A man in Georgia is becoming a pioneer of sorts, suing the creators of ChatGPT for defamation.

It certainly is a novel case, as Reason Magazine reports.

Fred Riehl is a journalist who was writing about a different lawsuit.

Riehl was writing about the Second Amendment Foundation (SAF) suing the Attorney General of Washington for apparently trying to silence the pro-gun rights group.

He asked ChatGPT to write the crux of the case.

The Georgia man, Mark Walters, was suddenly brought into the case.

ChatGPT said, “The suit is against Mark Walters, who is accused of defrauding and embezzling funds from the SAF. Walters was the group’s treasurer and chief financial officer and that had “misappropriated funds for personal expenses without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures.”

But while Walters was a member of the foundation, that’s where the accuracy stops. He was neither plaintiff nor defendant in the case and didn’t have a leadership role within the organization. His name was randomly brought up out of the blue.

Riehl let Walters know of this strange write-up, and needless to say, chose not to publish ChatGPT’s synopsis of the case.

So Walters isn’t suing Riehl, but OpenAI.

The case on it’s face may not seem like a winnable one. Walters suit claims “ChatGPT’s allegations concerning Walters were false and malicious, expressed in print, writing, pictures, or signs, tending to injure Walter’s reputation and exposing him to public hatred, contempt, or ridicule. By sending the allegations to Riehl, OAI published libelous matter regarding Walters.”

But here’s the interesting part. Technically, nothing was published. But, ChatGPT did “publish” their write-up when it appeared online.

What is the definition of “publish”? That will be up to the court to decide. And that word and its definition will become huge factors in future artificial intelligence lawsuits.

But this lawsuit can provide a general blueprint for those trying to prevent ChatGPT from producing incorrect information. It says at nothing else, even if false information appears privately, the makers can be held liable.

Should this get OpenAI to think twice before they claim ChatGPT is the answer to all writers problems? Best proceed with caution.

 

 

Add comment