Home Technology What ScarJo v. ChatGPT Could Look Like in Court

What ScarJo v. ChatGPT Could Look Like in Court

32
0
What ScarJo v. ChatGPT Could Look Like in Court
ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab

It doesn’t matter whether a person’s actual voice is used in an imitation or not, Rothman says, only whether that audio confuses listeners. In the legal system, there is a big difference between imitation and simply recording something “in the style” of someone else. “No one owns a style,” she says.

Other legal experts don’t see what OpenAI did as a clear-cut impersonation. “I think that any potential ‘right of publicity’ claim from Scarlett Johansson against OpenAI would be fairly weak given the only superficial similarity between the ‘Sky’ actress’ voice and Johansson, under the relevant case law,” Colorado law professor Harry Surden wrote on X on Tuesday. Frye, too, has doubts. “OpenAI didn’t say or even imply it was offering the real Scarlett Johansson, only a simulation. If it used her name or image to advertise its product, that would be a right-of-publicity problem. But merely cloning the sound of her voice probably isn’t,” he says.

But that doesn’t mean OpenAI is necessarily in the clear. “Juries are unpredictable,” Surden added.

Frye is also uncertain how any case might play out, because he says right of publicity is a fairly “esoteric” area of law. There are no federal right-of-publicity laws in the United States, only a patchwork of state statutes. “It’s a mess,” he says, although Johansson could bring a suit in California, which has fairly robust right-of-publicity laws.

OpenAI’s chances of defending a right-of-publicity suit could be weakened by a one-word post on X—“her”—from Sam Altman on the day of last week’s demo. It was widely interpreted as a reference to Her and Johansson’s performance. “It feels like AI from the movies,” Altman wrote in a blog post that day.

To Grimmelmann at Cornell, those references weaken any potential defense OpenAI might mount claiming the situation is all a big coincidence. “They intentionally invited the public to make the identification between Sky and Samantha. That’s not a good look,” Grimmelmann says. “I wonder whether a lawyer reviewed Altman’s ‘her’ tweet.” Combined with Johansson’s revelations that the company had indeed attempted to get her to provide a voice for its chatbots—twice over—OpenAI’s insistence that Sky is not meant to resemble Samantha is difficult for some to believe.

“It was a boneheaded move,” says David Herlihy, a copyright lawyer and music industry professor at Northeastern University. “A miscalculation.”

Other lawyers see OpenAI’s behavior as so manifestly goofy they suspect the whole scandal might be a deliberate stunt—that OpenAI judged that it could trigger controversy by going forward with a sound-alike after Johansson declined to participate but that the attention it would receive from seemed to outweigh any consequences. “What’s the point? I say it’s publicity,” says Purvi Patel Albers, a partner at the law firm Haynes Boone who often takes intellectual property cases. “The only compelling reason—maybe I’m giving them too much credit—is that everyone’s talking about them now, aren’t they?”

Source link