Comments full of people reading the headline and assuming that what OpenAI did here is fine because it's a different actress, but that's not how "Right of publicity" (*) laws work. The article itself explains that there is significant legal risk here:
Mitch Glazier, the chief executive of the Recording Industry Association of America, said that Johansson may have a strong case against OpenAI if she brings forth a lawsuit.
He compared Johansson’s case to one brought by the singer Bette Midler against the Ford Motor Co. in the 1980s. Ford asked Midler to use her voice in ads. After she declined, Ford hired an impersonator. A U.S. appellate court ruled in Midler’s favor, indicating her voice was protected against unauthorized use.
But Mark Humphrey, a partner and intellectual property lawyer at Mitchell, Silberberg and Knupp, said any potential jury probably would have to assess whether Sky’s voice is identifiable as Johansson.
Several factors go against OpenAI, he said, namely Altman’s tweet and his outreach to Johansson in September and May. “It just begs the question: It’s like, if you use a different person, there was no intent for it to sound like Scarlett Johansson. Why are you reaching out to her two days before?” he said. “That would have to be explained.”
* A.K.A. "Personality rights": https://en.m.wikipedia.org/wiki/Personality_rights
The Midler case is readily distinguishable. From Wikipedia:
If you ask an artist to sing a famous song of hers, she says no, and you get someone else to impersonate her, that gets you in hot water.
If you (perhaps because you are savvy) go to some unknown voice actress, have her record a voice for your chatbot, later go to a famous actress known for one time playing a chatbot in a movie, and are declined, you are in a much better position. The tweet is still a thorn in OA's side, of course, but that's not likely to be determinative IMO (IAAL).
1: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
The actress did impersonate Her though.
It's not just a random "voice for your chatbot", it's that particularly breathy, chatty, voice that she performed for the movie.
I would agree with you completely if they'd created a completely different voice. Even if they'd impersonated a different famous actress. But it's the fact that Her was about an AI, and this is an AI, and the voices are identical. It's clearly an impersonation of her work.
Did she? The article claims that:
1. Multiple people agree that the casting call mentioned nothing about SJ/her
2. The voice actress claims she was not given instructions to imitate SJ/her
3. The actress's natural voice sounds identical to the AI-generated Sky voice
I don't personally think it's anywhere near "identical" to SJ's voice. It seems most likely to me that they noticed the similarity in concept afterwards and wanted to try to capitalize on it (hence later contacting SJ), opposed to the other way around.
No-one had to explicitly say any of that for it to still be an impersonation. Her was a very popular film, and Johansson's voice character was very compelling. They literally could have said nothing and just chosen the voice audition closest to Her unconsciously, because of the reach of the film, and that would still be an impersonation.
That's a very broad definition of impersonation, one that does not match the legal definition, and one that would would be incredibly worrying for voice actors whose natural voice happens to fall within a radius of a celebrity's natural voice ("their choice to cast you was unconsciously affected by similarity to a celebrity, therefore [...]")
What you're arguing fails to pass the obviousness test ; if I were running the company it would be blankly obvious that the optics would be a problem, so I would start to collect a LOT of paperwork documenting that the casting selection was done without a hint of bias towards a celebrity's impression. Where is that paperwork? The obviousness puts the burden on them to show it.
Otherwise your argument lets off not just this scandal but an entire conceptual category of clever sleazy moves that are done "after the fact". It's not the the Kafka trap you're making it out to be.
I think optics-wise the best move at the moment is quelling the speculation that they resorted to a deepfake or impersonator of SJ after being denied by SJ herself. The article works towards this by attesting that it's a real person, speaking in her natural voice, without instruction to imitate SJ, from a casting call not mentioning specifics, casted months prior to contacting SJ. Most PR effort should probably be in giving this as much of a reach as possible among those that saw the original story.
Would those doing the casting have the foresight to predict, not just that this situation would emerge, but that there would be a group considering it impersonation for there to be any "hint of bias" towards voices naturally resembling a celebrity in selection between applicants? Moreover, would they consider it important to appeal to this group by altering the process to eliminate that possible bias and providing extensive documentation to prove they have done so, or would they instead see the group as either a small fringe or likely to just take issue to something else regardless?
Yes, this should all have been obvious to those people. It would require a pretty high degree of obliviousness for it to not be obvious that this could all blow up in exactly this way.
It blew up by way of people believing it was an intentional SJ deepfake/soundalike hired due to being rejected by SJ. I think this article effectively refutes that.
I don't think it blew up by way of people believing simply that those doing the casting could have a hint of a subconscious bias towards voices that sound like celebrities. To me that seems like trying to find anything to still take theoretical issue in, and would've just been about something else had they made the casting selection provably unbiased and thoroughly documented.
Again, I think it requires a high degree of obliviousness to not have the foresight during casting to think, "if we use a voice that sounds anything like the voice in the famous smash hit movie that mainstreamed the idea of the kind of product we're making, without actually getting the incredibly famous voice actress from that movie to do it, people will make this connection, and that actress will be mad, and people will be sympathetic to that, and we'll look bad and may even be in legal hot water". I think all of that is easily predictable!
It seems way more likely to be a calculated risk than a failure of imagination. And this is where the "ethics" thing comes into play. They were probably right about the risk calculation! Even with this blow-up, this is not going to bring the company down, it will blow over and they'll be fine. And if it hadn't blown up, or if they had gotten her on board at any point, it would have been a very nice boon.
So while (in my view) it definitely wasn't the right thing to do from a "we're living in a society here people!" perspective, it probably wasn't even a mistake, from a "businesses take calculated risks" perspective.
I think it's deceptively easy to overestimate how likely it is for someone to have had some specific thought/consideration when constructing that thought retroactively, and this still isn't really a specific enough thought to have caused them to have set up the casting process in such a way to eliminate (and prove that they have eliminated) possible subconscious tendency towards selecting voice actors with voices more similar to celebrities.
But, more critically, I believe the anger was based on the idea that it may be an intentional SJ soundalike hired due to being turned down by SJ, or possibly even a deepfake. Focusing on refuting that seems to me the best PR move even when full knowledge of what happened is available, and that's what they're doing.
I'm sorry, but your first paragraph is a level of credulity that I just can't buy, to the point that I'm struggling to find this line of argument to be anything besides cynical. The most charitable interpretation I might buy is that you think the people involved in this are oblivious, out of touch, and weird to a degree I'm not willing to ascribe to a group of people I don't know.
If you are an adult living and working in the US in the 2020s, and you are working on a product that is an AI assistant with a human voice, you are either very aware of the connection to the movie Her, or are disconnected from society to an incredible degree. I would buy this if it were a single nerd working on a passion project, but not from an entire company filled with all different kinds of people.
The answer is based on "they wanted a voice that sounds like the one in Her, but the person whose voice that is told them no, but then they did it anyway". The exact sequence of events isn't as important to the anger as you seem to think, though it may be more important to the legal process.
My claim is not that they hadn't heard of the movie her, but that while setting up auditions, the chain of thought that would lead them to predict a group would take issue in this very particular way (marcus_holmes's assertion that unconsciously favoring the VA's audition would constitute impersonation) that necessitates the proposed rigor (setting up auditions in a way to eliminate possibility of such bias, and paperwork to prove as such), and consider it worthwhile appeasing the group holding this view, is not so certain to have occurred that the seeming lack of such paperwork can be relied on to imply much at all.
I would go further and say that chain of reasoning is not just uncertain to have occurred, but would probably be flawed if it did - in that I don't think it would noticeably sway that group. Opposed to the evidence in the article, or some forms of other possible possible evidence, which I think can sway people.
Less the order of events, and more "seeking out an impersonator and asking them to do an imitation" vs "possibility of unconscious bias when selecting among auditions"
A lot of legal constructs are defined by intent, and intent is always something that is potentially hard to prove.
At most the obviousness should the burden of discovery on them, and if they have no records or witnesses that would demonstrate the intent, then they should be in the clear.
IMO having records that explicitly mention SJ or Her in any way would be suspicious.
IANAL
I think that reaches too far. Intent should be a defining part of impersonation. IANAL and I don't know what the law says.
Intent on whose part, though? Like, supposing in arguendo that the company's goal was to make the voice sound indistinguishable from SJ's in Her, but they wanted to maintain plausible deniability, so instead cast as wide a net as possible during auditions, happened upon an actor who they thought already sounded indistinguishable from SJ without special instruction, and cast that person solely for that reason. That seems as morally dubious to me as achieving the same deliberate outcome by instruction to the performer.
so who was doing the selecting, and were they instructed to perform their selection this way? If there was a law suit, discovery would reveal emails or any communique that would be evidence of this.
If, for some reason, there is _zero_ evidence that this was chosen as a criteria, then it's pretty hard pressed to prove the intent.
SJs voice has some very distinctive characteristics and she has distinctive inflections that she applies. None of that inflection, tonality, or characteristics are present in the chat bot voice. Without those elements, it can be said to be a voice with vaguely similar pitch and accent, but any reasonable “impersonation “ would at least attempt to copy the mannerisms and flairs of the voice they we’re trying to impersonate.
Listening to them side by side, the OpenAI voice is more similar to Siri than to SJ. That Sam Altman clearly wanted SJ to do the voice acting is irrelevant, considering the timings and the voice differences.
The phone call and tweet were awkward tho.
Exactly anyone that listens to both side by side should be able to clearly distinguish them.
I have this sinking feeling that in this whole debate, whatever anyone's position is mostly depends on whether they think it's good that OpenAI exists or not.
No it doesn't.
That's a verbatim quote from the article (albeit based on brief recordings).
I haven't heard the anonymous voice actress's voice myself to corroborate WP's claim, but (unless there's information I'm unaware of) neither have you to claim the opposite.
The clips are all online for you to listen to them yourself. The article can say what it likes, it's just wrong.
Just to make sure: have you correctly understood the article's claim?
It's saying that the anonymous voice actress's natural voice sounds identical to the AI-generated Sky voice (which implies it has not been altered by OpenAI to sound more like SJ, nor that they had her do some impression beyond her own natural voice).
If so, could you link the clips of the voice actress's natural voice, to compare to the AI version? I've searched but was unable to find such clips.
The AI voice is what’s in question here and there are plenty of examples, just listen to ones that aren’t intentionally selected to try to create the impression of similarity. Sky has been around for half a year so you don’t have to limit yourself to the tech demo but even if you do if you listen to the whole thing you’ll see the normal speaking voice is very different from ScarJo.
I know and agree. The article's claim is that the anonymous voice actress's natural voice sounds identical to the AI-generated Sky voice. That is, VA = AI, not AI = SJ or VA = SJ. Which corroborates with the claim that OpenAI was not asking her to do any particular impression.
Here, a direct comparison:
https://x.com/chriswiles87/status/1792909936189378653
So your theory is that this was completely coincidental. But after the voice was recorded, they thought, "Wow, it sounds just like the voice of the computer in Her! We should contact that actress and capitalize on it!"
That's what you're going with? It doesn't make sense, to me.
Listen to the side by side comparisons. Sky has a deeper voice overall, in the gpt4o demo Sky displays a wider pitch range because the omni model is capable of emotional intonation. Her voice slides quite a bit while emoting but notably doesn't break and when she returns to her normal speaking voice you can hear a very distinct rhotic sound, almost an over-pronounced American accent and she has a tendency towards deepening into vocal fry especially before pauses. I'd describe her voice as mostly in her chest when speaking clearly.
Now listen to SJ's Samantha in Her and the first thing you'll notice are the voice breaks and that they break to a higher register with a distinct breathy sound, it's clearly falsetto. SJ seems to have this habit in her normal speaking voice as well but it's not as exaggerated and seems more accidental. Her voice is very much in her head or mask. The biggest commonality I can hear is that they both have a sibilant S and their regional accents are pretty close.
I was thinking someone thought "oh that sounds a fair bit like SJ in Her, if we can get SJ onboard, perhaps we can fine-tune what we got to sound like SJ in Her".
... that it would be even better to have a famous voice from Her than a rather generic female voice they had, but their proposal was declined. Well oops, but SJ, famous as she is, doesn't have a copyright right on all female voices other than her own.
Who’s making those claims, exactly? That will tell you a lot about their likely veracity.
First two claims are "according to interviews with multiple people involved in the process", direct quotes from the casting call flier, and "documents shared by OpenAI in response to questions from The Washington Post". Given the number of (non-OpenAI) people involved, I think it would be difficult to maintain a lie on these points. Third claim is a comparison carried out by The Washington Post.
Also sam had a one word tweet: “her.” So it looks like there was something going on.
It’s an obvious comparison to make for the technology, I don’t think it was meant as “it sounds like ScarJo”
It sounds more like Rashida Jones than SJ to me.
I think part of this PR cycle is also the priming effect, where if you're primed to hear something and then listen you do great it.
Then OpenAI did the priming by referencing "Her".
This is why things are decided by juries. You may well truly believe this all seems unrelated and above board. But very few people will agree with you when presented with these facts, and it would be hard find them during a jury selection.
Good luck proving that in court.
“You’re honor our evidence is that the audio clips both sound breathy”
That and repeated, rejected attempts to hire ScarJo to do the voice. I expect they have ample documentation on that.
Did you not read the article? They did the voice months before that.
Then why did they ask her two days prior launch? A potential jury would be very, very curious knowing that.
A very simple answer is that they also wanted SJ as a voice. That doesn't mean the original voice was trying to copy SJ.
They had a voice, the natural comparison watching the interaction is to Her, and they there is likely still time to get actually SJs voice before a public rollout.
If you believe a known conman like Sam Altman didn't intend to steal SJ's voice I have a bridge to sell you.
See my comment from yesterday re him being a known conman: https://news.ycombinator.com/item?id=40435120
I find this whole thing very confusing because I never thought the voices sounded identical or very similar. I was initially even more confused about this whole thing because I thought I couldn't find clips of the SJ AI voice only to realize I had but it didn't sound like her.
This opinion is independent of Sam being a conman, scammer, creepy Worldcoin weirdo, and so on.
It’s not about what you believe but rather what you can prove
In particular, I'm pretty sure you can bake a voice model well enough for a canned demo in two days given we have cloning algs which will do it with 15 seconds of samples.
Production ready? Probably not, but demos don't have to be.
I may be wrong, but I believe this case would be made to a jury, not to a judge.
I think it would be hard to seat a jury that, after laying out the facts about the attempts to hire Johansen, and the tweet at the time of release, would have even one person credulous enough to be convinced this was all an honest mix-up.
Which is why it will never in a million years go to a trial.
How do you explain the many people saying that the voices do not sound especially similar?
"The pitch is kiiiiiind of close, but that's about it. Different cadence, different levels of vocal fry, slightly different accent if you pay close attention. Johansson drops Ts for Ds pretty frequently, Sky pronounces Ts pretty sharply. A linguist could probably break it down better than me and identify the different regions involved."
https://old.reddit.com/r/singularity/comments/1cx24sy/vocal_...
There is also a faction claiming that Sky's voice is more similar to Rashida Jones's than Scarlett Johansson's:
https://old.reddit.com/r/ChatGPT/comments/1cx9t8b/vocal_comp...
Given the breadth and range of female voices available, this is way too close to be just a coincidence.
There are approximately 4 billion women in the world. Given that I know a few people who sound very similar to me, I would say that there are (subjectively) perhaps 1,000 to 10,000 different types of women's voices in the world.
This would mean that a celebrity could possess a voice similar to 0.5 million to 5 million other women, and potentially claim royalties if their voice is used.
This types of estimate is sadly deeply flawed. Voices are affected not just by ethnicity but also language and culture. I know because I can feel slight noticeable differences in tones between communities just 30-50 miles apart, within same social classes and everything. Bilinguals also sound noticeably different to monolinguals even in their primary language.
I think people thinks others sound same not because they're similar from beginning, but because voices must homogenize under peer pressures. There's a proverb "nails that sticks out gets hammered". Most people probably has hammered flat voices intended to not stand out.
You first said that Sky is "clearly an impersonation" of Johansson. Now you say that it's not a coincidence they chose Sky's voice actress. These are two different claims. It may not be a coincidence in the sense that they may have chosen Sky's actress because she sounds similar to Johansson. But that alone doesn't constitute an impersonation. Impersonation means deliberately assuming the identity of another person with the intent to deceive. So you'd have to demonstrate more than a degree of similarity to make that case.
Isn't it TTS + style transfer? Or is it e2e from text to waveform?
This is unclear. What is clear is OpenAI referenced Her in marketing it. That looks like it was a case of poor impulse control. But it's basis for a claim.
Because they're building a voice-mediated AI, duh.
Can someone explain to me the outrage about mimicking a public persons voice, while half the people on hacker news argue that it's fine to steal open source code? I fail to see the logic here? Why is this more important?
In the Ford case they hired an impersonator to sing one of her copyrighted songs, so it's clearly an impersonation.
In OpenAI's case the voice only sounds like her (although many disagree) but it isn't repeating some famous line of dialog from one of her movies etc, so you can't really definitively say it's impersonating SJ.
But they asked her first!:
"Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people....
https://twitter.com/BobbyAllyn/status/1792679435701014908
So its: ask Johansson, get declined, ask casting directors for the type of voice actors they are interested in, listened to 400 voices, choose one that sounds like the actor, ask Johansson again, get declined again, publish with a reference to Johansson film, claim the voice has nothing todo with Johansson.
[EDIT] Actually it looks like they selected the Sky actor before they asked Johansson and claim that she would have been the 6th voice, its still hard to believe they didn't intend it to sound like the voice in her though:
https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
Except it doesn't sound like Johansson, I don't know why people keep saying this. At best, the voice has a couple of similar characteristics, but I didn't think for one second that it was her. Can James Earl Jones sue if someone uses a voice actor with a deep voice?
Also looking from the perspective of the lesser known voice actress, does Scarlett Johansson have the right to trample the lesser known voice artists future job opportunities by intimidating her previous employers?
Imagine being a potential future employer of the lesser known artist, would you dare hire her in the face that Johansson's lawyers might come after you?
Is this lesser known voice artist now doomed to find a job in a different sector?
Voice archetypes are much much older than Johansson, so by symmetry arguments, could those earlier in line sue Johansson in turn?
When a strong person is offered a job at the docks, but refuses, and if then later another strong person accepts the job, can the first one sue the employer for "finding another strong man"?
At some point the courts are being asked to uphold exceptionalist treatment and effectuate it on tax-payers dollars moving executive branches in case of non-compliance.
Right, it would be one thing if there was evidence that OpenAI asked the actress to imitate Johansson. But people are saying that using this voice actress at all without Johansson's permission shouldn't be legal, which is a bizarre claim. If someone thinks my voice sounds similar to a celebrities, now that celebrity owns my voice? In any other situation, everyone here would think such a standard would be completely outrageous.
(For what it's worth, I didn't find the Sky voice to sound anything like Scarlett Johannson personally)
Exactly, everyone claiming so hasn't actually listened to it [1], or they're basing their opinion off of "suspicious correlations", like that Altman mentioned "her" just before releasing a voice-interactive AI assistant.
[1] https://x.com/chriswiles87/status/1792909936189378653
Some may have listened to it and just have such a poor opinion of OpenAI that they allow it to cloud their judgment. Both voices sound like white women in a similar age group, must be Scarlett Johansson.
And not just white women, Rashida Jones is biracial and a lot of people think the voice sounds like hers. I agree that it actually sounds much more like Jones’ than Johansson’s - I think I’d be able to accurately distinguish between Sky and Johansson’s voice just about every time, but I’m not sure I would be able to do the same with Sky and Jone’s.
And as far as I can tell, OpenAI also had 4 different female voices - two from the API, and two in ChatGPT. So there are several different types of voices they covered.
I listened again and you're right, it does sound more like Rashida Jones.
I think that's exactly it, or they're critical of all corporations, and they're jumping all over suspicious timelines, like that they tried to convince her 9 months ago and again 2 days before the new release as some kind of evidence of malfeasance.
Completely false. Even the journalists at the launch of "Sky" last year were singling it out of the batch of voices, and specifically asking OA about how it sounded like Johansson: https://www.washingtonpost.com/technology/2023/09/25/chatgpt...
You mean people who are incentivized to stir up controversy were trying to stir up controversy? I'm shocked. This is definitely not a case where self interest would cloud anyone's judgment.
In case it's not clear, my moderately low opinion of Open AI is bested only by my even lower opinion of journalists.
Maybe because it's the most famous recent good movie about voice interface AI?
He's wasn't citing Lucy or whatever other garbage.
I would bet these charitable readings of OA intentions are going to get wiped away by some internal emails found during discovery. It does sound like Mr. Altman was talking about it publicly in a tweet, it’s probable there are internal comms about this.
This is exactly the right argument. Accepting the lawsuit would give precedent to an insidious combination: Matthew+chilling effect
The lesser-known voice actor is dooming themselves to find a job in a different sector by contributing to the development of technology that will almost certainly replace all voice actors.
This is weird, if not bizarre. Scarlett didn't do anything. Literally no action besides saying no. Then a company decides to impersonate her and use her performance in a movie as implicit marketing for a product. That's the company's problem, not hers.
Would you ever say “except strawberries aren’t tasty, I don’t know why people keep saying this”?
Maybe it doesn’t sound like Johansson to you, but it does sound like her to a lot of people. Worse, evidence points to Altman wanting you to make that connection. It’s trying to sound like her performance in one specific movie.
I guarantee you that nobody who's ever heard her voice actually thinks that, go on:
https://x.com/chriswiles87/status/1792909936189378653
There is no such evidence.
You definitely do not. Think of the craziest conspiracy theory you can, one that has been debunked countless times, and I’ll show you people that have been shown the evidence yet still believe the conspiracy.
This case is not a conspiracy theory. But it is something where you disagree with a popular opinion. The point is that you’re projecting your thought pattern and way of understanding the world into everyone else instead of being empathetic and putting yourself on the other side.
Look, I get it. I also thought that the outrage about Apple’s recent ad was disproportionate. But I’m not going to go out of my way to defend the ad. I don’t give a hoot that a massive corporation is getting a bit of bad press for something largely inconsequential, I just wish the outrage had been directed at something which actually matters and go on with my life.
With this kind of dismissal, I don’t think it’s worth continuing the conversation. I’ll leave you to it. Have a genuinely good weekend.
Is it a popular opinion? Where's the evidence of that? Or is it maybe just a manufactured clickbait outrage now bordering on conspiracy theory? I provided a link that clearly demonstrates the very foundational claim is wrong, and the article has already clarified the mistakes in the timeline that conspiracists are citing as evidence of malfeasance. This controversy is a textbook nothing burger and it annoys me to no end that people keep falling for these tactics.
I already said I didn’t think it was worth continuing the conversation, so I hope I don’t regret trying to be kind.
Please realise that calling something “manufactured clickbait outrage” itself borders on conspiracy theory. Again, empathise, look at your argument with an outside eye.
Don’t let it. Being annoyed on the internet only ruins your day. And the more unhinged you become, the less sense your arguments will make and the more it will backfire.
Unless you have a personal relationship with either Altman or Johansson, I recommend you let this one go. It’s not worth the cost of any mental wellness. Some issues are, but not this one. Save yourself for the issues that are important.
Again I wish you a relaxing weekend.
For ten seconds I thought “Wow, that is strikingly familiar to her”. Then I realized that was the example from the movie. I tend to agree, not identical or even close but definitely some similarities. I don’t see a jury ruling that they’re too similar
Including Johansson herself and her family?
We can talk biases, but I think we're pretty far from "guarantee" in this matter.
Johannsen already has lawyers on the ready. This goes beyond some publicity stunt. You can be cynical, but I struggle to call a potential legal proceeding a conspiracy.
Linking to two small soindbites isn't going to undo an entire court case. I doubt I always sound the same in two identical line readings.
Sky sounds like Scarlett to me. Not all the time, not in every sentence or statement, but frequently enough that combined with Altman's outreaches to Scarlett there's no way I'm giving them the benefit of the doubt.
Try to ask yourself: is it at all plausible that others operating in good faith might come to a different conclusion, particularly given the surrounding details? If you can't find your way to a "yes" then I'm not sure what to say, you must view everyone (seemingly the majority though who knows for sure) to be deluding themselves or trolling others.
I've been saying this for days, and I'm pretty firmly in the OpenAI critic camp.
The only reason people think it sounds like her is because they've biased themselves into it because of all the context surrounding it.
Maybe the fault for that belongs to the company who tried to create the association in your mind by using a similar voice and tweeting about that one movie with the voice.
That’s basic advertising. They knew what they were doing. It’s just that it may have backfired.
I don't think it's a particularly similar voice to begin with, and the technology in general is very 'Her' like - which would also explain why they reached out to ScarJo to do an additional voice.
I think the tweet was dumb but in my layperson's understanding of the law and the situation, I doubt OpenAI is at any huge legal risk here.
It doesn't really matter their legal risk here, IMO. What matters is the court of public opinion in this case.
Even if they are able to show irrefutable proof that it wasn't ScarJo and is in fact another person entirely it will not matter.
This is one of those times that no matter what the facts show people will be dug in one way or another.
If they ask James Earl Jones to do it, he says no, and then they hire someone with a deep voice in order to sound like him? Yes.
Except it doesn't sound like her, and that's not even the correct sequence of events.
It absolutely does sound like her, in this context where she is the voice in the AI voice companion zeitgeist.
In just the same way that if the context were "magical space villain in a black mask and cape" and someone was hired with a deep voice, it would be a clear rip-off of James Earl Jones.
And it requires a level of credulity that I find impossible to sustain to think that "the sequence of events" here doesn't start with "let's have a voice actor casting call and pick a voice that sounds like the one in the movie about this that everyone loves". I won't, however, be shocked if nobody ever wrote that down in a medium that is subject to legal discovery.
If I play the voices back to back, nobody thinks they're the same person.
If I ask which one is SJ, people that have seen her films know, those who don't, don't. (Hint, only one sounds hoarse like early smoker voice and self-assured even in the 'Her' role, only one sounds impossibly 2020s valley girl chipper and eager to please.)
Sure seems like all the dogpiling last week either didn't do a basic listen or ignored it, as it's much better for click farming to mock the cock of the walk.
I immediately thought "grade school teacher", although I was listening to the clip where she was telling a story.
This. Its what people want to hear. If you heard that voice in a vacuum with no context and asked someone what famous person is it, I doubt many people would say Johansson. Some, sure, but not the majority.
The only thing Sam did wrong was play too fast and loose with the implication of "Her" given that he had been talking to ScarJo. Lawyers should have told him to just STFU about it.
To add to this, the legal standard isn't whether it sounds "like" her. It has to be a replication of her voice. Millions of people may sound or look "like" another person, that doesn't mean they are a copy of that person.
The best case study in voices imho is David Attenborough. He has a distinct voice that many have sought to replicate. But you know who else had that voice? Richard Attenborough (the actor from Jurassic Park). They are brothers. Sadly, Richard recently passed. Their voices are unsurprisingly nearly interchangeable, along with a thousand other people with similar life stories. So who gets to own complete rights to the distinctive "Attenborough" voice? In any other area of intellectual property the answer is simple: nobody. It doesn't exist and/or was created by people long before any living Attenborough walked the earth.
Similarly, courts ruled that GTA did not steal from Lindsay Lohan. One cannot own the image of generic California blonde in a red bikini. So why should Johansson own sexy/breathy voice of with a nondistinctive accent?
I was confused because it doesn’t really sound much like her normal voice, and I didn’t see Her. So I looked it up—it sounds a little bit more like her voice in her… the movie where she’s adding some AI-like affectations to her voice.
I guess they decided to remove it for PR reasons or something.
Read the main article again. The voice actor was already auditioned and chosen before they went to Johansson. Maybe they had a change of plan and thought it would be cool to have Johansson's voice rather than an unknown actor's.
The point isn't the time line of hiring the voice actor. The question is whether OpenAI was deliberately trying to make the voice sound like Johansson.
Suppose someone asked Dall-e for an image of Black Widow like in the first Avengers movie, promoting their brand. If they then use that in advertising, Johansson's portrait rights would likely be violated. Even (especially) if they never contacted her about doing the ad herself.
This is similar to that, but with voice, not portrait.
that's because one can make the argument that dall-e was regurgitating - it would be different if you get somebody who happens to look like her to pose in a similar way.
I don't think this is entirely right (not a lawyer)
You can't hire an artist to draw black widow in the style of Scarlett Johansson's widow. The issue isn't how the art is made, it's whether the end result looks like her.
I think there may be additional issues (to be determined either in courts or by Congress, in the US) with regard to how Dalle makes art, but elsewhere in the thread someone mentioned the Ford Bette middler case, and that does seem to be relevant (also, though, not exactly what happened here)
I don't have the expertise to know how similar this is to the case at hand.
Not a lawyer, but wouldn't the important intent be the one when the voice is released, not the first moment someone gets hired? The economic harm happens when you publicize the voice in a way designed to gain value from Johannsen's reputation without her permission, not when you record it. The tweet and requests speak directly to that moment.
Yes, if she lost movie roles or other contracts because people assumed they could license the OpenAI voice then she could claim she was harmed. However OpenAI removed the voice and this situation is widely publicized. So it is hard to prove that she is being harmed now
Is there any quality difference between hiring a voice actor specifically to provide the voice for an AI compared to cloning an actor's voice from their movies?
Much of the coverage I've seen thorough social media on this (including Johansson's statement) gives the impression that this is what OpenAI did. If the quality of doing that would be worse than using a voice actor to imitate Johansson's voice, what is the value of the publicity which gives the impression that their technology is more advanced than it is, compared to whatever they end up settling this for?
Especially when you have ex-OAers, who had been working there at the time on 'J.A.R.V.I.S.', tweeting things like https://x.com/karpathy/status/1790373216537502106 "The killer app of LLMs is Scarlett Johansson."
There are probably a number of other cases. The one I remember is when Sega asked Lady Miss Kier of Deee-Lite fame to use her public image for a game. Nothing come out of it but Sega made the character Ulala[1] anyway. If you grew up in the 90s the characters name was strongly connected to Lady Miss Kier's catch phrase, but unfortunately she lost the suit and had to pay more than half a million.
[1] https://en.m.wikipedia.org/wiki/Ulala_(Space_Channel_5)
Not to over-analyze your use of language, but using the possessive here makes it seem like she personally owned that phrase or its use was associated with her. First, I don't know if that's true. Did she say, "Ooh la la," constantly, or is it just something she said at the beginning of the music video (and possibly studio recording) of the one hit from Deee-Lite, Groove Is In The Heart? Moreover, that phrase is a fairly well-known and widely-used one, see: https://en.wikipedia.org/wiki/Ooh_La_La. It certainly was not original to her nor would its use indicate an association to her. To your point, its use plus the aesthetic of the character does seem like a reference to Lady Miss Kier's appearance in that music video (if not also her style and appearance more generally, I don't know if that is how she always looks). But she didn't sue everyone else on this list for the use of her supposed catch phrase, ooh la la.
I hate to say one person's fame is so great that they get special or different treatment in court, but I think "Lady Miss Kier" was punching above her weight in trying to sue over use of her image. Her fame was a flash-in-the-pan one-hit-wonder in the early 90s, no offense to any Deee-Lite fans. It was certainly a great song (with some help from Herbie Hancock, Bootsy Collins, and Q-Tip).
https://www.youtube.com/watch?v=etviGf1uWlg
Not a native speaker, so "catch phrase" is maybe not the right term or too strong.
Prompted by your comment I read up about the case and from what I understand Sega wanted to use the song in a promotion and not (what I remembered) her likeness.
Wouldn't that apply to entertainers like Rich Little whose entire career was him doing his best to exactly impersonate famous peoples' voices and mannerisms?
Parody tends to fall into various exemptions. Of course one might argue that OpenAIs work itself is a parody of AGI.
If someone impersonates people but doesn't disguise/hide their appearance, there's no risk of confusing listeners about who is making the voice.
It amounts to nothing as it was a single word and they could spin that any way they want, it's even a generic word, lol. The "worst" interpretation on that tweet could be "we were inspired by said movie to create a relatable product" which is not an unlawful thing to do.
I see this a lot in space like HN focused on hard science and programming — this idea that judgments couldn't possibly consider things like context beyond what has been literally spelled out. To paraphrase a fitting XKCD, they have already thought of that "cool hack" you've just come up with [0].
I lack the knowledge to make a judgment one way or another about whether this will go anywhere, because I know very little about this area of law, more so in the US. However, this idea that tweeting the title of that specific movie in the context of such a product release couldn't possibly be connected to one of those voices having a similar cadence and sound of a specific actor in that same movie they approched beforehand, couldn't have no legal bearing seems naive. Is it that doubtful that a high-priced legal team couldn't bring a solid case, leading to a severe settlement, or more if she wants to create a precedent?
Clichéd Mafia talk is not a foolproof way to prevent conviction in most jurisdictions.
[0] https://xkcd.com/1494/
The order of the events is different, but it still comes down to whether OA had a specific voice in mind when building the chatbot.
By your logic, I could go find a Tim Cook looking and sounding guy, make a promotion video with him praising my new startup, ping Tim Cook to check if by any chance he wouldn't miraculously be willing to do me a favor to avoid me all the trouble in the first place, but still go on and release my video ads without any permission.
"I did all the preparation way before asking the celebrity" wouldn't be a valid defense.
You could have an ad with a voiceover saying all sorts of positive things about your company, in a voice that sounds somewhat like Tim Cook. What you could not do is use a body double, dress someone up like him, or otherwise give the impression that your startup is loved by Tim Cook himself. He doesn’t have a monopoly on slightly southern accents.
Also a lawyer, and the Middler case is apparently not understood so narrowly. The possible chilling effect on employability of actors who happen to look or sound just like already famous actors rankled me, too, and I really got into it with my entertainment law professor (who has had some quite high profile clients). His advice in no uncertain terms was basically “Sorry to those actors, but if you try to get away with hiring one of them knowing they’re likely to be confused with someone more famous, you’re stupid.”
Pulling the voice when ScarJo complained is not a good look. I’m sure her attorneys would be very excited to do discovery around that decision should it come to trial.
It won’t though, this is primarily a PR problem for OpenAI. Which is still a real problem when their future product development depends entirely on everyone giving them tons of data.
Counter-intuitively, I think this puts Johansson in a stronger position.
OpenAI did not want to copy her the actress, they wanted to copy HER the performance from the movie.
The tweet + approach is probably sufficient to bring a lawsuit and get into discovery and then it'll come down to if there's a smoking gun documents (e.g. internal emails comparing the voice to Her, etc.)
It's likely that someone internally must have noticed the similarity so there's like some kind of comms around it so it very much will depend on what was written in those conversations.
They contacted Johansson after the Sky voice was created, they didn’t create it because she declined.
The voice actor isn’t a Johansson imitator, and the voice isn’t an imitation.
The only similarity between the Sky voice and Johansson’s is that it’s a white American female, so by your logic a significant percentage of the US population has a case.
Imaging this to be such a legal minefield, can't sell my own voice because a celeb sounds a bit alike as my own voice.
You can sell your voice to whoever you want.
What you can't do is USE that voice in a way that seeks to mislead (by however much) people into believing it is someone else.
I'm really not sure why people can't understand that it is intent that matters.
I don't think OpenAI wants people to think ChatGPT 4 or whatever is Scarlett Johansson, that would not make any sense.
Then what were OpenAI hoping to get out of their association with her? Why go to the effort of getting in contact? Why reference the film Her?
They contacted her to use her voice. I mean, the Sky voice doesn't sound like Scarlet Johansson.
Because they developed an AI some people is bonding with. Which is the bigger deal? The voice or the AI, you tell me.
What is the angle here? They tell people that it's Scarlett Johansson responding to them instead of a computer? To what end? I just don't get it. And I think anyone who confuses a computer program for a real person has bigger problems than being potentially defrauded by OpenAI.
And people have been making computers sound like humans without anyone suggesting that it's some attempt at fraud for very long.
There’s a certain group of people on this site that do not want to OpenAI, Apple, and others, to have done wrong.
A celeb creates a particular voice personality for a role as an AI in a very successful movie. You create an AI, and "co-incidentally" create a strikingly similar voice personality for your AI. Not a legal minefield, you copied the movie, you owe them.
They could have used any accent, any tone, anyone. Literally anyone else. And it would have been fine. But they obviously copied the movie even if they used a different actress.
Yes they tried to copy film Her yet the voice ai character in the film you think was original? No other films predate that film with female AI voices? And would the actor have claim on the fictional character they played in the film vs film “owners”?
Obviously there have been lots of female voices for AIs in movies. But not all of them sound the same. OpenAI could have created a new voice personality for their AI by hiring almost any female voice actress available. But they didn't. They chose to copy Her.
Listen to the voices side by side:
https://www.reddit.com/r/ChatGPT/comments/1cwy6wz/vocal_comp...
And here is voice of another actress ( Rashida Jones ):
https://www.youtube.com/watch?v=385414AVZcA
which actress do you think is similar to the openAI sky voice?
Which sounds more like Johansson's voice on Her, the Sky voice, or Emma Watson?
Again, given the vast range of voices (even female voices) available, choosing one that sounds so close to Her, given the subject matter of the film (and the OpenAI leadership's references to the film), this is not coincidence.
Did I claim coincidence? The openAI sky voice has been available among many other voices since last September and I shared evidence that there are other actors who could claim the voice is more similar to their voice. Yet you think the voice actor from the film her has a right to prevent the use of the sky voice now? On what basis? I doubt most people who watch the film her even know who did the voice acting. And per the evidence I shared above the claim about it being that same voice vs another popular actor is weak.
Exactly, that would be absolutely nonsensical.
You wouldn’t be the one liable. Any company that hired you might be liable if you sound like a famous voice and the more famous individual had already declined using their voice.
The company would also be liable if they used your voice and claimed it was someone more famous.
Ultimately you’re not liable for having a similar voice because you’re not trying to fool people you’re someone else. It’s the company that hired you who’s doing that.
This is why tribute acts and impressionists are fine…as long as they are clear they’re not the original artist
Because of laws and regulations like these, innovation is getting slower and slower.
I'm afraid the whole world will get regulated like EU someday, crippling innovation to a point that everyone's afraid to break a law that they aren't even aware of, and stop innovating.
My understanding was that the voices sound quite similar. I haven't heard the original Sky voice so don't know. Are there any samples online?
Listen to the voices side by side:
https://www.reddit.com/r/ChatGPT/comments/1cwy6wz/vocal_comp...
And here is voice of another actress ( Rashida Jones ):
https://www.youtube.com/watch?v=385414AVZcA
If you click through and listen please reply and answer these questions: which actress do you think is similar to the openAI sky voice? And what does that tell you about likely court result for Johansson? And having reached this conclusion yourself would you now think the other actress Rashida Jones is entitled to compensation based on this similarly test?
Are you asking me to trust that a video clip posted to the ChatGPT subreddit by u/SWAMPMONK is providing an accurate audio recording of the openAI sky voice?
You can literally go look up the original demo footage on YouTube too. Which is what I did (before that comparison was created) and didn't think it sounded anything like SJ.
Put side by side...I can hear a similarity, but they're also distinctly different (the OAI voice doesn't have the huskiness of the Her VA IMO, but they're both Californian female with a bubbly demeanour).
Thanks! Hadn’t heard it before now. They don’t sound all that similar to me, though perhaps I am a bit detail-oriented on account of my linguistics background.
You might be right, but still if it goes to court, OpenAI knows they have an uphill battle. Any jury is going to love Johansson.
The voice doesn't need to sound the same for OpenAI to lose.
https://www.youtube.com/watch?v=1uM8jhcqDP0&t=30s
Wait, what's the music artifact at 0:54? What are they even doing!?
If you program a digital voice to sound like someone, there's zero difference than using an impersonator. A voice actress playing a role in a movoe (Her), is 100% that actress's voice too.
People are so weird on this. OpenAI screwed up, they know it, their actions show it, there isn't much to discuss here.
Her statement says otherwise:
"Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and Al. He said he felt that my voice would be comforting to people.
https://twitter.com/BobbyAllyn/status/1792679435701014908
The Sky voice was released with the first ChatGPT voices last year in September, so there's no contradiction there unless they asked her on the 1st of September and somehow trained another voice within the few weeks after she said no.
Here's a video that someone posted in October talking to the same Sky voice: https://www.youtube.com/watch?v=SamGnUqaOfU
Er, that is totally possible? You act like it's not a machine learning system. You train new stuff in hours or days easily, especially if you have good tooling. Imagine saying this of, say, Stable Diffusion image LoRAs: "this X artist LoRA couldn't be based on X because it was somehow trained within the few weeks after X said no!"
All the timing means is that, in good management & startup fashion, because they needed multiple voices, they had a voice pipeline going, and so could have a bunch of female voices in the pipeline for optionality. And if licensing Johansson didn't work out, you have a plan B (and C, and preferably D). This is big business, you don't do things serially or not have backups: "'hope' is not a plan".
They could do it but my point is that people are using the September rejection date as evidence for them copying her voice afterwards because it was 7 months before GPT-4o and they aren't aware that the voice has been in the app for 7 months already.
In what way?
That in no way contradicts the fact that the Sky voice was created first, although it does seem to suggest a misunderstanding by Johansson that this was to be an exclusive deal to be "the" voice, leading to the incorrect conclusion that the Sky voice was created after she declined, and must therefore be an impersonation (despite sounding nothing like her/Her, as she herself must know better than anyone). Stretch after stretch after stretch. (Being kind.)
In fact the recordings used for training were made in June/July 2023, which is before Johansson was contact as a possible "also-ran": https://openai.com/index/how-the-voices-for-chatgpt-were-cho...
The jury will decide on the latter.
At any rate, Altman made clear allusions to hint that they are capable of synthesizing ScarJo's voice as a product feature. The actress retaliated saying she verbally did not consent, and now OpenAI's defense is that they hired a different actress anyway.
...which means they lied to everyone else on the capabilities of the tech, which is y'know, even worse
Exactly. And regardless of the timeline outlined, when discovery happens, they’ll be a bunch of internal messages saying they want ScarJo to do the voice or find someone they can match her close enough. They went down both paths.
This will settle out of court.
her
Hmmm. This isn’t voice acting though. I suspect that we’ll find that OpenAI used thousands of Johansson’s voice samples for general training to give the voice a “Her” feel and then found someone with a similar voice for fine tuning but had Johansson said yes, they could then have had her do it instead.
If the records show that they did train Sky with Johansson’s voice samples it will be an interesting case.
You misunderstand how personality rights work.
Called it in the other thread and calling it in this one, there is no wrongdoing on OpenAI's side.
Looking/sounding like somebody else (even if its famous) is not prosecutable. Scarlet Johansson has nothing in this case, whether people like it or not. That's the reality.
Who said it was ‘prosecutable’?
Scarlet Johansson is threatening legal action against OpenAI for this.
Are you not aware of this?
Scarlet Johansson cannot prosecute anyone. She can sue them, in civil court, for civil damages. Prosecution is done in connection with crimes. Nobody is alleging any crimes here.
prosecute: to officially accuse someone of committing an illegal act, and to bring a case against that person in a court of law
Source: Cambridge's dictionary (but any other would work as well)
Where did you get this? I'm seeing "to officially accuse someone of committing a crime" [1]. Criminality is esssential to the term. (EDIT: Found it. Cambridge Academic Content dictionary. It seems to be a simplified text [2]. I'm surprised they summarised the legal definition that way versus going for the colloquial one.)
You have to go back to the 18th century to find the term used to refer to initiating any legal action [3][4].
[1] https://dictionary.cambridge.org/dictionary/english/prosecut...
[2] https://www.cambridge.org/us/cambridgeenglish/catalog/dictio...
[3] https://verejnazaloba.cz/en/more-about-public-prosecution/hi...
[4] https://www.etymonline.com/word/prosecute
I think it’s still common informal usage to prosecute a (moral) case. Maybe more common in the UK where you can bring a literal private prosecution.
Although I think what lawyers say these days is that it’s not colorable.
Sure, those are other definitions [1], e.g. to prosecute an argument. Within a legal context, however, it is black and white.
For crimes. One wouldn't say one is prosecuting a defendant for e.g. libel. (Some states have private prosecution [2].)
[1] https://www.merriam-webster.com/dictionary/prosecute
[2] https://en.wikipedia.org/wiki/Private_prosecution#United_Sta...
The third definition listed on your Merriam-Webster link seems to be what's applicable here, and very clearly describes the term as applicable to any legal action.
This is consistent with my understanding of the term as a native English speaker, having experienced the term "prosecute" being used in reference to both criminal and civil cases in all forms of discourse, verbal and written, formal and informal, for decades, and only first encountering the claim that it shouldn't be used for civil cases here in this thread, today.
Partly why I used that citation. It’s one of the few (adult) dictionaries that acknowledges as much.
I wouldn’t go so far as to say the Webster 3b usage is incorrect—it’s in some dictionaries and was historically unambiguously correct. But it’s non-standard to a high degree, to the extent that Black’s Law Dictionary only contains the criminal variant. (I’ll leave it open whether publicly referring to someone who has only been sued as someone who has been prosecuted, when intended as an attack, qualifies as defamation.)
More to the point of clear communication, I’d put it in a similar category as claiming one’s usage of terrific or silly was intended in its historic sense [1]. (Though I’ll admit my use of “nice” falls in that category.)
All that said, I’m very willing to entertain there being a dialect, probably in America, where the historic use is still common.
[1] https://www.mentalfloss.com/article/84307/7-words-mean-oppos...
I didn’t know that about the states. Thanks.
Why would that make it more common in the UK? Being able to bring a private prosecution strengthens the distinction, a regular citizen can both sue someone for a civil offense and prosecute someone for a criminal offense. It makes it more clearly nonsense to refer to suing someone for slandering you as "prosecuting" them because you can bring prosecutions and that is not one!
Like I said, a moral case. There are also things like “trial of the facts”
Scroll down on that site, literally. Words have more that one meaning.
Here's what I get from MacOS's dictionary: institute legal proceedings against (a person or organization).
I can also be pedantic and insist that, even under the strict interpretation you are vouching for ...
... is a correct argument.
If this is really your hill to die on, go for it. Using “prosecute” to refer to civil litigation is not standard English in any dialect circa 1850.
Easy peasy, yes/no answer.
From your understanding, is looking/sounding like somebody else (even if its famous) prosecutable or not?
No. And if a lawmaker claimed they would like it to be, and then claimed they meant civilly litigible, they’d be labelled dishonest. (Even if it was an honest mistake.)
Great, then we agree. :^)
Go away with the “well, actually”.
It’s a civil dispute.
The term "prosecution" applies to civil cases as well as criminal ones.
Not true. SJ's statement seeks info and does not threaten legal action.
Nobody said looking/sounding like someone else is "prosecutable", and this willfully obtuse reading is getting annoying.
Many people here, including you, seem to be under the impression that a person who sounds like a celebrity can, because they are not that celebrity, do whatever they want with their voice regardless of whether or not they seem to be passing off as or profiting from the persona of that celebrity. This is not the case.
When others point this out many people, again including you, then go "so you're saying the fact that someone sounds like a celebrity means they can't do anything with their voice - how absurd!", and that isn't the case either, and nobody is saying it.
This binary view is what I'm calling obtuse. The intent matters, and that is not clear-cut. There are some things here that seem to point to intent on OpenAI's part to replicate Her. There are other things that seem to point away from this. If this comes to a court case, a judge or jury will have to weigh these things up. It's not straightforward, and there are people far more knowledeable in these matters than me saying that she could have a strong case here.
People have now said this an absurd number of times and yet you seem to be insisting on this binary view that completely ignores intent. This is why I am calling it willfully obtuse.
If the above are misrepresentations of your argument then please clarify, but both seemed pretty clear from your posts. If instead you take the view that what matters here is whether there was intent to profit from Scarlett Johannson's public persona then we don't disagree. I have no opinions on whether they had intent or not, but I think it very much looks like they did, and whether they did would be a question for a court (alongside many others, such as whether it really does sound like her) if she were to sue, not that there is any indication she will.
Edit: And I should say IANAL of course, and these legal questions are complex and dependent on jurisdiction. California has both a statutory right and a common law one. Both, I think, require intent, but only the common law one would apply in this case as the statutory one explicity only applies to use of the person's actual voice. (That seems a bit outdated in today's deepfake ridden world, but given the common law right protected Midler from the use of an impersonator perhaps that is considered sufficient.)
https://www.dmlp.org/legal-guide/california-right-publicity-...
All we need is discovery. They’ll settle before that because we all know they are swimming in dirt.
That is exactly it - people do not like how OpenAI is acting. Whether or not there is legal action to be had is an interesting tangent, but not the actual point - OpenAI's actions are ticking people off. Just because something is legal does not mean it is the right thing to do.
One of the great things about HN is you get all kinds of experts from every field imaginable.
Yikes.
The sad thing is: most probably absolutely nothing will happen.
These startups break laws, pay the fines and end up just fine.
Remember that it was Sam Altman who proposed to change the YC questionnaire to screen applicants by incidents where they have successfully broken rules. YC even boasts about that.
They're likely going to write Scarlett Johansson a large check and get a new voice.
What else do you think is going to happen?
The entire company gets shut down?
Why would it be a particularly large check? What are her damages?
Or to rephrase it, the laws don't apply to the rich. They just have to pay a little bit more "tax" (which probably still works out to be less than typical small business has to pay % wise anyway) to be allowed to break them.
Then you don’t know ScarJo. She doesn’t fuck around and she has enough money to put legal fees where her mouth is. She was the vanguard of actors suing for streaming royalties, for example.
Do we really want someone's voice to be copyrightable, to a point where similar sounding people can't be used?
This is such a weird issue for HN to be upset about when other IP related issues (e.g. companies suing for recreating generic code, patent trolls, trademark colors, disregard of paywall, reverse engineering, etc), people here overwhelmingly fall on the side of weaker IP protections.
I guess the diff is some people just pick the side of "the little guy" and the example of centi millionaire beautiful actress vs billion dollar founder, the scales tip to the actress
That thought process puts them in the same boat as: Theranos, FTX, Lehman Brothers, Bear Stearns, Countrywide Financial, Enron, Washington Mutual, Kidder Peabody, and many other companies which no longer exist.
So if (since?) they lack the ethical compass to not break the rules, perhaps a simple history of what happens to companies that do break the rules might be useful guidance...
However strong her case may be but in lawsuits like these, the prosecution usually needs to prove *beyond reasonable doubt* that OpenAI copied her voice and that too intentionally. This, in all likelihood, seems very tough to come given the evidence so far. Yes, she might drag on the case for a long time but doubt that will cause the slightest dent in OpenAI's business.
This would be a civil not criminal case, there is no prosecution as well “beyond a reasonable doubt” is not the standard, rather Johansson’s lawyers only need to show that the balance of probabilities lies in her favour.
[1] https://www.findlaw.com/criminal/criminal-law-basics/the-dif...
What sort of damages can Scarlett Johansson expect to get if OpenAI launches with the Sky voice for a short while, then pulls it quickly after the backlash (like they did)?
Are punitive damages commonplace for such scenarios?
I’m sure someone can do some type of math about her general pay rate per performance minute then multiply it by the millions of hours Sky has been heard using her voice. I think that number would likely be quite high.
Any damages are probably going to be way cheaper than the cost of an equivalent publicity campaign.
Bad PR is also PR.
I don't get it. He was looking for specific voice. Scarlett Johansson is one of the people who has the voice of this kind. She wasn't interested. It's only logical to approach a different person with the same kind of voice.
It's kinda nasty for one person to monopolize work for all actors that have similar voice to them just because she's most famous of all of them.
100% this.
I'm in the exact same camp, bur for some reason HN crowd thinks that Scarlet has a right here as the other voice actor has a similar voice. Apparently there's an [archaic] law called right to publicity (or something like that) that makes even working with someone with a similar voice illegal. According to that restrictive logic no one can do anything on Earth as they might be doing/looking/sounding similar to someone else who might get offended, as everyone's offended by literally anything nowadays.
I frankly want to see a lawsuit of OpenAI vs Scarlet on this one, where OpenAI wins.
But it wasn't her voice, it was the voice of the impersonator. By that logic, the impersonator can never speak without authorization because the impersonator would use Bette Midler's voice.
They wanted people to either think that is was Bette Midler, or someone that sounded very like her to gain the benefit of association with her. They wanted to use some of Bette's cultural cachet, without her permission.
That's because it was impersonating them, not sounding like them. If they didn't try to sell it as them they would have been fine
That's the problem here too right? Sam implies the voice is scarlet with his references to Her.
To me it all just shows these tech-bro's are just spoiled little brads with strong incel energy. I 100% expect a scandal in a few months where it turns out Sam has a bunch of VR porn bots generated by AI that just 'happen' to look and sound EXACTLY like some celebs...
I think there's a pretty reasonable answer here in that the similarities to Her are quite obvious, and would be regardless of whose voice it was. If you wanted it to be SJ, reaching out right at the last minute seems rather odd, surely you'd reach out at the start?
There are three timelines that seem to be suggested here
* OAI want the voice to sound like SJ
* They don't ask her, they go and hire someone else specifically to sound like her
* They work on, train and release the voice
* OAI, too late to release a new voice as part of the demo, ask SJ if they can use her voice
This requires multiple people interviewed to be lying
Or
* OAI hire someone for a voice
* They train and release the voice
* People talking to a computer that reacts in this way is reminiscent of Her
* "We should get SJ as an actual voice, that would be huge" * Asks SJ
One third one, probably more middle of the road?
* OAI hire someone for a voice
* They train and release the voice
* People talking to a computer that reacts in this way is reminiscent of Her
* "Is this too similar to SJ? Should we ask them?"
* Asks SJ
Sure, though worth noting that they hired a Bette Midler impersonator to sing a cover of a Better Midler song (edit - after asking and getting a "no")
To be honest, I'm not really that convinced it sounds like her
https://youtu.be/GV01B5kVsC0?t=165
https://youtu.be/D9byh4MAsUQ?t=33
Maybe they didn't need SJ to do anything to train the voicebot as it was already trained with the material from the movie.
(I'm in a camp that Sky voice doesn't really like Johansson's)
Here’s the thing though - if I was OpenAI, I’d be more interested in the actor sounding like the voice agent in Her, than Scarlett Johansen.
After all, Scarlett was playing a role in the movie (lending her voice to it), and they wanted to replicate this acted out role.
If the intent alone mattered, OpenAI should be in the clear. More so if they never specially instructed this voice actor to “sound like Scarlett”.
On the other hand, Sama reaching out to Scarlett directly over a number of times doesn’t lend a good look. Perhaps they felt that Scarlett has already done it (acted out as a voice agent they were trying to bring to life) and she would truly understand what they were going for.
Maybe, it was also a bit for marketing and the buzz-worthy story it might generate (“OpenAI worked with ScarJo to create their life-like AI voice. Scarlett was also the voice behind the AI in “Her”).
However, I’m not a lawyer and the the law could very well view this more favourably towards Scarlett.
I’ll admit that my cynicism is in overdrive but I wonder if OpenAI deliberately provoked this. Or at least didn’t mind it as an outcome.
More and more you see legal action as a form of publicity (people filing frivolous lawsuits etc), a lawsuit like this would help keep OpenAI looking like an underdog startup being crushed by the rich and powerful rather than the billion dollar entity it actually is.
"Mitch Glazier, the chief executive of the Recording Industry Association of America, said that Johansson may have a strong case against OpenAI"
Of course he does. RIAA thinks nearly everything is illegal, and in general mocked or critiqued on the site when they go after some shared mp3s or whatever.
His opinion is neither authoritative or informative.
OpenAI hires voice actress. Then it contacts famous actress for a license. Why. The implication is that OpenAI knows there is a potential legal issue.
Then later, when OpenAI promotes voice actress with an apparent reference to famous actress' work, and famous actress complains, OpenAI "pauses" the project. If OpenAI believes there is no issue, then why pause the project.
This behaviour certainly looks like OpenAI thought it needed a license from famous actress. If there is another explanation, OpenAI has not provided it.