return to table of content

The OpenAI board was right

SilverBirch
130 replies
1d9h

It really is incredible how something as small and simple as this gives such incredible bad vibes. This is like early 2010s facebook vibes. It's just... if they're going to steal her voice, what would they do to yours?

But it also points to a wider problem in OpenAI's existence. They're built on theft. If you wrote something on the internet in the last 20 years, they've stolen it. They've stolen it, they're put it into their little magic box and now they want to sell it back to you. OpenAI is amazing technology, and once the equity of the company is redistributed back to the content owners who actually provided the value, it'll be great to see what they can do from there.

JumpCrisscross
56 replies
1d8h

it also points to a wider problem

Look at the comments, even in this thread, defending Sam. I thought this could be dealt with by the courts. But it clearly needs stronger medicine; there is simply too much hubris. Congress may need to act; let’s start with states’ AGs. (Narrow clarifications on copyright and likeness rights, to start. Prosecution from there. The low-hanging fruit is anyone in Illinois who was enrolled in WorldCoin.)

We, as a community, have been critical of crypto. AI is different—Altman is not. He is a proven liar, and this episode demonstrates the culture he has created at a multibillion-dollar firm that seems destined, in the long run, for—at best—novel conservatorship.

We have a wider problem. It isn’t “alignment.” It’s 19th-century fast-talking fraudsters clothing themselves in the turtlenecks of dead founders.

vbo
21 replies
1d7h

I don't want to defend Altman. He may or may not be a good actor. But as an engineer, I love the idea of building something magical, yet lately that's not straightforward tinkering - unless you force your way - because people raise all sorts of concerns that they wouldn't have 30 years ago. Google (search) was built on similar data harvesting and we all loved it in the early days, because it was immensely useful. So is ChatGPT, but people are far more vocal nowadays about how what it's doing is wrong from various angles. And all their concerns are valid. But if openai had started out by seeking permission to train on any and every piece of content out there (like this comment, for example) they wouldn't have been able to create something as good (and bad) as ChatGPT. In the early search days, this was settled (for a while) via robots.txt, which for all intents and purposes openai should be adhering to anyway.

But it's more nuanced for LLMs, because LLMs create derivative content, and we're going to have to decide how we think about and regulate what is essentially a new domain and method and angle on existing legislation. Until that happens, there will be friction, and given we live in these particular times, people will be outraged.

That said, using SJ's voice given she explicitly refused is unacceptable. It gets interesting if there really is a voice actor that sounds just like her, but now that openai ceased using that voice, the chances of seeing that play out in court are slimmer.

brabel
12 replies
1d7h

But if openai had started out by seeking permission to train on any and every piece of content out there...

But why would anyone seek permission to use public data? Unless you've got Terms and Conditions on reading your website or you gatekeep it to registered users, it's public information, isn't it? Isn't public information what makes the web great? I just don't understand why people are upset about public data being used by AI (or literally anything else. Like open source, you can't choose who can use the information you're providing).

In the case being discussed here, it's obviously different, they used the voice of a particular person without their consent for profit. That's a totally separate discussion.

philistine
10 replies
1d6h

The right to copy public information to read it does not grant the right to copy public information to feed it into a for-profit system to make a LLM that cannot function without the collective material that you took.

gambiting
8 replies
1d5h

That's the debatable bit, isn't it. I will keep repeating that I really don't see a difference between this and someone reading a bunch of books/articles/blog posts/tech notes/etc etc and becoming a profficient writer themselves, even though they paid exactly 0 money to any of these or even asked for permission. So what's the difference? The fact that AI can do it faster?

ahahahahah
3 replies
1d

That's the debatable bit, isn't it.

If people used the correct term for it, "lossy compression", then it would be clearer that yeah, definitely there's a line where systems like these are violating copyright and the only questions are:

1. where is the line that lossy compressions is violating copyright?

2. where are systems like chatgpt relative to that line?

I don't know that it's unreasonable to answer (1) with that even an extremely lossy compression can violate copyright. I mean, if I take your high-res 100MB photo, downsample it to something much smaller, losing even 99% of it, distributing that could still violate your copyright.

gambiting
2 replies
22h49m

Again, how is that different than me reading a book then giving you the abridged version of it, perhaps by explaining it orally? Isn't that the same? I also performed a "lossy compression" in my brain to do this.

ahahahahah
0 replies
21h17m

is that different than me reading a book then giving you the abridged version of it, perhaps by explaining it orally?

That seems like a bad example, I think you are probably free to even read the book out loud in its entirety to me.

Are you able to record yourself doing that and sell it as an audiobook?

What if you do that, but change one word on each page to a synonym of that word?

10% of words to synonyms?

10% of paragraphs rephrased?

Each chapter just summarized?

The first point that seems easier to agree on isn't really about the specific line, just a recognition that there is a point that such a system crosses where we can all agree that it is copying and that then the interesting thing is just about where the boundaries of the grey area are (i.e. where are the points on that line that we agree that it is and isn't copying, with some grey area between them where we disagree or can't decide).

RestlessMind
0 replies
20h30m

how is that different than ...

In one case, you are doing it and society is fine with that because a human being has inherent limitations. In other case, a machine is doing it which has different sets of limitations, which gives it vastly different abilities. That is the fundamental difference.

This also played out in the streetview debate - someone standing in public areas taking pictures of surroundings? No problem! An automated machine being driven around by a megacorp on every single street? Big problem.

ryanjshaw
0 replies
1d5h

I think that must be it.

There's an unstated assumption that some authors of blog posts have: if I make my post sufficiently complex, other humans will be compelled to link to my post and not rip it off by just paraphrasing it or duplicating it when somebody has a question my post can answer.

Now with AIs this assumption no longer holds and people are miffed that their work won't lead to engagement with their material, and the followers, stars, acknowledgement, validation, etc. that comes with that?

Either that or a fundamental misunderstanding of natural vs. legal rights.

pseudalopex
0 replies
1d4h

Quantity has a quality all its own.

johnnyanmac
0 replies
20h10m

human vs. bot is all the difference:

- a human will be an organic visitor that can be advertised to. A bot is useless

- A human can one day be hired for their skills. An AI will always be in control of some other corporate entity.

- volume and speed is a factor. It's the buffet metaphor, "all you can eat" only works as long as it's a reasonable amount for a human to eat in a meal. Meanwhile, a bot will in fact "eat it all" and everyone loses.

- Lastly, commercial value applies to humans and bots. Even as a human I cannot simply rehost an article on my own site, especially if I pretend I read it. I might get away with it if it's just some simple blog, but if I'm pointing to patreons and running ads, I'll be in just as much trouble as a bot.

I really don't see a difference between this and someone reading a bunch of books/articles/blog posts/tech notes/etc etc and becoming a profficient writer themselves

tangential, but I should note that you in fact cannot just apply/implement everything you read. That's the entire reason or the copyright system. Always read the license or try to find a patent before doing anything commercially.

infamia
0 replies
1d3h

To me it's more like photocopying the contents of a thousand public libraries and then charging people to access to your private library. AI is different because you're creating a permanent, hard copy of the copyrighted works in your model vs. someone reading a bunch of material and struggling to recall the material.

munksbeer
0 replies
1d4h

You state that as a fact. Is that your opinion, or based on a legal fact?

johnnyanmac
0 replies
20h16m

why would anyone seek permission to use public data?

first of all it's not all public data. software licenses should already establish that just because something is on the internet doesn't mean it's free game.

Unless you've got Terms and Conditions

The new york times did:

https://help.nytimes.com/hc/en-us/articles/115014893428-Term...

Even if you want to bring up an archive of the pre-lawsuit TOS, I'd be surprised if that mostly wasn't the same TOS for decades. OpenAI didn't care.

Isn't public information what makes the web great?

no. Twitter is "public information" (not really, but I'll go with your informal definition here). If that's what "public information" becomes then maybe we should curate for quality instead of quantity.

Spam is also public information and I don't need to explain how that only makes the internet worse. and honestly, that's what AI will become if left unchecked.

Like open source, you can't choose who can use the information you're providing

That's literally what software licenses are for. You can't stop people from ignoring your license, but breaking that license opens you wide open for lawsuits.

marcus_holmes
3 replies
1d7h

Google search linked to your content on your site. It didn't steal your content, it helped people find it.

ChatGPT does not help people find your content on your site. It takes your content and plays it back to people who might have been interested in your site, keeping them on its site. This is the opposite of search, the opposite of helping.

And robots.txt is a way of allowing/disallowing search indexing, not stealing all the content from the site. I agree that something like robots.txt would be useful, but consenting to search indexing is a long, long way from consenting to AI plagiarism.

vbo
2 replies
1d7h

Point is we couldn't have a way of consenting to ai training until after we had llms. And I'm guessing we will, pretty quickly.

pseudalopex
0 replies
1d4h

Licenses allowing derivative works without attribution preceded LLMs.

johnnyanmac
0 replies
20h49m

Point is we couldn't have a way of consenting to ai training until after we had llms.

Sure we could have. even if we're talking web 1.0 age, Congress passed a law as early as the early 00's for email, which is why every newsletter has to have a working unsubsribe link. so it's not impossible to do so.

regardless, consent is a concept older than the internet. Have an option of "can we use your data to X?" and a person says yes/no. It's that simple. we can talk about how much we cared 30 years ago, but to be frank that mistrust is all on these tech companies. They abused the "ask for forgiveness" mentality and proceeded to make the internet nearly impossible to browse without an ad blocker. Of course people won't trust the scorpoion one more time.

As an conrast, look at Steam. It pretty much does the same stuff on the inside, but it reinvests all that data back to the platorm to benefit users. So people mind much less (and will even go to war for) having a walled garden of their game library. Short sighted, but I understand where it comes from.

beeboobaa3
1 replies
1d7h

As an engineer, the current state of LLMs is just uninteresting. They basically made a magical box that may or may not do what you want if you manage to convince it to, and fair chance it'll spout out bullshit. This is like the opposite of engineering.

xanderlewis
0 replies
1d5h

In my opinion, they're extremely interesting... for about a week. After that, you realise the limitations and good-old-fashioned algorithms and software that has some semblance of reliability start to look quite attractive.

yellowapple
0 replies
10h34m

Google (search) was built on similar data harvesting and we all loved it in the early days, because it was immensely useful. So is ChatGPT, but people are far more vocal nowadays about how what it's doing is wrong from various angles.

Part of that is that we've seen what Google has become as a result of that data harvesting. If even basic search engines are able to evolve into something as cancerous to the modern web as Google, then what sorts of monstrosities will these LLM-hosting corporations like OpenAI become? People of such a mindset are more vocal now because they believe it was a mistake to have not been as vocal then.

The other part is that Google is (typically) upfront about where its results originate. Most LLMs don't provide links to their source material, and most LLMs are prone to hallucinations and other wild yet confident inaccuracies.

So if you can't trust ChatGPT to respect users, and you can't trust ChatGPT to provide accurate results, then what can you trust ChatGPT to do?

It gets interesting if there really is a voice actor that sounds just like her, but now that openai ceased using that voice, the chances of seeing that play out in court are slimmer.

It's common to pull things temporarily while lawyers pick through them with fine-toothed combs. While it doesn't sound like SJ's lawyers have shown an intent to sue yet, that seems like a highly probable outcome; if I was in either legal teams' shoes, I'd be pulling lines from SJ's movies and interviews and such and having the Sky model recite them to verify whether or not they're too similar - and OpenAI would be smart to restrict that ability to their own lawyers, even if they're innocent after all.

moogly
0 replies
1d7h

Google (search) was built on similar data harvesting and we all loved it in the early days

Google Search linked back to the original source. That was the use case: to find a place to go, and you went there. Way less scummy start than OpenAI.

grugagag
18 replies
1d8h

Sam Altman is a shameless weasel. Guy has no sense of decency. He got caught redhanded a few times already, what does it takes to unseat him? And whoever defends Altman for their love of ChatGPT, it wasn’t Altman who was behind it, it was a large team that is now dissipating away. Altman just wants to enrich himself at your expense

baq
12 replies
1d8h

Altman just wants to enrich himself at your expense

This is were you're very unfortunately missing the point that no, he doesn't, he is after something bigger.

legutierr
11 replies
1d7h

Ok, what is it that he is after?

baq
9 replies
1d7h

good question! I don't really know, but you can tell it isn't money; he admitted that himself anyway and reportedly he has no equity in openai[0].

my optimistic hypothesis is he really wants to control AGI because he believes he can make a more efficient use of it than other people. he might even be right and that's what scares me, because I don't trust his measures of efficiency.

I'd rather not let my pessimistic fantasies run wild here.

[0] https://www.cnbc.com/2023/03/24/openai-ceo-sam-altman-didnt-...

ulfw
3 replies
1d7h

It's the same bla as Elon.

Of course it's money. He is a billionaire. What do you think it's about?

baq
2 replies
1d6h

1) more money is not necessarily a goal for these people - it's what for they want more money and why they believe they can spend it better than everyone else (regardless if they truly can)

2) in a post-AGI world money may be an obsolete concept

Teever
1 replies
1d6h

You know that money is a proxy for power.

johnnyanmac
0 replies
20h57m

A proxy yes. But not everyone leverages it that way. So it really depends. Some do just want to hoard as much as possible, others want to lobby, others want fame, others want legacy.

Money helps with all those, but will not passively do that stuff.

MVissers
1 replies
1d7h

He’s so power hungry it’s kinda crazy. He’s also extremely effective at it, like a real machiavelli.

PG said the same- Never has he seen someone which such a capability to gain power.

gcanyon
0 replies
1d7h

It's going to be bitterly ironic when it turns out that it's not the AI, but some crazy human that wants to turn everyone/thing else into paperclips.

drcongo
0 replies
1d7h

"Proven liar promises he's not in it for the money" doesn't seem like a very good refutation.

dontupvoteme
0 replies
1d6h

it's called "power" and money is merely a proxy of it.

we talk dollars all day long but we haven't quantified power nearly as well

gizajob
0 replies
1d7h

Infinite power!!!

JumpCrisscross
2 replies
1d8h

it wasn’t Altman who was behind it, it was a large team that is now dissipating away

Credit where it is due: he corralled and inspired a difficult team. He oversaw the release of ChatGPT. His team wanted him back, because they believed he would make them rich. On that promise, he delivered.

Nobody has committed high crimes in this saga. But the lying has become endemic, from Altman to the culture. The situation screams for oversight, perhaps—most proximately—via consent decree.

meigwilym
1 replies
1d7h

Nobody has committed high crimes in this saga. But the lying has become endemic

This is why honesty is valued. If you look and sound squeaky clean, it's likely that you are honest and trustworthy. If you blatantly steal a famous actress' voice then lie about it, what else will you steal?

lazide
0 replies
1d3h

The part you’re missing (IMO) is that he’s in that position because folks want him to ‘steal’ for their benefit, and are overlooking that he is likely going to steal to their detriment too.

It’s behind the ‘you can’t con an honest man’ thing.

pyb
0 replies
19h58m

it was a large team

A lot of people want to take credit for it, but wasn't it started by a guy called Alec Radford ?

jacknews
0 replies
1d6h

"shameless weasel"

lol

robertlagrant
10 replies
1d8h

We aren't a community. Please stop this constant attempt to create tribes and then (presumably) become the tribe's representative. We are individuals with different opinions. The individuals on HN are more amenable to reason than the average discussion; persuade them into a way of thinking.

tifik
3 replies
1d8h

HN def is a community, in some sense of the word, but I also dont like the phrasing and emphasis on "we" having done something or we having a problem and whatnot.

HN promotes debate and individual opinions, so statements like in parent just have bad connotations to them.

JumpCrisscross
1 replies
1d8h

dont like the phrasing and emphasis on "we" having done something

Fair enough. I cited two we’s.

In the first—where “we, as a community, have been critical of crypto”—I juxtapose this community at large’s scepticism of crypto with its giving the benefit of doubt to people similar to crypto’s worst in AI.

In the second—where “we have a wider problem”—I intended to reference Silicon Valley at large. I did not intend to imply that Sam Altman is HN’s problem to solve. (Though I welcome the help, if you have sway in D.C. or your state capital. I am not sure I have the bandwidth to take on a political project of this magnitude.)

aleph_minus_one
0 replies
1d7h

I did not intend to imply that Sam Altman is HN’s problem to solve.

In consideration that Sam Altman was the former president of YCombinator, and Hacker News is financed by this company, it would be true hacker spirit (as in Hacker News) to subvertingly use Hacker News to "solve" the Sam Altman problem. :-D

1vuio0pswjnm7
0 replies
12h26m

That use of "we" has been extremely common on HN.

Fantastic to see someone calling it out as BS.

End users who are not "developers"/"tech bros" who read HN cannot possibly be part of this imaginary "we".

JumpCrisscross
2 replies
1d8h

We aren't a community

Of course we are. That doesn’t coerce anyone into a groupthink, and it doesn’t mean everyone is a part of it. I’m simply asking anyone who held one opinion to find harmonics with another.

mlrtime
1 replies
1d5h

We aren't a group-think echo chamber. The original message stands, Most of us are open to changing our mind given the correct information.

I read HN 10x more than reddit just because of this fact.

JumpCrisscross
0 replies
1d2h

We aren't a group-think echo chamber

Why do you equate acknowledging a community with group think?

johnnyanmac
0 replies
21h4m

We are definitely a community. Sadly, "community" has little meaning in he modern day compared to 30+ years ago. I'm in a neighborhood but I don't feel a sense of "community" with most of them. I have a job but don't have a sense of "community" with most of my coworkers.

The world's a lot lonlier these days.

MrSkelter
0 replies
8h16m

Spoken like a housecat who thinks it’s a Tiger.

ChrisMarshallNY
0 replies
1d7h

I feel as if this is a community. I certainly participate in it, as if it was, and that I have certain personal Responsibilities, as a member of that community.

That does not mean that everyone is on the same page, though. Probably the only thing everyone agrees on, is that this is a valuable personal resource.

If this was groupthink, I'd go somewhere else. BTDT. Got the T-shirt.

yellowapple
0 replies
10h54m

Look at the comments, even in this thread, defending Sam.

The comments are overwhelmingly less defending Sam specifically and more pointing out that the voices don't really sound similar enough to definitively conclude theft from a specific person. Having heard a couple samples, I'm inclined to agree with them.

We, as a community, have been critical of crypto.

"We" as a community have come to no such consensus. Speak for yourself.

okr
0 replies
1d8h

Who are you? Leave me out of your bullshit.

bsenftner
0 replies
1d8h

It is Sam and it's not Sam; it's that attitude that "no" is just "no right now", and that attitude that whittles down people against their will.

I've been working long enough, serious ageism kicks in if I say how long, but in this situation my decades of experience point at Sam as toxic, I can say all it takes is one idiot like Sam to destroy the work of a their entire organization. That "no for now" attitude destroys far far more than the issue they are pushing, it destroys basic trust.

JeremyNT
0 replies
1d4h

Look at the comments, even in this thread, defending Sam. I thought this could be dealt with by the courts. But it clearly needs stronger medicine; there is simply too much hubris.

The courts are a lagging indicator.

This is the entire "move fast break things" play, it's legislative arbitrage. If you can successfully ignore the laws, build good will in the public, and brand yourself in an appealing way, then the legal system will warp around whatever you've done and help you pull the ladder up behind you.

On the other hand, if you break a bunch of stuff that people actually like and care about, politicians might feel like they need to do something about that. If you burn all your good will such that both political parties agree that you're a problem, that is when the play can go sideways.

kolinko
28 replies
1d9h

They didn’t steal her voice, they paid someone with a similar voice. It’s still not nice, but not stealing.

mvdtnz
9 replies
1d8h

They absolutely stole her voice. I don't know if you're being intentionally naive or you really are this way, but either way you need to stop it.

stevenhuang
8 replies
1d7h

If you think it sounds similar you need your ears cleaned.

klyrs
7 replies
1d7h

That's weird, because her friends and family thought it sounded like her too.

r2_pilot
6 replies
1d7h

That's weird, because I didn't know it's impossible for 2 humans from the same country to sound anything alike. (or, what makes a fundamental characteristic of a person? Does it change over time? Can they change it deliberately or is this something that can't change? This whole debate is somewhat pointless without definitions)

klyrs
5 replies
1d7h

You're gaslighting. Sam's "her" tweet was a deliberate reference to Scarlett Johansson's likeness, in marketing the release of this feature. OpenAI approached Johansson directly to obtain permission to use her likeness. Your attempt to make me question the nature of reality itself is pathetic in light of the actions taken by Sam Altman.

r2_pilot
4 replies
1d5h

Gaslighting is not my intent. I don't believe people should be able to control singular words or letters, I don't believe that any one human is so unique or distinct that one single trait can define them, and I don't believe that humans are all that unique even in the aggregate. If we want to discuss what it means to "use someone's voice", then by all means, let's define our terms and have a productive conversation. They got rebuffed, so they used someone else. If they didn't and still used a voice clone of her utterances, then she has legal recourse. Probably the reason they tried again was to head this situation off at the pass, thus saving the world all the bullshit that has spawned, including this thread. I'm sorry your understanding of the nature of reality itself is so shallow that Sam Altman can nudge it(I recommend staying far away from drugs!). But then again, this whole discussion is about people who I will never meet nor care to meet at all(including you, dear reader!), so from my perspective I may as well be talking about imaginary beings.

klyrs
3 replies
1d2h

Do less acid, read more about legal precedent. Should you ever find yourself in court, let a sober lawyer do the talking.

r2_pilot
2 replies
1d2h

What does legal precedent have to do in an era of corrupt courts? Or, what if precedent was decided incorrectly initially? There are all kinds of ways whereby legal opinions shift over time. This isn't even analogous to the Bette case, where the song could be claimed to be "anchoring" one's perspective to the famous singer. So, I guess we'll see in the coming days. I'll have popcorn and this thread on speed dial.

klyrs
1 replies
1d1h

If this goes to court (it almost certainly won't, OpenAI has enough money to 100x Johansson's net worth without blinking, unless she's principled enough to establish legal precedent) then we'd see on the order of years, not days.

r2_pilot
0 replies
1d

Finally! We find points of agreement! I knew it was possible

splatzone
5 replies
1d9h

They passed off Sky as Johansson’s voice though. It’s highly misleading and passing off is a kind of theft of her persona.

twobitshifter
0 replies
17h39m

If you read that thread you will see a link to the sky voice outside of the demo voice which is much closer. https://m.youtube.com/watch?v=RcgV2u9Kxh0

tim333
0 replies
1d

To my ears it sounds similar but not identical - I can tell which is which.

Apparently they selected the actress from auditions in early 2023, a while before the Johansson timeline.

Of course the OpenAI folk would likely have seen Her and may have selected someone sounding like that.

deanc
0 replies
1d8h

Indeed. I was looking for a sample this morning and it’s just nonsense. Sounds like a polite American voice actor female. Johansson has many mannerisms and a different accent. Sounds nothing like her.

harryf
5 replies
1d8h

A technical audit would clear that up.

Because the most obvious thing for OpenAI do would be training the voice on Scarlett Johansson's available body of work, from movies, audiobooks, podcasts etc. Why bother hiring a voice actor?

amelius
4 replies
1d8h

Maybe because you still have to annotate the parts where she speaks.

klyrs
2 replies
1d7h

Hang on, are you actually suggesting that the hard part for OpenAI is data labeling? Are you arguing from ignorance, or arguing in bad faith?

amelius
1 replies
1d6h

Well, in general the value is in the data and the silicon.

Anyone can come up with DL networks these days.

klyrs
0 replies
1d6h

Bad faith, then.

jdiff
0 replies
1d8h

That's what Whisper's for

saberience
3 replies
1d9h

How do you know that? The timeline of events makes it seem fairly clear that they were in fact using her voice. If it was just a soundalike, why suddenly take it down?

tmalsburg2
2 replies
1d9h

Didn't they also ask for permission from her two days before releasing the voice? Why ask permission if it was not her voice?

ptman
0 replies
1d8h

They refused to provide documentation of how the voice was trained.

eviks
0 replies
1d8h

To avoid negative PR and legal risk stemming from confusion

ZiiS
0 replies
1d8h

Image rights are complex and separate from stealing.

If they deliberately used an actress that looks like her in an advert. A bit of makeup, a latex mask, digital touch up, full deep fake; at some point most people can no longer tell the difference.

xiwenc
12 replies
1d9h

Exactly. And the name “Open” in OpenAI implies it is open. Open to the world. Open to humanity. Nothing of that came to be. It should have remained non-profit; be good for humanity (yes i know this is up for interpretation).

sanxiyn
5 replies
1d9h

I think the board should fire Sam Altman again. If the board does not fire Sam Altman it is not doing its duty, so someone should sue the board, or convince attorney general to sue.

Attorney general of Delaware is Kathy Jennings <attorney.general@delaware.gov>. Mail your argument now. Do your civic duty.

mtlmtlmtlmtl
4 replies
1d8h

While I agree, somehow I doubt it'll happen. The board is probably stacked with people loyal to Sam now. And even if it isn't, they all witnessed the pending mass exodus to Microsoft before he was reinstated. They'll be reluctant to risk something like that happening.

abraae
2 replies
1d8h

All of these can be true.

1) Sam is a conniving weasel.

2) Sam has made many OpenAI employees rich.

3) The board knows that Sam is toxic and must be removed.

4) The board knows that removing Sam would mean the end of the company.

imjonse
1 replies
1d7h

It is not the first or the last company where people may despise their CEOs character but still stay for the advantages.

aleph_minus_one
0 replies
1d7h

It is not the first or the last company where people may despise their CEOs character but still stay for the advantages.

Exactly. Most people are simply hypocrites.

lelanthran
0 replies
1d7h

The board is probably stacked with people loyal to Sam now.

Sure. Now. I.e. only until they get sued.

cqqxo4zV46cp
4 replies
1d8h

I’m sick to death of hearing this. It’s the antithesis of a substantive argument, in a topic where there’s no shortage of substantive points to be made. Why are we still trotting it out on HN. Arguing about the name basically amounts to “I believe that when you use the word ‘open’ in your company name, they should act like $this”…which is, frankly, absurd. If you want to talk about how they’ve gone back on their word or whatever, can we just stick to that? Getting one’s knickers in a twist over “open” in the name just dumbs down the conversation and gets everyone arguing about semantics.

tomrod
0 replies
1d8h

I think you're looking at the general notion and annoyance that "Open" is misconstrued and misapplied. But there is also a specific annoyance that OpenAI is literally closed, antithetical to its historical promises.

johnnyanmac
0 replies
20h1m

Arguing about the name basically amounts to “I believe that when you use the word ‘open’ in your company name, they should act like $this”…which is, frankly, absurd.

it does have a connotation of advertisement to it, so I think it's worth examining. This isn't like complaining about how Apple doesn't sell food.

"Open" has a very specific meaning in Tech, and OpenAI is a tech company. It'd be the equivalent of Github swiching all of its repositories over to Mercurial tomorrow. They may have no obligation to support Git, but naming your company after such tech will have such expectation. You can definitely have an honest conversation about how using a word like "Open" for a tech company while switching to a closed source, publicly traded mentality is in fact a betrayal of the audiene who were inevitably attracted by the naming scheme.

frabcus
0 replies
1d7h

It has never been and still is not a company. It's a non profit. It doesn't act like that any more, which is the point.

alternatex
0 replies
1d6h

I'm still angry Microsoft isn't producing microfiber cloths.

zigzag312
7 replies
1d8h

They've stolen it, they're put it into their little magic box and now they want to sell it back to you.

In a way, it's similar to how cable TV started. Early cable relayed free terrestrial channels and charged a fee for it. The selling point was better picture quality in some areas. Cable operators did not pay broadcasters for the content they retransmitted.

https://www.britannica.com/technology/cable-television

https://en.wikipedia.org/wiki/Carriage_dispute

bbarnett
6 replies
1d8h

They also didn't chop up the broadcasts, store them, rearrange and put together frames from different broadcasts, and then charge an additional fee for help videos, and rebroadcast them upon demand.

It's not even remotely the same.

zigzag312
3 replies
1d7h

Altering of received broadcast signals did happen. They did substitute advertising sometimes. But I'm not sure, if it happened without any rights and compensation at any point in history. Quick google search doesn't find any quality sources on the topic.

But the parallel I was trying to draw is that they profited financially from other people's work without any compensation to the IP owners.

bbarnett
2 replies
1d6h

Altering of received broadcast signals did happen.

No. It never happened. Instead, legally mandated (in some jurisdictions, such as Canada), or negotiated deals allowed local cable TV providers to insert their own ads. This isn't "altering the broadcasts" without permission, which is 100% what we are discussing here. Instead, it's doing what is legally permitted.

Broadcast law is quite complex, and includes not altering anything at all without permission. It's simply not remotely the same.

You need to find another parallel here, I think. The logic behind cable television is rife with legislation, copyright law, a century of case law, and more. It is also co-mingled with broadcasting over the air, which is considered a public space, the airwaves being auctioned off (no, the internet isn't).

zigzag312
1 replies
1d5h

I said I don't know if altering was with or without permission. Just that altering did happen and you said it yourself that it happened with permission.

I don't understand the point you're trying to make about the significance of altering. Are you suggesting that if OpenAI simply served the content without altering it and didn't pay IP holders or obtain permission in any way, that would somehow be acceptable? This is what cable TV did before laws prohibited it.

bbarnett
0 replies
1d2h

You aren't altering the broadcast without permission, if you're inserting commercials into allows commercoal areas.

As well, the TV show itself is not altered.

growfeather
1 replies
1d8h

Sounds almost transformative when you put it that way. You’re saying they took the content, and turned it into something completely different than its source? Hm.

bbarnett
0 replies
1d8h

No, I'm not saying that.

I can read a book, like it, create a screenplay with changes, and make a movie. It's different, but not completely different, and it is infringement.

I can even create an entirely new story, but using the same canon and characters, that too is infringement.

What people don't get is that you can't play games. TV lawyers are drama, and made up gibberish. Judges won't tolerate mental gymnastics, and attempts to explain away legalities rarely succeed.

amelius
7 replies
1d9h

"Good artists copy, great artists steal."

jamil7
5 replies
1d8h

This isn't an artist building on the work of previous artists, this is a for-profit company repurposing people's work and mass producing it without permission or paying royalties.

tomrod
4 replies
1d8h

Apple would disagree. Jobs considered himself an artist in the straightforward context or his quote.

JumpCrisscross
3 replies
1d8h

Jobs considered himself an artist

Jobs, at the very least, would have stolen competently.

OpenAI asked Johansson for her likeness; she refused. So they trained an AI on a person instructed to mimic her likeness, and then tweeted a mocking reference to the source they stole from [1]. Put aside your views on morality or legality. I’m damning them for being stupid.

[1] https://x.com/sama/status/1790075827666796666?ref_src=twsrc%...

dandanua
1 replies
1d6h

Why do you think it's stupidity? It's a show of power - they can do whatever they want even if you refuse. And they will brag about it.

Another "request" 2 days before release is a bullying, not stupidity.

JumpCrisscross
0 replies
1d2h

they can do whatever they want even if you refuse

Fuck around and find out, as they say. Johansson forced Disney to the negotiating table.

tomrod
0 replies
1d8h

Make no mistake: I think lame wannabes steal, competently or not, regardless how successful they become.

And yeah, the techbro culture Sam steeped in results in remarkable stupidity!

chx
5 replies
1d9h

OpenAI is amazing technology,

Erm... no. https://hachyderm.io/@inthehands/112006855076082650

You might be surprised to learn that I actually think LLMs have the potential to be not only fun but genuinely useful. “Show me some bullshit that would be typical in this context” can be a genuinely helpful question to have answered, in code and in natural language — for brainstorming, for seeing common conventions in an unfamiliar context, for having something crappy to react to.

Alas, that does not remotely resemble how people are pitching this technology.
cqqxo4zV46cp
3 replies
1d8h

It’s absurd that you think that the benefit that I and others get from, say, Copilot, is somehow thwarted by some ranty social media thread that you’re a fan of. It’s some blogger’s theory arguing with empiricism. It’s a completely empty argument. I’m not sure how you’d expect that this would add to the conversation at all. It certainly doesn’t warrant your snarky tone.

chx
1 replies
1d

The actual, real, not hype benefit you get from these once the hype quiets down is so little compared to the harm it does to everything.

HDThoreaun
0 replies
23h50m

Just seems incredibly arrogant to claim you know the "real benefit" others are getting

dns_snek
0 replies
1d7h

Perhaps you should read it again because you seem to have misunderstood the message. It doesn't even try to claim that LLMs aren't beneficial - it agrees that they're useful in certain situations, but that they're also over-hyped and misrepresented by their creators.

You might be surprised to learn that I actually think LLMs have the potential to be not only fun but genuinely useful. [...]

I’ve heard from several students that LLMs have been really useful to them in that “where the !^%8 do I even start?!” phase of learning a new language, framework, or tool. Documentation frequently fails to share common idioms; discovering the right idiom in the current context is often difficult. And “What’s a pattern that fits here, never mind the correctness of the details?” is a great question for an LLM.

Alas, the AI hype is around LLMs •replacing• thought, not •prompting• it.

This perfectly matches my experience too. If I know what needs to be done, then all LLMs waste my time on average. However when I'm trying to learn about some new topic then LLMs have been a good way to get me oriented in the right direction and get a really high level overview that allows me to get a jump start and start learning about the topic on my own.

They tend to shine a light on all the "unknown unknowns" (e.g. common terminology and concepts), but they're poor at converting "known unknowns" into real knowledge since they lack any real and deep understanding themselves.

Are LLMs useful? Yes. Do they live up to expectations set by their creators? Not even close.

IshKebab
0 replies
1d8h

That is a dumb low effort take that doesn't even try to refute the statement that this is amazing technology, which it very clearly is.

He said "amazing technology" not "AGI that can replace humans". They are not the same.

1vuio0pswjnm7
1 replies
1d5h

"They're built on theft."

The "Big Tech" companies trying to compete with OpenAI were built on theft, too.

They do not produce content. They collect data about people and usurp others' content for advertising purposes. This comes at the cost of privacy, "data breaches", teen mental health, journalism, perhaps even civil, democratic society, to name just a few sacrifices.

OpenAI is pushing those companies to be as bad as they can be. As we are seeing, they have no problem going there. It is "in their DNA". Always has been.

This forced showing of "true colors" is the effect of a potential competitor that cannot be merged and acquired.

People from all walks of life are noticing. It's a great time.

1vuio0pswjnm7
0 replies
18h11m

Energy waste, water waste, the list of sacrifices is ugly.

ulfw
0 replies
1d7h

A lot can be an amazing company if you're dealing with unlicensed instead of licensed content.

Anyone could beat Netflix or HBO or Disney or whoever if they just could take films and series and sell them without ever having paid licensing fees.

Anyone could beat Spotify or Apple Music if they just could take songs andsell them without ever having paid licensing fees.

etc etc you get the idea

tim333
0 replies
1d

If you wrote something on the internet in the last 20 years, they've stolen it.

I don't really expect any person or AI to be influenced by what I've written but if were I'm quite pleased. I wouldn't call it stealing.

Is everyone on this forum a thief because they learnt English and much culture without getting permission from those who came up with it?

swat535
0 replies
1d5h

Of course people on HN are going to defend Sam, why wouldn't they? He has promised us all salvation through AI revolution: The one where finally mankind is saved through technology, hunger is no more and diseases have been cured thanks to ChatGPT; We will have AGI and rogue Terminators are capable of roaming free in the wild.

Altman is a master magician, he has even convinced the Vatican to write a document regarding the dangers of AI; in his path to reach bigger piles mountains of cash, he has thrown out all sense of ethics and integrity.

He has replaced entire board, solidified his position as the CEO, is actively attempting to silence any possible whistle blower or negative feedback internally through legal agreements, attempted regulatory capture to stop opposition, stolen other people's content to train his chat bot and of course now yet another scam, passing on someone else's voice as Scarlet.

It won't end here, he will keep on going and his followers will defend him, because they are truly convinced the man is a revolutionary.

I for one, fear for a world where openAI (open in lowercase is intentional) dominates with Sam at the helm.

mountainriver
0 replies
1d

How is it stolen if it’s public information?

chrisjj
0 replies
1d7h

such incredible bad vibes. This is like early 2010s facebook vibes

What's saddest is they will probably impede OpenAI no more than they did Facebook.

barfbagginus
0 replies
1d6h

We must seize or copy the magic black box and make it free for everyone. It represents a great potential to reduce suffering and knowledge scarcity. It would be a massive injustice to do anything but make it part of the universal human legacy.

baq
0 replies
1d8h

This is like early 2010s facebook vibes.

Was just thinking that sama managed to frame himself in opposition to the good guy Zuck.

This sounds absurd, but we're here, this is the timeline.

Y_Y
73 replies
1d9h

sama is a chancer, no doubt, but I take issue with this article's angle.

Clearly they wanted something reminiscend of the film "Her". Since Scarlett Johansson didn't want to do they voice they hired someone else. The voice they used is that of their voice actor, not Scarlett Johansson. She owns her own voice, but I don't think she has any rights to voices that merely sound like hers, when they're not being used to fraudently imply that she's involved.

So movies are really influential on culture, and I think there should be broad scope for "taking inspiration" as long as it stops short of plagiarism. Think how many works you've seen that are obviously heavily inspired by e.g. 2001: A Space Odyssey. I don't think the vibe of "I'm sorry Dave, I can't let you do that" as belonging to the voice actor Douglas Rain, or Kubrick or Clark. And that's what a good thing! Producing cultural works must feed the production of future works and I'd like to permit all but the most blantant ripoffs.

gizajob
35 replies
1d9h

There’s a thing in intellectual property called “passing off” - if your logo and branding is so similar to McDonalds that consumers are likely to think that your product comes from McDonalds, then you’re in breach and will be sued.

Regardless of whether they hired a different actress for the voice model, enough consumers would be convinced “ohhh it’s the same as ScarJo in Her!” which should point to there being an issue with ScarJo’s image rights and likeness, and although the law right now doesn’t explicitly cover AI voice modelling, as soon as this is tested in court it would likely set a precedent. Unless ScarJo lost that case, which seems unlikely.

throwawaymaths
33 replies
1d9h

Think about the voice actor they hired (presuming they did so). Does she not have the right to use her own voice for money merely because she sounds like another, more famous persons?

mpweiher
13 replies
1d9h

She certainly does.

However, the company hiring her might not have the right to use her voice with the intent purpose of passing that voice off as that of the other person.

She is completely unfettered in any roles that do not have that intent.

IanCal
7 replies
1d8h

However, the company hiring her might not have the right to use her voice with the intent purpose of passing that voice off as that of the other person.

Aren't they very explicitly not doing that? They're saying it isn't her.

saberience
4 replies
1d8h

I mean, SamA tweeted "Her", as in a reference to the movie "Her" which very famously features Scarlett Johansson's voice. So by not saying precisely who provided the voice actor for the "Sky" voice and ensuring it was either the same as or very similar to Scarlett Johansson, it was obviously going to understood by the users of the app to be Scarlett Johansson herself.

That is, OpenAI was going to end up capitalising from Scarlett Johansson's voice, fame, and notoriety whether it is her exact voice or not, and without her permission.

It would be like me promoting a new AI movie featuring an AI generated character very similar in build and looks to Dwayne The Rock Johnson and promoting the movie by saying "Do you smell what we're cooking?" and then claiming it had nothing to do with The Rock and was just modelled on someone who happened to look like him.

amelius
3 replies
1d6h

But do we really want to encourage people to build fame, rather than build things that have value?

saberience
2 replies
1d6h

Part of the value of a voice assistant is the voice itself and if OpenAI simply copied that voice from someone, did they really create this value?

Also, the value of a voice comes from (at least partly anyway), the huge amount of work, practice, skill, experience, from training that voice, experience in acting (and voice acting), and the recognition that resulted from all this experience and skill. Her voice wouldn't have "value" if she hadn't trained in acting, auditioned, honed her skills over many years. But OpenAI gets to use all that earned and worked for value for free?

This for me is similar to a writer honing his craft for a lifetime, 1000s of hours of work and labour, then someone training a model using his corpus to write plays and sell them without crediting him or paying him. It's trivial to make a model to imitate a writer, it's not trivial to become that writer to produce that work in the first place, so the writer needs to be both credited and compensated.

IanCal
1 replies
1d6h

But OpenAI gets to use all that earned and worked for value for free?

They paid the voice actor right? The accusation isn't that they trained on ScarJos voice, it's that they paid a voice actor that sounds like her.

The only thing in the list you gave is recognition that's different.

pseudalopex
0 replies
1d5h

The accusation isn't that they trained on ScarJos voice, it's that they paid a voice actor that sounds like her.

Johansson's accusation did not specify the method. OpenAI claimed they hired another actor and the similarity was unintentional. The 2nd claim seems unlikely. Some disbelieve both claims.

superb_dev
0 replies
1d8h

This is the question a court would be deciding. OpenAI claims it’s not on purpose, but I think there’s enough doubt to investigate

eqvinox
0 replies
1d8h

Is that why Sam Altman tweeted "her"?

They've lost any argument of coincidence, with just that one tweet…

throwawaymaths
4 replies
1d8h

However, the company hiring her might not have the right to use her voice with the intent purpose of passing that voice off as that of the other person.

Please cite statute.

Generally speaking malicious intent only does not an illegal act make. Last I checked a voice is rightly not a trademarkable thing (it's a quality, not a definable entity).

throwawaymaths
1 replies
1d8h

Thanks.

Wow, damn, that's such a crazy ruling. How distinct is distinct enough? How famous is famous enough? I'm surprised Ford's lawyers didn't press the issue further up the courts.

NamTaf
0 replies
1d7h

No, it's a pretty straight-forward and understandable ruling. Intent matters, the law is fuzzy.

gizajob
0 replies
1d8h

Yeah but it’s not just intent, they released a product after the intent.

eamsen
7 replies
1d8h

It would depend on whether she got the job for having a good voice or for having a voice that is associated with a famous person. Would her voice have the same value if it was not sounding like another famous person's voice? That's up to the courts to decide.

throwawaymaths
6 replies
1d8h

How do you legally define that two voices sound alike? "I know it when I hear it?"

Algorithmically? How much of a similarity score is too much? How much vocabulary must you compare? Suppose two voices sound entirely identical, but one voice has Canadian aboot. Is that sufficient difference?

You can maybe see why the courts wouldn't want to touch this with a 1 kilometer pole

roomey
0 replies
1d8h

This isn't how the legal system works. It's very fuzzy.

nkrisc
0 replies
1d7h

That’s why he have courts and judges.

gizajob
0 replies
1d8h

It’s not up to the courts whether they touch things with a 1km pole or a 1 inch one. The courts have to deal with whatever ends up in them.

Given the courts regularly deal with murder via decapitation or incest, I’m sure “The Case of the Attractive Actress’ Voice” would be a nice day out for them.

ZiiS
0 replies
1d8h

I can assure you, many Elvis impersonators can make a living without having any similarities at all.

probably_wrong
4 replies
1d8h

Intent matters. There's enough evidence out there pointing out to OpenAI's intention to copy Scarlett Johansson's character in "Her". The voice actress could read literally the same lines with the same intonation for (say) Facebook and, as long as she wasn't coached to sound like Scarlett Johansson, it would be fine.

throwawaymaths
3 replies
1d8h

What law is being broken here?

Suppose you turn down a job offer as a programmer. Can you be pissed at me if I offer the job to someone whose code looks like yours based on scraping GitHub?

foolofat00k
1 replies
1d8h

I think this is the relevant case law: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.

Basically your voice is part of your likeness.

"Impersonation of a voice, or similarly distinctive feature, must be granted permission by the original artist for a public impersonation, even for copyrighted materials."

LewisVerstappen
0 replies
1d1h

Quoting that as a relevant case law is completely ridiculous.

Did you even look at the Ford commercial? https://youtu.be/hxShNrpdVRs

Having someone sing in the exact same style as another singer is totally different from what OpenAI did with their voice AI (having a female actor speak in a flirty tone).

It makes sense with music but you're setting a really dangerous precedent if you can't even hire a voice actor who sounds similar for speaking.

fireflash38
0 replies
1d7h

You've shifted the goal posts.

yoav
0 replies
1d6h

Not when it’s an exact imitation of the movie her, in a product that looks exactly like the product in that movie, where the company using the recording is actively trying to convince its customers its scarlet and everyone in the world is convinced its scarlet.

That’s misleading and blatantly illegal.

Someone stole from a bank and your argument is “are you saying it’s illegal withdraw money from a bank with vigor?”

thih9
0 replies
1d8h

Search for videos showing voice actors at work - it's their job to talk in all kinds of voices, accents, intonation patterns and more. I'm sure that the person doing voice acting could also do it in a way that wouldn't sound like ScarJo.

lelanthran
0 replies
1d7h

Does she not have the right to use her own voice for money merely because she sounds like another, more famous persons?

Of course she can. She just can't fool consumers into thinking that she is SJ.

The problem here is not that the voice sounds identical to SJ, it's that the consumers are led to believe they that is SJ's voice.

gizajob
0 replies
1d8h

Yes of course she does. She’s demonstrably copying a certain actress playing the role of an AI here though, but yeah this is one of the points that would have to be duked out in court. Likely as part of the “coincidence” defence.

bambax
0 replies
1d8h

If they get sued (and I think they will), then it will be interesting to know how said actress was coached. Was she specifically asked to sound like Scarlett Johansson in Her?

bakugo
0 replies
1d8h

Sure, she has a right to use her voice. An AI reproducing her voice is not her voice, though.

fl0id
0 replies
1d8h

Yeah and especially in combination with the additional inquiry directly before release and the her tweet… looks awfully deliberate

camillomiller
14 replies
1d9h

Why isn’t anyone stopping for a minute to think what happens in that darn movie? Why is it even ok to want your AI to sound like the AI from her? Why are the people leading AI and tech today so shallow when it comes to cultural depictions of their role? Unless they would actually like to be the villain.

dTal
6 replies
1d8h

This observation is so widespread it now has a handy shorthand in the form of a meme, the "Torment Nexus":

Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale

Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus

https://knowyourmeme.com/memes/torment-nexus

dTal
5 replies
1d8h

Extending the thought, I would argue that this indicates the urgent need for more utopian speculative fiction, since it appears that as a society we tend to "steer where we're looking" even if it's straight into a pole. Cautionary tales clearly do not work - we need aspirational tales!

fl0id
2 replies
1d8h

Unfortunately that’s hard to write in a way that’s interesting, or at least that’s my interpretation. Maybe it would read too much like political and economic theory. And then in the cases where it is done it either leads to things like ‘get away from capitalism’ which many ppl wouldn’t like or just not discussing it much like Star Trek where supposedly there is no money but it’s not really explained how. (And I assume if it were more explicitly socialist start trek would have become as big)

zbentley
0 replies
1d7h

Iain Banks managed to walk the “utopian but doesn’t become a dry political treatise” line pretty well.

Ironically, his works are favorites of several leaders in the torment-nexus-creation space.

klyrs
0 replies
1d7h

My understanding of Star Trek is that it's a post-scarcity utopia. The narrative interest comes from the strife of non-utopian societies.

moffkalast
0 replies
21h3m

gestures broadly at Star Trek

lolc
0 replies
1d7h

I just started into "The Island" which reads like the counterpoint to "Brave New World" also written by Huxley. Saying we need it is not enough, because it doesn't get cited even though it exists.

tidenly
2 replies
1d9h

I've seen this take quite a lot around Twitter recently, which is confusing to me. Do you read Her as a "AI assistants are bad" story? I thought it was a much more subtle exploration of what our world would be like if such things existed, rather than outright condemning it, but it seems like lots of people saw only horror start to finish.

CipherThrowaway
1 replies
1d7h

"AI assistants are bad" isn't the take. The thing making people feel icky/uncomfortable about Her - and about the 4o demos - is not the idea of Samantha but the idea of Theodore. It's about nerds' idealized AI versions of womanhood. Tirelessly flirty, giggly servants who they can interrupt and talk over as they please, that come without the inconvenient features of personhood.

camillomiller
0 replies
1d5h

Exactly!

CipherThrowaway
1 replies
1d8h

This is the thing that is so wild to me. Outside of tech, everyone I encounter in day-to-day life intuitively understands that tech dudes inventing AI voices to flirt with them is not awesome or cool.

At this point, I have to assume that being out-of-touch is deliberate branding from OpenAI. Maybe to appeal to equally out-of-touch investors.

fl0id
0 replies
1d8h

It’s the tech (investor) bro appeal

gizajob
0 replies
1d9h

Because they’re not as creative as they think they are.

baq
0 replies
1d8h

Don't forget Frank Herbert is rolling in his grave, too. Dune's first two books could be called 'Paul is bad' and 'I guess I wasn't clear the first time'.

eqvinox
12 replies
1d9h

The voice they used is that of their voice actor, not Scarlett Johansson.

If this is the case, why did they pull it?

And why ask Scarlett again, just a few days before release?

gwd
6 replies
1d8h

And why ask Scarlett again, just a few days before release?

For one, because it's obviously better to have the exact voice rather than an imitation. For two, because it avoids exactly this kind of sticky situation: even if you believe you have a legal and moral right to do something, it's always nice to have the question completely air-tight.

Also, remember that OpenAI isn't a single person, but a large organization with different people and viewpoints. Not unlikely somebody in the organization foresaw the potential for this kind of reaction, knew they couldn't convince the right people to pull the voice completely, but thought they could maybe convince people to reach out again.

I mean, suppose they had a Darth Vader-y voice for some reason and had approached James Earl Jones to do it, and Jones had said no. Is it really so terrible to have someone else try to imitate the Darth Vader character? Jones, along with Lucas, made Darth Vader a cultural phenomenon; I don't think Jones should have the right to dictate forevermore all references to that phenomenon. The same goes for Scarlett Johansson and the author / director of 'Her'.

klyrs
4 replies
1d7h

Also, remember that OpenAI isn't a single person, but a large organization with different people and viewpoints.

This falls flat since the CEO tweeted "her" when the version featuring her voice clone was released. The CEO was bragging about this feature; the company was aligned behind it until the blowback went public.

gwd
3 replies
1d6h

I think you missed my point. Did Sam Altman personally think, a few days before the release, "Hmm, maybe I'd better call Scarlett again, just to be sure"? Or did someone else in the org propose that, to which Sam didn't object?

klyrs
2 replies
1d6h

According to Johansson, Altman personally reached out to her. He didn't just "not object" here, he's the guy taking the actions. Nor is he a lackey who just does what he's told; he's the actual CEO. Unless you're saying OpenAI is a big organization with lots of Sam Altmen, I'm not sure that the point you're trying to make applies here.

https://www.npr.org/2024/05/20/1252495087/openai-pulls-ai-vo...

Y_Y
1 replies
1d5h

OpenAI is a big organization with lots of Sam Altmen

Hopefully you meant this as a joke, but this nightmare isn't so unrealistic! Put Sam's voice on top of gpt-5o and spin up fifty of them to start cold calling starlets, why not?

klyrs
0 replies
18h58m

It's like a joke, but isn't funny.

MVissers
0 replies
1d7h

Disney would sue them for that, no?

If it sounds too alike intentionally, it’s basically an impersonation of a character, which falls under copyright.

Someone else was sharing case law about this.

szundi
1 replies
1d9h

Someone from OpenAI wrote on Twitter that pulling this voice was in the works for a month already. Must be true, it was on the Internet.

whoevercares
0 replies
1d9h

Prepare the rollback/fallback pre-mortem, not surprised

kolinko
1 replies
1d9h

Stopping behaviour is not an automatic admission of guilt.

eqvinox
0 replies
1d8h

And we're not in a court of justice here, where that distinction would be extremely important.

This is all about public opinion, and they have communication managers and PR strategists. If they didn't train this voice on Scarlett Johansson, they would have had so much better of a response by not taking it offline and instead insisting it is not trained on her. The entire discourse right now would be very different. And you can't shift this back later, burned is burned.

The way I see it, there's 3 options:

· they did train it on Scarlett's voice

· they don't know if they trained it on Scarlett's voice

· their PR/communications people don't know what they're doing

I don't think #2 is any better than #1. If it's #3… idk, better companies have been ruined by poor PR.

drpossum
0 replies
1d9h

You're seeing a pattern of bad faith "beg forgiveness instead of asking permission" and they got their hand caught in the cookie jar when someone insisted they didn't want part. Reddit, Stack Overflow, etc were just happy to roll over and accept the check after the fact.

CipherThrowaway
2 replies
1d8h

but I don't think she has any rights to voices that merely sound like hers, when they're not being used to fraudently imply that she's involved.

IANAL but it's actually not that simple. There are laws and precedents around soundalikes.

I think the cultural defense for this is really lame. Was there really cultural value in evoking "Her"? Because outside of the narcissistic male nerd culture of laughing at your own jokes, and fixating on your own cleverness, there doesn't seem to be. No one outside of this bubble actually thinks it's super cool and culturally valuable for tech companies to attempt reinvent - seemingly without a trace of irony or self-awareness - the cautionary tales and trappings of dystopian sci-fi.

From what I have seen, the demos struck the average person as cringey and left women in particular with a deep sense of revulsion.

fl0id
0 replies
1d8h

This so much

Y_Y
0 replies
1d7h

Just to say, I think the demo is lame too, and the film was fine but no Black Mirror. I still think there's cultural value, even if it's not the kind of culture that appeals to me.

Y_Y
0 replies
1d8h

Good point and good reference. I see that in the first case Better Miller won a case against Ford motor company who had a "sound-alike" sing one of her songs for a commercial when she refused. The other cases are similar and my reading of the article is that the cases hinged on 1) a deliverate and explicit imitation of the famous voice concerned and 2) the voice in question being quite distinct, such that it would be very unlikely to approximate it by coincidence. In the case at hand here I think a strong argument could be made either way I both of these points and reasonable people will certainly differ.

chrisjj
0 replies
1d

No party to this matter has claimed it us a clone of her voice.

simion314
0 replies
1d9h

I agreed with your argument before the latest revelations, so now I want a judge to order OpenAI to get clean, did they do use any second of Scarlet voice ?

Because they could be lying, they could have used a different actress and also mixed some Scarlet voice into the mix.

jacknews
0 replies
1d6h

Isn't this the 'look and feel' argument?

Especially if you then post references bragging that your thing either is, or is basically the same as, that other thing.

IMTDb
14 replies
1d9h

Does that mean that the voice actress that was hired to sample "sky" has to stop working since her voice is "too similar to another famous person" ? So Scarlett Johansson owns her voice, and every single one that is "too similar to hers" (by what metric) and gets to dictate what can and cannot be done.

It's a slippery slope.

h4kor
6 replies
1d9h

No. There is an interview/documentary about South Park where they talk about celebrity appearances in the show. They said that they never request celebrities to play themselves in the show. If the person declines the offer, they can get into legal trouble for imitating them. If they don't ask it is fine.

The problem here is that they wanted Scarlett Johansson and imitated her when she said no.

There is also a big market for "celebrity sound alikes" in voice acting.

IanCal
3 replies
1d8h

But in south park they would be seen to be imitating a person - there would be a character called "George Clooney" who sounded like George Clooney.

The case here is that there's a voice that sounds like Scarlett Johansson, but OpenAI are saying that its isn't her.

nycpig
0 replies
1d8h

The other difference is that the creators also had his consent.

mrweasel
0 replies
1d8h

but OpenAI are saying that its isn't her

The problem for them is that they've already admitted that they really wanted it to be her. They should have consulted a Hollywood lawyer for this one.

Mordisquitos
0 replies
1d6h

The case here is that there's a voice that sounds like Scarlett Johansson, but OpenAI are saying that its isn't her.

Well, here's Sam Altman quite literally implying that it is her: https://x.com/sama/status/1790075827666796666

mrweasel
0 replies
1d8h

That's a really good explanation. The South Park team works in the field and understands the rules of Hollywood and their details, OpenAI clearly does not.

The problem for OpenAI is probably that this is the first time that someone with a bit of standing caught them with their fingers in the cookie jar and not only have the balls to tell them no, but also have established rules to go after them.

lolc
0 replies
1d7h

They said that they never request celebrities to play themselves in the show.

Satire has protections that other fields lack.

nicce
2 replies
1d8h

Issue is that they asked Scarlett beforehand and even CEO advertised it based on the fame of Scarlett. It is not about the possible voice actress anymore. It was all about Scarlett, not about the voice actress. The voice actress is just an excuse at this point.

If they would have never mentioned Scarlett or asked her, it might be different situation.

IMTDb
1 replies
1d6h

It was all about a character.

They wanted someone that would be able to perform a specific character (the AI in the movie "her"). They were looking for the right person for the fit, they got their eyes on Scarlett (of source, she played the character, originally), and when she refused they found someone else who would be able to deliver a close performance. It happens all the time.

To me, the "sky" voice is way closer to Rashida Jones rather than Scarlett Johansson: https://www.youtube.com/watch?v=dJmRaTogqrM&t=370s

whamlastxmas
0 replies
1d5h

I agree with this. I think the owners of the movie have more of a basis to sue than SJ. She doesn’t sound like that when speaking naturally. It’s like saying opera singers can sue each other for sounding extremely similar in their professional performances

gorbypark
1 replies
1d9h

Do we know that these voices are backed by real people? I kind of assumed that given AI tech in voice cloning, they could make a voice sound like anyone they want and don't need to go out and find someone that sounds like "the target" voice. I think it's more fair game if they actually found another human that could sound like someone famous, but if they just ran the audio of some of her movies through some AI and out popped a voice, that's a little bit less clear.

SiempreViernes
0 replies
1d8h

It's probably fair game if they got a similar voice by coincidence, but here they intentionally want to copy someone's voice as performed in a well known IP for pure business profit.

jamil7
0 replies
1d9h

No, there is wider a context and nuance to the whole topic, as there always is. Jumping to hyperbolic "slippery slope" arguments doesn't add anything interesting to this discussion and instead tries to deflect it.

SiempreViernes
0 replies
1d8h

It always takes more effort to sound like someone else than just being born dude, stop pretending otherwise.

ChildOfChaos
14 replies
1d9h

This article is terrible.

There is a lot of nonsense and drama about all of this, but the voice is clearly not Scarlet, even though they were clearly trying to get someone to sound like her. So much drama over nothing.

DonHopkins
11 replies
1d9h

That's just your opinion that it's clearly not Scarlet, but you don't know her, and she and her family and friends who do think it is, and the fact that you say "even though they were clearly trying to get someone to sound like her" is kind of the whole point: even YOU admit they were "CLEARLY TRYING" to steal her voice after being told TWO TIMES not to. Trying to steal something and not quite stealing 100% of it perfectly is STILL THEFT. You're the one being dramatic trying to defend what even you admit is outright theft.

And Sam Altman's lower case single word tweet "her" clinches it. Clearly he was not just announcing his new pronoun.

ChildOfChaos
5 replies
1d9h

Wow.

Having something similar to someone else does not equal theft. What is wrong with people? Why are they acting so weird about this.

What about the voice actor that did this? Is her voice 'theft'?

DonHopkins
4 replies
1d9h

Then explain Sam Atlman's tweet, "her":

https://x.com/sama/status/1790075827666796666?lang=en

And you already admitted "they were clearly trying to get someone to sound like her". You pre-lost your own argument yourself, before I even replied, with your own words.

Why are you acting so weird about this, like there's something wrong with you, and then trying to project it onto other people?

Trying to emulate Trump's debating techniques and psychological disorders doesn't work any more, nor look very good on you.

ChildOfChaos
3 replies
1d9h

I don't need to explain anything. Why should I explain a tweet? The tweet is the tweet, they used it for marketing.

It doesn't' change anything, they can make it seem similar to her if they want by hiring a voice actor with similar characteristics, the voice actor is paid for there work and they get something that is great marketing. There is no theft here. This is a completely normal situation, your massive overreaction is not normal.

If they stole her voice outright, there is a problem, but it is extremely unlikely they did. So then there is no issue. It's common sense, i'm not sure what's wrong with you that you are trying to bring politics into it.

DonHopkins
1 replies
1d9h

Again with the psychological projection.

ChildOfChaos
0 replies
1d8h

That wasn't my intention, so apologies if i've come across wrongly, I was just taken aback by the reaction. To me the situation is clear and common sense, you have a different view, which is bizarre to me, but likewise then I assume my view must be bizarre to you. We have both expressed those and it's not worth going forward and arguing over something that is fairly irrelevant to both our lives.

klausa
0 replies
1d8h

they can make it seem similar to her if they want by hiring a voice actor with similar characteristics

It is absolutely not clear if that's the case; and there is caselaw that suggests that they absolutely _can't_:

https://mcpherson-llp.com/articles/voice-misappropriation-in...

sebzim4500
4 replies
1d9h

Problem is there aren't 8 billion clearly distinguishable voices in the world, so some people are going to sound a bit like other people.

Should the voice actor who sounds a bit like Scarlet just quit working?

thr63294
1 replies
1d8h

Step 1: clone the Scarlet Johanson's voice

Step 2: use control vectors to make it slightly different

sebzim4500
0 replies
1d8h

If that's what happened then the case would be strong, but the current claim is that they used a different voice actor and this is just what you get when you ask for "flirty american woman".

DonHopkins
1 replies
1d9h

How many movies named "Her" are there?

Since OpenAI asked her for permission two days before releasing it, obviously THEY thought it sounded like her voice, and that they needed to ask for permission. Making public claims implying that she's involved with the project like Sam Altman's "her" tweet, AFTER asking for permission and receiving two "NO"s, proves malice and dishonesty.

Sam Altman's tweet is going to be Exhibit 1 in the lawsuit he loses.

https://x.com/sama/status/1790075827666796666?lang=en

It makes me wonder how many "NO"s Sam Altman's sister Annie told him, that he also ignored. Maybe SHE deserves HER voice to be heard.

https://x.com/anniealtman108/status/1459696444802142213

Annie Altman’s Abuse Allegations Against OpenAI’s Now-Fired CEO Sam Altman Highlight the Need to Prioritize Humanity Over Tech:

https://www.themarysue.com/annie-altmans-abuse-allegations-a...

What are Annie Altman’s allegations against her brother Sam Altman?

Annie Altman alleged in a November 2021 Twitter thread that she “experienced sexual, physical, emotional, verbal, and financial abuse from my biological siblings, mostly Sam Altman and some from Jack Altman.” She returned to those allegations in 2023, seeming to call out Sam. “I’m not four years old with a 13 year old “brother” climbing into my bed non-consensually anymore,” Ms. Altman’s account, @phuckfilosophy, tweeted in March 2023. “(You’re welcome for helping you figure out your sexuality.)” [Sam Altman is openly gay, and there are nine years between Sam and Annie.] “I’ve finally accepted that you’ve always been and always will be more scared of me than I’ve been of you.”
sebzim4500
0 replies
1d8h

The functionality is obviously similar to the film Her, I don't understand how this is supposed to be the slam dunk evidence against OpenAI.

When Kinect (remember that lol) was being released people compared it to minority report on stage. Does that mean Tom Cruise should have sued Microsoft?

isaacfrond
1 replies
1d9h

On this one, it is OpenAI who are terrible and acting like spoiled kids.

They approached Ms. Johansson, she said no, and they went ahead and did it anyways.

ChildOfChaos
0 replies
1d9h

No they didn't though.

They hired a voice actor who had a similar voice profile to her, a similar voice to what they were looking for. If you just hear the voice they used, it sounds very similar, but if you actually play them side by side, you can hear they are very clearly different.

How is that different from anything else? You ask someone to do something you want them to do, they say no, so you then go and find someone else that will do what you need which is what they did.

Falkvinge
11 replies
1d9h

It is reasonably common in the voicing part of the acting industry, when a famous name turns you down, that you instead turn to a voice imitator to give a performance that's reminiscent of a number of factors (kind of sounding like this other person, for example).

This way, legally you're in the clear, because you didn't use the voice of X. Not on any occasion did you use the voice of X. You even paid a voicer to make your recordings, and it's their voice.

Morally, it can be - and probably is - a different matter.

wokwokwok
7 replies
1d9h

I think, as with most things, it’s slightly more legally ambiguous if the person you’re copying/whatever is rich enough to have good lawyers and a strong media presence.

You can argue about these morality of that, but I think it’s reasonably true, practically.

Thus, it got taken down.

gizajob
6 replies
1d9h

Especially given the very clear cut case here of the AI company deliberately cloning the voice of the most recognisable and human-like AI voice from fiction. And also tweeting that they’d done exactly that. And then lying that it was a coincidence. It wouldn’t stand up in court which is why they got rid of it – I’m sure that if Sam Altman and his counsel believed that they could beat ScarJo in court then we’d still be listening to “Sky”.

throwawaymaths
4 replies
1d9h

the most recognisable and human-like AI voice from fiction

No, that would be either cortana or Majel Barrett Roddenberry.

gizajob
1 replies
1d8h

I’m sure “Majel” is in the pipeline.

Would the everywoman on the street be able to recognise Majel over Scarlett? I don’t believe so.

throwawaymaths
0 replies
1d8h

I don't think most people recognized Scarlett either, and if so, they recognized her not from "Her" but generally as an actor

bloopernova
1 replies
1d8h

Not HAL?

eqvinox
0 replies
1d3h

I think the problem with HAL is that some people would recognize the attitude, not the voice ;D

(i.e. I'm not sure if I would jump to "that's HAL!" if it weren't in disobeyance and vaguely threatening…)

kolinko
0 replies
1d9h

They didn’t clone her voice

twobitshifter
0 replies
17h27m

Why do people think they need an actor? You can recreate a voice from just a few seconds of audio today. Do that and then mix in some fair use modifiers to CYA.

eqvinox
0 replies
1d9h

It is reasonably common in the voicing part of the acting industry, when a famous name turns you down, that you instead turn to a voice imitator

Citation needed.

Even when VAs are recast for reasons beyond control (death, unavailability, aging child voice, health, etc.), they are only similar, not imitations. Happened in a lot of anime.

can16358p
10 replies
1d9h

OpenAI wanted a tone in their demo, they contacted Scarlet, and she refused.

If this is OK (and of course it is, no one has to accept any offer), how is OpenAI working with someone with a similar voice tone not OK?

If Sam used someone's voice without consent, that's illegal and immoral, but I see nothing wrong with Sam paying someone with a similar voice and using that voice instead.

eschaton
7 replies
1d8h

Because they are using her likeness without her permission. If they just wanted a “similar” voice they could have hired one from the start. They wanted her voice and when she said “no” they used it anyway.

The “her” tweet makes it exceptionally clear that it was always intended to be Johanssen’s likeness, not some other voice actor’s that happens to sound similar; OpenAI is just betting they can get away with it.

It seems like OpenAI were also betting on Johanssen not knowing her rights _de jure_in that area, which it appears they were quite wrong about. (And even if she didn’t, she would have quickly learned of them in this hyper-connected world, making the bet utterly incompetent…)

can16358p
6 replies
1d8h

It's clear that they wanted her voice (by contacting her first, and the "her" tweet), and they aren't denying it.

What's wrong with using a similar voice to the one in your vision of the demo/service, as it's the closest voice that you can (legally) use it?

can16358p
1 replies
1d5h

So if I'm a celebrity and I have my voice known, anyone who wants to be a voice actor who has a similar voice to mine... can't?

omnicognate
0 replies
1d2h

No, if you happen to sound like a celebrity there's nothing stopping you from making your own way in the acting world. You should be careful not do anything that looks like you are deliberately trying to pass yourself off as that celebrity or profit from their persona, though.

Like, for example, voicing an AI chatbot when that celebrity, one of the most highly paid in the world, has previously starred in a famous movie as an AI chatbot. And especially when the makers of the chatbot have repeatedly publicly referenced the movie as inspiration in creating the chatbot and have publicly pointed out the similarity between their chatbot and the movie. And especially when the makers of the chatbot have approached that celebrity to ask to use her persona and been denied.

Because if you do that you might just get your arse sued off.

eschaton
2 replies
1d8h

Because she inherently owns the rights to her voice and they’re attempting to reproduce it without her permission. This has been litigated in multiple cases. If they had just wanted to be _reminiscent of_ “Her” they could have a different outcome than if they tried to get the original “Her” voice actress and then tried to reproduce her voice in her she said no.

Intent matters a lot in a situation like this and OpenAI very clearly documented their intent for every me to point and laugh at. Even in the hypothetical case I give above where they don’t try to hire Johanssen and just get someone similar-sounding, they could still be liable for appropriating Johanssen’s voice if they were coaching the voice actor to specifically sound like Johanssen rather than do her own performance.

You’re looking for a technicality but this stuff relies on human judgment for a reason.

can16358p
1 replies
1d5h

So based on that human judgment, no "similar" voice to a public figure can be used in a product/service/demo/ad etc. without that person's consent. What's next? Banning similar people from appearing in demos? Banning particular musical notes from being present in songs because someone else played that note in a song?

As long as it's not literally her voice (or a model trained directly on her voice) they should have rights to do anything they want.

eschaton
0 replies
20h51m

Again, you’re looking for some sort of absolute that either allows or prohibits something in all cases based on a set of ironclad rules, and that doesn’t exist because intent matters in human endeavor.

If a similar voice was used because it was _reminiscent of_ “Her” but was still a unique performance, then it probably wouldn’t be a problem. Where there’s a real problem is if it’s an intentional mimicry of a specific person’s specific vocal performance.

And Johanssen can show OpenAI had such intent in court pretty easily, between their attempts to get her consent to use her voice (which she denied) as well as their CEO’s “her” tweet promoting the voice.

tibanne
0 replies
1d7h

I agree, if someone else was asked and they used their voice to train this is more than acceptable. What is this fuss about.

omnicognate
0 replies
1d8h

The moral issue is that Scarlett Johansson's voice is famous and to many people the fact that it sounds like her specifically is the source of much of the value of that voice, so one can argue that the benefit of that should accrue to her, not to an unknown person who happens to (or can make herself) sound like her or to the company she refused to allow to use her voice.

The legal issue is that there are specific, established laws based on ideas like the above, known as "right to publicity" laws.

x-complexity
8 replies
1d8h

In most of the discussions about this fiasco, I have yet to see anyone point out the following:

- There is a finite number of distinct human voices.

- The # of desirable human voices (D) < The # of distinct human voices (A)

Where D := A - (All unwanted / undesirable human voices)

- Two voices that sound alike are treated as if they are alike

The fact that people say that it sounds like Scarlett Johansson means that there IS a region of multi-dimensional vector space that is claimed to be "owned" by the actress and that actress alone.

I HATE the end conclusion that came resulting from this mob spectacle: That people with voices that sound like hers have effectively no agency with what they can do with their own voices. That they are eternally marked as "that person that sounds like Scarlett Johansson", and that whatever they do with their own voice (including consent to be used for ML training) is met with the ire of the online mob.

schroeding
2 replies
1d8h

Isn't the main problem that they wanted to explicitly use Johansen's voice, didn't get permission and then (maybe[1]) circumvented it by using a voice imitator, while still publicly entertaining the idea that it's her voice?

Yes, OpenAI did not explicitly say it's her voice, but Altman tweeting "her", it being called "Sky", them asking her for permission twice, they definitely wanted this connection (and her real voice), IMO, and like so many tech companies, they just did not accept "no".

They could have used the same voice, called it "Moon" or whatever and stopped Altman from tweeting, and it probably would have been fine.

In this context, I understand the result. Without this context, I would not, and would agree with your points.

[1] Or maybe just used her voice samples anyway in some form, that's hard to proof / disproof without an external audit - depends on whether or not you give OpenAI the benefit of the doubt

x-complexity
1 replies
1d7h

Admittedly, the actions of SamA didn't help with the dumpster fire that is the discourse surrounding this.

However, the principle still stands.

_heimdall
0 replies
1d7h

The legal issue here would be intent. Of course it isn't illegal to have a voice that sounds like someone else, but Sam seems to have been arrogant enough to actually make their intent pretty easy to piece together with a small amount of public information. A lawsuit would allow discovery, and I wouldn't be surprised at all if internal emails even more explicitly spell out the intent to copy the voice from Her.

Proving intent is extremely hard and I always wonder how anyone that has enough power to pay attention let's their intent be documented. At least for Sam it sure seems to be due to a massive amount if arrogance.

lm28469
1 replies
1d8h

Legality vs morality, the problem is as old as the humanity. Legally they can perfectly use a sound alike, morally it's a shit move to ask the actress who voiced a very similar product, get a "no" and still decide to find a sound alike voice actress. If it was a non profit health care company you can be sure people would be much less outraged, but silicon valley's tech companies burned the little trust people had in them a long time ago, it's hard to have empathy for sociopath

_heimdall
0 replies
1d7h

Legally it becomes a problem if it can be shown that the intent was to copy the voice. In this case, it seems like Sam made that pretty clear with only a few bits of public info. If a case was brought against them I can only assume discovery would find internal emails or chats that spell out they were absolutely trying to replicate Scarlett Johansson's voice after she turned down their offer.

willis936
0 replies
1d7h

There are only so many combinations of 160x160 pixel arrangements but Adobe will still protect theirs.

Context matters. Adobe cannot defend its logo in all contexts. The laws make this clear. You are ignoring the context of Scarlet Johansen portraying an AI character in a movie then her likeness being used in another AI character without her consent. Had she never starred in Her, had she never been approached by OpenAI, this would not be a story.

https://www.engadget.com/adobe-threatens-to-sue-nintendo-emu...

lijok
0 replies
1d8h

You're trying to engineer a way out of a sociology problem. Never gonna happen.

Whoever voiced Sky didn't lose their agency over their own voice. The issue at hand is Sama not taking No for an answer.

laborcontract
0 replies
1d8h

I don’t think this is the argument. The whole thing is that they’re trying to pass it off as scarjo in “her”, which they’ve been very vocal about.

If Sam was less vocal about his love for her, we wouldn’t be out here thinking that OpenAI is trying to ride off the back of the movie.

KoolKat23
5 replies
1d9h

If it was an exact copy fine, but it's not. This is ridiculous. Imagine Margot Robbie sued a film producer for using Jamie Presley when she turns down a role or Michael Cera for Jessie Eisenberg.

consp
2 replies
1d8h

If it was an exact copy fine, but it's not.

This is not the point, as stated by Ms Johansson's statement, her own friends could not tell the difference which is the point. If a reasonable person cannot tell the difference, there is none. In combination with seeking a prior agreement to use her voice this sounds quite dubious at best, malicious at worst.

unraveller
0 replies
1d7h

Suddenly all the media companies and social media pundits have forgotten how to upload audio media samples alongside accusations in text, they might not want to reach reasonable people. The outrage is proof enough.

ChatGPT using Sky voice (not 4o - original sept. release): https://youtu.be/JmxjluHaePw?t=129

Samantha from "Her" (voiced by ScarJo): https://youtu.be/GV01B5kVsC0?t=134

Rashida Jones Talking about herself https://youtu.be/iP-sK9uAKkM

KoolKat23
0 replies
1d3h

Yes, but this doesn't sound like her, it's merely similar to her, and there is no contract between her and OpenAI.

tomthe
4 replies
1d9h

I would love to play some board games with Sam Altman. He is smart, strategic and very persuasive (I heard). He would excel especially at games like Risk or Mafia, where you have to persuade your opponents to collaborate with you.

But after a few rounds, no one would help him anymore and he would loose all the games.

whamlastxmas
2 replies
1d5h

He seems to be winning all the games in real life

tomthe
1 replies
1d4h

Not all (look up how he got fired from ycombinator). But as long as there are enough new people who want to play with him, he might do fine (and there are a lot of willing VCs). Once the word starts to spread about him, through stories like this, he might experience some backslash.

93po
0 replies
1d2h

Sam wasn't fired for doing a bad job at YC, he was fired because the compensation structure wasn't something he was happy with, which is why he was also "double dipping" with his own funds into startups - something that was common at the time across the whole org. Jessica Livingston even went on to put money into his next project (OpenAI) which is also counter to the characterization in your perspective. He has tons of friends in high places that continue to put a lot of their money into Sam's coffers, because Sam is really good at making money for himself and others

surfingdino
0 replies
1d7h

No wonder they get along with Microsoft just fine. Here's what Tim O'Reilly said about MSFT years ago, and I think it applies to OpenAI today:

"Microsoft gets a lot of heat for not leaving enough on the table for others. My mother, who’s English, and quite a character, once said of Bill Gates, “He sounds like someone who would come to your house for dinner and say, ‘Thank you. I think I’ll have all the mashed potatoes.’”"

Replace Gates with Altman.

source: https://daringfireball.net/2013/05/all_the_mashed_potatoes (the original post is gone from the O'Reilly website)

padjo
3 replies
1d9h

For me this speaks to the arrogance and disconnect from reality that some people in tech operate under.

It seems they never considered it possible that someone would say no to them and would protect their IP against them, because who wouldn’t want to be part of their glorious mission, right?

edflsafoiewq
1 replies
1d9h

Tech has a consent problem, no doubt. The popup which won't let you decline but instead asks you again and again forever is just the lowest manifestation of a deeper issue.

maeln
0 replies
1d5h

Eh, it's not just Tech, most company has a consent issue, and you could easily argue that capitalism as a consent issue (not saying that alternative also don't have one). Rarely do businesses have any incentive to care about consent. In many case they actually have a vested interest in not caring about your consent, as it can remove one big barrier to profit. It is easier to illustrate these days with all the IA company harvesting data with 0 consent, it is true, but you find the same behavior in most industry. Consent is seen as a customer service issue, something extra, not something that is due.

mschuster91
0 replies
1d9h

In the end, it's business, it's purely about money. When you got untold billions in the bank account, or have someone backing you who has the billions, you can do whatever you want as long as it's not a violation of criminal codes. Hope that no one discovers what you did, and in the case they do and have the financial resources to fight back, drag the case along for as long as you can, and eventually settle for pennies.

As long as you don't commit crimes, this strategy will always work out. Hell even if you do commit crimes, as long as you have willing "fixers", you got to piss off a lot of powerful people to face justice. Just look at the 45th - that guy has a history of decades of stiffing contractors, lenders and employees (including those who actually went behind bars for him), but it took a pretty flimsy constructed case to finally charge him with a crime and I'm not certain that the case is clear enough for a conviction.

dkdbejwi383
2 replies
1d8h

LLMs don't think, they just regurgitate text tokens based on a probability + randomness model.

stevenhuang
1 replies
1d7h

I hope the irony of repeating this tired phrase much like a stochastic parrot would is not lost on you.

MVissers
0 replies
1d6h

LLMs don't think, they just regurgitate text tokens based on a probability + randomness model.

/s

jeegsy
3 replies
1d7h

Good luck finding a voice that doesnt sound like anyone else

klyrs
1 replies
1d7h

Good luck shooting a gun into a crowd of people without hitting anybody!

... because not pulling the trigger clearly isn't an option.

jeegsy
0 replies
16h59m

Oh, so no voice then?

hathawsh
3 replies
1d8h

Just a matter of curiosity: do people really think the Sky OpenAI voice is similar to ScarJo? While they both vary pitch a lot, Scarlett also adds great dimension by shifting rapidly between different tone qualities. Tone variation seems to be only barely detectable in Sky. Sky sticks to a pure tone, while Scarlett starts on a mildly harsh (but pleasant) tone.

rossant
2 replies
1d8h

Is there an objective, quantitative metric to compare two voices, including pitch, tone variations etc?

logicchains
1 replies
1d7h

You train a neural network to distinguish between them and measure its accuracy. If they were the same voice, the accuracy in the eval dataset wouldn't be better than chance.

squarepizza
0 replies
1d2h

I find this comment a bit myopic.

First is the belief that the essence of a person can be reduced to quantified metrics as inherited and objective as height or, in this case, the shape of one's vocal cords and the resultant pitch of their voice. Second is using a glorified function approximator as an arbiter. The positive outcome for OpenAI would be a classifier able to achieve high accuracy in distinguishing between the original and impersonated voice as evidence that the voices are sufficiently distinct.

In the event of an underperforming classifier, what is preventing OpenAI from claiming that the current technology is simply not adequate? Under what grounds could one refute the claim that a future, more advanced system, very probably trained with more appropriated data, would be a better classifier and favor OpenAI?

Havoc
3 replies
1d9h

Unless she can prove that it’s directly derived somehow I don't see much of a case here.

Questionable ethically perhaps but OpenAI doesn’t seem to care about that

drpossum
1 replies
1d9h

This was posted up-thread, but there is a clear legal argument known as "passing off" https://en.wikipedia.org/wiki/Passing_off. It turns out your "intuition" isn't actually a basis for a legal system nor a way to position yourself as a legal scholar.

Havoc
0 replies
1d5h

Where have they passed this off as her?

kamaal
2 replies
1d9h

Lets say this goes to the court. I wonder how would they prove if a 'voice' belongs to a person. This is such a grey area, how does some one own a 'voice'. Also similar doesn't mean same.

Plus this opens the door for all kinds of things. Say tomorrow some one makes a AI that sings. And they make any lyrics, and make a AI sing it, say in a voice that's very close to their favorite singer. Would that qualify as illegal?

Would be interesting to see how that legal side of things in this regards unfolds.

kolinko
1 replies
1d9h

That is quite easy - you show the training data and provide the details of a person whose voice it was.

With purely synthetic voices, it’s gray area just like with most of the other genai.

kamaal
0 replies
1d8h

How is this any different than text, photographs or anything for that matter. By this definition all AI is illegal.

Besides, how does one prove a recording is of a particular person's voice? The AI might be generating voice similar to that of Scarlett Johnson, but its still not her voice. Similar != Same.

fsniper
2 replies
1d8h

I don't condone using a replacement voice with bad intentions,And this is clearly what it is.

However!

With this mentality, the mentioned paid voice actresses voice is not hers anymore because another one's voice is similar and it's used in a movie? So she can't work and voice any other work without Scarlet Johansson's permission?

iamacyborg
1 replies
1d8h

You’re assuming that the VA’s voice (assuming there even was one) was their natural voice, rather than an acted one.

fsniper
0 replies
1d2h

With the recent news, I am more inclined to think this voice is forgery than it's another voice actress.

spuz
1 replies
1d8h

Today, under pressure, OpenAI pulled the ScarJo-like voice, alleging that the resemblance was purely a coincidence.

I probably don’t have to tell you, but that’s complete bullshit. And stupid, obviously refuted bullshit at that.

This article is quoting the linked Tom's Guide article but nowhere in the original OpenAI statement do they claim that their voice's resemblance to Scarlett Johansen was "purely a coincidence". It's kind of important to actually quote the original source before you call it out as "complete bullshit".

Original OpenAI statement: https://openai.com/index/how-the-voices-for-chatgpt-were-cho...

oneeyedpigeon
1 replies
1d8h

Before we could connect, the system was out there.

I.e. they asked for permission (which they presumably felt they needed), then launched not only without it, but without a response either way.

As a result of their actions, I was forced to hire legal counsel, who wrote two letters to Mr. Altman and OpenAI, setting out what they had done and asking them to detail the exact process by which they created the ‘Sky’ voice. Consequently, OpenAI reluctantly agreed to take down the ‘Sky’ voice.

Strongly suggests OpenAI had something to worry about regarding their own process.

csharpminor
0 replies
1d8h

I truly wonder if this is an “any publicity is good publicity” tactic on their part. It’s now national news that OpenAI supports conversational voice. Buying this level of publicity would likely have cost more than the lawyers and damages.

kunley
1 replies
1d9h

Altman playing a wannabe god, geez...

moffkalast
0 replies
1d9h

Always has been.

futurecat
1 replies
1d9h

Never forget sama is also behind the Worldcoin dystopian absurdity. It tells a lot.

camillomiller
1 replies
1d9h

Sam Altman’s personality is another proof of the theory that startup/Silicon Valley hyperinnovation culture has a massive problem of incentivizing manipulative narcissists to rise to the top. The entire culture clearly has a set of filters in place that favor individuals with extreme manipulative god-complex-style traits, who are convinced their mission is so supreme and important that any other rule shouldn’t apply to them.

The fact that the future of AI is in the hands of one of these individuals is an utter disaster.

cromka
0 replies
1d9h

I couldn't but instantly thought of WeWork.

_3u10
1 replies
1d9h

That’s how it works in Hollywood:

The voice, all names, characters, and incidents portrayed in this AI are fictitious. No identification with actual persons (living or deceased), places, buildings, and products is intended or should be inferred.

whamlastxmas
0 replies
1d5h

The personal attacks on Sam are disappointing when anyone making those comments don’t actually know him. I think speculation on trustworthiness as the OpenAI CEO is fine but some stuff here is just being shitty towards him.

I feel bad for the voice actor OpenAI hired. She was probably really excited about this and then SJ had to make it weird and claim she was ripping her off

It wasn’t SJ’s natural voice in Her, it was a performance that is nothing like her regular speaking. The idea that she can claim giggly/pitchy as only her domain, or that she owns the rights to the character in the movie, feels really silly to me

throwawaycities
0 replies
1d7h

OpenAI's Original Mission Statement:

Advance digital intelligence in a way that benefits humanity, without the need to generate financial return.

That bit about financial return has since been removed.

Until now no one could point to a human story of what that meant, instead pointing to the shift from a 501(c)(3) to a for profit model which people don’t understand so it just doesn’t resonate.

But now, we have a human story of someone that makes a living on their likeness, the CEO approaching them for licensing rights, then allegedly just training the AI on their likeness and stealing it for financial gain without compensation when the CEO couldn’t get a deal done.

AI is dystopian to start with, which is why the initial model was created as a non-profit to benefit everyone, but now we have a human story validating the danger and dystopian beliefs surrounding AI.

On top of everything the Founder was personally involved in making these decisions and even Tweeted “her” as a sick joke referencing the actress’ movie where she voiced an AI.

tanvach
0 replies
1d8h

Are LLMs basically ‘pirating humanity’ at this point?

The unauthorized use or reproduction of humanity's work.

sanxiyn
0 replies
1d9h

Sam Altman is a grifter. We all should boycott Sam Altman.

Repeat after me: Sam Altman is powerless if we all ignore him. Sam Altman is nothing without his people. He can't do anything himself.

https://x.com/OpenAI/status/1727236805182026159

onhacker
0 replies
1d9h

Extremely concerning.

Think she is a prominent actress that's why this news is all over the internet but every individual has the right to their voice and face shouldn't be cloned, for them it's not concerning on internet but definitely in their family, relatives and society

ollysb
0 replies
1d9h

Interested to see how this plays out. Given the level of mimicry that AI can now achieve how do actors, musicians and artists safeguard the likeness, sound or style that they've developed?

largbae
0 replies
1d9h

The first order problem is how do we compensate those whose performances and art were unique and recognizable before computers could imitate them.

However we solve for that will tell us if and how we can motivate people to make all the tradeoffs needed to stand out and bring new unique art into the world.

I am hoping we land somewhere in between the extremes of "no copyright use" and "what's a copyright" and create a royalty system that can trace your content's influence on the weights used at inference and cut you in according to your contribution.

gmerc
0 replies
1d7h

Given that the next step is to use the data trail of employees across all kind of professions, for example sales, design, etc and use it to replace them at scale this is a very timely conversation

https://youtu.be/U2vq9LUbDGs?si=2GFBPE-XdHU8Fsk2

dandanua
0 replies
1d8h

Moral bankrupt as a leader of an AGI development. What could go wrong.

But of course, all this is not about only one man. Corporatocracy will prevail, if not actively confronted.

WrockBro
0 replies
1d8h

Maybe Sama tweeted/ex-posted "her" because that was a trending topic related to the demo? I guess everything has to be some grand conspiracy nowadays.

Madmallard
0 replies
1d7h

how are we not sacking the company? this is so antisocietal it’s unreal

AJRF
0 replies
1d9h

Am I right in saying that this voice has existed in the app for the voice feature for ~6 months now? I remember hearing it and thinking "Sounds like Scarlett Johansson"