return to table of content

David Attenborough is now narrating my life

iris2004
67 replies
21h34m

It's a neat demo. I don't know how I feed about cloning the voice of a living person, even a 'public' figure like a presenter. I wouldn't be surprised if it eventually becomes illegal, although it'll likely be as difficult to enforce as normal casual copyright infringement.

woleium
46 replies
21h30m

I expect the existing laws will be extended. I had a colleague tell me about looking into how much Morgan freeman wanted for a 15s commercial ($$$$$$). The alternative he used was a “sound alike” voice actor, for which Morgan Freeman received royalties to the tune of $$$$$.

pcthrowaway
42 replies
21h24m

Why would Morgan Freeman be entitled to royalties for a commercial where the voice over sounded similar to his voice... unless he actually sold the rights to represent him with a vocal likeness in the commercial (AKA he didn't do the voice over, but the commercial was able to claim the voice over was by Morgan Freemen, or that it represented Morgan Freeman's opinions... like when a book by a celebrity is actually written by someone else)

kevinmchugh
24 replies
21h13m

You can't use a Tom Waits (or Morgan Freeman) sound-a-like because you're profiting off their work and likeness. Unless the celeb sells a licensed sound-a-like.

https://faroutmagazine.co.uk/when-tom-waits-sued-doritos/

chankstein38
13 replies
21h6m

What if my voice sounded like Tom Waits or Morgan Freeman and I did commercials? It doesn't and I don't I'm just curious how this makes sense. Is it just if the voice is advertised to sound like him?

stuckinhell
8 replies
21h2m

Yes you would owe them money. This why I'm against artists trying to chokehold the AI.

pcthrowaway
7 replies
20h51m

I don't think with the current copyright system you would owe another artist money when recording a commercial just because your voice happens to sound like theirs. But with AI perhaps it's "fair" (given the assumption that there's any fairness to copyright in the first place) to expect a creator to provide some kind of compensation to the source material that their AI was trained on.

Case in point, this bot was clearly trained on David Attenborough samples, perhaps it can't be used commercially without purchasing some rights to the man's likeness.

stuckinhell
2 replies
20h49m

Picasso is widely quoted as having said that “good artists borrow, great artists steal.”https://lifehacker.com/an-artist-explains-what-great-artists...

lbwtaylor
0 replies
19h25m

There is a significant difference between "stealing" artistic ideas, and impersonating individual humans.

How folks can think it's reasonable to impersonate Morgan Freeman and use his voice to sell, for example, crypto crap. It's gross and should absolutely be illegal.

cjaybo
0 replies
18h57m

Well Picasso wasn’t much of a legislator, and I don’t think his point was that all artistic theft is justifiable to begin with.

mywacaday
2 replies
20h35m

What would happen if a voic was trained on Morgan Freeman, Jack Nicholson, Robert De Niro, George Clooney and tuned to not sound like any one individual but have the characteristics we find appealing. Could that be used for its own copyright claim?

twoodfin
1 replies
17h26m

I believe the answer is almost certainly yes under current law.

Whenever the topic of copying material for model training purposes arises on hn, however, a number of folks seem convinced such use falls under the rubric of “transformative” and will be considered fair use.

yesco
0 replies
1h48m

Do you just believe this or was this tested in court yet? I haven't been following the various lawsuits on this issue closely but I had thought that, so far, they were leaning towards fair use.

dragonwriter
0 replies
17h45m

I don't think with the current copyright system you would owe another artist money when recording a commercial just because your voice happens to sound like theirs

No, its a right of publicity/right of personality issue, not a copyright issue. While that's not afederal lawissue in the US, its an issue in some US jurisdictions, and its a national law issue in a number of non-US jurisdictions.

The mistake is thinking “would not owe moneyunder the current copyright regime” means “would not owe money”: there are more laws in the world than just copyright law.

tim333
0 replies
19h9m

I guess you wouls have the chance to argue about it in court, if someone came after you.

kevinmchugh
0 replies
3h16m

Sounding like someone isn't just having the same timbre of voice, it's cadence, accent, diction (both definitions). I'm skeptical how much two unrelated people could just happen to sound alike, especially if one of them is famous for their voice. Compare Larry David's natural speaking voice to how he speaks when impersonating Bernie Sanders. Those are two men from the same time and place and background, and there's still quite a gulf.

Now perhaps you could just happen to sound like a famous actor who isn't well known for their voice, like John krasinski or something, who does voice over ads but isn't super distinct. That would probably be fine but you're still going to have to slog through the world of voice talent auditions

jrockway
0 replies
17h29m

Sounds like a similar situation to someone named Mike Rowe starting a software company:https://en.wikipedia.org/wiki/Microsoft_v._MikeRoweSoft

golergka
0 replies
17h51m

The agency came back to Frito-Lay with a parody song inspired by Tom Wait’s ‘Step Right Up’, featuring the same Carni hollerings as the Small Change track

If you did straight-up parodies of their songs, yes.

pavon
4 replies
20h59m

In that case though (and the Bette Milder precedent), the sound-a-like was used with a song that the artist was associated with, which implied that it was the actual artist, or at least deliberately invoked their likeness in mind of the viewer.

I'm skeptical that using a narrator that happened to sound like David Attenborough or Morgan Freeman would be enough by itself to qualify for right of publicity unless it included obvious references to prior work, catch phases, or the like.

notahacker
3 replies
19h16m

Difficult to find a Morgan Freeman sound-a-like if they're not marketing themselves as a Morgan Freeman sound-a-like. And if they are (and obtaining more customers as a result), it's easier to pay Morgan Freeman royalties than come up with persuasive arguments that selling 'voiceovers that sound like Morgan Freeman' isn't profiting from deliberately invoking his likeness...

amelius
2 replies
18h33m

Difficult to find a Morgan Freeman sound-a-like if they're not marketing themselves as a Morgan Freeman sound-a-like.

How about this scenario: they publish a youtube video, then someone says in the comments: hey you sound just like morgan freeman, then the company looking for the voice performance googles for morgan freeman, finds the comment and contacts them.

notahacker
0 replies
3h15m

Think if I wanted to earn a living as a voiceover artist who happened to sound like a very famous actor I'd rather advertise my USP and pay royalties on the much higher earnings than only pick up business from companies that make hiring decisions based on random chance or YouTube comments ...

bryanrasmussen
0 replies
18h19m

So then in court the google for Morgan Freeman and the comment you sound like Morgan Freeman come out and then you owe Morgan Freeman $$$$$$$$$ because he had to go to court over it.

ASalazarMX
2 replies
19h55m

If your voice is the twin of a celebrity's, but you don't benefit from the likeness (for example, not being paid for voice acting where your identities could be confused), you'd probably have a strong defense if the celebrity tries to extort royalties from you.

The rest is shades of grey. For example, if you're an incredible voice imitator and plan to do shows, negotiations would be wise. Even Weird Al asked for permission when he didn't had to, legally speaking.

aequitas
1 replies
19h42m

Even Weird Al asked for permission when he didn't had to, legally speaking.

But Weird Al does parodies, which is fair use. He's not trying to pass off an album as being a genuine Michael Jackson album or something.

morelisp
0 replies
18h49m

Why would Weird Al try to pass off his albums as being from the guy who stole his songs?

mym1990
0 replies
20h48m

What happens if the sound alike is given away for free and there is no profit or even revenue component?

ge96
0 replies
20h24m

Huh, Salma Hayek was right

vasco
7 replies
21h8m
airstrike
6 replies
20h57m

if my product doesn't imply or state that the voice is Morgan Freeman's, I don't think they really have a case? at least not based on common sense

justinpombrio
5 replies
20h49m

If the voice was chosen specifically to sound like Morgan Freeman, for the purpose of making listeners think it's Morgan Freeman when they hear it, it seems reasonable for it to run afoul of Morgan Freeman's right to publicity.

airstrike
4 replies
19h2m

I owe nothing to Morgan Freeman just because I sound like him, unless I'm intentionally misleading listeners by advertising his name or implying as much

thrwy_918
1 replies
17h23m

I owe nothing to Morgan Freeman just because I sound like him

What if a company chooses you as a voice artist largely on the basis that you sound like Morgan Freeman?

airstrike
0 replies
15h52m

Is it a case of (a) they like Morgan Freeman's voice as a general preference (and therefore want someone who has a nice-sounding voice such as Morgan Freeman's for features such as how it sounds smart, deep, engaging, I don't know) or (b) they want someone who sound like Morgan Freeman to somehow represent to listeners that it's Morgan Freeman's voice?

ewi_
1 replies
18h29m

If they can reasonably argue that you're being hired because you sound like Morgan Freeman and you wouldn't get the job otherwise, then they likely have a case.

airstrike
0 replies
15h53m

Not if "sound like Morgan Freeman" is a proxy for "have a nice-sounding voice". They could prefer people who sound like Morgan Freeman even if they are not representing to listeners that the voice is that of Morgan Freeman. Big distinction there.

tantalor
2 replies
17h15m

I could understand if the actor was actually marketed as or publicly known a soundalike.

But on the other hand, does "store brand lucky charms" have to pay a kick back to General Mills?

happycube
1 replies
16h44m

If it's on the shelf right next to the original, it was probablymadeby General Mills.

onionisafruit
0 replies
11h56m

That’s what my mother said. She was wrong.

bradley13
2 replies
19h36m

Tell that to Einstein's family. Put an older guy with a German accent and wild white hair in a commercial, and they want paid.

Honestly, it's offensive that they can "own" a general appearance. But their lawyers manage to enforce it.

avar
1 replies
18h25m

An easy workaround for this is to dress your German speaker in a Nazi uniform, now suddenly nobody wants to. be associated with them, even if you keep the unkempt white hair.

jmvoodoo
0 replies
17h24m

Now you're being sued for defamation though.

galangalalgol
0 replies
21h17m

yeah at least in the US you are fine as long as you don't claim the voice is a certain person.

frereubu
0 replies
21h18m

I don't get this either, and would be very interested in a clear explanation.

3np
0 replies
21h16m

Gathering, retaining, using individual voice recordings for business purposes may be regulated, though. Already is in places, even if enforcement and precedence is lacking.

So there'd be no legal way to produce or share the model generating such a voice or its output.

ciabattabread
0 replies
20h50m

The alternative he used was a “sound alike” voice actor, for which Morgan Freeman received royalties to the tune of $$$$$.

Was it Josh Robert Thompson, who is officially authorized? [1]

[1]https://filmschoolrejects.com/morgan-freeman-crab-in-barb-an...

antod
0 replies
17h6m

My father had a friend who was a well recognised commercial voice over artist , and told me about his fees. There was a fee for doing the actual work, but a much bigger exclusivity/exposure fee designed to prevent his voice getting over used to the point where future work dried up.

WalterBright
0 replies
17h20m

I'd pay to stop hearing Morgan Freeman's voiceovers everywhere. It's way overplayed.

At least do some James Earl Jones!

MrDresden
5 replies
21h16m

While I agree, I also did thoroughly enjoy a recent Youtube channel that did Warhammer 40K lore in Attenborough's voice. It recently got taken down for obvious reasons.

Makes you wonder if in a few years there will be licenses available for using someone's voice for such endeavours.

kkzz99
4 replies
20h58m

I have been creating audiobooks from ebooks for a while now. I just grab an .epub and turn it into a pretty good audiobook in minutes for completely free using local models.

thrwy_918
1 replies
17h20m

What's the state of the art in run-it-yourself TTS? I haven't seen anything that even comes close to what ElevenLabs is doing

sigmar
0 replies
14h49m

I converted a book to audiobook a couple of months ago using the tts code here:https://github.com/coqui-ai/TTS

Wasn't as good as eleven labs, but it had enough options that I found a voice that I liked to listen to.

Tagbert
0 replies
20h53m

What do you use for the voice and the text to speech generator?

Dig1t
0 replies
19h27m

Yes please do share the models you are using

olalonde
4 replies
17h37m

Personally, I don't really see a problem with it as long as it's not used deceptively. In this case, it's pretty clear that it's not really David Attenborough speaking.

Outside of that, I am not sure I'm seeing a moral or economic argument for making voice copyrightable. The usual argument about incentivizing creation doesn't apply here.

tantalor
3 replies
17h20m

pretty clear that it's not really David Attenborough speaking

Not sure how you came to that conclusion. The bot mimics his speaking style very precisely.

harpiaharpyja
1 replies
17h14m

I'm pretty sure he came to that conclusion using context.

olalonde
0 replies
16h57m

Indeed.

olalonde
0 replies
16h57m

From the tweet and video where he explains how he did it...

anon_cow1111
3 replies
19h3m

The extremely obvious legal problem is, ALL simulated voices sound like some real person somewhere, it's just a matter of finding that person.

I don't see a solution to this, so I'll tend to believe the cynic's viewpoint that the final written law will be in favor of whoever has the most lawyers.

thrwy_918
0 replies
17h21m

ALL simulated voices sound like some real person somewhere

By the same logic, all simulated faces look like some real person somewhere, but I wouldn't expect to simulate a face that looks like Brad Pitt and then use it in a TV commercial

jstarfish
0 replies
18h49m

I don't see a solution to this

It's actually simple. Take David Attenborough'snameout of the picture.

"A generic British guy is now narrating my life" isn't as funny as implicating a known celebrity, which is where the legal issues would arise. A generic British guy isn't going to sue you.

Hell, you might even be able to get away with "A generic British guy who sounds like David Attenborough is now narrating my life."

fragmede
0 replies
18h57m

Sure. The difference is some voices, like David Attenborough or Samuel Jackson are instantly recognizable to a large audience while someone who isn't famous' voice is not.

Where to put the bar is a specific issue, but the exemplars in their category stand out as recognizable.

thefourthchime
0 replies
15h47m

I've also cloned his voice, it's trivial.

That said, it'll be interesting how the legal side of this shakes out. Does one own their likeness and voice? Who decides what that is? There are plenty of voice actors that can do a great Morgan Freedman or trump. How is that different?

samplatt
0 replies
16h13m

Cloning the voice of a dead person, to me, feels even icky-er. Indigenous Australians would probably feel the same, too, considering they avoid contact with pictures or recordings of deceased people.

jojobas
0 replies
18h16m

If human vocal impersonation is legal (and they are damn convincing at times) so should be machine.

RunSet
0 replies
16h18m

I don't know how I feed about cloning the voice of a living person,

I would like to hear it generate Kevin Conroy (R.I.P.) vocals for Christopher Nolan's Batman movies.

CamperBob2
0 replies
19h57m

Shouldn't be illegal, but I could understand it if we see laws that require explicit disclosure when voices of living people are simulated. At first glance that seems like the right compromise.

tptacek
14 replies
21h24m

This is simultaneously impressive and pretty gross.

i_am_jl
8 replies
21h21m

Do we have a word for this yet? I feel like it's becoming common enough that we'll soon need one.

devindotcom
3 replies
21h19m

generative impersonation, I suppose

AlecSchueler
2 replies
21h17m

But for something simultaneously impressive and gross?

rzzzt
1 replies
19h47m

grimpressive? improsse?

pvaldes
0 replies
17h26m

grosspresionist?

shmeeed
0 replies
20h28m

Uncanny vallAI

pawelduda
0 replies
21h18m

Imgrossive

jl6
0 replies
21h14m

CreePT

civilitty
0 replies
21h18m

Repugnificent.

throwaway4aday
4 replies
19h43m

Why do you think it is gross? Also, what does gross mean in this context? It's a very ambiguous term. Would it be less gross if this was a meme video where someone does an impression of Attenborough and just narrates what the person being filmed is doing in a comical way, there are a million of those probably.

tptacek
3 replies
16h55m

Yes, it would be different if this was an Attenborough impersonator.

lucubratory
2 replies
4h51m

Why?

tptacek
1 replies
44m

A pitch perfect Attenborough impression by a human would be an expression of talent and itself a kind of artwork. We understand intuitively that perfect impressions of famous people are scarce. When we watch something like this, we see that those impressions will be cheap, instantaneous and universally available; there is only the negative aspect, and none of the wonder at what it took to pull off.

Perhaps for a few minutes we're going to be marveling at the achievements of developers who make these things! But most people here, and, in the near future, everybody else, understand that a programming accomplishment today is a boxed up SKU next year. Most likely, in a very short period of time, the overwhelming majority of people making use of this technology will have had no hand in the accomplishment of building it.

So, yeah. Impressive. Creepy.

throwaway4aday
0 replies
19m

This is a good thing. It democratizes access to a product that very, very few could afford when it required a lot of human talent and time to produce. In my experience, I only marvel at impressions done as an unexpected party trick and the novelty quickly wears off. If I hear a voiceover that sounds very much like some famous person I'm not imbued with wonder I just perceive that the audio was well produced and don't give it a second thought. I think this kind of automation may even be very good for people with unique voices who are looking to get into voice acting, they should be able to pay some service to clone their voice and then offer it on demand for a small fee. The amount of work they would have to put into this is greatly reduced compared to if they had to do all of the voice acting themselves and they are able to take more customers since the work can be done in parallel. It's like digital publishing for voice acting with the same benefits and detriments e.g. your voice can be cloned like your ebook or song can be torrented but you also get a much bigger market and it's much easier to enter it.

xwdv
10 replies
21h10m

It’s exciting to imagine that we can soon enter the age where people will be able to voice their opinions in their own voice and choice of words, long after they are dead and gone from this physical world. Perhaps the first version of some digital afterlife.

midasuni
4 replies
20h55m

And of course anyone can use your voice ti voice any opinion. Want Ghandi saying his words are backed with nuclear weapons? Want to portray Hitler as a nice loving chap? And that’s before you get onto current politicians -https://www.theguardian.com/politics/2023/nov/10/faked-audio...

Dioxide2119
2 replies
19h54m

Hm. Gives a bit of truth to the Amish claim that having themselves be photographed steals a bit of their soul. Your voice is (was) unique to you.

Recordings moved the needle to your voice (when saying new phrases) is unique to you.

These voice cloning of the last few years means that your voice (as long as it is never recorded and remixed) is unique to you.

A far more difficult proposition.

giraffe_lady
1 replies
19h38m

I sincerely find this to fall into the proscriptions against necromancy present in most ancient belief systems. The dead should not be made to speak the words of the living. And we should not create for ourselves any illusions about the completeness or finality of death.

It's not so much that it's a moral transgression as that it will undermine and corrupt our own understanding of what it means to be a living person.

xwdv
0 replies
19h31m

Turns out, it doesn't mean much to be a living person.Ideasare what matter to society, not the individual that spawns them. When you kill a revolutionaire, you are only killing a man.

xwdv
0 replies
19h36m

This will be a fairly trivial problem to solve. Each spoken word can be cryptographically signed on some kind of distributed public ledger and unless the words originate from a verified origin you cannot assume something you hear was indeed spoken by the original source LLM.

Onlyverified sentences coming from your LLM clone can be considered the actual you.

isoprophlex
2 replies
20h58m

I too long to retain some corporeal presence as a box, filled with an llm trained on my life's collected texts and utterances. A featureless grey cube with only two buttons marked "more snark" and "more grossness", to be dusted off and interacted at every christmas by my great-grandchildren.

thallium205
0 replies
18h46m

This comment will be added to your LLM.

metabagel
0 replies
20h24m

And a robotic finger to pull.

theGnuMe
0 replies
18h21m

This is a startup - digital tombstones or a digital dia de los muertos.

cryptoz
0 replies
19h25m

But like, you can't know what someone would have said in the future if they aren't there to say it. No amount of LLM improvements would be able to know the real internal thoughts and memories of someone. You could guess, sure, but you don't know with any certainty.

People change. Small things change people - seeing a car speed by might alter your opinion on some variety of things, but no LLM was there to capture that moment.

It will always be a guess. You can never know for sure what someone who is deceased would think about something in the future since they are simply not there.

rcarmo
7 replies
20h49m

I want this, but with Werner Herzog. It would be the only way to do justice to the existential angst that pervades some of my meetings.

slothtrop
2 replies
20h24m

There's AI generated word-salad debate between Herzog and Zizekhttps://infiniteconversation.com/

pacifika
1 replies
20h18m

Flagged by Safari for an invalid cert

justin_oaks
0 replies
20h10m

Should happen on any browser. Tryhttps://www.infiniteconversation.com/instead.

zestyping
0 replies
8h18m

I just did this. I couldn't resist. I looked into the camera and Werner said:

In the quiet chaos of an ordinary room, a lone figure peers into the abyss of a lens, a silent sentinel in the company of walls adorned with fragments of a life well-lived. The ambient light casts a celestial glow atop their crown, an unintentional halo in a moment of serendipity. The gaze, piercing yet calm, confronts the onlooker, challenging the nature of existence through the simple act of capturing a fleeting second in the digital ether. Here stands a human, enshrined in the mundane yet extraordinary tableau of their own making.
zestyping
0 replies
1h38m
willseth
0 replies
20h22m

Came here to say this. This is correct.

willcipriano
0 replies
19h36m

Mike Rowe would be good for doing work around the house.

kadomony
7 replies
20h53m

Given the Attenborough Lore channel for Warhammer received a cease and desist, you can bet that his legal team will be busy suing the shit out of people like this.

hyggetrold
4 replies
20h50m

I was so bummed about that. It makes sense of course but those videos were really fun.

DaiPlusPlus
2 replies
20h25m

Was it noncommercial and done in the spirit of satire (surrealism?) - in which case surely they'd get a pass - but I guess they just didn't want to go bankrupt trying to defend themselves?

It's a shame then - when you compare it to the situation between thenerdcore gangsta-rapperalter-ego of Stephen Hawking: MC Hawking; who was created without authorization by Dr. Hawking, and sold real physical CDs for a profit (and I own one!), but who ultimately received an endorsement from the late physicist who took it all in good humour - and arguably Stephen Hawking is a bigger public figure than David Attenborough.

...if it really does comes down to just the personal approval by the subject themselves (and not their legal/marketing/legal teams acting without them even being aware) then I'm concerned what it means for artistic liberty vs public figures.

kadomony
1 replies
20h10m

I don't believe it was intended to be satirical. The depth of worldbuilding and lore in Warhammer affords an immense opportunity for Games Workshop to actually do some kind of series like what was created with Attenborough's likeness. His voice is his main source of income for his estate nowadays, so it makes sense he doesn't want it replicated, especially without any permission being granted.

throwaway4aday
0 replies
19h38m

Too bad they'd never do that and would probably do a worse job if they did. There should be carve outs for fans to make things that bloated corporations will never have the ingenuity to come up with or the balls to carry out.

kadomony
0 replies
20h21m

Same. When I logged into YouTube and saw "Scholar's Lore", my heart sunk and I immediately knew what had come to pass. Attenborough is fiercely protective of his voice.

russellbeattie
1 replies
18h50m

That was such a great channel. The voice was paired with pitch perfect writing as well. The script had exactly the sort of flow that an Attenborough BBC program would have, plus it was actually really informative.

He really should re-dub the entire series with a unique AI generated voice and repost. They wouldn't be as good, but still pretty close.

kadomony
0 replies
17h28m

He is doing exactly that! It's not as unique as Attenborough's voice, but it's nice to see every video being reuploaded slowly. He's even taken feedback from the previous videos into consideration to "remaster" them.

SpaceManNabs
4 replies
20h45m

I think this is nasty to provide as a service to others because the voice didn't consent. I don't agree with others when also critiquing personal use. Personal use should be fair use as long as you aren't stealing or mining a service to minimize some personal cost.

So is this a text to speech model while mimicking somebody's voice? That is pretty cool.

One thing I don't understand is how you get specific models for someone's voice. Do you fine tune to that voice?

The issue of style transfer is not something I understand for audio. Anyone have example papers to read? I am familiar with MuLan, MusicLM, and MusicGen.

spyder
3 replies
19h26m

In the tweet he says he is usinghttps://elevenlabs.iofor the voice, which is currently the best quality text-to-speech service and allows for voice cloning too.

And Elevenlabs is saying this about cloning other people's voices:

Can I clone anybody's voice?

Yes, as long as you have their consent. For Professional Voice Cloning, we have integrated robust security measures to make sure you can only clone your own voice. Unless you share it, your voice belongs and is available only to you.

But it looks like it's not enforced if the example in the tweet is possible.

But even if it would be enforced by the company, open-source models are getting close to them and they are harder to control. The public or commercial use of the output from these models can (and probably should) be limited by laws, but if the users want to use it privately (for example to narrate themselves) there is not much you can do to prevent it especially with the progress in this tech it will become more trivial to do it for anybody.

leobg
2 replies
19h21m

It actually is enforced. They have a verification system where they show you random text on the screen and you have to read that into the microphone. So I don’t know how the tweet author got around that.

chmod775
1 replies
15h39m

I don’t know how the tweet author got around that.

If I had to defeat such a thing, I'd try to fake the voice using a good-enough AI running locally.

Alternatively use some voice-changer/editing magic to make your own voice sound "good enough".

leobg
0 replies
11h55m

I tried that. But their recognition model seems to be quite picky. And you need the voice change real time. No time to edit anything.

nextworddev
2 replies
19h39m

Wonder how he’s deciding which frame to feed to judge.py, given it would be tremendously wasteful to call gpt4vision on every frame. Maybe there’s some logic to detect meaningful drifts?

circuit10
0 replies
19h31m

Looks like it just does it every 5 seconds

AussieWog93
0 replies
19h33m

In the video, it just captured a frame every 5s. You can see the way he holds up his cup of water for a little too long to make sure it gets captured.

hyperific
2 replies
20h39m

This is nuts. I want a version of this where Emma Thompson narrates my life like Stranger than Fiction.

jph
1 replies
20h35m

You're right, Stranger than Fiction is perfect for this, and her narration would be excellent.

"Harold checks his watch then begins to read Hacker News...."

93po
0 replies
3h20m

Imagine the shameful tone when a private browser window opens

BlueTemplar
2 replies
20h15m

Reminded me of the South Park guys (et al.) take on Deep Fakes from a few years ago, especialy the bit with Michael Caine :

https://www.youtube.com/watch?v=9WfZuNceFDM

(Of course they used a real fake voice for all of them.)

expertentipp
1 replies
17h27m

Did Trump marry his daughter? I'm out of touch with American politics.

93po
0 replies
3h18m

Yes and then he peed the bed and slapped Nancy pelosi’s ass

gumballindie
1 replies
17h15m

Without consent, of course.

spiderice
0 replies
13h10m

You don't need consent to create parodies, which this clearly is.

floatrock
1 replies
18h51m

Isn't part of the actors' strike about maintaining rights on AI-reproductions of actor likeliness?

This is a brilliant and effective demo of why that's important.

gumballindie
0 replies
17h13m

I am starting to believe a lot of people working in ai are sociopaths. Their fixation with impersonating people and stealing ip points firmly in that direction.

wsintra2022
0 replies
15h33m

Pretty cool, a couple of months back a friend asked me if I could get him a Carl Rogers audiobook, I forget which one. I told him sure and jokingly asked if he wanted David Attenborough reading it. He said yes. I had myself a challenge. Long story short, I was able to create an audio book for a friend, the voice was meh!

throwawaaarrgh
0 replies
19h35m

Take my money

smlacy
0 replies
20h58m
oldshatterhand
0 replies
18h39m

I need Werner Herzog to narrate my life!

"Look in the eyez of this nerd. The intenzity of the ztupidity looking at back at you is juzt amazing."

itissid
0 replies
21h5m

Oh man what a time to be alive. The creativity is amazing. Bravo.

fnordpiglet
0 replies
12h58m

This is the future I was promised.

divbzero
0 replies
19h27m

YouTube is developing revenue sharing for AI-generated music [1]. I wonder if they will extend revenue sharing to non-musical content like the narration in OP.

[1]:https://www.theverge.com/2023/8/21/23840026/youtube-ai-music...

dang
0 replies
19h10m
consumer451
0 replies
13h11m

I have really enjoyed the new show Scavengers Reign. It does an amazing job of showing an alien planet's biodiversity. I have fantasized about a cut of just the nature scenes with Attenborough narrating. It seems like this make an attempt at that.

atleastoptimal
0 replies
20h56m

I tried making something like this but OpenAI restricts Vision API calls to 100 per day rn

VikingCoder
0 replies
20h3m

I think you shouldn't be able to distribute code / model like this without permission...

...but...

What if you distributed code that when I ran it, and plugged in each of the Blue Planet / Planet Earth DVDs, it would learn how to do this, with a local model?

OnlyMortal
0 replies
18h40m

“… and here we have the common Twitter poster. Their attempt to gain favour with their peers and, therefore, popularity with potential mates falls foul once again.”

- Attenborough maybe.

LightBug1
0 replies
19h24m

Interesting ... and awful ... it sharpens how I feel about it knowing that the man is nearing the end of his life. This is absolutely not how I want to experience and remember the man.

Dig1t
0 replies
21h10m

This is astounding. This is the kind of demo that we could only dream of just five years ago, now some hacker dude on Twitter can just throw this kind of thing together. Seriously cool.