[HN Gopher] Statement from Scarlett Johansson on the OpenAI "Sky...
___________________________________________________________________
Statement from Scarlett Johansson on the OpenAI "Sky" voice
Author : mjcl
Score : 1299 points
Date : 2024-05-20 22:28 UTC (13 hours ago)
(HTM) web link (twitter.com)
(TXT) w3m dump (twitter.com)
| alsodumb wrote:
| Why do I feel like Sam's 'her' tweet pretty much gave Scarlett
| Johansson's legal counsel all the ammo they needed lol.
| CharlesW wrote:
| Also, it shows that today's blog post was fiction.
| crimsoneer wrote:
| The sky voice they took down has existed for more than a
| year. It's different to the new demo that kicked this all
| off.
| CharlesW wrote:
| The voice didn't change, just the ability of the model to
| "output laughter, singing, or express emotion".
| elevatedastalt wrote:
| It probably made things worse, but the fact that they reached
| out to her to use her voice and she explicitly refused would be
| sufficient ammo I feel. (Not a lawyer of course).
|
| Of course, Twitter continues to bring people with big egos to
| their own downfall.
| not2b wrote:
| Not to mention that it matches up pretty much exactly with
| the Bette Midler and Tom Waits cases, where courts ruled
| against companies using soundalikes after the person they
| really wanted turned them down. Doesn't matter if they hired
| a soundalike actress rather than clone her voice, it still
| violates her rights.
| cortesoft wrote:
| I am guessing this is only because they first tried to hire
| the originals before hiring sound-alikes... otherwise,
| would it mean that if your voice sounds similar to someone
| else, you can't do voice work?
| recursive wrote:
| > would it mean that if your voice sounds similar to
| someone else, you can't do voice work?
|
| Maybe only when the director's instructions are "I want
| you to sound like XYZ".
| wilg wrote:
| Of course, surely you can do this if you're playing a
| character, such as impersonating Trump or Obama or an
| actor on SNL?
| not2b wrote:
| Yes, parody's fine when it's clearly parody. But if you
| try to pretend that Trump or Obama (rather than an
| impersonator) is endorsing a product, you're in trouble.
| wilg wrote:
| but openai has only ever said that the chatgpt voice has
| nothing to do with scojo
| smugma wrote:
| Exception: Parody is covered under the first amendment.
| meat_machine wrote:
| >Wheel of Fortune hostess Vanna White had established
| herself as a TV personality, and consequently appeared as
| a spokesperson for advertisers. Samsung produced a
| television commercial advertising its VCRs, showing a
| robot wearing a dress and with other similarities to
| White standing beside a Wheel of Fortune game board.
| Samsung, in their own internal documents, called this the
| "Vanna White ad". White sued Samsung for violations of
| California Civil Code section 3344, California common law
| right of publicity, and the federal Lanham Act. The
| United States District Court for the Southern District of
| California granted summary judgment against White on all
| counts, and White appealed.
|
| >The Ninth Circuit reversed the District Court, finding
| that White had a cause of action based on the value of
| her image, and that Samsung had appropriated this image.
| Samsung's assertion that this was a parody was found to
| be unavailing, as the intent of the ad was not to make
| fun of White's characteristics, but to sell VCRs.
|
| https://en.wikipedia.org/wiki/White_v._Samsung_Electronic
| s_A....
|
| Maybe it depends on which court will handle the case, but
| OpenAI's core intent isn't parody, but rather to use
| someone's likeness as a way to make money.
|
| (I am not a lawyer)
| not2b wrote:
| Good voice actors can do a whole range of voices,
| including imitating many different people. The cases
| where someone got in trouble are where there's
| misrepresentation. If it goes to court, there's
| discovery, and if the OpenAI people gave specific
| instructions to the voice actor to imitate Scarlett
| Johansson, after denying it, there could be big trouble.
| We don't know that, but it looks likely given how they
| first approached her and how they seemed to be going for
| something like the "Her" film.
| planede wrote:
| It's only really about the intent of the voice to sound
| like the original. Reaching out to the originals first
| implies intent, so it makes the case easier.
|
| It would be harder to find a case if they simply just
| hired someone that sounds similar, but if they did that
| with the intention to sound like the original that's
| still impersonation, only it's harder to prove.
|
| If they just happened to hire someone that sounded like
| the original, then that's fair game IMO.
|
| IANAL
| joegibbs wrote:
| Definitely. GPT4o has a voice that sounds like Scarlett
| Johannson? They'd probably get away with it, I'm sure there
| are a lot of people that sound like her. Tweeting a reference
| to a movie she was in - a bit more murky because it's
| starting to sound like they are deliberately cloning her
| voice. Asking to use her voice, then using a soundalike, then
| referencing the movie? 100%, no doubt.
| akr4s1a wrote:
| So was asking her to reconsider 2 days before the demo, how
| blatant can you get
| OrangeMusic wrote:
| They really wanted her voice yes. Does that prove anything?
| xyst wrote:
| https://en.wikipedia.org/wiki/Res_ipsa_loquitur
| dclowd9901 wrote:
| They'll make the case that the abilities of the device is what
| he was referring to, but I think more the fact they were
| pushing her so hard for her involvement will actually be the
| damning aspect for them with that line of defense.
| fareesh wrote:
| am I wrong to think this was the plan all along?
|
| mainstream adoption hasn't been that great - now there's drama
| heyoni wrote:
| You can read that tweet?
| dheera wrote:
| What are the chances that among 7 billion people in the world
| that there are always going to be 100 people that sound like you?
| If Sam Altman was going for a particular voice, there are
| probably 100 people that indistinguishably have that voice and it
| just becomes a question of a headhunt.
| guhidalg wrote:
| One word: "Her"
| robbomacrae wrote:
| That's precisely what they did with Doodle God to imitate
| Morgan Freeman [0] and how James Veich deep faked David
| Attenborough in his PLnaT eRth video [1].
|
| [0]: https://www.mercurynews.com/2021/07/20/how-the-doodle-god-
| un...
|
| [1]: https://youtu.be/-CopbQ_QgmM?si=gkbWEva_qqG8dTib&t=205
| karaterobot wrote:
| Trying to imitate her voice to get around paying her wouldn't
| be okay either. _Waits vs. Frito Lay_ taught us that, if
| nothing else did. The question is whether people would think
| they were hearing Scarlett Johansson 's voice when using that
| product, and the answer is yes, so they have to pay her to
| trade on her identity.
| ClassyJacket wrote:
| Since Microsoft has given up on her, they should hire Jen Taylor
| and do almost-Cortana.
| heyoni wrote:
| Deleted?
| jaykru wrote:
| Try another browser; I wasn't able to open it on Librewolf
| (Firefox fork.)
| heyoni wrote:
| I'm on safari on iPhone though I do use the safari extension
| for adblocking...
| PixelPaul wrote:
| I am really not liking this Sam guy and how he does things. He
| has an attitude of "my way and only my way, and I don't care what
| you think or do"
| oglop wrote:
| That's all successful tech companies out of Silicon Valley.
|
| It is a silly place.
| dylan604 wrote:
| On second thought, let's not go there.
|
| It's not just successful companies though. There is a bit of
| ego necessary in a founder that makes them think their idea
| or their implementation of a thing is better so that it needs
| to be its own company. Sometimes though they even get caught
| up in their own reality distortion fields with obviously bad
| ideas or ideas implemented badly due to their own arrogance
| that ultimately fails.
| aeurielesn wrote:
| Him and pretty much the entire SV culture.
| talldayo wrote:
| Relax guys, he's _Open_!
|
| Open for business, Open to suggestions, and Open season for
| any lawyers that want a piece of the sizable damages.
| dylan604 wrote:
| This is pretty much quintessential founder behavior. I have had
| my run-ins with people like this, and the relationship is
| usually short lived. I do not drink the kool-aid, and question
| pretty much everything. These types of personalities are like
| oil and water and do not mix. You almost need a third person to
| act as a emulsifier to allow the oil&water to mix without
| separating.
| laborcontract wrote:
| An emulsifier role is a great way to put it. In basketball,
| that's the glue guy.
| suzzer99 wrote:
| Yeah this explains why most of my interviews with startups
| haven't gone well. I've even had friends ask me to do PoC
| work for them for equity, and they _still_ get all bent out
| of shape when I 'm not instantly smitten by their one-
| sentence pitch, and instead ask for more details about their
| business plan.
|
| If I thought I had a great idea, I would _want_ people to try
| to poke holes in it. Yet founders often universally seem to
| be the incredibly sensitive and insecure about their idea.
| mmh0000 wrote:
| You don't like the sister rapist Sam Altman[1][2]? Seems like
| everybody should LOVE this guy!
|
| [1] https://www.lesswrong.com/posts/QDczBduZorG4dxZiW/sam-
| altman...
|
| [2] https://www.hackingbutlegal.com/p/statement-by-annie-
| altman-...
| octopoc wrote:
| > "Sam and Jack, I know you remember my Torah portion was
| about Moses forgiving his brothers. "Forgive them father for
| they know not what they've done" Sexual, physical, emotional,
| verbal, financial, and technological abuse. Never forgotten."
|
| That is...not a pretty picture. We desperately need someone
| else at the helm of OpenAI.
| vundercind wrote:
| Heavily self-promoting, brash aren't-I-great business sorts are
| a pretty damn consistent type across time, and it's not a
| _good_ type.
| aaronharnly wrote:
| Well, this confirms that OpenAI have been shooting from the hip,
| not that we needed much confirmation. The fact that they
| repeatedly tried to hire Johansson, then went ahead and made a
| soundalike while explicitly describing that they were trying to
| make it be like her voice in the movie ... is pretty bad for
| them.
| llamaimperative wrote:
| "Shooting from the hip" is giving them too much credit. Actual
| knowing malice and dishonesty is more like it.
| infotainment wrote:
| It's definitely sketchy (classic OpenAI) But my question is: is
| what they did actually illegal? Can someone copyright their own
| voice?
| automatoney wrote:
| In the United States, likeness rights vary by state
| https://en.wikipedia.org/wiki/Personality_rights
| foota wrote:
| I think this will fall under what are termed personality
| rights, and the answer varies by state within the US.
| duskwuff wrote:
| It's not precisely copyright, but most states recognize some
| form of personality rights, which encompass a person's voice
| just as much as the person's name or visual appearance.
| bhhaskin wrote:
| But where it will get murky is people sound like other
| people. Most voices are hardly unique. It will be
| interesting to see where this lands.
| ocdtrekkie wrote:
| It isn't murky, because law is about _intent_ more than
| result. It doesn 't matter if they hired someone who
| sounds like Scarlett, it matters if they _intended to do
| so_.
|
| If they accidentally hired someone who sounds identical,
| that's not illegal. But if they intended to, even if it
| is a pretty poor imitation, it would be illegal because
| the intent to do it was there.
|
| A court of law would be looking for things like emails
| about what sort of actress they were looking for, how
| they described that requirement, how they evaluated the
| candidate and selected her, and of course, how the CEO
| announced it alongside a movie title Scarlett starred in.
| howbadisthat wrote:
| Under what legal theory is intending to do something
| which is legal (hiring a person that has a voice you
| want) becomes illegal because there is another person who
| has a similar voice?
| ocdtrekkie wrote:
| It's not intending to do something legal, it's intending
| to do something illegal: Stealing their likeness. The
| fact you used an otherwise legal procedure to do the
| illegal activity doesn't make it less illegal.
| howbadisthat wrote:
| How can something be illegal if every step towards the
| objective is legal? This would result in an incoherent
| legal system where selective prosecution/corruption is
| trivial.
| ocdtrekkie wrote:
| It is legal to buy a gun, and legal to fire a gun, and it
| can even be legal to fire a gun at someone who is
| threatening to kill you in the moment, but if you fire a
| gun at someone with the intention of killing someone that
| happens to be very, very illegal.
| howbadisthat wrote:
| Very well. But in this case the end goal is the end of
| someone's unique life.
|
| In the case of acquiring a likeness, if it's done legally
| you acquire someone else's likeness that happens to be
| shared with your target.
|
| The likeness is shared and non-unique.
|
| If you objective is to take someone's life, there is no
| other pathway to the objective but their life. With
| likeness that isn't the case.
| kelnos wrote:
| So? You're merely (correctly) pointing out that the acts
| have consequences that are of wildly differing severity.
| Not that one is a legal and the other is not.
| tivert wrote:
| OpenAI should hire you as their lawyer.
| jcranmer wrote:
| What's illegal, in general, is not the action itself but
| the intent to do an action and the steps taken in
| furtherance of that intent.
|
| Hiring someone with a voice you want isn't illegal;
| hiring someone with a voice you want _because_ it is
| similar to a voice that someone expressly denied you
| permission to use is illegal.
|
| Actually, it's so foundational to the common law legal
| system that there's a specialized Latin term to represent
| the concept: mens rea (literally 'guilty mind').
| tivert wrote:
| > But where it will get murky is people sound like other
| people. Most voices are hardly unique. It will be
| interesting to see where this lands.
|
| Yes, it will be interesting in June 1988 when we will
| find out "where this lands":
| https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
| emmp wrote:
| There are two similar famous cases I know offhand. Probably
| there are more.
|
| https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
|
| Bette Middler successfully sued Ford for impersonating her
| likeness in a commercial.
|
| Then also:
|
| https://casetext.com/case/waits-v-frito-lay-inc
|
| Tom Waits successfully sued Frito Lay for using an imitator
| without approval in a radio commercial.
|
| The key seems to be that if someone is famous and their voice
| is distinctly attributeable to them, there is a case. In both
| of these cases, the artists in question were also solicited
| first and refused.
| npunt wrote:
| Also Crispin Glover's case in Back to the Future II
|
| https://www.hollywoodreporter.com/business/business-
| news/bac...
| dralley wrote:
| What if the imitator is clearly an imitator? e.g.
| https://www.youtube.com/watch?v=YvF0l8RUGQ8
| gcanyon wrote:
| That's weird -- I would think Morgan Freeman would be
| able to sue over that, but I Am Not An Intellectual
| Property Lawyer.
| kelnos wrote:
| I feel like that's a little different. In the cases of
| Midler, Waits, and Johansson, the companies involved
| wanted to use their voices, were turned down, and then
| went with an imitator to make it seem to the audience
| that the celebrity was actually performing. In the case
| of this "Morgan Freeman" video, Freeman himself is very
| obviously not performing: the imitator appears on screen,
| so it's explicitly acknowledged in the ad.
|
| But I'm not a lawyer of any sort either, so... ::shrug::
| hooloovoo_zoo wrote:
| Both cases seem to have also borrowed from the artists'
| songs too however. That could perhaps make a difference.
| pseudalopex wrote:
| Bette Midler and Tom Waits didn't control their songs
| when they sued the companies.
| hooloovoo_zoo wrote:
| But it makes it more likely that the listener will
| associate the commercial with the artist than just using
| the voice.
| deprecative wrote:
| True to an extent. I'd argue that celebrity of a certain
| level would make one's voice recognizable and thus
| confusion can happen.
| pseudalopex wrote:
| The Midler v. Ford decision said her voice was
| distinctive. Not the song.
|
| OpenAI didn't just use a voice like Scarlett Johansson's.
| They used it in an AI system they wanted people to
| associate with AI from movies and the movie where
| Johansson played an AI particularly.[1][2]
|
| [1] https://blog.samaltman.com/gpt-4o
|
| [2] https://x.com/sama/status/1790075827666796666
| kcplate wrote:
| You would have to argue the distinctiveness of the voice
| (if they hadn't already pursued her to do it). Tom
| Waits...that's pretty distinct voice. Scarlett
| Johansson...not so much
| yread wrote:
| The Tom Waits case had a payout of 2.6 million for services
| with fair market cost of 100k. What would it cost openai to
| train chatgpt using her voice? Is she also going to get a
| payout 26 times that? That GPU budget is starting to look
| inexpensive...
| aaronharnly wrote:
| I'm not a lawyer and don't have any deep background this area
| of IP, but there is at least some precedent apparently:
|
| > In a novel case of voice theft, a Los Angeles federal court
| jury Tuesday awarded gravel-throated recording artist Tom
| Waits $2.475 million in damages from Frito-Lay Inc. and its
| advertising agency.
|
| > The U.S. District Court jury found that the corn chip giant
| unlawfully appropriated Waits' distinctive voice, tarring his
| reputation by employing an impersonator to record a radio ad
| for a new brand of spicy Doritos corn chips.
|
| https://www.latimes.com/archives/la-
| xpm-1990-05-09-me-238-st...
| crazygringo wrote:
| Yes, absolutely illegal. You don't need to copyright
| anything, you simply own the rights your own likeness -- your
| visual appearance and your voice.
|
| A company can't take a photo from your Facebook and plaster
| it across an advertisement for their product without you
| giving them the rights to do that.
|
| And if you're a known public figure, this includes lookalikes
| and soundalikes as well. You can't hire a ScarJo impersonator
| that people will think is ScarJo.
|
| This is clearly a ScarJo soundalike. It doesn't matter
| whether it's an AI voice or clone or if they hired someone to
| sound just like her. Because she's a known public figure,
| that's illegal if she hasn't given them the rights.
|
| (However, if you generate a synthetic voice that just happens
| to sound exactly like a random Joe Schmo, it's allowed
| because Joe Schmo isn't a public figure, so there's no value
| in the association.)
| zooq_ai wrote:
| But is that Scarlett Jo or Producers of Her that own the
| copyright?
|
| If you imitate Darth Vader, I don't think James Earl Jones
| has as much case for likeliness as Star Wars franchise
| ceruleanseas wrote:
| James Earl Jones sold his voice rights to Disney a couple
| of years ago, so they can continue to use an AI likeness
| of his voice for future movies.
| https://ambadar.com/insights/james-earl-jones-signs-off-
| his-...
| crazygringo wrote:
| It's both.
|
| If you just want ScarJo's (or James Earl Jones') voice,
| you need the rights from them. Period.
|
| If you want to reuse the _character_ of her AI bot from
| the movie (her name, overall personality, tone, rhythm,
| catchphrases, etc.), or the _character_ of Darth Vader,
| you _also_ need to license that from the producers.
|
| And _also_ from ScarJo /Jones if you want the same voice
| to accompany the character. (Unless they've sold all
| rights for future re-use to the producers, which won't
| usually be the case, because they want to be paid for
| sequels.)
| nickthegreek wrote:
| If they didn't use her actual voice for the training,
| didn't hire voice talent to imitate her, didnt pursue her
| for a voice contract, didn't make a reference to the movie
| in which she voices an AI, I feel OpenAI would have been on
| more stable legal footing. But they aren't playing with a
| strong hand now and folded fast.
| rockemsockem wrote:
| You're 100% correct and there's precedent
|
| https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
| GuB-42 wrote:
| Not only that but they didn't credit the voice actress
| who sounds like her. If she was semi-famous and just
| naturally sounded like Scarlett Johansson, maybe they
| could have an argument: "it is not Scarlett, it is the
| famous [C-list actress] who worked in [production some
| people may know]".
| howbadisthat wrote:
| Scarlet owns the voice of a stranger that happens to sound
| like her? That seems absurd.
|
| Just find someone who sounds like her, then hire them for
| the rights to their voice.
| callalex wrote:
| It's really hard to assume in good faith that you are
| unfamiliar with the concept of impersonation. Just in
| case: https://en.m.wikipedia.org/wiki/Impersonator
|
| There is no doubt that the hired actor was an
| impersonator, this was explicitly stated by scama
| himself.
| howbadisthat wrote:
| The variance in voice is not that great. Just find
| someone who is very close to her voice naturally.
| airstrike wrote:
| Doesn't matter if the intent is to make the listener
| think they're hearing ScarJo
| sneak wrote:
| I missed that; where did he say that?
| warcher wrote:
| It's just that her voice by itself is relatively
| unremarkable. Someone like say, Morgan freeman, or Barack
| Obama, someone with a distinctive vocal delivery, that's
| one thing. Scarlett Johansson, I couldn't place her voice
| out of a lineup. I'm sure it's pleasant I just can't
| think of it.
| llamaimperative wrote:
| Scarlett Johansson does absolutely have a distinctive and
| very famous voice. I wouldn't take your own ignorance
| (not meant disparagingly) as evidence otherwise.
|
| That's why she was the voice actor for the AI voice in
| Her.
| serf wrote:
| >That's why she was the voice actor for the AI voice in
| Her.
|
| She was used in Her because she has a
| dry/monotone/lifeless form of diction that at the time
| seemed like a decent stand-in for an non-human AI.
|
| IMDB is riddled with complaints about his vocal-
| style/diction/dead-pan on every one of her movies. Ghost
| World, Ghost in the Shell, Lost in Translation, Comic-
| Book-Movie-1-100 -- take a line from one movie and dub it
| across the character of another and most people would be
| fooled, that's impressive given the breadth of
| quality/style/age across the movies.
|
| When she was first on the scene I thought it was bad
| acting, but then it continued -- now I tend to think that
| it's an effort to cultivate a character personality
| similar to Steven Wright or Tom Waits; the fact that
| she's now litigating towards protection of her character
| and likeness reinforces that fact for me.
|
| It's unique to her though , that's for sure.
| kristiandupont wrote:
| >She was used in Her because she has a
| dry/monotone/lifeless form of diction that at the time
| seemed like a decent stand-in for an non-human AI
|
| Do you have a source for this?
| tivert wrote:
| > There is no doubt that the hired actor was an
| impersonator, this was explicitly stated by scama
| himself.
|
| And here's some caselaw where another major corporation
| got smacked down for doing the exact same thing:
| https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
|
| But given how unscrupulous Sam Altman appears to be, I
| wouldn't be surprised if OpenAI hired an impersonator as
| some kind half-ass legal cover, and went about using
| Johansson's voice anyway. Tech people do stupid shut
| sometimes because they assume they're so much cleverer
| than everyone else.
| planede wrote:
| Impersonating is defined by intent. "Just find someone
| who sounds like her" implies intent.
| kcplate wrote:
| The problem is they pursued, was rejected, then approximated.
| Had they just approximated and made no references to the
| movie...then I bet social marketing would have made the
| connection organically and neither Ms Johansson or the Her
| producers would have much ground because they could
| reasonably claim that it was just a relatively generic
| woman's voice with a faint NY/NJ accent.
| simonsarris wrote:
| This is known as personality rights or right to publicity.
| Impersonating someone famous (eg faking their likeness or
| voice for an ad) is often illegal.
|
| https://en.wikipedia.org/wiki/Personality_rights
| bl4kers wrote:
| Not here to weigh in on the answers to these questions. But
| it certainly feels pretty scary to have to ask such questions
| about a company leading the LLM space, considering the U.S.
| currently has little to no legal infrastructure to reign in
| these companies.
|
| Plus the tone of the voice is likely an unimportant detail to
| theor success. So pushing up against the legal boundaries in
| this specific domain is at best strange and at worst a huge
| red flag for their ethics and how they operate.
| tootie wrote:
| This is so pointless and petty too. Like "hee hee our software
| is just like the movies". And continuing the trend of tech
| moguls watching bleak satire and thinking it's aspirational.
| steveBK123 wrote:
| How do people watch 15 seconds of a demo like this -
| https://x.com/OpenAI/status/1790072174117613963
|
| And not see how over the top it is... cmon.
| yazzku wrote:
| If anyone thinks this demo is cool, I regret to inform you
| that your life is very, very sad.
| tsimionescu wrote:
| "Over the top" means exaggerated and corny, almost the
| opposite of "cool".
| Balgair wrote:
| Sci-Fi Author: In my book I invented the Torment Nexus as a
| cautionary tale
|
| Tech Company: At long last, we have created the Torment Nexus
| from classic sci-fi novel Don't Create The Torment Nexus
|
| https://x.com/AlexBlechman/status/1457842724128833538?lang=e.
| ..
| whamlastxmas wrote:
| I think this is such a massively trivial detail it's hard to
| draw broader conclusions from it
| signal11 wrote:
| OpenAI claimed they hired a different professional actor who
| performed using her own voice [1].
|
| If so, I _suspect_ they'll be okay in a court of law -- having
| a voice similar to a celebrity isn't illegal.
|
| It'll likely cheese off actors and performers though.
|
| [1] https://www.forbes.com/sites/roberthart/2024/05/20/openai-
| sa...
| hn_20591249 wrote:
| Seems like sama may have put a big hole in that argument when
| he tweeted "her", now it is very easy to say that they
| knowingly cloned ScarJo's likeness. When will tech leaders
| learn to stop tweeting.
| crimsoneer wrote:
| Yeeah, this was very stupid. Sigh.
| catchnear4321 wrote:
| only when they can get a bigger fix from something else.
|
| it takes more than money to fuel these types, and they
| would have far better minders and bumpers if the downside
| outweighed the upside. they aren't stupid, just addicted.
|
| musk was addict smart, owned up to his proclivities and
| bought the cartel.
| chipweinberger wrote:
| Or perhaps they cloned a character's likeness?
|
| Is there a distinction?
|
| Are they trying to make it sound like Her, or SJ? Or just
| trying to go for a similar style? i.e. making artistic
| choices in designing their product
|
| Note: I've never watched the movie.
| romwell wrote:
| >Is there a distinction?
|
| Yes, _that_ would be a copyright violation _on top of_
| everything else.
|
| Great idea though!
|
| I'm going to start selling Keanu Reeves T-Shirts using
| this little trick.
|
| See, I'm not using Keanu's _likeness_ if I don 't label
| it as Keanu. I'm just going to write _Neo_ in a Tweet,
| and then say I 'm just cloning _Neo_ 's likeness.
|
| Neo is not a real person, so Keanu can't sue me!
| _Bwahahaha_
| moralestapia wrote:
| If you find a guy that looks identical to him, however
| ...
| romwell wrote:
| >If you find a guy that looks identical to him, however
| ...
|
| ...it wouldn't make any difference.
|
| A Barack Obama figurine is a Barack Obama figurine, no
| matter how much you say that it's _actually_ a figurine
| of Boback O 'Rama, a random person that _coincidentally_
| looks identically to the former US President.
| whoknowsidont wrote:
| I love these takes that constantly pop up in tech
| circles.
|
| There's no way "you" (the people that engage in these
| tactics) believe anyone is that gullible to not see
| what's happening. You either believe yourselves to be
| exceedingly clever or everyone else has the intelligence
| of toddler.
|
| With the gumption some tech "leaders" display, maybe
| both.
|
| If you have to say "technically it's not" 5x in a row to
| justify a position in a social context just short-circuit
| your brain and go do something else.
| planede wrote:
| Nitpick: it's not copyright, it's personality rights and
| likeness. It's a violation of it nonetheless.
| gcr wrote:
| That's a weaksauce argument IMO. The character was played
| by SJ. Depictions of this character necessarily have to
| be depictions of the voice actress.
|
| Your argument may be stronger if OpenAI said something
| like "the movie studio owns the rights to this
| character's likeness, so we approached them," but it's
| not clear they attempted that.
| moralestapia wrote:
| They didn't approach her, they approached her agent,
| which should've the point of contact for either case.
|
| As to whether she owns the rights of that performance or
| somebody else, we'd have to read the contract; most
| likely she doesn't, though.
| mrbungie wrote:
| Probably anyone but her inner circle can "approach her"
| directly. I would expect any other kind of connection to
| be made through her agent.
| moralestapia wrote:
| I honestly don't think Scarlett (the person, not her
| "her" character) has anything to favor their case, aside
| from the public's sympathy.
|
| She may have something only if it turns out that the
| training set for that voice is composed of some
| recordings of her (the person, not the movie), which I
| highly doubt and is, unfortunately, extremely hard to
| prove. Even that wouldn't be much, though, as it could be
| ruled a derivative work, or something akin to any
| celebrity impersonator. Those guys can even advertise
| themselves using the actual name of the celebrities
| involved and it's allowed.
|
| Me personally, I hope she takes them to court anyway, as
| it will be an interesting trial to follow.
|
| An interesting facet is, copyright law goes to the
| substance of the copyrighted work; in this case, because
| of the peculiarities of her character in "her", she is
| pretty much _only_ voice, I wonder if that make things
| look different to the eyes of a judge.
| pseudalopex wrote:
| Likeness rights and copyright are different.
| moralestapia wrote:
| Fictional characters cannot have personality rights, for
| obvious reasons.
|
| That falls under copyright, trademarks, ...
| pseudalopex wrote:
| Actors who play fictional characters have personality
| rights.
| diego_sandoval wrote:
| It's also very easy to say that they were inspired by the
| concept of the movie, but the voice is different.
| llamaimperative wrote:
| Sure if they hadn't contacted her twice for permission,
| including 2 days before launch.
| lyu07282 wrote:
| I heard the voice before hearing this news and didn't
| recognize her, but it's crazy if they really cloned her
| voice without her permission. Even worse somehow since
| they did such a bad job at it.
| apantel wrote:
| They can say they had a certain thing in mind, which was to
| produce something like 'Her', and obviously Scarjo would
| have sold it home for them if she participated. But in lieu
| of the fact that she didn't, they still went out and
| created what they had in mind, which was something LIKE
| 'Her'. That doesn't sound illegal.
| jahewson wrote:
| Not illegal, because this would be a civil case. But
| they're on thin ice because there's a big difference
| between "creating something like Her" and "creating
| something like Scarlet Johansson's performance in Her".
| apantel wrote:
| Creating something like Her is creating something like
| Scarlet Johansson's performance of her. The whole point
| is to hit the same note, which is a voice and an
| aesthetic and a sensibility. That's the point! She wasn't
| willing to do it. If they hit the same note without
| training on her voice, then I think that's fair game.
| a_wild_dandan wrote:
| Yeah, influential people shouldn't get to functionally
| own a "likeness." It's not a fingerprint. An actor
| shouldn't credibly worry about getting work because a
| rich/famous doppelganger exists (which may threaten
| clientele).
|
| Explicit brand reference? Bad. Circumstantial
| insinuation? Let it go.
| ocdtrekkie wrote:
| I mean, unless an investigation can find any criteria used to
| select this particular actress like "sounds like Scarlett" in
| an email somewhere, or you know, the head idiot intentionally
| and publicly posting the title of a movie starring the
| actress in relation to the soundalike's voice work.
| zone411 wrote:
| It probably is illegal in CA: https://repository.law.miami.ed
| u/cgi/viewcontent.cgi?article...
|
| "when voice is sufficient indicia of a celebrity's identity,
| the right of publicity protects against its imitation for
| commercial purposes without the celebrity's consent."
| romwell wrote:
| I'd be surprised if it was legal anywhere in the US, but
| this just puts the final nail into Sky's coffin.
| charlieyu1 wrote:
| But why? Sounds like a violation to the rights of the sound
| actor
| whoknowsidont wrote:
| Because it's meant to give the _appearance_ or
| _perception_ that a celebrity is involved. Their actions
| demonstrate they were both highly interested and had the
| expectation that the partnership was going to work out,
| with the express purpose of using the celebrity's
| identity for their own commercial purposes.
|
| If they had just screened a bunch of voice actors and
| chosen the same one no one would care (legally or
| otherwise).
| whynotminot wrote:
| Sounds like one of those situations you'd have to prove
| intent.
|
| (and given the timeline ScarJo laid out in her Twitter
| feed, I'd be inclined to vote to convict at the present
| moment)
| sangnoir wrote:
| > Sounds like one of those situations you'd have to prove
| intent.
|
| The discovery process may help figuring the intent -
| especially any internal communication before and after
| the two(!) failed attempts to get her sign-off, as well
| as any notes shared with the people responsible for
| casting.
| jahewson wrote:
| Not necessarily, because this would be a civil matter,
| the burden of proof is a preponderance of the evidence -
| it's glaring obvious that this voice is emulating the
| movie Her and I suspect it wouldn't be hard to convince a
| jury.
| janalsncm wrote:
| What OpenAI did here is beyond the pale. This is open and
| shut for me based off of the actions surrounding the
| voice training.
|
| I think a lot of people are wondering about a situation
| (which clearly doesn't apply here) in which someone was
| falsely accused of impersonation based on an accidental
| similarity. I have more sympathy for that.
|
| But that's giving OpenAI far more than just the benefit
| of the doubt: there is no doubt in this case.
| sneak wrote:
| I think "beyond the pale" is a bit hyperbolic. The voice
| actor has publicity rights, too.
| charlieyu1 wrote:
| I guess the Trump lookalike satire guy would not want to
| go to California then
| actionfromafar wrote:
| Ah, so OpenAI does satire. That explains a lot.
| csomar wrote:
| I am guessing it's because you are trying to sell the
| voice as "that" actor voice. I guess if the other voice
| become popular on its own right (a celebrity) then there
| is a case to be made.
| mandmandam wrote:
| Did you read the statement? They approached Scarlett
| _twice_ , including two days before launch. Sam even said
| himself that Sky sounds like 'HER'.
|
| This isn't actually complicated _at all_. OpenAI robbed
| her likeness against her express will.
| rockemsockem wrote:
| It's almost certainly not legal exactly because of the
| surrounding context of openai trying to hire her along with
| the "her" tweet.
|
| There's not a lot of precedent around voice impersonation,
| but there is for a very, very similar case against Ford
|
| https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
| signal11 wrote:
| Amazing case law, thank you! I suspect OpenAI have just
| realised this, hence the walking back of the "Sky" voice.
| meat_machine wrote:
| I am not a lawyer, but other potentially relevant cases:
|
| https://www.quimbee.com/cases/waits-v-frito-lay-inc
|
| https://en.wikipedia.org/wiki/White_v._Samsung_Electronics_
| A....
| JeremyNT wrote:
| Whether or not what they've done is currently technically
| illegal, they're priming the public to be pissed off at them.
| Making enemies of beloved figures from the broader culture is
| likely to _not_ make OpenAI many friends.
|
| OpenAI has gone the "it's easier to ask forgiveness than
| permission" route, and it seemed like they might get away
| with that, but if this results in a lot more stories like
| this they'll risk running afoul of public opinion and future
| legislation turning sharply against them.
| klyrs wrote:
| Problem is, they asked for permission, showing their hand
| multiple times. Can I say, I'm somewhat relieved to learn
| that Sam Altman isn't an evil genius.
| OrangeMusic wrote:
| I honestly didn't think it sounded like Johansson. Because of
| the controversy I just now re listened to the demos and I still
| find if very unlikely that someone would think it was her.
| dilap wrote:
| > 4. Naughtiness
|
| > Though the most successful founders are usually good people,
| they tend to have a piratical gleam in their eye. They're not
| Goody Two-Shoes type good. Morally, they care about getting the
| big questions right, but not about observing proprieties. That's
| why I'd use the word naughty rather than evil. They delight in
| breaking rules, but not rules that matter. This quality may be
| redundant though; it may be implied by imagination.
|
| > Sam Altman of Loopt is one of the most successful alumni, so we
| asked him what question we could put on the Y Combinator
| application that would help us discover more people like him. He
| said to ask about a time when they'd hacked something to their
| advantage--hacked in the sense of beating the system, not
| breaking into computers. It has become one of the questions we
| pay most attention to when judging applications.
|
| "What We Look for in Founders", PG
|
| https://paulgraham.com/founders.html
|
| I think the more powerful you become, the less endearing this
| trait is.
| shombaboor wrote:
| it seems most of the big companies try to break the rules while
| in the process become so strong they trade it off for what
| becomes a marginal fine & cost of doing business. Facebook,
| Uber come to mind first. This may just be the same.
| astrange wrote:
| Everyone let Uber get away with breaking taxi rules because
| those rules were only good for the people with the taxi
| medallion monopoly.
|
| (Which wasn't even the taxi drivers, although they were
| plenty bad enough on their own.)
| aeurielesn wrote:
| This quote actually makes me disgusted. I don't think this is a
| quality to encourage on, especially since despite the tone it
| reads more as abuse.
| laborcontract wrote:
| You are on "Hacker News", surely it's not that much of a
| surprise?
| christina97 wrote:
| It's a different type of hacking. This was about hacking a
| system for one's advantage, not the computer type of
| hacking.
| laborcontract wrote:
| I always interpreted the hacker in HN as a spirit,
| irrespective of vocation.
| jprete wrote:
| What is endearing and admirable as the underdog very
| easily becomes contemptible abuse in the biggest dog of
| the pack. It's not a contradiction - the hacking spirit
| is a trait that causes more damage the more power you
| have.
| abvdasker wrote:
| The persistent belief in the tech industry as some kind
| of underdog I think explains much of the recent
| deplorable behavior by some of the richest people on the
| planet. A bunch of unbelievably wealthy nerds are
| mentally trapped in the past and too out of touch to
| realize they have become the bullies.
| whamlastxmas wrote:
| I think governance is overly restrictive and stifles
| innovation. For example I love that Uber and Airbnb exist
| even though they both sort of skirt or acceptably break rules
| in place that a complete rule follower wouldn't have
| violated.
|
| Taking taxis 15 years ago was an absolute scammy shitty
| experience and it's only marginally better now thanks to an
| actual competitive marketplace
| dylan604 wrote:
| It's actually the most rational thing I've heard quoted from
| him. You have to be willing to open the box to see what's
| inside to know if you can do it
| better/cheaper/faster/smaller. There's ways of doing that
| without breaking laws, or doing something unethical with the
| what you learn. There's also ways of doing it without
| destroying something or violating anyone/anything. It also
| allows you to hear their response to see if where person is
| in that rationale. Do they toe the lines, do they run right
| across it, do they bend but not break, do they scorched earth
| everything they touch?
| talldayo wrote:
| Makes me glad there aren't people betting on how far I'm
| willing to go to bend the law. When you lay it all out like
| that you make PG sound like a cockfighter paying to get his
| champion bloodied.
| croes wrote:
| Those are the same people who break the laws and exploit
| people if they know they can't fight back.
|
| Usually those people are considered sociopaths.
|
| Maybe it's time to ask the employees of OpenAI who fought
| to get Altman back, How this behavior is compatible with
| their moral standards or whether money is the most
| important thing.
| dylan604 wrote:
| Is that something that needs to be asked? I thought it
| was pretty evident when the coup was happening.
| blibble wrote:
| personally I think it completely sums up silicon valley
| perfectly
| themagician wrote:
| Sure, a bit of rebellion can fuel innovation in founders, but
| as they gain power, it's important to keep things ethical. What
| seems charming at the startup phase might raise eyebrows as the
| company expands.
| idontknowtech wrote:
| > They delight in breaking rules, but not rules that matter.
|
| To them*
|
| Which is the whole problem. These narcissistic egotists think
| they, alone, individually, are capable of deciding what's best
| not just for their companies but for humanity writ large.
| rachofsunshine wrote:
| The problem is this line:
|
| > They delight in breaking rules, but not rules that matter.
|
| The question becomes "what rules matter?". And the answer
| inevitably becomes "only the ones that work in my favor and/or
| that I agree with".
|
| I think someone trying to defend this would go "oh come on,
| does it really matter if a rich actress gets slightly richer?"
| And no, honestly, it doesn't matter that much. Not to me,
| anyway. But it matters that it establishes (or rather, confirms
| and reinforces) a culture of disregard and makes it about what
| _you_ think matters, and not about what someone else might
| think matters about the things in their life. Their life
| belongs _to them_ , a fact that utopians have forgotten again
| and again everywhere and everywhen. And once all judgement is
| up to you, if you're a sufficiently ambitious and motivated
| reasoner (and the kind of person we're talking about here is),
| you can justify pretty much whatever you want without that
| pesky real-world check of a person going "um actually no I
| don't want you to do that".
|
| Sometimes I think anti-tech takes get this wrong. They see the
| problem as breaking the rules at all, as disrupting the status
| quo at all, as taking any action that might reasonably be
| foreseen to cause harm. But you do really have to do that if
| you want to make something good sometimes. You can't foresee
| every consequence of your actions - I doubt, for example, that
| Airbnb's founders were thinking about issues with housing
| policy when they started their company. But what differentiates
| behavior like this from risk-taking is that the harm here is
| deliberate and considered. Mistakes happen, but this was not a
| mistake. It was a choice to say "this is mine now".
|
| That isn't a high bar to clear. And I think we can demand that
| tech leaders clear it without stifling the innovation that is
| tech at its best.
| ixaxaar wrote:
| So a kind of lack of empathy? Do these guys have this image of
| "autists" and are basically filtering for them, cause this
| criteria seems to be favoring oppositional defiance disorder.
|
| I mention this specifically because I remember mark andreseen
| comment something similar in lex fridman's podcast, something
| along the lines of getting "those creative people" together to
| build on ai.
| deletedie wrote:
| The more accurate (though somewhat academic) term for this
| trait is 'narcissism'.
| mxstbr wrote:
| There is no source; black text on a white background. How do we
| know this is real?
| llamaimperative wrote:
| It was posted by the tech reporter at NPR. Inb4 "journos can't
| be trusted" blah blah blah, here in reality NPR is a reputable
| org and a reasonable person's Bayesian priors would put this at
| "almost certainly an actual statement from ScarJo."
| shombaboor wrote:
| This reporter appears to have confirmed it from a direct source
| https://x.com/yashar/status/1792682664845254683?t=EwNPiMPwRe...
| Animats wrote:
| _Variety_ has a story.[1] It doesn 't yet mention an direct
| statement from Johannson. But watch that space. _Variety_ is
| well connected in Hollywood and will check with her agent to
| confirm or deny.
|
| [1] https://variety.com/2024/digital/news/openai-pulls-
| scarlett-...
| Animats wrote:
| Variety article updated: [UPDATE: Johansson released a
| statement saying Altman had reached out to ask her to lend
| her voice to ChatGPT but she declined; when she heard the
| demo, "I was shocked, angered and in disbelief that Mr.
| Altman would pursue a voice that sounded so eerily similar to
| mine."]
| timdorr wrote:
| Scarlett Johansson doesn't have social media accounts:
| https://nypost.com/2023/04/04/why-scarlett-johansson-is-not-...
|
| Stuff from her comes via press agents, which is generally sent
| directly to reporters.
| esafak wrote:
| https://www.nbcnews.com/tech/tech-news/scarlett-johansson-sh...
| notamy wrote:
| http://archive.ph/cr759
|
| https://nitter.poast.org/BobbyAllyn/status/17926794357010149...
| endisneigh wrote:
| Sadly not much will come of this. Even if they're fined, so what?
| MrSkelter wrote:
| You have no idea. Th will settle or be forced to admit they
| used a movie studios IP without payment to clone a voice model.
| They will cut a check for tens of millions and maybe stock as
| well. They will run from this as is clearly obvious from the
| immediate takedown. They are in crisis mode.
| astrange wrote:
| Who says creating a voice model from a movie would require
| you to pay anyone?
| rangerelf wrote:
| I'd say it's a given, every detail in a movie is an
| artistic expression of some kind.
| fullshark wrote:
| I don't agree, if Johansson is serious about wanting a legal
| precedent set and doesn't care about the money (tbd) then a
| hypothetical lawsuit could go to the US Supreme court and lead
| to a decision that has significant ramifications.
| bigiain wrote:
| "US Supreme court"
|
| Yeah. _That_ well known completely rational and
| unquestionably incorruptible institution.
|
| I would bet Altman had been to more teenage sex parties and
| paid for more holidays with SC judges that Scarlett has...
| :sigh:
| polynomial wrote:
| Exactly, cost of doing business.
| Imnimo wrote:
| Many of OpenAI's productization ideas make more sense when you
| remember that the guy in charge also thought Worldcoin and it's
| eye scanning or were a good idea.
| ozten wrote:
| Nooo. I've been enjoying that voice for a few months on my iPhone
| ChatGPT app. Launched... and tested... the voice is someone else
| now.
| keepamovin wrote:
| I think it's interesting that Johansson chose to forgo
| substantial royalties and collaboration potential
|
| But it must feel pretty fucking weird and violatory when you
| spend your entire life thinking about how you are going to
| deliver certain lines and that's your creative Body of work, and
| then for someone to just take that Voice and apply it to any
| random text that can be generated?
|
| I get why she wouldn't want to let it go.
|
| In a way it is similar to how a developer might feel about their
| code being absorbed, generalized, and then regurgitated almost
| verbatim as part of some AI responses
|
| But in the case of voice it's even worse as the personality
| impression is contained in the slightest utterance... whereas a
| style of coding Or a piece of code might be less Recognizable,
| and generally applicable to such a wide range of productions
|
| Voice is the original human technology, To try to take that from
| someone without their consent is a pretty all encompassing grab
| steveBK123 wrote:
| She choose to forego being a billion lonely guys AI girlfriend
|
| Not a bad call for someone already rich
| keepamovin wrote:
| > She choose to forego being a billion lonely guys AI
| girlfriend
|
| To suggest that Johansson's only appeal is to the opposite
| gender (and 'lonely' ones at that!) I think is myopic and
| reductive of her impact
| steveBK123 wrote:
| That is not her only appeal, she is a renowned actress.
|
| However Sam tweeted "her" which is literally the movie
| where she voices the AI girlfriend. And then made a
| synthetic replica of her voice the star of their new demo
| against her wishes.
|
| It's pretty direct what he is pitching at.
| keepamovin wrote:
| Fair enough. But I think your comment was reductive...
|
| but it doesn't matter, because how he may have marketed
| it in a 140 character tweet does not encompass the
| entirety of how it could be used, of course
| skywhopper wrote:
| That was not an exclusive statement. It can be a billion
| lonely guys, AND lonely women, AND gregarious enbies, AND
| everyone else.
| steveBK123 wrote:
| Indeed, they can find the male voice equivalent. Though
| to be fair I think men are MUCH more susceptible to this
| than women.
|
| That said, Krazam covered this topic well already
| https://www.youtube.com/watch?v=KiPQdVC5RHU
| chemmail wrote:
| More like delaying the inevitable. We have ai voice
| generation so good now, they use it to stage ransom calls and
| parents cannot tell the difference.
| logrot wrote:
| I think you're pretty naive.
| keepamovin wrote:
| > I think you're pretty naive.
|
| I don't think it's about me, but since you brought it up, I
| try to maintain my innocence in this world. I try to be
| biased towards that rather than cynicism, at least... I think
| that's important. Cynicism is a kind of arrogance: where you
| think you've seen it all before... but you're wrong.
|
| But thank you for the opportunity Your comment provides to
| speak to that: I do appreciate it. the chance to add clarity.
| thelastCube wrote:
| I think you are a class act.
| tootie wrote:
| ScarJo probably has enough money for ten lifetimes already. The
| potential downside of signing your identity away to a SV
| plutocrat with questionable morality and world-changing
| technology is enormous. And it seems like she was instantly
| vindicated in not trusting him.
| suddenexample wrote:
| Were royalties in the picture? Don't think it's crazy to think
| that a company obsessed with replacing artists wouldn't value
| one enough to pay royalties.
|
| And in terms of collaboration potential... OpenAI is a big draw
| for businesses and a subset of tech enthusiasts, but I don't
| think artists in any industry are dying to collaborate with
| them.
| cdrini wrote:
| According to Open AI's post about the Sky voice controversy:
|
| "Each actor receives compensation above top-of-market rates,
| and this will continue for as long as their voices are used
| in our products."
|
| https://openai.com/index/how-the-voices-for-chatgpt-were-
| cho...
|
| Not sure if this is royalties, but it seems like there's some
| form of long term compensation. But it's a little vague so
| not sure.
| whamlastxmas wrote:
| SJ is a major character in like 5 of the top grossing movies of
| all time. The royalties from her voice being used by ChatGPT
| would be meaningless to her
| fxd123 wrote:
| > chose to forgo substantial royalties and collaboration
| potential
|
| We know nothing about their offer to her. Could have just been
| a bad deal
| UberFly wrote:
| OpenAI has successfully stolen the intellectual property of
| millions of people to incorporate into their product, so why
| would they fear stealing someones voice at this point? I hope she
| wins. Maybe it'll set some kind of precedent.
| whamlastxmas wrote:
| Calling it stealing is a stretch. They have, at worst,
| infringed on copyright and terms of service.
| steveBK123 wrote:
| Incredibly stupid
|
| The wink wink at creating an AI girlfriend is so bizarre
|
| I guess we know who their target user base is
| talldayo wrote:
| Worse than that, good luck positioning yourself as a paragon of
| "AI safety" when you can't even handle basic human business
| relationships honestly.
| crazygringo wrote:
| Seriously. This is utterly baffling to me.
|
| OpenAI is trying to demonstrate how it's so trustworthy, and
| is always talking about how _important_ it is to be
| trustworthy when it comes to something as important and
| potentially dangerous as AI.
|
| And then they do something like this...??
|
| I literally don't understand how they could be this dumb. Do
| they not have a lawyer? Or do they not tell their corporate
| counsel what they're up to? Or just ignore the counsel when
| they do?
| steveBK123 wrote:
| Also retired the entire safety team in the same weak too,
| lol.
| bigiain wrote:
| I wonder howe much their anti disparagement clauses are
| about covering up how this went down internally?
| prawn wrote:
| Especially considering that the advantage gained by having
| an AI sound like this one individual is absolutely minimal.
| It's not as though any significant portion of a target
| market is going to throw a tantrum, saying "No, no, I
| refuse to accept this simulated companionship unless it
| sounds exactly like the voice in that one particular movie
| several years ago." Baffling that the company didn't
| recognise the risk here and retire that voice as soon as
| they were turned down the first time.
| imperialdrive wrote:
| Absolutely. Good for Scarlett, and my gosh Sam and that org
| need to learn a few lessons. What were they thinking?? So
| gross.
| steveBK123 wrote:
| Will be funny if Sam was the bad guy all along
| eschaton wrote:
| Uh, we all should know exactly who and what Sam Altman is
| by now.
|
| He's absolutely been the bad guy all along.
| DavidPiper wrote:
| "Duh." [1] ;-)
|
| [1] https://www.youtube.com/watch?v=0AjqljwVusk
| option wrote:
| tell me. who target user base is?
| __loam wrote:
| Lonely losers who think computers are magic.
| mrieck wrote:
| I can't believe this demo hasn't been deleted yet:
|
| https://twitter.com/OpenAI/status/1790089521985466587
|
| Giggly, flirty AI voice demos were already weird, but now it's
| even creepier knowing the backstory of how they try to get
| their voices.
| steveBK123 wrote:
| This demo and the one I linked just seem so open about the AI
| GF use case its bizarre.
|
| If you actually wanted a voice assistant AI, having a giggly,
| chatty computer acting like it has a huge crush on you is not
| remotely useful in day to day real world use. Unless that's
| exactly what you want.
| brcmthrowaway wrote:
| This reads like a PR stunt. Why did they clone the voice from
| Her?
| whamlastxmas wrote:
| SJ doesn't have a monopoly on the sultry giggly flirty American
| female voice. There are a million women who could imitate this
| pretty closely
| dangoodmanUT wrote:
| Alright who left the dwight schrute cloner on overnight in the
| comments
| blibble wrote:
| consent appears to be optional for everything OpenAI does
| ml-anon wrote:
| Those are the rumors about Sam...
| LeoPanthera wrote:
| Non-X sources:
| https://news.google.com/stories/CAAqNggKIjBDQklTSGpvSmMzUnZj...
| akr4s1a wrote:
| I can't fathom such a bad decision as asking someone for
| permission to use their voice and doing it anyway after they say
| no. It's almost like NYT is currently suing them for unauthorized
| use and they should really not be making such an amateur mistake.
| MBCook wrote:
| I really hope she sues the company to hell and back.
|
| She has the resources to fight back and make an example of them,
| and they have the resources to make it worthwhile.
| ml-anon wrote:
| Scarlett Johansson made Disney cave. She's going to absolutely
| destroy this band of grifters.
| MBCook wrote:
| Not only that, I don't think Disney wants this precedent.
| They may also want to get back on her good side. Either way
| given they own a number of movies with her in starring roles
| I wouldn't be surprised if they were happy to help in her
| lawsuit with the legal fees.
|
| Hell they may sue on their own or join as another damaged
| party.
| ml-anon wrote:
| I'm sure they are salivating at the thought of discovery
| forcing OAI to crack open their datasets so they can put a
| dollar amount to every piece of infringing material in
| there.
| threatofrain wrote:
| I'm not sure she can do much since OpenAI withdrew so quickly.
| What damages are there?
| callalex wrote:
| The way USA courts are set up, setting precedent and
| assessing damages are two distinct things. I agree that the
| precedent she would be targeting wouldn't be all that
| financially rewarding but that's not the only thing that
| motivates humans.
| system2 wrote:
| Sky voice is still there.
| DevX101 wrote:
| The demo OpenAI was a massive marketing campaign for GPT-4o
| and led to the largest increases in revenue for their mobile
| app. The voice was a large part of why this release was a
| hit. The demo is still on youtube with 4M views. She has a
| great case for financial remuneration even if they haven't
| yet launched the voice feature.
| MBCook wrote:
| Withdrew too quickly?
|
| They didn't come to an agreement to use her voice, they used
| her voice. And they obviously knew it was a problem because
| they went back to her like the night before trying to get her
| approval again.
|
| The correct thing to do was NOT use her voice.
|
| You don't get to steal something, get all the benefit from it
| (the press coverage), and then say "oops never mind it was
| just a few hours you can't sue us".
|
| Why don't we try selling tickets to watch a Disney movie
| "just one time" and see how well that goes. I don't think
| Disney's lawyers will look at it and say "oh well they
| decided not to do it again."
| 8note wrote:
| They make no representation that it is her voice, and
| there's a really good chance that they separately made a
| voice that sounds similar enough to where if they could
| tack her name to it, it'd be good for advertising, but
| otherwise isn't her voice.
|
| Voices are really hard for people to distinguish as being a
| certain person without priming, so really she's doing for
| free the advertising they were hoping she'd do for pay
| thih9 wrote:
| > there's a really good chance that they separately made
| a voice that sounds similar enough to where if they could
| tack her name to it, it'd be good for advertising, but
| otherwise isn't her voice.
|
| There's also a really good chance this is in some way a
| deepfake. Would be interesting to see this get examined
| by courts.
| MBCook wrote:
| I don't know if it's actually her voice. At this point I
| wouldn't put it past them.
|
| But if they concocted a fake voice to sound as much like
| her as possible, that's not really better.
|
| Altman's tweet, combined with previous statements Her is
| his favorite movie, and trying to secure rights twice
| looks really really damning.
|
| > so really she's doing for free the advertising they
| were hoping she'd do for pay
|
| They didn't want her to advertise for them. They wanted
| to use her voice. Do you not see a difference?
| option wrote:
| not defending oai here, but why do _you_ seem to hate them? did
| chatGPT make your life better or worse?
| mepiethree wrote:
| I think for most people the answer is "worse".
| wilg wrote:
| lol
| DavidPiper wrote:
| Not OP, or the replier, but I think the longer answer
| here is that if Scarlett Johansson <insert any wealthy
| and/or popular figure> can't win a lawsuit against a
| company that has effectively:
|
| (1) Used content she has created (vocal lines) in order
| to train a generative AI with the ability to create more
| of that content (vocal lines), without permission or
| payment, let alone acknowledgement
|
| (2) In doing so removed the scarcity of the content she
| provides (Generative AI is effectively unlimited in the
| lines it can produce once effectively trained)
|
| Then no smaller time actor, voice actor, artist,
| musician, etc, is likely to have any chance defending
| themselves against the theft of their work for AI
| purposes.
|
| And, with that precedent set, the legal and financial
| landscape of art and creativity will have changed in a
| way that discourages anybody from creating original works
| for financial gain, because we've systematised the
| creation of original and derivative works with no legal
| or financial ramifications.
| wilg wrote:
| Maybe, but I don't think it's likely that they used any
| actual copyrighted material of her voice. It's not
| illegal to want one voice actor to play a role (or record
| voice data), then get a different, somewhat (but not
| actually very) similar sounding voice actor to do it
| instead.
| rockemsockem wrote:
| Curious, how?
| lotsoweiners wrote:
| Sam Altman has a smarmy, borderline vomit inducing face. The
| fact that I have to see his face every day while scrolling
| through the interwebs feeds has made my life worse.
| mycologos wrote:
| See, I don't like Sam Altman either, but this habit of
| criticizing people for their _faces_ seems wrong-headed.
| Isn 't it his behavior that we should be criticizing?
| sooheon wrote:
| I could be wrong but in these cases (or when people
| criticize voices, similarly), people are more put off by
| expressions than raw features. Nonverbal communication is
| high emotional bandwidth. I don't begrudge someone
| disliking someone else's self presentation.
| z7 wrote:
| Very obvious bias against OpenAI in the comments here.
| Possible motives: a) there's a visceral human reaction
| against anyone extraordinarily successful and powerful
| (Nietzsche wrote about this). b) resentment for OpenAI's
| advancements in code generation and its possible impact on
| the job market. I don't think much of the outrage here is
| motivated by altruism, it's probably more about siding with
| whoever opposes your perceived enemy.
| Ar-Curunir wrote:
| Or maybe the prospect of theft and unpermissioned copying
| of an extremely personal quality (one's voice) is
| (correctly) being called out?
|
| People aren't as selfish/petty as you're making them out.
| 8note wrote:
| There's any number of people with equivalent voices that
| Scarlet has stolen the voices of by selling it to movies,
| and all of those other people who sound the same deserve
| compensation from her for it
| eschaton wrote:
| Most people aren't that selfish or petty. But there are
| some people who are--and by an unfortunate quirk of human
| nature, they tend to believe everyone else is just like
| them.
| z7 wrote:
| So your response is ... "No I'm not, you are."
|
| Well. This reaction invites further psychological
| interpretations, lol.
| eschaton wrote:
| Uh, what? This is a pretty well-researched phenomenon: A
| very high percentage of people who engage in pathological
| behaviors that take advantage of other people believe
| that "everyone does it" and rely on that as
| justification. Penn & Teller did a pithy bit on the
| topic.
| z7 wrote:
| Can you provide the evidence indicating that her voice
| was stolen? Maybe also an audio analysis of the vocal
| characteristics involved and the degree of similarity?
| And wouldn't it be prudent to investigate these matters
| before making such accusations?
| maxbond wrote:
| I propose the following razor:
|
| Never attribute to jealousy that which can be explained by
| a difference of perspective.
|
| Jealousy can explain any criticism of anybody. That makes
| it an epistemic hazard; you can always fall back to it, and
| you can develop a habit of doing so. And then you have
| deafened yourself. Any substantive criticisms will be lost
| on you. As will any opportunity to learn from those you
| disagree with. Your ideas will be hot house flowers,
| comfortable and safe in their controlled environment, but
| unable to contend in the wild. Aspire to weeds.
| z7 wrote:
| That would be tautological, as 'difference of
| perspective' is a restatement of the phenomenon at a
| higher level of abstraction. Someone who is resentful
| will likely have a different perspective than they would
| otherwise have. Psychological motives are real and don't
| disappear by simply assuming a variant of a problem-
| solving heuristic. Notice also I didn't use the word
| 'jealousy.' I do think that speculations about
| psychological motives should be made only after careful
| consideration and generally remain unprovable. However,
| the uncertainty of psychological motives shouldn't
| prevent us from ever addressing them.
| 101008 wrote:
| it could make life better for a lot of people, but training
| their models on copyright material, which makes life worse
| for another set of people.
| MBCook wrote:
| Right. If what they're doing is so incredibly valuable,
| just pay for the resources you're using.
|
| If your business model only works if you steal stuff from
| other people you don't have a business.
| rockemsockem wrote:
| Not commenting on open AI specifically, but I feel like
| actually acquiring the data is borderline impossible if
| you only follow legal channels.
|
| Where can I insert my money to buy every penguin random
| house book? How do I pay every deviant art artist
| whatever amount they're owed?
| MBCook wrote:
| You could go to Penguin directly. Or perhaps the Authors
| Guild would help you set something up across publishers.
|
| All of deviant art? Probably not. But do you need to
| train on all of that? You could certainly run ads telling
| people you'd be willing to pay a small amount to train on
| their art and let them choose.
|
| Would it be legal to train against the national archives?
|
| Options exist. No you won't get as much stuff as you do
| by taking whatever you want, but people are being
| compensated for their work or at least being given the
| choice to opt in.
| tavavex wrote:
| Ignoring the actual legality of using training data in
| machine learning, let's look at these "options" in a
| purely objective-driven way. If you do that, you'll
| quickly realize that what you describe would strengthen
| megacorporations to an even greater extent. If training
| on public data is banned, then the entities who have
| complete ownership of all their data would be granted a
| de facto monopoly. Stock image services, media
| conglomerates, music labels, industry giants would start
| in the winning position. When one side just gets their
| way for free and the other has to beg for scraps, do you
| really think this is a fair proposition?
|
| The reason why open-source AI exists at all is that we've
| always been allowing use of public data - it was okay
| when Google did it, it was okay when the Internet Archive
| did it, it was even okay when text translation services
| used that same data to train their models - or really,
| that applies to basically anything ML-driven before
| generative AI.
|
| There's, like, a sea of reasons to criticize OpenAI for -
| but arguing for extending IP laws even further and
| calling out opponents for "literal theft" is one of the
| weaker options that caught on with many people.
| qbxk wrote:
| you're not wrong, and i think the technology exists to
| accomplish something that did this, and has for a few
| years. but the end result would be a system that funnels
| money from large numbers of people, to different large
| numbers of people. and a lot of overlap and cross
| payments too. maybe there's a good business in
| transaction fees on that? but it seems like big numbers
| going to other people while a much smaller number goes to
| the party owning the system is not an appealing business
| in our world.
|
| ipso facto why does it not exist while spotify buys
| _podcasts_ and kills them
| throwaway115 wrote:
| Sounds like you're using the music industry's definition
| of "steal." Nothing was stolen, because nobody was
| deprived of the thing.
| visarga wrote:
| In this case it just sounds like her , not even a direct
| copy
| visarga wrote:
| Creatives will use AI even more than regular people and
| with better results. What I find dangerous is to protect
| all paraphrases and variations as well, akin to owning an
| idea. When not replicating expression ideas should be free,
| even in copyrights works
| wraptile wrote:
| I think majority agrees that we're ok with a few
| millionaire priviledge actors losing small amount of their
| value in favor of personal AI assistants for general
| populace. How is that even a topic worth discussing is
| trully perplexing.
| krainboltgreene wrote:
| No man, a majority do not believe that and it's really
| weird that you think they do.
| wraptile wrote:
| Why? The data clearly points that nobody will go out of
| their way to protect IP laws. In majority of the world
| nobody could care less about some hollywood actor with
| tough to spell name getting their voice cloned.
|
| It's really weird that someone would think that IP law is
| more important than access to information. Must be some
| dystopian bubble.
| MBCook wrote:
| They stole someone's voice after being told not to?
|
| They unleashed a massive new wave of spam and scams and
| garbage onto the Internet?
|
| Because they're stealing every single bit of content on the
| Internet and everywhere else they can get their hands on
| without paying anything for it and then expect to sell it
| back to us in chewed up garbage form?
|
| They help single-handedly cause MS to blast past their carbon
| commitments by 30%? And got everyone else into a big race for
| how many resources they could waste to power AI nonsense that
| doesn't even actually work that well for what people want to
| use it for?
|
| Perhaps the fact that they don't seem to care one bit about
| any of the consequences, legal/moral/ethical/economical/etc.
| caused by what they've done as long as they make money?
|
| I don't have a grudge against open AI. I have a grudge
| against the AI industry and the way these kind of SV golden
| boys seem to think they're immune from criticism.
|
| Why do you think stealing a professional actor's voice should
| be OK and immune from criticism? This is horrible.
| eschaton wrote:
| Amen. We need ethics and accountability and this wave of
| "AI" has been sorely lacking those, instead preferring the
| Uber model of "let's just get big enough to bully things
| into working out in our favor."
|
| It's important to nip that shit in the bud lest it spread.
| wraptile wrote:
| > They stole someone's voice after being told not to?
|
| Stole someone's voice? That's not stealing, let's not pump
| more power for copyright propaganda even if this case is
| correct.
| sashank_1509 wrote:
| This is hilarious. OpenAI didn't even need to press for this
| voice, their technical demo was impressive enough, but they did
| and now it'll cast a shadow over a pretty impressive AI
| advancement. In the long term though, this won't matter.
| anon373839 wrote:
| Well, that statement lays out a damning timeline:
|
| - OpenAI approached Scarlett last fall, and she refused.
|
| - Two days before the GPT-4o launch, they contacted her agent and
| asked that she reconsider. (Two days! This means they already had
| everything they needed to ship the product with Scarlett's cloned
| voice.)
|
| - Not receiving a response, OpenAI demos the product anyway, with
| Sam tweeting "her" in reference to Scarlett's film.
|
| - When Scarlett's counsel asked for an explanation of how the
| "Sky" voice was created, OpenAI yanked the voice from their
| product line.
|
| Perhaps Sam's next tweet should read "red-handed".
| MrMetlHed wrote:
| Would love to see this get far enough for discovery to see how
| that all played out behind the scenes.
| hehdhdjehehegwv wrote:
| They'll settle as soon as they figure that out. Idiot tax.
| ml-anon wrote:
| She has no incentive to settle and actually could win big
| by being the figurehead of the creative industry against
| AI. It's understandable why she accepted a settlement from
| Disney, but there's no reason why she should settle with a
| random startup that has no other influence on her
| employability in Hollywood.
| MrMetlHed wrote:
| And plenty of her peers have been fighting against AI
| content harvesting in their recent contract
| negotiations[1].
|
| 1. https://apnews.com/article/hollywood-ai-strike-wga-
| artificia...
| hehdhdjehehegwv wrote:
| OpenAI will settle, not sure how you read that in
| reverse.
| eurleif wrote:
| Settling isn't unilateral. OpenAI can offer to settle,
| but if she doesn't accept, there will be no settlement.
| hehdhdjehehegwv wrote:
| I also said "they" instead of "her", I'm confused as to
| why anybody misinterpreted what I said.
| ml-anon wrote:
| Everyone knows what you meant. But it's not up to them to
| "settle". If she brings forward a formal complaint they
| can offer to but she has no obligation or incentive to
| accept.
|
| This may turn out to be something they can't just buy
| their way out of with no other consequences.
| nickthegreek wrote:
| This statement from scarlet really changed my perspective. I
| use and loved the Sky voice and I did feel it sounded a little
| like her, but moreover it was the best of their voice
| offerings. I was mad when they removed it. But now I'm mad it
| was ever there to begin with. This timeline makes it clear that
| this wasn't a coincidence and maybe not even a hiring of an
| impressionist (which is where things get a little more wishy
| washy for me).
| ekam wrote:
| Same here and that voice really was the only good one. I
| don't know why they don't bring the voices from their API
| over, which are all much better, like Nova or Shimmer
| (https://platform.openai.com/docs/guides/text-to-speech)
| sanxiyn wrote:
| I think because it is not text-to-speech. It probably isn't
| simple to transfer.
| crimsoneer wrote:
| But it's clearly _not_ her voice right? The version that 's
| been on the app for a year just isn't. Like, it clearly
| intending to be slightly reminiscent of her, but it's also
| very clearly not. Are we seriously saying we can't make
| voices that are similar to celebrities, when not using their
| actual voice?
| gedy wrote:
| Normally I'd agree if this were some vague "artist style",
| but this was clearly an attempt to duplicate a living
| person, a media celebrity no less.
| threatofrain wrote:
| Is this different from the various videos of the Harry
| Potter actors doing comedic high fashion ads? Because
| those were very well received.
|
| https://www.youtube.com/watch?v=ipuqLy87-3A
| BadHumans wrote:
| Is a billion dollar AI company utilizing someone's voice
| against their will in a flagship product after they said
| no twice different from a random Youtube channel making
| comedy videos?
|
| I think so but that could just be me.
| nicklecompte wrote:
| I think anti-deepfake legislation needs to consider fair
| use, especially when it comes to parody or other
| commentary on public figures. OpenAI's actions do not
| qualify as fair use.
| throwway120385 wrote:
| The problem with that idea is that I can hide behind it
| while making videos of famous politicians doing really
| morally questionable things and distributing them on
| YouTube. The reason Fair Use works with regular parodies
| in my opinion is that everyone can tell that it is
| obviously fake. For example, Saturday Night Live
| routinely makes joking parody videos of elected officials
| doing things we think might be consistent with their
| character. And in those cases it's obvious that it's
| being portrayed by an actor and therefore a parody. If
| you use someone's likeness directly I think that it must
| never be fair use or we will quickly end up in a world
| where no video can be trusted.
| cjbgkagh wrote:
| I'm guessing you're referring to people still thinking
| Sarah Palin said she could see Russia from her house,
| that was from a SNL skit and an amazing impression from
| Tina Fey. I agree, people have a hard time separating
| reality from obvious parody, how could we expect them to
| make a distinction with intentional imitation. Society
| must draw a clear line that it is not ok to do this.
| jacobolus wrote:
| One is a company with a nearly $100 billion valuation
| using someone's likeness for their own commercial
| purposes in a large-scale consumer product, which
| consumers would plausibly interpret as a paid
| endorsement, while the other seems to be an amateur
| hobbyist nobody has ever heard of making a parody demo as
| an art project, in a way that makes it clear that the
| original actors had nothing to do with it. The context
| seems pretty wildly different to me.
|
| I'm guessing if any of the Harry Potter actors threatened
| the hobbyist with legal action the video would likely
| come down, though I doubt they would bother even if they
| didn't care for the video.
| jprete wrote:
| Those are parodies and not meant at any point for you to
| believe the actual Harry Potter actors were involved.
| bottled_poe wrote:
| There's a big difference between a one off replica and
| person-as-a-service.
| tsimionescu wrote:
| That has a much better chance of falling under fair use
| (parody, non-commercial) if the actors ever tried to sue.
|
| There is a major difference between parodying someone by
| imitating them while clearly and almost explicitly being
| an imitation; and deceptively imitating someone to
| suggest they are associated with your product in a
| serious manner.
| __loam wrote:
| Why do you have an issue with them taking someone's
| likeness to use in their product but not with them taking
| someone's work to use in their product?
| gedy wrote:
| Because this isn't training an audio model along with a
| million other voices to understand English, etc. It's
| clearly meant to sound exactly like that one celebrity.
|
| I suspect a video avatar service that looked exactly like
| her would fall afoul of fair use as well. Though an image
| gen that used some images of her (and many others) to
| train and spit out generic "attractive blonde woman" is
| fair use in my opinion.
| numpad0 wrote:
| Chances are this is. Basically same as LoRA. One of go-to
| tools for these literally uses Diffusion model and work
| on spectrograms as images.
| __loam wrote:
| Okay so as long as we steal enough stuff then it's legal.
| citizenpaul wrote:
| An actress that specifically played the voice of AI in a
| movie about AI no less.
| bigfishrunning wrote:
| It could be trained on Scarlett's voice though, there's
| plenty of recorded samples for OpenAI to use. It's pretty
| damning for them to take down the voice right away like
| that
| brandall10 wrote:
| Her statement claims the voice was taken down at her
| attorney's insistence.
| bobthepanda wrote:
| this is correct. in fact the fcc has already clarified this
| for the case of robocalls.
| https://www.fcc.gov/document/fcc-makes-ai-generated-
| voices-r...
| emmp wrote:
| We can seriously say that, yes. The courts have been saying
| this in the US for over 30 years. See Midler v. Ford Motor
| Co.
| Avshalom wrote:
| Tom Waits won a lawsuit against Doritos too.
| dragonwriter wrote:
| If the purpose is to trade on the celebrity voice and
| perceived association, and its subject to California right
| of personality law, then, yes, we're saying that that has
| been established law for decades.
| Last5Digits wrote:
| That's not the purpose though, clearly. If anything, you
| could make the argument that they're trading in on the
| association to the movie "Her", that's it. Neither Sky
| nor the new voice model sound particularly like ScarJo,
| unless you want to imply that her identity rights extend
| over 40% of all female voice types. People made the
| association because her voice was used in a movie that
| features a highly emotive voice assistant reminiscent of
| GPT-4o, which sama and others joked about.
|
| I mean, why not actually compare the voices before
| forming an opinion?
|
| https://www.youtube.com/watch?v=SamGnUqaOfU
|
| https://www.youtube.com/watch?v=vgYi3Wr7v_g
|
| -----
|
| https://www.youtube.com/watch?v=iF9mrI9yoBU
|
| https://www.youtube.com/watch?v=GV01B5kVsC0
| cowsup wrote:
| > People made the association because her voice was used
| in a movie that features a highly emotive voice assistant
| reminiscent of GPT-4o, which sama and others joked about.
|
| Whether you think it sounds like her or not is a matter
| of opinion, I guess. I can see the resemblance, and I can
| also see the resemblance to Jennifer Lawrence and others.
|
| What Johannson is alleging goes beyond this, though. She
| is alleging that Altman (or his team) reached out to her
| (or her team) to lend her voice, she was not interested,
| and then she was asked _again_ just two days before
| GPT-4o 's announcement, and she rejected _again._ Now
| there 's a voice that, in her opinion, sounds a lot like
| her.
|
| Luckily, the legal system is far more nuanced than just
| listening to a few voices and comparing it mentally to
| other voices individuals have heard over the years.
| They'll be able to figure out, as part of discovery, what
| lead to the Sky voice sounding the way it does
| (intentionally using Johannson's likeness? coincidence?
| directly trained off her interviews/movies?), whether
| OpenAI were willing to slap Johannson's name onto the
| existing Sky during the presentation, whether the "her"
| tweet and the combination of the Sky voice was supposed
| to draw the subtle connection... This allegation is just
| the beginning.
| Last5Digits wrote:
| I honestly don't think it is a matter of opinion, though.
| Her voice has a few very distinct characteristics, the
| most significant of which being the vocal fry /
| huskiness, that aren't present at all in either of the
| Sky models.
|
| Asking for her vocal likeness is completely in line with
| just wanting the association with "Her" and the big PR
| hit that would come along with that. They developed voice
| models on two different occasions and hoped twice that
| Johannson would allow them to make that connection.
| Neither time did she accept, and neither time did they
| release a model that sounded like her. The two day run-up
| isn't suspicious either, because we're talking about a
| general audio2audio transformer here. They could likely
| fine-tune it (if even that is necessary) on her voice in
| hours.
|
| I don't think we're going to see this going to court.
| OpenAI simply has nothing to gain by fighting it. It
| would likely sour their relation to a bunch of media big-
| wigs and cause them bad press for years to come. Why
| bother when they can simply disable Sky until the new
| voice mode releases, allowing them to generate a million
| variations of highly-expressive female voices?
| om2 wrote:
| I haven't hear the GPT-4o voice before. Comparing the
| video to the video of Johansson's voice in "her", it
| sounds pretty similar. Johansson's performance there
| sounds pretty different from her normal speaking voice in
| the interview - more intentional emotional inflection,
| bubbliness, generally higher pitch. The GPT-4o voice
| sounds a lot like it.
|
| From elsewhere in the thread, likeness rights apparently
| do extend to intentionally using lookalikes / soundalikes
| to create the appearance of endorsement or association.
| ncallaway wrote:
| > Are we seriously saying we can't make voices that are
| similar to celebrities, when not using their actual voice?
|
| They clearly thought it was close enough that they asked
| for permission, twice. And got two no's. Going forward with
| it at that point was super fucked up.
|
| It's very bad to not ask permission when you should. It's
| far worse to ask for permission and then ignore the
| response.
|
| Totally ethically bankrupt.
| nicce wrote:
| And they could have totally get away with it by never
| mentioning the name of Scarlett. But of course, that is
| not what they wanted.
|
| Edit: to clarify, since it is not exactly identical
| voice, or even not that close, they can plausibly deny
| it, and we never new what their intention was.
|
| But in this case, they have clearly created the voice to
| represent Scarlett's voice to demonstrate the
| capabilities of their product in order to get marketing
| power.
| visarga wrote:
| > since it is not exactly identical voice, or even not
| that close, they can plausibly deny it
|
| When studios approach an actress A and she refuses, then
| another actress B takes the role, is that infringing on
| A's rights? Or should they just scrap the movie?
|
| Maybe if they replicated a scene from the A's movies or
| there was striking likeness between the voices... but not
| generally.
| nicce wrote:
| > When studios approach an actress A and she refuses,
| then another actress B takes the role, is that infringing
| on A's rights? Or should they just scrap the movie?
|
| The scenario would have been that they approach none.
| avarun wrote:
| > They clearly thought it was close enough that they
| asked for permission, twice.
|
| You seem to be misunderstanding the situation here. They
| wanted ScarJo to voice their voice assistant, and she
| refused twice. They also independently created a voice
| assistant which sounds very similar to her. That doesn't
| mean they thought they had to ask permission for the
| similar voice assistant.
| tomrod wrote:
| And... No. That is what OpenAI will assert, and good
| discovery by Scar Jo reps may prove or disprove.
| chromakode wrote:
| So, what would they have done if she accepted? Claimed
| that the existing training of the Sky voice was voiced by
| her?
| og_kalu wrote:
| Voice cloning could be as simple as a few seconds of
| audio in the context window since GPT-4o is a speech to
| speech transformer. They wouldn't need to claim anything,
| just switch samples. They haven't launched the new voice
| mode yet, just demos.
| sangnoir wrote:
| > Claimed that the existing training of the Sky voice was
| voiced by her?
|
| That claim could very well be true. The letter requested
| information on how the voice was trained - OpenAI may not
| want that can of worms opened lest other celebrities
| start paying closer attention to the other voices.
| blackoil wrote:
| Maybe they have second trained on her voice.
| voltaireodactyl wrote:
| You seem to be misunderstanding the legalities at work
| here: reaching out to her multiple times beforehand,
| along with tweets intended to underline the similarity to
| her work on Her, demonstrates intention. If they didn't
| think they needed permission, why ask for permission
| multiple times and then yank it when she noticed?
|
| Answer: because they knew they needed permission, after
| working so hard to associate with Her, and they hoped
| that in traditional tech fashion that if they moved fast
| and broke things enough, everyone would have to reshape
| around OAs wants, rather than around the preexisting
| rights of the humans involved.
| KHRZ wrote:
| You could also ask: If Scarlett has a legal case already,
| why does she want legislation passed?
| minimaxir wrote:
| To prevent it from happening again, with more legal
| authority than a legal precedent.
| ncallaway wrote:
| Because a legal case under the current justice system and
| legislative framework would probably take hundreds of
| thousands to millions of dollars to bring a case that
| requires discovery and a trial to accomplish.
|
| Maybe (maybe!) it's worth it for someone like Johansson
| to take on the cost of that to vindicate her rights--but
| it's certainly not the case for most people.
|
| If your rights can only be defended from massive
| corporations by bringing lawsuits that cost hundreds of
| thousands to millions of dollars, then only the wealthy
| will have those rights.
|
| So maybe she wants new legislative frameworks around
| these kind of issues to allow people to realistically
| enforce these rights that nominally exist.
|
| For an example of updating a legislative framework to
| allow more easily vindicating existing rights, look up
| "anti-SLAPP legislation", which many states have passed
| to make it easier for a defendant of a meritless lawsuit
| seeking to chill speech to have the lawsuit dismissed.
| Anti-SLAPP legislation does almost _nothing_ to change
| the actual rights that a defendant has to speak, but it
| makes it much more practical for a defendant to actually
| excercise those rights.
|
| So, the assumption that a call for updated legislation
| implies that no legal protection currently exists is just
| a bad assumption that does not apply in this situation.
| bradchris wrote:
| She has a personal net worth of >$100m. She's also
| married to a successful actor in his own right.
|
| Her voice alone didn't get her there -- she did. That's
| why celebrities are so protective about how their
| likeness is used: their personal brand is their asset.
|
| There's established legal precedent on exactly this--even
| in the case they didn't train on her likeness, if it can
| reasonably be suspected by an unknowing observer that she
| personally has lent her voice to this, she has a strong
| case. Even OpenAI knew this, or they would not have asked
| in the first place.
| parineum wrote:
| > If they didn't think they needed permission, why ask
| for permission multiple times and then yank it when she
| noticed?
|
| Many things that are legal are of questionable ethics.
| Asking permission could easily just be an effort for them
| to get better samples of her voice. Pulling the voice
| after debuting it is 100% a PR response. If there's a law
| that was broken, pulling the voice doesn't unbreak it.
| ncallaway wrote:
| > You seem to be misunderstanding the situation here.
| They wanted ScarJo to voice their voice assistant, and
| she refused twice. They also independently created a
| voice assistant which sounds very similar to her.
|
| And promoted it using a tweet naming the movie that
| Johansson performed in, for the role that prompted them
| to ask her in the first place.
|
| You have to be almost deliberately naive to not see that
| the were attempting to use her vocal likeness in this
| situation. There's a reason they immediately walked it
| back after the situation was revealed.
|
| Neither a judge, nor a jury, would be so willingly naive.
| ants_everywhere wrote:
| Yes, totally ethically bankrupt. But what bewilders me is
| that they yanked it as soon as they heard from their
| lawyers. I would have thought that if they made the
| decision to go ahead despite getting two "no"s, that they
| at least had a legal position they thought was defensible
| and worth defending.
|
| But it kind of looks like they released it knowing they
| couldn't defend it in court which must seem pretty
| bonkers to investors.
| ethbr1 wrote:
| > _I would have thought that if they made the decision to
| go ahead despite getting two "no"s, that they at least
| had a legal position they thought was defensible and
| worth defending._
|
| They likely have a _legal_ position which is defensible.
|
| They're much more worried that they don't have a _PR_
| position which is defensible.
|
| What's the point of winning the (legal) battle if you
| lose the war (of public opinion)?
|
| Given the rest of their product is built on apathy to
| copyright, they're actively being sued by creators, and
| the general public is sympathetic to GenAI taking human
| jobs...
|
| ... this isn't a great moment for OpenAI to initiate a
| long legal battle, against a female movie actress /
| celebrity, in which they're arguing how her likeness
| isn't actually controlled by her.
|
| Talk about optics!
|
| (And I'd expect they quietly care much more about their
| continued ability to push creative _output_ through their
| copyright launderer, than get into a battle over
| likeness)
| justinclift wrote:
| > They likely have a legal position which is defensible.
|
| Doesn't sound like they have that either.
| XorNot wrote:
| How is the PR position not defensible? One of the worst
| things you can generally do is admit fault, particularly
| if you have a complete defense.
|
| Buckle in, go to court, and double-down on the fact that
| the public's opinion of actors is pretty damn fickle at
| the best of times - particularly if what you released was
| in fact based on someone you signed a valid contract with
| who just sounds similar.
|
| Of course, this is all dependent on actually having a
| complete defense of course - you absolutely would not
| want to find Scarlett Johannsen voice samples in file
| folders associated with the Sky model if it went to
| court.
| ethbr1 wrote:
| In what world does a majority of the public cheer for
| OpenAI "stealing"* an actress's voice?
|
| People who hate Hollywood? Most of that crowd hates tech
| even more.
|
| * Because it would take the first news cycle to be
| branded as that
| XorNot wrote:
| It is wild to me that on HackerNews of all places, you'd
| think people don't love an underdog story.
|
| Which is what this would be in the not-stupid version of
| events: they hired a voice actress for the rights to
| create the voice, she was paid, and then is basically
| told by the courts "actually you're unhireable because
| you sound too much like an already rich and famous
| person".
|
| The issue of course is that OpenAIs reactions so far
| don't seem to indicate that they're actually confident
| they can prove this or that this is the case. Coz if this
| is actually the case, they're going about handling this
| in the dumbest possible way.
| ml-anon wrote:
| It's wild to me that there are people who think that
| OpenAI are the underdog. A 80Bn Microsoft vassal, what a
| plucky upstart.
|
| You realise that there are multiple employees including
| the CEO publicly drawing direct comparisons to the movie
| Her after having tried and failed twice to hire the
| actress who starred in the movie? There is no non idiotic
| reading of this.
| XorNot wrote:
| You're reading my statements as defending OpenAI. Put on
| your "I'm the PR department hat" and figure out what
| you'd do if you were OpenAI given various permutations of
| the possible facts here.
|
| That's what I'm discussing.
|
| Edit: which is to say, I think Sam Altman may have been a
| god damn idiot about this, but it's also wild anyone
| thought that ScarJo or anyone in Hollywood would agree -
| AI is currently the hot button issue there and you'd find
| yourself the much more local target of their ire.
| ml-anon wrote:
| Then why bother mentioning an "underdog story" at all?
|
| Who is the underdog in this situation? In your comment it
| seems like you're framing OpenAI as the underdog (or
| perceived underdog) which is just bonkers.
|
| Hacker News isn't a hivemind and there are those of us
| who work in GenAI who are firmly on the side of the
| creatives and _gasp_ even rights holders.
| Sebb767 wrote:
| > they hired a voice actress for the rights to create the
| voice, she was paid, and then is basically told by the
| courts "actually you're unhireable because you sound too
| much like an already rich and famous person".
|
| There are quite a few issues here: First, this is
| assuming they actually hired a voice-alike person, which
| is not confirmed. Second, they are not an underdog (the
| voice actress might be, but she's most likely pretty
| unaffected by this drama). Finally, they were clearly
| aiming to impersonate ScarJo (as confirmed by them asking
| for permission and samas tweet), so this is quite a
| different issue than "accidentally" hiring someone that
| "just happens to" sound like ScarJo.
| jubalfh wrote:
| an obnoxious sleazy millionaire backed by microsoft is by
| no means "an underdog"
| foobarian wrote:
| > But it kind of looks like they released it knowing they
| couldn't defend it in court which must seem pretty
| bonkers to investors.
|
| That actually seems like there may be a few people
| involved and one of them is a cowboy PM who said fuck it,
| ship it to make the demo. And then damage control came in
| later. Possibly the PM didn't even know about the asks
| for permission?
| anytime5704 wrote:
| The whole company behaves like rogue cowboys.
|
| If a PM there didn't say "fuck it ship it even without
| her permission" they'd probably be replaced with someone
| who would.
|
| I expect the cost of any potential legal
| action/settlement was happily accepted in order to put on
| an impressive announcement.
| kuboble wrote:
| > a cowboy PM who said fuck it, ship it to make the demo.
|
| Given the timeline it sounds like the PM was told "just
| go ahead with it, I'll get the permission".
| emsign wrote:
| It looks really unprofessional at minimum if not a bit
| arrogant, which is actually more concerning as it hints
| at a deeper disrespect for artists and celebrities.
| mensetmanusman wrote:
| Effective altruism would posit that it is worth one voice
| theft to help speed the rate of life saving ai technology
| in the hands of everyone.
| ehnto wrote:
| It didn't require voice theft, they could have easily
| found a volunteer or paid for someone else.
| ncallaway wrote:
| Effective Altruists are just shitty utilitarians that
| never take into account all the myriad ways that
| unmoderated utilitarianism has horrific failure modes.
|
| Their hubris will walk them right into federal prison for
| fraud if they're not careful.
|
| If Effective Altruists want to speed the adoption of AI
| with the general public, they'd do well to avoid talking
| about it, lest the general public make a connection
| between EA and AI
|
| I will say, when EA are talking about where they want to
| donate their money with the most efficacy, I have no
| problem with it. When they start talking about the
| utility of committing crimes or other moral wrongs
| because the ends justify the means, I tend to start
| assuming they're bad at morality and ethics.
| parineum wrote:
| This is like attributing the crimes of a few
| fundamentalists to an entire religion.
| ncallaway wrote:
| I don't think so. I've narrowed my comments specifically
| to Effective Altruists who are making utilitarian trade-
| offs to justify known moral wrongs.
|
| > I will say, when EA are talking about where they want
| to donate their money with the most efficacy, I have no
| problem with it. When they start talking about the
| utility of committing crimes or other moral wrongs
| because the ends justify the means, I tend to start
| assuming they're bad at morality and ethics.
|
| Frankly, if you're going to make an "ends justify the
| means" moral argument, you need to do a _lot_ of work to
| address how those arguments have gone horrifically wrong
| in the past, and why the moral framework you're using
| isn't susceptible to those issues. I haven't seen much of
| that from Effective Altruists.
|
| I was responding to someone who was specifically saying
| an EA might argue why it's acceptable to commit a moral
| wrong, because the ends justify it.
|
| So, again, if someone is using EA to decide how to direct
| their charitable donations, volunteer their time, or
| otherwise decide between mora goods, I have no problem
| with it. That specifically wasn't context I was
| responding to.
| ocodo wrote:
| Effective Altruists are the fundamentalists though. So
| no, it's not.
| comp_throw7 wrote:
| > When they start talking about the utility of committing
| crimes or other moral wrongs because the ends justify the
| means, I tend to start assuming they're bad at morality
| and ethics.
|
| Extremely reasonable position, and I'm glad that every
| time some idiot brings it up in the EA forum comments
| section they get overwhelmingly downvoted, because most
| EAs aren't idiots in that particular way.
|
| I have no idea what the rest of your comment is talking
| about; EAs that have opinions about AI largely think that
| we should be slowing it down rather than speeding it up.
| ncallaway wrote:
| In some sense I see a direct line between the EA argument
| being presented here, and the SBF consequentialist
| argument where he talks about being willing to flip a
| coin if it had a 50% chance to destroy the world and a
| 50% chance to make the world more than twice as good.
|
| I did try to cabin my arguments to Effective Altrusts
| that are making ends justify the means arguments. I
| really don't have a problem with people that are
| attempting to use EA to decide between multiple _good_
| outcomes.
|
| I'm definitely not engaged enough with the Effective
| Altrusits to know where the plurality of thought lies, so
| I was trying to respond in the context of this argument
| being put forward on behalf of Effective Altruists.
|
| The only part I'd say applies to all EA, is the brand
| taint that SBF has done in the public perception.
| emsign wrote:
| The speed doesn't really matter if their end goal is
| morally wrong. A slower speed might give them an
| advantage to not overshoot and get backlash or it gives
| artists and the public more time to fight back against
| EA, but it doesn't hide their ill intentions.
| 0xDEAFBEAD wrote:
| >Effective Altruists are just shitty utilitarians that
| never take into account all the myriad ways that
| unmoderated utilitarianism has horrific failure modes.
|
| There's a fair amount of EA discussion of
| utilitarianism's problems. Here's EA founder Toby Ord on
| utilitarianism and why he ultimately doesn't endorse it:
|
| https://forum.effectivealtruism.org/posts/YrXZ3pRvFuH8SJa
| ay/...
|
| >If Effective Altruists want to speed the adoption of AI
| with the general public, they'd do well to avoid talking
| about it, lest the general public make a connection
| between EA and AI
|
| Very few in the EA community want to speed AI adoption.
| It's far more common to think that current AI companies
| are being reckless, and we need some sort of AI pause so
| we can do more research and ensure that AI systems are
| reliably beneficial.
|
| >When they start talking about the utility of committing
| crimes or other moral wrongs because the ends justify the
| means, I tend to start assuming they're bad at morality
| and ethics.
|
| The all-time most upvoted post on the EA Forum condemns
| SBF: https://forum.effectivealtruism.org/allPosts?sortedB
| y=top&ti...
| ncallaway wrote:
| I've had to explain myself a few times on this, so
| clearly I communicated badly.
|
| I probably should have said _those_ Effective Altruists
| are shitty utilitarians. I was attempting--and since I've
| had to clarify a few times clearly failed--to take aim at
| the effective altruists that would make the utilitarian
| trade off that the commenter mentioned.
|
| In fact, there's a paragraph from the Toby Ord blog post
| that I wholeheartedly endorse and I think rebuts the
| exact claim that was put forward that I was responding
| to.
|
| > Don't act without integrity. When something immensely
| important is at stake and others are dragging their feet,
| people feel licensed to do whatever it takes to succeed.
| We must never give in to such temptation. A single person
| acting without integrity could stain the whole cause and
| damage everything we hope to achieve.
|
| So, my words were too broad. I don't actually mean _all_
| effective altruists are shitty utilitarians. But the ones
| that would make the arguments I was responding to are.
|
| I think Ord is a really smart guy, and has worked hard to
| put some awesome ideas out into the world. I think many
| others (and again, certainly not all) have interpreted
| and run with it as a framework for shitty utilitarianism.
| gibbitz wrote:
| Are we surprised by this bankruptcy. As neat as AI is, it
| is only a thing because the corporate class see it as a
| way to reduce margins by replacing people with it. The
| whole concept is bankrupt.
| ecjhdnc2025 wrote:
| 100% this.
|
| It's shocking to me how people cannot see this.
|
| The only surprise here is that they didn't think she'd
| push back. That is what completes the multilayered cosmic
| and dramatic irony of this whole vignette. Honestly feels
| like Shakespeare or Arthur Miller might have written it.
| ncallaway wrote:
| I don't think any said anything about being surprised by
| it?
| emsign wrote:
| Problem is they really believe we either can't tell the
| difference between a human and an AI model eventually, or
| they think we don't care. Don't they understand the
| meaning of art?
| EasyMark wrote:
| Sure they could have taken her to court but right now they
| don't want the bad publicity, especially since it would put
| everything else in the shadow of such a scandalous "story".
| Better to just back off, let S.J. win and move on and start
| planning on they're gonna spend all that paper money they
| got with announcement of a new, more advanced model. It's a
| financial decision and a fairly predictable one. I'm glad
| she won this time.
| __loam wrote:
| Paper money from the model they're giving away for free?
| EasyMark wrote:
| I mean if you don't think these kinds of positive
| announcements don't increase the value of the company or
| parent company then I don't really know how to convince
| you as it's a standard business principle.
| ml-anon wrote:
| There isn't a positive announcement here, what is wrong
| with you?
|
| This reads like "we got caught red handed" and doing the
| bare minimum for it to not appear malicious and
| deliberate when the timeline is read out in court.
| __loam wrote:
| I believe there's a difference between building a
| sustainable and profitable business and pumping the
| stock.
| smugma wrote:
| She also won big against Disney. They backed down even
| though it appeared the contract was on their side. Iger
| apologized.
|
| https://www.bbc.co.uk/news/business-58757748.amp
| mschuster91 wrote:
| Probably (and rightfully) feared that, had Disney stuck
| with their position, other MCU actors would be much, much
| harsher in new contract negotiations - or that some would
| go as far and say "nope, I quit".
| callalex wrote:
| I think we should all be held to the standard of "Weird" Al
| Yankovic. In personal matters consent is important.
| visarga wrote:
| > Are we seriously saying we can't make voices that are
| similar to celebrities, when not using their actual voice?
|
| I think the copyright industry wants to grab new powers to
| counter the infinite capacity of AI to create variations.
| But that move would knee cap the creative industry first,
| newcomers have no place in a fully copyrighted space.
|
| It reminds me of how NIMBY blocks construction to keep up
| the prices. Will all copyright space become operated on
| NIMBY logic?
| kapildev wrote:
| I can still access the sky voice even though it is supposed
| to be "yanked".
| thorum wrote:
| There's still a Sky option but the actual voice has been
| changed.
| andrewinardeer wrote:
| I thought it sounded like Jodie Foster.
| ncr100 wrote:
| Scar Jo thought it sounded like herself, and so did people
| who knew her personally.
|
| That is what matters. OWNERSHIP over her contributions to
| the world.
| mcphage wrote:
| Clearly Sam Altman though it sounded like ScarJo as well
| :-(
| smt88 wrote:
| I mostly agree with you, but I actually don't think it
| matters if it sounded exactly like her or not. The crime
| is in the training: did they use her voice or not?
|
| If someone licenses an impersonator's voice and it gets
| very close to the real thing, that feels like an
| impossible situation for a court to settle and it should
| probably just be legal (if repugnant).
| toomuchtodo wrote:
| https://en.wikipedia.org/wiki/Personality_rights
| sangnoir wrote:
| > The crime is in the training: did they use her voice or
| not?
|
| This is a civil issue, and actors get broad rights to
| their likeliness. Kim Kardashian sued Old Navy for using
| a look-alike actress in an ad; old Navy chose to settle,
| which makes it appear like "the real actress wasn't
| involved in any way" may not be a perfect defense. The
| timeline makes it clear they wanted it to sound like
| Scarlett's voice, the actual mechanics on how they got
| the AI to sound like that is only part of the story.
| randoglando wrote:
| > If someone licenses an impersonator's voice and it gets
| very close to the real thing, that feels like an
| impossible situation for a court to settle and it should
| probably just be legal (if repugnant).
|
| Does that mean if cosplayers dress up like some other
| character, they can use that version of the character in
| their games/media? I think it should be equally simple to
| settle. It's different if it's their natural voice. Even
| then, it brings into question whether they can use
| "doppelgangers" legally.
| aseipp wrote:
| It is not an impossible situation, courts have settled
| it, and what you describe is not how the law works
| (despite how many computer engineers think to the
| contrary.)
| smt88 wrote:
| Courts have settled almost nothing related to AI. We
| don't even know if training AI using copyrighted works is
| a violating of copyright law.
|
| Please point to a case where someone was successfully
| sued for sounding too much like a celebrity (while not
| using the celebrity's name or claiming to be them).
| davidgerard wrote:
| Multiple cases already answering your question in this
| thread.
| ascorbic wrote:
| Midler vs Ford:
| https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
| dv_dt wrote:
| As I understand it (though I may be wrong) in music
| sampling cases, it doesn't matter if the "sample" is
| using an actual clip from a recording or if were
| recreated from scratch using a new media (e.g. direct
| midi sequence), if a song sampling another song is
| recognizable it is still infringing.
| parineum wrote:
| Sampling is not the same as duplication. Sampling is
| allowed as it's a derivitive work as long as it's
| substantially different from the original.
|
| It's a "I know it when I see it" situation so it's not
| clear cut.
| Findecanor wrote:
| Oh, the day when an artist could sample other artists
| without attribution and royalties is long gone. The music
| labels are very hard on this these days.
| XorNot wrote:
| If OpenAI commissioned a voice actor to lend their voice
| to the Sky model, and cast on the basis of trying to get
| someone who is similar sounding to the Scarlett
| Johannson, but then did not advertise or otherwise use
| the voice model created to claim it _was_ Scarlett
| Johannson - then they 're completely in the clear.
|
| Because then the actual case would be fairly bizarre: an
| entirely separate person, selling the rights to their own
| likeness as they are entitled to do, is being prohibited
| from doing that by the courts because they sound too much
| like an already famous person.
|
| EDIT: Also up front I'm not sure you can entirely discuss
| timelines for changing out technology here. We have voice
| cloning systems that can do it with as little as 15
| seconds of audio. So having a demo reel of what they
| wanted to do that they could've used on a few days notice
| isn't unrealistic - and training a model and _not_ using
| it or releasing it also isn 't illegal.
| cycomanic wrote:
| That's confidently incorrect. Many others already posted
| that this has been settled case law for many years. I
| mean would you argue that if someone build a macbook
| lookalike, but not using the same components would be
| completely clear?
| XorNot wrote:
| I ask you what do you call the Framework [1]? Or Dell's
| offerings?[2] Compared to the Macbook? [3]
|
| Look kind of similar right? Lot of familiar styling
| queues? What would take it from "similar" to actual
| infringement? Well if you slapped an Apple Logo on there,
| that would do it. Did OpenAI make an actual claim? Did
| they _actually_ use Scarlett Johannson 's public image
| and voice as sampling for the system?
|
| [1] https://images.prismic.io/frameworkmarketplace/25c9a1
| 5f-4374...
|
| [2] https://i.dell.com/is/image/DellContent/content/dam/s
| s2/prod...
|
| [3] https://cdn.arstechnica.net/wp-
| content/uploads/2023/06/IMG_1...
| mrbungie wrote:
| You're not arguing your way out of jurisprudence,
| especially when the subject is a human and not a device
| nor IP. They (OpenAI) fucked up.
| XorNot wrote:
| There is _not_ clear jurisprudence on this. They 're only
| in trouble if they actually used ScarJo's voice samples
| to train the model, _or_ if they intentionally tried to
| portray their imitation as her without her permission.
|
| The biggest problem on that front (assuming the former is
| not true) is Altman's tweets, but court-wise that's
| defensible (though I retract what I had here previously -
| probably not easily) as a reference to the general
| concept of the movie.
|
| Because otherwise the situation you have is OpenAI
| seeking a particular style, hiring someone who can
| provide it, _not_ trying to pass it off as that person
| (give or take the Tweet 's) and the intended result
| effectively being: "random voice actress, you sound too
| much like an already rich and famous person. Good luck
| having no more work in your profession" - which would be
| the actual outcome.
|
| The question entirely hinges on, did they include _any_
| data at all which includes ScarJo 's voice samples in the
| training. And also whether it actually _does_ sound
| similar enough - Frito-Lay went down because of intent
| and similarity. There 's the hilarious outcome here that
| the act of trying to contact ScarJo is the _actual_
| problem they had.
|
| EDIT 2: Of note also - to have a case, they actually have
| to show reputational harm. Of course on that front, the
| entire problem might also be Altman. Continuing the trend
| I suppose of billionaires not shutting up on Twitter
| being the main source of their legal issues.
| einherjae wrote:
| Are you a lawyer?
| einherjae wrote:
| Grey laptops that share some ideas in their outline while
| being distinct enough to not get lawyers from Cupertino
| on their necks?
| jerojero wrote:
| Well Sam Altman tweeted "her" so that does seem to me
| like they're trying to claim a similarity to Scarlett
| Johannson.
| jonathankoren wrote:
| This has been settled law for 34 years. See Tom Waits v
| Frito-Lay.
|
| They literally hired an impersonator, and it cost them
| 2.5 million (~6 million today).
|
| https://www.latimes.com/archives/la-
| xpm-1990-05-09-me-238-st...
| smt88 wrote:
| That case seems completely dissimilar to what OpenAI did.
|
| Frito-Lay copied a song by Waits (with different lyrics)
| and had an impersonator sing it. Witnesses testified they
| thought Waits had sung the song.
|
| If OpenAI were to anonymously copy someone's voice by
| training AI on an imitation, you wouldn't have:
|
| - a recognizable singing voice
|
| - music identified with a singer
|
| - market confusion about whose voice it is (since it's
| novel audio coming from a machine)
|
| I don't think any of this is ethical and think voice-
| cloning should be entirely illegal, but I also don't
| think we have good precedents for most AI issues.
| jonathankoren wrote:
| Let me connect the dots for you.
|
| Company identifies celebrity voice they want.
| (Frito=Waits, OpenAi=ScarJo)
|
| Company comes up with novel thing for the the voice to
| say. (Frito=Song, OpenAI=ChatGpt)
|
| Company decides they don't need the celebrity they want
| (Frito=Waits, OpenAI=ScarJo) and instead hire an
| impersonator (Frito=singer, {OpenAI=impersonator or
| OpenAI=ScarJo-public-recordings}) to get what they want
| (Frito=a-facsimile-of-Tom-Waitte's-voice-in-a-commercial,
| OpenAi=a-fascimilie-of-ScarJo's-voice-in-their-chatbot)
|
| When made public, people confuse the fascimilie as the
| real thing.
|
| I don't see how you don't see a parallel. It's literally
| best for beat the same, particularly around the part
| about using an impersonator as an excuse.
| minimaxir wrote:
| More notably for legal purposes, there were several
| independent news reports corroborating the vocal
| similarity.
| sangnoir wrote:
| ...and sama's tweet referencing "Her"
| dyno12345 wrote:
| I'm not sure how much you currently legally own
| imitations of your own voice. There's a whole market for
| voice actors who can imitate particular famous voices.
| adolph wrote:
| Should have renamed it
|
| https://en.wikipedia.org/wiki/Sosumi
|
| Or
|
| https://www.reddit.com/r/todayilearned/comments/9n44b6/ti
| l_t...
| parineum wrote:
| She doesn't own most (all probably) of her contributions
| to the world.
|
| If the voice was only trained on the voice of the
| character she played in Her, would she have any standing
| in claiming some kind of infringement?
| wkat4242 wrote:
| > maybe not even a hiring of an impressionist
|
| If they really hired someone who sounds just like her it's
| fair game IMO. Johanssen can't own the right to a _similar_
| voice just like many people can have the same name. I think
| if there really was another actress and she just happens to
| sound like her, then it 's really ok. And no I'm not a fan of
| Altman (especially his worldcoin which I view as a privacy
| disaster)
|
| I mean, imagine if I happened to have a similar voice to a
| famous actor, would that mean that I couldn't work as a voice
| actor without getting their OK just because they happen to be
| more famous? That would be ridiculous. Pretending to be them
| would be wrong, yes.
|
| If they hired someone to change their voice to match hers,
| that'd be bad. Yeah. If they actually just AI-cloned her
| voice that's totally not OK. Also any references to the
| movies. Bad.
| confused_boner wrote:
| Discovery process will be interesting
| 101008 wrote:
| But clearly they are advertising as her (no pun intended),
| which is a gray area.
| wkat4242 wrote:
| Yeah that was the bad part. Agreed there.
|
| I wonder if they deliberately steered towards this for
| more marketing buzz?
| sneak wrote:
| Why are you mad? We have no rights to the sound of our voice.
| There is nothing wrong with someone or something else making
| sounds that sound like us, even if we don't want it to
| happen.
|
| No one is harmed.
| mkehrt wrote:
| Are you sure? You certainly have rights to your likeness--
| it can't be used commercially without permission. Di you
| know this doesn't cover your voice?
| elicash wrote:
| The law can actually be interesting and nuanced on this: ht
| tp://law2.umkc.edu/faculty/projects/ftrials/communications.
| ..
| ethbr1 wrote:
| I think it's a different argument with respect to famous
| media celebrities* too.
|
| If someone clones a random person's voice for commercial
| purposes, the public likely has no idea who the voice's
| identity is. Consequently, it's just the acoustic voice.
|
| If someone clones a famous media celebrity's voice, the
| public has a much greater chance of recognizing the voice
| and associating it with a specific person.
|
| Which then opens a different question of 'Is the
| commercial use of the voice appropriating the real
| person's fame for their own gain?'
|
| Add in the facts that media celebrities' values are
| partially defined by how people see them, and that they
| are often paid for their endorsements, and it's a much
| clearer case that (a) the use potentially influenced the
| value of their public image & (b) the use was theft,
| because it was taking something which otherwise would
| have had value.
|
| Neither consideration exists with 'random person's voice'
| (with deference to voice actors).
|
| * Defined as 'someone for whom there is an expectation
| that the general public would recognize their voice or
| image'
| thatoneguy wrote:
| At least in past court cases I'm familiar, you can't use an
| impersonator and get people to think it's the real thing.
|
| It's not like Tom Waits ever wanted to hock chips
|
| https://www.latimes.com/archives/la-
| xpm-1990-05-09-me-238-st...
| chii wrote:
| > get people to think it's the real thing.
|
| but did openAI make any claims about whose voice this is?
| Just because a voice sounds similar or familiar, doesn't
| mean it's fraudulent.
| tedivm wrote:
| Just read the top post of the thread you're responding
| to-
|
| > - Not receiving a response, OpenAI demos the product
| anyway, with Sam tweeting "her" in reference to
| Scarlett's film.
| dzhiurgis wrote:
| To me reference sounds more to towards omni than her
| voice
| altairprime wrote:
| What's "omni"?
| nickthegreek wrote:
| GTP-4o is the new model, the o stands for Omni.
| nickthegreek wrote:
| That's not a gamble they are willing to take to court of
| law or public opinion.
| acomjean wrote:
| Or Bette Midler singing for ford. She turned them down.
| They used a sound alike, she sued and won
|
| https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
| dewbrite wrote:
| They used a sound-alike _and_ had her sing one of her
| songs. I believe that 's a different precedent, in that
| it's leveraging her fame.
|
| Imo Sky's voice is distinct enough from Scarlett, and it
| wasn't implied to _be_ her.
|
| Sam's "Her" tweet could be interpreted as such, but
| defending the tweet as the concept of "Her", rather than
| the voice itself, is.
| sleepybrett wrote:
| Does not the 'her' tweet give away the game. Aas you
| said, it was a midler impersator singing one of midlers
| songs. In this case they have a voice for their AI
| assistant/phone sex toy that is very much like the
| actress that played a famous ai assistant/phone sex toy.
| Even if he is taken as meaning the concept it's very very
| damning. If they had, instead, mimic'ed another famous
| actor's voice that hasn't played a robot/ai/whatever and
| used that would that really be any better though?
| Christopher Walken, say, or hell Bette Midler?
| splatzone wrote:
| There's a nice YouTube doc telling the story of this, and
| Tom Waits' hatred of advertising -
| https://youtu.be/W7J01e-OIMA?si=57IJooNwg5oTfh62
| windexh8er wrote:
| The thing about the situation is that Altman is willing to
| lie and steal a celebrity's voice for use in ChatGPT. What he
| did, the timeline, everything - is sleazy if, in fact, that's
| the story.
|
| The _really_ concerning part here is that Altman is, and
| wants to be, a large part of AI regulation [0]. Quite the
| public contradiction.
|
| [0] https://www.businessinsider.com/sam-altman-openai-
| artificial...
| dvhh wrote:
| Some people might see some parallel with SBF and see how
| Altman would try to regulate competition without impeding
| OpenAI progress
| viking123 wrote:
| I always mix up those two in my head and have to think
| which one is which
| ocodo wrote:
| One is in jail, when it should be two are in jail
| garbthetill wrote:
| I dont like sam, but he moves way smarter than ppl like
| sbf or Elizabeth holmes. He actual has a product close to
| the reported specs, albeit still far away from the
| ultimate goal of AGI
|
| i dont see why he should be in jail
| choppaface wrote:
| Should be in jail for Worldcoin which has pilfered people
| of their biological identity. I guess you could literally
| delete Worldcoin and in theory make people whole, but
| that company treats humans like vegetables that have no
| rights.
| Findecanor wrote:
| If his sister's words about sexually abusing her are
| true, he should be in jail.
| bryanrasmussen wrote:
| no, in that case he should have been in the Juvenile
| incarceration system, unless the argument is that he
| should have been charged as an adult, or that Juvenile
| abusers should always be charged and sentenced as adults,
| or that Juvenile sex offenders who were not charged as
| Juveniles should be charged as adults.
|
| Which one?
|
| on edit: this being based on American legal system, you
| may come from a legal system with different rules.
| numpad0 wrote:
| Maybe they were the rogue AGI escapes we found along the
| way
| choppaface wrote:
| sama gets to farm out much of the lobbying to Microsoft's
| already very powerful team, which spends a mere $10m but
| that money gets magnified by MS's gov and DoD contracts.
| That's a _huge_ safety net for him, he gets to steal and
| lie (as demonstrated w / Scarlett) and yet the MS
| lobbying machine will continue unphased.
|
| https://www.opensecrets.org/federal-
| lobbying/clients/summary...
| startupsfail wrote:
| Most likely it was an unforced error, as there've been a
| lot of chaos with cofounders and the board revolt, easy to
| loose track of something really minor.
|
| Like some intern's idea to train the voice on their
| favorite movie.
|
| And then they've decided that this is acceptable
| risk/reward and not a big liability, so worth it.
|
| This could be a well-planned opening move of a regulation
| gambit. But unlikely.
| mmastrac wrote:
| It makes a lot more sense that he was caught red-handed,
| likely hiring a similar voice actress and not realizing
| how strong identity protections are for celebs.
| windexh8er wrote:
| I don't think this makes any sense, at all, quite
| honestly. Why would an "intern" be training one of
| ChatGPT's voices for a major release?
|
| If in fact, that was the case, then OpenAI is not aligned
| with the statement they just put out about having utmost
| focus on rigor and careful considerations, in particular
| this line: "We know we can't imagine every possible
| future scenario. So we need to have a very tight feedback
| loop, rigorous testing, careful consideration at every
| step, world-class security, and harmony of safety and
| capabilities." [0]
|
| [0] https://x.com/gdb/status/1791869138132218351
| Always42 wrote:
| At first I thought there may be a /s coming...
| Cheer2171 wrote:
| > easy to loose track of something really minor. Like
| some intern's idea
|
| Yes, because we all know the high profile launch for a
| major new product is entirely run by the interns. Stop
| being an apologist.
| mbreese wrote:
| This is an unforced error, but it isn't minor. It's quite
| large and public.
|
| The general public doesn't understand the details and
| nuances of training an LLM, the various data sources
| required, and how to get them.
|
| But the public does understand stealing someone's voice.
| If you want to keep the public on your side, it's best to
| not train a voice with a celebrity who hasn't agreed to
| it.
| surfingdino wrote:
| I had a conversation with someone responsible for
| introducing LLMs into the process that involves personal
| information. That person rejected my concern over one
| person's data appearing in the report on another person.
| He told me that it will be possible to train AI to avoid
| that. The rest of the conversation convinced me that AI
| is seen as magic that can do anything. It seems to me
| that we are seeing a split between those who don't
| understand it and fear it and those who don't understand
| it, but want to align themselves with it. Those latter
| are those I fear the most.
| kombookcha wrote:
| The "AI is magic and we should simply believe" is even
| being actively promoted because all these VC hucksters
| need it.
|
| Any criticism of AI is being met with "but if we all just
| hype AI harder, it will get so good that your criticisms
| won't matter" or flat out denied. You've got tech that's
| deeply flawed with no obvious way to get unflawed, and
| the current AI 'leaders' run companies with no clear way
| to turn a profit other than being relentlessly hyped on
| proposed future growth.
|
| It's becoming an extremely apparent bubble.
| surfingdino wrote:
| On the plus side, lots of cheap nVida cards heading for
| eBay once it bursts.
| kergonath wrote:
| > Like some intern's idea to train the voice on their
| favorite movie.
|
| Ah, the famous rogue engineer.
|
| The thing is, even if it were the case, this intern would
| have been supervised by someone, who themselves would
| have been managed by someone, all the way to the top. The
| moment Altman makes a demo using it, he owns the problem.
| Such a public fuckup is embarrassing.
|
| > And then they've decided that this is acceptable
| risk/reward and not a big liability, so worth it.
|
| You mean, they were reckless and tried to wing it? Yes,
| that's exactly what's wrong with them.
|
| > This could be a well-planned opening move of a
| regulation gambit. But unlikely.
|
| LOL. ROFL, even. This was a gambit all right. They just
| expected her to cave and not ask questions. Altman has a
| common thing with Musk: he does not play 3D chess.
| vasilipupkin wrote:
| if this account is true, Sam Altman is a deeply unethical
| human being. Given that he doesn't bring any technical know
| how to building of AGI, I just don't see the reason to have
| such a person in charge here. The new board should act.
| ornornor wrote:
| He has "The Vision"... It's the modern entrepreneurship
| trope that lowly engineers won't achieve anything if they
| weren't rallied by a demi-god who has "The Vision" and
| makes it all happen.
| azinman2 wrote:
| Probably not wrong. Lots and lots of examples of that
| being true.
| safety1st wrote:
| There is something to it. Someone has to identify the
| intersection between what the engineering can do and what
| the market actually wants, then articulate that to a
| broad enough audience. Engineers constantly undervalue
| this very fuzzy and very human centric part of the work.
|
| I don't think the issue is that Vision doesn't matter. I
| think the issue is Sam doesn't have it. Like Gates and
| Jobs had clear, well defined visions for how the PC was
| going to change the world, then rallied engineering
| talent around them and turned those into reality, that's
| how their billions and those lasting empires were born.
| Maybe someone like Elon Musk is a contemporary example.
| Just don't see anything like that from SamA, we see him
| in the media, talking a lot about AI, rubbing shoulders
| with power brokers, being cutthroat, but where's the
| vision of a better future? And if he comes up with one
| does he really understand the engineering well enough to
| ground it in reality?
| parpfish wrote:
| I roll my eyes when somebody says that they're "the idea
| person" or that they have "the vision".
|
| I'd wager that most senior+ engineers or product people
| also have equally compelling "the vision"s.
|
| The difference is that they need to do actual work all
| day so they don't get to sit around pontificating.
| jcranmer wrote:
| I mean, there's already been some yellow flags with
| Altman already. He founded Worldcoin, whose plan is to
| airdrop free money in exchange for retinal scans. And the
| board of OpenAI fired him for (if I've got this right)
| lying to the board about conversations he'd had with
| individual board members.
| imjonse wrote:
| He rubs elbows with very powerful people including CEOs,
| heads of state and sheiks. They probably want 'one of
| them' in charge of the company that has the best chances
| of getting close to AGI. So it's not his technical chops
| and not even 'vision' in the Jobs sense that keeps him
| there.
| dontupvoteme wrote:
| Are they really the ones with the best chance now though?
|
| They're basically owned by Microsoft, they're bleeding
| tech/ethnical talent and credibility, and most
| importantly Microsoft Research itself is no slouch
| (especially post-Deepmind poaching) - things like Phi are
| breaking ground on planets that openai hasn't even
| touched.
|
| At this point I'm thinking they're destined to become
| nothing but a premium marketing brand for Microsoft's
| technology.
| insane_dreamer wrote:
| I thought we had already established this when the
| previous board tried to oust him for failing to stick to
| OpenAI's charter. This is just further confirmation.
|
| > The new board should act
|
| You mean like the last board tried? Besides the board was
| picked to be on Altman's side. The independent members
| were forced out.
| silver_silver wrote:
| It shouldn't be forgotten that his sister has publicly
| accused him and his brother of sexually abusing her as a
| child.
| verisimi wrote:
| I didn't know about that, strange:
|
| https://www.lesswrong.com/posts/QDczBduZorG4dxZiW/sam-
| altman...
| lrvick wrote:
| "Some commenters on Hacker News claim that a post
| regarding Annie's claims that Sam sexually assaulted her
| at age 4 has been being repeatedly removed."
|
| Whelp. Let us see if this one sticks.
| serial_dev wrote:
| He must be bringing something to the table as they tried
| to get rid of him and failed spectacularly. Business is
| not only about technical know how.
| surfingdino wrote:
| Microsoft. They are protecting their investment.
| ocodo wrote:
| Altman has proven time and again that he is little more
| than a huckster wrt technology, and in business he is a
| stone cold shark.
|
| Conman plain and simple.
| wraptile wrote:
| Not going to lie, he had me. He appeared very genuine and
| fair in almost all media he appeared like podcasts but
| many of his actions are just so hard to justify.
| svachalek wrote:
| He has a certain charm and seeming sincerity when he
| talks. But the more I see of him, the more disturbing I
| find him -- he combines the Mark Zuckerberg stare with
| the Elizabeth Holmes vocal fry.
| jesterson wrote:
| so all psychopaths do, aren't they?
| johnnyanmac wrote:
| CEO's have been studied to have a disproportionately
| higher rate of psychopathy. So there's a little
| correlation. You don't get to the top of a company in
| this kind of society without having some inherent charm
| (assuming you aren't simply inheriting billions from a
| previous generation).
| polotics wrote:
| Do you have a link to a video of Altman's voice shifting
| from controlled deep to nasal? The videos of Elizabeth
| Holmes not being able to keep up with the faked deep tone
| are textbook-worthy...
| kristiandupont wrote:
| I have exactly the same feeling as I think you do. When
| you reach the levels of success he has, there will always
| be people screaming that you are incompetent, evil and
| every other negative adjective under the sun. But he
| genuinely seemed to care about doing the right thing. But
| this is just so lacking of basic morals that I have to
| conclude that I was wrong, at least to an extent.
| wraptile wrote:
| I feel that this is a classic tale of success getting to
| you. It almost feels like it's impossible to be
| successful at this level and remain true. At least, I
| hadn't seen it yet.
| lawn wrote:
| You'd think that Worldcoin would be enough proof of what
| he is but I guess people missed that memo.
| JoRyGu wrote:
| Because of course he's got a crypto grift going.
| Shocking.
| ben_w wrote:
| Much as I dislike crypto, that's more of "having no sense
| of other people's privacy" (and hubris) than general
| scamminess.
|
| It's a Musk-error not an SBF-error. (Of course, I do
| realise many will say all three are the same, but I think
| it's worth separating the types of mistakes everyone
| makes, because _everyone_ makes mistakes, and only two of
| these three also did useful things).
| pwdisswordfishc wrote:
| > that's more of "having no sense of other people's
| privacy"
|
| Sufficiently advanced incompetence is indistinguishable
| from malice.
| ben_w wrote:
| It's not particularly advanced, it's the same thing that
| means the supermajority of websites have opted for "click
| here to consent to our 1200 partners processing
| everything you do on our website" rather than "why do we
| need 1200 partners anyway?"
|
| It's still bad, don't get be wrong, it's just something I
| can distinguish.
| lawn wrote:
| It's not just about privacy either.
|
| Worldcoin is centrally controlled making it a classic
| "scam coin". Decentralization is the _only_ unique thing
| about cryptocurrencies, when you abandon decentralization
| all that's left is general scamminess.
|
| (Yes, there's nuance to decentralization too but that's
| not what's going on with Worldcoin.)
| ben_w wrote:
| True decentralisation is part of the problem with
| cryptocurrencies and why they can't work the way the
| advocates want them to.
|
| Decentralisation allows trust-less assurance that money
| is sent, it's just that's not useful because the goods or
| services for which the money is transferred still need
| _either_ trust _or_ a centralised system that can undo
| the transaction because fraud happened.
|
| That's where smart contracts come in, which I also think
| are a terrible idea, but do at least deserve a "you
| tried!" badge, because they're as dumb as saying "I will
| write bug-free code" rather than as dumb as "let's build
| a Dyson swarm to mine exactly the same amount of
| cryptocurrency as we would have if we did nothing".
| lawn wrote:
| > Decentralisation allows trust-less assurance that money
| is sent
|
| That is indeed something it does.
|
| But it also gives you the assurance that a single entity
| can't print unlimited money out of thin air, which is the
| case with a centrally controlled currency like Worldcoin.
|
| They can just shrug their shoulders and claim that all
| that money is for the poor and gullible Africans that had
| their eyeballs scanned.
| ben_w wrote:
| > But it also gives you the assurance that a single
| entity can't print unlimited money out of thin air, which
| is the case with a centrally controlled currency like
| Worldcoin.
|
| Sure, but the _inability_ to do that when needed is also
| a bad thing.
|
| Also, single world currencies are (currently) a bad
| thing, because when _your_ bit of the world needs to
| devalue its currency is generally different to when
| _mine_ needs to do that.
|
| But this is why economics is its own specialty and not
| something that software nerds should jump into like our
| example with numbers counts for much :D
| mlindner wrote:
| I'm glad more people are thinking this. It's amazing that
| he got his way back into OpenAI somehow. I said as much
| that he shouldn't go back to OpenAI and got downvotes
| universally both here and on reddit.
| m000 wrote:
| Aspiring technofeudalist.
| beefnugs wrote:
| the whole technology is based on fucking over artists, who
| didn't expect this exact thing?
| surfingdino wrote:
| It's not just the artists, anything you do in the digital
| realm and anything that can be digitised is fair game. In
| the UK NHS GP practices refuse to register you to see a
| doctor even when it's urgent and tell you to use a third-
| party app to book an appointment. You have use your phone
| to take photos of the affected area and provide a
| personal info. I fully expect that data to be fed into
| some AI and sold without me knowing and without a process
| for removal of data should the company go bust. It is
| preying on the vulnerable when they need help.
| KineticLensman wrote:
| Last time I booked a blood test it was via the official
| NHS app , not a third party.
| surfingdino wrote:
| https://www.patientaccess.com/
| 4ndrewl wrote:
| Important to note the "The NHS" is not a single entity
| and the GP practice is likely a private entity owned in
| partnership by the doctors. There are a number of reasons
| why individual practices can refuse to register.
|
| Take your point about LLMs though.
| surfingdino wrote:
| I went to see my GP and the lady at the reception told me
| they no longer book visits at the reception and I had to
| use the app. Here's the privacy policy
| https://support.patientaccess.com/privacy-policy They
| reserve the right to pass your data to third party
| contractors and to use it for marketing purposes. There
| is the obligatory clause on regarding the right to be
| forgotten, but the AI companies claim it is impossible to
| implement.
| 4ndrewl wrote:
| I didn't read that as reserving the right - looks like a
| standard dpia that is opt-in and limited.
|
| However, GP practices are essentially privatised - so you
| do have the right to register at another practice.
| jjgreen wrote:
| App? What's an app?
|
| It's a thing you put on your phone
|
| I don't have a phone
|
| Well, we can't register you
|
| You don't accept people who don't have phones? Could I
| have that in writing please, ..., oh, your signature on
| that please ...
| choppaface wrote:
| Altman doesn't want to be part of regulation. sama wants to
| be the next tk. he wants to be _above_ regulation, and he
| wants to spend Microsoft's money getting there.
|
| E.g. flying Congress to Lake Cuomo for an off-the-record
| "discussion" https://freebeacon.com/politics/how-the-aspen-
| institute-help...
| trustno2 wrote:
| Altman wants to be a part of AI regulation in the same way
| Bankman Fried wanted to be a part of cryptocurrency
| regulation.
| gds44 wrote:
| Whats really interesting about our timeline is when you
| look at the history of market capture in Big Oil, Telco,
| Pharma, Real Estate, Banks, Tobacco etc all the lobbying,
| bribing, competition killing used to be done behind the
| scenes within elite circles.
|
| The public hardly heard from or saw the mgmt of these
| firm in media until shit hit the fan.
|
| Today it feels like managment is in the media every 3
| hours trying to capture attention of prospective
| customers, investors, employees etc or they loose out to
| whoever is out there capturing more attention.
|
| So false and condradictory signalling is easy to see.
| Hopefully out of all this chaos we get a better class of
| leaders not a better class of panderers.
| hoseja wrote:
| So great to have twitter so the narcissistic psychopaths
| can't resist revealing themselves for clout.
| askl wrote:
| I always had trouble telling apart those two Sams. Turns
| out they're the same person.
| belter wrote:
| This whole exchange from 1:04:53 to 1:10:22 takes a whole
| different meaning....
|
| https://youtu.be/P_ACcQxJIsg?t=3891
| akudha wrote:
| What is so special about her voice? They could've found a
| college student with a sweet voice and offered to pay her
| tuition in exchange for using her voice, no? Or a voice
| actor?
|
| Why be cartoonishly stupid and cartoonishly arsehole and
| steal a celebrity's voice? Did he think Scarlett won't find
| out? Or object?
|
| I don't understand these rich people. Is it their hobby to
| be a dick to as many people as they can, for no reason
| other than their amusement? Just plain weirdos
| meat_machine wrote:
| Scarlett voiced Samantha, an AI in the movie "Her"
|
| Considering the movie's 11 years old, it's surprisingly
| on-point with depictions of AI/human interactions,
| relations, and societal acceptance. It does get a bit
| speculative and imaginative at the end though...
|
| But I imagine that movie did/does spark the imagination
| of many people, and I guess Sam just couldn't let it go.
| mike_hearn wrote:
| It's not just that. Originally the AI voice in Her was
| played by someone else, but Spike Jonze felt strongly
| that the movie wasn't working and recast the part to
| Johansson. The movie immediately worked much better and
| became a sleeper hit. Johansson just has a much better
| fitting voice and higher skill in voice acting for this
| kind of role, to the extent that it maybe was a
| make/break choice for the movie. It isn't a surprise that
| after having created the exact tech from the movie,
| OpenAI wanted it to have the same success that Jonze had
| with his character.
|
| It's funny that just seven days ago I was speculating
| that they deliberately picked someone whose voice is very
| close to Scarlett's and was told right here on HN, by
| someone who works in AI, that the Sky voice doesn't sound
| anything like Scarlett and it is just a generic female
| voice:
|
| https://news.ycombinator.com/item?id=40343950#40345807
|
| Apparently .... not.
| sage76 wrote:
| > Is it their hobby to be a dick to as many people as
| they can, for no reason other than their amusement? Just
| plain weirdos
|
| They seem to love "testing" how much they can bully
| someone.
|
| I remember a few experiences where someone responded by
| being an even bigger dick, and they disappeared fast.
| xinayder wrote:
| > The thing about the situation is that Altman is willing
| to lie and steal a celebrity's voice for use in ChatGPT.
| What he did, the timeline, everything - is sleazy if, in
| fact, that's the story.
|
| Correcting, the thing about this whole situation with
| OpenAI is they are willing to steal everything for use in
| ChatGPT. They trained their model with copyrighted data and
| for some reason they won't delete the millions of protected
| data they used to train the AI model.
| chx wrote:
| Altman is a known conman. Surely you are aware of Yishan
| Wong describing how Sam Altman and the Reddit founders
| conned Conde Nast https://reddit.com/r/AskReddit/comments/3
| cs78i/whats_the_bes...
| sirsinsalot wrote:
| Wow, Altman in the replies there:
|
| > Cool story bro.
|
| > Except I could never have predicted the part where you
| resigned on the spot :)
|
| > Other than that, child's play for me.
|
| >Thanks for the help. I mean, thanks for your service as
| CEO.
| barbariangrunge wrote:
| Everyone is so mad about them stealing a beloved celebrity's
| voice. What about the millions of authors and other creators
| whose copyrighted works they stole to create works that
| resemble and replace those people? Not famous enough to
| generate the same outrage?
| surfingdino wrote:
| Welcome to the world where the "fuck the creatives" brigade
| wants everything for free.
| creato wrote:
| I think the unique thing about this case is not
| specifically the "voice theft", but that OpenAI
| specifically asked for permission and were denied, which
| eliminates most of the usual plausible deniability that
| gets trotted out in these cases.
| al_borland wrote:
| I had to go look at what voice I picked once I heard the
| news, it was Sky. I listened to them all and thought it
| sounded the best. I didn't make any connection to her (Scar
| Jo or the movie) when going through the voices, but I wasn't
| listening for either. I don't think I know her voice well
| enough to pick it out of a group like that.
|
| Maybe I liked it best because it felt familiar, even if I
| didn't know why. I'm a bit disappointed now that she didn't
| sign on officially, but my guess is that Altman just burned
| his bridge to half of Hollywood if he is looking for a plan
| B.
| npunt wrote:
| When people cheat on (relatively) small things, it's usually an
| indication they'll cheat on big things too
| iosjunkie wrote:
| I would love to see the providence of their training data.
| ojbyrne wrote:
| I think you want the word "provenance."
| blackeyeblitzar wrote:
| We need laws where companies are forced to reveal source of
| personal data. Like how did XYZ company get my contact info
| to spam me?
| nwoli wrote:
| OpenAI only hires and is built on the culture that data and
| copyright is somehow free for the taking, otherwise they
| would have zero ways to make a profit or "build agi"
| slg wrote:
| Which is what makes me wonder if this might grow into a
| galvanizing event for the pro-creator protests against these
| AI models and companies. What happened here isn't
| particularly unique to voices or even Scarlett Johansson, it
| is just how these companies and their products operate in
| general.
| bakuninsbart wrote:
| I think the only way for these protests to get really
| tangible results is in case we reach a ceiling in LLM
| capabilities. The technology in its current trajectory is
| simply too valuable both in economic and military
| applications to pull out of, and "overregulation" can be
| easily swatted citing national security concerns in regards
| to China. As far as I know, China has significantly
| stricter data and privacy regulations than the US when it
| comes to the private sector, but these probably count for
| little when it comes to the PLA.
| andy_ppp wrote:
| We have almost run out of training data already so I'm
| not convinced they will get massively more generalised
| suddenly. If you give them reasoning tasks they haven't
| seen before LLMs absolutely fall apart and produce
| essentially gibberish. They are currently search engines
| that give you one extremely good result that you can
| refine up to a point, they are not thinking even though
| there's a little bit more understanding than search
| engines of the past.
| ncr100 wrote:
| Stealing someone's identity is indeed one of those "big
| things".
| sneak wrote:
| Impersonating someone's voice isn't stealing anything, and
| certainly not their identity.
| MrFoof wrote:
| 30+ year old established case precedent disagrees with
| you:
|
| http://law2.umkc.edu/faculty/projects/ftrials/communicati
| ons...
|
| https://casetext.com/case/waits-v-frito-lay-inc
| iainctduncan wrote:
| If they are a celebrity actor it sure is.
| LewisVerstappen wrote:
| How did they even cheat here?
|
| OpenAI did nothing wrong.
|
| The movie industry does the same thing all the time. If an
| actor/actress says no they you find someone else who can play
| the same role.
| tjmc wrote:
| Nothing? If you're acting like the sea witch in "The Little
| Mermaid" you're probably doing something wrong.
| ramenbytes wrote:
| Key difference here is that Scarlett still has her voice.
| ramenbytes wrote:
| I don't think that's quite the same. Are they going out and
| hiring impersonators of the actors who declined the role or
| digitally enhancing the substitute to look like them? That
| seems closer to what happened here.
| falloutx wrote:
| If they are so "Open" they should reveal their training
| data which created this voice. I am sure it is just movie
| audio from S. Johansson's movies.
| sneak wrote:
| Who cheated whom? Out of what?
| og_kalu wrote:
| - Two days before the GPT-4o launch, they contacted her agent
| and asked that she reconsider. (Two days! This means they
| already had everything they needed to ship the product with
| Scarlett's cloned voice.)
|
| New voice mode is a speech predicting transformer. "Voice
| Cloning" could be as simple as appending a sample of the voice
| to the context and instructing it to imitate it.
| jprete wrote:
| If they really did that then (A) it's not much better (B)
| they didn't even wait for an answer from Johansson (C) it's
| extraordinarily reckless to go from zero to big-launch
| feature in less than two days.
| og_kalu wrote:
| >(A) it's not much better
|
| OP seems to be on the "They secretly trained on her voice"
| train. The only reason "Two Days!" would be damning is if a
| finetune was in order to replicate ScarJo's voice. In that
| sense, it's much better.
|
| >(C) it's extraordinarily reckless to go from zero to big-
| launch feature in less than two days.
|
| Open AI have launched nothing. There's no date for new
| voice mode other than, "alpha testing in the coming weeks
| to plus users". No-one has access to it yet.
| lolinder wrote:
| I'm really confused by this claim. How is it that so many
| people tried this Sky voice if no one has access yet?
| og_kalu wrote:
| There is a voice mode in the GPT app that's been out
| (even for free users) for nearly a year now. There are a
| couple voices to chooses from and sky was one of them.
|
| This mode works entirely differently from what Open AI
| demoed a few days ago (the new voice mode) but both seem
| to utilize the same base sky voice. All this uproar is
| from the demos of new sky which sounds like old sky but
| is a lot more emotive, laughs, a bit flirty etc.
| tnias23 wrote:
| Idk the voice in the 4o demo and the existing Sky voice
| seemed quite different to me. And Scarlett's letter says
| she and her friends were shocked when they heard the 4o
| demo. This whole situation is about the newly unveiled
| voice in the demo. It's different.
| jonpo wrote:
| Sky voice is old
| Havoc wrote:
| > OpenAI yanked the voice from their product line.
|
| Still live for me? Unless the Sky I'm getting is a different
| one?
| cjbillington wrote:
| It is. They didn't remove the UI option, they just swapped it
| out under the hood for the "juniper" voice.
| __loam wrote:
| The tweet is so fucking brazen lol
| RCitronsBroker wrote:
| yeah, that was just poking the hornets nest. Even if i wasn't
| mad enough to make a stink over my voice before, plausible
| deniability and all, that would've sealed the deal for me.
| sangupta wrote:
| With the recent departures at OpenAI it seems that all ethics
| and morals are going down the drain and OpenAI becoming the
| big-bully.
| hyperhopper wrote:
| There were never any. None of the models or code are actually
| open. It claims to be a nonprofit but is effectively a for
| profit company pulling the strings of a nonprofit just to
| avoid taxes.
| sangupta wrote:
| I guess you are right. The era of "don't be evil" if there
| ever was one, is now long gone and forgotten.
| lambdaxyzw wrote:
| >There were never any.
|
| This is a bit unfair. Some people left OpenAI on the ground
| of ethics, because they were unsatisfied with how this
| supposed nonprofit operates. The ethics was there, but
| OpenAI got rid of it.
| rvz wrote:
| Almost as if they knew that they cloned her voice without her
| permission.
|
| Don't hear any arguments on how this is fair-use. (It isn't)
|
| Why? Because everyone (including OpenAI) knows it clearly isn't
| fair-use even after pulling the voice.
| dragonwriter wrote:
| > Don't here any arguments on how this is fair-use.
|
| > Why?
|
| Because it's a right of personality issue, not copyright, and
| there is no fair use exception (the definition of the tort is
| already limited to a subset of commercial use which makes the
| Constitutional limitation that fair use addresses in
| copyright not relevant, and there is no statutory fair use
| exception to liability under the right of personality.)
| kragen wrote:
| you're doing god's work; ignorance is never-ending
| fakedang wrote:
| Please tell us about the time you most successfully hacked some
| (non-computer) system to your advantage.
| IncreasePosts wrote:
| I wouldn't necessarily call that damning. "Soundalikes" are
| very common in the ad industry.
|
| For example, a car company approached the band sigur ros to
| include some of their music in a car commercial. Sigur ros
| declined. A few months later the commercial airs with a song
| that sounds like an unreleased sigur ros song, but really they
| just paid a composer to make something that sounds like sigur
| ros, but isn't. So maybe openai just had a random lady with a
| voice similar to Scarlett so the recording.
|
| Taking down the voice could just be concern for bad press, or
| trying to avoid lawsuits regardless of whether you think you
| are in the right or not. Per this* CNN article:
|
| > Johansson said she hired legal counsel, and said OpenAI
| "reluctantly agreed" to take down the "Sky" voice after her
| counsel sent Altman two letters.
|
| So, Johansson's lawyers probably said something like "I'll sue
| your pants off if you don't take it down". And then they took
| it down. You can't use that as evidence that they are guilty.
| It could just as easily be the case that they didn't want to go
| to court over this even if they thought they were legally above
| board.
|
| * https://www.cnn.com/2024/05/20/tech/openai-pausing-flirty-
| ch...
| dragonwriter wrote:
| > I wouldn't necessarily call that damning. "Soundalikes" are
| very common in the ad industry.
|
| As are disclaimers that celebrity voices are impersonated
| when there is additional context which makes it likely that
| the voice would be considered something other than a mere
| soundalike, like direct reference to a work in which the
| impersonated celebrity was involved as part of the same
| publicity campaign.
|
| And liability for commercial voice appropriation, even by
| impersonation, is established law in some jurisdictions,
| including California.
| IncreasePosts wrote:
| The most famous case of voice appropriation was Midler vs
| Ford, which involved Ford paying a Midler impersonator to
| perform a _well known Midler song_ , creating the
| impression that it was actually Bette.
|
| Where are the signs or symbols tying Scarlett to the openAI
| voice? I don't think a single word, contextless message on
| a separate platform that 99% of openAI users will not see
| is significant enough to form that connection in users
| heads.
| pseudalopex wrote:
| The Midler v. Ford decision said her voice was
| distinctive. Not the song.
|
| The replies to Altman's message showed readers did
| connect it to the film. And people noticed the voice
| sounded like Scarlett Johansson and connected it to the
| film when OpenAI introduced it in September.[1]
|
| How do you believe Altman intended people to interpret
| his message?
|
| [1] https://www.reddit.com/r/ChatGPT/comments/177v8wz/i_h
| ave_a_r...
| romwell wrote:
| Sorry, that's apples-to-pizzas comparison. You're conflating
| _work_ and _identity_.
|
| There's an ocean of difference between mimicking the style of
| someone's art in an original work, and literally _cloning
| someone 's likeness_ for marketing/business reasons.
|
| You can hire someone to make art in the _style_ of Taylor
| Swift, that 's OK.
|
| You _can 't_ start selling _Taylor Swift figurines_ by the
| same principle.
|
| What Sam Altman did, figuratively, was giving out free
| T-Shirts featuring a face that is recognized as Taylor Swift
| by anyone who knows her.
| IncreasePosts wrote:
| But they aren't doing anything with her voice(allegedly?).
| They're doing something with a voice that some claim sounds
| like hers.
|
| But if it isn't, then it is more like selling a figurine
| called Sally that happens to look a lot like Taylor Swift.
| Sally has a right to exist even if she happens to look like
| Taylor Swift.
|
| Has there ever been an up and coming artist who was not
| allowed to sell their own songs, because they happened to
| sound a lot like an already famous artist? I doubt it.
| airstrike wrote:
| The detail you're missing is that who claim it sounds
| like "her" includes the CEO of the company
| romwell wrote:
| TL;DR: This question had already been settled in 2001
| [3]:
|
| The court determined that Midler should be compensated
| for the misappropriation of her voice, holding that, when
| "a distinctive voice of a professional singer is widely
| known and _is deliberately imitated in order to sell a
| product_ , the sellers have appropriated what is not
| theirs and have committed a tort in California."
|
| I hope there's going to be no further hypotheticals after
| this.
|
| -----
|
| >They're doing something with a voice that some claim
| sounds like hers.
|
| Yes, that's what a _likeness_ is.
|
| If you start using your own paintings of Taylor Swift in
| a product without her permission, you'll run afoul of the
| law, even though your _painting_ is obviously _not the
| actual Taylor Swift_ , and you painted it _from memory_.
|
| >But if it isn't, then it is more like selling a figurine
| called Sally that happens to look a lot like Taylor
| Swift. Sally has a right to exist even if she happens to
| look like Taylor Swift.
|
| Sally has a right to _exist_ , not the right to be
| _distributed_ , _sold_ , and otherwise _used for
| commercial gain_ without Taylor Swift 's permission.
|
| California Civil Code Section 3344(a) states:
|
| _Any person who knowingly uses another's name, voice,
| signature, photograph, or likeness, in any manner, on or
| in products, merchandise, or goods, or for purposes of
| advertising or selling, or soliciting purchases of,
| products, merchandise, goods or services, without such
| person's prior consent, or, in the case of a minor, the
| prior consent of his parent or legal guardian, shall be
| liable for any damages sustained by the person or persons
| injured as a result thereof._
|
| Note the word "likeness".
|
| Read more at [1] on Common Law protections of identity.
|
| >Has there ever been an up and coming artist who was not
| allowed to sell their own songs, because they happened to
| sound a lot like an already famous artist? I doubt it.
|
| Wrong question.
|
| Can you give me an example of an artist which was
| _allowed_ to do a close-enough impersonation _without
| explicit approval_?
|
| No? Well, now you know a good reason for that.
|
| Tribute bands are legally in the grey area[2], for that
| matter.
|
| [1] https://www.dmlp.org/legal-guide/california-right-
| publicity-...
|
| [2] https://lawyerdrummer.com/2020/01/are-tribute-acts-
| actually-...
|
| [3] https://repository.law.miami.edu/cgi/viewcontent.cgi?
| article...
| spuz wrote:
| The damning part is that they tried to contact her and get
| her to reconsider their offer only 2 days before the model
| was demoed. That tells you that at the very least they either
| felt a moral or legal obligation to get her to agree with
| their release of the model.
| IncreasePosts wrote:
| Or, they wanted to be able to say, yes, that is "her"
| talking to you.
|
| I have no idea if they really used her voice, or it is a
| voice that just sounds like her to some. I'm just saying
| openai's behavior isn't a smoking gun.
| johnnyanmac wrote:
| > a reference to an object or fact that serves as
| conclusive evidence of a crime or similar act, just short
| of being caught in flagrante delicto.
|
| If this isn't a smoking gun, I don't know what it.
|
| I think people forget the last part of the definition,
| though. A Smoking gun is about as close as you get
| without having objective, non-doctored footage of the
| act. There's a small chance the gun is a red herring, but
| it's still suspicious.
| ProjectArcturis wrote:
| Case law says no.
|
| There have been several legal cases where bands have sued
| advertisers for copying their distinct sound. Here are a few
| examples:
|
| The Beatles vs. Nike (1987): The Beatles' company, Apple
| Corps, sued Nike and Capitol Records for using the song
| "Revolution" in a commercial without their permission. The
| case was settled out of court.
|
| Tom Waits vs. Frito-Lay (1988): Tom Waits sued Frito-Lay for
| using a sound-alike in a commercial for their Doritos chips.
| Waits won the case, emphasizing the protection of his
| distinct voice and style.
|
| Bette Midler vs. Ford Motor Company (1988): Although not a
| band, Bette Midler successfully sued Ford for using a sound-
| alike to imitate her voice in a commercial. The court ruled
| in her favor, recognizing the uniqueness of her voice.
|
| The Black Keys vs. Pizza Hut and Home Depot (2012): The Black
| Keys sued both companies for using music in their
| advertisements that sounded remarkably similar to their
| songs. The cases were settled out of court.
|
| Beastie Boys vs. Monster Energy (2014): The Beastie Boys sued
| Monster Energy for using their music in a promotional video
| without permission. The court awarded the band $1.7 million
| in damages.
| mycologos wrote:
| Sucks that he had to do it, but the notion of Tom Waits
| making _Rain Dogs_ and then pivoting to spending a bunch of
| time thinking about Doritos must be one of the funnier
| quirks of music history.
| IncreasePosts wrote:
| Disregarding the cases settled out of court (which have
| nothing to do with case law):
|
| 1) Tom Waits vs Frito-Lay: Frito-Lay not only used a
| soundalike to Tom Waits, but the song they created was
| extremely reminiscent of "Step Right Up" by Waits.
|
| 2) Bette Midler vs. Ford Motor Company: Same thing - this
| time Ford literally had a very Midler-esque singer sing an
| exact Midler song.
|
| 3) Beastie Boys vs. Monster Energy: Monster literally used
| the Beastie Boys' music, because someone said "Dope!" when
| watching the ad and someone at Monster took that to mean
| "Yes you can use our music in the ad".
|
| Does Scarlett Johansson have a distinct enough voice that
| she is instantly recognizable? Maybe, but, well, not to me.
| I had no clue the voice was supposed to be Scarlett's, and
| I think a _lot_ of people who heard it also didn 't think
| so either.
| Cheer2171 wrote:
| > Does Scarlett Johansson have a distinct enough voice
| that she is instantly recognizable?
|
| It absolutely is if you've seen /Her/. It even nails her
| character's borderline-flirty cadence and tone in the
| film.
| underlogic wrote:
| Actually until recently I thought the voice actor for
| "her" was Rashida Jones
| locusofself wrote:
| I definitely thought "Sky" was Rashida Jones. I still do.
| Y_Y wrote:
| I've seen _Her_ and the similarity of the voice didn 't
| occur to me until I read about it. I guess it wasn't
| super distinct in the movie. Maybe if they'd had
| Christopher Walken or Shakira or someone with a really
| distinctive sound it would have been more memorable and
| noticable to me.
| pseudalopex wrote:
| The Midler v. Ford decision said her voice was
| distinctive. Not the song.
| parpfish wrote:
| And in an interesting coincidence: ScarJo recorded a Tom
| Waits cover album in 2008
| minimaxir wrote:
| Given the timeline, I'm still baffled Sam Altman tweeted "her."
| That just makes plausible deniability go away for a random
| shitpost.
| prepend wrote:
| I thought it was about functionality more than the specific
| voice.
| minimaxir wrote:
| I suspect that'll be OpenAI's defense.
| krisoft wrote:
| That is what discovery is for. If this would ever get to
| that phase.
|
| Someone from OpenAi hired the agency who hired the voice
| talent (or talents) for the voice data. They sent them a
| brief explaining what they are looking for, followed by a
| metric ton of correspondence over samples and contracts
| and such.
|
| If anywhere during those written communications anyone
| wrote "we are looking for a ScarlettJ imitator", or words
| to that effect, that is not good for OpenAI. Similarly if
| they were selecting between options and someone wrote
| that one sample is more Johansson than an other. Or if
| anyone at any point asked if they should clear the rights
| to the voice with Johansson.
|
| Those are the discovery findings which can sink such a
| defense.
| prepend wrote:
| I recall the basics from my contracts law class that it's
| not against the law to hire an impersonator as long as
| you don't claim it's the celebrity.
|
| So it's legal to hire someone who sounds like SJ. And
| likely legal to create a model that sounds like her. But
| there will likely need to be some disclaimer saying it's
| not her voice.
|
| I expect that OpenAI's defense will be something like "We
| wanted SJ. She said no, so we made a voice that sounded
| like her but wasn't her." It will be interesting to see
| what happens.
| pseudalopex wrote:
| Bette Midler and Tom Waits won cases where the companies
| didn't claim the impersonators were them.
| unraveller wrote:
| Discovery works both ways. The original Her voice
| actress[1] was recast to someone more SoCal in post-
| production, so there is evidence of the flirty erotic AI
| style itself not being a unique enough selling point.
|
| It will come down to what makes the complaining
| celebrity's voice iconic, which for Scarjo is the
| 'gravelly' bit. Which smooth Sky had none of.
|
| [1] actress reading poem:
| https://www.youtube.com/watch?v=eWEEAjRFJKc
| krisoft wrote:
| > Discovery works both ways.
|
| Ok? What materials would you suspect discovery can
| uncover from Scarlett or her team?
|
| > was recast to someone more SoCal in post-production
|
| Was recast to Scarlett Johansson. Hardly a good argument
| if you want to argue that her voice is not unique.
| jrflowers wrote:
| What part of the functionality from the movie Her did you
| think it meant?
| brown9-2 wrote:
| Some people are just addicted to posting
| mvdtnz wrote:
| The same egomaniac tendencies that cause people like Elon
| Musk or Paul Graham to post the first dumbass thing that
| comes to their mind because they think everyone absolutely
| has to see how smart and witty they are.
| spuz wrote:
| It's also worth noting that Sam Altman admitted that he had
| only used GPT4o for _one week_ before it was released. It 's
| possible that in the rush to release before Google's IO event,
| they made the realisation of the likeness of the voice to
| Scarlett Johansen way too late hence the last minute contact
| with her agent.
|
| https://www.youtube.com/watch?v=fMtbrKhXMWc
| emsign wrote:
| Then asking Johansson for permission months before was pure
| coincidence?
| mrbungie wrote:
| The "Sky" voice and it's likeness to SJo's have been there in
| the ChatGPT app for months.
| LewisVerstappen wrote:
| They approached Johansson and she said no. They found another
| voice actor who sounds slightly similar and paid her instead.
|
| The movie industry does this all the time.
|
| Johansson is probably suing them so they're forced to remove
| the Sky voice while the lawsuit is happening.
|
| I'm not a fan of Sam Altman or OpenAI but they didn't do
| anything wrong here.
| falloutx wrote:
| Then they should credit that actress and we can see if its
| legit, otherwise we believe they used copyrighted audio from
| S. Johansson's movies.
| zombiwoof wrote:
| Smug Silicon Valley entrepreneur. Sam is a trash human
| cjbgkagh wrote:
| AFAIK they yanked it pretty quickly and the subsequent scandal
| has widely informed people that it was not authorized by
| Scarlett Johansson. So while it was clearly a violation
| resulting from a sequence of very stupid decisions by OpenAI, I
| am not sure if there would be much in the way of damages.
| BeefWellington wrote:
| In cases like this, don't damages essentially equate to the
| profit a company makes from the false association with the
| celebrity?
|
| Otherwise, it'd be impossible to show damages if you weren't
| personally being denied business because of the association.
| ml-anon wrote:
| Johansson is rich. The real value she could get from this
| would be as an advocate for the rights of creatives,
| performers and rights holders in the face of AI. If this goes
| to discovery OpenAI is done.
|
| How much do you think Disney or Universal Music or Google or
| NYT would give to peek inside OpenAI's training mixture to
| identify all the infringing content?
| spullara wrote:
| The voice was shipped last september.
| burntalmonds wrote:
| Do you know if the voice was the same back in september?
| tnias23 wrote:
| Scarlet says she was shocked to hear the voice in the 4o
| demo, and they had requested her consent (for the 2nd time)
| 2 days prior to the demo. If that demo voice was the same
| as the existing Sky voice, this wouldn't be happening.
| exitb wrote:
| The voices were available for some time for the ChatGPT TTS
| model, but it seems that they reused them for the 4o audio
| output, which sounds significantly more human-like. I've
| heard the Sky voice before and never made the connection. I
| did think of Johansson though during the live demo, as the
| voice + enhanced expressiveness made it sound much like the
| movie Her.
| arvinsim wrote:
| Does it really cost a to train one voice?
|
| Seems pretty reckless to not have alternatives just in case
| Scarlett refused.
| numpad0 wrote:
| Probably 5-10 minutes worth of dataset and GPU time for
| finetuning on an existing base model. Could be done on a Blu-
| ray rip or an in-person audition recording, legality and
| ethics aside.
| ProjectArcturis wrote:
| I'm beginning to think this Sam Altman guy isn't so
| trustworthy.
| disqard wrote:
| You beat me to it.
|
| Bon mots apart, he really appears to have an innate capacity
| for betrayal.
| owlninja wrote:
| And he must be a helluva pitchman, given the weird
| fired/hired debacle. Mixed with some of the resignations
| that made the HN front page recently, apparently anyone
| leaving OpenAI signed away the right to speak. I even find
| it odd that their statement says they hired a voice
| actress, but they want to protect her privacy? Seems like a
| helpful alibi if true, or likely, said actress has signed
| an agreement to never reveal she worked with OpenAI.
| throwaway635383 wrote:
| Was there something in the water? Lots of rumbles around
| the early OpenAI members and questionable behavior.
| 0xDEAFBEAD wrote:
| And perhaps not consistently candid either.
| insane_dreamer wrote:
| that took a while ;)
| nvy wrote:
| It's not possible for me to express the full measure of my
| disdain for Sam Altman without violating the HN guidelines.
| toss1 wrote:
| ChatGPT is way better than to need stupid ripoffs than this
|
| Sam should be ashamed to have ever thought of ripping off
| _anyone 's_ voice, let alone done it and rolled it out.
|
| They are building some potentially world-changing technology,
| but cannot rise above being basically creepy rip-off artists.
| Einstein was right about requiring improved ethics to meet new
| challenges, and also that we are not meeting that requirement.
|
| sad to see
| swiftcoder wrote:
| It's part and parcel of the LLM field's usual disdain for any
| property rights that might belong to other people. What they
| did here is not categorically different than scraping every
| author and visual artist on the internet - but in this
| instance they've gone and brazenly "copied" (read: stolen)
| from one of the few folks with more media clout than they
| themselves have.
| cm2012 wrote:
| People hire celebrity voice impersonators all the time. You've
| heard a few impersonators this month probably from ads. This is
| such a non-issue that's only blowing up because Johannsen wrote
| a letter complaining about it and because people love "big tech
| is evil" stories.
| benreesman wrote:
| I know I have a reputation as an OpenAI hater and I understand
| why: it's _maybe_ 5-10% of the time that the news gives me the
| opportunity to express balance on this.
|
| But I've defended them from unfair criticism on more than a few
| occasions and I feel that of all the things to land on them
| about this one is a fairly mundane screwup that could be a
| scrappy PM pushing their mandate that got corrected quickly.
|
| The leadership for the most part scares the shit out of me, and
| clearly a house-cleaning is in order.
|
| But of all the things to take them to task over? There's
| legitimately damning shit _this week_ , this feels like someone
| exceeded their mandate from the mid-level and legal walked it
| back.
| crznp wrote:
| It really doesn't sound like a "mid-level exceeding their
| mandate".
|
| It sounds like Altman was personally involved in recruiting
| her. She said no and they took what they wanted anyway.
| benreesman wrote:
| It feels weird to be defending Altman, but those of us who
| go hard on the legitimately serious shit are held to a high
| standard on being fair and while multiple sources have
| independently corroborated e.g. the plainly unethical and
| dubiously legal NDA shit Vox just reported on, his links to
| this incident seem thinly substantiated.
|
| I'm not writing the guy a pass, he's already been fired for
| what amount to ethics breaches in the last 12 months alone.
| Bad actor any way you look at it.
|
| But I spent enough time in BigCo land to know stuff like
| this happens without the CEO's signature.
|
| I'd say focus on the stuff with clear documentary evidence
| and or credible first-hand evidence, there's no shortage of
| that.
|
| I get the sense this is part of an ambient backlash now
| that the damn is clearly breaking.
|
| Of all the people who stand to be harmed by that leadership
| team, I think Ms. Johansson (of who I am a fan) is more
| than capable of seeing her rights and privileges defended
| without any help from this community.
| gunsle wrote:
| You're allegedly "not writing the guy a pass" but then
| you go on to do so anyways. If Johansson isn't lying and
| Altman did personally reach out to her, I really don't
| see how you can even attempt to argue this is some middle
| manager gunning for a promotion. In the same way you're
| complaining that she needs no help from the community in
| defending herself, Altman needs no help from you reaching
| this hard. Like how can you not see that hypocrisy?
| crznp wrote:
| If this isn't the thing that makes your blood boil,
| that's fine. The world could probably do with less
| boiling blood, and it is still early, more evidence may
| come out. However, she indicated in her statement that
| Altman asked her, not OpenAI. It seems credible that he
| would want to be involved.
|
| Both sides of the story feel like we're slowly being
| brought to a boil: Sutskever's leaving feels like it was
| just a matter of time. His leaving causing a mess seems
| predictable. Perhaps I am numb to that story.
|
| But stealing a large part of someone's identity after
| being explicitly told not to? This one act is not the end
| of the world, but feels like an acceleration down a path
| that I would rather avoid.
| om2 wrote:
| There's more evidence of Altman being personally involved
| in this incident than in him being personally involved in
| the OpenAI exit agreement, and he has denied the latter.
| I'm not sure I believe his denial in the latter case.
|
| Having an NDA in exit terms you don't get to see until
| you are leaving that claim ability to claw back your
| vested equity if you don't agree seems more severely
| unethical, to be sure. But that doesn't mean there's more
| reason to blame it on Altman specifically. Or perhaps you
| take the stance that it reflects on OpenAI and their
| ethics whether or not Altman was personally involved, but
| then the same applies to the voice situation.
| ml-anon wrote:
| Scarlet Johansson literally mentioned that Sam personally
| reached out to her team.
|
| The CTO was on stage presenting the thing and the CEO was
| tweeting about it.
|
| Please explain for us which part of this is happening
| without the CEO's signature.
|
| Of everyone who has been harmed and had their work stolen
| or copyright infringed by Sam's team, Scarlett Johansson
| is the one person (so far) who can actually force the
| issue and a change, and so the community is right to
| rally behind her because if they're so brazen about this,
| it paints a very clear picture of the disdain they hold
| the rest of us in.
| azinman2 wrote:
| I still have the sky voice in my app.
| m_mueller wrote:
| I still have Sky voice. Is it because of my region?
| Aeolun wrote:
| What I don't understand is what they _expected_ to happen?
|
| Apparently they had no confidence in defending themselves, so
| why even release with the voice in the first place?
| unraveller wrote:
| They underestimated how quickly people would take off the
| headphones and jump on the bandwagon to claim affinity with
| an injured celebrity.
|
| Are you suggesting they should have engineered the voice
| actress' voice to be more distinct from another actress they
| were considering for the part? Or just not gone near it with
| a 10ft pole? because if the latter the studios can just
| release a new Her and Him movie with different voices in
| different geo regions and prevent anyone from having any kind
| of familiar engaging voice bot.
| rlt wrote:
| I don't think that's quite right.
|
| OpenAI first demoed and launched the "Sky" voice in November
| last year. The new demo doesn't appear to have a new voice.
|
| I doubt it would take them long to prepare a new voice, and
| who's to say they wouldn't delay the announcements for a ScarJo
| voice?
|
| A charitable interpretation of the "her" tweet would be a
| comparison to the conversational and AI capabilities of the
| product, not the voice specifically, but it's certainly not a
| good look.
| GaggiX wrote:
| I believe that "Sky" voice was first released in September
| last year and according to the blog post released by OpenAI
| they were working with "Sky" voice actress months before even
| contacting Scarlett Johansson for the first time.
| nox101 wrote:
| To each their own. I personally didn't get Scarlett Johansson
| vibes from the voice on the GPT-4o demo
| (https://openai.com/index/hello-gpt-4o/) even though I'm a huge
| fan of hers (loved Her, loved Jojo Rabbit, even loved Lucy, and
| many many others) and have watched those and others multiple
| times. I'd even say I have a bit of a celebrity crush.
|
| To me it's about as close to her voice as saying "It's a
| woman's voice". Not to say all women sound alike but the sound
| I heard from that video above could maybe best be described and
| "generic peppy female American spokesperson voice"
|
| Even listening to it now with the suggestion that it might
| sound like her I don't personally hear Scarlett Johansson's
| voice from the demo.
|
| There may be some damming proof where they find they sampled
| her specifically but saying they negotiated and didn't come to
| an agreement is not proof that it's supposed to be her voice.
| Again, to me it just sounds like a generic voice. I've used the
| the version before GPT-4o and I never got the vibe it was
| Scarlett Johansson.
|
| I did get the "Her" vibe but only because I was talking to a
| computer with a female voice and it was easy to imagine that
| something like "Her" was in the near future. I also imagined or
| wished that it was Majel Barrett from ST:TNG, if only because
| the computer on ST:TNG gave short and useful answers where as
| ChatGPT always gives long-winded repetitive annoying answers
| GaryNumanVevo wrote:
| OpenAI confirmed it by removing the voice immediately after
| Johansson's lawyers reached out
| nox101 wrote:
| That's not confirmation. That's called prudence.
| dontupvoteme wrote:
| Could they have made it look less like Midler vs Ford?
|
| "Midler was asked to sing a famous song of hers for the
| commercial and refused. Subsequently, the company hired a
| voice-impersonator of Midler and carried on with using the song
| for the commercial, since it had been approved by the
| copyright-holder. Midler's image and likeness were not used in
| the commercial but many claimed the voice used sounded
| impeccably like Midler's."
|
| As a casual mostly observer of AI, even I was aware of this
| precedent
| _rm wrote:
| It's a real shame she didn't take the deal though
| wnevets wrote:
| An AI company stealing someone elses IP for profit? Unheard of.
| ThinkBeat wrote:
| SJ mentions deep fakes.
|
| It is quite possible that OpenAI has synthesized the voice from
| SJ material.
|
| However If OpenAI can produce the woman who did is the current
| voice, and she has a voice nearly identical that of SJ would that
| mean OpenAI had done something wrong?
|
| Does SJ since she is a celebrity hold a "patent" right to sound
| like her.
|
| The more likely scenario is that they have hired a person and
| told her to try and imitate how SJ sounds.
|
| What is the law on something like that?
| not2b wrote:
| The answer, based on two different court precedents (Bette
| Midler, Tom Waits), is that the company can't do that.
| Companies cannot hire soundalike people to advertise their
| products after the person with a distinctive voice they really
| wanted declined. Doesn't matter if they hired a soundalike and
| used her voice.
| crimsoneer wrote:
| There is a big, big difference between actively and
| intentional imitating a voice, and having a broadly similar
| voice though!
| crimsoneer wrote:
| The CEO tweeting a jokey reference to your voice really
| doesn't help though.
| talldayo wrote:
| A difference that gets increasingly narrow when you are
| desperately trying to license the original likeness.
| bigiain wrote:
| Yeah but...
|
| Them trying (and failing) to negotiate the rights, and then
| them vaguely attempting again 2 days before launch, and
| fucking Altman tweeting a quite obvious reference to a
| movie in which SJ is the voice of an AI girlfriend - leans
| very very strongly in the direction of "active and
| intentional imitation".
|
| Anybody trying to claim some accidental or coincidental
| similarity here has a pretty serious credibility hole they
| need to start digging themselves out of.
| chipweinberger wrote:
| If they are not trying to trick people into believing xyz
| person did the voice acting, and instead are going for a
| certain style, I think they would be protected by freedom of
| expression. Think of how authors of books describe voices in
| great detail.
|
| i.e. intent matters.
|
| In this case, since the other voice actor has a clearly
| different voice than SJ, it seems like their intent is to
| just copy the general 'style' of the voice, and not SJ's
| voice itself. Speculative though.
| Ar-Curunir wrote:
| > In this case, since the other voice actor has a clearly
| different voice than SJ, it seems like their intent is to
| just copy the general 'style' of the voice, and not SJ's
| voice itself.
|
| How can you say that when they literally approached SJ for
| her voice, and then asked the voice actor to reproduce SJ's
| voice?!
| chipweinberger wrote:
| > and then asked the voice actor to reproduce SJ's
| voice?!
|
| you are just making that up afaict.
|
| > How can you say that when they literally approached SJ
| for her voice
|
| Almost by definition SJ's voice will match the style of
| 'Her', at least for awhile (*). So why not ask SJ first?
|
| (*) voices change significantly over time.
| rvz wrote:
| Exactly this.
|
| The fact Johansson did not give permission to OpenAI to
| use their voice, they then hired a voice actor to
| similarly copy her voice likeliness with Altman tweeting
| a reference to the film 'Her' which Johansson was the
| starring voice actor in that film, tells you that OpenAI
| intended to clone and use her voice even without
| permission.
|
| OpenAI HAD to pull the voice down to not risk yet another
| lawsuit.
|
| The parent comment clearly has the weakest defense I have
| seen on this discussion.
| prawn wrote:
| SJ's side would use pre-action discovery and trawl through
| internal OpenAI communications to check. If there was any
| suggestion of instruction from management to find a similar
| voice, they'd potentially be in trouble. There's already the
| indication that they wanted her voice through official
| channels.
|
| And would they need to use a voice actor when there is a
| substantial body of movie dialogue and interviews? I'd be
| surprised if they'd bothered.
| ThinkBeat wrote:
| This happens a lot in movies though. If the lead famous actor
| turns down a part, esp in a sequel the film maker will spend
| time deliberately finding an actor that looks a lot like the
| lead star.
|
| (or they rewrite the roll)
| cdme wrote:
| Steal everything they possibly can and hope they end up too big
| to kill. I sincerely hope they fail spectacularly.
| chimney wrote:
| Sounds like someone has not been consistently candid with their
| communication.
| causality0 wrote:
| I'm sure Jen Taylor would be open to an offer.
| minimaxir wrote:
| Cortana on Windows is indeed already voiced by Jen Taylor.
|
| https://en.wikipedia.org/wiki/Cortana_(virtual_assistant)
| causality0 wrote:
| That's what I was referencing. That, and the fact Microsoft
| has given up on Cortana so she's probably free.
| 1vuio0pswjnm7 wrote:
| https://nitter.poast.org/pic/orig/media%2FGODgca6bAAAxaPB.jp...
| BadHumans wrote:
| I know there are people here who think you should be able to use
| a person's likeness for whatever but regardless of how you feel,
| I don't think you can disagree this is a pretty bad look and does
| not reflect well on Altman or OpenAI.
| nabla9 wrote:
| Johansson has money to hire lawyers and immediate access to
| media, so they backed off.
|
| Altman and OpenAI will walk over everyone here without any
| difficulty if they decide to take whats ours.
| ecjhdnc2025 wrote:
| I often wonder why tech people think so positively about
| companies they idolise who are Uber-ing their way through
| regulations. Where do they think it stops?
|
| Why would people not want laws? The answer is so they can do
| the things that the laws prevent.
|
| This is POSIWID territory [0]. "The purpose of a system is what
| it does". Not what it repeatedly fails to live up to.
|
| What was the primary investment purpose of Uber? Not any of the
| things it will forever fail to turn a profit at. It was to
| destroy regulations preventing companies like Uber doing what
| they do. That is what it succeeded at.
|
| _The purpose of OpenAI_ is to minimise and denigrate the idea
| of individual human contributions.
|
| [0]
| https://en.wikipedia.org/wiki/The_purpose_of_a_system_is_wha...
| wilg wrote:
| Nobody besides cab medallion owners really liked the
| regulations that Uber violated is probably a big part of it
| nerdponx wrote:
| > I often wonder why tech people think so positively about
| companies they idolise who are Uber-ing their way through
| regulations. Where do they think it stops?
|
| Because they don't think about the consequences, and don't
| want to. Better to retreat into the emotional safety of
| techno-fantasy and feeling like you're on the cutting edge of
| something big and new (and might make some good money in the
| process). Same reason people got into NFTs.
| ecjhdnc2025 wrote:
| > Same reason people got into NFTs.
|
| Same _people_ who got into NFTs.
| mycologos wrote:
| > Because they don't think about the consequences, and
| don't want to.
|
| This is a dangerous way of thinking about people who
| disagree with you, because once you decide somebody is
| stupid, it frees you from ever having to seriously weigh
| their positions again, which is a kind of stupidity all its
| own.
| timeon wrote:
| > because once you decide somebody is stupid
|
| You just have made up an argument. There is no stated nor
| implied stupidity.
|
| You can't dismiss critique of carelessness like that.
| rockemsockem wrote:
| But like, Uber gave us taxis on our phones. No taxi company
| was going to do that without a force like Uber making them to
| it.
| ecjhdnc2025 wrote:
| It also gave you cab drivers who don't earn enough to be
| able to replace their vehicles.
|
| You can cheer on "forces" like Uber all you like but I
| would prefer it if progress happened without criminal
| deception:
|
| https://www.theguardian.com/news/2022/jul/10/uber-files-
| leak...
|
| I don't see how anyone can read this and think the uber app
| is a net positive.
| rockemsockem wrote:
| Citation needed on the "drivers who can't replace their
| vehicles" part. Lots of cities in the US have passed laws
| about how much such workers must be paid and I generally
| think the government is the one that should be solving
| that problem.
|
| I read that whole article. I didn't know about the
| intentional strategy to send Uber drivers into likely
| violent situations. That's fucked up.
|
| Most of that article seemed to focus on Uber violating
| laws about operating taxi services though. Sounds good to
| me? Like there's nothing intrinsically morally correct
| about taxi service operation laws. This sort of proves my
| point too. Some company was going to have to fight
| through all that red tape to get app-based taxis working
| and maybe it's possible to do that without breaking the
| law, but if it's easier to just break the law and do it,
| then whatever. I can't emphasize how much I don't care
| about those particular laws being broken and maybe if I
| knew more about them I'd even be specifically happy that
| those laws were broken.
| dclowd9901 wrote:
| There's gotta be a middle ground. The registration system
| was shit and encouraged a ridiculous secondary market
| that kept a lot of people under someone else's thumb too.
|
| Why does everything keep getting worse? Why do people
| keep making less? We need to figure out the answers to
| these questions. And no, nobody here knows them.
| afro88 wrote:
| > POSIWID
|
| You need to be honest about what it actually does then.
| Cherry picking the thing you don't like and ignoring the rest
| will bring you no closer to true understanding
| mateus1 wrote:
| I agree. They're clearly have ethics beyond "steal whatever
| data is out there as fast as you can".
| dang wrote:
| Recent and related:
|
| _OpenAI pulls Johansson soundalike Sky's voice from ChatGPT_ -
| https://news.ycombinator.com/item?id=40414249 - May 2024 (96
| comments)
| stavros wrote:
| This is the first I've heard of this, but I've used the "Sky"
| voice extensively and never once thought it sounded like
| Johansson. Has anyone else noticed a similarity? To me they sound
| pretty different, Johansson's voice is much more raspy.
| z7 wrote:
| Looks like you're not the only one who thinks the voices don't
| sound similar:
|
| https://news.ycombinator.com/item?id=40414908
|
| https://news.ycombinator.com/item?id=40414923
|
| https://news.ycombinator.com/item?id=40419791
|
| https://news.ycombinator.com/item?id=40414802
|
| https://news.ycombinator.com/item?id=40414902
|
| https://news.ycombinator.com/item?id=40414713
|
| https://news.ycombinator.com/item?id=40415350
|
| Saw some other posters expressing this view who deleted their
| posts after getting downvoted, lol.
| WrongAssumption wrote:
| I mean, the first post you linked to explicitly says they do
| sound similar. The last one says they don't know what
| Scarlets voice sounds like to begin with.
| chemmail wrote:
| Bottom line anyone who watched the movie "Her" will make an
| immediate connection. Also does not help Sam Xed "her" the
| night before. Pretty much slam dunk case.
| stavros wrote:
| Anyone who watched the movie "Her" will make an immediate
| connection with _any_ of the female voices. It 's kind of the
| entire point of the film.
| muglug wrote:
| There's two different things: the Sky voice they launched last
| year, as heard here:
| https://www.youtube.com/watch?v=RcgV2u9Kxh0. The voice actor is
| the same, but the intonation is fairly flat.
|
| They changed the voice to _intone_ like Scarlett Johansson 's
| character. It's like they changed the song the voice was
| singing to one that lots of people recognise.
| stavros wrote:
| Oh, interesting, I didn't know that, thank you. Do you have
| an example of the changed voice anywhere?
| kromem wrote:
| I don't think that's exactly accurate.
|
| What's likely different is that GPT-4o can output the
| tonality instructions for text to speech now.
|
| It's probably the same voice, but different instructions for
| generations. One was without tonal indicators, one with.
| wraptile wrote:
| That's a big stretch. Allowing intonation to be copyrighted
| would be incredibly silly. The timeline and intention makes
| sense but this being "similar" would never win and it
| shouldn't. Sam should have kept his mouth shut.
| nmeofthestate wrote:
| Yes when the demos first came out I don't remember seeing
| anyone comparing the voice to Scarlett Johansson's. It seems to
| be a meme that's taken hold subsequently with the news about
| OpenAI trying to license her voice.
| sooheon wrote:
| Yes, I feel gaslit by the whole situation
| stavros wrote:
| Yeah, very odd. Don't get me wrong, I'd absolutely love
| Johansson's voice on the thing, but Sky is not it.
| sebzim4500 wrote:
| Maybe I'm crazy but I don't think the voices are even that
| similar. I simply don't believe that her closest friends could
| not tell the difference.
| cdrini wrote:
| I didn't think the old Sky sounded anything like her, but the
| sky they unveiled at the 4o event seemed super similar. While
| watching the event I was genuinely wondering "wait did they
| actually partner with Scarlett Johansson? That's wild!"
|
| This voice: https://x.com/OpenAI/status/1790072174117613963
| rvz wrote:
| Again. After Johansson was approached to be hired for the voice
| then another AI company tried to clone her voice without her
| permission.
|
| Doesn't matter around similarity. There was nothing fair-use
| around this voice and it is exactly why OpenAI yanked the voice
| and indirectly admitted to cloning her voice.
|
| [0 https://news.ycombinator.com/item?id=38154733]
| drooby wrote:
| If we're talking about the voice from the "Say hello to GPT-4o",
| then this is clearly not Scarlett J.
|
| They have similar voices, but SJ has more bass and rasp.
|
| And if it's true that OpenAI hired a different actor, then this
| should basically be case closed.
|
| The voice of Sky (assuming that's the same as the demo video),
| sounds like a run of the mill voice actor tbh. Great, but not
| that interesting or unique.
| ml-anon wrote:
| Combined with Murati's reaction when asked if they trained Sora
| on YouTube videos, it's obvious that OpenAI has trained their TTS
| systems on a whole bunch of copyrighted content including the
| output of professional actors and voice actors who definitely
| weren't compensated for their work.
|
| Altman and Murati are world-class grifters but until now they
| were stealing from print media and digital artists. Now they're
| clashing with some of the most litigious industries with the
| deepest pockets. They're not going to win this one.
| meowface wrote:
| You claim to work for either Anthropic or DeepMind (almost
| certainly DeepMind). I'm doubtful their AI products don't use
| people's works in similar ways.
| Havoc wrote:
| Doesn't sound all that similar to me
| worstspotgain wrote:
| Most of the reactions here are in unison, so there's little left
| to contribute in agreement.
|
| I'll ask the devil's advocate / contrarian question: How big a
| slice of the human voice space does Scarlett lay a claim to?
|
| The evidence would be in her favor in a civil court case. OTOH, a
| less famous woman's claim that any given synthesized voice sounds
| like hers would probably fail.
|
| Contrast this with copyrighted fiction. That space is
| dimensionally much bigger. If you're not deliberately trying to
| copy some work, it's very unlikely that you'll get in trouble
| accidentally.
|
| The closest comparison is the Marvin Gaye estate's case.
| Arguably, the estate laid claim to a large fraction of what is
| otherwise a dimensionally large space.
| https://en.wikipedia.org/wiki/Pharrell_Williams_v._Bridgepor...
| jakelazaroff wrote:
| (Not a lawyer) I think the issue is not just that they sound
| similar, but that OpenAI sought to profit from that perceived
| similarity. It's pretty clear from Sam Altman's "her" tweet
| that OpenAI, at least, considers it a fairly _narrow_ slice of
| the human voice space.
| worstspotgain wrote:
| That's true. As I suggested, this case may well be open and
| shut. But what if it was Google's or Meta's voice that
| sounded exactly like Sky, i.e. without a history of failed
| negotiations? The amount of "likeness" would technically be
| identical.
|
| Are companies better off not even trying to negotiate to
| begin with?
| jakelazaroff wrote:
| Not sure legally. I chose the words "perceived similarity"
| intentionally to encompass scenarios in which the
| similarity is coincidental but widely recognized. Even in
| that case, I believe the original person should be entitled
| to a say.
| MrMetlHed wrote:
| It'd be fine to not negotiate if you weren't going to use a
| voice that sounded famous. If I were offering something
| that let you make any kind of voice you want, I would
| definitely not market any voice that sounded familiar in
| any way. Let the users do that (which would happen almost
| immediately after launch). I would use a generic employee
| in the example, or the CEO, or I'd go get the most famous
| person I could afford that would play ball. I would then
| make sure the marketing materials showed the person I was
| cloning and demonstrated just how awesome my tool was at
| getting a voice match.
|
| What I wouldn't do is use anything that remotely sounds
| famous. And I would definitely not use someone that said
| "no thanks" beforehand. And I would under no circumstances
| send emails or messages suggesting staff create a voice
| that sounds like someone famous. Then, and only then, would
| I feel safe in marketing a fake voice.
| worstspotgain wrote:
| Sounds judicious. You probably wouldn't get sued, and
| would prevail if sued. However, the question of how much
| human voice space Scarlett can lay claim to remains
| unsettled. Your example suggests that it might be quite a
| bit, if law and precedent causes people to take the CYA
| route.
|
| Consider the hypothetical: EvilAI, Inc. would secretly
| like to piggyback on the success of Her. They hire Nancy
| Schmo for their training samples. Nancy just happens to
| sound mostly like Scarlett.
|
| No previous negotiations, no evidence of intentions. Just
| a "coincidental" voice doppelganger.
|
| Does Scarlett own her own voice more than Nancy owns
| hers?
|
| Put another way: if you happen to look like Elvis, you're
| not impersonating him unless you also wear a wig and
| jumpsuit. And the human look-space is arguably much
| bigger than the voice-space.
| jakelazaroff wrote:
| I know toying with these edge cases is the "curious" part
| of HN discussions, but I can't help but think of this
| xkcd: https://xkcd.com/1494/
| worstspotgain wrote:
| HN discussions, grad school case studies, and Supreme
| Court cases alike. Bad cases make bad laws, edge cases
| make extensive appeals.
| kelnos wrote:
| > _However, the question of how much human voice space
| Scarlett can lay claim to remains unsettled_
|
| I don't think it's that unsettled, at least not legally.
| There seems to be precedent for this sort of thing (cf.
| cases involving Bette Midler or Tom Waits).
|
| I think the hypothetical you create is more or less the
| same situation as what we have now. The difference is
| that there maybe isn't a paper trail for Johansson to use
| in a suit against EvilAI, whereas she'd have OpenAI dead
| to rights, given their communication history and Altman's
| moronic "Her" tweet.
|
| > _Does Scarlett own her own voice more than Nancy owns
| hers?_
|
| Legally, yes, I believe she does.
| telotortium wrote:
| This is almost an identical case, and resulted in a ruling
| favorable to Midler, whose voice was imitated in that case:
| https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co
| worstspotgain wrote:
| It's indeed pretty similar. However, it involved singing a
| song Midler was known for. This case is at best peripheral to
| the movie Her, in that the OpenAI voice does not recite lines
| from the movie.
| pseudalopex wrote:
| The decision said Midler's voice was distinctive. Not the
| song.
| TaroEld wrote:
| My concern is that cases like this would set the precedent that
| synthetic voices can't be too close to the voice of a real,
| famous person. But where does that leave us? There's been lots
| of famous people since the recording age, and the number is
| only going to increase. It seems unlikely that you can
| distinguish your fake voice from every somewhat public/famous
| real voice in existence, especially going forward. Will this
| not result in a situation where the synthetic voices must
| either sound clearly fake and non-human to not be confused with
| an existing famous voice, or the companies/producers must in
| every case pay royalties to the owner to a famous voice that
| sounds similarly close, even if their intent wasn't even to
| copy said voice or any that are similar, to avoid them getting
| sued afterwards? Are we going to pay famous people for being
| famous?
| wraptile wrote:
| This sort of copyright seems completely unethical to me.
|
| We have 8 billion people, probability of unique voice and
| intonation is extremely unlikely. Imagine someone else owning
| your voice. Someone much richer and more powerful. No
| entertainment is worth putting fellow human beings through such
| discrimination and cruelty.
| z7 wrote:
| So they wanted Johansson's voice, she declined, they chose
| another voice actress who sounds somewhat similar. Can someone
| explain why that is bad? I don't really get it.
| rangerelf wrote:
| Did they really hire another voice actress? Who?
|
| As for why it's bad, it's because they set down the precedent
| of wanting specifically Scarlett Johansen's voice, got
| declined, doubled down, got declined again, and then went ahead
| and did it anyway. They can say in their own defense that it's
| some other voice actress that sounds similar, ok, so produce
| that name, tell us who she is.
|
| Absent that, it's Johansen's voice, clipped from movies and
| interviews and shows and whatever.
| z7 wrote:
| OpenAI says:
|
| "Sky's voice is not an imitation of Scarlett Johansson but
| belongs to a different professional actress using her own
| natural speaking voice."
|
| https://openai.com/index/how-the-voices-for-chatgpt-were-
| cho...
| drcode wrote:
| Yeah, an ScarJo asked them to provide evidence of this
| before suing them, and they couldn't... and they have a
| track record of lying.
| dorkwood wrote:
| Why shouldn't we trust what OpenAI says? If they say they
| used another actress, we should give them the benefit of
| the doubt. Questioning them only gives fuel to their
| detractors, which could in turn slow progress and reduce
| investment in the space.
| mrbungie wrote:
| Do they get a free pass because of accelerationist
| reasons? If anything this is a reason to stall/brake.
|
| sama tweet after the demo + SJo's press release + OpenAI
| not even risking it and pulling out the voice from
| ChatGPT should raise enough doubts if anything.
| minimaxir wrote:
| The training for a voice decoder model at GPT-4o's quality
| would require specific and numerous high-quality examples.
|
| This is different from how voice cloning models like
| ElevenLabs works.
| ecjhdnc2025 wrote:
| OpenAI: it's like Uber for not respecting common decency.
| reducesuffering wrote:
| Remember when the OpenAI board said Sam was "not consistently
| candid" and most people here advocated he be reinstated and how
| dare the board? They are speedrunning the Google "Don't Be Evil"
| rug-pull. Not allocating the superalignment team resources they
| were promised, "don't criticize OpenAI or mention there's an NDA
| or you lose all your equity"...
| globalnode wrote:
| I think the 2 tech leads that just resigned are actually
| dodging a HUGE bullet.
| throwaway5752 wrote:
| Maybe this "non-profit" shouldn't be entrusted with one of the
| most potentially dangerous and world changing techologies in 80
| years if they can't ethically handle providing a voice to their
| model.
| j-bos wrote:
| Only partially related: One thing I often wonder with generative
| tools, and even before that with the explosion of gobal artists'
| publishing online:
|
| When is it infringing to make something that looks or sounds like
| somebody famous? I mean, there's only so many ways a human voice
| voice can sound or face can look. At what point are entire
| concepts locked down just because somebody famous exists or
| existed that pattern matches.
| djaykay wrote:
| At this point it's time to create lists of "ethical" AI services,
| ones that aren't OpenAI nor other bad actors. I'm dropping my
| ChatGPT Plus today. Any suggestions on what to replace it with?
| 1ark wrote:
| HuggingFace?
| risenshinetech wrote:
| It's OK everyone. I saw some people on HN say that "it sounds
| nothing her voice". I believe them over Her.
| GaggiX wrote:
| "Sky" voice has been the default for about 8 months now, I think,
| if it resembles Scarlett Johansson so much, why does no one seem
| to have mentioned it before?
| zamadatix wrote:
| It's been mentioned so much in the last 8 months it prompted me
| to go watch "Her" 11 years after it came out.
| GaggiX wrote:
| I only saw Scarlett Johansson and "Her" being mentioned only
| after the presentation of ChatGPT-4o. I saw people asking for
| a Scarlett Johansson voice tho.
| zamadatix wrote:
| This way prior to 4o. The demo did seem to make the
| discussion louder though as everyone was suddenly talking
| about the voice feature whereas previously it was a slow
| side option for mobile devices only.
| ec109685 wrote:
| They added much more Her like emotion to the gpt-4o version, so
| the similarities are more striking.
| tmsh wrote:
| i think openai would do better if they had principles, values,
| etc. around responsibility and ownership.
|
| it doesn't seem like principles should matter. but then the bill
| of rights doesn't seem like it should matter either if you were
| to cold read the constitution (you might be like - hmm, kinda
| seems important maybe...).
|
| it compounds culturally over time though. principles ^ time =
| culture.
|
| "Audacious, Thoughtful, Unpretentious, Impact-driven,
| Collaborative, and Growth-oriented."
|
| https://archive.is/wLOfC#selection-1095.112-1095.200
|
| maybe "thoughtful" was the closest (and sam is apologetic and
| regretful and transparent - kudos to him for that). but it's not
| that clear without a core principle around responsibility. you
| need that imho to avoid losing trust.
| HarHarVeryFunny wrote:
| I found the whole ChatGPT-4o demo to be cringe inducing. The fact
| that Altman was explicitly, and desperately, trying to copy "her"
| at least makes it understandable why he didn't veto the bimbo
| persona - it's actually what he wanted. Great call by Scarlett
| Johansson in not wanting to be any part of it.
|
| One thing these trained voices make clear is that it's a tts
| engine generating ChatGPT-4o's speech, same as before. The whole
| omni-modal spin suggesting that the model is natively consuming
| and generating speech appears to be bunk.
| aabhay wrote:
| I wouldn't go as far as your last statement. While shocking,
| it's not inconceivable that there's native token I/O for audio.
| In fact tokenizing audio directly actually seems more efficient
| since the tokenization could be local.
|
| Nevertheless. This is still incredibly embarrassing for OpenAI.
| And totally hurts the company's aspiration to be good for
| humanity.
| timeon wrote:
| > company's aspiration to be good for humanity
|
| Seems like they abandoned it pretty early - if it was real in
| the first place.
| leumon wrote:
| I think it is more then a simple tts engine. At least from the
| demo, they showed: It can control the speed and it can sing
| when requested. Maybe its still a seperate speech engine, but
| more closely connected to the llm.
| sooheon wrote:
| tts with separate channels for style would do it, no?
| kromem wrote:
| Most impressive was the incredulity to the 'okay' during the
| counting demo after the _n_ th interruption.
|
| Was quickly apparent that text only is a poor medium for the
| variety and scope of signals that could be communicated by
| these multimodal networks.
| nabakin wrote:
| Azure Speech tts is capable of doing this with SSML. I
| wouldn't be surprised if it's what OpenAI is using on the
| backend.
| monroewalker wrote:
| > One thing these trained voices make clear is that it's a tts
| engine generating ChatGPT-4o's speech, same as before.
|
| I'm not familiar with the specifics of how AI models work but
| doesn't the ability from some of the demos rule out what you've
| said above? Eg. The speeding up and slowing down speech and the
| sarcasm don't seem possible if TTS was a separate component
| mmcwilliams wrote:
| I have no special insight into what they're actually doing,
| but speeding up and slowing down speech have been features of
| SSML for a long time. If they are generating a similar markup
| language it's not inconceivable that it would be possible to
| do what you're describing.
| GrilledChips wrote:
| It's also possible that any such enunciation is being
| hallucinated from the text by the speech model.
|
| AI models _exist_ to make up bullshit that fills a gap.
| When you have a conversation with any LLM it 's merely
| autocompleting the next few lines of what it thinks is a
| movie script.
| HarHarVeryFunny wrote:
| The older formant-based (vs speech sample based) speech
| sythesizers like DECTalk could do this too. You could select
| one of a half dozen voices (some male, some female), but also
| select the speed, word pronunciation/intonation, get it to
| sing, etc, because these are all just parameters feeding into
| the synthesizer.
|
| It would be interesting to hear the details, but what OpenAI
| seem to have done is build a neural net based speech
| synthesizer which is similarly flexible because it it
| generating the audio itself (not stitching together samples)
| conditioned on the voice ("Sky", etc) it is meant to be
| mimicking. Dialing the emotion up/down is basically affecting
| the prosody and intonation. The singing is mostly extending
| vowel sounds and adding vibrato, but it'd be interesting to
| hear the details. In the demo Brockman refers to the "singing
| voice", so not clear if they can make any of the 5 (now 4!)
| voices sing.
|
| In any case, it seems the audio is being generated by some
| such flexible tts, not just decoded from audio tokens
| generated by the model (which anyways would imply there was
| something - basically a tts - converting text tokens to audio
| tokens). They also used the same 5 voices in the previous
| ChatGPT which wasn't claiming to be omnimodal, so maybe
| basically the same tts being used.
| nabakin wrote:
| Azure Speech tts is capable of speeding up, slowing down,
| sarcasm, etc with SSML. I wouldn't be surprised if it's what
| OpenAI is using on the backend.
| vessenes wrote:
| Greg has specifically said it's not an SSML-parsing text
| model; he's said it's an end to end multimodal model.
|
| FWIW, I would find it very surprising if you could get the
| low latency expressiveness, singing, harmonizing, sarcasm
| and interpretation of incoming voice through SSML -- that
| would be a couple orders of magnitude better than any SSML
| product I've seen.
| og_kalu wrote:
| >One thing these trained voices make clear is that it's a tts
| engine generating ChatGPT-4o's speech, same as before. The
| whole omni-modal spin suggesting that the model is natively
| consuming and generating speech appears to be bunk.
|
| This doesn't make any sense. If it's a speech to speech
| transformer then 'training' could just be a sample at the
| beginning of the context window. Or it could one of several
| voices used for the Instruct-tuning or RLHF process. Either
| way, it doesn't debunk anything.
| nikolay wrote:
| Except that Sky doesn't sound like Scarlett Johansson. I'm sick
| and tired of Hollywood!
| nicklecompte wrote:
| From the Ars Technica story[1], this is very funny:
|
| > But OpenAI's chief technology officer, Mira Murati, has said
| that GPT-4o's voice modes were less inspired by Her than by
| studying the "really natural, rich, and interactive" aspects of
| human conversation, The Wall Street Journal reported.
|
| People made fun of Murati when she froze after being asked what
| Sora was trained on. But behavior like that indicates
| understanding that you could get the company sued if you said
| something incriminating. Altman just tweets through it.
|
| [1] https://arstechnica.com/tech-policy/2024/05/openai-pauses-
| ch...
| gkanai wrote:
| Bloomberg's Odd Lots Podcast had an ex-CIA officer, Phil
| Houston, on in April of 2024. He was promoting a new book but
| he had a lot of great advice for anyone to use regarding
| 'tells' when people are lying. Murati was clearly lying- that's
| obvious then and now.
|
| https://podcasts.apple.com/us/podcast/an-ex-cia-officer-expl...
| coolandsmartrr wrote:
| Could you explain what "to use regarding 'tells'" means in
| this context?
| applecrazy wrote:
| A "tell" in this case is domain-specific terminology to
| denote a behavior that provides information that the person
| may have been trying to keep secret. I believe the term
| comes from poker:
|
| https://en.wikipedia.org/wiki/Tell_(poker)
| IceDane wrote:
| Is there actually any evidence for this? AFAIK, other similar
| claims about people doing certain things when lying have been
| debunked(like fidgeting, avoiding eye contact, etc)
| j-bos wrote:
| It's context amd person to person specific, with many
| possibilities for false positives and negatives.
| lovemenot wrote:
| Reminiscent of the movie The Congress, in which Robin Wright's
| character, a famous actor, is hustled by a movie studio into
| giving up her likeness for them to continue making films starring
| her, in perpetuity.
| swat535 wrote:
| Does Sam has any more reputation to burn? Seriously, who would
| genuinely trust him at this point?
|
| I mean I know he has hundreds of blind followers but good Lord,
| you would think that the man, with all his years of experience
| had some sense to introspect about what he is trying to achieve
| vs how he is going about it.
|
| Money really does blind all our senses, doesn't it?
| rvz wrote:
| > Does Sam has any more reputation to burn? Seriously, who
| would genuinely trust him at this point?
|
| Sam doesn't care.
|
| After the board threw him out of his own company, why would he
| allow that to happen again? With that, he now trusts far much
| less people.
|
| > Money really does blind all our senses, doesn't it?
|
| That is why the cultishness was full on display last year when
| he was fired by the board.
| gcanyon wrote:
| I watched the keynote and many of the demo videos and never once
| thought, "That sounds just like ScarJo."
|
| That said, the timeline she lays out is damning indeed.
| rockemsockem wrote:
| This is interesting to hear and if she decides to sue there's
| extremely clear precedent on her side.
|
| The fact that they reached out to her multiple times and
| insinuated it was supposed to sound like her with Sam's "her"
| tweet makes a pretty clear connection to her. Without that they'd
| probably be fine.
|
| Bette Midler sued Ford under very similar circumstances and won.
|
| https://en.m.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
| thefourthchime wrote:
| I wonder if Scarlett Johansson really has a legal copyright
| since Warner Bros. owns the rights to the movie "Her." It would
| be like Dan Castellaneta trying to get a copyright for Homer
| Simpson when the character is owned by Fox.
| rockemsockem wrote:
| It's not copyright, you can't copyright a voice, it's down to
| likeness laws. It seems to me that they clearly invoked her
| likeness as a celebrity.
|
| It'll be interesting to see what happens if she sues and
| refuses to settle.
| thefourthchime wrote:
| I'm not sure. In the case of *Midler v. Ford Motor Co.*, the
| advertising agency hired a singer to do an impression of Bette
| Midler herself, not a character she performs. The singer was
| instructed to sound as much like Bette Midler as possible while
| performing her hit song "Do You Want to Dance?" for the Ford
| commercial. This use of a sound-alike voice led to the lawsuit,
| as it mimicked Midler's distinctive voice without her
| permission.
|
| Scarlett Johansson's character in the movie is not Scarlett
| Johansson although her voice is very similar. I wouldn't say
| it's identical.
| rockemsockem wrote:
| I haven't seen the commercial, but I feel like it's also
| probably not identical. My read of the case is that the
| context connecting Midler to the ad without her consent was a
| key feature, which makes me think Scarlett Johansson would be
| in a similarly strong position if she brought a case.
|
| But ultimately I'm also not sure. There are some differences
| that courts could find important. I hope she sues and refuses
| to settle, so we can find out!
| vagab0nd wrote:
| What if the model asks the user to input an audio sample of the
| person they'd like to hear, and use that? Would that be legal?
| prawn wrote:
| Presumably the software could have terms of use that put any
| onus of use on the individual, assuming their marketing didn't
| promote being able to do this with random celebrities' voices.
| MarioMan wrote:
| That's how Elevenlabs voice cloning works. They put the onus on
| the person making the clone to have gotten consent.
|
| https://elevenlabs.io/voice-cloning
| skepticATX wrote:
| There are models that are nearly as good as GPT-4 now. For
| personal usage, I've been using them for a while now. OpenAI has
| jumped the shark so much that I'm going to advocate for moving to
| Anthropic/Google models at work now. OpenAI simply can't be
| trusted while Sam is at the helm.
| callalex wrote:
| I think we should all strive to meet the standards of "Weird" Al
| Yankovic. He set out to do something that was widely hated by a
| very powerful and litigious sector, and yet after a full career
| he is widely revered by both the public and the industry. He
| masterfully sidestepped any problems by adhering to the basic
| concept of consent while still getting what he wanted 99% of the
| time.
| justin66 wrote:
| Did anyone say no to him other than Prince?
| apengwin wrote:
| Paul McCartney
| morkalork wrote:
| Coolio didn't say yes to Amish paradise but the record label
| that had the rights agreed to the deal without him (which
| certainly shows who's in charge in that industry eh?). Many
| years later, Coolio admitted that he was wrong about the
| whole thing though.
| justin66 wrote:
| Oh my. That's disappointing, in the sense that I believe
| I've heard Al say that he always checks with the artist.
| zamadatix wrote:
| If you take Weird Al's word for it, he was told Coolio
| had approved and only later found out it was the other
| way around:
|
| "...two separate people from my label told me that they
| had personally talked to Coolio... and that he told them
| that he was okay with the whole parody idea...Halfway
| into production, my record label told me that Coolio's
| management had a problem with the parody, even though
| Coolio personally was okay with it. My label told me...
| they would iron things out -- so I proceeded with the
| recording and finished the album."
|
| https://www.vulture.com/2011/12/gangstas-parodist-
| revisiting...
| justin66 wrote:
| Thanks. Yeah, I don't think Weird Al would lie about
| something like that.
| willis936 wrote:
| It worked for Weird Al because he's always had good intentions.
| If all he ever tried to do was scam people by overselling, fear
| mongering, saber rattling, and stealing whatever was in reach
| then it wouldn't have worked out for him. We'll see if OpenAI
| can last as long as Weird Al.
| minimaxir wrote:
| One interesting legal caveat is that the Sky voice isn't
| "ScarJo", it's ScarJo as acted in the movie Her.
|
| An issue with voice actors having their voice stolen by AI
| models/voice cloning tech is that they have no legal standing
| because their performance is owned by their client, and therefore
| no ownership. ScarJo may not have standing, depending on the
| contract (I suspect hers is much different than typical VA). It
| might have to be Annapurna Pictures that sues OpenAI instead.
|
| Forbes had a good story about performer rights of voices:
| https://www.forbes.com/sites/rashishrivastava/2023/10/09/kee...
|
| IANAL of course.
| muglug wrote:
| IANAL either, but that's not the caveat you think it is.
|
| Bette Midler was able to sue Ford Motor Co. for damages after
| they hired a sound-alike voice:
| https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. Ford had
| acquired the rights to the song (which Midler didn't write).
| whatever1 wrote:
| Judging from this, YouTube is gonna make a ton of money from
| Sora.
| mtnGoat wrote:
| Oh wait Altman and team acting like they rule the world?!
|
| If that box wasn't in your bingo card I'm sorry, it's basically
| the center/free box at this point.
| 0xWTF wrote:
| Anyone care to bet Microsoft and other investors are actually ok
| with this narrative? I think they are, and in fact may be
| advocating for Sam to advance this narrative, because they 1)
| want a court decision, no matter which way it goes, and 2)
| they're confident OpenAI has more capabilities in the pipeline
| mensetmanusman wrote:
| The little mermaid story was true!
|
| Who would have thought we would be discussing voice theft
| someday.
| SLHamlet wrote:
| Really not a smart idea for OpenAI to do this when one of the top
| Congresspeople represents the Hollywood area, is about to be
| elected Senator, and already has a bill ready to require AI
| companies to abide by copyright:
|
| https://nwn.blogs.com/nwn/2024/04/adam-schiff-ai-video-games...
| xyst wrote:
| VCs and angel investors feeling the burn right now. I hope all
| the firms on Sand Hill Rd get bent
| helsinki wrote:
| Statement from Scarlett Johannson's publicist* FTFY
| whoknowsidont wrote:
| Has this industry learned nothing?
|
| I don't know guys, the super hyped up company with next-gen
| technology might just be using crime, underhanded tactics, and
| overstating their capabilities to pull in the thing we all
| love... and it's not each other or your friend's mother!
|
| It's money!
| ml-anon wrote:
| >Last week, OpenAI CTO Mira Murati told me the Sky voice was not
| patterned after ScarJo.
|
| >"I don't know about the voice. I actually had to go and listen
| to Scarlett Johansson's voice," Murati said.
|
| Seems like a big part of Mira's job is not knowing things. How is
| no one questioning how she landed a VP job at OpenAI 2 years
| after being an L5 PM?
| ptelomere wrote:
| First they lie to you saying they will save the world. Then they
| take from you saying they're using them to make the world a
| better place. Then they rule you, saying "there are no other
| ways".
|
| All the while many people believe them at every step.
| hotdogscout wrote:
| Such a nothing burger. Do moralists not get tired of picking up
| pitchforks?
|
| Imitating a movie AI was a cool idea and imitation was the only
| legal way to do it.
|
| Do you pull your hair when companies advertise with Elvis
| impersonators?
|
| Nobody was significantly harmed by this, I can guarantee the rich
| people that use hacker news consume things from much less savory
| standards than imitating a celebrity.
|
| Nestle is strong but you pull the plug at THIS?
|
| Pg has done worse and he owns this forum.
|
| Have some perspective.
| erichmond wrote:
| This is why OpenAI being the "leader" in the space worries me. We
| need to be building trust in AI systems, not leaning into what
| the public perception is. On the other hand, maybe it's good they
| are showing who they really are.
| totalhack wrote:
| Go f'in get 'em Scarlett.
| xyst wrote:
| Sam Altman is an absolute scumbag. Fuck ClosedAI. Hope this
| company and its VCs crash and burn like Theranos.
|
| Maybe Altman lands in jail or files for bankruptcy after all the
| dust settles.
| ab5tract wrote:
| As soon as you heard an AI laughing, you should have rejected it
| categorically. Simps.
| samcat116 wrote:
| What a stupid self own by OpenAI that could have easily been
| avoided.
| neilv wrote:
| Isn't OpenAI mostly built upon disregarding the copyright of
| countless people?
|
| And hasn't OpenAI recently shown that they can pull off a
| commercial coup d'etat, unscathed?
|
| Why would they not simply also take the voice of some actress?
| That's small potatoes.
|
| No one is going to push back against OpenAI meaningfully.
|
| People are still going to use ChatGPT to cheat on their homework,
| to phone-in their jobs, and to try to ride OpenAI's coattails.
|
| The current staff have already shown they're aligned with the
| coup.
|
| Politicians and business leaders befriend money.
|
| Maybe OpenAI will eventually settle with the actress, for a
| handful of coins they found in the cushions of their trillion-
| dollar sofa.
| tony_cannistra wrote:
| > No one is going to push back against OpenAI meaningfully.
|
| Couldn't, perhaps, one of the more famous people on Earth be
| responsible for "meaningfully" taking OpenAI to task for this?
| Perhaps even being the impetus for legislative action?
| neilv wrote:
| And allied with an army of other artists.
|
| If they tell the story of OpenAI, in a way that reaches
| people, that would be a triumph of the real artists, over the
| dystopian robo-plagiarists.
|
| I love it already.
| johnnyanmac wrote:
| >Isn't OpenAI mostly built upon disregarding the copyright of
| countless people?
|
| It sure was. But OpenAI decided to poke the Bear and is being
| sued by NYT. And apparently as a sidequest they thought it best
| to put their head in a lion's mouth. I wouldn't call the PR
| clout and finances of an A-list celebrity small potators.
|
| They could have easily flown under the radar and have been
| praised as the next Google if they kept to petty thievery on
| the internet instead of going for the high profile content.
|
| >People are still going to use ChatGPT to cheat on their
| homework, to phone-in their jobs, and to try to ride OpenAI's
| coattails.
|
| Sure, and ChatGPT isn't goint to make lots of money from these
| small time users. They want to target corporate, and nothing
| scares of coporate more than pending litigation. So I think
| this will bite them sooner rathter than later.
|
| >Maybe OpenAI will eventually settle with the actress, for a
| handful of coins they found in the cushions of their trillion-
| dollar sofa.
|
| I suppose we'll see. I'm sure she was offered a few pennies as
| is, and she rejected that. She may not be in it for the money.
| She very likely doesn't need to work another day in her life as
| is.
| flanked-evergl wrote:
| > > Isn't OpenAI mostly built upon disregarding the copyright
| of countless people?
|
| > It sure was.
|
| Can you cite something that elaborates on this point? Do
| people who read books and then learn from it also disregard
| copyright? How is what OpenAI does meaningfully different
| from what people do?
| NicuCalcea wrote:
| Are those people then reselling the contents of the books?
| z7 wrote:
| Very obvious bias against OpenAI in the comments here. Possible
| motives: a) there's a visceral human reaction against anyone
| extraordinarily successful and powerful (Nietzsche wrote about
| this). b) resentment for OpenAI's advancements in code generation
| and its possible impact on the job market. I don't think much of
| the outrage here is motivated by altruism, it's probably more
| about siding with whoever opposes your perceived enemy.
| rileytg wrote:
| i'm confused about the timeline, it is still available to me? did
| they decide to reactivate?
| browningstreet wrote:
| Sam is aligned with the Elon playbook.
| Art9681 wrote:
| Hot take. The lesson for OpenAI is to STFU. Always. This is
| always the best thing to do. STFU. You wanted to emulate her
| voice? Should have done it anyway and not told a soul. You know
| why? Because there are tens of thousands of women who sound like
| that. It's a very generic voice and accent. Let's be real here.
| Many of us have seen her movies and had we not read the
| controversy we would not have made the connection. All OpenAI had
| to do was move forward with intent and let HER prove its HER
| voice, and not a generalization of many similar women's voices
| that could be found in the public domain applied towards a
| process that collapsed into something resembling her and many
| other women's voices.
|
| In the not so distant future, when the world's top AI models can
| generate endless accents and voices at will, the probability of
| one of those sounding just like you (and thousands of other
| people) will be high. It will be VERY high.
|
| All this dealing with Hollywood and music industry and all the
| crap i've been reading about OpenAI trying to wiggle their way
| into those industries is absolute damn nonsense. What is Sama
| thinking?! GO BACK TO BEING NERDS AND STFU.
|
| If you really believe you are going to create a real AGI, none of
| this is relevant. No one is going to thank you for creating
| something that can replicate what they value in seconds. Do it
| anyway.
|
| And remember, STFU.
| bpiche wrote:
| Luke skywalker shooting torpedoes into the Death Star vibes. Burn
| it down.
| encoderer wrote:
| Quite the scarlet letter on Openai
| iainctduncan wrote:
| Sam Altman appears to not be smart enough to realize how much
| damage his unbridled selfishness and weaselry are capable of
| doing to Open AI.
|
| They have no moat, they can't fix hallucinations, and people are
| starting to realize it's nowhere near as useful or close to AGI
| as he's been saying. If they hate him too, this ship is sunk.
|
| What a bloody arrogant idiot.
| abakker wrote:
| It seems he is either using ChatGPT instead of talking to
| experts...or just listening to only himself. Either way, Ilya
| should have stuck to his guns.
| dsign wrote:
| I wish you were right and that ship would be sunk, by the grace
| of Sam Altman the bloody arrogant idiot. There are burning
| issues that AI could tackle, unsolved problems that cost
| billions of lives and whose solution require non-human
| information processing. But instead of those life-or-death
| matters, we are using all of that compute to wreck our species
| cultural back-bone and identity. If Sam Altman the bloody
| arrogant idiot by his greed manages to convince us that we have
| taken a wrong turn with our application of AI, then I will hang
| a portrait of him in my living room.
| dbg31415 wrote:
| It's stuff like this that gives me 0 trust in Altman. Drama
| follows the guy everywhere and it's likely because he brings it
| on himself with questionable morales and actions.
| DavidPiper wrote:
| Interesting to see how the more upvotes and comments this thread
| gets, the further DOWN it goes on the frontpage, despite being
| more recent than almost everything above it.
| dorkwood wrote:
| What's the difference between an algorithm training on someone's
| voice, and a human baby listening to a voice and growing up to
| speak the same way? Would you punish the baby for that? It's
| exactly the same thing.
| deadbabe wrote:
| Are we doomed to someday have all the popular AI voices sound
| like submissive, borderline sexually subservient females?
| thih9 wrote:
| > and the passage of appropriate legislation to help ensure that
| individual rights are protected.
|
| Very interesting to see this there. Does anyone know how could
| that be legislated?
| nick137381 wrote:
| Too bad she didn't agree to it. It's the only voice they
| currently have that I can stand. Hey OpenAI, can you maybe try
| stealing Morgan Freeman's voice next instead?
| mepian wrote:
| Didn't expect Scarlett Johansson to become the flashpoint of the
| next AI winter.
| nycdatasci wrote:
| Could she at least release an audio statement instead of just the
| text?
| xbmcuser wrote:
| The more I find out about Altman the more I agree with the
| previous board about removing him. The guy just feels sleazy to
| me. Though he is doing what I want and that is not giving a fuck
| about artificial monopolies granted by government ie copyright
| PostOnce wrote:
| The man is literally lobbying congress to obtain an artificial
| monopoly on AI in the name of "safety".
|
| https://www.youtube.com/watch?v=TO0J2Yw7usM
| titanomachy wrote:
| It would be helpful if you linked to something more specific
| (even a timestamp). I'm not going to watch this nearly 3-hour
| video to decide whether it supports your statement or not.
| theyinwhy wrote:
| If you want to discover truths, 3 hours research seems like
| a good deal. Anyways, I got you covered. Altman has had a
| plan for this from the get go:
| https://www.techemails.com/p/sam-altman-emails-elon-musk
| wraptile wrote:
| The board reinstating Altman was what broke the camels back to
| me. It showed that the board is completely powerless and that
| Altman is simply a liar.
| oblio wrote:
| You're sort of funny. He's ignoring the existing artificial
| monopolies just to make money, when it suits him.
|
| He's using trade secrets, copyright, patents, NDAs liberally.
|
| This is not a principled stand, just opportunism.
| insane_dreamer wrote:
| Actually it's the copyright that is supposed to prevent a
| monopoly arising in the form of an OpenAI, or Google News back
| in the day.
| xbmcuser wrote:
| Copyright was supposed to protect a small inventor or maker
| for a short period of time. But as those small makers became
| corporations and they lobbied for and were able to keep
| increasing the copyright periods from a few years to decades
| after the death of the individual the actual essence of the
| reason for copyright is gone. I am of the opinion that
| copyright apart from a fixed term should also have a form of
| value tax. So if the government is allowing you to have a
| monopoly on something you should pay a yearly tax on its
| valuation. And if it is not worth it to pay the copyright
| yearly tax to keep the copyright then the work is released to
| the commons.
| wyldfire wrote:
| Evil genius territory.
|
| When the offer was declined by scarjo, they could still train on
| her works of art and just hire a soundalike to make recordings
| regardless of whether they used it during training.
|
| Then, at release time - either they get the buzz of artist-
| licensed "Her" or they get the buzz /outrage/Streisand of
| unlicensed "Her". Even if they take it down, OpenAI benefits.
|
| I feel like the folks who fear the tech are wrong. But when the
| supposed stewards do such a moustache-twirling announcement, it
| seems like maybe we do need some restraint.
|
| If a trade group can't put some kind of goodwill measures in
| place, we will inevitably end up with ham fisted legislation.
| talldayo wrote:
| There's not a lick of genius to it. Sam Altman wasn't willing
| to compromise on his vision and had to change things after-the-
| fact because he was threatened.
|
| For me to believe this was genius I'd have to see some actual
| response from Sam. From the outside-looking-in, it appears that
| he was caught with his pants down when Jonhansson said no and
| went ahead _even though_ he was rejected a second time and
| obviously knew it was the wrong choice. There 's no Streisand
| effect at play here, OpenAI already owned the news cycle with
| their 4o announcement and could have kept it quiet. But Sam
| just _had_ to have his One More Thing, and now he 's getting
| his just deserts.
| nyolfen wrote:
| so scarlet johansson has rights over every VA that sounds like
| her as well because she's famous?
| _giorgio_ wrote:
| I chose the voice a lot of time ago just because it sounded nice,
| I've never thought of a similarity to Scarlett even after the
| Sama tweet.
|
| The real problem, now, is that they don't have a nice working
| voice anymore.
| gnicholas wrote:
| Does anyone have a link to examples? I'm curious to hear how
| close this sounds to the actress's voice.
| starspangled wrote:
| Too bad nothing substantive will happen to them.
|
| The worst of it is not that this one person is being ripped off
| (that's bad enough and I hope she gets some kind of resolution).
| The worst of it is that it shows the company and the people
| behind it who are making the big decisions are dishonest and
| unethical.
|
| All the alleged "safety" experts in corporations and in
| government policy and regulators? All bullshit. The right way to
| read any of these "safety" laws and policies and regulators is
| that they are about ensuring the safety of the ruling class.
| danans wrote:
| What an unforced error on OpenAI's part, but revelatory to all of
| us how their leaders actually see the world around them: either
| people whose likeness and style to copy like Johansson or chumps
| like the rest of us who would marvel at the regurgitated
| synthetic likeness.
|
| And really, how much worse would the demo have been if they
| hadn't cloned Johansson's voice, and instead used another unknown
| voice? If it was similarly flirty, we'd have fallen for it
| anyways.
| hurtuvac78 wrote:
| I am wondering if many Open AI engineers feel mistaken today for
| having promised to follow Sam Altman to MSFT after the board
| action.
| Jayakumark wrote:
| At this level of copyright infringement, Now I 100% believe that
| it's fully trained on YouTube and other copyright videos, audio,
| books etc. they don't care about using any public data or paying
| a dime as long as they can build a model with it , they will
| never disclose the data used and won't allow anyone who quit to
| talk about it , blackmailing them with equity.
| divbzero wrote:
| Wanting to imitate _Her_ is rather ironic: It's like watching
| _Wall Street_ and wanting to be Gordon Gecko, or watching
| _Gattaca_ and wanting to genetically engineer humans.
| gkanai wrote:
| Either _that_ is the joke for Altman or there are lot of
| Johansson fans at OpenAI.
| fnord77 wrote:
| Sam is a creep
| skilled wrote:
| Has Sam already tweeted how sorry he is? All things considered,
| this might actually give people perspective on how weird OpenAI
| has been when it comes to respecting other people's property.
|
| The top comment in this thread is crazy too, they probably
| contacted her two days prior to launch on the off chance that
| they could use her as a marketing puppet.
|
| Lost for words on this one.
| SaintSeiya wrote:
| Sam messed with the wrong girl. In the end his firing was the
| correct thing to do. The "bad guys" of the company were doing the
| correct thing and like Jesus, we crucifix them.
| petre wrote:
| He could still ask Morgan Freeman I suppose.
| xlii wrote:
| Just couple of days ago I discussed "Her" in context of Sky voice
| of ChatGPT, and how it reminds me of the movie.
|
| It's interesting to see how it unfolding.
| msoad wrote:
| With all of those stories coming out of OpenAI I'm happy I didn't
| join them. A lot of sleazy and shady practices.
| dino1729 wrote:
| Given the sequence of events, Scarlett Johannsson suing OpenAI is
| a logical outcome. Sam Altman, of all people, should be
| anticipating this outcome for sure.
|
| Assuming Sam Altman is not stupid, this could be part of some
| elaborate plan and a calculated strategy. The end goals could
| range from immediate practical outcomes like increased publicity
| (see ChatGPT's mobile app revenue doubled overnight:
| https://finance.yahoo.com/news/chatgpts-mobile-app-revenue-s...)
| and market impact, to more complex objectives like influencing
| future legal frameworks and societal norms around AI.
| camillomiller wrote:
| This is such a classic Altman move. Why everyone keeps defending
| this guy as he if he wasn't such a manipulative power hungry
| weasel is beyond me.
| wojciechpolak wrote:
| What would happen if there was someone else in the world with
| exactly the same voice as Scarlett (or very, very similar) and
| they expressed a desire to work with OpenAI? Would Scarlett still
| have the right to prohibit its use?
| wraptile wrote:
| That's a fundamental flaw with this sort of copyright. First
| come first serve and it's up to courts to "asign rights" so if
| you don't have that tough luck someone else owns your identity
| because their bigger.
|
| People cheering for this sort of copyright are completely lost
| imo. That's not a world anyone but the select few wants to live
| in.
|
| Nevertheless that's not what happened here.
| surfingdino wrote:
| It reads to me like he used a big dollop of flattery to get her
| to agree then asked her to reconsider when she said no? That's
| cringy.
| leobg wrote:
| > He told me that he felt that by my voicing the system, could
| bridge the gap between tech companies and creatives and help
| consumers to feel comronable with the seismic shalt concerning
| humans and Al. He said he felt that my voice would be comforting
| to people.
|
| To me, that reads like the same kind of snake oil he sold Elon
| when he proposed the joint founding of OpenAI.
|
| I can just about imagine the books in his private library. The
| Prince. 48 Laws of Power. Win Friends and Inference People.
| unraveller wrote:
| Damning would be a side by side comparison of voices to assess
| the claim. We have the technology.
|
| ChatGPT using Sky voice (not 4o - original release):
| https://youtu.be/JmxjluHaePw?t=129
|
| Samantha from "Her" (voiced by ScarJo):
| https://youtu.be/GV01B5kVsC0?t=134
|
| Rashida Jones Talking about herself https://youtu.be/iP-sK9uAKkM
|
| I challenge anyone to leave prejudice at the door by describing
| each voice in totality first and seeing if your descriptions
| overlap entirely with others. They each have an obvious unique
| whispiness and huskiness to them.
| dilyevsky wrote:
| First and second sample have this super noticeable voice fry
| that Rashida doesn't have (as much of)
| unraveller wrote:
| They would all sound alike when itching for agreement, I bet.
| The narrow likeness in accent is there at first but that
| isn't true likeness if significant other details emerge on
| further listening and that don't overlap.
| surfingdino wrote:
| I wonder how they'll handle this? A pot of gold and an iron-clad
| NDA for an out of court settlement?
| raverbashing wrote:
| Sam: double the grift and half the taste of Steve Jobs
| Madmallard wrote:
| The fact that they did it anyway and only took it down after
| legal threat tells you these are not the people you want to be in
| charge of such powerful systems. They want their cake and to eat
| it too, and regular humans be screwed over in the process. I
| think a relinquishment of power is in order. OpenAI should truly
| be open and there should be large public discussion forums
| regarding changes moving forward.
| badrunaway wrote:
| Sam Altman doesn't inspire confidence in where AI companies are
| going with user consent. And the board can't even remove him even
| if he takes OpenAI to the wrong path. He is the board and the
| company.
| meta-level wrote:
| They should have copied die voices from
| https://en.m.wikipedia.org/wiki/The_Congress_(2013_film) instead.
| Would have been like Amazon removing 1984 from customers
| Kindles..
| kashyapc wrote:
| Altman often uses tactical charisma to trap gullible people,
| government entities, and any unsuspecting powerful person for his
| ends. He will not bat an eyelid to take whatever unethical route
| if that gives him "moat". He relentlessly talks as if "near-term
| AGI" is straining to get out of the bottle in his ClosedAI
| basement. He will tell you with great concern about how "nervous"
| or "scared" (he said this to the US Congress[1]) of what he
| thinks his newest LLM model is gonna let loose on humanity.
|
| So he's here to help regulate it all with an "international
| agency" (see the reference[2] by _windexh8er_ in this thread)!
| Don 't forget that Altman is the same hack who came up with
| "Worldcoin" and the so-called "Orb" that'll scan your eyeballs
| for "proof of personhood".
|
| Is this sleazy marketer the one to be trusted to lead an effort
| that has a lasting impact on humanity? _Hell_ no.
|
| [1] https://news.ycombinator.com/item?id=38312294
|
| [2] https://news.ycombinator.com/item?id=40423483
| ml-anon wrote:
| "Tactical charisma" is a good one.
|
| Honestly though, if you actually listen to him and read his
| words he seems to be even more devoid of basic empathetic human
| traits than even Zuckerberg who gets widely lampooned as a
| robot or a lizard.
|
| He is a grifter through-and-through.
| jajko wrote:
| Emotional intelligence /true empathy cannot be learned or
| acquired, at least IMHO.
|
| But it can be learned to be mimicked almost to perfection,
| either by endless trial & error or by highly intelligent
| motivated people. It usually breaks apart when completely new
| intense / stressful situation happens. Sociopaths belong here
| very firmly and form majority.
|
| If you know what to look for, you will see it in most if not
| all politicians, 'captains of industry' or otherwise people
| who got to serious power by their own deeds.
|
| Think about a bit - what sort of nasty battles they had to
| continually keep winning with similar folks to get where they
| are, this ain't the place for decent human beings, you/me
| would be outmatched quickly. Jordan Peterson once claimed you
| have cca 1/20 of sociopaths in general population, say 15
| millions just in US? Not every one is highly intelligent and
| capable of getting far, but many do. Jobs, Gates, Zuckenberg,
| Bezos, Musk, Altman and so on and on. World is owned and run
| by them, I'd say without exception.
| ecjhdnc2025 wrote:
| > Emotional intelligence /true empathy cannot be learned or
| acquired, at least IMHO.
|
| I think you are right in general here in this comment but I
| am not sure if you are right on this bit.
|
| Peterson might be slightly overstating the number of
| sociopaths (others put it at more like one in thirty).
|
| Those people have to fake it (if they can be bothered; it
| doesn't seem to hold people back from the highest office if
| they don't)
|
| The vast majority of people with noticeably low empathy,
| though, simply haven't ever been taught how to nurture that
| small seed of empathy, how to use it to see the world, how
| to feel the reciprocal benefits of others doing the same.
| How to breathe it in and out, basically. It's there, like a
| latent ability to sing or draw or be a parent, it's just
| that we're not good at nurturing it specifically.
|
| Schools teach "teamwork" instead, which is a lite form of
| empathy (particularly when there is an opposing team to
| "other")
|
| I was never a team player, but I have learned to grow my
| own empathy over the years from a rather shaky sense of it
| as a child.
| ml-anon wrote:
| In the case of Peterson, I'd say it takes one to know one.
| raxxorraxor wrote:
| Almost any public figure that displays empathy will do it
| for show or as a statement. On many occasion not showing an
| emotional reaction is the empathetic thing to do as well.
|
| You cannot really judge people by their public appearance,
| it will in most cases be a fake persona. So the diagnosis
| of Jobs or Zuckerberg isn't really grounded in reality if
| you do not know them personally.
| kashyapc wrote:
| I agree. By tactical charisma, I didn't mean to imply that he
| has genuine empathy. I mean that he says things the other
| person finds pleasing, in just the right words, and credible-
| sounding seriousness. Tactical in a tempting sense: "Don't
| you want to be the bridge between man and machine, Scarlett?"
| or, "Imagine comforting the whole planet with your voice" --
| I've slightly rephrased a bit here, but this is how he tried
| to persuade Scarlett Johannson into "much consideration" (her
| words).
|
| Yes, I've listened to Altman. A most recent one is him
| waffling with a straight-face about "Platonic ideals"[1],
| while sitting on a royal chair in Cambridge. As I noted
| here[2] six months ago, if he had truly read and digested
| Plato's works, he simply will not be the ruthless conman he
| is. Plato would be turning in his grave.
|
| [1] https://www.youtube.com/watch?v=NjpNG0CJRMM&t=3632s
|
| [2] https://news.ycombinator.com/item?id=38312875
| jjgreen wrote:
| I enjoyed this comment [1] on the Reg's article on this story:
|
| _Hurray, OpenAI has found a new lucrative market. Horny incels._
|
| [1]
| https://forums.theregister.com/forum/all/2024/05/21/scarlett...
| sirsinsalot wrote:
| Again, I must recommend everyone read Jaron Lanier's "Who Owns
| The Future".
|
| It's an excellent book, and so so many of the issues raised in it
| are playing out blow-by-blow.
| zmmmmm wrote:
| If it spurs on the movement to create legislation controlling how
| likenesses are used in AI models, Sam Altman has done himself a
| great disservice here.
| lokhura wrote:
| I'm not a fan of Sam Altman, but this is such a non-issue. The
| solution is simple: adapt and find new ways to be creative in the
| world of AI. Copyright is becoming a thing of the past and
| rightly so. We just have to collectively accept this and move on
| because laws won't stop it.
| slim wrote:
| move fast break people
|
| hurting people is just a risk Sam Altman is willing to
| incorporate into his equation
| braza wrote:
| The most interesting aspect of this debacle, in my opinion, is
| that with new technologies that allow you to impersonate and/or
| recruit artists with minor modifications, the figure of "movie
| star" and the artist itself will be significantly diluted.
|
| For example, I would love to see all of the Bourne books adapted
| into live-action films, but I know that will be impossible. In
| the future, I believe it would be great to see some AI actors who
| are not related to any famous actors/actresses perform the same
| screenplay: of course, if the book is licensed to that AI movie.
|
| [1] - https://bourne.fandom.com/wiki/The_Bourne_Directory
| ChrisMarshallNY wrote:
| If they used a different voice actress, then it should be trivial
| to simply tell everyone who she is (She could probably benefit
| from the publicity), and show the hundreds of audio samples, all
| dated before this kerfuffle.
|
| Problem solved.
| ChrisMarshallNY wrote:
| This article was on CNN a few days ago. Probably relevant:
| https://www.cnn.com/2024/05/17/tech/voice-actors-ai-lawsuit-...
| blinding-streak wrote:
| Everything the board said about Altman, the reasons for firing
| him, were correct.
___________________________________________________________________
(page generated 2024-05-21 12:01 UTC)