[HN Gopher] X blocks Taylor Swift searches after fake AI videos ...
___________________________________________________________________
X blocks Taylor Swift searches after fake AI videos go viral
Author : jmsflknr
Score : 45 points
Date : 2024-01-28 14:22 UTC (8 hours ago)
(HTM) web link (www.ft.com)
(TXT) w3m dump (www.ft.com)
| exo-pla-net wrote:
| https://archive.is/9ErBf
| 1B05H1N wrote:
| It matters now because it's easy to do?
| samstave wrote:
| It matters because she is famous?
|
| If this was just some random person, this would be a blip on
| nightly news - similar to the revengeporn website stuff.
|
| It must be really interesting though, to be _the_ person in all
| of Human History who is at the creation of a Humanity 's
| awareness that controlling AI versions of our likeness is going
| to be an impactful and meaningful area in legal precedent for
| here to evermore.
|
| Swift, IMO, should feel a certain sense of weird-luckiness? to
| literally the Human where we begin the discussion of protecting
| ourselves from AI fakes....
|
| My question is, then, if Swift can be entirely in control of
| how her likeness is used in any context, then what about any
| random person's likeness being scanned, documented and analyzed
| by millions of camera surveillance feeds every day?
|
| Its a weird tangent, but if Swift creates the foundation for
| (what would this be, case law? Precedent? Dont know what legal
| terms define this) - what impact could it have for people
| defending the even capture of their likeness by systems that
| use that likeness to develop a catalog of your biometric-
| behavors to track you, recreate you, catalog you, define you,
| and then have business systems use that data to make decisions
| upon or against you?
|
| If I own all aspects of my biologics, then do I have _ay_
| agency over how data captured, and AI-ified, amy be used?
| Mountain_Skies wrote:
| Hollywood is already struggling with how to handle these
| types of issues, with actors wanting to maintain control of
| their likeness and studios wanting to be able to own those at
| the very least as a "work for hire" property in the context
| of a movie character. Not that this is completely new.
| Crispen Glover sued over the producers of 'Back to the Future
| II' using another actor to give the impression of him playing
| the George McFly character instead of recasting the character
| or writing around the absence of the character. IIRC, Glover
| ended up winning.
| foogazi wrote:
| > If this was just some random person, this would be a blip
| on nightly news - similar to the revengeporn website stuff.
|
| Still a crime regardless of the victims notoriety
| techdragon wrote:
| She is rich, and famous, and can both afford to fight
| legally, and sway public opinion to her side ( which is is
| the ethically sound side, it's hard to argue it's totally
| fine for anyone to make fake porn _and share it_ of you
| without permission... doubly so given we allow people to
| exercise likeness rights )
|
| So she has the social and economic power to stand up for her
| rights, and enforce her existing likeness rights in the face
| of some widespread AI imagery that is violating those
| rights...
|
| I'm not expecting precedent, just another sad example of how
| the rich and powerful have rights the rest of us don't,
| because you have to _assert_ those rights which requires
| lawyers which requires money and so... the status quo
| continues as it exists today... "nothing to see here, move
| along"... sadly.
| samstave wrote:
| Full agreement.
|
| That's why I think this is really important for everyone -
| and not just about fake sex tapes - but in how much we
| control our presence in the world, specifically at the
| intersection of our Digital existence and our Biological
| existence.
|
| And this is imperative to get right, given that our Digital
| selves are effectively immortal from this point forward -
| even though they will ultimate just be boiled down to some
| coordinate in a vector graph for eternity.
| techdragon wrote:
| I think you expect too much. The end result is that she
| assets rights she already had yesterday, rights she had
| years ago before DALL-E or Stable Diffusion even
| existed... there's nothing new here... just existing
| power structures that give the rich and famous some
| measure of control over their likeness due to it being
| part of their personal brands and thus their l business
| interests... while the non rich and famous have no right
| to privacy in public and short the limited protections
| that have been implemented in some circumstances against
| revenge porn, no right to protect their likeness being
| misused... from people taking pictures in public, to AI
| generated images... they do not have protections against
| this kind of thing as no act has been perpetuated against
| them unless it crosses a tiny minority of laws against
| things like commercial imagery rights which is why models
| have to sign a release... but that won't protect from
| generated imagery given away for free.
| Teever wrote:
| Ironically I think that her fame has a chance of being her
| undoing on this subject and we'll end up with a situation
| where the famous have to endure certain kinds of sexual
| content being produced that features their likeness while
| the little people do not.
|
| https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell
| crote wrote:
| > Swift, IMO, should feel a certain sense of weird-luckiness?
| to literally the Human where we begin the discussion of
| protecting ourselves from AI fakes....
|
| Did you miss the part about the images containing her being
| raped and assaulted? This isn't a deepfake ripoff concert or
| something.
|
| This woman is being actively stalked in real life by multiple
| people. She literally needs 24/7 security to protect her from
| weirdos. I highly doubt she's going to care even the
| _slightest_ about being part of the "discussion", and the
| fact that you'd even _suggest_ she should "feel lucky" shows
| an incredible lack of empathy on your part.
| samstave wrote:
| Sorry, actually I wasnt really following what was happening
| in this case, mostly because I don't care about anything at
| all related to Taylor Swift - but I do care about how we
| navigate the laws of AI likeness ownership.
|
| Her personal circumstances suck - and it was insensitive
| for me to not mentino that, but I did so out of ignorance,
| because I dont care about taylor swift - not because I am
| lacking empathy.
|
| SO, apologies for that. I stand my my point, and I hope
| this gets Swiftly resolved in a manner that beneficially
| covers all.
| danhon wrote:
| It mattered before it was easy to do. This has been brought up
| before, but leadership barreled ahead anyway.
| 2OEH8eoCRo0 wrote:
| Would it have been possible to pull this off when it wasn't
| easy to do? I can't put my finger on it but being easy
| changes things. It takes very few bad actors to have a big
| impact.
| mlrtime wrote:
| It matters only because AI is new/hot topic. AI+TS is the only
| reason this is news as this has been happening for a long time.
|
| TS needs to protect her image as it's worth a lot of money.
| Doesn't matter if it's AI, Photoshop or some drawing. The same
| as any other Trademark owner, they have to protect it to risk
| losing it.
|
| Nobody is claiming this will stop AI or stop future scenarios,
| that doesn't mean that TS shouldn't also agressively protect
| her image.
| addicted wrote:
| It's always mattered. People have been trying hard to get
| social media companies and govt officials to pay attention.
|
| Because Swift is famous and because AI is involved they (and
| places like HN) are finally paying attention.
| kotaKat wrote:
| I assume this is why Nitter got slammed and is on the verge of
| breaking.
|
| https://github.com/zedeus/nitter/issues/1154
| techdragon wrote:
| I don't see the connection.
| JoeMattiello wrote:
| People use nitter for embedding tweets where twitter has been
| broken for ages such as in telegram.
|
| Personally I use fixupx dot com
| hasty_pudding wrote:
| Meanwhile it's like every other thread on 4chan /b.
|
| if you ever want to know what truly censorship resistant sites
| are, just post AI pictures of Taylor Swift.
| rc_mob wrote:
| i gave up on that place when 4chan started to worship
| authoritarian trump
| hasty_pudding wrote:
| I gave up on politics in general. Now I find it strange that
| people care so strongly about things they have such little
| control over. lol
|
| People on 4chan are motivated by doing things for the lulz
| more than any ideology..and you have to admit Trump
| presidency would provide the most lulz.
| netsharc wrote:
| I have a feeling the OG 4chan-ners were doing it for the
| "lulz", but when it became well-known as a "site where the
| users supports Trump", all the bleeding-heart Trump
| supporters (including the actually deplorable e.g. the
| white supremacists) actually started frequenting the
| site...
|
| I wonder how most of the media's inability to see jokes as
| what they are contributed to their misreporting...
| f38zf5vdt wrote:
| The DALLE3 stuff on 4chan is 10000% worse than nudes. One of my
| friends showed me a picture of TS putting jews in ovens in a
| concentration camp. Microsoft is the one spouting out about the
| need for "generative image AI safety", meanwhile they have to
| be the worst offender given some of the stuff I've seen out of
| Bing.
| vdaea wrote:
| It's important that we have censorship so images of ungood
| things can't be generated.
| twoWhlsGud wrote:
| Figure out how to stop the social media robots from
| monetizing the destruction of civil society and you might
| be able to keep something resembling free speech.
| Otherwise, the future looks like China or Russia (massive
| kleptostates using the robots to keep control instead).
| It's not clear there's a way for democracies to survive the
| disappearance of the peaceful maintenance of reality.
| 0cVlTeIATBs wrote:
| If by "monetizing the destruction of civil society" you
| mean making money from posting celebrity nudes on the
| internet maybe you'll be glad to learn the people doing
| it are doing it for free.
| TheLoafOfBread wrote:
| Everyone talking in defense of Taylor Swift up to the White House
| has created massive Streisand effect. Yet AI porn is here for few
| years at least.
| Shawnj2 wrote:
| AI porn trained off of a dataset of people who are fine being
| in an AI porn dataset vs. AI porn of celebrities who
| understandably go to great lengths to protect their image
| rights are different categories.
| gizmo686 wrote:
| This is not about Taylor Swift. If you want to fight against
| non consensual AI porn, you need someone to be the face of it.
| Swift is a good candidate because:
|
| A) She comes with a lot of free PR, that you would have to work
| for if you wanted to build up a token victim from nothing and
|
| B) She is already a massivly well known public figure. She has
| money, a platform, a PR team; she will be fine. Heck, she
| probably already had photoshopped poen of her floating around
| the internet for years.
|
| Swift will be absolutely fine with being made the face of this.
| Your random Jane Doe would not.
| Teever wrote:
| https://en.wikipedia.org/wiki/Hustler_Magazine_v._Falwell
|
| You're right that it's not about her but wrong that she's a
| good candidate.
| _heimdall wrote:
| Taylor Swift also isn't one to shy away from a public moral
| stand. She picked a fight with streaming services and her
| former label. This seems like one I could see her stepping in
| to take a public stand on.
| greenhexagon wrote:
| | If you want to fight against non consensual AI porn, you
| need someone to be the face of it
|
| Isn't this just an emotionally charged way of suggesting
| people fight against artistic freedom, freedom of speech,
| etc?
|
| I have literally zero concern about someone making AI porn
| that looks like me (or my spouse, family, celebrities,
| politicians, etc). People have already had photoshop and
| before that imaginations. It's maybe a little weird, icky or
| uncomfortable to think about, but that's a small price to pay
| for living in a free liberal democracy.
|
| I'm far more concerned that this will give powerful people
| another tool to crack down on journalists, artists,
| activists, documentary filmmakers, etc. Or even just any
| independent creative who attempts to publish work outside of
| one of the major copyright cartel corporations.
| RcouF1uZ4gsC wrote:
| It is interesting to contrast this reaction with widespread
| blocking and calling for laws with the reaction to Kanye West's
| Famous video which featured a wax model of a naked Taylor Swift
| in bed with him.
|
| https://www.robertreeveslaw.com/blog/famous-video-kanye/
|
| There is almost an elitism that things are ok as long as someone
| famous does them but if the common people do things, then there
| is outrage and a desire to involve the law against them.
| thih9 wrote:
| True, that wasn't ok either.
|
| At the same time this instance seems more dangerous, due to the
| obvious - scale, availability, technology advancements in an
| unregulated field, also the level of explicit content.
|
| I see no elitism. If Kanye distributed explicit AI deepfake,
| there would be similar outrage. If a "common person" made a wax
| model of a celebrity and recorded a music video, they would be
| labeled a weirdo at most.
| kkarakk wrote:
| As andy warhol said - "Art is what you can get away with".
|
| Kanye gets a (heavily frowned upon)pass because there is an
| inherent artistic and cultural commentary sensibility to the
| video.
| kkarakk wrote:
| There are sex workers who already use AI tech to produce images
| of themselves trained on their pictures - for eg,
| https://www.susu.bot/ (NSFW)
|
| This is likely the (near) future of this tech, you can't control
| it only monetise it so that the ease of use is more convenient
| than spending a couple of hours downloading publicly available
| images and training models yourself.
|
| (The industrial revolution and it's consequences hit the world of
| celebrity...)
| Khaine wrote:
| Fake Celebrity porn has been a thing for the longest time. In the
| 90s people used to airbrush the face/head of a celebrity onto the
| body of a nude model. Prior to that people used their
| imaginations.
|
| What has changed since then? I guess technology now makes this
| easier, has fakes like this become more socially unacceptable?
| dharmab wrote:
| The change is now it takes about 5 seconds and a nice-ish
| computer instead of hours of skilled work.
| _heimdall wrote:
| So is the problem that fake porn is being created, or the
| ease in which it can be created?
|
| The latter almost makes it sound like fake celebrity porn is
| more of an art form that should be appreciated and guarded
| from cheap knock offs.
| ender341341 wrote:
| > or the ease in which it can be created?
|
| It's the volume of it, which comes from the ease.
| gqcwwjtg wrote:
| Kanye West did a music video with wax figures of nude
| celebrities including Taylor Swift and that had barely any
| backlash. Is it because he's known as an artist? The price
| of the recreation? The way it's distributed? Is creating
| fake images of someone nude in a somewhat sexual context
| meaningfully different from creating images of them
| actively engaging in sex acts?
|
| It's almost like the problem is nobody claiming it as art
| they created.
___________________________________________________________________
(page generated 2024-01-28 23:01 UTC)