[HN Gopher] FTC proposes new protections to combat AI impersonat...
___________________________________________________________________
FTC proposes new protections to combat AI impersonation of
individuals
Author : oblib
Score : 89 points
Date : 2024-02-19 19:31 UTC (3 hours ago)
(HTM) web link (www.ftc.gov)
(TXT) w3m dump (www.ftc.gov)
| IronWolve wrote:
| So individuals still have fair use for protests and comedy?
| twoodfin wrote:
| The FTC's power only extends to commercial activity, and in
| this case it appears limited to implied (impersonated)
| commercial endorsements.
| dang wrote:
| Url changed from https://www.techradar.com/pro/security/the-ftc-
| bans-ai-imper..., which points to this.
| aetherspawn wrote:
| People don't want to be impersonated by deepfake.
|
| Artists don't want their art style copied.
|
| Writers don't want to be out of business or have the price of
| their work degraded by GPT spam.
|
| There are a lot of things people don't want AI to do, but I can't
| want for them to use AI to remaster Star Trek Next Gen (and other
| old sci-fi) into 4K. This kind of application hurts nobody.
| Nuzzerino wrote:
| Personally I'd rather AI be used to help fix society and health
| issues. Help people thrive and survive, things like that.
| cynicalsecurity wrote:
| Human problems can only be solved by humans, not technology.
| hsbauauvhabzb wrote:
| I'm not sure that's true, are you suggesting the internet
| hasn't solved a single human problem?
| cynicalsecurity wrote:
| Nope. It hasn't solved anything.
| BriggyDwiggs42 wrote:
| Agreed, but these things are actually harder than upscaling
| star trek unfortunately.
| ben_jones wrote:
| > I can't want for them to use AI to remaster Star Trek Next
| Gen (and other old sci-fi) into 4K. This kind of application
| hurts nobody.
|
| If the remaster only improves video and audio fidelity sure.
| But they may also swap out characters, personalities, and plot
| points, based on the whims of the studio.
|
| AI controlled by major powerbrokers is going to further the
| aims of those power brokers.
| BriggyDwiggs42 wrote:
| If there's a good tool floating around to do the upscaling, I
| bet committed fandoms are about as likely as major companies
| to gain access. Superfans wont care about copyright. The only
| way that wouldn't seem likely to me is if the company owns or
| unilaterally controls the tool, which we haven't seen any
| indication is a likely business model.
| hammock wrote:
| The original Star Wars trilogy is now ruined because of
| overzealous "remastering"
| teekert wrote:
| Wasn't TNG shot on film and remastered [yes: 0]? Not sure if it
| made it to 4K, and it probably could use some AI clean up. I
| did read that DS9 was scaled up with "AI" because it was mostly
| CGI. I read on reddit that fans are even doing it, with mixed
| results.
|
| [0] https://trekmovie.com/trek-remastered/tng-remastered/
| megmogandog wrote:
| I'm curious why you would want 4K in this case? I was over at a
| family member's place and threw on an episode of TNG to kill
| time. I was trying to figure out why everything looked so
| cheap: Picard's uniform, other costumes, the alien makeup. It
| was because they'd increased the definition past what the sets,
| costumes, and makeup were designed for. Personally I prefer the
| fuzzy look which appears more 'realistic' to me.
| janalsncm wrote:
| Seems like this is broader than just AI impersonation. It also
| would include a person claiming to be from the government when
| they're not.
|
| > Falsely imply government or business affiliation by using terms
| that are known to be affiliated with a government agency or
| business (e.g., stating "I'm calling from the Clerk's Office" to
| falsely imply affiliation with a court of law).
|
| My first thought on this was _great, I'm glad the FTC is doing
| something about this and I'm surprised it wasn't already
| regulated by the FTC_.
|
| My second thought was the majority of this type of fraud is
| probably from foreign impersonation, not Americans. And it's not
| like they'd be sending in predator drones for surgical strikes
| against scam call centers, as satisfying as that might be.
|
| My third thought was that having this on the books and keeping a
| record of these violations will give the FTC leverage to crack
| down on telecoms that don't do anything about it.
| codetrotter wrote:
| "Hello sir I am calling from Windows support."
| cyanydeez wrote:
| indeed, almost no novel criminality or fraud is occuring.
|
| robocalls should just be illegal. unidentified propaganda
| should be banned. ads with fraudulent claims should be
| prosecuted.
|
| you can argue about "to what degree" but AI isn't doing
| anything but exposing the true lack of enforcing of existing
| laws because of capitalism grease.
| ethanbond wrote:
| > but AI isn't doing anything but exposing the true lack of
| enforcing of existing laws because of capitalism grease.
|
| This is just a dismissive way of saying "AI isn't doing
| anything but making the problem worse."
| cyanydeez wrote:
| right.
|
| I guess my opinion is "they already weren't tracking the
| type of fraud AI is capable; they weren't enforcing
| fraudulent activity..."
|
| AI just makes it more numerous. a non story.
|
| "sorry guys, we're still not regulating white color fraud"
| bobthepanda wrote:
| This is a legal clarification. Impersonation has always been
| illegal, but they didn't specify AI because they didn't exist
| back then, so now they do.
| notatoad wrote:
| >My second thought was the majority of this type of fraud is
| probably from foreign impersonation, not Americans. And it's
| not like they'd be sending in predator drones for surgical
| strikes against scam call centers, as satisfying as that might
| be.
|
| I think it means that american-owned social media platforms
| would be required to comply with these rules though, right?
| that seems like a fairly big deal.
| jasonwatkinspdx wrote:
| > My second thought was the majority of this type of fraud is
| probably from foreign impersonation, not Americans.
|
| There's an industry for it in India, because they can source
| english call center workers there fairly easily.
|
| The youtuber Mark Rober ran into a couple of US mules for these
| sorts of scams while working on one of his prank videos, and
| decided to team up with some other youtubers to troll some of
| these companies in India. The video on it is worth watching:
| https://www.youtube.com/watch?v=xsLJZyih3Ac
| thesis wrote:
| Isn't fraud already illegal?
| wmf wrote:
| They think there will be additional deterrence by making it
| super illegal. See "identity theft" for a previous example.
| hammock wrote:
| Yes and assaulting someone is also illegal (whether with a
| weapon or not), reckless driving is also illegal (whether you
| are drunk or not), and violent hate crimes are illegal (whether
| you had discriminatory hate in your heart or not)
| Cheer2171 wrote:
| Fraud has to be prosecuted by the Dept of Justice. Deceptive
| business practices can be independently pursued by the FTC.
| kjkjadksj wrote:
| Too bad scammers couldn't care less what the FTC decides just
| based on the crap coming into my physical mailbox, email inbox,
| and calls and texts.
| spazx wrote:
| Nor the FCC. I just got a call from a contractor-recruiter call
| center. (About half of them are scams, but you can't tell,
| because they all sound like the same foreign call center.) The
| caller ID said "Public Service". That has got to be illegal?
| evolvingstuff wrote:
| I suspect in the near future there will be a number of cases
| where individuals will intentionally release a bunch of AI
| "chaff", in the sense that having a very large number of bad
| videos/texts about them, of which many are clearly false, will
| disguise the actual bad behavior. I'm not sure what term/phrase
| will be used for this particular tactic, but I am absolutely
| certain one will arise.
| hammock wrote:
| This almost certainly already happens, just without AI. For one
| example look at the surfeit of UFO stories, many of which can
| be plausibly attributed to state efforts to cloud intelligence
| about actual classified air and space technology
| anigbrowl wrote:
| This (minus the AI part until now) is pretty much the strategy
| of political operator Steve Bannon, who pithily summarized it
| as 'flood the zone with shit'. Think of all those junk 'news'
| sites that are just barely curated content farms using
| automated 'spinners' to pump out content, rage farming pundits
| etc.
| karmasimida wrote:
| AI watermarking would be a boom business.
| PeterisP wrote:
| Why would the scammers apply watermarking to their fraudulent
| data? Or do you expect that FTC somehow will make the open
| source generative models disappear worldwide?
| MR4D wrote:
| Key words - "allow for _civil_ penalties".
|
| Civil is a much lower bar than criminal when it comes to court
| cases.
| reocha wrote:
| Unfortunately the cat is it out of the bag, we can't simply
| legislate this away.
| malfist wrote:
| The same thing has been said before. Doesn't make it true.
|
| Lawn mower manufacturers said they couldn't make lawnmowers
| safe, it was impossible. Until the government mandated that
| they had too.
| reocha wrote:
| Lawnmowers can't emulate the president of the united states.
| scoofy wrote:
| People respond to incentives.
|
| We have insanely high quality printers, yet we do not have much
| counterfeiting.
|
| Just because we _can_ do something easily and illicitly, doesn
| 't mean that people _will_ do things illicitly if the proper
| instinctive structure is implemented.
| reocha wrote:
| What proper incentive structure do you suggest? There are
| plenty of protections for printers and even then we do have
| (printer based and otherwise) counterfeiting.
|
| We are talking about a software based solution that can
| emulate any public figure (locally) that the average person
| will not be able to recognize. This is a categorical risk to
| the information age
| bobthepanda wrote:
| we can't legislate it away, but we can throw the book at people
| who do. i don't understand why the knee jerk cynical response
| is "why bother," as if that will make the problem better.
| reocha wrote:
| Do you want to regulate every gpu that can run a video sim?
|
| An easier requirement from official sources to embed a public
| key in a video. This is a solved problem
| bobthepanda wrote:
| is this regulating every gpu? this simply says there are
| legal consequences for aiding fraud.
|
| laws that say murder is illegal are not regulating knives.
| jmorrice wrote:
| When dealing with Western actors, this is symmetric warfare. If
| you are a political party and the opposition impersonates you,
| you could do the same back.
|
| What concerns me is the asymmetric threat such as posed by North
| Korean dollar groups. For the uninitiated, this is a real threat
| but let's for a moment not think about the silliness of present
| day Communists getting extremely ruthless about stacking cash.
|
| The model I'm concerned about is, say you have a NK hacker group
| and they make a very, very convincing video of a CEO doing
| something embarrassing (shout out a former UK PM's alleged
| porcine initiation ritual) with a view to making cash.
|
| These people are focussed on the bottom line. They could
| structure their extortion demand to be F.O. money and get paid
| with little fuss. And do it over and over.
|
| On the one hand the replicability of such attacks concerns me.
| However I have been considering a future where we are embarrassed
| or exposed to embarrassing content on an industrial scale.
|
| Embarrassment is a social concept that we all deal with, and deal
| with it we do. It could be that the AI impersonation mess gets so
| bad we all become inoculated to this type of content because
| virtually everyone notable has become a victim already. Could it
| become the cost of doing business?
| ranger_danger wrote:
| Ctrl+F "crypto", not found.
|
| Nothing to see here, moving on.
| Animats wrote:
| This is the controversial provision, the "means and
| instrumentalities" clause. Existing law covers people running
| impersonation scams. The big question is, what responsibilities,
| if any, do sellers of tools have? The draft language:
|
| _SS 461.5 Means and Instrumentalities: Provision of Goods or
| Services for Unlawful Impersonation Prohibited._
|
| _It is a violation of this part, and an unfair or deceptive act
| or practice to provide goods or services with knowledge or reason
| to know that those goods or services will be used to:_
|
| _(a) materially and falsely pose as, directly or by implication,
| a government entity or officer thereof, a business or officer
| thereof, or an individual, in or affecting commerce as commerce
| is defined in the Federal Trade Commission Act (15 U.S.C._ 44);
| or*
|
| (b) materially misrepresent, directly or by implication,
| affiliation with, including endorsement or sponsorship by, a
| government entity or officer thereof, a business or officer
| thereof, or an individual, in or affecting commerce as commerce
| is defined in the Federal Trade Commission Act (15 U.S.C. 44).*
|
| It's the "with knowledge or reason to know" clause that's key
| here. Various industry parties have already commented on this,
| some wanting stronger language there to protect sellers of
| general purpose tools for creating content.
|
| Sellers of automated outbound marketing tools which can be used
| to deliver impersonation scams might be caught up by this.
___________________________________________________________________
(page generated 2024-02-19 23:00 UTC)