[HN Gopher] A "deepfake" is at the center of a harassment case, ...
___________________________________________________________________
A "deepfake" is at the center of a harassment case, but what if
it's not faked?
Author : danso
Score : 107 points
Date : 2021-04-08 18:21 UTC (4 hours ago)
(HTM) web link (www.dailydot.com)
(TXT) w3m dump (www.dailydot.com)
| danso wrote:
| Thread on initial news story:
| https://news.ycombinator.com/item?id=26447471
| crazygringo wrote:
| This is no different from whether a letter was or wasn't faked,
| or an e-mail, or whether a signature was forged or not.
|
| You just get an expert witness (or opposing ones) to testify as
| to the likelihood of something being forgery or not and why, as
| well as take things like motivation, means, etc. into account.
|
| There's literally nothing different about deepfakes. We have a
| well-established court system for handling this.
| lolsal wrote:
| > There's literally nothing different about deepfakes.
|
| That's a bit disingenuous don't you think?
| mensetmanusman wrote:
| " "Scary though if it isn't a deepfake," Shamook added. "This
| could be the start of people being able to discredit real
| evidence by simply saying 'deepfake.'" "
|
| This.
|
| I have been in a jury, and I wouldn't be surprised if image based
| evidence won't always be discounted by defense attorneys. It
| probably is already starting for the high paid ones (e.g. the
| notion itself will protect the rich.)
|
| It brings up an interesting future where you can't trust any
| witness testimony (because people do misremember) or any video,
| image, or audio evidence due to this technology.
|
| The only societal workaround will be to implant cryptographic
| keys inside of bodies and monitor all interactions on some
| blockchain /s?
| woah wrote:
| It's always so odd when people assume that there were no
| functioning legal systems during the tens of thousands of years
| of human history before photography was invented
| buran77 wrote:
| The question is how effective that justice system was, not
| whether it existed or not. It's the same with medicine, it
| existed for quite some time but the effectiveness increased
| drastically and you wouldn't want to roll back all that
| progress.
|
| The reality is that for most of those tens of thousands of
| years justice systems were mostly a sham where the focal
| point wasn't actually determining innocence but rather the
| appearance of justice and pleasing (some) people.
| tshaddox wrote:
| Or now, for the vast majority of cases where direct
| photographic evidence would be useful but does not exist.
| xmprt wrote:
| People lived before the Internet existed but it would be a
| huge setback if we lost it. Similarly, advancements in
| technology have allowed for our justice systems to be a lot
| better and it would be a real shame if we lost that because
| of deepfakes. I'm not saying it will go that far, but it's
| totally normal to be wary of it.
| drdeadringer wrote:
| I remember an episode of the Olde Time Radio show "The Shadow"
| where Lamont and Margo attend a trial as audience members. This
| is in the 1930s // 1940s for context.
|
| The prosecution projected a film with audio [cutting edge
| technology at the time] demonstrating the crimes in question
| [bribery and related charges].
|
| Lamont ['The Shadow'] was able to hear the audio and read lips
| at the same time and figure out that what was heard was not was
| what said. "Believe half of what you see, and nothing you hear"
| was his takeaway.
|
| As the story goes, that's what happened. The film shown at
| court was dubbed for audio by a hired guy who did great vocal
| impersonations. The accused was innocent.
|
| Short story long: deep fakes aren't a new idea. 100 years and
| counting. Same trick, different tools.
| nickysielicki wrote:
| The phrasing of this rubs me the wrong way -- I get the feeling
| that you're more afraid of guilty people being presumed
| innocent than you are of innocent people being framed by
| deepfakes.
|
| Our justice system worked for a very long time in an
| environment where cell phones didn't exist, where cities
| weren't covered end-to-end in CCTV cameras, and our courts were
| able to do a decent job of protecting the innocent and putting
| away the bad guys. Distrust of video evidence only makes us
| revert back 30 years to when we weren't so monitored. It's
| something that we can survive. On the other hand, if we don't
| see an outright rejection of deepfakes in our courts, we're
| going to see it used as a political weapon. We can survive the
| former, but not the latter.
|
| Unfortunately, I don't see it working out that way. Once courts
| and police get their hands on a new technology and it becomes
| sufficiently ingrained, it doesn't matter how little sense it
| makes. Drug dogs are bullshit, but they're still everywhere.
| [1] DNA evidence can be synthesized and planted, but it's the
| gold standard in terms of courtroom evidence. [2]
|
| Given all of this, I don't have a lot of faith in our justice
| system getting deepfakes right, and that worries me a lot more
| than guilty people being presumed innocent.
|
| [1]:
| https://www.washingtonpost.com/opinions/2019/02/05/supreme-c...
|
| [2]: Good reason to avoid giving your DNA to 23andMe or to the
| state. Unfortunately, both my parents did 23andMe, so I'm
| screwed regardless.
| https://www.nytimes.com/2009/08/18/science/18dna.html
| Daho0n wrote:
| >Drug dogs are bullshit, but they're still everywhere. [1]
|
| Just because some abuse them and have no proper training
| doesn't mean they are fake. You clearly have never seen a
| police dog work. They are able to find dead people, drugs,
| etc. that is at the bottom of the sea or follow the path the
| target took while inside a car. What is BS is the "police"
| and justice system in the US.
| samatman wrote:
| This is a misunderstanding of why "drug dogs are bullshit".
|
| Yes, dogs are capable of sniffing out people, animals, and
| substances. This has been made use of for centuries.
|
| Dogs are also capable of "hitting" on the trunk of a car,
| just because they guess that their master wants to look
| inside it. Plenty of evidence that this happens all the
| time.
|
| Which undermines the right protecting against unreasonable
| search and seizure. All you have to do is be the wrong
| color, or driving in a "suspicious" way (which means
| whatever police want it to mean), and they can bring a K9
| by, the dog will dutifully point at the trunk, and the cops
| get their search.
|
| It's no different from the "I smelled pot" routine: there
| aren't any penalties to the police for doing an "I smelled
| pot" search and not finding cannabis, and there are no
| consequences for a K9 'hitting' on a trunk which turns out
| to have nothing more interesting than a gym bag.
| rStar wrote:
| drug dogs usefulness to law enforcement, almost
| exclusively, are as machines to manufacture probable cause
| on command. As this is how they're used, "bullshit" is far
| too kind a descriptor.
| Enginerrrd wrote:
| Just because one drug dog might actually work doesn't mean
| that, on average, they and their handlers are performing to
| standard sufficient for probable cause.
| rossdavidh wrote:
| I don't think the history regarding how courts (and juries)
| handled eyewitness testimony, is all that different from how
| they handle DNA evidence. It's regarded as far more reliable
| than it is.
| Klinky wrote:
| The track record of the justice system has been extremely
| mixed for many, and downright awful for minorities. DNA
| testing has been absolutely crucial in the last 30 - 40 years
| for undoing some of the damage the justice system caused
| before DNA testing was adequately available. There is no
| comfort found in regressing backwards to where forensic
| science becomes easily dismissible, and we're reliant on who
| can buy the most prestigious expert witness to testify for
| them.
| Rule35 wrote:
| Courts and police have still done vastly more for
| minorities than against them.
| fxtentacle wrote:
| There already are services who claim to be able to verify the
| authenticity of images. And so far, I managed to fool all of
| them with good CGI videos injected through MIPI into an Android
| system.
|
| I remember that TruePic was especially awful, because their app
| fell for me just taking a photo of a printed out image. Plus
| their website overflows with Blockchain and Crypto buzzwords
| Yajirobe wrote:
| > There already are services who claim to be able to verify
| the authenticity of images. And so far, I managed to fool all
| of them with good CGI videos injected through MIPI into an
| Android system.
|
| Not to mention adversarial attacks will probably always
| exist, so there will always be a way to fool these systems.
| jmcqk6 wrote:
| I don't think this is new at all. The specifics might be new,
| but the problem is quite old, and has been with us for
| thousands of years.
|
| I think the danger is in trying to come up with an abstract,
| general solution for it. I don't think one exists.
|
| I think the only thing we can do is deal with things on a case
| by case basis in these situations.
| pvarangot wrote:
| I thought that juries are instructed to decided when it's
| "beyond any reasonable doubt"? So if the defense wants to
| discredit video evidence they have to make a case that that's a
| reasonable doubt, probably have experts review the video and
| provide at least a sketch up of the motive behind spending
| resources on forging evidence like that. Also I understand that
| in most cases that go to trial there's more than just video
| evidence.
| Ma8ee wrote:
| It would surprise me if there aren't cases where the only
| evidence is security camera footage.
| throwaway0a5e wrote:
| >It brings up an interesting future where you can't trust any
| witness testimony (because people do misremember) or any video,
| image, or audio evidence due to this technology.
|
| That future sounds a lot like the past where we couldn't trust
| testimony and audiovisual evidence was practically nonexistent.
| paxys wrote:
| IMO it's a step in the right direction. Photos and videos
| _should_ at minimum be signed /cryptographically verifiable if
| they are used as key testimony in court. We have had the tools
| to easily and convincingly fake photos for decades now. I can
| bet there are people behind bars today because someone got
| creative with photoshop.
| thebean11 wrote:
| If I'm an end consumer taking pictures and videos for fun /
| posterity what's the incentive for me to have them
| cryptographically signed? Why would I ever need to prove
| they're real?
|
| Are you saying that by law devices should cryptographically
| sign pictures?
| ben0x539 wrote:
| Hm, I'm out of the loop here, who would sign the photos? The
| camera app?
| AnthonyMouse wrote:
| > Photos and videos _should_ at minimum be signed
| /cryptographically verifiable if they are used as key
| testimony in court.
|
| This kind of "verification" doesn't work. You're trying to
| have the device authenticate the image. That implies a root
| of trust that goes through the device manufacturer, which
| state actors will immediately compromise, making the
| signatures untrustworthy regardless. Then people will find
| vulnerabilities in the devices themselves, or convincing ways
| to have the device take a picture of a doctored photograph
| instead of a picture of the world, extending the ability to
| forge signatures to the general public.
|
| The resources needed to forge a signature aren't going to be
| much different than the resources needed to create a deepfake
| to begin with. The creator gets to choose which device to use
| and can choose based on which device they find a
| vulnerability in. So it's more likely to be a liability than
| a benefit because it lends false credence to the authenticity
| of fakes.
|
| For it to work you would need all devices to be secure
| against all attackers. It's not realistic.
| chrisco255 wrote:
| > The resources needed to forge a signature aren't going to
| be much different than the resources needed to create a
| deepfake to begin with.
|
| How? If you have a public/private key pair and then do some
| kind of multisig process wherein the manufacturer can sign
| the photo, the OS can sign the photo, and then a user
| supplied signature could also sign the photo. You're not
| going to be able to fake 3 cryptographic signatures without
| 3 private keys. It would be extremely difficult to fake all
| of that. It would take a concerted, concentrated effort and
| would be extremely rare in practice. It would certainly
| improve the situation.
| AnthonyMouse wrote:
| Once you compromise the device it doesn't matter how many
| keys there are because they all have to be on the device.
| titzer wrote:
| As a user, why does my private key need to be on the
| device? Especially, because I wouldn't want it there,
| because devices get stolen.
| AnthonyMouse wrote:
| It has to be where the camera is. Otherwise the device
| holding your private key would need some way to
| authenticate photos from the device with the camera,
| which was the original problem.
| chrisco255 wrote:
| First off, you'd have to steal the actual device. Then
| you'd have to break the device encryption itself to then
| extract a an encrypted version of a private key. (the
| private key itself can be encrypted with a password not
| stored on the device) You pair the 3 sigs plus a
| blockchain to track timestamps and photo hash signatures
| and maybe pair that with a different blockchain app to
| report lost or stolen devices...and you've got yourself
| as secure of a system as you can conceive in the age of
| deep fakes.
| xyzzy123 wrote:
| One threat is that a malicious actor uses a compromised
| device to fake a "legitimate" verified image.
|
| This image can now be injected into the news cycle and be
| taken seriously.
|
| Keeping a secret on the camera is more or less the
| "jailbreak problem", aka being able to make a dvd player
| or console that only does the things that the
| manufacturer wants.
|
| I will be pretty sad if this problem is actually "solved"
| effectively because it portends a future where
| manufacturers can completely curtail user freedom. But...
|
| If the key on the device is compromised by reverse
| engineering or because the manufacturer is compromised
| then the whole root of trust is gone.
|
| Attackers can choose the weakest brand of camera.
|
| You now have some doubt about _any_ image that has been
| "verified" because it could have come from a compromised
| camera secret.
|
| All the rest of the stuff, the blockchain, etc depends on
| the root of trust in the camera.
|
| You also now have debates around camera country of origin
| because there is no near future world where governments
| can't lean on local manufacturers. Finally, doubt can be
| thrown on images retro-actively if we later learn that a
| camera we thought was secure, has vulnerabilities.
| nharada wrote:
| > The resources needed to forge a signature aren't going to
| be much different than the resources needed to create a
| deepfake to begin with
|
| I imagine deepfakes will become way easier in the near
| future because there are legitimate commercial reasons to
| make them easy and accessible (for example digital art
| assets). I don't see the same happening for cracking
| devices, which is more a standard security cat and mouse
| game between hackers and device manufacturers.
| AnthonyMouse wrote:
| All you're saying is that creating deepfakes would get
| easier, not that cracking devices would get harder. And
| it's already not hard enough to be a significant barrier.
| tshaddox wrote:
| You've explained why it wouldn't work flawlessly, but that
| doesn't mean it wouldn't work better than the current state
| of affairs. Video evidence is already useful in the legal
| system despite "video evidence" obviously not being a
| flawless system. A state actor or even a competent video
| effects studio with a modest budget could already create a
| convincing fake video!
| Asraelite wrote:
| One possibility would be for all recording devices to
| immediately upload checksums of their recorded footage to
| multiple trusted online authorities upon recording.
|
| If someone presents altered footage, you can show that the
| checksum of the original was uploaded earlier.
|
| This of course only covers certain cases. If the deepfake
| is crafted from scratch or from a camera that doesn't
| upload its checksums, and the incentive for the creation of
| the deepfake arises before the supposed time the deepfake
| takes place, then this method won't be of use.
| AnthonyMouse wrote:
| Opt out and it's an invasion of privacy, opt in and
| nobody does and it isn't applicable to 99% of authentic
| footage.
|
| And subject to attack. Don't want to be recorded? Jam
| wireless so they can't have anyone sign their videos,
| then claim they're fake.
|
| Also doesn't work for state actors. A country can
| compromise three separate signing authorities.
| jjk166 wrote:
| Chain of custody is standard for evidence already. It might
| not be 100% infallible, but it certainly works better than
| nothing.
|
| Yes, if a state actor wants to create evidence to frame
| you, and they didn't care about cost or returns, they could
| do so. But they could always do so. The fact is most of us
| are never going to get framed by a nation state.
|
| The notion that you're either secure against all possible
| attacks or nothing is absurd.
| moftz wrote:
| I guess the weakpoint in all of these techniques to prove
| a photo was taken by a certain device, at a certain time,
| in a certain place is for someone wearing a very good
| disguise to publicly commit crimes in your name. Any
| witnesses or photo evidence is going to point to you. A
| full movie costume makeup kit probably costs less than
| what it takes to hack Apple for someone's iPhone private
| key.
| AnthonyMouse wrote:
| Chain of custody implies you have some impartial
| authority who was actually there at the time to
| authenticate the recording. But then you would already
| have an impartial eyewitness. That doesn't work the other
| 99% of the time when the two parties to the case each
| tell a different story and one of them is offering
| evidence they have in their possession. The chain of
| custody then contains someone partial to the case and
| untrustworthy.
|
| > Yes, if a state actor wants to create evidence to frame
| you, and they didn't care about cost or returns, they
| could do so. But they could always do so. The fact is
| most of us are never going to get framed by a nation
| state.
|
| This is a major problem for elections. Nation states
| absolutely will doctor video of adversarial political
| candidates and for other propaganda purposes.
|
| > The notion that you're either secure against all
| possible attacks or nothing is absurd.
|
| But that's how signatures work. You're trusting every
| device in the world to sign the output of its camera. If
| the attacker can compromise any device, they can produce
| signed forgeries.
| jjk166 wrote:
| You're describing a situation where the recording ought
| not to be admissible anyways. If you can't trust where
| evidence came from, it's not evidence.
|
| Again, nation states have always been able to doctor
| video of adversarial political candidates. Lookup
| pictures where the KGB made someone disappear.
|
| Yes, if an attacker can compromise any device they can
| produce a signed forgery. But that's a really big if.
| There are a lot of potential attackers who can't
| compromise any and all devices, and signatures protect
| against them.
| astrange wrote:
| There are pretty good ways to detect photo editing, using
| things like measuring levels of JPEG artifacts across the
| image. Or if you save pictures raw I've never seen anyone try
| to fake a camera raw file.
| Ma8ee wrote:
| Why would it be harder or different to fake a raw file?
| astrange wrote:
| I didn't say it would be hard, just that I haven't seen
| one. But they contain a lot of information that can't be
| displayed on a computer monitor and so it would be a lot
| of work to check all of it.
| NortySpock wrote:
| Couldn't you just "rebalance" the JPEG artifacts as a final
| pass? Or some sort of slight blur that re-jumbles the least
| significant bits of every pixel?
| therealx wrote:
| Yes, this is standard in the forgery business - dust and
| speckle filter, drop the res, ensure the light is
| balanced, and maybe blur if you can get away with it.
| astrange wrote:
| These general techniques are weak against specific
| detection models that understand how the original image
| should've been generated. Another example is camera noise
| modeling since the injected noise won't be consistent
| with how a phone camera etc actually behaves.
|
| Changing the resolution could be effective though.
| Another cheap trick would be shifting the image 4 pixels
| over to break up the DCT artifact grid.
| foepys wrote:
| Who does the verification? Who does guard the keys? Who is
| responsible when keys go missing and "prove" false
| information?
|
| Just saying "we need to sign photos" isn't enough and simply
| naive.
| cabalamat wrote:
| Maybe something like Third Eyes (
| https://slatestarcodex.com/2013/05/08/raikoth-corruption-
| pri... ) could do the job.
| floren wrote:
| As a fan of film photography, I smile at a future in which a
| roll of developed negatives is valuable evidence because it's
| a lot harder to fake.
| grobbles wrote:
| There is a recent British Netflix series called "The Capture"
| that is pertinent to this discussion. A little absurd at times,
| but for anyone who wants some mental floss around this topic
| it's a very interesting show.
| throwaway17_17 wrote:
| The other aspect of discounting image evidence in this way
| would be to produce multiple conflicting versions and then
| argue a jury can never be sure which is real (where deepfake
| quality is comparable to the law enforcement original). This
| would have a hurdle for admission in a criminal case, but I
| would be able to very easily argue an acquittal if it was
| presented to the jury.
| panzagl wrote:
| Doesn't matter where the material came from, there was still
| harassment.
| joe_the_user wrote:
| This seems like the main point. Defamation can include true
| statements in a number of situations, notably when someone
| isn't a public figure.
|
| https://en.wikipedia.org/wiki/Defamation
| jcranmer wrote:
| In the US, truth is absolute defense to defamation: if the
| defendant can show that the alleged statement is true, the
| defamation case fails right then and there. The difference
| between public and non-public figures with respect to
| defamation is that the statement needs to be made with
| "actual malice" if the figure is public (i.e., knew it was
| false or recklessly disregarded the falsehood).
| willcipriano wrote:
| Watch the embedded video. Either some random woman with no
| apparent background in this work made the most convincing deep
| fake I've ever seen, or that young woman was caught on film
| smoking something.
|
| If its really fake the folks over at Sassy Justice should hire
| her, their stuff looks like garbage in comparison.
| SolarNet wrote:
| I mean it's pretty classic trick to mix in real stuff with fake
| stuff. This video was probably really, but one of the videos in
| question is apparently fake.
| whimsicalism wrote:
| Yeah, having watched that, there's no chance that is a
| "deepfake" and likely is not a fake at all.
| bigtones wrote:
| A young woman was caught on film vaping - it is inconclusive to
| me it was THAT young woman.
| willcipriano wrote:
| That is certainly possible but I'd personally bet that her
| own mother would know instantly if that was the case.
| _Microft wrote:
| Well, there were people mistaking pictures of others for
| pictures of themselves already...
|
| https://www.dpreview.com/news/7021408195/hipster-offended-
| af...
| henriquez wrote:
| The story here isn't about harassment or the technical chops of
| some bitter mom. This is the perfect way for the Mockingbird
| Media to introduce the concept of a "deepfake" to the public. A
| video, followed by claims it's fake, followed by claims it might
| not be fake. The end result is confused and more easily-
| manipulated masses who need to trust authorities to vet all
| information they receive, an abusive but codependent
| relationship.
| ed25519FUUU wrote:
| Whenever I think about politicians lamenting about the
| deepfakes, I can't help but wonder if they're attempting to get
| ahead of Jeffrey Epstein new. Epstein allegedly ran a blackmail
| operation by recording people in "compromising" situations on
| his island and ranch (and probably other places).
|
| "That wasn't me on camera. Must be a deepfake!"
| bentcorner wrote:
| > the Mockingbird Media
|
| Can you please not. It's hard to take someone's argument
| seriously when this kind of name-calling is intermingled with
| it. It's similar to someone writing m$ft or unironically
| referring to Zuckerberg as a robot.
| depingus wrote:
| I think Trey Parker and Matt Stone did an the best introduction
| to deep fakes. https://www.youtube.com/watch?v=9WfZuNceFDM
| udhdhxnxn wrote:
| It is also a great opportunity to reveal blackmail inflation
| and the impact on controlled elites.
| macawfish wrote:
| I looked this up and was not disappointed. Thanks for
| sharing!
| [deleted]
| brippalcharrid wrote:
| I'd love for this information to be released now, but
| wouldn't publishing cryptographic hashes of the material now
| (or a decade ago) prevent it from being dismissed as
| deepfakery if it's released in the future?
| phreeza wrote:
| Is this a thing?
| joe_the_user wrote:
| _The end result is confused and more easily-manipulated masses
| who need to trust authorities to vet all information they
| receive, an abusive but codependent relationship._
|
| On the one hand, yeah, it's definitely more how the press likes
| a human interest story mixed with whatever is flavor of the
| month (the techno wow of "deep fakes").
|
| But on the hand, the tone you take in that last sentence is
| really off-kilter. The press just works in a kind of hysterical
| fashion, moral panics (even conspiracy theories) aren't
| conspiracies but natural. Attributing stuff to a malicious
| effort to make the public more manipulable (as if that was
| necessary) should be dropped in favor of the natural
| incompetence of the press in framing issues reasonably.
|
| Edit: "Mocking Bird" above is a phrase referencing CIA
| manipulation of the media. This is a bit of over the top for
| just some rather screechy but pretty ordinary story.
| Natsu wrote:
| The GP's argument doesn't make any sense to me. The reporting
| was good for a change. It gives named, verifiable sources who
| are legitimate experts on the subject and several of them, it
| tells us where to find the video in question [1] and gives us
| the basis for their opinion (e.g. no visual artifacts in the
| smoke or when the hand goes in front of the face).
|
| All of this can be confirmed by a responsible news consumer
| and everything I can find here checks out. So this is legit
| investigative news and precisely the kind of thing we should
| encourage more of.
|
| I am aware of Project Mockingbird [2] but I cannot see any
| relation between it and this report. It makes no sense
| whatsoever that the CIA would care about some squabble over
| unknown cheerleaders. It makes no sense to believe that
| everything is or isn't a deepfake without examining the
| facts. And this article, unlike so many I have seen, is an
| example of news done right. Given how disinformation works,
| this is the opposite of it. This article was written to
| inform, not to inflame our passions.
|
| In short, it's the polar opposite of what I would expect from
| disinformation and this is the kind of good reporting that
| should be rewarded.
|
| [1] It should've actually linked it, IMHO, but I won't
| complain because it wasn't hard to find this video @0:49 -
| https://www.youtube.com/watch?v=F5I1RfxehT4
|
| [2] https://en.wikipedia.org/wiki/Project_Mockingbird
| readflaggedcomm wrote:
| A "mockingbird" also sings late into the night and mimics
| other birds, which can be annoying and throw the blame on
| others. It's a natural metaphor.
| redindian75 wrote:
| 'mocking bird media' is a Qnon term like "deep state".
| readflaggedcomm wrote:
| The etymologies disagree, but you be sure to not waste a
| good conspiracy theory so you can label people.
| henriquez wrote:
| Qanon didn't invent Operation Mockingbird. The CIA did.
| NOGDP wrote:
| > Attributing stuff to a malicious effort to make the public
| more manipulable (as if that was necessary) should be dropped
| in favor of the natural incompetence of the press in framing
| issues reasonably.
|
| Except we've known for a long time that the mainstream press
| works, very intentionally, to manipulate public opinion.
| joe_the_user wrote:
| The press has influence. The press has always had
| influence. In a society involving mass media, the press
| can't not-have influence.
|
| We might talk about what form a society would have to have
| for the press to not have great influence - maybe "very
| strong education and civil institutions" or "directly
| democratic workers' councils" but if we're going have a
| modern capitalist society with multiple poles of elites and
| atomized consumers, the press, the corporations, the state
| and highest professional all will have disproportionate
| influence.
|
| Which is to say, now, all the different press outlets
| manipulate, sometimes in distinct and opposite ways,
| sometimes in agreement with each other. And all the various
| economic and political institutions manipulate.
|
| And the thing with this manipulation is it doesn't require
| crazy plan like the ggp/op insinuates. They don't need to
| intentionally create "deep fake" as vague threat, human
| psychology just naturally drifts that way, including the
| psychology of the reporters themselves. And manipulation
| for a specific purpose just requires the poor framing of
| ideas that can leveraged whenever you need it. So yeah, the
| press certainly manipulate but it doesn't generally have a
| "master plan" of manipulation. That wouldn't help (not that
| it hasn't been tried).
| sushisource wrote:
| Probably sometimes, intuitively it's pretty damn obvious
| not _all_ the time, and based on what evidence? What a huge
| sweeping claim to just drop in one sentence.
| anonu wrote:
| I do agree with you. But this specific article seems to
| actually question the media:
|
| > Even more troublesome, it seems that no media outlets
| attempted to verify that the vaping footage was actually a
| deepfake before broadcasting the claim to millions of
| Americans.
|
| > The possible misstep by the media, and the chance that the
| criminal justice system could misapply laws against videos
| mischaracterized as deepfakes, is what worries experts the
| most.
| im3w1l wrote:
| > While prosecutors have not provided specifics, tools designed
| to make women appear nude in images with the aid of artificial
| intelligence (AI) do exist.
|
| They may be able to generate a "plausible" nude, suitable for
| rubbing one out. But it's not a real nude. Moles and scars wont
| match. The data is just not there, so the manipulator has to make
| something up. Thus I think the nudes are actually the most
| promising for conclusively proving or disproving a manipulation.
| lazide wrote:
| Hard to disprove without stripping naked for a court witness
| though, which is a pretty high degree of 'shitty' that would be
| easy to cause. Especially if you used a ton of stolen bikini
| photos to generate the deep fake. Not a lot of skin that
| wouldn't exactly match, and the parts that don't would
| definitely not be for public consumption.
| lurquer wrote:
| Don't now why you're getting downvoted.
|
| This happens.
|
| Particularly in rape cases where the woman (or child)
| describes some particular feature on the alleged rapists
| genitals.
|
| Michael Jackson had to strip down, as I recall. Obviously not
| in open court.
| jjk166 wrote:
| You don't need to strip naked, you just need to show one
| discrepancy. Cut a hole in a pair of cheap jeans where you
| have a mole that's not in the video.
| Karawebnetwork wrote:
| The video in this story seems to be quite low resolution and
| from a distance. Scars and moles are pretty much invisible in
| this situation.
| jjk166 wrote:
| > "The videos or the images already existed in some form and then
| our allegation--this is of course subject to proof at trial,
| she's innocent until proven guilty--but our allegation is that
| Mrs. Spone took existing images from existing social media from
| these three victims' existing social media accounts and
| manipulated them," Weintraub said.
|
| > The statement is similar to claims made by Engelhart to the
| Daily Dot that Spone allegedly used legitimate video as a basis
| for the deepfake. Police did not know whether the alleged
| victim's face was added to a video of another young female vaping
| or if a vaping pen and smoke was digitally added to a legitimate
| video of the alleged victim.
|
| If you're saying that she started with one video or image that
| was publicly accessible and modified it, I'd expect them to have
| the original video or image so they could tell what was added.
| ALittleLight wrote:
| Weintraub also stated during the press conference that
| investigators determined that the vaping video had been
| manipulated after analyzing its "metadata," a term which refers
| to embedded information in digital media that can reveal how and
| when it was last edited.
|
| But after contacting Reiss, the officer who investigated Spone,
| the Daily Dot learned that police never actually obtained the
| original vaping video. Instead, like what was seen on NBC News,
| police only had access to a cellphone recording taken by Spone of
| the vaping video being played on a separate device. Any metadata
| analysis would therefore fail to include information on the
| source video.
|
| In response to questions on how it could have been determined
| that the vaping footage was manipulated without access to the
| original video, Reiss argued that he could see with his "naked
| eye" elements that "don't make sense."
|
| -----
|
| So... They can tell by the metadata on a copy of the file plus a
| naked eye analysis of the video that this is a deep fake? That's
| much worse evidence for the deep fake case than I assumed there
| would be.
|
| This does make me wonder though: Should we default to believing
| video or doubting it? Does the victim here need to prove it is a
| deep fake or the perpetrator prove it's not?
| ASalazarMX wrote:
| > In response to questions on how it could have been determined
| that the vaping footage was manipulated without access to the
| original video, Reiss argued that he could see with his "naked
| eye" elements that "don't make sense."
|
| Never expected to see a version of the "I can tell from some of
| the pixels and from seeing quite a few shops in my time." meme
| used as legal testimony.
| vmception wrote:
| lol a video being manipulated doesn't mean its a deepfake
|
| but I suppose it will follow the same standard as sexual
| harassment and rape victims: if they've ever altered the truth in
| the past then their current experience is invalidated for the
| purpose of using a court to reduce freedom for someone else
|
| so videos will have to be closest to raw sensor data or
| inadmissable
| aaron695 wrote:
| The fact Sassy Justice is meta says a lot about where deep fake
| tech is at.
|
| When CNN starts using deep fake tech to show Trump throwing a
| bowl of fish food into a pond, then it's reached maturity.
| epanchin wrote:
| This person allegedly created a deepfake video that simulated a
| child nude, and sent it to someone else. Why was she not charged
| with creating and distributing child pornography?
| nightpool wrote:
| So just to summarize:
|
| 1. A cheerleader's mom anonymously texted some images and a video
| of a rival cheerleader to her coach. One of these images was a
| deepfake--a swimsuit photo off of Facebook edited to make the
| subject look nude. The other images and the video (of the girls
| drinking and smoking), were real.
|
| 2. The girl and her mom were charged with cyber harassment and
| harassment. The prosecutors weren't entirely clear on which of
| the photos + video were real and which were edited--the victim
| claimed they were all fake--but the whole thing was generally
| classed as harassment. Later, the police claimed that "metadata
| analysis" proved the video was fake.
|
| 3. ABC (among others, but ABC is mentioned in this article) ran a
| TV spot about this story, repeating the prosecutor's assertion
| that the video was manipulated
|
| 4. Twitter user @HenryAjder ("Deepfake expert"), among others,
| pointed out the error
|
| 5. The DailyDot published this article
|
| This seems like a bit of a nothingburger? Any reason we're still
| talking about this? Nobody was "discrediting evidence", the
| videos & photos were only evidence to one thing--that someone
| sent them to her coach in order to get her kicked off the team.
| Whether they were real or not was immaterial to the harassment
| that occurred.
| [deleted]
| swiley wrote:
| Wow I was expecting assholes from 4chan not an obsessive soccer
| mom.
| throwaway0a5e wrote:
| Technology trickles down
|
| Nations state > Bigco > Smallco > Neckbeards > Karen
|
| (skipping some steps but you catch the drift).
|
| The metadata stuff the NSA was doing 20yr ago is what
| advertisers are doing today. SpaceX is doing stuff that was
| recently only the purview of nation states.
| macawfish wrote:
| The facts of the situation are definitely not that exciting but
| the significance of the story itself is pretty monumental. Just
| the fact that such ordinary people are caught up in public
| uncertainty about the authenticity of media depicting very
| ordinary human stuff, all because of the ubiquity of a bunch of
| < 10 year old tech... this is the tip of a massive very messy
| iceberg.
| zionic wrote:
| We're not that far off from every 14 year old being able to
| do this on their phone.
|
| The rising millennial generation is not ready for the chaos
| their progeny is about to unleash on the world.
| jjk166 wrote:
| > The statement is similar to claims made by Engelhart to the
| Daily Dot that Spone allegedly used legitimate video as a basis
| for the deepfake. Police did not know whether the alleged
| victim's face was added to a video of another young female
| vaping or if a vaping pen and smoke was digitally added to a
| legitimate video of the alleged victim.
|
| > Weintraub also stated during the press conference that
| investigators determined that the vaping video had been
| manipulated after analyzing its "metadata," a term which refers
| to embedded information in digital media that can reveal how
| and when it was last edited.
|
| > But after contacting Reiss, the officer who investigated
| Spone, the Daily Dot learned that police never actually
| obtained the original vaping video. Instead, like what was seen
| on NBC News, police only had access to a cellphone recording
| taken by Spone of the vaping video being played on a separate
| device. Any metadata analysis would therefore fail to include
| information on the source video.
|
| It appears the police are calling the video of the vaping a
| deepfake
| nightpool wrote:
| My mistake! I had hedged around that point in my original
| comment, but I removed it after reading the twitter thread
| DailyDot where HenryAjder seemed to imply the opposite. I've
| updated my comment to clarify
| DyslexicAtheist wrote:
| > This seems like a bit of a nothingburger?
|
| what was a takeaway here regardless of the story being wrong,
| is that the damage deepfakes will cause might not be what self-
| proclaimed futurists predicted (e.g. rampant use of the
| technology as a means to subvert evidence) but the opposite:
| that people will counter justified allegations of evidence as
| being deepfakes. So not the technology is doing most of the
| damage but the hypothetical possibility that allows that these
| arguments now exist.
|
| Imagine any Karen or Kyle now being presented with video
| evidence will soon scream "deepfake". And few have seen this
| coming because we were more absorbed with hypothetical (but
| less realistic) scenarios.
|
| This is even more fascinating (and was the simpler answer yet
| most have missed it).
|
| https://cla.purdue.edu/academic/english/theory/postmodernism...
| danso wrote:
| I think it's interesting if law enforcement bungles a type of
| evidence that is almost certainly going to be more of a
| society-wide problem in the near future
| nightpool wrote:
| My impression is that law enforcement generally bungles every
| single type of evidence on a regular basis. Computer evidence
| is only notable because we're paying more attention to it.
| zionic wrote:
| Alternate: they bungle computer evidence just as much but
| society hasn't caught up to their ineptitude yet.
| qwerty456127 wrote:
| I'm just curious how many centuries will have to pass before we
| outgrow the bullshit society where being exposed smoking,
| drinking (but not harming anybody while drunk) and posing nude
| can harm you.
| ed25519FUUU wrote:
| Can someone fill me in here? Would an anonymous video with no
| chain of custody ever be admissible in court even without
| deepfakes? After all, there was always CGI.
| ceejayoz wrote:
| It would certainly be admissible.
|
| You could also certainly get an expert witness to give their
| opinion on whether that video is CGI or not.
| navaati wrote:
| You would also have to pay this expert witness... if you can.
| tiahura wrote:
| Yes. In order to have a photo admitted you just have to lay the
| foundation via a witness.
|
| To lay the foundation for admission for a photograph into
| evidence, ask questions such as these:
|
| Q. Mr. Witness, I'm handing you a photograph that's been marked
| Exhibit 2 for identification. What is depicted in that
| photograph?
|
| A. A stoplight at the intersection of 4th and Pine.
|
| Q. Is that photograph a fair and accurate representation of the
| stoplight at 4th and Pine as it existed on the day of the
| collision?
|
| A. Yes, it is.
|
| At this point, you can move for admission of the photograph
| into evidence.
|
| https://www.illinoistrialpractice.com/2004/11/foundational_q...
|
| On cross examination the other side has the right to try to
| discredit the witness: "Aren't you really blind, couldn't this
| be a deepfake, " etc. And, they have the right to put on an
| expert or other witness to testify that it's fake or the wrong
| picture or whatever. In the end, it all goes to the jury and
| they get to make of it what they will.
___________________________________________________________________
(page generated 2021-04-08 23:01 UTC)