[HN Gopher] Lucasfilm hires YouTuber who specializes in deepfaki...
       ___________________________________________________________________
        
       Lucasfilm hires YouTuber who specializes in deepfaking big-budget
       movies
        
       Author : thunderbong
       Score  : 292 points
       Date   : 2021-07-27 12:04 UTC (10 hours ago)
        
 (HTM) web link (www.theverge.com)
 (TXT) w3m dump (www.theverge.com)
        
       | marcodiego wrote:
       | To the people complaining that what you see is no longer
       | believable: there is a way out: signatures. It is time to start
       | pressuring public people to digitally sign whatever they say.
        
         | 650REDHAIR wrote:
         | How would this work?
         | 
         | Plenty of videos are candid or shot and released by a 3rd
         | party.
         | 
         | This could work for press releases and such, but not for videos
         | with headlines like "CEO CAUGHT KICKING A BABY IN THE FACE" or
         | "UNDERCOVER INVESTIGATION: PRESIDENT ADMITS ALIENS ARE REAL".
        
         | jcrei wrote:
         | Hopefully this gains traction, but it can't be with a "dousign"
         | type signature that has no legal validity. We should champion
         | Qualified Electronic Signatures
        
         | Tepix wrote:
         | Judging by the european vaccination certificates, noone ever
         | verifies a signature.
         | 
         | I know there are cameras that digitally sign the data. However,
         | you hardly ever get to see raw footage. It is always edited.
        
         | WastingMyTime89 wrote:
         | > To the people complaining that what you see is no longer
         | believable
         | 
         | I don't understand the complain. It's a movie: what you saw
         | never was believable.
         | 
         | It's trickery by design. Everything you see as been spliced
         | together from multiple takes, purposefully framed, lighted and
         | colorized. It's all fake but in a way so culturally ingrained
         | that people don't even notice the deceit anymore. You think
         | this continuous action you are watching?
        
           | sillyquiet wrote:
           | I could be wrong, but I think OP was speaking more about
           | filmed things that purport to be real-life events (i.e.,
           | evidence), rather than cinema.
        
             | UncleMeat wrote:
             | During then 2004 election, doctored photos of John Kerry at
             | an anti-vietnam-war protest circulated widely. They were a
             | simple cut-and-paste job. The world of "people will make
             | fraudulent media to sell a narrative" has already been here
             | for decades.
        
             | WastingMyTime89 wrote:
             | But it's the same. I think that the fact that people are so
             | scared of deep fakes show they are not critical enough of
             | what they are already shown. Images lie all the time.
        
               | sillyquiet wrote:
               | Eh, maybe. People watching cinema "know" it's not real,
               | even though propaganda is a thing I guess. Deepfakes
               | won't alter anything with regard to that though, whether
               | it's a deep-faked actor, a CGI monstrosity, or just a
               | look-alike actor in makeup.
               | 
               | although you are right about video 'evidence', editing,
               | cuts, and carefully muted dialog can alter things to the
               | point of being the opposite of what was being filmed - an
               | unprovoked attack can become self-defense or vice-versa,
               | etc. Again, deep-fakery is just another tool in that
               | unsavory toolbox, not anything paradigm shifting.
        
         | Kosirich wrote:
         | What about signing normal photos for the purpose of stopping
         | misuse.. the approach of "if a photo is not digitally signed by
         | each person on the photo for this specific case, it is assumed
         | that the photo use is not fair"?
        
         | jcims wrote:
         | I think it would have to work through the camera industry to be
         | effective, so the sensor actually includes signatures with the
         | image data. Otherwise all of the 'hot mic' moments are going to
         | be left unaddressed.
         | 
         | Image formats could be updated with an 'original' layer so that
         | if the news site wanted to crop or edit the content, the
         | original would still be avaliable for comparison.
         | 
         | Given all of the hardware design going into high speed hashing
         | on asic, shouldn't be that hard to find a component to do the
         | work.
        
       | jordanab wrote:
       | Give it another decade or so, and I can see Lucasfilm/Disney
       | making full feature films starring only deepfake 'clones' of the
       | original trilogy characters in their younger/O.T. forms.
        
         | bick_nyers wrote:
         | The implications for the entertainment industry are massive.
         | 
         | When I was working in indie game development, I wondered if you
         | could use deepfakes as a voice actor. Basically get someone
         | famous/good voice with infinite voice lines, without having to
         | pay for studio time. Obviously, you would need them to sign-off
         | on using their voice for commercial purposes.
        
           | wishinghand wrote:
           | There's a post on Hacker News for this, by a company called
           | Sonantic.
        
           | omgwtfbbq wrote:
           | Already happening. Recently was used to recreate Anthony
           | Bourdain's voice: https://en.wikipedia.org/wiki/Roadrunner:_A
           | _Film_About_Antho...
        
           | chronogram wrote:
           | How would you get the acting part of the voice acting right?
           | I can't imagine you wouldn't still need a skilled voice actor
           | for that.
        
             | post-it wrote:
             | A markup language for voice that tells the generator how to
             | inflect everything. It's not on the horizon yet, but
             | anything that our voice can do, a computer will do someday.
        
               | bick_nyers wrote:
               | Yup exactly. Anything you would tell a voice actor to do
               | you have in the markup. Obviously, the voice actor can
               | still produce higher quality, probably for a long time to
               | come.
        
               | mgdlbp wrote:
               | https://docs.microsoft.com/en-us/azure/cognitive-
               | services/sp...
               | 
               | https://cloud.google.com/text-to-speech/docs/ssml
               | 
               | https://docs.aws.amazon.com/polly/latest/dg/ssml.html
        
               | AnIdiotOnTheNet wrote:
               | Then the person doing the markup becomes the talent you
               | have to pay to make things good.
        
               | account42 wrote:
               | That percon can be replacable. Or it can be team. And you
               | don't need to worry about the AI tiring or damaging their
               | vocal cords after trying out different intonations all
               | day. And eventually the there will be good enough
               | automation to generate the intonattions too - either
               | entirely or with minimal input from a voice director.
        
               | jjk166 wrote:
               | It's a lot easier to write "cries like a baby" or
               | "screams in terror" than it is to actually do it on
               | command, over and over again, for take after take.
               | 
               | And one can even imagine a program with emotional slider
               | bars that lets a person listen to how a line sounds with
               | different levels of inflection and then automatically
               | inserts the appropriate markup for the settings the user
               | selects.
        
               | Cthulhu_ wrote:
               | I believe that's already out there, since services like
               | Alexa will do certain inflections depending on the
               | context of what they're saying. I think.
        
             | M277 wrote:
             | Yeah, the acting part is a valid concern. A mod for The
             | Witcher 3 does this to give the main character voiced
             | dialogue[1], but it doesn't really sound.... right. I mean,
             | it is voiced and some lines feel authentic, but some lines
             | also just feel odd.
             | 
             | [1]: https://www.gamesradar.com/witcher-3-mod-uses-ai-to-
             | create-n...
        
             | Cthulhu_ wrote:
             | Not if you're going to voice the Elcor from Mass Effect.
             | 
             | "With barely contained terror. You drive a hard bargain."
        
           | echelon wrote:
           | I'm working on https://vo.codes
           | 
           | The new version is almost ready to launch.
           | 
           | I've also got voice to voice conversion working, and I'm
           | trying to make it real time. It's pretty close.
        
         | stevesearer wrote:
         | I met someone a few years back who apparently worked in the
         | field of 'digital persona management' which is basically an
         | agent for actors' likeness after they die. It sounded like
         | families and estates were very interested in the concept as
         | long dead actors could potentially become movie stars again in
         | theory.
        
           | Ashanmaril wrote:
           | That sounds like a fairly big ethical dilemma that Disney
           | will happily ignore if making a puppet show out of people's
           | corpses earns them a few extra bucks
        
             | MrPatan wrote:
             | This will only be a thing for a while. Why pay somebody's
             | grandchildren when you can create a new face that's yours?
        
         | kickscondor wrote:
         | See the Harrison Ford vid linked at the end of the original
         | article. Billy Dee Williams also gets inserted. There's no
         | doubt that this technique will defeat CGI Youngface.
        
         | frankfrankfrank wrote:
         | I suspect it will also lead to essentially "real" fantasy
         | characters that totally replace real human actors. We already
         | have many comic/drawn characters that people associate and
         | identify with in a similar way they associate and identify with
         | human actors; there is no reason why you would not be presented
         | with "actors that don't actually exist in person. I cannot
         | recall what it was called, but the industry has already
         | produced a fully CGI movie that tried to push this very thing
         | by essentially making a real like manga movie.
        
         | ThePadawan wrote:
         | https://collider.com/james-dean-digital-cgi-performance-in-n...
         | (2019)
         | 
         | > James Dean, an iconic movie star who died in 1955 at the age
         | of 24, has been cast in a new Vietnam-era action film called
         | Finding Jack.
        
         | svieira wrote:
         | This is the plot of _The Congress_:
         | 
         | https://en.wikipedia.org/wiki/The_Congress_(2013_film)
        
         | echelon wrote:
         | Other way around. Real actors' likenesses won't be used as
         | often.
         | 
         | That's a good thing. More actors can now work.
        
       | baby wrote:
       | And I just got spoiled on the mandolarian :(
        
       | ChrisArchitect wrote:
       | ILM isn't dumb - this is a talenthire for sure. Amazing work from
       | this guy and dedication to the niche (and they likely have some
       | ideas kicking around in Marvel/Lucas writing rooms that are about
       | bringing alllll the olds back to life)
        
       | bordercases wrote:
       | Security footage tamper contracting is one obvious black market
       | extension of this.
        
       | coolandsmartrr wrote:
       | Lucasfilm's subsidiary Industrial Lights and Magic is known for
       | leading visual effects on actors's appearances to help make films
       | that cannot be realized without such technology. For instance,
       | Martin Scorsese entrusted them to "youthen" the leading actors in
       | "The Irishman" so that Robert Deniro et al can play their
       | characters at a younger age without wearing a red-ball "clown-
       | nose" tracker. "Star Wars: Rouge One" practically reanimates
       | Peter Cushing to continue and expand the involvement of Grand
       | Moff Tarkin in the Star Wars saga. These processes are
       | painstaking, and artists sweat over details on a frame-by-frame
       | basis to negate the "uncanny valley" of artificial human
       | likenesses.
       | 
       | Obviously, the labor-intensive nature of today's CGI techniques
       | drive up production costs. Meanwhile, the deepfakes on YouTube
       | provide a convincing enough rendition of likenesses without
       | actual actors, all produced on consumer-level GPUs. This presents
       | a huge potential to save costs and the benefits are clearly
       | enticing to film productions.
       | 
       | As Hollywood gravitates towards blockbuster franchises,
       | productions will want to bring the same ensemble of actors (or at
       | least their likenesses) as long as possible. While moviegoers may
       | be unsettled by seeing "reanimated" dead actors like in Rouge
       | One, they still may hope to see franchise actors to look
       | consistently youthful or attractive on screen. Deepfakes may be
       | more relied upon to provide that effect.
        
         | shadowtree wrote:
         | The Irishman is the perfect example of where the Deepfake is
         | MILES BETTER than then classic CGI de-aging:
         | 
         | https://www.youtube.com/watch?v=dHSTWepkp_M
         | 
         | Just look at it, the CGI DeNiro looks like from the Polar
         | Express.
        
           | tjoff wrote:
           | It is? The deepfake just looks out of focus to me, just
           | something blurry that as a side-effect removes some wrinkles
           | but also takes away lightning and everything else.
           | 
           | Not to say the deepfake isn't seriously impressive. But I
           | very much prefer the netflix version.
        
           | kilroy123 wrote:
           | Wow the deep fake version is WAY better.
        
           | dualboot wrote:
           | This is deceptive, though. The "deep fake" is essentially
           | building on the CGI de-aged product. If they'd started with
           | deepfake it likely wouldn't have yielded this level of
           | result.
        
             | burnte wrote:
             | I disagree, deepfaking completely replaces the face. It
             | doesn't ahve to be close to start with. Check this one out:
             | https://www.youtube.com/watch?v=861gfPVmgdc
        
               | ad404b8a372f2b9 wrote:
               | It doesn't have to be close but it helps a lot both the
               | quality of the results and the convergence speed if both
               | faces are similar.
               | 
               | In movies you don't usually have the luxury of choosing
               | the original face you want to replace, but for people who
               | make memes (or porn), you commonly choose a source video
               | featuring someone that resembles the person you want to
               | put in.
        
               | seph-reed wrote:
               | > In movies you don't usually have the luxury of choosing
               | the original face you want to replace
               | 
               | Hiring an actor that roughly resembles the person you're
               | trying to deep fake seems doable.
        
             | planb wrote:
             | Well, Robert DeNero's face still has the same proportions
             | and bone structure, so taking the original video as a
             | starting point would probably yield the same results.
        
             | 411111111111111 wrote:
             | Uuuh, why? I can't imagine a reason why the deaged video
             | would've improved the deep fake version. If anything, it
             | should've reduced it's quality by adding incorrect data
             | which could confuse the model.
        
               | ad404b8a372f2b9 wrote:
               | The default model for deepfakes is an autoencoder. The
               | encoder and decoder will converge faster and to simpler
               | solutions if the distributions of original faces and fake
               | faces are closer.
               | 
               | It's an intuitive result even ignoring the specific
               | model. It takes less information to go to and from
               | similar faces than it does two completely dissimilar
               | ones.
        
             | TrevorJ wrote:
             | That would still be an effective technique for big-budget
             | films if the net result is an improvement on the state of
             | the art. (Which I think it does seem to be).
        
             | vernie wrote:
             | I'm inclined to agree and I'd be interested to see this
             | process applied to some of the raw footage featured in the
             | the award season campaign. It's also worth noting that in
             | addition to de-aging they also changed De Niro's eye color.
        
           | pen2l wrote:
           | Wait what, Deepfake is better? To me, not at all, deepfake
           | seems blurry. It looks like, well, a deepfake, there is
           | something distinct about deepfakes which just stands out, I
           | think it's the soft blur around individual parts. It's taking
           | the easy route by blurring and darkening a lot of things,
           | look at the eyelids for example in modeled example, there is
           | incredible detail there.
           | 
           | Deepfake definitely has a place in this space, in that it can
           | do things with 1/100th the effort in a scalable way, but it
           | has a lot more to catch up on with traditional modeling than
           | what some folks appear to be thinking. rendering with goodies
           | likes SSS, AO, etc. gives magical results which are hard to
           | achieve any other way. And as soon as you get a little bit
           | complicated in what you're trying to create, at least the
           | currently existing neural network models fall apart are are
           | just not applicable. Take this video for example, which was
           | very manually modeled:
           | https://www.youtube.com/watch?v=BC2dRkm8ATU Deepfake is a
           | long, long way to taking a stab at things like this.
        
             | jjeaff wrote:
             | On mobile, I don't see the blur on the deep fake. It looks
             | really good. I suspect it would be more obvious on a large
             | screen.
        
             | castlecrasher2 wrote:
             | >Wait what, Deepfake is better? To me, not at all, deepfake
             | seems blurry. It looks like, well, a deepfake, there is
             | something distinct about deepfakes which just stands out, I
             | think it's the soft blur around individual parts. It's
             | taking the easy route by blurring and darkening a lot of
             | things, look at the eyelids for example in modeled example,
             | there is incredible detail there.
             | 
             | I agree with your complaints about DeepFake but imo it did
             | a far better job de-aging DeNiro. To me, the release
             | version had a lot of "old man" cues that the DeepFake one
             | didn't, such as the jowls and heavy wrinkles.
        
         | ksec wrote:
         | My question is, how would the cost structure works. Can I now
         | hire an actor that looks 90% like someone I have in mind, and
         | then Deepfake it to look 99.9%, and save on actor's cost?
         | 
         | It works when you are trying to do that on actor that are no
         | longer with us, but what about actors that are still alive?
        
           | seanicus wrote:
           | Not a lawyer but you need likeness rights to reproduce an
           | actor's face. I.e. Peter Cushing's estate gave permission for
           | him to "appear" in Rogue One.
           | 
           | But maybe it doesn't apply to parody(?)
           | 
           | https://youtu.be/9WfZuNceFDM
        
       | jcims wrote:
       | The worst thing a lot of these deepfake folks do with movie clips
       | is hire impersonators to try to make it more realistic. The
       | problem is that a) the impersonator is usually off a bit in
       | timing or character and b) the soundstage is nothing like the
       | rest of the movie, it just sounds like cuts to a podcast. The
       | result just doesn't work.
       | 
       | https://youtu.be/A8TmqvTVQFQ?t=52
       | 
       | That said, one way it *does* work is with novel scenes filmed
       | with a good impersonator, the outcome can be pretty remarkable:
       | 
       | https://www.tiktok.com/@deeptomcruise/video/6957456115315657...
       | 
       | https://www.youtube.com/watch?v=krAU3C9jhj8
       | 
       | https://www.youtube.com/watch?v=ybasoc6LxIU
       | 
       | https://www.youtube.com/watch?v=VWrhRBb-1Ig
        
       | mindvirus wrote:
       | I wonder where this leads in the long term.
       | 
       | Do we have different actors deepfaked in for different markets?
       | 
       | Do actors even act anymore? Do companies just pay actors for
       | their likeness and do the rest?
       | 
       | Do we get to a point where you can choose who is acting in a film
       | you're watching?
       | 
       | How are deepfaked voices? Can we substitute audio as well?
       | 
       | I understand that deepfakes are still a bit of a manual process,
       | but presumably that will change.
        
         | vollmond wrote:
         | > Do actors even act anymore? Do companies just pay actors for
         | their likeness and do the rest?
         | 
         | I assume that would eventually get to generating a totally new
         | person, rather than modeling after a specific actor who costs
         | money.
        
           | tk75x wrote:
           | That's already happening. Look into "digital influencers".
        
           | mindvirus wrote:
           | Even better! Imagine - every showing at every theatre in the
           | world has slightly different actors A/B tested for the
           | moviegoers.
        
             | Andrex wrote:
             | Pixar and the like can theoretically do this already, but
             | don't, maybe due to cost reasons.
        
         | bonoboTP wrote:
         | Well, why are good actors so sought-after? What makes someone
         | an actor whose likeness people seek and what makes a B-tier
         | actor?
         | 
         | Part of it is inertia and random celebrity status, having an
         | attractive or interesting face etc., but part of it is also the
         | raw knowledge of _when_ to apply certain microexpressions, how
         | to gesture etc. i.e. how to do the acting itself. To be a
         | convincing, charismatic etc. actor it 's not enough to wear a
         | digital mask of a celebrity, the underlying actor still needs
         | to act well. That may not be so important for certain types of
         | shallow movies, but it certainly is for deeper drama films etc.
         | 
         | It's similar to today's text generation where you may be able
         | to generate sports game reports, user's manuals or travel
         | brochures etc. but not really those where you need high level
         | decisions, like applying the appropriate expressions to a real-
         | world event, taking into account all the context, like writing
         | a poem about your feelings reflecting on some recent real-world
         | event.
         | 
         | I'm not saying humans have a magical power that can't be
         | implemented in silicon.
         | 
         | What I'm saying is that deepfakes as they are today are not
         | sufficient to replace actors. You'd need a higher level
         | puppeteering AI that would take the whole storyline and script
         | into account to come up with the right ways to express the
         | appropriate emotions at that moment in the film and could take
         | the director's instructions regarding his vision of how the
         | drama should unfold etc.
        
           | tshaddox wrote:
           | I think all the stuff you describe about charisma is
           | absolutely true in the _creative_ mode of generating value,
           | but there is also an _extractive_ mode of generating value
           | for which I absolutely think deepfakes could be very
           | effective. Once an actor establishes an audience (generally
           | through the creative mode you describe), there is still an
           | opportunity to extract as much value as possible from the
           | remaining good will of the fan base. This already happens
           | with famous actors producing cheap and unpopular movies that
           | seem to only exist to put that actor's name and face on the
           | poster.
        
           | ant6n wrote:
           | One example where it shows that the actor doesn't just
           | provide a face, but, well,the acting, is back to the Future.
           | Originally they had wanted to cast Michael J Fox, but he
           | wasn't available. So they picked a different actor. And that
           | actor didn't get that the movie was supposed to be fun. They
           | shot several weeks with this actor who was turning the movie
           | into something very serious, being terrified by being
           | transplanted into the past, and finding it tragic to come
           | back to a present where everything is different.
           | 
           | Theres a good documentary about this [1], that talks about
           | replacing that actor, and when Fox comes on set and delivers
           | the first line filmed ("You put a time machine... In a
           | DeLorean?!"), it's hilarious, night and day difference.
           | 
           | [1] season 2, episode 1 of movies that made us
           | www.netflix.com/us/title/80990849
        
         | 6gvONxR4sf7o wrote:
         | There is a company doing this for video game voices. The voice
         | actors who provide training data get royalties. I hope that
         | becomes a standard for deep fakes and the ML in general, as
         | opposed to how copilot (and the rest of the industry) generally
         | just takes whatever they get their hands on as free train
         | training data.
        
         | gedy wrote:
         | I doubt Hollywood would want this, but I'd love to be
         | entertained by choosing some base story, then be able to pick
         | the lead actors and perhaps setting and mood. Deep fakes get a
         | ways toward this. This is basically what remakes are.
        
       | Amin699 wrote:
       | I'm not sure Shamook's results are always better than the
       | originals; many still have that uncanny, mask-like quality that's
       | still common to a lot of deepfakes today, and they don't have the
       | benefit of all the additional CG lighting work that clearly went
       | into Disney's modern films and shows. But perhaps by combining
       | ideas, they can reach new heights.
        
       | mdrzn wrote:
       | That's awesome news, instead of the usual cease & desist
       | YouTubers usually get.
        
       | mensetmanusman wrote:
       | It is so promising that this technology is reaching the masses
       | and letting random youtubers compete with the best. Hopefully
       | this type of tech further decentralizes the content creation from
       | Hollywood to other parts of the world.
       | 
       | It reminds me of the 'what's in the box' 2009 short made with
       | available cgi assets https://m.youtube.com/watch?v=IU_reTt7Hj4
        
         | lmilcin wrote:
         | It is also so devastating that you can no longer believe
         | anything published online, even if it looks legit.
         | 
         | This is going to be one more dimension to misinformation on the
         | net.
         | 
         | Now you will see videos of politicians saying something and
         | even then you cannot be sure whether this is actual video or a
         | fake.
        
           | tshaddox wrote:
           | Does anyone actually determine what to believe this way? Like
           | if you read a quote from a politician in a large newspaper,
           | you don't believe it's real, but if you see a cell phone
           | video of the politician saying something at a rally, you do
           | believe it's real? Personally my confidence in the veracity
           | would be the opposite. There's nothing special about video
           | that makes it fundamentally harder than text to distort,
           | edit, or even outright fabricate.
        
             | xfitm3 wrote:
             | What would a detective or a jury believe? Video or
             | testimony?
        
               | tshaddox wrote:
               | Presumably they would believe (or at least be instructed
               | to believe) neither implicitly.
        
           | RandomLensman wrote:
           | As long as everyone wants their information for free, this
           | will be just another layer of icing on the cake of
           | misinformation.
           | 
           | Good, clean, and reliable information is expensive and needs
           | a fair bit of work. I can see why most people have forgotten
           | that but it might come back them and then this will be way
           | less of a problem.
        
           | madeofpalk wrote:
           | > you can no longer believe anything published online
           | 
           | When has this not been the case?
        
             | NateEag wrote:
             | Fifteen years ago it was not feasible for a random person
             | with an axe to grind to publish a convincing video of a
             | specific person doing something they did not actually do.
             | 
             | It's about to become pretty low-effort for a random person
             | with an axe to grind to do that.
             | 
             | It's not that you can no longer believe anything published
             | online - it's that video evidence without provenance was
             | relatively reasonable to trust for a few years, and it's
             | about to stop being so.
        
           | 0xbadcafebee wrote:
           | https://www.history.com/news/josef-stalin-great-purge-
           | photo-...
           | 
           | Nothing new under the sun
        
           | nIHOPp6MQw0f5ut wrote:
           | This has been a possibility for decades but only for those
           | with large budgets. Now that it is more widely accessible
           | more people actually know it can be done.
        
             | a1369209993 wrote:
             | > This has been a possibility for decades but only for
             | those with large budgets.
             | 
             | Yep. You don't need CGI if you can search a large
             | population for someone who looks like insert-public-figure-
             | here and make up the difference with makeup.
        
           | gorwell wrote:
           | "Now you will see videos of politicians saying something and
           | even then you cannot be sure whether this is actual video or
           | a fake."
           | 
           | This is actually already the case with Biden and Trump video
           | clips even without being deepfaked. Often they are presented
           | out of context to the point of completely reversing reality.
           | It's helpful to assume any clip is fake by default,
           | especially if it's a viral one that makes one side look bad.
           | 
           | By the time deepfakes are common, it'll be best practice to
           | assume fake by default.
        
             | Cthulhu_ wrote:
             | Doesn't even need an actual quote taken out of context,
             | just a headline (or a thousand) will already have an effect
             | because people scan and can't be bothered to read the
             | contents, until it becomes a background idea stuck in
             | someone's head. People also forget where they read
             | something, people forget details, and they simplify things
             | over time.
        
           | liotier wrote:
           | > Now you will see videos of politicians saying something and
           | even then you cannot be sure whether this is actual video or
           | a fake
           | 
           | Back to text and the good'ol credibility of the messenger.
           | Digital commodified journalists and now the need for
           | credibility will let them get out of anonymity again.
        
           | thefourthchime wrote:
           | To be fair, this tech has been around for years and I've yet
           | to see it be used successfully in social media
           | misinformation. The stuff I see on my in-law's facebook is
           | usually some clunky meme photo with shocking text.
           | 
           | A video needs to actually be watched, that takes more effort,
           | and then it would be widely debunked as fake. The "fake news"
           | memes are usually at least partially true which helps
           | convince people that the misinformation is legit.
        
             | wussboy wrote:
             | "I've yet to see it used successfully..."
             | 
             | As far as you know. By definition, wouldn't its successful
             | use mean you didn't know it was successfully used?
        
           | Geee wrote:
           | The problem is actually opposite. People won't believe
           | anything because they assume that what they are seeing is a
           | deep fake. It's happening already. If you see videos online
           | of Trump / Biden / someone notable there's always someone
           | claiming that it's a deep fake if they don't like what they
           | see.
        
             | lmilcin wrote:
             | Isn't it exactly the same problem rather than opposite of
             | it?
        
               | Geee wrote:
               | Yes, it is.
        
           | bongoman37 wrote:
           | I think we'll eventually see hardware that cryptographically
           | signs the content as soon as it is produced with timestamp,
           | but then that could be fooled by someone creating a deepfake,
           | projecting it on a high resolution screen and then
           | photographing with another camera. Or we wouldn't believe
           | things unless they are captured by multiple cameras at
           | different angles and maybe future GANs will be able to cover
           | that. We are seeing the beginning of an arms race!
        
           | anoraca wrote:
           | It's probably safer if people stick to primary sources with
           | good reputations anyway, right?
        
             | derptron wrote:
             | Are there any news outlets left with any credibility?
        
           | planb wrote:
           | I say it every time this comes up: People don't care if
           | something looks legit. They care if it supports their views.
           | Nothing will get worse just because the fakes get better. It
           | might even help when it's common knowledge that everything
           | can be faked by a 14 year old on their PC.
        
           | danparsonson wrote:
           | Not to mention the politicians who are recorded doing
           | something genuinely shady and will wave it away as fake news
           | - this is happening already even without deep fakes.
        
           | only_as_i_fall wrote:
           | Idk seems like an overstated problem.
           | 
           | People will be less likely to believe leaks or supposed hot
           | mic recordings, but the majority of what politicians and
           | other public figures say happens in public view which makes
           | it difficult to fake.
           | 
           | You already don't really know if a video you see on Facebook
           | has been carefully re-cut to change the meaning or tone of
           | what the speaker was saying, so I think the fact that we can
           | more easily wholesale create videos doesn't really change
           | much. If you want to know if something is real the best
           | option is still to cross reference multiple sources and if
           | possible multiple recordings.
        
           | tarsinge wrote:
           | How do you already believe something written is legit? Does
           | it not just put video on par with text?
        
             | lmilcin wrote:
             | Normally you try to think and reconcile it with knowledge
             | and experience you already have.
             | 
             | The problem is when everything you have ever learned is
             | suspect.
        
           | DougN7 wrote:
           | I agree - this is going to make finding the truth so much
           | harder once it's weaponized as it surely will be. How can
           | representative democracy flourish when you can no longer tell
           | who you want to represent you?
        
           | krastanov wrote:
           | This feels like it has been true since forever, just with
           | other media. A picture with a made up quote seems exactly as
           | damaging. Good journalists will continue vetting sources and
           | unscrupulous TV personalities will continue showing whatever
           | fits their narrative without vetting.
        
             | psychomugs wrote:
             | Since the inception of photography, all photographs have
             | been lies [1,2]. The only remedy is critical thinking and
             | awareness and skepticism on the part of the recipients,
             | which is being outpaced by the technology.
             | 
             | [1] https://en.wikipedia.org/wiki/Hippolyte_Bayard#Self_Por
             | trait...
             | 
             | [2] https://i.redd.it/nh45pwigrhc21.jpg
        
             | lmilcin wrote:
             | It has been true only to a certain level. Photos can be
             | faked and videos can be mislabeled, but as long as it
             | happens it small enough number it is possible to have
             | people point it out and make a fuss about it.
             | 
             | This changes when individual people with no resources at
             | all can make convincing fakes and wield it as a weapon to
             | sow disinformation, to have it then picked up by "major"
             | media, all information on the net becomes pretty useless.
        
               | riffraff wrote:
               | Again, this was always true, we relied on news agencies
               | and such to be gatekeepers of what is true.
               | 
               | Sometimes a random video pops up and people believe it
               | shows X and then it propagates but it's something
               | completely different (e.g. people beating up immigrants
               | on the streets in northern Italy -> traditional krampus
               | celebrations; Junker drunk at some event -> Junker
               | suffers from lombalgia; Berlusconi mimicing a sexual act
               | on some woman -> it was a comedian skit; Britney Spears
               | sex tape -> it's a random pornstar ...)
               | 
               | People will learn to be doubtful of random internet
               | videos just as they have learned to be doubtful of random
               | internet articles.
               | 
               | Or not, since they haven't yet, but it's not a
               | qualitative change.
        
               | Fricken wrote:
               | The majority of Covid disinformation was spread by 12
               | people with limited resources. They didn't need
               | deepfakes. Deepfakes add nothing of substance to the
               | liar's toolkit.
               | 
               | It's easy to lie to people so long as you're saying
               | things that validate their shitty emotions. Conversely,
               | it's extremely hard to tell people the truth when it goes
               | against their shitty emotions.
        
               | lupire wrote:
               | > The majority of Covid disinformation was spread by 12
               | people with limited resources.
               | 
               | created by 12 people. spread by many thousands.
        
               | NateEag wrote:
               | Tangent, but:
               | 
               | Emotions can't be shitty.
               | 
               | You can miscalibrate your emotional responses to
               | situations, much as you can sear your conscience.
               | 
               | But the emotions themselves - the full range are valid
               | human feelings, from fury to transcendent joy.
        
               | jfengel wrote:
               | The effect that emotions have on their behaviors can be
               | shitty. Worse, they can be self-reinforcing, such that
               | their emotions cause them to seek out ways to deepen that
               | emotional state, resulting in an increase in shitty
               | behavior.
               | 
               | Those feelings are valid, but the effect they have on
               | other people is not. Dealing with valid emotions in a way
               | that doesn't harm other people can be incredibly
               | difficult, especially when those emotions put harm front
               | and center.
               | 
               | Our emotions are valid, but our behaviors are not. It
               | behooves us to mind our emotional states when they cause
               | problems for other people. Often, that will
               | coincidentally bring about an emotional state we prefer
               | as well, but often with unpleasant transitions.
        
               | NateEag wrote:
               | Yup, I agree with all of that.
               | 
               | I didn't say anything about behaviors, because it was
               | emotions that were labeled as shitty.
               | 
               | I didn't bother to go into the distinction between
               | emotions and the bad behavior they often give rise to, so
               | thanks for explaining that.
        
               | robertlagrant wrote:
               | "valid" seems sort of weasel-wordy here.
        
               | NateEag wrote:
               | Okay, I'll phrase it more strongly:
               | 
               | Emotions as such are good, from sorrow to rage to joy.
               | 
               | Each one is a good response to some situations, as far as
               | I can tell.
               | 
               | Your emotional response can be misplaced, so that you
               | experience an inappropriate emotional in some situations.
               | 
               | As noted elsewhere in the thread, you can also be
               | inspired by your emotions to inappropriate (and just
               | terrible) behavior.
               | 
               | The emotions themselves, though, are not shitty.
               | Recognizing them and understanding where they're coming
               | from can be tremendously helpful in aligning your actions
               | with reality and your own values, and even in discovering
               | what your own values are.
               | 
               | I share this perspective not out of a sense of
               | superiority but in the hope that it helps someone else
               | avoid my mistakes.
        
               | dahart wrote:
               | > It has been true only to a certain level. [...] as long
               | as it happens it small enough number it is possible to
               | have people point it out and make a fuss about it.
               | 
               | I think the opposite is true. This isn't a historical
               | perspective, you're using logic to speculate.
               | Historically speaking, there have been fakes that reached
               | huge numbers of people, and they were more damaging then
               | than they are today because they were more believable;
               | the public had not yet conceived that photos could be
               | faked, and it was not possible to see evidence of fakery.
               | Today, everyone knows photos and videos can be faked.
               | 
               | I don't know of any deep fake videos yet that have
               | tricked a large number of people or been used for
               | political purposes. Maybe it has happened, I don't know,
               | do you know? But there have been lots of influential
               | faked photos. Just Google a little to find hundreds of
               | historical examples of famous and misleading doctored
               | photos. (Lots of overlap in these lists, because some of
               | the photos are famous).
               | 
               | https://www.cc.gatech.edu/~beki/cs4001/history.pdf
               | 
               | https://delmarwatsonphotos.com/photographs/famous-
               | photograph...
               | 
               | https://www.quora.com/What-historical-photos-are-highly-
               | misl...
               | 
               | https://www.businessinsider.com/fake-photos-
               | history-2011-8
               | 
               | https://www.ranker.com/list/historic-images-that-were-
               | retouc...
               | 
               | https://www.ba-bamail.com/content.aspx?emailid=29607
               | 
               | https://www.pinterest.com/yosomono/faked-images-everyone-
               | thi...
        
           | 6gvONxR4sf7o wrote:
           | Maybe it could mean journalism will be professionalized
           | again, when trust isn't as simple as taking a video on your
           | phone or writing a blog post... if we're lucky.
        
           | kikokikokiko wrote:
           | I remember watching "The Running Man" when I was a kid and
           | thinking that the scene were Arnold loses the fight to the
           | Jesse Ventura guy, that in the movie was a "Deep Fake", was
           | so unrealistic... man, we are already there. Nothing can be
           | believed anymore.
        
           | minsc__and__boo wrote:
           | >It is also so devastating that you can no longer believe
           | anything published online, even if it looks legit.
           | 
           | I don't buy these sky is falling arguments.
           | 
           | Deep fake video will have about as much impact as photoshop
           | has had.
        
           | [deleted]
        
           | tvirosi wrote:
           | This has been true since forever though. (It's such a weird
           | point.) Very believable photoshops have been possible since
           | forever, but you generally only believe images that are
           | verified by a trusted source. Even ridiculously fakable
           | things like "someone telling you a thing is true" (without
           | having photographic evidence of it) has somehow not been
           | completely eroded as a communication channel by deceptive
           | agents because of reputation and trust holding it all up.
        
           | another-dave wrote:
           | Would be good if content publishers did something like
           | digitally signing their content along with embedded metadata
           | so if e.g. you see a video circulating you can see that the
           | BBC attested that it was released by them & its original air
           | date was such & such.
           | 
           | Or if you see a quote claiming to be from Emmanuel Macron or
           | Boris Johnson, you can see it was released with a digital
           | signature from a Guardian journalist & they add whatever
           | date/time/location details they want to validate the
           | information.
           | 
           | If instead I release something (on Twitter or wherever) that
           | I say is a screen-capture of my TV, _I_ add the metadata
           | (originally seen on BBC on 27 Jul, 1:55pm) and sign and then
           | you know that you only trust it as much as you trust _my_
           | reputation rather than the BBC.
           | 
           | Wouldn't solve the problem entirely but it might create a bit
           | of an audit trail for stuff and encourage people not to trust
           | unvetted material.
        
             | rubicon33 wrote:
             | I wonder if there's a startup opportunity there. In the
             | coming years, this problem of deepfakes and lack of trust
             | in media is only going to get worse. Crypto could mitigate
             | this. Maybe a hardware company who sells very high end
             | cameras for media outlets, that digitally signs and adds to
             | a blockchain, all recorded media?
        
               | pintxo wrote:
               | Already done for (high-end) photo cameras.
               | 
               | But why store the signature in a blockchain? If you do
               | not trust the certificates in the first place, the
               | storage location won't make any difference. And if you
               | trust the certificates, the storage location is
               | completely irrelevant. Because the certificates alone
               | provide the trust.
        
               | rubicon33 wrote:
               | Well, my thinking was that you would want to store the
               | data cryptographically signed, on a block chain, for the
               | same reasons (more or less) that NFTs exist on a
               | blockchain. Predominately, the public ledger of ownership
               | seems like an important aspect of digital content. Is it
               | necessary for trust? Not at all, but it certainly doesn't
               | hurt it?
               | 
               | Disclaimer: I am an armchair crypto fan. Not an
               | authority.
        
               | a1369209993 wrote:
               | > But why store the signature in a blockchain?
               | 
               | For the same reason as certificate transparency logs; you
               | want to avoid trusting something that has a history of
               | certifying false statements. You also need to handle
               | throwaways, so it's definitely not _sufficient_ (and
               | might turn out to not be necessary once a complete
               | solution is found), but it does seem useful.
        
             | Goz3rr wrote:
             | Metadata that would swiftly be destroyed by the first
             | twitter/facebook user reposting a
             | screenshot/screenrecording from their phone
        
               | another-dave wrote:
               | Exactly, so if I upload something and say "Look what I
               | just recorded off the BBC!" you _shouldn't_ believe me if
               | creating a deep-fake becomes as easy as recording the
               | real thing.
               | 
               | In that scenario, you'd want people to say "but wait,
               | there's no signature on this, it could be fake" and then
               | only trust the video as much as you trust the source (not
               | the claimed source).
        
               | JKCalhoun wrote:
               | Lack of the right kind of metadata would be the first
               | tell.
        
               | db_admin wrote:
               | While still maintaining a verifiable origin
        
             | barrkel wrote:
             | Then, when someone videos an atrocity, they need to choose
             | between publishing publicly (risking retribution) or
             | publishing anonymously (if they even know how) and risk
             | being disbelieved.
        
               | another-dave wrote:
               | Well, they could contact a news outlet on the condition
               | of anonymity & the news outlet satisfies themselves as to
               | the validity of the footage.
               | 
               | Similar to how anonymous 'tip-off' stories with protected
               | sources work in general at the moment -- the media outlet
               | put their own reputation on the line on the basis of the
               | source & we trust (to a certain degree) reputable news
               | outlets to validate & vet their sources correctly. This
               | is true for stuff that's easily forgable at the moment,
               | e.g. a whistle-blower releasing documents.
        
               | SilasX wrote:
               | They could use ring signatures (like what Monero and
               | other do), where the signature only validated that it
               | came from one of several possible private keys.
        
             | bun_at_work wrote:
             | The problem is how to do you make that signature or digital
             | artifact accessible by the general public.
             | 
             | Any visual artifact can be mocked, so we end up with the
             | same problem as clickbait titles, where the conclusion one
             | arrives at from just a title can be disproved, but it
             | doesn't prevent the false information from going viral.
             | 
             | What good is it to say "that video you saw was fake!" after
             | the video has spread around and done the damage already?
             | 
             | It's hard to come up with a solution to this problem just
             | because the solution has to preempt the problem. A
             | cryptographic visual artifact _could_ work, but it's still
             | likely that misinformation via deep-fakes will cause
             | problems for society at large.
        
               | another-dave wrote:
               | Yeah agreed -- making security & authenticity
               | understandable to a layperson is always going to be
               | tricky.
               | 
               | Websites like Twitter adopting a "blue tick" for a
               | validated profile on their platform though is a model
               | people seem to get. If we had some equivalent of a "blue
               | tick" at a user-agent level, e.g. a for your browser to
               | take a signature and display it in a standardised, human
               | way to say "this video is signed by bbc.co.uk" it could
               | work. (With a similar model for user-agents elsewhere
               | e.g. you'd probably need adoption in apps like WhatsApp
               | to get traction.)
               | 
               | The other side of it (like privacy discussions) is how
               | much the average person will care -- tabloid journalism
               | often skirt the borders of what they can get away with at
               | the moment & they nominally have a duty currently to only
               | write factual information. If Fox News or the Daily Mail
               | release videos and put their own signature to it, then
               | you arguably lend them legitimacy ("it's on the news so
               | it must be true. It's signed by them and all!").
        
               | pope_meat wrote:
               | Blue check twitter people are sus, don't trust them.
               | 
               | That's the mood in the algorithm hole twitter put me in
               | to.
               | 
               | So, make what you will of that.
        
               | ASalazarMX wrote:
               | As most of the viewing is done in digital screens, the
               | player itself could show when media is signed, much like
               | the lock icon in web browsers.
        
             | [deleted]
        
             | MR4D wrote:
             | > Would be good if content publishers did something like
             | digitally signing their content along with embedded
             | metadata
             | 
             | NFT for news!!
        
         | dimitrios1 wrote:
         | Hollywood hasn't had centralization in content creation for a
         | long time. To the contrary, this past decade has been all about
         | streaming service studios and indie content creators, while
         | Hollywood continues remaking the same handful of scripts and
         | plots over, and creating sequels and modernizations of old
         | films.
         | 
         | as Ricky Gervais said in his infamous Golden Globes speech,
         | "This show should just be me coming out going "Well Done,
         | Netflix, you win everything. Well done."
        
           | mixmastamyk wrote:
           | Also "Hollywood" is not done there anymore for tax reasons.
           | Most stuff is done in Vancouver, New Zealand, Atlanta, etc.
        
       | croes wrote:
       | Wouldn't be the developer of the used software the better choice?
        
         | sumedh wrote:
         | Is it just one developer or a team (collaborators) of people?
        
           | croes wrote:
           | It's open source, so multiple developers are possible
           | 
           | https://github.com/iperov/DeepFaceLab
        
         | kaetemi wrote:
         | Not really. It's still an art form. The output of a tool is
         | only as good as its operator. Would you hire the developers of
         | Photoshop to draw your paintings?
        
           | defectiveboss wrote:
           | I'd hire both of them. Artists in the VFX field are
           | frequently constrained by the capabilities of their tools.
        
           | croes wrote:
           | It depends, how much did the software, and how the artist and
           | is it an established software field or a new type. If it is
           | in the early days of the new software field, I would favor
           | the developers. So as photo manipulation was a new feature I
           | would hire the developers.
        
       | coldcode wrote:
       | While even the new ones are not perfect, they are way better than
       | what's in the movies/shows. Good they decided to hire the artist
       | instead of just beating him with lawyers.
        
         | andyp-kw wrote:
         | Lucus has always been good with things like this. Star Wars
         | games pre-disney were generally easy to mod, and fan fiction
         | writers didn't have to be too careful about getting sued.
         | 
         | As long as the creator wasn't making money from it.
         | 
         | It's one of the reasons why the franchise survived for so many
         | years without new movies.
        
           | riffraff wrote:
           | But this is Disney now, they have not gone easy on people
           | touching their property.
        
           | tvirosi wrote:
           | This guy has been making money off of this though (through
           | ads and his patreon).
        
       | bick_nyers wrote:
       | Whenever DeepFakes are brought up I am reminded of this:
       | 
       | https://en.m.wikipedia.org/wiki/Simulacra_and_Simulation
        
         | Cthulhu_ wrote:
         | It's mentioned / shown in The Matrix as well, where everyone
         | lives in a realistic simulation.
        
         | tvirosi wrote:
         | Could you describe your understanding of what that term means
         | to you? I always hear it referenced but it haven't really
         | clicked to me what people mean by mentioning it (reading the
         | articles barely help for me).
        
           | bick_nyers wrote:
           | I just find it interesting, notably the stages part.
           | Authenticity of information is a problem in today's society,
           | but really it's false trust in information. How many people
           | get their "news" from Facebook? What lies beyond trust,
           | authenticity, and information itself is an interesting thing
           | to think about. Is there a post-trust society that isn't
           | disorderly and chaotic?
        
       | nanna wrote:
       | Lucasfilms should have stopped making Star Wars after The Return
       | of the Jedi. Everything since has been abysmal. That's just a
       | fact no amount of deepfaking will change.
        
       | matsemann wrote:
       | I don't understand. What's the original vs deepfake comparison
       | about? I know nothing of Star Wars (sorry), is there a third
       | video it's based on or something?
        
         | nickthegreek wrote:
         | They are both deepfakes. But one was made by hollywood, and a
         | better one was made by a youtuber.
        
           | naz wrote:
           | Though the Youtuber's deepfake is a deepfake overlaid on
           | Hollywood's attempt, so all of the lighting and motion
           | capture acting is already done.
        
           | Nathanael_M wrote:
           | A key difference is that in at least the case of the movie
           | "original" they were digital 3D heads, complete CGI. The
           | youtuber's deepfakes are old footage that an AI overlays onto
           | the face.
        
         | wodenokoto wrote:
         | Both original and new one tries to depict how a specific actor
         | looked in the 70s.
         | 
         | There's no ground truth to compare with other than the Luke
         | skywalker character in the original Star Wars movies and which
         | one looks most realistic
        
           | matsemann wrote:
           | So there is a stand in acting, and both have replaced his
           | face? Or is the deepfake done on top of the original that
           | already looks like the actor?
        
             | esrauch wrote:
             | There was a stand-in and the tv show computer generated the
             | character replacing the stand-in. And then a YouTuber took
             | _that_ footage and made it look even more realistic.
        
             | unlikelymordant wrote:
             | The deepfake will have been done by a youtuber over the top
             | of the tv show version (the one on the left is from the
             | show)
        
             | Cthulhu_ wrote:
             | One thing to note is that the 'original' (CGI actor) was
             | already a bit awkward because things like the lip sync were
             | off compared to a real actor.
             | 
             | That said, the facial animations for Leia in another video
             | he made (they did a digital version of Leia for Rogue One,
             | he deepfaked on top of that) actually improved with the
             | deepfake version.
        
         | [deleted]
        
       | tvirosi wrote:
       | I know the narrative is to be scared of this tech (maybe even
       | push towards legislation of it). But me personally I just find
       | these things amazingly awesome and super cool.
        
         | Cthulhu_ wrote:
         | Scared if it's used to impersonate influential people and
         | spread misinformation, but cool when used in media. It's a
         | difficult one.
         | 
         | Mind you, impersonating influential people in a convincing
         | fashion is / has been a thing for a while now. I'm thinking of
         | Forrest Gump hanging out with the president and the like.
        
         | Andrex wrote:
         | If Photoshop could be likened to giving a humanity a loaded
         | revolver, I feel like deepfakes are like handing an AK-47. I'm
         | hopeful but I'm also cautiously seeing how the world embraces
         | this tech.
        
           | JasonFruit wrote:
           | So deep fakes are a great start toward a defense against an
           | increasingly coercive government? I'm even more confused now.
        
             | Andrex wrote:
             | Poe's law is real...
        
       | diegoperini wrote:
       | Better deepfakes also mean less make-up related skin harm on
       | artists which is a huge leap for the industry.
        
       | squarefoot wrote:
       | The deepfakes look much better than the original, especially
       | Leia, whose CGI recreation in Rogue One looked odd from the
       | beginning. Tarkin is ok just like Luke, especially their eyes
       | which now seem to be looking to whom they're talking to, thanks
       | to better reflections. Luke's mouth however is still unrealistic
       | when matching the speech; for example at 0:32 when Luke says "He
       | wants your permission", the lips don't even touch to create the
       | "P" sound.
        
       | sdevonoes wrote:
       | I'm not really into the tech behind Deep Fake, but doesn't the
       | whole credit go to the tool? Or is it that one needs to adjust
       | the tool somehow to produce decent fakes?
        
         | bick_nyers wrote:
         | Adjusting the tool for sure, and moreover, adjusting your data
         | collection strategy to the scene, and applying manual fixes to
         | the data or the output. Definitely an art as much as it is a
         | science, much like Machine Learning in general. I'm not sure if
         | the tool will even improve much with time, as I understand,
         | most of the work is in the data collection, and quantity is
         | definitely not better than quality.
        
         | fsloth wrote:
         | Deep fake tools are like ... well, any other digital content
         | creation tool. The artist needs to do most of the
         | parametrization, even though the algorithms come boxed in.
         | 
         | Deepfacelab has some tutorials to explain what is actually
         | done: https://github.com/iperov/DeepFaceLab
         | 
         | e.g. https://www.youtube.com/watch?v=1smpMsfC3ls
        
         | cainxinth wrote:
         | Do you really think they didn't try doing this without this guy
         | before deciding it would be easier just to hire him?
        
           | jtbayly wrote:
           | Obviously they did. That's what the new guy is competing
           | with.
        
             | cainxinth wrote:
             | No, I mean afterwards. They saw his vid, and must have
             | thought, let's try and fix our previous work using a
             | similar method... and discovered even that isn't so easy.
        
         | robbrown451 wrote:
         | If it is so easy, he wouldn't be the only one getting all those
         | views. His results are far better than anyone else has been
         | able to get, so that says there is a lot more than just running
         | the tool.
        
         | omgwtfbbq wrote:
         | Suppose it's like saying a painting should really be credited
         | to the paint and brushes since they did all the work of
         | creating the piece.
        
         | wccrawford wrote:
         | It's not just about the tool, it's about how to set up and use
         | the tool, and I'm sure there's a lot of tweaking, too. It's
         | cheaper to hire this person to set it up and run it for them
         | than to try to get someone up to speed that's already on-staff,
         | I'd bet.
        
         | hermannj314 wrote:
         | "Point this device at the thing you are looking at and press a
         | button." is sufficient creative effort to generate copyright
         | for photographs, I am not going to be hard on digital artists
         | doing significantly more work.
        
           | ethanbond wrote:
           | Generate copyright, sure, but generate value that justifies
           | hiring? You can't take the camera away from the photographer
           | and have it autonomously recreate the photographer's taste
           | and tuning. OP is asking whether that's also the case with
           | the tool shown here (the answer to which I don't know, but I
           | suspect it does require a fair bit of artistic tuning).
        
             | cinntaile wrote:
             | Well the person currently working at Lucasarts got paid to
             | create a deepfake, now someone made a better version so why
             | not pay them for doing a better job?
        
               | ethanbond wrote:
               | The reason not to do that is if you can get the same
               | results for free or for a one-time cost. That's the
               | question GP is asking.
        
               | cinntaile wrote:
               | A lot of software consists of ready built models, but
               | without the right parameters and constraints you'll just
               | end up with garbage. I don't see a reason why this
               | wouldn't be the case for deep learning models? The fact
               | that they hired him suggests to me that it's not possible
               | at this point point in time anyway.
        
             | tshaddox wrote:
             | For hiring it obviously just depends on how many people can
             | do the job and what compensation terms they will expect, as
             | well as how much compensation employers will offer.
        
       | remir wrote:
       | I've seen some deep fakes that are very convincing, especially if
       | you show them to someone not knowing about them. One example
       | would be this one in which Sean Connery's face was replaced by
       | Burt Reynold's in James Bond:
       | https://www.youtube.com/watch?v=foqeQM-7PSg
       | 
       | Truly amazing what can be done with consumer grade stuff
       | nowadays.
        
         | Cthulhu_ wrote:
         | The same youtuber put Jim Carrey's face onto James Bond as
         | well. And face-swapped Jim Carrey and Hugh Jackman.
        
           | GiveOver wrote:
           | It's Shamook, the person mentioned in the article.
        
       | easterncalculus wrote:
       | "Shamook is the one who "fixed" Luke Skywalker's cameo in The
       | Mandalorian to the tune of 1.9 million _videos_... "
       | 
       | I understand this is a mistake, we all make them, but where are
       | the editors at The Verge? If a reader can find this with a
       | cursory look-over, shouldn't they find it also? I couldn't have
       | handed this in as a high school essay, so I'd imagine it wouldn't
       | get past an editor at a large magazine. Maybe it's just me but at
       | least from what I see personally there's loads of errors small
       | and larger in the news these days. It's weird.
        
         | kungito wrote:
         | From what I have heard, there is less and less money per
         | article in the market. They hire college temps to write (copy
         | paste from other websites) and no one checks the articles. They
         | have to make sure that whatever has an article on the internet,
         | they have a copy as well.
        
           | Cthulhu_ wrote:
           | It's the gig economy, moved to the internet. It's a race to
           | the bottom as I said in another comment.
        
         | nathancahill wrote:
         | Speed is king in the content/clicks game. It's easy to go back
         | and edit afterwards. Hitting publish as fast as possible is the
         | only way. Some "articles" are published as just headlines and
         | fleshed out afterwards. Really interesting to watch on a
         | Bloomberg terminal for example.
        
         | long_time_gone wrote:
         | I wonder if journalists say the same thing about how often
         | developers release software with bugs.
        
           | easterncalculus wrote:
           | They should! Though there are more people that can read
           | English than those that can read code, not including when
           | bugs exist outside of it.
        
         | wyldfire wrote:
         | > Maybe it's just me but at least from what I see personally
         | there's loads of errors small and larger in the news these
         | days. It's weird.
         | 
         | From what little I know about journalism (very little), editors
         | at traditional media had a very opinionated stance about
         | language and punctuation. It is to the degree that they're not
         | merely finding errors like these, they're also suggesting
         | rewrites for clarity, etc.
         | 
         | I, too, notice many simple errors like these that make me think
         | that editor must not be as valuable a role as it was in the
         | past. As the cost of communication dropped toward zero, an
         | editor role becomes a more significant cost, maybe. Would the
         | cumulative effect of errors like this one be enough to impact
         | the readership of The Verge?
        
           | easterncalculus wrote:
           | I definitely think the position has to have dropped in value,
           | we're publishing more articles per day than before and most
           | often online, in the land of instant corrections - there's no
           | printing press to worry about. Though I would certainly
           | appreciate if someone read them - there are plenty of
           | independents, bloggers, etc that would catch this stuff in
           | their own writing.
        
         | yreg wrote:
         | I imagine you read plenty of "look-overs" without noticing
         | them. I also imagine that the editors at (say) The Verge do
         | find and fix plenty of mistakes that you wouldn't notice
         | anyway. In this case there's a combination where they didn't
         | notice and you did.
         | 
         | Proofreading is difficult because your mind subsconsciously
         | fixes the text you are processing. Editors need to focus hard,
         | but as we all know it's tough to maintain focus while doing
         | monotonous tasks.
        
           | easterncalculus wrote:
           | Proofreading is definitely hard, I've made tons of mistakes
           | just on this site. The reason I bring it up is because it is
           | more than a technicality and in the first bit of the article.
           | I would also imagine that editors miss things all the time,
           | but I really think that if one read this it would have been
           | caught.
           | 
           | It's not a big problem, the meaning is still there, but it
           | seems like a trend to me with online journalism at least, and
           | I was wondering if others felt similarly or if I'm just being
           | unfair.
        
             | yreg wrote:
             | I see. Some professional editor would have to chime in and
             | tell us. :)
             | 
             | I as an amateur have proofread texts for my friends many
             | times and I often missed very visible mistakes. But it's
             | not my job of course, maybe specialists have some methods
             | beyond reading carefully.
        
         | Cthulhu_ wrote:
         | There's a lot of really low-effort articles written nowadays -
         | maybe not so much for The Verge, but definitely for other
         | platforms. They offer payment per word, and they can hire
         | editors that also get paid per word of the article. But it's a
         | race to the bottom, where people push to churn out as many
         | articles as they can per day just to optimize income (or worse,
         | to try and make ends meet).
        
       | me_me_me wrote:
       | This reminds me of Bojack Horseman plot-line where they scanned
       | every actor on set in case they die so they can recreate them in
       | virtual for the movie.
        
         | Cthulhu_ wrote:
         | Some years ago they didn't even need to scan them, they just
         | used old footage to create 'virtual reality' / 'hologram'
         | concerts for Elvis Presley, Michael Jackson and Tupac.
        
         | amitport wrote:
         | And "The Congress" movie
         | https://en.m.wikipedia.org/wiki/The_Congress_(2013_film)
         | 
         | (It's inspired by a book that I haven't read, so I'm not sure
         | if the same idea we're discussing is also in the book... But
         | it's definitely in the movie.)
        
           | phaedrus wrote:
           | Before clicking the link, I thought this was going to be a
           | completely different science fiction story about (U.S.)
           | congressmen who leave the business of legislative voting to
           | AI copies (brain scans) of themselves while the original
           | travels full-time to raise funds and votes. After a secret
           | congressional hearing is held on a matter of national
           | security, the scanned copies begin voting in ways counter-to
           | or that don't make sense to the original versions, but the
           | copes are not allowed to reveal what was in the content of
           | that hearing that changed their minds.
           | 
           | I don't remember where I read that plot synopsis, so I might
           | have some details wrong. (If it's a book I haven't read it,
           | or it could have been someone's description of a story idea.)
        
       | mmkos wrote:
       | Is it just me who thinks the title is sensationalised? I can
       | barely see any difference between the two, and the one made by
       | the YouTube builds on top of the original one.
        
         | AnIdiotOnTheNet wrote:
         | I agree with that, and, while I can't put my finger on why, the
         | new deepfake actually seems a little less real to me.
        
         | nocturnial wrote:
         | It's not just you. I can't tell the difference either.
         | 
         | Maybe it's just something someone has to explain to us what to
         | look for. It could be a curse to know why some people think the
         | deepfake is better. I once talked to a graphics artist about
         | why they thought some effect looked bad because I didn't see
         | it. They explained it in detail. Now I can't unsee what they
         | were talking about and can easily spot that mistake.
         | 
         | I don't know. Maybe it's better not knowing and enjoying the
         | results or knowing and each time you see it thinking: "Hey,
         | they made that mistake"
        
         | aazaa wrote:
         | If you hid the cation, there's no way I could tell, either.
        
         | yourenotsmart wrote:
         | Building "on top of the original one" is not an asset in the
         | deepfake process, it's a hindrance, because then the deepfake
         | inherits the unnatural 3D facial animation, which is most of
         | the reason why traditional VFX 3D facial replacement works so
         | poorly.
        
           | jcims wrote:
           | 100% this is why the example looks so bad.
           | 
           | Here's another example from the same channel with a live
           | actor (Robert Pattison as Batman):
           | 
           | https://www.youtube.com/watch?v=dmuYz0aZGgU
           | 
           | A careful examination will find all sorts of artifacts but an
           | unprimed general populace wouldn't have any clue.
        
         | dang wrote:
         | Ok we've replaced that with a more neutral description from the
         | article body. Thanks!
        
         | MajorBee wrote:
         | I think the key difference is how the eyes are "reanimated" in
         | the deepfakes. Rendering realistic eyes (with that trademark
         | "spark of life") has always been a challenge in real life
         | emulating CGI, but now these new clips kind of capture the
         | spark in the actors eyes much better. Luke Skywalker's eyes in
         | the original clip kind of look dead plastic, the kind you'd
         | find in real-time gameplay (not to mention this nose also looks
         | like the product of bad rhinoplasty). The deepfake does this
         | much better, in my opinion.
         | 
         | Elsewhere in this thread, someone posted a similar comparison
         | clip about The Irishman and the difference is even more
         | pronounced there. De Niro's eyes actually look the age they're
         | supposed to be in that scene. The original scene had two key
         | problems, in my option; one, they decided not to touch the eye
         | and focused on smoothening the skin (I assume this is because
         | of the technical limitations of doing the deaging in
         | painstaking CGI); two, and this is a little more of a mystery,
         | is that they thought they could get away with using De Niro's
         | current 80 year old voice on a character that is supposed to be
         | 35 (ages are ballpark numbers). The raspy voice of an old man
         | is just not something you expect out of a supposedly much
         | younger and healthier man. They should have just gotten a voice
         | actor who could do a convincing impression of De Niro in the
         | 70s and dubbbed him in.
        
           | Multicomp wrote:
           | > (with that trademark "spark of life") h
           | 
           | This was the big winner for me. The deepfake had "Tarkin" and
           | "Leia" on screen. The originals had 3D GCI (very good CGI,
           | but CGI) with dead eyes.
           | 
           | To further my anecdoe, I totally missed Leia's sequence the
           | first watch through because I was watching the content and
           | zoned out, forgetting to evaluate her face for deepfake
           | pixels because it looked real enough for me to suspend
           | disbelief.
           | 
           | I have never been able to look at Leia's face in the original
           | Rogue One scene without forcing myself to say "they did their
           | best, they'll redo it someday for the 8K release, until then
           | grin and bear the dead eyes".
        
         | knuthsat wrote:
         | The deep fake one looks very weird to me. The mouth is
         | sometimes not closed and there is minimal movement on the upper
         | lip on words that would need more.
        
         | bluebubble56 wrote:
         | Not just you - I can't really see the difference either, both
         | look pretty uncanny to me, tbh.
        
       ___________________________________________________________________
       (page generated 2021-07-27 23:02 UTC)