[HN Gopher] Deep Live Cam: Real-time face swapping and one-click...
       ___________________________________________________________________
        
       Deep Live Cam: Real-time face swapping and one-click video deepfake
       tool
        
       Author : blini2077
       Score  : 189 points
       Date   : 2024-08-10 13:05 UTC (9 hours ago)
        
 (HTM) web link (deeplive.cam)
 (TXT) w3m dump (deeplive.cam)
        
       | blini2077 wrote:
       | Deep Live Cam is a cutting-edge AI tool that enables real-time
       | face replacement in videos or images using just a single photo.
       | Perfect for video production, animation, and more.
        
         | emsign wrote:
         | I don't want to see faked videos.
        
           | warkdarrior wrote:
           | Son, you no longer have an option on our free service. But
           | you can subscribe to our deep fake-free service for only
           | $89.99/month.
        
           | exe34 wrote:
           | You won't know the difference.
        
             | luzojeda wrote:
             | That only makes it worse.
        
               | exe34 wrote:
               | resistance is futile. profits go brrrr.
        
           | corn13read2 wrote:
           | Maybe you have been this whole time...
        
       | cs702 wrote:
       | Well, I understand how it works, and I still find it freaking
       | amazing. The quality is... impressive.
       | 
       | On the flip side, the ability to deep-fake a face in real time on
       | a video call is now accessible to pretty much every script kiddie
       | out there.
       | 
       | In other words, you can no longer trust what your eyes see on
       | video calls.
       | 
       | We live in interesting times.
        
         | Xeoncross wrote:
         | It's interesting, because the subconscious ability of the mind
         | to identify discrepancies is incredible (even if we ignore that
         | feeling we get about something).
         | 
         | The feel of counterfeit bills, the color someone choose to
         | wear, the sound that doesn't quite fit.
         | 
         | I think deep-fakes are mostly a danger to people without a lot
         | of source material for their minds to compare against. You
         | could trick me into believing I was taking with Elon, but not
         | my son.
        
           | reaperducer wrote:
           | _You could trick me into believing I was taking with Elon,
           | but not my son._
           | 
           | And yet there have been several recent studies that show the
           | younger someone is, the more likely they are to be scammed
           | online.
           | 
           | > In 2021, Gen Xers, Millennials, and Gen Z young adults
           | (ages 18-59) were 34% more likely than older adults (ages 60
           | and over) to report losing money to fraud,[1] and some types
           | of fraud stood out. Younger adults reported losses to online
           | shopping fraud - which often started with an ad on social
           | media - far more often than any other fraud type, and most
           | said they simply did not get the items they ordered.[2]
           | Younger adults were over four times more likely than older
           | adults to report a loss on an investment scam.[3] Most of
           | these were bogus cryptocurrency investment opportunities.
           | 
           | https://www.ftc.gov/news-events/data-visualizations/data-
           | spo...
        
             | dexterdog wrote:
             | Older people are less likely to invest in high risk, high
             | yield.
        
               | Rinzler89 wrote:
               | Older people are less likely to have their entire
               | personas and private lives fully documented on social
               | media.
        
               | eltoxo wrote:
               | Younger people really should consider this point.
               | 
               | Personally, I don't use streaming video outside of work
               | and there are no videos of me on youtube or any social
               | media to train a model on even if someone wanted to.
               | 
               | My mother in her 70s doesn't even have a debit card. She
               | thinks the idea is ridiculous and insecure. She writes
               | paper checks and that is it. To put her account number on
               | an electronic device would be completely unthinkable.
               | 
               | While the average older person might be more easily
               | confused by social engineering the attack surface for an
               | electronic scam is so tiny compared to the average
               | younger person.
        
               | bsenftner wrote:
               | Only one photo is needed. I've not looked deep into this
               | specific project, but speaking as an early developer of
               | likeness transfer trained algorithms, only one image is
               | needed, and it can even be a crude sketch - but if one's
               | true likeness is captured by an image it can be recreated
               | in full 3D. The catch is an individual's specific facial
               | expressions, such as the real individual has a slightly
               | lopsided smile, or they have smile dimples, or simply
               | their characteristic at rest facial positions are absent,
               | so they don't look like themselves to those that know
               | them.
        
               | reaperducer wrote:
               | _She writes paper checks and that is it._
               | 
               | As my accountant says: "For every person in your
               | neighborhood committing check fraud, there are ten-
               | thousand people around the world trying to steal your
               | money online."
        
               | mh- wrote:
               | _> She writes paper checks and that is it. To put her
               | account number on an electronic device would be
               | completely unthinkable._
               | 
               | But she's handing a plaintext copy of her account number
               | to everyone she pays with a check..
        
             | bsmithers wrote:
             | > And yet there have been several recent studies that show
             | the younger someone is, the more likely they are to be
             | scammed online.
             | 
             | I think you are misreading the post. Pretty sure they meant
             | 
             |  _you could trick me into believing I was talking with
             | Elon, but you could not trick me into believing I was
             | talking with my son_
             | 
             | To which I agree personally, though I don't know how
             | universal this is.
        
             | vincnetas wrote:
             | i fell victim to such scam this year (first time i got
             | scammed over 40 years). Key factor was that i got link to
             | scam shop not from social ad but from my wife :) she got it
             | from insta ad. So basically my wife scammed me :)
        
           | emsign wrote:
           | What if that source material for young brains gets more and
           | more contaminated with artificial junk?
        
           | cs702 wrote:
           | The key take-away, for me, is that I should "keep my guard
           | up" on any video call about _money or other important
           | matters_ , even if other participants on the call are
           | colleagues, friends, or relatives. There are no guarantees of
           | authenticity anymore. My new motto for video calls is "trust
           | by verify."
        
           | despideme wrote:
           | There's interesting ambiguity in this comment. I interpret
           | the comment as saying, "I could be tricked by a deepfake of a
           | stranger due to a lack of experience with their 'true'
           | behaviors, but would not be tricked so easily when it's
           | someone I know well."
           | 
           | Others here seem to be interpreting the statement as, "I
           | could be tricked because I am an older person, while a
           | younger person would not be so easily deceived."
        
         | Bluecobra wrote:
         | Indeed. Just recently a company got fooled into hiring a remote
         | employee who turned out a North Korean hacker:
         | 
         | https://arstechnica.com/tech-policy/2024/07/us-security-firm...
        
         | emsign wrote:
         | Post-truthism has reached live video and is accessible to
         | everyone. Turns out it's still only a few weirdos who love to
         | use it for grifting purposes. I think most normal people are
         | like "what do I need this crazy sh*t for?"
        
         | ibejoeb wrote:
         | It's got a lot of uncanny valley going on. Zuck looks like a
         | corpse. I'm sure it could fool some people, but I'm not
         | terrified yet.
        
         | exe34 wrote:
         | I recommend setting up code words with people. I haven't gone
         | that far myself yet, but in my mind, there are clear phrases I
         | could say to the people in my life that they would be convinced
         | it was me. Unfortunately, until I have the specific talk with
         | them, I suppose anybody could impersonate me to them.
        
       | rnimmer wrote:
       | FTA: "Ethical Use Safeguards
       | 
       | Built-in checks prevent processing of inappropriate content,
       | ensuring legal and ethical use."
       | 
       | I see it claims to not process content with nudity, but all of
       | the examples on the website demo impersonation of famous people,
       | including at least one politician (JD Vance). I'm struggling to
       | understand what the authors consider 'ethical' deepfaking? What
       | is the intended 'ethical' use case here? Of all the things you
       | can build with AI, why this?
        
         | KolmogorovComp wrote:
         | For many (notably mastercard and VISA), when they say "ethical"
         | they really mean anything but porn.
        
           | hackernewds wrote:
           | that seems like an overreaction. the card processors ban much
           | more questionable trades - such as weapons and terrorism
           | financing
        
             | roamerz wrote:
             | I think "questionable" is very subjective especially when
             | blocking transactions on something that is constitutionally
             | protected. I wonder what would happen if a processor banned
             | paying for something that ended someone's life on a near
             | 1:1 ratio like abortion?
             | 
             | Just using that to highlight the subjective nature of the
             | comment.
        
               | coolspot wrote:
               | GP comment is misleading. Visa/MC do not block gun/ammo
               | purchases in the US.
        
               | roamerz wrote:
               | It's not so much Visa/MC but rather the payment
               | processors that are at issue here.
        
               | mynameisvlad wrote:
               | The payment processors interpret the networks' rules, you
               | do understand that right? If they're banning something,
               | it's because the networks either outright are banning it
               | too or have put enough restrictions and constraints in
               | place that the liability for the transaction doesn't make
               | sense.
               | 
               | The payment processors are doing what the networks tell
               | them to do.
               | 
               | It's not like the processors are actively looking for
               | ways to turn down money; they want as many transactions
               | going through them so they can earn their share of it.
        
               | highcountess wrote:
               | There were several efforts to restrict the people's right
               | ability to marshal resistance to tyranny and Visa/MC was
               | very much involved with that even though they were not
               | the only ones.
        
               | lancesells wrote:
               | When did this happen?
        
               | tourmalinetaco wrote:
               | The entire US financial apparatus is part of the problem.
        
               | bee_rider wrote:
               | In general I think payment processor are not required to
               | associate with anybody. The government (in the US at
               | least) is limited in their ability to prevent you from
               | buying guns and making porn (a form of speech), but they
               | can't make people do the transactions with you; the right
               | to have somebody process payments for you is not
               | constitutionally protected.
               | 
               | But I'd be at least curious (as a non-lawyer) if there
               | could be issues around discriminating against pregnant
               | women in the US, since abortion is a service that is only
               | used by them.
        
               | parineum wrote:
               | The real answer to the guns/abortion comparison is there
               | are a lot of people that Will loudly state their
               | opposition.
               | 
               | No (or few) politician is going to stand up for porn
        
             | cess11 wrote:
             | That's because the state is forcing them to.
        
               | tourmalinetaco wrote:
               | The state has never stopped the funding of terrorism.
               | 
               | https://www.nbcnews.com/politics/us-taxpayers-may-
               | funding-ta...
        
               | cess11 wrote:
               | Sure, but that doesn't mean it wants competition in that
               | space.
        
             | Maxatar wrote:
             | I can't find anything to support your claim about weapons.
             | Seems pretty much all online arms dealers I can find
             | selling anything from grenades, machine guns, and even
             | rocket launchers take credit cards and I'm fairly certain
             | stores also accept them too.
        
               | highcountess wrote:
               | I am not sure what the current state of the issue is, but
               | there was an initial effort to restrict gun sales in
               | various devious and deceptive ways since it is illegal to
               | overtly do so because it is legal trade and economic
               | activity.
               | 
               | I would not be surprised though if the clear illegality
               | of the violation of the Constitution of such efforts were
               | brought to the attention of the payment processors, and
               | they were reminded that they would severely regret
               | hastening attention on an effort that still needs to
               | happen, a public electronic payment processing capacity.
        
               | happyopossum wrote:
               | > grenades, machine guns, and even rocket launchers
               | 
               | Umm, yeah - what country are you buying live grenadesor
               | working rocket launcher online with a Mastercard? Cuz
               | it's not the US or Canada. And if it's not a live grenade
               | or working rocket launcher, it's no different than any
               | hunk of metal.
        
               | sim7c00 wrote:
               | uh. yeah... us and canada are a tiny fraction of the
               | world, also what is really buying using visa or
               | mastercard? if i use visa to byy crypto and then get
               | explosives (which can be transparently done) there is
               | nothing they can or will do about it... - buying things
               | online has nothing to do with countries or borders, nor
               | is it always clear, to payment providers or even
               | customers, what kind of scheme enables a payment..
        
             | sneak wrote:
             | I buy weapons with my Visa card all the time.
        
             | PontifexMinimus wrote:
             | And backed off from blocking Onlyfans.
        
           | instagraham wrote:
           | That they do, but perhaps the relevant context is that while
           | porn is globally unregulatable, but the one entity that has
           | proven its ability to regulate it (or at least exercise some
           | control over it) have been payment processors like Visa and
           | Mastercard.
           | 
           | FT had a fantastic podcast on the porn industry and the guy
           | behind Mindgeek. Like many stories about multinational
           | entities, you constantly hear the usual refrains - noone can
           | regulate this, the entities keep changing their name and
           | face, there is no accountability, etc. But when Visa and
           | Mastercard threaten to pull their payments, the companies
           | have to listen.
           | 
           | Visa and mastercard are the de facto regulators of porn
           | today, and mostly do so to prevent nonconsentual and extreme
           | fetish stuff from being displayed on mainstream platform.
           | 
           | From what I gathered from the podcast, they're not super keen
           | on being the regulator - but it's a dirty job and somebody
           | has to do it.
        
           | godelski wrote:
           | So....                 ffmpeg -i porn.mp4 -vf
           | "crop=crop_w:crop_h:coord_x:coord_y" "definitely not
           | porn.mp4"       *faceswap "definitely not porn.mp4*
           | ffmpeg -i porn.mp4 -i "swapped definitely not porn.mp4"
           | -filter_complex "[0][1]overlay=coord_x:coord_y" -c:a copy
           | "deepfake porn.mp4"
           | 
           | Got it
        
         | rikafurude21 wrote:
         | # process image to videos         if modules.globals.nsfw ==
         | False:             from modules.predicter import predict_video
         | if predict_video(modules.globals.target_path):
         | destroy()
        
           | exe34 wrote:
           | I'm a big fan of explicit checks like this.
        
         | reaperducer wrote:
         | _Of all the things you can build with AI, why this?_
         | 
         | That can be asked of 90% of what's come out of the latest AI
         | bubble so far.
         | 
         | Like a lot of technology, AI has so much potential for good.
         | And we use it for things like games that simulate killing one
         | another, or making fake news web sites, or pushing people to
         | riot over lies, or making 12-year-olds addicted to apps, or
         | eliminating the jobs of people who need those jobs the most,
         | or, yes, pornography.
         | 
         | We can do better.
        
           | parineum wrote:
           | > AI has so much potential for good.
           | 
           | Like what?
        
             | dylan604 wrote:
             | I'm hoping that at some point the novelty and hype will die
             | down so that the headline grabbing "send a follow up email"
             | or "summarize call" will get out of the way so the more
             | impressive things like detecting medical conditions
             | months/years earlier than human doctors will be a much more
             | visible. The things for making people lazy are a total
             | waste to me.
        
             | arjie wrote:
             | My wife used ChatGPT and Adobe's AI to design our wedding
             | outfits so there's that. Turned out great!
        
         | tdeck wrote:
         | The answer is that anyone working on deepfakes doean't care
         | much about ethics or they wouldn't be doing it in the first
         | place.
        
           | ithkuil wrote:
           | OTOH now that we know the technology is possible, would you
           | prefer that only some actors had perhaps the ability to do
           | that. or perhaps not and having the lingering doubt that
           | anything you see could be deep fake but there could always be
           | plausible deniability that it would be too hard to actually
           | carry it out.
           | 
           | If the technology is actually made widely available that just
           | reveals that the Pandora box was actually already open
        
             | dylan604 wrote:
             | The claim of "deepfake" will be much more difficult to
             | disprove than "my account was hacked"
        
           | godelski wrote:
           | I think this is an oversimplification that undermines your
           | goals.
           | 
           | If you're unwilling to recognize the benefits of something,
           | it becomes easier to dismiss your argument. Instead, the
           | truth is balancing trade-offs and benefits. Certainly there
           | is a clear and harmful downside to this tech. But there are
           | benefits. It does save a lot of money for the entertainment
           | industry when you need to edit or do retakes. The most famous
           | example might be superman[0].
           | 
           | The issue is that when the downsides get easy to dismiss, it
           | becomes easy to get lost in the upsides. It'll get worse
           | because few people consider themselves unethical. We're all
           | engineers and we all have fallen for this trap in some way or
           | another. But we also need to remember that the road to hell
           | isn't paved with malicious intent...
           | 
           | [0] https://www.youtube.com/watch?v=2nxanN85O84
        
             | Vegenoid wrote:
             | > But there are benefits. It does save a lot of money for
             | the entertainment industry when you need to edit or do
             | retakes.
             | 
             | I think the downside is 10 orders of magnitude larger than
             | this benefit.
             | 
             | I also think there are more people who'd call this usage a
             | downside than a benefit.
        
         | greg_V wrote:
         | The number one use case for this will be to beat KYC checks,
         | which means KYC procedures will get more annoying and
         | bothersome for everyone else!
        
         | godelski wrote:
         | I'm a researcher who's made one of the best face generators.
         | I'd like to address your questions and discuss a larger more
         | critical point.
         | 
         | I too have ethical concerns. There are upsides though. It is a
         | powerful tool for image and video editing (for swapping, you
         | still need a generator on the backbone)[0]. It is a powerful
         | tool for compression and upsampling (your generative model
         | __is__ a compression of (a subset of) human faces, so you don't
         | need to transmit the same data across the wire). It is easy to
         | focus on the upsides and see the benefits. It is easy to not
         | spend as much time and creative thinking directed at malicious
         | usages (you're not intending to use or develop something for
         | malicious acts, right?!). But there's two ways to determine
         | malicious usages of a technology: 2) you emulate the thinking
         | of a malicious actor, contemplating how they would use your
         | tool, and 2) time.
         | 
         | But I also do think application matters. I think this can get
         | hairy when you get nuanced. Are all deepfakes that are done
         | without consent of the person being impersonated unethical? I
         | think at face (pun intended) value, this looks like an
         | unambiguous no. But what about parody like Sassy Justice?[1].
         | Intent here is not to deceive, and the deep fakes add to the
         | absurdity of the characters, and thus the messages. Satire and
         | parody itself doesn't work unless mimicry exists[2]. Certainly
         | these comedic avenues are critical tools in democracy,
         | challenging authority, and challenging mass logic failures[3]
         | (which often happens specifically due to oversimplification and
         | not thinking about the details or abuse).
         | 
         | I want to make these points because I think things are post hoc
         | far easier to dismiss than a priori. We're all argumentative
         | nerds, and I think despite the fact that we constantly make
         | this mistake, we can all recognize that cornering someone
         | doesn't typically yield in surrender, but them fighting back
         | harder (why you never win an argument on the internet, despite
         | having all the facts and being correct). And since we're mostly
         | builders (of something) here, we all need to take much more
         | care. * _The simpler you rationalize something to be post hoc,
         | the more difficult it will be to identify a priori.*_
         | 
         | Even at the time, I had reservations when building what I made.
         | But one thing I've found exceptionally difficult in ML research
         | is that it is hard to convince the community that data is data.
         | The structure of data may be different and that may mean we
         | need more nuance in certain areas than others (which is
         | exciting, as that's more research!), but at the end of it, data
         | is data. But we get trapped in our common datasets to
         | evaluate[4] and more and more, our research needs to be
         | indistinguishable from a product (or at least a MVP). If we can
         | make progress by moving away from Lena, I think we can make
         | progress by moving away from faces AND by being more nuanced.
         | 
         | I don't regret building what I built, but I do wish there was
         | equal weighting to the part of my voice that speaks about
         | nuance and care (it is specifically that voice that led to my
         | successful outcomes too). The world is messy and chaotic. We
         | (almost) all want to clean it up and make it better. But
         | because of how far we've advanced, we need to recognize that
         | doing good (or more good than harm) is becoming harder and
         | harder. Because as you advance in any topic, the details matter
         | more and more. We are biased towards simplicity and biased
         | towards thinking we are doing only good[5], and we need to
         | fight this part of ourselves. I think it is important to
         | remember that a lie can be infinitely simple (most conspiracies
         | are indistinguishable from "wizards did it"), but accuracy of a
         | truth is bounded by complexity (and real truth, if such a thing
         | exists, has extreme or infinite complexity).
         | 
         | With that said, one of my greatest fears of AI, and what I
         | think presents the largest danger, is that we outsource our
         | thinking to these machines (especially doing so before they can
         | actually think[6]). That is outsourcing one of the key
         | ingredients into what defines us as humans. In the same way
         | here, I think it is easy to get lost in the upsides and
         | benefits. To build with the greatest intentions! But above all,
         | we cannot outsource our humanity.
         | 
         | Ethics is a challenging subject and it often doesn't help that
         | we only get formal education through gen ed classes. But if
         | you're in STEM, it is essential that you are also a
         | philosopher, studying your meta topic. Don't need to publish
         | there, but do think about. Even just over beers with your
         | friends. Remember, it's not about being right -- such a thing
         | doesn't exist --, it is about being less wrong[7]
         | 
         | [0] https://www.youtube.com/watch?v=2nxanN85O84
         | 
         | [1] https://www.youtube.com/@SassyJustice
         | 
         | [2]
         | https://www.supremecourt.gov/DocketPDF/22/22-293/242292/2022...
         | 
         | [3] https://www.gutenberg.org/files/1080/1080-h/1080-h.htm
         | 
         | [4] I do think face data can be helpful when evaluating models
         | as our brains are quite adept at recognizing faces and even
         | small imperfections. But this should make it all that much
         | clearer that evaluation is __very__ hard.
         | 
         | [5] I think it is better to frame tech (and science) like a
         | coin. It has value. The good or evil question is based on how
         | the coin is spent. Even more so how the same type of coins are
         | predominantly spent. Both matter and the topic is coupled, but
         | we also need to distinguish the variables.
         | 
         | [6] Please don't nerdsplain to me how GPTs "reason". I've read
         | the papers you're about to reply with. I recognize that others
         | disagree, but I am a researcher in this field and my view isn't
         | even an uncommon one. I'm happy to discuss, but telling me I'm
         | wrong will go nowhere.
         | 
         | [7] https://hermiene.net/essays-trans/relativity_of_wrong.html
        
       | goda90 wrote:
       | How long until we see "anti-cheat"-like software to try to detect
       | this stuff for video chatting?
        
         | diggan wrote:
         | I'm guessing that finding a technology to try to detect this
         | would be over-engineering. I'd love to see a sample where the
         | person with the swapped face passes their hand with spread
         | fingers over their face, and see how it handles that.
        
           | Raicuparta wrote:
           | I've tried it, it currently does not handle that scenario
           | well at all.
        
             | warkdarrior wrote:
             | Wait three months, it'll be fixed.
        
         | radicality wrote:
         | Perhaps we'll see it a requirement to use a closed platform
         | like an iPhone where it would be much easier to attest that the
         | feed is not tampered with.
         | 
         | It's already a requirement sometimes to take a video of your
         | face from multiple angles using your phone - some identity
         | verification service forced me to do it. I imagine that stuff
         | like this will evolve to check for hardware attestations more,
         | or use info from depth/lidar sensors to verify video and other
         | sensor data align.
        
       | dpweb wrote:
       | Facinating software although I hope the idea "we're gonna rely on
       | people to be good humans and DO THE RIGHT THING" is quickly
       | abandoned and instead there is just as robust development of
       | detection software that goes along with newer and better deep
       | fake tools.
        
         | BugsJustFindMe wrote:
         | Detection is ultimately impossible. Anything you can detect can
         | be explicitly evaded.
        
           | exe34 wrote:
           | I dream of a world where a web of trust signatures are taken
           | seriously. A few hops should get you to a real human holding
           | the camera who claims it's a real recording. If that person
           | or someone along the way is regularly flagged as malicious by
           | others that you trust, you can blacklist them.
        
           | mhuffman wrote:
           | I think the terminal solution to this, in the US and maybe
           | the EU, will be putting identifying code/chips into all
           | devices capable of connecting to the Internet that will tag
           | all content (video, text, audio, images) in some way where
           | browsers will have to legally change to interpret them. This
           | will make everyone either unable to use the Internet or known
           | to anyone that "needs to know".
        
       | nope1000 wrote:
       | Technically impressive but I fail to see a good use case for it
       | that's not related to propaganda or scam and the website doesn't
       | seem to list one either.
        
         | pavel_lishin wrote:
         | If I had to dig way, way down to the bottom of the barrel for
         | use cases, it would be _very_ funny if everyone showed up to a
         | meeting wearing one of the attendee 's faces.
        
           | nope1000 wrote:
           | That would have been the dream of students during remote
           | schooling times.
        
           | stavros wrote:
           | Don't they already?
        
         | nwoli wrote:
         | One legitimate one I could imagine is if people want to pursue
         | a career in the adult film industry but without having to
         | reveal their true face (not with a celebrity face of course)
        
           | eltoxo wrote:
           | I have been thinking about this comment and if it isn't a
           | celebrity then just someone you know?
           | 
           | If not someone you know then just a random stranger?
           | 
           | Not that I can think of a better use case but it is telling
           | if this is the best we can do.
        
             | nwoli wrote:
             | I just meant a synthetic human (a dalle/stylegan/stable
             | diffusion face output).
        
             | irq-1 wrote:
             | Job interviews, bank loans, anywhere racism or
             | discrimination exists (or might exist.)
        
           | netsharc wrote:
           | There's some sort of filter on Instagram (or maybe it is some
           | deepfake tool) that replaces girls' eyes with a set of nice
           | eyes, but it seems the tool only has that pair of eyes, so
           | all the videos of girls with these eyes are so noticable. And
           | so many "influencers" have this pair of eyes, it's
           | depressing.
           | 
           | It's even more amusing when one sees glitches like eyes
           | appearing in front of a strand of hair...
        
         | muixoozie wrote:
         | Only things I can think of:
         | 
         | - streamer goofing around.
         | 
         | - Perhaps something like this could be used to map your facial
         | expressions onto video game characters in real-time.
         | 
         | - could take tictok style social media to the next level of
         | absurdity. make me into a meme. Ghana says goodbye etc.
        
         | skocznymroczny wrote:
         | It's fun for goofing around. Imagine a conference call with
         | your buddies and each one comes with a different deepfake. Kind
         | of like a costume party but on camera.
        
           | luzojeda wrote:
           | I don't think that compensates all the bad uses it will
           | probably have.
        
         | bsenftner wrote:
         | Oh those without the imagination: this is gold marketing for
         | makeup and fashion advertising companies. The "good use" is the
         | multi-billion dollar makeup and fashion industry. People will
         | submit their own images so they can see themselves randomly
         | appear in their own media feeds in the latest fashions. This is
         | a no brainer for those with the connections to fashion
         | marketing.
        
         | rebeccaskinner wrote:
         | Just a few off the top of my head:
         | 
         | Movies and TV:                 - As an alternative to motion
         | capture for animation       - As an alternative to existing de-
         | aging CGI when you want to flash back to a younger version of a
         | character (especially for cases where newer sequels are being
         | made for much older movies)       - As an easy way to get some
         | additional footage if an actor no longer looks the part
         | 
         | In a professional setting:                 - Conduct job
         | interviews were interviewees faces are mapped to the faces of a
         | few pre-defined images, to reduce a major source of implicit
         | bias in interviewing       - Get some footage of yourself when
         | you're looking your best, with great lighting, and use that
         | rather than being off-camera if you're joining a meeting when
         | you don't look great       - Create virtual spokespeople to
         | represent your company in marketing, and allow that person to
         | be played by different actors
         | 
         | News and Politics:                 - An alternative to blurred
         | or blocked out faces for people giving interviews or whistle
         | blowing       - Allow people to testify in court (virtually)
         | without revealing their identity and risking retaliation
        
           | realce wrote:
           | None of these uses - most of which benefit a slim percentage
           | of society or are needlessly complicated by this technology -
           | outweigh the severe downsides to society. It's the apex of
           | foolishness to act glibly about this.
           | 
           | This all ends extremely badly.
        
             | mhuffman wrote:
             | The counterpoint is that some or all of these could make
             | money and not enough people care how it ends if money is
             | being made. I suspect it will have to take a terrorist plot
             | using generative AI or something similarly significant to
             | shut the door and even then it will be disallowed by us
             | commoners, not the big four or five AI companies and not to
             | the rest of the world.
        
             | rebeccaskinner wrote:
             | Whether the upsides outweigh the downsides or not is a
             | different discussion. My point is that there are plenty of
             | ways someone might use this technology. If you do think
             | that this technology is a net negative to society and
             | should be controlled or prohibited, then it's still
             | important to understand the potential ways someone might
             | want to apply it so that you can be prepared to make your
             | argument.
             | 
             | Personally, I have mixed feelings. I think that most of the
             | outcomes we're most concerned about are going to happen one
             | way or another, and developing in public or even
             | commoditization of access to it is going to be a net
             | reduction in harm over locking it up and pretending it
             | doesn't exist while allowing people (and nations) with the
             | resources to run large models in secret to develop and use
             | the technology against people who are ignorant of what's
             | possible.
        
       | doctorhandshake wrote:
       | This makes me think there could be use for a very very (very)
       | easy-to-use tool that allows two parties to choose and store a
       | secret (verbally pronounceable) passphrase known only to the two
       | of them, for use in situations in which it might be necessary to
       | 'sign' a video chat or audio conversation in which one party's
       | identity might be in doubt.
        
         | reaperducer wrote:
         | _a very very (very) easy-to-use tool that allows two parties to
         | choose and store a secret (verbally pronounceable) passphrase
         | known only to the two of them_
         | 
         | So, quite literally, a "password" in its original pre-internet
         | meaning.
        
           | doctorhandshake wrote:
           | Haha yes. Maybe a secret knock?
        
         | 101008 wrote:
         | I always liked that idea (not original, I know) in the Harry
         | Potter books (Harry Potter and the Deathly Hallows, more
         | specifically), where two people ask a private question they
         | should only know to be sure they are not being impersonated.
        
         | sebastiennight wrote:
         | I've seen people try and share such a "password" verbally on a
         | video call. With recording and live transcribing on. From free-
         | tier extensions with wobbly privacy policies.
         | 
         | This won't work.
         | 
         | I've resorted to using OTP apps with family and coworkers.
        
           | mynameisvlad wrote:
           | So you force your parents/kids/aunts/etc to give you a one
           | time code every time they want to talk to you?
           | 
           | That seems extremely clunky and impersonal and I couldn't
           | imagine anyone in my family willingly agreeing to do it.
        
       | haxiomic wrote:
       | I miss the recent past where new tech felt exciting and inspiring
       | but for the last few years new developments are often coupled
       | with an anxiety for the new harms possible and often unclear
       | benefits. I wonder how much is 'inevitable', at large enough
       | scale we will always exploring new possibility spaces as they
       | become available and how much is our choosing - we put resources
       | to build these things in full awareness because we think they
       | bring value over focusing on other things. I realise though it is
       | useful for society to develop understanding and defences for
       | these things early
       | 
       | I've notice I've steadily become more ashamed to be associated
       | with tech. I'm still processing how to react to this and what to
       | choose to work on in response
       | 
       | Am I in a bubble? Do you share similar feelings or are yours
       | quite different? I am very curious
        
         | diggan wrote:
         | > I've notice I've steadily become more ashamed to be
         | associated with tech
         | 
         | Are you actively contributing to these areas you feel ashamed
         | about? If not, you shouldn't really feel ashamed about what
         | other people chose to work on, even if both of you work "in
         | tech".
         | 
         | I'm sure not all people working on medical research agrees with
         | what all other researchers are working on, but you cannot
         | really control what others are working on, so why feel ashamed
         | over what others are working on?
        
           | reaperducer wrote:
           | _you shouldn 't really feel ashamed about what other people
           | chose to work on, even if both of you work "in tech"._
           | 
           | Why not? Someone who builds boxes that hold bombs can be
           | ashamed of being in the munitions industry, even if they
           | don't make the actual bombs.
        
             | diggan wrote:
             | Right, but if you're in general "manufacturing", there
             | isn't much point of feeling ashamed about some parts of the
             | industry focusing on munitions manufacturing.
        
         | 0xf00ff00f wrote:
         | I feel the same way. I can't think of a single legitimate use
         | case for this. I wish all those GPU teraflops were being used
         | for something else.
        
         | chillingeffect wrote:
         | Definitely with you. It used to be a higher entry point so a
         | certain passion was necessary. And it was less about money and
         | more about sharing info and joy. Now networking tech has been
         | "democratized", it's another medium where the usual human pain
         | and greed play out. High school again, but with real
         | consequences on peoples' lives.
        
       | Willingham wrote:
       | Just need to add voice augmentation and every grandma and grandpa
       | in the world will have their bank accounts cleaned out! Better go
       | warn mine now! XD
        
         | reaperducer wrote:
         | _Just need to add voice augmentation and every grandma and
         | grandpa in the world will have their bank accounts cleaned
         | out!_
         | 
         | Only if by "grandma" you mean "Millennial" and by "grandpa" you
         | mean "Gen Z." Your ageism doesn't jibe with reality:
         | 
         | https://www.ftc.gov/news-events/data-visualizations/data-spo...
        
           | hungie wrote:
           | I think you should read that again. It's clear that different
           | age groups fall for different scams and have different
           | impacts from them.
           | 
           | Grandparents absolutely fall for some scams at
           | disproportionate rates. And are less likely to be able to
           | recover. (A 19 year old who loses everything has many more
           | productive years to recover than a 72 year old.)
           | 
           | Also, humorously, millennials _are_ starting to become
           | grandma and grandpa. Elder millennials are in their mid 40s.
           | It 's young, but not impossible for them to be grandparents
           | now.
        
         | curiousgal wrote:
         | In the world? You'd be surprised by how many grandmas out there
         | that don't have bank accounts or access to the Internet...
        
       | XorNot wrote:
       | A practical use of this would be to animate your face onto a CGI
       | model which was independently posed for the purposes of video
       | meetings - which is something I've always wanted.
       | 
       | Let me separate my face, body and words and craft the experience.
        
         | greeniskool wrote:
         | Face tracking has existed for years now. I frankly don't see
         | what's different between what you described and FaceRig.
        
           | XorNot wrote:
           | You're missing the point: I want a fake version of myself.
           | 
           | I want a model which is made photoreal with my own image, so
           | it can be given a voice in real time with my words, but a
           | filtered version of my facial expression and pose.
           | 
           | So how I look and act is essentially scriptable.
        
             | radicality wrote:
             | I think that's already happening. You can buy a trained
             | model of someone to do 24/7 live-streaming peddling
             | products, or even black-mirror-esque bringing back deceased
             | ones. Company in china selling this is Silicon
             | Intelligence.
             | 
             | https://www.technologyreview.com/2023/09/19/1079832/chinese
             | -...
             | 
             | https://www.technologyreview.com/2024/05/07/1092116/deepfak
             | e...
        
         | gunalx wrote:
         | Strit up look into vtubing, and it's already done.
         | https://inochi2d.com/
        
       | illnewsthat wrote:
       | What's the technical difference in how this works vs. previous
       | face swapping tech (like Snapchat filters)?
        
       | zug_zug wrote:
       | Technologically seems cool, and the first use that pops into mind
       | is "wouldn't this be funny to prank my friend?"
       | 
       | But maybe no, it wouldn't. Maybe it'd be deeply disconcerting. We
       | have very strong norms around honesty as a society, and maybe
       | crossing them in video just for a joke is comparably crass to
       | giving somebody a fake winning lottery ticket.
        
       | hackernewds wrote:
       | where are all these "wow impressive" comments coming from?
       | clicking "Get Started" dumps you into a loop of landing on the
       | home page
        
         | jakepage91 wrote:
         | the README.md seems straightforward enough.
        
           | robxorb wrote:
           | Where is the repo? Stuck in the landing page loop here and no
           | github link I could find.
        
             | stavros wrote:
             | Yeah all of the page's links just go to the same page,
             | except the "experience live cam" link at the top. That goes
             | to this:
             | 
             | https://github.com/hacksider/Deep-Live-Cam
             | 
             | Took me multiple minutes to find too.
        
       | declan_roberts wrote:
       | "Built-in checks prevent processing of inappropriate content,
       | ensuring legal and ethical use."
       | 
       | A software engineer says to himself, if only I could keep these
       | guns from jumping off the table and shooting people.
        
       | FredPret wrote:
       | I think AI + deepfakes will increase the value pressure on in-
       | person interactions - ie, the only time when you can (for now)
       | believe your eyes and ears.
       | 
       | I wonder how politics can be transacted in such an environment.
       | Old-timey first-past-the-post might be the optimal solution if
       | you can't trust anything from out of earshot.
        
         | carapace wrote:
         | > I wonder how politics can be transacted in such an
         | environment.
         | 
         | Codes and seals predate computers (by quite a bit.)
         | 
         | https://en.wikipedia.org/wiki/Code_(cryptography)
         | 
         | https://en.wikipedia.org/wiki/Seal_(emblem)
        
           | FredPret wrote:
           | Those are relevant to the workings of government.
           | 
           | The process of politicians debating and getting elected is
           | going to have to be much more local. Just look at how easy it
           | is to spread misinformation now.
        
             | carapace wrote:
             | My hope is that we (as a global society) re-learn to value
             | honor and honesty.
             | 
             | > The process of politicians debating and getting elected
             | is going to have to be much more local.
             | 
             | I'm no expert on government but that seems like it would be
             | a good thing. IMO the best but most expensive form of
             | government is Quaker-style Consensus Decision Making:
             | 
             | https://en.wikipedia.org/wiki/Quaker_decision-making
             | 
             | https://en.wikipedia.org/wiki/Consensus_decision-making
        
         | shireboy wrote:
         | In person interactions like this CIA mask expert tricking a US
         | president? https://spyscape.com/article/master-of-disguise-how-
         | the-cias...
        
           | jaynetics wrote:
           | This is a nice article, but I don't think it works as a
           | counter argument to GP. Deep fake shenanigans are way more
           | scalable and thus more likely to affect average people than
           | these elite spy techniques.
        
       | Stagnant wrote:
       | Looks like this project is a fork of the discontinued roop[0]
       | with primarily some UI improvements. One of roop's main
       | developers has been working on facefusion[1] for the past year
       | and it produces by far the most convincing results from the ones
       | i've seen and it also supports real time webcam face swapping.
       | 
       | 0: https://github.com/s0md3v/roop/
       | 
       | 1: https://github.com/facefusion/facefusion
        
       | KaiserPro wrote:
       | Is there a legitimate usecase for this?
       | 
       | Like when they were brainstorming this as a product, what was the
       | persona/vertical they were targeting?
        
       | freeone3000 wrote:
       | The future of v-tubers is here
        
       | RafelMri wrote:
       | Interesting... This project is built upon "GFPGAN v1.4"
       | (https://github.com/TencentARC/GFPGAN) and "FaceSwap Extension -
       | Automatic 1111 - Proof of Concept"
       | (https://github.com/revolverocelot1/-webui-faceswap-unlocked).
       | The GFPGAN project is grounded on its own in the paper "GFP-GAN:
       | Towards Real-World Blind Face Restoration with Generative Facial
       | Prior" by Wang et al. (https://xinntao.github.io/projects/gfpgan)
        
         | ed wrote:
         | This is not a new face swapping technique, it's a wrapper
         | around inswapper (aka InstantID, an IP adapter):
         | https://github.com/haofanwang/inswapper
         | 
         | Relevant source https://github.com/hacksider/Deep-Live-
         | Cam/blob/main/modules...
        
       | chefandy wrote:
       | Well, this will make for some "interesting" viral campaign
       | fodder.
        
       | araes wrote:
       | Huh. Thought politics was dead with VASA-1 [1], EMO [2], and
       | Animate / Outfit anyone [3], so we could clothe people in
       | anything, animate them any way we want, put them anywhere, and
       | have them say anything to the public. "Thoughts and prayers
       | victims..."
       | 
       | However, this really nails that pretty dead itself. Wonder if I
       | can:
       | 
       | - Sit at home in pajamas.
       | 
       | - Change my face to Sec. of Def. Lloyd Austin.
       | 
       | - Put myself in a nice suit from TV
       | 
       | - Call the White House with autotune voice pretending to be going
       | in for surgery yet again because of life threatening
       | complications
       | 
       | - Send the entire military into conniptions (maybe mention some
       | dangerous news I need to warn them about before the emergency
       | rush surgery starts)
       | 
       | Edit: This [4] might be an Animate / Outfit anyone image... It's
       | difficult to tell. Even with huge amounts of experience, the
       | quality has become too elevated, too quick to check 1000's of
       | depressing murder images for fakes because it might be a BS heart
       | string story. All stories on the WWW are now, "that might be
       | fake, unless I can personally check." Al-arabiya upvoted casinos
       | and lotteries for muslims recently. [5] "they all might be fake."
       | 
       | [1] https://www.microsoft.com/en-us/research/project/vasa-1/
       | 
       | [2] https://humanaigc.github.io/emote-portrait-alive/
       | 
       | [3] https://humanaigc.github.io/animate-anyone/
       | 
       | [4]
       | https://www.reuters.com/resizer/v2/https%3A%2F%2Fcloudfront-...
       | 
       | [5] https://english.alarabiya.net/News/gulf/2024/07/29/uae-
       | grant...
        
       | xnx wrote:
       | https://github.com/C0untFloyd/roop-unleashed does this for free
       | on your own computer
        
         | petesergeant wrote:
         | So does this, the landing page is a wrapper for
         | https://github.com/hacksider/Deep-Live-Cam
        
       | hankchinaski wrote:
       | Now the question is, what is the use case that does not entail
       | misinformation or personal amusement?
        
         | bufferoverflow wrote:
         | Cam girls will get some competition from guys.
        
       | jader201 wrote:
       | I feel like this is one step closer to the Black Mirror episode
       | where the grieving widow orders an AI version of her late
       | husband.
       | 
       | And I don't say this with excitement.
        
       | paul7986 wrote:
       | Neil DeGrasse Tyson says it best the Internet will die because of
       | this type of technology
       | https://www.youtube.com/shorts/AxM2XTwaaUA
       | 
       | AI will kill the Internet we know today and the new one im
       | guessing you will have to have a Internet license attached to
       | your identity which is backed by your internet reputation which
       | you always want to keep it high for veracity/validity! You can
       | still post anonymously but it wont hold as much weight compared
       | to you posting using your verified Internet identity. This idea
       | of mine i posted good number of times here and it gets downvoted
       | but with the IRS in bed with ID.Me (elon musk is involved with
       | them in some capacity) you can see what i mention with ID.me and
       | the IRS being a small step in this direction. Otherwise no one
       | uses the Internet (zero trust of it) .. it dies and we go back to
       | reading books and meeting in person (doesnt sound all that bad
       | yet ive never read a book before).
        
       | mikkom wrote:
       | This is getting scary to be honest
       | 
       | And this is the worst quality it will ever be. In the future it
       | will be impossible to know who we are talking with online.
        
       | kshri24 wrote:
       | I am not at all comfortable with this. Even though it is an
       | amazing demo I feel the release of this tool could not have been
       | more ill-timed. It has all the potential of wrecking US elections
       | this year. I don't really know what the optimal release time
       | should have been but I don't see how this can be used for good at
       | all. And I am still just considering implications of this
       | restricted to elections. I am not even going to think about what
       | it would mean to child porn, terrorism and even entire
       | assassinations orchestrated to destabilize a Government leading
       | to Civil/World Wars. Lots of things can go wrong with this tech.
        
         | mc32 wrote:
         | Another huge issue is fraud. People impersonating others for
         | financial fraud, etc.
         | 
         | Question is, is it stoppable? Doubt anyone thinks it can be
         | stopped unless you get into fascistic/communistic/authoritarian
         | tactics of arresting people for just using it at all.
        
       | cdrini wrote:
       | It's really getting to the point where multimedia online
       | shouldn't be trusted unless it's from a reputable source and
       | cross verified.
       | 
       | I wonder, is there a universe where maybe cameras are updated to
       | add some sort of digital signature to videos/photos to indicate
       | they are real and haven't been tampered with? Is that
       | feasible/possible? I'm not skilled with cryptography stuff to
       | know, but if we can digital sign documents with some amount of
       | faith...
       | 
       | I've heard folks mention trying to tag AI photos/videos, but it
       | seems like tagging non-AI photos/videos is more feasible?
        
         | bravoetch wrote:
         | The idea of a signed and verified stream has only been used to
         | enforce old-school distribution rights. Because of this, the
         | implementations are clunky and have zero incentive for consumer
         | adoption. Why buy your own handcuffs?
        
           | cdrini wrote:
           | The incentive is to prove that the video is not AI generated.
           | Useful for consumers, news organisations, camera
           | manufacturers, etc. The idea would be you can still copy the
           | file/change the video, but the signature will no longer be
           | valid. It's not mean to be restrictive like handcuffs/DRM.
        
       | sylware wrote:
       | That will be funny once those neural nets can be infered on a
       | normal person beefy workstation (or a few of them). Just thei
       | "ethical" goes down the drain.
       | 
       | It is already ez to run text troll AIs on normal workstations...
       | so...
        
       ___________________________________________________________________
       (page generated 2024-08-10 23:00 UTC)