[HN Gopher] How much do I need to change my face to avoid facial...
       ___________________________________________________________________
        
       How much do I need to change my face to avoid facial recognition?
        
       Author : pseudolus
       Score  : 84 points
       Date   : 2024-12-08 14:38 UTC (8 hours ago)
        
 (HTM) web link (gizmodo.com)
 (TXT) w3m dump (gizmodo.com)
        
       | mdorazio wrote:
       | I wonder if adding stickers, tattoos, or makeup that look like
       | eyes above or below your real eyes would do it.
        
         | derefr wrote:
         | There's even a make-up trend of "enlarging" the eyes by
         | painting the waterline of the lower eyelid white, that could be
         | used as a justification for walking around like this even in a
         | totalitarian police state.
        
           | dylan604 wrote:
           | In the current state of policing, this would just be probable
           | cause or fits the description of type of things. Sure, you
           | might not be identifiable by facial rec, but you'd be
           | recognizable by every flatfoot out there, or even the see
           | something say something crowd.
           | 
           | Might as well just wear a face mask and sunglasses. If your
           | FaceID can't recognize you, neither can the other systems.
        
             | buran77 wrote:
             | > If your FaceID can't recognize you, neither can the other
             | systems.
             | 
             | FaceID can't recognize me if I tilt my head a bit too much
             | to one side.
        
         | bsenftner wrote:
         | That is, for now, 100% effective. I'm a former lead software
         | scientist for one of the leading FR companies in the world.
         | Pretty much all FR systems trying to operate at real time use a
         | tiered approach to facial recognition. First detect for generic
         | faces in an image, which collects various things that are not
         | human faces but do collect every human face in an image. That's
         | tier 1 image / video frame analysis, and the list of potential
         | faces is passed on for further processing. This tier 1 analysis
         | is the weakest part, if you can make your face fail the generic
         | face test, it is as if you are invisible to the FR system. The
         | easiest way to fail that generic face test is to not show your
         | face, or to show a face that is "not human" such as has too
         | many eyes, two noses, or a mouth above your eyes in place of
         | any eyebrows. Sure, you'll stand out like a freak to other
         | humans, but to the FR system you'll be invisible.
        
         | moffkalast wrote:
         | Juggalo makeup is supposedly extremely effective.
         | 
         | Just make sure you don't how how magnets work for plausible
         | deniability.
        
           | thefaux wrote:
           | I don't even have to pretend!
        
         | dathinab wrote:
         | Wrt. cameras with depth sensors like face unlock this isn't
         | supper likely to work.
         | 
         | Wrt. public cameras which don't have such features and are much
         | further away and aren't supper high resolution either it maybe
         | could even somewhat work.
        
       | iterateoften wrote:
       | I had a similar thought last time I was in an airport for an
       | international flight and instead of scanning my boarding pass and
       | looking at my passport they just let everyone walk through and as
       | you passed the door it would tell you your seat number.
       | 
       | When I was in Mexico I filed a report with the airport after an
       | employee selling timeshares was overly aggressive and grabbed my
       | arm and try to block me from leaving. Quickly they showed me a
       | video of my entire time with all my movements at the airport so
       | they could pinpoint the employee.
       | 
       | Like the article says I think it is just a matter of time until
       | such systems are everywhere. We are already getting normalized to
       | it at public transportation hubs with almost 0 objections. Soon
       | most municipalities or even private businesses will implement it
       | and no one will care because it already happens to them at the
       | airport, so why make a fuss about it at the grocery store or on a
       | public sidewalk.
        
         | Zigurd wrote:
         | This reminds me of the early days of applying speech
         | recognition. Some use cases were surprisingly good, like non-
         | pretrained company directory name recognition. Shockingly good
         | _and_ it fails soft because there are a small number of
         | possible alternative matches.
         | 
         | Other cases, like games where the user's voice changes due to
         | excitement/stress, were incredibly bad.
        
         | dylan604 wrote:
         | > Quickly they showed me a video of my entire time with all my
         | movements at the airport so they could pinpoint the employee.
         | 
         | This is just as interesting as it is creepy, but that's the
         | world we live and this is hacker news. So, how quickly was was
         | quickly. You made your report, they get the proper people
         | involved, and then they show you the video. How much time
         | passed before you were viewing the video?
         | 
         | For someone that plays with quickly assembling an edited video
         | from a library of video content using a database full of
         | cuepoints, this is a very interesting problem to solve. What
         | did the final video look like? Was it an assembled video with
         | cuts like in a spy movie with the best angles selected in
         | sequence? Was it each of the cameras in a multi-cam like view
         | just starting from the time they ID'd the flight you arrived
         | on? Did they draw the boxes around you to show the system
         | "knew" you?
         | 
         | I'm really curious how dystopian we actually are with the
         | facial recognition systems like this.
        
           | eschneider wrote:
           | Those sorts of systems run in realtime. They neither know (or
           | care) who you are. They work by identifying people and
           | pulling out appearance characteristics (like blue coat/red
           | hair/beard/etc) and hashing them in a database. After that,
           | it's straightforward to track similar looking people via
           | connected cameras, with a bit of human assistance.
        
           | Animats wrote:
           | Here's a marketing video for a multi-camera tracking system
           | which does just that.[1]
           | 
           | [1] https://www.youtube.com/watch?v=07inDETl3LQ
        
         | CalRobert wrote:
         | Making it opt-out instead of opt-in means that that vast
         | majority of people won't care, or have better things to do.
         | 
         | You don't have to have your photo taken to enter the US if
         | you're a citizen, but who wants to deal with the hassle? And on
         | and on it goes.
        
           | onetokeoverthe wrote:
           | wrong. photo taken at sfo inbound customs.
           | 
           | go ahead and decline while the cop is holding your passport.
        
             | dessimus wrote:
             | > holding your passport.
             | 
             | When my spouse and I crossed through US customs this past
             | spring, they called us by our names and waved us on before
             | even getting our passports out to hand to the customs
             | officer. This was at BWI, fwiw.
        
               | jamiek88 wrote:
               | Customs or immigration?
        
               | dessimus wrote:
               | CBP. We are citizens, and were returning from a trip.
        
             | jamiek88 wrote:
             | Customs or immigration?
        
               | CalRobert wrote:
               | For whatever reason most Americans use the word "customs"
               | when they are, in fact, referring to immigration, when
               | traveling internationally.
        
               | Cyph0n wrote:
               | Because entry is handled by CBP - Customs and Border
               | Protection.
               | 
               | Immigration - which is the process of becoming a US
               | permanent resident and/or citizen - is handled (mostly)
               | by USCIS.
               | 
               | Other visas are handled by the State Department (foreign
               | ministry).
               | 
               | Not an expert, but this is my understanding.
        
             | CalRobert wrote:
             | I fly back to the US pretty often (I am a US citizen living
             | abroad) and have declined every time. This is in SFO. They
             | are generally fine with it. But most people won't risk it.
             | 
             | It's much, much more annoying in Ireland, where US
             | immigration happens in Dublin (an affront to Irish
             | sovereignty, but that's another matter) - so being delayed
             | can mean missing your flight.
        
               | onetokeoverthe wrote:
               | some airports laid back. others like sfo must have an
               | ongoing bust quota contest.
        
               | kortilla wrote:
               | > (an affront to Irish sovereignty, but that's another
               | matter
               | 
               | I'll bite. Why do you think it's an affront to their
               | sovereignty? It's entirely voluntary and it's something
               | the Dublin airport (and the dozens of other airports in
               | Canada) actively seek out to get direct access to the
               | domestic side in the US.
               | 
               | The US does not force any airports into these
               | arrangements.
        
         | onetokeoverthe wrote:
         | a bit after 911 i figured the airport dystopia would eventually
         | ooze out. after soaking deep within the nextgen.
         | 
         | rub my jeans sailor. no 3d xrays for me.
        
         | sema4hacker wrote:
         | Twenty (!) years ago I got home from a drug store shopping trip
         | and realized I had been charged for some expensive items I
         | didn't buy. I called, they immediately found me on their
         | surveillance recording, saw the items were actually bought by
         | the previous person in line, and quickly refunded me. No face
         | recognition was involved (they just used the timestamp from my
         | receipt), but the experience immediately made me a fan of video
         | monitoring.
        
           | WalterBright wrote:
           | I was talking with an employee at a grocery store, who told
           | me that management one day decided to review the surveillance
           | footage, and fired a bunch of employees who were caught
           | pilfering.
        
             | kQq9oHeAz6wLLS wrote:
             | I had a friend who was a checker at a large local chain,
             | and before shift one day he popped into the security office
             | (he was friends with the head of security) to say hi, and
             | they had every camera in the front of the store trained on
             | the employee working the customer service desk.
             | 
             | Someone got fired that day.
        
           | maccard wrote:
           | I worked in a retail/pc repair place about 10 years ago. Boss
           | phoned me one day to say X (customer) device is missing have
           | I seen it? I immediately knew it had been stolen and who by.
           | I was on my own in the shop, 10 minutes before closing and I
           | had been busy for the previous hour so the device was in the
           | front of the shop instead of stored away securely like they
           | normally would be. I was able to find the video within about
           | 30 seconds of getting in and pinpoint the guy. I actually
           | recognised him and was able to tell the police where I saw
           | him somewhat frequently (as I lived nearby too).
           | 
           | Without it, I think all the gingers would have pointed at me
           | rather than me being tired and making a mistake.
        
         | 1659447091 wrote:
         | > and no one will care because it already happens to them at
         | the airport, so why make a fuss about it at the grocery store
         | or on a public sidewalk.
         | 
         | You may be overestimating how many unique/different people
         | travel through airports, especially more than once or twice to
         | notice the tracking. People who travel once or twice total in
         | their life by air, (are usually easy to spot), far more
         | concerned with getting through a confusing hectic situation
         | then noticing or even knowing that using facial recognition is
         | new and not simply a special thing (because 9/11). And, the
         | majority of Americans have travelled to zero or one country,
         | last time I saw numbers on it. That country is usually Mexico
         | or Canada where they drive (or walk).
         | 
         | I think once it starts trying to hit close to home where people
         | have a routine and are not as stressed by a new situation and
         | have the bandwidth to--at a minimum--take a pause, will ask
         | questions about what is going on.
        
         | dathinab wrote:
         | The thing with you example is that there is a "time and
         | location bound context" due to which the false positive rate
         | can be _massively_ reduced.
         | 
         | But for nation wide public search the false positive rate is
         | just way to high for it to work well.
         | 
         | Once someone managed to leave a "local/time" context (e.g.
         | known accident at known location and time) without leaving too
         | many traces (in the US easy due to wide use of private cars
         | everyone) the false positive rate makes such systems often
         | practically hardly helpful.
        
         | gleenn wrote:
         | I seriously pisses me off that they make the font so small on
         | the opt-out signage and you get told by a uniform to stare at
         | the camera like you have no choice. Everything you don't fight
         | for ends up getting taken.
        
           | foxglacier wrote:
           | I tend to just stop and read the fine print for things that
           | might matter or if I have the time, even if I'm holding up a
           | queue. I've spent several minutes at the entrance gate to a
           | parking building because of the giant poster of T&Cs. I ask
           | librarians to find books for me because the catalogue
           | computer has a multi-screen T&C that I can't be bothered
           | reading. I've turned away a customer from by business because
           | their purchasing conditions included an onerous
           | indemnification clause which they refused to alter. I
           | discovered you don't need ID to travel on local flights
           | because the T&C led me to calling the airline who gave me a
           | password to use instead. I've also found several mistakes in
           | T&Cs that nobody probably notices because nobody reads them.
        
         | jillyboel wrote:
         | Thank you for giving us this dystopian future, AI bros
        
       | Zigurd wrote:
       | The article correctly points out that the amount of information
       | available in a controlled environment. makes it not even that
       | same problem. If I have data on your irises and blood vessels and
       | cranium shape, good luck evading a match if I get you where I can
       | get the same measurements. On the street there are some hacks,
       | like measuring gait, that can compensate for less face data, but
       | evading a useful match that's not one of a zillion false
       | positives is much easier.
        
       | derefr wrote:
       | If what you're trying to do is to _publish prepared images of
       | yourself_ , that won't be facially recognized as you, then the
       | answer is "not very much at all actually" -- see
       | https://sandlab.cs.uchicago.edu/fawkes/. Adversarially prepared
       | images can still look entirely like you, with all the facial-
       | recognition-busting data being encoded at an almost-
       | steganographic level vs our regular human perception.
        
         | 1659447091 wrote:
         | Do you know if this is still being worked on? The last "News"
         | post from the link was 2022. Looks interesting.
        
       | nonrandomstring wrote:
       | The thing about biometrics as discussed in more intelligent
       | circles, is "compromised once compromised for all time". It's a
       | public key or username not a password.
       | 
       | Fortunately that's not true of governments. Although your
       | government may be presently compromised it is possible, via
       | democratic processes, to get it changed back to uncompromised.
       | 
       | Therefore we might say, it's easier to change your government
       | than it is to change your face. That's where you should do the
       | work.
        
         | dathinab wrote:
         | biometrics are also way less unique then people think
         | 
         | basically the moment you apply them to a huge population (e.g.
         | all of US) and ignore temporal and/or local context you will
         | find collisions
         | 
         | especially when you consider partial samples weather that is
         | due to errors of the sensors used or other reasons
         | 
         | Innocent people have gone to prison because of courts ignoring
         | reality (of biometric matches always just being a likelyhood of
         | matching not ever a guarantee match).
        
       | hammock wrote:
       | First order approximation is 10 years' worth of aging, or 5
       | years' worth for a child under 16. These are the timelines in
       | which you must renew your American passport photo.
       | 
       | Apple Face ID is always learning as well. If your brother opens
       | your phone enough times with your passcode, it will eventually
       | merge the two faces it recognizes
        
         | sanj wrote:
         | citation?
        
           | hammock wrote:
           | First hand experience. Try it yourself?
        
             | IncreasePosts wrote:
             | Google photos only has pictures of my mom from her 60s
             | onwards, but when I put a sepia toned scan of my mom as a 9
             | year old, google photos asked me "is this [your mom]?"
        
       | satvikpendem wrote:
       | Their conclusion reminds me of this lady in China, Lao Rongzhi,
       | who was a serial killer along with her lover, Fa Ziying [0]. They
       | both went around the country extorting and killing people, and,
       | while Fa was arrested in 1999 via a police standoff, Lao was on
       | the run for two decades, having had plastic surgery to change her
       | face enough that most humans wouldn't have recognized her.
       | 
       | But in those two decades, the state of facial recognition
       | software had rapidly increased and she was recognized by a camera
       | at a mall and matched to a national database of known criminals.
       | At first police thought it were an error but after taking DNA
       | evidence, it was confirmed to be the same person, and she was
       | summarily executed.
       | 
       | In this day and age, I don't think anyone can truly hide from
       | facial recognition.
       | 
       | [0] https://www.youtube.com/watch?v=I7D3mOHsVhg
        
         | joe_the_user wrote:
         | Hmm, "cameras reported a 97.3% match". I would assume that for
         | a random person, the match level would be random. 1/(1 -.973) ~
         | 37. IE, 1 in 37 people would be tagged by the cameras. If
         | you're talk China, that means matching millions of people in
         | millions of malls.
         | 
         | Possibly the actual match level was higher. But still, the way
         | facial recognition seems to work even now is that it provides a
         | consistent "hash value" for a face but with a limited number of
         | digits/information (). This be useful if you know other things
         | about the person (IE, if you know someone is a passenger on
         | plane X, you can very likely guess which one) but still
         | wouldn't scale unless you want a lot of false positives and are
         | after specific people.
         | 
         | Authorities seem to like to say DNA and facial recognition
         | caught people since it implies an omniscience to these
         | authorities (I note above someone throwing out the either wrong
         | or meaningless "97.3% value). Certainly, these technologies do
         | catch people but they still limited and expensive.
        
           | Epa095 wrote:
           | > I would assume that for a random person, the match level
           | would be random. 1/(1 -.973) ~ 37.
           | 
           | Why would you assume that?
        
           | ImprobableTruth wrote:
           | The "97.3%" match is probably just the confidence value - I
           | don't think a frequentist interpretation makes sense for
           | this. I'm not an expert in face recognition, but these
           | systems are very accurate, typically like >99.5% accuracy
           | with most of the errors coming from recall rather than
           | precision. They're also not _that_ expensive. Real-time
           | detection on embedded devices has been possible for around a
           | decade and costs for high quality detection have come down a
           | lot in recent years.
           | 
           | Still, you're right that at those scales these systems will
           | invariably slip once in a while and it's scary to think that
           | this might enough to be considered a criminal, especially
           | because people often treat these systems as infallible.
        
         | wslh wrote:
         | This could help with the discussion: "Human face identification
         | after plastic surgery using SURF, Multi-KNN and BPNN
         | techniques"
         | <https://link.springer.com/article/10.1007/s40747-024-01358-7>
        
       | hyperific wrote:
       | CV Dazzle (2010) attempted this to counter the facial recognition
       | methods in use at that time.
       | 
       | https://adam.harvey.studio/cvdazzle
        
         | probably_wrong wrote:
         | D-ID (YC S17, [1]) promised that they would do the same. They
         | have been quite silent on whether they ever achieved their
         | target and nowadays they've pivoted to AI so no idea whether
         | they ever actually succeeded.
         | 
         | https://news.ycombinator.com/item?id=14849555
        
       | its_bbq wrote:
       | Why is makeup considered cheating but surgery not?
        
         | jquave wrote:
         | wrong app bro
        
         | jl6 wrote:
         | Maybe wearing enough makeup to hide your face would fool an
         | algorithm, but be conspicuous enough to get you noticed anyway.
        
       | throe844i wrote:
       | I welcome such tracking and surveillance.
       | 
       | It is too easy to get accused of something. And you have no
       | evidence to defend yourself. If you keep video recording of your
       | surroundings forever, you now have evidence. AI will make
       | searching such records practical.
       | 
       | There were all sorts of safe guards to make such recordings
       | unnecessary, such as due process. But those were practically
       | eliminated. And people no longer have basic decency!
        
         | dingnuts wrote:
         | who cares if you're tracked because you have nothing to hide,
         | right?
         | 
         | now imagine you're the wrong religion after the regime change.
         | 
         | "I have nothing to hide" is a stupid argument that leads to
         | concentration camps
        
           | simplicio wrote:
           | Seems like the Nazis managed to do the Concentration Camps
           | thing without facial recognition software.
        
             | pavel_lishin wrote:
             | But they did have tremendous data processing abilities for
             | their time!
        
               | simplicio wrote:
               | I don't think keeping the data processing abilities of
               | modern gov'ts below that of 1930's Germany is really a
               | plausible plan for avoiding concentration camps.
        
             | pessimizer wrote:
             | > As the Nazi war machine occupied successive nations of
             | Europe, capitulation was followed by a census of the
             | population of each subjugated nation, with an eye to the
             | identification and isolation of Jews and Romani. These
             | census operations were intimately intertwined with
             | technology and cards supplied by IBM's German and new
             | Polish subsidiaries, which were awarded specific sales
             | territories in Poland by decision of the New York office
             | following Germany's successful Blitzkrieg invasion. Data
             | generated by means of counting and alphabetization
             | equipment supplied by IBM through its German and other
             | national subsidiaries was instrumental in the efforts of
             | the German government to concentrate and ultimately destroy
             | ethnic Jewish populations across Europe. Black reports that
             | every Nazi concentration camp maintained its own Hollerith-
             | Abteilung (Hollerith Department), assigned with keeping
             | tabs on inmates through use of IBM's punchcard technology.
             | In his book, Black charges that "without IBM's machinery,
             | continuing upkeep and service, as well as the supply of
             | punch cards, whether located on-site or off-site, Hitler's
             | camps could have never managed the numbers they did."
             | 
             | https://en.wikipedia.org/wiki/IBM_and_the_Holocaust
             | 
             | They would have done a lot better faster with facial
             | recognition software, and certainly wouldn't have turned it
             | down.
        
               | mixmastamyk wrote:
               | But surely it couldn't happen in America, right? Guess
               | what, census data _was_ used to facilitate Japanese
               | internment.
        
             | whycome wrote:
             | America and Canada used facial recognition for their ww2
             | concentration camps.
             | 
             | https://en.m.wikipedia.org/wiki/Internment_of_Japanese_Cana
             | d...
        
           | throe844i wrote:
           | Data means power and freedom. With access to data you can
           | defend yourself from legal persecution! In past people were
           | lynched and killed for false accusations! With evidence they
           | would have a chance!
           | 
           | Hostile regime will kill you anyway. But there is a long way
           | there. And "soft hostile" may throw you into prison for 30
           | years, or take your your house and family. Or will not
           | enforce punishment on crooks. All fully legally in "proper
           | democracy".
           | 
           | And "wrong religion" and "leads to concentration camps"
           | really is a stupid argument, given what is happening last
           | year. People today are just fine with concentration camps and
           | genecide! It is just absurd argument used to defend corrupted
           | status quo!
           | 
           | If you have a "wrong religion" change it! People did that all
           | the time.
        
             | pavel_lishin wrote:
             | > _With access to data_
             | 
             | That's the key problem. Why do you assume you'll have
             | access to this data?
        
         | pavel_lishin wrote:
         | > _It is too easy to get accused of something. And you have no
         | evidence to defend yourself. If you keep video recording of
         | your surroundings forever, you now have evidence._
         | 
         | This assumes that you have _access_ to those recordings. If you
         | 're live-logging your life via something you're wearing all day
         | every day, maybe - but if the government decides to prosecute
         | you for something, what are the odds that you'll be able to
         | pull exonerating evidence out of the very system that's trying
         | to fuck you?
         | 
         | Even if a system doesn't _care_ , it's still a hassle. Case in
         | point:
         | https://www.independent.co.uk/news/world/americas/michigan-s...
         | 
         | > _An African American man who was wrongly convicted of a fatal
         | shooting in Michiganin 2011 is suing a car rental company for
         | taking seven-years to turn over the receipt that proved his
         | innocence, claiming that they treated him like "a poor black
         | guy wasn't worth their time"._
         | 
         | I found this article while looking for another story that's
         | virtually identical; I believe in that one it was a gas station
         | receipt that was the key in his case, and he ended up spending
         | very minimal time in jail.
         | 
         | How many people are in jail now because they weren't able to
         | pull this data?
        
         | trgn wrote:
         | i recently tried one of those cashierless amazon stores. it was
         | an odd jolt, this feeling to be trusted, by default. It was
         | vaguely reminiscent off one in my childhood, when, after my
         | parents had sent me on an errand to the local grocer, I'd
         | forgotten the money and the clerk/owner let me just walk out
         | since they knew me. Presumably they and my mom would take care
         | of the balance later.
         | 
         | I live now in a city where small exchanges are based on a
         | default of of mistrust (e.g. locking up the tide-pods behind a
         | glass case - it's not a meme). The only super market near (not
         | even _in_) my food desert started random bag checks.
         | 
         | The modern police state requires surveillance technology, but
         | abusive authority has flourished in any technological
         | environment. the mafia had no problem to terrorize entire
         | neighborhoods into omerta for example, without high technology.
         | i'm sure there's other examples.
         | 
         | i don't know the right answer, but considering the extent to
         | which anti-social and criminal attitudes are seemingly allowed
         | to fester, while everybody else is expected to relinquish their
         | dignity, essentially _anonymize_ themselves, it makes me less
         | and less have a kneejerk response to the expansion of
         | technologically supported individualization.
        
       | shae wrote:
       | What about you infrared LEDs on my face?
        
         | gehwartzen wrote:
         | https://www.reddit.com/r/techsupportmacgyver/comments/mej7j7...
        
       | deadbabe wrote:
       | It's trivial to also implement gait analysis to help visually
       | identify someone if a face isn't available. Then when you do get
       | a glimpse of the face you can link the gait and the face
       | identity.
        
       | SoftTalker wrote:
       | I was traveling internationally earlier this year and I have
       | grown a heavy beard since my passport photo was taken. None of
       | the automated immigration checkpoints had any trouble identifying
       | me.
        
         | mixmastamyk wrote:
         | Believe they focus on the eyes/nose shape and spacing.
        
           | dathinab wrote:
           | yes, they mainly focus on bone structure especially around
           | eye nose area
           | 
           | beards are to easy to change, and masks had been very common
           | for some time and cover more then beards
        
       | darepublic wrote:
       | Need emp charges like in metal gear. A bunch of metallic confetti
       | fills the air while you dash past the security cameras big and
       | small
        
       | Scotrix wrote:
       | "Asking our governments to create laws to protect us is much
       | easier than..."
       | 
       | A bit naive that, it's too late since data is already mostly
       | available and it just takes a different government to make this
       | protection obsolete.
       | 
       | That's why we Germans/Europeans have tried to fight data
       | collections and for protections for so long and quite hard (and
       | probably have one of the most sophisticated policies and
       | regulations in place) but over time it just becomes an
       | impossibility to keep data collections as low as possible (first
       | small exceptions for in itself very valid reasons, then more and
       | more participants and normalization until there is no protection
       | left...)
        
         | wizzwizz4 wrote:
         | It's not too late. Maybe it is _for us_ : but in 100 years, who
         | will really care about a database of uncomfortably-personal
         | details about their dead ancestors? (Sure, DNA will still be an
         | issue, but give that 1000 years and we'll probably have a new
         | MRCA.) If we put a stop to things _now_ (or soon), we can nip
         | this in the bud.
         | 
         | It's probably not too late for us, either. Facial recognition
         | by skull shape is still a concern, but only if the bad guys get
         | _up-to-date_ video of us. Otherwise, all they can do is
         | investigate our _historical_ activity. Other types of data have
         | greater caveats preventing them from being useful long-term,
         | provided we not participate in the illusion that it 's
         | "impossible to put the genie back in the bottle".
        
       | imron wrote:
       | > If you wore sunglasses and then did something to your face
       | (maybe wear a mask or crazy dramatic makeup) then it would be
       | harder to detect your face, but that's cheating on the question--
       | that's not changing your face, that's just hiding it!
       | 
       | So sunglasses and a mask then. Who cares if it's 'cheating'.
        
       | dathinab wrote:
       | What often is fully ignored in such articles is the false
       | positive rate.
       | 
       | Like e.g. where I live they tested some state of the art facial
       | recognition system on a widely used train station and applauded
       | themself how grate it was given that the test targets where even
       | recognized when they wore masks and capes, hats etc.
       | 
       | But what was not told was that the false positive rate while
       | percentage wise small (I think <1%) with the amount of expected
       | non-match samples was still making it hardly usable.
       | 
       | E.g. one of the train stations where I live has ~250.000 people
       | passing through it every day, even just a false positive rate of
       | 0.1% would be 250 wrong alarms, for one train station every
       | single day. This is for a single train station. If you scale your
       | search to more wider area you now have way higher numbers (and
       | lets not just look at population size but also that many people
       | might be falsely recognized many times during a single travel).
       | 
       | AFIK the claimed false positive rate is often in the range of
       | 0.01%-0.1% BUT when this system are independently tested in real
       | world context the found false positive rate is often more like
       | 1%-10%.
       | 
       | So what does that mean?
       | 
       | It means that if you have a fixed set of video to check (e.g.
       | close to where a accident happened around +- idk. 2h of a
       | incident) you can use such systems to pre-filter video and then
       | post process the results over a duration of many hours.
       | 
       | But if you try find a person in a nation of >300 Million who
       | doesn't want to be found and missed the initial time frame where
       | you can rely on them to be close by the a known location then you
       | will be flooded by such a amount of false positives that it
       | becomes practically not very useful.
       | 
       | I mean you still can have a lucky hit.
        
         | Eisenstein wrote:
         | What does 'false positive' mean? That it thinks it is someone
         | else, or that it thinks it is a target of an investigation?
        
           | TuringNYC wrote:
           | When the actual is negative but the inference is positive,
           | rate of that.
           | 
           | This is a very handy guide:
           | https://en.wikipedia.org/wiki/Confusion_matrix
        
       | _DeadFred_ wrote:
       | This has been answered since the 80s. This much:
       | 
       | https://i.imgur.com/7cuDqPI.jpg
        
       | _heimdall wrote:
       | I'm of two minds when it comes to surveillance. I don't like that
       | businesses, airports, etc do it but it is their property. I don't
       | like that they can run video feeds through software, either in
       | real time or after the fact, to so easily find and track my every
       | move. But again, its their property.
       | 
       | Where the line is always drawn for me, at a minimum, is what they
       | do with the video and who has access to it.
       | 
       | Video should always be deleted when it is no longer reasonably
       | needed. That timeline would be different for airports vs
       | convenience stores, but I'd always expect the scale of days or
       | weeks rather than months or years (or indefinitely).
       | 
       | Maybe more importantly, surveillance video should never be shared
       | without a lawful warrant, including clear descriptions of the
       | limits to what is needed and why it is requested.
        
       | gehwartzen wrote:
       | Kidding. (But maybe not?...)
       | 
       | https://en.m.wikipedia.org/wiki/Groucho_glasses
        
       | costco wrote:
       | The face ID feature on Bryan Johnson's phone no longer recognized
       | him after many months of his intense health regimen:
       | https://twitter.com/bryan_johnson/status/1777789375193231778
        
       | sandbach wrote:
       | At Tianfu Airport in Chengdu, there are large screens with
       | cameras attached that recognize your face and tell you which gate
       | to go to. Convenient but scary, like many things in China.
        
       | aprilthird2021 wrote:
       | It feels increasingly like the only way to avoid such facial
       | recognition is to suddenly grow a religious conviction that your
       | face should not be seen by strangers
        
       ___________________________________________________________________
       (page generated 2024-12-08 23:00 UTC)