[HN Gopher] How much do I need to change my face to avoid facial...
___________________________________________________________________
How much do I need to change my face to avoid facial recognition?
Author : pseudolus
Score : 248 points
Date : 2024-12-08 14:38 UTC (1 days ago)
(HTM) web link (gizmodo.com)
(TXT) w3m dump (gizmodo.com)
| mdorazio wrote:
| I wonder if adding stickers, tattoos, or makeup that look like
| eyes above or below your real eyes would do it.
| derefr wrote:
| There's even a make-up trend of "enlarging" the eyes by
| painting the waterline of the lower eyelid white, that could be
| used as a justification for walking around like this even in a
| totalitarian police state.
| dylan604 wrote:
| In the current state of policing, this would just be probable
| cause or fits the description of type of things. Sure, you
| might not be identifiable by facial rec, but you'd be
| recognizable by every flatfoot out there, or even the see
| something say something crowd.
|
| Might as well just wear a face mask and sunglasses. If your
| FaceID can't recognize you, neither can the other systems.
| buran77 wrote:
| > If your FaceID can't recognize you, neither can the other
| systems.
|
| FaceID can't recognize me if I tilt my head a bit too much
| to one side.
| bsenftner wrote:
| That is, for now, 100% effective. I'm a former lead software
| scientist for one of the leading FR companies in the world.
| Pretty much all FR systems trying to operate at real time use a
| tiered approach to facial recognition. First detect for generic
| faces in an image, which collects various things that are not
| human faces but do collect every human face in an image. That's
| tier 1 image / video frame analysis, and the list of potential
| faces is passed on for further processing. This tier 1 analysis
| is the weakest part, if you can make your face fail the generic
| face test, it is as if you are invisible to the FR system. The
| easiest way to fail that generic face test is to not show your
| face, or to show a face that is "not human" such as has too
| many eyes, two noses, or a mouth above your eyes in place of
| any eyebrows. Sure, you'll stand out like a freak to other
| humans, but to the FR system you'll be invisible.
| moffkalast wrote:
| Juggalo makeup is supposedly extremely effective.
|
| Just make sure you don't how how magnets work for plausible
| deniability.
| thefaux wrote:
| I don't even have to pretend!
| marc_abonce wrote:
| > Juggalo makeup is supposedly extremely effective.
|
| Yeah, it's supposed to be better than black metal makeup and
| even better than that early-2010's anti-detection makeup:
|
| https://consequence.net/2019/07/juggalo-makeup-facial-
| recogn...
|
| https://adam.harvey.studio/cvdazzle
|
| Although that was around 2018-2019 so, given how quickly face
| recognition is evolving, I wonder if juggalo makeup works
| anymore. Besides, as mentioned by many of the interviewees in
| OP's article, there's a balance between changing or hiding
| your facial features and looking "suspicious" or "unnatural"
| which is of course context dependent. Concerts are safer to
| "cheat" than airports.
| dathinab wrote:
| Wrt. cameras with depth sensors like face unlock this isn't
| supper likely to work.
|
| Wrt. public cameras which don't have such features and are much
| further away and aren't supper high resolution either it maybe
| could even somewhat work.
| iterateoften wrote:
| I had a similar thought last time I was in an airport for an
| international flight and instead of scanning my boarding pass and
| looking at my passport they just let everyone walk through and as
| you passed the door it would tell you your seat number.
|
| When I was in Mexico I filed a report with the airport after an
| employee selling timeshares was overly aggressive and grabbed my
| arm and try to block me from leaving. Quickly they showed me a
| video of my entire time with all my movements at the airport so
| they could pinpoint the employee.
|
| Like the article says I think it is just a matter of time until
| such systems are everywhere. We are already getting normalized to
| it at public transportation hubs with almost 0 objections. Soon
| most municipalities or even private businesses will implement it
| and no one will care because it already happens to them at the
| airport, so why make a fuss about it at the grocery store or on a
| public sidewalk.
| Zigurd wrote:
| This reminds me of the early days of applying speech
| recognition. Some use cases were surprisingly good, like non-
| pretrained company directory name recognition. Shockingly good
| _and_ it fails soft because there are a small number of
| possible alternative matches.
|
| Other cases, like games where the user's voice changes due to
| excitement/stress, were incredibly bad.
| dylan604 wrote:
| > Quickly they showed me a video of my entire time with all my
| movements at the airport so they could pinpoint the employee.
|
| This is just as interesting as it is creepy, but that's the
| world we live and this is hacker news. So, how quickly was was
| quickly. You made your report, they get the proper people
| involved, and then they show you the video. How much time
| passed before you were viewing the video?
|
| For someone that plays with quickly assembling an edited video
| from a library of video content using a database full of
| cuepoints, this is a very interesting problem to solve. What
| did the final video look like? Was it an assembled video with
| cuts like in a spy movie with the best angles selected in
| sequence? Was it each of the cameras in a multi-cam like view
| just starting from the time they ID'd the flight you arrived
| on? Did they draw the boxes around you to show the system
| "knew" you?
|
| I'm really curious how dystopian we actually are with the
| facial recognition systems like this.
| eschneider wrote:
| Those sorts of systems run in realtime. They neither know (or
| care) who you are. They work by identifying people and
| pulling out appearance characteristics (like blue coat/red
| hair/beard/etc) and hashing them in a database. After that,
| it's straightforward to track similar looking people via
| connected cameras, with a bit of human assistance.
| Animats wrote:
| Here's a marketing video for a multi-camera tracking system
| which does just that.[1]
|
| [1] https://www.youtube.com/watch?v=07inDETl3LQ
| UltraSane wrote:
| This tech isn't new. My company uses Axis cameras and Axis
| has some pretty advanced video analytics software
| https://www.axis.com/en-us/products/analytics
|
| It records the license plates of all cars entering and
| leaving the parking lots. You can associate names to faces
| which we do for all employees and the system automatically
| records when people enter and leave buildings. You can even
| just tell it to find all people with a blue shirt in a
| particular camera in a time window. It can automatically
| detect people shouting.
| sho wrote:
| > I'm really curious how dystopian we actually are
|
| No idea how widespread it is, but in Singapore airport the
| system is tightly integrated. You are "tagged" when you check
| in, and "tagged out" as you board, with your appearance
| associated with your intended flight details. If you miss
| your flight or otherwise spend too much time in the secure
| zone, you are highlighted in the system and will eventually
| be approached. Arriving passengers are also given a time
| limit to take their next action, be it clear immigration or
| enter transit, and lingering will also trigger a response.
|
| All in the name of safety and security but I can't help but
| feel a measure of discomfort with it all.
| CalRobert wrote:
| Making it opt-out instead of opt-in means that that vast
| majority of people won't care, or have better things to do.
|
| You don't have to have your photo taken to enter the US if
| you're a citizen, but who wants to deal with the hassle? And on
| and on it goes.
| onetokeoverthe wrote:
| wrong. photo taken at sfo inbound customs.
|
| go ahead and decline while the cop is holding your passport.
| dessimus wrote:
| > holding your passport.
|
| When my spouse and I crossed through US customs this past
| spring, they called us by our names and waved us on before
| even getting our passports out to hand to the customs
| officer. This was at BWI, fwiw.
| jamiek88 wrote:
| Customs or immigration?
| dessimus wrote:
| CBP. We are citizens, and were returning from a trip.
| dawnerd wrote:
| They do that with kiosks and the app. It can be a bit
| hectic with global entry.
| jamiek88 wrote:
| Customs or immigration?
| CalRobert wrote:
| For whatever reason most Americans use the word "customs"
| when they are, in fact, referring to immigration, when
| traveling internationally.
| Cyph0n wrote:
| Because entry is handled by CBP - Customs and Border
| Protection.
|
| Immigration - which is the process of becoming a US
| permanent resident and/or citizen - is handled (mostly)
| by USCIS.
|
| Other visas are handled by the State Department (foreign
| ministry).
|
| Not an expert, but this is my understanding.
| CalRobert wrote:
| My understanding is that immigration gets you in the
| country, even as a tourist, and customs gets your stuff
| in.
| shiroiushi wrote:
| That's how other countries normally do things, but
| America is a little weird.
| Cyph0n wrote:
| I was simply trying to explain why the US refers to entry
| as "customs".
| ghaff wrote:
| Customs as it existed a few decades ago barely exists in
| many/most countries today except pro-forma. You used to
| routinely get your bag searched. Now, with very few
| exceptions, you just walk through the green door. Part of
| it (of course, this may change) is that there used to be
| a lot of financial incentive to buying items abroad and
| importing in your luggage.
|
| I have Global Entry but I don't think the US even has a
| customs form any longer.
| popcalc wrote:
| >a lot of financial incentive to buying items abroad and
| importing in your luggage
|
| >very few exceptions
|
| You've got it backwards. If you're an American you're
| probably traveling through freeports or low tax regimes
| like Singapore, UK, etc. and don't realise how regressive
| most regimes are. In places like Hungary, Angola, SEA --
| where tax can be in the range of 30-50% you will be lucky
| not to be shaken down by a customs agent before leaving
| the luggage carousel.
| ghaff wrote:
| As an American, I've traveled through a _lot_ of
| countries and don 't have much experience over the past
| couple of decades with being shaken down by customs
| agents. But perhaps if I looked differently and/or had
| several large pieces of luggage which I don't travel
| with.
| CalRobert wrote:
| I fly back to the US pretty often (I am a US citizen living
| abroad) and have declined every time. This is in SFO. They
| are generally fine with it. But most people won't risk it.
|
| It's much, much more annoying in Ireland, where US
| immigration happens in Dublin (an affront to Irish
| sovereignty, but that's another matter) - so being delayed
| can mean missing your flight.
| onetokeoverthe wrote:
| some airports laid back. others like sfo must have an
| ongoing bust quota contest.
| kortilla wrote:
| > (an affront to Irish sovereignty, but that's another
| matter
|
| I'll bite. Why do you think it's an affront to their
| sovereignty? It's entirely voluntary and it's something
| the Dublin airport (and the dozens of other airports in
| Canada) actively seek out to get direct access to the
| domestic side in the US.
|
| The US does not force any airports into these
| arrangements.
| lmm wrote:
| The programme is there for the convenience of the US.
| Would they allow Ireland to operate a corresponding
| facility on US soil?
|
| (The popularity of that airport for CIA torture flights
| also doesn't help the case, even if not directly linked)
| rssoconnor wrote:
| > The programme is there for the convenience of the US.
| Would they allow Ireland to operate a corresponding
| facility on US soil?
|
| FWIW, I recall reading that the program in Canada is
| reciprocal, and it is simply the case that Canada hasn't
| decided to operate any corresponding facility in the US.
| goodcanadian wrote:
| That is correct. IIRC, Bermuda is also part of the
| agreement, and I would be very much surprised if Ireland
| doesn't operate on the same rules.
| majormajor wrote:
| It provides a good amount of convenience for US citizens,
| certainly.
|
| Let's talk about Toronto or Vancouver to set aside CIA
| whatever. What particular convenience does it provide for
| the US government to do it there vs on the US side?
| AFAICT that would save the airline that brought a person
| who got denied a bit of trouble - vs having to take them
| back to their departure airport - but not be a
| particularly huge convenience or burden for either
| government at a higher-up level.
| lmm wrote:
| > What particular convenience does it provide for the US
| government to do it there vs on the US side?
|
| It reduces legal accountability (I know the US courts
| have generally exempted border operations from the
| constitution anyway, but that interpretation could change
| in the future) and makes it easier to prevent people from
| e.g. landing and claiming asylum (yes there are measures
| to penalise airlines and oblige them to return
| passengers, but they're not always fully effective). More
| subtly it means there's less pressure to have reasonable
| border rules, since turning someone away before they
| board is lower-stakes. And having an official, pseudo-
| law-enforcement presence in a country is valuable almost
| in itself.
| goodcanadian wrote:
| I would argue higher legal accountability as they are
| subject to the host country's laws. If you are at a US
| airport, you are at the whim of US border officials. If
| you are at a Canadian airport, you have the right to turn
| around and leave.
| dotancohen wrote:
| CIA torture flights?
| lmm wrote:
| When the US government wants to torture people from
| another country, it gets around legal protections by
| having the CIA illegally fly them to a third country.
| Many of those flights went via Ireland. See e.g.
| https://www.independent.ie/irish-news/wikileaks-memo-
| tells-o...
| briandear wrote:
| Nothing to do with US immigration pre clearance.
| briandear wrote:
| The program is there for the convenience of Irish
| travelers. They can clear immigration and then when they
| arrive they are treated as domestic arrivals and save a
| lot of time.
| kalleboo wrote:
| The programme is there for the convenience of the
| airlines. If someone arrives in the US and is denied
| entry, the airline is on the hook to fly them back. It's
| much better for them for the traveler to be denied before
| even boarding.
| mattkrause wrote:
| More critically, it opens up a huge number of routes for
| the airlines because the US destination no longer S to be
| an international airport with a CBP presence.
| CalRobert wrote:
| I think it's absurd to have US immigration policy
| enforcement on Irish soil (I suppose there's a diplomatic
| carve-out for whether the post-immigration area is "US
| soil" or whatever, but still).
|
| As said policies become increasingly inhumane I think
| Ireland should consider removing this arrangement. But
| you are right, Dublin Airport themselves do benefit since
| it makes them more attractive, especially as a transfer
| airport for people going to the US from Europe.
| louthy wrote:
| Is it "absurd"? If you're going to be rejected access to
| a country, wouldn't it be better before you get on the
| plane? Seems the opposite of absurd, it seems preferable.
| 6LLvveMx2koXfwn wrote:
| UK based travellers travelling to Europe via either the
| Eurostar or Le Shuttle go through French immigration on
| UK soil before departing, this facilitates easy exit in
| France. Makes perfect sense to me, as a UK national I
| don't see this as impinging on UK Sovereignty.
| tsimionescu wrote:
| I can't even imagine a situatuion where this is not
| preferable. For example, if the US immigration check
| happens in Ireland, they can't detain you or mess with
| you in ways in which Ireland doesn't approve of, which
| they could if you were on US soil.
|
| If anything, it seems to me that the USA agreeing to
| perform immigration checks in Ireland and accept them
| when you reach the USA is a(n extremely mild) limitation
| to US sovereignty, not to Irish sovereignty.
| lupusreal wrote:
| When I took a US ferry to Canada, Canada border officials
| were on the boat so we could do all the paperwork before
| we arrived.
| onetokeoverthe wrote:
| a bit after 911 i figured the airport dystopia would eventually
| ooze out. after soaking deep within the nextgen.
|
| rub my jeans sailor. no 3d xrays for me.
| sema4hacker wrote:
| Twenty (!) years ago I got home from a drug store shopping trip
| and realized I had been charged for some expensive items I
| didn't buy. I called, they immediately found me on their
| surveillance recording, saw the items were actually bought by
| the previous person in line, and quickly refunded me. No face
| recognition was involved (they just used the timestamp from my
| receipt), but the experience immediately made me a fan of video
| monitoring.
| WalterBright wrote:
| I was talking with an employee at a grocery store, who told
| me that management one day decided to review the surveillance
| footage, and fired a bunch of employees who were caught
| pilfering.
| kQq9oHeAz6wLLS wrote:
| I had a friend who was a checker at a large local chain,
| and before shift one day he popped into the security office
| (he was friends with the head of security) to say hi, and
| they had every camera in the front of the store trained on
| the employee working the customer service desk.
|
| Someone got fired that day.
| HeyLaughingBoy wrote:
| Surprising how common it is. The first hardware I ever
| designed on the job was a device to detect employee theft.
| maccard wrote:
| I worked in a retail/pc repair place about 10 years ago. Boss
| phoned me one day to say X (customer) device is missing have
| I seen it? I immediately knew it had been stolen and who by.
| I was on my own in the shop, 10 minutes before closing and I
| had been busy for the previous hour so the device was in the
| front of the shop instead of stored away securely like they
| normally would be. I was able to find the video within about
| 30 seconds of getting in and pinpoint the guy. I actually
| recognised him and was able to tell the police where I saw
| him somewhat frequently (as I lived nearby too).
|
| Without it, I think all the gingers would have pointed at me
| rather than me being tired and making a mistake.
| notachatbot123 wrote:
| It's a different thing though. In your case they used a
| timestamp to manually look at footage and confirm an
| identity. In OP's case, automated recognition is used to
| identify and track people, in aggregation mass.
| interludead wrote:
| An added layer of complexity
| 1659447091 wrote:
| > and no one will care because it already happens to them at
| the airport, so why make a fuss about it at the grocery store
| or on a public sidewalk.
|
| You may be overestimating how many unique/different people
| travel through airports, especially more than once or twice to
| notice the tracking. People who travel once or twice total in
| their life by air, (are usually easy to spot), far more
| concerned with getting through a confusing hectic situation
| then noticing or even knowing that using facial recognition is
| new and not simply a special thing (because 9/11). And, the
| majority of Americans have travelled to zero or one country,
| last time I saw numbers on it. That country is usually Mexico
| or Canada where they drive (or walk).
|
| I think once it starts trying to hit close to home where people
| have a routine and are not as stressed by a new situation and
| have the bandwidth to--at a minimum--take a pause, will ask
| questions about what is going on.
| highcountess wrote:
| I'm thinking it will only be a matter of time before (if it's
| not already the case) that things like self-checkout systems
| that do HQ faces level video for facial recognition and
| identification, akin to any number of dystopian novels/movies
| where some protagonist cannot move around without face
| covering because there are scanners, or even something like
| Idiocracy where the public is so conditioned that they
| immediately report someone who does not obey the government
| regime's requirement to have some barcode.
| interludead wrote:
| Is there a tipping point where familiarity leads to
| normalization, or does it instead give people the clarity to
| resist?
| tim333 wrote:
| Also there's the possibility people aren't particularly
| bothered by it as long as it gets used for reasonable
| purposes, to catch the bad guys. My main annoyance with
| surveillance in London is it wasn't good enough to catch
| the bastards who snatched my phone.
|
| From a practical point of view to avoid getting caught if
| you look at the phone snatchers, they wear hoodies,
| balaclava type cycle masks and generic black tracksuit like
| clothing. If you look at the photos of the NYC shooter he
| slipped up in not wearing a balaclava type mask and in
| having distinctive clothing and backpack.
| photonthug wrote:
| > My main annoyance with surveillance in London is it
| wasn't good enough to catch the bastards
|
| Well that's the norm with all surveillance: it pretty
| much never helps you and might hurt you, regardless of
| the promises. Obviously after decades of constant spying,
| men are still getting ads intended for women and vice
| versa, and yet micro targeting is changing election
| outcomes. Banks and governments watch every single
| transaction, but it doesn't reduce the administrative
| burden of compliance for tax paperwork. Airport
| experiences are worse than ever and at greater expense,
| but anyone with a few brain cells to rub together knows
| that it's just security theater. Even more basically..
| google reads all your email and searching for that exact
| phrase you know you read or wrote just a few weeks ago
| somehow turns up zero results.
|
| This will all just get worse, because as the amount of
| data collected increases, everyone can be suspected of
| something just because of coincidence. Your insurance
| company is getting the memo about your poor diet or know
| that you're driving too fast, and just won't bother to
| find out about your healthy exercise regimen or that your
| job is driving an ambulance! To be presumed innocent
| you'll need to opt into more data collection or
| disclosures of course, that's the way it goes, but this
| only makes things worse because the extra data is just
| more stuff that can be used in a case against you.
| alistairSH wrote:
| But, will they even realize when/where they're being
| surveilled?
|
| Out of sight, out of mind. If there isn't a large video
| camera tracking them as they move across a shop or down the
| street, I'm not sure many people will even notice.
| conductr wrote:
| It's pretty much too late by the time that happens. People's
| general indifference regarding privacy never ceases to amaze
| me, we really put up no fight whatsoever
| dghlsakjg wrote:
| I wonder if there isn't a case to be made for some really
| bad faith projects as demonstrators for just how creepy
| this shit is.
|
| Privacy advocacy orgs should have contests for tracking
| people using publicaly available video feeds, or something
| of the sort.
|
| Let people search their license plate to see how easy it is
| to track all of your movements. Maybe put up a few high res
| webcams in the vicinity of a legislature building for
| maximum effect.
| conductr wrote:
| I don't have much hope in that approach. It might get
| some attention and trend for a day/week or so, but
| nothing happens and people move on to the next thing and
| the camera's remain
|
| Also, practically, the advocacy groups would need to get
| access to the surveillance feeds or deploy their own
| hardware - which I just don't see happening
| 1659447091 wrote:
| I've used this as a bit of a thought experiment, and also
| think it may do more harm than good--but a part of me
| wonders if it may be just thing to create change. A non-
| profit that works a bit like _haveibeenpwned.com_ but
| with data sold by data brokers that anyone can look up,
| with corresponding source attribution. At one point, long
| ago I was of the idea that all data should be public
| /exposed or none of it (this ship already sailed with
| data brokers and such. Don't know how it could be
| undone).
|
| The problem I keep running into is a real world take on
| the Trolley problem[0].
|
| Do you publicly publish all data, which:
|
| 1. Reduces its sellable value
|
| 2. Makes people aware of how much they are being tracked
| and profiled
|
| 3. Gives back a small bit of agency over ones data by
| knowing where to send delete/remove request to make data
| brokers honour local laws
|
| However, doing so would also:
|
| 1. Give easy access to abuse victim data, putting them in
| further harm
|
| 2. Give actual stalkers an easier path to their targets
|
| 3. Other harm that I can not fathom at this point in time
|
| I don't know the answer, maybe mask the address part, or
| do like Strava and set a blocking geo fence around
| home/work addresses. For location tracking keep it months
| behind and remove/mask anything remotely related to
| health services (mental and physical).
|
| [0] https://en.wikipedia.org/wiki/Trolley_problem
| conductr wrote:
| I'd prefer the passive data never existed, it's actively
| collected and that activity can be banned. Meaning, when
| I'm on strava I'm actively collecting data and have opted
| in to that. But, if I'm jogging, I didn't opt in to the
| cameras on every pole using facial recognition to
| triangulate my location (my face + camera location = my
| location) and so I think this is a bit of an overreach.
|
| Just like everyone though, I'm just going to gripe here
| and move on with my life as the mass surveillance
| infrastructure rollout proceeds
| dathinab wrote:
| The thing with you example is that there is a "time and
| location bound context" due to which the false positive rate
| can be _massively_ reduced.
|
| But for nation wide public search the false positive rate is
| just way to high for it to work well.
|
| Once someone managed to leave a "local/time" context (e.g.
| known accident at known location and time) without leaving too
| many traces (in the US easy due to wide use of private cars
| everyone) the false positive rate makes such systems often
| practically hardly helpful.
| bigiain wrote:
| No too sure modern private cars are all that good at letting
| you avoid leaving time/location traces.
| gleenn wrote:
| I seriously pisses me off that they make the font so small on
| the opt-out signage and you get told by a uniform to stare at
| the camera like you have no choice. Everything you don't fight
| for ends up getting taken.
| foxglacier wrote:
| I tend to just stop and read the fine print for things that
| might matter or if I have the time, even if I'm holding up a
| queue. I've spent several minutes at the entrance gate to a
| parking building because of the giant poster of T&Cs. I ask
| librarians to find books for me because the catalogue
| computer has a multi-screen T&C that I can't be bothered
| reading. I've turned away a customer from by business because
| their purchasing conditions included an onerous
| indemnification clause which they refused to alter. I
| discovered you don't need ID to travel on local flights
| because the T&C led me to calling the airline who gave me a
| password to use instead. I've also found several mistakes in
| T&Cs that nobody probably notices because nobody reads them.
| jillyboel wrote:
| Thank you for giving us this dystopian future, AI bros
| Buttons840 wrote:
| I think the best we can hope for is that government officials
| are subject to more surveillance than regular people. Everyone
| is going to have at least some surveillance.
| UltraSane wrote:
| You have zero expectation of privacy in public
| crooked-v wrote:
| There's a huge difference between the historical intent of
| that principle and the way that these days everyone in a
| given space can be exhaustively recorded and tracked 24/7.
| hibikir wrote:
| The availability of cheap sick is what makes the lack of
| privacy naturally different. There was a time where a
| police department could identify a suspect, talk to a
| judge, and then had that person followed for a while,
| dedicating multiple people to the efforts. With enough
| cameras and disk space, you now identify a person, and they
| were pre-followed for who knows how long.
|
| Then again, it depends on where you are. One could have
| thought that finding a specific guy in NYC after you had
| him on camera at a given time would be easy, but they
| aren't so easy to locate immediately.
| wpietri wrote:
| For sure, and I think a key change here is asymmetry.
| Previously in public I'd have a reasonable chance of
| knowing that somebody was watching or following me. Between
| cameras, networks, and high-capacity recording, that's all
| out the window.
|
| I'd feel much better about if we heavily surveilled the use
| of surveillance. E.g., every access is recorded both in
| terms of metadata and in terms of generating video of
| whoever's looking. And if I'm in something they're looking
| at, I get notified (barring temporary legal exceptions for
| open investigations and the like).
| HeatrayEnjoyer wrote:
| This saying isn't even true. Many countries have cultural
| expectations and legal structures providing some level of
| privacy in public. The very first GDPR fine issued stemmed
| from a business security camera that needlessly recorded
| people on the sidewalk.
| UltraSane wrote:
| How can you expect privacy when an unlimited number of
| cameras could be recording you?
| itishappy wrote:
| You reasonably can't. As a society, we need to choose
| between individual privacy and the pervasive use of
| invasive cameras. Regulations can be made to protect one
| or the other. The US seems to be going one way, the EU
| another.
| gosub100 wrote:
| Merely recording is a necessary, but not sufficient
| condition to invasion of privacy.
| Mashimo wrote:
| I actually have, by law even :)
| unethical_ban wrote:
| The ease of mass surveillance and analysis/tracking makes it
| worse. Machine powered automatic analysis and tracking is
| more than just video recording. I hope that difference is
| apparent.
| ct0 wrote:
| The individual tracking systems were getting secretly installed
| at a local to me state school about 10 years ago. It's got to
| be pretty advanced by now.
| nobody9999 wrote:
| >why make a fuss about it at the grocery store or on a public
| sidewalk.
|
| Because _my_ business is my business and nobody else 's. Full
| stop.
| try_the_bass wrote:
| When you're in public or at a grocery store, it's no longer
| _just_ your business, though?
| JTyQZSnP3cQGa8B wrote:
| That's the American privacy concept. It's different in
| other countries.
| try_the_bass wrote:
| No, that's kind of just a fact? When you're in public,
| you're interacting with others, meaning your actions (or
| lack thereof) no longer only impacts yourself.
|
| So, it stops being solely your business, and starts to
| become slightly others', as well.
| nobody9999 wrote:
| >So, it stops being solely your business, and starts to
| become slightly others', as well.
|
| Whose? And under what circumstances? Please be specific
| and include appropriate legal precedents. Thanks!
| gianjohansen wrote:
| Depends on the state and city, there's no federal law.
| Madison Square Garden (notoriously) use facial
| recognition to ban all lawyers from their venue who work
| at firms engaged in active litigation against them. This
| was upheld in May [1][2] since in NYC you can collect
| biometric data for commercial use without consent as long
| as it's signposted and you're not selling the data [3].
|
| [1] https://law.justia.com/cases/federal/district-
| courts/new-yor...
|
| [2] https://news.bloomberglaw.com/litigation/madison-
| square-gard...
|
| [3] https://codelibrary.amlegal.com/codes/newyorkcity/lat
| est/NYC...
| try_the_bass wrote:
| I don't know about case law, but when you walk into a
| grocery store, it certainly becomes _their_ business!
|
| Where's the case law and precedent that says your
| business is only your own, even when on a public sidewalk
| or in a grocery store? If you're going to make such
| unreasonable demands, can we start with your own claims,
| since you made them first?
| hombre_fatal wrote:
| If you want to get your mind blown, bring up traffic
| light cameras in Texas where people use "I have the right
| to privacy" to literally mean they should be able to run
| a red light [and potentially T-bone someone].
|
| Public roads should be a clear case where your business
| is everyone else's business since you're hurtling down
| the road in an increasingly heavier vehicle, but we're
| far from being able to acknowledge that.
| unethical_ban wrote:
| You see no difference between
|
| Being seen by others, being recorded incidentally, being
| recorded constantly, and being recorded and analyzed by
| machines in real time?
| try_the_bass wrote:
| Of course I see a difference among those. Where did I
| indicate anything to the contrary, or on the topic at
| all?
|
| How am I supposed to take you privacy advocates seriously
| when you make such wild logical leaps?
| unethical_ban wrote:
| Because your statement makes no point unless it is a
| defense of current technology.
|
| Most people don't expect others to look away while you
| pick your nose at the grocery store. The statement about
| defending privacy in public is almost always about
| tracking and the ease of it.
| bayouborne wrote:
| Start masking up w/a consistent alterface now, because once
| everyone gets base-lined, then you're going to be stopped
| because you don't look like you.
| Razengan wrote:
| And the biggest problem is that all this surveillance is one-
| side: "they" can see everything we do but we can't see what
| they do.
| temporallobe wrote:
| I just experienced one of these facial scanners in the UK while
| boarding a plane for the US. The thought had occurred to me
| that this could become the norm and that there's nothing one
| could actually do about it and that we are already living in
| the dystopian future we feared, where no one can truly ever be
| anonymous. But I also wondered about various problem scenarios.
| If the scanner couldn't match your face, would they deny you
| entry? If so, what would happen if someone had plastic surgery
| or some other condition that altered their face? What if this
| technology becomes so pervasive that your face is scanned
| everywhere you go? Where does any of this end?
| bookofjoe wrote:
| See my comment up top re: plastic surgery that alters your
| face.
| Gud wrote:
| Don't assume that this development is inevitable.
|
| Some countries have strong privacy laws, such as Switzerland.
| interludead wrote:
| Trading-off of your biometric data for that convenience
| kccqzy wrote:
| What you describe at the end has already happened in China.
| Municipalities (at least the large ones) routinely have cameras
| with facial recognition everywhere in public. The police has
| power to pull up this kind of information without warrants
| (it's China, so what do warrants even mean).
| Zigurd wrote:
| The article correctly points out that the amount of information
| available in a controlled environment. makes it not even that
| same problem. If I have data on your irises and blood vessels and
| cranium shape, good luck evading a match if I get you where I can
| get the same measurements. On the street there are some hacks,
| like measuring gait, that can compensate for less face data, but
| evading a useful match that's not one of a zillion false
| positives is much easier.
| derefr wrote:
| If what you're trying to do is to _publish prepared images of
| yourself_ , that won't be facially recognized as you, then the
| answer is "not very much at all actually" -- see
| https://sandlab.cs.uchicago.edu/fawkes/. Adversarially prepared
| images can still look entirely like you, with all the facial-
| recognition-busting data being encoded at an almost-
| steganographic level vs our regular human perception.
| 1659447091 wrote:
| Do you know if this is still being worked on? The last "News"
| post from the link was 2022. Looks interesting.
| sebastiennight wrote:
| My understanding is that this (interesting) project has been
| abandoned, and since then, the face recognition models have
| been train to defend against it.
| derefr wrote:
| Very likely correct in the literal sense (you shouldn't rely
| on the published software); but I believe the _approach_ it
| uses is still relevant / generalizable. I.e. you can take
| whatever the current state-of-the-art facial recognition
| model is, and follow the steps in their paper to produce an
| adversarial image cloaker that will fool that model while
| being minimally perceptually obvious to a human.
|
| (As the models get better, the produced cloaker retains its
| ability to fool the model, while the "minimally perceptually
| obvious to a human" property is what gets sacrificed -- even
| their 2022 version of the software started to do slightly-
| evident things like visibly increasing the contour of a
| person's nose.)
| nonrandomstring wrote:
| The thing about biometrics as discussed in more intelligent
| circles, is "compromised once compromised for all time". It's a
| public key or username not a password.
|
| Fortunately that's not true of governments. Although your
| government may be presently compromised it is possible, via
| democratic processes, to get it changed back to uncompromised.
|
| Therefore we might say, it's easier to change your government
| than it is to change your face. That's where you should do the
| work.
| dathinab wrote:
| biometrics are also way less unique then people think
|
| basically the moment you apply them to a huge population (e.g.
| all of US) and ignore temporal and/or local context you will
| find collisions
|
| especially when you consider partial samples weather that is
| due to errors of the sensors used or other reasons
|
| Innocent people have gone to prison because of courts ignoring
| reality (of biometric matches always just being a likelyhood of
| matching not ever a guarantee match).
| hammock wrote:
| First order approximation is 10 years' worth of aging, or 5
| years' worth for a child under 16. These are the timelines in
| which you must renew your American passport photo.
|
| Apple Face ID is always learning as well. If your brother opens
| your phone enough times with your passcode, it will eventually
| merge the two faces it recognizes
| sanj wrote:
| citation?
| hammock wrote:
| First hand experience. Try it yourself?
| IncreasePosts wrote:
| Google photos only has pictures of my mom from her 60s
| onwards, but when I put a sepia toned scan of my mom as a 9
| year old, google photos asked me "is this [your mom]?"
| satvikpendem wrote:
| Their conclusion reminds me of this lady in China, Lao Rongzhi,
| who was a serial killer along with her lover, Fa Ziying [0]. They
| both went around the country extorting and killing people, and,
| while Fa was arrested in 1999 via a police standoff, Lao was on
| the run for two decades, having had plastic surgery to change her
| face enough that most humans wouldn't have recognized her.
|
| But in those two decades, the state of facial recognition
| software had rapidly increased and she was recognized by a camera
| at a mall and matched to a national database of known criminals.
| At first police thought it were an error but after taking DNA
| evidence, it was confirmed to be the same person, and she was
| summarily executed.
|
| In this day and age, I don't think anyone can truly hide from
| facial recognition.
|
| [0] https://www.youtube.com/watch?v=I7D3mOHsVhg
| joe_the_user wrote:
| Hmm, "cameras reported a 97.3% match". I would assume that for
| a random person, the match level would be random. 1/(1 -.973) ~
| 37. IE, 1 in 37 people would be tagged by the cameras. If
| you're talk China, that means matching millions of people in
| millions of malls.
|
| Possibly the actual match level was higher. But still, the way
| facial recognition seems to work even now is that it provides a
| consistent "hash value" for a face but with a limited number of
| digits/information (). This be useful if you know other things
| about the person (IE, if you know someone is a passenger on
| plane X, you can very likely guess which one) but still
| wouldn't scale unless you want a lot of false positives and are
| after specific people.
|
| Authorities seem to like to say DNA and facial recognition
| caught people since it implies an omniscience to these
| authorities (I note above someone throwing out the either wrong
| or meaningless "97.3% value). Certainly, these technologies do
| catch people but they still limited and expensive.
| Epa095 wrote:
| > I would assume that for a random person, the match level
| would be random. 1/(1 -.973) ~ 37.
|
| Why would you assume that?
| ImprobableTruth wrote:
| The "97.3%" match is probably just the confidence value - I
| don't think a frequentist interpretation makes sense for
| this. I'm not an expert in face recognition, but these
| systems are very accurate, typically like >99.5% accuracy
| with most of the errors coming from recall rather than
| precision. They're also not _that_ expensive. Real-time
| detection on embedded devices has been possible for around a
| decade and costs for high quality detection have come down a
| lot in recent years.
|
| Still, you're right that at those scales these systems will
| invariably slip once in a while and it's scary to think that
| this might enough to be considered a criminal, especially
| because people often treat these systems as infallible.
| left-struck wrote:
| Another related thing to consider, if she had plastic surgery
| what are the odds that among a billion people there isn't
| someone whose face looks more like her original face than her
| face looks like her original face.
| noqc wrote:
| The only way a percentage match means anything here, is that
| the facial recognition software returns a probability
| distribution of representing the likelihood that the person
| identified is each member of the set. I'm sure that 97.3% is
| actually low for most matches, since she had extensive
| plastic surgery.
| wslh wrote:
| This could help with the discussion: "Human face identification
| after plastic surgery using SURF, Multi-KNN and BPNN
| techniques"
| <https://link.springer.com/article/10.1007/s40747-024-01358-7>
| mmooss wrote:
| How do you know that story is true? Would the police say if
| they made a mistake? Would anyone be able to find out the truth
| or accuse them?
| satvikpendem wrote:
| By that logic, how do you know any crime story from anywhere
| in the world is true and not just a cover up by cops?
| mmooss wrote:
| Good question: When you encounter information, how do you
| determine the likelihood of its accuracy?
| h1fra wrote:
| that's a problem reported by many many minorities around
| the world
| 542354234235 wrote:
| The person they executed admitted to being Lao Rongzhi,
| admitted to participating in the crimes, but claimed she was
| not responsible because of abuse she suffered at the hands of
| Fa Ziying. While false and forced confessions are absolutely
| a thing, hers doesn't really fit that pattern. She
| acknowledged being involved, showed remorse for the killings,
| but distanced herself from them and minimized her involvement
| in violence, focusing on the robberies. After being presented
| with DNA evidence, it doesn't appear that she ever claimed
| not to be Lao again nor did her defense seem to ever attempt
| to put that forward, but both of them put forward a rigorous
| defense to attempt to save her.
|
| Anything is possible, but it seems from her own actions for
| years up until her execution that it was in fact her and she
| only denied it to the local police initially, hoping to be
| let go.
| jampekka wrote:
| > and she was summarily executed.
|
| Nitpick: Summary execution means execution without due process.
| As per Wikipedia there was a quite thorough legal process all
| the way to the supreme court.
|
| "On September 9, 2021, Lao was sentenced to death by the
| Nanchang Intermediate People's Court for intentional homicide,
| kidnapping, and robbery. She was also stripped of her political
| rights for life and had all of her personal property
| confiscated. Lao appealed her conviction in court, and the
| second trial was held on August 18, 2022 at Jiangxi Provincial
| Higher People's Court. Although Lao admitted to being an
| accomplice to Fa, she claimed to have only done so in fear of
| losing her own life, as Fa had physically and sexually abused
| her throughout their relationship. On November 30 of the same
| year, the court upheld the death sentence. On December 18,
| 2023, the Nanchang Intermediate People's Court carried out the
| execution of Lao Rongzhi, with the approval of the Supreme
| People's Court."
|
| https://en.m.wikipedia.org/wiki/Fa_Ziying_and_Lao_Rongzhi
| arcbyte wrote:
| Your overall point holds that there was China's version of
| due process and plenty of elapsed time between her capture
| and subsequent execution. Therefore it was not a summary
| execution. Nowhere close. Moreover, to call this out is not a
| nitpick, it's an important factual correction of the OP.
|
| However I would nitpick that while summary executions do
| include those without due process, the defining
| characteristic is simply speed. If the execution happened
| uncharacteristically fast compared to typical executions,
| even if all due process afford to her was followed, then she
| was still summarily executed.
| jampekka wrote:
| Nitpicking continued: As per e.g. Wikipedia definition it
| refers explicitly to the process (and not the speed): "In
| civil and military jurisprudence, summary execution is the
| putting to death of a person accused of a crime without the
| benefit of a free and fair trial. The term results from the
| legal concept of summary justice to punish a summary
| offense, as in the case of a drumhead court-martial, but
| the term usually denotes the summary execution of a
| sentence of death."
|
| In practice a free and fair trial can't be very fast
| though.
|
| https://en.m.wikipedia.org/wiki/Summary_execution
| satvikpendem wrote:
| Thanks, I used the wrong word, I should have meant that she
| was executed soon after conviction, which is not usually the
| case in many other countries.
| hyperific wrote:
| CV Dazzle (2010) attempted this to counter the facial recognition
| methods in use at that time.
|
| https://adam.harvey.studio/cvdazzle
| probably_wrong wrote:
| D-ID (YC S17, [1]) promised that they would do the same. They
| have been quite silent on whether they ever achieved their
| target and nowadays they've pivoted to AI so no idea whether
| they ever actually succeeded.
|
| https://news.ycombinator.com/item?id=14849555
| marc_abonce wrote:
| It looks like that makeup style might not work so well anymore,
| at least according to this tweet I found:
| https://x.com/tahkion/status/1013568373622362113
|
| It seems like the woman in the example is using a CV Dazzle
| makeup style and it doesn't fool the algorithm.
|
| Other makeup styles work better, although I think that's
| probably just a short term solution (tweet is from 2018) before
| any new trendy makeup style is added to the training dataset.
| its_bbq wrote:
| Why is makeup considered cheating but surgery not?
| jquave wrote:
| wrong app bro
| jl6 wrote:
| Maybe wearing enough makeup to hide your face would fool an
| algorithm, but be conspicuous enough to get you noticed anyway.
| throe844i wrote:
| I welcome such tracking and surveillance.
|
| It is too easy to get accused of something. And you have no
| evidence to defend yourself. If you keep video recording of your
| surroundings forever, you now have evidence. AI will make
| searching such records practical.
|
| There were all sorts of safe guards to make such recordings
| unnecessary, such as due process. But those were practically
| eliminated. And people no longer have basic decency!
| dingnuts wrote:
| who cares if you're tracked because you have nothing to hide,
| right?
|
| now imagine you're the wrong religion after the regime change.
|
| "I have nothing to hide" is a stupid argument that leads to
| concentration camps
| simplicio wrote:
| Seems like the Nazis managed to do the Concentration Camps
| thing without facial recognition software.
| pavel_lishin wrote:
| But they did have tremendous data processing abilities for
| their time!
| simplicio wrote:
| I don't think keeping the data processing abilities of
| modern gov'ts below that of 1930's Germany is really a
| plausible plan for avoiding concentration camps.
| Wicher wrote:
| https://en.wikipedia.org/wiki/IBM_and_the_Holocaust
| pessimizer wrote:
| > As the Nazi war machine occupied successive nations of
| Europe, capitulation was followed by a census of the
| population of each subjugated nation, with an eye to the
| identification and isolation of Jews and Romani. These
| census operations were intimately intertwined with
| technology and cards supplied by IBM's German and new
| Polish subsidiaries, which were awarded specific sales
| territories in Poland by decision of the New York office
| following Germany's successful Blitzkrieg invasion. Data
| generated by means of counting and alphabetization
| equipment supplied by IBM through its German and other
| national subsidiaries was instrumental in the efforts of
| the German government to concentrate and ultimately destroy
| ethnic Jewish populations across Europe. Black reports that
| every Nazi concentration camp maintained its own Hollerith-
| Abteilung (Hollerith Department), assigned with keeping
| tabs on inmates through use of IBM's punchcard technology.
| In his book, Black charges that "without IBM's machinery,
| continuing upkeep and service, as well as the supply of
| punch cards, whether located on-site or off-site, Hitler's
| camps could have never managed the numbers they did."
|
| https://en.wikipedia.org/wiki/IBM_and_the_Holocaust
|
| They would have done a lot better faster with facial
| recognition software, and certainly wouldn't have turned it
| down.
| mixmastamyk wrote:
| But surely it couldn't happen in America, right? Guess
| what, census data _was_ used to facilitate Japanese
| internment.
| whycome wrote:
| America and Canada used facial recognition for their ww2
| concentration camps.
|
| https://en.m.wikipedia.org/wiki/Internment_of_Japanese_Cana
| d...
| dredmorbius wrote:
| The Nazis utilised the best information-management tools of
| the time, including IBM computers (fully supported by IBM
| throughout the war) and punch cards (as another commenter
| notes: <https://news.ycombinator.com/item?id=42359352>).
| Those tattoos worn by concentration camp survivors were
| IBM-assigned identifiers.
|
| Nazis also used census and other civil data sources.
| Deliberate destruction of such records in the Netherlands
| is one of the legacies of WWII:
|
| <https://en.wikipedia.org/wiki/The_Holocaust_in_the_Netherl
| an...>
|
| This and other legacies of 20th-century genocide are chief
| reasons why European attitudes toward rampant data
| collection and exchange are far harsher than in the United
| States. Though I'd argue still not nearly harsh enough.
| throe844i wrote:
| Data means power and freedom. With access to data you can
| defend yourself from legal persecution! In past people were
| lynched and killed for false accusations! With evidence they
| would have a chance!
|
| Hostile regime will kill you anyway. But there is a long way
| there. And "soft hostile" may throw you into prison for 30
| years, or take your your house and family. Or will not
| enforce punishment on crooks. All fully legally in "proper
| democracy".
|
| And "wrong religion" and "leads to concentration camps"
| really is a stupid argument, given what is happening last
| year. People today are just fine with concentration camps and
| genecide! It is just absurd argument used to defend corrupted
| status quo!
|
| If you have a "wrong religion" change it! People did that all
| the time.
| pavel_lishin wrote:
| > _With access to data_
|
| That's the key problem. Why do you assume you'll have
| access to this data?
| pavel_lishin wrote:
| > _It is too easy to get accused of something. And you have no
| evidence to defend yourself. If you keep video recording of
| your surroundings forever, you now have evidence._
|
| This assumes that you have _access_ to those recordings. If you
| 're live-logging your life via something you're wearing all day
| every day, maybe - but if the government decides to prosecute
| you for something, what are the odds that you'll be able to
| pull exonerating evidence out of the very system that's trying
| to fuck you?
|
| Even if a system doesn't _care_ , it's still a hassle. Case in
| point:
| https://www.independent.co.uk/news/world/americas/michigan-s...
|
| > _An African American man who was wrongly convicted of a fatal
| shooting in Michiganin 2011 is suing a car rental company for
| taking seven-years to turn over the receipt that proved his
| innocence, claiming that they treated him like "a poor black
| guy wasn't worth their time"._
|
| I found this article while looking for another story that's
| virtually identical; I believe in that one it was a gas station
| receipt that was the key in his case, and he ended up spending
| very minimal time in jail.
|
| How many people are in jail now because they weren't able to
| pull this data?
| rustcleaner wrote:
| If people are sitting in cells for lack of that data, the
| standard of proof is too low.
| trgn wrote:
| i recently tried one of those cashierless amazon stores. it was
| an odd jolt, this feeling to be trusted, by default. It was
| vaguely reminiscent off one in my childhood, when, after my
| parents had sent me on an errand to the local grocer, I'd
| forgotten the money and the clerk/owner let me just walk out
| since they knew me. Presumably they and my mom would take care
| of the balance later.
|
| I live now in a city where small exchanges are based on a
| default of of mistrust (e.g. locking up the tide-pods behind a
| glass case - it's not a meme). The only super market near (not
| even _in_) my food desert started random bag checks.
|
| The modern police state requires surveillance technology, but
| abusive authority has flourished in any technological
| environment. the mafia had no problem to terrorize entire
| neighborhoods into omerta for example, without high technology.
| i'm sure there's other examples.
|
| i don't know the right answer, but considering the extent to
| which anti-social and criminal attitudes are seemingly allowed
| to fester, while everybody else is expected to relinquish their
| dignity, essentially _anonymize_ themselves, it makes me less
| and less have a kneejerk response to the expansion of
| technologically supported individualization.
| gaiagraphia wrote:
| Would you be happy for such systems to scale with income and
| power?
|
| Surely those with larger means have a bigger impact if they're
| acting nefariously? And it'd be a HUGE issue for society if our
| rich and powerful were wrongly accused, and couldn't implement
| their efficiencies and expertise across the market.
| nitwit005 wrote:
| It'll may help, but the police will realistically not make an
| effort into proving your innocence. You'll have to dig that
| evidence up yourself.
|
| Netflix has a documentary, Long Shot, on someone who was proven
| innocent of a murder as footage of them at a baseball game was
| found at the time of the murder. They had to get help finding
| that footage, as the police wouldn't check.
|
| The prosecutors absolutely did not care that video footage, and
| phone evidence, placed him at another location, and continued
| to insist on his guilt. The judge eventually dismissed the
| charges.
| shae wrote:
| What about you infrared LEDs on my face?
| gehwartzen wrote:
| https://www.reddit.com/r/techsupportmacgyver/comments/mej7j7...
| dredmorbius wrote:
| Oh, _that_ guy:
|
| <https://xkcd.com/1105/>
|
| (Different mechanism, similar result.)
| deadbabe wrote:
| It's trivial to also implement gait analysis to help visually
| identify someone if a face isn't available. Then when you do get
| a glimpse of the face you can link the gait and the face
| identity.
| SoftTalker wrote:
| I was traveling internationally earlier this year and I have
| grown a heavy beard since my passport photo was taken. None of
| the automated immigration checkpoints had any trouble identifying
| me.
| mixmastamyk wrote:
| Believe they focus on the eyes/nose shape and spacing.
| dathinab wrote:
| yes, they mainly focus on bone structure especially around
| eye nose area
|
| beards are to easy to change, and masks had been very common
| for some time and cover more then beards
| throw310822 wrote:
| Which makes me wonder: what about contact lenses that can
| mess up with the measurement of the eyes distance, like for
| example having a drawing of the iris (surrounded by a portion
| of white sclera) that is slightly offset from the real one?
| darepublic wrote:
| Need emp charges like in metal gear. A bunch of metallic confetti
| fills the air while you dash past the security cameras big and
| small
| cbanek wrote:
| What's funny is the metallic confetti would inevitably have a
| serial number on it that they could trace to who bought it.
| Taser rounds already have this built in.
| Scotrix wrote:
| "Asking our governments to create laws to protect us is much
| easier than..."
|
| A bit naive that, it's too late since data is already mostly
| available and it just takes a different government to make this
| protection obsolete.
|
| That's why we Germans/Europeans have tried to fight data
| collections and for protections for so long and quite hard (and
| probably have one of the most sophisticated policies and
| regulations in place) but over time it just becomes an
| impossibility to keep data collections as low as possible (first
| small exceptions for in itself very valid reasons, then more and
| more participants and normalization until there is no protection
| left...)
| wizzwizz4 wrote:
| It's not too late. Maybe it is _for us_ : but in 100 years, who
| will really care about a database of uncomfortably-personal
| details about their dead ancestors? (Sure, DNA will still be an
| issue, but give that 1000 years and we'll probably have a new
| MRCA.) If we put a stop to things _now_ (or soon), we can nip
| this in the bud.
|
| It's probably not too late for us, either. Facial recognition
| by skull shape is still a concern, but only if the bad guys get
| _up-to-date_ video of us. Otherwise, all they can do is
| investigate our _historical_ activity. Other types of data have
| greater caveats preventing them from being useful long-term,
| provided we not participate in the illusion that it 's
| "impossible to put the genie back in the bottle".
| bigiain wrote:
| So what you're suggesting is we do whatever we can to avoid
| hitting 2 degrees of universal facial recognition precision?
| Given that the 1.5 degree target is now inevitably
| impossible.
| wizzwizz4 wrote:
| Mass surveillance takes active maintenance, and most of its
| direct consequences cannot outlive the last of those
| subject to it. Alteration of the chemical composition of
| the atmosphere is expected to persist for millennia, with
| consequences that won't be felt for centuries. They're
| analogous only in that the same societal forces drive both:
| but trying to tackle those forces head-on is operating on
| such a high level of abstraction that you'd be wasting your
| time.
|
| Start small. Get your kid's school to take the CCTV out of
| the toilet rooms. There's no such problem as "facial
| recognition" or "mass surveillance": there are _many
| specific instances_ of it. Fight those.
| briandear wrote:
| But the Germans still ask people to register their religion,
| ostensibly so the government can give tax money to the relevant
| religion. Sorry, but the German government asking people to
| provide their religion to the government just reminds me of
| something unpleasant.
| seszett wrote:
| If that's your only nitpick then just look at France that has
| similar privacy protections, and doesn't collect religious
| data.
| imron wrote:
| > If you wore sunglasses and then did something to your face
| (maybe wear a mask or crazy dramatic makeup) then it would be
| harder to detect your face, but that's cheating on the question--
| that's not changing your face, that's just hiding it!
|
| So sunglasses and a mask then. Who cares if it's 'cheating'.
| dathinab wrote:
| What often is fully ignored in such articles is the false
| positive rate.
|
| Like e.g. where I live they tested some state of the art facial
| recognition system on a widely used train station and applauded
| themself how grate it was given that the test targets where even
| recognized when they wore masks and capes, hats etc.
|
| But what was not told was that the false positive rate while
| percentage wise small (I think <1%) with the amount of expected
| non-match samples was still making it hardly usable.
|
| E.g. one of the train stations where I live has ~250.000 people
| passing through it every day, even just a false positive rate of
| 0.1% would be 250 wrong alarms, for one train station every
| single day. This is for a single train station. If you scale your
| search to more wider area you now have way higher numbers (and
| lets not just look at population size but also that many people
| might be falsely recognized many times during a single travel).
|
| AFIK the claimed false positive rate is often in the range of
| 0.01%-0.1% BUT when this system are independently tested in real
| world context the found false positive rate is often more like
| 1%-10%.
|
| So what does that mean?
|
| It means that if you have a fixed set of video to check (e.g.
| close to where a accident happened around +- idk. 2h of a
| incident) you can use such systems to pre-filter video and then
| post process the results over a duration of many hours.
|
| But if you try find a person in a nation of >300 Million who
| doesn't want to be found and missed the initial time frame where
| you can rely on them to be close by the a known location then you
| will be flooded by such a amount of false positives that it
| becomes practically not very useful.
|
| I mean you still can have a lucky hit.
| Eisenstein wrote:
| What does 'false positive' mean? That it thinks it is someone
| else, or that it thinks it is a target of an investigation?
| TuringNYC wrote:
| When the actual is negative but the inference is positive,
| rate of that.
|
| This is a very handy guide:
| https://en.wikipedia.org/wiki/Confusion_matrix
| Eisenstein wrote:
| That wasn't what I was asking. I was asking what the
| failure mode was.
| Sebb767 wrote:
| False positive would, in this case, mean wrongly identifying
| an unrelated person a the search target.
| _DeadFred_ wrote:
| This has been answered since the 80s. This much:
|
| https://i.imgur.com/7cuDqPI.jpg
| _heimdall wrote:
| I'm of two minds when it comes to surveillance. I don't like that
| businesses, airports, etc do it but it is their property. I don't
| like that they can run video feeds through software, either in
| real time or after the fact, to so easily find and track my every
| move. But again, its their property.
|
| Where the line is always drawn for me, at a minimum, is what they
| do with the video and who has access to it.
|
| Video should always be deleted when it is no longer reasonably
| needed. That timeline would be different for airports vs
| convenience stores, but I'd always expect the scale of days or
| weeks rather than months or years (or indefinitely).
|
| Maybe more importantly, surveillance video should never be shared
| without a lawful warrant, including clear descriptions of the
| limits to what is needed and why it is requested.
| rainonmoon wrote:
| The complicating factor is that it isn't just "their property",
| it's an essential destination of many people's ability to
| function in society, which makes them adjacent to public
| utilities. If the water retailer which services your home
| started adding substances which could be used to track and
| identify their customers, you'd be pretty unhappy. Private
| ownership doesn't absolve an entity from public accountability,
| especially when there is extremely little option to not engage
| them.
| _heimdall wrote:
| > The complicating factor is that it isn't just "their
| property", it's an essential destination of many people's
| ability to function in society
|
| That's a much bigger can of worms, one that reaches well
| beyond just airports. Many modern societies are extremely
| complex and assume individual access to a long list of
| resources and services.
|
| Its a pretty slippery slope though. People can absolutely
| choose not to fly, it isn't a basic requirement for life. The
| slippery slope leads to larger and larger government - as
| long as society continues to create implied requirements on
| the individual it seems reasonable to give more and more
| power to a central authority to ensure everyone can have that
| access.
|
| It sounds reasonable enough, though there isn't a good
| guardrail built in to avoid eventually building a
| totalitarian or communist state as so many things are now "
| basic necessities."
| gehwartzen wrote:
| Kidding. (But maybe not?...)
|
| https://en.m.wikipedia.org/wiki/Groucho_glasses
| costco wrote:
| The face ID feature on Bryan Johnson's phone no longer recognized
| him after many months of his intense health regimen:
| https://twitter.com/bryan_johnson/status/1777789375193231778
| roland35 wrote:
| Looks like he hangs out with RFK on his Twitter. Boo
| sandbach wrote:
| At Tianfu Airport in Chengdu, there are large screens with
| cameras attached that recognize your face and tell you which gate
| to go to. Convenient but scary, like many things in China.
| aprilthird2021 wrote:
| It feels increasingly like the only way to avoid such facial
| recognition is to suddenly grow a religious conviction that your
| face should not be seen by strangers
| dotancohen wrote:
| Like a burka?
| dredmorbius wrote:
| There are numerous religions and cultures which practice some
| level of covering of the face or head. Burkas are only one
| example.
|
| Not a great reference, but:
|
| "Religious Head Coverings: A Comprehensive Guide"
|
| <https://headcoverings-by-devorah.com/religious-head-
| covering...>
|
| Wikipedia has a listing as a section of this article:
|
| <https://en.wikipedia.org/wiki/List_of_headgear#Religious>
|
| (Many of these cover only the top of the head or a part of
| the head, e.g., yarmulke, fez, kofia. Others are more
| comprehensive.)
|
| The non-religious and/or cultural coverings might also be of
| interest.
| mmooss wrote:
| Religious laws and convictions have been born from similar
| situations.
| polote wrote:
| Is the tech to do facial recognition at this accuracy available
| to public ?
|
| Last time I checked there was deepface
| https://github.com/serengil/deepface/tree/master but it was far
| to work as well as that
| marssaxman wrote:
| > Soon, the only real defense may be federal regulation.
|
| That doesn't sound like much of a defense!
| spaceguillotine wrote:
| according to the DMV and Passport office just having bangs is
| enough to fool the system
| marethyu wrote:
| Can wearing realistic face masks and contact lens that changes
| iris color possibly fool modern face recognition software?
| dotancohen wrote:
| Maybe just wear a burka?
| azalemeth wrote:
| I've often wondered what would happen if I wandered around with a
| bright IR led flashing on my lapel at about 30 or 60 Hz and
| sufficiently invisible to human eyes yet low wavelength enough to
| get into most CMOS chips and dazzle the camera.
|
| I think this on shopping trips routinely. I don't like being
| surveiled and even though I have nothing to hide (I've never
| shoplifted in my life!) I hate the persuasive nature of it all. I
| don't even mind being followed by a human that much, but I do
| mind algorithmic analysis that is far more effective, scary, and
| invasive. Sadly I think the answer to this experiment would be
| being asked to leave or an uncomfortable chat with a policeman.
| Nevertheless I silently would like someone braver than me to try
| it. You're allowed to wear a light on your clothes -- why not
| make it an IR one?
| a012 wrote:
| I guarantee that it'll trigger an alarm to the shop security
| and there'll be officer to see you immediately.
| aetherson wrote:
| You "guarantee" that? I think it's a possibility, but very
| far from universal.
| tivert wrote:
| Maybe not "every shop," but ones with a security guard
| monitoring the video who's actually doing his job, trying
| to "dazzle the camera" would definitely draw _extra_
| attention to you, which is probably not what you want.
|
| "Dazzle the camera" is an idea that sounds good when you
| fail to think about the whole system, and instead
| hyperfocus on one component.
| jaco6 wrote:
| Whether we ultimately outlaw facial recognition or not is
| unimportant. Cameras and data are now so cheap that soon we will
| be able to track every public movement of every person in the
| country, making crime impossible. Once you leave your house, a
| street camera will see, and trace the movements of you or your
| car into the city and as you go about your business, with or
| without your face. It will follow you until you return home or
| check into a hotel or fall asleep in your car. Your address is
| public information so this isn't a privacy violation. The current
| cost of storing 24 hour footage of the entire urban street area
| of the USA is just $100 billion annually, far less than the
| current total of $300 billion spent on criminal justice.
|
| This will bring an end to crime and herald a massive revival of
| public trust and socialization.
| nmeofthestate wrote:
| Kudos for offering an alternative view exploring the potential
| upsides, but I'd take issue with "Whether we ultimately outlaw
| facial recognition or not is unimportant.". Making use of all
| that data to fight crime would absolutely require it to be
| legal to capture. Edit: sorry - just realised what you mean.
| The footage is legal to capture and use in criminal
| investigations, but not using face recognition.
| dogman144 wrote:
| I get your point but the literature on this I've read leans
| towards:
|
| - ubiquitous surveillance is here (your point broadly)
|
| - the data engineering to work the data isn't quite there, or
| isn't full spectrum in the manner you argue (what prevents your
| theory as of now)
|
| However, what is clunky tech today can be scaled and effective
| tech tomorrow, so maybe your argued future is possible, if not
| likely.
|
| https://www.mitre[.]org/news-insights/publication/decipherin...
| AtlasBarfed wrote:
| Basically, the trend is you have human rights.
|
| But moving anywhere at all by any means at all is a privilege.
| Driving is a privilege, walking is a privilege, flying is a
| privilege, biking is a privilege.
|
| Of course electronic payment systems are a privilege, health care
| is a privilege, internet is a privilege, school is a privilege,
| jobs are a privilege.
| stego-tech wrote:
| Not a bad piece, all told, though the general practical advice
| hasn't really changed in the decade-plus since I last touched the
| stuff: stop looking up (in general), keep as much of your face
| obscured as practical, try mixing up patterns to make it
| difficult for algorithms to match you over time, know where
| cameras are and how to avoid them, and if you do have to enter a
| known surveillance area, exit it as quickly and discreetly as
| possible - and adjust outfits between surveillance areas if
| you're particularly paranoid.
|
| That said, let me just help dash any hopes of fooling government
| surveillance right now. Any competent Nation State that has an
| axe to grind with you specifically, already has you in their
| dragnet. They already have enough information to match your face
| in grainy analog B&W surveillance footage from an ancient grocery
| store camera. You're not beating those short of significant
| cosmetic surgery or prosthetics of some sort, and even then, if
| they want you badly enough then they'll just pull partial prints
| off something you touched and validate that way.
|
| Always remember the first rule of security: if someone really
| wants something you have badly enough, there's nothing you can do
| to stop them. With that in mind, plan accordingly. It's why I
| don't go to protests myself, or otherwise engage with events
| where I know facial recognition tech is deployed: I'm _in_ that
| data set, _multiple_ times, with _pristine_ reference materials,
| simply by virtue of past work (not including the updates via
| passport photos or Global Entry access). My safest bet is simply
| not to put myself in that position in the first place, and that
| 's likely yours as well.
| cbanek wrote:
| Between not wanting to be seen and sun protection, I'm tempted to
| go full Burka (even though I'm not religious).
| cabirum wrote:
| Changing face doesn't matter. You will simply not be allowed to
| enter some area without a successful scan.
| interludead wrote:
| I think we should push for legal frameworks that govern biometric
| data collection and usage
| SV_BubbleTime wrote:
| The people you may need to protect yourself from, might be the
| people writing and enforcing the laws. What you need, is a
| deterrent from people abusing systems.
| briandear wrote:
| More so than the face, gait recognition is even more hard to
| fool. A person's gait is as unique as a fingerprint.
| AlfredBarnes wrote:
| I would be very surprised if every large grocery store isn't
| already trackin every customers movement. It would be relatively
| cheap to implement.
| resource_waste wrote:
| Ready for the cocktail:
|
| >You walk in the store and are ID'd on camera
|
| >You buy everything and use your credit card, which is linked
| to your ID, since the checkout has a camera too.
|
| >You use your email once, or your address once, with that same
| credit card, all connected... now they have your email and home
| address. You significant other has the same address? Everything
| is linked.
|
| If you really want to get crazy, you can combine voting records
| too. Based on primary ballot numbers, you can figure out if
| someone voted D or R in the primary with an address.
|
| Imagine all the stuff you can get from an email address too..
| exabrial wrote:
| You need to:
|
| 1. move the distance between your eyes from the center of your
| face a random amount
|
| 2. move both eyeballs up or down a random amount
|
| This will defeat a vast majority of simple systems. However there
| are far more sophisticated ones that are slower and require more
| resolution:
|
| 1. mess your jaw line, cheek bones, nose bones, and depth your
| eyes sit inside your head
|
| Finally the creme de le creme which even identical twins are as
| different as dogs and cats:
|
| 1. get the white of your eyes tatoo'd with new vasculature.
| tartoran wrote:
| Shouldn't gait play a role in identifying people in addition to
| facial recognition? Someone was suggesting dropping a small
| pebble in one of the shoes (or both) to change the walk natural
| pattern.
| crazygringo wrote:
| It _could_ but I don 't think it _does_. Has anyone built a
| gait recognition system? It would be tricky because it also
| varies simply depending on your shoes, if you 're wearing a
| heavy backpack, if you're rolling a suitcase, etc.
|
| It's also actually really easy to change your gait if you
| want. Just watch someone, and then copy how they walk. Start
| by paying attention to whether the hold their more fixed
| "center" of movement in their chest, abdomen, or waist (or
| where in between), then match their degree of stiffness or
| sway, and you're most of the way there. It's a pretty common
| acting exercise.
| ItDoBeWimdyTho wrote:
| I searched google scholar for "gait surveillance" articles
| since 2023 and got 12,000 results. I'd be willing to bet
| some of them are in operation at this point.
| debugnik wrote:
| > Has anyone built a gait recognition system?
|
| Years ago it was announced that some Chinese cities would
| use gait recognition for surveillance, but I don't know if
| the deployment stuck. I remember a video showing off the
| tech, although I can't look for it right now.
| BurningFrog wrote:
| Must be orders of magnitude harder, since it needs video
| instead of just one photo.
|
| That said, I'm sure it exists.
| crazygringo wrote:
| What about just a prosthetic nose that it's a bit wider and
| longer? Blending it in with makeup. I always assumed that's the
| easiest thing to change that would definitively mess with the
| metrics.
| BurningFrog wrote:
| Sunglasses are a simpler way to obfuscate your eyeball metrics.
| Clubber wrote:
| To circumvent facial recognition, wear a mask. Nearly all of the
| BLM rioters wore masks and very few (if any) were caught. Most of
| the J6 people didn't wear a mask and almost all of them were
| caught. Wear a simple surgical mask like was common during covid.
| brodouevencode wrote:
| The timing of this with respect to AI/FR being a hotly reported
| technology used in the search of the UnitedHealthcare Insurance
| CEO is kinda gross.
|
| But such are the times.
| tivert wrote:
| Remember when technology was going to liberate the common man? It
| turns out the tyrants are almost always in a better position to
| use it for tyranny.
| resource_waste wrote:
| Eh, they cant control messages as well. Communication worldwide
| is easier than ever.
|
| But yes, weapons are stronger too.
|
| Things aren't black and white.
| tivert wrote:
| > Eh, they cant control messages as well. Communication
| worldwide is easier than ever.
|
| They totally _can_ (see China, People 's Republic Of), it's
| just in many places the authorities have _not yet_ chosen to.
|
| It's pure fantasy to think you'll be able to run around an
| resist (or even just ignore) Big Brother with a cell phone
| running Signal.
|
| Computer technology has an asymmetry that, despite 90s-era
| propaganda, actually favors the tyrants. It's time we
| acknowledge that.
|
| > But yes, weapons are stronger too.
|
| Not in any way that's meaningful when discussing tyranny.
| bookofjoe wrote:
| "Hum," a new novel by Helen Phillips, addresses this question
| precisely.
|
| The premise: A woman who's not well off financially after losing
| her job signs up for a study in which an advanced robot
| surgically alters her face ever so minimally so as to use her as
| a test case for the company's state-of-the-art/bleeding edge
| (sorry) facial recognition software.
|
| She signed up because having become unemployed with no prospect
| of future employment, her husband's job as a gig-handyman which
| is mostly pest control and pays terribly, and two young children,
| she fears being evicted from their apartment.
|
| The study offers a huge payment in advance, enough for their
| family to live in comfort for 10 months without any other income
| source.
|
| One problem soon becomes apparent: in altering her appearance
| ever so slightly, her family and everyone she knows are taken
| aback: she look just like she used to, but somehow not quite: the
| study is intended to see how surveillance video handles faces in
| the uncanny valley -- by creating them.
|
| NO -- I have not ruined the book if you're thinking about reading
| it: my introduction above happens early on, following which the
| story explodes in unexpected, compelling directions.
|
| This book is beautifully written: it's sci-fi, the sixth book by
| a highly regarded and awarded novelist.
|
| Read the first 19 pages (of 244) here:
| https://www.amazon.com/Hum-Novel-Helen-Phillips/dp/166800883...
| hcaz wrote:
| Added to my read-list
| pempem wrote:
| Immediate add! This is so interesting
|
| Hopefully folks understand that this is dystopian rather than a
| roadmap to their next product proposal
| nonameiguess wrote:
| I don't suppose anyone here knows the answer, but claims of
| matching accuracy like this make me wonder why basic touch ID so
| often fails and I need to delete my fingerprints and re-enroll. I
| always figured it was because of rock climbing tearing up my
| fingers and making the prints gradually different enough that
| they no longer match. Is it really easier to fool a fingerprint
| match than a face match? Or was I just wrong all along and the
| sensors suck? But if the sensors suck, why does deleting and re-
| enrolling work?
| LinuxBender wrote:
| _How much do I need to change my face to avoid facial
| recognition?_
|
| Taboo opinions inspired by W.O.P.R. Avoid playing the game:
|
| - Stay clear of areas with cameras when possible. _Revenue
| impacting._
|
| - Do Zoom or Jitsi calls with businesses and associates when you
| can.
|
| - Become self sufficient. Stop spending money when it is not
| required and have healthy groceries delivered to you. _Reduce tax
| revenue._
|
| - Work from home if your company permits it. _Go mostly off
| grid._
|
| - Hire someone to run errands for you when they can not be
| avoided. _Pay cash to a neighbors kid to run into town._
|
| I know none of this will be popular with anyone but I am _that
| guy_.
| wkat4242 wrote:
| > I think during the pandemic they changed the systems to rely
| heavily on the shape of people's eyes, because so many people
| were wearing masks over their noses and mouths. I don't honestly
| know how people could realistically change the shape of their
| eyes to fool these systems.
|
| Eh party contacts maybe? I use those a lot.
| nitwit005 wrote:
| > I think you could not realistically change your face to fool
| state-of-the-art facial recognition. I think during the pandemic
| they changed the systems to rely heavily on the shape of people's
| eyes, because so many people were wearing masks over their noses
| and mouths. I don't honestly know how people could realistically
| change the shape of their eyes to fool these systems.
|
| There are multiple common cosmetic surgeries that involve eye
| shape.
|
| > And now your face won't match your driver's license or
| passport, so traveling will be really difficult for you. So,
| honestly, why bother?
|
| My drivers license photo went un-updated for over a decade. I
| didn't look remotely similar to my teenage self, and not a single
| person cared. Excepting one airport security person who commented
| on how old the photo was.
___________________________________________________________________
(page generated 2024-12-09 23:01 UTC)