[HN Gopher] Face recognition is being banned, but it's still eve...
       ___________________________________________________________________
        
       Face recognition is being banned, but it's still everywhere
        
       Author : laurex
       Score  : 202 points
       Date   : 2021-12-22 17:13 UTC (1 days ago)
        
 (HTM) web link (www.wired.com)
 (TXT) w3m dump (www.wired.com)
        
       | midjji wrote:
       | Seems rather obvious to me that will remain the case.
       | 
       | Both encryption and face recognition software rely on math and
       | programming simple enough that a single person can make a
       | functional one in less than a year from scratch. However, if
       | using existing open source frameworks anyone can put it together
       | either in a few days, though collecting a face database takes
       | slightly longer, though in most cases anyone with an existing app
       | of some kind can very easily collect the data largely
       | automatically. Arguably I would not trust an encryption framework
       | put together like that, but when it comes to face recognition I
       | would.
       | 
       | So the question of if governments will be able to prevent non-
       | sanctioned face recognition systems from the public sphere is
       | rather similar to the question of if they are able to prevent the
       | use of non-sanctioned encryption.
       | 
       | How is that going these days?
        
       | pueblito wrote:
       | > CBP says it has processed more than 100 million travelers using
       | face recognition and prevented more than 1,000 "imposters" from
       | entering the US at air and land borders.
       | 
       | I wonder how many of those 1000 were incorrectly bounced back
        
         | diebeforei485 wrote:
         | At least at airports, CBP never bounces anyone back at the
         | first instance. At the immigration counters, the dude either
         | lets you in or sends you to secondary.
        
         | PragmaticPulp wrote:
         | The software only flags people for manual review. It's not an
         | automatic software-only rejection.
        
           | 0xcde4c3db wrote:
           | If a selection process is screwed at the outset, I don't
           | think one can generally un-screw it with any amount of post-
           | hoc analysis or manual review.
        
             | karaterobot wrote:
             | Your comment seems unrelated to the comment you're
             | responding to. That comment is about automatic rejection,
             | not selection.
             | 
             | But speaking of selection, there is no perfect process in
             | any system involving humans, so whether it's a flawed
             | facial recognition algorithm, or a biased, tired,
             | overworked human doing it, having a manual review step and
             | post-hoc analysis is pretty useful.
        
               | 0xcde4c3db wrote:
               | My point is that by the time a biased selection happens,
               | much of the damage to the overall process is already
               | irreversible. If the algorithm is biased (and I'm not
               | saying it is, just that it's valid to ask the question),
               | making the final decision manual is more of a fig leaf
               | than an actual fix.
        
       | ctdonath wrote:
       | Can't stop a idea whose time has come.
        
       | 14 wrote:
       | If they ban facial recognition then companies will start using
       | other tech like gait recognition. So are we wanting to ban facial
       | recognition or is it something more like the right to not be
       | tracked(if that even is a right). I think we need to establish
       | what our end goal is because you know clever companies will find
       | a way around such restrictions and find other creative ways. They
       | will use shoe detection instead or something maybe not as
       | accurate but enough to get them around the law but still detect a
       | percentage of criminals. Or jewelry detections or tattoo or
       | whatever isn't banned by law.
        
         | petermcneeley wrote:
         | This is the same issue I have with 'right to repair' laws. When
         | you create a tangle of specific laws banning or regulating
         | specific things all that you really have done is created a
         | world where he who has the most lawyers wins.
         | 
         | What is the end goal with banning face recognition? I think
         | this is all about asymmetry of information.
        
           | midjji wrote:
           | Switching from defending the right to privacy to the right to
           | not be exploited due to power/information asymmetry would
           | solve a great many problems.
        
       | bsenftner wrote:
       | Note the immature (on purpose?) nature of some of these bans:
       | some ban the local government only, some ban only one city of a
       | multi-city district, and nearly all of them leave the option of
       | hiring a private agency (with zero pubic oversight, because they
       | are private) to do the FR for them. And that private agency
       | happens to be owned by the step-son of one of the law makers.
       | This type of legal nonsense is everywhere they are trying to
       | regulate facial regulation.
        
         | toper-centage wrote:
         | The annoying part is that there's no opt out. And when there
         | is, it's obscure (you can express that to the staff), and makes
         | you look hella suspicious.
        
       | ordiel wrote:
       | As I heard "somewhere", 'now that they have forced you to wear a
       | mask, dont remove it ;)'
       | 
       | (I know the profiling works even whit only the eye section yet
       | you get the gist)
        
         | LeifCarrotson wrote:
         | "I wear my sunglasses at night, so I can so I can..." avoid an
         | automated surveillance state.
        
           | [deleted]
        
       | wintorez wrote:
       | Once the genie is out of the bottle...
        
       | latchkey wrote:
       | I just flew to Dubai and back.
       | 
       | On the way in, they scanned my face at the immigration counter.
       | 
       | On the way out, you go through a locked turnstile that scans your
       | face and won't open until it recognizes you.
       | 
       | It took about 2 seconds for the doors to open. Pretty amazing and
       | creepy at the same time.
        
       | Andrew_nenakhov wrote:
       | I believe that the efforts to ban technologies such as this one
       | are futile. Pandora's boxes can only be opened, never closed.
        
         | eunos wrote:
         | "We need to live with surveillance"
        
         | earthnail wrote:
         | We banned mines, too, and it's been fairly successful. Not
         | 100%, sure, but just imagine a world were mines weren't banned,
         | and what Europe and the US would look like, and you know the
         | ban was very effective.
        
           | trasz wrote:
           | Most countries banned mines, but not US:
           | https://en.m.wikipedia.org/wiki/Ottawa_Treaty
        
             | hutzlibu wrote:
             | To be fair, "most countries" does neither include china,
             | nor russia, either.
        
               | trasz wrote:
               | But of the Western countries the only one that broke out
               | is USA.
        
               | [deleted]
        
             | 2468013579 wrote:
             | Ottawa treaty only bans anti-personnel mines, not anti-
             | vehicle or command activated. The US bands persistent anti-
             | personnel mines, but not mines that only last a day or so.
             | This argument gets more convoluted once you read up on it.
             | As I mentioned in spaetzleleeser's comment below, the US is
             | one of a few countries that has a policy banning persistent
             | anti-vehicle mines (they ban all persistent mines in
             | general), so in some ways is more strict than others.
        
               | trasz wrote:
               | Anti-vehicle mines don't really matter, they pose no
               | threat to civilians.
        
               | Andrew_nenakhov wrote:
               | Definitely. It is known that civilians never use
               | vehicles.
        
               | trasz wrote:
               | "Anti-vehicle mines" are rally anti-tank mines, and as
               | such:
               | 
               | 1. They require tons of pressure to detonate; it's
               | adjusted for a tank, not ordinary car, and
               | 
               | 2. They don't end up in unexpected places - they are
               | placed on the roads, and thus are easy to find and
               | detonate.
        
               | Andrew_nenakhov wrote:
               | Any truck or bus has higher road pressure than a tank,
               | because wheels footprint is much smaller than that of a
               | tank tracks.
               | 
               | A very quick search also proves that deaths from anti-
               | vehicle mines number at thousands each year. Even on the
               | very UN website [1] we read:
               | 
               | > _The Secretary-General calls on all countries to also
               | regulate the use of anti-vehicle landmines. Such weapons
               | continue to cause many casualties, often civilian. They
               | restrict the movement of people and humanitarian aid,
               | make land unsuitable for cultivation, and deny citizens
               | access to water, food, care and trade._
               | 
               | I believe we can safely assume that anti-vehicle
               | landmines DO pose a deadly threat to civilians.
               | 
               | [1]: https://www.un.org/disarmament/convarms/Landmines/
        
             | spaetzleesser wrote:
             | That's a fact the US should be deeply ashamed of. I had no
             | idea how bad the land mine and unexploded bomb situation is
             | until I visited Cambodia and Laos where even 50 years after
             | the war people are still getting their limbs blown off.
        
               | giraffe_lady wrote:
               | All the countries that didn't ban land mines are all the
               | same countries that were, essentially, expecting to fight
               | wars in the near future.
        
               | 2468013579 wrote:
               | Read up on the difference between persistent and non-
               | persistent land mines [1]. The US only uses non-
               | persistent mines that last usually 24-72 hours. The issue
               | with 50 years later is not accurate since non-persistent
               | mines haven't been commonplace for decades and officially
               | outlawed in 2004 [2]. Only place the US allows persistent
               | mines is at the DMZ. There's also a distinction between
               | anti-personnel mines and anti-vehicle mines, which the US
               | has a clearer distinction about than others. The Ottawa
               | Treaty only bans anti-personnel mines [3], so the US in
               | one of few countries that has a policy banning the use of
               | persistent anti-vehicle mines.
               | 
               | [1]
               | https://sites.duke.edu/lawfire/2021/04/12/understanding-
               | u-s-...
               | 
               | [2] https://2001-2009.state.gov/t/pm/wra/c11735.htm
               | 
               | [3] https://en.m.wikipedia.org/wiki/Ottawa_Treaty
        
               | trasz wrote:
               | I'm sure Russia and China have similar excuses :->
        
               | spaetzleesser wrote:
               | How many of the non persistent mines will fail to disarm
               | and still be a danger? I would expect quite a few.
               | 
               | In my mind they should have been banned like biological
               | weapons. And the US could take leadership
        
         | andrepd wrote:
         | That's a cop-out. Regulation can happen, of course, if there's
         | political will for it.
        
         | nopenopenopeno wrote:
         | Even with all our efforts, murder still happens, so I guess
         | it's time to throw the towel in on that as well.
        
       | darkwater wrote:
       | Looks like it's being deployed more everyday, rather. For
       | example, a document-less airport experience pilot[1] is being
       | tested in Barcelona (Spain)
       | 
       | [1] https://identityweek.net/josep-tarradellas-barcelona-el-
       | prat...
        
       | flimflamm wrote:
       | Should police collect iPhones away in those cities where face
       | recognition is banned?
        
         | raverbashing wrote:
         | Face detection (that is, only identifying a face in a picture)
         | is not face recognition (identifying the person behind the
         | face).
        
         | ben_w wrote:
         | That depends what the law says. Some of the pressure here is to
         | stop the police themselves from using the tech, I wouldn't know
         | if any given law prevents personal use.
        
         | onethought wrote:
         | It's a government ban. Not a generalised ban.
        
           | flimflamm wrote:
           | Sorry, I am out side of US. What's the difference?
        
             | lajamerr wrote:
             | Government ban means banned for use by government
             | officials/entities. So it doesn't really apply to the
             | general public or corporations(Unless the corporation
             | specifically has a contract with the same requirements to
             | do work on behalf of the government)
        
         | bayindirh wrote:
         | You should also collect Sony Alpha cameras too. Many of them
         | have a preferential focus feature which uses face recognition.
         | 
         | You add up to five faces, prioritized, and it focuses to them
         | if it recognizes in the crowd of people.
        
           | sib wrote:
           | Wow - didn't know about this preferential focus feature.
           | Sounds extremely useful...
        
           | namibj wrote:
           | TIL, and this sounds actually rather useful. Do you have
           | information on which ones would support that? Especially in
           | the range a curious hacker (who else is HN for?) would be
           | interested in.
        
             | bayindirh wrote:
             | It's especially useful in crowd gatherings, group photos
             | and such. I think it's rather designed for concerts,
             | weddings, etc.
             | 
             | My A7-III has it. Since it uses Eye-AF and Face-AF
             | pipelines (it generates a face model from the photo you
             | take for that feature only), it needs AI autofocus stack
             | inside the camera. A7-C should have it, maybe latest APS-C
             | lines (6600 for example) have it.
             | 
             | Sony's AF tech is insane. Point to a person with
             | sunglasses, and it marks the eye instantaneously. What the
             | actual sorcery?
        
               | novok wrote:
               | Well the goal is to recognize a face, not to match a
               | specific face to a specific person. So you can train an
               | AF model to to center focus on the center of the
               | sunglasses. Even with the 'track these specific faces'
               | features, since your still not trying to unique identify
               | someone, the model can latch on to generic features like
               | "red sunglasses, brown skin, surgical mask & this general
               | shape".
        
       | fortran77 wrote:
       | I don't care too much about face recognition for check in. But I
       | don't like it for bag drop. If it mis-matches me, I may never see
       | my bag again!
        
       | kotaKat wrote:
       | Hell. Walmart's security DVRs by Verint all have facial
       | recognition enabled on the firmware, along with other security
       | analytics options.
        
         | reaperducer wrote:
         | The apartment building I lived in from 2009-2010 had facial
         | recognition.
         | 
         | When you walked through the main lobby, your face appeared on a
         | screen for the inside doorman so he could grant you access to
         | the elevator lobby. It also showed your name, apartment number,
         | and lease end date.
         | 
         | That was over a decade ago. I can't imagine what new tech must
         | look like.
        
       | surajs wrote:
       | Everything is touching everything
        
       | bigodbiel wrote:
       | Pre-crime facial recognition surveillance is authoritanism. Post-
       | crime facial recognition of surveillance is investigative
       | process. The cat is out of the bag, the only thing that prevents
       | it's inevitable abuse is strict legal framework for its
       | implementation
        
         | nirui wrote:
         | > strict legal framework for its implementation
         | 
         | And that might turn this technology into something that only
         | big company can afford, artificially created a pro-monopoly
         | environment.
         | 
         | The problem is complex.
        
           | pessimizer wrote:
           | Not really. I don't mind there being a near-monopoly on
           | facial recognition software if it takes a large company to
           | handle the needed guidelines and potential liabilities.
        
             | sailfast wrote:
             | One doesn't mind until it turns on one, typically. It's all
             | fine until the monopoly stops operating in good faith and
             | it's too late to stop it.
        
               | Jarwain wrote:
               | Well that's what the strict legal framework is for,
               | right? To ensure ethical operation instead of relying on
               | good faith?
        
             | uncletammy wrote:
             | Replace the words "facial recognition software" with
             | "location data", a comparable commodity in this context,
             | then it's immediately clear what would go wrong. See
             | https://www.eff.org/deeplinks/2021/08/its-time-google-
             | resist...
             | 
             | If the bureaucratic obstacles for using a technology are so
             | high that only very well funded companies can overcome
             | them, then those companies effectively form a cartel with
             | the state in the use and abuse of those technologies.
             | 
             | If citizens had stronger protections against the
             | application of the third-party doctrine in the US, we might
             | have less to fear. Currently though, the greatest oppressor
             | of personal freedoms through technology is the United
             | States Government. They can and will use "guidelines and
             | potential liabilities" to weaponize new technology while
             | making it next to impossible to counter the threat.
        
           | midjji wrote:
           | Problem with a strict legal framework for implementation is
           | that it takes 1 person just a few days to get a face
           | recognition system working using generic frameworks for deep
           | learning.
           | 
           | The laws can be as strict as you like, but its like
           | introducing a law saying you cant watch nude cartoons at
           | home. Even if you somehow eliminated all the existing ones,
           | its just a pen and paper away.
        
             | kilburn wrote:
             | Is that really a problem from a legal perspective?
             | 
             | There are plenty of things that are easy to do but have
             | strict legal frameworks around them. Quick examples include
             | copyright laws, use of force/violence and driving rules.
        
               | midjji wrote:
               | and how well are those laws preventing these actions?
               | Long term there are few things more damaging to a
               | societies justice system than wide spread breaking of
               | laws. Go on long enough, and people stop taking every law
               | seriously, while at the same time political reasons for
               | prosecution starts dominating.
        
           | giraffe_lady wrote:
           | When did we decide that this should have a low barrier to
           | entry. Is there any reason to think increased competition
           | between surveillance providers will lead to more ethical
           | surveillance?
           | 
           | This is not a market problem at all, I don't see why you'd
           | use market-brain constructs like monopoly to engage with it.
           | 
           | One provider or thousands, the problem is the social dynamics
           | and the power structures, not a product-consumer
           | relationship.
        
         | sneak wrote:
         | There's really no chance of that happening in the USA within a
         | generation. The US federal government has decided that it alone
         | is trustworthy, and that everyone and everything is a potential
         | threat, and that it is entitled to unlimited and unrestricted
         | surveillance of the entire country (and several others),
         | regardless of what is or is not written into law.
         | 
         | Snowden even blew the lid off of it, and _nothing changed_.
         | That 's how you know it's permanent.
        
           | nopenopenopeno wrote:
           | I'm not sure why you're being downvoted. If anything, your
           | comment is sadly an understatement.
        
         | [deleted]
        
         | deepstack wrote:
         | > strict legal framework for its implementation and strict
         | legal framework enforcement
        
         | andrepd wrote:
         | It's also trivially easy to regulate. Mandate that all cameras
         | recording public spaces record to encrypted storage. The key is
         | in possession of the judicial system. It can only be decrypted
         | and examined _with a warrant_ , with probable cause, as you
         | would need for wiretapping or raiding a home or searching
         | someone's computer.
        
           | aspaceman wrote:
           | Now explain that to a lawyer.
           | 
           | They won't get it. Trust me.
        
           | brk wrote:
           | Hardly sounds trivial, and how would you ever enforce or
           | manage that? Also, that is totally counter to the current
           | (US) laws that basically state you can record video (not
           | audio) of any publicly viewable areas freely.
           | 
           | What you propose would require massive changes to essentially
           | any camera that has any part of its view covering an outdoor
           | area, and probably many indoor areas (malls, etc.).
           | 
           | Also, face recognition typically runs on live streams to
           | build indexes, so encrypting storage would not do much. You'd
           | need to implement something like an HDMI-style encryption to
           | control what devices/processors/whatever can connect to a
           | live video stream from a camera to try and control exactly
           | how the streams are processed.
        
             | alkonaut wrote:
             | Where I am the law is very strict although somewhat
             | arbitrary. I can shoot a video of a public place. I can
             | publish the material (although commercial use requires sign
             | off naturally).
             | 
             | But here is the interesting bit: I can't set up a fixed
             | camera to record a public space without permission. And I
             | won't get that permission. Meaning basically that recording
             | in public spaces is only done by humans, limiting the scope
             | of how widespread it can be. I like this.
             | 
             | It also means, that it's not allowed to put up a camera on
             | my porch that covers the street in front of my house. I
             | suspect a lot of ring/nest users are in violation.
        
               | 14 wrote:
               | Sounds like Japan. But most countries that is not the
               | case. People want to protect their property. Also with
               | hidden cameras it would be trivial to do so and not be
               | detected so in your country those who want to record are
               | already doing it just quietly. So again it is a law that
               | only effects non criminals.
        
               | VWWHFSfQ wrote:
               | Do you have the text of the law for this? I suspect
               | you're overstating the restrictions.
        
               | alkonaut wrote:
               | Which part of the law? That I can't set up a camera on my
               | house to film a street crossing?
               | 
               | That's almost impossible to get permission for. A private
               | citizen can _not_ get permission to film a Street corner.
               | The difficulty in getting permission obviously isn't
               | written in the legal text, but individuals don't get
               | permission and businesses only very rarely do.
               | 
               | https://www.riksdagen.se/sv/dokument-
               | lagar/dokument/svensk-f...
        
               | dgfitz wrote:
               | Don't tell Nest...
               | 
               | Seriously, this means every home outdoor security camera
               | is potentially illegal. This can't be true.
        
               | alkonaut wrote:
               | Of course it is. You can't put up a camera on your porch
               | and shoot your driveway _and_ part of the street. To
               | follow the law you have to point it so it's only covering
               | your driveway or lawn and no public space. It's pretty
               | simple restrict the field of view with a screen in front
               | or bit of tape over the camera if needed.
               | 
               | Does everyone do that? Probably not. But that doesn't
               | change how the law is written.
        
               | VWWHFSfQ wrote:
               | you keep saying this and you don't actually provide the
               | text of the law
        
               | herbstein wrote:
               | They did. Two comments ago. But it's in Swedish so you
               | probably won't be able to read it.
        
             | exo762 wrote:
             | > Hardly sounds trivial, and how would you ever enforce or
             | manage that?
             | 
             | Spitballing. Enforcement: make distribution of CCTV video
             | undisclosed via court order criminal offense.
             | 
             | > Also, that is totally counter to the current (US) laws
             | that basically state you can record video (not audio) of
             | any publicly viewable areas freely.
             | 
             | This is irrelevant, because scale changes nature of things.
             | Single photo of your person is just that. 24 photos per
             | second for every second of your life is a total
             | surveillance. Law was clearly about recording usage that
             | does not amount to surveillance.
             | 
             | > Also, face recognition typically runs on live streams to
             | build indexes, so encrypting storage would not do much.
             | 
             | Encrypt the stream on the camera itself. Storage is cheap.
        
               | brk wrote:
               | >Spitballing. Enforcement: make distribution of CCTV
               | video undisclosed via court order criminal offense.
               | 
               | The same way gun ownership laws have curbed the illegal
               | gun ownership/use problem? Or, do we just keep stacking
               | laws endlessly hoping one of them actually works?
               | 
               | >24 photos per second for every second of your life is a
               | total surveillance.
               | 
               | Sure but we're nowhere near that point, or likely to be
               | at that point. Also, almost nothing records at 24fps,
               | 15fps is more common.
               | 
               | >Encrypt the stream on the camera itself. Storage is
               | cheap.
               | 
               | In most cases and camera and recording/viewing software
               | come from different companies, so you'd still need some
               | kind of key management system.
        
               | jrmg wrote:
               | _The same way gun ownership laws have curbed the illegal
               | gun ownership /use problem? Or, do we just keep stacking
               | laws endlessly hoping one of them actually works?_
               | 
               | There are legit arguments to be had about personal
               | freedom here, but it's plainly untrue regulation
               | intrinsically can't work. It works for many, many things
               | - and works for gun ownership in basically every other
               | developed nation in the world.
        
               | zo1 wrote:
               | And how does one view the feed from the camera if it's
               | encrypted? That's the whole point of the camera itself.
               | 
               | Anywho, sounds like we're jumping through hoops without
               | really understanding the requirements. Like, what is
               | _really_ bad about recording people in public? What is
               | _really_ bad about performing facial recognition?
               | 
               | But I'd go a step further, what is it that we're trying
               | to prevent from happening by making facial recognition
               | illegal? This is the juicy part and the one where the
               | "problem" becomes wishy-washy. We get reasons like:
               | "Prevent stalking by government officials", or "stop
               | widespread surveillance from...[something]". All present
               | their own challenges and implied slippery-slopes, but all
               | have different ways of being solved without necessarily
               | making public recording and facial recognition blanketly
               | illegal.
        
           | adolph wrote:
           | All that seems trivial to type but less than trivial to
           | implement.
        
             | dunefox wrote:
             | So?
        
           | karmicthreat wrote:
           | If someone throws a rock through a window, why shouldn't it
           | be legal to run the surveillance through facial recognition?
           | A crime was committed, we have a pic of the perpetrator.
        
             | netizen-936824 wrote:
             | Because recognition is incredibly flawed and biased and
             | leads to false accusations ruining lives and costing people
             | tens of thousands of dollars
        
               | zo1 wrote:
               | Everything is flawed and leads to false accusations, even
               | something ridiculously black and white like electronic
               | bank records. Witnesses, intoxication, bias, outright
               | lying, guilt, etc. All evidence has flaws and potential
               | bias. One need only look at all the false imprisonments
               | that have happened over the years due to various bits of
               | "evidence" to see that.
               | 
               | Instead, we should take the opposite approach: Invest
               | heavily in this tech, and lightly-regulate glaringly bad
               | aspects of it. E.g. For facial recognition, we can put
               | down laws that punish unfair punishment of suspects. Or
               | if we find employers that misappropriate facial
               | recognition developed to record hours worked on the
               | factory floor to punish them for chatting or going to the
               | bathroom too many times, and we don't like that, then we
               | regulate that.
               | 
               | We really went down the wrong path here somewhere,
               | applying a black and white approach to things instead of
               | just riding the in-between strategically and fairly.
               | That's how we move forward as a society instead of
               | legislating ourselves into irrelevance.
        
               | tomschlick wrote:
               | Facial recognition should never be considered evidence.
               | It is a clue. A way to sift through tens of thousands of
               | possible entries and narrow it down to 5 for a human to
               | review. We need the legal framework to ensure its not
               | used as evidence by itself, but the investigatory tool is
               | very useful.
        
               | karmicthreat wrote:
               | True, it should not replace an eyes on review of matching
               | suspects. But is a useful tool to reduce the pile you
               | need to go through.
               | 
               | Really, facial recognition just isn't getting weighed
               | properly judicially.
        
               | kbenson wrote:
               | So is eyewitness testimony. That's why we have trials.
               | The problem in the case of a crime committed like that is
               | not that the technology is used, it's when people assume
               | the technology is infallible, when there's plenty of
               | evidence to the contrary (just like there's not plenty of
               | evidence that eyewitness testimony is not infallible and
               | often subject to a bunch of problems).
               | 
               | If facial recognition is perceived as low accuracy, but
               | can yield some leads for investigators that can be
               | independently corroborated, that seems like a fine use of
               | the technology. If we're worried about the public
               | assuming it's more accurate than it is if used as
               | evidence in trials, we can either pass some laws about
               | its use as trial evidence (which is not the same as using
               | it as a lead), or train defense attorneys and the public
               | (often done through TV...) that it's use in the role of
               | proving guilt is extremely limited because of it's false
               | positive rate.
        
           | the_pwner224 wrote:
           | > The key is in possession of the judicial system. It can
           | only be decrypted and examined with a warrant, with probable
           | cause, as you would need for wiretapping
           | 
           | That's how you end up with the NSA and CIA being inside every
           | single camera in the country :)
        
             | vincnetas wrote:
             | Including your phone ;)
        
             | antris wrote:
             | As if they aren't already.
        
               | VWWHFSfQ wrote:
               | are they? or is this baseless speculation
        
               | A_non_e-moose wrote:
               | No quite baseless, more like speculation with established
               | precedent, motive and opportunity.
        
               | antris wrote:
               | Snowden leaks
        
               | VWWHFSfQ wrote:
               | did the snowden leaks say they were in every camera
        
               | antris wrote:
               | For practical purposes, yes. The leaks showed how NSA has
               | built a multi-layered data harvesting-archiving-searching
               | machine that works not only through compromised hardware
               | (including cameras), but also big company
               | infrastructures, phone/email/SMS/internet browsing
               | records and content, fiber optic cable tapping, hacking,
               | installing bugs, spying and more.
               | 
               | If the camera itself is not bugged, it likely is
               | harvested at another step at some point and nothing's
               | more clear that if NSA wants to see what a camera sees,
               | they are able to tap it if needed. Sure, untapped cameras
               | exist, but it doesn't really make a practical difference.
               | The NSA will still have your information if it wants, and
               | likely already has most of it.
        
               | vngzs wrote:
               | I don't know how long the records persist, but presumably
               | for things like video surveillance footage the decay
               | period would be quite fast. For full-text contents, less
               | speedy but still fast. I can only truly envision long-
               | term collection and storage of metadata - and even then,
               | it's a big question how much is feasible and reasonable
               | to store indefinitely.
               | 
               | It's likely that for your security camera footage to be
               | accessed you would have to be targeted. It's likely that
               | such targeting would only affect footage in the future or
               | very recent past. But I'd grant that pervasive attackers
               | can probably capture exponentially-decaying single video
               | frames from millions of cameras if they were
               | appropriately motivated, and those single frames could go
               | back quite far.
               | 
               | The real problem here is the existence of such a system
               | creates a kind of panopticon [0], with chilling effects
               | not only on activity and discussion contrarian to the
               | current administration, but also any future
               | administration that may have access to electronic
               | surveillance records. Without knowing how long records
               | are kept, it is quite plausible that a future
               | authoritarian state will misuse past records to target
               | civilians.
               | 
               | [0]: https://en.wikipedia.org/wiki/Panopticon
        
             | hutzlibu wrote:
             | You can regulate the NSA and CIA too, if you really want.
        
               | pessimizer wrote:
               | Those agencies both monitor and retaliate against members
               | of Congress. The NSA and CIA regulate government, not the
               | other way around.
        
               | zip1234 wrote:
               | The NSA and CIA are both foreign focused. The FBI is
               | internally focused within the US.
        
               | throwaway19937 wrote:
               | Both agencies have a long history of illegal domestic
               | surveillance.
               | 
               | The NSA's warrantless surveillance program (https://en.wi
               | kipedia.org/wiki/NSA_warrantless_surveillance_(...) and
               | the CIA's Operation CHAOS
               | (https://en.wikipedia.org/wiki/Operation_CHAOS) are two
               | examples of this behavior.
        
               | lp0_on_fire wrote:
               | Tell that to John Kennedy.
        
               | the_pwner224 wrote:
               | In theory, yes. In practice... no, unfortunately. The
               | people of America seem to be fine with it.
               | 
               | Actually, as another commenter mentioned, they probably
               | already are doing it. It would be stupid to think
               | otherwise.
        
               | colinmhayes wrote:
               | >> If you really want
               | 
               | > The people of America seem to be fine with it.
               | 
               | The people don't care that they're being watched. That's
               | the main lesson I took from the snowden leaks. I think
               | most people might even like it because they think it
               | means there will be less crime.
        
       | 1cvmask wrote:
       | London is the surveillance capital of the world:
       | 
       | London is often called the CCTV capital of the world, and for
       | good reason. The city is home to hundreds of thousands of CCTV
       | cameras, and the average Londoner is caught on CCTV 300 times a
       | day.
       | 
       | https://www.cctv.co.uk/how-many-cctv-cameras-are-there-in-lo...
       | 
       | Facial recognition software is being integrated now into this
       | network. No one seems to mind at all. It seems that the same is
       | in most of the rest of the world.
        
         | ricardobayes wrote:
         | Not everywhere though, in Spain it's illegal to film the
         | street. "In Spain, the law protects the rights of citizens to
         | use public spaces, which is free from interference (Clavell et
         | al., 2012)" If you want to live a functioning non-Orwellian
         | society, move here.
        
           | andy_ppp wrote:
           | Spain seems really competent right now compared to the UK,
           | they even have a digital nomad visa for people outside the
           | EU.
           | 
           | With remote working being mainstream going forward I'm
           | certainly going to consider somewhere with more sunshine and
           | less harsh winters...
        
             | novok wrote:
             | They have bad economics and were living under a
             | dictatorship in living memory. Those countries can do the
             | best for a while once they free themselves from
             | dictatorship.
        
             | dmje wrote:
             | The other easier option (although doesn't solve the
             | sunshine problem..) is moving out of cities into rural
             | areas. Very few cameras where we live in Cornwall.
        
               | [deleted]
        
               | Yottaqubyter wrote:
               | If you do video calls through teams or similar when
               | working remotely, internet speed would be a big issue,
               | coupled with the abandon of a lot of rural areas in spain
        
           | kenoph wrote:
           | Explain this to the folks at home: https://www.reddit.com/r/e
           | urope/comments/85qnx5/george_orwel...
        
             | ricardobayes wrote:
             | yes, having one CCTV is exactly the same situation as
             | covering every square meter like in London. /s
        
               | kenoph wrote:
               | I most certainly didn't imply that (:
               | 
               | But you did say it was illegal. Seems to me there are
               | exceptions if that picture is to be trusted.e
        
         | brk wrote:
         | * London is the surveillance capital of the world*
         | 
         | That's only because China does not release much information
         | publicly about their surveillance systems, but many parts of
         | China are most likely far more surveilled than London.
        
       | danuker wrote:
       | The network effects of large databases lead to an imbalance of
       | power between surveillance and sousveillance: a personal face
       | recognition DB will never match a government one.
       | 
       | https://en.wikipedia.org/wiki/Sousveillance
       | 
       | As such, we can not, say, catch not-so-publically-known but well-
       | connected public officials where they should not be.
        
         | GuB-42 wrote:
         | Obviously, a government has more power and more means than an
         | individual.
         | 
         | But when it comes to databases, individuals have access to
         | ridiculously large databases too. With Facebook and Google you
         | can peek into the private life of most people, even the police
         | does it because there is more data there than in their own
         | files. And I am just talking about ordinary access. Not what
         | Facebook and Google can do as a company.
         | 
         | Add a bit of social engineering and crowdsourcing and no one is
         | safe if enough people want to find them. There have been some
         | pretty impressive "doxing" in the past.
         | 
         | And staying off social media is pretty hard if you want to live
         | a normal life. You may not have a Facebook account but your
         | friends do, you may find your picture on your company website,
         | maybe even in a local newspaper. And officials are no
         | different, if they want to live normally, they are going to be
         | exposed.
        
       | tartoran wrote:
       | I see us wearing masks as a more general trend in the future.
       | That would probably mess with all facial recognition systems
        
         | Spooky23 wrote:
         | Agreed. My guess is that post pandemic, mask advocacy will flip
         | to more reactionary elements as well.
        
         | Nextgrid wrote:
         | Gait analysis/recognition covers that.
        
           | donkarma wrote:
           | wait until you put a pebble in your shoe
        
             | spicybright wrote:
             | I'd actually love to see if gait recognition gets screwed
             | up by changing someones shoes (pebbles, extra size up,
             | uneven platform height, etc)
        
           | reaperducer wrote:
           | _Gait analysis /recognition covers that._
           | 
           | My wife has a hundred pairs of heels of varying heights and
           | widths. Good luck with that.
        
       | andy_ppp wrote:
       | It was suggested in the early 2000s that a face could be tracked
       | moving across London by the security services. What do we think,
       | was it possible then? I have to assume by now they have intent
       | monitoring everywhere and can probably tell when someone is doing
       | things like planning a bombing etc.
        
         | actually_a_dog wrote:
         | Well, the Independent claimed in 2004 that the average Briton
         | is caught on camera 300 times a day. [0] Since London probably
         | has more than the average number of cameras per area, even if
         | we wind the clock back 3-4 years, I suspect the answer is yes.
         | 
         | ---
         | 
         | [0]: https://www.independent.co.uk/news/uk/this-britain/how-
         | avera...
        
           | antihero wrote:
           | Which is an amazing number yet the MET seem incapable of
           | solving most crimes.
        
             | bruce343434 wrote:
             | So why is that? Isn't this amount of surveillance basically
             | a god tool?
        
               | marginalia_nu wrote:
               | It's a very large haystack.
        
               | zo1 wrote:
               | Because it's apparently taboo and dystopian to perform
               | facial recognition and surveillance to help victims. Most
               | of those cameras' and their recordings go into the void
               | instead of some sort of central database that can be used
               | for tracking criminals.
        
             | meheleventyone wrote:
             | Probably because most of the cameras aren't hooked to
             | anything centralised or necessarily known about. Which
             | necessitates an enormous data gathering and combing
             | operation to find any useful of which there might not
             | actually be anything. Cameras aren't magic.
        
             | Nextgrid wrote:
             | Solving real, victim-reported crimes requires effort,
             | rarely scales and pretty much never brings any political
             | clout.
             | 
             | Solving bullshit manufactured crimes such as drug-related
             | ones scales better, doesn't even require a victim to
             | complain and provides seemingly-decent political clout and
             | a veneer of "we're doing something, see?".
             | 
             | What's the last time you heard about stolen bikes or phones
             | being recovered on the BBC? Personally, never. But drug-
             | related stuff is common.
        
           | cameronh90 wrote:
           | The vast majority of cameras in London are private
           | installations, rather than something hooked up to some
           | central system.
           | 
           | There are the ring of steel cameras, but that area hasn't
           | been widened recently. TFL cameras are possibly linked into
           | some security service, and there's a fair few of them, but
           | they're mainly pointed at roads.
        
         | bsenftner wrote:
         | "Intent monitoring" is fictional and does not exist beyond
         | scifi stories and the journalist written articles selling fear
         | about facial recognition. I work in the industry, there is no
         | such thing as "intent monitoring".
        
           | celeduc wrote:
           | "Intent monitoring" is a euphemism for detecting the presence
           | of the "wrong kind of people". It's not fictional at all.
        
       ___________________________________________________________________
       (page generated 2021-12-23 23:02 UTC)