[HN Gopher] Texas sues Google for collecting biometric data with...
       ___________________________________________________________________
        
       Texas sues Google for collecting biometric data without consent
        
       Author : fortran77
       Score  : 154 points
       Date   : 2022-10-21 13:22 UTC (9 hours ago)
        
 (HTM) web link (www.nytimes.com)
 (TXT) w3m dump (www.nytimes.com)
        
       | crazygringo wrote:
       | First of all, "biometric data" is a scary-sounding term that
       | sounds like iris or fingerprint scans, when it's just... what
       | people's faces and voices look and sound like. By this measure,
       | _my own brain_ is  "collecting biometric data" on thousands of
       | people a day while I walk down the street.
       | 
       | Second, in cases like the Nest camera or Google photos,
       | presumably most of the people it's scanning _don 't match_
       | anything. It sure doesn't feel like a violation of privacy or
       | consent for a computer to say, "I tried, but nope, face/voice not
       | recognized".
       | 
       | Third, everyone being recognized has _previously been identified
       | by the user_ , whether someone tagging their friends in Photos,
       | family members on Nest, or household members for Assistant. If
       | I'm OK with a friend uploading an image of me to their own Google
       | Photos, I don't see why I shouldn't allow them to run a facial
       | detection algorithm to organize all the photos that contain me.
       | 
       | Now _if_ Google were using all of this to build up a database of
       | all the faces in the world matched to identities (as other
       | sketchy companies have already done), then this would all be a
       | problem. But there 's zero evidence of anything like that, and
       | that's not what this case is about. This isn't about misusing
       | Street View or anything.
       | 
       | So it's hard to see this as anything but a political stunt?
       | What's Texas's goal here -- to remove the helpful features that
       | alert us when a stranger (as opposed to family member) is at our
       | door, to distinguish voices in Assistant, and find photos of a
       | particular friend in our photo library...? These are all
       | genuinely useful features.
        
         | nova22033 wrote:
         | The texas AG is under indictment for security fraud. This is
         | red meat to trumpers who feel the social media companies are
         | biased against them.
        
         | VWWHFSfQ wrote:
         | People always come in here with the same comment like "my brain
         | memorizes people's faces too! Should _that_ be illegal??? "
         | 
         | Your brain isn't memorizing billions of people's faces on the
         | scale that Google is. There's a difference. It's a false
         | equivalency.
        
           | googlryas wrote:
           | So the argument isn't actually about collecting biometric
           | data, but it is about being really good at collecting
           | biometric data?
        
           | nova22033 wrote:
           | _Your brain isn 't memorizing billions of people's faces on
           | the scale that Google is. There's a difference. It's a false
           | equivalency._
           | 
           | Doing things at scale isn't a crime. Can you point out which
           | law google broke?
        
             | weberer wrote:
             | >Google had violated a state consumer protection law that
             | requires companies to inform citizens and get their consent
             | before capturing their biometric identifiers, including
             | fingerprints, voiceprints and a "record of hand or face
             | geometry."
             | 
             | The law in question:
             | 
             | https://archive.ph/w9gtb
        
               | shadowgovt wrote:
               | It's going to be an interesting case because it can
               | easily be argued that Google captured nothing; they ran
               | post-capture analysis on data captured by individuals for
               | private use.
               | 
               | Whether that's relevant to the law is what the courts
               | will have to decide.
        
           | [deleted]
        
           | crazygringo wrote:
           | As long as the data Google is collecting is segregated per-
           | user, then _no_ -- it 's _not_ any different.
           | 
           | If Google is learning about 10-20 people per account, and
           | that data never gets combined, then what's the problem?
           | 
           | Google also processes billions of people's e-mails. But
           | they're segregated per-account. So there's no problem. What
           | makes someone's face any different from an e-mail they send?
           | 
           | Again, this lawsuit is _not_ about Google aggregating
           | _anything_. It 's _entirely_ about information that users
           | voluntarily choose to supply, that is used for features
           | desired by those users, and that remains segregated per-
           | account.
        
             | jeremyjh wrote:
             | Do you seriously believe that until the end of time no
             | court can sign a warrant demanding Google search it's
             | database of photos across all accounts?
        
           | monksy wrote:
           | It's also not cooridnating, indexing it, compairing it to
           | others, selling that data, passing it on for blackmail
           | purposes, or law enforcement purposes, etc. Additionally to
           | reproduce the person's overall biometerics it is a very
           | sloppy process and would be unreliable.
        
         | dkarl wrote:
         | > sounds like iris or fingerprint scans, when it's just... what
         | people's faces and voices look and sound like
         | 
         | It's important to point out the important difference, which is
         | that fingerprints are considered uniquely identifying for
         | practical purposes, and are commonly used to identify
         | individuals by automated security mechanisms and by law
         | enforcement, whereas faces are considered uniquely identifying
         | for practical purposes, and are commonly used to identify
         | individuals by automated security mechanisms and by law
         | enforcement.
        
           | bbojan wrote:
           | I see what you did there.
        
         | AnonCoward42 wrote:
         | It is always a matter of scale. You can get the fingerprint of
         | anyone (relatively) easily, but not everyone. So the problem
         | only arises when it is done in mass or more generally when it
         | can be used by centralized entities of any sort.
         | 
         | The wider question is, if this can be avoided at all. But I
         | think we should at least try, because this kind of data can so
         | easily be misused and at one point in time we were already
         | aware of that, but forgot in recent times.
        
         | residentcoder wrote:
         | From the article:
         | 
         | >Mr. Paxton said in a statement. "I will continue to fight Big
         | Tech to ensure the privacy and security of all Texans."
         | 
         | It's obvious pandering to the masses. This is about Mr.Paxton
         | trying to get some attention.
        
           | cvwright wrote:
           | Big Tech keeps winning because we all hate our political
           | opponents more than we hate the abuse of our privacy.
        
             | manuelabeledo wrote:
             | Also because people like Paxton aren't really doing their
             | jobs.
             | 
             | He targeted Google, when Amazon, Facebook, etc. are doing
             | the same. Also, less known companies like Palantir, roam
             | free, just because they don't have the political baggage
             | like Google do.
             | 
             | Worst of all, Paxton doesn't seem to have any issues with
             | police departments having access to Nest or Ring accounts.
        
           | bedast wrote:
           | He's on the ballot next month. So of course he is.
           | 
           | Not a fan of mass surveillance, but my gut says this is the
           | wrong person to fight this fight. His motives are
           | questionable, and it shows in specifically targeting private
           | business and not working to "ensure the privacy and security
           | of all Texans" from government and law enforcement.
        
         | Bhilai wrote:
         | Yep, this sounds like Ken Paxton trying to get some limelight a
         | week before early voting begins in Texas. Its supposed to be a
         | close election for Paxton who has basically abused his office
         | in all sorts of ethical violations[1] He also been under
         | indictment for securities fraud[2] and recently decided to
         | literally run and drive away in his car to avoid being
         | served[3].
         | 
         | This is Paxton's incompetence for those unaware:
         | 
         | > Six of the people indicted last year on allegations that they
         | were involved in a scheme to force teenage girls to "exchange
         | sexual contact for crystal methamphetamine" are now free.
         | 
         | [1] https://apnews.com/article/elections-texas-presidential-
         | elec...
         | 
         | [2] https://www.texastribune.org/2022/09/15/ken-paxton-
         | securitie...
         | 
         | [3]https://www.texastribune.org/2022/09/26/texas-attorney-
         | gener...
        
         | advisedwang wrote:
         | Google Photos recognizes _EVERY_ face even if you don 't tag
         | someone. It can show you a big array of every person in every
         | photo you've ever taken. Tagging just puts a name underneath
         | the picture and allows you to search them by name.
         | 
         | https://support.google.com/photos/answer/6128838
         | 
         | My google photos has a "photo group" for every person I studied
         | in college with (from a series of photos I took at a dinner
         | with everyone present). It even has barack obama, which it
         | recognized off a shirt a friend was wearing. I tagged none of
         | these poeple.
        
           | amf12 wrote:
           | > Google Photos recognizes EVERY face
           | 
           | No. It does not "recognize" a face. If you take a few
           | pictures of another Google user, it won't say - "Hey, here
           | are all the pics of John". It will just allow you to lookup
           | all pictures that match a face. Its clustering and not
           | recognition.
        
           | cromwellian wrote:
           | So it's a clustering algorithm that just clusters photos with
           | like features together.
           | 
           | Let's say it was done without any kind of special face
           | recognition but was done purely by some funky nearest
           | neighbor algorithm in high dimensional space. Would you still
           | object?
           | 
           | The end result is the same, grouping things by commonality on
           | different axes, one of them being faces. If the underlying
           | algorithm didn't even know what faces were, would it may a
           | difference?
        
       | heisenbit wrote:
       | A state which is now handing out DNA and fingerprint kits to kids
       | certainly has the expertise to navigate the finer aspects of
       | peoples rights.
        
         | mechanical_bear wrote:
         | I fail to see the connection.
        
           | colpabar wrote:
        
         | [deleted]
        
         | bee_rider wrote:
         | Broken clock I guess.
        
           | shadowgovt wrote:
           | We should keep a close eye on this concern.
           | 
           | Brin's "The Transparent Society" notes that once ubiquitous,
           | automated, cheap surveillance becomes widely available, the
           | genie is out of the bottle... There isn't a scenario where
           | nobody uses it, there are merely scenarios where we decide
           | who may use it and what contexts.
           | 
           | Is the same power structure that's going to attempt to ban it
           | for private use going to refrain from use by, say, the
           | police? Because it's not Google but will ultimately absorb
           | the consequences of having these technologies suppressed...
           | It's individuals who can now no longer identify who just
           | walked into their property as easily as the police can
           | identify who just walked into an abortion clinic.
        
         | prepend wrote:
         | How does voluntary dna registering intended to protect kids
         | with data held by the state relate to Google collecting data
         | involuntarily from people and using it for commercial purposes?
         | 
         | I don't think the issue is with dna and face data existing.
         | It's that Google is scooping it up and selling it that is the
         | problem.
        
         | infecto wrote:
         | I cannot argue with optics and timing but its important to note
         | that this is only part of the https://childidprogram.com
         | National Child Identification Program. Its not as nefarious as
         | some of the news articles made it seem. So I hardly see this
         | comment relevant to the discussion of the OP.
        
         | detaro wrote:
         | > _handing out DNA and fingerprint kits_
         | 
         | That the state is neither mandating nor collecting. Seems a
         | fairly rights-compatible way of doing it. (Although I'd also
         | question if its really worth doing overall)
        
       | advisedwang wrote:
       | The article mentions Washington having a biometric privacy law
       | too, but it explicitly excludes facial recognition:
       | 
       | RCW 19.375.010 (1) ... "Biometric identifier" does not include a
       | physical or digital photograph, video or audio recording or data
       | generated therefrom...
        
       | neonate wrote:
       | https://archive.ph/DKfRd
        
       | gerash wrote:
       | is this AG arguing that door bell cameras and indoor cameras (eg.
       | Nest cam) are all illegal in Texas now?
        
       | r3trohack3r wrote:
       | I think many comments here are missing a bigger picture context
       | of how this data is problematic. Not speaking law here, speaking
       | my personal mental model of evaluating the risk/ethics of facial
       | recognition in personal security devices.
       | 
       | Building a local profile of faces observed on a single person's
       | property isn't problematic for me. In order for the police to
       | search it, they have to issue a very specific warrant for a
       | specific addresses' surveillance data. Its also anonymous by
       | default - while you have the face, the data is local and isn't
       | attached to any "network" of facial recognition software. Its not
       | being used to track people moving from house to house, or
       | business to business, to build a "model" of their behavior to
       | manipulate them. It's specifically "what footage do I have
       | locally of this face?" which can be extremely helpful when
       | securing a home.
       | 
       | This is in stark contrast to what I worry companies like Google
       | do (Facebook has been caught doing it): building shadow profiles
       | of non-users that track behaviors across all places their face
       | shows up (surveillance footage, personal photos of other users,
       | etc.). This is a network of information consolidated into a
       | single point. You have no reasonable expectation of privacy in
       | public, but I do believe you should have a reasonable expectation
       | that people aren't stalking your every move across many
       | properties and aggregating that data into a single source law
       | enforcement can get access to with a single warrant.
        
         | giraffe_lady wrote:
         | I think they're more similar than you think, definitely along
         | the same continuum. This isn't an argument for why this data
         | collection is ok however, I'll actually take the more radical
         | stance that even the local face recording is unethical or at
         | least pointless and possibly harmful.
         | 
         | > "what footage do I have locally of this face?" which can be
         | extremely helpful when securing a home.
         | 
         | How? What mechanism of security is strengthened here? Almost
         | certainly your only use case for this data is handing it over
         | to the police to apply _their_ facial recognition systems to
         | it. Which they have because of homeowners like you, I guess. It
         | 's the same surveillance apparatus as in your google case
         | except more manual and less efficient. If it shouldn't exist at
         | all, a bad version of it also shouldn't exist.
        
           | r3trohack3r wrote:
           | "Send me a push notification for every novel face detected"
           | 
           | "Alarm if any face other than these 5 are detected on an
           | indoor camera between the hours of 10pm and 8am"
           | 
           | "I want to review all footage of the nanny from the past week
           | since they spend time alone with my children"
           | 
           | "Show me all footage of myself inside the house in the last
           | 15 minutes because I have no idea where I set my keys down"
           | 
           | "I just heard a window break at 2am, which cameras are
           | detecting motion so my wife can avoid them to get the kids
           | out while I do the opposite to buy them some time"
        
             | giraffe_lady wrote:
             | Several of those are still currently science fiction and
             | the rest are the same cop shit google wants to do to all of
             | us, except you're the cop and you're only doing it to your
             | family and employees. It's not better.
             | 
             | It doesn't matter whether it's a paternalistic police state
             | or literally you, dad, sitting in the seat of the
             | panopticon. It's bad for the psyche of everyone surveilled
             | in this way and it shouldn't be done. The fantasy that
             | you'll use this in an emergency is the same one gun owners
             | use to justify their decisions, and just as likely to come
             | true.
        
               | r3trohack3r wrote:
               | There is a difference between the power that comes with
               | the centralization of surveillance into a panopticon (by
               | the state or a business) and localized home surveillance.
               | There is also a difference in how consent works in those
               | cases.
               | 
               | > The fantasy that you'll use this in an emergency
               | 
               | Knowing how many people are in my home and where they are
               | at 2am seems like information that would be extremely
               | helpful. You'll need to share data to backup your stance
               | that I'm doing home security wrong. I'd love to learn how
               | to do it better.
               | 
               | Let's not derail this into a conversation about guns, we
               | can save that disagreement for another day.
        
           | mananaysiempre wrote:
           | > If [country-wide private surveillance] shouldn't exist at
           | all, a bad version of it also shouldn't exist.
           | 
           | While I'm on the side of less surveillance in general, I
           | don't think I like this logic: it seems to me that the
           | crucial point that distinguishes the systems of the last
           | couple of decades from what came before is convenience. "No
           | expectation of privacy" means very different things when
           | someone needs to physically follow you around or even make a
           | lucky guess as to which CCTV to look at compared to making a
           | query from the comfort of their desk. It's one thing for a
           | passing cop to memorize your license plate number and then
           | have to thumb through paper files in order to know that's
           | you; it's another for them to have a full map of every car
           | and owner name on it. (I don't think the latter extremes are
           | available in most countries yet, FWIW.) The bland
           | "expectation of privacy" wording seems to miss this sliding
           | scale of effort for a given amount of tracking.
           | 
           | I actually half suspect that this argument is wrong for some
           | dumb reason I'm missing, because I don't have a source for it
           | (even though it seems to be in the air somehow) and even
           | supposed experts like EFF lawyers don't seem to be making it.
           | But I haven't yet found that reason.
        
             | giraffe_lady wrote:
             | I think the difference is right in front of you, you almost
             | said it explicitly right here:
             | 
             | > It's one thing for a passing cop to memorize your license
             | plate number and then have to thumb through paper files in
             | order to know that's you; it's another for them to have a
             | full map of every car and owner name on it.
             | 
             | Those are the exact same thing _if you 're a person who
             | cops are following around already_, memorizing your number
             | trying to get you for any offense they can. The difference
             | is for many people, particularly people overrepresented on
             | HN, the police are not generally taking much interest in
             | your routine activities and trying to use them against you.
             | But this isn't a relationship with policing that everyone
             | has.
             | 
             | All these abuses were already here, they just weren't
             | applied to everyone. The expanding of these capabilities
             | _is_ what makes them devastating, you 're right. But the
             | earlier iterations weren't actually less bad, they were
             | just less likely to be applied to us.
        
               | r3trohack3r wrote:
               | I love this train of thought and it aligns well with how
               | I think about the world. You treat state actors
               | differently than citizens and, living under a state, they
               | are absolutely in your threat model for home security.
               | Police get it wrong sometimes. Sometimes they act
               | maliciously.
               | 
               | Counter point. My security system is not easily taken
               | offline (resilient against cutting power for 2+ hours,
               | cameras are all PoE, hard to find the drive storing
               | footage, etc.). If the state enters my home COINTELPRO
               | style, or gets the wrong house, I want footage of that
               | encounter under my control when it's time to go to court.
               | 
               | I think this line of thinking lines up well with my
               | original comment. We shouldn't look at this technology as
               | an all encompassing shift towards centralization of
               | power. There are uses that gives power back to
               | individuals. This power is a good thing on an individual
               | level and dangerous in aggregate.
        
           | r3trohack3r wrote:
           | > Which they have because of homeowners like you, I guess.
           | 
           | All of my security cameras are isolated on my network. They
           | do not exchange packets with the public internet. They can
           | not phone home to the manufacturer.
           | 
           | If law enforcement has footage from my surveillance system,
           | it's because they have a warrant or because I willingly
           | handed the footage over to them after someone made a bad
           | decision on my property.
        
             | brnaftr361 wrote:
             | Great, so now _you 're_ surveilling _your_ family. It 's
             | still a violation of privacy, perhaps even moreso because
             | you, an intimate, now have a much wider view of your
             | family's behavior. It's weird and unnatural. And you may
             | say today that you're a beneficent broker of this
             | information, but tomorrow when your wife or your children
             | in some manner act aberrantly - _viola_ your opinion
             | suddenly changes - you 'll leverage it for some self-
             | concerned motive. At least with a third party monitoring
             | shit there's not such a considerable conflict of interest.
             | But none of it is really reasonable. And criminals _will_
             | adapt - they have way more options than you do.
        
             | giraffe_lady wrote:
             | > they have a warrant or because I willingly handed the
             | footage over to them
             | 
             | Yes, that was the assumption I had in mind while writing
             | that.
        
         | ajross wrote:
         | Notably Facebook has been building AI profiles of face
         | recognition from its users' photos for more than a decade now.
         | Yet no Texas lawsuit. It's important to recognize the clear
         | political angle here too. Paxton sees Google as an enemy
         | whereas Facebook is closer to "his team" (or at least relies
         | for more of its revenue on his team).
         | 
         | And I get how this comment will be taken, but it would be good
         | for all the HN libertarians to introspect and think a bit on
         | whether they really want to get their "stronger enforcement of
         | personal internet privacy standards" via "extended use of state
         | power against political enemies".
         | 
         | Seems like both of those are slippery slopes. I know which one
         | I personally fear more.
        
           | selwynii wrote:
           | FB shut down facial recognition last year. I'm waiting to see
           | other companies follow suit. Apple?
           | 
           | https://about.fb.com/news/2021/11/update-on-use-of-face-
           | reco...
        
           | thetrb wrote:
           | What are you talking about? There is a Texas lawsuit about
           | this against Meta:
           | https://www.nytimes.com/2022/02/14/technology/texas-
           | facebook...
        
         | cromwellian wrote:
         | But these profiles can be built on the fly. Imagine you host
         | photos but have no biometric profiles. The police show up with
         | a warrant and this warrant demands you run a search for all
         | faces that match a given face and turn over the resulting
         | images to the court. If such a search algorithm exists, the
         | court might be able to order you or an outside forensics
         | company to use it.
         | 
         | So basically to prevent police being able to do this you have
         | to prevent hosting non-end to end encrypted photos.
         | 
         | That's a significantly reduced service capability.
        
           | vineyardmike wrote:
           | > demands you run a search for all faces that match a given
           | face and turn over the resulting images to the court.
           | 
           | Hopefully a court would reject that warrant as being too
           | broad, but either way the compute to do this would be really
           | expensive for any not-large company, or even a large company
           | if they regularly get these requests.
        
             | scarface_74 wrote:
             | You give the US justice system way too much credit.
             | 
             | https://www.theguardian.com/us-news/2021/sep/16/geofence-
             | war...
        
             | cromwellian wrote:
             | The compute wouldn't be that high as long as you had a way
             | of scoping it down. If the police limited it to a handful
             | of accounts of to a geo-fenced time of day, it would be
             | quite tractable.
             | 
             | The police have already used such geo-fenced warrants
             | before.
        
       | [deleted]
        
       | uncletammy wrote:
       | As a Texan, it's nice seeing Ken Paxton, our crook of an attorney
       | general who has been under indictment for multiple years for
       | multiple crimes, finally do something positive for the people.
       | 
       | I still won't vote for the fascist criminal but I'm glad he's
       | finally doing his job.
        
       | cestith wrote:
       | Ken Paxton is up for reelection and is still under federal
       | indictment for multiple felonies. This is a state right of action
       | and if this election stunt works, don't look for any
       | reimbursement to the wronged parties.
        
       | shagie wrote:
       | Gift link: https://www.nytimes.com/2022/10/20/technology/texas-
       | google-p...
        
       | monksy wrote:
       | I think a bigger isn't isn't just collecting consent for this,
       | but it's questioning is the data neccessary to perform the
       | business need they're asking for it and is it coerced consent.
       | 
       | GDPR is great, however phone companies still collect biometic
       | data and claim that it's "necessary" (It's not) They do so and
       | claim it's neccessary, when it's the copy of the passport that is
       | necessary. Additionally forcing the collection of it otherwise
       | you can't get a phone number is not exactly consent. That's
       | coerced.
       | 
       | This is the case of Telefonica in Spain (Where I went to the
       | store and they didn't warn me but coerced me into going through
       | this with a store rep right there) This is also the case in
       | Sounder apartments where they want pictures of your id and of you
       | to rent from them. https://www.sonder.com/
        
       | manuelabeledo wrote:
       | Texas is a one-party consent state, meaning that you can store
       | any sort of recording you have been involved in, without seeking
       | consent from the other parties involved.
       | 
       | Texas police departments have also been trying to partner with
       | camera manufacturers like Ring or Google, to access their
       | customer records [0].
       | 
       | I must be missing something here, then. Either that, or this is
       | just another political stunt directed against "big tech", while
       | it ignores the bigger picture.
       | 
       | [0] https://www.dallasjustice.com/ring-cameras-and-police-
       | survei...
        
         | j33zusjuice wrote:
         | 100% a political stunt. The guy bringing these suits is a
         | political hack. Even if he's right that this could lead to
         | abuse, I'm certain it's a self-interested thing. This could be
         | my biases showing themselves, but I've got good reason for my
         | biases against the government. I'm also against big tech,
         | though, so I'm torn on how to feel about Google getting sued.
        
         | nordsieck wrote:
         | > Texas is a one-party consent state, meaning that you can
         | store any sort of recording you have been involved in, without
         | seeking consent from the other parties involved.
         | 
         | Seems unlikely to apply.
         | 
         | 1. "one-party consent" applies only to "wiretap" laws i.e.
         | audio recording. It looks like a lot of this case is about
         | photo and video data.
         | 
         | 2. "one-part consent" refers to people in conversation. That
         | definitely doesn't apply to Google here.
        
           | bravetraveler wrote:
           | I think an important part of this is the 'reasonable
           | expectation of privacy', too.
           | 
           | From what I understand, recording video or audio outside _in
           | the public_ where there 's no reasonable expectation of
           | privacy is a-okay.
           | 
           | This may put the ring cameras in an interesting position.
           | They're attached to your home, yes, but they're looking at
           | the public street.
        
             | anon_cow1111 wrote:
             | Is there any reason I _can 't_ put a Ring camera in the
             | bathroom and sue someone's ass off later, though? Do they
             | actually advertise them as being accessible by police, so
             | the user reasonably expects their security camera footage
             | to not be private? (I really don't know, I've only seen
             | commercials which seem to imply it's just home camera/voice
             | com you can use with your phone)
        
         | phrz wrote:
         | Yes, you're thinking of recordings. This suit is brought under
         | the Texas Capture or Use of Biometric Identifier Act (Tex. Bus.
         | & Com. Code SS 503.001 et seq.) [0]. At issue is the alleged
         | unauthorized collection of face and fingerprint biometric data.
         | 
         | https://statutes.capitol.texas.gov/Docs/BC/htm/BC.503.htm
        
           | manuelabeledo wrote:
           | Thing is, outside purely legal statuses, I don't see why
           | collection of "faces" done by a company on my behalf, is less
           | dangerous than the police tapping in my account to retrieve
           | pictures of alleged criminals.
           | 
           | In other words, I highly doubt that anyone wanting to enter
           | my home would object to Google scanning their face, but I
           | cannot say the same about the police collecting information
           | on them.
        
         | shadowgovt wrote:
         | Not only are you missing nothing, you may have identified one
         | of the dimensions of why this lawsuit is coming from Texas
         | specifically.
         | 
         | It may be the stick in a carrot-and-stick negotiation for
         | sharing that camera data with Texas LEO. A lawsuit like this
         | can be dropped by the prosecution any old time, and there's no
         | particular requirement that _all_ reasons the suit was dropped
         | be publicly stipulated.
        
       | gzer0 wrote:
       | I find this hysterical, especially since Texas was "hacked" and
       | all 28 million residents at the time had their entire Driver
       | License info stolen (Name, address, DL #, age, height). Nothing
       | was done [1].
       | 
       | [1] https://www.fox26houston.com/news/nearly-28-million-
       | licensed...
        
         | TehCorwiz wrote:
         | I think the major distinction between your example and the
         | article is: voluntary vs involuntary.
         | 
         | Texas did not intentionally give away their data, it was
         | exposed due to an attack from a third-party.
         | 
         | Google is willfully collecting cross-referenced location,
         | image/video, and personal profile data.
        
           | jonas21 wrote:
           | Google has not given their data away at all, voluntarily or
           | involuntarily, except when compelled to by court order.
           | 
           | Texas did voluntarily give their data to a third-party, the
           | insurance software company Vertafore, who then inadvertently
           | exposed it publicly.
        
         | calculatte wrote:
         | I don't find that hysterical. Both cases are wrong and should
         | not happen. People turning such stories into political issues
         | is how they keep getting away with it.
        
       | VWWHFSfQ wrote:
       | > The complaint targets the Google Photos app, which allows
       | people to search for photos they took of a particular person;
       | Google's Nest camera, which can send alerts when it recognizes
       | (or fails to recognize) a visitor at the door; and the voice-
       | activated Google Assistant, which can learn to recognize up to
       | six users' voices to give them personalized answers to their
       | questions. Mr. Paxton said the products violated the rights of
       | both users and nonusers, whose faces and voices were scanned or
       | processed without their understanding or consent.
       | 
       | It would be great if they could actually stop Google and everyone
       | else from doing face recognition on random people in photos
       | without them knowing or having any way to stop it.
        
         | amf12 wrote:
         | The complaint is a farce.
         | 
         | > Google Photos app, which allows people to search for photos
         | they took of a particular person;
         | 
         | All Google Photos does is group pictures by matching faces in
         | pics to allow you to click on a face and lookup all pictures
         | which have that face. It does not "recognize" who the picture
         | belongs to.
         | 
         | > Google's Nest camera, which can send alerts when it
         | recognizes (or fails to recognize) a visitor at the door;
         | 
         | All the processing happens on the device locally. It will only
         | "recognize" by name if you name a face. This is a premium
         | feature and the tagged names are wiped when you stop the
         | premium subscription.
         | 
         | > voice-activated Google Assistant, which can learn to
         | recognize up to six users' voices to give them personalized
         | answers to their questions.
         | 
         | Users have to enroll their voice to access this feature.
        
           | shadowgovt wrote:
           | It might not be the case the complaint is a farce as per the
           | law.
           | 
           | On the other hand, it might be the case Texas passed an ill-
           | thought-out "Don't collect biometric data" law that makes it
           | illegal to do anything interesting with privately-taken
           | photographs for personal use without the express consent of
           | every individual in every photograph.
        
         | theptip wrote:
         | It's really simple (not easy): pass a US equivalent of GDPR,
         | and make sure biometrics count as "personal data" as they do in
         | the EU.
         | 
         | CCPA gets you a lot of the way there (indeed I suspect you can
         | request to have Google delete all your biometric data under the
         | "right to be forgotten" clause), but I think CCPA doesn't
         | prevent collection/processing of data without an agreement in
         | place as the GDPR does. (Basically, GDPR requires you to have
         | an agreement in place before storing any of my Personal Data,
         | since it's that agreement that then binds who you can share it
         | with, and how. If I don't use Google, then Google can't process
         | my biometric data. I wonder if they get around this by
         | approximately everyone using Google in some capacity, and
         | having a "I also agree that you can process my biometric data"
         | term in the ToS?)
         | 
         | Interested in any lawyers' opinions on the above; my read of
         | e.g.
         | https://www.bakerlaw.com/webfiles/Privacy/2018/Articles/CCPA...
         | is that Google cannot (i.e. would be forbidden to) do biometric
         | recognition on people that have requested for their data to be
         | deleted.
         | 
         | Where I'd like to see CCPA go further is that it doesn't
         | strongly restrict transfers of data; under GDPR you explicitly
         | approve a set of Processors and Sub-Processors, and must be
         | informed when that changes. CCPA does seem to restrict sales of
         | data, but doesn't tightly control where it is shared without
         | sale.
        
           | mindslight wrote:
           | A direct analog of the GDPR is exactly what the US needs,
           | especially with its framing of consent that doesn't allow the
           | standard dance of nullification-via-nonconsentual-contract.
           | 
           | I fear that all of these half-baked state privacy laws are
           | going to force a move for overriding federal legislation, at
           | which time the surveillance companies will lobby hard for all
           | sorts of loopholes that effectively neuter the protections,
           | and the totalitarian status quo will be set in stone.
           | 
           | It's unfortunate the Internet surveillance industry wasn't
           | nipped in the bud 15 years ago, now Surveillance Valley is
           | held up as a bastion of "innovation" and most of the people
           | who should know better are happily on the take.
        
             | shadowgovt wrote:
             | On the other hand, given that the GDPR is the source of the
             | now-near-ubiquitous cookie-consent noise, perhaps it's
             | better if the US _not_ mirror that law.
             | 
             | There are probably good parts of it that can be lifted, but
             | it may tilt too hard in the direction of personal ownership
             | of other people's perception of you.
             | 
             | In any case, we can be confident that nothing will pass at
             | the Federal level that would make credit scores illegal.
        
               | mindslight wrote:
               | The GDPR is not "the source of the now-near-ubiquitous
               | cookie-consent noise". Rather, those banners arose from
               | _malicious compliance_ by companies protesting the
               | earlier EU cookie law, and you 've seemingly fallen for
               | the ruse. Furthermore, the GDPR _fixed_ the loophole that
               | made this malicious compliance technically legal. So the
               | only take away from your point is that enforcing
               | regulations on large companies suffers from
               | incompleteness, which is obvious.
               | 
               | Your assertion that the GDPR affects "personal ownership
               | of other people's perception of you" is blatantly false.
               | From the GDPR: " _2. This Regulation does not apply to
               | the processing of personal data: ... (c) by a natural
               | person in the course of a purely personal or household
               | activity_ ". It explicitly excepts regulating _personal
               | activity_ , and instead focuses on commercial activity -
               | ie companies. Companies do not have some inherent right
               | to keep surveillance records on individuals. And this
               | often referenced idea that company behavior is merely
               | individual behavior scaled up is utterly fallacious,
               | starting with the fact that companies are formed
               | precisely to shield liability by diffusing
               | responsibility.
               | 
               | And sure, the corporate lobby is extremely powerful in
               | the US, so I agree I'm dreaming to think that any law
               | would ever hamper the credit surveillance bureaus - they
               | already bought their regulatory capture with the
               | indemnifying "Fair" Credit Reporting Act. But still if
               | we're talking about what _ought_ to be, then a law that
               | would allow me to opt out of their keeping surveillance
               | records on me is sorely needed. I for one would be happy
               | to live without them.
        
               | shadowgovt wrote:
               | > those banners arose from malicious compliance
               | 
               | The consequences of malicious compliance _must_ be
               | considered for every law passed. That 's fundamental to
               | the process of law, because humans react to incentives
               | and are often selfish (or at least, self-focused). There
               | was very little carrot attached to the law and plenty of
               | stick, so people pushing right up to the edge of what the
               | law allows was _completely_ anticipated.
               | 
               | Can't blame people with goals opposed to the law for
               | legally bending the law to reach those goals.
               | 
               | I like thinking about what _ought_ to be, but I sure get
               | burned too often by people passing laws in that direction
               | who haven 't paid sufficient thought to what _is_.
        
               | mindslight wrote:
               | I agree that the cookie law was a terrible law, written
               | by tech-ignorant politicians who reached for a naive
               | solution - one that already existed as a configuration
               | option in web browsers.
               | 
               | Malicious compliance within the letter of the law should
               | be anticipated, yes. But malicious _noncompliance_ is
               | still illegal, like these GDPR-inspired faux-consent
               | popups. The framing of consent in the GDPR is precisely
               | because of how the cookie law played out.
               | 
               | I find it curious that your nick is "shadowgovt", yet
               | you're arguing against attempting to regulate a shadow
               | government. Are you like a shadow government _enthusiast_
               | or something?
        
               | shadowgovt wrote:
               | Check my profile, it's a reference to a song.
               | 
               | The crux of the song is shadow governments do not exist.
               | It's just people doing their best with the incentives
               | before them.
        
               | theptip wrote:
               | > the GDPR is the source of the now-near-ubiquitous
               | cookie-consent noise
               | 
               | I didn't think this was true off the top of my head, and
               | a quick dig supports that:
               | 
               | https://gdpr.eu/cookies/
               | 
               | The GDPR doesn't have anything to say about cookies,
               | aside from that they count as Personal Data. (It's quite
               | readable, you can verify this for yourself: https://gdpr-
               | info.eu/). GDPR is about what happens to the data you
               | share with a company, what they are allowed to do with
               | it, who they are allowed to share it with, what your
               | rights to delete that data are, and what the penalties
               | should be if they leak/misuse your data.
        
               | shadowgovt wrote:
               | Because cookies count as personal data (which is pretty
               | strange by itself... data the client stores that the
               | server hands it is personal?), The need to gather consent
               | to store cookies on your machine became suddenly
               | ubiquitous, and rather than cease to store them (which
               | should have been a ridiculous expectation on its face...
               | What, every website on the planet was going to audit how
               | the cookies are used? Nobody's spending that money, and
               | you can't just remove them without risking breaking some
               | flow), companies went with the shortest path to
               | continuing their current practices with no major changes,
               | which was to pop a banner asking for your consent.
               | 
               | That outcome should have been obvious and the fact that
               | it wasn't does not inspire confidence in the people
               | making these laws.
        
         | Mikeb85 wrote:
         | > face recognition on random people in photos
         | 
         | Except they're not. They just match faces to itself and the
         | user has to tell Google photos who it is. Google Photos isn't
         | running faces against a database to ID people, it only matches
         | like faces within your own albums.
        
           | 8ytecoder wrote:
           | I avoid Google products. But even to me it's obvious that
           | they're not doing a global match. All facial recognition - if
           | you could even call it that - is simply within the users
           | photos. Kinda like a categorisation algorithm. Now, if they
           | took this data and used it elsewhere Texas might have a case.
           | Otherwise it should be thrown out.
        
           | monksy wrote:
           | They are doing face recognition but not detection. It's
           | always amusing when I go through google photos and it shows
           | me complete strangers that I have in my photographs.
        
         | shadowgovt wrote:
         | Just keep in mind that the "they" in question is _definitely_
         | not going to stop themselves from using it to comb crowd images
         | for perpetrators of potential crimes and other wanted
         | individuals.
        
           | VWWHFSfQ wrote:
           | We can care about more than one thing at once. This article
           | is about attempts to stop Google from doing it. That's what
           | we're talking about.
        
             | shadowgovt wrote:
             | It's about attempts _by the people who will do the other
             | thing_ to stop Google from doing it.
             | 
             | I'm sure Texas law enforcement wouldn't mind if cops could
             | operate in neighborhoods with less worry about what
             | people's Ring cameras will show them doing.
        
               | VWWHFSfQ wrote:
               | you're trying to derail this conversation with some
               | whataboutism
        
               | shadowgovt wrote:
               | It's not a whataboutism to look one step forward to the
               | consequences if Texas bans Google from offering these
               | tools to consumers (which is: an arrangement where the
               | tools are in the hands of _only_ the government, not the
               | government and private individuals).
               | 
               | Texas's ban is in the business and commerce code; it's a
               | ban for the commoners but not the government. We should
               | be looking at attempts to exercise it with suspicion and
               | concern.
        
       | chatterhead wrote:
       | I thought you had to show actual damages to sue in this context?
       | How does a state assume the role of the people transgressed and
       | then not pass any money from the suit onto the people impacted?
       | 
       | So tired of governments using people as a means of attacking
       | corporations only to not provide the people with relief.
       | Otherwise, it needs to be criminal. But, they can't unless a
       | specific individual is found to have broken the law, but that's
       | what the layers of separation and plausible deniability are for.
       | 
       | How does any of this benefit the people?
        
       | balderdash wrote:
       | Personal opinion: While I don't expect much in the way of privacy
       | in public I do have a general expectation of anonymity. E.g. if I
       | leave my home walk down to the local cafe, pay cash for a coffee,
       | put me my head into the local bookshop for a browse, and walk
       | home, while people can see me and judge my actions, I can
       | complete this journey in an anonymous way, if I misbehave (don't
       | pay for my coffee) I expect my actions to come under scrutiny.
       | But the thought of the coffee shop, book shop, and all my
       | neighbors logging my comings and goings is terrifying .
        
       | hirundo wrote:
       | If I look at you and recognize you I don't feel that I have
       | violated your rights, whether I use cybernetic augmentation or
       | not.
        
         | thatguy0900 wrote:
         | What if I pay someone who knows you to find and list every
         | picture you've ever been in together with the location it was
         | taken in, and I started tracking who and where you hang out
         | with the most? Maybe every once in a while send you lists of
         | things your friends have bought (I'm also doing this to them)
         | just in case you might want to buy it as well. Genuinely
         | curious if that feels violating to you
        
           | pc86 wrote:
           | No, that doesn't sound like genuine curiosity.
        
             | thatguy0900 wrote:
             | How so? Where is the line drawn? He's comfortable
             | recognizing people's faces, we all are. That's not really
             | what Googles doing though, mine is closer. I would draw the
             | comfort line at amortizing what they know about my pictures
             | with data they've only gained through other people's
             | uploads, I think.
        
         | petilon wrote:
         | You may not have, but the company that provided you cybernetic
         | augmentation probably did, if they collected info without
         | consent and rented it to you.
        
           | [deleted]
        
         | boplicity wrote:
         | Yes, but you don't have literally millions of vision inputs
         | that are constantly scanning for me.
        
         | AlexandrB wrote:
         | When did it become OK to treat each other like shit? Not
         | violating someone's rights is a pretty low bar. I don't
         | understand why people feel justified in turning the
         | surveillance capabilities of the world's largest companies on
         | their fellow humans.
        
           | esprehn wrote:
           | What part of this is surveillance though? Within a particular
           | Google Photos account the faces of humans (and pets) are
           | identified and then labeled by the account owner. Google
           | doesn't match faces across accounts, doesn't use the data for
           | ads, and is doing effectively what a human could do by
           | scrolling through the photos and manually circling faces with
           | a pen.
        
             | shadowgovt wrote:
             | It's ultimately irrelevant to the Texas code whether Google
             | is correlating the data across sources, because what the
             | Texas code banned was collecting the data in the first
             | place, not the way it's collated and used.
             | 
             | There's some sense to that approach; people don't want to
             | trust that the only thing keeping Google from doing the
             | cross-correlation is their own corporate ethics. I
             | ultimately think trying to ban the intake in that way is
             | bailing the Titanic, but I think I see where they're coming
             | from.
        
               | googlryas wrote:
               | But the data is just photos. Is any photo hosting service
               | illegal in Texas now? The biometric data is literally in
               | any photograph of you. And the data is entirely isolated
               | within a single account.
               | 
               | I have tens of thousands of photos in my google account,
               | and thousands with my wife. Right now, google can show me
               | photos of me and my wife. Is the idea that I'm supposed
               | to ask for some kind of consent from my wife before I'm
               | allowed to ask google to show me photos of me and my
               | wife, but I can go through my account and just select
               | photos of me and my wife?
        
           | shadowgovt wrote:
           | The jury's out on whether that counts as "treating each other
           | like shit."
           | 
           | Most of human civilization has been a pattern of small
           | communities where everyone knew everyone. The privacy granted
           | implicitly by anonymity is relatively new (and, I'd argue,
           | whether it's a net benefit for society is a largely open
           | question... A lot of harm is done by people who quietly go
           | off the rails because nobody knows who they are).
        
             | dadoomer wrote:
             | > Most of human civilization has been a pattern of small
             | communities where everyone knew everyone. The privacy
             | granted implicitly by anonymity is relatively new
             | 
             | I find that hard to believe. Teotihuacan (first example
             | that came to mind) already had 100,000+ inhabitants around
             | 1 to 500 CE, and according to Wikipedia it was only the
             | sixth largest city.
             | 
             | Regardless, I don't see why it's particularly important
             | what cities were like during "most of human civilization"
             | when discussing electronic suirveilance.
        
         | bakugo wrote:
         | I'm so tired of this "well a real human can do it so it's okay
         | for a machine under the control of a megacorporation to do
         | something vaguely similar at a 1000000x larger scale!" non-
         | argument.
        
           | shadowgovt wrote:
           | "If I hired two million private investigators to follow
           | around the citizens of Dallas and record everything they do,
           | would that be illegal?"
           | 
           | (No; if anything, it'd be a job-creation program at that
           | point. ;) )
        
           | googlryas wrote:
           | Well then people should stop putting forward
           | moralistic/absolutist arguments and provide more nuance.
           | 
           | Anyways, where's the limit? Is it just megacorps at 1000000x
           | the scale that can't do it? Can a tiny company do it at 10x
           | the scale? Can I personally do it at 1x the scale?
        
             | mindslight wrote:
             | The GDPR lays out a straightforward threshold:
             | 
             |  _2. This Regulation does not apply to the processing of
             | personal data: ... (c) by a natural person in the course of
             | a purely personal or household activity_
             | 
             | Once something is done commercially, it inevitably scales
             | in frequency and continuity.
        
               | 988747 wrote:
               | Now the question is: if I write a web scraper that
               | collects data from LinkedIn, and I do it on weekend as a
               | hobby project, but I end up with personal data of 1M
               | people is it still a "personal or household activity"?
        
       ___________________________________________________________________
       (page generated 2022-10-21 23:01 UTC)