[HN Gopher] FTC: Vast Surveillance of Users by Social Media and ...
       ___________________________________________________________________
        
       FTC: Vast Surveillance of Users by Social Media and Video Streaming
       Companies
        
       Author : nabla9
       Score  : 271 points
       Date   : 2024-09-19 16:49 UTC (6 hours ago)
        
 (HTM) web link (www.ftc.gov)
 (TXT) w3m dump (www.ftc.gov)
        
       | GeekyBear wrote:
       | This portion is particularly problematic:
       | 
       | > many companies engaged in broad data sharing that raises
       | serious concerns regarding the adequacy of the companies' data
       | handling controls and oversight.
        
         | mrmetanoia wrote:
         | It would be wonderful if the staff report recommendations were
         | taken seriously by our legislators. I think I'll send a copy of
         | this to my reps and say hi.
        
       | ryanisnan wrote:
       | I love the cognitive dissonance on display within the federal
       | government.
       | 
       | One arm: "everyone is a criminal; spy on everyone"
       | 
       | Other arm: "hey you shouldn't really harvest all of that data"
        
         | bitwize wrote:
         | And it's not just here.
         | 
         | The EU: Unlike the barbarians across the pond, we actually
         | protect people's privacy rights.
         | 
         | Also the EU: ChAt CoNtRoL
        
           | ryanisnan wrote:
           | The problem seems deeply fundamental to what it means to be a
           | human.
           | 
           | On one hand, there's a lack of clear leadership, unifying the
           | societal approach, on top of inherently different value
           | systems held by those individuals.
           | 
           | It seems like increasingly, it's up to technologists, like
           | ones who author our anti-surveillance tools, to create a free
           | way forward.
        
           | whimsicalism wrote:
           | this view presupposes the state as "just another actor" as
           | opposed to a privileged one that can take actions that
           | private actors can't
        
             | Karunamon wrote:
             | Those privileged actions are mostly irrelevant when
             | discussing mass surveillance. Doubly so since they can just
             | buy or acquire the data from corps.
        
             | lupusreal wrote:
             | In the matter of corporations vs governments, if you tally
             | up number of people shot it's clear which of the two is
             | more dangerous. You would think Europe of all regions would
             | be quick to recognize this.
             | 
             | I don't _like_ corporations spying on me, but it doesn 't
             | scare me nearly as much as the government doing it. In fact
             | the principle risk from corporations keeping databases is
             | giving the government something to snatch.
        
               | whimsicalism wrote:
               | because the government has a monopoly on violence. i
               | would much prefer that to corporations being able to wage
               | war themselves
        
               | lupusreal wrote:
               | Who is arguing for corporations to wage war? What an
               | absolutely insane strawman. What I am arguing against is
               | letting governments grant themselves the ability to spy
               | on their own populations on an unprecedented scale,
               | because governments _" waging war"_ (mass murder) against
               | their own people is a historically common occurrence.
        
           | immibis wrote:
           | The EU has multiple parts. One part keeps asking for chat
           | control, and another part keeps saying no.
        
         | jlarocco wrote:
         | The cognitive dissonance is in the voters and users.
         | 
         | Even right here on HN, where most people understand the issue,
         | you'll see conversations and arguments in favor of letting
         | companies vacuum up as much data and user info as they want
         | (without consent or opt-in), while also saying it should be
         | illegal for the government to collect the same data without a
         | warrant.
         | 
         | In practice, the corporations and government have found the
         | best of both worlds: https://www.wired.com/story/fbi-purchase-
         | location-data-wray-... Profit for the corporation, legal user
         | data for the government.
        
           | spacemadness wrote:
           | HN is filled with folks that wrote the code in question, or
           | want to create similar products. And they hate to have it
           | pointed out that these tools may cause harm so they thrash
           | around and make excuses and point fingers. A tale as old as
           | this site.
        
             | mrmetanoia wrote:
             | I often have to remind myself who hosts this board and that
             | I am hanging out on a site for successful and aspiring
             | techno-robber-barons.
        
               | sabbaticaldev wrote:
               | > I am hanging out on a site for successful and aspiring
               | techno-robber-barons.
               | 
               | that's how we first arrive here (all of us). Time pass
               | tho and most around fail then we become proper people
               | capable of reasoning
        
               | 2OEH8eoCRo0 wrote:
               | Complete with egotistical and ironic appropriation of the
               | word _hacker_.
        
               | singleshot_ wrote:
               | Explaining that modern technology is user-hostile and
               | destructive to the society is nowhere else more on-topic
               | than Paul Graham's ego blog. While it might be true to
               | say the site is "for" robber barons, There are a lot more
               | users here than the ones you described.
        
           | itronitron wrote:
           | And in Europe, everyone and their dog uses WhatsApp
        
           | BeetleB wrote:
           | Anti-disclaimer: I'm not one of those folks.
           | 
           | However, that's not at all a cognitive dissonance.
           | Fundamentally, there's a difference between governments and
           | private companies, and it is fairly basic to have different
           | rules for them. The government cannot impinge on free speech,
           | but almost all companies do. The government cannot restrict
           | religion, but to some extent, companies can. Etc.
           | 
           | Of course, in this case, it's understandable to argue that
           | _neither_ side should have that much data without consent.
           | But it 's also totally understandable to allow only the
           | private company to do so.
        
             | jlarocco wrote:
             | There is fundamentally a difference between corporations
             | and the government, but it's still a cognitive dissonance.
             | These aren't the laws of physics - we _chose_ to have
             | different rules for the government and corporations in this
             | case.
             | 
             | There are plenty of cases where the same rules apply to
             | both the government and corporations.
        
           | doctorpangloss wrote:
           | There isn't a single intellectually honest harm associated
           | with the majority of app telemetry and for almost all ad data
           | collection. Like go ahead and name one.
           | 
           | Once you say some vague demographic and bodily autonomy
           | stuff: you know, if you're going to invoke "voters," I've got
           | bad news for you. Some kinds of hate are popular. So you
           | can't pick and choose what popular stuff is good or what
           | popular stuff is bad. It has to be by some objective
           | criteria.
           | 
           | Anyway, I disagree with your assessment of the popular
           | position anyway. I don't think there is really that much
           | cognitive dissonance among voters at all. People are sort of
           | right to not care. The FTC's position is really unpopular,
           | when framed in the intellectually honest way as it is in the
           | EU, "here is the price of the web service if you opt out of
           | ads and targeting."
           | 
           | You also have to decide if ad prices should go up or down,
           | and think deeply: do you want a world where ad inventory is
           | expensive? It is an escape valve for very powerful networks.
           | Your favorite political causes like reducing fossil fuel use
           | and bodily autonomy benefit from paid traffic all the same as
           | selling junk. The young beloved members of Congress innovate
           | in paid Meta campaign traffic. And maybe you run a startup or
           | work for one, and you want to compete against the vast
           | portfolio of products the network owners now sell. There's a
           | little bit of a chance with paid traffic but none if you
           | expect to play by organic content creation rules: it's the
           | same thing, but you are transferring money via meaningless
           | labor of making viral content instead of focusing on your
           | cause or business. And anyway, TikTok could always choose to
           | not show your video for any reason.
           | 
           | The intellectual framework against ad telemetry is really,
           | really weak. The FTC saying it doesn't change that.
        
             | arminiusreturns wrote:
             | The intelligence agencies literally use ad data to do
             | "targeted killing" what are you even talking about?
             | 
             | Ex-NSA Chief: 'We Kill People Based on Metadata'...
        
               | doctorpangloss wrote:
               | Can you define a harm suffered by the people that the FTC
               | represents? What about the EU beneficiaries of the GDPR?
               | This is sincere, it is meant to advance to a real and
               | interesting conversation.
        
               | arminiusreturns wrote:
               | I think privacy violations are a harm in themselves, but
               | you seem to have already dismissed this issue, so I'll
               | move on. How about behavioral manipulation via
               | microtargeting, economic harm via price discrimination,
               | reselling of the data via monetization to unscrupulous
               | aggregators or third parties, general security reduction
               | (data and metadata sets could be used for APT, etc), or
               | the chilling effect of being tracked all the time in this
               | way?
        
           | neuralRiot wrote:
           | >The cognitive dissonance is in the voters and users.
           | 
           | People really need to learn to say "NO" even if that means an
           | inconvenience "Your personal information might be shared with
           | our business partners for metrics and a customer tailored
           | experience" no thanks, "what is your phone number? so I can
           | give you 10% discount" no thanks, "cash or credit?" Cash,
           | thanks, "login with google/ apple/ blood sample" no thanks
        
         | whimsicalism wrote:
         | It seems entirely reasonable/consistent that we would allow
         | some capabilities among publicly sanctioned, democratically
         | legitimate actors while prohibiting private actors from doing
         | the same.
         | 
         | In fact, many such things fall into that category.
        
         | daedrdev wrote:
         | I would be worried if the state was conscious of what it itself
         | was doing as a whole
        
         | bee_rider wrote:
         | It isn't cognitive dissonance, the state does lots of things
         | we're not supposed to do. Like we're not supposed to kill
         | people, but they have whole departments built around the task.
         | 
         | Should the state do surveillance? Maybe some? Probably less?
         | But the hypocrisy isn't the problem, the overreach is.
        
         | kiba wrote:
         | There are different organizations with different opinions. The
         | government isn't a monolithic entity.
        
         | cvnahfn wrote:
         | The FTC is under the president's authority. This is election
         | pandering, same as Zuckerberg's backpedaling on government
         | censorship.
         | 
         | This is for getting votes from the undecided.
         | 
         | Everything will be back to normal (surveillance, data
         | collection and censorship) after the election.
        
           | singleshot_ wrote:
           | Begs the question of agency authority which is manifestly not
           | resolved. You will find that the elections' results will
           | effect the eventual resolution of the question of the unitary
           | executive quite dramatically.
        
           | munk-a wrote:
           | I don't know if you've been watching but the FTC has actually
           | been extremely proactive during this cycle. Lina Khan is an
           | excellent steward and has pushed for a lot of policy
           | improvements that have been sorely needed - including the ban
           | (currently suspended by a few judges) on non-competes.
           | 
           | It is disingenuous to accuse the FTC of election pandering
           | when they've been doing stuff like this for the past four
           | years consistently.
        
             | srndsnd wrote:
             | And has sued Amazon for their use of anti-competitive
             | pricing.
             | 
             | This is just what Kahn's FTC _does_.
        
           | layer8 wrote:
           | The FTC is bipartisan, no more than three of the five
           | commissioners can belong to the same party. The present
           | report was unanimously voted by all five.
        
         | layer8 wrote:
         | Since the federal government isn't a single mind (nor a hive
         | mind), a cognitive dissonance can only be meaningfully located
         | on the observer's side.
        
       | vundercind wrote:
       | Behind the ball by 15 years to start taking this seriously and
       | beginning to _think about_ pushing back, but better late than
       | never.
       | 
       | Next please reign in the CRAs.
        
         | devonbleak wrote:
         | It makes me irrationally angry that I suddenly started getting
         | spam emails from Experian. Like motherfucker I never consented
         | for you to have my data, then you leak it all, now you're
         | sending me unsolicited junk email? It's just such bullshit that
         | I'm literally forced to have a relationship with these
         | companies to freeze my credit or else I'm at the mercy of
         | whoever they decide to release my information to without my
         | authorization.
        
           | twoodfin wrote:
           | My pet solution has been to make the credit _reporters_
           | liable for transmitting false information to the CRAs.
           | 
           | Chase tells Experian I opened a new line of credit with them,
           | but it later is demonstrated that it was a scammer with my
           | SSN? Congratulations, $5,000 fine.
           | 
           | Of course this all gets priced in to the cost and
           | availability of consumer credit. Good! Now the lenders have
           | an incentive to drive those costs down (cheaper, better
           | identity verification) to compete.
        
             | lotsofpulp wrote:
             | The solution is much simpler. Put all of the consequences
             | of being defrauded by a borrower onto the lender.
             | 
             | If a lender wants to be repaid, then they need to show the
             | borrower all the evidence they have for proof that the
             | borrower entered into the contract.
             | 
             | If all a lender has is the fact that a 9 digit number, date
             | of birth, name, and address were entered online, then the
             | borrower simply has to say "I did not enter that
             | information", and the lender can go pound sand.
             | 
             | Guarantee all the lenders will tighten up their operations
             | very quickly, and consequently, so will the loans that
             | appear on one's credit report.
        
               | ryandrake wrote:
               | Right. This is a problem between the lenders and the
               | people who stole from the lenders. The person whose
               | name/number was used shouldn't even be part of the
               | transaction or part of the problem.
               | 
               | They call it "Identity Theft" instead of what it should
               | be called: Bank fraud. The term "Identity Theft" 1.
               | needlessly pulls an otherwise uninvolved person into the
               | mix, suddenly making it _their_ problem too, and 2.
               | downplays the bank 's negligence.
               | 
               | If someone uses my name to take out a loan, and the bank
               | stupidly lets them, this shouldn't even remotely be my
               | problem. I shouldn't even have to know about it. This is
               | the bank's problem from their own stupidity.
        
               | twoodfin wrote:
               | Lenders hand over bad loans to collection agencies
               | ("accept the consequences") all the time. Cost of doing
               | business. That an innocent person's credit is destroyed
               | is just collateral damage from their perspective.
        
               | sib wrote:
               | "Put all of the consequences of being defrauded by a
               | borrower onto the lender" - that seems a bit strange.
               | 
               | Imagine saying "put all of the consequences of getting
               | robbed onto the bank, not the robber"
        
               | lotsofpulp wrote:
               | Who bears the consequences of their home being robbed? Or
               | mugged on the street? Or a contractor taking payment for
               | services and then disappearing?
               | 
               | Why are we subsidizing lenders' by putting this
               | ridiculous burden on people who have nothing to do with
               | the lender's business?
               | 
               | The lender can pay to appropriately verify their
               | borrower's identity, or go to court and sue for damages
               | like everyone else has to.
        
             | trinsic2 wrote:
             | Can you describe how you make them liable in this
             | arrangement?
        
               | twoodfin wrote:
               | You can challenge entries your credit report today. Win
               | the challenge, whoever reported the entry is liable to
               | the Feds. Maybe add a modest bounty for the injured
               | taxpayer.
        
           | nicholasjarnold wrote:
           | Yep. It sucks. Zero consequences of any import for those
           | companies as far as I'm aware too. Tiny fines end up being
           | "cost of doing business". Then they get to externalize their
           | failures onto us by using terms like "Identity Theft", which
           | indicates something was stolen from ME and is now MY problem.
           | 
           | In actuality some not-well-maintained systems owned by <corp>
           | were hacked or exposed or someone perpetrated fraud on a
           | financial institution and happened to use information that
           | identifies me. It's really backwards.
           | 
           | PSA: If you haven't already, go freeze your credit at
           | Experian, TransUnion, Equifax and Innovis. It will make the
           | perpetration of this type of fraud much more difficult for
           | adversaries.
        
             | singleshot_ wrote:
             | PSA pro tip: they will try to steer you toward "locking"
             | your account. Don't fall for it. Freeze your account.
        
           | rkagerer wrote:
           | That's not an irrational reaction.
        
           | conradev wrote:
           | For a while they were sending emails about my account that I
           | was actually _unable_ to unsubscribe from[1]. I _knew_ it was
           | illegal at the time, and when I finally noticed an
           | unsubscribe button it was because the FTC finally
           | intervened[2].
           | 
           | https://www.reddit.com/r/assholedesign/comments/udy8rz/exper.
           | ..
           | 
           | https://www.ftc.gov/news-events/news/press-
           | releases/2023/08/...
        
           | itronitron wrote:
           | https://www.youtube.com/watch?v=CS9ptA3Ya9E
        
         | flycaliguy wrote:
         | I think Snowden was bang on when in 2013 he warned us of a last
         | chance to fight for some basic digital privacy rights. I think
         | there was a cultural window there which has now closed.
        
           | orthecreedence wrote:
           | Snowden pointed and everyone looked at his finger. It was a
           | huge shame, but a cultural sign that the US is descending
           | into a surveillance hell hole _and people are ok with that_.
           | As someone who was (and still is) vehemently against PRISM
           | and NSLs and all that, it was hard to come to terms with. I
           | 'm going to keep building things that circumvent the "empire"
           | and hope people start caring eventually.
        
             | Clubber wrote:
             | >and people are ok with that.
             | 
             | All the propagandists said he was a Russian asset, as if
             | even if that were true, it somehow negated the fact that we
             | were now living under a surveillance state.
             | 
             | >Snowden pointed and everyone looked at his finger.
             | 
             | This is a great way of putting it.
        
               | lesuorac wrote:
               | > it somehow negated the fact that we were now living
               | under a surveillance state.
               | 
               | There's long been surveillance programs and also numerous
               | laws outlining the responsibilities of telecom provides
               | to enable wire tapping.
               | 
               | There's really nothing new from Snowden besides the names
               | of a bunch of people to go kill cause they're spies.
               | 
               | FISA [1] isn't a private law either.
               | 
               | https://en.wikipedia.org/wiki/COINTELPRO
               | 
               | https://en.wikipedia.org/wiki/Mass_surveillance_in_the_Un
               | ite...
               | 
               | Note: 2006 (Klien) predates 2013 (Snowden)
               | 
               | https://en.wikipedia.org/wiki/Room_641A
               | 
               | [1]: https://en.wikipedia.org/wiki/Foreign_Intelligence_S
               | urveilla...
        
               | Clubber wrote:
               | >There's really nothing new from Snowden besides the
               | names of a bunch of people to go kill cause they're
               | spies.
               | 
               | https://en.wikipedia.org/wiki/2010s_global_surveillance_d
               | isc...
        
               | lesuorac wrote:
               | You are dense. Imagine a government authorizes 10B for a
               | bridge and then in 5 years a bridge shows up.
               | 
               | Now instead, imagine in 1978 [1] a government authorizes
               | "United States federal law that establishes procedures
               | for the surveillance and collection of foreign
               | intelligence on domestic soil" and in 2008 [2] amends it
               | to not be a big deal if they're foreign or not and then 5
               | years later it turns out they're doing just that.
               | 
               | These bills are not secret. Were not secret. Have never
               | been secret. It's not my fault you didn't read them but
               | it doesn't make Snowden novel.
               | 
               | [1]: https://en.wikipedia.org/wiki/Foreign_Intelligence_S
               | urveilla...
               | 
               | [2]: https://en.wikipedia.org/wiki/Foreign_Intelligence_S
               | urveilla...
        
               | Clubber wrote:
               | >You are dense.
               | 
               | Well, maybe you're one of those propagandists. If you
               | can't attack the idea, attack the person, right?
               | 
               |  _Hand waves, nothing new to see here, carry on._
               | 
               | The bills aren't what were exposed, it was more the
               | techniques and scope. Like PRISM and XKeyScore and
               | circumventing laws by sharing intelligence on US citizens
               | with allies who aren't restricted by US laws. Spying on
               | allied governments, etc. You know, that stuff.
               | 
               | You should really click on the link.
               | 
               | https://en.wikipedia.org/wiki/2010s_global_surveillance_d
               | isc...
        
               | dialup_sounds wrote:
               | https://en.wikipedia.org/wiki/Project_SHAMROCK
               | 
               | https://en.wikipedia.org/wiki/Martin_and_Mitchell_defecti
               | on
               | 
               | https://en.wikipedia.org/wiki/Church_Committee
               | 
               | https://en.wikipedia.org/wiki/ECHELON
               | 
               | Et cetera. These aren't new issues. The obsession with
               | Snowden as a messianic figure is unhelpful in
               | contextualizing the information.
        
               | simoncion wrote:
               | > There's long been surveillance programs and also
               | numerous laws outlining the responsibilities of telecom
               | provides to enable wire tapping.
               | 
               | Laws which the telecoms were knowingly and willfully
               | breaking for years.
               | 
               | You do remember that Congress gave them retroactive
               | immunity? [0][1] You do know that this was only granted
               | because people COULD sue (and were suing) them because of
               | the information made public by Snowden and others?
               | 
               | [0] <https://www.aclu.org/news/national-
               | security/retroactive-tele...>
               | 
               | [1] See Title II of the this bill
               | <https://www.congress.gov/bill/110th-congress/house-
               | bill/6304>
        
             | digging wrote:
             | > and people are ok with that
             | 
             | I've seen no evidence of this. People mostly either don't
             | understand it for feel powerless against it.
        
               | immibis wrote:
               | I've seen no evidence people aren't ok with that. Most
               | people around me didn't care about the Snowden
               | revelations. It was only tech people who tightened up
               | security.
        
               | orthecreedence wrote:
               | This is my experience as well. I talked to a LOT of
               | people after the Snowden debacle (techies and otherwise)
               | and the general attitude was "so what? they aren't using
               | the information for anything bad!" or "I have nothing to
               | hide!" (in this thread, for instance:
               | https://news.ycombinator.com/item?id=41594775)
               | 
               | I think people don't really understand what an enormous
               | sleeping dragon the entire thing is.
        
               | digging wrote:
               | > I think people don't really understand what an enormous
               | sleeping dragon the entire thing is.
               | 
               | Isn't that what I said? Mostly we're debating semantics.
               | My deeper point is that it's counterproductive and
               | borderline misanthropic to argue "People just don't care
               | about evil being done!" whereas the argument that "People
               | seriously have no idea yet what they're 'agreeing' to"
               | opens the door to actual solutions, for one inclined to
               | work on them.
        
               | dylan604 wrote:
               | There's also a vast amount of people that were just too
               | young to be aware of Snowden's revelations. These people
               | are now primarily on TikTok what not, and I doubt there's
               | much in those feeds to bring them to light while directly
               | feeding the beast of data hoarding.
        
               | davisr wrote:
               | > I've seen no evidence of this
               | 
               | Over 99% of Americans point a camera at themselves while
               | they take a shit.
        
               | lcnPylGDnU4H9OF wrote:
               | And I'd bet over 99% of those people have never once
               | considered that said camera could even be capable of
               | saving any data without them operating it.
        
               | davisr wrote:
               | Very doubtful they've not considered it. When I go to
               | coffee shops, I see maybe a quarter-to-half the laptops
               | have a shade over the webcam. But when I see people using
               | their phones, I've never once seen them use a shade,
               | piece of tape, or post-it note.
               | 
               | They use the front-facing camera of their phone so often
               | that the temporary inconvenience of removing a shade
               | outweighs the long-term inconvenience of malware snapping
               | an exposing photo.
        
               | chiefalchemist wrote:
               | The cover over the webcam might not be for security per
               | se. It could be they don't want anyone at work - or home?
               | - to accidentally see where they are. If you cover the
               | camera you don't have to worry any such accidents.
               | 
               | My gut says that for most people is the reason.
        
               | doctorpangloss wrote:
               | Snowden couldn't convince people that the privacy he was
               | talking about meant a limit on government power. Not
               | sensitive data. And honestly, nobody cares about anyone
               | taking a shit.
               | 
               | You can advocate for limiting govt. power ("LGP") without
               | leaking any NSA docs. I don't think a single story about
               | "LGP" changed due to the leaks. Everyone knows the
               | government can do a lot of violence on you. So it's very
               | hard.
               | 
               | If you're a high drama personality, yeah you can conflate
               | all these nuanced issues. You can make privacy mean
               | whatever you want.
        
               | ajsnigrutin wrote:
               | But won't you think of the children!
               | 
               | (EU is trying to implement chat control again...)
               | 
               | We need more real-world analogies... "see, this is like
               | having a microphone recording everything you say in this
               | bar"... "see, this is like someone ID-ing you infront of
               | every store and recording what store you've visited, and
               | then following you inside to see what products you look
               | at. See, this is like someone looking at your clothes and
               | then pasting on higer price tags on products. ..."
        
             | EGreg wrote:
             | Over ten years ago I wrote about the root of the problem:
             | https://magarshak.com/blog/?p=169
             | 
             | And here is a libertarian solution:
             | https://qbix.com/blog/2019/03/08/how-qbix-platform-can-
             | chang...
        
         | newsclues wrote:
         | The long term consequences of 9/11.
        
       | nabla9 wrote:
       | Facebook Employees Explain Struggling To Care About Company's
       | Unethical Practices When Gig So Cushy
       | https://www.youtube.com/watch?v=-DiBc1vkTig
        
         | xyst wrote:
         | an onion parody/satire video, lol
        
       | ChrisArchitect wrote:
       | Related earlier this week:
       | 
       |  _Instagram Teen Accounts_
       | 
       | https://news.ycombinator.com/item?id=41572041
        
       | mgraczyk wrote:
       | "these surveillance practices can endanger people's privacy,
       | threaten their freedoms, and expose them to a host of harms, from
       | identify theft to stalking."
       | 
       | Is there any evidence that any of these things have ever happened
       | as a result of this sort of data collection? I'm not talking
       | about data posted to social media, I'm talking about the specific
       | data collection described in this FTC press release.
        
         | mu53 wrote:
         | I have been stalked and harassed by an Apple employee using
         | data they were able to glean from their access at Apple.
         | 
         | The impossible part is proving the abuse. All of these
         | companies keep their database, access controls, and everything
         | they possible can about these data lakes secret. The simple
         | fact of the matter is that you will never have any evidence
         | someone looked you up in a database.
         | 
         | It is really easy to walk the line, but be obvious enough to
         | intimidate.
        
           | mgraczyk wrote:
           | Apple wasn't listed and (outside the app store) doesn't
           | collect the data described in the press release.
        
             | drawkward wrote:
             | So imagine the possible abuses by people at companies who
             | do.
        
             | stiffenoxygen wrote:
             | They mentioned practices that corporations do. I think any
             | corporation that collects data on you counts here. I don't
             | think its worth it to only talk about the examples provided
             | in the article.
        
             | stiffenoxygen wrote:
             | They absolutely do, in fact they even tried to encrypt user
             | data to not be as invasive as other companies but the FBI
             | sued them and said no you can't do that, you need to keep
             | that data so we can subpoena you.
        
         | orthecreedence wrote:
         | Your comment is really coming across as "well, nothing bad has
         | happened yet so who cares?" If that's not the case, please let
         | me know how you meant it. If it is the case, surely you can
         | imagine a world in which dragnet surveillance of people who
         | have an expectation of privacy can be abused by corporations,
         | institutions, or private individuals. It really doesn't take a
         | lot of imagination to picture this world.
        
           | mgraczyk wrote:
           | It's been ubiquitous for around 20 years now (Google started
           | doing mass surveillance for display ads in the early 2000s)
           | and nothing bad has happened, so yes that's my point.
           | 
           | If nothing bad happens for decades, and that is inconsistent
           | with your model of danger, then the model is probably wrong
        
             | orthecreedence wrote:
             | Your argument boils down to "yes, someone has had a gun
             | pointed at my head for quite some time now, but they
             | haven't pulled the trigger yet so I don't see the problem."
        
               | mgraczyk wrote:
               | No, I'm arguing that it's not actually a gun, and my
               | evidence is that there are 2 billion "guns" that have
               | been pointed at 2 billion people's heads for years, and
               | nobody has been hurt.
               | 
               | It's more like a flashlight than a gun
        
               | orthecreedence wrote:
               | > It's more like a flashlight than a gun
               | 
               | I disagree, and again, implore you to use your
               | imagination. If private messages ( _not just yours but
               | someone elses_ ) were to suddenly be public or
               | institutional knowledge, what damning things might
               | happen? What influence might some have over others? What
               | dynamics could or would shift as a result?
               | 
               | I'm comfortable making the claim that you aren't really
               | thinking this through, at all, in any meaningful way.
        
               | immibis wrote:
               | What was the fallout last time this happened? Was it like
               | pulling the triggers of guns pointed at people's heads?
        
               | mgraczyk wrote:
               | The FTC press release is not talking about private
               | messages, that is not the kind of data they are asking to
               | protect. Private messages are already generally protected
               | in the way the FTC is asking for.
        
             | ryandrake wrote:
             | If you don't think anything bad happens from personal data
             | being accessed without one's consent, please reply to this
             | comment and share:
             | 
             | 1. Your full name
             | 
             | 2. Your home address
             | 
             | 3. Your social security number (if you're American)
             | 
             | 4. Your mother's maiden name
             | 
             | If you're right, then you have nothing to worry about.
        
               | mgraczyk wrote:
               | None of this data is included in the FTC report. They are
               | not talking about this.
               | 
               | My full name is Michael Graczyk, I live in San Francisco,
               | none of these companies know any more detail than that
               | about the questions you asked
        
               | lcnPylGDnU4H9OF wrote:
               | > none of these companies know any more detail than that
               | about the questions you asked
               | 
               | I suspect you mean that you haven't provided these
               | companies with these details. What reason do you have to
               | think they don't know those details?
        
               | tway_GdBRwW wrote:
               | Michael, I disagree with your point but I recognize your
               | integrity. You just posted your name and city, and your
               | HN profile shares more personal information.
               | 
               | I respect that you are willing to stand behind your
               | claim. Best of success with your current venture.
        
             | tway_GdBRwW wrote:
             | > nothing bad has happened
             | 
             | ummm, WTF?
             | 
             | 10x increase in teen suicide doesn't qualify as "bad"?
             | 
             | or repeated DOJ lawsuits against Facebook because their
             | advertising practices result in highly effective racial
             | discrimination?
        
         | drawkward wrote:
         | Wait for the AI tools Larry Ellison wants to give to law
         | enforcement to retroactively connect/hallucinate the dots.
        
         | dogman144 wrote:
         | Not only is there evidence of harms, there are is a whole
         | industry focused on fixing the problem for those wealthy enough
         | or incentivized enough to care.
         | 
         | Do a bit of googling, but ADINT and RTB tracking will get you
         | there for search terms.
         | 
         | Or, continue being confidently dismissive of something serious
         | people are taking very seriously. I am sorry if this FTC report
         | targeted the source of your RSUs or otherwise motivated set of
         | incentives, but there's no free lunch. The consequences are
         | finally landing of your viewpoint, done collectively, over the
         | last decade.
        
           | mgraczyk wrote:
           | > targeted the source of your RSUs or otherwise motivated
           | 
           | I don't currently have any financial interest in any of these
           | companies
           | 
           | > but ADINT and RTB tracking will get you there for search
           | terms.
           | 
           | These are good things, do you have any examples of harm that
           | has been caused by ADINT or RTB? Prosecuting criminals
           | doesn't count for me
        
       | srndsnd wrote:
       | To me, what's missing from that set of recommendations is some
       | method to increase the liability of companies who mishandle user
       | data.
       | 
       | It is insane to me that I can be notified via physical mail of
       | months old data breaches, some of which contained my Social
       | Security number, and that my only recourse is to set credit
       | freezes from multiple credit bureaus.
        
         | alsetmusic wrote:
         | Regulation is key, but I don't see it as likely when our
         | society is poisoned by culture war bs. Once we put that behind
         | us (currently unlikely), we can pass sane laws reigning in huge
         | corporations.
        
         | zeroonetwothree wrote:
         | If you aren't directly harmed yet what liability would they
         | have? I imagine if your identity is stolen and it can be tied
         | to a breach then they would already be liable.
        
           | drawkward wrote:
           | Surveillance apologist.
        
           | kibwen wrote:
           | The fact that my data can be stolen in the first place is
           | already outrageous, because I neither consented to allowing
           | these companies to have my data, nor benefit from them having
           | my data.
           | 
           | It's like if you go to an AirBNB and the owner sneaks in at
           | night and takes photos of you sleeping naked and keeps those
           | photos in a folder on his bookshelf. Would you be okay with
           | that? If you're not directly harmed, what liability would
           | they have?
           | 
           | Personal data should be radioactive. Any company retaining it
           | better have a damn good reason, and if not then their company
           | should be burned to the ground and the owners clapped in
           | irons. And before anyone asks, "personalized advertisements"
           | is not a good reason.
        
             | pc86 wrote:
             | I mean it's pretty clear that you _are_ directly harmed if
             | someone takes naked photos of you without your knowledge or
             | consent and then keeps them. It 's not a good analogy so if
             | we want to convince people like the GP of the points you're
             | making, you need to make a good case because that is not
             | how the law is currently structured. "I don't like ads" is
             | not a good reason, and comments like this that are seething
             | with rage and hyperbole don't convince anyone of anything.
        
               | drawkward wrote:
               | What is the harm? It is not obvious to me, if the victim
               | is unaware...unless you are alleging simply that there is
               | some ill-defined right to privacy. But if that is so, why
               | does it apply to my crotch and not my personal data?
        
               | simoncion wrote:
               | These are exactly my questions. If I never, ever know
               | about those pictures and never, ever have my life
               | affected by those pictures, what is the _actual_ harm to
               | me?
               | 
               | If the answer to them ends up being "Well, it's illegal
               | to take non-consensual nudie pictures.", then my follow-
               | up question is "So, why isn't the failure to protect my
               | personal information also illegal?".
               | 
               | To be perfectly clear, I do believe that the scenario
               | kibwen describes SHOULD be illegal. But I ALSO believe
               | that it should be SUPER illegal for a company to fail to
               | secure data that it has on me. Regardless of whether they
               | are retaining that information because there is literally
               | no way they could provide me with the service I'm paying
               | them for without it, or if they're only retaining that
               | information in the hopes of making a few pennies off of
               | it by selling it to data brokers or whoever, they
               | _should_ have a VERY SERIOUS legal obligation to keep
               | that information safe and secure.
        
               | lcnPylGDnU4H9OF wrote:
               | > to fail to secure data that it has on me
               | 
               | Just want to point out that the company is usually also
               | doing what it can to get other information about you
               | without your consent based on other information it has
               | about you. It's a lot closer to the "taking non-
               | consensual nudie pictures" than "fail to secure data"
               | makes it sound.
        
               | JumpCrisscross wrote:
               | > _it 's pretty clear that you are directly harmed if
               | someone takes naked photos of you without your knowledge
               | or consent and then keeps them_
               | 
               | Sure. In those cases, there are damages and that creates
               | liability. I'm not sure what damages I've ever faced from
               | any leak of _e.g._ my SSN.
        
               | pixl97 wrote:
               | I mean most people won't until that day they find out
               | theirs a house in Idaho under their name (and yes I've
               | seen just this happen).
               | 
               | The problem here is because of all these little data
               | leaks you as an individual now bear a cost ensuring that
               | others out there are not using your identity and if it
               | happens you have to clean up the mess by pleading it
               | wasn't you in the first place.
        
             | lesuorac wrote:
             | I don't think thats a proper parallel.
             | 
             | I think a better example would be You (AirBnB Host) rent a
             | house to Person and Person loses the house key. Later on
             | (perhaps many years later), You are robbed. Does Person
             | have liability for the robbery?
             | 
             | Of course it also gets really muddy because you'll have
             | renting the house out for those years and during that time
             | many people will have lost keys. So does liability get
             | divided? Is it the most recent lost key?
             | 
             | Personally, I think it should just be some statutory
             | damages of probably a very small amount per piece of data.
        
               | pixl97 wrote:
               | The particular problem comes in because the amount of
               | data lost tends to be massive when these breaches occur.
               | 
               | It's kind of like the idea of robbing a minute from
               | someone's life. It's not every much to an individual, but
               | across large populations it's a massive theft.
        
               | lesuorac wrote:
               | Sure and if you pay a statutory fine times 10 million
               | then it becomes a big deal and therefore companies would
               | be incentivized to protect it better the larger they get.
               | 
               | Right now they probably get some near free rate to offer
               | you credit monitoring and dgaf.
        
               | polygamous_bat wrote:
               | > I think a better example would be You (AirBnB Host)
               | rent a house to Person and Person loses the house key.
               | 
               | This is not a direct analogue, a closer analogy would be
               | when the guest creates a copy of the key (why?) without
               | my direct consent (signing a 2138 page "user agreement"
               | doesn't count) and at some later point when I am no
               | longer renting to them, loses the key.
        
               | lesuorac wrote:
               | I'm still much more interested in the answer to who is
               | liable for the robbery.
               | 
               | Just the Robber? Or are any of the key-copiers (instead
               | of losers w/e) also?
        
               | 8note wrote:
               | This version loses multiple parts of things that are
               | important
               | 
               | 1. I have no control over what was stored 2. I have no
               | control over where the storage is
               | 
               | The liability in this case is the homeowner/host, as you
               | should have and had full ability to change out the locks.
               | 
               | To make it more similar, I think you'd need one of the
               | guests to have taken some amount of art off the wall, and
               | brought it to a storage unit, and then the art later was
               | stolen from the storage unit, and you don't have access
               | to the storage unit.
               | 
               | It's not as good as the naked pictures example because
               | what's been taken is copies of something sensitive, not
               | the whole thing
        
             | ryandrake wrote:
             | That's the big problem with relying on tort law to curb
             | this kind of bad corporate behavior: The plaintiff has to
             | show actual injury or harm. This kind of bad behavior
             | _should_ be criminal, and the state should be going after
             | companies.
        
             | JumpCrisscross wrote:
             | > _before anyone asks, "personalized advertisements" is not
             | a good reason_
             | 
             | The good reason is growth. Our AI sector is based on, in
             | large part, the fruits of these data. Maybe it's all
             | baloney, I don't know. But those are jobs, investment and
             | taxes that _e.g._ Europe has skipped out on that America
             | and China are capitalising on.
             | 
             | My point, by the way, isn't pro surveillance. I enjoy my
             | privacy. But blanket labelling personal data as radioactive
             | doesn't seem to have any benefit to it outside emotional
             | comfort. Instead, we need to do a better job of specifying
             | _which_ data are harmful to accumulate and why. SSNs are
             | obviously not an issue. Data that can be used to target
             | _e.g._ election misinformation are.
        
               | thfuran wrote:
               | So you're saying it's all vastly valuable and that's why
               | it is right that it is taken without consent or
               | compensation?
        
             | ranger_danger wrote:
             | >I neither consented to allowing these companies to have my
             | data, nor benefit from them having my data.
             | 
             | I think both of those are debatable.
        
           | drawkward wrote:
           | Go ahead, post your phone number here. It's not directly
           | harmful.
        
           | idle_zealot wrote:
           | That's the whole problem with "liability", isn't it? If the
           | harms you do are diffuse enough then nobody can sue you!
        
           | bunderbunder wrote:
           | This is exactly why thinking of it in terms of individual
           | cases of actual harm, as Americans have been conditioned to
           | do by default, is precisely the wrong way to think about it.
           | We're all familiar with the phrase "an ounce of prevention is
           | worth a pound of cure", right?
           | 
           | It's better to to think of it in terms of prevention. This
           | fits into a category of things where we _know_ they create a
           | disproportionate risk of harm, and we therefore decide that
           | the behavior just shouldn 't be allowed in the first place.
           | This is why there are building codes that don't allow certain
           | ways of doing the plumbing that tend to lead to increased
           | risk of raw sewage flowing into living spaces. The point
           | isn't to punish people for getting poop water all over
           | someone's nice clean carpet; the point is to keep the poop
           | water from soaking the carpet in the first place.
        
             | supertrope wrote:
             | Safety rules are written in blood. After a disaster there's
             | a push to regulate. After enough years we only see the
             | costs of the rules and not the prevented injuries and
             | damage. The safety regulations are then considered annoying
             | and burdensome to businesses. Rules are repealed or left
             | unenforced. There is another disaster...
        
               | bunderbunder wrote:
               | Tangentially, there was an internet kerfuffle about
               | someone getting in trouble for having flower planters
               | hanging out the window of their Manhattan high rise
               | apartment a while back, and people's responses really
               | struck me.
               | 
               | People from less dense areas generally saw this as
               | draconian nanny state absurdity. People who had spent
               | time living in dense urban areas with high rise
               | residential buildings, on the other hand, were more
               | likely to think, "Yeah, duh, this rule makes perfect
               | sense."
               | 
               | Similarly, I've noticed that my fellow data scientists
               | are MUCH less likely to have social media accounts. I'd
               | like to think it's because we are more likely to
               | understand the kinds of harm that are possible with this
               | kind of data collection, and just how irreparable that
               | harm can be.
               | 
               | Perhaps Americans are less likely to support Europe-style
               | privacy rules than Europeans are because Americans are
               | less likely than Europeans to know people who saw first-
               | hand some of what was happening in Europe in the 20th
               | century.
        
           | halJordan wrote:
           | This is the traditional way of thinking, and a good question,
           | but it is not the only way.
           | 
           | An able bodied person can fully make complaints against any
           | business that fails their Americans with Disabilities Act
           | obligation. In fact these complaints by able bodied well-
           | doers is the de facto enforcement mechanism even though these
           | people can never suffer damage from that failure.
           | 
           | The answer is simply to legislate the liability into
           | existence.
        
           | squeaky-clean wrote:
           | The same way you can get ticketed for speeding in your car
           | despite not actually hitting anyone or anything.
        
         | 2OEH8eoCRo0 wrote:
         | I get a feeling that liability is the missing piece in a lot of
         | these issues. Section 230? Liability. Protection of personal
         | data? Liability. Minors viewing porn? Liability.
         | 
         | Lack of liability is screwing up the incentive structure.
        
           | brookst wrote:
           | I think I agree, but people will have very different views on
           | where liability should fall, and whether there is a malicious
           | / negligent / no-fault model?
           | 
           | Section 230? Is it the platform or the originating user
           | that's liable?
           | 
           | Protection of personal data? Is there a standard of care
           | beyond which liability lapses (e.g. a nation state supply
           | chain attack exfiltrates encrypted data and keys are broken
           | due to novel quantum attack)?
           | 
           | Minors viewing porn? Is it the parents, the ISP, the
           | distributor, or the creator that's liable?
           | 
           | I'm not here to argue specific answers, just saying that
           | everyone will agree liability would fix this, and few will
           | agree on who should be liable for what.
        
             | TheOtherHobbes wrote:
             | It's not a solvable problem. Like most tech problems it's
             | political, not technical. There is no way to balance the
             | competing demands of privacy, security, legality, and
             | corporate overreach.
             | 
             | It might be solvable with some kind of ID escrow, where an
             | independent international agency managed ID as a not-for-
             | profit service. Users would have a unique biometrically-
             | tagged ID, ID confirmation would be handled by the agency,
             | ID and user behaviour tracking would be disallowed by
             | default and only allowed under strictly monitored
             | conditions, and law enforcement requests would go through
             | strict vetting.
             | 
             | It's not hard to see why that will never happen in today's
             | world.
        
               | malfist wrote:
               | > It's not a solvable problem
               | 
               | Lawnmower manufacturers said the same thing about making
               | safe lawnmowers. Until government regulations forced them
               | to
        
               | ToucanLoucan wrote:
               | https://i.imgur.com/mXU28ta.jpeg
               | 
               | Specifically, 1970.
        
             | StanislavPetrov wrote:
             | >Protection of personal data? Is there a standard of care
             | beyond which liability lapses (e.g. a nation state supply
             | chain attack exfiltrates encrypted data and keys are broken
             | due to novel quantum attack)?
             | 
             | There absolutely should be, especially for personal data
             | collected and stored without the express written consent of
             | those being surveilled. They should have to get people to
             | sign off on the risks of having their personal data
             | collected and stored, be legally prevented from collecting
             | and storing the personal data of people who haven't
             | consented and/or be liable for any leaking or unlawful
             | sharing/selling of this data.
        
         | arminiusreturns wrote:
         | I agree. Let me tell you about what just happened to me. After
         | a very public burnout and spiral, a friend rescued me and I
         | took a part time gig helping a credit card processing company.
         | About 2 months ago, the owner needed something done while I was
         | out, and got their uber driver to send an email. They emailed
         | the entire customer database, including bank accounts, socials,
         | names, addresses, finance data, to a single customer. When I
         | found out, (was kept hidden from me for 11 days) I said "This
         | is a big deal, here are all the remediations and besides PCI we
         | have 45 days by law to notify affected customers." The owner
         | said "we aren't going to do that", and thus I had to turn in my
         | resignation and am now unemployed again.
         | 
         | So me trying to do the right thing, am now scrambling for work,
         | while the offender pretends nothing happened while potentially
         | violating the entire customer base, and will likely suffer no
         | penalty unless I report it to PCI, which I would get no reward
         | for.
         | 
         | Why is it everywhere I go management is always doing shady
         | stuff. I just want to do linuxy/datacentery things for someone
         | who's honest... /cry
         | 
         | My mega side project isn't close enough to do a premature
         | launch yet. Despite my entire plan being to forgo VC/investors,
         | I'm now considering compromising.
        
           | mikeodds wrote:
           | As in.. his actual Uber driver? He just handed his laptop
           | over?
        
             | arminiusreturns wrote:
             | Yes. The owner is old, and going blind, but refuses to sell
             | or hand over day to day ops to someone else, and thus must
             | ask for help on almost everything. I even pulled on my
             | network to find a big processor with a good reputation to
             | buy the company, but after constant delays and excuses for
             | not engaging with them, I realized to the owner the
             | business is both their "baby" and their social life,
             | neither of which they want to lose.
        
           | TinyRick wrote:
           | Why would you resign? You could have reported it yourself and
           | then you would have whistleblower protections - if the
           | company retaliated against you (e.g. fired you), you then
           | would have had a strong lawsuit.
        
             | arminiusreturns wrote:
             | Because I don't want to be associated with companies that
             | break the law and violate regulations knowingly. I've long
             | had a reputation of integrity, and it's one of the few
             | things I have left having almost nothing else.
        
               | TinyRick wrote:
               | So you would rather be known as someone who had an
               | opportunity to report a violation, and chose not to? From
               | my perspective it seem like you decided _against_ acting
               | with integrity in this situation - the moral thing would
               | have been to report the violation, but you chose to look
               | the other way and resign.
        
               | qup wrote:
               | I wonder if I was part of the database that got emailed.
        
               | arminiusreturns wrote:
               | Very unlikely, this is a very small operation with a tiny
               | customer base.
        
           | ValentinA23 wrote:
           | The DOJ has just launched a corporate whistleblower program,
           | you should look into it maybe it covers your case:
           | 
           | https://www.justice.gov/criminal/criminal-division-
           | corporate...
           | 
           | >As described in more detail in the program guidance, the
           | information must relate to one of the following areas: (1)
           | certain crimes involving financial institutions, from
           | traditional banks to cryptocurrency businesses; (2) foreign
           | corruption involving misconduct by companies; (3) domestic
           | corruption involving misconduct by companies; or (4) health
           | care fraud schemes involving private insurance plans.
           | 
           | >If the information a whistleblower submits results in a
           | successful prosecution that includes criminal or civil
           | forfeiture, the whistleblower may be eligible to receive an
           | award of a percentage of the forfeited assets, depending on
           | considerations set out in the program guidance. If you have
           | information to report, please fill out the intake form below
           | and submit your information via
           | CorporateWhistleblower@usdoj.gov. Submissions are
           | confidential to the fullest extent of the law.
        
           | aftbit wrote:
           | >Why is it everywhere I go management is always doing shady
           | stuff.
           | 
           | Well here's a cynical take on this - management is playing
           | the business game at a higher level than you. "Shady stuff"
           | is the natural outcome of profit motivation. Our society is
           | fundamentally corrupt. It is designed to use the power of
           | coercive force to protect the rights and possessions of the
           | rich against the threat of violence by the poor. The only way
           | to engage with it AND keep your hands clean is to be in a
           | position that lets you blind yourself to the problem. At the
           | end of the day, we are all still complicit in enabling slave
           | labor and are beneficiaries of policies that harm the poor
           | and our environment in order to enrich our lives.
           | 
           | >unless I report it to PCI, which I would get no reward for.
           | 
           | You may be looking at that backwards. Unless you report it to
           | PCI, you are still complicit in the mishandling of the
           | breach, even though you resigned. You might have been better
           | off reporting it over the owner's objections, then claiming
           | whistleblower protections if they tried to terminate you.
           | 
           | This is not legal advice, I am not a lawyer, I am not your
           | lawyer, etc.
        
             | arminiusreturns wrote:
             | I did verify with an attorney that since I wasn't involved
             | and made sure the owner knew what was what, that I had no
             | legal obligations to disclose.
        
             | positus wrote:
             | The problem isn't society or profit motivation. It's
             | people. Humanity itself is corrupt. There aren't "good
             | people" and "bad people". There's only "bad people." We're
             | all bad people, just some of us are more comfortable with
             | our corruption being visible to others to a higher degree.
        
               | ragnese wrote:
               | > We're all bad people, just some of us are more
               | comfortable with our corruption being visible to others
               | to a higher degree.
               | 
               | If the GP's story is true (and I have no reason to
               | suspect otherwise), then there are clearly differences in
               | the degree of "badness" between people. GP chose to
               | resign from his job, while his manager chose to be
               | negligent and dishonest.
               | 
               | So, even if we're all bad people, there are less bad and
               | more bad people, so we might as well call the less bad
               | end of the spectrum "good". Thus, there are good and bad
               | people.
        
         | bilekas wrote:
         | > To me, what's missing from that set of recommendations is
         | some method to increase the liability of companies who
         | mishandle user data.
         | 
         | As nice as this is on paper, it will never happen, lobbyist
         | exists. Not to be tinfoil hat but why would any lawmaker slap
         | the hand that feeds them.
         | 
         | Until there is an independent governing body which is permitted
         | to regulate over the tech industry as a whole it wont happen.
         | Consider the FDA, they decide which drugs and ingredients are
         | allowed and that's all fine. There could be a regulating body
         | which could determine the risk to people's mental health for
         | example from 'features' of tech companies etc. But getting that
         | body created will require a tragedy. Like why the FDA was
         | created in the first place. [1]
         | 
         | That's just my 2cents.
         | 
         | 1 : https://www.fda.gov/about-fda/fda-history/milestones-us-
         | food....
        
         | layer8 wrote:
         | I'm completely sympathetic to making companies more liable for
         | data security. However, until data breaches regularly lead to
         | severe outcomes for subjects whose personal data was leaked,
         | and those outcomes can be causally linked to the breaches in an
         | indisputable manner, it seems unlikely for such legislation to
         | be passed.
        
           | Onavo wrote:
           | Then instead of regulating the companies, make SSN easily
           | revokable and unique per service. I don't understand why
           | Americans are so oppposed to a national ID despite the fact
           | that every KYC service use SSNs and driver licenses.
        
             | candiddevmike wrote:
             | Because they're the mark of the beast or a step towards
             | fascism or something.
             | 
             | I don't think it would take much to convert real IDs into a
             | national ID, they are as close to as they can get without
             | "freaking people out".
        
             | mapt wrote:
             | The expansion of KYC and the hegemonic dominance of our
             | global financial intelligence network is a recent
             | infringement on our privacy that would not necessarily pass
             | popular muster if it became well-known.
             | 
             | Most of our population is still living in a headspace where
             | transactions are effectively private and untraceable, from
             | the cash era, and has not considered all the ways that the
             | end of this system makes them potential prey.
             | 
             | The fact is that the market is demanding a way to identify
             | you both publicly and privately, and it will use whatever
             | it needs to, including something fragile like a telephone
             | number 2fa where you have no recourse when something goes
             | wrong. It's already got a covert file on you a mile long,
             | far more detailed than anything the intelligence agencies
             | have bothered putting together. The political manifestation
             | of anti-ID libertarians is wildly off base.
        
           | wepple wrote:
           | I forgot where I saw this, but the US govt recently announced
           | that they see mass PII theft as a legitimate national
           | security issue.
           | 
           | It's not just that you or I will be inconvenienced with a bit
           | more fraud or email spam, but rather that large nation state
           | adversaries having huge volumes of data on the whole
           | population can be a significant strategic advantage
           | 
           | And so far we typically see email+password+ssn be the worst
           | data leaked; I expect attackers will put in more effort to
           | get better data where possible. Images, messages, gps
           | locations, etc
        
           | mapt wrote:
           | "What fraction of the FBI and CIA do the Communists have
           | blackmail material on?"
        
       | shawn-butler wrote:
       | the full report[0] is a good read don't just read the summary..
       | 
       | >>> But these findings should not be viewed in isolation. They
       | stem from a business model that varies little across these nine
       | firms - harvesting data for targeted advertising, algorithm
       | design, and sales to third parties. With few meaningful
       | guardrails, companies are incentivized to develop ever-more
       | invasive methods of collection. >>>
       | 
       | [0]: https://www.ftc.gov/system/files/ftc_gov/pdf/Social-
       | Media-6b...
        
       | JackOfCrows wrote:
       | Shocked, gambling, establishment, etc.
        
       | MengerSponge wrote:
       | 2016 Schneier on Security "Data is a Toxic Asset":
       | https://www.schneier.com/blog/archives/2016/03/data_is_a_tox...
        
       | kart23 wrote:
       | > Profound Threats to Users Can Occur When Targeting Occurs Based
       | on Sensitive Categories
       | 
       | > Targeted ads based on knowledge about protected categories can
       | be especially distressing. One example is when someone has not
       | disclosed their sexual orientation publicly, but an ad assumes
       | their sexual orientation. Another example is when a retailer
       | identifies someone as pregnant and targets ads for baby products
       | before others, including family, even know about the pregnancy.
       | These types of assumptions and inferences upon which targeted
       | advertising is based can in some instances result in emotional
       | distress, lead to individuals being misidentified or
       | misclassified, and cause other harms.
       | 
       | If this is one of the biggest harms the FTC can come up with,
       | then honestly as a consumer I don't really care. Having free
       | youtube is worth getting a few mistargeted ads, or I CAN JUST
       | TURN TARGETED ADS OFF. Advertising isn't someone harassing you,
       | its an ad that I can close or just report as not being accurate.
       | I'd really be interested to hear from someone who thinks getting
       | a mistargeted ad is in top 10 most stressful things in their
       | life.
       | 
       | What I would really be interested in is the raw responses from
       | the companies, not this report.
        
         | macawfish wrote:
         | Use your imagination?
        
         | carb wrote:
         | > I CAN JUST TURN TARGETED ADS OFF
         | 
         | The only reason you have the option to do this is because of
         | groups pushing back against advertising companies. Ad companies
         | have no incentive to offer the option to disable targeting.
         | 
         | If you like having this option available, then you should like
         | this FTC report and the position they are taking.
        
           | kart23 wrote:
           | > If you like having this option available, then you should
           | like this FTC report and the position they are taking.
           | 
           | I can like other positions and actions the FTC has done, like
           | requiring the ability to turn off targeted ads, and not like
           | others, like this one. This is among the biggest problems in
           | politics right now. Supporting a political party doesn't mean
           | you need to 100% back all their opinions and policies, thats
           | how change is effected in successful democratic systems.
        
             | stiffenoxygen wrote:
             | > I can like other positions and actions the FTC has done,
             | like requiring the ability to turn off targeted ads, and
             | not like others, like this one
             | 
             | They weren't saying that was the case I think you're
             | misunderstanding them here. But they are 100% correct, you
             | are benefiting from other people fighting against this mass
             | surveillance and yet speaking against it. I think you
             | should do some research on why privacy is important and
             | challenge yourself and your potentially entrenched beliefs.
        
               | kart23 wrote:
               | Read my first comment. I definitely agree privacy is
               | important. All I'm saying is that this is not one of the
               | harms we should be worrying about when saying targeted
               | advertising is a problem, and I don't understand why this
               | is an important issue that we should care about when
               | targeted advertising can be turned off:
               | 
               | "Profound Threats to Users Can Occur When Targeting
               | Occurs Based on Sensitive Categories"
        
       | russdpale wrote:
       | instead of stupid recommendations, which are laughable, the
       | government should actually enforce them.
        
         | layer8 wrote:
         | "The government" isn't a singular entity, and the FTC is an
         | independent agency.
        
       | notinmykernel wrote:
       | See also: "How advertisers became NSA's best friend"[1].
       | 
       | [1]https://www.theverge.com/2013/12/12/5204196/how-
       | advertisers-...
        
       | yieldcrv wrote:
       | Wait till the FTC discovers Full Story
        
       | cynan123 wrote:
       | Lina Khan has been on a tear. She actually seems to care about
       | online human rights.
        
       | seydor wrote:
       | A little hypocritical when it comes from various government
       | organizations all over the western world. Surveillance companies
       | are essential for police to be able to easily gather data when
       | needed fast. It is a happy accident that surveillance is so
       | lucrative for advertising and also so effective for policing.
        
         | janalsncm wrote:
         | Different parts of government might disagree on the best course
         | of action but I wouldn't call that disagreement _hypocrisy_ per
         | se.
         | 
         | It's also not true that it's an irresolvable conflict. Yes the
         | cops can and do buy your phone location data, but even if we
         | said that was fine and should continue, that doesn't also mean
         | that any schmuck should be able to buy real-time Supreme Court
         | justice location data from a broker.
        
       | doctorpangloss wrote:
       | Simple questions:
       | 
       | Should ad prices be lower or higher?
       | 
       | Should YouTube be free for everyone, or should it cost money?
        
         | beezlebroxxxxxx wrote:
         | Having ads does not require mass surveillance --- that's really
         | just something that social media companies have normalized
         | because that's the _particular_ business model and practices
         | they have adopted and which makes them the most amount of money
         | possible.
        
           | goosejuice wrote:
           | Well put. Targeting and more specifically retargeting is the
           | problem.
           | 
           | Most companies can't afford to not do this when their
           | competitors are. Hence the need for regulation.
        
         | janalsncm wrote:
         | Those are useful questions but I don't think they're the only
         | ones that matter. Here's another one for consideration:
         | 
         | What is the minimum level of privacy that a person should be
         | entitled to, no matter their economic status?
         | 
         | If we just let the free market decide these questions for us,
         | the results won't be great. There are a lot of things which
         | shouldn't be for sale.
        
           | doctorpangloss wrote:
           | > What is the minimum level of privacy that a person should
           | be entitled to, no matter their economic status?
           | 
           | This is an interesting question: maybe the truth is, very
           | little.
           | 
           | I don't think that user-identified app telemetry is below
           | that minimum level of privacy. Knowing what I know about ad
           | tracking in Facebook before Apple removed app identifiers, I
           | don't think any of that was below the minimum level.
           | 
           | This is a complex question for sort of historical reasons,
           | like how privacy is meant to be a limit on government power
           | as opposed to something like, what would be the impact if
           | this piece of data were more widely known about me? We're
           | talking about the latter but I think people feel very
           | strongly about the former.
           | 
           | Anyway, I answered your questions. It's interesting that no
           | one really wants to engage with the basic premise, do you
           | want these services to be free or no? Is it easy to conceive
           | that people never choose the paid version of the service?
           | What proof do you need that normal people (1) understand the
           | distinction between privacy as a barrier to government
           | enforcement versus privacy as a notion of sensitive personal
           | data (2) will almost always view themselves as safe from the
           | government, probably rightly so, so they will almost always
           | choose the free+ads version of any service, and just like
           | they have been coming out ahead for the last 30 years, they
           | are likely to keep coming out ahead, in this country?
        
             | BriggyDwiggs42 wrote:
             | The issue to me is that these companies have operated and
             | continue to operate by obfuscating the nature of their
             | surveillance to users. This isn't a system of informed
             | consent to surveillance in exchange for free services; it's
             | a system of duping ordinary people into giving up sensitive
             | personal information by drawing them in with a free
             | service. I'm almost certain this model could still exist
             | without the surveillance. They could still run ads; the ads
             | would be less targeted.
        
             | janalsncm wrote:
             | I didn't mean to evade your questions, but my opinion is as
             | follows:
             | 
             | Yes I want YouTube to be free, but not if that requires
             | intrusive surveillance.
             | 
             | People who pay for YouTube aren't opted out of surveillance
             | as far as I can tell. So I reject the premise of your
             | question, that people are choosing free because they don't
             | value privacy. They haven't been given the choice in most
             | cases.
             | 
             | On a tangential note, you previously asked if ads should be
             | more expensive. It's possible that ads should be _less_
             | expensive, since they may be less effective than ad spend
             | would suggest: https://freakonomics.com/podcast/does-
             | advertising-actually-w...
        
       | DaleNeumann wrote:
       | "According to one estimate, some Teens may see as many as 1,260
       | ads per day.200 Children and Teens may be lured through these ads
       | into making purchases or handing over personal information and
       | other data via dark patterns"
       | 
       | There is a long trail of blood behind google and facebook,
       | amazon... Etc...
        
         | 93po wrote:
         | Even with ad blockers, we still see tons of ads. Corporate news
         | like CNN constantly has front page stories that are just paid
         | promotion for some product or service wrapped in a thin veil of
         | psuedo journalism. Product placement is everywhere too. Tons of
         | reddit front page content is bot-upvoted content that is
         | actually just a marketing campaign disguised as some TIL or
         | meme or sappy story.
        
       | bilekas wrote:
       | More details :
       | https://www.ftc.gov/system/files/ftc_gov/pdf/Social-Media-6b...
        
       | ianopolous wrote:
       | We really need e2ee social media that's designed to protect, not
       | addict people.
        
         | janalsncm wrote:
         | "E2ee social media" isn't a coherent concept. E2ee has to do
         | with how information is transferred not what is transferred.
        
           | ianopolous wrote:
           | Sure it is: https://peergos.org/posts/decentralized-social-
           | media
           | 
           | It is social media where only the end users' devices can
           | decrypt the posts and comments. Then surveillance is not
           | possible. Targeted ads are not possible.
        
       | herf wrote:
       | Please make it so my kids can watch a YouTube video required by
       | school without watching 20 YouTube shorts after. That's all I
       | want.
        
         | goosejuice wrote:
         | Download the video?
        
       | hnpolicestate wrote:
       | Imagine the respect the government has for your intelligence
       | publishing this while purchasing said surveilled user data.
        
         | carom wrote:
         | The government is large and consists of multiple organizations
         | with different goals.
        
         | bbarnett wrote:
         | There is no single "the government".
         | 
         | Instead "The Government" is like a huge community. They are all
         | supposed to adhere to the same code, but like any community
         | there are those members that look for a way to bypass the law,
         | without quite going over it.
         | 
         | That's what said purchases are. And even parts of the community
         | in the same branch of a government department, may do what
         | other parts are not even really aware of. Or agree with.
        
           | hollerith wrote:
           | Although you have a valid point, I object to your calling it
           | a community because communities don't have constitutions and
           | cannot throw people in jail if they break the community's
           | rules. Also, a community has much less control over who
           | becomes a member of the community than a government has over
           | who it employs.
        
       | CatWChainsaw wrote:
       | Surveillance is cancerous. It keeps on growing, feeding on
       | justification for every data point "just because", and then
       | eventually it kills you.
        
       ___________________________________________________________________
       (page generated 2024-09-19 23:00 UTC)