[HN Gopher] The Fantasy of Opting Out
___________________________________________________________________
The Fantasy of Opting Out
Author : Fiveplus
Score : 211 points
Date : 2021-02-18 11:18 UTC (11 hours ago)
(HTM) web link (thereader.mitpress.mit.edu)
(TXT) w3m dump (thereader.mitpress.mit.edu)
| marshmallow_12 wrote:
| this level of monitoring can only become useful with tools that
| don't yet exist. it's only theoretically possible to aggregate
| all the available data on any individual. it will require
| advanced AI and vastly greater data sharing in order to make this
| a significant issue. i imagine though that by the time these are
| developed, they will have their teeth blunted by advanced, new,
| obfuscation techniques and technologies.
| ClumsyPilot wrote:
| I think this is a dangerous fantasy, it does not require any
| AI, just putting all the data together with loads of man hours
| pksebben wrote:
| they're already abusing our data. adtech sells to the US army
| (pointedly: data from apps that are marketed to Muslims). if
| you wait for the exploitation to be absolute before dealing
| with it, you'll miss the opportunity to deal with it at all
| marshmallow_12 wrote:
| The exploitation will never become absolute. Yes, blurred
| lines are very dangerous, and yes an opportunity is being
| missed here, but the scale of the danger is being
| overestimated. The next data crisis will be entirely
| unexpected but obvious in retrospect. (like every single
| google account being breached).
| pksebben wrote:
| That's orthogonal to the discussion, though. The concern
| isn't about cybersecurity, or breaches - this is about
| governments and companies abusing their access. The
| concern is that given enough time / development / etc
| handing _anyone_ a "track all folks down to the
| millimeter" is a fundamentally dangerous proposition.
|
| We know for a fact that government agencies misbehave.
| Imagine McCarthy's America, but with the ability to
| programatically crawl your every movement and spoken
| word. _That's_ the nightmare scenario.
| marshmallow_12 wrote:
| imho Western Democracy is too strong and data gathering
| is too weak to make this an immediate concern. even if
| democracy unravels and the tech matures it may well be
| less potent and/or effective than feared. A far more
| pressing concern is a large data breach, which is
| possible and probable.
| marshmallow_12 wrote:
| ...which is not practical and therefore only of consequence
| for a person of interest, i.e. a criminal or suspected
| criminal. I'm not saying it isn't a big problem, (suspicions
| can be invented) i'm just pointing out it is unlikely to lead
| to a dystopia. I think that laws will be drafted to
| invalidate most data as evidence in court (since minor
| offenses are universal and can't all be prosecuted).
| hedora wrote:
| The world needs to move to making all data collection opt-in.
| There should be no negative impact to failing to opt in, unless
| absolutely necessary. Those exceptions should be clearly
| legislated, and be easily challenge in court.
|
| This would require a right-to-privacy constitutional amendment in
| the US.
| keiferski wrote:
| I'm sure I'll be called crazy for this, but the true solution to
| oppressive societal forces is personal space travel and
| colonization. When it becomes possible for a small group of
| people to fund their own "opt out" and escape into outer space,
| individuals will regain some bargaining power.
|
| Obviously this won't happen for centuries. But on the timeline of
| "future human existence", it's really not very long at all. I see
| this as an inevitable outcome of technological development, even
| if the Private Ownership of Spacecraft War of 2346 is bloody.
| That gives me hope for the future.
| packetlost wrote:
| How is that any different from someone deciding to opt out from
| society and move to... I dunno, the mountains? There's still a
| lot of places on Earth that are sparsely inhabited or
| completely uninhabited.
| hertzrat wrote:
| Living in the mountains, where I live, is about $2k per month
| just for rent
| keiferski wrote:
| Still under the authority of a state. Escaping to Patagonia
| doesn't really let you avoid governmental surveillance, just
| makes you less of a target.
|
| Besides that, one shouldn't have to be a hermit in order to
| avoid oppression.
| AnimalMuppet wrote:
| But how is it different if you're a space hermit?
| gpm wrote:
| I suppose because the cost to reach you is greatly
| increased, meaning the cost to exercise authority is
| greatly increased, so actual authority is greatly
| deceased.
|
| Right now the cost for the government to send their
| agents to "the mountains" is trivial, the most remote you
| could possibly get is "a few hours commute and a
| helicopter ride". The government is more than willing to
| pay that price in important cases.
|
| I think they imagine that space travel can increase that
| price to "months or years" and "expensive vehicles that
| support people during that time". Rather like how much it
| would have cost to send government agents to remote
| places hundreds of years ago.
|
| (Personally I'm not particularly convinced, the number of
| habitable rocks in the solar system is small. Living not
| on a rock means you need to import resources. Maybe if we
| get interstellar travel).
| joubert wrote:
| > oppressive societal forces is personal space travel and
| colonization
|
| Why do you think colonies in space will be different and not
| see "societal forces" emerge?
| keiferski wrote:
| Of course human societies will still have issues. But the
| option of exiting will exist, as will the option of just
| going off on your own.
| joubert wrote:
| The article had this but that stood out to me:
|
| "It isn't possible for everyone to live on principle; as a
| practical matter, many of us must make compromises in
| asymmetrical relationships, without the control or consent
| for which we might wish."
|
| Even if you could just "go off on your own into space", you
| will likely need to transact with other humans in order to
| survive. Put differently, I think while it is an "option"
| in principle, it isn't in reality.
| keiferski wrote:
| The ability to walk away is perhaps the single most
| effective negotiation tactic. Simple as that.
|
| In MBA land they call it BATNA. Best alternative to a
| negotiated agreement.
| ccsnags wrote:
| I think you are correct.
|
| The more accessible opting out becomes, the more leverage
| regulator people will have when negotiating with the
| dominant social order.
|
| Space travel like this is down the road, but it's only a
| long road if you are thinking in terms of your own finite
| existence.
|
| Also, opting out is in demand. From a more immediate
| perspective, I think that there is a lot of room for
| innovation around personal privacy and opting out. No
| system created by humans is permanent. More surveillance
| means that humans will create systems to subvert it or
| render it useless. This has already happened in many ways.
|
| A system is nicer to its people when they are there
| voluntarily. These changes will also work to improve the
| quality of life of those in the system as well as those
| outside of it.
| ymbeld wrote:
| I won't call you crazy, but I will call you naive. Modern
| civilization was what brought us centralized states and
| corporations. And yet we always seem to think that that next
| hill, just over the horizon is where everything will flip on
| its head and we will be back to some mythical past where we
| could roam wherever we please--all enabled by technology of
| course.
|
| New World 2.0 isn't coming.
| keiferski wrote:
| I don't see the early 19th century as a "mythical past", nor
| did I say big changes were "just over the horizon."
|
| Space is quite literally limitless from a human perspective;
| to assume that somehow human beings will make zero progress
| on space travel 500 or 1,000 years from now seems naive to
| me.
| timerol wrote:
| Are you familiar with Sealand? It was basically the same idea,
| but in international waters
| throwawayboise wrote:
| Good point, you could do it now, far more cheaply, and far
| more safely than going into space.
| keiferski wrote:
| Most of these sort of attempts have been either poorly
| planned or deliberate shut down by nation states (like the
| ones near Italy and Thailand.)
|
| I'd also imagine it's far less exciting to live on a floating
| platform in the ocean than out in space. Certainly the
| marketing materials will be more appealing.
| coldtea wrote:
| > _I'm sure I'll be called crazy for this, but the true
| solution to oppressive societal forces is personal space travel
| and colonization. When it becomes possible for a small group of
| people to fund their own "opt out" and escape into outer space,
| individuals will regain some bargaining power._
|
| Besides the infissibility for billions to do so in the next
| 2-3-5 centuries at least, it will probably also be the total
| opposite if/when it happens.
|
| Those space colonies won't be like roaming around in some empty
| earth. It will be like living in some very close knit community
| on Earth, when everybody is monitored and depends on everybody
| else don't doing something stupid/suicidal to put the colony in
| danger...
|
| > _I see this as an inevitable outcome of technological
| development_
|
| Why would it be inevitable?
|
| There are big show stopping issues which are only handwaved
| away atm with "but, progress" (as if technological development
| is boundless and creativity can bypass any hard constraint).
|
| Some issues as so hard physics problems, that (BS like
| "Alcubierre drive" aside) would mean the best we could ever do
| would be "generation ships".
| keiferski wrote:
| By inevitable, I mean that space travel will become
| affordable enough to be personal, and that this is a question
| of time, not physical limitations.
|
| We're also talking about hundreds or thousands of years here.
| It seems totally reasonable to me to assume that a private
| spaceship priced at ~$500,000 in 2021 dollars will exist by
| say, 2500.
| simonh wrote:
| This is silly, we are all observed by other people almost all of
| our lives, but that's fine because they don't conspire behind our
| backs to create a comprehensive record that's handed over to the
| government. I don't mind if the building security records me
| entering the building, or the bank records me using the ATM, or
| that London Transport videos me on the train. What I object to is
| if all of those are stitched together and handed over to
| advertising agencies or my employer.
|
| Likewise I don't care that Google knows what I searched for, or
| that Twitter knows what I tweeted, or that LinkedIn has my
| employment history. What I don't want is all of that being sold
| to Cambridge Analytica to then aggregate and sell on to someone
| else for goodness knows what purposes.
|
| An awful lot of my life and interests are easily searchable. My
| handle here is basically just my name, and I use the same handle
| or even more complete versions everywhere I can. When I'm out in
| public, the public can see me. When I post in public, the public
| can read what I say. That's fine, that's why I said it.
|
| However my private correspondences with my wife and kids on
| iMessage or WhatsApp are nobody else's business. My bank
| transactions and online shopping likewise, that latter is mainly
| between me and Amazon. Where I would get upset is if Amazon sold
| that data to Google to show me 'relevant ads', or show my
| purchases to my friends. Remember Facebook Beacon? There need to
| be clear, hard lines in the sand.
| feralimal wrote:
| You're naive, sorry to say.
|
| I would think that all that data is being shared. And if its
| not being shared now, it is being recorded. And an AI will run
| through all that information and process it in the future. Why
| anyone would trust self-serving governments and corporations
| with private information amazes me!
|
| It is a perfectly rational hypothesis to consider that a lot of
| the reasons governments use to take civil liberties away, are
| ones that they orchestrated themselves to facilitate their
| power grab. To not consider this as a possibility, in
| psychological terms, is like being the co-dependent in a
| narcissistic relationship, or like the victim in Stockholm
| syndrome - you can't imagine that someone would be that
| abusive, even though you know already that governments and
| corporations do NOT have your back.
|
| All government conspiracy aside, anyone can see that one makes
| lots of decisions to do things (or not) on account of what it
| means to be in public. In your mind, contrast the idea of being
| in public in a busy city versus a quiet country road. You do
| not act in the same way! You are under greater stress in a
| city, you will conform with the social norms as you perceive
| them, you will not 'flower' as an individual.
|
| This stuff is all known. We are better managed in cities hence
| 'they' want to move the mass of people into 'smart' (spy)
| cities. Its not a secret. Its been being planned for a long
| time. Look into technocracy.
| [deleted]
| simonh wrote:
| Of course it is, I even game examples of data being used in
| that way (Beacon and Cambridge Analytica). How can you
| possibly think I'm not aware of activities I cited examples
| of? That's exactly the sort of thing we need to focus our
| efforts on stopping,
| ttt0 wrote:
| > Likewise I don't care that Google knows what I searched for,
| or that Twitter knows what I tweeted, or that LinkedIn has my
| employment history. What I don't want is all of that being sold
| to Cambridge Analytica to then aggregate and sell on to someone
| else for goodness knows what purposes.
|
| So you don't like Cambridge Analytica, but Google, Twitter and
| LinkedIn using your data for goodness know what purposes is
| fine? I'm pretty sure they're running all that data through all
| sorts of machine learning algorithms and some of that might be
| used at some point for surveillance and censorship purposes or
| dystopian stuff in general. Not might, _will_ be, if already
| isn 't. Because you'd have to be stupid to have this amount of
| data and not use it to further your political agenda.
| simonh wrote:
| They need this information to provide the services they
| offer, if you don't want the services don't provide the data.
| I thought I made it abundantly, crystal clear I am against
| them arbitraging or selling this data for other purposes of
| the kind you describe. That's what we need to focus on with
| regulation. Of course they want to use this data for other
| purposes, and we need to make sure they do not. I find it
| somewhat exasperating that you seem to think I believe
| otherwise.
| SamuelAdams wrote:
| > I would get upset is if Amazon sold that data to Google to
| show me 'relevant ads'
|
| If the email account you use for Amazon is a Gmail account they
| email a receipt of purchase to that account, which Google will
| use for 'relevant ads'.
|
| But of course in that case you are expressly granting
| permission for Amazon to contact google. If it was done via a
| TOS agreement or something then yes I agree, that would be
| concerning.
| andagainagain wrote:
| Even then, that's more a problem with google than amazon. Or,
| more accurately, it's a problem that we treat email as
| electronic mail. We feel we own the account like we own our
| address, Heck, for some stupid reason we use them for
| identification all over the internet.
|
| But technologically, they are postcards sent to a business.
| And we just visit the business to pick up our postcards.
| Imagine every time you bought something, they sent the
| receipt to walmart for you to pick up. Not in an envelope,
| just handed to the guy at the counter and put into a box for
| you. We never should have let email get this far, but we did.
| dnissley wrote:
| Supposedly that's no longer the case:
|
| https://www.nytimes.com/2017/06/23/technology/gmail-ads.html
| zeta0134 wrote:
| This is something I wish was noted more often. When I'm
| searching for something on Amazon, I have intentionally visited
| that storefront and am willfully handing them my data (in the
| form of search and browsing history); of _course_ Amazon is
| going to keep that and use it to personalize my results.
| Frankly that 's part of their value add, so this is neither
| surprising nor particularly upsetting.
|
| What's surprising to most people (and should be the focus of
| any litigation, imho) is precisely this third-party data
| sharing. It's partly why the cookie law drives me nuts, since
| it's made all tracking the bogeyman, and in reality, most first
| party "tracking" is completely benign. If companies would agree
| not to sell or share my data with third parties (law
| enforcement serving a warrant being the major exception) then I
| have no real issue with the tech. The blatant sharing, and
| especially _ad networks_ make my blood boil.
| II2II wrote:
| > It's partly why the cookie law drives me nuts, since it's
| made all tracking the bogeyman, and in reality, most first
| party "tracking" is completely benign.
|
| There is a lot more disclosure about the use of cookies due
| to those laws. One of the things those disclosures will note
| is how many of those cookies are from third-parties. How many
| people have even reviewed a single disclosure? Of those who
| have, how many know how to disable third-party cookies? I am
| not surprised that tracking ended up as the bogeyman due to
| the amount of it, the dubious motives of most of it, and the
| limited control that people have.
|
| Even if you eliminate third-party tracking and other forms of
| data sharing, the amount of tracking happening through first-
| party cookies is sometimes questionable. A company may be
| fully justified in figuring out how their services are used,
| but does that extend to creating profiles on individuals?
| There is a big difference between a business using aggregate
| data to improve sales and using data to tailor services to
| individuals. It is worth noting that many people would
| consider the former as being too manipulative, while it is
| reasonable to argue that the latter is exploiting the
| vulnerabilities of individuals.
|
| Personally, I find any sort form of tracking beyond ensuring
| security and performance to be excessive since most of the
| other tracking is intended to establish a one-sided
| relationship to the benefit of the people doing the tracking.
| Arguing that it sometimes improves the lives those being
| tracked is missing the point since it is usually very much
| unintentional.
| Jgoure wrote:
| Anecdotally, I purchased a heated blanket from Amazon for my
| father and had it shipped to his address. When I went to look
| at order details, Amazon advertised to me other things the
| person I shipped this item to may like. Men Diapers, Pet
| treats etc... I don't own a pet due to allergies and I am
| very young to be searching for adult diapers. I found it very
| rude and unprofessional of Amazon to share such a personal
| suggestion with me.
|
| I called Amazons customer service to complain about their
| suggestions. The representative said that their suggestions
| are based on my searches and purchases. I believe the
| suggestions are also based on the purchases made for that
| address.
|
| My father doesn't even get an option to opt out of Amazon
| suggesting things he's purchased, to other people to purchase
| for him. There isn't an option for privacy.
| africanboy wrote:
| > of course Amazon is going to keep that and use it to
| personalize my results
|
| and yet after years of buying the same pair of shoes, year
| after year, amazon still doesn't know what's my shoe size and
| I have to check if my size is still available every time...
|
| I believe they are not really trying to improve _my_
| experience, but their profits.
|
| But I have no proof.
| sandworm101 wrote:
| >> of course Amazon is going to keep that and use it to
| personalize my results. Frankly that's part of their value
| add
|
| You think it is value add because you assume they are using
| the data to send you more relevant content. That's just not
| how data is always used. A google customer (eg an advertiser)
| might want to hit you with deliberately non-relevant ads, ads
| to divert you away from a competitor product. You might want
| to book a trip to visit family in Hawaii, but the Florida
| resort advertiser doesn't much care what you want actually.
| They will hit you with Florida ads in hopes that they can
| divert a potential traveler to a different destination. You
| will miss out on relevant Hawaii content simply because
| Florida has paid more to put content in front of you.
| gumby wrote:
| Also they can use this to change the price they offer you
| zeta0134 wrote:
| Don't misunderstand: this is a bad advertising practice. I
| don't like most advertising, relevant or otherwise, so if
| this happens as you describe then yes, the value add is
| negative. That doesn't make the data usage _surprising_
| though, and that was my point. It 's okay if Amazon uses
| data I entered into search directly to advertise to me,
| even if they're not very good at it. Ethically, no line was
| crossed here.
|
| What would be surprising (to most consumers) is if the
| Florida resort advertiser, instead of bidding on some broad
| target demographic, has access to enough data to target me
| individually. However that comes about, _that_ is the line
| that is crossed. Why does the third party have direct(ish)
| access to this data? Why wasn 't I informed? Etc. Whether
| that is direct sale of the data, or indirect targeting
| through unusually specific ad campaign targeting, the
| effect is the same: it's _creepy._
| doggodaddo78 wrote:
| Amazon then sells your searches to every other company that
| happens by and correlates it with other broker data, and
| resells that.
| Wowfunhappy wrote:
| Counterpoint: Many of the tech giants are so big and all-
| encompassing that they are practically their own third party.
| Google can take your location history from Google Maps and
| use it to recommend videos on Youtube.
|
| Perhaps more importantly, restricting third-party but not
| first-party sharing creates bad incentives. If Google could
| share search data with first-party services like Google
| Reviews, but not third-party ones like Yelp, what does that
| mean for Yelp? We'd just be encouraging the largest companies
| to bring even more of the world in-house.
| antasvara wrote:
| If I'm not mistaken, this problem is exactly what is
| attempting to be solved by the current anti-trust suit
| against the big tech companies in the United States (this
| is the suit brought by Texas and a few other states).
| Depending on the outcome, vertical integration like you're
| describing could be deemed monopolistic and result in
| Google being spun off into separate entities for each of
| the parts you're describing.
|
| I think views related to this topic somewhat depend on
| whether or not you consider Google a monopoly in the
| digital advertising space. If you do, it would seem that
| bringing more of the world in-house would be deemed illegal
| under anti-trust laws.
| Wowfunhappy wrote:
| I do believe that Google, Facebook, Amazon, and even
| Apple ought to be broken up. However, I don't have a ton
| of faith that it will happen, and even if it does, I'm
| wary of policies that would encourage future
| consolidation.
| cjfd wrote:
| Good, let us forbid cross-service sharing of data. Gmail
| can do everything it likes to with whatever data is
| generated while I am using that. Let us just not allow it
| to also use this data in google maps and on Youtube.
| Wowfunhappy wrote:
| How do you define a single service? What if Google adds
| gmail results to Google Search, or Facebook integrates
| Whatsapp into Facebook Messenger?
| hertzrat wrote:
| "Everything it likes" is pretty broad, isn't it?
| tehjoker wrote:
| Why should they be able to personalize to the individual? Why
| shouldn't they be working from depersonalized aggregated
| statistics?
| clairity wrote:
| > "...willfully handing them my data (in the form of search
| and browsing history); of course Amazon is going to keep that
| and use it to personalize my results."
|
| sure for the few minutes that that data is relevant to
| selling you stuff you're looking for right now, but why would
| you expect them to keep it for longer, as this seems to
| imply?
|
| the time dimension matters too. keeping that data for more
| than a few minutes should also be explicitly opt-in, as it's
| data being collected and potentially shared in the future
| (intentionally or not).
|
| reach and accessibility are naturally limited 'in the old
| days' where a salesperson might remember your preferences,
| even writing them down to share with other salespeople, but
| that data hardly leaked out to other retailers (and potential
| competitors). it seems that that should be our baseline, and
| any further gathering/sharing be subject to explicit opt-in.
| cmckn wrote:
| I agree that the time dimension should at a minimum be
| communicated, and longer term analytics on this kind of
| data can almost always be done without associating the data
| with individuals.
| dmitryminkovsky wrote:
| > most first party "tracking" is completely benign
|
| Drinking water, breathing fresh air, sitting by a fire on a
| cold day, hugs from loved ones--things like this tend to be
| completely benign. A permanent record of your activity in
| somebody else's hands should never be assumed to be benign,
| much less _completely_ benign. On the contrary, such a record
| should be assumed to be hostile, because there is no
| legislation or level of care in the world that can truly
| prevent this record from leaking to third parties, or from
| being abused by the first party. In the United States we have
| this beautiful right to remain silent, because anything we
| say can and will be used against us in the a court of law. I
| believe it is wise to at least deeply internalize this, so
| that even if you do share data with "first parties," you
| won't be surprised later when that activity comes to bite
| you, because eventually it will.
| einpoklum wrote:
| > but that's fine because they don't conspire behind our backs
| to create a comprehensive record that's handed over to the
| government.
|
| > I don't mind if the building security records me entering the
| building, or the bank records me using the ATM
|
| Actually, I would pretty much bet the government in a bunch of
| states in the world can access your bank records. Places like
| the US, or China, or Russia. But let's ignore that.
|
| > What I object to is if all of those are stitched together and
| handed over to advertising agencies or my employer.
|
| The thing is, once the information is gathered, and stored,
| it's an easy transition to feed it somewhere. And judging by
| current trends, - some company will soon offer pay those
| disparate surveillers to feed such data to it, constantly -
| since it can processed and analyzed en masse, and monetized.
| Oh, and they'll probably send the government a copy of
| everything too (judging by what FAANG do, for example).
|
| > Likewise I don't care that Google knows what I searched for
|
| That's not "likewise". Google is already a huge stitcher of
| surveillance - the kind you said you disapprove of. And, again,
| they send everything to the US government.
|
| > that latter is mainly between me and Amazon.
|
| You mean between you and the entity controlling a huge chunk of
| all on-line commerce and whose operations are larger in
| monetary terms than most states in the world? And that acts
| like a government with its own body of rules and internal
| judicial system for disputes? ... yeah, it's "just" between you
| and them.
| andagainagain wrote:
| Indeed.
|
| I'm not trying to separate myself from society. I'm just trying
| to keep the stalkers away.
|
| A lot of companies seem to want to act less like members of
| society and more like stalkers.
| hinkley wrote:
| My go-to response for people who say "If you don't have
| anything to hide then why do you care if we know?" Is that
| clearly they have never bought Preparation-H or itch cream from
| the drugstore, or they're not thinking about how that purchase
| is not information their classmates or rivals need to know
| about.
|
| I once told someone something about my kid, and they responded,
| "why haven't you told me this before???" My flat reply was,
| "because it's not the most interesting thing about her." I'm
| about 50-50 on smart versus stupid answers, but occasionally I
| surprise even myself. That's one of my best one-liners.
|
| Setting aside police/surveillance state dystopias for a moment:
| People try to make you small by labeling you. The more things
| they know about you, the more labels they have. If you don't
| believe me just look at the sewer that flows through replies to
| AOC's tweets. Having depression as a teenager should not define
| you. Being a survivor of assault or harassment should not
| define you. Having a working class upbringing should not
| pigeonhole you. Having an itchy groin should not be ammo for
| somebody to derail and deflect what you're trying to do. Mind
| your own goddamned business and keep the conversation on topics
| that are actually relevant, like your embezzlement conviction
| or my ongoing bribery lawsuit.
| doggodaddo78 wrote:
| Live in a glass house, tell me your complete sexual history
| including all of the kinks you're ashamed of, and give me your
| email password. I thought so.
|
| Your viewpoint come across as extremely naive until you've had
| political persecution, targeted harassment, or stalking issues.
| Privacy isn't something you or anyone else gets to decide no
| one else needs because you don't understand it or value it, but
| you're free to try living in a fantasy world so long as you
| don't put the lives of reporters or refugees at risk, or
| condone the invasion of the lives and personal effects of
| others.
| grawprog wrote:
| >Likewise I don't care that Google knows what I searched for,
| or that Twitter knows what I tweeted, or that LinkedIn has my
| employment history. What I don't want is all of that being sold
| to Cambridge Analytica to then aggregate and sell on to someone
| else for goodness knows what purposes.
|
| This logic would be fine if all google did was search, if all
| amazon did was shopping, but they don't. They have federal
| government contracts, they work with defense contractors and
| law enforcement, they control huge amounts of the internet
| infrastructure. Google, amazon, facebook gathering your data is
| more than just a search engine, a store front and a social
| network gathering it, even if theh share it with nobody other
| than their internal businesess, those ternal businesses have
| massive control over the internet qnd many people's lives.
| throwaway98797 wrote:
| Hiding in plane sight is powerful. Especially true if one can do
| it with a community.
| maxerickson wrote:
| How many bits revealed by choosing the wrong spelling of
| 'plain'?
| [deleted]
| germinalphrase wrote:
| There will be a great spookiness to augmented reality.
|
| Already, we invite soft surveillance into our private spaces, but
| will we agree to having those spaces mapped to the millimeter,
| our objects tracked in kind and location, our private actions (in
| addition to our words) persistently noticed, considered and
| logged?
| airstrike wrote:
| Relevant (and still disturbing):
| https://www.youtube.com/watch?v=YJg02ivYzSs
| pdkl95 wrote:
| I strongly recommend everyone watch Raph Koster's talk "Still
| Logged In: What AR and VR Can Learn from MMOs"[1] about the
| ethical issues involved in VR/AR... and how VR/AR can be used
| as a weapon.
|
| [1] https://www.youtube.com/watch?v=kgw8RLHv1j4
| germinalphrase wrote:
| It is a solid piece of work, but the blaring visual pollution
| is probably not our biggest worry.
| ergl wrote:
| Needs a (2019) in the title
| choeger wrote:
| The issue will become obvious rather soon. It won't be the state
| that uses the surveillance like Stasi or Gestapo would have
| (although, it might come closer with excuses like IP or public
| health). Instead my bet is on online shopping.
|
| Right now dynamic pricing is still asynchronous. If They do it,
| They have a model of you that fits some marketeers understanding
| of people. And this model suggests a price increase or maybe even
| a decrease.
|
| But what _will_ happen is real-time data exchange. Say you booked
| a nice hotel for your vacation and now search for flights. Wonder
| why your prices are 50% higher? Say your TV just broke, or your
| car didn 't start this morning, or you mentioned on whatsapp how
| you need new sports equipment. Basically whenever you will _need_
| something, you will pay a Premium. No matter where the data comes
| from. That 's the price of giving up privacy.
| GCA10 wrote:
| In today's society, the desire to be noticed is easily 50x the
| level of anxiety about being in a surveillance state.
|
| We could start with the nonstop, look-at-me nature of Instagram
| (or any other social site). They satisfy a deep craving that just
| keeps growing. We could marvel at the Jan. 6 rioters posting
| their moments in history for all to see. It's endless, and it
| isn't slowing down.
|
| Yes, there's a powerful argument to be made that nonstop
| surveillance could work out badly. But after 15 years of seeing
| such pieces thunder into obscurity, rehashing the same arguments
| in isolation seems futile.
|
| Anyone who wants to contribute to the conversation needs to spend
| serious time thinking about the reasons why so many people want
| strangers to know about them. It's a deep-felt desire. For a lot
| of people, the dread of being unknown/un-noticed/ignored is
| greater than the risks that come from being noticed. Once we
| understand why that's so, we might be able to move forward.
| lucasmullens wrote:
| These seem unrelated to me. I should be able to post on social
| media publicly while still wanting privacy in other aspects of
| my life. A desire to be seen is not in any way fulfilled by
| security cameras and tracking cookies.
| GCA10 wrote:
| Dash cam footage! Why has dash-cam footage become a thing on
| YouTube? Why aren't people demanding that it be shut down?
|
| Even security cameras are now part of performance culture
| ymbeld wrote:
| Beware of loudest people in the room bias.
| colllectorof wrote:
| The wast majority of people out there have no mental capacity
| to imagine how data they post online and provide to various
| orgs could be and most likely eventually _will be_ used against
| them. This is evidenced by the continuing proliferation of dumb
| comments along the lines of "I am boring", "I am doing nothing
| wrong", etc.
|
| There is a tremendous cognitive bias in play here. The idea
| that because most people around you don't weaponize certain
| types of information means no one anywhere will ever weaponize
| that information, even if it is globally and indefinitely
| available.
| GCA10 wrote:
| Agreed that a lot of people do things that are appealing now
| and not so wise later. But I think we'll get farther if we
| talk about this as a "short horizon" problem, rather than
| assailing their mental capacity.
|
| It's all a variant of "candy today; diabetes in 20 years."
| Public health experts have probably thought the hardest about
| how to get people to take the long term into account. There
| must be something in their playbook that could benefit the
| anti-surveillance cause.
| colllectorof wrote:
| I would gladly just say "imagination" instead of "mental
| capacity to imagine", but that word has been ruined by
| making it sound like something only kids and painters have
| to exercise. Hard reality: modern world _requires_
| imagination to navigate.
|
| For example, most people can imagine living with diabetes,
| but they have no idea what it would feel like if some
| entity started using their leaked data against them. It's a
| much more complex scenario with lots of possible outcomes
| and variables.
| ymbeld wrote:
| They don't have the mental capacity? In what sense?
| anaerobicover wrote:
| You're right, but there's a crucial and fundamental difference
| between surveillance and posting to social media: the second is
| _voluntary_. The poster has chosen to share whatever it is.
| They may not be completely aware of the full range of
| consequences, but it 's still their choice. In some measure,
| the tech is empowering them to do this thing that they want --
| to be noticed.
|
| Surveillance -- including stuff like profiling people by
| analyzing their voluntary social media posts -- is _imposed_ on
| a person by someone else. It is taking away the surveilled
| person 's free choice, and its entire purpose is to gain power
| over them.
|
| There's also absolutely no _inherent_ reason that surveillance
| -- the deliberate steps of gathering /cataloging/analyzing --
| has to come along with people being able to post things in
| public. That's just a f'd-up practice that our society has
| adopted.
| hertzrat wrote:
| Is it really a deep craving, or even a real choice? I am
| working on an indie game. I really don't want to play the
| social media game, I don't even have Twitter or Facebook. Yet,
| I'm spending today researching how to make a YouTube channel
| and how to gain followers. Indie games just almost never sell
| unless you build an audience before release. Don't assume
| everyone does this out of vanity or enjoys the idea of being
| talked about online
| rjbwork wrote:
| Personally I don't take pictures of myself and don't have any
| of the real name or picture based social media sites (Facebook,
| Instagram, TikTok, Snapchat, etc.) I only use HN, Reddit, and
| Discord because I enjoy talking about current goings on and
| ideas with others, and to communicate about shared hobbies or
| interests.
|
| Do I like to get some Karma on HN/Reddit or reactions on
| Discord or replies on all 3? Yeah, I do, but not because of
| some "desire to be noticed" (I think) but because it means
| someone thinks I have provided some input to a conversation and
| they want to talk about it at the very least it lets me know
| I'm not a crazy person talking out of my ass. In fact, I'd
| prefer that nobody knows my real name or what I look like on
| all 3 of those sites. I can be a bit more authentic and candid
| than I'd feel comfortable being otherwise.
| jpm_sd wrote:
| David Brin covered this topic in a 1996 Wired article [1] and a
| follow-on 1998 book [2]. So far, we're not doing great on the
| "Accountability" part.
|
| Bruce Schneier disagreed with him in 2008 [3] (and probably still
| does).
|
| [1] https://www.wired.com/1996/12/fftransparent/
|
| [2] https://www.davidbrin.com/transparentsociety.html
|
| [3] https://www.wired.com/2008/03/securitymatters-0306/amp
| frompdx wrote:
| _Privacy does not mean stopping the flow of data; it means
| channeling it wisely and justly to serve societal ends and values
| and the individuals who are its subjects, particularly the
| vulnerable and the disadvantaged._
|
| I found the conclusion to be very open ended. Who decides what it
| means to channel the flow of data _wisely and justly_ , and to
| what ends?
| naringas wrote:
| > Those who know about us have power over us.
|
| I'm not sure about this, those who can change our behavior have
| power over us. knowing somebody does not necessarily mean I have
| power over said somebody.
|
| likewise, there are things that have power over us without even
| having to know us.
|
| however if someone has power over somebody AND knows a lot about
| said somebody then their power is (indeed) more effective.
| ReactiveJelly wrote:
| If someone can't change your behavior, they can share their
| knowledge with someone who can.
| eternalban wrote:
| Brunton & Nissenbaum describe some of the features and mechanics
| of the panopticon -- "the apparatus of total surveillance" -- but
| do not comment on the _psychological effects_ of "total
| surveillance" on collective and individual behavior.
|
| Foucault on 'Panopticism' addresses that far important aspect.
| Psychologically defeated people will _not_ seek to "opt out".
| Opting out is the analog of escaping from prison: most prisoners
| do not seriously entertain such notions, much less act on them.
| "A real subjection is born mechanically from a fictitious
| relation."
|
| https://foucault.info/documents/foucault.disciplineAndPunish...
|
| "Hence the major effect of the Panopticon: to induce in the
| inmate a state of conscious and permanent visibility that assures
| the automatic functioning of power. So to arrange things that the
| surveillance is permanent in its effects, even if it is
| discontinuous in its action; that the perfection of power should
| tend to render its actual exercise unnecessary; that this
| architectural apparatus should be a machine for creating and
| sustaining a power relation independent of the person who
| exercises it; in short, that the inmates should be caught up in a
| power situation of which they are themselves the bearers. To
| achieve this, it is at once too much and too little that the
| prisoner should be constantly observed by an inspector: too
| little, for what matters is that he knows himself to be observed;
| too much, because he has no need in fact of being so. In view of
| this, Bentham laid down the principle that power should be
| visible and unverifiable. Visible: the inmate will constantly
| have before his eyes the tall outline of the central tower from
| which he is spied upon. Unverifiable: the inmate must never know
| whether he is being looked at at any one moment; but he must be
| sure that he may always be so. In order to make the presence or
| absence of the inspector unverifiable, so that the prisoners, in
| their cells, cannot even see a shadow, Bentham envisaged not only
| venetian blinds on the windows of the central observation hall,
| but, on the inside, partitions that intersected the hall at right
| angles and, in order to pass from one quarter to the other, not
| doors but zig-zag openings; for the slightest noise, a gleam of
| light, a brightness in a half-opened door would betray the
| presence of the guardian. The Panopticon is a machine for
| dissociating the see/being seen dyad: in the peripheric ring, one
| is totally seen, without ever seeing; in the central tower, one
| sees everything without ever being seen.
|
| It is an important mechanism, for it automatizes and
| disindividualizes power. Power has its principle not so much in a
| person as in a certain concerted distribution of bodies,
| surfaces, lights, gazes; in an arrangement whose internal
| mechanisms produce the relation in which individuals are caught
| up. The ceremonies, the rituals, the marks by which the
| sovereign's surplus power was manifested are useless. There is a
| machinery that assures dissymmetry, disequilibrium, difference.
| Consequently, it does not matter who exercises power. Any
| individual, taken almost at random, can operate the machine: in
| the absence of the director, his family, his friends, his
| visitors, even his servants (Bentham, 45). Similarly, it does not
| matter what motive animates him: the curiosity of the indiscreet,
| the malice of a child, the thirst for knowledge of a philosopher
| who wishes to visit this museum of human nature, or the
| perversity of those who take pleasure in spying and punishing.
| The more numerous those anonymous and temporary observers are,
| the greater the risk for the inmate of being surprised and the
| greater his anxious awareness of being observed. The Panopticon
| is a marvellous machine which, whatever use one may wish to put
| it to, produces homogeneous effects of power.
|
| A real subjection is born mechanically from a fictitious
| relation. So it is not necessary to use force to constrain the
| convict to good behaviour, the madman to calm, the worker to
| work, the schoolboy to application, the patient to the
| observation of the regulations. "
| mrmikardo wrote:
| Thanks for reminding me of this. A very pertinent and
| insightful observation and, as you suggest, one that is missing
| from the linked article.
| antattack wrote:
| I view it as a done deal, there's no escape and we need to plan
| for the future:
|
| Our current laws are not very detailed, often times they are
| overly severe to serve as a determent and/or make assumptions
| based on available evidence (which was less before).
|
| As we know more and more of an individual (due to gadgets, online
| activity, and cashless transactions)- laws and punishment need to
| take it all into account and be more tailored to actual crime and
| make less assumptions because there's plenty of evidence to go
| by.
|
| Another important issue is that we should not allow those in
| power shield themselves from surveillance and accountability
| under a guise of safety or security.
| bogomipz wrote:
| >"The browser plugins TrackMeNot and AdNauseam, which explore
| obfuscation techniques by issuing many fake search requests and
| loading and clicking every ad, respectively.
|
| I would be curios to hear anyone's experience and/or feedback on
| these plugins.
| purplezooey wrote:
| "life outside the totalitarian microscope?"... exaggerate much?
| pdkl95 wrote:
| > Obfuscation may be our best digital weapon.
|
| From Dan Geer's portentous talk _" Cybersecurity as
| Realpolitik"_[1][2]:
|
| >> Privacy used to be proportional to that which it is impossible
| to observe or that which can be observed but not identified. No
| more -- what is today observable and identifiable kills both
| privacy as impossible-to-observe and privacy as impossible-to-
| identify, so what might be an alternative? If you are an optimist
| or an apparatchik, then your answer will tend toward rules of
| data procedure administered by a government you trust or control.
| If you are a pessimist or a hacker/maker, then your answer will
| tend towards the operational, and your definition of a state of
| privacy will be my definition: _the effective capacity to
| misrepresent yourself_.
|
| [1] https://www.youtube.com/watch?v=nT-TGvYOBpI
|
| [2] http://geer.tinho.net/geer.blackhat.6viii14.txt
| waynecochran wrote:
| Is there any hope in feeding the surveilance noise -- a lot of
| it? e.g., create bots with my credentials that visit random web
| sites, have a phone that reports bogus GPS coordinates,
| numerous dummy accounts, that sort of thing...
|
| Or if enough folks gang up and feed the system an avalanche of
| random (or misdirected) information that we can drown our
| signature in a sea of noise?
| marshmallow_12 wrote:
| short term: i expect so, long term not so much. At best, some
| occasional fuzz will mar an otherwise clear picture of you
| and your activities. Too many fake accounts will only force
| users to surrender more personal information in order to
| authenticate themselves.
| keiferski wrote:
| Deepfakes may do this job for us.
| kodah wrote:
| Giving these big detailed anecdotes about how we're actively in a
| surveillance state isn't working. People don't care. A lot of
| those same people have probably helped, in the form of public
| opinion solidarity, to make it this way. When you support or make
| excuses for engineering firms that engage in aggressive tracking,
| you give them clearance. When you constantly murmur about
| immigration or terrorism, it provides tools and reasoning for
| these systems to exist. If you fear monger about all the "bad
| people" on the internet, you create pathways for things like real
| name policies, sentiment analysis, or private data collection to
| prove who you are and what faith you come in. Then there's people
| who will aid these people and say things like, "Well those
| governments and companies aren't quite sharing data yet!" as if
| mass aggregation at a governmental level or through private
| partnerships isn't already happening. When you put all this hand
| wringing together it forms a useful set of tools for governments
| and private companies to abuse or misuse. Privacy on the internet
| was never about one small thing, it was always about an aggregate
| of decisions that achieve an outcome.
| pksebben wrote:
| My interpretation of this article is that the author wanted to
| remind us to engage, and to contribute the conversation because
| we don't know how to manage the situation. It sounds like you
| have some format of a game plan to deal with this, would you
| care to share it? I'm legitimately interested.
| kodah wrote:
| > There is no simple solution to the problem of privacy,
| because privacy itself is a solution to societal challenges
| that are in constant flux. Some are natural and beyond our
| control; others are technological and should be within our
| control but are shaped by a panoply of complex social and
| material forces with indeterminate effects. Privacy does not
| mean stopping the flow of data; it means channeling it wisely
| and justly to serve societal ends and values and the
| individuals who are its subjects, particularly the vulnerable
| and the disadvantaged. Innumerable customs, concepts, tools,
| laws, mechanisms, and protocols have evolved to achieve
| privacy, so conceived, and it is to that collection that we
| add obfuscation to sustain it -- as an active conversation, a
| struggle, and a choice.
|
| The author comes to the same conclusion I do. I just stated
| that making vivid images of what your loss of privacy looks
| like aren't really making a dent.
|
| The author also says it best: there is no simple solution.
| Rather, the problem exists in people's behavior and belief
| systems. They feel justified in their beliefs for a cause,
| and once they are galvanized into that belief system they are
| no longer required to consider second and third order effects
| from their belief support. In fact, they're totally allowed
| to just dismiss people altogether as long as they are doing
| so _in support of the cause_.
|
| This isn't anything new. Hot topics show that people _just do
| this_ when they feel some type of way about a given topic. If
| you want to solve these problems I think the place to start
| is making vocal calls to people within your belief system
| that are encouraging a loss of privacy. This _must_ be from
| within your belief system because people don 't listen fully
| to people of polar belief systems and it must be vocal so
| that everyone sees the example.
|
| More or less saying: privacy must be a common concern that is
| continually addressed and answered for in every discussion
| where we encourage change. It can no longer be an option.
| pksebben wrote:
| Thank you for this. You managed to put fairly specific
| words to something that I constantly struggle to define,
| which is the process by which one affects and influences
| the culture they exist in. I think it's important to bring
| these things up and talk about them, especially with people
| who are not exposed to echo chambers like this one. Talking
| about these issues on HN is important, but only matters if
| you take the subject matter and expose it to folks in other
| contexts. You describe this process really well.
|
| I wonder, too, how we as tech-minded hackers and
| programmers and doers can use what we have and what we work
| with to strengthen these signals. Like, facebook and
| twitter et al have optimized for things like raw engagement
| numbers / advertising exposure etc. Are there things we
| could do to optimize for engagement / cultural development?
| I pose this question in earnest. It's something I have
| thought about a lot without many good answers to show for
| it.
| raintrees wrote:
| A concept explored by Greg Bear in his book Slant:
| https://www.amazon.com/Slant-Novel-Greg-Bear/dp/0812524829
| ThrustVectoring wrote:
| > If the apparatus of total surveillance that we have described
| here were deliberate, centralized, and explicit, a Big Brother
| machine toggling between cameras, it would demand revolt, and we
| could conceive of a life outside the totalitarian microscope.
|
| Really not sure how to turn this into actionable legislation, but
| the fundamental problem isn't the data _collection_. Rather, it
| 's the massive reduction in cost in organizing and querying the
| data. The laws and norms were set up when the only way to tell if
| someone had walked down a specific street was to pay someone to
| watch or to knock on doors and talk to people with faulty
| memories. Cameras couldn't store years of footage, databases
| weren't invented yet, and machine facial recognition was pure
| fantasy.
|
| Like, in the 19th century it'd be absolutely _ridiculous_ to
| insist that you have a right of "privacy" that means that people
| can't recognize you when you're walking around in public. And for
| a long time, pointing a camera outside your window was basically
| just like looking out it, and it got treated that way. A database
| of camera footage looking out at a majority of public streets,
| recording 24/7 with 5 years of back footage, indexed by time +
| location + facial recognition match, on the other hand, exploits
| people's privacy in a way that is _far_ more than the sum of
| parts.
|
| Essentially, my view is that some databases are repugnant to
| public policy and should be illegal to build and to query. GDPR
| has well shown the problems involved in legislating this, and
| there's a massive free speech argument that torpedoes the whole
| thing anyhow, so I'm pessimistic about actually fixing things.
| jtbayly wrote:
| Needs 2019 added to title
___________________________________________________________________
(page generated 2021-02-18 23:01 UTC)