[HN Gopher] Rights, Laws, and Google
___________________________________________________________________
Rights, Laws, and Google
Author : vinnyglennon
Score : 111 points
Date : 2022-08-29 14:09 UTC (8 hours ago)
(HTM) web link (stratechery.com)
(TXT) w3m dump (stratechery.com)
| bsedlm wrote:
| edit: deleted.
| merely-unlikely wrote:
| This doesn't necessarily take away from your point, but
| commonly under the law there is a legal fiction that holds a
| corporation is considered a "person"[1].
|
| [1] https://en.wikipedia.org/wiki/Corporate_personhood
| nonrandomstring wrote:
| > power ... exerting force over a minority ... free software
| movements were never about the openness nor the accessibility
| of the source code. But about issues such as this.
|
| Yes, Free Software culture has always been more about justice
| on a new (digital) frontier where new forms of injustice and
| abuse roam wild (and have only grown worse).
|
| I sincerely think the best model for understanding big-tech
| corporations like Google is as serfdom under feudal warlords
| within modern fiefdoms. It closely mirrors these historical
| power relations where laws and constitutions have nothing to
| say.
|
| The article is long and complex but I eventually found the
| kernel in this line:
|
| > "A Google spokeswoman said the company stands by its
| decisions, even though law enforcement cleared the two men."
|
| Though the moral questions behind it all are very complex, this
| case is not itself actually that complex. There was no mistake.
| No lack of proper investigation or tardiness by the police. The
| child, parent, doctor and police - all of the parties except
| Google - acted fairly, in good faith, mutuality and consent.
|
| Google is the problem here, and simply believes itself a law
| unto itself, that's the nub of it. For all the posturing about
| complying with the laws of nation states, companies like
| Facebook and Google have grown smug about their power. When
| they roll over the toes of the innocent, they laugh and say
| "and what are you going to do about it?"
|
| We are in new "might is right" times and nobody big enough has
| yet had the courage to say "Act justly, or we will hurt you
| back", and then follow up.
|
| For the ordinary citizen, the only sensible course is to
| resolutely refuse to use their services and products, and Free
| Software which provides so many alternatives to Google,
| Microsoft, Facebook and suchlike is the solution. It is the
| _moral_ choice that a _good citizen_ can and should make in
| these times.
| waylandsmithers wrote:
| My theory on why Google does this (and follows the same pattern
| in so many areas) is that having one person simply use common
| sense and reasonable judgment would be something of an admission
| that their software, created by the best and brightest engineers,
| isn't top notch.
| paulcole wrote:
| Start with questioning whether the software was created by the
| best and brightest.
| Animats wrote:
| There's a question as to whether Google is acting as an agent of
| the Government here. When the Government outsources something,
| that activity sometimes becomes subject to constitutional
| protections. Cases on this are iffy, though.[1] This has come up
| in reference to Amtrak, the Federal Reserve, Blackwater, and
| privately run prisons.
|
| Suppose Google analyzed all business data that passed through its
| servers looking for patterns of tax evasion. If the probability
| of tax evasion was high, reporting it to the IRS, which offers
| rewards, could be a profit center.
|
| [1] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2394843
| cee_el123 wrote:
| Does a petition/campaign in Mark's favour have any hope of
| getting Google to restore his account ?
|
| the most frustrating question I have - Why isn't Google restoring
| his account despite him being cleared by the police ? Why are
| they not even explaining this ?
| gumby wrote:
| They are probably worrying that if they did it would open a
| floodgate of challenges, lawsuits and political attacks.
|
| As the last paragraph of this essay points out: tough. Google
| needs to step up and deal with the consequences of its own
| size.
| golemotron wrote:
| > Google needs to step up and deal with the consequences of
| its own size.
|
| This is where regulation can help.
| gumby wrote:
| This is an under-appreciated factor in regulation ignored
| by the reflexive "regulation bad" crowd. Companies can
| welcome regulation, and not in a malign, regulatory capture
| fashion.
|
| One case could be this one: doing the right thing could
| open you up to attacks from bad actors. A regulation could
| give you air cover (and legal cover).
|
| Or lets say you want to be more environmentally friendly
| but customers won't pay extra. Your competitors do too, but
| of course you can't agree to do this (that's collusion). A
| regulation removes the risk and is applied to all.
| crazygringo wrote:
| I am utterly baffled as well. This was literally a front-page
| article in the NYT, not just the online home page but front
| page of the _paper edition_ [1].
|
| It's one thing to get the attention of some Googlers on the
| front page of HN, it's a whole other order of magnitude for the
| front page of the NYT.
|
| I cannot imagine how this hasn't become an urgent priority for
| Sundar himself to make sure this gets fixed and fast, at a
| minimum in these known situations. What could they possibly
| still be debating or deciding inside there at Google...? Police
| determined no charges, so restore the accounts. I don't see any
| potential risk, legal/reputational/otherwise, this would open
| up. Google's inaction here _boggles the mind_.
|
| [1]
| https://www.nytimes.com/issue/todayspaper/2022/08/22/todays-...
| wswope wrote:
| I'm right there with you in dismay, but I think this has been
| building up a while: consider how long there have been Google
| employees who browse HN, fully aware of how broken processes
| are damaging users and developers, and just sitting on the
| sidelines because they're in it for the paycheck.
| codefreeordie wrote:
| I'm surprised at how long this has gone on, but not _at all_
| surprised that it happened. Based on my experience at Google,
| I would have expected the NYT article to get it unstuck
| within 2-3 days.
|
| When I was at Google, I was continuously amazed at how hard
| it was to stop this kind of own-goal when we knew it was
| happening. (Based on where I was inside Google, I came across
| (dramatically-less-sensational) obviously dumb decisions of
| this sort).
|
| It was always _shockingly_ hard as a Googler to get traction
| on fixing some customer 's bureaucracy-navigation problem --
| but usually it getting into the press finally got someone
| through the bureaucracy.
| [deleted]
| Tangurena2 wrote:
| Google has become too large. As a result, they're a threat
| rather than a benefit to the web ecosystem.
|
| We saw similar stuff when banks were getting bailed out because
| they were "too big to fail". That meant that their very size
| was a threat to the financial system - they were really too big
| to allow to exist.
| alexb_ wrote:
| This is an extremely good article. The hard part of this is
| everyone can agree that "legality is not morality" and that what
| is legal is not necessarily moral. But this is used for the wrong
| side of the argument - since the Bill of Rights is a legal
| document, some disregard it as any guidance for morality.
| vorpalhex wrote:
| The Bill of Rights is a legal document based on a Moral set of
| beliefs.
| merely-unlikely wrote:
| I think this article points to a missing product in the market -
| integrated cloud services but with the cloud located on a private
| server in the home. Specifically for a company like Apple (who
| hand waives at being pro-privacy) to enable all iCloud
| functionality to be hosted on a personal mac. From a high level
| (and likely ignoring important but probably not insurmountable
| technical details), the idea would be for everything to look
| exactly the same to the iPhone user, but for all data and
| processing to happen on a machine the user controls. No Apple
| accessible or hosted user data, end-to-end encrypted in transit,
| etc.
|
| One might argue there are products that can fill this niche and
| certainly many on Hacker News are capable of setting up their own
| home servers. However, that is not nearly as convenient or
| realistic for the average person to do. Those setups also don't
| integrate directly into default services like iMessage, Photos
| (including both storage and things like photo tagging), etc.
| Modern desktops and many people's internet connections should be
| powerful/fast enough to handle this.
|
| To the extent user data is used to feed big-data algos/machine
| learning/etc, this might not be practical. That's why I suggest
| Apple do it rather than Google (Apple already claiming much of
| their ML services are handled on-device).
|
| One might argue that's essentially something iTunes handled
| (still could?) but again it's not nearly as integrated as modern
| iCloud services. And that only works over a local network or
| direct wired connection.
|
| The basic idea is just to move data hosting and processing from
| third-party servers where users have no direct control over how
| their data is used or accessed to hardware sitting in their home.
| izacus wrote:
| This exists are part of NAS devices. Buy a Synology and you get
| your iCloud/Google set of services in your home, for 1-time
| price.
|
| With all of the services you mentioned.
| merely-unlikely wrote:
| That only covers a subset of iCloud functionality though.
| Remote file storage sure. What about things like like backup
| the entire device with the ability to restore on a new
| iPhone, browsing history, app data (first and third-party),
| passwords, etc. AFAIK solutions aren't nearly as integrated.
| Which is why it would need Apple's support.
|
| Then there are services like email, iMessage, reminders,
| notes, etc. The latter there may be some service that does it
| but not in the default apps. The former I don't know of any
| plug-and-play solutions not cloud hosted. For example, many
| privacy concerned people use Proton mail but that is cloud
| hosted and subject to search without the user's knowledge
| (hopefully provided a valid warrant, but I don't think that
| takes away from the point).
|
| I'd also rather services like find-my-iphone, recent
| locations, and other map data be handled on a server I
| control without Apple having any potential access.
|
| I'm not saying there are no solutions but I am saying that
| they involve tradeoffs that a first-party solution could make
| a lot smoother. I'd gladly pay for a home Apple server that
| could replace all iCloud functionality without having to
| change my habits. I suspect I'm not the only one - just look
| at their recent privacy focused offering that turns off a
| bunch of features for "at risk" individuals.
| smoldesu wrote:
| It more sounds like your fault for becoming overly-reliant
| on services that are only possible by putting your trust in
| one company. I'm also one of the people who does the whole
| "home NAS" thing, and I have no problem syncing my
| passwords, notes, dotfiles and whatnot.
|
| This is the hook that Tim Cook has been setting up for
| years. They wanted you to take a bite out of their services
| without thinking about how their concentration of power
| effects you. Now that people are realizing that
| iCloud/Apple Pay/App Store are all just ploys to add an
| Apple Tax to other segments of the economy, they're relying
| on the people addicted to their services to justify it's
| use. When people talk about the degrading Apple ecosystem,
| _this_ is what we 're referring to.
|
| I jumped off that racket years ago. Buy a few WD Reds and
| set up RAID, then you can self-provision your own cloud
| resources.
| merely-unlikely wrote:
| I don't disagree. Still, the situation remains and I bet
| Apple could charge another "tax" to remedy it. And to be
| clear, I'm suggesting a product offering not a regulatory
| solution.
| smoldesu wrote:
| Yep. I certainly wouldn't hold your breath for Apple to
| reverse their stance on services/the cloud though. They
| make too much money off it to encourage people to self-
| host. It may well take regulatory action before Apple
| puts their customers before profit margins again.
| blfr wrote:
| _I think this article points to a missing product in the market
| - integrated cloud services but with the cloud located on a
| private server in the home._
|
| That would be very cool but, sadly, 60% of that market is in
| this comments thread. And we want it to be open source.
| rpdillon wrote:
| The product that kicked off the topic is Google Photos, which
| I recently neared my storage limit on. Sensing that Google
| One may not be an ideal next step, I set up my phone to
| backup photos and videos to my Synology instead using DSFile
| (Synology's file manager for Android). This has been pretty
| awesome so far, at least for my use cases.
|
| I think Synology is the closest product I know of that
| provides something like what you're describing, though, as
| you point out, it's not OSI open source.
| gernb wrote:
| I use google photos for access, not backup. So, I can not
| only access all my photos from anywhere, I can also take
| advantage of their ML driven search. I'm not entirely happy
| with the UX of Google Photo's but it's 1000x better than a
| self hosted file share
| merely-unlikely wrote:
| I suspect the market may actually be of decent size if you
| consider enterprises, journalists, politicians, and the like.
| Users who aren't necessarily tech savvy, won't stop using
| default services/apps without a fight, but who require a
| level or privacy that is currently somewhat compromised.
| vkou wrote:
| Have you seen the unmitigated dumpster fire that is mass-
| market Wordpress hosting? Now imagine that, but _even
| worse_.
|
| Users who aren't tech savvy who are trying to run their own
| mission-critical software on their own hardware:
|
| 1. Should expect to pay $200+/month for a support plan.
| That will buy them one hour of an engineer's time per
| month. They are going to need that time, because _deploying
| software is hard_ , _keeping software running is hard_ ,
| and there are no SRE/DevOps economies of scale when every
| Tom, Dick, and Harry wants to run their own service on
| their own hardware.
|
| 2. Should expect to either deal with all the problems
| (stability and security) that come with regular auto-
| updates, or regularly getting pwned (because they aren't
| auto-updating, and because they are a juicy, high ROI
| target for bad actors).
|
| 3. All this stuff already exists for enterprises, and they
| aren't complaining in these HN threads, because their needs
| _are_ being met by cloud providers (As are the needs of
| politicians. Because they generally don 't need security,
| they need security _theatre_. As long as they can check the
| compliance checkbox, they are in the clear.) The people
| whose needs aren 't met are the long-tail small-fry of
| journalists and hobbyists and their ilk. Who aren't enough
| of a market, and can't afford the money, and the time,
| because see points #1 and #2. Like, maybe the NYT can
| afford to have this set up, and set up well for their
| people, but they already have dedicated IT teams doing just
| that!
| marcosdumay wrote:
| > And we want it to be open source.
|
| I don't see how that bit would be a problem. The market size,
| yes, it's a problem, but it's supposed to be a cloud service,
| there is no loss on the clients being open source.
| bsedlm wrote:
| yes there's a loss, not a monetary/economic loss, but a
| loss of (potential) power. the fact the loss is "potential"
| irks me about calling it a 'loss'; it's not quite a loss,
| but a missed opportunity to leverage more power over.
| smolder wrote:
| We want it to be open source because open source is the only
| source of services you can run at home or on premises at a
| small business for a reasonable cost.
|
| It's really a supply side issue pretending to be a demand
| one. Self hosted services of all kinds are woefully
| underdeveloped.
|
| Tech giants aren't going to encourage you to own your data
| and hardware by spending effort on supporting it, because
| that undermines key reasons for building cloud services:
| surveillance and control.
| SpicyLemonZest wrote:
| Self-hosted services are woefully underdeveloped because
| people outside the open source community generally don't
| want them. I'm familiar with a couple of small
| organizations who've been around since the pre-cloud days;
| all of them hated "the server" and love that Google enabled
| them to get rid of it.
| ineptech wrote:
| > Tech giants aren't going to encourage you to own your
| data and hardware by spending effort on supporting it
|
| On the contrary, it would make a lot of sense for Amazon to
| work on open source self-hosted server apps, because they
| would seem to be the answer to the question, "Why would a
| non-techy want a cloud VM." Surely their PMs must day-dream
| about a future in which a cloud VM is as common a monthly
| fee for a middle-class family as a gym membership or
| netflix subscription, right?
| ineptech wrote:
| Fake edit to add: there's a startup devoted to solving
| this problem with a novel OS intended to be a good
| platform for server-side apps, and since they appear
| unlikely to get traction I would be very happy to hear
| that Amazon is forking or duplicating it, just because I
| so badly want to be a user of such a product.
| nemothekid wrote:
| > _I think this article points to a missing product in the
| market - integrated cloud services but with the cloud located
| on a private server in the home._
|
| I don't know, this sounds like a technical solution to what is
| a political problem. Gmail, today, already scans your email for
| ads; what if congress decides that they started doing CSAM on
| your emails? How feasible is it for everyone to host their own
| email in that situation?
|
| This also echoes the controversy with CSAM scanning in iCloud.
| There is no clear national consensus if the government should
| be allowed to invade your privacy when it comes to child abuse.
| Ironically, terrorists seem to have better privacy protections
| than a pedophile.
|
| Personally I don't think CSAM scanning of your private data
| should exist, much like I think everyone should be afforded
| E2EE communication even if they might break the law with it.
| Whether we allow our privacy to be invaded in such a manner is
| a political/regulatory problem. A "private cloud" doesn't help
| if congress decides that comcast should also be allowed to MITM
| your SSL connection to scan every image you donwload.
| Wowfunhappy wrote:
| > A "private cloud" doesn't help if congress decides that
| comcast should also be allowed to MITM your SSL connection to
| scan every image you donwload.
|
| Not to be overly pedantic, but I don't think this is true on
| a technical level? Any "private cloud" worth its salt would
| presumably use its own E2E encryption, with keys known only
| to the owner.
|
| The government could _force_ the private cloud vendor to
| build a back door, but that 's kicking things up a notch. The
| vendor could also decide to build a back door on their own, I
| suppose... which is another reason people in this market tend
| to want open source.
| nemothekid wrote:
| Yes, a private cloud should be able to do it; I don't want
| to imply that it is impossible to build such a solution.
| justinwp wrote:
| > Gmail, today, already scans your email for ads;
|
| https://support.google.com/mail/answer/6603
|
| > The process of selecting and showing personalized ads in
| Gmail is fully automated. These ads are shown to you based on
| your online activity while you're signed into Google. We will
| not scan or read your Gmail messages to show you ads.
| hertzrat wrote:
| Notice is says "for ads," and not in general to profile you
| or train ai systems
| sib wrote:
| Although it's a fair mistake; Google used to do this.
|
| Here is an except from my "welcome to Gmail" message, from
| Google, from June 2004:
|
| "You may also have noticed some text ads or related links
| to the right of this message. They're placed there in the
| same way that ads are placed alongside Google search
| results and, through our AdSense program, on content pages
| across the web. The matching of ads to content in your
| Gmail messages is performed entirely by computers; never by
| people. Because the ads and links are matched to
| information that is of interest to you, we hope you'll find
| them relevant and useful."
| pdonis wrote:
| _> A "private cloud" doesn't help if congress decides that
| comcast should also be allowed to MITM your SSL connection to
| scan every image you donwload._
|
| It also doesn't help if you send the image in an email to a
| third party that doesn't share encryption keys with you.
| Which is going to be the case for situations like the one
| described in this article unless your "private cloud" expands
| to include all your health care providers--not to mention
| your bank, your insurance company, etc., etc. Which of course
| it won't; all of those third parties are not going to care
| what your private cloud setup is; they're going to use
| whatever all the other large companies use.
| verisimi wrote:
| > I think this article points to a missing product in the
| market - integrated cloud services but with the cloud located
| on a private server in the home
|
| Absolutely. But who's going to sell it to the masses? Most
| people don't care, as they haven't been told to - no one has
| sold it to them.
|
| And who is going to explain about the erosion of privacy? Would
| that be Google, the government? Why, when both are
| beneficiaries of the existing system (ie making money for
| google, increasing control and monitoring for govts)? This is a
| market that is studiously ignored as it is in no one's
| interests!
|
| Do you remember when Google piled into rss only to try its best
| to then kill it? When data is your business, it makes no sense
| to cut yourself out of the intermediation.
| tempodox wrote:
| I guess the three-letter agencies would be severely
| disappointed if that happened, at a minimum.
| gernb wrote:
| Apple is never going to do this. Their largest growing source
| of income is "services" (iCloud, etc...) They make a ton of
| money charging users for those services
| murphyslab wrote:
| > the tremendous trade-offs entailed in the indiscriminate
| scanning of users' cloud data.
|
| It might be helpful to frame any cloud-based scanning of user
| activity or files as akin to mass-testing in health. While it
| sounds good in principle, there is always a background rate of
| false positives. And being incorrectly diagnosed can have
| negative consequences. There has been a large amount of
| examination of the consequences of mass-testing for various
| diseases without indication.
|
| e.g.
|
| COVID testing: https://www.bmj.com/content/373/bmj.n1411/
|
| Herpes: https://www.statnews.com/2017/01/26/flawed-herpes-
| testing-le...
|
| Genetic testing: https://www.geneticsdigest.com/genetic-testing-
| potential-con...
| colinsane wrote:
| the article makes good points, but i don't think the conclusion
| is obvious. for example, i want the freedom to not interact with
| Google. a ruling to require Google provide services to every
| individual might allow for a future corresponding ruling, that
| individuals must interact with Google (or at least, this becomes
| the socially acceptable view, which shapes laws later). wiser, i
| think, would be to prevent any single company from becoming
| critical to participation in society.
| SpicyLemonZest wrote:
| I don't think that really solves anything here, because nothing
| in the story indicates that Google was _critical_ to Mark 's
| participation in society. He got new contact information and it
| sounds like he's been able to (with some effort) get all his
| non-Google accounts switched over. He didn't get fired, or
| evicted, or land in any legal trouble. He just liked Google's
| services and enjoyed using them until he was unfairly banned.
| [deleted]
| themacguffinman wrote:
| If regulation is the answer, I'd hope it is designed with the
| restraint that Madison had: only guarantee the deepest and most
| fundamental rights. It should not rule out the ability to use
| error-prone systems at all, at most it could restrict their use
| for the most severe consequences like broad account bans. Google
| can and should be reducing the scope of their automated action.
|
| I think error-prone systems are still necessary today to reduce
| costs to a tolerable modern level. I'm not inclined to throw out
| the baby with the bathwater regarding communication costs for the
| same reason I don't want $300 toasters even if "they don't make
| 'em like they used to".
|
| A good example is something that Facebook does: they have a
| blacklist of domains you can't send on Messenger, the message
| will be instantly blocked when you paste a detected link. From
| experience, this list is fairly large and very inaccurate, but
| the false positives for this system are annoying rather than
| catastrophic because all it does is block a specific message. If
| you don't like this, then you can use Signal or something, I
| don't want other people making that choice for me.
| sylware wrote:
| Isn't google/alphabet a part of the vanguard/blackrock galaxy
| now?
|
| Aren't blackrock/vanguard making the laws in the US? Like killing
| the planned obsolescence related bills?
| jjulius wrote:
| >As company shareholders, BlackRock and Vanguard can vote on
| behalf of their clients at company shareholder meetings. Both
| firms also have "investment stewardship" functions, which
| enables the proxy votes.
|
| >BlackRock's spokesperson said the votes can also be carried
| out by a portfolio manager - and in some cases at BlackRock,
| can be carried out by the clients themselves (here).
|
| >BlackRock and Vanguard do not "own" all the biggest
| corporations in the world. They invest trillions of dollars
| into leading companies on behalf of their clients, who
| ultimately own the shares.
|
| https://www.reuters.com/article/factcheck-business-investmen...
| sylware wrote:
| So blackrock/vanguard can "hide" their "big" investors,
| depending on the company "invested in", or is this public
| info?
|
| That said, it is actually blackrock/vanguard ppl on the
| boards, and often they are the biggest ones, and expecting no
| steering from them or the "big" investors behind would be...
| "unexpected".
|
| Their is also the massive debts with the gigantic interests
| to be paid all along the calendar year. In the case of
| starbucks, owners of the debts is public info or the web page
| is incomplete?
|
| That said, now I understand why online non-
| franchised/licensed starbucks stuff has the aspx file
| extension.
| jjulius wrote:
| >... it is actually blackrock/vanguard ppl on the boards...
|
| Citation needed.
| sylware wrote:
| The 2 other heavily blackrock-ed/vanguard-ed companies
| are microsoft and apple, then "their" directors would be
| the current microsoft CEO and the apple guy.
| jrm4 wrote:
| This is where e.g. much of "free market" rhetoric and e.g.
| Libertarianism need to take a severe backseat.
|
| Given the reach, it's absolutely time for "government
| interference." Now, I'm not actually saying that we definitely
| should shut things down. But we absolutely need to drag Google
| people into public hearings and such. Google et al have inserted
| themselves into their lives to an extent that they ought no
| longer be able to claim "but we're a private company." It's
| increasingly difficult, if not impossible, to opt out.
|
| Example; I fully intended for my children to not have anything
| like a Google account until a certain age -- but then the
| pandemic and remote schooling happened. It is true that with a
| great deal of pain to teachers et al I could have opted out, but
| that would have been literally unreasonable, not to mention
| unsustainable for other people.
| Stupulous wrote:
| This is tangential, but your opening statement threw me for a
| loop. As a Libertarian, I've spent a lot of time arguing with
| people on the left and right who are on the pro-censorship side
| when it comes to corporations. While it wouldn't surprise me to
| see a Libertarian take a position against individual liberty on
| this issue, I have not seen that happen. Your ire may be better
| directed at the ones in the driver's seat who have the power to
| address this and refuse to do so rather than the powerless
| minority who consistently get this right.
| [deleted]
| pdonis wrote:
| _> This is where e.g. much of "free market" rhetoric and e.g.
| Libertarianism need to take a severe backseat._
|
| No, it's where we the people need to understand that rights
| come with responsibilities. If you want to claim the right to
| keep your data private, you need to not give it to third
| parties that you know are not going to honor that. And knowing
| that is easy when it comes to Google, Facebook, Twitter, etc.,
| because of the simple rule that if you're not paying for the
| product, you _are_ the product.
|
| The problem here is not too much free market and
| libertarianism, but too little. The ultimate reason for this...
|
| _> It 's increasingly difficult, if not impossible, to opt
| out._
|
| ...is that the _government_ has inserted itself more and more
| deeply into our lives, and we have let it. For example, your
| kids ' schools are beholden to Google because they are run by
| the government, which is insulated from free market competition
| and can just punt on providing proper infrastructure. And the
| reason why Google, Facebook, Twitter, etc. can survive on the
| ad-supported business model, instead of having to make their
| users actual paying customers as would be the case in a real
| free market, is that the government has set things up to favor
| them. They can get cheap loans to build the massive
| infrastructure they need because of government monetary policy.
| They can insulate themselves from any effective legal challenge
| because of the way the legal system is set up.
| Aunche wrote:
| > But we absolutely need to drag Google people into public
| hearings and such
|
| We _have dragged_ Google into public hearings. The problem is
| that the purpose of these hearings has always been to collect
| soundbytes rather than to try to define a relationship between
| internet services, the people, and the government.
| ghiculescu wrote:
| What should the government do? The article discusses how it is
| not clear what should be different (other than turning the
| relevant accounts back on).
|
| It is easy to ask for public hearings but if there's no
| intended remedy or change they will just be a show trial and
| nothing will change.
|
| Your example is great, because to me it illustrates my point
| well. What would you like to have had be different?
| jrm4 wrote:
| I think good places in the past to look are "warning labels"
| and/or "common carrier"-like ideas.
|
| Warning labels today, I think means "fixing voluminous EULA
| crap," which we've done in other fields reasonably well, e.g.
| the FDA I think is pretty solid here. Simply requiring
| "readable" terms of services I think would go a long way
| (though you have to wait a bit for things to shake out.)
|
| The general principles behind "common carrier" type services
| would also be a good idea to start thinking about; they
| essentially just say "When your product or service becomes a
| day-to-day thing for a lot of people, we're going to hold you
| to certain accessibility rules."
| loudmax wrote:
| I think I largely agree with you about getting the government
| involved. But the nature of that involvement is important here.
|
| As I see it the fundamental problem is not that Google isn't
| behaving as a responsible steward of free speech. The problem
| is that Google shouldn't have this power to begin with. By law,
| Google should be allowed to capriciously and arbitrarily shut
| down a user's account. It's their service, they shouldn't need
| to provide a justification. But Google shouldn't be in a
| position where they control so much of the market. There needs
| to be real competition in this area so users can easily switch
| to a different provider if they lose faith in Google's
| reliability. Where the government needs to be involved is in
| ensuring real choice in a free market, rather than the current
| duopoly.
|
| Ensuring a free market isn't easy. It's much easier to simply
| pass a law that says that Google (or a company in a similar
| position) needs to provide better service or offer users
| recourse to dispute this sort of service interruption. But
| that's just putting wallpaper over a crack. We need the
| government to commit to the hard work of enforcing open
| standards of interoperability so smaller entrants can get into
| the market.
| merely-unlikely wrote:
| I'd like to see a private equity style Competition Authority.
| The idea being the gov would come into a monopolized market
| and fund new competitors. I say PE because I'd rather the
| people profit if successful rather than subsidies but the
| real point is funding more choice vs only having the power to
| breakup or constrain existing corporations. Another tool in
| the toolbox.
| jrm4 wrote:
| Essentially, though -- this is just "more regulation with
| extra steps."
|
| Just pass the regulations and competition will likely find
| a way around it. Honestly, this to me is the biggest blind
| spot in all of "pro free market capitalism." The good
| flavor of capitalism is _antifragile,_ You can make
| competition better precisely by making things HARDER on
| private companies, not easier. Stuff gets better because
| the weak die.
| shkkmo wrote:
| > By law, Google should be allowed to capriciously and
| arbitrarily shut down a user's account. It's their service,
| they shouldn't need to provide a justification.
|
| I strongly disagree.
|
| While there should be some level of protection for large
| companies ability to terminate customers at will, there also
| need to be consumer protections against unreasonable EULAs
| and capricious decisions. (Edit: some such protections
| already exist, so it is a question of if and how we shift the
| balance between consumer rights and corporate rights. I think
| as the companies grow larger, more powerful, and more
| integral to our daily lives, that balance needs to shift
| further towards consumer rights given the power imbalance.)
|
| When companies sell us digital goods and become integral
| parts of our digital lives, there arises a need to balance
| their rights of free association against the rights of their
| users to due process.
|
| That is not to say that simply mandating a fair dispute
| resolution system is a sufficient solution, but it is a
| necessary step.
|
| I think mandating data portability and maybe even
| interoperability is also necessary (though an
| interoperability mandate seems tricky to do well.)
| pdonis wrote:
| _> When companies sell us digital goods_
|
| Google is not selling you any digital goods. They provide
| services like GMail for free to people. That is a big part
| of the problem: users are not paying customers and so have
| no leverage, as they would in a proper customer
| relationship. Instead, users are the product, and the
| actual paying customers have all the leverage in
| determining how the product is treated.
| horsawlarway wrote:
| I think this is roughly my stance as well.
|
| If we've allowed Google (and a few other select tech
| companies) the privilege of becoming de-facto monopolies in
| their space (and I believe we absolutely have, often
| intentionally as government policy in the international
| space) then they should be bound by the same rules that the
| government is.
|
| I think this is even more obvious when you consider how some
| of these companies creep into government processes.
|
| Ex: Google classroom is estimated to be in use in more than
| 60,000 schools across the US. This means that a Google
| account is required for all students in these schools (opt-
| outs exist, but are impractical, hard to enforce, and often
| cause you to be labeled as troublesome by teachers and
| staff).
|
| If my child attends a public school that is using Google
| software and requiring a Google account - The government damn
| well is involved, and I expect the government to require the
| same rights & due process for a Google account as they do for
| other governmental actions.
| ghiculescu wrote:
| It seems like the simplest solution here would be to not ban
| accounts automatically for CSAM detections, but to have a process
| to do so based on police recommendation.
|
| Clearly Google already has a process to escalate detection to the
| police so banning the account based on what happens there doesn't
| seem like a big leap.
| notriddle wrote:
| > It seems like the simplest solution here would be to not ban
| accounts automatically for CSAM detections, but to have a
| process to do so based on police recommendation.
|
| The police aren't going to recommend a ban. They don't want
| people banned. They want people arrested, tried, and convicted
| for criminal possession. They're going to recommend keeping the
| account open until they have enough evidence to take it to
| court.
| dcow wrote:
| Which seems completely fine. If the person is being
| investigated and they are continuing to commit abuses, the
| police will immediately arrest them and have evidence to bear
| increasing the odds that society locks the person up prompty.
| If the suspect is in custody, what does the status of their
| account matter? The point is that the account ultimately gets
| closed and the data purged if the suspect is found guilty, so
| what benefit is there to anybody in being delete-happy?
| bcatanzaro wrote:
| We need to regulate Google and require it to have a transparent
| appeals process.
| tinalumfoil wrote:
| I don't think Google's doing this out of a deep concern for
| CSAM and hatred for due process, but I think it's in fact the
| law itself that essentially requires corporations to have these
| trigger bans otherwise they'd face liability.
|
| Think about Google's alternatives here.
|
| They could (1) not have CSAM filters -- your data is private no
| need for Google to scan anything -- in which case people would
| use their platform to distribute illegal content and they'd be
| a nice target to get massive damages from.
|
| Or (2) give people due process, rights, appeals, etc despite
| not having any of the tools of a court of law, it's even
| illegal for them to look at the evidence, and if they make the
| wrong decision _or_ they make the right decision too late they
| 're still liable for their platform being used for illegal
| activity. Keep in mind the only way to truly know if material
| is sexually explicit is to show it to Potter Stewart. He'll
| know it when he sees it.
|
| Or (3) be proactive in building systems that detect illegal use
| of their platform and aggressively, without process or appeal,
| remove any user of the platform the moment you have any
| evidence at all they are engaged in this activity.
|
| Because people will try to use Google's platform for illegal
| activity and Google knows this, (3) is not only the only
| sensible option but it's actually the only legal option. It
| would, in effect, be _illegal_ for Google to do anything other
| than scanning all your files and aggressively ban people.
| treis wrote:
| >use their platform to distribute
|
| I'm not a fan of this phrasing because it absolves Google of
| responsibility. If the CSAM is on a Google server then Google
| is possessing and distributing CSAM. You can say they're
| doing it unwittingly or unintentionally but that doesn't
| change the fact that they're doing it.
| josephcsible wrote:
| > how big the number of false positives would have to be to shut
| the whole thing down?
|
| Since it seems that false positives are uncorrectable (given that
| even a New York Times article hasn't gotten this one fixed), and
| the consequences of one are thus life-destroying, even a single
| false positive should mean shutting the whole thing down.
| sulam wrote:
| So, Google, Apple, Twitter, Facebook, etc should all shut down
| their scanning for CSAM? Do you realize the outcry this would
| produce?
|
| Google is not unique here, nor does it have unique power. This
| story would be just as plausible, and difficult, if the company
| in question were Facebook or Apple.
|
| Once a system exists to scan for this material, it's quite easy
| to argue that shutting it off would be immoral and potentially
| criminal. "You had the means to identify abused children and
| help them, but you turned it off because you couldn't work out
| a reasonable policy to deal with the false positives? Try
| harder, people."
|
| (Note that I do not speak for my employer here.)
| Semaphor wrote:
| > So, Google, Apple, Twitter, Facebook, etc should all shut
| down their scanning for CSAM?
|
| What they should shut down, is their irrevocable deletion of
| accounts. When the false positives trigger nuclear bombs,
| then yes, a single false positive is too much.
|
| As far as I know, only Google does this, so everyone else can
| keep scanning in this argument.
| count wrote:
| Shutting down this new type of scanning is not the same as no
| longer scanning for CSAM.
|
| It's curious how the big providers have been scanning for
| CSAM for YEARS with nothing making the news...because hashes
| are much different and don't false positive like this.
| josephcsible wrote:
| It's immoral _not_ to violate everyone 's privacy, because
| some people are criminals?
| karaterobot wrote:
| The process that has led to our phones scanning everything we
| do and condemning us in an opaque extrajudicial process began
| with much smaller violations of privacy and trust -- analysis
| of our emails, tracking our movement with cookies, and so on.
| When these practices were introduced, some people complained,
| saying they were part of a motion that could only ratchet in
| one direction: toward ubiquitous surveillance and corporate
| control over more and more of our lives.
|
| Cut to just a few years later, and here we are: "shutting it
| off would be immoral and potentially criminal". We can't go
| back! So, that clicking sound you hear is the ratchet
| turning.
| tomrod wrote:
| I disagree. The value Google and similar web services brings is
| enormous. That said, there should be people in the loop who are
| trained and authorized to work around the operational systems.
| This is the cost of doing business.
| josephcsible wrote:
| I don't mean shutting down all of Google. I mean shutting
| down the whole automatic image scanning and account banning
| system.
| ajross wrote:
| That logic seems to go both ways, though. Actual child abuse
| is, surely, even more "life-destroying", right? So wouldn't it
| make more sense to tolerate any number of false positives if we
| can save just one kid?
|
| I'm always amazed at the number of people who come to a
| complicated nuanced issue like this, accuse one side of having
| implemented an absolutist and inflexible dystopian monster, and
| then promptly demand an equally absolutist solution.
|
| The libertarian answer might well be that we need to tolerate
| CSAM in the interests of free speech, but that's not the sense
| of the society we live in, nor of the laws it has passed. All
| paths must be middle paths.
| dont__panic wrote:
| It is interesting to me that CSAM is used as justification for
| such heavy-handed consequences and widespread surveillance.
| People use the internet to do all kinds of awful things, plenty
| of them crimes, plenty of those crimes being violent or
| personal property crimes. Murder, robbery, burglary, etc.
|
| Obviously CSAM is vile and not something your average citizen
| condones. But so are those other crimes. I suppose children
| "pull on the heartstrings" more than your average victim, but
| children are also the victims of those other crimes. Why
| haven't we started surveilling personal messages to detect
| criminal planning in advance? Because more people understand
| the nasty implications of the government reading your messages,
| but few understand the implications of machine learning
| "detecting" CSAM in your Google Drive or Photos?
| matthewdgreen wrote:
| In the UK they also search for "terrorism related content"
| and other types of criminal media. While this isn't required
| in the US (and neither is CSAM scanning) part of the impetus
| for building these systems is to satisfy non-US governments
| that don't have the same constitutional protections that the
| US does. US law enforcement officials routinely join with
| their non-US counterparts in urging/pressuring US firms to
| implement these systems, knowing that once they're built
| (non-voluntarily) for non-US compliance they will likely be
| turned on ("voluntarily") for US customers as well.
|
| See e.g. this open letter to Facebook authored by US AG Barr
| and counterparts from the UK and Australia.
| https://www.justice.gov/opa/pr/attorney-general-barr-
| signs-l...
| vineyardmike wrote:
| Your broader point stands, but in the US at least "pre crime"
| monitoring or police intervention is a massive can of illegal
| worms.
|
| The planning of a crime can be illegal itself, eg "conspiracy
| to commit murder" is illegal, but generally it's fraught with
| existing legal minefields.
___________________________________________________________________
(page generated 2022-08-29 23:00 UTC)