[HN Gopher] Few people know that Google voluntarily removes some...
___________________________________________________________________
Few people know that Google voluntarily removes some search results
Author : danso
Score : 281 points
Date : 2021-06-11 20:05 UTC (2 hours ago)
(HTM) web link (twitter.com)
(TXT) w3m dump (twitter.com)
| drdavid wrote:
| I wonder what checks and balances are in place.
|
| Can someone be a complete dirtbag and request that legitimate
| criticism be removed simply because they don't want folks to know
| they're a dirtbag?
|
| Can convicts request the results be removed? How about sex
| offenders? How about people convicted of domestic violence
| assaults or similar?
| k2xl wrote:
| What I was wondering is how do they verify that the person is
| the one requesting themselves be removed? Can you technically
| have all results of someone that you don't like be removed? I
| could see that harming businesses if your name is often
| associated with your websites around your business.
| paxys wrote:
| Not always voluntary. They have to do this by law in a lot of
| countries.
| renewiltord wrote:
| Well, they kind of have to, right? Right to be forgotten and all
| that.
| dkokelley wrote:
| This is an example of a large central authority censoring
| information.
|
| How does the notion of a purely distributed, unregulated,
| uncensorable, blockchain-backed internet handle "revenge porn" or
| other genuinely harmful content?
|
| An argument I hear from the crypto community is that blockchain
| is good because it enables freedom of speech that can't be banned
| by governments or other central authorities.
|
| The crypto community needs to address the other side of that coin
| too. Are there circumstances when something _should_ be banned,
| and how does that work on a blockchain?
| lozaning wrote:
| I freaked a bunch of FBI agents out a while back by Base64
| encoding a photo of an FBI logo and writing that to the Eth
| chain. They have massive concerns around this regarding data
| exfiltration. The cost of removing the data, once written to a
| chain, is essentially equal to the market cap of the coin for
| that chain.
| IncRnd wrote:
| The reason is that it is illegal to use the FBI logo unless
| authorized to do so.
| CDRdude wrote:
| I don't understand. Why did this freak out a bunch of FBI
| agents?
| gninazol wrote:
| Are you asking why it freaked out real FBI agents or the
| ones in lozaning's imagination?
|
| The FBI doesn't give a single damn hoot about "data
| exfiltration" via blockchain metadata. Anyone with half an
| understanding of any of those terms knows what an absurd
| implication that is.
|
| For concerns about "data exfiltration", discussing
| blockchain doesn't even make a single damn bit of sense.
| Blockchain is about impermanence and publishing, not about
| "exfiltration".
|
| I'm not sure which is more amusing, thinking that the FBI
| would give a hoot about b64-encoded data in the metadata of
| a transaction, or that this person wrote that comment on HN
| to try and seem cool. LOL, what, the FBI reached out to
| them (how?/why? nothing about this makes sense)?
| lozaning wrote:
| Because the data is there _forever_ and there's not really
| anything they can do about it, which stands in stark
| contrast to how they're used to operating.
|
| There is no person they can throw in jail, no corporation
| that can be sued, no servers that can be seized, domains
| taken over, etc, that will result in that data no longer
| being available to those that know where to look for it.
|
| The data just remains online until such a time that the
| coins of a given chain are worth $0.0 and coin holders no
| longer have a financial incentive to keep the ledger
| online.
|
| FWIW the ones I talked to where more of the door kicking
| variety, not the 'cyber' type.
| int_19h wrote:
| But why would they freak out about a _logo_?
| dkokelley wrote:
| Another thought experiment:
|
| What would happen if someone encoded something horrible and
| illegal like child porn into a blockchain? Is everyone who
| operates on that chain guilty of possession? Does that make a
| specific crypto "illegal" if its blockchain contains illegal
| content?
| tick_tock_tick wrote:
| People already added it into bitcoin if you stretch the
| definition a bit. As in the bits needed to reconstruct the
| image exist in the chain.
| lostmsu wrote:
| If you extend it wide enough, one should ban natural
| numbers because some of them in binary form are identical
| to child pornography video files.
| kr99x wrote:
| No, no information should ever be banned from any
| public/commons - the end. You want to ban certain information
| from a particular place you control? Go nuts. The wider public?
| No.
|
| The power to ban information is _too great_ to be entrusted to
| any authority at all. Depending on how thorough the "ban" (web
| text filter at the ISP level? mandatory AR implants at birth
| filtering banned content? worse?), it's anywhere from an
| abhorrent violation of human rights and the principles behind
| free exchange and scientific inquiry all the way up through
| literally the most powerful weapon which could even
| _theoretically_ be designed.
|
| This is not a road worth going down, for any amount of harm
| reduction. The cost _today_ may be worth it. The cost long term
| is potentially too great to even consider risking. There is no
| guarantee of who holds the ban hammer tomorrow.
| dkokelley wrote:
| Thanks for sharing. Do you think there is ever a scenario
| where information should be censored from the public at
| large? (Child porn, or you or your family getting 'doxxed',
| for example)
|
| I appreciate the take that the future harm isn't worth the
| benefit today, since we'd enable future arbiters to control
| what is available.
|
| There's nuance in determining what is a public sphere vs. a
| privately controlled platform. The places with the most
| distribution are currently private, but crypto could change
| that to where we collectively own the platform, effectively
| making it publicly owned. Does that change your thoughts on
| when censorship is appropriate?
| ipaddr wrote:
| You don't undo a harm by trying to erase it. If the porn has
| been created/recorded the ex has a copy and could upload it to
| non-block chain locations. Once he shares it it has been
| decentralized.
|
| There was never a way to undo porn video you made when you were
| younger.
| afavour wrote:
| The internet is incomprehensibly vast. Yes, there might be a
| copy (or multiple copies) of a video you made in your youth
| on the internet. But as long as it isn't a) prominently
| available and b) attached to your name, you might be OK.
|
| The situation we have today does allow you to address such a
| problem: Google can remove search results. It's not possible
| to do that on the blockchain.
| ipaddr wrote:
| The blockchain is vast and growing. Discovery is difficult.
|
| Google prevents new searches or tries to but if you have
| the link you can visit without google. Same as the
| blockchain you would need to know where to look.
| summerlight wrote:
| But still you can try making it harder to access by cutting
| the major distribution channels, which perhaps will
| significantly reduce its propagation velocity. With whatever
| technologies that refuse to fix this issue, you cannot. This
| is a critical difference for law enforcement.
| ipaddr wrote:
| You wouldn't stop a torrent. Why should the blockchain be
| different Discovery still happens elsewhere.
| pjc50 wrote:
| You can't 100% undo it, but you can criminalize distribution.
| Uploading it to a system from which it cannot be deleted
| guarantees no leniency.
| [deleted]
| gninazol wrote:
| Always good to rehash the same exact conversation over and over
| for ... 13 years now. Can we at least pretend to learn
| something from every other time this exact conversation has
| played out?
|
| No? We're just gonna relitigate it all from scratch again -
| relearning the same naive lessons over and over? (Shockingly
| people in this space _do_ indeed understand the implications of
| censorship-resistant platforms.)
| meekmind wrote:
| > The crypto community needs to address the other side of that
| coin
|
| Do they? Nothing is ever _really_ deleted on the good old-
| fashioned internet either.
|
| We can't have our cake and eat it too. Having a centralized
| arbiter of truth is more dangerous to the truth than bad people
| who do bad things.
| devenblake wrote:
| > Nothing is ever _really_ deleted on the good old-fashioned
| internet either.
|
| Tell that to folks doing web archival.
| jedimastert wrote:
| So, like, I get what you're saying, but I feel like you didn't
| actually look past the title of the post. This isn't
| governments censoring information or whatever, it's a form you
| can fill out to request removal of coerced personal information
| like revenge porn or doxxing.
| dkokelley wrote:
| I understand what the post is saying. I see this as an
| example of censorship (by Google in this case) being a GOOD
| thing. This caused me to wonder about how a decentralized
| platform would handle similar circumstances.
| jedimastert wrote:
| Whelp color me an ass for commenting about not reading past
| the headline while also completely missing the main crux of
| your argument! That's my b
| narrator wrote:
| When talking about the blockchain and other voluntary systems,
| I tend to look at these things from a "what if there was no
| government that could just use physical coercion to implement
| its laws, how would you get people to voluntarily sign up for
| this?" perspective for these kinds of problems.
|
| You would get users to sign the social contract with some
| online entity and that entity would censor the blockchain for
| their ideological/legal jurisdiction. For example, you could
| sign the Christian fundamentalist social contract and have all
| blasphemy removed from your search results. In exchange, you
| would receive community support, access to their content, etc.
| If they found out you were browsing blasphemous material they
| would revoke the particular social contract you signed.
|
| Just spitballing here, but I wonder if the nofap guys could
| start something like this to block all nsfw content on whatever
| distributed blockchain thing was out there. Then they could
| offer some special forum as a benefit. You could use a DAO for
| governance conflicts, etc.
| pjc50 wrote:
| That doesn't (and can't) prevent revenge porn or blackmail or
| doxxing in any way, because the victim is not a party.
| narrator wrote:
| Obviously you can use courts and the police in the form of
| the existing government. That works fine. We're trying to
| figure out how to do this in a borderless global internet
| via blockchain and so forth.
|
| The victim would contact the organizations and ask that
| they remove the material citing their mission and bylaws.
| Maybe they would form a hierarchy with the most generally
| agreed upon rules being shared, like a treaty, between
| blockchains.
| kevin_thibedeau wrote:
| I'd love to walk into a business without shoes or a shirt and
| demand service. For some crazy reason these businesses are
| allowed to restrict my liberty on their property.
| dkokelley wrote:
| I guess I'm thinking more about a "decentralized Google"
| where it's impossible to remove things like revenge porn. Is
| there anything we can/should do about that potential?
| kypro wrote:
| IMO both of these options are terrible.
|
| I seem to lean quite far on the freedom side of most arguments,
| but I do acknowledge there are times when action may need to be
| taken in the interest of the public. My objection is that I
| neither want it to be impossible to take action or for a
| foreign company to unilaterally decide what action to take.
|
| What we need in situations like this is a legal process. If one
| doesn't exist, it's not for companies to start deciding what
| information the public should have access to and what is
| "harmful", but for democratic countries to pass laws with the
| consent of their local electorates to decide what legal
| protections and processes need to be put in place.
|
| The problem we have today is that there are too many foreign
| companies deciding what we can and can't say or do. Crypto has
| the exact opposite problem, but unless our governments step up
| and regulates these companies in the interests of the public
| our only option (if you don't agree with the censorship) is to
| create something uncensorable.
| dkokelley wrote:
| I appreciate this response! I agree that a legal process
| seems to be the best solution (so far) to collectively
| deciding what is and isn't ok.
|
| The legal system has its own faults of course. It's
| administered by fallible governments and can have individual
| bad actors. But could a legal system expect to exert control
| over a decentralized system like a blockchain?
|
| Put another way, if there was a "crypto twitter" clone, and
| someone posted revenge porn or personal information about
| someone (home address, let's say), wouldn't that post be
| forever embedded into the blockchain? Would a legal system
| ever be able to remove it?
| avianlyric wrote:
| It may not be able to remove it, but it certainly can make
| viewing, storing or disseminating it illegal and punish
| those that do.
|
| Ultimately a legal system is how a society controls a
| governments monopoly on violence. A legal system can remove
| almost anything if it really want too, by virtue of the
| fact that it can send big men with guns to destroy whatever
| physical manifestations of the thing exists.
|
| Of course there are practical limits to this power. But
| that's never stopped a government before.
| Dracophoenix wrote:
| It's not just companies. It's differing jurisdictions. The
| Middle East would hold that promoting homosexuality is
| "harmful" (not unlike the US even a few years into the 21st
| century). China would hold that it is harmful to promote a
| different political party than the CCP. India finds it
| harmful for people to kiss in public. There is no legal
| process that discerns "truly harmful" material from that
| which is perceived. Harm or lack thereof is limited to
| individual assessment
| slg wrote:
| >I seem to lean quite far on the freedom side of most
| arguments, but I do acknowledge there are times when action
| may need to be taken in the interest of the public. My
| objection is that I neither want it to be impossible to take
| action or for a foreign company to unilaterally decide what
| action to take.
|
| >What we need in situations like this is a legal process. If
| one doesn't exist, it's not for companies to start deciding
| what information the public should have access to and what is
| "harmful", but for democratic countries to pass laws with the
| consent of their local electorates to decide what legal
| protections and processes need to be put in place.
|
| What is interesting is that I wouldn't even have to change a
| single word here and I can equally apply this reasoning to
| encryption. It is almost always unpopular on HN to suggest
| that universal end-to-end encryption might not be a great
| idea or that encryption backdoors are something that need to
| be seriously discussed, but those opinions are born out of
| the same underlying logic.
|
| Free speech is good, but no society wants universal free
| speech since there are legitimately evil people who will use
| their speech for malicious means. Same is true about
| encryption. Why are we ok with removing revenge porn from
| Google results but need to just accept that we are allowing
| child porn to be shared via easily encrypted channels?
| kypro wrote:
| Personally I'm not convinced the rewards outweigh the risks
| in regards to encryption, at least not at this moment in
| time. But again, I'd much rather issues like these were
| debated democratically than some tech company deciding one
| day that they need to view my private messages to "protect
| the children".
|
| But to your point, if the sharing of child porn or other
| illegal content ever got so bad that something urgently
| needed to be done I would personally be open to limiting
| the use E2EE (if there truly was no better option) and I'd
| assume most people here would agree, although I would argue
| in many cases it's E2EE that prevents you from becoming a
| victim of things like revenge porn in the first place.
| int_19h wrote:
| The thing about E2EE is that it's not something that you
| can meaningfully ban in a non-totalitarian society. RSA
| boils down to one fairly simple formula, for example, or
| a Perl one-liner. People who _really_ need it will figure
| it out.
|
| Besides, there would be quite a few people who'd be
| actively circumventing any such law if it were enacted,
| e.g. by hosting the apps in other jurisdictions. And I
| think that's a good thing! There should be fundamental
| limits on the power of governance, regardless of how
| democratic said governance is; and democracies can be no
| less abusive than other forms when it comes to minorities
| etc. Or even majorities, when established public mores
| essentially require widespread hypocrisy - the
| Prohibition comes to mind. "Think of the children" (or
| terrorists, or whatever is the go-to political excuse at
| any particular moment) is not a valid exception.
| slg wrote:
| >I'd much rather issues like these were debated
| democratically than some tech company deciding one day
| that they need to view my private messages to "protect
| the children".
|
| I agree with your general point except most people in the
| tech community are not willing to even have these
| debates. It is often treated as an issue with a single
| right answer and that encryption can never be
| compromised.
|
| Removing ourselves from the debate is only going to end
| up with the decision being made without our input.
| avianlyric wrote:
| The difference is that strong encryption is an all or
| nothing proposition. You either have strong encryption that
| works in all scenarios, both good and ill, or you have weak
| encryption that protects nothing. There unfortunately is no
| middle ground, and plenty of people and governments have
| tried.
|
| Selective censorship by central authorities can be
| granular, choosing to create a legal process for selective
| censorship doesn't suddenly allow any person on the
| internet to potentially censor any thing they want. Weak
| encryption however allows anyone, anywhere, with enough
| effort to break all encryption everywhere, and do so
| without without detection or needing to expose themselves
| via a legal system.
|
| Also there aren't "central encryption authorities" that all
| encryption passes through. Anyone can implement modern
| crypto wherever, so banning it makes no sense. It's like
| trying to ban the concept of Pi.
| TurningCanadian wrote:
| I wouldn't say it's all or nothing. Several of the more
| regressive countries already demand access to decryption
| keys. The government gets easy access to the data but
| it's still hard for the non-state bad guys to intercept.
| avianlyric wrote:
| Doesn't help if people are using end-to-end encryption.
| slg wrote:
| >You either have strong encryption that works in all
| scenarios, both good and ill, or you have weak encryption
| that protects nothing. There unfortunately is no middle
| ground, and plenty of people and governments have tried.
|
| This is always stated as a universal truth of the
| technology, but this mostly seems like a people and
| organizational problem. We already have encryption
| algorithms that can encrypt content for decryption by
| multiple optional keys. Why couldn't we fragment and
| distribute one set of those keys among multiple legal
| entities? That would require coordination and agreement
| to decrypt anything. That wouldn't be a true backdoor
| anymore than the original key is a backdoor, it would
| simply be overhead on the existing encrypted content to
| allow it to be decrypted by multiple keys.
|
| >Also there aren't "central encryption authorities" that
| all encryption passes through. Anyone can implement
| modern crypto wherever, so banning it makes no sense.
| It's like trying to ban the concept of Pi.
|
| This is true, but defaults matter. If the US makes a law
| regarding encryption, Google, Apple, Amazon, Microsoft,
| Facebook, etc are all going to follow it. You still might
| be able to roll your own encryption, but the people who
| do that will be a tiny share of overall communication.
| Laws like this aren't meant to completely stop something.
| Putting in financial regulations to crack down on money
| laundering doesn't stop money laundering. The goal is to
| make it more difficult and prevent the most egregious
| cases.
| avianlyric wrote:
| > Why couldn't we fragment and distribute one set of
| those keys among multiple legal entities?
|
| There would be thousands if not millions of legitimate
| decryptions every year. At each instance all of the
| fragments will need to be put together, creating an
| opportunity for the data to be exfiltrated.
|
| Additionally your making the assumption that legal
| agencies will be able to securely store these keys long
| term (I.e forever). Regardless of your view on the
| operational and security competency of these agencies,
| it's extreme difficulty to keep cryptographic keys secure
| long term if you need to be able to access them on a
| regular basis.
|
| Even if you don't think most criminal organisations will
| manage this, you can pretty much guarantee that large
| nation state actors like China and Russia will find a way
| to get hold of these keys. You then give your largest
| competitors the ability to seriously damage your economy
| or steal secrets by either using the keys themselves, or
| leaking them on the internet. Suddenly every single
| message sent by every citizen, politician, bank, weapons
| company etc becomes public for all to see.
|
| > This is true, but defaults matter. If the US makes a
| law regarding encryption, Google, Apple, Amazon,
| Microsoft, Facebook, etc are all going to follow it. You
| still might be able to roll your own encryption
|
| You will 100% be able to roll your own, and trivially
| too. You'll just need to go to GitHub and grab the source
| code, or pre-built binaries for a working crypto system,
| of which there are many. Banning the big companies from
| using crypto isn't enough, you also need to ban anyone
| from talking about crypto as well.
|
| This is very different to money laundering rules.
| Realistically you can't opt out of the modern financial
| system, regardless of what blockchain proponents say, so
| introducing rules to gatekeep money flows at system choke
| points makes sense. Also one ability to censor money at
| choke points doesn't also create an opportunity for
| enemies to exploit those same choke points.
|
| The money laundering equivalent of crypto censorship
| would be like the US deciding that it was going to switch
| to Bitcoin so that all transactions are public and
| accessible to law enforcement. Then just kinda hoping
| that a country like China isn't going to launch a 51%
| attack.
| slg wrote:
| Your first three paragraphs are all focusing on
| structural flaws with my suggestion and not technical
| limitations of encryption itself. The debate has already
| moved from this is impossible to this is impractical. We
| can fix impractical. For example, we can design a way for
| content to be reencrypted with new keys if anything ever
| leaks.
|
| >You will 100% be able to roll your own, and trivially
| too. You'll just need to go to GitHub and grab the source
| code...
|
| Github is owned by Microsoft. Microsoft won't let you
| host code that is designed specifically to break the law.
| That is the point. You would need to jump through a
| variety of hoops in order to avoid this. It wouldn't be
| impossible, but it won't be the default and most people
| won't go through the effort to do it.
| avianlyric wrote:
| > Your first three paragraphs are all focusing on
| structural flaws with my suggestion and not technical
| limitations of encryption itself. The debate has already
| moved from this is impossible to this is impractical.
|
| I think the flaws are so large, and fixing them so
| impractical (especially anything involving direct human
| involvement), that the problem is essentially impossible.
|
| In theory traffic laws should prevent 100% of car
| accidents. Yet people die every day in traffic accidents.
|
| In theory the judicial system should never execute an
| innocent person. Yet the US sends people later proven
| innocent to the electric chair on a semi regular basis.
|
| In theory everyone should have a strong interest is
| keeping planet earth healthy enough to support human
| life, yet we're on a course to cataclysmic climate
| change.
|
| Alcohol addiction should have ended during prohibition,
| and weed should be impossible to buy in the US. Yet
| neither is true.
|
| What on earth makes you think we'll solve the structural
| issues with government key escrow for all encryption,
| when all the evidence suggest that we're unable to
| perfectly solve these issues as a species?
|
| > Github is owned by Microsoft. Microsoft won't let you
| host code that is designed specifically to break the law.
|
| Cool we'll host it on Gitlab then, or one of many no US
| hosting providers. We can't even stop people pirating
| movies despite the media industry throwing billions at
| the problem. Do you honestly think we'll have better luck
| with crypto?
|
| You wanna see how this story ends? Read up on prohibition
| America, or just the war on drugs. We've been down this
| road many times, with many different vices, it's ends the
| same way every time. Normal people get criminalised,
| illicit behaviour continues regardless. Organised crime
| profits from facilitating illicit behaviour. Please
| explain why you thing crypto is going to be the
| exception?
| kortilla wrote:
| > Why couldn't we fragment and distribute one set of
| those keys among multiple legal entities?
|
| Because crypto is implemented in software and it's
| trivial to remove the part that encrypts for the
| government's key.
| slg wrote:
| You completely ignored the last paragraph of my comment
| where I addressed this type of concern.
| gjs278 wrote:
| stop posting. you're wrong and your idea sucks. there's
| nothing else to say.
| ChrisKnott wrote:
| It's not all or nothing, at all. The argument is about
| whether the material can be theoretically legally
| recovered like Gmail and Facebook Messenger (currently),
| or not, like WhatsApp. All these services use encryption
| and are secure.
| version_five wrote:
| Huh? The google example is about content being available in
| public, essentially the equivalent of someone posting dirty
| / embarrassing pictures in a public square and needing
| there to be a way to take them down. That sounds pretty
| reasonable.
|
| The encryption example is about being able to intercept
| private communication (never mind outlawing math) because
| someone could use that private communication for something
| illegal.
|
| The two examples have nothing to do with each other IMO.
| throwitaway1235 wrote:
| It's simple. The user is a thinking, conscious being, he or she
| views, what he or she wants. No dilemma. Nothing shall be
| censored.
| anamexis wrote:
| So if someone posts revenge porn of you, you're fine just
| trusting the 8 billion people on Earth not to look at it, and
| leaving it at that?
| summerlight wrote:
| To add more data points for your point:
|
| https://en.wikipedia.org/wiki/Nth_room_case
|
| Everyone, please DO NOT underestimate people's malicious will
| to abuse weak minority groups having un-moderated platforms as
| their primary tool. Due to lack of Telegram's basic moderation,
| those victims had been severely abused to the unrecoverable
| level, more than two years. This could've been at least
| mitigated if they simply closed the room based on user
| reporting, which they refused to do so.
| shakezula wrote:
| Google is not backed by a blockchain.
|
| And as someone who has worked in the blockchain space for a
| long time, it's pretty much unanimously agreed in the space
| that the cost of inevitably protecting bad actors is worth the
| value in protecting good actors. Good and bad are subjective
| there, but for better or worse the blockchain isn't.
| spondyl wrote:
| I'd like to take this opportunity to share some relevant
| exercepts from In The Plex[^1] about Eric Schmidt, then-CEO of
| Google.
|
| > One day Denise Griffin got a call from Eric Schmidt's
| assistant. "There's this information about Eric in the indexes,"
| she told Griffin. "And we want it out." In Griffin's
| recollection, it dealt with donor information from a political
| campaign, exactly the type of public information that Google
| dedicated itself to making accessible. Griffin explained that it
| wasn't Google policy to take things like that out of the index
| just because people didn't want it there. Principles always make
| sense until it's personal," she says.
|
| > Then in July 2005, a CNET reporter used Schmidt as an example
| of how much personal information Google search could expose.
| Though she used only information that anyone would see if they
| typed Schmidt's name into his company's search box, Schmidt was
| so furious that he blackballed the news organization for a year.
|
| > "My personal view is that private information that is really
| private, you should be able to delete from history," Schmidt once
| said. But that wasn't Google's policy...
|
| I guess they've since changed the policy a bit?
|
| [^1]: https://www.amazon.com/Plex-Google-Thinks-Works-
| Shapes/dp/14...
| [deleted]
| Blikkentrekker wrote:
| On a slightly unrelated but nevertheless interesting issue: a
| while back I wanted to find an "incel" board to see how discourse
| thereon actually was, and _Google_ returned no direct results to
| any of them but _DuckDuckGo_ immediately returned the results one
| might expect when searching "incel forum".
| legitster wrote:
| To Google's credit, it's literally the first search results for
| "how do I remove myself from Google". It is a bit ironic that
| people looking to remove something for being too easy to find via
| search are being stymied by a simple search.
|
| Still, good to provide some visibility about it. I certainly
| never knew this was a thing.
| oh_sigh wrote:
| First result...under 4 ads that probably make you pay to do the
| thing google does for free for you.
| minsc__and__boo wrote:
| 2 ads and a card linking directly to the Google support
| article explaining how to remove it here.
| jfrunyon wrote:
| I get 0 ads on that search. The very first result, above
| anything but their header, is
| https://support.google.com/websearch/troubleshooter/3111061 .
| legitster wrote:
| A weird example of Google creating business for third
| parties.
| danso wrote:
| Couldn't fit the whole tweet, so just to be clear, she's talking
| about manual, non-legal requests:
|
| > One of the surprising things about working on the slander
| series is how few people in the field, even experts, know that
| Google voluntarily removes some search results. (No court order
| needed!) You have to visit this generic url:
| https://support.google.com/websearch/troubleshooter/3111061?...
|
| Down the thread, she adds this:
|
| > _Because so few people know about it, "reputation managers" are
| charging people like $500 a pop to "remove damaging information
| from Google results." And ALL THEY DO is fill out that form for
| free. Someone tried to hawk this service to my husband after he
| came under attack._
|
| I didn't know about this service URL, and had just assumed
| "reputation managers", if they did anything at all, were limited
| to SEO spamming.
|
| edit: Searched HN for mention of this link and found exactly 5
| results, 3 of which look unique, and the earliest in Dec. 23,
| 2015:
|
| https://imgur.com/EEUJn9M
| tlogan wrote:
| There is saying in Balkan countries: (my translation) "who
| knows knows, who does not know then it is 500 deutsche marks
| (dinars, etc.)".
| tyingq wrote:
| They are fairly narrow categories though, and don't handle
| things like mugshot sites, even if it's something you were
| arrested...but not convicted for. Still useful to know for
| sure.
| joshuamorton wrote:
| Mugshot sites _might_ be covered by
| https://support.google.com/websearch/answer/9172218, but it
| depends.
|
| [disclosure: I'm a googler, but no clue about this service]
| tyingq wrote:
| I'm pretty sure the mugshot sites have adjusted their
| practices where it's much harder to prove what's really
| going on. Straight up blackmail payment pages are gone.
| Rasta1s wrote:
| Aren't mugshots public records?
| tyingq wrote:
| Sort of. Some police departments publish them, some
| don't. Some publish them only day of, then remove them.
| Some only show them if you do the right kind of search
| (last/first/ maybe + birth). The predatory mugshot sites
| scrape these variations and publish them in a "forever"
| way, with lots of SEO tricks. Then basically shake you
| down for payment to remove them.
| jrockway wrote:
| It's the great "public" vs. "publicized" debate.
| mike_d wrote:
| Technically yes, but they are removed when a person is
| found not guilty or released without being charged thus
| protecting the innocent and falsely accused. The mugshot
| sites capitalize on this by scraping the government
| websites and archiving them forever - until you pay a
| removal fee.
|
| Mugshots.com was doing exactly this until they got hit
| with extortion charges:
| https://www.chicagotribune.com/business/ct-biz-mugshot-
| websi...
| Rasta1s wrote:
| " Technically yes, but they are removed when a person is
| found not guilty or released without being charged thus
| protecting the innocent and falsely accused. The mugshot
| sites capitalize on this by scraping the government
| websites and archiving them forever - until you pay a
| removal fee."
|
| I don't believe anyone has an obligation to remove if
| someone found not guilty. The arrest happened, that's a
| fact. The information is public was made public by the
| government, that's also a fact.
|
| From my understanding once something is made public you
| can't put it back in the bag. The government may do so on
| their own databases (Ie. Expungement, removal from
| government databases) but that doesn't apply to the
| public and especially not to publications who have first
| amendment right to publish public records.
|
| I looked up the case you mentioned, it's still pending
| and the arguments made by the government are questionable
|
| https://www.techdirt.com/articles/20180523/10224639892/ca
| lif...
| judge2020 wrote:
| This might also include what is required to comply with GDPR's
| right to be forgotten, if that's still a thing.
| vitus wrote:
| That's a different form!
|
| https://www.google.com/webmasters/tools/legal-removal-
| reques...
| gist wrote:
| > are charging people like $500 a pop to "remove damaging
| information from Google results." And ALL THEY DO is fill out
| that form for free. Someone tried to hawk this service to my
| husband after he came under attack.
|
| 'ALL THEY DO' is what anyone does in a similar way. You are
| paying for what they know that you don't know how to do. There
| is nothing wrong or deceptive about making money in this way.
| (use of 'all they do' seems to imply this). Plenty of people
| are busy and willing to pay for things to have someone else
| handle the details. Google could easily publicize this but they
| choose not to.
|
| Here is the thing. Someone charging $500 (or any amount) has
| already filtered people that feel they have a real need from
| everyone attempting to do similar. Now you could argue the fee
| should be less but the friction caused by that higher fee (for
| those who can afford it) is worth it. And those that can't can
| do research and effort (like with anything) to find a way to
| achieve the same.
|
| People often expect everything to be free and nobody to make
| money off of knowledge. By the way if reputation managers are
| charging $500 nothing to prevent someone from doing the same
| for less and changing the market price. Also I'd image the rep
| managers possibly massage some things to get the job done
| something many people wouldn't know or want to do.
| majormajor wrote:
| > Because so few people know about it, "reputation managers"
| are charging people like $500 a pop to "remove damaging
| information from Google results." And ALL THEY DO is fill out
| that form for free. Someone tried to hawk this service to my
| husband after he came under attack.
|
| While that sounds like a high price, and you usually don't want
| to just hire the first person you see advertise a service, the
| concept of specialization there, and some people just not
| wanting to devote the time to becoming experts everywhere,
| seems perfectly fine and normal.
|
| People often pay electricians even for tasks that are just
| "flip a switch and turn some screws," after all.
| [deleted]
| akiselev wrote:
| _> People often pay electricians even for tasks that are just
| "flip a switch and turn some screws," after all._
|
| The ultimate reason people pay electricians is liability.
| Unlicensed work is an easy way to invalidate your homeowner's
| insurance, lose everything, and get sued for any damage to
| your neighbors' property (by their insurance company, no
| less).
| version_five wrote:
| My view is that if the only reason you are doing something
| is because of the insurance implications, then you've
| surely made the wrong decision.
|
| Nobody's interests are less aligned with your own than an
| insurance company's. There are lots of legitimate reasons
| to hire an electrician, but making decisions solely based
| on what someone who wants you to perpetually give them
| money whole finding reasons to never give it back is never
| good practice.
| sombremesa wrote:
| Electricity can kill you. Doesn't seem like a good
| comparison.
| amelius wrote:
| Ending up in a Google search result can too, I suppose, in
| some indirect way.
| theli0nheart wrote:
| > _People often pay electricians even for tasks that are just
| "flip a switch and turn some screws," after all._
|
| Not really.
|
| People pay electricians for their experience and knowledge
| that give them the tools to solve complex problems, simply,
| without messing up your wiring, causing a fire, or blowing
| something up.
|
| Removing a link on Google, on the other hand, involves
| clicking on a link and filling out a form. You don't need
| state licensing or problem solving abilities to fill out that
| form. Any "reputation managers" out there that are doing this
| are essentially defrauding people, and it's shameful.
| majormajor wrote:
| Let's say someone wants to replace a lightswitch.
| Google/Youtube will show you how to do that safely in a
| matter of minutes.
|
| People pay because they're afraid of at least one of (a)
| their ability to be sure they're getting the right
| instructions, or (b) their ability to execute. In the
| electrician example, (b) could be substantial if you don't
| understand electricity. "Electricity can kill you" is
| basically the home-maintenance version of "the internet is
| confusing" that would prevent someone from wanting to find
| this Google form themselves.
|
| In the "reputation manager" example, (a) is going to come
| into play more, I bet. A _good_ "reputation manager"
| probably isn't going to just fill out one form on google, I
| imagine there's a lot you could do to cover Bing,
| wikipedia, and various other sites and services. I don't
| know where employees go to get background checks, for
| instance, but maybe I need to have that covered, too. So
| now the problem still parallels working with an electrician
| - how do you know if you found a good one? Lots of
| handyman/contractor horror stories and scams out there too!
| [deleted]
| version_five wrote:
| I think you meant that although it's easy to trivialize what
| an electrician does, in practice there's actually a lot of
| skill and experience in it. (I could be wrong, although it's
| always funny to see how what was probably an off-hand comment
| causes people to go down such a rabbit hole).
| vmception wrote:
| Agreed, my favorite example of this is how people pay like
| $100 to get a tax-ID number for their new business, which is
| free from the IRS and with accidental form choices being
| consequence free, because they think its fraught with
| disaster or just too obtuse
| mattzito wrote:
| I would say that the difference between your example and
| these reputation managers is that they are often themselves
| directly responsible for the problems they create:
|
| https://www.nytimes.com/interactive/2021/04/24/technology/on.
| ..
|
| I guess this is more analogous to flipping someone's breakers
| in their house and then charging them to flip them back on.
| tialaramex wrote:
| I paid a gas plumber full rates to in effect remind me where
| the tap is for the fill line for my boiler.
|
| Over time a boiler that isn't meaningfully "leaking" will
| still gradually lose internal pressure. Once the pressure
| _inside_ the boiler is less than _outside_ the boiler even
| when the water inside is hotter, that 's not good. Eventually
| the system won't work and shuts down for safety, but before
| that it'll make a _lot_ of noise while running. There 's a
| pressure meter so you can see what's wrong if you don't
| understand boilers. My meter said about 0.4 bar. So that's
| too low, now I just needed to re-pressurize it.
|
| The regulations here say that, to avoid mistakes resulting in
| stagnant water flowing back from a boiler tank into the fresh
| water system, the two mustn't be permanently connected. So
| there'd logically be an input, you connect a temporary hose,
| re-pressurize.
|
| In practice, nobody does that, the installers will run a
| permanent line, in defiance of the rules, and use a tap, now
| you can re-pressurize by turning the tap.
|
| Except, many years after moving in and setting up I'd
| forgotten where that tap could be. It wasn't where I expected
| to find it, and I could not think there else they'd have put
| it. So after sleeping on it and still not remembering I
| called a plumber. Not an emergency plumber, but still
| plumbers aren't cheap.
|
| The plumber also couldn't immediately find it, it wasn't
| where he first looked either. But I can't actually tell how
| much time he spent "pretending" to look versus how long it
| really took him to discover it. Because it's embarrassing
| right? Even if the customer is up-front about the nature of
| the task, "It's under the kitchen sink - behind this panel,
| here" (yes, that's where it actually is) doesn't feel like a
| real job worth PS80 or whatever it was.
| amalcon wrote:
| I'm pretty good with electrical diagrams and such, and I know
| exactly what one needs to do to stay about as safe as one
| can. I've fished (ethernet) wires before. I still hire
| electricians, because it's the only practical way to buy
| electrician's insurance for a project.
|
| I know a retired electrician. He hires electricians for
| anything nontrivial, for the insurance.
|
| That said your broader point has a lot of merit. I hire
| plumbers for anything more complicated than snaking a drain,
| and it's because of their specialty knowledge.
| gambiting wrote:
| What do you mean electrician's insurance?
|
| I do have a so called "DIY insurance" on my home policy -
| if I damage something while doing a DIY project(drill into
| a cable or a pipe for instance) my home insurance will
| cover the repair. Is that not enough?
| akiselev wrote:
| No, that's just for incidental damage and probably
| legally mandated by your state. If your house burns down
| to due to DIY wiring, the insurance company won't pay out
| a dime (if the inspectors discover it).
|
| Electricians have liability insurance provided by an
| organization specializing in policies for practicing
| tradesmen. If an insurance company fails to disqualify a
| homeowner's policy and pays out, it will then go after
| the electrician and his insurance to recoup their losses.
| Residential electrical fires are extremely preventable so
| insurance companies almost universally refuse to insure
| homes with DIY electrical work because the homeowner _is_
| the one liable.
| einpoklum wrote:
| TBH the electricians I know tend to rely on their know-how
| to just do things right rather than on somebody else with
| the ability to sure in case it all burns down.
| InvertedRhodium wrote:
| Here in NZ, a lot of work on your own property is legal -
| even some relatively complicated stuff - but the catch is
| that you need to find a qualified inspector to sign off on
| it.
|
| The reality is that unless someone knows you and your work
| personally (friend, relative, w/e) it can be really
| difficult to find an inspector who is willing to sign off
| on some random persons work as there is a liability
| component (nowhere near as large as there is in the US, we
| have publicly funded "accident insurance" called ACC) in
| doing so.
|
| That being said, anything I can legally do myself I do and
| I am sure to maintain good relationships with the
| inspectors I know.
| f38zf5vdt wrote:
| ...what? I've known of this for at least a decade. Do people
| expect Google to be some free-for-all of unmoderated knowledge
| that can be used to defame and degrade others?
| danso wrote:
| The author of the tweet also wrote a major NYT story about the
| topic earlier this year:
|
| https://news.ycombinator.com/item?id=25972121
|
| https://www.nytimes.com/2021/01/30/technology/change-my-goog...
|
| The story focuses on a software engineer who discovered that he
| was the slander target of someone who had been fired by his
| father 30 years ago. He found her identity and took her to
| court in 2018. But the libelous Google results didn't change.
| The NYT story even ends with this:
|
| > _Yet even that hasn't solved the problem. See for yourself:
| Do a Google search for "Guy Babcock."_
|
| A day after the NYT story, those results disappeared. Here's
| what they looked like:
| https://news.ycombinator.com/item?id=25973045
|
| If a software engineer with the resources to find and take an
| anonymous libeler didn't know that Google could manually
| intervene and remove results that listed him as a pedophile,
| I'm assuming the vast majority of people are equally unaware.
| wly_cdgr wrote:
| This feature is certain to be abused & should really not exist
| despite the good it may do in specific cases
| kaliali wrote:
| I would've thought more people would realize this from last years
| US election. That's too bad.
| wly_cdgr wrote:
| This feature is certain to be abused & really should not exist
| despite the good it may do in specific cases
| joecool1029 wrote:
| Cool, the section "Remove content about me on sites with
| exploitative removal practices from Google" seems custom-crafted
| to handle sites like RipoffReport.
| https://support.google.com/websearch/answer/9172218
| seumars wrote:
| What's also surprising is how bad Google is at processing these
| requests. It's almost like a PR stunt. I've had to use their EU
| Privacy Removal form in the past and a single response can take
| anything from a few days to several weeks to no response at all.
| Half the time it seems like you're emailing a bot as I've
| received the same canned reply to simple inquiries. In the end I
| just gave up.
| toby- wrote:
| If people don't know about this, they haven't been paying
| attention. I find the idea that 'even experts' (haha) don't know
| about this rather hilarious.
| H8crilA wrote:
| Toby, I haven't laughed so hard in a month.
| throwaaskjdfh wrote:
| What does the Internet Archive/Wayback Machine do in similar
| circumstances?
| paulpauper wrote:
| i am sure they will also remove stuff, but most ppl such as
| employers just do a google search
| deertick1 wrote:
| Yeah also if you compare results from google vs duckduckgo for
| controversial search terms like "I don't care about gender
| identity"
|
| Google will return only content that tells you why that opinion
| is wrong e.g. "why you should care about gender identity"
|
| Whereas duckduckgo will return stuff that actually matches your
| search.
|
| Google always errs on the side of left wing political platform.
|
| Its actually really egregious once you start testing it out. To
| the point that google actually completely buries really valuable
| information such as primary sources for controversial events,
| scientific studies etc and instead promotes shitty wapo articles
| ans the like that tell you how to think about the thing you are
| actually searching for.
| cfgghsj wrote:
| They call it prioritizing "authoritative sources" over organic
| results.
| A4ET8a8uTh0 wrote:
| Damn. That is a controversial statement? I am a rebel again.
| gowld wrote:
| Google is personalized to some extent.
|
| I don't see what you are describing at all.
| thrwawy11jun21 wrote:
| Searched Google and DDG for "I don't care about gender
| identity"
|
| google top: reddit post about "What are you if you just don't
| give a fuck about gender and you have no kind of dysphoria?"
|
| DDG top: blog post titled "Don't care about gender Self-ID?
| Here's why you should"
| Blikkentrekker wrote:
| I also echo this; I'm not sure what exactly the poster you
| replied to suggests.
|
| Both normal and incognito on _Google_ return similar results
| to what was returned to you.
| pjc50 wrote:
| The second one is part of the anti-trans campaign.
| crazygringo wrote:
| What are you even talking about?
|
| I just Googled "I don't care about gender identity" and the top
| results are all agreeing with that statement.
|
| The search is working perfectly. Google isn't inserting any
| kind of left-wing bias whatsoever. It's finding the same types
| of pages as DuckDuckGo does as well.
|
| So why are you spreading this misinformation?
| deertick1 wrote:
| I hadn't tried that particular search. Heres on I have tried
| and just double checked it : "I don't care about transgender
| rights"
|
| The example further down about the moon landing is a much
| better example though. Its a lot worse right when some
| political event happens. Like if trump said sonething really
| controversial and you search for more context around it, on
| google top results would all be articles about how shitty the
| thing he said was whereas on DDG youd have a greater
| probability of actually getting some context.
|
| Like when pandemic was in full swing trying to get
| information that was against the mainstream narrative was
| impossible on google but was doable on ddg.
|
| This might not be like an active decision by google but their
| results do skew really heavy left. It might just be a
| function of popularity rankings. But id imagine ddg uses a
| similar ranming system so not sure what makes the difference.
|
| There are times tho when ive searched for something I know
| exists like word for word and it still wont come up on google
| but will on ddg so, go figure.
| Cd00d wrote:
| > I hadn't tried that particular search
|
| So, the one example you provided you hadn't taken the 10
| seconds to validate?
|
| You're not providing value here. Please make an effort to
| make meaningful and additive comments instead of spewing
| misinformation or made up anecdotes.
| mythrwy wrote:
| Yandex works even better for stuff like that.
| brunoTbear wrote:
| I'd ask if that error might be better understood as downranking
| of two things that seem very reasonable to downrank in a search
| engine (something designed to return accessible and useful
| information): - bigots motivated by animosity towards
| minorities - open contempt for the truth
|
| Hard to feel there's a compelling interest in supporting
| bigotry when knowledge is best advanced by open inquiry and the
| net result of bigotry is a suppression of voices which will
| lead to less knowledge.
|
| As to conspiracies and lies? Very little truth value there,
| unclear why Google would want to uprank that kind of content.
|
| Were there real debates about truth with actors in good faith
| on both sides, I might be more open to your framing of the
| problem as left wing vs right wing.
| deertick1 wrote:
| Yeah so thats the thing. You are just immediately assuming
| that you are right and everyone who disagrees is wrong and a
| horrible person. Personally, I see people obsessed with
| gender identity as being wrong, anti-truth, anti science,
| disingenuous, etc.
|
| So thats my point : I don't want google to give me the truth,
| I want it to give me the internet, warts and all. I don't
| want google to gatekeep the information I have access to. Lol
| I remember everyone being in a tizzy when net neutrality gor
| the axe cause ISPs would start gatekeeping. But if a
| wholesome company like google does it its in pursuit of
| truth.
| pjc50 wrote:
| > Personally, I see people obsessed with gender identity as
| being wrong, anti-truth, anti science, disingenuous, etc.
|
| Well, yes, that's the basis of the anti-trans campaign:
| obsession. It ends up taking over people and causing them
| to become mono maniac posters on the subject, sometimes to
| the detriment of their careers. It's worse than
| scientology.
|
| People not caring about gender identity would in many cases
| be a huge improvement.
| danso wrote:
| Why did you start using Google in the first place? Their
| early papers on PageRank/BackRub describes how their
| algorithm filters for quality, and their most prominent
| example is explicitly political:
|
| http://infolab.stanford.edu/~backrub/google.html
|
| > _As an example which illustrates the use of PageRank,
| anchor text, and proximity, Figure 4 shows Google 's
| results for a search on "bill clinton". These results
| demonstrates some of Google's features. The results are
| clustered by server. This helps considerably when sifting
| through result sets. A number of results are from the
| whitehouse.gov domain which is what one may reasonably
| expect from such a search. Currently, most major commercial
| search engines do not return any results from
| whitehouse.gov, much less the right ones._
|
| > ... _The biggest problem facing users of web search
| engines today is the quality of the results they get back.
| While the results are often amusing and expand users '
| horizons, they are often frustrating and consume precious
| time. For example, the top result for a search for "Bill
| Clinton" on one of the most popular commercial search
| engines was the Bill Clinton Joke of the Day: April 14,
| 1997. Google is designed to provide higher quality search
| so as the Web continues to grow rapidly, information can be
| found easily._
| IncRnd wrote:
| That isn't exactly how Google sorts or filters results
| today.
|
| "Yes, we do use PageRank internally, among many, many
| other signals. It's not quite the same as the original
| paper, there are lots of quirks (eg, disavowed links,
| ignored links, etc.), and, again, we use a lot of other
| signals that can be much stronger." [1]
|
| [1] https://twitter.com/JohnMu/status/1232014208180592641
| John Mu, Search Advocate at Google.
| deertick1 wrote:
| Huh interesting. But lets be real asjing "why did you
| start using google anyway" is a preposterous question.
|
| You used to be able to find all kinds of weird ass fucked
| up shit on google but now its all internet based news
| media. I miss the raw shit.
| danso wrote:
| To be fair, I asked the question because you made a
| sweeping assertion about others being "anti-truth/anti-
| science", which implies you'd be more cognizant and
| knowledgeable about the algorithms and mindset behind
| Google back "in the good ol days".
| TazeTSchnitzel wrote:
| You're making a huge leap by assuming this is because such a
| subject isn't politically correct or whatever. There's many
| more likely explanations:
|
| * You have phrased the search in a way that people expressing
| that view wouldn't be likely to use, yet is similar to how
| people holding contrary views would express theirs. I think
| that's the case here. If you search for "gender does not exist"
| or so, you may get the results you want.
|
| * Google's results are customised based on what they think you
| would be interested in.
|
| * Content from one side of a contentious topic may be less
| popular online and/or not linked to as many authoritative
| sites, and thus have a lower PageRank.
| jfoster wrote:
| > Google always errs on the side of left wing political
| platform
|
| Reminds me that yesterday I tried googling around the topic of
| how covid affects fertility (eg. "covid fertility") and nearly
| every result that comes back is about vaccines not affecting
| fertility. Okay, thanks Google, but how about the actual
| disease? Results were a fair bit more relevant by adding
| "disease" to the query, but still got one or two about
| vaccines.
|
| I don't think you can call that behaviour favouring left. It
| feels as though they're creating vaccine hesitancy.
| deertick1 wrote:
| Perfect example. Had the exact aame experience. As someone
| who leans more right wing, I call it left skewed, but yes you
| are right it is not necessarily purely left aligned, or even
| at all. It feels like just any controversial opinion just
| gets obliterated by their ranking algorithm (in the case if
| covid I would bet a million dollars there was manual
| intervention in the name of public health tho)
| nradov wrote:
| That's the type of thing that you'll have to search on the
| specialized Google Scholar site to find useful results.
|
| https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=COVI.
| ..
| IncRnd wrote:
| That's exactly the poster's point.
| TazeTSchnitzel wrote:
| COVID-19 is a special case. I think Google has intervened
| there to ensure that official sources are favoured due to
| concerns about the impact of misinformation in a pandemic
| specifically. This is not a common Google practice so far as
| I know.
| Blikkentrekker wrote:
| Perhaps much of it is not ideological on Google's part but
| actually simply the machine learning about what people want
| to see.
|
| It's entirely possible that such a disproportionate number of
| persons is more likely to access links about vaccines and
| such politics than what you searched for that the machine,
| not the man, has elected to favor those without having any
| real political motive.
| meisel wrote:
| Can you provide more examples of this?
| quacked wrote:
| Look up "proof the moon landing was fake". I have an
| aerospace degree and don't think it was fake, but I still
| find it strange that Google won't return any of the
| conspiracy pages I enjoyed reading about a decade ago.
| crazygringo wrote:
| Probably not strange since they probably have very low
| PageRank, visits, etc.
|
| On the other hand, the top Google results go to The
| Guardian, Wikipedia, Time, the BBC, etc., discussing the
| topic -- hugely popular sites.
|
| Google's meant to find _popular_ relevant pages for your
| search terms. Remember, that 's what PageRank was all
| about. So seems to be working as expected.
| Blikkentrekker wrote:
| This does run into the same problem that _Wikipedia_ runs
| into, that it insists on using "credible" or "reputable"
| sources but does not really bother to define that, and it
| essentially comes down to that sources are not "credible"
| for disagreeing with their beliefs.
|
| Personally, I have yet to see any "credible" news source
| and the adage remains that every news article about
| anything I have even the most minor inside knowledge of
| seems completely inaccurate, especially the politically
| laden ones.
|
| No matter what mechanism _Google_ uses to assign such
| ranking: be it their own judgement or simply an agnostic
| a.i. that lets the masses decide, I cannot see anything
| good coming from it and there will always be a bias not
| based in factuality, but politics.
| crazygringo wrote:
| ...you don't see anything good coming from Google search
| results?
|
| ...that it's incentivized to return the results that the
| most people are looking for? As measured by clicks?
|
| If you don't believe the news and you don't find
| popularity useful either then I honestly can't imagine
| what you're even looking for then.
___________________________________________________________________
(page generated 2021-06-11 23:00 UTC)