[HN Gopher] Google Drive bans distribution of "misleading content"
___________________________________________________________________
Google Drive bans distribution of "misleading content"
Author : temp8964
Score : 711 points
Date : 2021-07-16 16:34 UTC (6 hours ago)
(HTM) web link (support.google.com)
(TXT) w3m dump (support.google.com)
| irthomasthomas wrote:
| I just cancelled my drive subscription.
|
| I don't know how anyone could continue using it after this.
|
| I probably have dozens of docs and hundreds of research papers
| contradicting government health advice on diabetes and heart
| disease. These would fall under "Misleading content related to
| harmful health practices" since they promote a health theory
| which the government considers harmful.
|
| However I would have cancelled regardless since the idea of
| automatic bans and/or content deletion based on ML models is
| crazy. They are obviously going to find a lot a false positives
| and I can't deal with the idea of trying to speak to google to
| explain that their algorithm mistakenly flagged my innocent
| content. In other words even if you are the perfect citizen,
| there is a chance you will get flagged anyway.
| tyingq wrote:
| Yes, this is really bizarre. I get this kind of policy for some
| kinds of platforms, but not Drive, Docs, Sheets, Slides, and
| Forms. Those are my documents, and I should be free to put
| whatever I want in them.
|
| What if I just like to collect and share old conspiracy theory
| stuff that I know is wrong? For whimsy, historical, whatever
| purposes...
| maxk42 wrote:
| Some commenter replied it will change nothing. I disagree (but
| didn't downvote) - it will change the number of people in the
| market for a competitor. There are competitors out there and
| the people who are cancelling their Drive subscriptions here
| are going to support them financially, building a viable rival
| to Drive with their dollars. More competition is one of the
| best possible outcomes and I fully support it. Please cancel
| your Google subscriptions, folks!
| kderbyma wrote:
| going to follow suit
| systemvoltage wrote:
| Please switch to Firefox as well. I beg everyone.
| ukie wrote:
| It will change nothing. Google has too much money. Getting away
| from its services is the right thing to do though.
| pwned1 wrote:
| Big Brother is unhappy.
| xiphias2 wrote:
| I recently started working on a program that distributes my own
| relational data in my web browser (IndexedDB) between my devices
| using WebRTC so that I can build applications on top of it. It
| seems like it will be needed more and more over time.
| intended wrote:
| Good on Google!
|
| The old assumptions of the internet and how it works were wrong.
| The declaration of freedom was naive.
|
| The simple model of how "free speech" leads to wiser outcomes
| turned out to be wrong.
|
| What the heck - this combination may even be a Great Filter.
|
| 1) The rapid deployment of near-species scale information
| networks.
|
| 2) Information losing neutrality because any species (aside from
| maybe a hive mind) adds spin and polarity to data
|
| 3) Unprepared economic markets that end up rewarding polarizing
| ("engaging") content, attriting societal level attention/mental
| resource in a way never experienced before.
|
| I believe the recent British Govt paper argues that social media
| firms need to bear a duty of care to users. This is a better
| place to start.
|
| Removing misinformation which has an obvious known fingerprint is
| a good start.
| slumdev wrote:
| I'll be cancelling and migrating away from Google as soon as I
| have a free weekend to identify alternatives and get it done.
|
| Now to find a vanity email domain... My last name is taken, so
| brainstorming something that is professional and individualized
| could take longer than the migration...
| MR4D wrote:
| As I read this, there is not one link to an alternative to Google
| Drive in the comments.
|
| Any suggestions? Distributed solutions (or self managed)
| preferred.
| TemporaryUser9 wrote:
| Try Zoho https://www.zoho.com/docs/
| choward wrote:
| Nextcloud seems to me to be the most popular self hosted
| solution.
| constantinum wrote:
| As i've mentioned above, Skiff, still in invite only beta looks
| like promising alternative [https://www.skiff.org/].
| [disclaimer] I don't work for them. I waiting for the beta
| early access though.
| sp2021 wrote:
| So is it "misleading" to address a male suffering from a mental
| disorder who calls himself a "she" as a "he" in my publicly
| shared google doc?
|
| We have abandoned all axioms.
| grawprog wrote:
| Nifty, censorship built right into office tools now too. It's so
| great my word processor and my spreadsheet app can police my
| thoughts now too.
|
| Maybe they'll be able to use AI at some point and just detect
| when I'm writing the wrong combination of words together and
| preemptively block me from even writing things.
| [deleted]
| [deleted]
| throwitaway1235 wrote:
| I also wonder how much control Google has over data stored on a
| Chromebook?
|
| Do they apply ownership to the data on your physical storage
| device?
| jinpa_zangpo wrote:
| The problem isn't that one company has made its own decision on
| what to ban. The problem is that the White House is literally
| compiling a list and telling all the social media companies who
| to ban. This is censorship by proxy.
| SV_BubbleTime wrote:
| That was the most shocking thing that came out recently imo.
| And not to be reported at all anywhere.
|
| That Trump's initially ridiculous lawsuit against big tech that
| they are operating as government agents - the White House just
| basically agreed with.
|
| Ignore the Twitter, but the CSPAN video is good.
| https://twitter.com/bennyjohnson/status/1416095333877260292
|
| _"We don't censor, we just tell a social media monopoly what
| to censor, it's totally not violating the first amendment!"_
| ... wth
| lettergram wrote:
| Interesting... so I wrote a wrote an article on gun violence
|
| https://austingwalters.com/firearms-by-the-numbers/
|
| Guess it contradicts the official position of the current
| administration (not reality). So will my drive content be
| removed?
|
| Similarly, I have been monitoring the CDC change the covid19
| deaths rate over time.
|
| https://austingwalters.com/changes-in-the-cdc-counts-of-deat...
|
| It appears the CDC had been inaccurately portraying the deaths
| rate(s) (I assume unintentionally). Particularly, it appears
| there's a significant number of unexplained deaths. That could be
| "misleading?" because I regularly collaborate with and we update
| the data.
|
| "misleading" does not mean not inaccurate. Often the context
| matters and how is Google going to take this into account? We
| have multiple theories and discuss them, find more information
| and put it together.
| tablespoon wrote:
| > Guess it contradicts the official position of the current
| administration (not reality). So will my drive content be
| removed?
|
| Below is the actual policy. What term do you think your blog
| post violates?
|
| > Do not distribute content that deceives, misleads, or
| confuses users. This includes:
|
| > Misleading content related to civic and democratic processes:
| Content that is demonstrably false and could significantly
| undermine participation or trust in civic or democratic
| processes. This includes information about public voting
| procedures, political candidate eligibility based on age /
| birthplace, election results, or census participation that
| contradicts official government records. It also includes
| incorrect claims that a political figure or government official
| has died, been involved in an accident, or is suffering from a
| sudden serious illness.
|
| > Misleading content related to harmful health practices:
| Misleading health or medical content that promotes or
| encourages others to engage in practices that may lead to
| serious physical or emotional harm to individuals, or serious
| public health harm.
|
| > Manipulated media: Media that has been technically
| manipulated or doctored in a way that misleads users and may
| pose a serious risk of egregious harm.
|
| > Misleading content may be allowed in an educational,
| documentary, scientific, or artistic context, but please be
| mindful to provide enough information to help people understand
| this context. In some cases, no amount of context will allow
| this content to remain on our platforms.
| lettergram wrote:
| According to the current administration gun violence is a
| public health crisis:
|
| https://efsgv.org/learn/learn-more-about-gun-
| violence/public...
|
| I present arguments that could be considered "misleading"
| based on the administrations official position. Personally,
| I'd like to actually fix the issues, to do so, we need to
| discuss the issues. With that, I wrote something we can use
| as a framework to discuss the issues.
|
| ----
|
| I simply deep dive into the data and found interesting
| results that differ (this is just a random selection):
|
| (1) There doesn't appear to be a correlation between firearm
| access and homicides (if anything it's slightly reverse)
| https://austingwalters.com/firearms-by-the-
| numbers/#Firearms...
|
| (2) White, Hispanic, Asian populations have one of the lowest
| firearm homicide rates in the world. In contrast, the black
| population has one of the highest firearm homicide rates are
| very high, which pushes the U.S. average up. (which arguably
| could support the systemic racism theory, but is a fact)
| https://austingwalters.com/firearms-by-the-
| numbers/#Comparin...
|
| (3) The CDC & FBI crime statistics show that <0.5% of the
| population is murdered by firearms in a given year (~1-1.5%
| if you include suicides).
|
| (4) Self-defense homicides are included in the data
|
| (5) Gangs don't appear to be the reason for a high firearm
| homicide rate https://austingwalters.com/firearms-by-the-
| numbers/#Gang_Dem...
|
| (6) Homicides per firearm are very low
| https://austingwalters.com/firearms-by-the-
| numbers/#Homicide...
|
| (7) You're more likely to be beat to death or stabbed than
| shot (arguably guns would save you from this)
| https://austingwalters.com/firearms-by-the-
| numbers/#_Circums...
| hxjemzbskwkxb wrote:
| Sorry but your content is misleading. Take for example the
| following quote, from Amnesty International, which cite in
| your article:
|
| > governments [with] poor regulation of the possession and
| use of guns lead to violence and that they must tackle this
| now through strict controls on guns and effective
| interventions in communities suffering high levels of gun
| violence.
|
| From this say the following:
|
| > The key statement is:
|
| Guns lead to violence
|
| The statement above implies a couple of things:
|
| 1. Gun volume and violence are correlated
|
| 2. As the number of guns increase, violence increases
|
| ----------
|
| This is a blatant distortion of what that quote from
| Amnesty is saying.
|
| That are clearly saying that _poor regulation of the
| possession and use of guns_ leads to violence.
|
| You are quite obviously engaging in bad faith arguments.
|
| edit: formatting
| remarkEon wrote:
| So his blog post should be banned then, right?
| hxjemzbskwkxb wrote:
| No, it should be ignored.
| Jiro wrote:
| I can see how information that contradicts the CDC falls
| under "Misleading content related to harmful health
| practices: Misleading health or medical content that promotes
| or encourages others to engage in practices that may lead to
| serious physical or emotional harm to individuals, or serious
| public health harm."
|
| Of course, it wouldn't actually fall under anything since
| it's not misleading, but such things get interpreted by
| social media censorship boards as misleading.
|
| Also, the gun violence one may fall under "serious public
| health harm". There have been plenty of attempts to control
| guns using public health claims. https://www.apha.org/topics-
| and-issues/gun-violence
| [deleted]
| tablespoon wrote:
| Both interpretations seem like mighty big stretches to find
| this particular content as noncomplaiant with that policy.
| IMHO, if the policy gets stretched like that, then the
| issue isn't with the policy itself.
|
| I mean, you could make similar stretches to hypothetically
| ban discussion of tax increases, because of the serious
| emotional harm that would cause to wealthy people fearing
| the loss of their money.
|
| Honestly, the only issue with the actual text that I see is
| the reference to "emotional harm," given how subjective
| that is and how certain ideological propositions can be
| medicalized via that route. The rest of it is very
| reasonable, especially the paragraphs about civic processes
| and manipulated media.
| CheezeIt wrote:
| > It also includes incorrect claims that a political figure
| or government official has died, been involved in an
| accident, or is suffering from a sudden serious illness.
|
| In other words, it bans _correct_ claims that Hillary Clinton
| had health problems, and that Joe Biden has dementia.
| mancerayder wrote:
| I read regular stories on HN about employee activism on certain
| types of popular issues around gender and race, involving
| petitions and publicity. Spotify, Apple, Google. Yet nothing
| about the march towards totalitarianism and information control -
| always with good intentions and to our benefit - promulgated with
| stories such as this.
|
| Not a peep from our young Silicon Valley activists. Do we still
| teach history at school? I'm talking about history of many
| countries. No fear whatsoever of information control and
| government censorship. I'm baffled and saddened.
| sixothree wrote:
| Just because you didn't hear anything doesn't mean they aren't
| saying it. That's just weak sauce argument and insults on your
| part.
|
| Lose the agenda.
| creddit wrote:
| Join one of these companies and what you will find is that:
|
| > Not a peep from _our young Silicon Valley activists_.
|
| are the ones pushing FOR these changes.
| s3r3nity wrote:
| > are the ones pushing FOR these changes.
|
| Because the goal isn't to deconstruct those systems of power
| (as it probably _should_), but rather to put someone else in
| the center.
|
| I once heard a Silicon Valley VC say on a podcast: "If you
| hear someone utter the term 'equity,' then run for the hills.
| Because it's really a power grab." (Not obviously talking
| about stock compensation, of course...)
| zpeti wrote:
| What's sad to me is that someone who really should be a hero to
| everyone in Silicon Valley, the journalist who decided to
| publish Edward Snowden's stuff, at insane risk to himself, has
| been warning about censorship forever. (Glen greenwald)
|
| Yet now he's apparently a right wing trump apologist to most
| people on the left.
|
| That doesn't change the truth of what a person who's put a lot
| on the table is saying.
|
| Censorship always becomes about power. Once you create the
| tools, the powerful will take them over. You may think that's a
| good thing when your side is in power, but that will NEVER be
| forever.
| [deleted]
| throwaways885 wrote:
| Snowden (and countless others) are hero's. This new breed of
| authoritarian leftists do not represent Silicon Valley and
| tech at large, but rather silence those of us who do still
| believe in free speech.
|
| I'm petrified about speaking out because I don't want to be
| labeled as a far-right trump apologist. No, believing
| gigantic megacorporations shouldn't censor information is not
| "right-wing".
| JasonFruit wrote:
| Please, please do speak out! Otherwise, the only people
| speaking are the authoritarian leftists and their worst
| possible opponents, the Trump apologists. People speaking
| in favor of speech _they do not agree with_ are the most
| compelling free-speech advocates of all.
| AgentME wrote:
| Is totalitarianism when you don't let your servers be open
| relays for antivaxxer propaganda during a pandemic?
| zpeti wrote:
| Soooo... The Who said the virus isn't airborne initially.
| They said masks don't work initially.
|
| What's your plan of action there? Google censors what the
| status quo asks them to and then flips and censors the other
| side the second they change their mind? Does that sound like
| a good world to you?
|
| How exactly does long term discussion happen in that cases?
| Since basically by about April 2020 you've banned all pro
| mask discussion and anti mask discussion...
| umvi wrote:
| Well we could create a new government organization that
| determines what the current best known truth is
| ("Department of Truth" say) and Google censors just remove
| anything that goes against the Department of Truth
| _-david-_ wrote:
| Can you clarify if you are being sarcastic or not?
| zpeti wrote:
| Damn, what a great idea! Why hasn't anyone thought of
| this yet?
| AgentME wrote:
| Does Google need a government agency to determine what
| emails are or aren't spam? Why would it need that in this
| case instead, and do you really think that would be an
| improvement? This is a bad strawman.
| AgentME wrote:
| If they're bad at identifying misinformation, then I'll
| fault them for that. Has Google had a tendency to censor
| things on the basis of information that later turned out to
| be true? Your examples are the judgments of groups that
| aren't Google and judgments that I don't think the groups
| responsible for pushed for others to ever get censored for.
| throwaways885 wrote:
| But that is precisely the lesson history taught us.
| Misinformation, fundamentally, cannot be identified. It's
| easy to say "oh, everyone who believed that was an idiot"
| when talking about Galileo being thrown in prison for
| heliocentrism. If he was alive today, we'd be calling him
| a "far-right conspiracy theorist" or something equally as
| nasty.
| AbrahamParangi wrote:
| In the early days of the coronavirus, you couldn't use
| Google to find information at all because everyone other
| than the WHO was getting censored. I was using Bing for a
| while because their censorship was much slower.
| [deleted]
| slg wrote:
| >Do we still teach history at school? I'm talking about history
| of many countries. No fear whatsoever of information control
| and government censorship. I'm baffled and saddened.
|
| I agree. We don't spend enough time teaching about the history
| of free speech in other countries like Germany. And no, I don't
| only mean in 1930s and 1940s Germany. I also mean the Germany
| of today. They, along with much of Europe, have laws outlawing
| some misinformation such as Holocaust denial and their
| societies haven't collapsed into totalitarianism.
|
| Some restrictions on speech are truly dangerous. Some aren't.
| It is important to have the historical context to help know
| which one we are discussing.
| imwillofficial wrote:
| Playing devils advocate here, how is holocaust denial
| dangerous?
| slg wrote:
| At the most basic level, denying or downplaying a previous
| genocide increases the odds of a future genocide.
| skissane wrote:
| But the Turkish government and its supporters are allowed
| to deny and downplay the Armenian genocide to their
| hearts content. The European Court of Human Rights even
| decided (Perincek v. Switzerland) that people have a free
| speech right to deny the Armenian genocide, yet it has
| also held that people don't have a free speech right to
| deny the Holocaust (Pastors v. Germany).
|
| Both genocides happened, both genocides were awful, but
| it seems like different rules apply to denying different
| genocides, and that those rules are based on political
| calculations rather than defensible principle.
| slg wrote:
| Yes, the laws in Turkey, Switzerland, and Germany are not
| going to be identical. And as I said elsewhere in this
| thread:
|
| "My original comment was about laws against
| misinformation in Germany. That doesn't mean I endorse or
| need to defend all free speech laws in all of Europe."
| imwillofficial wrote:
| Do you have data to support that by any chance, I've
| heard old axioms, but nothing more than that.
| slg wrote:
| I don't want to be a jerk, but if you want to play
| devil's advocate, you can do the research yourself. I am
| comfortable believing the old "those who don't know
| history are doomed to repeat it" adage without having a
| peer reviewed study on it.
| zpeti wrote:
| It is also slowly becoming illegal to criticise Islam.
|
| Taking Islam out of the equation do you think it's a good
| idea for any religion to actually be off the table in terms
| of discussion?
| bigthymer wrote:
| > It is also slowly becoming illegal to criticise Islam
|
| Really? Literally "Illegal" as in laws against it, or
| figuratively as in no longer socially acceptable?
| whoooooo123 wrote:
| Go burn a Koran and see what happens.
| zpeti wrote:
| Yes, literally. Hate speech laws.
| cm2187 wrote:
| https://eclj.org/free-speech/echr/blasphemy-crime-the-
| echr-i...
| 8note wrote:
| It looks like there's a lot more nuanced in there than
| criticising Islam. Trying to convince people that all
| Muslims are pedophiles is a bit different than calling
| Mohammad a pedophile
| cm2187 wrote:
| That's not what the ECHR ruled on.
| slg wrote:
| You are equating criticizing a religion with spreading hate
| speech. Islam or any religion is not "off the table in
| terms of discussion". Many European countries, including
| Germany, simply don't want people crossing the lines into
| speech that can incite people to harm others.
| nomel wrote:
| > simply don't want people crossing the lines into speech
| that can incite people to harm others.
|
| I think that's very very far from simple, since it
| necessarily requires the government slowly align, and
| converge, with those people, so they they are never
| offended and never cause harm.
| slg wrote:
| > since it necessarily requires the government slowly
| align, and converge, with those people, so they they are
| never offended and never cause harm.
|
| "Those people" are not the ones who dictate whether
| something is hate speech. Whether someone is offended is
| not a factor in hate speech laws. When I said "incite
| people to harm others" I am talking about physical harm
| or violence. It is perfectly legal to offend people in
| Germany.
| corty wrote:
| > It is perfectly legal to offend people in Germany.
|
| No, wrong, it isn't.
|
| https://dejure.org/gesetze/StGB/183a.html
| https://dejure.org/gesetze/StGB/166.html
|
| Sexual and religious offense. Both predicated on someone
| being (even just possibly) offended.
|
| https://dejure.org/gesetze/StGB/185.html
| https://dejure.org/gesetze/StGB/188.html
|
| General offense, usually only prosecuted if the offended
| wants it (but the prosecutor has discretion to proceed
| without). Harsher punishments if the offended is a
| politician.
|
| https://dejure.org/gesetze/StGB/104.html Even just
| offending other states by burning flags is punishable.
|
| So you couldn't be more wrong. (I personally think those
| laws are BS and should be done away with.)
| slg wrote:
| I don't know what your point is here. You are citing
| particular laws in which a party is likely to be offended
| like public sex, defamation, and desecration of a flag.
| But the crime isn't that offense was caused. That offense
| is the byproduct of the actual crime.
|
| Plus many of those are illegal in other countries too.
| You can't have public sex or defame people in the US
| either, but no one would say it is illegal to offend
| people.
| zpeti wrote:
| Already linked above but here you go:
| https://eclj.org/free-speech/echr/blasphemy-crime-the-
| echr-i...
| slg wrote:
| FYI Austria and Germany haven't been a unified country in
| three quarters of a century.
|
| My original comment was about laws against misinformation
| in Germany. That doesn't mean I endorse or need to defend
| all free speech laws in all of Europe.
| guelo wrote:
| We don't think lies about the vaccine or who won the election
| are noble causes worth making sacrifices for.
| [deleted]
| LegitShady wrote:
| Once you put the structures in place you don't get to choose
| what gets censored. You're arming a terrible weapon that will
| already be used against you on the pretext that it will only
| be used for these two things - and it won't be only used for
| those things.
|
| Terrible short sightedness.
| s3r3nity wrote:
| Glowing example of "tossing the baby out with the bath
| water."
| bopbeepboop wrote:
| They also censored that Fauci caused COVID by funding gain-
| of-function, even though that turned out to be true.
|
| Fauci unbanned gain-of-function, funded the lab which leaked
| a seemingly engineered virus, and had that banned by the
| Senate when it emerged what he'd done.
|
| The same can be said about media censorship around Cuomo and
| Whitmer killing tens of thousands.
| s3r3nity wrote:
| This ain't it, chief.
|
| We shouldn't be trusting (and giving power to) central
| governing bodies to dictate what counts as a "lie," and
| scrubbing away inconvenient information.
|
| "I disapprove of what you say, but I will defend to the death
| your right to say it" - Voltaire
| BitwiseFool wrote:
| Full agreement. "Slippery Slope" may be a fallacy, but
| establishing a precedent is a real thing and there's no way
| this stops here.
| imwillofficial wrote:
| I think whoever popularized "slippery slope" being a
| fallacy was an evil mastermind.
|
| It's empowered countless midwits the ability to blithely
| dismiss valid sloppy slopes with "nuh uh! It's a
| fallacy!"
| vimy wrote:
| Because the activists _want_ tech companies to censor and
| control information.
| notquitehuman wrote:
| Not all or even most. Tech platform censorship is just ranked
| lower on their list of issues/grievances. How do you think
| activists should be spending their time?
| drstewart wrote:
| >Tech platform censorship is just ranked lower on their
| list of issues/grievances.
|
| Yeah, I doubt that.
|
| https://alphabetworkersunion.org/principles/mission-
| statemen...
|
| >Social and economic justice are paramount to achieving
| just outcomes. We will prioritize the needs of the worst
| off. Neutrality never helps the victim.
|
| That doesn't scream "end platform censorship" to me.
| ceilingcorner wrote:
| The ideology of young activists basically boils down to power.
| You either have it, or you don't. Foucault in a nutshell.
| Abstract principles that should apply equally to everyone are
| just a colonialist legacy, or something.
| kube-system wrote:
| For better or for worse: the cloud was always somebody else's
| computer.
|
| Honestly I think it's pretty silly that we've all become so
| complacent with the cloud that we pretend that we're not guests,
| and that we own the place.
| knownjorbist wrote:
| While worrisome, I'm surprised HN has essentially nothing
| negative to say about people spreading vaccine misinformation,
| peddling The Big Lie, or other conspiratorial zeitgeist stuff
| from the last 5ish years.
| _-david-_ wrote:
| This virus is the prime example of why most people here are
| against thingd like this. Look how many times the government
| officials have flipped on all of this virus stuff. One day it
| is completely debunked, discredited and misinformation to
| suggest the virus came from a lab. The next day it is a
| plausible theory. We cannot trust banning misinformation
| because we don't know what is actually misinformation. What is
| misinformation today is accurate information tomorrow.
| tomcam wrote:
| This will end well.
| MattGaiser wrote:
| People call for tech regulation.
|
| Government has mandate to regulate tech.
|
| Tech realizes that shift in power and self regulates in a way
| favourable to the government.
|
| This reminds me of when Reddit tossed out Ellen Pao. Reddit
| demanded wholesale change, so what they got /u/spez, who happily
| banned all manner of subreddits when the fight was originally
| about one.
|
| https://old.reddit.com/r/announcements/comments/3dautm/conte...
| floren wrote:
| "Mighty nice service ya got here, shame if it was ta get...
| regulated. Unrelated, boy we sure hate when people say XYZ"
| gjsman-1000 wrote:
| Amazing they have the gall to think they still can arbiter truth,
| considering how YouTube up until recently was banning the
| "conspiracy theory" that the recent pandemic may have originated
| from a Wuhan Lab... Something that now all the experts are saying
| may actually be true.
| seoaeu wrote:
| You don't have to be an arbiter of all truth to flag some
| statements as dangerously untrue. Just because truth is
| sometimes hard to determine doesn't mean that we have to give
| up on there being an objective reality
| cwkoss wrote:
| "Dangerously untrue" is entirely subjective. NSA director
| testified that Snowden revelations were untrue. Snowden
| revelations cost US credibility, and probably has hampered
| our ability to "protect the globe". Does that make Snowden's
| revelations dangerously untrue?
|
| Is there a specific legal test you could propose to
| distinguish actually 'dangerously untrue' information from
| info which just threatens existing power structures?
| polynomial wrote:
| Oh that's ok, didn't YouTube just win a major award for Human
| Rights or something...?
| gjsman-1000 wrote:
| Yes, their CEO won a Free Speech Award. At an event sponsored
| by YouTube. Unbelievably stupid.
| IgorPartola wrote:
| I am so tired of seeing this crap. Here is what happened:
|
| Act I: there was no solid evidence as to where COVID
| originated. There was speculation of various kinds ranging from
| transmission from a bat cave via wet markets, to lab leak at a
| Wuhan lab, to it being a bio weapon. Nobody had solid evidence
| though people started digging.
|
| Act II: a group of pundits and conservative political
| operatives who were already known to be liars and blowhards
| latched onto the lab leak theory. It just so happened that it
| was politically advantageous for the GOP to push anti-Chinese
| sentiment at the time and a lab leak theory would point the
| finger directly at the Chinese government. They presented no
| evidence and had no evidence. They started spreading this
| theory along with various online groups _all known for
| spreading disinformation_. The clear undertone of this message
| was a call for violence against Asian Americans.
|
| Act III: in reaction to the calls for violence tech companies
| started curbing spread of this message. Remember some of the
| other things these same people were saying: that COVID was fake
| and invented by the US government to keep us indoors while they
| installed 5G towers everywhere to control our minds; that
| staying home and wearing masks was designed to weaken our
| immune systems to prepare for some kind of bio weapon attack;
| that Bill Gates was using vaccines to implant trackers in every
| arm of every individual around the world; that the mRNA
| vaccines are designed to let Pfizer and Moderna copyright or
| trademark your DNA such that they own you and you become their
| slave; that nobody actually was dying from COVID and this was a
| massive coverup designed to make Trump look bad.
|
| Act IV: evidence had emerged that the lab leak theory might
| have credibility. The tech companies lifted the filtering
| efforts.
|
| ---
|
| Note that this is in no way different than if I tell you that
| lizard people from Mars run the US government and you tell me
| to shut the fuck up. I _might_ be right and maybe evidence
| later shows up that in fact lizard people form Mars do run the
| US government, but since I have no evidence at the moment this
| is just bullshit at best and a call for insurrection at worst.
| Even a broken clock it right twice a day and when liars say
| something there is less than neutral reason to trust what they
| say. When mostly liars are frothing at the mouth spreading a
| theory, well it sure walks, talks, and sounds like a conspiracy
| theory. Had the people spreading the lab leak theory initially
| taken the time to do proper research and present any kind of
| shred of evidence then maybe it would have been treated
| differently. But as is they had no credibility to begin with,
| so is it that surprising that what they had to say was treated
| as lies, especially given their clear conflict of interest?
|
| And yes I am aware of individual incidents of various
| investigators being hampered by their higher ups from looking
| into the lab leak theory. I read those stories in nuance and
| what I gleamed is that it was (a) partly incompetence and (b)
| partly reactionary to the bullshit that the talking heads on TV
| were spreading. Maybe what we should focus on is holding the
| talking heads on TV to some standard of reality and these
| things won't happen instead of crying "censorship!" when
| someone calls for violence against an ethnic group based on
| unsubstantiated (at least at the time and still currently not
| proven) theory.
| _-david-_ wrote:
| I am so tired of seeing this crap. Here is what happened:
|
| People said the lab leak theory was debunked because Trump
| said it.
|
| Maybe what we should focus on is holding HN posters to some
| standard of reality and these things won't happen.
| mshanowitz wrote:
| Act II is grossly misrepresented. It wasn't just the GOP and
| a lot of evidence was presented. In fact, there hasn't been a
| whole lot of new evidence presented since then.
| cwkoss wrote:
| The comment you're replying to is a perfect example of why
| tech companies shouldn't be trusted to be arbiters of
| truth.
| thatguy0900 wrote:
| They're just making sure next time scientists come up with a
| political based consensus they bury the dissent a little
| better.
| prezjordan wrote:
| Why is conspiracy theory in quotes? It was a conspiracy theory,
| peddled by conspiracy theorists, and laundered into the public
| discourse by conservative media outlets. One of many dozen such
| claims about COVID-19. A broken clock is right twice a day (in
| this case the clock is probably still wrong).
| maaand wrote:
| both sides think the propaganda they consume is closer to
| reality and truer than the propaganda others consume.
| prezjordan wrote:
| Patently false that this a both-sides type of scenario. One
| dabbles with conspiracy theory and order of magnitude more
| than the other.
| djrogers wrote:
| No, it was _a theory_ that millions of people thought was the
| most likely possibility. Those people were silenced on
| Youtube /Twitter etc.
| fraudz4us wrote:
| It isn't a conspiracy.
|
| It is a fact.
|
| There are official government funding records of Fauci shipping
| money not just to the North Carolina lab but directly to Wuhan.
|
| There are official government funding records of Fauci
| explicitly funding "gain of function" research a.k.a., bio-
| weapons.
|
| Fauci needs to FUCKING HANG.
|
| Google is hardcore fascist.
| echelon wrote:
| I just wanted to respond.
|
| Gain of function isn't a conspiracy. It's research
| methodology.
|
| We still don't know for a fact what happened. Lab leak seems
| incredibly plausible, but we need more data to understand.
|
| If lab leak is what happened, then this is likely a case of
| best intentions that went horribly awry. Your response is far
| too extreme and discounts the failure modes that may have
| been more likely.
|
| Researchers in the US wanted to conduct gain of function
| research but couldn't due to the legislative environment.
|
| China has a ton of novel coronaviruses in local wildlife
| reservoirs that do cause disease and can evolve to impact
| humans. These are worth studying.
|
| An arrangement could have been made to study these viruses
| with research objectives laid out by Western scientists.
| That's not bioweapons research. That's basic science.
|
| The Wuhan lab may not have been equipped with the same safety
| protocols, enabling the virus to escape. Here's something
| we'd still need to find out.
|
| What we need to look at is the cause of failure and prevent
| it from happening again. If rules were broken, then a handful
| of individuals may be responsible for unleashing this.
| (Overzealous Western researchers and Chinese lab personnel.)
|
| This is geopolitically complicated and all parties involved
| are trying to save face. China and the US included. Not to
| mention every government that was slow to act in stopping the
| spread.
|
| This is maybe human error. Lots of human error. If this is
| the case, it's historically notable as probably one of the
| greatest mistakes in human history. It cost millions of
| lives.
|
| Through the same lens, it's also interesting to see all of
| the positive changes. RNA vaccines, remote work, supply chain
| discoveries, etc.
| CodeWriter23 wrote:
| Why not study the Human Immune System and how to reinforce
| it via diet. Like how they studied bone loss decades ago
| and decided to add Vitamin D to milk to increase calcium
| uptake. Seems to me there are millions of potential
| pathogens that could be problematic for humans, increasing
| defenses seems to be the rational approach, if one's intent
| is to actually preserve human life.
| ixacto wrote:
| It's possible that there were great intentions all around
| and the COVID leak was a horrible accident.
|
| It's also possible that a PRC agency or individual decided
| to take advantage of the situation and leak it to try and
| stick it to the United States as a geopolitical move.
|
| I hope this isn't true, but until we have additional
| evidence it would be impossible to rule this out entirely.
| Also the PRC needs to show that evidence to the world ASAP.
|
| For some reason the media wants to call everyone that looks
| at the situation xenophobic and racist, which just leads me
| to believe there is more to the story.
| drew-y wrote:
| Saying "all the experts" think it may actually be true is a big
| stretch. It's far more accurate to say that the experts haven't
| ruled it out. But they still think natural origin is far more
| likely[1].
|
| [1] https://www.nature.com/articles/d41586-021-01529-3
| djkivi wrote:
| I thought this theory was debunked by scientists over a year
| ago?
|
| https://www.npr.org/2020/04/22/841925672/scientists-
| debunk-l...
| naasking wrote:
| It was not: https://thebulletin.org/2021/05/the-origin-of-
| covid-did-peop...
| djrogers wrote:
| It was debunked by _some_ scientists, but nowhere near all,
| and there was no consensus.
|
| https://www.npr.org/2021/07/15/1016436749/who-chief-wuhan-
| la...
| CodeWriter23 wrote:
| That is the truth. But mainstream Propaganda as it
| typically does shouted loudly that IT DID NOT COME FROM A
| LAB IN WUHAN.
| polynomial wrote:
| Seems like the inevitable result is decision fatigue,
| where people decide, in the face of 2 irreconcilably
| polarized viewpoints that it's not worth the emotional
| stress of all the information overload to try to
| understand which is correct and simply give up.
| peytn wrote:
| > I spoke to Peter Daszak, president of the EcoHealth
| Alliance
|
| This guy became a little controversial [1] after a FOIA
| request turned up emails in which he appears to be
| conspiring with colleagues to manipulate public perception:
|
| > "you, me and him should not sign this statement, so it
| has some distance from us and therefore doesn't work in a
| counterproductive way." Daszak added, "We'll then put it
| out in a way that doesn't link it back to our collaboration
| so we maximize an independent voice."
|
| > Baric agreed, writing back, "Otherwise it looks self-
| serving and we lose impact."
|
| [1]: https://www.vanityfair.com/news/2021/06/the-lab-leak-
| theory-...
| BitwiseFool wrote:
| "Debunk" is becoming quite the loaded term. For one thing
| it's being used definitively despite the fact that there
| are still ongoing debates. The other thing is that it
| implies a sort of finality. New evidence or the results of
| an investigation can pop up anytime.
| beervirus wrote:
| A few months ago, it was forbidden to think about. Now it's
| considered a real possibility by the mainstream. Next year,
| it will probably be considered irrefutable.
| gjsman-1000 wrote:
| I said that almost "all the experts" agree that it _may_ be
| true, not that they necessarily believe it to be so.
| colinmhayes wrote:
| I don't see an "almost"
| drew-y wrote:
| Still. There is a difference between saying something
| hasn't been ruled out and saying that it may be true. To
| me, "may be true" implies a decent (> 10% chance) that it
| _is_ true. My impression is that most experts do not think
| it 's true. They just don't have enough evidence to
| definitively rule it out.
| gjsman-1000 wrote:
| Really? Just yesterday: "The WHO's Chief Says It Was
| Premature To Rule Out A Lab Leak As The Pandemic's
| Origin."
|
| https://www.npr.org/2021/07/15/1016436749/who-chief-
| wuhan-la...
| drew-y wrote:
| That headline is literally my point.
|
| "It Was Premature To Rule Out A Lab Leak" === We do not
| have enough evidence to definitively rule it out.
| inglor_cz wrote:
| We are talking about a one-off event that happened in a
| totalitarian country with a huge penchant for purging
| undesirable information.
|
| Even experts are on a shaky ground when almost all
| primary information sources are controlled by a non-
| cooperative party. The only thing concerning the origin
| of the epidemics that is beyond reasonable doubt is the
| genetic sequence of the virus. We do not even know for
| sure who the first covid-19 patient really was and when.
|
| Of course it is hard to speak with confidence in such a
| situation.
| jsight wrote:
| He should not have used the word all, but this feels like a
| really pedantic point to me. The main thing is that there is
| no consensus that it didn't get created that way.
| [deleted]
| mrguyorama wrote:
| These two things are vastly different: A) All the experts
| agree that covid comes from a wuhan lab B) Experts have not
| ruled out a lab leak
|
| Pointing out such a difference is not being pedantic, and
| in fact is hugely important
| mwigdahl wrote:
| But that is not the difference here. The original post
| said "Something [meaning the lab leak theory] all the
| experts are saying may actually be true". This is
| somewhere in between your A and B and at least to me
| sounds closer to B than A, as "may actually be true" is
| more of an acknowledgment of possibility than an
| assertion of probability. It's definitely not a statement
| of certainty.
| bzbarsky wrote:
| The post said "all the experts are saying may actually be
| true". The "may" (not "is"!) there is pretty key in
| making the meaning a lot closer to your (B) than anything
| like your (A)... Did that comment get edited after yours?
| Because it seems like you are arguing against a strawman,
| not what was actually said.
| devwastaken wrote:
| Youtube is still populated by actual conspiracy Qnuts. Much of
| the YouTube active discussion is right leaning, look to the
| comment sections.
|
| Google isn't trying to arbiter truth, it's a combination of
| doing what China tells them to do and _not_ being the arbiter
| of truth by removing almost anything related to covid. Simply
| discussing it can get you demonitized or your video removed.
| guelo wrote:
| Why pretend that there hasn't been loads of fake harmful covid
| misinformation over the last year? The conspiracy theorists
| have not been vindicated, or are you taking hydroxychloroquine?
| ASalazarMX wrote:
| > Something that now all the experts are saying may actually be
| true
|
| Since this is blatantly false, I would ban this as
| misinformation if it happened in a place I moderated. I'm don't
| care if it's about China, I care about not enabling
| misinformation. If people want conspiracy theories, there are
| friendly forums they could visit, but they want them in serious
| forums too.
| gjsman-1000 wrote:
| WHO Chief, yesterday, admits ruling out the lab leak theory
| was premature. It's not blatantly false unless the WHO Chief
| is propping up conspiracy theories now.
|
| https://www.npr.org/2021/07/15/1016436749/who-chief-wuhan-
| la...
| ASalazarMX wrote:
| I agree it's among the possibilities, but one should be
| honest about their likelihood. Right now the consensus is
| that natural occurrence is the most likely explanation, but
| there is not solid evidence to discard a lab leak.
|
| They don't have solid evidence to disprove a lab leak.
| Disproving is a lot harder that proving.
|
| I must concede that your statement wasn't blatantly false,
| but it uses weasel language to appear more truthful than it
| is.
| Uberphallus wrote:
| Again, it can be both, (and IMO chances are it's both).
| It could have been a sample from nature that leaked out
| of the lab.
|
| There are no signs that it might be man made. There are
| no signs that it might have evolved in GoF research. But
| the WIV had a sample of a close ancestor of this virus,
| it's not crazy that someone got the bug and jumped out in
| that very same city.
| slices wrote:
| evolutionary biologists Weinstein & Heyer argue that it
| has the fingerprints of GoF all over it
| Uberphallus wrote:
| I haven't read from them, but as a general thing, GoF
| research is very directional, so in practice the
| experiments will pressure a virus to gain a function, but
| the rest of the functions are usually impacted to a large
| extent due to the lack of evolutionary pressure.
|
| So you may do selection on virions that target better
| certain receptors in certain human cells to infect them,
| and that's useful to know of possible evolutions of a
| wild virus, but in parallel it might be losing
| environmental resistance (temperature range, UV light),
| or maybe damage the expression of some vital protein, or
| become too pathogenic and die along with the host.
|
| By all means, one could try to perform GoF in live humans
| to ensure there's no LoF, but that limits enourmously the
| speed of the research, plus it's usually forbidden to
| experiment in humans these days.
| s3r3nity wrote:
| >...but one should be honest about their likelihood.
|
| I agree, but I don't trust any central authority
| (especially the government!) to do that for me.
|
| Educating people & empowering them to make this call is a
| much more ethical, and fruitful, endeavor.
|
| To loosely paraphrase Mark Twain (I believe?): Just
| because a toddler can't use a sharp knife doesn't mean
| that I have to use a butter knife to cut my steak.
| ASalazarMX wrote:
| > To loosely paraphrase Mark Twain (I believe?): Just
| because a toddler can't use a sharp knife doesn't mean
| that I have to use a butter knife to cut my steak.
|
| That's a great quote, only we're the toddler. For all its
| political failings, the WHO is brutally more
| knowledgeable about the subject that us, and less biased
| than our local governments. They're asking China to open
| up so they can discard/prove a lab leak. If China
| refuses, the lab leak possibility remains on the table no
| matter its likelihood.
|
| Weaseling the phrasing, one can give the impression that
| the WHO turned around and now thinks the lab leak is the
| most likely source, while they still consider the natural
| origin most likely.
| nradov wrote:
| There is no such consensus.
| thinkingemote wrote:
| Its a slippery slope but not for the reasons you might first
| think.
|
| It's because this is turning a platform into an publisher. Soon
| more and more governments will be demanding publishers to enforce
| editorial standards. There seems to be no platforms these days
| that say they are just that. Even cloudfare, a peice of
| background infrastructure is happily editorialising certain
| things.
|
| With this idea, try not to get sucked in to the other argument.
| The issue is not that the list of certain things is going to
| increase, the issue is that the impartial nature of the internet
| is disappearing.
|
| There's a fair bit of evidence on some public hearings that the
| major players were given an ultimatum from both Republicans and
| Democrats to start being publishers, to relinquish control to the
| state, or be split up under anti monopoly and anti competition
| laws.
| krapp wrote:
| >It's because this is turning a platform into an publisher.
|
| The "platform vs. publisher" dichotomy that crops up in these
| conversations is propaganda. Exercising editorial control over
| content does not create a distinction between one or the other,
| or convert one into the other. The distinction doesn't exist as
| a matter of law, legal right or obligation.
|
| Every web site and service, from tiny phpbb forums to FAANG
| silos, has always had the right to choose what does and does
| not appear on their site, and the discretion to "editorialize"
| as they see fit.
|
| And the internet has never been impartial. Individual sites can
| be run impartially, but that's a choice made by the site owners
| - not an obligation or legal duty. Other site owners have every
| right to run their site under other terms. If you don't like
| the terms under which a service is offered, you can use another
| service.
|
| https://www.eff.org/deeplinks/2020/12/publisher-or-platform-...
| stale2002 wrote:
| I would recommend you read the actual law.
|
| https://en.wikipedia.org/wiki/Section_230
|
| "No provider or user of an interactive computer service shall
| be treated as the PUBLISHER or speaker of any information
| provided by another information content provider."
|
| So yes. In the actual law, there is _something_ called a
| publisher, and the law is distinguishing the website from
| being _something_ as opposed to a publisher.
|
| So yes, there absolutely is a "distinction" being made here,
| regarding something that the law itself calls a publisher.
| krapp wrote:
| My claim was not that publishers didn't exist, but that the
| commonly presented distinction between platform and
| publisher - that a "platform" cannot moderate content
| beyond strict legality, or else they must be be considered
| a "publisher," lose Section 230 protection and take full
| legal responsibility for all content on their site - does
| not exist. Google's editorial policies "turning them from a
| platform into a publisher" is not a thing that actually
| happens.
| stale2002 wrote:
| > that a "platform" cannot moderate content beyond strict
| legality
|
| Nobody in this thread said anything about this being
| currently illegal. Instead the claim was about a platform
| acting more like a publisher, in practice.
|
| From a defacto perspective, things that people often call
| platforms, very much act much different than publishers,
| in practice.
|
| The person you were responding to is saying that this
| change, from the previous status quo of platforms acting
| neutrally, is a problem, but didn't bring up anything to
| do with the law.
|
| The fact that this stuff is legal, to become less
| neutral, is in fact precisely the issue!
|
| EX: the following statement "the issue is that the
| impartial nature of the internet is disappearing"
|
| Is a description of how things work, in practice, that
| has nothing to do with the law, and instead having to do
| with how these entities act in practice.
|
| > Google's editorial policies "turning them from a
| platform into a publisher" is not a thing that actually
| happens.
|
| Yes it is a thing that is happening. It just has nothing
| to do with the stuff you brought up. In the past, Google
| acted differently. It has nothing to do with it being
| legal to act differently.
|
| The actual, more interesting thing though, is how this
| will effect _Future_ laws though.
|
| In reality, colloquial called platforms, acting more like
| colloquial called publishers very much could cause
| changes in future laws.
| krapp wrote:
| >Nobody in this thread said anything about this being
| currently illegal.
|
| And neither did I. My comment obviously referenced
| legality in the context of what sort of content
| "publishers" versus "platforms" are considered able to
| moderate - strictly legal content versus legal content
| which offends some otherwise arbitrary guidelines.
|
| This is the second time you've misconstrued my comment,
| so I'm going to find a better use of my time now. Good
| day.
| stale2002 wrote:
| > you've misconstrued my comment
|
| You are misconstruing the other person's comment is the
| point. They were talking about how laws might change, and
| the consequences of how platforms acting more like what
| are commonly called publishers might be.
|
| And it is how platforms acting more like publishers, in
| the colloquial sense, is absolutely a thing, and it has
| nothing to do with the law.
|
| You called it propaganda, when there is actually a point
| to be made here.
| trhway wrote:
| Such rules are probably not for everyday enforcement. It is more
| like Russia laws - have something to use when needed and/or to
| publicly demonstrate that you care about the issue.
| teh_infallible wrote:
| It's ironic that the largest advertiser is concerned about
| misleading content.
| tikiman163 wrote:
| Google drive is not supposed to be a mass distribution platform
| and it actually violates the TOS to treat it like one. You can
| Argus all you like about privacy and free speech rights, but they
| do not apply to anyone trying to use Google drive as some sort of
| free content distribution network. It wasn't designed to let
| people do that, and Google has every right to ban people for
| violating the TOS. The fact that they typically haven't reacted
| to this kind of use on a large scale before now should drive home
| the point of just how much these idiots have started to abuse the
| TOS all for the purpose of distributing misleading content.
| fraudz4us wrote:
| Hacker News was cool maybe 10 years ago. Now it's just one
| gigantic liberal toolshed.
| toss1 wrote:
| >> "Misleading content related to harmful health practices:
| Misleading health or medical content that promotes or encourages
| others to engage in practices that may lead to serious physical
| or emotional harm to individuals, or serious public health harm."
|
| One can hope that this will apply to the massive anti-vax
| campaigns on YouTube
| axy wrote:
| Idiots found the right place to publish the knowledge. Google
| drive, yeah. If they were true scientists, that would be an
| excuse. Scientists are unaware of popular technology, they deal
| with pure science.
| theodric wrote:
| No more CNN clips. Got it.
| kart23 wrote:
| >could significantly undermine participation or trust in civic or
| democratic processes
|
| Participation in our democratic and civic processes include the
| freedom to speak about our government in a way that we desire,
| even if it's misleading or contradicts our government. It's part
| of the reason why America was founded in the first place. Google
| is being hypocritical here.
| quantum_state wrote:
| Misleading by what standard? "misleading content" may well be in
| the eyes of beholders.
| rhapsodic wrote:
| Which side are you on?
| ajsnigrutin wrote:
| Companies should really decide and be either platforms (only
| remove directly illegal material, when reported, and carry no
| responsibility for other stuff) or a published (cherry pick what
| they want posted/hosted, and carry all the responsibility for the
| posted content.
| crazygringo wrote:
| I see a lot of comments misinterpreting this.
|
| First, it's not about private files, it's about _distributing_
| content.
|
| Google isn't spying on your private files, but does scan them
| when you share them publicly. E.g. keep all the pirated movies
| you want on your Drive, and even give private access to friends,
| but the moment you make them _publicly_ viewable Google scans
| them and limits access accordingly. So no, this isn 't applying
| to your private diary or privately shared documents.
|
| And second, to those who claim absolute free speech with no
| limits -- notice that the two main categories here are related to
| _democracy_ and _health_. All our legal protections ultimately
| depend on a democratic foundation -- undo that with
| misinformation and you don 't have anything anymore. Similarly,
| your rights don't matter much if you're dead. Companies aren't
| allowed to advertise rat poison as medicine and neither are you.
| listmaking wrote:
| Actually, this is not even about distribution exactly, but
| about the "Report Abuse" button: what this page lists are
| categories of things that, if someone with access to the file
| clicks on "Report Abuse", whoever is acting on those flags may
| decide is a violation. Note that the page says:
|
| > _After we are notified of a potential policy violation, we
| may review the content and take action..._
|
| So (1) it's not about Google proactively scanning all your
| files (even public ones: though I guess with sufficiently
| public files, sooner or later someone will click on "Report
| Abuse", perhaps even by accident), and (2) I imagine it could
| happen with files you shared with just your friend, if your
| "friend" decides to "Report Abuse".
|
| (Disclaimer: I work at Google but not on Google Drive or
| anything related to these policies.)
| prepend wrote:
| I don't want a cloud storage provider with only private
| storage. If I have a library of book files I want to share it
| with my spouse and if Google is trying to filter out misinform
| and not let me distribute it to my spouse, that's bad.
|
| I think we'll have augmented intelligence through computing
| soon and imagine how horrific it will be if Google says "you
| can think misinfo, we just won't let you think it?"
|
| That's bad. Storing, creating, and distributing don't need
| limits like this.
|
| Asimov's three laws were possible and they still had issues.
| Imagine having a law for robots that they couldn't speak what
| Google thought is misinfo.
| wes-k wrote:
| > Companies aren't allowed to advertise rat poison as medicine
| and neither are you.
|
| You may want a different example :).
|
| > Warfarin first came into large-scale commercial use in 1948
| as a rat poison. Warfarin was formally approved for human use
| by the US FDA to treat blood clots in 1954.
|
| Source: https://en.m.wikipedia.org/wiki/Warfarin
| contravariant wrote:
| There's something fundamentally flawed about the idea that
| censorship in the name of preventing misinformation is
| protecting the foundation of democracy.
|
| You cannot have true democracy if people cannot disagree with
| their governments, they must be able to disagree with _any_
| truth or opinion such a government might consider self-evident,
| just on the off chance they 're right.
|
| I should at this point note that Google doesn't directly claim
| to go quite _that_ far in preventing misinformation, they
| mostly claim to disallow things that could harm the democractic
| _process_ (e.g. telling people to vote at the wrong place,
| their candidate has died, etc.). At least that kind of
| information is usually agreed upon (if not there are bigger
| problems than mere misinformation), though they seem to try to
| include claims of voter-fraud, which is a bit dangerous.
| nickysielicki wrote:
| The bigger problem that I have with the idea that
| misinformation kills democracy is that it seems to suggest
| that misinformation is some new phenomenon or that the
| average person has been well informed throughout the history
| of western democracy.
|
| Democracy thrived before the printing press. Democracy
| survived the invention of the printing press, which was
| mostly in the hands of magnates who could afford it.
| Democracy survived the invention of television and radio,
| which was (and still is) in the hands of a select few
| magnates. We build up terms like "journalistic integrity" and
| look at the past with rose colored glasses as if these
| mediums delivered pure objective truth.
|
| If anything, what we're seeing with the internet is a more
| true democracy with a wider range of opinions, less
| controlled by small groups of plutocrats. If you don't like
| to see the death of that plutocracy, or you're happy to see a
| new group of benevolent plutocrats come in to retake control
| the narrative, I hate to be the one to tell you this, but you
| don't really like democracy.
| BrainWorm wrote:
| Soapbox, Zine, Flyer, Specialty Forum, Usenet, Civilian Radio
|
| You aren't censored if you can't generate a link to Bits on a
| google server.
| contravariant wrote:
| Call me a deontologist but if it's good then all of them
| should do it and if it's bad then none of them should.
| throw_nbvc1234 wrote:
| Do you feel the same way about free speech zones? Don't
| like what someone says, just force them to move out of the
| way so they end up protesting to an empty audience.
| Naturally if it's a big enough protest you may have to use
| violence (tear gas, riot police ect...) to do this but hey,
| they still get their right to free speech, so i guess it's
| all good and democratic.
|
| https://en.wikipedia.org/wiki/Free_speech_zone
| kirykl wrote:
| If Google didn't hide the URL people might know what voting
| location info is official and what's not
| Steltek wrote:
| What would you do to combat the deliberate misinformation
| campaign that is weakening the US and many other countries?
| While propaganda and polarization are not new, the speed,
| reach, and aggregation are only possible with modern
| communication.
|
| An early tidbit that may have been lost with deplatforming
| was that Euro leaders also disagreed, citing that only the
| state could be trusted with that power. However, the US
| Constitution mostly prohibits that route and it's left to
| private companies to make up their own minds about what
| content they want to host.
| contravariant wrote:
| It's a hard problem for democracy, the best countermeasures
| I know of are transparency and education, but those are
| mitigations at best, you can't really do much if a majority
| of people believe an untruth.
|
| You could also elect me as your benevolent dictator, I'll
| be happy to bring the misinformation to an end, but the lie
| I'd tackle first would be that this has anything to do with
| democracy.
| hackererror404 wrote:
| Imagine if Britain had this same technology when the USA was
| founded... It of course would have quickly cracked down on
| communications and it would have done so in the name of
| "peace" and "what's right"...
|
| This idea that thinking critically of a government and even
| believing that perhaps the government as it stands today is
| not the government "of and for the people" (sure could be
| interpreted as anti-democracy by that same corrupt
| government)... And maybe that's not correct, but who is the
| government to say that we can or cannot challenge them in
| public discourse as it is supposedly protected under the
| first amendment?
|
| This is indeed an insanely slippery slope and people willing
| to trade their freedoms because they think it's for the
| ultimate good, I think are really making a mistake... it's
| not difficult to understand that this is one of the first
| steps of an actual fundamentally corrupt government... This
| is easily open to abuse and vast interpretation.
| BrainWorm wrote:
| `Magna Carta originated as an unsuccessful attempt to
| achieve peace between royalist and rebel factions in 1215`
| [deleted]
| crazygringo wrote:
| > _There 's something fundamentally flawed_
|
| There isn't really. You're adopting, I assume, J.S. Mill's
| view, that the cure for bad speech is more speech, which he
| famously published in 1859.
|
| However, since then it's been widely accepted that when
| speech reaches a certain level of _harm_ then the greater
| good is to prevent /punish it. You can't incite violence
| under the guise of free speech. You can't advertise that
| something is safe when it's not. This is because more speech
| can't undo violence and death after it occurs.
|
| And when it comes to misinformation with regards to provable
| and intentional lies about voting procedures, election
| results, etc. that _falsely_ harm the country 's institutions
| and legitimacy, it's entirely consistent for that to fall
| under the widely-accepted prohibition of speech that rises to
| a certain threshold of harm. It directly leads to mobs,
| riots, and revolution _based on lies_ , not based on actual
| injustices.
|
| This doesn't mean _any_ harmful speech is prohibited -- that
| 's ridiculous. You're generally allowed to insult people,
| tell lies, etc. But there's a _threshold_ of harm that gets
| established.
| overgard wrote:
| Giving anyone the ability to arbitrate what is "good"
| speech vs "bad" speech is way too much power. In any era of
| history there have always been "truths" that were massively
| popular and eventually overturned. I don't think we are the
| first era to be an exception. So when you're talking about
| punishing "bad" speech you are talking about creating super
| powerful entities just because they agree with you. That
| intent scares me far more than whatever nonsense you get
| from q anon or antivaxxers.
| cmckn wrote:
| > Giving anyone the ability to arbitrate what is "good"
| speech vs "bad" speech is way too much power.
|
| But that isn't at all what is happening here. Google has
| decided that they don't want to enable people to
| distribute certain data using their platform. They're not
| being crowned the omnipotent oracle of good and bad.
|
| > you are talking about creating super powerful entities
| just because they agree with you.
|
| This position is bizarre to me -- what do you think an
| elected government is? I vote to create "super powerful
| entities that agree with me" every 4 years. _Those_
| entities possess the power to destroy all life on earth.
| Google is not anywhere near as powerful as those
| entities, and while it is not (directly) democratically
| accountable, it _does_ derive its power from its users.
|
| No information will be permanently erased just because
| Google does not spend money and time making it available
| on Drive.
| hf98shf89sh wrote:
| So, last year when the Wuhan lab leak was a conspiracy theory,
| it would have been (and was) censored.
|
| But what happens when Google/their allies change their mind and
| determine that something is no longer a conspiracy theory?
| gjs278 wrote:
| the rest of us were fine before google did this drive ban. how
| about they just put a warning on it for morons like you and let
| the rest of us read it?
| ping_pong wrote:
| You are assuming a professional is reading whatever the content
| is you are distributing and will make a rational, fair
| decision.
|
| No.
|
| It's going to be a minimum wage indentured Google servant that
| doesn't quite understand what they are reading but they have
| 17.5 seconds per case to make a decision. They will shoot first
| and ask questions later. What if the document is satire but
| they couldn't understand it? Oh well there goes one strike
| against your account, or maybe that's your third strike and now
| ALL your Google accounts are banned.
|
| We already know what the appeals process is like. Unless you
| get it publicized on Hacker News et al, you won't get any
| chance to appeal.
| rscoots wrote:
| Who at Google do you trust to decide what information you and I
| are allowed to know? What is this person's qualifications?
|
| How can they be held to account when they inevitably get it
| wrong?
|
| Where will the highly-transparent write-ups detailing
| moderation decisions be published?
|
| Seems like if Google actually gave a damn about the morality of
| censorship as some sort of 'neccessary evil' you'd be able to
| answer these questions easily^. Until then, it's a non-starter
| in my book.
| systemvoltage wrote:
| To add: Who at Google listens to dissent!? No one is allowed
| to say anything - there is sort of a chilling effect
| internally at Google.
|
| It's debateless policies that are spreading on the world
| stage. People need to rise up against a small group of
| individuals located in Menlo Park, CA who are demonstrably
| and utterly out of the touch with the rest of the world, but
| deciding how and what information flows. These people have no
| idea how agriculture works or how people live in Indonesia or
| what conflicts are going on in Namibia.
| revnode wrote:
| This is the key problem with all censorship, however well-
| intentioned. A person is needed to censor. People make
| mistakes. Sometimes by accident, sometimes on purpose. There
| is a strong dis-incentive to having any transparency or
| accountability. If there were, you might be held liable for
| your mistakes and nobody wants that.
| joe_the_user wrote:
| I don't trust Google to fully filter information for it's
| credibility ... so I don't automatically trust things shared
| on Google drive.
|
| But I don't have to trust that Google won't suppress valid
| positions drive since there are many alternatives for sharing
| information beyond Google drive, which isn't meant to
| primarily host public content in any case.
| rscoots wrote:
| >many alternatives for sharing information beyond Google
| drive
|
| You'll be disturbed to learn then that every major social
| network heavily censors information in an opaque manner.
|
| What about the elderly or others who might only know how to
| use Facebook or YouTube? Fuck em?
| joe_the_user wrote:
| First, since you've been derailing thing from one
| question another, I have to mention that a thing shared
| publicly from a Google drive account is no more
| accessible than a thing shared from a website that a
| person sets themselves, so Google drive accessibly is not
| particular answer to social network news filtering.
|
| But on the topic of social network news filtering, anyone
| who uses a social network is implicitly consenting to
| that network's filtering of information.
|
| Once upon a time, most people got their news from a
| single newspaper - well informed people might read
| several papers as well as newsmagazines but even this
| implied a lot of filtering. Those newspapers filtered the
| news more heavily than any present network.
| akira2501 wrote:
| > undo that with misinformation and you don't have anything
| anymore.
|
| That presumes that we have come from a period that was somehow
| free of misinformation. This is obviously false, and all we're
| doing is trading one corrupt system of control for another.
|
| Democracy also demands that the burden of proof is on the
| accuser, don't you feel this same standard should apply to
| those, who of their own volition, take on the task of fighting
| this "misinformation?" Shouldn't those deprived have recourse?
|
| > Companies aren't allowed to advertise rat poison as medicine
| and neither are you.
|
| Advertising is always a commercial activity. If I'm merely
| sharing my opinion that rat poison, in some dose, might
| possibly serve as a cure for some particular ailment, how am I
| advertising? Isn't there a responsibility of the other end user
| to not accept medical advice from anonymous information
| published from a free document sharing service?
|
| I'm not sure the trade offs you suggest are gaining us anything
| important.
| gjs278 wrote:
| I only get two comments a day and im going to spend the next
| one on you as well. the rest of the world would be better off
| if you were dead because you are leading us down a path to
| actual hell. let people write whatever the fuck they want you
| fucking fascist prick. not everyone knows how to make a website
| and google drive might be how they communicate their
| information. it's the equivalent to locking up a guy standing
| on venice beach with a megaphone because he sounds crazy, and
| he probably is, but he's got the right to do it. google drive
| is a common carrier and as long as they're not violating the
| laws of the country it's being distributed in, they should fuck
| off and not worry about it.
| freedomben wrote:
| what about if you share a file with select people? Is that
| still "private" or does it become public the moment you give
| someone else access?
| driverdan wrote:
| > Google isn't spying on your private files
|
| This is false. All of the major services that host images scan
| them for child porn, regardless if they're private or shared.
|
| I don't know how they're going to apply these rules and, unless
| you work there and are involved in this, neither do you.
| tyingq wrote:
| >First, it's not about private files, it's about distributing
| content.
|
| Sort of. People use "anybody with the link can view" for lots
| of purposes that are far short of broad public publishing.
|
| I use it for sharing with single digit numbers of people I
| already know, or sometimes just for myself for things that
| don't need to be private.
| mikevm wrote:
| This is not just Google Drive, this is a policy across many of
| their services (Drive, Docs, Sheets, Slides, Forms, and new
| Sites). Holy smokes... I think it's time to finally dump Google
| services for me.
| hiidrew wrote:
| second this...I've been thinking of 'degooglefying' my life for
| a bit, as convenient as some of the products are this concerns
| and worries me
| theknocker wrote:
| The march to implement the TPP as the de facto policy of
| "private" monopolies continues unabated, with full support from
| our completely amoral and lying intelligence community.
| cbradford wrote:
| As a common carrier they should not be allowed to discriminate.
| Like railroads cannot discriminate against traffic they do not
| like, and restaurants and hotels cannot discriminate. As a
| society we decided long ago that if you hold yourself out to the
| public you cannot discriminate
| RosanaAnaDana wrote:
| This. Also, 'misleading' is a mechanism to how content is used,
| not the content itself. So content in one use case may be
| misleading and in another, it is not. How content is used
| matters. How is google to judge this?
| YeBanKo wrote:
| It sort of makes sense, if it is a document, that is public or
| widely shared (anyone with link can access), in which case it
| serve not only as a collaboration tool, but also a sort of "CDN".
| Though even then, limiting access seems appropriate, "removing
| the content, and limiting or terminating a user's access to
| Google products" is too much.
| h3rsko wrote:
| I highly reccomend people look at Cryptomator[1]. It encrypts all
| your data on your google drive or dropbox etc, but works like
| normally on your local machine. Allows you to use these services
| while blocking the provider from being able to view your
| unencrypted data. Also its free and open source!
|
| [1]https://cryptomator.org/
| emerged wrote:
| I've been using bitlocker virtual disks on Google Drive for
| years. I wouldn't be surprised if Google eventually decides to
| delete anything encrypted.
| commandlinefan wrote:
| I keep hoping to see a fully end-to-end encrypted, peer-to-
| peer, uncensorable communications channel catch on... but if it
| hasn't caught on yet, I don't think it's ever going to. Too
| many people believe that "if you don't have anything to hide
| you don't have anything to fear" - so they figure if it looks
| like they're hiding something, they should have reason to fear.
| throwawaysea wrote:
| This move shouldn't surprise anyone. This was always the next
| logical step for activist companies and institutions that have
| been openly practicing authoritarian censorship. Their previous
| moves faced no real consequence or meaningful pushback, and they
| face little competition because they are monopolistic, so why
| would they stop marching down this path?
|
| What's more surprising is that there are people here on HN who
| are excusing this by claiming that Google isn't banning hosting
| the content but only distributing it. We need a new phrase for
| this kind of unhelpful trivialization and gaslighting - it isn't
| just harmless trolling.
|
| The biggest impact of this will be from how inconsistently this
| will be applied. Google will not ban activists from sharing
| "toolkits" used to organize riots or push left politics. They
| won't stop false content like the core claims of the 1619 project
| from being shared via BLM materials for schools that are hosted
| on Google drive. Google will suppress one side and in effect
| amplify all others, propagandizing the world through their
| services.
|
| It's time we seek out alternatives, break up big tech companies,
| regulate them, and put an end to their abuse of power. For now
| here are some alternatives to Google:
|
| https://restoreprivacy.com/google-alternatives/
|
| https://www.techspot.com/news/80729-complete-list-alternativ...
|
| https://fossbytes.com/google-alternative-best-search-engine/
| colordrops wrote:
| People aren't seeing the forest for the trees. The establishment
| had a near perfect lock on the narrative before the internet
| became popular. There was a very tight Overton window on TV,
| radio, and publications that didn't necessarily match what people
| were thinking about or wanted to hear about. This was
| intentional.
|
| When the internet broke into the mainstream, it was a strong
| feature, not a bug, that content was hard to censor, and sites
| like Google got a lot of their earlier traction due to their
| results not being gamed or massaged for profit.
|
| You could hear the establishment gears grinding though. They were
| gonna take back control of the narrative even if it took decades.
| Well it did, and they have, to a degree. It's only going to keep
| getting worse as long as they can continue to wrest further
| control.
|
| Do you think that only leaders of other countries like China want
| to control what people see, hear, and think? Do you recognize the
| immense power that comes from narrative control, militarily and
| financially?
|
| edit: for those downvoting, would you care to point out flaws in
| my statements?
| honksillet wrote:
| Just more censorship and gaslighting.
| ehsankia wrote:
| Not sure if related, but for the past 2 months, on a daily basis,
| sometimes 2-3 times per day, I get a Drive notification about
| someone "resolving a comment" with me tagged, and the document is
| naked ladies and other pornographic ads.
|
| Generally by the time I get to the document, it's gone, and
| otherwise I mark it as spam, but it hasn't put a dent in the
| daily notifications I receive...
| dukeofdoom wrote:
| Also today,
|
| Psaki on de-platforming American citizens: "You shouldn't be
| banned from one platform and not others if you providing
| 'misinformation' out there."
|
| https://twitter.com/disclosetv/status/1416095652162031624
|
| They will define misinformation of course.
| willhinsa wrote:
| Not to mention the executive branch "flagging problematic posts
| for Facebook".
|
| https://news.yahoo.com/biden-administration-flagging-problem...
| ComodoHacker wrote:
| >Manipulated media: Media that has been technically manipulated
| or doctored in a way that misleads users and may pose a serious
| risk of egregious harm.
|
| What about deepfakes and other machine-generated content?
| [deleted]
| josephcsible wrote:
| Their definition of misleading says it "includes information
| [...] that contradicts official government records". Because no
| government record has ever been wrong before, right?
| gjsman-1000 wrote:
| A. Which government? All governments in general collectively?
| Or just the governments you want to believe? I'm sure they
| aren't going to listen to the COVID skeptic President of
| Brazil, right?
|
| Even better, _as a user_ , can I appeal to the authority of a
| different government than the one I live in? Let's say I appeal
| to the authority of Brazil as governing my content even though
| I live in the US. How does that work?
|
| B. Because no government has ever lied on official records when
| there is a disaster. For sure. And no government is currently,
| right now, lying on their records to save face. For sure.
| zpeti wrote:
| Well, we only have to go back a year, and you are saying the
| Trump admin is the only one speaking truth...
|
| I find it insane that Silicon Valley companies have such
| short memories, and can't even comprehend that 1 year ago the
| same policies would have resulted in banning anything Trump
| disagreed with.
| native_samples wrote:
| The "policy" is merely a fig leaf to disguise untrammelled
| totalitarianism and arbitrary abuses of power.
|
| It's sad. I used to work there. What monster did we create,
| exactly.
| judah wrote:
| I was shocked as you until I read the actual text.
|
| Banned misinformation "includes _census participation_ that
| contradicts official government records."
|
| Census participation. For example, "You didn't register in the
| census last year? You're ineligible to vote." It's aimed at
| misinformation that discourages people from voting.
|
| That seems much more reasonable, and far more precise, than
| information that contradicts any government record.
| josephcsible wrote:
| Census participation was the last entry in the list. I think
| that clause is meant to cover the entire list, not just its
| last entry.
| Negitivefrags wrote:
| It's amazing to see an authoritarian regime establish itself in
| real time in front of our eyes.
|
| It's so easy to look at the past and think "How did people let
| this happen?".
|
| And yet here we are.
|
| I hope stopping the anti-vaxers and election fraud people is
| worth it.
| cpr wrote:
| Especially as the news about the election fraud slowly rolls
| out across multiple states...
| defaultname wrote:
| Did I accidentally stumble into The_Donald? The bizarre,
| fact-free comments that dominate this discussion are simply
| gross for HN.
|
| How in the world was this comment flagged, beyond the
| brigading of the most ignorant of deplorables. There is
| zero information about "election fraud" coming out, much
| less from multiple states, but this is the tact these
| horrendous cretins use to ply their disinformation noise.
| But did you see all of the news coming out about how the
| "MAGA" crew are actually lizard people with sub-50 IQs?
| It's true, you'll see. It's true! The Cyber Biologists did
| a "study" and they pointed out that in a picture the
| insurrectionists reflected light just so, clearly
| demonstrating that they must be lizard people.
|
| It is embarrassing to see this on HN. The US is turning
| into a laughing stock.
| oogabooga123 wrote:
| I can't tell if you are being sarcastic, this is how far
| internet discussion has deteriorated.
| andrewclunn wrote:
| https://www.westernjournal.com/az-audit-revelation-wrong-
| pap...
|
| Slowly, The_Donald (now banned from reddit by the way) is
| proven right. I was shadowbanned here on HN a while ago,
| but even this place is slowly waking up to the truth.
| Your outrage is your mind crying out in cognitive
| dissonance as you deny the thought, "What if I'm wrong?"
| defaultname wrote:
| While usually it's just a cheap saying, I literally
| laughed out loud at your comment. The delusion is
| incredible.
| mikevm wrote:
| They will support this as long as the government in place is
| one that aligns with their own views. That's pretty obvious,
| right? Given the explicit mention of banning the discussion of
| voter fraud - something the Democrats have done before, but is
| now "illegal".
| markzzerella wrote:
| Early on Fauci was telling everyone how useless masks were [1].
|
| I have a couple people in my social circle that were banned
| from fb for saying he was wrong early on about masks being
| ineffective. And several others were banned later for pointing
| out fauci's earlier stance and calling it all propaganda. You
| are not allowed to think for yourself. Pick up that can.
|
| [1] https://www.youtube.com/watch?v=lE-XVfZCX-o
| deregulateMed wrote:
| N95 masks were never useless and Fauci should be seen as a
| monster.
|
| What I don't understand is why there were no nations giving
| away n95 masks and education on how to wear it. It seems like
| we hivemind to the unsustainable lockdowns.
| djrogers wrote:
| > What I don't understand is why there were no nations
| giving away n95 masks
|
| Don't know if you remember last march/April, but there
| _were_ no N95 masks to give out in a lot of places...
| ceejayoz wrote:
| Right, but what about by _this_ March /April?
|
| I don't understand why most functional governments didn't
| eventually have a monthly care package of "here's your
| masks, hand sanitizer, the latest newsletter, and your
| stimulus check".
| SV_BubbleTime wrote:
| > I don't understand why most functional governments
| didn't eventually have a monthly care package of "here's
| your masks, hand sanitizer, the latest newsletter, and
| your stimulus check".
|
| Partly because it isn't the role of government to "give"
| you things you might want while they are paid for with
| foreign debt in box.
| ceejayoz wrote:
| Disaster mitigation and relief has long been a
| governmental responsibility.
| joshuamorton wrote:
| There was a proposal to do this:
| https://www.washingtonpost.com/context/read-the-scrapped-
| usp...
|
| It was killed for political reasons.
| TheCoelacanth wrote:
| This April they're mostly useless in the US because you
| can just get vaccinated. If you're refusing to get
| vaccinated, you're probably refusing to wear a mask too.
| ceejayoz wrote:
| There's a lot more to the world than the US, and that's
| really missing the point. "We're out of masks" is a
| perfectly good answer to "why aren't governments giving
| them out" in March 2020; it's not a good answer a few
| months later.
| deregulateMed wrote:
| I agree, but by October they were everywhere.
| merlinscholz wrote:
| Last year in Germany you could get IIRC 7 free N95/FFP2
| masks at the local pharmacy for free. The pharmacists were
| supposed to show you how to wear them.
| YeBanKo wrote:
| "We have recently been notified of a potential policy
| violation and after a thorough review of the video materials
| uploaded, it has been determined that the content is
| misleading and contradicts official government records. As a
| result of this decision, you access terminating to Youtube
| and other Google products, including Gmail, has been
| terminated. The decisions is final. This message is auto-
| generated"
|
| "The past was alterable. The past never had been altered.
| Oceania was at war with Eastasia. Oceania had always been at
| war with Eastasia."
| read_if_gay_ wrote:
| Lab leak hypothesis.
| jonnycomputer wrote:
| Pressure by Chinese government, maybe?
| MeinBlutIstBlau wrote:
| A lot of major tech/game companies have been kowtowing to
| Chinese policies.
| jonnycomputer wrote:
| you comment would be more acceptable without the use of
| "kowtowing".
| Uberphallus wrote:
| Strange, I find it particularly accurate.
| jonnycomputer wrote:
| Then use "prostrate", or "abject submission" when means
| the same thing, more or less. English already has words
| for this!
|
| It is not about accuracy, but about language that is
| stereotypically reserved for when talking about China,
| and to my ears, it comes off as a bit racist, tbh.
| selimthegrim wrote:
| I suppose if I say Xi Jinping should go to hoosegow I'm
| stepping in it too?
| mdoms wrote:
| What?! "Kowtowing" has absolutely no racist connotation
| whatsoever! You're a lunatic.
| MeinBlutIstBlau wrote:
| Ke Tou is the Chinese word "Kowtow" basically meaning
| "submission" or "prostrate." Yet For some reason, the
| fact that the Chinese employed this method of submission
| on others was okay, but when we use it on the context of
| China forcing it on Western countries, suddenly it's
| "racist."
|
| Whether I use the word the Chinese made or the use an
| English amalgamation of its meaning is irrelevant. There
| is absolutely nothing even remotely ethnocentric or
| bigoted in using the word. You are just an easily
| offended person.
| jonnycomputer wrote:
| Are you writing in Chinese? No, you weren't. So the
| question is, what is the intent here? Would you have used
| the same word if, for example, the context was Brazil? or
| Russia?
|
| I think a little more self-reflection, rather than
| defensiveness is in order. Especially because the comment
| was intended to be helpful, and not a criticism. The
| original comment was getting downvoted, and I suggested
| what the reason might be.
| OhWellLol wrote:
| The intent is to reveal your linguistic hypochondria,
| which appears to be working very well.
| jonnycomputer wrote:
| Or just that I'm a little more aware?
|
| https://www.google.com/search?q=news+kowtow+-definition+-
| mea...
|
| That's a google search. I just filter out definitions,
| references, and the clothing brand. Now look at it. Sure
| there are examples of the word kowtow used for other
| things. But its mostly about China. Because in the
| context of China, the word just suggests itself. But why?
| What does it add? What does it subtract. No one here
| wants to wrestle with that. Instead they get defensive.
| OhWellLol wrote:
| Oh goodie, using linguistic references from Google to
| justify accusatory paranoia. Let's play that game!
|
| https://trends.google.com/trends/explore?q=Kowtow&geo=US
|
| I-I-I-is North Dakota a bastion of Chinese context!?
|
| Your conspiracy theory is not only nakedly annoying, but
| pointless until you address this very serious North
| Dakota problem.
| jyrkesh wrote:
| I would have. English is a frankenstein language,
| adopting words from all sorts of different languages.
| Kowtow means what it means in English, it's a perfectly
| reasonable word to have used there.
|
| I tried really hard to see if this wasn't the case, but
| the consensus across the web seems to align with this SO
| post:
| https://english.stackexchange.com/questions/314900/does-
| kowt...
|
| IMO, he was being downvoted because this particular
| instance is just as much a show of kowtowing to the US
| government as the Chinese government.
| MeinBlutIstBlau wrote:
| I call the Russian special forces Spetznaz and the
| Brazilian yearly festival Car-ni-val. Or when I am happy
| when bad happens to other people, I have schadenfreude.
| Did you also know most of English is comprised of loan
| words from other languages too? Or are you offended too
| that words like etiquette are actually french?
|
| So yes. I would use a word that the said culture would
| say themselves.
| jonnycomputer wrote:
| Yeah, well that's fine and all. But you're also not
| exactly talking about the ancient Chinese practice of
| submitting to the Emperor either. There are no Chinese
| emperors, and kowtow is not a contemporary custom. So,
| no, you aren't doing what you are saying you are doing.
| You are using the word to have the meaning it has
| acquired in Europe and America.
|
| So why is it that when we talk about Western
| relationships with China the word kowtow comes up so
| often? What does it add? Why does it suggest itself?
| Maybe wrestle with that for a while.
|
| This isn't about _you_.
| Uberphallus wrote:
| > kowtow is not a contemporary custom
|
| It's rare as a protocolary practice, but it's still part
| of the culture.
|
| https://www.bbc.com/news/world-asia-35553120
| mooseburger wrote:
| > What does it add? Why does it suggest itself?
|
| What do you think it does? No idea where you're going
| with this.
| acrobatsunfish wrote:
| "The point is obvious. There is more than one way to burn
| a book. And the world is full of people running about
| with lit matches. Every minority, be it
| Baptist/Unitarian, Irish/Italian/Octogenarian/Zen
| Buddhist, Zionist/Seventh-day Adventist, Women's
| Lib/Republican, Mattachine/Four Square Gospel feels it
| has the will, the right, the duty to douse the kerosene,
| light the fuse. Every dimwit editor who sees himself as
| the source of all dreary blanc-mange plain-porridge
| unleavened literature licks his guillotine and eyes the
| neck of any author who dares to speak above a whisper or
| write above a nursery rhyme."
|
| Ray Bradbury
| MeinBlutIstBlau wrote:
| Kowtow is not an offensive term, has been used by the
| Chinese for millennia, and in this context, happens to be
| about the Chinese.
|
| Quit being offended just for the sake of it.
| jonnycomputer wrote:
| TBH, it is you who seems to be offended. Not me.
| [deleted]
| MeinBlutIstBlau wrote:
| Says the person calling me a racist for not saying
| something racist...
| jonnycomputer wrote:
| I didn't call you racist.
| MeinBlutIstBlau wrote:
| > I didn't call you racist.
|
| But at the top of the thread:
|
| >and to my ears, it comes off as a bit racist
|
| If that's not calling me a racist then I don't know what
| is.
| [deleted]
| crocodiletears wrote:
| No, this is just an extension of their policy on covid
| misinformation.
| Lammy wrote:
| "Ey yo the government is lies, son / United States of Google;
| Verizon" https://www.youtube.com/watch?v=Ezpohf6-SuA
| neuronexmachina wrote:
| The full paragraph for reference:
|
| > Misleading content related to civic and democratic processes:
| Content that is demonstrably false and could significantly
| undermine participation or trust in civic or democratic
| processes. This includes information about public voting
| procedures, political candidate eligibility based on age /
| birthplace, election results, or census participation that
| contradicts official government records. It also includes
| incorrect claims that a political figure or government official
| has died, been involved in an accident, or is suffering from a
| sudden serious illness.
| protomyth wrote:
| Which government?
| josephcsible wrote:
| The way it's written, it sure sounds like _any_ government. I
| expect them to selectively enforce it only for governments
| they agree with, though.
| not2b wrote:
| Where legally required, they have to enforce it for a
| government they disagree with if they want to continue to
| do business in that country. Otherwise their employees
| could face arrest (that's happening in India, for example).
| kilovoltaire wrote:
| In my opinion you've removed very important context from this
| quote.
|
| As mentioned elsewhere, the full quote is very different:
|
| "information about public voting procedures, political
| candidate eligibility based on age / birthplace, election
| results, or census participation that contradicts official
| government records"
| throwitaway1235 wrote:
| I've seen multiple comments assigning Googles illiberal behavior
| to Chinese influence or pressure.
|
| This is patently false, Big Tech is beholden to Western dictate.
| There is currently a coordinated effort by non-partisan forces to
| push the United States into a war with China. Don't fall for it.
| omginternets wrote:
| Like many on HN, I rely on Google's services to a degree that I
| find worrying. The reason for my immobilism is the usual one:
| Google's services are extremely convenient, and switching to
| alternatives is extremely costly in time.
|
| Like many of you, I make it a point to investigate alternatives
| to G-Suite, Android, _etc_. Sadly, the answer to "is there a
| practical alternative to Google" has so far been a resounding
| "no".
|
| After thinking about this some more, I'm beginning to think the
| question is ill-posed. I'd like to ask a better question, and I'd
| really appreciate your input. But first, let me list a handful of
| observations about my behavior as a user and customer that I
| think are important.
|
| == OBSERVATIONS ==
|
| OBSERVATION 1: I don't mind paying for software, I just hate
| accounting.
|
| I don't often buy software, _especially_ subscription-based SaaS
| products. For the longest time, I thought my aversion had to do
| with frugality, but I now realize it has to do with peace of
| mind. I refuse to accept the burden of tracking my subscriptions,
| being responsible for cancelling them when no longer in use, and
| chasing down the odd unexpected charge on my bank statement, or
| trying to remember how much a subscription costs and whether or
| not to cancel before its automatic renewal. Life is too short.
| The mere thought of such a snake-it provokes a response
| approximating an anxious rage. My reaction is visceral; one
| cannot reason with me on this point.
|
| Subscriptions-management services don't solve the problem. They
| might remove accidental complexities that emerge from managing
| subscriptions -- visualizing pricing side-by-side, alerting me
| when a subscription is about to expire, performing the
| (un)subscription automatically, etc. -- but it is impossible for
| them to solve the essential complexity of subscription
| management: thinking about subscriptions.
|
| OBSERVATION 2: I am a software developer, but I am not an
| advanced user.
|
| I am technically literate, but most of the work I do in G-Suite
| is basic. I draft documents, often collaboratively. I use track-
| changes and comments. I use Times New Roman in 12-point font. I
| sometimes make basic slide decks, using the default theme and
| paying no regard to aesthetics beyond ensuring proper alignment.
| I rarely use spreadsheets, but when I do, I use the basic
| functions, and rudimentary formatting.
|
| Same goes with email. Helvetica, ll-pt. Attachments.
|
| OBSERVATION 3: Cloud storage is actually pretty great, but I
| mistrust you deeply.
|
| In a utopian world, Cloud storage and SaaS software would be
| perfect! I almost always have access to the internet when I need
| it, and during the rare times when I don't, it can wait an hour
| or two. In this utopia, I could take full advantage of Cloud/SaaS
| offerings, effectively freeing myself of the need to store,
| organize, back-up and garbage-collect local copies. It would be
| great!
|
| Only, here's the issue: I can't do that. I can't do it because
| SaaS platforms can't be trusted to have adequate disaster
| recovery in place, spy on me, routinely hold my livelihood
| hostage [0], and might arbitrarily deplatform me.
|
| As such, I find myself in the ironical situation of having to
| _manually_ backup my Google documents by exporting them to RTF
| and storing them in a well-organized folder hierarchy for a rainy
| day. My life has actually gotten a bit _worse_ , but not so bad
| that I'm willing to give up collaborative editing.
|
| == QUESTION ==
|
| Taken together, the above observations point to something
| approximating:
|
| 1. Self-hosted service. Or, at least, some kind of ability to
| continue working in the event that the Cloud-based mothership
| fails or banishes me. Such self-hosting should be prepackaged and
| easy to install/configure.
|
| 2. Non-subscription based monetization. More generally: non-
| recurring expenses.
|
| 3. Optional cloud-based storage. Perhaps just a dumb S3-style API
| into which data can be backed up?
|
| Does anything like this (or better!) exist?
|
| [0] I'm looking at you, Lastpass.
| tomohawk wrote:
| It's crazy how real live is almost impossible to satirize at this
| point. Consider this attempt:
|
| https://babylonbee.com/news/to-avoid-1st-amendment-concerns-...
| breck wrote:
| Going to plug our new subreddit: https://Reddit.com/r/ifa
|
| We need to pass an amendment abolishing copyright and patent
| laws.
|
| We need to make (old) Napster, torrents, Pirate Bay, Sci hub,
| legal, and decentralize the control of information. Otherwise
| it's 1984 baby.
| SpicyLemonZest wrote:
| The "census participation that contradicts official government
| records" clause is especially wild, given the US Census Bureau's
| open policy of injecting small errors for privacy purposes.
| (https://www.census.gov/about/policies/privacy/statistical_sa...)
| humanistbot wrote:
| Misleading title. Google Drive bans *sharing* "Misleading
| Content," which means they can disable sharing if a doc
| containing blatant misinformation starts to be widely shared on
| other platforms. (Edit: This doesn't mean they'll go into your
| drive and remove doubleplusungood content)
|
| Google Drive is now widely used as a social media platform -- I
| even get spam from google drive share notifications now! And so
| I'm sure they will stumble in all the ways that all the platforms
| do around content moderation.
| [deleted]
| temp8964 wrote:
| I updated the title, thanks.
| acmdas wrote:
| "In some cases, no amount of context will allow this content to
| remain on our platforms..." sounds exactly like they will "go
| into your drive and remove doubleplusungood content."
| finiteseries wrote:
| "After we are notified of a potential policy violation, we
| may review the content and take action, including restricting
| access to the content, _removing the content, and limiting or
| terminating a user's access to Google products._ "
|
| It's absolutely in the cards.
| [deleted]
| [deleted]
| jfoutz wrote:
| Oh no! I have an OKR sheet that might be in violation.
| axy wrote:
| So, any company that works on, for example, civic or democratic
| processes, may lose all their documents on google drive.
| CONGRATULATIONS!!! The next step would be your gmail box.
| Animats wrote:
| Does this apply to Google ads?
| listmaking wrote:
| Just to be clear, this is about the "Report Abuse" button. This
| page lists categories of things that, if someone clicks on
| "Report Abuse", Google may decide is indeed abuse, and take
| action. So this does not apply to private files, but to files for
| which someone with access to the file decided to complain. Note
| the page title, and that it says:
|
| > _After we are notified of a potential policy violation, we may
| review the content and take action..._
|
| There are a lot of comments here. Many may indeed hold the
| opinion that even when someone clicks on "Report Abuse" for a
| file being distributed via Google Drive on the grounds of
| "misleading content", then Google shouldn't take action like _"
| restricting access to the content, removing the content, and
| limiting or terminating a user's access to Google products"_.
| That's a valid position, and legitimately a criticism one can
| hold, and this discussion makes sense. But there are also quite a
| few comments imagining this to be about Google proactively
| scanning everyone's private files, so just making this clear.
| (The submitter probably understands the distinction as they put
| "distribution" in the title, but clearly, some comments do not.)
|
| (Disclaimer: I work at Google but just posting as myself; have no
| special information here but this is just my obvious reading of
| the page.)
| brailsafe wrote:
| Remember, Google can just outright ban your full account and not
| explain why.
| agentdrtran wrote:
| Good. this rule is pretty narrowly scoped and sorely needed on
| every platform.
| regnull wrote:
| People must be able to exchange information in a secure and
| private way, free of government or corporate censorship. The fact
| that there is no common, simple, and ubiquitous way to exchange
| private messages blows my mind. I wish someone would do something
| about it.
|
| Oh wait, I did! Here's my attempt to implement email based on
| Self-Sovereign Identity: https://github.com/regnull/ubikom and
| https://ubikom.cc
|
| Still work in progress, but it's fully functional. There are no
| accounts or domains - private key ownership is all we need.
| sysadm1n wrote:
| Surprised nobody has mentioned ProtonDrive[0] as a near future
| candidate for eliminating these types of incidents. Since the
| data is encrypted, ProtonDrive can't make arbitrary value
| judgements on your data. For me personally I can't wait for it to
| arrive so I can wean myself off Google properly. It's the one
| final thing that I need to replace Google with: Proper Encrypted
| Cloud Storage.
|
| [0] https://protonmail.com/blog/protondrive-security/
| DeathMetal3000 wrote:
| Sync.com already exists and is e2ee. Why people keep using non
| e2ee services and get outraged when their data is misused I
| can't understand.
| das_keyboard wrote:
| I don't have much experience with Googles policies, but is this
| some "report only" type of thing.
|
| That means it will only be enforced if someone reports a file I
| created or should I expect some google employees or ai to scan my
| files for the mentioned content?
| iandanforth wrote:
| As a government censor it is very pleasing to see commercial
| companies go above and beyond like this. In the old days I would
| have to threaten legal action before they instituted preemptive
| policies. Now all I have to do is wait for an internal group to
| raise an inclusion/exclusion/psychic violence issue and they'll
| grant themselves all sorts of broad policing authority. Saves me
| a whole lot of trouble I can tell you that much!
|
| </sarcasm>
| matheusmoreira wrote:
| > Circumvention
|
| > Publicly sharing apps suspended by Google Play Developer
| Policies.
|
| > Publicly sharing videos that do not comply with the YouTube
| Community Guidelines.
|
| Google bans people for this? Wow.
| dukeofdoom wrote:
| Just a friendly reminder that if you use "google" for all of your
| logins...then if "Google" bans you, you will lose access to all
| of your logins.
| semitones wrote:
| This is terrifying and is one of the main reasons I am working
| de-googling my digital life.
| floatingatoll wrote:
| It took me over a year to extricate myself from Gmail, and I
| still keep the account open and forwarding because it hasn't
| been a year since the last time I found something I have to
| migrate. I'll be thrilled someday if I can drop it, but I
| imagine the vestigal leftover will just persist for all time.
| Gmail was a very convenient service, but ultimately I can't
| trust them to use reasonable human judgement in administrative
| decisions, and so I can't trust Google with the keys to my
| life.
| fraudz4us wrote:
| Everyone except fascists and fucktards form San Francisco knows
| the election was straight up stolen.
|
| Youtube even deleted the OFFICIAL Maricopa County hearing
| yesterday because it totally incriminated the shit out of
| everyone involved. This is on top of the insanity coming out of
| Georgia.
|
| The fraudulent election needs to be dealt with otherwise I fear
| the US will dive into a second civil war and we can't do this
| right now - not when China is sharing nuclear bombing videos of
| Japan on twitter and not when they are looking at invading Taiwan
| in the next few years.
|
| San Francisco - get your shit together - your city is the first
| city China will bomb.
| cblconfederate wrote:
| It's even in the name, it's google's drive, not my drive, why
| would i expect otherwise
| cronix wrote:
| I see Google got the DNC's memo. Gov't is asking private
| corporations to censor your SMS messages, because gov't itself
| can't. They want to censor every form of digital communication,
| period. We either have free thought and expression of ideas, or
| we all live under someone else's ideals of what those are, and
| most likely, if history has taught us anything....you won't like
| it but it will be too late. Your voice will just be gone, unless
| it's full of praise for whoever your "dear leader" ends up being.
| This isn't done. More will come. We'll hem and haw when we hear
| about it, but keep on keeping on...right off the cliff.
|
| Free speech is not easy, but it is an absolute necessity unless
| we want to revisit some of histories uglier chapters, and all I
| see is pedal to the metal.
|
| > And this campaign is far from a single-pronged strategy.
| According to Politico, the Democratic National Committee (DNC)
| and "Biden-allied groups" - whatever that last phrase means -
| have plans to "engage fact-checkers more aggressively" and "work
| with SMS carriers to dispel misinformation about vaccines that is
| sent over social media and text messages."
|
| https://www.eutimes.net/2021/07/us-says-it-will-censor-anti-...
| throwitaway1235 wrote:
| Nailed it. Government knows it would be difficult to implement
| vaccine passports (stateside anyways) so they have publicly and
| privately asked private business to check vaccination status,
| bar entry etc for them.
|
| Same goes for the Democrat Party asking Facebook to flag
| opposition comments regarding Covid. Government is shacked by
| the 1st amendment, corporations aren't.
| acmdas wrote:
| Yet another reason to avoid using cloud (i.e., someone else's
| computer that you don't control) products.
| tobinfekkes wrote:
| Well isn't this just rich (in a poor way), considering Google is
| notorious over the years for making the display of their ad
| placements harder and harder to differentiate from actual search
| results, thus more and more misleading....
|
| Can we ban that too?
|
| Give me a break.
| twodayrice wrote:
| Do you mean ads?
| homedrive wrote:
| Looking for an easy way to self-host your own Web drive at home?
| Try HomeDrive:
|
| https://www.homedrive.io
| temp8964 wrote:
| Related:
|
| @BreesAnna First time I've seen a cloud drive blocking a
| document...
|
| It was a very long document re vaccination headlines from around
| the world.
|
| https://twitter.com/BreesAnna/status/1384763150109716481
| rubyist5eva wrote:
| Absolutely outrageous.
| qwertox wrote:
| So I downloaded the PDF and took a look at it. Its 80 MB of
| screenshots and photos of newspaper articles of reports of
| people which died because of the vaccine (or at least shortly
| after having gotten it).
|
| I wouldn't consider it misinformation, but it is misleading.
|
| There can be several causes of serious side effects, most of
| which I believe to be either contamination, a pre-existing
| condition which got its last kick due to a component of the
| vaccine, or something like this.
|
| In any case, the document notes that around 2500 people have
| died, and according to what I checked in the database of the
| CDC, now around 4500 have died. This includes those who
| accidentally fell down the stair after tripping over their cat,
| but who got their vaccine a couple of hours before, but that
| doesn't matter.
|
| So from 34,000,000 Covid-19 cases in the US 608,500 died =
| 1.79%
|
| From 334,927,961 administered doses in the US 4,434 died =
| 0.0013%
|
| If you are vaccinated you're protected by at least 40% (used to
| be 90+) against getting infected and if you do, it will me more
| likely that you won't have to go to the hospital.
|
| There are a lot of people who tend to ignore this comparison,
| probably because it makes them feel more important to know
| about these deaths and we, the sheep, just get silently
| vaccinated, because they tell us to do so.
|
| The point is, that it has benefits, and also some minor risks,
| to get vaccinated, yet the article attempts to make you believe
| the contrary, that the risks are too high. So yes, it is
| misleading, but in a somewhat dangerous way. There are people
| who are easily influenced by this kind of narrative.
|
| Now, the question is still open for debate: should Google be
| allowed to decide if content like this can't be shared over
| their platform.
| fraudz4us wrote:
| Sundar needs to step the fuck down.
| ncphil wrote:
| Wow. Just wow.
| mberning wrote:
| A Russian troll farm buys some election ads on facebook and it's
| a national security crisis with years of investigations and
| hearings. Google decides to play god with the flow of information
| and I highly doubt they receive anything more than the slightest
| pushback from our political establishment. Our elected officials
| are asleep at the wheel.
| coldtea wrote:
| In case you were wondering why countries that don't feel like
| being pets to the US like China have their own Google's,
| Twitters, and everything.
|
| It's not just for "government control". They'd do the same even
| if they were perfect democracies.
|
| It's first and foremost for avoiding such blatant foreign
| control.
| avivo wrote:
| Google can do a _much_ better job of making clear what is out of
| scope. When I first saw this headline I was also surprised and
| outraged.
|
| Looking more closely, this is just about _distribution_ ,
| probably in terms of "content hosting". It doesn't target
| individuals or families storing whatever they want for
| themselves. This made more sense in that context.
|
| For example, if I create a fake video urging people to vote
| illegally, or at the wrong time, and I am sharing it through
| Google Drive with many many people seeing it, Google wants a
| policy prevent that sharing.
|
| Otherwise either it's hands are tied or it's just doing arbitrary
| things. Which is far more authoritarian.
|
| If a document is shared and accessed by thousands of people, it
| makes plausible sense that Google might not want to essentially
| be a hosting service if that content is leading to real-world
| harm.
|
| ...but this has not been made explicit enough for such a
| sensitive issue, with real speech and free expression concerns.
| (and there are real concerns as always about who decides what is
| misinformation)
| WalterBright wrote:
| > if that content is leading to real-world harm
|
| The truth can also lead to real-world harm.
| wolverine876 wrote:
| We can't make any distinction at all?
| [deleted]
| sneak wrote:
| Google turns over user data on any Google user instantly to the
| US government without a search warrant, including that of US
| citizens. Anything in Google is free for the taking by the feds.
|
| There are already sufficient reasons to stop trusting them with
| your data. This is not the first and it won't be the last.
|
| Google delenda est.
| TameAntelope wrote:
| The problem we _actually_ need to solve is how to divorce the
| concept of free speech from private companies. For some reason,
| it 's almost ubiquitously believed that the "Internet" is
| actually just Twitter, Google, Facebook.
|
| How can we, the tech literate, start to explain to our
| friends/family and eventually the world, that the Internet is
| actually an interconnected network of computers that no one
| person or entity controls?
|
| Honestly, I fear that even among us, we have large groups of
| uninformed people who believe companies like Google and Facebook
| _are_ more than just leaf nodes on the graph, and that worries
| me. If knowledgable people can be this wrong, how can we expect
| the uninformed to get this right?
|
| I worry about the future of tech, if the discourse on HN is any
| indication of how the greater tech community feels. The idea that
| Google is an integral part of using the Internet to express
| oneself is not only completely wrong, but actively harmful, and I
| have no clue what to do about it.
|
| "The Net interprets censorship as damage and routes around it".
| John Gilmore's statement has seemingly gone out of vogue, but it
| remains true. Google _cannot_ stop information (good and bad)
| from spreading, so why do people think they can, and how do we
| disabuse Americans (and the world) from the notion that Google
| has that power?
| xerxesaa wrote:
| Even if you are tech literate, how do you easily find an
| alternative to Facebook and Twitter?
|
| The entire benefit of these platforms derives from the fact
| that they've accumulated so many users. People use these
| platforms because everyone else they know is on it.
|
| I personally haven't used either for several years but I've had
| to "sacrifice" my online social presence and miss out on
| updates from my circle as a result. I put "sacrifice" in quotes
| because I think the net result is positive and though I miss
| having some updates, I overall feel quality of life is better
| without a public online presence.
| TameAntelope wrote:
| As you've mentioned, the notion that you _need_ an
| alternative to live a healthy, happy life is untrue.
|
| How do we explain this to the world?
| pcdoodle wrote:
| I can't believe we give a company such power in exchange for so
| little.
| Spivak wrote:
| The power to arbitrarily take down content on their own
| servers? Like I'm with you that it's an overreach but this
| really isn't that much power in the grand scheme of things.
| Don't publish on Google Drive and they can't touch you. Very
| very little pushes you to publish on Drive. Unlike social media
| where the lock-in the the ability to find an audience Drive is
| a glorified S3 bucket for all it matters.
| Trias11 wrote:
| Google is not exactly fairy godmother.
|
| They're very commercial entity with the management having very
| political biases.
|
| Anyone can walk away from Google at any time. No one is forcing
| the one to use any of Google services.
|
| Every time i use google i assume there are thousands eyes going
| through my docs. I accept this as a tradeoff for quality
| serivces.
|
| If i don't accept - i walking away.
| Andrew_nenakhov wrote:
| Almost everybody here cheered when Twitter and Facebook started
| censoring and 'fact-checking' Trump. Few voices who said that
| this will end badly were drowned by the rejoicing crowd, not
| thinking about the consequences.
|
| News such as this is a direct consequence of allowing big tech to
| become arbiters of 'truth'. And yes, it will be much worse than
| this.
| jonnycomputer wrote:
| What. The. Hell.
|
| I suppose limiting sharing _might_ be justified. But limiting
| access and /or deleting own files is outrageous.
|
| For that matter, collections of misleading content can have
| legitimate purposes. Such as research.
|
| Now I'm seriously considering dropping my Drive subscription.
|
| Update: I got a lot of upvotes, but tbh I may have misunderstood
| the new policy, which does seem to (maybe?) only be limited to
| distribution of misleading content. I do think Google has a
| legitimate interest in regulating use of it's services as means
| of distributing information. I'm not sure where the line is,
| though. For example, if I sent an email to a friend in which I
| said something that isn't true, I think I would rightly be upset
| if Google refused to deliver the email. OTOH, if I was sending
| this to large numbers of people regularly, as part of some kind
| of misinformation operation, then maybe blocking me would be a
| legitimate. Complicated.
| commandlinefan wrote:
| > I do think Google has a legitimate interest in regulating use
| of it's services as means of distributing information
|
| If so... do you think that your ISP also has a legitimate
| interest in regulating use of it's services as means of
| distributing information?
| Craighead wrote:
| Of course... Have you not heard of copyright notices leading
| to termination of services?
| jonnycomputer wrote:
| I think ISPs should be public utilities, so, no.
| mshanowitz wrote:
| Perhaps the basic functions of the internet that Google
| provides like mail and basic document storage should be
| treated as public utilities as well
| ixacto wrote:
| Does this apply to google workspace (paying)customers too?
|
| If so google is actually scanning what is supposed to be
| private information.
| m-p-3 wrote:
| If it's not end-to-end encrypted, it's not private from
| Google.
|
| Google can and will scan anything you feed it.
| ixacto wrote:
| I'm tempted to make a bunch of honeypot accounts just to
| see how Orwellian and/or gameable their system is.
|
| What will trigger it? Fan pages to Trump/Qanon? Antivaxxer
| propaganda? The German federal police paper on how to
| clandestinely manufacture heroin? Or maybe just a 20mb text
| file with the "Wan " character?!
| freedomben wrote:
| If you do they will figure out your real identity and
| will terminate your real account without warning. I
| highly suggest you move _everything_ out before that.
| ixacto wrote:
| That is...true. Everything google account backed is now
| wiped from my iPhone. Just got to download the google
| drive stuff next.
| stingraycharles wrote:
| I looked closely but I cannot find anything that says that
| paying customers are excluded. So I would assume it is
| included.
|
| It makes sense, though, to enforce one single policy:
| otherwise, just becoming a paying customer would be a
| loophole to circumvent phishing etc policies.
|
| I guess the crux here is when, exactly, they start scanning
| the content. Is that perhaps the moment the content is first
| shared?
|
| It doesn't seem to be documented.
| read_if_gay_ wrote:
| > If so google is actually scanning what is supposed to be
| private information.
|
| Am I just too cynical or are you naive for thinking they
| aren't datamining absolutely anything they can get their
| hands on?
| DaniloDias wrote:
| Logs are only useful if they are monitored.
|
| This individual seems to look at events that happen that
| don't align with their prejudices and assumes that the
| signal is an aberration.
|
| Nihilists get burned. You have to hold people accountable
| for the consequences of their action rather than their
| intention.
| kevincox wrote:
| Their policies for Workspace accounts has tight
| restrictions on what they are allowed to use your data for.
| So they shouldn't be "datamining absolutely anything that
| they can get their hands on".
| jfengel wrote:
| Many of the components of the abuse page talk about
| "distribution", but the one about hate speech is simply "Do not
| engage in hate speech".
|
| The way I read that, they could conceivably boot you just for
| having a private diary of racist rants. I don't know if they
| intend it that way; it's also possible that they'd interpret
| "speech" as meaning "speech where somebody else can hear you".
|
| But if you were looking for more reason to drop your Drive
| subscription, that might be it.
| BitwiseFool wrote:
| The Terms of Service are written broadly and that's on
| purpose so that Google can take the liberty of interpreting
| things however they please. Ultimately, it comes down to the
| culture of the team that deals with content scanning and
| take-down requests. I'm willing to bet money that it is
| staffed by Progressive Bay-Area folks and the implications
| should be self evident.
| polynomial wrote:
| Even more problematic is the difficulty current algorithms
| have in distinguishing "engaging" in a certain class of
| speech acts from studying said class. This requires a
| subtlety of parsing humans are (sometimes) capable of, but
| machines still struggle with.
| Uhhrrr wrote:
| Also, what if the content wasn't racist when you wrote it?
| kgrimes2 wrote:
| Chiming in on your update. It's my understanding as well that
| it's only related to distribution of content. There's nothing
| keeping you from storing misinformation on your personal Drive,
| but if you're linking to it from Facebook and using it to, say,
| sway a political election, cutting off others' access to it
| makes some sense.
| 2OEH8eoCRo0 wrote:
| The headline is misleading. This makes it sound like anything
| that's misleading is banned which is false. They are banning
| very specific things that are veritably false like spreading
| fake election rules, spreading false election dates, etc.
|
| These things have happened:
|
| >The Trump campaign has sent Facebook advertisements to tens of
| thousands of voters in swing states, erroneously telling them
| it was election day after the social media group's ad blockers
| failed to detect messages that violated its rules on
| misinformation
|
| Is the election date a matter of opinion? I don't think that
| lying about what day is election day is protected speech- its
| malicious to the very foundation of Democracy.
| dfdz wrote:
| A recent high profile example is the lab leak "conspiracy
| theory". Facebook block discussion of the possibility that
| Covid-19 started from a lab leak, but now scientists are
| seriously considering the issue.
|
| Imagine if Google had this policy last year, and a scientist
| posted a Google Slides presentation about the merits of the lab
| theory (and imagine it was banned under the misleading "health"
| information umbrella) ....
|
| Disclaimer: I use dropbox (mostly since they have supported
| linux since when I started using the service, and when I
| started using the service, the syncing was much better than
| google drive) But now I have another reason!
| rtkwe wrote:
| I remember the original theory going round that was getting
| removed being that it was a lab leak of a bioweapon not the
| current theory that it was more normal 'gain of function'
| research that got out. Those are two very different
| accusations.
| hackererror404 wrote:
| If we are honest however, these are theories however
| "unfounded" they might be...
|
| Science is a continual pursuit of what is accurate and even
| then when we have a scientific proof, it's often not always
| permanent.
|
| It was a theory that this came from a lab. It was perhaps
| another theory that the lab was working on this perhaps in
| part as a bioweapon. We can get into the politics of why
| China in general can't really be trusted and so who the
| heck knows... and that of course leads to skepticism and
| theories such as these.
|
| These theories may or may not be true the same way
| "covid-19" came from a wet market may or may not be true.
| It's okay to say we don't know, here are some theories...
| That's science.
|
| We should try to get to the bottom of it so that whatever
| happened is exposed and we can learn from this.
|
| A third party getting in the way and deciding based on who
| knows what, what theories are okay to talk about and what
| theories are not okay to talk about is a horrible idea.
|
| Perhaps Google is working on a multi-billion dollar deal
| with China to put Google in all of the phones made in
| China? You better believe that if China told Google, "Hey
| this theory is false (trust us), you better take down
| anything that talks about it"... Google would be very
| pressured to comply.
| brippalcharrid wrote:
| How could we tell the difference without knowing the
| motivations of the people responsible for it? Has their
| conduct been likely to inspire confidence that they were on
| the up-and-up? Gain-of-function research is used for
| offensive as well as defensive purposes; it is inherently
| dual-use, and it can be hard to draw a line between the
| two. We aren't in a position to be able to exclude the
| possibility that this research was being carried out with
| the intention of being transferred to pure weapons
| development once suitable candidate pathogens had been
| developed.
| soheil wrote:
| I'm not defending Google's policy here. But imagine for every
| lab leak that Google falsely blocked, it also blocked many
| dangerous conspiracy theories before they got out of hand. I
| think is boils down to numbers and I think Google is probably
| in the best position to have the most accurate numbers, the
| question is are they also making a decision that is most
| consistent with those numbers.
| underwater wrote:
| Maybe it should be the scientists and journalists exploring
| that possibility, not random Facebook conspiracy nuts? Once
| an conspiracy is broven true, then FB can let it ride.
| jdasdf wrote:
| Science isn't a god with priests telling you what it says.
| It's a process that anyone can and should do.
| [deleted]
| chitowneats wrote:
| First of all it isn't "proven true". It's just not proven
| false. Which has been the case the entire time.
|
| One of the reasons it was assumed false was that the
| scientific community bought into the original letter from
| Peter Daszak. It turns out Daszak has conflicts of interest
| regarding WIV and GoF research that, in theory, could have
| caused the pandemic.
|
| It is incredibly naive to allow people to be the
| gatekeepers of information that is inconvenient for them.
| matheusmoreira wrote:
| Scientists? Sure. Journalists? No. Banning discussion but
| allowing the mass media free reign means people are being
| fed propaganda.
| zarkov99 wrote:
| Something similar that is happening now is the Ivermectin
| discussion, which right now, to some sounds as crazy as the
| lab leak, to some, sounded a few months ago. So we have
| learned nothing from the lab leak debacle.
| 2OEH8eoCRo0 wrote:
| Lab leak theory has always been taken seriously but it was
| also largely hijacked by racists at the time who were much
| louder than the scientists. If memory serves.
| Latty wrote:
| There is a distinction between discussing a theory, and
| presenting it as the truth with no evidence.
|
| Not a distinction I imagine automated moderation systems are
| going to manage well, of course, but it exists.
| seanclayton wrote:
| Google will gladly distribute public PDFs that say the
| elimination of Jews "must necessarily be a bloody process,"
| presented as truth with no evidence.
|
| Why do they get to be the arbiter of what information
| deserves presentation as truth with no evidence, when they
| allow Mein Kampf to be presented as such?
|
| This is all assuming that setting a file to public in
| Google Drive is considered "presenting it as the truth with
| no evidence."
| Latty wrote:
| I agree with this to some extent, how direct should the
| harm be before you stop distributing it?
|
| I don't think there is an easy, obvious line there--
| direct calls to harm (murder these people) shouldn't be
| accepted, and I believe even beyond that there should be
| limits, but find that line and enforcing that is very
| hard.
| golemiprague wrote:
| You are downvoted but this is exactly what happening in
| the Arabic sphere of discussion and nobody really cares
| because Google, like most of the discussion here, is very
| America centric. This is not even about truth but rather
| about calls for murder which are against the law in many
| countries.
|
| In some way though, I am happy that this "diversity of
| thought" exists because that's the only thing that can
| stop companies like Google from becoming the arbiter of
| truth. It is just sad that a lot of those countries are
| also ok with calls for murder and violence.
| pageandrew wrote:
| This distinction never mattered. Any discussion of the
| theory besides refuting it as "debunked" was banned on
| Facebook.
| Alex3917 wrote:
| > I suppose limiting sharing might be justified. But limiting
| access and/or deleting own files is outrageous.
|
| It's definitely not just a hypothetical risk either. In the
| last month, Facebook blocked me from distributing my blog post
| on Django best practices because they claim it violates their
| community standards. It's literally a post about how to
| structure API code, but now they're claiming that it's
| advocating for genocide or something and there's apparently no
| one to appeal to or any way to reverse this.
|
| For reference:
| https://alexkrupp.typepad.com/sensemaking/2021/06/django-for...
|
| https://developers.facebook.com/tools/debug/?q=https%3A%2F%2...
| jonnycomputer wrote:
| That's crazy.
|
| Twice in the last week a comment of mine got blocked on
| Facebook for violating policies. In the first case, I posted
| a link to an EPA website and a screenshot of a graph posted
| on it, showing a reduction in smog in the last 10 years
| compared to 1987. In the second, I pasted a screenshot of
| something from the CDC's COVID Tracking site and a link to a
| British research paper on the efficacy of the vaccines wrt
| the delta variant. In the second case, my link was broken
| somehow, which might have contributed to the problem somehow?
|
| They never went back up.
| [deleted]
| bootaccount wrote:
| why consider it when you can do it? Behavior can only change
| when there's a massive drop in MAU and revenue, then the
| product team will consider backtracking
| jonnycomputer wrote:
| Because it would be a major inconvenience for me, that's why.
|
| Update:
|
| I'm getting a lot of criticism here.
|
| If your aim is to try to get me to actually make the switch,
| then the approach you are taking is counter-productive.
| Instead, I guess you are doing what most people do when faced
| with something they disapprove of: they do the thing that is
| easiest and most satisfying, which is to criticize or punish
| the other person, and to pat themselves on the back for being
| better.
|
| But criticizing people just isn't an effective way to change
| people's minds or behavior.
|
| What would work in this case? Showing them a path forward.
|
| In this case, I presented a real problem, that it would be a
| major inconvenience to switch. That is the truth. I don't
| live in some idealized fantasy world, where I can take
| actions for free.
| seneca wrote:
| Bang on. I think one of the best things we can do as
| engineers is help build alternatives to these authoritarian
| companies' tools, and to strive to make the on ramp as easy
| as possible. As it stands right now, it is a lot of effort
| to use Libre tools when compared to the poison pill
| corporate ones.
|
| The second best things we can do is support projects and be
| vigilant against the creep of corporate interests into
| FOSS.
| bootaccount wrote:
| ah, so you're just talking the talk, without preparing to
| walk the walk... things don't improve because you just
| wrote a comment on HN, it has to materially impact the
| business for them to notice the disagreement with the
| policy.
|
| Most people are like you though, too lazy to do anything
| and they'll subject themselves to whatever big G says is
| good for them.
| ASalazarMX wrote:
| Beware of extremist/polarizing arguments like this one,
| they're mostly manipulative.
|
| You can voice your disagreement, and still use the
| platform as long as they don't cross the line. That
| doesn't mean you haven't prepared a plan for degoogling
| your life and accept the inconveniences.
| Zachsa999 wrote:
| Yes, well said.
|
| Please stand by while I puke and shit and regret opening
| the comments section of an article discussing whether the
| first amendment is still valid.
| nazgulsenpai wrote:
| I dropped Google Drive late last year and it wasn't too
| bad. Setup another service, made a directory junction from
| GoogleDrive folder to new folder, profit.
| kratom_sandwich wrote:
| Refreshingly honest
| errantmind wrote:
| If you care about privacy and freedom of expression, you
| have to live it, not just complain on HN.
| [deleted]
| ping_pong wrote:
| So now, a minimum wage drone at Google, or even worse an AI, will
| be able to shut down my entire account because they perceived my
| document that I'm sharing as "misinformation"? That's pretty
| fucking scary.
| Spivak wrote:
| I mean they already could immediately shut down your account
| for suspicious activity with no resource so really this is less
| mild than that.
| c0brac0bra wrote:
| Quis custodiet ipsos custodes?
| fareesh wrote:
| My guess is that a lot of this is driven by the climate in the
| USA, so with that in mind I have some examples that I am curious
| about, as an outside observer.
|
| > It also includes incorrect claims that a political figure or
| government official has died, been involved in an accident, or is
| suffering from a sudden serious illness.
|
| 1) What is the ruling if I store and share a CNN video of a Don
| Lemon show in which he and a mental health professional discusses
| her diagnosis that a political figure, whom she has never met, is
| mentally ill.
|
| 2) Same thing but on Fox News and different political figure
|
| 3) Same thing but on Infowars but about Hillary Clinton
|
| > Content that is demonstrably false and could significantly
| undermine participation or trust in civic or democratic
| processes.
|
| and
|
| > This includes information about public voting procedures,
| political candidate eligibility based on age / birthplace,
| election results,
|
| 1) Claims in the media and by political figures that the
| gubernatorial election in the state of Georgia was fraudulent
|
| 2) Claims that a Presidential election was "hacked" by a foreign
| state
|
| 3) Claims that a Presidential election was fraudulent
|
| > Manipulated media: Media that has been technically manipulated
| or doctored in a way that misleads users and may pose a serious
| risk of egregious harm.
|
| Again here, dealing in pure objective truth versus media spin and
| misleading framing:
|
| 1) Out-of-context edits suggesting that political figure called
| neo-nazis "very fine people"
|
| 2) A video, absent context, of a fictional reading of a phone
| call transcript, during an impeachment
|
| 3) A statement by a political figure that the COVID-19 virus may
| have leaked from a laboratory
|
| I will be very surprised if the enforcement does not fall on
| partisan political lines.
| a254613e wrote:
| I, for one, am happy about this decision.
|
| If there's one thing that's became way more obvious to me over
| the past year and a half is just how susceptible a lot of people
| are to conspiracy theories and other similar content.
|
| In an ideal world we would have different solutions, however we
| don't live in a perfect world so I welcome decisions like this.
| Yes it does open doors for more government control and
| censorship, but I'd take that if it means reducing conspiracy
| theories, racism, etc.
| kderbyma wrote:
| so when the government starts to promote racism as policy you
| will obviously be in favour of it.
| a254613e wrote:
| No I will not.
|
| Life isn't a computer program that you can define with
| EXTREMELY specific if statements, like HN likes to believe.
|
| So I'm happy with statements such as "Content that is
| demonstrably false" from google. And I can be in favour of
| supporting more regulation, but also against regulation
| that's demonstrably harmful even if I can't write you an
| extremely specific definition of what harmful of
| "demonstrably false" is that would cover every single
| imaginable edge case.
| ceilingcorner wrote:
| This seems like a move to lessen upcoming monopoly legislation by
| preemptively banning stuff that the government is about to deem
| "misleading" or "dangerous."
| thinkingemote wrote:
| Agree totally. Both parties in the USA have had public hearing
| of the large tech companies. They were effectively given the
| ultimatum, censor or be be broken up. There was a fair bit of
| overlap but during the questions, the democrats were pro
| removal of fake news (this was during Trump's term) and the
| republicans were anti non competition, anti monopoly.
|
| I doubt that platform to publisher is a wise move but it's
| probably the only one they could do.
|
| As an aside, during the hearings it was Amazon that seemed less
| concerned...
| briffid wrote:
| Of course everything is in the interest of the public, so
| everything is going to be transparent, accountable, appealable,
| based on clear rules. As usual for Google.
| tpolzer wrote:
| The doc in question (https://archive.org/details/frontline-
| workers-testimonies-va...) contains anecdotes mixed with blatant
| lies about vaccine side effects.
|
| I totally agree that moderation can be overreaching, but this is
| not one of those cases.
|
| (Edit: context from the submitter here:
| https://news.ycombinator.com/item?id=27858045)
| jonnycomputer wrote:
| Drive is a storage service.. I don't expect Google to moderate
| _any_ of my content, as long as it is legal.
| tpolzer wrote:
| This is about _distribution_.
| fraudz4us wrote:
| No. It's about fascism and pushing a politicol agenda that
| has destroyed millions of people's jobs, livelihoods, out-
| right MURDERED people all to take out orange man.
|
| Stop being a fucking tool.
| inglor_cz wrote:
| Imagine the same policy with GMail. Captive audience,
| because changing your address after 10 years is
| complicated. And suddenly Google decides what ideas are you
| allowed to discuss over e-mail with other people. Because
| that is distribution, m'kay?
| slumdev wrote:
| Ridiculous distinction.
|
| I can own content. I can print it or burn it to a DVD and
| give it to a friend. But I can't just send him a link?
|
| Ridiculous.
| tpolzer wrote:
| You can't host anything you want on somebody else's
| public file hosting service, surprise.
| syshum wrote:
| I just wish they would admit they do not support free
| speech, or American Values... if they did that I would
| not have much of an issue with it
|
| Their continues false advertisement of "All users are
| welcome" and "We support free speech" is the biggest
| problem.
| slumdev wrote:
| This move is cynical, in bad faith, and contrary to
| everything that makes the Internet useful.
|
| And if they're trying to lose CDA 230, this is a great
| step toward convincing lawmakers who might have been on
| the fence.
| syshum wrote:
| Drive is a storage service.. I don't expect Google to
| moderate any of my content, as long as it is legal.
| eertami wrote:
| They never said they would. You can store all the
| misleading content you like in your Google Drive. Using
| Google drive as a distribution platform to share
| misleading content is what they are moderating.
|
| Claiming otherwise is completely disingenuous, as it
| intentionally obscures core features of the service. It
| would be like saying "Twitter is an instant messaging
| service, I don't expect them to moderate any of my
| content" after a public tweet gets deleted.
| fraudz4us wrote:
| Fuck you.
| pwned1 wrote:
| What if I write a paper disputing incorrect information,
| and cite the incorrect information's source? Now I am
| censured.
| qwertox wrote:
| It helps to read the ToS:
|
| > Misleading content may be allowed in an educational,
| documentary, scientific, or artistic context, but please
| be mindful to provide enough information to help people
| understand this context. In some cases, no amount of
| context will allow this content to remain on our
| platforms.
| jonnycomputer wrote:
| I can't edit my comment anymore, but I think you are
| correct.
| drstewart wrote:
| I take it you disagree with any E2E encrypted messaging
| services, on principle?
| it wrote:
| I'm switching from Gmail to Protonmail so I don't have to worry
| if Google thinks my messages or those of my friends are
| misinformation.
| overgard wrote:
| This is an insane policy. The only way to know what's true vs
| false is by allowing free expression not supressing it.
| [deleted]
| Pick-A-Hill2019 wrote:
| Ahh Hell No!
|
| I mean - Who gets to define 'misleading and/or confusing'?
| Google? A court case?
|
| If they (Google) want to impose restrictions such as "Do not
| distribute content that deceives, misleads, or confuses users" -
| Might I suggest that they apply those very same standards to
| their own behavior.
|
| * But who can watch the watchmen? *
|
| I am grateful that I have a symmetric ftth connection.
| Information (correct or incorrect) needs to be free - as in Free
| to be expressed; Free to be ridiculed; Free to be disseminated;
| Free to be discussed; Free to be exposed to the light of day.
|
| It is for the people themselves to decide what they do or do not
| believe.
|
| Do I want Google to decide what is or is not misleading or
| confusing? Eh??? Say Whhaaatttt!!! Ever read a contracts terms
| and conditions? They can be as confusing AF... so uhhmm Yep -
| Let's Ban 'em! Woot!
|
| Google, Eat Your Own Dogfood.
| shadowgovt wrote:
| I think Google would agree with you. If you're going to share
| information that (in Google's perception) is wildly misleading,
| they'd prefer you do it off a domain that doesn't have "Google"
| in its path. And all of us are free to do that.
| Pick-A-Hill2019 wrote:
| That is a very valid point and I'm sorry you are seeing
| downvotes on your comment (so have an upvote from me).
|
| The "My Server, My Rules" point of view is absolutely 100%
| valid ( and also why I commented about asymmetrical ftth).
|
| But, What I am also saying is this -
|
| At what point did the Google that was the 'Do No Evil'
| version pivot in to a position such as they are currently
| taking?
|
| When did the ethos of 'Let's do good things for good reasons
| for the good of humanity' pivot to what is potentially a
| Section 230 nightmare?
|
| To repeat my question - Who judges (and adjudicates) what is
| or is not misleading or confusing?
| shadowgovt wrote:
| I think Google (and the rest of FAANG) is wrestling with
| the uncomfortable possibilty that they are pawns in several
| nation-states' disinfo campaigns and they suspect their
| previous lack of intervention and professional having-no-
| opinion on questions of fact made everything worse.
|
| It's possible that "Don't be evil" means "exercise control
| over the things you create." Frankenstein's monster wasn't
| created evil... It learned cruelty after its creator
| abdicated responsibility for it and it was exposed nakedly
| to a cruel world.
| floren wrote:
| > I am grateful that I have a symmetric ftth connection.
| Information (correct or incorrect) needs to be free - as in
| Free to be expressed; Free to be ridiculed; Free to be
| disseminated; Free to be discussed; Free to be exposed to the
| light of day.
|
| You have a symmetric FTTH connection for now. Host something
| somebody considers too offensive and they'll get your
| connection pulled. We don't hear about this because essentially
| nobody hosts their own stuff at home, but all it would take is
| identifying the ASN of your IP address and making a
| sufficiently loud noise on Twitter.
| Pick-A-Hill2019 wrote:
| You are absolutely right. If I ran un-encrypted file sharing
| or torrented something that matched a hash somewhere in some
| system, yes, absolutely my connection would be shut-off.
| While I could debate the rights and wrongs of that I take the
| cowards approach and just shuffle random bits of bytes
| backwards and forwards.
| s3r3nity wrote:
| > Who gets to define 'misleading and/or confusing'? Google? A
| court case?
|
| They claim that anything "misleading" is "includes information
| [...] that contradicts official government records."
|
| This is a wildly Orwellian way to dictate "truth" : it's
| whatever the government body says is true. (1984 _literally_
| has a "Ministry of Truth")
|
| Authorities can be wrong, and can _themselves_ be incentivized
| to mislead. Why place the center on them vs. individual
| responsibility?
|
| Helping people understand, weigh, and index information and
| sources is an important problem - but solving it in this way is
| _absolutely not_ the right way to do it.
|
| EDIT: spelling & grammar
| krautsourced wrote:
| The more any of these companies start to filter not-illegal
| content,the less it becomes justified to keep them exempt from
| liability for the things the _do_ allow. Because at some point we
| are reaching that 'have your cake and eat it' point.
| generalizations wrote:
| How long until this applies to Gmail?
| ncal wrote:
| "Every record has been destroyed or falsified, every book
| rewritten, every picture has been repainted, every statue and
| street building has been renamed, every date has been altered.
| And the process is continuing day by day and minute by minute.
| History has stopped. Nothing exists except an endless present in
| which the Party is always right." -George Orwell, 1984
| zenron wrote:
| If there's one thing the history of authoritarianism has taught
| us, it's that the invasion of privacy will not be contained.
| Invasion of Privacy breaks free, it expands to new territories,
| and crashes through barriers painfully, maybe even dangerously,
| but, uh, well, there it is. ...
|
| Authoritarianism will find away.
|
| - Ian Malcolm after they raided his house for publishing the
| memoires of his stay at Jurassic Park to Google Docs
| zarkov99 wrote:
| Man, we are completely unprepared for the level of control it is
| not possible to have over speech. I guess I get where Google is
| coming from, maybe, but this is getting downright Orwellian.
| LatteLazy wrote:
| Google drive reserves the right to remove content it _deems_
| misleading, _when it wants to_ , or not.
|
| So nothing has actually changed.
| jollybean wrote:
| This is definitely a bridge too far.
|
| YouTube is a public facing media platform, they can do as they
| please.
|
| But dipping their toes into private content is Orwellian.
|
| This has to stop.
| chrisstanchak wrote:
| Crypto, get off the bench. You're in.
| [deleted]
| _Algernon_ wrote:
| Putting the megacorps that have manipulating people on behalf of
| others as their primary business model in charge of determining
| truth. Cyberpunk dystopia, here we come (except for the cool
| gadgets).
| pensivebeard wrote:
| I'm not quite sure how Google's announcement that Drive is now
| subject to censorship comes as a shock to anyone. Google already
| heavily censors their search results and I am concerned that no
| one is talking about it in relation to this. As bad as the
| voilation of privacy and dictatorial intentions that now apply to
| Drive are, I believe that search result augmentation is far
| worse. In absolute terms, now that "to Google" is synonymous to
| every possible form of "to search", "to know", or to "to
| discover" it as an organization is actively crushing independent
| thinking. Throughout many conversations with younger folks and
| even hapless adults, I have discovered the prevailing attitude
| toward the act of thinking to have diverged outside of the notion
| of self. When something is to be discovered, it is to be searched
| for. There is no critical thinking involved in this process. Due
| to the immediacy of seemingly correct or cogent information,
| people have ceased relying on any other metric in the evaluation
| of information besides a measurement of consensus. The problem is
| that they directly associate this measurement with the rank of a
| result in a search engine. The thinking goes that because it is
| ranked towards the top it must be more true.
|
| But what happens when truthiness is only measured through
| consensus and commonness? The most dire and direct consequence is
| that the rate of convergence toward true diversity of thought is
| absolutely flattened. This artifact is one of the main causes in
| the existential division in the U.S. The real crisis is that the
| ranking of the truthiness of an opinion on important social
| issues that are presented through Google's search results is
| treated and evaluated like objective fact. When presented with
| only opinion and information about social issues that one
| strongly agrees with, the disciplined and liberal thinker might
| imagine themselves on the other side of the situation and
| consider what to do when faced with only information that he or
| she disagrees with. What then?
|
| As an example, try searching Google for the phrase "Systemic
| racism is not real". The only results that appear are in
| opposition to the assertion. The searcher is then faced with a
| quandary about whether to question the reality that all opposing
| points of view are absent. But as I've pointed out, the new
| default to laziness and consensus acceptance guarantees that the
| searcher's true belief about this complex issue will default to
| what is offered. The noble cause of the searcher, to believe in
| an America where Good and Honest people are free to seek others
| who wish to treat other humans based on their character rather
| than a pitifully shallow discriminant as melanin level, to learn
| about the noble spirit of their own country, and who hate to be
| told that they must be complicit in some sort of hellish
| oppression of others are faced with nothing but a brutal
| emptiness. No wonder everyone thinks America is damned and
| doomed! No wonder we all feel so alone. We sit in our apartments
| and stare at the consensus feed. We let it alter our emotions and
| perceptions without thinking for ourselves. We live in vast
| cities of millions of individuals who want nothing more than to
| be fed the consensus and forget our neighbors. By censoring and
| skewing social issues like this we are forced to abandon the
| liberal notions that are the foundation of Western thought. What
| free thinker is brave enough to fight the shallow consensus of
| millions of lazy thinkers who are being spoon fed perspectives of
| a select few far left technocrats? This is the modern book
| burning. The Arbiter of the zeitgeist sits behind an innocent
| text box and it grows ever more powerful.
|
| I challenge the reader to be truly disciplined in their analysis,
| set aside their political leanings, and objectively consider the
| perspective of a right winger who just saw Twitter silence their
| leader and watched Amazon boot the only social network where they
| could speak their mind, Parler, out of existence. We discuss
| these companies as if they are just organizations who are just
| private companies out to make money but any rational person can
| look at these actions and evaluate them to be political forces
| that are heavily biased and who are asserting their control. We
| circle endlessly around discussions about whether they are
| publishers or platforms and it is obvious that they are both.
| Google has intentionally steered the very nature of the internet
| into their control. The only way to be known is to advertise on
| their platform and you must pay them to do that. They protect
| this modality at all costs and offer the free services that we
| are discussing as an indirection away from their intent. Make no
| mistake: Google wants to become the internet. But the horrible
| reality is that they already have and no one has noticed. Google
| and Amazon alone control an almost absolute majority of servers
| that host the entire internet. They could turn it off if they so
| chose. Companies are willingly giving control of the critical
| parts of their business to these companies. If Amazon and Google
| go down every business that hosts on their servers is doomed.
|
| Microsoft is just as bad. They are actively trying to take away
| the control of user's computers by hosting Windows in the cloud.
| What happens when they only make it available in the cloud on
| their servers? Worse still, we developers are giving up all of
| our control to them by hosting everything on Github. What did
| they do when we made all of our code and expertise avialable to
| them for free? They trained an AI on it with the intention of
| centralizing the very skill we gave them and are going to try to
| sell it back to us in the form of Copilot. What happens when your
| employer only trusts you to verify Copilot's output and not write
| anything for yourself? What happens to the next generation of
| programmers who don't even know how to write code without their
| help? Microsoft's seemingly benevolent attitude towards
| developers is just another example of the damage centralization
| is causing.
|
| This announcement is just another assertion of totalitarian
| control. The ruthless centralization these companies are seeking
| is an existential danger to everyone. The threat is critical! We
| are in mortal danger. We have given them complete control and are
| doing nothing about it.
|
| The only option that can save us is to build an alternative and
| compete with them.
| exabrial wrote:
| The fact google is scanning people's private drive's for this
| sort of stuff should be scary to absolutely everyone.
| systemvoltage wrote:
| No one is talking about this but no matter what side you're on,
| it's terrifying that a group of individuals congregated in Menlo
| Park, CA and to a lesser extent few other offices around the
| world has become the arbitrator of the world's information. Even
| if that group is benevolent and have good intentions. It's
| principally wrong. These groups of people have no dissent because
| it gets silenced internally in Google. It's like an unstoppable
| unchecked steamroller of ideas that make the world dance at their
| tune.
|
| I also do not want FAANGs to spread their work culture, their
| shitty UI frameworks, their interview process, their politics,
| their methodology and engineering libraries, their philosophical
| stance, their morality and ultimately their influence over the
| rest of the world.
|
| I love most of their products but I will not use them because of
| their overpowering influence. People in Australia have to deal
| with shitty SV culture because of these companies. It's
| cancerous. It's dangerous and it's going to stifle freedom of
| information, ultimately freedom of political expression and
| opposition. They're acting as government proxy but inability to
| vote them off. Makes me angry just thinking about it. Fuck this.
| [deleted]
| draklor40 wrote:
| So I could be banned for sharing a document that says 'Google is
| evil' ?
| Vuska wrote:
| It's not entirely clear to me which of these policies apply to
| any and all content you upload without sharing externally.
|
| Both the misleading content and sexually explicit material
| sections start with "Do not distribute content...". So does that
| mean as long as I don't enable sharing, I'm free to upload such
| content? If I'm the only person who can access these files, then
| I'm not distributing it surely.
|
| I also find it curious the section on circumvention makes no
| mention of the use of encryption. I suppose doing so would show
| imply Google scans your content and use such techniques to find
| files in violation of their policies. With that not explicitly
| banned, I'll continue to use Cryptomator to keep my private
| documents out of Google's hands.
| tennis74 wrote:
| The fundamental problem is that it used to invoke a cost to
| publish anything. Now any idiot can post things for free and the
| viral aspect of the systems ensures it will explode. Us humans
| are hardwired for new and outrageous gossip and hench the ads
| industry is related to this. Nothing can fix this as long as
| publishing is cost free. Human nature can not be fixed. Get over
| it and design systems which mitigate the worst part of it.
| perryizgr8 wrote:
| Well it is google's service and they can do whatever they like on
| it. If you have a problem, just make your own online office
| suite.
| semitones wrote:
| The road to hell is paved with "good intentions". We desperately
| need to find a globally adoptable alternative to google and the
| services that it provides. Docs, Sheets, Drive, etc. are
| fantastic services in that they work really well on a massive
| scale. However, Google's increasing role as an arbiter of right
| vs wrong and a steward of information puts too much power into
| the hands of one corporation, whose best interests are provably
| not aligned with that of the general population.
|
| I've been working as a SWE at google (in ads...) for over two
| years and I've really started to loathe it over the past year.
| The pay is fantastic and it's really hard to walk away from that,
| but the idea that they are not (or at least no longer)
| contributing to the better world that I think we need, has
| started to weigh heavier and heavier on me...
|
| We should be able to implement services like these, that are free
| of ads, on globally distributed infrastructure, with no central
| authority, to have truly free-flowing information.
|
| edit: added quotes around "good intentions"
| ayngg wrote:
| You need to remember that people actually wanted Google and
| tech companies to be arbiters of truth and moderate content.
| This is what people asked for, so the real issue probably has
| to do with why people are comfortable with giving this amount
| of power to companies like Google in the first place.
| WalterBright wrote:
| > whose best interests are provably not aligned with that of
| the general population.
|
| Hell always comes when people feel they know what's best for
| everyone else, and attempt to forcibly implement it.
|
| I'm for people being free to choose what they believe their own
| individual self interest is. Even if it isn't what others
| imagine it to be. Even if those others are right.
| galaxyLogic wrote:
| Google could be sued if they helped distribute dangerous
| misleading health-information.
|
| It's really not about their "good intentions" but about their
| self-interest.
|
| It's not about freedom of speech either. Everybody has the
| right to say whatever they want, and Google has the right not
| to distribute it. Free markets 101.
|
| The fact that Google has de facto monopoly in some areas is a
| problem that government needs to deal with.
| prepend wrote:
| They can only be sued now because they are trying to filter
| out content. Therefore anything remaining is intentional.
|
| I think it would be better to not take proactive filtering
| based on what they think is misleading and to instead rely on
| legal judgements to remove info.
|
| Misleading people isn't illegal (in the US at least,
| currently). There are situations where it is and that would
| be a good line for Google to take.
|
| Now I want them to be sued enough times to make them stop
| doing this.
| bopbeepboop wrote:
| Google actively suppressed holding Fauci accountable for
| gain-of-function research.
|
| I don't believe their censorship makes people safer -- in
| practice, it seems to enable establishment dishonesty.
| analyte123 wrote:
| Google (and all other providers) can _not_ be sued for their
| users distributing "dangerous" or whatever content. This is
| literally the purpose of section 230. The only exception is
| the sex trafficking stuff covered under SESTA/FOSTA.
|
| Whatever pressure they're getting to do this comes from
| somewhere besides the threat of private civil lawsuits.
| seniorgarcia wrote:
| I think I hate you.
|
| "But the money" is your argument for contributing to something
| you recognize as evil. Good on you for having at least some
| conscience so it "weighs on you". Apparently the money is too
| good though, or you don't care enough. Curious which one it is.
| sumnuyungi wrote:
| The toxicity on HN is a strange combination of vitriol and
| superiority complex. I can't really think of many other
| forums that spew such hatred while morally grandstanding to
| such an extent.
| _huayra_ wrote:
| I always wonder if it will be challenging to get people used to
| the idea of paying for these services from a non-Google
| provider. Although Google hoovers up cash hand over fist (hence
| the high salaries), having a well-polished set of apps that
| work as well for low cost is going to be a hard pill to swallow
| for so many folks that are just used to "free email" by now.
|
| I host my own email and pay about a dollar a month and my non-
| tech family members look at me like I'm some fool for not using
| the free stuff. Conversations about the value of data just go
| woosh over their heads :|
| VoodooJuJu wrote:
| like, just stop using google, lol. What's wrong with computing
| the way it was intended? Stop using the cloud people.
| moneywoes wrote:
| When were they contributing to the global good?
| pfortuny wrote:
| The "public square" is free for speaking because it belongs to
| everybody.
|
| As long as a speech is said in a private place (e.gr. Google's
| servers, facebook's...) it is subject to money. It will never
| be free because it is not "public".
| 1vuio0pswjnm7 wrote:
| "We should be able to implement services like these, that are
| free of ads, on a globally distributed infrastructure, with no
| central authority, to have truly free-flowing information."
|
| Google's job is to take away the motivation to do this work.
|
| Alernatively, if it looks like someone is doing it, hire or
| acquire them.
| robertlagrant wrote:
| Can't I get Google Workspace right now which isn't ad-
| supported?
| MuffinFlavored wrote:
| > We desperately need to find a globally adoptable alternative
| to google and the services that it provides
|
| I'm pretty sure Facebook and Twitter aren't far behind in the
| pursuit of also actively banning content they deem misleading.
| I don't think it's a Google-specific problem.
| colordrops wrote:
| They are ahead of the game; they've both been doing it for a
| while.
| lazyjones wrote:
| All these services are easy to replace. What isn't yet
| available elsewhere is the content discovery possibilities
| provided by Youtube, especially what would be considered
| "pirated" content on other sites without deep ties to the
| government (e.g. full CDs, movies that get uploaded and stay on
| yt for a long time, to be downloaded by millions).
| andai wrote:
| A search engine for videos?
| bilbo0s wrote:
| Deep ties to the government alone won't do it. You'll need
| deep ties to these content industries, and that will cost you
| money. Google has it. Most startups probably won't.
| galaxyLogic wrote:
| Point is Google can offer services like Docs for free and
| still invest money in continually improving them because it
| sells advertisement-space on its free products.
|
| Sure you could build your own. But how would you finance the
| maintenance of your software and its free-for-customers
| delivery platform? Answer: By either selling advertisement,
| or by charging people for using it.
|
| The problem really is that by now Google has monopoly.
| Therefore it is difficult or impossible for you to compete
| with them.
| lazyjones wrote:
| As a user, I don't need to compete with Google. I just need
| an alternative. If I needed collaborative document editing
| (as a replacement for Docs, a feature I don't use), I'd
| probably try something like https://www.samepage.io/ . My
| only use of Docs at the moment is for maintaining some
| stock price calculations and tracking with the
| =GOOGLEFINANCE(...) function. There's probably some
| personal finance apps out there that do the same.
| semitones wrote:
| "All these services are easy to replace" - I would like to
| agree, but I think that statement trivializes the endeavor a
| bit too much.
|
| For Docs for example: Sure, from a technical standpoint,
| creating a webapp that lets you modify a document, syncs over
| new changes on a regular basis, and lets multiple users
| collaborate on the same doc and the same time is certainly
| harder than making a hello world calculator, but it's
| definitely not that hard. Most of us could figure out how to
| do it, especially since it's 2021 and our hardware is up to
| the task.
|
| However, what is stupidly non-easy, is getting billions of
| people to use your solution, and for it to become the
| "standard" in society, for powerusers and non-powerusers
| alike, and for it not to have hiccups when a billion people
| use it. Oh, also, you need to somehow pay for the physical
| infrastructure that supports it, and the human beings that
| maintain it (hardware + software).
|
| The situation would be even more complicated for YouTube
| because of the nature of the content, i.e. a single video on
| YouTube might be viewed by billions of people, while even the
| most popular docs are viewed by thousands at most, and even
| then making a copy of a text document to then distribute off-
| platform is pretty trivial.
|
| And yes, we can spit out terms like blockchain w/ proof of
| stake, IPFS, etc. - but the tech isn't even the hard part,
| it's everything else that's hard (adoption, consensus,
| complexity for non-powerusers, funding, etc.)
| Hizonner wrote:
| Why does any one thing have to be a "standard"? That's part
| of the problem.
| JumpCrisscross wrote:
| > _Why does any one thing have to be a "standard"? That's
| part of the problem._
|
| Economies of scale and network effects.
|
| Nothing _has_ to be the standard. But if 80% of people
| are familiar with X, choosing Y incurs friction.
| lazyjones wrote:
| What has to be "standard"? The application itself or its
| UX? I'm pretty sure it's the latter, so all it takes is a
| familiar UI and low-friction approach. You don't have to
| be a world famous brand for people to be able to use it.
| semitones wrote:
| Standards != centralization. Standards apply more to
| protocols, and centralization applies more to ownership
| of data.
|
| For instance, there is an enormous benefit to the entire
| world from using a standard IP/TCP stack. Yet, we don't
| seem to suffer from a centralized authority making
| controversial or conflict-of-interest decisions regarding
| that stack.
|
| So what I was suggesting is that document-editing abide
| by some global protocol/standard, which would include
| storage, versioning, and permissions. As far as the
| choice of user-facing interfaces and implementations of
| logic around those standards, there need not be just one,
| or even a few. Everyone can bring something cool to the
| table and anyone can use whichever flavor they wish.
| kdmdmsl wrote:
| To avoid Google Drive/Docs/etc you don't have to boil the
| ocean. Buy a Synology DiskStation and just use the
| included software
|
| It's a solution that works for anyone with basic IT
| skills. Offer your extended family access, and use the
| provided software to control permissions etc
| jchrisa wrote:
| In the pandemic my elementary aged kids become Google
| Docs experts. Network effects are real.
| semitones wrote:
| Also, non-powerusers will gravitate towards the simplest
| solution to their problem, and that will also tend to be
| the one that the highest number of people understand how
| to use.
| dudus wrote:
| Exactly, if we replace one standard for another what
| problem are we solving if the main goal was to reduce
| centralization.
| [deleted]
| judge2020 wrote:
| If nobody uses something, what difference does it make?
| j4yav wrote:
| Everyone uses it or nobody uses it aren't the only two
| options.
| dimitrios1 wrote:
| Exactly. We only need a "standard" because everyone
| builds walled gardens and easy content distribution to
| multiple platforms is intentionally difficult. Right now
| you have to watch a "YouTube" video, instead of just a
| video on YouTube.
| swalsh wrote:
| I think what we need is a users bill or rights. There should be
| certain rules a company just can't decide to make, and there
| should be certain rights afforded to every user.
|
| Without that we're just choosing between which monarchy to rule
| us.
| smrtinsert wrote:
| Quit or just admit you'd rather get paid. I have no problem
| ignoring calls from recruiters.
| drluke wrote:
| Maybe it time we make it easy for people to host thier own
| clouds... Like Nextcloud. All in one at home server with
| Owncloud Plug-N-Play. While we are at it, have it ready to go
| for hosting emails and Federated Social Media.
|
| To be honest, it would be a challange, but totally doable.
|
| Decetralize the interenet once again!
| jakelazaroff wrote:
| This is techno-utopianism. The problem isn't technical. Your
| grandparents won't figure out how to host their own cloud or
| use Mastodon. Your friends who work at art galleries or in
| construction won't figure it out either. The reason
| centralized services proliferate is that they cater to the
| vast majority of people who don't care about this stuff.
|
| "How do we limit the harm of misinformation while preserving
| our freedoms?" That's a political problem. We can't just code
| our way out.
| Ostrogodsky wrote:
| This is what most people here dont seem to understand. If
| you are banned from Facebook/Youtube/Google/Twitter, for
| the mainstream public (That is, 90% of the people) you are
| effectively banished from the Internet. This is like a
| candidate not being covered in the NY Times, WP,LA Times,
| but it is OK because the "Quarterly Express" in "Chinook
| county" published a 2-pages interview.
| hollasch wrote:
| ISPs have shown increasing comfort with delisting sites they
| deem bad. I'm a huge proponent of commoditizing data hosting,
| but the culture today leans heavily toward "filtering"
| information.
| native_samples wrote:
| It's not really culture. It's a small minority of radical
| extremists who get their own way repeatedly by threatening
| meltdowns out of all proportion to the severity of the
| problem, and emotionally manipulating their own managers
| ("you're a bad person if you don't do this"). And they do
| it again and again, until the organization starts to
| collapse and becomes a mere tool for their political
| agendas.
|
| The best way to push back on this is not some p2p techno
| fix. It's to systematically start firing anyone in an
| organization who demands the moral cleansing of customers
| or colleagues.
| drluke wrote:
| ISP Censorship is next on their list... They are going to
| push normal discorse underground and bad ideas will just
| fester instead of being natually filtered out in the
| proving grounds of public discorse.
| umvi wrote:
| Then there will be calls for cloud providers to remove
| "misleading" clients. (Read: AWS + Parlor, Cloudflare, etc)
|
| What we really need is a legal designation of a "public
| square" online that protects free speech online
| drluke wrote:
| I agree, but my solution was self_hosted at home, not on
| big tecks slippery back. But I guess the same could be said
| by your ISP.
|
| Although with the increasing use of private gaming servers
| and streaming, ISP's are starting to acommadate the gen_pop
| with decent upload bandwith finally.
|
| Who know, it wont be long untill ISP's start filtering
| content on a massive scale.
| TameAntelope wrote:
| A "public square" online designation would effectively be
| the nationalization of whatever service gets that
| designation.
|
| I, for one, don't think the US government should be in
| charge of Twitter.
|
| Taking private companies and forcing them to say certain
| things is not how I'd like this country to evolve, and as
| long as the 1st Amendment exists, is not how it will
| evolve. "Public Square" designations for private companies
| cannot exist alongside the 1st Amendment, period.
| Ostrogodsky wrote:
| OK, and then what happens when a cartel of the big
| companies decide what and who can publish stuff?
| hash872 wrote:
| The issue is that we know from experience, after 20+ years of
| the modern Internet, that if you make a 'free speech'
| drive/repository place that's widely available, it will host
| the absolute worst of the human race. Then, let's say you
| personally were in charge of said Free Speech Drive- every day
| you'd get up and hear about people using it for (legal)
| jailbait photos, Islamic State recruiting, collaboration
| between extremist militia groups in various countries
| (including your own), actual illegal content, and so on. Pretty
| soon the FBI & CIA start contacting you about some of the
| actual or borderline illegal content being hosted on Free
| Speech Drive. Do you want to deal with that?
|
| For one thing, it's easy to say 'well we'd only take down
| illegal content'. But in practice there isn't such a bright
| line, there's lots of borderline stuff, authorities could rule
| something posted on your site illegal after the fact- lots of
| these situations are up to a prosecutor's judgement call. Would
| you risk jail to push the boundaries? Coordinating 1/6 wasn't
| necessarily illegal until- it was.
|
| If Islamic State is recruiting on Free Speech Drive, posting
| manifestos, encouraging Western residents to actual jihad- you
| wouldn't take that down? You'd leave it up if it hewed up to
| the line of being legal- really? Jailbait or non-nude pics of
| someone's teenage daughter, hosted in the thousands- you
| wouldn't take that down? It's easy to be an absolutist in an
| Internet argument, it's much harder when you face the sort of
| everyday content moderation issues you see in the real world
| walrus01 wrote:
| a lot of people who say they want an absolute free speech
| drive/free speech host have never actually worked for a
| colocation/dedicated server/hosting ISP and seen how the
| sausage is made.
| travoc wrote:
| Is "seeing how the sausage is made" a requirement for
| having beliefs or opinions on the matter?
| imbnwa wrote:
| I think it brings one closer to having "skin in the game"
| user-the-name wrote:
| Yeah, actually, yeah, sometimes it is.
| [deleted]
| jdasdf wrote:
| >The issue is that we know from experience, after 20+ years
| of the modern Internet, that if you make a 'free speech'
| drive/repository place that's widely available, it will host
| the absolute worst of the human race.
|
| And that's just fine. People have the right to be assholes.
| Torwald wrote:
| If I were hosting said Free Speech Drive, I would be
| oblivious to the contents of what my users host on their
| drives. I would not violate their privacy by spying on them.
| Their files, not mine.
| hash872 wrote:
| If your site is widely used and you make it technically
| impossible for you see or moderate content in any way
| whatsoever, your site will become a host for _real_ illegal
| content- not just the borderline examples I gave. As the
| comment above me notes, even 4chan removes CP. You place
| yourself in serious legal jeopardy with this decision
| iammisc wrote:
| No I wouldn't. Google is not necessarily wrong here. The
| issue is that you cannot easily 'own' a part of the internet,
| despite much of our life playing out there.
|
| In the real world, if no one wants to host you and your
| group, the standard answer is to acquire money and buy your
| own land, your own broadcasting, etc. On the internet, this
| is much harder for 'normal' people to do, requiring them to
| use services like Google.
|
| Look, once parler was taken down, I purchased several servers
| off ebay, rented colocation space, and set up my own
| services. But I have the technical know-how to actually 'own'
| part of the internet without depending on anyone else. Most
| people can't really do this. Thus they depend on Google, et
| al, who are not selling them something akin to a land title,
| which is what people feel they ought to have, but rather a
| service.
|
| The reason people got Mad at AWS for taking down parler is
| because to the common man's mind, when Parler pays for its
| "website" (because let's be honest, that's as deep as most
| people go), it 'owns' it, and it ought to be able to hold
| title to that thing in perpetuity, like land. People felt
| upset because they felt that amazon simply seized what they
| perceived to be the equivalent of land or personal property.
|
| Of course websites are different because they require active
| serving by a computer, and parler was paying AWS to do that
| and Amazon decided not to. But that's not how people view it.
|
| To top it off, people are scared because they don't know how
| to own anything on the internet. Even tech savvy people have
| no idea how to purchase or lease IP space, set up servers,
| routes, etc. It's all very confusing.
| jaywalk wrote:
| > once parler was taken down, I purchased several servers
| off ebay, rented colocation space, and set up my own
| services. But I have the technical know-how to actually
| 'own' part of the internet without depending on anyone
| else.
|
| You said you rented colocation space. You (I assume) are
| paying a monthly fee for an Internet connection for your
| servers. You are absolutely depending on others who can be
| pressured just like AWS was with Parler. Don't kid
| yourself.
| iammisc wrote:
| Sure, but the difference is that... if that happens, I
| own the computers. They can take down my internet
| connection (although I have multiple colocation centers
| owned by different people... I guess I could go
| international if I really want to add extra redundancy),
| but the computer is mine. The data on the drive is mine.
| They cannot touch this stuff. If they did, I can accuse
| them of larceny, and sue for damages.
|
| It's written in the contract that they can take me
| offline, but they cannot touch my stuff. They can take it
| off the shelf for non-payment, but there's a period in
| which they have to retain it and offer it for pick-up.
|
| This is wholly different than Amazon not only taking
| parler down, but also deleting the data, forcing them to
| download terabytes in three days, over a weekend.
|
| And you're still right though. i don't actually _own_ any
| IP space. In fact, IP space 'ownership' is handled by an
| NGO with little regulation. That's terrible. It ought to
| be governmental, because owning parts of the internet is
| an extremely important part of society. Too important to
| be left in any non-governmental organization's hands.
|
| There are blockchain like systems that could solve this
| problem in a distributed fashion. There's also urbit. Or
| we could have proper governmental authority.
| kmeisthax wrote:
| Another wrinkle in all of this is that you can use free
| speech _as a form of censorship_.
|
| For example, if someone says something you don't like, you
| can intimidate them into shutting up by, say, threatening to
| reveal their personal information, such as their legal
| identity, address of residence, and so on. On certain corners
| of the Internet, merely dropping dox is good enough to get
| randos (who won't even be affiliated with you, so +1 to
| plausible deniability) to harass someone you want to shut up.
|
| A more technical variant of this is DDoS attacks. Instead of
| trying to intimidate someone into shutting up with threats of
| stochastic terrorism; you shout over them by sending a bunch
| of traffic to their site until the server crashes or they run
| out of money.
|
| So even if you're a hardcore free speech extremist, you still
| need to embrace some level of "censoring the censors" if you
| want the Internet to actually be _usable_.
| native_samples wrote:
| That's not censorship though. Threats are what people are
| forced to do when they cannot censor you, as censorship is
| much more direct. And DDoS attacks aren't speech.
| d23 wrote:
| You've seemingly distinguished DDoS attacks from
| legitimate traffic. If someone is "flooding the zone"
| with disinformation with the purpose of making it
| impossible to discern the truth (i.e. it's not
| legitimate, good faith discourse), is it not reasonable
| to draw a parallel with DDoSing?
| hash872 wrote:
| Agreed. That's not even getting into just pure spam, which
| from people like Alex Stamos I've heard is 100-1000x the
| issue that culture war content moderation is. Once you've
| accepted that a platform can remove the kind of spam that
| killed MySpace- or doxing or a DDoS attack, as you say-
| you're already on the (common sense IMO) road to content
| moderation. Which again, from 25+ years of the modern
| Internet, we know is just mandatory to have a useable site
| mrtksn wrote:
| > threatening to reveal their personal information, such as
| their legal identity
|
| What's the problem with that? Bad things on the internet
| happen more often than not because of the lack of
| responsibility.
|
| Doxxing has become the primary sin in the Internet religion
| but it would solve all kind of problems. I am going to
| commit that sin and say that Doxxing is the solution, you
| can downvote me and make my comment greyed out and censor
| me when you argue against censorship.
|
| Instead of deleting content, simply make sure that it's
| linked to someone who can pay for it if it turn out to be
| something to be payed for.
|
| The Anonymity argument is only good when you are actively
| persecuted by a state actor. I don't agree that you deserve
| anonymity because the public will demonise you. If you hold
| strong believes that can be met harshly by the general
| public, you better be ready for the pushback and think of
| ways to make it accepted. That's how it has been done since
| ever.
|
| Therefore, when a content is questionable maybe the users
| should be simply KYC'ed en left alone until a legal take
| down order is issued. If its illegal(like illegal porn,
| copyrighted content, terroristic activities etc), go to
| prison for it. If its BS get your reputation tarnished.
| breakfastduck wrote:
| You are so, so wrong here.
|
| Who the hell gets to judge what is 'to be payed for' in
| this world you're talking about? The mob? No thanks!
|
| In fact the internet actually went to shit the minute it
| pivoted from 'Dont share your personal info publicly' to
| 'please give us every last drop of your personal
| information and share it publicly'
| mrtksn wrote:
| >Who the hell gets to judge what is 'to be payed for' in
| this world you're talking about
|
| Those who demand the payment, obviously.
|
| Denying the existence of the god or being gay could be
| something to be payed for in some places and obviously
| that is horrible thing but anonymity doesn't solve that.
|
| Fighting for a change or leaving that place solves
| something. Alan Turing himself was subjected to these
| things in the United Kingdom. A few decades later things
| changed in the UK and had nothing to do with the
| anonymity.
|
| Now those who think that gays deserve equal rights demand
| payment. Again, anonymity it's not helping the anti-gay
| folks but simply creates low quality discussion and
| stress and nothing more.
| nemo44x wrote:
| Well clearly people that think like them will always be
| in control of things.
|
| That's the thing people don't often consider - how will
| this policy/law/norm work when I'm not the one benefiting
| from it?
| TheDong wrote:
| > What's the problem with that?
|
| > If you hold strong believes that can be met harshly by
| the general public, you better be ready for the pushback
| and think of ways to make it accepted. That's how it has
| been done since ever.
|
| The problem is that we're not talking about the general
| public. Let's say I'm Jewish. Someone on the internet may
| "doxx" me by finding a group of neonazis and spreading my
| information there, resulting in me getting threats and
| hate.
|
| The internet seems to specialize in this sort of
| "doxxing". Why? My theory is that the internet, even if
| you have a real name for a handle, still distances and
| dehumanizes others to the point where it's hard to
| understand the pain you're causing.
|
| It's hard to walk up and slap someone because you feel
| the slap and see them wince in pain. It's easy to DM
| someone on twitter something far more hurtful than a
| slap, laugh about it with your tribe of neonazis, and
| forget about it the next day.
| mrtksn wrote:
| I think the solution of your problem is physically
| securing you against neo nazis instead of hiding your
| identity. Unless of corse you are writing this from the
| 1940's and you are in Central Europe. If that's the case,
| you have a case.
| TheDong wrote:
| I am writing this on the internet. Physically securing
| myself does nothing to prevent hatemail, DoSs, and
| slander.
|
| The ideal that "lies can't hurt you, the truth is
| stronger than lies" has never seemed to actually work.
| There are countless fictions are far more prominent than
| facts, and there are countless people who's online
| experience has been damaged by a small contingent of
| dedicated attackers.
|
| The response to "harboring free speech to the extreme
| results in neonazis digitally harassing Jews" should not
| be "okay, fine, lock your door at night, free speech is
| more important than you being harassed".
| mrtksn wrote:
| You get your e-harassment non-anonymously too.
|
| If people believe that it's wrong for you to be subjected
| to that, those who do this to you will pay for it.
|
| Anonymity does have uses but it's powerful and open to
| misuse. What we see today in the internet is it's misuse,
| mostly.
| yjftsjthsd-h wrote:
| ...you cannot possibly be seriously decrying anonymity
| under the handle "mrtksn". If you're going to argue that,
| you can do it under your real name.
| the8472 wrote:
| > That if you make a 'free speech' drive/repository place
| that's widely available, it will host the absolute worst of
| the human race.
|
| That's only due to selection effects. If being open were the
| default then they'd be diluted among all the other people.
| ISPs themselves, (older) reddit, 4chan all serve as examples
| that the people you don't want to talk to can be mostly
| siloed off to some corner and you can have your own corner
| where you can have fun. Things only get problematic once you
| add amplification mechanisms like twitter and facebook feeds
| or reddit's frontpage.
|
| > For one thing, it's easy to say 'well we'd only take down
| illegal content'. But in practice there isn't such a bright
| line, there's lots of borderline stuff, authorities could
| rule something posted on your site illegal after the fact-
| lots of these situations are up to a prosecutor's judgement
| call. Would you risk jail to push the boundaries?
|
| I don't see how that's an issue? They send a court order, you
| take down the content is a perfectly reasonable default
| procedure. For some categories of content there already exist
| _specific_ laws which require takedown on notification
| without a court order, which exactly depends on jurisdiction
| of course, in most places that would be at least copyright
| takedowns and child porn.
|
| > Pretty soon the FBI & CIA start contacting you about some
| of the actual or borderline illegal content being hosted on
| Free Speech Drive. Do you want to deal with that?
|
| That's pretty much what telcos have to deal with for example.
| Supposedly 4chan also gets requests from the FBI every now
| and then. It may be a nuisance, but not some insurmountable
| obstacle. For big players this shouldn't be an issue and
| smaller ones will fly under the radar most of the time
| anyway.
|
| Also, having stricter policies doesn't make those problems go
| away. People will still post illegal content, but now in
| addition to dealing with the FBI you also need to deal with
| moderation policies, psychiatrists for your traumatized
| moderators (which you're _making_ see that content) and
| endusers complaining about your policy covering X but not Y
| or your policy being inconsistently enforced or whatever.
| ABCLAW wrote:
| >ISPs themselves, (older) reddit, 4chan all serve as
| examples that the people you don't want to talk to can be
| mostly siloed off to some corner and you can have your own
| corner where you can have fun. Things only get problematic
| once you add amplification mechanisms like twitter and
| facebook feeds or reddit's frontpage.
|
| This isn't true at all, and the reddit report following
| their ban wave is pretty clear about it; once areas that
| actively established a standard of violent or racist
| discourse as acceptable were banned, the volume of
| objectionable material across the site dropped.
|
| 4ch had a similar situation, where the culture on /b/,
| which was intentionally left as an explicitly unmoderated
| segment of the site, a silo, actively invaded other boards
| with violent, racist content.
|
| It isn't that people sit in silos and do nothing otherwise
| - it's that the silos themselves cause people to believe
| their content is acceptable, then spread that shit
| everywhere.
| the8472 wrote:
| I wrote "mostly siloed" not "perfectly siloed". This is
| no different from real life where your social sphere is
| not perfectly insulated from other social spheres.
| Perfectly siloed also means filter bubbles.
| semitones wrote:
| It's a real problem. It's easier to suppress such content,
| but the problem is, it just goes elsewhere where it is almost
| completely unchecked, and it just proliferates in much darker
| circles as a result, and we have even less exposure as to its
| true volume.
|
| Maybe there should be more of an effort to reduce peoples'
| incentive to engage in that sort of behavior in the first
| place. Why do people join violent extremist groups? Why do
| people engage with CP? Why do terrorist groups exist? Is it
| just human nature? Is it a fact that with 7+ billion people
| we are destined to have millions of people engage in this
| behavior?
|
| De-platforming horrible material is better than nothing, but
| it feels like whack-a-mole
| [deleted]
| TeeMassive wrote:
| > The issue is that we know from experience, after 20+ years
| of the modern Internet, that if you make a 'free speech'
| drive/repository place that's widely available, it will host
| the absolute worst of the human race. Then, let's say you
| personally were in charge of said Free Speech Drive- every
| day you'd get up and hear about people using it for (legal)
| jailbait photos, Islamic State recruiting, collaboration
| between extremist militia groups in various countries
| (including your own), actual illegal content, and so on.
| Pretty soon the FBI & CIA start contacting you about some of
| the actual or borderline illegal content being hosted on Free
| Speech Drive. Do you want to deal with that?
|
| Imagine if there was some kind of website or network that
| existed for years with barely no rules or enforcement, like
| image boards that only remove CP or a decentralized
| anonymizing network with even decentralized payment systems.
| That would be the end of the world.
| hash872 wrote:
| Right, and I left this out of my already-lengthy comment
| just so it wasn't a total wall of text. If you can easily
| host your 'censored' ideas on some other corner of the
| Internet- what exactly is the problem? Why are you entitled
| to Google's private property, specifically? You've been
| asked to leave one establishment, and are free to simply go
| elsewhere.
|
| We've entered a Golden Age for radical/controversial
| content- totally unthinkable freedom to say or read
| anything that would've been technically impossible even in
| the 80s. It's actually the opposite of censorship- never
| have people been so free to express any view, thanks to the
| Internet. I'm not really clear the level of hysteria over
| Google Drive's policies, specifically- 4chan or another
| site just like 4chan will always be there
| andai wrote:
| Google docs etc aren't even Google's inventions, Google just
| bought them. I think it's important to emphasize that, to
| dispel the notion that you need to be a big company to make a
| product like that.
| semitones wrote:
| I agree that you do not have to be a big company to make a
| product like that, but it seems like you have to be a big
| company in order to host and deliver it.
| oarsinsync wrote:
| > _I agree that you do not have to be a big company to make
| a product like that, but it seems like you have to be a big
| company in order to host and deliver it._
|
| The tragedy isn't in what you've said, but rather what you
| haven't said. The implication is already that a product of
| GDocs/GSheets quality should be free, as part of a large
| company's moat, rather than a paid for product that people
| _will_ pay for.
|
| The tragic reality is thanks to these large companies
| turning what would otherwise be successful standalone
| businesses, into free additional features.
|
| I've used that word because Steve Jobs famously described
| Dropbox as a feature. Google has effectively made MS Office
| a feature. Apple effectively made operating systems a
| feature, by giving away macOS and iOS for free with their
| hardware sales.
|
| Increasingly, everything becomes a feature, in search of
| what? For big tech, it's to sell users attention.
|
| Meanwhile, on the other side, big media is charging us to
| give them our attention...
| throwaway984393 wrote:
| > it seems like you have to be a big company in order to
| host and deliver it.
|
| What you'd need to host/deliver something like Docs/Sheets
| is: a product team (2 QE, 8 SWEng, 3 SRE, 1 product owner),
| the product, some cloud infrastructure, and the capital to
| pay for it. You could go larger than that to build it, but
| that is _plenty_ of people to run /maintain/support it.
| Assuming "large scale" is between 1M and 100M users, figure
| between $750K and $3M for infra, and non-SV salaries for
| employees, and you're lookin' at between $1.75M and $4M.
|
| If you use the cheapest infrastructure and labor, you could
| do it for $500K.
|
| The question isn't whether you need to be a big company or
| not, it's where you're gonna get the money. VCs throw that
| much cash around every day. If you can actually get _paying
| customers_ , even better.
| root_axis wrote:
| Nah. Hosting is cheaper than it's ever been at any time in
| history, the costs only become a concern if you have lots
| of users in which case you should be generating lots of
| revenue to pay for the increased hosting costs.
| semitones wrote:
| There's more to hosting than just spinning up a fleet of
| containers you know (not even that that is necessarily
| trivial...).
|
| Also, how would you be generating revenue from these
| users? Just do what google does and run ads on the side?
| Then what's the difference?
| rambambram wrote:
| Ofcourse there's more to hosting. But that's what the
| hosting company does! The webhosting landscape - at least
| here in Europe - is perfect: worldclass technology, local
| service. I can be on the phone with these companies if
| there's a problem.
| root_axis wrote:
| > _There 's more to hosting than just spinning up a fleet
| of containers you know (not even that that is necessarily
| trivial...)._
|
| Spinning up containers is like the bare minimum I'd
| expect from an ops engineer, I wouldn't call it "trivial"
| but it's the job.
|
| > _Also, how would you be generating revenue from these
| users?_
|
| That's a business question, not an engineering one, if
| the businesses doesn't have a plan for revenue then
| hosting costs are irrelevant.
| swalsh wrote:
| Hosting stuff that hosts content some people find
| "problematic" has its own additional layers of
| difficulty. Amazon is completely willing to dump you if
| they disagree with you.
| root_axis wrote:
| There are a few cases of this happening, but it's not
| common. If you intend to host "problematic" content pick
| a more understanding host or colocate.
| rootsudo wrote:
| Hosting is at an all time low, while innovation, time and
| putting time to a project like this is not.
| nine_k wrote:
| Have you ever been on-call for any project larger than a
| toy? If you have, you likely noticed that keeping the
| whole thing up sometimes takes effort, more effort than
| meets the eye.
|
| Hosting as in having some code deployed to some machines
| is indeed cheap. Keeping a large app like g.docs up and
| running, especially without breaking the bank, is a bit
| more tricky.
| root_axis wrote:
| > _Have you ever been on-call for any project larger than
| a toy? If you have, you likely noticed that keeping the
| whole thing up sometimes takes effort, more effort than
| meets the eye._
|
| Of course, that applies to literally every piece of
| production software ever, but keeping a webpapp running
| really isn't that hard, it's honestly the bare minimum of
| competent software development, if you have a team of
| SREs up at 3am triaging the site every night you're doing
| something wrong. Now of course, when you get to _google_
| scale, you will encounter unique problems, but if you 're
| at google scale your business has more than enough
| revenue to pay for the costs.
| rambambram wrote:
| My experience as well. Web apps - if made slightly
| streamlined and lightweight - with thousands of visitors
| a month is easy peasy on cheap webhosting. Google is
| another scale ofcourse. That's like comparing elephants
| with mosquitos.
| iso1631 wrote:
| Anything that becomes a danger of breaking away from the
| ecosystem gets bought out. Indeed that seems the end goal
| of most startups
| colechristensen wrote:
| No you have to be a big company to resist the urge of
| getting bought out.
|
| Or you have to either have aspirations of becoming a big
| platform company or a plan to survive and be happy watching
| big companies push you to fifth place in a category you
| once dominated.
| izacus wrote:
| > Google docs etc aren't even Google's inventions, Google
| just bought them.
|
| That's the same thing as saying that macOS Monterey isn't
| Apple's invention, they just copied Xerox.
|
| There's years of development on what Google bought and what
| Docs suite is now and any engineer that had developed a
| product for years shouldn't say silly stuff like your
| sentence.
| guerrilla wrote:
| We have both nextcloud and LibreOffice.
| 37r7dyysy wrote:
| As always, the question is who's going to do better? The
| Americans already gave it their best shot and this is where
| they ended up. Certainly not any of the members of the EU, it'd
| get hobbled by regulation as they've never cared about actual
| free speech of the kind the Americans value. It's sure not
| going to be coming out of anywhere in SEA or NAME with the way
| governments in those regions tend to operate. So who's left to
| do better at a scale that matters?
| api wrote:
| I know there are a ton of people who, like you, want
| alternatives. Unfortunately nobody is willing to pay for it.
|
| Advertisers (and governments and criminals) pay handsomely for
| surveillance-driven "free" platforms, but who will pay for the
| development, maintenance, productization, polish, and support
| of open decentralized alternatives? Users have been conditioned
| to believe that software should be free, and the more
| ideological people in the FOSS movement will tell you it's "not
| open source" if you don't give it away with no strings
| attached. I know people who will actually uninstall things if
| they do not have an OSI-compliant license.
|
| Look at how much work it takes to develop and maintain these
| centralized systems. Now consider that decentralized systems
| are more challenging to develop and scale because you have to
| deeply understand distributed systems instead of just hacking
| some code to run on one centrally managed 100% trusted
| platform.
|
| Where is the army of independently wealthy highly skilled
| developers who are going to do all this unpaid?
|
| I am not optimistic. Nobody pays for freedom, openness, or
| privacy. All the money and momentum is behind the current user-
| exploiting paradigm, and now we have a generation of
| programmers who are learning "cloud native" development and
| don't even know how to develop things that don't run this way.
|
| Edit:
|
| I once gave a presentation to a room of college kids and was
| discussing a peer to peer system. A student raised his hand and
| asked how two devices could communicate this way without "a
| cloud." He was not aware that it was possible for something to
| communicate directly with something else over a network without
| a server.
| leppr wrote:
| This problem is not nearly close to being solved, but if you
| look at the more general area around "crypto" _(currency)_ ,
| lots of projects are exploring novel forms of funding
| development, maintenance and usage of decentralized networks,
| either for themselves or as generalized solutions[1].
|
| [1]: https://gitcoin.co/
|
| _(Obligatory disclaimer pre-acknowledging the drive-by
| comments when mentioning anything crypto on HN: yes there are
| lots of scams, yes lots of illegal activity uses these
| networks, yes blockchain is often used as a buzzword)_
| api wrote:
| I was optimistic about cryptocurrency, but the problem I
| see is that the bad drives out the good. There are so many
| scams that the very idea is now linked to scams in the mind
| of a huge number of people, driving away a ton of people
| who might otherwise use it for things that are not scams.
|
| It's the "bad neighborhood effect." A few people commit
| crimes, so a neighborhood becomes known as "bad." The
| people who don't commit crimes move out. Now most of the
| neighborhood's residents are criminals.
|
| A lot of financial regulation is about maintaining the
| reputation of markets so that serious people will use them.
| If too many scams, bubbles, and other nonsense goes down,
| the market gets an overall bad reputation.
| leppr wrote:
| I think this effect certainly plays a big part in slowing
| down progress on fundamental research in an otherwise
| hyper-active field. But the good thing about open
| technologies is that they can act like neutral tools and
| not neighborhoods.
|
| Decentralization allows anyone to start their own bubble
| with however much curation they want. If you want crypto
| without scams, it's very easy to achieve. Let's not
| forget how terrible a reputation the whole "world wide
| web" had just 2 decades ago, and yet here we are
| complaining how locked-down and ultimately "too safe" it
| has become.
|
| The whole space will change with more regard for
| fundamental aspects of the platforms' technology, than
| transient feelings about its current ecosystem.
| GekkePrutser wrote:
| > I know there are a ton of people who, like you, want
| alternatives. Unfortunately nobody is willing to pay for it.
|
| I don't know, I pay for O365. I know a lot of people that do.
| And remember; Google Drive is not free either. To get more
| than a minimal amount of storage you have to pay. As a
| result, most serious users of these services are already
| paying for them.
|
| However, we end up paying _AND_ keep getting judgement on our
| data. It really should be E2E encrypted and these conditions
| should only apply for files that are actually shared with
| external people.
|
| However, I've heard many stories of people getting their
| accounts banned for having copyrighted content on their drive
| that was never shared at any point.
|
| If it wasn't for the fact that I mainly have O365 for other
| stuff (email in particular), I would never pay for OneDrive
| under these conditions. Imagine your computer suddenly going
| like "oh hey this is a downloaded movie, you shouldn't have
| this!!" and deleting it from its harddrive. Or worse, even
| forbidding you to log in and access any of your data.
|
| Ridiculous of course but this is the situation we now have
| with online storage. I back my OneDrive up every day for this
| reason.
| seaman1921 wrote:
| must be one of those AWU promoters
| jodrellblank wrote:
| > " _but the idea that they are not (or at least no longer)
| contributing to the better world that I think we need_ "
|
| If I can ask this in a non-snarky way, did you ever honestly[1]
| think working on ads was contributing to a better world? If so,
| how?
|
| [1] 'honestly' contrasted with "it pays well so I'm not going
| to think about that" or "I say it does in public but I don't
| believe it".
| yobbo wrote:
| > We should be able to implement services like these
|
| Implementation would be easy if we knew how to solve payments
| for decentralized hosting, maybe with ads as a possible
| business model. And the payments brokers mustn't be the central
| points either.
| nobody_at_all wrote:
| Google has property rights.
|
| They don't want BS that is harming everyone on their property.
|
| I would ban the same on my property.
|
| That said, there is no reason to ever use Google. Nothing they
| offer is so critical that you must use them.
|
| I don't use Google for anything and never will because they are
| a spyware company.
|
| You willingly work for a spyware company and are trying to
| claim personal morals? How does that work in your mind?
| Genuinely curious.
|
| > We should be able to implement services like these, that are
| free of ads, on globally distributed infrastructure, with no
| central authority, to have truly free-flowing information.
|
| How does that work?
|
| Who pays for it?
|
| Who manages it?
|
| You realize that would create a de facto government if an
| actual government wasn't running it.
|
| Look at any website that has truly "free-flowing information".
|
| They all end up being a cesspool filled with racists, sovcits,
| terrorists, and pedophiles.
|
| Even places like Kiwifams has rules and deletes posts and bans
| users that violates their few rules.
| yibg wrote:
| Perhaps. But isn't google (and facebook, twitter etc) also in a
| bit of a no win situation? They implement these types of
| measure and they get accused of censorship and being "arbiter
| of right vs wrong". They don't and they get accused of helping
| spread fake news and false information.
|
| Putting aside intent, cost etc. What can / should these
| companies do that'll make everyone happy?
| alexfromapex wrote:
| Build your own cloud with NAS. It's a game changer.
| summerlight wrote:
| > We should be able to implement services like these, that are
| free of ads, on globally distributed infrastructure, with no
| central authority, to have truly free-flowing information.
|
| How naive. The reality is terrible, probably much more than
| what you can imagine. Let's assume that someone tries to find
| out teenager victims for cybersex trafficking at a massive
| scale on your proposed infrastructure with "free-flowing
| information". How will you stop them from doing so? Is it just
| a hypothetical scenario? No. This actually happened on
| Telegram, which refused official government's order to shut
| down the chatting room on a victim's request.
|
| https://en.wikipedia.org/wiki/Nth_room_case
|
| Please don't underestimate how far average people's malice can
| go. Both private and governmental intervention exist not just
| because of "good intentions", but it come from the real demands
| from the society.
| EGreg wrote:
| We developed an alternative to google and the services that it
| provides. It's part of a growing open source ecosystem that
| includes NextCloud for file management.
|
| If you want a totally free and open source alternative to
| Facebook and Google, that you can run on your own servers, you
| can find it here:
|
| https://github.com/Qbix/Platform
| gumby wrote:
| Have you considered NextCloud? Even available as a service if
| you don't want to host the code yourself.
| TameAntelope wrote:
| Are we, as humans, entitled to a certain level of UX, or are we
| simply entitled to the ability to share information?
|
| Because if it's the former, then yeah Google itself needs to be
| replicated and provided for all humanity. If it's the latter,
| well what does FTP not have?
|
| I'm having a very hard time seeing how one specific
| implementation of any technology becomes so important that to
| live without it is to be deprived of a human right.
| breck wrote:
| > We should be able to implement services like these, that are
| free of ads, on globally distributed infrastructure, with no
| central authority, to have truly free-flowing information.
|
| Yes! And the way we do it is simple: amend the US Constitution
| to abolish copyright and patents and ensure people have the
| right to intellectual freedom.
|
| https://reddit.com/r/ifa
| robertlagrant wrote:
| We can do it today on patent-unencumbered technology.
| [deleted]
| ghanpatel wrote:
| and pay for it with... love?
| [deleted]
| izacus wrote:
| I think your fundamental error is in the fact that you think
| that a private company (and market competition) can fix these
| issues. It seems that many people on HN are just waiting for
| the new Savior company, that will magically have incentives to
| fight for them instead of making money. It's like hoping for
| market competition create health regulation in the food
| industry.
|
| Turns out, not even Apple is that messiah, and perhaps the
| solution isn't in demanding private companies to be your
| regulators and defenders of good morals and truth. What
| happened to having specialized agencies regulate and inspect
| industries?
| abaracadab wrote:
| Honestly a lot of folks should assess the value that big tech
| provides. We survived the 90's running MSDOS backing stuff up
| to disks. And storage is cheap. Cloud this-and-that is great
| in theory but in practice it makes things too complicated.
| For instance, whenever I use a cloud-centric application I'm
| thinking okay... so where's my files? And the answer is who
| knows!
|
| More specifically I used to use OneNote when it used the
| local file system and I could get to my data. Then MSFT puts
| everything behind a cryptic ten-layer hashed URI...
|
| The tech industry is like an insurance company now selling
| people on fear of losing files or productivity--but I'd wager
| more often than not technology gets in the way of folks
| productivity. By technology I mean fluffy clouds because
| obviously tech CAN help.
|
| We need a better balance and a lot of that starts by keeping
| it simple.
| Bud wrote:
| What happened to regulation of private companies?
|
| Easy answer. The modern GOP happened.
| clairity wrote:
| we could get a lot closer if markets were well-regulated for
| fairness and competitiveness. instead, we get all sorts of
| distortions, some well-meaning but most not. one sign of a
| good market dynamic is a lot of medium-sized companies,
| rather than very large or very small, because that means the
| companies have taken advantage of economies of scale but no
| one firm has outsized advantage, so all must still compete.
|
| to that end, google, facebook, et al. should each probably be
| split up into dozens of companies to start, with stong
| privacy, portability and interoperability standards/mandates
| at minimum.
| jhoechtl wrote:
| I wonder how long it takes until your imho very true
| observations will be heard.
|
| Splitting up of these internet behemoths is so much
| overdue!
| leppr wrote:
| Heard? By whom? Monopoly-capitalism directly benefits
| rulers, complaining to the very people who have
| incentives to keep the system as it is will not change
| anything. You and your fellow citizens have to actually
| do something.
| JasonFruit wrote:
| What makes you think the government is that Messiah? Is it
| likely, in your view, that the government will go out of
| their way to "encourage the spread of misinformation"? I'm
| not seeing that happening.
| paulluuk wrote:
| In the EU, governments and actually trying very hard to
| discourage the spread of misinformation, as well as passing
| legislation that the tech sector has always claimed was not
| needed. So yes, it's very likely, as it's already
| happening.
| duxup wrote:
| >What makes you think the government is that Messiah?
|
| I didn't get that impression from that post...
| bdcravens wrote:
| I don't think izacus suggested it was. My read: there is no
| messiah, and a system of checks and balances is how we
| protect the public interest.
| ezekg wrote:
| I mean, I don't really think this holds up. I have personally
| dropped all Google products I use in favor of privacy-focused
| alternatives. For example, GA>Fathom, Gmail>Protonmail, etc.
| So I'm sure privacy-focused competitors will pop up soon, or
| perhaps they already exist and I simply don't know of them
| because I'm not a big user of the OP's listed Goog products.
|
| The market has most certainly spoken: we prefer privacy, and
| we no longer want to be a product. Competition will come to
| fill those needs. No need for more government regulation. The
| free market works.
| antihero wrote:
| > The market has most certainly spoken
|
| This laughable arrogance. The market has most certainly
| spoken but not in the way you and your microcosm think it
| has. The market doesn't give a fuck about privacy. They
| want convenience and want to pay as little as they for it.
| Privacy might seem to be on people's tongues at this moment
| and it will continue to be still, but only a core few will
| actually make decisions that inconvenience them in order to
| achieve it. Only a few million people are leaving services
| like WhatApp en-masse. 99% of their customers do not give a
| fuck. They aren't doing drug deals (well some are), they
| aren't dissidents, they aren't terrorists, they're just
| every day people going about their business with likely
| zero real repercussions other than finding it a bit creepy
| and continuing to scroll. There's probably more people
| leaving because things have gotten a bit boring and uncool
| than people taking a principled stand for privacy.
|
| We are, at least currently, outliers. Do not forget that.
| ezekg wrote:
| I'm referring to the segment of the market I'm in. I
| didn't mean 'the market' as in 100% market share. I
| should have said "the market is speaking." No need to
| come off as rude.
| semitones wrote:
| I agree with everything except "I think your fundamental
| error is in the fact that you think" - I don't think this,
| and I don't think my post implied that I think that either.
|
| To clarify my statement: "We should be able to implement
| services like these, that are free of ads, on globally
| distributed infrastructure, with no central authority, to
| have truly free-flowing information." - I don't think that
| the structure provided by a company is sufficient to do this
| properly. Rather than companies/governments, we need
| protocols and standards. For instance, what if we had a
| decentralized app (dapp) built on something like cardano,
| that allowed one to edit docs that lived on an IPFS? We might
| have to sacrifice some micro-conveniences (e.g. google docs
| saves automatically for you as often as every few keystrokes)
| to make it tenable on something like a blockchain, but it
| seems feasible.
| the8472 wrote:
| There is a category of companies that doesn't try to be a
| content arbiter. Common carriers, utilities, other services
| that have an obligation to contract.
|
| Of course the "contract" with SNPs is that you "pay" with
| your eyeballs falling on adverts, and the ads in turn don't
| want to be anything like common carriers. So turning any web
| service into something like a common carrier would have
| interesting knock-on effects.
| TazeTSchnitzel wrote:
| I think another error is assuming that having all content
| within a few hyper-scale hyper-global ad-supported commercial
| repositories of everything is a natural or healthy state of
| affairs. Many small websites dedicated to particular things
| is IMO generally better both from a free speech and a
| moderation standpoint than these giants that have to thread
| an impossible needle. In other words, web 1.0 was better.
| rambambram wrote:
| Can't agree more.
| coolaliasbro wrote:
| I often wonder whether the solution is setting the
| expectation with companies that if they act in bad faith,
| their leadership will be om nom nommed? Apologies for the
| metaphor, it's the best I could come up with on short notice.
| nitwit005 wrote:
| > What happened to having specialized agencies regulate and
| inspect industries?
|
| A lot of censorship has traditionally been governments
| threatening to regulate industries if they don't self
| regulate, and that's pretty much what's been going on.
| danuker wrote:
| Doesn't need to be a private company.
|
| For instance, LibreOffice is looking at WASM:
|
| https://wiki.documentfoundation.org/Development/WASM
| mod50ack wrote:
| In fact, there is already Collabora, which is production
| ready, FOSS and based on LibreOffice.
| amelius wrote:
| The fundamental issue is companies dealing with out data,
| when they should just be making hardware or software.
|
| In the old days, computer companies made the hardware and
| government institutions used that to run the internet. This
| is how it should be. Companies should not be touching our
| data directly. They simply cannot handle the responsibility.
| starkd wrote:
| Government regulation is no substitute for competition. The
| ability for consumers to walk away to a suitable alternative
| maintains a continual accountability that a single agency is
| unlikely to deliver.
| avanderveen wrote:
| We still need there to be suitable alternatives, so there
| needs to be some regulation at least, to encourage
| competition and/or to prevent companies from reaching a
| point where there are no longer alternatives.
| macintux wrote:
| I've walked away from Facebook, but I'm still negatively
| impacted by their algorithm which promotes anger, division,
| and conspiracy theories.
| bruiseralmighty wrote:
| This actually highlights the point well I think. The cost
| of walking away from Facebook is too high for most people
| even though they know its' bad for them.
|
| Take myself for example. I only ever get angry when I
| browse my facebook feed and for some reason I've actually
| taken steps to ensure I get angry when I browse my own
| feed (chalk it up to silicon magic). I would like to
| leave facebook, but if I do I'd lose messenger which is
| the easiest and most consistent way for me to keep in
| contact with dozens of people (who mostly don't make me
| angry).
|
| If facebook protocols were open source, then by now I
| would likely have dozens of different options to take my
| friend's list to a messenger only app that does not
| include a feed for me to angrily browse.
|
| Lowering the cost to leave is a benefit to everyone since
| it reduces the reach of that anger inducing feed.
| starkd wrote:
| And how old are you? I don't think it's too much to
| expect for you to be able to manage your own emotions by
| now. FB has nothing to do with that.
| rainsil wrote:
| Isn't the Messenger app a "messenger only app that does
| not include a feed for me to angrily browse."? They also
| seem to have standalone web and desktop versions at
| messenger.com
| ai_ja_nai wrote:
| I'd say that this decision by Google stems from regulators
| and public opnion bashing against misinformation.
|
| Private companies follow the wind of the policies that are
| there to coerce them. In this case, the policy that is being
| put in place is that misinformation is dangerous and should
| be countered.
|
| In 5 years, misinformation could be relegated to Tor
| networks, where it belongs.
| galangalalgol wrote:
| executive agencies should inspect and enforce, but not
| regulate. That is the legislature's job and they cannot
| felegate it beyond "implementation details". We have been too
| lenient with this so we get the FCC, FDA, EPA, ATF etc
| flipflpping on what amount to laws instead of details with
| every change of administration. That isn't how its supposed
| to work. Congress decides laws, no one else.
| [deleted]
| philwelch wrote:
| As people remind us all the time, it's not a violation of the
| right to freedom of expression if a private company stops
| letting you use their services. It is a violation of that
| right if the government does so.
| iammisc wrote:
| Well by all means regulate, but realistically, for example,
| if we had a Google Drive run by the 'other side' of the
| political aisle (kind of like Gab v Twitter), then if Google
| banned certain content, it's unlikely the other would, and
| vice versa.
|
| Unfortunately, the issue here is companies responding to
| something other than the market.
| mgraczyk wrote:
| I would much rather live with censorship that can be removed
| by withholding dollars vs one that requires votes.
|
| Just look at the puritanical rules concerning language and
| sexuality forced on broadcasters. Those rules are decades old
| and outdated and will likely remain forever.
|
| When companies fuck up on censorship, the results seem to
| last only a few years. When governments fuck up, the
| consequences echo for multiple lifetimes.
| mikeiz404 wrote:
| > What happened to having specialized agencies regulate and
| inspect industries?
|
| My guess is a combination of regulatory capture and apathy on
| the part of citizens for various reasons.
| yazantapuz wrote:
| In the other hand, government agencies are not saviors
| either.
| ksec wrote:
| >Turns out, not even Apple is that messiah
|
| They _were_ , just no longer the same under Tim Cook.
|
| I dont want to derail the discussion into another political
| debate but my thesis, is that some _ideology_ spread like
| plague in Silicon Valley. The Good vs Evil. As the OP said
| Google stated off being good, but somewhere along the line
| the definition of Good got twisted a little bit. They keep
| thinking they were so righteous they literally started a
| crusade or witch-hunt ( so to speak ).
|
| And it is in some way interesting because it rhymes with many
| historical events.
| psyc wrote:
| I'm really sorry for the off-topic, but I've been waiting for
| a chance to ask because I see it here very, very often. What
| are people intending to denote with the "private" in "private
| company", especially where the company is so obviously and
| well-known to be public[ally traded]? What are you trying to
| distinguish it from by calling it private? Are you just
| pointing out it's not the government or quasi-governmental?
| Are you emphasizing that it has its own prerogatives?
|
| Sorry, this has just been becoming a peeve for me on this
| site. I just want to know what you're trying to express by
| calling a company that is not privately owned, "private".
| Thanks.
| BitwiseFool wrote:
| I feel the exact same way. I suspect "Private Company" is
| being used in relation to Public/Private Sector and not in
| relation to whether or not the company is publicly traded.
| psyc wrote:
| I feel like your sense of it might be right. But even
| then, you can count the number of public sector
| "companies" (like Fannie and Freddie) on one hand, so it
| just strikes me as such a bizarre distinction.
| ABCLAW wrote:
| It's not a bizarre distinction. The discussion is about
| free speech, and a common talking point has to do with
| comparing the actions of companies to the protections
| laid out for free speech in the American first amendment.
|
| However, the first amendment is a restriction on
| government, not a restriction on private individuals or
| corporations.
| psyc wrote:
| Private equity groups are private. My uncle's
| construction company is private. Google isn't private in
| any sense that isn't confusing. Why not just say that the
| 1st amendment doesn't apply to corporations? That seems
| clear to me.
|
| And as an aside, how many of us really need to be
| reminded, several times per day, of the scope of the 1st
| amendment? Really? Isn't it more likely it's a tired
| debate stopper?
| ABCLAW wrote:
| >And as an aside, how many of us really need to be
| reminded, several times per day, of the scope of the 1st
| amendment? Really?
|
| Yes, that's why people are pre-emptively raising the
| status of the corporations in question as private - to
| head off the discussion you're tired of hearing.
|
| Despite that, instead of discussing what the proper ambit
| of content review should exist, 80% of the thread is
| still debating whether or not editorial control should
| exist in the first place; the exact same type of boring,
| rehashed discussion that adds nothing of value.
|
| Now we're here having a meta discussion about the
| discussion that ALSO adds nothing of value, so it looks
| like no matter where we go there's no shortage of ink
| that leads nowhere :(.
| BitwiseFool wrote:
| "Man, I really wish people would focus on discussing
| something more impactful than the same old procedural
| arguments".
|
| Hah! By saying that you're _also_ not focusing on the
| real issue. Got You! Hypocrite much? /s
| ABCLAW wrote:
| That's why I've got a frowny face there :).
| BitwiseFool wrote:
| >"And as an aside, how many of us really need to be
| reminded, several times per day, of the scope of the 1st
| amendment? Really? Isn't it more likely it's a tired
| debate stopper? "
|
| My feelings exactly. It doesn't accomplish anything and
| literally adds nothing to the debate. Does the Eighth
| Amendment's prohibition on excessive fines imposed and
| cruel and unusual punishment _only_ apply to the
| government too? Clearly, we can talk about the spirit of
| the Bill of Rights and apply that to things that aren 't
| literally the government.
| iammisc wrote:
| In America you can count the number of public sector
| 'companies' on one hand but in others you cannot.
|
| And every once in a while, government does take over a
| company, like GM for a long time was a public company.
| izacus wrote:
| Uhh, this might be a language difference - here the term
| "public" company usually means a company that's majorly
| owned by a government.
|
| Companies owned by private individuals (whether they're
| publicly traded on a stock market or not) are usually
| referred to as "private".
|
| So that's what I meant - a company not controlled by a
| government and has no accountability beyond its private
| owners.
| sadosystems wrote:
| You got it, he is emphasizing the fact that google is not a
| part of the government. I agree that this distinction is
| kind of annoying for some reason.
| jedimastert wrote:
| The term "private company" has no relation to whether or
| not it is traded publicly. It simply means the company is
| not owned by or controlled by (in some sense) the
| government, but by "private" individuals
| iammisc wrote:
| The reason a distinction is drawn is because government has
| more rules it must follow when interacting with the public
| than google. Non-governmental entities in the united states
| are less regulated than the government.
| HanayamaTriplet wrote:
| The key is in your last paragraph: the companies in
| question are privately owned in the sense of "private
| property". In the context of (American) public policy, due
| to factors such as the First Amendment, the distinction of
| a company being governmental/public vs non-
| governmental/private is much more likely to be relevant
| than it being publicly/privately traded.
| psyc wrote:
| My point is 0.0000006% of companies are governmental. In
| the context of the discussion, nobody is even thinking of
| those. So, "private" as a qualifier is hot air. It sounds
| ignorant to me.
| izacus wrote:
| Ignorant of what? Publicly (as in government) owned base
| infrastructure companies (like telecoms, to which Google
| et. al. are pretty similar in these ethics cases) are not
| rare in the world at all.
| option wrote:
| competition definitely can make things better. But we aren't
| seeing enough competition and these companies are engaging in
| anti-competetive behavior without much consequences.
| [deleted]
| ashtonkem wrote:
| I personally think nobody should have this kind of power.
| ukie wrote:
| Hitler would be proud of this comment.
| soheil wrote:
| > It's like hoping for market competition create health
| regulation in the food industry.
|
| Why is it like that and not like gas stations not allowing
| smoking so their property doesn't go up in flames? It's
| really strange to think market forces must always be against
| consumer, there can be mutual benefit if the right incentives
| are put in place.
| duxup wrote:
| The other truth is we're all outraged when these companies
| host some stuff that we don't like ... and get upset when
| they don't host the stuff we do like.
|
| Consumers aren't rational. Neither are their demands.
|
| I'm probably no more rational than anyone else, but I'm
| honest that I sure as hell don't want to give money to a
| service that is happy to host some violent folks content /
| garbage...
| einpoklum wrote:
| > The other truth is we're all outraged when these
| companies host some stuff that we don't like
|
| If by "we" you mean US politicians and mass media, then
| sure. But I'm not sure that's true for the general populace
| in the US, nor for other countries.
|
| On another note - I don't like that they host things, at
| all. That is, I don't like that the entity which runs a
| search engine is also the one which hosts a large part of
| the videos available for free on the Internet. Or that a
| company making popular computing hardware like Apple is
| also the host and gatekeeper for mobile apps, podcasts etc.
| HarryHirsch wrote:
| _host some violent folks ' content_
|
| Consider the Irish Troubles. Consider the US Civil Rights
| movement.
| robotresearcher wrote:
| Isn't US Civil Rights movement notable for its
| nonviolence overall? MLK emphasized asserting basic human
| rights, so that the violence of the state should be seen
| more clearly by contrast.
|
| The US revolutionary movement might be a more clear
| example where violent action was decisive.
| throwaways885 wrote:
| > The other truth is we're all outraged when these
| companies host some stuff that we don't like
|
| Please, speak for yourself. I think these companies should
| host absolutely everything[0]. Between 2010-2016 was the
| golden age for these companies actually being free and
| open.
|
| Edit: Within the law. To be honest, there is very little I
| see that should be censored beyond CP.
| ixtli wrote:
| "Within the law" undermines your point. It takes the
| issue and just shifts it to another location on the map.
| It resolved nothing.
|
| We need to collectively grow up and acknowledge that not
| all disagreements can coexist and solve them.
| throwawaysea wrote:
| In America it shifts it to the US constitution, which
| provides a more principled approach to speech than the
| biases of tech companies. Is it a perfect solution? No.
| But it is closer to it than the present situation.
| pasca1 wrote:
| It seems like consumers can't agree on why they are upset
| with these companies. I don't even think we can agree
| that a private company shouldn't be making decisions
| about what information should be allowed or removed.
| oceanplexian wrote:
| +1 I have never been "outraged" by something posted on
| the Internet. Disgusted, disappointed or shocked, sure.
| But in no case did I think the hosting company was
| somehow responsible for it.
| ai_ja_nai wrote:
| > Edit: Within the law. To be honest, there is very
| little I see that should be censored beyond CP.
|
| Self harm, terrorism, revenge porn, fabricated news. Just
| to name a few.
|
| Internet is very different from what we experienced in
| the 90s. The ingress barrier guaranteed good content (or
| at least entertaining content). Now the barrier to
| content submission has been lowered so much that really
| anything makes on the Web and this is not good. There are
| reasons, after all, for having locks on entrance doors,
| right?
|
| I am quite happy that Google "got the message" from
| regulators that misinformation is a real danger and we
| should apply zero tolerance to web polluters.
| ferdowsi wrote:
| To you it's a golden age, to others this is a period
| where the internet was thoroughly weaponized by
| misinformation agents to undermine democracy and civil
| societies around the world.
| throwaways885 wrote:
| But isn't the flip side of all this censorship a reduced
| ability for people to tell bullshit from reality?
|
| Why is information intelligence not taught in schools?
|
| Much like an immune system, we need to be exposed to
| nonsense so that we're constantly vigilant. There will
| always be a group of nutters who believe in flat earth,
| that vaccines cause autism, etc... Trying to cut that off
| at the source just throws these people into underground
| cults. A more scalable, sustainable solution is to
|
| 1) teach people to do their research - properly, as in,
| don't go on Facebook and join "Flat Earth Society Boston"
| to find The Truth
|
| 2) teach people it's okay to change their minds - part of
| this spreading cultism is that political opinions are now
| core identities
|
| 3) teach people to tolerate opposing viewpoints, even the
| silly ones - point and laugh, but don't try to cancel and
| destroy their lives
|
| Another thing - every time a tweet or document is
| censored, the replies are generally cut off as well. How
| can people learn to distinct true and false if they don't
| get to see examples of people being wrong and corrected.
| pjc50 wrote:
| How is the education system _today_ going to help people
| who went to school in the 1970s?
|
| Conversely, what should we teach children today about the
| information threats of the 2050s?
| throwaways885 wrote:
| A proper fix is better than an instant fix.
| pjc50 wrote:
| Sure, but in the meantime the misinformation voters get
| to pick the textbooks. This is a bit like educating
| people in fire prevention when the forest is already
| ablaze.
| ABCLAW wrote:
| >But isn't the flip side of all this censorship a reduced
| ability for people to tell bullshit from reality?
|
| No. People have limited processing cycles in their heads,
| and you never notice the bullshit you fall for, so you
| can't correct for errors you're making. Critical thinking
| skills are great, but they don't provide you expert-level
| knowledge in every field, nor can they. Sometimes your
| educated heuristics are just plain wrong and someone else
| has better information you don't have access to. I see
| this all the time here with content in my field -
| developers just get law wrong all the time.
|
| You might remember an era when email inboxes were FLOODED
| with a deluge of dick pills, get rich quick schemes,
| Nigerian prince scams, and other low-effort, low-value
| content. Sure you might be able to avoid clicking on
| garbage, but the general health of your inbox declined
| dramatically.
|
| Does that mean society is going to end because we've
| trampled upon the rights of the latest Cialis replacement
| to spam my inboxes? Probably not.
| ajsnigrutin wrote:
| You have Johnny, saying you should wear a mask when you
| go to the store, because it helps you against covid. And
| Bobby says you shouldn't, because there's no evidence it
| helps, and it might even be even worse for you, if you
| wear it wrong.
|
| So you're suggesting we should remove Johnnys
| fearmongering, conspiracy post, because we should listen
| to the experts?
|
| https://edition.cnn.com/2020/03/30/world/coronavirus-who-
| mas...
|
| Experts clearly say that "There is no specific evidence
| to suggest that the wearing of masks by the mass
| population has any potential benefit. In fact, there's
| some evidence to suggest the opposite in the misuse of
| wearing a mask properly or fitting it properly,"
|
| Expert knowledge by WHO.
|
| ...and a month later, we should delete Bobbys post too?
| Do we undelete Johnnys post?
| ABCLAW wrote:
| This is actually pretty interesting because your post is
| premised upon a massive misreading of what actually
| happened. It shows you can spread disinformation while
| thinking you did your research.
|
| The expert knowledge was that mass usage of masks early
| on in the pandemic could prevent frontline healthcare
| workers from accessing needed supplies while logistics
| spun up to meet demand.
|
| "There also is the issue that we have a massive global
| shortage," Ryan said about masks and other medical
| supplies. "Right now the people most at risk from this
| virus are frontline health workers who are exposed to the
| virus every second of every day. The thought of them not
| having masks is horrific."
|
| This is literally the third paragraph.
|
| When community spread began to drive the bulk of new
| infections and we've had months to spin up production on
| masks, obviously mass adoption changes in value.
| ajsnigrutin wrote:
| I literally quoted the paragraph, where they said that
| there's no evidence to suggest that the wearing of masks
| by the mass population has any potential benefit.
|
| They didn't say "it helps, but we're unable to call
| wallmart and buy all their stock, so we're asking you not
| to buy them, so we can", they said that there's no
| evidence to suggest that the wearing of masks by the mass
| population has any potential benefit... those are two
| different things.
| ABCLAW wrote:
| I've actually watched the briefing you're referring to.
| Go to 26:00 - 28:00 https://www.pscp.tv/w/1OyJAYoodRnJb
|
| The very next statement by Dr. Ryan "There also is the
| issue that there is a massive global shortage, and where
| should these masks be, and where is the best benefit?
| Because one can argue there's a benefit in anything, and
| where does a given tool have it's best benefit? And right
| now the people who are most at risk are frontline health
| workers [...]"
|
| The follow up statement is also very emphatic that this
| is about mask allocation due to constrained supply. I
| don't get why you're trying to ignore the very clear
| context of the statement.
| throwawaysea wrote:
| You seem to be making an argument from authority by
| leaning on experts, and I don't _fully_ disagree with
| that approach either. But trusted authorities regularly
| betray trust, and use their label of expertise to push
| their own agendas. A recent example is found in the false
| attribution of the PNW heat wave to climate change
| (https://cliffmass.blogspot.com/2021/07/flawed-heatwave-
| repor...). They also can be wrong solely due to making a
| mistake (COVID had many examples of this with rapidly
| changing guidance). So you aren't free from the need for
| critical thinking skills, because in the most important
| matters you have to still challenge them. To be able to
| do so, you need to have trained that muscle beforehand.
| ABCLAW wrote:
| >You seem to be making an argument from authority by
| leaning on experts
|
| I don't think that's the core of what I'm saying. I'm
| saying people have limited mental time, so devoting an
| unlimited volume of time to sorting through bullshit is
| not feasible.
|
| Everyone is going to need to make choices, but if
| statistically the options presented before them are
| better, we'd expect better outcomes in general.
|
| 'Critical thinking' is one of those things that people
| keep raising as the catch all solution. This line of
| reasoning states that it doesn't matter what options are
| on offer because people will calculate the best ones!
| Unfortunately, they don't.
|
| Most of our language fluency tests rate people's skills
| in this area; take the ACTFL scale for instance. The sad
| reality is that when people are provided with language
| based reasoning testing, many people perform fairly
| poorly due to common errors, even in test-based
| situations. People misread statements, misunderstand
| their meaning, have trouble moving from specific to
| general or vice versa, have difficulties tailoring their
| message to their audience, etc.
|
| In general, the very HIGHEST level of linguistic ability
| in specifically tested scenarios is what we assume out of
| everyone as the baseline when having these discussions
| about social discourse. This is an unreasonable starting
| point.
| wizzwizz4 wrote:
| > _Much like an immune system, we need to be exposed to
| nonsense so that we 're constantly vigilant._
|
| No: much like an immune system, we need to be exposed to
| _vaccines_ (i.e. education on how to spot deception,
| knowledge of what scams are currently going on). Enough
| people trying to deceive you, and eventually someone will
| succeed.
| mikepurvis wrote:
| Indeed, and that was also the period where Reddit happily
| hosted a whole bunch of extremely tasteless and
| borderline illegal communities centered around things
| like pictures of overweight people and sexualized
| children.
|
| A quick googling suggests that the first wave of closures
| was in 2015 [1] after a crushing wave of negative
| publicity and advertiser pullouts. The other really high-
| profile one was r/The_Donald, which wasn't closed until
| 2020 and even has its own Wikipedia article. [2]
|
| [1]: https://www.washingtonpost.com/news/the-
| intersect/wp/2015/06...
|
| [2]: https://en.wikipedia.org/wiki/R/The_Donald
|
| ----
|
| EDIT: Actually, looks like r/jailbait was closed much
| earlier, in 2011, per https://en.wikipedia.org/wiki/Contr
| oversial_Reddit_communiti...
|
| But certainly prior to June 2015, there was much, much
| broader tolerance for racist, sexist, transphobic content
| on the site.
| ajsnigrutin wrote:
| > But certainly prior to June 2015, there was much, much
| broader tolerance for racist, sexist, transphobic content
| on the site.
|
| And now it has moved to other, more hidden platforms,
| where there is noone to write counterarguments, and
| sometimes (tor, freenet,...) impossible to identify
| someone who writes actual threats and not just "yo momma
| so fat..." jokes.
| mikepurvis wrote:
| I mean, that's literally what most of the posts are
| arguing for in this thread-- that this needs to be a
| watershed moment to get serious about distributed,
| uncensorable alternatives to products like Google Drive.
|
| But in any case, "people will find alternatives" has
| never been a valid reason not to act (either here or in
| other popular cases such as guns/suicide). There is _real
| value_ in having standards of conduct that go above and
| beyond the bare minimum of "not illegal." Moral and
| ethical value, of course, but also economic-- in the
| reddit case, ultimately being a place that was viable for
| ad spends by mainline brands who wouldn't want to be
| associated with a site whose public image was that of
| being a safe space for hate speech.
|
| Google is a little different since there's no r/all page
| for GDrive that can be gamed to show this content, and
| nor is Google likely as worried about the safety of its
| reputation as an advertiser.
| objectivetruth wrote:
| > And now it has moved to other, more hidden platforms
|
| Not Reddit's fault nor problem.
|
| > where there is noone to write counterarguments
|
| Yeah, I don't think there were a ton of countering voices
| trying to teach the vile racists or the pedophiles the
| errors of their ways.
|
| > impossible to identify someone who writes actual
| threats
|
| That's an argument against the existence/legality of
| anonymous networks -- probably not going to go over well
| here.
| slg wrote:
| >Within the law. To be honest, there is very little I see
| that should be censored beyond CP.
|
| Within the law of which country?
|
| Should copyrighted content be blocked in the US? What
| about in the Netherlands?
|
| Should Holocaust denying content be blocked in Germany?
| What about in the US?
|
| Should anti-CCP content be allowed in China? What about
| outside China?
|
| If you want to do the bare minimum according to the law,
| you are going to need a different implementation for
| every country. And even then people in countries with
| more lax laws are going to think you are acting
| unethically by censoring content in more restrictive
| countries and vice versa.
|
| EDIT: I have no idea why a comment that amounts to
| "different countries have different laws" is being
| downvoted.
| laumars wrote:
| > _Edit: Within the law. To be honest, there is very
| little I see that should be censored beyond CP._
|
| The thing is, people are never going to agree where the
| line is drawn. So I'd rather let individual companies
| decide where they draw the line and if that happens to be
| not where you'd agree, then you can go support an
| alternate, if this there are many, who draws the line
| elsewhere. And for those who don't want a private entity
| owning the infrastructure, there is always solutions like
| IPFS.
| koonsolo wrote:
| You clearly don't have kids that consume content on the
| internet.
| ithinkso wrote:
| This is what I think people often miss in those
| discussions. There is an assumption that everything
| shared/'hosted' online is someone's opinion and thus
| should be allowed to be shared. This was never a case on
| the internet, don't trust what you read online. What
| should be regulated is what can be advertised, because
| most what you read is indistinguishable from an ad.
| Smoothies cure cancer - an ad or someone believe some
| antivax type bullshit? What if it's an ad to vote for a
| particular party because the other will do unbelievable
| bad things? Fanatic or serious opinion? Should you allow
| those kind of post on your platform?
|
| In my opinion you should just because it is the only way
| to make sure that your platform is not taken seriously
| and will prompt people to read something a bit more
| serious. It will stir controversies on twitter but who in
| IRL seriously considers an opinion of a twitter person?
|
| Edit: I went for a smoke I thought about it a bit more,
| take a look at voat vs reddit. Voat was created as a
| response to censorship on reddit and look at the cesspool
| of a place it is. Companies do not censor content because
| they have a moral stand or a political agenda, they
| moderate the content because otherwise it will turn into
| a shit you wouldn't believe (HN does it to). There is
| much more trolls on the internet responding to everything
| they can than legitimate people trying to have a
| discussion.
|
| New Eternal September started with social media and those
| not hardened by the internet before have a hard time to
| just dismiss what they read as 'a troll'
| bserge wrote:
| Well then 00-10 was the platinum age. Few behemoths, lots
| of smaller companies, less focus on marketing and money,
| fewer idiots online, a focus on hosting your own stuff
| from 0.
|
| But I agree, platforms should be neutral. And people
| should still strive to control as much of their online
| properties as possible. Not just give it to Amazon and
| Google.
| eigen wrote:
| > Well then 00-10 was the platinum age. Few behemoths,
| lots of smaller companies, less focus on marketing and
| money, fewer idiots online, a focus on hosting your own
| stuff from 0.
|
| Eternal September was 1993. maybe this golden age of the
| internet where everyone was civil and educated and almost
| nothing was subject to censorship didn't really exist.
|
| [1] https://en.wikipedia.org/wiki/Eternal_September
| zuminator wrote:
| Host absolutely everything? Instructions on creating
| explosives, chemical and biological weapons? Some future
| doomsday weapon? Would you say everything should be
| available up to and including methods for any unbalanced
| individual to single-handedly kill thousands or millions?
| How about your personal details, ID, address, employer,
| medical history, surfing, shopping habits? Would you even
| be ok with hackernews piercing the veil of your
| "throwaways885" username and publishing it?
| throwaways885 wrote:
| What I'm suggesting is that neither I or large
| corporations are capable of making that distinction. Nor
| the government for that matter.
| dahfizz wrote:
| Platforms are not publishers. The publisher of these
| things should face legal action (including the removal of
| their content). It's not for the platform to pick and
| choose.
| mahogany wrote:
| Can you define "platform" in this context? Is this a
| legal term?
|
| It's interesting to see this dichotomy between platforms
| and publishers in these types of threads. I assume it
| stems from a reading of Section 230 somehow, but the word
| "platform" never appears in that text.
| Dylan16807 wrote:
| Strictly speaking, "platform" is an umbrella term, and
| people are actually distinguishing between publisher and
| non-publisher.
| heavyset_go wrote:
| > _Between 2010-2016 was the golden age for these
| companies actually being free and open._
|
| It was in the early part of the 2010's that Google,
| Twitter and others started censoring Islamic content in
| the name of antiterrorism and stopping the spread of
| extremism on their platforms.
| yibg wrote:
| I think this is one of those things that sounds nice on
| the surface but isn't really tenable in reality.
|
| Should they only host or also surface "everything"? Is
| deranking something censoring? What about not promoting
| something? If the idea is every thing is given equal
| weight, then things pretty quickly becomes a cesspool.
| diego wrote:
| The law where? That is part of the problem. Every piece
| of content would need to have hundreds of flags, one for
| each jurisdiction in which the content could be viewed.
| That may not be enough for some countries given how easy
| it is to fake your origin address. It's solvable but not
| an easy problem.
| michaelmrose wrote:
| If I host a document for a certified notadoctor telling
| you that you should treat your children's autism by
| feeding them bleach which will certainly constitute
| abusing all of them and perhaps killing some do you think
| online marketplaces of ideas should ignore the fact that
| half the population is dumber than dirt and the dead kids
| and keep serving up poison?
| reedjosh wrote:
| > fact that half the population is dumber than dirt
|
| I think you're going a little far there, and the fact
| that you're using this to justify censorship is pretty
| ugly.
| michaelmrose wrote:
| The case I gave is not in any way hypothetical there were
| many popular actual self published ebooks on amazon
| instructing you to abuse and possibly kill your kids with
| bleach to cure their autism.
|
| https://www.cnbc.com/2019/05/28/amazon-removes-books-
| promoti...
|
| I'm not going too far I'm speaking empirically. Almost
| 1/4 of the population has an IQ less than 90 and is
| empirically challenged and a substantial portion of the
| remainder including those with reasonable or even high IQ
| are completely dysfunctional because regardless of how
| functional a brain they were born with they have
| basically ruined it by training it only to consume and
| create trash.
|
| One has only to talk to a large enough number of your
| fellow humans to realize half of them are in fact dumber
| than dirt. If there weren't literature about bleaching
| your childs insides to cure their autism or other
| insanity of the same grade would find no takers.
|
| If you find the number of bleach swillers insufficient
| consider thatnearly 40% of us in America believe that a
| genie created the earth less than 10k years ago.
| Overwhelmingly this is because they do not possess the
| intellectual aptitude to dismiss this theory. If their
| brains were highly functioning they would do so despite
| conditioning. Plenty of people will live 70-100 years and
| pass away in their hospital bed without ever ever having
| turned their brain into the on position.
|
| On the flip side others aggressively question the reality
| they are given but because they lack the inherent
| intelligence or have spent their entire intellectual life
| consuming the equivalent of junk food they are utterly
| incapable of discerning the difference between insane
| fantasy and truth.
| landryraccoon wrote:
| That's your choice.
|
| Isn't the point of the free market that consumers are
| free to vote with their dollars?
|
| I don't want to give my money to companies that host
| things that I find morally abhorrent. That doesn't mean
| other companies can't host that data, simply that I don't
| want any of my dollars to go to those companies.
|
| If the vast majority of consumers also don't want to give
| money to companies that host that content, that's the
| hosting company's problem. The free market is speaking.
| Nobody is making it illegal for that data to be hosted,
| but nobody is obliged to pay for it to be hosted. So it
| seems to me that things are working as intended.
| theandrewbailey wrote:
| > I don't want to give my money to companies that host
| things that I find morally abhorrent.
|
| Isn't that a sign of a moral panic? Back in the 90s, my
| parents didn't want to patronize companies that signed
| deals with pornographers. Even though I was only 10, that
| sounded ridiculous. Good luck finding any large company
| that doesn't.
| michaelmrose wrote:
| If people wont do business with people who do business
| with pornographers what right do the pornographers or
| porn aficionados have to object. If it would render it
| more clear it wouldn't matter if the subject were
| avocados. It's not a moral judgement on the worthiness of
| porn its about consumer choice in aggregate.
| lostlogin wrote:
| Can you explain this? Are you referring to tech only?
| Maybe I'm way too oblivious but of the large companies I
| pay money to I find it hard to imagine that most are
| signing deals with pornographers.
| Chris2048 wrote:
| Did the local utilities (water, electric) refuse them to
| pornographers, and did you parents boycott those?
| jjeaff wrote:
| Just how many modern companies do you think are actively
| dealing in pornography?
| landryraccoon wrote:
| I don't see what's changed. Youtube doesn't host
| pornographic content either. The Apple App store doesn't
| allow adult content of any kind. If you're saying maybe
| Youtube and the Apple store should grow up a little and
| allow adult content I'm sympathetic but the horse has
| long left the barn on that one.
| eurasiantiger wrote:
| > Youtube doesn't host pornographic content
|
| While they don't host porn, they do host a host of porn
| performers.
| teh_infallible wrote:
| I think the whole concept of "voting with your dollars"
| is flawed. Google will always have more votes than you.
| derek_codes wrote:
| What? So because someone or something has more money than
| us, we should just say screw it and buy products we are
| morally against? That makes no sense.
| nybble41 wrote:
| The analogy to voting to flawed in that actual voting is
| much too powerful, because it involves politics and the
| use of force. When you "vote" with your dollars your
| choices are to either support something or ignore it and
| spend your money somewhere else. The amount of support
| you can provide is limited by the economic surplus you've
| created, and no matter how much money you amass you can't
| just shut down something you dislike as long as other
| people continue to support it. Voting in the political
| sense lacks these safeguards: a majority (or vocal
| minority) can subsidize their pet projects with the
| opposition's resources, or prohibit harmless activities
| merely because they find them distasteful.
| jlawson wrote:
| No, because just a handful of companies control the
| entire ecosystem. There is no alternative, you cannot
| escape their influence and you cannot function without
| interacting with them.
|
| If Google and Apple ban you from their products and
| platforms, and Visa shuts down your payment processing,
| there is nowhere to go. You're done. This and much more
| has happened many times.
|
| You're making the "just build your own internet bro"
| argument.
| mason55 wrote:
| But I think we need to examine both sides of this.
|
| When the American concept of free speech was coined, it
| was valuable because you could go stand in a government
| owned square and communicate your message for free. It
| was a good balance between not forcing private companies
| to accept speech but still allowing the speech to happen.
|
| Online we don't have the concept of a government run
| square, and so your speech can be totally stifled by
| private companies.
|
| But the difference is that when you're standing in the
| town square shouting nonsense, your reach is constrained,
| your ability to reproduce your speech is low (you have to
| just stand there and keep shouting) and everyone knows
| who you are. Damaging speech just can't be that damaging.
| Online is totally different.
|
| I think the argument of "Google can't censor you, only
| the government can" is not great because there's no gov't
| equivalent of the town square. But I don't think the
| answer is just "make Google accept all speech" or "create
| a gov't equivalent of the town square" is necessarily the
| answer either. I think we should be starting from first
| principles and understand what free speech is trying to
| accomplish and come up with a framework that helps us
| accomplish it.
| shkkmo wrote:
| The history of printing things for wide scale
| distribution well predates the first amendment and it is
| silly to pretend otherwise.
| inlined wrote:
| But the history of forcing those publications to host
| your opinion is unprecedented
| Turing_Machine wrote:
| Historically there were two modes of distribution,
| "publishers" and "common carriers".
|
| Publishers (like newspapers) had full control over their
| content, and also had full responsibility for it (e.g.,
| if they printed something libelous, they could be sued).
|
| Common carriers (like the phone company) had no control
| over the content, and no responsibility for it, either
| (you couldn't sue the phone company if someone used the
| phone to plan a crime, for instance).
|
| Google and their ilk want to have it both ways. They want
| the full control of publishers, and the zero
| responsibility of common carriers.
|
| Historically, power without responsibility has invariably
| been a recipe for abuse.
|
| I think they should have to choose one or the other.
| throwawaysea wrote:
| This is the point SCOTUS Justice Clarence Thomas made in
| his render opinion, that communication networks like
| social media should be regulated as common carriers:
| https://reclaimthenet.org/justice-clarance-thomas-big-
| tech-p...
| [deleted]
| tsimionescu wrote:
| Pretending free speech was only about the town square is
| ahistorical.
|
| Free speech has always been about distribution as well -
| publishing a book or a newspaper, distributing pamphlets
| - those had similar reach to a random FB post or YT video
| today (in terms of percentage of the population).
|
| Of course newspapers had no obligation to carry anyone's
| message, but, far more importantly, a newspaper couldn't
| be censored by government for printing stuff the
| government didn't like.
|
| It's also important to remember that there used to exist
| many more newspapers - factories would have newspapers,
| most towns would have one or two, many clubs and similar
| organizations would have one.
| michaelmrose wrote:
| Why can't you do business with cash/money orders and run
| a website on your own hardware exactly?
| GekkePrutser wrote:
| Doing the money order thing is so inconvenient these days
| that 90% of consumers will just skip you. It's simply not
| practical.
| rOOb85 wrote:
| Sounds like they just need to pull themselves up by their
| bootstraps and work harder.
|
| They could also accept checks via the mail. Or cash via
| the mail. Or cryptocurrencies. Or gift cards. Or barter.
| There are multiple options available to them.
| GekkePrutser wrote:
| Those things are all really inconvenient.. Really, who
| wants to go out and mail something and wait for it to
| arrive? And gift cards risk the same kind of banning of
| credit card companies. This is going to limit any
| business beyond inviability.
|
| The only thing you mention there that is a serious
| alternative are cryptocurrencies. And those are constanly
| fluctuating, needing very complex hedging against sudden
| value changes. And still something that most consumers
| will really struggle with. Most will not have a clue how
| to obtain crypto or how to deal with it (safely).
|
| It might work for a really highly educated niche, but not
| for 99% of consumers. They just want to put in their
| paypal or credit card details and click buy now.
| nobody_at_all wrote:
| I don't do business with either Google or Apple.
|
| I don't need them to do anything at all.
|
| There are competitors to Visa.
|
| I can do anything I want without Apple, Visa, Google and
| even Amazon and Microsoft.
| 13of40 wrote:
| The impact of that is asymmetrical, though. A company
| that's making 1% profit (thinking a rural TV station
| here, not necessarily Google) can't afford to lose 10% of
| it's viewers, so they dial down their programming to the
| least objectionable possible. That's how we get in a
| situation where most people want to watch Breaking Bad or
| Game of Thrones but what they get are Brady Bunch reruns.
| LudwigNagasena wrote:
| There is no free market in big tech.
| TeMPOraL wrote:
| There's little free market for the consumers _in tech in
| general_. The barriers to entry are extreme - between
| inherent complexity of the products, enormous capital
| investments required, and scale-related benefits
| (economies of scale, network effects), few can afford to
| start a real competitor. You can 't _just_ make a new
| Facebook or a new smartphone or even a new coffee machine
| - not without making a faustian bargain with investors, a
| deal which is usually the root problem behind why
| technology sucks.
|
| Consumers only get to choose out of what's available, and
| in this market, it's hard for an entrepreneurial consumer
| to _make_ some things available when the market isn 't
| providing them.
| madmax96 wrote:
| That is the point of the free market. But deplatforming
| certain kinds of offensive content is regressive.
|
| For instance, in the not-so-distant past many Americans
| found interracial marriage morally abhorrent. Wikipedia
| says only 5% of Americans thought interracial marriage
| should be legal in the 1950s. In today's environment,
| that leads to deplatforming those who would've supported
| marriage equality. This is not something we would desire.
|
| People are full of prejudices. I'm sure our grandchildren
| will look back in horror at ours. Let's not deplatform
| them for that.
| IX-103 wrote:
| There's a difference between disagreeing with a position
| and finding it "abhorrent".
|
| Your evidence with regards to interracial marriages does
| not support your point. Whether "something can be
| discussed" is an entirely different question than than
| "do you support position X". The very fact that they
| _were able to take a poll_ is strong evidence that
| talking about it was not verboten.
| obedm wrote:
| I always think I can draw the line in the sand as a very
| rational and relatively well read person.
|
| But then I remember that the best thinkers the world has
| ever seen (Plato, Aristotle, Cicero, ad nauseum) were
| never able to look beyond their noses to see the human
| suffering of others.
|
| Aka, they were perfectly happy to have a society run by
| slaves, to ignore the plight of the poor and sick, etc.
| jampekka wrote:
| Perhaps these are the "best thinkers that the world has
| ever seen" because they said stuff that was beneficial
| for (some) powers that be. E.g. Plato-Aristotle-line/myth
| is directly linked to Alexander the Great.
|
| Worth noting that there were very influential thinkers
| and entire schools of thought that looked beyond their
| noses. A good example are cynics/Diogenes the Dog, who
| may well have been more influential than the Platonic
| line. E.g. (as per anecdotes we have left) Alexander the
| Great had great respect to Diogenes, who totally
| ridiculed Alexander's (and Plato's) position.
|
| Also stoics (e.g. Marcus Aurelius) are quite direct
| descendants of cynics and not ashamed of this at all.
|
| More I look into classical philosophy, or the "myth" of
| academia, more it seems that it's mostly a fabrication of
| perhaps scholastics.
| jjeaff wrote:
| That may be true of Plato, Aristotle, etc. But one thing
| I have learned from history is that there is almost
| always a contingent of people that do find terrible
| things like slavery abhorrent and were even outspoken
| about it. But if you are an elite, and benefit greatly
| from something, you are probably much less likely to be
| outspoken against it.
| obedm wrote:
| I've never heard of any ancient philosopher being
| abhorrent about slavery and the like and I've read that
| there weren't any.
|
| Could you point to some readings if you're aware of it?
| I'd love to know
| MatteoFrigo wrote:
| Seneca had a somewhat more humane attitude towards
| slavery. See e.g. https://figsinwinter.medium.com/seneca-
| to-lucilius-47-on-sla...
| spaced-out wrote:
| At the same time, the Civil Rights Movement aggressively
| used boycotts, protests, and shaming campaigns to push
| for racial equality.
| synergy20 wrote:
| you're out if you still think about 'equality', it's all
| about 'equity' these days, 'equality' is no longer good
| enough.
| wslack wrote:
| Tolerance of wrong viewpoints is different than the
| active support algorithms give to discovering false
| content.
| throwawaysea wrote:
| It's not "active" support if the algorithm acts in a
| content neutral fashion, for example based on engagement
| metrics. In such a situation, changing the algorithm to
| artificially not let allegedly-false content be
| discovered is actively supporting the opposing
| viewpoints. Leaving the algorithm to act without
| artificial content-specific modification is not active
| support. Tolerance would be leaving the algorithms alone.
| AlbertCory wrote:
| Former Googler here (11 1/2 years, including in Ads)
|
| The idea that algorithms are "neutral" is laughable.
| There is a loosely organized group of activists out there
| who are aware of how these algorithms work and actively
| manipulate them.
|
| "Engagement metrics" are nothing more than these people
| pushing the buttons.
| onetimeusename wrote:
| I don't really understand why tech companies, like
| Google, go so far out of the way to maintain the image of
| being neutral. I agree they have a right to censor
| content they choose for whatever reason but what I don't
| understand is why they try appear to be neutral about
| their decisions. It feels like everyone is aware of what
| is going on, even other commenters who support Google
| censorship admit they approve of the bias.
|
| So why do tech companies cling to this line of being
| neutral when no one really seems to accept it and they
| themselves have no intention of being neutral? I feel
| like there wouldn't be any conflict about policies or
| complaints they have to deal with if they were more
| honest. Maybe it has to do with section 230. I don't know
| but I feel like we would be better off if consumers had
| more information.
| nobody_at_all wrote:
| There is a difference between opposing viewpoints and
| factually incorrect information that is destructive.
|
| If your view is based on provable falsehoods, your view
| is worse than valueless.
|
| Tolerating these people is harmful to everyone, but that
| is not why Google is banning it. It is because it is
| harmful to Google's bottom line.
|
| At the end of the day, Google owns their servers and can
| say what is allowed and what is not.
| peytn wrote:
| Google is a legal construct. I can't go have a coffee
| with Google. I can't get a high five from Google. Google
| will do whatever our laws say it has to do in exchange
| for liability protections for its owners.
| jjeaff wrote:
| I disagree. I think the algorithms are fundamentally
| immoral because they promote content that gets
| "engagement". Which includes and in many cases
| prioritizes content that people have engaged with because
| it causes a negative response. Rather than pushing good*
| content, it prioritizes lowest common denominator,
| reality tv, desperate pundit, fast food, self
| congratulatory, outrage porn garbage.
|
| *By good, I simply mean thoughtful, high quality,
| factual, educational, or otherwise uplifting content
| regardless of politics
| ShroudedNight wrote:
| I don't think "Good" is unambiguous enough to trust the
| platforms to promote it. How about simply "related"? Show
| people the content they've explicitly asked for. If
| people explicitly ask for outrageous content, then fine,
| but we needn't force feed it to society.
| acituan wrote:
| We're not talking about individual choice but an inherent
| ir/rationality in censorious behavior.
|
| Vast majority of the consumers probably make pragmatic
| rather than idealistic consumption choices. Eg when you
| source a new iPhone, you source certain unethical labor
| practices. When you make use of the US dollar, you make
| use of some amount of atrocities that built its
| international purchasing power (eg any of the petrodollar
| wars). I bet those rarely bother even the most so-called
| "idealistic" consumers because a) it is a hard calculus
| to compute b) it is impossible to live when every
| "impure" thing is removed from use.
|
| The difference with public content hosting is being able
| to twist arms to make them take down stuff and conform to
| an _image_ of virtuousness which we narcissistically and
| psuedo-religiously identify with. It is not about the
| real damage the contents pose, it is our intolerance to
| being seen as a "person who can use such sites". The
| threat is to our confirmation bias, in this case the
| confirmation of an idea that there is a clear right and
| wrong and we are definitely right.
| landryraccoon wrote:
| I'm confused. I don't see the problem. What's the
| difference between the image of virtue and just virtue?
|
| If the majority of people believe that images of
| virtuousness are what they want, then that's just what
| they want. People aren't computing the outcome, their
| ethics are based on appearances and always have been. The
| internet doesn't change that fact. Whether it was in the
| middle ages or the post-industrial period or today,
| virtue has always been performative. So I don't really
| see what you think your argument demonstrates.
| oceanplexian wrote:
| What does the "Majority of People" mean?
|
| The enlightenment concept of free speech is likely a
| minority viewpoint among the people of the world. However
| I also think it is the correct view and that corporations
| or governments looking to censor speech are infringing on
| human rights, and believe in fighting for it the same as
| I would fight against racism, slavery, religious
| discrimination, authoritarianism, and so on. Just because
| a lot of people believe something is ok does not make it
| right.
| acituan wrote:
| > What's the difference between the image of virtue and
| just virtue?
|
| What's the difference between a real car and a perfect
| cardboard replica? One has functionality and interiority.
| The other is just exteriority.
|
| In the King Midas story, he wants everything to be
| mindlessly golden and thus turns them into unusable shiny
| crap. He got the golden exterior alright, with none of
| the real goldenness, goodness they would afford him.
| landryraccoon wrote:
| Your example is a bit facile. A cardboard car does not
| function as a car does. It sounds almost like you are
| saying something but you never explain exactly what's
| wrong with performative virtue.
|
| But you have to define what is virtuous before you can
| claim that performative virtue is "fake". The young
| people of today are no less virtuous than their elders,
| despite the fact that their elders (like virtually every
| generation before them) complain that they are immoral.
| nobody_at_all wrote:
| You start with a faulty premise: google services are
| public, so you have no valid point.
|
| Build your own data center and upload anything you want.
| Don't demand that a private company give you an
| unfettered soapbox.
|
| Also "source a new iPhone"? Can we dispense with the
| meaningless buzzwords, please?
| gcanyon wrote:
| This is reasonable as an ideal, but could be harmful in
| practice. A significant percentage of the U.S. population
| believes inarguably wrong and demonstrably dangerous
| things at this point. It is possible that the only
| effective way to fix that is corporate censorship. That
| wouldn't make me happy, but I'm not going to agree with
| letting a crazy person steer the Titanic into an iceberg
| just because it's their right to do so.
| lenkite wrote:
| Until they commit crimes and are formally charged and
| prosecuted in a court of law, they can believe anything
| they like.
| labawi wrote:
| I guess education and honest information would be too
| radical an approach.
|
| Governments and leaders are concerned that people don't
| trust them, yet the truth is, they don't deserve to be
| trusted. When the system is designed to create a placated
| populace instead of critical thinkers, those in charge
| are routinely lying and blatantly misleading instead of
| informing, then it's no surprise people will believe in
| all kinds of fringe ideas.
|
| Hiding and shunning information can be a temporary band
| aid, but the inevitable effect is that people will trust
| official sources even less.
| hanselot wrote:
| Demonstrably used to mean "provably so", but now it's
| more akin to "debunked by made up website X"
| LudwigNagasena wrote:
| Well, remove their voting rights then if you think that
| they are too stupid. Why should I be the one who suffers?
| tomjen3 wrote:
| >A significant percentage of the U.S. population believes
| inarguably wrong and demonstrably dangerous things at
| this point
|
| Yes. Both sides can agree that they think the other
| believes in falsehods. Since one person has one vote,
| there is effectively nothing you can do about it.
| native_samples wrote:
| Ah, but how do you know you're not the one believing
| crazy wrong and dangerous things? All those terms are
| highly relative and if your answer is argument by
| authority, well ...
| prox wrote:
| Because there are an awful lot of knowledge domains where
| there is consensus among experts, and one can verify
| their own knowledge along those lines.
| wtetzner wrote:
| The consensus of experts has been wrong many times
| throughout history.
| katbyte wrote:
| There is always a line and content that is illegal and
| the question will always be where that line is drawn.
| nybble41 wrote:
| A straightforward reading of the 1st Amendment indicates
| that there can be no such thing as illegal content under
| US law, including whatever you are right now considering
| proposing as an exception. An act _involving_ speech may
| be illegal; if you make a credible threat of imminent,
| irreversible harm others are free to take you at your
| word and defend themselves--the justification here is the
| harm which is reasonably expected to follow, not the
| content of your speech. If you lie to someone to obtain
| their property under false pretenses, knowing that the
| lie precludes any "meeting of the minds" and thus
| renders the contract invalid such that the property still
| belongs to them, then you are committing theft. Your
| punishment derives from the act of taking property which
| did not belong to you, not the fact that you lied in the
| process. And of course what you say may be used as
| evidence against you without the speech itself being
| illegal.
|
| There are some more problematic areas where the
| Constitution itself is inconsistent. Copyright should not
| exist, for one; the core concept is utterly incompatible
| with freedom of speech. The Supreme Court even recognized
| this at one point--it's why we have the concept of "fair
| use" in the first place--but "fair use" is a poor
| compromise which does not fully negate the infringement
| of the freedom of speech. When you have one clause saying
| that Congress has the power (but not the obligation) to
| do something which would infringe on the freedom of
| speech, and another clause later passed as an amendment
| saying "Congress shall make no law... abridging the
| freedom of speech", the obvious reconciliation of these
| clauses is that Congress is barred from exercising that
| enumerated power because it would violate the later
| amendment. The Court tried to strike a balance instead...
| but it still amounts of Congress passing a law which
| abridges the freedom of speech, despite the limitations
| imposed by the Court.
| tsimionescu wrote:
| There was one other notable exception to free speech that
| undermines your point: obscenity is not, or has not been,
| considered speech under the terms of the free speech
| amendment. So, at least historically, there is some
| precedent for considering that some forms of human
| expression can be censored on their own merits.
|
| I should note that I am anti-censorship and consider such
| laws absurd, but my point is that we can't rely on
| readings of the constitution to self-evidently protect us
| from such things.
| CamperBob2 wrote:
| _obscenity is not, or has not been, considered speech
| under the terms of the free speech amendment._
|
| What part of "shall make no law" isn't clear? It's true
| that obscenity is an exception to 1A, but that's
| something that some people made up after the fact, in
| direct contravention to the framers' stated intention.
|
| The founders weren't short on ink. If they had wanted to
| equivocate, they were certainly free to do so. The fun
| _really_ starts when some people decide that their
| (least) favorite amendments are subject to less (more)
| interpretation than others.
| michaelpb wrote:
| > Between 2010-2016 was the golden age for these
| companies actually being free and open.
|
| Were they really though? I'm pretty sure there were
| plenty of things, especially things deemed to be
| "Intellectual Property violating", which were strictly
| banned on all these platforms far before 2010. One
| example is that Google has been removing search results
| for as long as I can remember.
|
| In other words, what was in your perspective "free and
| open" was very restrictive and politically pro-corporate
| from my perspective --- hardly an "objective stance" but
| instead a very political stance on what was permitted and
| what was forbidden.
| logifail wrote:
| > we're all outraged when these companies host some stuff
| that we don't like
|
| I'd argue that there is already a (fairly) tried and tested
| process in place to deal with this, it's the legal system.
|
| There are plenty of media outlets that publish stuff I
| don't particularly like, but almost none of it is illegal,
| so - to be blunt - I just have to suck it up.
|
| Some of my friends have opinions that I - at times -
| violently disagree with, but I file that under one of the
| side effects of life, and I deal with it.
|
| I'm rarely "outraged" by companies hosting stuff. If it's
| illegal, knock yourself out and get it taken down.
|
| However if it's just really, really annoying or you find it
| against your own worldview, perhaps take a deep breath /
| drink a cup of tea* / go to the gym / hug your OH, and move
| on to something more important?
|
| * or gin :)
| freedomben wrote:
| I think rather than "violently" you meant to say
| "vehemently." If not then ignore this comment. If so then
| you should probably edit as the two have important
| differences in meaning.
| GekkePrutser wrote:
| The legal system isn't great for this as it tends to
| listen to the one with the most expensive lawyers,
| especially in the US. And companies like Google have a
| lot of expensive lawyers.
| TimTheTinker wrote:
| There are two more ways to deal with content we don't
| like. We've largely abandoned these to our great harm:
|
| 1) Get to know people in your community who hold
| differing opinions. We all need to be doing this more -
| fostering friendship over the things we have in common.
| This isn't easy, but the Western world used to be _far_
| better at it.
|
| 2) Engage in healthy debate, which means advocating a
| specific opinion in a public way -- either speech, column
| in your local newspaper, etc. -- with carefully
| researched references/sources, no ad-hominem attacks,
| assuming good faith and intent on the part of those who
| disagree, and respect for the differing opinions of
| others.
|
| Imagine if every local community did (1) and (2) --
| people would be a lot happier and would be less likely to
| hold unsupportable opinions, since even cursory research
| (i.e. prior to publicly arguing in favor of them) would
| show those opinions have no basis in reality.
| etchalon wrote:
| There seems to be an immense body of research that shows
| that good ideas do not win out over bad ideas.
| TimTheTinker wrote:
| What's worse than a bad idea? A forced idea (or ideology)
| and the lack of freedom to express a contrary opinion.
|
| The real problem isn't those who disagree with you or
| hold "bad ideas", it's those who would take away
| everyone's freedom to disagree in peace.
| LudwigNagasena wrote:
| I am absolutely happy if a company is ready to host
| everything legal.
| CyberRabbi wrote:
| Do you support the right of others to host content you
| consider violent or garbage?
| duxup wrote:
| Generally yes, but that depends on what we're talking
| about exactly.
|
| Specifically I noted my willingness to use their services
| / indirectly support that company if I have the choice.
| barnesto wrote:
| This is the problem, isn't it? You seem to be posting an
| easy question to answer. But it's not, is it? Who is
| Google to judge what anyone would "consider violent or
| garbage?"
|
| I don't remember electing them to control this aspect of
| life. And where does it stop? What is the line? Who is
| actually defining these things?
|
| When a small group of people control the definition of
| "wrong think" then we're gonna have a problem regardless
| of which side of the argument those people are on.
|
| While your question is innocent enough, I get the feeling
| you already knew the answer.
| gwright wrote:
| > This is the problem, isn't it? You seem to be posting
| an easy question to answer. But it's not, is it? Who is
| Google to judge what anyone would "consider violent or
| garbage?"
|
| You aren't even touching on the complexities. The
| original article was about "misleading content". Google
| is asserting that they will take action on "content that
| deceives, misleads, or confuses users".
|
| Good grief.
| throwaways885 wrote:
| Speech is violence nowadays. Silence is also violence.
|
| One of these days, I'm going to go live in the woods and
| no internet. People clearly do not want a free society
| anymore so I may as well just check out.
| throwawaysea wrote:
| > Speech is violence nowadays. Silence is also violence.
|
| Also: their violence is speech.
| zo1 wrote:
| If only we could leave our respective countries and self
| organize voluntarily into new ones.
|
| Until then, those of us with kids, can't afford to
| checkout. We have to secure a future for our children.
| [deleted]
| [deleted]
| minsc__and__boo wrote:
| Or just migrate to a country that aligns more with your
| views, like many immigrants have done and do today. John
| Locke never said anything about needing form new ones.
|
| The, "Won't somebody please think of my children" excuse
| is a little selfish considering there are people with
| differing opinions (who may or may not have children
| too).
| stevenicr wrote:
| would be great if fbk and twitter did the same thing!
|
| any group like 'occupy democrats' 'vets against trump' -
| would be gone.
|
| If they would do these to the search results, most of the
| news sites would not longer have top positions! I'm
| liking this now.
|
| "content that deceives, misleads, or confuses users"
|
| - funny that I used to ask people on fbook some years ago
| when they posted some things, 'do you believe the thing
| you are re-posting is true or fake? Is it funny or
| serious? Do you think your 'followers' think it's true
| when they see you post it?
|
| Trying to determine the understanding of the re-poster -
| but also the 'intent' of them re-sharing - sadly I think
| most of the time it was to 'deceive' aka virtue signal
| tribe thing - even when they admitted things may not be
| true, they still wanted the thing posted and shared - and
| knowing others may not look at it and not know it's not
| true.
|
| need to think on this longer. Wait, when g/f/t thought
| the hunter laptop was fake they affected our national
| elections and discourse, when they did not care if golden
| shower oppo research was true or fake it affected real
| world stuff.
|
| Not so sure these folks can be trusted with deciding what
| should be shared as true/false actually. The reach and
| effects of these decisions are large and serious.
| krapp wrote:
| >Who is Google to judge what anyone would "consider
| violent or garbage?"
|
| They own and operate a service called Google Drive. They
| offer that service under whatever terms they decide. And
| the decisions they make are relevant to their own
| service. They likely also don't allow you to use their
| service to distribute illegal material.
|
| >I don't remember electing them to control this aspect of
| life. And where does it stop? What is the line? Who is
| actually defining these things?
|
| Google has the right to make this decision on their own
| platform. They don't have the right to make this decision
| outside of their platform, and are not attempting to do
| so. They're not a government. They cannot control what
| other sites do. They don't have an army. They're not
| burning textbooks or jailing teachers. They're not
| controlling the definition of "wrong think."
|
| If you don't like what they're doing, you're welcome not
| to use their service. Google Drive isn't the only cloud-
| based document backup service by a long shot.
| gizmo wrote:
| In a free society sites like liveleaks and wikileaks
| absolutely have a right to exist. As well as all the
| fringe conspiracy sites.
|
| The problem with censorship by Facebook, Youtube, Twitter
| and Google is different. Here the government is putting
| pressure on the tech companies to censor content they
| don't like. Censor the bad people or risk antitrust
| action.
|
| It's so gross and so clearly in violation of the first
| amendment. Even elected officials and professors are not
| exempted. It doesn't matter if you're elected by the
| people or if you're an expert in your field, if you say
| something that is considered 'misinformation' by the
| Ministry of Truth you get censored. It's outrageous, and
| if big tech doesn't change course we need to start
| building alternative platforms. But it might already be
| too late.
| CyberRabbi wrote:
| It doesn't matter if competing platforms are built.
| Normal people don't care one way or another, especially
| outside of how those other platforms are popularly
| characterized, so those other platforms will never take
| off. Only a minority of people are conscious of their
| liberties and subsequently any potential infringement to
| them.
| guerrilla wrote:
| > The other truth is we're all outraged when these
| companies host some stuff that we don't like ... and get
| upset when they don't host the stuff we do like.
|
| Yet it is rational to oppose distribution of what you think
| is wrong and promote distribution of what you think is
| right. You must have a hidden premise in there somewhere.
| Broken_Hippo wrote:
| I think most religion is wrong, yet I'm not opposing
| that. I also think genocide is wrong, and yes, I'll
| oppose that. Same with anti-vax rhetoric.
|
| While religion can be a real negative, it isn't generally
| the goal and often, there are good intentions. Genocide
| hurts people, though, through its nature. Anti-vaccine
| propaganda hurts folks as well. You simply cannot have
| these movements without hurting folks.
| stevenicr wrote:
| Some religions have hurting people codified in their main
| directives - so some should be stopped from being shared.
|
| Ponzi schemes? Crypto investments? Drugs? Gambling?
| Alcohol recipes? Sugar?
|
| i-robot protect us all with truth! (except hunter laptop,
| and lab leak theory - hide those haha)
|
| Actually - I am kind of okay with this new kindergarten
| gloves way of treating the people - let's give them the
| sharing a ability they deserve. California knows best
| what's good for everyone - just don't talk bad about
| beef. Well you may have to censor that in some other
| parts of the world.
|
| Different kindergarten for different countries? different
| states?
|
| Think how much better and safer this internet world is
| going to be without all these bad things!
| hanselot wrote:
| People take the vaccine they die. They don't take it they
| die. Why bother taking it then? It doesn't do anything a
| vaccine is supposed to do. So why cal it that? Why not
| call it what it is? Experimental gene therapy with 0 long
| term studies. Why the social pressure to force others to
| make the same mistake you made by volunteering for use of
| a drug that you can't even sue the creators of if it
| kills you? My hat off to you for being willing to die so
| that others can learn from this experiment, but don't
| pretend you are anything more than rats in a lab.
| psyc wrote:
| Is it rational, though? Maybe for a certain type of
| politically active person. But I believe in free
| expression (the principle kind, in addition to the 1st
| amendment kind), so it seems to me it is not rational for
| me to oppose the distribution of anything legal. Or is it
| irrational of me to have principles, rather than being
| maximally self-interested or social-utopia-utilitarian?
| michaelpb wrote:
| The thing is you are "promoting distribution of what you
| think is right" literally right now. Like, with this very
| comment right here, _you_ are being politically active!
|
| So, assuming you are acting rationally, you are right now
| promoting what you think is "right" ("anything legal",
| "1st amendment" eg the United States's Constitution),
| while countering what you think is "wrong".
|
| If you didn't believe in the promotion of what you think
| is right, then you wouldn't be posting to argue against
| what you think is wrong! You would never upvote (bias) or
| downvote (censorship) and so on. Sure, you could argue
| that your style of promotion (comments on HN), or that
| promoting your worldview in general is better for certain
| outcomes, but ultimately your still just arguing for
| "freedom" in your particular definition of "freedom"
| (still promoting or opposing distribution of right/wrong)
| psyc wrote:
| I think you are equivocating. I don't oppose the
| distribution of any other opinion. I don't like those
| opinions, but I'm not trying to make it harder for anyone
| to say them. And trying to change someone's mind about it
| is not at all the same as "opposing distribution"
| regardless of whether it has the same intent or potential
| effect.
| michaelpb wrote:
| I'm not trying to equivocate here (or be combative, I
| hope this is an interesting discussion for both of us!).
| I'm being serious: I consider downvoting to be "opposing
| distribution" of a statement by definition, since it
| limits the distribution, although perhaps not very
| effective if done by yourself.
|
| > regardless of whether it has the same intent or
| potential effect.
|
| I disagree, and I think I'm in the majority to have more
| outcomes-based ethics [1]
|
| What I'm trying to get across is that you are
| "politically active", whether you think you are or not.
| "Activism" can literally involve just a bunch of friends
| on an online platform upvoting and downvoting. Even just
| a small group of people doing this can even be effective
| censorship in certain contexts, such as local elections.
| Sure, Google may have more cost-effective means of
| censorship --- larger political campaigns have to pay
| firms LOADS to bury stories or control online discourse
| without access to the power Google has --- but it's still
| the same result, just a matter of who calls the shots and
| cost-efficacy.
|
| You might argue that controlling discourse like this is
| not censorship or unethical based on your definition, but
| as you said it can have the same intent and has the same
| potential effect, so to another perspective, perhaps one
| that places less value on the USA constitution, it most
| certainly is.
|
| [1] About 90%, in the case of a survey question about the
| trolley problem -
| https://en.wikipedia.org/wiki/Trolley_problem#Survey_data
| psyc wrote:
| How would my being mildly politically active on HN
| contradict my belief that it is rational for me to
| encourage the distribution of legal speech, without
| regard for agreement with the content? I feel like you're
| trying to chide me by reminding me that I'm engaging in
| politics. Yes, I'm engaging in politics. I just don't see
| what that has to do with the thrust of my objection.
|
| Your point about downvotes being censorship is troubling
| and gives me pause. I think it's only censorship because
| of how HN fades the comment towards illegibility. But I
| have always thought that is a user-hostile design choice.
| I'd much rather you could see the score, but still be
| able to read them easily. I have spent enormous amounts
| of time carefully reading people I disagree with.
|
| (FWIW, I did not downvote you, and I do so rarely.)
| michaelpb wrote:
| > How would my being mildly politically active on HN
| contradict my belief that it is rational for me to
| encourage the distribution of legal speech, without
| regard for agreement with the content? [...] I just don't
| see what that has to do with the thrust of my objection.
|
| Yeah I'm not really making my point clear here, sorry.
|
| My point is that a "values-neutral" platform doesn't
| exist, and every attempt to build such a thing is usually
| only "values-neutral" from the perspective of it's
| creator. For example, I'd argue there's a contradiction
| even in the way you phrased it here: "Legal speech"
| implies you are indeed giving "regard for agreement with
| the content", since this would imply suppressing content
| that is not in agreement with some legal framework you
| have in mind.
|
| > I think it's only censorship because of how HN fades
| the comment towards illegibility.
|
| It's not just that. Upvotes promotes one position over
| the other, so when one considers statistical properties
| of how far people scroll down, or how likely people are
| to expand low-voted comments or go to another page (for
| platforms like Reddit, HN, etc), the effect can be the
| same.
|
| It's interesting to see that as online platforms
| gradually replace "traditional" journalism for how people
| get information, we're rehashing a some of the same old
| arguments about what is "objective" journalism.
| Publishing ANYTHING, whether physical documents (eg
| newspapers) or HTML documents (eg HN, Facebook), will
| always promote some worldview and censor another based on
| what is included in the publication, and the ordering of
| the topics.
|
| Sometimes this censorship is explicit (eg nixing a story,
| Google taking down a search result), other times it's
| done statistically (eg putting stuff "below the fold", a
| search result being on page 10), but our informational
| world is perpetually being shaped like this. Pretending
| that's it's even logically possible have unbiased
| platforms "without regard for agreement with the content"
| --- as an example, not you, but elsewhere here it was
| claimed 2010-2016 was mostly censorship-free --- is
| starting off on a wrong premise. If we start on a wrong
| premise, any further discussion is meaningless at best,
| and actively manipulative at worse (Fox News' "Fair and
| Balanced" slogan comes to mind)
|
| > (FWIW, I did not downvote you, and I do so rarely.)
|
| Thanks, I avoid this also!
| nybble41 wrote:
| > I'd much rather you could see the score, but still be
| able to read them easily.
|
| I agree. Actually I'd almost forgotten they did that
| since I use the StyleBot extension and have ".commtext {
| color: black; }" in the CSS for this site, which
| overrides the fading. I also enabled the setting to show
| "dead" comments; they aren't always worth reading, but it
| happens more often than you might think.
|
| Actually that's another aspect in its own right--HN
| doesn't just fade downvoted comments but removes them
| altogether if they're downvoted enough. And sometimes
| comments are actually deleted by the administrators and
| not just marked as "dead" and hidden from the page by
| default. It's their site, and I would uphold their right
| to not be forced to host comments they object to against
| their will, but there is some actual censorship going on
| and not just convenient "curation" of what shows up
| clearly in the default view.
|
| I don't see downvotes as censorship when the downvoted
| comment is still available for those who care to read it,
| and not actually deleted. To me it's more of an
| indication that, in the reader's opinion, the comment was
| perceived as not contributing to the discussion. In
| general I prefer to upvote the good comments and save the
| downvotes for trolling, flamebait, etc. which would be
| likely to derail the thread. To put it in words, an
| upvote is like saying "check this out" while a downvote
| is either "don't waste your time" or "this thread belongs
| somewhere else". But you can still see the downvoted
| comments if you want.
| guerrilla wrote:
| It's obviously rational otherwise you wouldn't have
| brought up legality. We already do this and agree that it
| is rational.
| michaelpb wrote:
| That is rational, yes. I think pretty much everyone
| thinks this way, with differing definitions of both
| "promote" and "oppose", and "wrong" and "right".
|
| Though I may be misunderstanding what you are getting to
| here.
| ceilingcorner wrote:
| > The other truth is we're all outraged when these
| companies host some stuff that we don't like ... and get
| upset when they don't host the stuff we do like.
|
| Is this actually true? I only think certain fringe Twitter
| groups are mad that companies host controversial things.
| tsimionescu wrote:
| There is a long history of people in the USA and
| elsewhere being mad that companies host certain things -
| pornography was illegal to distribute for decades because
| of such beliefs, and is still segregated from non-
| pornographic content, and shunned by all regular
| advertisers (you won't see Coca-Cola ads or Beats
| headphones on pornographic sites) for precisely this
| reason.
| duxup wrote:
| Man I'd love to find test that theory scientifically.
| filleduchaos wrote:
| Complacence might be a better descriptor than
| mad/outraged. For example nobody really gets up in arms
| over companies choosing not to host (what they deem to
| be) pornography for example, and various bans of risque
| content and the people who produce them from major
| platforms tend to get a lukewarm response from people who
| are otherwise vocal about free speech (see: the USA's
| FOSTA/SESTA)
| jrsj wrote:
| Almost no one in this industry is actually making the world
| better. Just take the money and use that to make the world
| better if you can. Especially since all things considered your
| role at Google probably doesn't make things better or worse
| either way.
| octonion wrote:
| "we"? No "we" don't. If it bothers you so much, why don't you
| do it? No one is stopping you.
| chaganated wrote:
| resignation or gtfo
| semitones wrote:
| maybe soon
| dang wrote:
| We've banned this account for repeatedly breaking the site
| guidelines. Please don't create accounts to do that with.
|
| https://news.ycombinator.com/newsguidelines.html
| [deleted]
| IX-103 wrote:
| I don't think the answer is alternatives, as you would just be
| trading one master for another. What we need is 1) for clients
| to retain control of their data and 2) standardized formats and
| interfaces for handling data. If moving off of Facebook was as
| simple as creating an account somewhere else, then things would
| be different. If Google doesn't own your data you can tell them
| how they are allowed to use it.
|
| As for Google, I've given up on anything revolutionary from
| them. They have a lot of smart people with good ideas but
| they've grown so big that they can't "move fast and break
| things" without breaking whole sectors of the economy--and
| given the number of anti-trust suits Google is embroiled in, I
| think the governments of the world know it. That said, Google
| generally seems to at least try to do the right thing, even if
| no one there seems to know exactly what that is.
| soheil wrote:
| > whose best interests are provably not aligned with that of
| the general population.
|
| I intuitively I agree with you, but can you provide sources so
| next time someone argues against it I can back up that claim
| better?
| efsavage wrote:
| I think there's a vacuum here in that society wants _someone_
| to intervene when Bad Things Happen, but we either can 't agree
| who that should be or (more likely IMO) the right choice of
| person/organization just doesn't exist. So you end up with some
| people/organizations/governments stepping up to increase their
| power and/or protect their own interests.
|
| I think this is why Zuckerberg and some other big players have
| called for laws to regulate these things, which seems
| counterintuitive, but then FB can pass the buck and is more
| likely to maintain the status quo where they're on top. But
| until they are more insulated from the risks, they're going to
| be forced to defend themselves.
|
| Disclaimer: I too work for Google
| semitones wrote:
| This is also probably why Google is advocating for the
| privacy sandbox and banning third-party-cookies, and staying
| ahead of the law tech-wise. Such that when the inevitable
| regulation of the playing field does come, they are sharing
| drinks and chuckling with the referees, while the other
| players are still struggling to figure out what their game
| plan is.
| charwalker wrote:
| It's also easier for FB/etc to push for laws to be written
| when they can pour millions into a PAC to get their ideal
| language into those bills, if not straight up write sections
| themselves. They can lobby for fines that are lower than
| profit from acting in bad faith or anti-competitively (who
| even knows how much money FB saved by buying out
| Instagram/etc), they can run their own disinformation or
| targeted campaigns to sway public opinion, or simply minimize
| anything on their platform to hide it from users. There's a
| massive power imbalance there between a regular voter and
| Zuckerberg/etc, even an imbalance between a regular voters
| who can or cannot vote early or by mail.
|
| I support regulating these groups but that must be done
| within the right assigned via the constitution, existing
| precedent where available, and in depth knowledge of how
| these companies operate and how the tech influences
| consumers. It's complicated.
| shadowgovt wrote:
| Funny enough, I arrived at the opposite conclusion during my
| tenure as SWE... Google wasn't taking responsibility for the
| wide sweep of its arms as the gorilla in the room, and was
| trying to stay neutral ("We just build services; expecting us
| to be responsible for them is like expecting fire not to burn")
| in a situation where they fundamentally can't.
|
| Google is already non-neutral... they ban all sorts of content
| on search and comply with law in multiple countries. It's never
| a question of "Should Google be neutral," but instead "How
| neutral."
| nomoreplease wrote:
| > in a situation where they fundamentally can't.
|
| I'm sorry but they can stay neutral by just not
| reading/blocking people's Docs
| Spooky23 wrote:
| No. That appeal to some notion of free speech is bunk.
|
| They can "stay neutral" by limiting the ability to share
| widely. Free speech doesn't mean that you are entitled to a
| billboard, radio station, or global content delivery
| network.
|
| By aligning global distribution with easy content
| marketing, Google, Facebook, etc created a monstrosity that
| encourages the worst content and systematically evicerates
| high quality content.
|
| That fundamental lack of understanding by naive engineers
| drives a lot of the problems we have.
| zepto wrote:
| > Free speech doesn't mean that you are entitled to a
| billboard, radio station, or global content delivery
| network.
|
| What has any of this got to do with _Google Drive?_
| shadowgovt wrote:
| Google Drive is a global content delivery network. A
| document flagged as "Share with link" is reliably hosted
| worldwide.
| nomoreplease wrote:
| There was no appeal to free speech. You're creating a
| straw man
| clipradiowallet wrote:
| In theory they can... in reality something bad will happen,
| someone will have written about it in a google doc, and the
| narrative will read "...they had access, yet
| evil/neglectful Google did nothing!".
| nomoreplease wrote:
| Sure, someone can frame it that way. Someone else can
| also frame it as respecting privacy
| shadowgovt wrote:
| That's not actually neutral; that's letting harmful content
| spread at the speed of Internet discourse. We already know,
| for example, that they aren't going to allow Google Docs to
| be used to wide-cast child pornography. They never have. It
| appears the only change is that they're adding new
| categories of misinformation to the "harmful content" list.
|
| When you have a reach like Google, neutrality isn't an
| option. And their mission isn't neutrality anyway; it's "To
| organize the world's information and make it universally
| accessible and useful." Dangerous misinformation coming
| from a google dot com domain isn't useful.
| LegitShady wrote:
| CP is already illegal and google doesn't need to
| arbitrate whether its illegal or not. Blocking CP is
| neutral and a basic.
|
| It's when google decides to block things that aren't
| illegal that they get into the weeds, and companies
| framing opinions they disagree with as "Dangerous
| misinformation" is itself dangerous misinformation, and a
| net negative on society.
|
| Google should not be arbitrating truth. They are not
| qualified, not capable, and not honest enough to do it,
| and they will never be.
|
| I don't know a single person in real life I would trust
| to censor what I can and can't see, and I trust google
| much less than those people I actually know.
| shadowgovt wrote:
| Is it "censorship" when they simply choose not to be the
| medium to communicate that data to you?
|
| If so, that puts the entire search apparatus in the
| category of "censorship," since it makes opinionated
| decisions regarding what the answer to your query should
| be. Choosing to refrain from vending a Drive URL is
| basically the same thing.
|
| Edit: I could see an argument that they're being a bad
| steward of other people's data if they choose not to
| honor share requests on content they host or choose to
| remove content they've previously hosted. In which case,
| I'm glad they're putting the fact they'll do that right
| in a public disclosure, and it _is_ something people
| should consider when choosing Drive to host their
| content.
| LegitShady wrote:
| The search apparatus is exactly the target of those
| trying to implement censorship.
|
| Six months ago saying covid-19 originated in a lab was
| verboten on many platforms - saying it on facebook or
| YouTube would get you called a purveyor of misinformation
| and the content deleted and and your account at risk.
|
| Suddenly it turns out the people involved at the
| government level funded the work exactly, and they worked
| with these companies to define what was misinformation,
| and suddenly Jon Stewart is making jokes about it and
| these companies allow it to be talked about again.
|
| If you don't understand how dangerous the platforms that
| house our public speech banning some speech based on
| "misinformation" is you aren't paying any attention. They
| have set up their systems to detect/downrank/remove
| arbitrary content and that will be used for political
| reasons - it already has been and quite recently.
|
| We live in dangerous times and a lot of people are
| oblivious.
| Spooky23 wrote:
| No, of course not. Every entity is allowed to manage
| how's its property is used.
|
| Barnes and Noble isn't censoring anyone because it
| chooses to feature some books on an end cap vs. buried on
| the shelf.
| asiachick wrote:
| It's funny to me that , in my mind HN = SV and SV is
| hyper liberal and listening to NPR which is also fairly
| liberal all the shows I listen to are calling for exactly
| "Google and Facebook need to ban all speech we don't
| like"
|
| this isn't Google's problem. It's a society level
| problem. Google is just responding to the pressure
| Spooky23 wrote:
| That's like 3 layers of assumption.
|
| Google doesn't want to eliminate its cash cow, and
| doesn't want to be associated with crazy fringe people.
| It's pretty simple really.
|
| Running this stuff through a "liberal" or "conservative"
| lens isn't productive. Big public companies care about
| making money and eliminating risks associated with doing
| so.
| robertlagrant wrote:
| Modern liberal is generally fairly pro-censorship; pro-
| authoritarian. The word's definition has just flipped in
| recent years, so it means different things to different
| people.
| clord wrote:
| My mind boggles that people have opinions like this. It
| is so anti-liberal. Every evil regime in history takes it
| upon itself to define and eliminate hateful content,
| using the contemporary unconscionable act to justify this
| evil.
|
| To be a good liberal, Google has to decide if it's a
| publisher or platform. IF it's a publisher, then the
| hateful content is coming from them and they must take
| editorial control of google docs and whatnot. If they're
| a platform provider, then they get the same protections
| as telephone network operators and others from the
| actions of their users.
|
| This current situation is the start of the road to
| tyranny.
| 8note wrote:
| I think the protections for phone operators are the anti
| liberal parts.
|
| These companies are choosing to repeat what people tell
| them. They should always be considerate when doing so.
|
| Picking operators who's opinions align with yours is the
| liberal way, using the free market to pick which ideas
| get repeated
| 13years wrote:
| > contributing to the better world that I think we need, has
| started to weigh heavier and heavier on me...
|
| This is the problem. We have arrived where we are because some
| group of people think that powerful companies "should make the
| world better".
|
| Historically, much of the major atrocities were carried out
| with these very intentions.
|
| How to build a tyrant in one simple lesson: 1) Take any normal
| person who wants to make the world better 2) Give them the
| power to do so
| constantinum wrote:
| Tools like Skiff[https://www.skiff.org/] are on the path of
| being that "alternative" which is, quoting from their website
| "..a privacy-first collaboration platform with expiring links,
| secure workspaces, and password protection."
| cwkoss wrote:
| Don't quit, organize with other google workers to fight against
| the policies you disagree with from within until they fire you,
| then take a fat unemployment check until you find your next gig
| - somewhere hopefully with fewer moral compromises.
| Florin_Andrei wrote:
| > _Google 's increasing role as an arbiter of right vs wrong
| and a steward of information_
|
| That is exactly what's needed.
|
| When there's no penalty for bullshit, lies tend to self-
| perpetuate, like viruses.
|
| You may be fixated right now on some half-baked ideal of "free
| speech" and so on, but the fact is - a lot of people out there
| can't tell left from right. And in situations such as the
| current pandemic, when the consequences of spreading bullshit
| are thousands of deaths, your naive ideas about "freedom" need
| a reality check.
|
| > _truly free-flowing information_
|
| Which will then immediately get hijacked by bullshit-spewing AI
| bots and folks with agendas. See what happened recently with
| all the "freedom" emphasizing social networks.
|
| What you're proposing is the online equivalent of a country
| with no military, no police, no laws and no judicial system.
| This may be fine in some Ayn Rand fanfic novel for young
| adults, but it's not a reality where anyone would want to live.
| [deleted]
| colordrops wrote:
| Hanlon's razor has really messed up people's discernment. There
| are plenty of amoral, if not malicious, decisions by these
| companies, and these behaviors are intentional. All you HN
| commenters think you see the consequences clearly, and the
| execs making these decisions are doe-eyed innocents led down
| the wrong path... Are you serious?
| dlahoda wrote:
| sure, try fluence.network
| mioasndo wrote:
| You can build an alternative, and many already exist, but the
| average user is far more influenced by branding and ads than
| privacy or freedom.
| jhoechtl wrote:
| I think with ipfs the infrastructure is there.
|
| https://ipfs.io
| duxup wrote:
| Who wants to be a consumer company that just has an anything
| goes policy?
|
| I sure as hell don't want to do it.
| prepend wrote:
| I don't know anyone proposing "anything goes." I think the
| most common request is to remove illegal speech and stick
| there.
| beebmam wrote:
| Please tell me why a private company should be forced to host
| content that they fundamentally disagree with.
|
| Why should Google be forced, legally, to carry Chinese state
| propaganda, for example? Why should Google be forced, legally,
| to carry ISIS propaganda?
|
| There are alternatives to Google. They should host that content
| there
| throwitaway1235 wrote:
| The argument would be the magnitude of their impact on how a
| member of society can search for, view or transmit
| information is too large for Google to be deemed a private
| company.
|
| If a private company impacts a nation's states democracy to
| such an extent that it rivals it in power, they ought to be
| classified as something else.
| x86_64Ubuntu wrote:
| I'm sorry, but no one from a reasonable standpoint is going
| to look at Google not hosting "misleading content" and then
| think democracy is threatened.
| throwitaway1235 wrote:
| I do. A candidate for political office being barred from
| posting campaign clips to YouTube is a threat to our
| democracy.
|
| Most average people view YouTube as the defacto video
| portal of the entire internet.
| x86_64Ubuntu wrote:
| Once again, a candidate not being able to post to Youtube
| is not a threat to democracy. Nothing is stopping this
| candidate from posting this on their campaign site, or an
| RNC affiliated site or even Facebook where most of their
| supporters are likely to be.
| throwitaway1235 wrote:
| YouTube would be literally interfering with a democratic
| election. If threat is too strong a word, fine. But you
| can't deny that they are actively participating in public
| elections. We want that? We want to privatize democracy?
| I don't.
| prepend wrote:
| > Once again, a candidate not being able to post to
| Youtube is not a threat to democracy.
|
| Yes it is. Youtube is a huge conduit for communication.
|
| Imagine if ABC banned a candidate. The argument that
| there's still CBS and NBC are available is not relevant
| as a major media outlet is favoring a candidate by
| blocking their opponents.
|
| For small outlets it's not an issue. But YouTube is the
| biggest video provider on the planet, not allowing a
| political candidate would be detrimental to democracy.
|
| Even if that candidate said stupid stuff like "world is
| flat." People have to make their decision and as long as
| we're a democracy, that choice should be individual.
| Ostrogodsky wrote:
| Yeah google de-indexing all pages criticizing the
| democrats and the company is also not threat to
| democracy. They must provide us with a curated set of
| sound information vetted by the politicians, FAANG and
| the state department.
| x86_64Ubuntu wrote:
| No one is curating anything. If you want your right-wing
| search engine, then create it. Google is under no
| obligation to give you top page rank.
| throwitaway1235 wrote:
| Can we at least agree on some basic facts? Google does in
| fact curate search. Where a page is listed does not
| depend on how many times the link was clicked. Yes?
| Ostrogodsky wrote:
| "If you want your left-wing gasoline create it.
| Shell,Citgo, Exxon,etc are in no obligation to give it to
| you. Go and extract your own oil and refine it."
| drstewart wrote:
| >If you want your right-wing search engine, then create
| it.
|
| And your true intentions come out.
|
| If Google delists AOC from all their properties in the
| next election and replaces the results with those from
| her opposition, I'm sure you'll be the first to say it's
| totally fine and she should just host her own search site
| if she doesn't like it.
| beebmam wrote:
| So, in your opinion, what should a government force the
| company to do when they are classified as what you
| describe? Force them to host all speech? Does that include
| art? Disinformation? Propaganda? Porn? Spam? Terrorism?
|
| I don't think you've thought out the consequences of what
| you're advocating for.
|
| If you have thought it out, please explain exactly what
| speech they should be forced to host and what speech they
| shouldn't be.
| throwitaway1235 wrote:
| Right. I personally never thought it was that
| complicated.
|
| Once a company is classified as a public utility (I
| believe Google is) it should be forced to host all legal
| content. You tell me what's illegal and I can safely tell
| you it can't be hosted by Google.
| beebmam wrote:
| By whose laws? Even within the US, there are plenty of
| different laws. Should it be an intersection of all laws,
| everywhere? Only content which is lawful around the
| world? Or regionally? Should people outside those regions
| be segmented off from content that isn't in their region?
|
| Moreover, you're saying that spam should be forced to be
| hosted by these companies, just like our snail-mail
| protects. Even if it takes up Exabytes of information.
|
| Should people who have their content removed be able to
| sue these companies for removing it?
| throwitaway1235 wrote:
| They are American companies, they therefore fall under
| American laws.
|
| Just with Twitter, we know that content is regulated by
| region. Setting your location to Germany will prohibit
| seeing certain content from the U.S. Many more such
| regional cases.
|
| The spam example would be a problem, but it's more of an
| annoyance to solve than a basic human rights case. You
| simply cannot have a democracy where segments of the
| population are barred from interacting with public
| officials online. Especially when public business
| (advertising, fundraising, making political arguments) is
| now a core of online communications.
| beebmam wrote:
| If you're saying US companies (that classify as whatever
| you're defining them as) should be forced to carry _all_
| legal speech, no matter how terrible or cruel or
| provocative it is, I 'd be okay with this, and that means
| literally all spam, and that admins would not be able to
| moderate any legal speech. If it's any less than this,
| I'm not okay with what you're advocating for.
|
| And effectively this would turn these sites into
| platforms that are so filled with trash they will be
| unusable. And the chaotic part of me would love to see
| that happen. But it means basically the end of these
| companies to function.
|
| Realistically, I think we should keep to the standard
| we've had in the past: we can't compel companies to host
| speech they disagree with, and we should take strong
| measures to limit their anti-competitive behavior and
| break them up into competing companies if necessary (like
| we did with telecom)
| quotemstr wrote:
| Don't be disingenuous. The problem is viewpoint
| discrimination. Spam isn't a viewpoint. Porn isn't a
| viewpoint. Libel isn't a viewpoint. We can limit the
| ability of tech companies to arbitrarily censor points of
| view while still keeping the platform free of spam.
|
| How? Create a cause of action whereby if a tech company
| removes someone's content, that person can go to court
| and ask that a judge determine whether that content
| removal is some kind of anti-spam operation or viewpoint
| censorship. You don't let the company have the final say.
| throwitaway1235 wrote:
| I don't want to keep arguing. Mostly informative
| exchange.
|
| I would say though, Twitters model from around 2012 was
| extremely open compared with today (remember the Arab
| Spring?) and in no way was it an unusable, trash/spam
| laden platform.
| prepend wrote:
| > Please tell me why a private company should be forced to
| host content that they fundamentally disagree with.
|
| Prior to the civil rights act, companies didn't serve certain
| people based on their race because they disagreed with that
| race.
|
| It's not that Google should be forced to carry stuff, they
| should be forced to not discriminate because they don't like
| it. ISIS propaganda is illegal and should be taken down for
| that reason.
| _Nat_ wrote:
| > It's not that Google should be forced to carry stuff,
| they should be forced to not discriminate because they
| don't like it. ISIS propaganda is illegal and should be
| taken down for that reason.
|
| Bright-line rules like that worry me. This is, a lot of
| stuff is subjective -- the line between " _legal_ " and "
| _illegal_ " isn't so clear or immutable as one might
| naively guess. So if something more binary -- like host-or-
| remove -- is tied to such a fuzzy, dynamic determinant,
| it'd seem to give rise to all sorts of problems.
|
| For example, say we forced big companies to host all legal
| content, but remove all illegal content, and then we want
| to know if something controversial is legal (e.g., taxes on
| Bitcoin back when it was newer). Then someone could post
| two images: one telling people to pay taxes on Bitcoin, and
| another telling people to not pay taxes on Bitcoin. Then
| the hosting-company would have to remove exactly one of
| those. By contrast, a hosting-company could normally just
| remove stuff they're unsure about because they're not
| required to host legal content, sparing them the burden of
| having to properly determine the legality of everything.
|
| Basically, the problem is that we'd be stripping hosting-
| companies of their freedom to operate in safe-waters,
| forcing them into murky areas and then opening them up to
| punishment whenever they fail to correctly navigate those
| murky waters.
| prepend wrote:
| I don't think it's perfect, I think it's just better than
| the current system.
|
| I trust society's laws for legal/illegal more than
| Google's arbitrary decisions of info/misinfo.
| _Nat_ wrote:
| If hosting-companies become responsible for determining
| what's legal/illegal, then they'll have reasonable cause
| to become an authority on the topic.
|
| It'd probably make them more influential and powerful
| rather than less, because their judgements would carry
| the implication of legal-determination, and in popular
| perception, _be_ law.
|
| They'd essentially be elevated to the status of being
| lower-courts.
| mullingitover wrote:
| > ISIS propaganda is illegal and should be taken down for
| that reason.
|
| What specifically is illegal about pro-ISIS speech?
| prepend wrote:
| Calls to violence, images of chopped off heads, etc.
|
| If it's literally just assholes saying "ISIS is great"
| then that shouldn't be taken down. Just like if two ISIS-
| lovers are IMing each other messages about how much they
| love ISIS and nothing else illegal it should be allowed.
| I think.
| beebmam wrote:
| Are you arguing that we should pass legislation that forces
| all US companies to host all legal content?
| quotemstr wrote:
| Yes. Any company with an user base or influence above a
| certain threshold should not get to make moderation
| decisions unilaterally without the input of society at
| large. That input is called "the law".
| prepend wrote:
| Not at all. But I would like to see legislation (or some
| strong rule) that forces huge corporations or companies
| over a certain market share to host all legal content.
|
| Similar to how television broadcasters have regulations
| that that force them to provide equal time to all major
| candidates.
|
| I think there's some reasonable threshold that doesn't
| require small providers to host everything.
| swalsh wrote:
| Why do we not allow telephone companies to censor our
| conversations? Why can't the power company choose not provide
| power to offices for political campaigns they disagree with?
|
| We afford monopolies certain privileges, but we also require
| monopolies to have certain restraints.
|
| Many of these large tech companies have become natural
| monopolies. I think its reasonable to expect similar
| restraints.
| beebmam wrote:
| I agree with you on restraints on natural monopolies. That
| should come in the form of limiting their anti-competitive
| behavior and probably also harvesting user data
| unconsentually for profit.
|
| But regarding limiting these companies from being able to
| decide which speech they don't want to host: I don't think
| you've thought this out fully.
|
| What speech should they be forced to host? Art?
| Disinformation? Propaganda? Porn? Spam? Terrorism? For
| illegal speech: Whose laws should they be forced to obey?
| gshixman wrote:
| With respect, no speech should be illegal, regardless of
| how absurd, abhorrent, or inaccurate the speech is to
| anyone. Free speech is fundamental to a truly free
| society, full stop.
| beebmam wrote:
| You disagree with your own statement and you don't even
| know it!
|
| "No speech should be illegal" - should I be able to
| threaten people then? That's speech. Should I be able to
| detail my plans for how I'm going to commit a crime?
| That's speech.
|
| Should I be able to scream at the top of my lungs in a
| public space? Should I be able to use a loudspeaker to
| broadcast my voice (or an advertisement) to drown out all
| other sound in a public space?
|
| It sounds nice what you're saying, but it's not what you
| actually believe, so I kindly ask you to argue less in
| bad faith.
| swalsh wrote:
| Disinformation, and propaganda are just specific types of
| protected free speech. I disagree with socialism, but
| communist propaganda is protected speech which should be
| allowed. Disinformation can be dangerous too, I
| understand that. But i'm not willing to allow Google or
| my government to make a decision on what is
| disinformation.
|
| As for everything else. You should clearly follow the
| laws of the country you are doing business in.
| commandlinefan wrote:
| > Google's increasing role as an arbiter of right vs wrong
|
| The problem is that (I doubt) Google is really doing this out
| of some misguided attempt at "protecting" people but rather as
| a reaction to what they perceive to be what the people want.
| When America was a very religious (Christian) country, media
| distributors stayed away from anything that appeared
| "blasphemous". They didn't necessarily do it because there was
| a law against it (there were some odd laws here and there, but
| the media didn't start actually challenging them until religion
| really fell out of favor), but because they were afraid of
| consumer reactions. Google (and every other tech company) is
| doing essentially the same thing here: speaking ill on certain
| topics is modern-day heresy and they just don't want to be
| attached to it because they do ultimately fear the consumer.
|
| Even if you found a globally adoptable alternative to google,
| the same people who pushed Google to ban distribution of
| "misleading content" would start looking for ways to ban your
| globally adoptable alternative - at the network level if
| necessary (look what happened to Parler before they agreed to
| follow the unwritten rules). At the end of the day, we won't
| have truly free speech because far too few of us really _want_
| truly free speech.
| semitones wrote:
| "At the end of the day, we won't have truly free speech
| because far too few of us really want truly free speech." -
| that's a terrifying prospect.
|
| I agree, I don't think Google is trying to "protect" people.
| They are ultimately, almost always, protecting their pockets.
| colordrops wrote:
| It's not only right wingers but left wing organizations like
| the Atlantic Council. They assist multiple companies in
| determining which content is deemed permissible.
| driverdan wrote:
| > America was a very religious (Christian) country
|
| America is still a very religious country. It's not as bad as
| it used to be but it's still pretty bad.
|
| Don't forget that these companies are global. Your example is
| still happening with pictures of Muhammad. Many companies
| refuse to host or show them for fear of offending Islamic
| extremists.
| ksec wrote:
| Oh Thank You. That is an interesting take I haven't thought
| about.
|
| For those us not from US, it this _" as a reaction to what
| they perceive to be what the people want."_ really represent
| the majority as in your example when America was very
| religious?
|
| Because it seems to me, ( and I know zip about US ) this
| action only please half and anger another half?
| unyttigfjelltol wrote:
| >speaking ill on certain topics is modern-day heresy
|
| The AUP would be more transparent if it simply banned
| "modern-day heresy". Folks would then be tagged as tech-
| heretics, and many would wear that badge with honor.
|
| Calling questionable, unproven, unpopular or ambiguous
| information "misleading"-- it's a doublespeak. Worse, having
| my cloud drive spontaneously dumping or blocking my data
| because some algorithm or faceless reviewer disagrees with
| the content-- that's totally unacceptable as a consumer
| proposition. Is my Android phone next simply because I'm
| posting an HN comment Google might disagree with? Seriously,
| it's completely unworkable from a consumer position for
| Google to arrogate to themselves that power.
| themacguffinman wrote:
| That's not how heresy works. People don't point fingers at
| you and hiss "heretic!", instead they judge you and think
| you're a terrible human being who does terrible things so
| they shouldn't help or associate with you. You can't
| "simply ban" heresy.
|
| If you're genuinely interested in convincing people across
| the aisle to stop trying to ban stuff like this, simply
| yelling that it's "totally unacceptable as a consumer
| proposition" and "completely unworkable" and "doublespeak"
| is barely an argument. Evidently, many consumers are
| accepting it and will continue to accept it.
| cabalamat wrote:
| > The road to hell is paved with good intentions.
|
| Good intentions my arse. Google, and the US ruling class in
| general, seems very, very keen that everyone believe the 2020
| US election was fair. Which to my mind is very suspicious.
|
| I think the election was probably rigged, and I think the
| democratic primary was also rigged, in 2016 and 2020, to
| prevent Bernie Sanders from winning, because he represented a
| threat to the ruling class.
|
| US elections are about as free and fair as Russian or Iranian
| elections -- and I bet in those countries the ruling class also
| tries to prevent people from saying they aren't.
| poorjohnmacafee wrote:
| Uh "good intentions"? I think you mean "ruling class lackeys"
| jensensbutton wrote:
| > We should be able to implement services like these, that are
| free of ads, on globally distributed infrastructure, with no
| central authority, to have truly free-flowing information.
|
| Who's going to pay for and operate it? And why won't they be
| subject to the same pressures that got us here?
| unixhero wrote:
| Time to double down on self hosting a Google drive clone. I don't
| trust Google any more.
| SavageBeast wrote:
| This is like watching someone (tech companies) wrap a rope around
| his neck while standing on a ladder. HEY HOLD MY BEER! Its all
| fun and games when an administration "in charge" has your back.
|
| It's going to be 100% different the day that changes. It's almost
| like people forget the pendulum of leadership in the US has a
| well known tendency to swing.
|
| Don't get me wrong, in principal I can get behind the idea of
| this but at the same time I don't think I'd feel comfortable with
| any actual implementation of the idea let alone the people in
| charge of the implementation. In my own personal view I feel like
| the harm done by people publishing stupid bullshit online is
| vastly outweighed by what amounts to government backed methods to
| cull publishing stupid bullshit online.
|
| This cure is worse than the disease in my opinion.
| [deleted]
| samlevine wrote:
| You are not immune to censorship.
|
| The things that you say and write now are very likely going to be
| put under a microscope in the future and be found wanting.
|
| And you will have to choose between lying, keeping your mouth
| shut, and having your entire life (digital and personal) torn
| apart for not agreeing with whomever is deciding what's true that
| day.
|
| If you want to have freedom of speech you have to have the
| freedom to say things that some people think are lies. And let
| other people do the same.
|
| There are laws to deal with edge cases that are seriously
| damaging, like libel. Which, unsurprisingly, is quite hard to
| prosecute in the US due to first amendment protections.
|
| Google banning illegal content is entirely reasonable. Kicking
| out users who are disruptive to the platform itself is also
| justifiable. This is not that.
| ilamont wrote:
| "Misleading content" is already an issue for anyone who uses
| Google Merchant Center, and can cause your account to be
| suspended. Try getting a human being to give you a straight
| answer on what this means in specific cases - it's nearly
| impossible. Small accounts can't get through to anyone, and even
| if larger accounts or consultants can, the Googlers on the other
| end can't explain what triggered their "AI" to shut down the
| account.
| GekkePrutser wrote:
| The problem with this is that people use Google Drive for a lot
| of purposes. Including private ones.
|
| I really don't want Google removing my stuff because of some T&C
| crap when I'm not even sharing it with anyone. An external HDD is
| much less likely to break but it doesn't try to impose any
| opinions on my data. Overall the risk of data loss is much less
| with redundant personal storage.
|
| At least just disable the sharing feature but don't delete it or
| ban the account. It makes the service completely unreliable as a
| storage medium. Sharing is only a tiny part of the features of
| these services and the only part where such limitations should
| apply.
|
| For example I don't have anything really bad but I do have
| copyrighted stuff on my O365 like ISO images to have handy when I
| need to reinstall something. I'm not sharing them but if they'd
| be deleted anyway I'd be really really pissed.
|
| And really, Google/MS shouldn't even be able to see what we store
| on Drive unless we try to share it publicly. It really should be
| zero-knowledge encrypted. Luckily there's apps like Cryptomator.
| sillysaurusx wrote:
| Throwing in my comment like a penny down a wishing well:
|
| Google recently _turned off my gmail_ because _my google drive
| quota was exceeded_. They were "protecting" me from myself, by
| denying me the ability to buy more storage. They refuse to accept
| any payment method until I deliver a passport photo to them.
|
| The problems here go a bit beyond them being the steward of
| information or arbiter of right vs wrong. They'll deny you
| service for any or no reason, and leave no recourse when their
| algorithm malfunctions.
|
| That said, they can ban distribution of misleading content. It's
| their right, and no one can dispute that. I'm not sure what
| people hope to accomplish by expressing outrage over this. We
| wouldn't want providers _not_ to be able to decide what content
| to distribute.
| anigbrowl wrote:
| _They refuse to accept any payment method until I deliver a
| passport photo to them._
|
| That seems like something worth supporting with screenshots and
| posting as its own thread. I've never heard of anyone needing a
| passport photo for a credit card transaction, at least not in
| the US.
| shadowgovt wrote:
| > We wouldn't want providers not to be able to decide what
| content to distribute.
|
| That is basically what some people want.
|
| Hacker News is looking at the choice between the George Orwell
| future and Neal Stephenson future and finding neither to its
| liking.
| scrps wrote:
| The very idea of corporations deciding on what is correct thought
| should shake people to the core regardless of ideology. Some
| might come up with hair-splitting justifications but this will
| come back to bite all of us and we let it happen. I'd bet the
| bank on it.
| corty wrote:
| Oh, Google Drive bans advertisements? Interesting... ;)
| simplecto wrote:
| Have a look at Arweave - https://www.arweave.org/
|
| It is an interesting exploration of crypto-based storage.
| reedjosh wrote:
| Gross.
|
| All must recognize the indubitable authority of the ministry of
| truth.
|
| https://en.wikipedia.org/wiki/Ministries_of_Nineteen_Eighty-...
| seneca wrote:
| FYI rclone is a great cli tool that can let you easily pull all
| content out of a Google drive and push it to several different
| easily configured backends, or store locally.
|
| It can also be used to encrypt all content on your drive,
| presumably hiding content from big brother, but also making it
| useless for sharing.
| cs2733 wrote:
| Relevant: How The CIA made Google (2015)
| https://medium.com/insurge-intelligence/how-the-cia-made-goo...
|
| Is Google honestly acting following economic interests? Is it to
| avoid liability from hosting damaging content? Is it acting to
| protect us?
|
| Google, which isn't event the most valuable company in the world,
| is worth above all but - give or take - the top 36 countries in
| the world.
|
| A company of this size wields immense power while simultaneously
| remaining unhampered by all the dead weight governments have to
| haul around to get anything done.
|
| [edit: switched link to original]
| w0m wrote:
| My wife gets a gdrive share ~5x/day from strangers filled with
| xxx material. Maybe related/
| cameronh90 wrote:
| I wish they would solve this malicious/spam sharing issue.
|
| I have an abusive ex who regularly still sends me stuff by
| sharing it on Google Drive. Google will not block it as it
| doesn't meet their standard of harassment, and even if they
| did, it often comes from a new account.
| trentnix wrote:
| And to be completely consistent with their censorship and account
| termination on Youtube, their app store, and ads, their
| justification for banning "misleading content" will be opaque and
| frustrating. And if they screw up, you'll get the old
| "inadvertent error" or "administrative error" catch all.
|
| The support and excuses for why this should be acceptable is
| pretty much summed up as: as long as it's not my ox being gored
| it's fine!
| chroem- wrote:
| In 2003 the US Government and media collaborated to start the
| Iraq War based on false reporting. Tens of thousands of people
| died. Today, the notion that Saddam Hussein had no weapons of
| mass destruction would be suppressed as "misinformation" because
| it contradicts the official narrative.
| katabasis wrote:
| Is "misleading content" going to be determined on a per-country
| basis? Will Google ban activists in Hong Kong from distributing
| "misleading content" about the PRC? How about people in Hungary
| sharing "misleading" resources for LGBT youth?
| bobthechef wrote:
| Is this part of the general push by the White House to censor
| such content across all communication channels? I know Biden was
| going to push telecoms to monitor and I think even censor text
| messages that spread anything about COVID that doesn't agree with
| the official narrative.
| errantmind wrote:
| Can all the people moralizing, grandstanding, and otherwise
| signaling their outrage stop using and recommending Google
| products? Yes, even if you feel like the alternatives are
| inferior.
| throw_m239339 wrote:
| So is google cloud going to do that on people's servers and
| databases as well? Delete the (lawful) content Google deems
| "misleading content"?
|
| Can someone from GCP comment here on record?
| Mountain_Skies wrote:
| Simple solution: Google can delete whatever it wants for being
| "misleading content" but if they ever get it wrong, even in the
| slightest, 10% of Google's wealth is transferred to the
| aggrieved. If Google is so confident in its ability to be the
| sole arbitrator of truth for humanity, they should have zero
| issue with agreeing to this because it will be zero risk.
|
| Of course they will never agree to such a condition because they
| know they aren't capable of even figuring out how to make
| products that the market wants much of the time. The Google
| Graveyard is proof of this. They sure as hell aren't capable of
| knowing more about virology than the virologists of the world who
| still are trying to figure things out.
|
| Tech companies wanting to be in control of human ability to
| communicate with each other is simply a power play. Combat it
| with risk and they'll have to back down or suffer the
| consequences when they make a mistake (which they will, often).
| quickthrower2 wrote:
| There is probably no binary wrong or right in some of the
| scenarios, you'd need a court to decide
| anigbrowl wrote:
| Don't really see the point of responding to policies you
| dislike with absurdly over-the-top counter-proposals.
| charonn0 wrote:
| This might be reasonable if the aggrieved user has a right that
| Google violated. But that doesn't seem to be the case.
| judge2020 wrote:
| You'll find that not many are for Google being so dominant and
| thus being able to limit potential information (or
| misinformation) spread, but you'll find almost everyone being
| against actually taking some of Google's wealth for doing so as
| they're still a private company and can do whatever that they
| want with their services, even if that's deleting information
| or removing files with the letter 'x' in their file name.
| spacephysics wrote:
| This is ridiculous, I agree with most the top comments about a
| need for a new platform.
|
| But if you have such "misleading" content as Big Brother sees it,
| shield it from them.
|
| Encrypt the content, give out the key. Even if it's a simple key,
| who cares. I'd bet you could even share it as a doc to other
| people, their internal algorithms surely aren't sophisticated
| enough to figure it out.
| pm90 wrote:
| Actual Content:
|
| Do not distribute content that deceives, misleads, or confuses
| users. This includes:
|
| Misleading content related to civic and democratic processes:
| Content that is demonstrably false and could significantly
| undermine participation or trust in civic or democratic
| processes. This includes information about public voting
| procedures, political candidate eligibility based on age /
| birthplace, election results, or census participation that
| contradicts official government records. It also includes
| incorrect claims that a political figure or government official
| has died, been involved in an accident, or is suffering from a
| sudden serious illness.
|
| Misleading content related to harmful health practices:
| Misleading health or medical content that promotes or encourages
| others to engage in practices that may lead to serious physical
| or emotional harm to individuals, or serious public health harm.
|
| Manipulated media: Media that has been technically manipulated or
| doctored in a way that misleads users and may pose a serious risk
| of egregious harm.
|
| Misleading content may be allowed in an educational, documentary,
| scientific, or artistic context, but please be mindful to provide
| enough information to help people understand this context. In
| some cases, no amount of context will allow this content to
| remain on our platforms.
| cjdrake wrote:
| This seems like a bad idea.
| s3r3nity wrote:
| How is this _not_ on the front page?
|
| Google is analyzing your files for content, and flagging if it
| doesn't align with narratives that a government body sets - and
| _removing all trace of that content._
|
| If this doesn't conflict with the "hacker pathos," what the fuck
| is Hacker News then?
| floatingatoll wrote:
| It's on the front page right now, but I'm not sure it deserves
| to be given the quality of HN's discussion so far. The top ten
| highest voted comments are essentially redundant copies of the
| same outrage, stated slightly differently. If everyone is just
| going to pile on and state how upset they are, instead of
| replying to each other's comments and keeping one thread for
| all the outrage, what's the point in bringing this to the front
| page at all? Where's the curious and interesting discussion?
| Outrage is pervasive on TV and the Internet and yet, as
| presented here so far, it's the most uninteresting response
| possible.
| semitones wrote:
| Why don't you be the change you want to see, and contribute
| constructively with your own opinion to facilitate curious
| and interesting discussion? Complaining about it is also not
| that interesting.
| floatingatoll wrote:
| I am. Meta-commenting about HN is a constructive
| contribution to both the quality of site and, in some
| instances, leads to quality contributions within a specific
| discussion as well. The site admin 'dang' certainly does a
| lot more of it than I do, and regularly links to his own
| past meta-commentaries, so while I respect that they're
| uninteresting to you, I disagree.
| natural219 wrote:
| Why are you not outraged, lmao.
| floatingatoll wrote:
| I am, but anything I would have said was already said by
| one of the top ten redundant outrage comments, so I'm not
| wasting HN's time piling on in that regard. Upvotes are a
| stellar alternative to "me too" replies that contribution
| nothing but redundancy.
|
| It's interesting that you interpret my reply as containing
| an opinion about the post topic, when it doesn't state any
| opinion about Google's actions at all. I don't consider it
| appropriate to infuse every statement I make with outrage,
| even if I'm outraged, because that saps the life from
| communities when it's a common practice. Perhaps you're
| (incorrectly) reading between the lines that the lack of
| infused outrage as some sort of statement of my opinion?
| jodrellblank wrote:
| From the link: " _After we are notified of a potential policy
| violation, we may review the content and take action, including
| restricting access to the content, removing the content_ ".
|
| Would you expect any company not to do this?
|
| Where does it say "Google is analyzing your files for content"
| in the way you claim? It's not "all content that doesn't align
| with narratives that a government sets", it's "information
| about public voting procedures, political candidate eligibility
| based on age / birthplace, election results, or census
| participation that contradicts official government records.".
| That's a lot more specific than you imply, and a lot more like
| "you can't spin up an overthow-the-government movement using
| Google services" than "you can't disagree with the government",
| even if it's literally "you can't say Obama was born in Kenya"
| - there is apparently a human review involved.
| healthysurf wrote:
| It's not even about government narratives - it's about
| narratives of executives and owners of tech companies.
| TameAntelope wrote:
| Hacker News is a place for people to discuss interesting
| things, while being generally associated with Y Combinator, a
| startup incubator.
|
| I've said this before, and dang has corrected me, but I believe
| this is a good thing for Y Combinator, because it attracts
| smart people to the YC brand, and gives YC companies a pool of
| talent they can pick from to build companies.
|
| The "Hacker" part has always, to me, been about makers, not
| about ethics per se. I've always found the notion that there is
| a shared "hacker" set of mores to be kind of curious.
|
| Ripping things apart to see how they work vs. kludging
| something together to solve a specific problem vs. adversarial
| digital trespassing for fun and profit -- none of that lends
| itself, to me, to some specific shared morality. Certainly a
| few personality traits will show up, but folks trying to build
| a set of acceptable behaviors out of that will probably not
| find a ton of consistency across the people who engage in the
| aforementioned activities.
| pwned1 wrote:
| It appears as though this post has been memory-holed.
| dang wrote:
| It set off the flamewar detector. We turned that off when we
| saw it.
| motohagiography wrote:
| It's an interesting form of curation, as if something has been
| censored, now that's something I'm interested in knowing about.
| Dangerous means powerful, and powerful is valuable. Most of
| everything is crap, but what's not crap is stuff that genuinely
| puts dominant narratives at risk.
|
| I can't remember the name of it, but there is a feature of change
| in systems that has to do with complexity, where the change
| happens rapidly as the result of explosive exponential growth,
| and not relatively linear increments (period doubling?). In that
| view, something being censored is a good leading indicator that
| it's a candidate for sudden growth.
| unyttigfjelltol wrote:
| What could go wrong.
| russh wrote:
| Impartiality is just one in a long line of canceled google
| products.
| [deleted]
___________________________________________________________________
(page generated 2021-07-16 23:00 UTC)