[HN Gopher] Facebook Is Not Necessary
___________________________________________________________________
Facebook Is Not Necessary
Author : chrbutler
Score : 33 points
Date : 2021-04-24 16:44 UTC (6 hours ago)
(HTM) web link (www.chrbutler.com)
(TXT) w3m dump (www.chrbutler.com)
| renewiltord wrote:
| Guys, when you give people the ability to share things, they'll
| share things with each other that aren't true.
|
| Facebook is as much to blame for this as Gutenberg is.
|
| The extent to which people will go to avoid accepting that
| grandpa actually is racist is astonishing. No, dude, Facebook
| didn't make grandpa racist. It gave your racist grandpa the
| ability to talk to other racists.
|
| Facebook accurately detects that your grandpa wants to talk to
| racists and gives him that ability.
| rkk3 wrote:
| "Facebook is as much to blame for this as Gutenberg is."
|
| The invention of the printing press is a good analogy to
| Facebook; it catalyzed the Protestant Reformation which caused
| centuries of persecution, war and unrest in Europe.
| Communication media are very powerful tools that have far
| reaching effects. It's not about "blame".
| chrbutler wrote:
| What Facebook and Gutenberg have in common is that money
| controls the flow of information. But that's where the
| comparison ends.
|
| If Gutenberg had built his printing press to only print
| information that would fit within one's existing preferences,
| then it would have been a lot like Facebook. But the reason
| we know about Gutenberg is because it did the opposite.
| chrbutler wrote:
| I think that's a fair counterpoint, but I'm not sure it covers
| the whole charge.
|
| What Facebook does is both to limit the diversity of opinion to
| which a single user is exposed, as well as extend the reach of
| particular points of view (those on the more extreme end, which
| are more "performative"). That creates validation of points of
| view that can be expressed harmfully, and creates a gathering
| cloud of cultural force that would likely not exist without
| it's influence.
| m0llusk wrote:
| Okay, but if I see something shared that I know to be false and
| can demonstrate to be false with links to robust summaries of
| evidence and argument from trusted sources then how do I
| respond? The Facebook model is that I have no real alternative
| except to share opposing views with others in my bubble. It is
| fine for your grandpa to talk to racists, but if he shares his
| newfound junk with me then I should be able to flag it and link
| the truth such that any viewer of his posts can see how others
| see it.
| lupire wrote:
| The algorithms radicalize people:
|
| https://www.wired.com/story/christchurch-shooter-youtube-rad...
|
| Just because Facebook and Youtube managemt aren't doing as
| intentionally as Fox News management does, doesn't make the
| effect not real.
| bsder wrote:
| > Facebook accurately detects that your grandpa wants to talk
| to racists and gives him that ability.
|
| Facebook _ENCOURAGES_ grandpa to talk to racists because that
| increases "engagement". And they _block_ his contact with non-
| racists.
|
| That's a gigantic difference.
|
| Okay, let's say grandpa is racist. But he has to keep it toned
| down because his grandkids don't like that talk and admonish
| him occasionally.
|
| Facebook, however, gives him _positive reinforcement_ to be
| racist. And he will now argue with his grandkids because he has
| the social validation from Facebook and Fox that "It's
| perfectly okay to be racist, and there are lots of racists just
| like you." And Facebook will match him with more people who are
| increasingly racist because that increases the engagement
| further.
|
| Facebook _amplifies_ the tendency rather than _damping_ the
| tendency.
|
| And eventually all his grandkids can do is quit coming to see
| him because he's now a lost cause.
|
| We saw this in the January 6th insurrection. Lots of people
| didn't get the fact that they were participating in an
| _insurrection_ because their social validation circle was too
| closed to point that out.
| paulpauper wrote:
| > When enough fake news is spread, serious things can happen. Bad
| things. At a small scale, an individual's reputation can be
| ruined. At a larger one, a foreign government can manipulate the
| outcome of a rival's election.
|
| people keep repeating this like it is a truism, but how many
| specific examples are there of this and how much damage does fake
| news really do? Which specific piece of fake news influenced the
| 2016 U.S. presidential election? Where are these fake news
| stories that are always going viral? The only viral fake news
| that I have seen are The Onion and Babylon Bee but those are
| intentional satires.
| smt88 wrote:
| You're asking for data and then using your anecdotal experience
| as proof that whatever data you're going to get will be false.
|
| Do you see how that A) puts the burden of research you should
| do yourself onto someone else, and B) tells people that you
| won't believe them anyway?
|
| There have been thousands of articles and studies with specific
| examples. You should try Googling it first and build up a real
| counterpoint before telling people you don't believe all the
| research.
| chrbutler wrote:
| https://www.nature.com/articles/s41562-020-0833-x
| chrbutler wrote:
| A team of researchers led by Andrew Guess of Princeton
| University tracked the internet use of over 3000 Americans in
| the lead up to the 2016 presidential election. They found
| Facebook to be the referrer site for untrustworthy news
| sources over 15% of the time. By contrast, Facebook referred
| users to authoritative news sites only 6% of the time.
|
| The authors state, "This pattern of differential Facebook
| visits immediately prior to untrustworthy website visits is
| not observed for Google (3.3% untrustworthy news versus 6.2%
| hard news) or Twitter (1% untrustworthy versus 1.5% hard
| news)."
| bquest2 wrote:
| is there updated data from the 2020 election? FB has done a
| lot sine 2016
| chrbutler wrote:
| and more:
|
| "approximately 57% of Trump supporters read at least one fake
| news article in the month prior to the 2016 election compared
| to only 28% of Clinton supporters. Older Americans were also
| more likely to visit untrustworthy news websites.
|
| Perhaps most alarming is the observed "stickiness" of fake news
| websites. The researchers estimate that people spend an average
| of 64 seconds consuming a fake news articles compared to only
| 42 seconds on verified news stories."
___________________________________________________________________
(page generated 2021-04-24 23:02 UTC)