[HN Gopher] Facebook exec blames society for COVID misinformation
       ___________________________________________________________________
        
       Facebook exec blames society for COVID misinformation
        
       Author : cbtacy
       Score  : 177 points
       Date   : 2021-12-13 17:54 UTC (5 hours ago)
        
 (HTM) web link (www.axios.com)
 (TXT) w3m dump (www.axios.com)
        
       | saboot wrote:
       | My bad, sorry everyone. Guess this was all my fault.
        
       | mikem170 wrote:
       | What about their algorithm?
       | 
       | Facebook decides what to show people. They could show you your
       | friends posts in chronological order, and/or let people have
       | control over what they see.
       | 
       | But no, Facebook decides what people see. Therefore they have
       | some responsibility for the spread of misinformation.
        
         | freediver wrote:
         | And people decide to use Facebook. I am not trying to defend
         | it, but blaming it 100% on Facebook is not fair. Even if their
         | algorithms were perfect to amplify misinformation, there still
         | needs to be enough people reading and sharing content for it to
         | have an effect.
         | 
         | A solution could be paying for Facebook, where both the number
         | of people and incentives would change.
        
           | nowherebeen wrote:
           | > And people decide to use Facebook
           | 
           | I don't choose to use WhatsApp but I have to because that
           | what my family members use and they aren't tech savvy enough
           | to use anything else. So no, it's not a simple choice. Once a
           | product gets saturated in the market, it gets very difficult
           | to replace it.
        
           | mithr wrote:
           | > blaming it 100% on Facebook is not fair.
           | 
           | The problem is that humans are never 100% rational. If the
           | audience for Facebook was purely rational robots, then sure,
           | you could make the argument that since they should just be
           | able to stop caring about these problematic things, and the
           | issue will go away, it's not Facebook's fault that they
           | haven't done that.
           | 
           | But given we are talking about humans, once Facebook has
           | spent considerable time and money _studying_ human behavior
           | and society in general, exactly _in order to_ figure out how
           | to maximize their own goals over anything else, I think they
           | should take the blame when there are negative side effects to
           | doing so. Saying  "well if society just stopped caring about
           | (e.g.) negative content it'd be fine, so it's society's
           | fault" is misdirection at best and ignores both the
           | concentrated effort Facebook has spent on achieving this
           | outcome, as well as the hoops they've spent the past few
           | years jumping through to defend their choices once people
           | started calling them out on it.
        
             | freediver wrote:
             | This is why I suggested 'paying for Facebook'. Legislation
             | could exist that simply says things that have commercial
             | interest behind them can not be given away for free.
             | 
             | Even a price of $0.01, would radically change the
             | environment on Facebook.
        
         | hardtke wrote:
         | Facebook doesn't really decide what you see, but instead
         | optimizes what you see to maximize your engagement. If you
         | never engage with political content or misinformation, you
         | generally won't see it. Once you start engaging, it will drown
         | out everything. What they could provide is a "no politics"
         | option, but I wonder if anyone would utilize it. There was an
         | old saying in the personalized recommendations world along the
         | lines of "if you want to figure out what someone wants to read,
         | don't ask them because they will lie." For instance, if you ask
         | people what kinds of news they want they will invariably check
         | "science" but in fact they will mostly read celebrity gossip
         | and sports.
        
           | mithr wrote:
           | Facebook decides what you see. That they have created an
           | algorithm that "maximizes engagement" is just another way of
           | saying that they've decided what you should see is what they
           | believe will keep you most "engaged". They could choose to
           | use a different metric -- it is entirely their choice.
        
           | wvenable wrote:
           | Facebook as experimented with a number of different options
           | to clean your feed but ultimately they never get deployed
           | because they all decrease engagement.
        
         | andrew_ wrote:
         | This is really the crux and it doesn't get as much attention as
         | it should.
        
           | saddlerustle wrote:
           | It's really not. WhatsApp has just as big of a misinformation
           | problem without any sort of algorithmic ranking.
        
             | dariusj18 wrote:
             | And I, personally, wouldn't blame Facebook for the spread
             | of misinformation via WhatsApp because it doesn't make
             | sense to.
        
           | dntrkv wrote:
           | It doesn't get enough attention? The "algorithms" are all
           | anyone talks about when it comes to this issue. I think
           | people put way too much weight on them.
           | 
           | Once you have enough people participating in a single
           | conversation, from all walks of life, the discourse will go
           | to shit very quickly.
           | 
           | Look at Reddit as an example. Small subreddits that are
           | focused on a specific hobby are usually pleasant and
           | informative. Large subreddits that cater to a large audience
           | are no better than the comment section on a political YouTube
           | video.
        
       | cronix wrote:
       | Facebook chooses what I see while on their platform. If they
       | didn't, I'd just see a chronological feed of my friends posts
       | that I chose to follow as they came through without any external
       | filtering. Going directly to friends walls shows that is not the
       | case.
       | 
       | Instead, they amplify emotionally based content that they think I
       | will react to (engagement) by studying previous interactions and
       | don't show me things they don't agree with (censorship) even if
       | it originated from an authoritative primary source. That doesn't
       | sound like it originated in society, but more of a purposeful
       | curation imposed on users, who have to conform if they want to
       | stay. I didn't.
        
       | KaiserPro wrote:
       | Once again Boz has been reading a bunch of pop philosophy books
       | and the press team left him alone with the press again.
       | 
       | He _may_ have made some excellent choices with oculus (price,
       | formfactor feature tradeoff). However he still hasn't grasped the
       | basics of communication.
       | 
       | His internal posts are long, rambling and at a tangent to the
       | point he's making. Someone told him once that allegories in
       | stories can be more effective means for communication. Either no
       | one has told him he's doing it wrong, or no one who he respects
       | has. more importantly the point hes making is normally painfully
       | reductive, despite the 8k words implying its well thought out.
       | 
       | The core problem is that he honestly believes that facebook has
       | done the best it can. He is firmly of the school of thought that
       | he and his team can do anything, and do it better than anyone
       | else.
       | 
       | The problem is, he can't do communication, and it shows. Worse
       | still he can't empathise with the "other side". I don't mean
       | sympathise, I mean mean understand why they are thinking the
       | things they are.
        
         | wiz21c wrote:
         | >> basics of communication.
         | 
         | It's not communication. He's got a big role at FaceBook and
         | FaceBook is a big company. They cannot escape their social
         | responsibility anymore. Hiding behind "freedom of choice, of
         | individual, etc." is simplistic. But sure, with that kind of
         | mind, FaceBook will continue to make piles of money.
        
       | vernie wrote:
       | Boz never not giving off psycho vibes.
        
       | handrous wrote:
       | We ought to stop mixing up what are effectively mass-
       | communication broadcast media with personal social networks. They
       | ruin one another.
       | 
       | But of course that would harm profits.
        
       | diegof79 wrote:
       | Facebook execs are not responsible for what people think, but
       | they aren't neutral either.
       | 
       | The connection between incentives ends generating a situation
       | where their decisions had a huge influence on society:
       | 
       | - Their goal is to make the company profitable, and they choose
       | ads as the business model.
       | 
       | - Without viewers, there are no advertisers. So, engagement is
       | key.
       | 
       | - They need to create incentives to make people both content
       | creators and followers: share your thoughts, share your photos,
       | and show us what you like.
       | 
       | - Content creation is hard and strong opinions attract people
       | (both detractors and followers).
       | 
       | - A long post format doesn't work for casual engagement, and the
       | UI is optimized for a quick scan (because that helps with
       | engagement).
       | 
       | The result is short posts of shitty content with very strong
       | opinions that create an echo chamber. Can they get out of that
       | trap? I don't know. I've seen good quality content in smaller
       | online communities. (for example, while HN is not small, the
       | quality of the comments is usually better than the article
       | itself). But, I'm suspicious that optimizing for profit
       | contradicts content quality. Something similar happens with TV
       | shows. TV networks increased the number of reality shows: they
       | are cheap to produce, generate strong emotions, and have a higher
       | immediate audience than a high-quality TV series. The high-
       | quality TV series came from media companies like HBO or Netflix
       | because they don't care about optimizing minute-to-minute ratings
       | (they care more about exclusives to attract subscribers).
        
       | PicassoCTs wrote:
       | Well, thats a narrative that has no cheerleaders.
       | 
       | All evil is either imposed by a outside and completely
       | repairable, once that evil is removed.
       | 
       | Or all evil is eternal, unfixable and the best thing one can do
       | is to give in.
       | 
       | Nobody seems to be willing to reverse engineer human brain
       | architecture, to find reachable local optimas and best outcomes
       | with a given, minimally changeable species. I guess it lacks the
       | flair.
       | 
       | The irony is, that FB has the resources to do such a job though.
       | They have the largest behavioral DB on the planet.
       | 
       | They know us better, then we know ourselves.
       | 
       | They could accomplish some nice Leviathancybernetics, but they do
       | not want that.
       | 
       | They want to sell that knowledge as scary tales and social
       | engineering advice to governments(palantir) and as marketing to
       | everyone with money.
        
       | Barrin92 wrote:
       | The medium is the message as the saying goes, the technology we
       | interact with shapes what kinds of interaction we have. Blaming
       | 'people' makes no sense because 'people' cannot be changed,
       | unless we genetically reengineer humanity to be more
       | accommodating to Facebook's algorithms, which they would probably
       | prefer compared to having to actually fix the problems of their
       | platform
       | 
       | >"At some point the onus is, and should be in any meaningful
       | democracy, on the individual"
       | 
       | Viewing systems issues that are macro at scale through an
       | individualized lens is great for dodging responsibility and
       | Facebook's bottom line but it makes about as much sense as
       | thinking that dealing with climage change will be achieved by
       | everyone voluntarily shopping greener on amazon rather than
       | creating systems that are, collectively and at a mechanism level
       | oriented towards social good
        
       | anikan_vader wrote:
       | In related news, society blames Facebook for COVID
       | misinformation.
        
       | citizenpaul wrote:
       | I mean did you expect facebook to step up and say they were
       | responsible for the spread of misinformation?
        
       | nathias wrote:
       | They are right, these aren't problems problems of online
       | discourse but rather problems of lack of critical thinking,
       | reflection and other education in general that has become visible
       | with digitalization. The solution isn't to create a facebook that
       | will regulate and rule the masses for good, but to gradually
       | educate the masses into thinking.
       | 
       | This just seems as a natural side effect of the new production
       | mode that we have had for some time now, selling adds as the main
       | source of income.
        
       | threatofrain wrote:
       | If I spread business lies on Facebook then Facebook becomes part
       | of the problem. But if I spread lies which result in people
       | dying, then Facebook is suddenly not the problem and we frame it
       | as the fundamentals of democracy.
        
       | alphabettsy wrote:
       | The Fb algorithm favors engagement and controversy to the point
       | that most people may not even see reasonable takes on a given
       | issue. They're not neutral.
        
       | mensetmanusman wrote:
       | Globalization has created winners and losers.
       | 
       | The losers were ignored by the winners creating huge gaps in
       | trust.
       | 
       | The winners used to utilize the few media institutes that they
       | controlled to keep a lid on discontent.
       | 
       | Social media has complicated the situation.
        
         | csours wrote:
         | Not disregarding your comment, but we are all both winners and
         | losers. I get a cheap phone with amazing capabilities, but
         | healthcare is confusing and expensive for me.
         | 
         | Edit: the balance for is different for different people.
        
       | Manuel_D wrote:
       | I largely agree with this, not just in the context of COVID
       | misinformation but a lot of the stuff Meta gets flak for in
       | general.
       | 
       | With respect to Instagram's effect on teens, people seem to
       | conspicuously omit the fact that this leaked internal research
       | showed that users were twice as likely to say that Instagram
       | improves their well-being than harms it. It's really not clear to
       | me how much of this is due to Meta products themselves, versus
       | inherent challenges people tend to experience during adolescence.
       | And also "Facebook knew instagram was hurting teens" is reductive
       | at best, disingenuous at worst given that teens were twice as
       | like to say it benefitted them.
       | 
       | Similar analogies can be made with the Rohingya issue. Talk radio
       | played a big part in inciting the Rwandan genocide. Is it right
       | to say that talk radio was responsible for the genocide? I don't
       | think so, the underlying social issues are mainly the cause and
       | radio was part of the landscape in which in played out. I think
       | it's a similar situation with Facebook. Like radio, they were a
       | communication mechanism in societies that were perpetrating
       | genocide. Facebook did their best to shut it down, but the
       | challenges of suddenly scaling up moderation in a foreign
       | language is hard. Yet people seem to genuinely think that
       | Facebook was knowingly endorsing the genocide.
        
       | Karsteski wrote:
       | > Bosworth defended Facebook's role in combatting COVID, noting
       | that the company ran one of the largest information campaigns in
       | the world to spread authoritative information.
       | 
       | I find this to be such a creepy, off-putting statement. The last
       | thing I want in my society is to have profit-driven tech
       | corporations deciding what is and is not "authoritative
       | information".
       | 
       | Especially given how time and time again they have censored
       | "misinformation" which then proceeded to have merit.
       | 
       | No thank you.
        
         | wutwutwutwut wrote:
         | Did you just consume authoritative information from Axios Media
         | Inc?
        
         | SleekEagle wrote:
         | Well said. I wonder how long it will take legislation to catch
         | up to the unregulated information highways that dominate so
         | much of our national and global thought and perceptions!
        
           | gimme_treefiddy wrote:
           | Won't the regulation force them to censor? I get not wanting
           | corporations to moderate discourse, but I can't help but feel
           | they get shit from both ends.
           | 
           | Fuck you if you censor, and fuck you if you don't.
        
             | SleekEagle wrote:
             | I don't think censorship is necessary given that it isn't
             | the only way to address the problem of misinformation. For
             | example, providing metrics on how trustworthy a source is
             | by reasonable metrics isn't censorship, but providing
             | information that's useful in gauging how much credibility
             | to assign to a certain report.
             | 
             | Do I think they should _censor_ information? No. But we 're
             | all ill-informed with respect to almost all subjects, so
             | knowing whether or not a climate report came from e.g. a
             | group of Ivy-league researcher vs. ExxonMobil is important,
             | and knowing whether or not information on evolution comes
             | from PhD evolutionary biologists vs. the Young Earth
             | Creationist Museum matters a lot. What I think we're seeing
             | is the growing distrust of subject-matter experts, but only
             | in select highly political domains.
             | 
             | I understand what you said about pressure from both ends,
             | but Facebook is a trillion dollar company with an extremely
             | robust AI team. I'm sure they can figure out a solution, or
             | at least come up with a gameplan on how to work their way
             | towards one.
             | 
             | I also understand a trepidation with respect to legislation
             | considering the not infrequent tendency to under-legislate
             | or over-legislate, but this is common in every area of law
             | and eventually the pendulum swings slow down to a
             | reasonable medium.
        
         | pm90 wrote:
         | > The last thing I want in my society is to have profit-driven
         | tech corporations deciding what is and is not "authoritative
         | information".
         | 
         | Most of the National Press (NTY, WaPo etc.) is profit-driven
         | though, isn't it? They're the one's who generally report and
         | decide what's authoritative...
        
           | gimme_treefiddy wrote:
           | I feel like the "toxic" big tech social media platforms have
           | been a lot more impartial than any of the MSM sources these
           | days. I don't think the parent comment argued for that
           | though.
        
           | Karsteski wrote:
           | Corporate press interests is also a problem as well, yes
        
         | weddpros wrote:
         | That's why I think combatting fake news actually promoted them
         | and made the world a worse place: when people hear "fake news",
         | they've learnt to doubt the "fake" part and start thinking this
         | news is "so worth it that Facebook/Google/the media is
         | combatting it, so I remain ignorant of that information that is
         | vital to me".
         | 
         | That's how I think combatting fake news made things worse for
         | the very people we're trying to protect: it pushed them deeper
         | into the fakeverse (TM).
         | 
         | Second order effects...
        
         | changoplatanero wrote:
         | The last thing that Facebook wants is to be forced into a
         | position where they have to decide what is and is not
         | "authoritative information". They are only doing it because
         | society gives them no other choice except to shut down the
         | whole business.
        
           | SleekEagle wrote:
           | Of course they don't want to be in such a position because it
           | costs them money. Similarly, it costs e.g Dupont money to be
           | responsibly disposing of chemicals, but we're comfortable
           | requiring them to do it or else face being shut down.
           | 
           | We learned our lesson with Dupont when they were poisoning
           | water supplies. Similarly, I believe Facebook is poisoning
           | our information supplies, which you can see in the rapid
           | uptick in QAnon believers. Hell, they whipped themselves into
           | enough of a frenzy to storm the Capitol building to overthrow
           | an election.
           | 
           | I'm not making any political statements, truly. I'm just
           | noting that these groups and ideas simply would not exist and
           | perpetuate without being facilitated by Facebook's rampant
           | misinformation problem.
        
       | scyzoryk_xyz wrote:
       | Yeah I don't think someone in that role from fb is particularly
       | qualified to talk about society and human nature in relationship
       | to social networking.
       | 
       | It's like listening to someone who builds, designs and optimizes
       | production lines in cigarette factories philosophize about why
       | people smoke and whether it is their free choice to do so.
        
       | addcn wrote:
       | There are no causes, just dynamics.
       | 
       | And Facebook and society are in one together. Society functioned
       | 'better' before Facebook, so I'd start looking at that end of
       | dynamic first.
       | 
       | QED.
        
       | mc32 wrote:
       | Maybe if facebook's graph were designed differently so that you
       | have circles of relationships: family, friends, acquaintances,
       | business relationships, interests and everyone else, but by
       | default as the relationships are closer to the periphery, the
       | less they get promoted. The smaller the circle the higher the
       | chance of promotion.
       | 
       | That way idiocy's doesn't spread high and far and instead has
       | limited transmission radius.
        
         | jonas21 wrote:
         | You clearly don't have as many crazy people in your family as I
         | do.
        
           | mc32 wrote:
           | It's not there aren't crazy relations but rather the
           | craziness stays within the family or circle of friends.
        
       | addcn wrote:
       | There are no causes, just dynamics.
       | 
       | And Facebook and society are in one together. Society functioned
       | 'better' before Facebook, so I'd start looking at the Facebook
       | end of dynamic first.
       | 
       | QED.
        
         | ethanbond wrote:
         | Also, when it comes down to it I think that, aside from the
         | cited exec, most o people would prefer that we dismantle
         | Facebook rather than society.
        
       | gfodor wrote:
       | Boz is 50% right but would be 100% right if they ditched the
       | business model that leads them to be incentivized into spying on
       | people and feeding them misinformation they'll click on.
        
       | snthd wrote:
       | It's meta being meta about meta.
       | 
       | On the one hand people have free will to believe what they want
       | to - and - apparently - Facebook has no influence or
       | responsibility on that.
       | 
       | On the other hand Facebook is entirely in the business of selling
       | influence to change what people believe.
       | 
       | The meta is that this is a piece trying to influence what people
       | believe about Facebook's influence.
       | 
       | I guess that makes this meta about meta being meta about meta.
       | 
       | Outrage against Facebook being too influential is marketing for
       | Facebook adverts. It's a logical PR strategy. There's a perverse
       | incentive to do it for real, and for Facebook to cause actual
       | harm.
       | 
       | It doesn't matter if anyone in Facebook actually believes that
       | (following a perverse incentive is a good idea). All that needs
       | to happen is for the incentives to be aligned that way. Which
       | might literally be the famed "optimising for engagement".
       | https://www.youtube.com/watch?v=hn1VxaMEjRU
        
       | dandanua wrote:
       | Society is dumb because execs of Facebook and other mega
       | corporations want it to be dumb. They're building walled gardens
       | where the herd can graze in a virtual reality, fully controlled
       | by them.
       | 
       | Yes, an average man has no idea how viruses work or what is mRNA,
       | but he is not dumb to understand that Zuckerberg and other
       | billionaires just don't give a ** about his life. That's why all
       | the conspiracies and denial of what comes from authorities.
        
       | shadowgovt wrote:
       | Facebook is discovering what casinos, distilleries, and cigarette
       | companies learned from their own experiences:
       | 
       | Sometimes, you stumble across something wildly successful that
       | hits so hard partially because it keys into the pathological
       | behaviors of some human beings (which can become self-
       | destructive).
       | 
       | When you discover something like that, you either volunteer to
       | regulate your interactions with your customers or the government
       | steps in to regulate them for you.
        
       | jimbob45 wrote:
       | Thought experiment #1: Facebook/Instagram disintegrate overnight.
       | During that same night, a decentralized version appears that
       | operates on something resembling the torrent protocol where users
       | install a small receiver/transmitter on their machine and are
       | able to participate in something perfectly resembling FB today
       | (without the ads). The cases of bullying and such remain. How do
       | you go about solving the problem in that case?
       | 
       | Thought experiment #2: The timeline is changed such that Bitcoin
       | is actually made into a centralized protocol with one company at
       | the middle with perfect control. Do lawmakers go about shutting
       | it down because it contributes to drug exchanges and black
       | markets?
        
       | somehnacct3757 wrote:
       | This dodges the real issue which is that Facebook profits from
       | users spreading disinformation. In fact this content is more
       | engaging so their profit maximizers will amplify the
       | disinformation. And as Frances Haugen's testimony demonstrates,
       | FB knows this internally and chooses their profits over their
       | users' well-being.
       | 
       | What the FB exec is trying to do here is akin to oil companies
       | cleaning sludge of ducks. "Look! We're helping! Were part of the
       | solution!" But the ducks shouldn't have any sludge on them in the
       | first place.
        
         | macns wrote:
         | Adding to your comment:
         | https://www.theguardian.com/technology/2021/oct/05/facebook-...
         | 
         | Everyone should be reminded of Frances Haugen's testimony. Two
         | months after, it's like everyone has forgotten so a Facebook
         | exec can blame it on society.
        
           | jjkaczor wrote:
           | Less than two weeks ago CNN reported that Facebook took money
           | for ads relating vaccine to the Holocaust:
           | 
           | https://www.cnn.com/2021/12/02/tech/facebook-vaccine-
           | holocau...
           | 
           | Society just doesn't seem to be remember much of anything
           | these days.
        
             | s1artibartfast wrote:
             | >A spokesperson for Meta, Facebook's parent company, said
             | the ads comparing the US Covid-19 response to Nazi Germany,
             | comparing vaccines to the Holocaust, and the ad suggesting
             | the vaccine was poison went against Facebook's vaccine
             | misinformation policies.
             | 
             | Seems terrifying to me that Facebook has such policies in
             | the fist place.
             | 
             | I agree that a shirt comparing US Covid-19 response to Nazi
             | Germany or Fauci to Josef Mengele is poor taste, but to
             | call a T-shits misinformation is a stretch.
             | 
             | "I'm originally from America but I currently reside in 1941
             | Germany"
             | 
             | This isn't misinformation because it isn't information. It
             | is political criticism and opinion.
        
               | jjkaczor wrote:
               | Well - maybe if ads in the linear feed were not treated
               | like every other kind of "information" and actually did
               | not allow "Likes, Comments & Shares", I could agree with
               | you a little more.
        
               | s1artibartfast wrote:
               | Can you expand on how "Likes, Comments & Shares" changes
               | disagreeable opinions to misinformation? Is it because
               | individuals post actual misinformation in the comments?
        
       | _moof wrote:
       | At least he admits there's a problem. I guess that's progress?
        
       | kodah wrote:
       | Why not both?
       | 
       | Facebook (and more broadly, social media) is to blame for letting
       | every idiot in the world go viral with their terrible opinions.
       | 
       | Society is to blame for the content.
        
       | tamentis wrote:
       | At this point they define how society behaves.
        
       | hui-zheng wrote:
       | btw, is this host a man or woman, or some other gender? I genuine
       | what to know so that I could use the correct gender pronouns on
       | this person. That host shall wear a badge with preferred gender
       | pronouns.
        
       | AtNightWeCode wrote:
       | Blame the society. Facebook IS a large part of the society. The
       | same way religious people blame things on culture. Religion is a
       | large part of the culture.
        
       | rishabhsagar wrote:
       | Isn't this the same point as "Guns don't kill people, people kill
       | people"?
       | 
       | While that statement is objectively true, the fact remains that
       | Facebook amplifies and benefits from sensational posts.
       | 
       | They are in a position to moderate the worst impacts of mis-
       | information and yet consistently appear to be falling short.
       | 
       | While I agree that parts of society is to blame for the online
       | toxicity online, FB (and other social media companies) are
       | certainly in position to do a better job of managing some of the
       | worst impacts such as vaccine hesitancy and political mis-
       | information.
       | 
       | Infact they appear to be reluctant to act on it, since
       | sensational posts and controversial topics appear to encourage
       | attention.
        
         | 908B64B197 wrote:
         | > Isn't this the same point as "Guns don't kill people, people
         | kill people"? While that statement is objectively true, the
         | fact remains that Facebook amplifies and benefits from
         | sensational posts.
         | 
         | And yet simply looking at Switzerland or Vermont,
         | countries/States with a high number of guns per capita and an
         | extremely low rate of gun violence, tells us a different story.
        
           | mrtranscendence wrote:
           | I'm not sure what you're even responding to here. How does it
           | matter that there are areas with high gun ownership and low
           | gun violence? It remains true that a gun gives the means to
           | more easily kill someone, which is all that the analogy needs
           | -- unless you're arguing that guns _don 't_ make it easier to
           | kill, in which case I don't know why so many people bother
           | using them.
        
       | jjkaczor wrote:
       | Just 11-days ago CNN reported that Facebook sold ads comparing
       | vaccine to the Holocaust...
       | 
       | https://www.cnn.com/2021/12/02/tech/facebook-vaccine-holocau...
       | 
       | Hmmm - were is that meme of Gene Wilder from Charlie and the
       | Chocolate Factory grinning maniacly while saying something along
       | the lines of; "Oh yes, please tell me more..."
        
         | ahartmetz wrote:
         | That is the advertiser's fault. /s
        
       | ElectronShak wrote:
       | I agree, and to assume that censorship helps society become
       | better is just daft thinking.
       | 
       | What happened to things like "Its healthy to question everything
       | around you".
        
       | barefeg wrote:
       | Facebook is trying to save the internet culture of 20 years ago.
       | A teenager version of me would 100% support what they are trying
       | to do. But mainstream society is not compatible with the internet
       | mentality of back in the day
        
       | imapeopleperson wrote:
       | If corporations won't take accountability then they shouldn't be
       | allowed to censor
        
         | SQueeeeeL wrote:
         | I'm not sure what this comment is trying to get at? It's a
         | blanket call for Facebook to take "accountability", but for
         | what specific actions. Private corporations will always have
         | control on what goes on with their platforms, if the US
         | government ran a version of Facebook, it would be a different
         | story
        
         | wutwutwutwut wrote:
         | That makes no sense to me. So if they don't hire a ton of
         | people they shouldn't be allowed to remove obvious child porn
         | their site?
        
           | gambiting wrote:
           | Like, there's a difference between removing material that's
           | actually illegal and removing material that they just don't
           | agree with as a company. I think the difference is important.
        
           | SkittyDog wrote:
           | Hey, a genuine Strawman argument!
           | 
           | You've implied two separate false equivalences, here:
           | 
           | * That "censorship" necessarily and only means "removing
           | obvious child pornography", AND
           | 
           | * that in order to "take accountability", a company must
           | "hire a ton of people".
           | 
           | You haven't established why either of those equivalences are
           | a reasonable interpretation of the parent statement. And to
           | this casual observer, it would appear as though you are
           | deliberately conflating extremes in a rather disingenuous
           | fashion.
        
             | mrtranscendence wrote:
             | No, the poster is not implying that "'censorship'
             | necessarily and only means 'removing obvious child
             | pornography'". They're implying that removing child
             | pornography is one form that censorship might take, and
             | that entirely removing the ability to censor would remove
             | the ability to censor child pornography. You might have
             | valid objections to this, but it's not the same as what
             | you've written.
             | 
             | As to the second point, either an organization has the
             | manpower to defend itself from content that it is
             | "accountable" for, or it will be unable to defend itself
             | when necessary. It seems pretty clear to me that only
             | organizations with decent resources would be able to
             | moderate content in the "if you censor, you're accountable"
             | regime.
        
         | bilbo0s wrote:
         | If it makes you feel any better, I think this exec agrees with
         | you that FB should not necessarily moderate. That said, judging
         | by the blowback being received, it looks as though this exec's
         | position will likely not win the day in the long run.
        
       | SllX wrote:
       | " Facebook exec blames society for COVID misinformation" is the
       | actual title.
       | 
       | I thought this article was going to be about something else.
        
         | andreyk wrote:
         | Thanks for pointing this out. A lot of people on HN seem to
         | instinctively deride Facebook, but this statements is not wrong
         | -- Facebook is a tool for communication, and how people use it
         | is not entirely in its control. At best you could argue that
         | their algorithms might have promoted misinformation since it
         | got more engagement, but I suspect the opposite was true.
        
       | ricardobayes wrote:
       | Facebook is made of people, it's us. We created it, we operate it
       | and we can make it better or cancel it all together. It's just on
       | us.
        
       | robomartin wrote:
       | > He said that individuals make the choice whether to listen to
       | that information or to rely on less reputable information spread
       | by friends and family.
       | 
       | > "That's their choice. They are allowed to do that. You have an
       | issue with those people. You don't have an issue with Facebook.
       | You can't put that on me."
       | 
       | This is such nonsense.
       | 
       | Psychology tells us that we are susceptible to message repetition
       | and perceived authority.
       | 
       | At the most basic level this is a necessary element for the
       | survival of the species: Parent constantly tells a kid not to get
       | too close to the edge of a cliff. If the brain wasn't wired to
       | accept such messages without question humanity might have failed
       | the evolutionary fitness test.
       | 
       | I've seen comments on this thread attributing aspects of the
       | social media effect to the village idiot. That's also nonsense.
       | Perfectly intelligent people who are demonstrably not idiots fall
       | prey to these psychological effects. Once someone ascribes trust
       | to a source --whether it is an individual, group, news
       | organization, politician, etc.-- it is nearly impossible to make
       | them see the errors in what they are being led to believe. It
       | takes a particularly open mindset to be able to look outside of
       | what I am going to call indoctrination.
       | 
       | In the US it is easy to identify some of these groups. Besides
       | religious groupings, anyone who will generally refer to
       | themselves as "life-long democrat" or "life-long republican" is
       | far more likely to accept a world view and "truths" from members
       | of those groups. Religion, of course, is likely the oldest such
       | resonant chambers.
       | 
       | Facebook and other social media outlets, along with their
       | algorithms, have introduced segmentation and grouping at a sub-
       | level never before possible in society. Worse than that, they
       | allow and, in fact, are the source of, a constant bombardment of
       | ideas and ideologies in sometimes incredibly narrow domains. This
       | is great when you are trying to understand the difference between
       | using synthetic vs. organic motor oil in your engine. Not so
       | great when it makes someone descend into a deep dark and narrow
       | hole of hatred.
       | 
       | That's the problem. And yes, FB and social media are absolutely
       | at fault of enabling for the constant repetition of some of the
       | most negative, violent and counterproductive messaging humanity
       | has ever seen.
       | 
       | I have mentioned this in other related discussions. We've seen
       | this first hand in our own family. Over the last four years or so
       | we two family members (cousins) who grew up together descend into
       | equally extreme opposites thanks to FB. It is interesting because
       | prior to this happening they didn't even have a FB account. They
       | each got one at the same time to keep in touch with family. Four
       | years later one is what I could only describe as a hate-filled-
       | republican and the other an equal and opposite hate-filled-
       | democrat. And 100% of this happened because FB drove these two
       | people into deeper, darker and more hateful dark holes day after
       | day, for years. The damage done is likely irreversible.
       | 
       | They (FB) didn't need to do that. Yet, that's what these geniuses
       | thought was the "right" thing to do. Brilliant.
       | 
       | I am not generally in favor of heavy-handed government
       | intervention. And yet, I have no idea how else something like
       | this could be corrected in order to make these social media
       | companies stop being radioactive to society. We have probably
       | wasted at least a decade optimizing for hatred. How sad.
       | 
       | EDIT: I was going to say "unintentionally optimizing", however,
       | at some point anyone with one bit of intelligence could
       | understand this was trending in the wrong direction. Not making
       | modifications to the recommendations algorithms to reduce the
       | occurrence of deep dives into dark holes of hatred is a form of
       | intentionally promoting such results. Again, sad.
        
       | specialist wrote:
       | Imagine if Facebook, Twitter, YouTube, etc were run like MMORPGs.
       | Imagine them proactively mitigating griefing, bots, brigading,
       | etc.
       | 
       | John Siracusa has been making this point on Accidental Tech
       | Podcast (http://atp.fm): Zuck's envisioned metaverse would also
       | be a toxic hellscape. Because Zuck is ideologically opposed to
       | moderation.
       | 
       | The difference, of course, is because social medias sell
       | advertising. Whereas MMOPRGs sell experiences.
        
       | kvathupo wrote:
       | This is a topic that came up in the wonderful Maria Ressa's Nobel
       | Peace Prize speech. She argued that an international coalition of
       | governments needs to combat disinformation on social media by
       | saying what information is the "truth", a word that came up often
       | in her speech.
       | 
       | I don't think this will work at all. Ignoring the implications of
       | government interference in private corporations, do we trust
       | governments to be arbiters of truth?
       | 
       | I certainly don't, and I'm sure John Locke would have agreed. In
       | the case of the US, the government itself was the source of much
       | skepticism concerning COVID, and masks!
        
         | quantumwannabe wrote:
         | Maria Ressa herself doesn't trust (some) governments to be
         | arbiters of truth. She was found guilty of spreading
         | disinformation by the Philippine government. She claims that
         | she was telling the truth, and the government is corrupt and
         | lying and attacking the free press, but authoritative sources
         | in the Philippines have stated that she is mistaken. The
         | Philippine government was clearly combating disinformation from
         | online foreign-backed conspiracy theorists.
        
         | 908B64B197 wrote:
         | > I don't think this will work at all. Ignoring the
         | implications of government interference in private
         | corporations, do we trust governments to be arbiters of truth?
         | 
         | Excluding the obvious bad faith actors (pretty much all on some
         | sanction list) some countries will dance around the truth and
         | bend over backward to avoid offending some countries [0] [1]
         | [2] [3]
         | 
         | [0] https://www.foxnews.com/media/who-china-taiwan-interview
         | 
         | [1] https://www.theguardian.com/world/2020/mar/30/senior-who-
         | adv...
         | 
         | [2] https://www.dailymail.co.uk/news/article-8172031/WHO-
         | officia...
         | 
         | [3] https://www.bbc.com/news/world-asia-52088167
        
         | ahartmetz wrote:
         | > do we trust governments to be arbiters of truth?
         | 
         | You already do, might as well accept it. The government funds
         | science, makes some decisions about education, and makes laws
         | based on what it thinks is true. Some of it is always wrong,
         | but it's not as wrong as idiots on Facebook. And the goverment
         | is something that is (somewhat) controlled by the people, more
         | so than Facebook.
        
       | beardyw wrote:
       | This headline is wrong. He did not say _society_ is to blame, he
       | said individual people are.
       | 
       | Margaret Thatcher said "there is no such thing as society". By
       | that she meant we have no collective consciousness, we are just
       | individuals. If that is the case then we would have no desire to
       | protect people we don't know. Spreading misinformation is not an
       | issue. Most legislation is unnecessary.
       | 
       | If you do believe in society, you are part of a larger organism
       | and you aim to protect others.
       | 
       | It just depends where you are on this.
        
       | Apocryphon wrote:
       | And yet they participate in society.
        
         | commandlinefan wrote:
         | I worry that's where this is all going: "We've prohibited
         | people with problematic opinions from participating in social
         | media, but after some careful reflection, we've realized that
         | they're still alive even so..."
        
       | Maursault wrote:
       | Oblig. Repo Man quote:
       | 
       | Bud: _I know a life of crime has led me to this sorry fate, and
       | yet, I blame society. Society made me what I am._
       | 
       | Otto: _That 's bullshit. You're a white suburban punk just like
       | me._
        
       | Bendy wrote:
       | "The NRA say that 'guns don't kill people, people kill people'. I
       | think the gun helps. Just standing and shouting BANG! That's not
       | gonna kill too many people." - Eddie Izzard, Dress to Kill (1999)
        
         | dandotway wrote:
         | Homo Sapiens have been murdering each other on this planet for
         | at least 10,000 to 100,000 years, and have only used burning-
         | powder projectile launch tubes for roughly the past 1000 years.
         | (Poison darts launched from a blowgun are a more ancient form
         | of killing tube.)
         | 
         | When convenient projectile launching killing tubes aren't
         | available, Homo Sapiens will rapidly revert to 10,000+ year old
         | murder methods, and thus a husband inflamed with murder-rage
         | who just learned his wife's ovaries have been voluntarily
         | fertilized by another man's semen will not infrequently use
         | punches or a nearby blunt object (hammer or rock) to fracture
         | her skull and destroy her brain function, or use his hands to
         | crush her windpipe, or bleed her out with a knife. This has
         | been happening essentially every year for at least the past
         | 10,000 years. If his wife had been armed with a handheld
         | projectile launching killing tube she could have defended
         | herself, but women frequently don't carry projectile tubes and
         | frequently vote for restricting access to projectile tubes,
         | because projectile tubes are loud and scary and make them feel
         | unsafe.
        
           | teawrecks wrote:
           | Why use many word when few word do trick?
        
             | dandotway wrote:
             | Many word have trick. Make see different than few word.
        
           | mcguire wrote:
           | Homo Sapiens have been killing each other on this planet for
           | at least 10,000 years, and only used nuclear weapons for
           | roughly the last 75 years.... If his wife had been armed with
           | an M-28 Davy Crockett firing an M-388 round using a W54 Mod 2
           | with a yield of 84GJ, she could easily have deterred him, but
           | she would not be legally allowed to carry military weapons.
           | 
           | Maintain the purity of your bodily fluids, friend!
        
         | adolph wrote:
         | Izzard is from England which has a high degree of gun control.
         | The murder rate doesn't seem to be strongly correlated to
         | regulation, however. This lends less credence to Izzard's
         | conjecture. Maybe people who may murder will use whatever tool
         | is available or aren't concerned about breaking gun laws?
         | 
         | https://www.macrotrends.net/countries/GBR/united-kingdom/mur...
         | 
         | https://en.wikipedia.org/wiki/Firearms_regulation_in_the_Uni...
        
           | root_axis wrote:
           | The murder rate seems a lot lower than the US. Also the
           | "murderers aren't concerned about gun laws" line is specious;
           | a criminal isn't concerned about laws _by definition_ - that
           | doesn 't invalidate the reasoning behind passing a law.
        
             | hui-zheng wrote:
             | Come to China. we have no gun and we have one of the least
             | murder rate in the world.
             | 
             | We welcome you immigrate here, where people are illegal to
             | possession any weapon other than kitchen knives, and you
             | are fully protected.
        
               | adolph wrote:
               | Doesn't seem like China is rolling out the welcome mat
               | for many folks.
               | 
               |  _In 2016, China issued 1,576 permanent residency cards.
               | This was more than double what it had issued the previous
               | year, but still roughly 750 times lower than the United
               | States' 1.2 million._
               | 
               | https://en.wikipedia.org/wiki/Immigration_to_China
        
               | 3pt14159 wrote:
               | Well, it's partially because police salaries are based on
               | the rate of solved cases, no? Surely there is
               | underreporting. And the homicide rate in Canada (which
               | includes unintentional manslaughter) isn't much higher
               | than the reported rate in China and we're fairly well
               | armed by international standards:
               | 
               | https://en.wikipedia.org/wiki/Estimated_number_of_civilia
               | n_g...
        
               | mrtranscendence wrote:
               | Part of my family's muslim. I'll take my chances with the
               | US and its guns, thanks.
        
               | root_axis wrote:
               | I would not want to live in China due to the government's
               | authoritarian tendencies.
        
           | Bendy wrote:
           | So long as it's mightier than the sword.
        
           | teawrecks wrote:
           | Your link shows the US has 4x the homicide rate as the UK.
           | Meanwhile, Japan has incredibly restrictive weapon laws
           | (including knives and swords), and they have a fraction of
           | the rate of everyone else.
           | 
           | And before you point to Switzerland, I am all for gun
           | ownership by certified, well trained, responsible citizens.
           | But the US doesn't have that. Either decrease access to guns,
           | or enact 2 years of compulsory military service where you are
           | trained to respect your weapon and know precisely when and
           | how to responsibly use it AND store it. If you do neither,
           | you get the US.
           | 
           | And in either case, we need to improve the mental wellbeing
           | of everyone in the US by giving more people access to "free"
           | healthcare and not stigmatizing mental health.
        
             | adolph wrote:
             | > Your link shows the US has 4x the homicide rate as the
             | UK.
             | 
             | That isn't germane. If the rate of something fluctuates
             | without connection to the action taken to change the rate,
             | then the action isn't effective or is confounded by other
             | more significant factors.
             | 
             | Izzard's implicit conjecture is that if guns are not
             | present, then murder is less likely (please let me know if
             | I am misunderstanding it). Izzard's country of primary
             | domicile is England which has a recent history of making
             | guns less available. However, the murder rate in England
             | appears to increase and decrease without regard to the
             | timing of key legislation. Since a murder may or may not be
             | performed with a gun, if murders do not decrease in the
             | absence of guns then it follows that a gun may be a
             | particular method but is not necessary for the commission
             | of a murder.
             | 
             | I suppose a counterfactual could be asserted: the murder
             | rate would have been even higher without England's gun
             | laws. I suppose that is possible and would be plausible
             | with more information about the natural variation in murder
             | rates of various methods. Maybe something like Narwal tusks
             | aren't a replacement for guns but have their own natural
             | rate of usage in murders.
        
             | mrtranscendence wrote:
             | Some people seem to be blind to the fact that access to a
             | firearm lowers the cost of killing (by making it easier to
             | do so); and what lowers the cost of something will
             | encourage that behavior at the margins. But Switzerland!
             | Sure, they've managed to thread that needle through
             | education and regulation. But just relaxing gun laws
             | without counteracting that in some way will of course
             | increase homicides (and suicides, similarly). The US is
             | case in point.
        
       | lexapro wrote:
       | Society allowed Facebook to emerge and grow into what it is now,
       | so I can't say I disagree.
        
       | rdiddly wrote:
       | Calling it "blaming society" makes this pretty funny, like when
       | Manson did it and the kid in the Suicidal Tendencies song does it
       | when his mom won't bring him a Pepsi.
       | 
       | Facebook's self-serving algorithms are of course a scourge in
       | this area, but he does have a point. Part of why the messaging on
       | COVID has been so fucked is because of this very thing: spinning
       | or tweaking the truth. Facebook does it, to increase engagement,
       | but public officials and others also do it. People who should've
       | just told the simple truth, instead tried to gauge our response
       | to it, and spun and tweaked the truth in an effort to "game" the
       | response. Just tell the truth. Because you're probably
       | underestimating the general public, as usual, and will ultimately
       | end up increasing the danger and impact, by two mechanisms: 1)
       | people have incomplete or incorrect or insufficient information
       | to act on, and/or 2) certain people (who are adults and can tell
       | when someone is dissembling, or communicating manipulatively,
       | a.k.a. propagandizing) start to distrust the "official story,"
       | and the cumulative effect is that they go looking for "the real
       | truth" in all kinds of wacky out-of-the-way places and get all
       | conspiracy-minded, and the Facebooks of the world pick up on this
       | and amplify it in their feeds. You want to combat this? Give them
       | an authoritative, trustworthy source. Tell them the whole,
       | unvarnished truth. Gauging the response, communicating to achieve
       | a goal, well that's not informing, that's either sales or
       | propaganda. You want to combat disinformation, start with
       | _information_ - all of it, without spin, without censorship.
       | 
       | Seemingly every disaster movie has a character who refuses to
       | sound a warning because they don't want to start a panic, but
       | then they ultimately cause greater loss of life or whatnot. That
       | character is always a villain. We hate them precisely because
       | their communication or lack thereof, has an agenda that
       | underestimates us and ultimately ends up costing us.
        
       | Grismar wrote:
       | Drug dealer blames addicts for substance abuse. No shit Sherlock,
       | but you're not accused of coming up with the filth or consuming
       | it - you're accused of being a primary trafficker. In fact, the
       | comparison with a dealer is mild, as Facebook is to a typical
       | drug dealer what Walmart is to your local convenience store.
       | Escobar has nothing on Zuckerberg.
        
       | Booktrope wrote:
       | Not FB responsibility if they set up and tuned an information
       | spreading system that promotes stuff that's inflammatory over
       | stuff that's informative? Users of FB only see what FB feeds to
       | them, and that's all about how FB aligns the user's activities
       | and characteristics to the content FB is supplying. What a total
       | cop out to say, the problem is what people say, when FB plays
       | such a crucial role in what people see. Before FB (yes and other
       | social media) amped this stuff up, a village idiot standing on a
       | corner shouting conspiracy theories got very little attention.
       | But on FB this kind of stuff feeds engagement, and we know how
       | important engagement is.
       | 
       | Yet the guy who's slated to by FB's CTO says, don't put all this
       | inflammatory stuff on me! Freedom of speech you know and just let
       | us do our job of promoting engagement and building ever more
       | effective ad targeting technology!
        
         | s1artibartfast wrote:
         | FB shows people what they want to see. They provided the public
         | a channel to tune into what anyone has to say and it turned out
         | that people sought out the village idiot.
         | 
         | Now the public says show me what I want, but prevent everyone
         | else from seeing what they want.
        
         | jensensbutton wrote:
         | > an information spreading system that promotes stuff that's
         | inflammatory over stuff that's informative
         | 
         | The problem is who's to say which is which.
        
       | throwaway47292 wrote:
       | > "Individual humans are the ones who choose to believe or not
       | believe a thing. They are the ones who choose to share or not
       | share a thing,"
       | 
       | This is just bullshit.
       | 
       | You can say the same thing about extracting confession under
       | torture.. the individual "decided" to tell the truth, why
       | shouldn't we admit it into evidence?
       | 
       | Those mega structures, controlled by billion parameter models,
       | and then some human goes and does an interview and says 'ye no
       | problem, its your choice', is most of all naive and arrogant,
       | that anyone can even pretend to say they understand how the model
       | is impacting the social structure.
       | 
       | As Jacques Ellul says, the strongest unit against propaganda is
       | the family, or small groups of individuals, as they pull each
       | other to the center of the group, but imagine now each individual
       | is exposed to unique personalized propaganda, so the group
       | constantly diverges. I imagine it like a group of friends holding
       | hands in a pool, but then there is giant influx of water between
       | them, so there is constant force to separate, so they have to
       | keep their relationships stronger, and the force increases over
       | time.
       | 
       | It is the people that seek propaganda, not the other way around.
       | Now however the algorithm satisfies the search in most
       | satisfactory way possible (within the limits of current
       | technology)
        
       | chris_wot wrote:
       | He says if he spends every dollar in removing fake and misleading
       | content it won't fix the issue.
       | 
       | More and more people like myself are just... deleting Facebook.
       | Not deleting their account. Just deleting it from their phone,
       | and never bother logging in from anywhere.
       | 
       | I urge people to try it. Don't bother jumping through the hoops
       | Facebook give you for deleting it. Just delete it from your
       | phone. You'll spend a day trying to check it, realise it isn't
       | there, then you'll completely forget about it. And you won't miss
       | it.
        
       | arduinomancer wrote:
       | I feel like a lot of people think "the internet" is toxic
       | 
       | When in reality maybe humans are just bad and the internet just
       | exposed it
       | 
       | Before the internet there was just no way to see it
        
         | commandlinefan wrote:
         | > no way to see it
         | 
         | Sure there was. All the pearl-clutchers who are telling us that
         | Facebook has doomed society were saying the same thing about TV
         | a few decades ago. Remember Morton Downey Jr? Jerry Springer?
         | Sally Jesse Raphael? They were saying that about music.
         | Remember 2 Live Crew? People burned Beatles records for fear
         | that this "devil's music" would destroy civilization. They were
         | saying that about comic books. They were saying that about
         | Dungeons and Dragons. I've spent my entire near half-century of
         | life hearing about how everything I enjoy is going to destroy
         | society.
        
           | jjkaczor wrote:
           | Both can be true.
           | 
           | Facebook can be both a problem - and a boon. It certainly
           | helps people find other people who share the same
           | "pearl/rifle" clutching mentality - and amplifies them into
           | eco-chambers.
        
       | uptownfunk wrote:
       | You're all still on FB? Dropped out and haven't looked back.
       | 
       | Youtube on the other hand...
        
         | mrtranscendence wrote:
         | I'm still on Facebook, but I don't engage with political
         | content and I unfriend acquaintances that post stupid crap.
         | Aside from posts by close family my feed is 90% cute animal
         | photos and videos, along with a wee bit of PC building content.
        
       | wodenokoto wrote:
       | "We are only _bringing_ the worst of people out, it was already
       | in there"
        
         | nowherebeen wrote:
         | This is such an understatement.
        
       | lr4444lr wrote:
       | I understand that corp execs are gonna corp exec,, but I gotta
       | admit I am still unclear on a fundamental level why social media
       | is any more blame worthy for misinformation than broadcast media
       | on a qualitative level. FB never made you any promises that what
       | your read on it contains any truth whatsoever.
        
         | bitexploder wrote:
         | I think it's mostly that they are incentivized to let some
         | voices be amplified on their platform. The radical right is
         | good for business so they let disinformation fester to make a
         | few extra dollars.
        
           | lr4444lr wrote:
           | I'm sorry, but I don't see how this is any different than
           | networks being concerned about the ratings of each show, or
           | Hollywood about projected return on budget when creating
           | projects.
        
         | newfonewhodis wrote:
         | If you have some time, I very highly recommend reading An Ugly
         | Truth (https://www.indiebound.org/book/9780062960672). That
         | book gave me a new realization - Facebook (and other companies)
         | are not victims of their surroundings. There are some very
         | intentional, very impactful decisions that are made at the
         | highest level.
        
       | LNSY wrote:
       | Alas, poor Mark Zuckerberg, a victim of society.
        
         | LNSY wrote:
         | Also: Facebook could go a long ways to counter covid
         | disinformation by removing the billions of bots and fake
         | accounts that infest its service. But if they did that we would
         | find out that most of their market share is, in fact, fake. And
         | then they wouldn't be worth nearly as much money as they say
         | they are.
        
       | ricardoplouis wrote:
       | Mental health has been an issue for as long as we've known, but
       | Facebook does have a curious way of amplifying societal problems
       | such as this and making it worse.
       | 
       | https://www.wsj.com/articles/facebook-knows-instagram-is-tox...
        
       | mbesto wrote:
       | He's right in some sense, but the context is important. The
       | problem is that Facebook is giving the village idiot a megaphone.
       | Facebook can't say:
       | 
       | - Amplify your commercial business message to billions of people
       | worldwide.
       | 
       | AND at the same time
       | 
       | - Well its your individual choice whether or not to listen to the
       | village idiot.
       | 
       | You guys gave them a megaphone, how do you expect society to
       | behave?!
        
         | baq wrote:
         | it isn't the village idiot, it's the insidious manipulator that
         | influences village idiots at industrial scale now.
        
         | api wrote:
         | It's much worse than giving the village idiot a megaphone.
         | Facebook (and most other socials) prioritize content to
         | maximize engagement, and (big surprise) the village idiot
         | maximizes engagement. Facebook is a machine tuned specifically
         | to spread hate and bad ideas because that's what maximizes the
         | time people spend on Facebook.
         | 
         | I thought of a good analogy a while back. Lets say someone
         | walks past you and says "hi" and smiles. Lets say someone else
         | then walks past you and punches you in the face. Which
         | interaction maximizes engagement? Well that's the interaction
         | and content that social media is going to amplify.
         | 
         | Social media companies are the tobacco companies of technology.
         | They make billions by lobotomizing the body politic.
        
           | freediver wrote:
           | > Lets say someone walks past you and says "hi" and smiles.
           | Lets say someone else then walks past you and punches you in
           | the face. Which interaction maximizes engagement?
           | 
           | Likely the first one. Could also lead to a literal
           | 'engagement'.
        
             | IntrepidWorm wrote:
             | Not on a busy streetcorner. A fistfight tends to attract
             | much more attention than a passing greeting. Facebook is a
             | very busy streetcorner.
        
             | whimsicalism wrote:
             | If you're optimizing for time spent in the interaction,
             | which is what FB does - then probably the punch in the face
             | will cause you to stay on the scene for longer, whether to
             | yell at the person, fight back, when you call the cops,
             | etc.
             | 
             | The "hi" is returned with a "hi" back and you both continue
             | walking.
        
               | freediver wrote:
               | Likely scenario is that the interaction stops the moment
               | you are floored by being punched in the face. You may
               | stay on the scene longer, but the perpetuator would
               | likely take off.
               | 
               | Also answering to hi and a smile with a hi and smile
               | _could_ in fact be the right way to optimize for time
               | spent in this interaction because it has low probability
               | but very high impact outcomes as far as time spent goes
               | (dating/marriage).
        
           | ricardobayes wrote:
           | Supposedly they were working on detuning this effect.
        
           | soyiuz wrote:
           | True, but the financial incentives of many (most?) companies
           | doesn't align with public benefits (see pollution, plastics,
           | diet etc.) Why is social media singled out in our demands for
           | them to act morally good, instead of just profitable?
        
             | JumpCrisscross wrote:
             | > _Why is social media singled out in our demands for them
             | to act morally good, instead of just profitable?_
             | 
             | It's not. We're so used to being totally unregulated in
             | tech that even minor oversight can be spun as burdensome.
        
             | alisonkisk wrote:
             | They aren't singled out.
        
         | canistr wrote:
         | Twitter and Youtube, sure.
         | 
         | But the blast radius of a Facebook post doesn't have the same
         | reach given the majority of posts go to your explicit network
         | of connections. Unless you're specifically referring to
         | Facebook Groups? But then are we certain it's different from
         | Reddit or other forums?
        
           | itake wrote:
           | Facebook Groups and Pages create ways for people to share
           | content, triggering exponential growth (e.g. user shares meme
           | to their page so that their friends see it. Their friends
           | choose to re-share. wash. rinse. repeat.)
        
         | [deleted]
        
         | freediver wrote:
         | So should there be a special tax on "megaphones" like Twitter,
         | Facebook or YouTube? What exactly is the legal framework under
         | witch these companies could be scrutinized? Normally the
         | manufacturer of megaphones does not get sued when a person uses
         | it to promote hatred on a village square.
        
           | varelse wrote:
           | Well since SCOTUS has ruled that it's okay for private
           | citizens to sue over abortion, just make it okay for everyone
           | to sue Facebook at scale irregardless of the
           | constitutionality of doing so. There's a lot they can do
           | about this content and they choose not to do so because it
           | costs money. Always follow the money.
        
             | adolph wrote:
             | The above is SCOTUS misinformation.
             | 
             |  _WASHINGTON -- The Supreme Court ruled on Friday that
             | abortion providers in Texas can challenge a state law
             | banning most abortions after six weeks, allowing them to
             | sue at least some state officials in federal court despite
             | the procedural hurdles imposed by the law's unusual
             | structure._
             | 
             |  _But the Supreme Court refused to block the law in the
             | meantime, saying that lower courts should consider the
             | matter._
             | 
             | https://www.nytimes.com/2021/12/10/us/politics/texas-
             | abortio...
        
               | varelse wrote:
               | Senate Bill 8 is still in effect. The above is SCOTUS
               | disinformation.
               | 
               | https://www.kvue.com/article/news/politics/texas-this-
               | week/t...
               | 
               | "So, essentially, the Supreme Court left the law in
               | effect. We were expecting to possibly see them limit the
               | enforcement of it because that was the biggest concern
               | that Supreme Court, Supreme Court justices seemed to
               | have. And that enforcement, of course, is allowing
               | private citizens to sue, under the law, anyone who aids
               | and abeits an abortion for at least $10,000 damages, if
               | won. And so, that's where it kind of was being targeted
               | today. And the Supreme Court essentially put that back on
               | the U.S. District Court, allowing that lawsuit to resume
               | to determine the constitutionality of the law."
               | 
               | Or TLDR they left it to the states. Can't wait to see how
               | the states run with that concept.
        
               | adolph wrote:
               | "ruled that it's okay" != "left the law in effect"
               | 
               | "U.S. District Court" != "the states"
               | 
               |  _Whatever a state statute may or may not say about a
               | defense, applicable federal constitutional defenses
               | always stand available when properly asserted. See U. S.
               | Const., Art. VI. Many federal constitutional rights are
               | as a practical matter asserted typically as defenses to
               | state-law claims, not in federal pre-enforcement cases
               | like this one._
               | 
               | https://www.supremecourt.gov/opinions/21pdf/21-463_3ebh.p
               | df
        
           | sverhagen wrote:
           | I think the megaphone is thus more of a metaphor than it is
           | an analogy. Or at least, like most analogies, it breaks down
           | under even the lightest pressure. For it to be an analogy,
           | it'd have to be a megaphone manufacturer that also brings the
           | audience together. Maybe Facebook is the megaphone AND the
           | village square AND then some.
        
           | noahtallen wrote:
           | That's what's challenging about this situation. We're
           | experiencing a fairly new problem. It hasn't before been
           | possible for a member of society to communicate with all
           | other members of society at the same time, nor has it been
           | possible for a member of society to get addicted on a curated
           | feed of random (sometimes anonymous) folks spreading their
           | ideas globally.
           | 
           | All of these things seem new to me:
           | 
           | - Global, direct communication with all members of society.
           | 
           | - Addictive design patterns in software.
           | 
           | - AI-curated news feeds based on increasing engagement.
           | 
           | - Anonymous conversations.
           | 
           | Since it's new, society doesn't have frameworks to think
           | about this kind of stuff yet.
        
             | throwaway0a5e wrote:
             | >That's what's challenging about this situation. We're
             | experiencing a fairly new problem. It hasn't before been
             | possible for a member of society to communicate with all
             | other members of society at the same time, nor has it been
             | possible for a member of society to get addicted on a
             | curated feed of random (sometimes anonymous) folks
             | spreading their ideas globally.
             | 
             | This comment could have been taken more or less word for
             | word from the diary of a monk who lived in the 1500s.
             | 
             | We've been through this before.
        
               | kelnos wrote:
               | I think scale matters, though. In the 1500s (through much
               | of the 1900s, even), most people were still mainly
               | exposed to the viewpoints of people and groups who were
               | physically local to them. Your local newspaper and (more
               | recently) local TV news was a product of local attitudes
               | and opinions. Certainly all of those people were not a
               | member of your "tribe", but many were, and there were
               | limits as to how far off the beaten path you could go.
               | 
               | If you had some wacky, non-mainstream ideas, you self-
               | moderated, because you knew most of the people around you
               | didn't have those ideas, and you'd suffer social
               | consequences if you kept bringing them up and shouting
               | them from the rooftops. Even if you decided you'd still
               | like to do some rooftop-shouting, your reach was
               | incredibly limited, and most people would just ignore
               | you.
               | 
               | Today you can be exposed to viewpoints from every culture
               | and every walk of life, usually with limited enough
               | context that you'll never get the full picture of what
               | these other people are about. If you have crazy ideas, no
               | matter how crazy, you can find a scattered, distributed
               | group of people who think like you do, and that will
               | teach you that it's ok to believe -- and scream about --
               | things that are false, because other people in the world
               | agree with you. And the dominant media platforms on the
               | internet know that controversy drives page views more
               | than anything else, so they amplify this sort of thing.
        
               | mrtranscendence wrote:
               | In a similar sense to how if you've experienced a low-
               | pressure shower you've experienced Niagara Falls, sure,
               | we've been through this before.
        
           | andrew_ wrote:
           | I don't understand how a targeted tax would help at all here.
        
         | commandlinefan wrote:
         | > the village idiot
         | 
         | One man's terrorist is another man's freedom fighter.
        
         | maerF0x0 wrote:
         | > The problem is that Facebook is giving the village idiot a
         | megaphone
         | 
         | While you're not wrong that it's giving the idiot a megaphone,
         | it's missing the greater picture. it's giving _everyone_ a
         | megaphone. The real question is why can't people discern the
         | difference between the idiot and the non-idiot?
         | 
         | I'd also note that a big issue now is trust -- trust in
         | "elites" (technocrats, wealthy, those in positions of power)
         | has been declining for a long time. i think people are not so
         | much seeking out the village idiot, but massively discounting
         | "experts".
         | 
         | A list of things that come to mind which have broken trust:
         | 60's saw hippies which wanted to break norms of their
         | parents/grandparents, 70s saw vietnam war, breaking gold
         | standard, 80s greed is good, iran contra etc, 90s tough on
         | crime policies, y2k fears, 00s - iraq/afghanistan, 9/11
         | attacks, governmental data dragnet, manning/snowden/asange,
         | Covid statements which did not pan out as planned...
         | 
         | People have good reasons to be skeptical of elites, but I think
         | anti-corruption work is more important than trying to silence
         | the idiot.
        
           | onlyrealcuzzo wrote:
           | Facebook (like News) is entertainment. People don't select
           | entertainment for accuracy.
           | 
           | The village idiot (that's successful on Facebook) has self-
           | optimized for being catchy - that's why people are listening.
        
           | vlovich123 wrote:
           | > The real question is why can't people discern the
           | difference between the idiot and the non-idiot?
           | 
           | Societally we solve this through trust organizations.
           | Individually, I have no way to validate the information every
           | expert/idiot I might come across. So is "connect the
           | frombulator to the octanizer but watch out for the ultra
           | convexication that might form" gibberish or just your
           | ignorance of the terminology in use in that problem domain?
           | Most people don't try to figure out how to navigate each
           | field. Heck, even scientists use shortcuts like "astrology
           | has no scientific basis so it doesn't matter what so-called
           | SMEs in those fields say". So you rely on trust in various
           | organizations and peers to help guide you. These structures
           | can often fail you for various reasons but that's the best
           | we've managed to do. That's why for example "trust the
           | science" is a bad slogan - people aren't really trusting the
           | science. They're trusting what other people (some times
           | political leaders) tell them the science is. Add in bad-faith
           | actors exploiting uncertainty and sowing chaos and it's a
           | mess.
           | 
           | Silencing the idiot is fine as long as your 100% certain
           | you're silencing someone who's wrong and not just someone
           | espousing a countervailing opinion (eg Hinton's deep learning
           | research was poo-pooed by establishment ML for a very long
           | time)
        
           | didibus wrote:
           | > it's missing the greater picture. it's giving _everyone_ a
           | megaphone
           | 
           | I think this can be argued against, because Facebook does
           | recommendation and algorithmic curation.
           | 
           | Even if Facebook didn't purposely tweak things to propagate
           | disinformation, you could say it is easy to manipulate their
           | algorithms to disproportionately push the information.
           | 
           | So for me it's a case of Facebook not doing enough to fight
           | potential abuse on their platform.
           | 
           | There's an element of responsibility here, because we are
           | prone to some material more than other. There are primitive
           | instincts in us, and content designed to take advantage of
           | that is parasitic, and it is manipulative and addictive in
           | that sense.
           | 
           | Crazy theories, appeal to emotions, controversy, nudity, clan
           | affiliation, and all that are ways to take advantage of our
           | psyche.
           | 
           | Even a smart person is as smart as the data more readily
           | available to them. If the only thing about gender psychology
           | I ever heard about was Jordan Peterson because he's been
           | recommended to me, even if I'm the smartest most reasonable
           | person, this is now the starting point of my understanding
           | and thoughts around gender psychology.
           | 
           | So I think a platform that is optimized to show information
           | that is most designed to make people susceptible to it, and
           | that also targets information to the most susceptible people
           | to present it to is by design going to result in the outcomes
           | we're seeing.
        
           | d1sxeyes wrote:
           | That's also missing the greater picture. It's giving
           | _everyone_ a megaphone... but giving the loudest megaphones
           | to the people who can get most people to listen to them.
           | 
           | You'll have noticed on the internet that there's a tendency
           | to prioritise engaging with things you disagree with (hell,
           | half of my HN comments are because I felt motivated to write
           | something to disagree with some OP at some point - even this
           | one).
           | 
           | What that means is the traditional small-c conservative
           | 'village elders', 'parish priests', and 'elected officials',
           | who hold authoritative positions not because they're
           | controversial, but because they historically represented
           | positions of neutrality and consensus end up with quiet
           | megaphones, and the madmen claiming the world is flat and
           | there's a paedophile ring run out of a pizza shop end up with
           | the loudest megaphones.
           | 
           | Half of the population is below average intelligence, and
           | giving the wrong people the loudest megaphones has a
           | devastating effect on society.
        
             | maerF0x0 wrote:
             | > You'll have noticed on the internet that there's a
             | tendency to prioritise engaging with things you disagree
             | with
             | 
             | I wonder if you're aware of any known effect/research on
             | this topic? I'm openminded to learning more
        
             | dstroot wrote:
             | Was going to say exactly this. Reasonable people with
             | reasonable views have no reason to promote themselves or
             | their views on Facebook. However non-reasonable people with
             | non-reasonable views promote heavily for clicks/engagement
             | to sell you something, or just to "idiot farm" to sell the
             | idiots something later.
             | 
             | Facebook's unregulated revenue model will keep ensuring
             | this dynamic.
        
             | Aunche wrote:
             | > You'll have noticed on the internet that there's a
             | tendency to prioritise engaging with things you disagree
             | with (hell, half of my HN comments are because I felt
             | motivated to write something to disagree with some OP at
             | some point - even this one).
             | 
             | It tends to be that platforms with more disagreement have
             | healthier discourse. I think that actually the opposite is
             | more harmful. Echo-chambers allow for extremist ideas to
             | grow and encourage hostility towards those that don't go
             | along with the echo chamber.
        
             | animal_spirits wrote:
             | Yeah I think this the crux of it. Facebook is prioritizing
             | the most "engagement" which means prioritizing the most
             | "reactions" which means prioritizing the most divisive and
             | enraging content on both sides. If Facebook instead
             | prioritized the "ah that's nice" kind of content we
             | wouldn't see the divisiveness we see today
        
               | Jcowell wrote:
               | I feel this ignores human nature to engage an in
               | conflict. Or in other words, humans like to debate so we
               | will.
        
               | animal_spirits wrote:
               | Yeah but it is also human nature to fuck as much as
               | possible but we have rules and laws against things like
               | rape to control those tendencies. Just because we are
               | naturally inclined to do something does not necessarily
               | mean that it is best for us
        
               | gretch wrote:
               | What? This is certainly [citation needed]
               | 
               | I highly disagree that it's human nature to 'fuck as much
               | as possible'.
               | 
               | Certainly it is the goal of some humans. I can speak with
               | personal experience that my nature isn't just to 'fuck as
               | much as possible'. And neither is it most ppl I know. And
               | the thing that's stopping us is not just anti-rape laws?
               | 
               | Fucking is great, but if you have a family and young
               | kids, you care a lot about taking care of you family and
               | not just going to the club and fucking more people.
        
               | satellite2 wrote:
               | It's more interesting to debate subject and preferences
               | where there is not already an overwhelming amount of
               | evidence that one side is wrong.
        
               | philipkglass wrote:
               | It's also common for people to accept defaults without
               | substantial customization. That's why the algorithms
               | matter. Some people will deliberately seek out rage-bait
               | even if the default algorithm delivers just-the-facts
               | news and heartwarming pictures from friends' families.
               | Most won't. Also, most people won't customize their
               | settings to eliminate rage-bait if _that 's_ what gets
               | prioritized by algorithmic defaults.
        
               | xcjs wrote:
               | It also doesn't matter how much you customize your
               | settings if they're inherently useless, minimally
               | functional, or never there in the first place. A lot of
               | the content control settings that Facebook loves to tout
               | are practically useless.
               | 
               | Sure, I can hide every post from the "Controversial News"
               | page, but I can't stop viewing content from third parties
               | entirely. I'm only interested in first-party content -
               | what my contacts create. Unfortunately that goes against
               | the monetization model of Facebook.
               | 
               | I want a more closed loop social network and think that's
               | the model we should return to, but unfortunately that's
               | not where the profit/engagement is.
        
             | foogazi wrote:
             | > Half of the population is below average intelligence
             | 
             | This is the real problem
        
               | amf12 wrote:
               | Half of the population will _always_ have below average
               | intelligence.
        
               | d2z-NdEa wrote:
               | This is incorrect. At least half the population will
               | always have _median or below_ intelligence.
        
               | rightbyte wrote:
               | If we are referring to IQ the median and average is
               | supposed to be the same by definition.
        
               | waltbosz wrote:
               | I disagree. Half the population will _always_ have above
               | average intelligence.
        
               | xupybd wrote:
               | But the majority is somewhere near center.
        
             | yololol wrote:
             | > Half of the population is below average intelligence
             | 
             | By definition :)
        
               | MaxBarraclough wrote:
               | Only if the distribution is symmetric. [0]
               | 
               | Extra pedantic mode: if the population is an odd number,
               | the number of below average intelligence will not equal
               | the number of above average intelligence.
               | 
               | [0] https://en.wikipedia.org/wiki/Symmetric_probability_d
               | istribu...
        
               | d1sxeyes wrote:
               | >if the population is an odd number, the number of below
               | average intelligence will not equal the number of above
               | average intelligence
               | 
               | I'll leave counting as an exercise for the reader :D
        
             | takeda wrote:
             | It's not everyone though. FB algorithm is giving preference
             | to the most controversial person. People that are
             | reasonable are boring and don't cause engagement, so their
             | posts also are not displayed.
             | 
             | When you get away from the site, FB will start bombarding
             | you with the messages from people that you would mostly
             | react, because they want you back.
        
               | endymi0n wrote:
               | ...and not just "the most controversial", in a lot of
               | cases on both the content creators' as well as the
               | sockpuppets' sides, we're not even talking about "village
               | idiot" type real humans anymore, but -- in contrast to
               | Meta CTO's words -- about extremely skillful mass
               | manipulators sitting somewhere in Russia and hiding
               | behind international proxies.
               | 
               | Not only was that tolerated in the name of profit, these
               | individuals were able to create official looking,
               | completely unverified "pages" with bogus attribution to
               | create their "engagement" campaigns meant to poison and
               | destruct western society (and arguably being successful
               | in that).
               | 
               | How this is legally anything different than complicity in
               | treason is hard to comprehend for me.
        
             | hui-zheng wrote:
             | that's all good points. I agree. I think it's not Facebook
             | gives the wrong people the loudest megaphones, but our
             | human nature and nature of the population are drawn to
             | those megaphones held by the wrong people.
             | 
             | What could we do about this? How could we identify the
             | wrong people so that we could take away the megaphone from
             | them? Who to decide which people are wrong? Some of them
             | are obvious, But some of them are not that obviously.
             | 
             | Maybe we could say madmen claiming the world is flat and
             | there's a paedophile ring run out of a pizza shop are
             | obviously wrong. We might know Nazi is obviously wrong, but
             | what about What about Antifa, what about "woke"? What about
             | all those theories behind "group identity"? The most
             | dangerous wrong people are the ones hold "good intentions"
             | (they could be self-deceiving or could be truly genuine)
             | but bad ideas, and its hard to discern.
             | 
             | History repeats itself. I suggest reading history of China
             | in 1930-1950, the rise of Communist China, and then read
             | "Culture Revolution" in 1970s. You could find that how the
             | people with "good intention" ended up being the most evil
             | in the history.
             | 
             | How could we avoid that to happen here? I don't have an
             | answer.
        
             | lubesGordi wrote:
             | This is what I'm seeing also. Youtube, Facebook, etc. all
             | prioritize engagement. It's not only the megaphone problem,
             | it's a quicksand problem. As soon as you watch some
             | misinformation to even try to understand what the hell
             | anti-vaxxers are claiming, then you get a ton of related
             | misinformation promoted to your homepage. How the hell are
             | technically ignorant people supposed to keep up with this?
             | Youtube and Facebook will lump you in a category and show
             | you what similar viewers watched.
        
             | nradov wrote:
             | Intelligence is a tool that can be used for good or evil.
             | Vladimir Lenin and Mao Zedong were quite intelligent by any
             | objective measure, but giving them megaphones resulted in
             | horrific disasters far worse than anything caused by
             | Facebook users so far.
        
             | AnIdiotOnTheNet wrote:
             | > Half of the population is below average intelligence
             | 
             | Maybe the real problem is that everyone assumes they're in
             | the other half. Or possibly that intelligence and wisdom
             | are the same thing.
             | 
             | I guess what I'm saying is that I would generally agree
             | with your post if it weren't for this statement. I don't
             | think intelligence really has anything to do with the
             | problem as even a lot of otherwise 'intelligent' people
             | have engaged with today's bullshit conspiracy theories and
             | nonsense.
        
           | jonny_eh wrote:
           | > why can't people discern the difference between the idiot
           | and the non-idiot?
           | 
           | Because it's not idiots that are the problem, it's bad-faith
           | actors, and they're very good at manipulating people. In the
           | past they'd have to do that 1:1, now they can do it at scale.
        
             | hui-zheng wrote:
             | I would say the most dangerous are not the bad-faith
             | actors, but those self-deceived ones who have genuine "good
             | intention" but act out the "bad" consequence.
             | 
             | And everyone of us could be that self-deceived ones,
             | including you and me.
        
             | pier25 wrote:
             | It's true there are bad-faith actors, but there are
             | definitely lots of idiots who don't know they are wrong.
        
               | nradov wrote:
               | Is it possible that you're also a wrong idiot but are
               | unaware of it?
        
             | xedrac wrote:
             | I've been in several debates where I was written off being
             | in "bad-faith", when in reality, I just didn't agree with
             | with the popular opinion on a particular subject. It seems
             | people are all too eager to justify their own position by
             | labeling others as being in "bad-faith".
        
               | kelnos wrote:
               | Except in this case we know that there have been many
               | organizations, FB pages, and fake individual FB accounts
               | set up specifically to spread misinformation and FUD
               | about COVID and vaccines. That's the definition of bad
               | faith.
               | 
               | Certainly, from there, real regular people pass on and
               | help spread this misinformation. Hard to say how many of
               | those people are also acting in bad faith or have just
               | been manipulated and scared into believing the bad
               | information. But it seems certain that the source of much
               | of this garbage is bad-faith actors.
        
               | xupybd wrote:
               | Where can I read more on these organisers of miss
               | information?
        
               | r721 wrote:
               | >Just twelve anti-vaxxers are responsible for almost two-
               | thirds of anti-vaccine content circulating on social
               | media platforms. This new analysis of content posted or
               | shared to social media over 812,000 times between
               | February and March uncovers how a tiny group of
               | determined anti-vaxxers is responsible for a tidal wave
               | of disinformation - and shows how platforms can fix it by
               | enforcing their standards.
               | 
               | https://www.counterhate.com/disinformationdozen
               | 
               | News story:
               | https://www.npr.org/2021/05/13/996570855/disinformation-
               | doze...
        
               | lubesGordi wrote:
               | If it's 'bad-faith' actors, then you're saying that the
               | misinformation is intentional, which makes it
               | disinformation.
        
             | brighton36 wrote:
             | I would also suggest that bad faith is contagious. Once bad
             | faith enters, the incentives are such that everyone acts in
             | bad faith in short time. ('Well if he's lying, then I'm
             | forced to lie too' become easy to justify)
        
             | treis wrote:
             | They've been able to do that since at least radio and
             | arguably since the printing press. At worst, Facebook is an
             | evolutionary step along that spectrum.
        
               | jonny_eh wrote:
               | Radio and newspapers were effectively local, we now have
               | global reach so that it takes just a few to mess things
               | up everywhere.
        
               | stilist wrote:
               | There's effectively no cost to do it now, though.
        
             | rumblerock wrote:
             | Given enough capability and knowledge of manipulation
             | methods bad actors can not only shape conversations and
             | promote controversial / conspiratorial information, but
             | also fan the flames of the backlash that make collective
             | reason impossible. So long as there are holes in these
             | platforms and a combative stance by platforms to resolve
             | them, this power can be had or hired. You don't need nation
             | state resources to pull it off at this point.
             | 
             | It's pretty widely acknowledged that what happens or begins
             | on social media is now shaping the behavior of politicians
             | and the narratives of legacy media. So if you successfully
             | seed something on social media you get to enjoy the ripple
             | effects through the rest of society and the media. If I
             | have enough resources and motivation, I'm fine with that
             | success rate being 1%, even 0.1% if it gets significant
             | traction. And once it's out there, the imprint on those
             | exposed really can't be undone by a weakly issued
             | correction that never gets the reach of the original false
             | information.
        
           | kelnos wrote:
           | > _it 's giving _everyone_ a megaphone._
           | 
           | Are they, though? It seems like FB amplifies things that they
           | think will generate more engagement and "stickiness".
           | Sensational things that cause outrage tend to do that more
           | than cold, hard facts. I would not at all be surprised if
           | misinformation gets amplified orders of magnitude more than
           | the truth.
        
             | mavhc wrote:
             | They amplify things that cause you to see more ads, because
             | you picked a free platform
        
           | topkai22 wrote:
           | Facebook is not just a hosting platform, through the Facebook
           | feed it exercises a great deal of editorial control over what
           | posts/information is surfaced to users. So while Facebook
           | might be giving everyone a megaphone, it doesn't turn
           | everyone the same volume. It needs to own that.
        
           | downWidOutaFite wrote:
           | Erosion of trust in elites just so happens to also be a long-
           | term goal of polluters, quacks, scammers, and other powerful
           | parasites of the common wealth when they run up against
           | government or science.
        
           | blablabla123 wrote:
           | I think in general Facebook has a bias towards inflammatory
           | posts - and other platforms for that matter as well including
           | HN actually. Also it's easy to blame the village idiot for
           | everything, but I don't think Donald Trump or Alex Jones are
           | village idiots. They are surely idiots but left the village
           | quite some time ago and gained popularity before Facebook
           | (InfoWars was founded 1999) - although FB surely was an
           | accelerator stage.
           | 
           | That said, the village idiot is harmless and I think
           | aristocracy (rule of the elite) is definitely not the
           | solution. But what is true that the normal filters in an
           | offline community haven't been translated online yet.
        
           | matwood wrote:
           | > The real question is why can't people discern the
           | difference between the idiot and the non-idiot?
           | 
           | And here is the real problem with FB, the algorithmic feed.
           | Normal life is pretty boring day-to-day, and doesn't trigger
           | 'engagement'. Conspiracies, etc... cause an enormous amount
           | of engagement. When a person is fed conspiracies all day by
           | the engagement algorithm, even the most critical thinkers
           | will start to drift. It works for the same reason advertising
           | works, familiarity and repetition. The solution is never use
           | FB, but that ship has sailed for most.
        
             | bluGill wrote:
             | I have had decent luck in reverting facebook back to what
             | it is for: sharing pictures of my kids with people who care
             | to see pictures of my kids. That means every time someone
             | shares something - no matter how funny - I block the place
             | it was shared from permanently. Slowly facebook is running
             | out of things to show me that isn't pictures of my friend's
             | kids.
        
               | bryan_w wrote:
               | Same, it didn't really take as much as I thought it would
               | to get it to stop showing me reshares, but now I see
               | either groups content or family/friends original content.
               | 
               | I still wish they had fine grain controls though.
        
               | mavhc wrote:
               | If only they could advertise your kids to you.
               | 
               | "I picked a free service and was annoyed they wanted to
               | make money somehow"
        
         | burnte wrote:
         | > You guys gave them a megaphone, how do you expect society to
         | behave?!
         | 
         | Considering most of humanity is... challenged when it comes to
         | thinking critically, this should have been an entirely
         | forseeable outcome. I agree it's society's fault, but Facebook
         | is part of society. They watched how their tool was being usde
         | by these people, and ENHANCED the reach of those messages
         | because it was good for Facebook. Facebook is the microcosm of
         | the object of it's blame. Idiocy writ large in recursion.
        
           | [deleted]
        
           | Hokusai wrote:
           | > most of humanity is... challenged
           | 
           | Most, no. Everybody is blind to one perspective or another.
           | Also, time is limited and attention is limited. Do not think
           | that others are just stupid because their focus or knowledge
           | does not overlap with yours.
           | 
           | "Those people" does not exist. It's just an illusion of your
           | own limited perspective. We are on this together and calling
           | people stupid not it is true, not it helps.
        
             | d23 wrote:
             | "Stupid people don't exist" is a bold take.
        
               | kelnos wrote:
               | That's not what the parent was claiming. The grandparent
               | was claiming that most people are stupid. The parent was
               | pointing out that most people are _not_ stupid. Some
               | people are, and many people have various biases and
               | preconceptions that make it easier for them to be
               | manipulated into believing misinformation.
        
               | d23 wrote:
               | > "Those people" does not exist. It's just an illusion of
               | your own limited perspective. We are on this together and
               | calling people stupid not it is true, not it helps.
               | 
               | I don't understand what's going on on this site. This is
               | the second time recently I've come across someone
               | claiming a comment didn't say something that's
               | essentially copied verbatim mere centimeters higher on
               | the monitor. It's basically in the same eyeful.
               | 
               | Hell, the commenter even _added_ the word  "stupid",
               | which wasn't in the parent comment.
        
             | tppiotrowski wrote:
             | I agree here. My observation is that most of humanity is
             | rational but acts on limited or incorrect information. If
             | you can provide truthful and complete information (in a
             | digestible form), humanity will do just fine.
        
             | kahrl wrote:
             | You're pretending that there's no difference in
             | intelligence/knowledge/skills between individuals or groups
             | of people. There are differences. Can we stop pretending
             | that there's no difference between a college educated
             | European, your average American who reads at a 7th grade
             | level, and a 3rd world farmer who has no perspective
             | outside their small village?
        
               | bigthymer wrote:
               | > a 3rd world farmer who has no perspective outside their
               | small village?
               | 
               | I think this is a disappearing breed. Ubiquitous cell
               | coverage means everyone knows what is happening
               | everywhere now.
        
               | mavhc wrote:
               | Still a lot of people getting their crops burned because
               | they knew about science and managed to grow a crop, and
               | everyone else didn't and thinks they're a witch
        
               | IntrepidWorm wrote:
               | To that effect, its worth pointing out that in many
               | developing nations, facebook IS the internet. To say that
               | this compounds all of the issues already discussed in
               | this thread is a fairly drastic understatement.
        
               | giantrobot wrote:
               | > Ubiquitous cell coverage means everyone knows what is
               | happening everywhere now.
               | 
               | I think this assumes facts not in evidence. Just because
               | someone is "connected" doesn't mean they're automatically
               | informed. As we see in first world countries, there's a
               | lot of fucking morons that only listen to comfortable
               | lies rather than uncomfortable truths.
               | 
               | There's also industrial production of bullshit peddled by
               | disingenuous actors taking advantage of that fact.
               | Fleecing rubes can be very profitable.
               | 
               | The very problem being discussed is Facebook trying to
               | absolve themselves of bullshit peddling by blaming
               | everyone else. They're blaming people for believing shit
               | _Facebook_ put in front of them under the guise of news.
               | They 're also fine taking money to promote bullshit as
               | "news". Yet it's society's fault that they believed
               | everything labeled news Facebook put in front of them.
        
           | Jenk wrote:
           | > this should have been an entirely forseeable outcome
           | 
           | It was. It is. It always will be.
           | 
           | Could you imagine the outrage had the authorities even
           | attempted to prevent Facebook et al?
        
           | danpalmer wrote:
           | Not saying you're wrong, but to take a slightly more
           | charitable view on humanity: Facebook exploits well known
           | human behaviour to amplify content.
           | 
           | It's (unfortunately?) human nature to share shocking things,
           | it may have even been evolutionarily advantageous at some
           | point. Using algorithms to exploit this behaviour at a scale
           | never before possible is harmful to humanity. No idiocy
           | required.
        
         | 908B64B197 wrote:
         | > The problem is that Facebook is giving the village idiot a
         | megaphone
         | 
         | What's interesting is that before Facebook, the only people who
         | could afford a megaphone were either state sponsored medias or
         | billionaires who owned TV stations and newspapers.
         | 
         | For the ordinary citizens, the only way you could be heard was
         | to write a letter to the editor of your local paper. If the
         | state/billionaire/editor didn't like you, your views or
         | anything really (your skin color perhaps?) it would simply not
         | get published, period.
         | 
         | With Facebook a lot of gatekeeping simply disappeared. It's
         | interesting to see who has an interest in regulating Facebook
         | and bringing back the "good old days" of medias.
        
         | alpineidyll3 wrote:
         | Internally Facebook works aggressively to combat covid
         | misinformation: source I work at fb. Literally most of the
         | commonly used datasets are about it. It's easy to hate and hard
         | to understand.
        
         | JPKab wrote:
         | The problem is that Gutenberg is giving the village idiot a
         | megaphone. Gutenberg can't say: - Amplify your commercial
         | business message to billions of people worldwide. AND at the
         | same time
         | 
         | - Well its your individual choice whether or not to listen to
         | the village idiot.
         | 
         | You guys gave them a megaphone, how do you expect society to
         | behave?!
        
         | TigeriusKirk wrote:
         | I have no problem with them saying both things at the same
         | time. You're responsible for what you give your attention to,
         | and so is everyone else.
        
           | fragmede wrote:
           | Ultimately, yes, but that's a rather short-sighted position
           | to take when there's an cadre of psychologists and other
           | highly-trained people who's entire job is to entrap you
           | further, just so _someone_ can make (more) money.
           | 
           | Eg when you buy items at the grocery store, do you
           | consciously examine all options, including the items on the
           | bottom shelf by your feet, or do you just go for the items at
           | eye level, and are thus tricked by a similar group of
           | psychologists into buying the product you've been trained to
           | want. And even if you, personally, do, there's a reason why
           | product companies pay supermarkets to have their products at
           | eye/arm level - it works.
        
           | SamoyedFurFluff wrote:
           | Sure, but also Facebook has a bunch of doctorate-having
           | engineers and psychologists dedicating hundreds or thousands
           | of hours to figure out a system that gets for me to give my
           | attention to Facebook, whereas I'm one dude who doesn't even
           | have a graduate degree who gets tired and bored and struggles
           | to sleep sometimes.
        
         | foobarian wrote:
         | I think it's not so much Facebook alone but the entire
         | Internet. The connectivity between humans is suddenly increased
         | manyfold, and reaches much wider. Imagine using a graph layout
         | tool on a giant graph with only few localized connections.
         | Likely the picture will have evenly distributed nodes without
         | much movement. But then as you dump all these new edges onto
         | the graph, the nodes start to move into rigid clusters
         | separated by weak boundaries. I think this is what's happening
         | with the red/blue, vax/antivax etc. groups.
        
           | kelnos wrote:
           | The internet alone doesn't connect people. Remove things like
           | Facebook and Twitter, and how do you get this giant
           | interconnected graph with few localized connections?
        
             | foobarian wrote:
             | Given a network like the Internet, things like Facebook and
             | Twitter naturally emerge.
        
         | kjgkjhfkjf wrote:
         | I wouldn't blame megaphones for the fact that "idiots" use
         | them. Nor would I expect megaphone manufacturers to dictate
         | what messages can be amplified using them. Nor would I expect
         | megaphone retailers to determine somehow whether a person was
         | an "idiot" before selling them a megaphone.
         | 
         | If someone uses a megaphone in an anti-social manner, that's a
         | matter for the police to handle.
        
           | tspike wrote:
           | Would you expect megaphone manufacturers to give souped up
           | models capable of drowning out other megaphones to only the
           | most controversial, destructive people?
        
           | michaelmrose wrote:
           | Analogies are nearly useless in making an argument Facebook
           | is an online platform with real time access to users
           | communications and metrics and analysis of how it's used
           | which allow it to make reasonable predictions on how it's
           | going to be used in the future.
           | 
           | Comparing it to dumb hardware is ridiculous.
           | 
           | Their ability to predict the negative effect of amplifying
           | crazy provides a moral imperative to mitigate that harm. In
           | case you don't understand there is a difference between the
           | platform allowing Bob to tell Sam a harmful lie, and letting
           | Bob tell 200 people who tell 200 people ..., and different
           | yet from algorithmically promoting Bob's lie to hundreds of
           | thousands of people who are statistically vulnerable to it.
        
           | kelnos wrote:
           | Is it a problem, however, if the megaphone manufacturers
           | specifically look for people who spread misinformation, and
           | sell them the loudest megaphones with the most reach?
           | 
           | FB has not _directly_ done that, but they have consistently
           | refused to acknowledge that selling the biggest megaphones to
           | the people who create the most  "engagement" (aka money for
           | FB) tend to be the types of people who generate false
           | information and outrage.
           | 
           | Their publicized efforts to shut down fake accounts and pages
           | set up specifically to spread misinformation is perfunctory,
           | and simply something for them to point at and say, "see,
           | we're doing things to fix the problem", when they're merely
           | playing whack-a-mole with symptoms, know what the root of the
           | problem is, but refuse to fix it because it's their cash cow.
        
           | lijogdfljk wrote:
           | So i think this is a breakdown of our previous mindset on the
           | matter. I don't know what future is "right", what the answer
           | is.. but i think it is important for us to at least recognize
           | that in the past, a crazy person on the street corner was
           | limited quite a bit on velocity.
           | 
           | This megaphone is a poor example imo. A far better example
           | would be broadcast television. We're now broadcasting
           | everyone straight into not just American homes, but world
           | wide.
           | 
           | So i ask, because i don't know, how does broadcast television
           | differ from a megaphone in requirements? What responsibility
           | is there on broadcast television what doesn't exist for a
           | street corner?
        
         | FpUser wrote:
         | I am not sure how it goes for the average person. Myself: I
         | just do not go to places where village idiots tend to
         | accumulate like FB or if I do (hard for me not to watch
         | youtube) I just completely ignore all that crap.
        
           | geodel wrote:
           | And that might be most reasonable thing to do.
           | 
           | It seems like lot of folks here allude though not exactly say
           | that they should be in position to decide on who is "idiot",
           | "bad-faith", "anti-science" and so on.
        
         | twblalock wrote:
         | Should our society have free speech, or free speech for
         | everyone except idiots?
         | 
         | If you agree with the second formulation, who do you think
         | ought to be in charge of deciding who the idiots are? Surely
         | Mark Zuckerberg would not be your first choice.
         | 
         | Maybe there is a third option: no free speech for anyone, all
         | speech must be moderated for lies and misinformation. Is that
         | what you want? In that case, who gets to decide what is true
         | and what is not? Surely Zuckerberg wouldn't be your first
         | choice for that either, right? And what should happen when
         | Facebook blocks "misinformation" that turns out to actually be
         | truthful?
         | 
         | Those who want Facebook to regulate "misinformation" and
         | gatekeep who (and what) is allowed on the site need to admit
         | that they don't actually believe in free speech -- they believe
         | in limited speech regulated by corporations.
        
           | michaelmrose wrote:
           | Facebook should ban or suspend accounts which spread
           | objective untruths that will tend to be harmful if spread.
           | 
           | You can have your free speech on your own website.
        
             | hpoe wrote:
             | Objective untruths like COVID being the result of a lab
             | leak?
        
               | JPKab wrote:
               | This.
               | 
               | People who support these kinds of activities are youthful
               | and arrogant enough to have any form of humility about
               | things they once passionately believed to be true turning
               | out to be incorrect.
               | 
               | In the 90's, eggs were thought to be as deadly as
               | cigarettes. A bowl of cheerios was considered to be
               | vastly superior nutrition wise to a plate of eggs. This
               | is the opposite of what we know to be true today. If I
               | had tried to argue against this with the current form of
               | Facebook, I'd be censored. (They also thought avocados
               | were bad for you in the 90s.)
               | 
               | The elevation of a collectively determined "objective"
               | truth over the freedom of individuals to exchange ideas
               | is the first step towards creating an environment for
               | authoritarianism to flourish. Subjugation of the
               | individual to the collective is the norm in most of
               | history, and it's not an accident that our current
               | prosperity emerged when in the times and places where it
               | was lifted.
        
               | redis_mlc wrote:
               | The disturbing things about the social media bans on lab
               | leak reporting are:
               | 
               | 1) DOE LLNL reported it was a possible lab leak in May
               | 2020.
               | 
               | 2) A Chinese virologist defected to the US in late 2020
               | and all of her interviews were deleted, though I saw her
               | fotos and summaries before the deletions.
               | 
               | 3) NTD Media started reporting in early 2020 and has been
               | persecuted by US socials at every turn since then. All of
               | their reports have turned out to be factual. Only Sky
               | Australia have had most of their reports not censored.
               | 
               | So there was plenty of early lab leak info, but it was
               | virtually all censored.
        
           | JPKab wrote:
           | Take any of these arguments about Facebook, replace
           | "Facebook" with "printing press" and everything still makes
           | sense, which tells you what this really is:
           | 
           | Cultural elites wanting to control what their perceived
           | inferiors think, believe, and most importantly, vote for.
           | 
           | The same class of people who wanted to regulate the printing
           | press in Europe during the 15th and 16th centuries are the
           | ones who want to regulate the internet today.
        
             | twblalock wrote:
             | To be fair there is also a large contingent of well-
             | intentioned people who don't realize the full implications
             | of what they are asking for.
             | 
             | Ironically many of those people would say they oppose the
             | concentration of corporate power, yet they are asking a
             | very large capitalist corporation to exercise power over
             | one of the most fundamental freedoms.
        
           | tenebrisalietum wrote:
           | I want free speech for everyone except idiots.
           | 
           | > who do you think ought to be in charge of deciding who the
           | idiots are?
           | 
           | Think about it. Engineering disciplines have mostly solved
           | this issue. Lets take structural/civil engineering and
           | something that affects many people - bridges. Through a
           | combination of law, codes, and government, not any joe schome
           | can build a bridge. Existing bridges generally work well and
           | can be trusted. Sometimes bad things happen like the FIU
           | collapse, but generally that's very rare.
           | 
           | I don't understand why there can't be a group of people,
           | large or small educated and from diverse backgrounds, that
           | can set basic standards on what is and is not misinformation,
           | with due-process like things such as appeals, etc. It's not
           | an impossible task.
           | 
           | > Those who want Facebook to regulate "misinformation" and
           | gatekeep who (and what) is allowed on the site need to admit
           | that they don't actually believe in free speech -- they
           | believe in limited speech regulated by corporations.
           | 
           | If you're going to use a third party for communication and
           | that third party is not owned by the people (i.e. a
           | government entity) then it follows from the above statement
           | that you don't believe in private property rights.
        
             | twblalock wrote:
             | How do you ensure that this Ministry of Truth you are
             | proposing will remain free of political pressure and
             | corruption?
             | 
             | And how do you expect a panel of experts to escape
             | groupthink and rule fairly in cases where the expert
             | consensus turns out to be incorrect?
             | 
             | Both of those goals are impossible to achieve.
        
               | michaelmrose wrote:
               | There is a difference between defining absolute truth and
               | identifying obvious lies.
               | 
               | If a medication is approved and then a new risk factor is
               | identified the issue goes from unproven to validated but
               | if it wasn't based on nothing it was never a lie.
               | 
               | The covid vaccine containing a chip to track you was
               | always a lie.
               | 
               | We don't need a ministry of Truth we need a ministry of
               | obvious bullshit.
        
               | kelnos wrote:
               | Under the -- very rocky -- assumption that a Ministry of
               | Obvious Bullshit could still avoid scope creep into
               | moderating truth, I still don't think this would fix
               | things. People who actively want to spread bullshit will
               | find a way. It's like spam vs. anti-spam, ads vs. ad-
               | blockers.
        
               | tenebrisalietum wrote:
               | People who want to commit murder might still find a way
               | too but we still have laws, police, prosecutors, etc.
               | Perfect should not be the enemy of good.
        
               | jiveturkey42 wrote:
               | That's quite the leap from moderating information to a
               | murder investigation
        
             | adolph wrote:
             | > It's not an impossible task.
             | 
             | Ok, set it up and then maybe removing idiotic speech can be
             | considered. Until then you have nothing but a desire to
             | define what is undesirable.
        
             | JPKab wrote:
             | Your comment can be loosely translated to the following:
             | 
             | "Authoritarianism can work. It's just the wrong people were
             | in charge. If me and other people like me had the same
             | power as Stalin/Hitler/Mao/Mussolini/Putin, everything
             | would be better because they were dumb, but we know better.
             | We can create Utopia when nobody else could. We are
             | uniquely prescient and intelligent."
             | 
             | The amount of arrogance and utter lack of humility is
             | shocking.
        
               | jiveturkey42 wrote:
               | If you thought the world is bad with aggressive bullies
               | in power, wait until you see how bad it gets with
               | aggrieved nerds in power.
        
             | SamoyedFurFluff wrote:
             | > I don't understand why there can't be a group of people,
             | large or small educated and from diverse backgrounds, that
             | can set basic standards on what is and is not
             | misinformation, with due-process like things such as
             | appeals, etc. It's not an impossible task.
             | 
             | This seems incredibly naive to me. It seems what's
             | happening there is we give people a list of important
             | figures to bribe to allow their speech to be considered
             | information (or for a competitors speech to be considered
             | misinformation).
             | 
             | And even if these were incorruptible humans, there are
             | several statements that are heavily under debate currently
             | as fact or fiction, such as the validity of neopronouns,
             | whether or not Spanish speakers anywhere use latinx, who is
             | the bad art friend, if kyle rittenhouse should've been
             | convicted for murder, do trans children exist and if so can
             | they perform any transitioning or puberty delay, what is
             | critical race theory, is the Covid vaccine rollout speed
             | sinister and if trump lost the 2020 presidential election
             | legitimately. And that's just all in America.
        
           | lmilcin wrote:
           | I believe speech should be free but people should be
           | responsible for their speech.
           | 
           | People behave completely differently when there are
           | consequences to what they say.
           | 
           | Speech for "everybody but idiots" is not free speech.
        
             | twblalock wrote:
             | What should the consequences be?
             | 
             | Removal of speech is not a consequence of speech -- it's
             | preventing speech in the first place. That's what happens
             | when Facebook blocks or deletes "misinformation" -- they
             | are removing the speech itself. That's not the same thing
             | as "consequences" for speech.
             | 
             | Look at what HN mods do -- they ban trolls, but they don't
             | delete what the trolls posted. It's there for everyone to
             | see -- in fact, if you look at "dead" comments you can see
             | flagged stuff too. In terms of free speech, that's very
             | different from deleting the comments entirely, which is
             | what people seem to want Facebook to do.
             | 
             | And for the sake of argument, even if we accept that
             | "consequences" ought to include the right to free speech
             | being taken away from bad actors -- who can be trusted to
             | decide who ought to be punished? Again, surely not
             | Facebook. Surely not the government either -- the winners
             | of every election would punish their enemies by taking away
             | their rights. So even if we could tell, 100% reliably, who
             | were trolls and who were not, we still should not give any
             | corporation or government the power to take away the right
             | of free speech.
        
               | kelnos wrote:
               | > _What should the consequences be?_
               | 
               | The usual when you get up in front of a group and act
               | like a jackass: social shame and ostracization. Something
               | that's hard to do on internet platforms. Even when people
               | are not anonymous, they have plenty of ways to "hide",
               | and it's easy to unfollow and block people who criticize
               | you for spreading misinformation.
               | 
               | So I don't know. Removal and deplatforming, IMO, is not
               | the answer. You don't fix extremism through censorship;
               | that just makes it worse and drives it underground.
        
               | lmilcin wrote:
               | Yes, removal is not a consequence.
               | 
               | What should be consequences? What are consequences when
               | you say something stupid to your family or friends? What
               | should be consequences when you knowingly lie to slander
               | somebody?
               | 
               | > And for the sake of argument, even if we accept that
               | "consequences" ought to include the right to free speech
               | being taken away from bad actors
               | 
               | This already exists in the law. Just as your right to
               | move freely is taken away in certain situations (for
               | example due to restraining order).
        
         | sharadov wrote:
         | That's too simplistic and naive, their algorithms amplify what
         | will get the most clicks!
        
         | betwixthewires wrote:
         | It's not a megaphone, the only people that can see it are
         | literally the village idiot's friends and family. It's gossip
         | within your social circle.
        
           | seanmcdirmid wrote:
           | But village idiots can share content into their social circle
           | from other more popularly known village idiots.
        
         | s1artibartfast wrote:
         | I think a better analogy is Facebook gave society a window into
         | each others lives, and people can't look away.
         | 
         | Facebook prioritizes what people want to see, and people want
         | to see train wrecks and inflammatory content.
        
           | disambiguation wrote:
           | > people want to see train wrecks and inflammatory content.
           | 
           | I'm starting to believe this more and more, but what I can't
           | understand is why? We know it has no real "nutritional
           | value", yet we crave it anyway.
           | 
           | Are we just bored and desire entertainment and drama?
           | 
           | What's the evolutionary drive for drama anyway?
        
             | s1artibartfast wrote:
             | >Are we just bored and desire entertainment and drama?
             | 
             | Essentially yes, We evolved so that we get a chemical hit
             | when we engage with social drama. If you are bored, have
             | nothing better to pay attention to, and/or have low self
             | control, it is a very easy way to get a quick fix.
             | 
             | >What's the evolutionary drive for drama anyway?
             | 
             | Humans evolved as pack animals, paying attention to pack
             | drama was extremely important. Picking sides or paying
             | attention could mean the difference between getting your
             | next meal or being beaten to death.
             | 
             | Because we evolved in small packs, where information was
             | usually relevant, we don't have a good filter, or on/off
             | switch.
             | 
             | Today we are exposed to the latest and most exciting drama
             | from around the world, opposed to our tiny pack, and it is
             | really hard to resist paying attention.
             | 
             | Paying attention to anything in the news or on social media
             | is unlikely to make an impact on your life. Even the
             | biggest topics have a very low risk of impacting you
             | personally, but you will notice that most of them have an
             | explicit or implicit hook that they _could_ impact you.
        
           | cgriswald wrote:
           | People want inflammatory content like a moth wants a flame.
           | Facebook amplifying a signal from the lunatic fringe preys
           | upon the need of non-lunatics (or different-thinking
           | lunatics) to argue against ideas they consider dangerous or
           | just wrong. As a side-effect, it makes the ideas appear more
           | mainstream, which has the effect of making the ideas more
           | popular. This further increases the compulsion of non-
           | lunatics to address the ideas.
           | 
           | I'm not sure if that qualifies as being what people 'want' or
           | not, but it seems like it's profitable.
        
             | s1artibartfast wrote:
             | >I'm not sure if that qualifies as being what people 'want'
             | or not, but it seems like it's profitable.
             | 
             | People want it like an alcoholic wants a drink. Facebook is
             | the corner store that sells to the public.
             | 
             | I think we agree on the mechanism. but the question is what
             | to do about it.
             | 
             | People generally say, silence people I don't like, and
             | blind others from seeing what I don't agree with.
             | 
             | The problem is that this is not a workable standard,
             | because everyone has a different opinion on what should be
             | censored.
        
         | jeffrogers wrote:
         | Right. Prior to social media, people were vetted many ways and
         | in every context in which they gained an audience. (e.g. earned
         | standing in social settings and community groups, promotions at
         | work, editors of one sort or another when publishing to a
         | group, etc) Audiences grew incrementally as people earned their
         | audience. Social media removed all that vetting and it inverted
         | the criteria to grow an audience. Sensationalism was rewarded
         | over thoughtfulness. So one of the most important tools we've
         | always relied on to judge information was removed. Hard to
         | believe, as intelligent as these folks at Facebook/Meta are
         | said to be, that they don't understand this. Feels
         | disingenuous.
        
           | LeifCarrotson wrote:
           | It is difficult to get a man to understand something when his
           | salary depends upon his not understanding it.
           | 
           | - Upton Sinclair
        
         | servytor wrote:
         | Yeah, I always hear people talking about the great "global
         | village" where everyone is 'connected', but I have to admit I
         | am against it. I don't want to be prank called.
        
       | JKCalhoun wrote:
       | > Asked whether vaccine hesitancy would be the same with or
       | without social media, Bosworth...
       | 
       | answered elliptically.
        
         | GuB-42 wrote:
         | Because it is a loaded question and there is no good answer.
         | 
         | Social media exist, they are part of our world, it is not a
         | factor we can isolate, which mean answering that question would
         | be science-fiction. It makes as much sense as speculating about
         | a world without Hitler, yeah, it can make great stories, but
         | that's it.
         | 
         | Maybe the answer would be "no, because a world without social
         | media would be a different world", but in which way? No one
         | knows.
        
       | ambrozk wrote:
       | The public doesn't trust politicians, the government, or its
       | official experts. Facebook is a public forum, and the public uses
       | it to express their mistrust. Politicians then pretend that
       | Facebook is the reason no one trusts them, because shooting the
       | messenger is easier than admitting that they don't have much
       | authority any more.
       | 
       | Are there major problems with Facebook? Absolutely. But the
       | motivation behind attacks like the one leveled by this journalist
       | is transparently to deflect blame off our incompetent political
       | establishment and onto an easy scapegoat. The truth is that if
       | politicians want people to trust them, they're going to have to
       | figure out how to convince those people. Making Facebook or
       | Twitter delete your enemies' posts hasn't worked in the past and
       | it won't work in the future. This is a free society. You don't
       | get to replace the public's opinions just because you declared
       | those opinions "misinformation." Maybe in China, but it just
       | doesn't work that way here.
        
       | zeruch wrote:
       | ...as if a corporation exists in a vacuum, outside of society.
        
       | iansimon wrote:
       | You know what I blame this on the breakdown of?
       | https://www.youtube.com/watch?v=Kw39tcyg7So
        
       | kgin wrote:
       | It's not false that there is a societal problem that is not
       | unique to Facebook.
       | 
       | But that sidesteps the question of what responsibility they have
       | as a company whose profits are, at minimum, powered by that
       | problem, if not exacerbating the problem.
       | 
       | "Privatize the profits, socialize the costs" is not sustainable.
        
         | sneak wrote:
         | > _"Privatize the profits, socialize the costs" is not
         | sustainable._
         | 
         | The church (as well as the integrated state) has been doing it
         | for thousands of years. I think treating the general population
         | as a somewhat renewable resource to be mined/harvested has a
         | longstanding tradition and history of being one of the most
         | sustainable things in human history.
         | 
         | Depending on how tax money is spent (i.e. in ways that benefit
         | a subset of society, rather than all taxpayers) this is perhaps
         | the most common and longstanding tradition we human beings
         | have.
        
           | geodel wrote:
           | Indeed. Many think if it sounds like feel-good rhetoric then
           | it must be true.
        
         | jensensbutton wrote:
         | > whose profits are, at minimum, powered by that problem
         | 
         | I don't think it's established that that's the minimum.
         | Facebook usually argues that it's a small minority of their
         | content and I don't see any evidence against that (it just gets
         | a lot of scrutiny). It seems like if you magically removed all
         | the "bad actors" they'd make just as much money.
        
         | joshenberg wrote:
         | Exactly. Running 'the world's largest vaccine information
         | campaign' rings hollow when it's really a mitigation effort.
         | That's akin to saying that the Valdez tragedy and subsequent
         | clean-up made Exxon the top environmentalists of '89.
        
       | SleekEagle wrote:
       | The legislative system has always taken a long time to catch up
       | to new technological innovations. It's certainly a problem that
       | technology advances so rapidly now and the time scale of
       | legislative action can't keep pace, especially given how
       | connected the world is.
       | 
       | No longer do laws have to keep up with new technologies that
       | affect _how_ we live, but now also information highways that
       | determine _what_ we think to some degree. I 'd much rather these
       | highways be regulated in a way that at least touches democracy
       | than by private companies whose role it is to drive profit.
        
       | bryan_w wrote:
       | At 1 hour old, this has 143 comments. Also $FB is up over 1%
       | today.
       | 
       | Just some random thoughts; take them as you wish.
        
       | ypeterholmes wrote:
       | Neither society nor facebook is to blame. The three foundational
       | systems of our lives have been centralized and corrupted: money,
       | information, politics. These systems are to blame, and the answer
       | is decentralized systems. Decentralized money (Crypto),
       | decentralized information (like HN), and decentralized voting
       | (DAO's). Once we start using healthy systems, we'll get our power
       | back and be able to fix our problems.
        
         | yakkityyak wrote:
         | This reads like a GPT-3 response.
        
           | HatchedLake721 wrote:
           | Lol for real, don't know if OP is serious. How do you catch a
           | GPT-3 commenter bot?
        
         | GDC7 wrote:
         | > Decentralized money, decentralized information, decentralized
         | voting.
         | 
         | Satoshi Nakamoto, net worth 30 billion and veto power over the
         | entire cryptospace (if he's alive)
         | 
         | Vitalik Buterin , net worth 10 billion and veto power over the
         | whole Ethereum ecosystem.
         | 
         | "Meet the new boss...the same as the old boss"
        
           | ypeterholmes wrote:
           | Satoshi has veto power over the entire crypto space?
           | 
           | Uhhhhhhhhhhh no.
        
             | GDC7 wrote:
             | if he/she moves the original coins and proves that he/she
             | is indeed Satoshi via technical competence and social
             | consensus , then between the immense wealth and social
             | status as the creator of Bitcoin their influence on the
             | crypto world will be massive.
             | 
             | You can never decentralize power and wealth because they
             | obey Power's law and also it happens organically and
             | unspectacularly, it's simply a bunch of people agreeing
             | that somebody is cool or what they are saying makes sense.
             | 
             | It snowballs from there and a couple of exponentials later
             | that person is sitting on billions of dollars and a 70M
             | people platform
        
         | JaimeThompson wrote:
         | How is HN decentralized?
        
           | ypeterholmes wrote:
           | Crowd-sourced content + crowd-sourced voting on visibility.
           | 
           | It's not perfectly decentralized of course, running on a
           | server and centralized administration, but the general
           | structure of the data flow is many to many.
        
             | ldiracdelta wrote:
             | + content moderation by admins.
             | 
             | "but some animals are more equal than others."
        
         | Kina wrote:
         | Extraordinary claims require extraordinary evidence.
        
           | ypeterholmes wrote:
           | Which claim? Our money is centralized via the Fed. Our
           | information is centralized via corporate media. Our politics
           | are centralized via Oligarchy.
           | 
           | These systems act to divide and conquer us, sapping our
           | power. By switching to decentralized systems, we will be
           | reconnected and rediscover our power. It's already happening
           | via sites like this, crypto, etc.
        
       | redwood wrote:
       | This is a "the media is the message" situation
        
       | actuator wrote:
       | Title is really poor compared to the content of the article.
       | 
       | In any case, he is right. Look at the pattern, any large social
       | network has these issues, which more or less seems like is
       | related to how people interact. Twitter is massively toxic,
       | Reddit is. Back in day Tumblr which was not current social media
       | huge also used to have content Facebook gets blamed for.
       | 
       | Give a platform for people to publish and share and every opinion
       | has the chance to be there.
       | 
       | It also doesn't have to be a massive broadcast platform,
       | messaging platforms with small communities in the form of groups
       | have these issues on a smaller scale. Though broadcast does make
       | it worse.
        
         | delusional wrote:
         | The maker of the gun may not be held solely responsible for
         | murder, but we surely have to consider if we want guns in the
         | hands of murderers.
         | 
         | Facebook isn't arguing this from a neutral standpoint. They are
         | arguing from the position of the company that stands to lose
         | their business if society decides murderers shouldn't have
         | guns.
        
           | actuator wrote:
           | But that's the exact issue no.
           | 
           | Social media platforms create immense value as well. I might
           | not be a user but I can see how it helps a lot of people as
           | well.
           | 
           | Do we enforce license on people to use social media
           | platforms?
        
             | baq wrote:
             | given what i see in there, yes, please.
        
         | d0mine wrote:
         | Said a person crying Fire in a crowded theater (it is a human
         | nature to panic, but it doesn't excuse the instigator)
        
         | knuthsat wrote:
         | Whenever I see toxicity, I always assume it's a young
         | individual. I remember myself in teens, always up for some
         | trolling on forums.
         | 
         | Today, I just don't have a need to do such things. Whenever I
         | encounter this weird behavior, I just stop interacting, because
         | I have a feeling it's some 13-16 year old wasting my time.
        
         | nefitty wrote:
         | I don't think that's true. HN is pretty non-toxic, from my
         | perspective at least. Reddit is tolerable to me.
         | 
         | I think the problem is misalignment of incentives. If I'm
         | incentivized to increase engagement, then I can think of some
         | pretty ridiculous shit to say that will get lots of people
         | clicking downvote and shouting at me.
         | 
         | I think these platforms could have been more proactive in
         | setting those cultural norms that evolve into fruitful social
         | interaction.
        
           | actuator wrote:
           | Have you been to any thread which mentions say China, India
           | or several other topics?
           | 
           | You will see a lot of toxicity and you can find several
           | deleted or heavily downvoted comments.
           | 
           | As even HN front page and commenr section is algorithmic.
           | Vote brigading happens here as well for posts and comments.
        
             | Ottolay wrote:
             | The HN algorithm will de-front page stories where the
             | comment section starts looking like a flame war. If you
             | look up the stories which were removed from the front page,
             | there is quite a bit of toxicity in the comments.
             | 
             | I personally like that they try to play down controversies
             | instead of optimizing for engagement from them.
        
         | carabiner wrote:
         | Lol tumblr did not have much right wing boomer content that I
         | saw.
        
           | forgotmyoldacc wrote:
           | That misses the point of the OP. Tumblr (as a general social
           | media platform) had incredible amounts of bullying and
           | toxicity. Whether its from the left or right is a moot point.
           | 
           | Example found after 10 seconds of Googling:
           | https://www.vice.com/en/article/3da838/an-attempted-
           | suicide-...
        
       | [deleted]
        
       | cletus wrote:
       | Look at video games, particularly on mobile. I mean they aren't
       | even games anymore. They're just metrics-optimized psychological-
       | trick machines to extract the most money from you $1 at a time ie
       | in-app purchases and pay-to-win. These aren't games: they're
       | engagement bait to bring you and your wallet back each day.
       | 
       | Why do we have this? Because people suck and it just makes way
       | too much money for anyone not to do it. Why didn't we have this
       | 20 years? Because the technical capability wasn't there.
       | 
       | It's really no different here. Communication and messaging costs
       | have really gone down to zero. If it wasn't FB, it'd be someone
       | else. There's simply too much money with very little costs in
       | engagement bait, whether or not that's the intent of the platform
       | or product.
       | 
       | And yeah, that's the case because people suck. Most people aren't
       | looking for verifiable information. They're looking for whatever
       | or whoever says whatever it is they've already chosen to believe.
       | That's it.
       | 
       | I'd say the biggest problem with FB and Twitter is sharing links
       | as this is such an easy way for the lazy, ignorant and stupid to
       | signals their preconceived notions to whatever audience they
       | happen to have. But if Twitter or FB didn't allow sharing links,
       | someone else would and that someone else would be more popular.
       | 
       | I honestly don't know what the solution to this is.
        
         | novok wrote:
         | There are still many video games that are not click boxes, and
         | if you watch what kids are actually into, they tend to be
         | actual games like roblox, minecraft, among us and fortnight.
         | Even the small mobile games they end up being mostly games vs.
         | metrics optimized click games.
        
         | Exendroinient wrote:
         | So the biggest obstacle to solving anything about this world is
         | stupidity of the general population. People can't stop choosing
         | the most stupid things with their wallets and attention.
        
           | bmitc wrote:
           | As harsh as I am on humanity, it's not entirely people's
           | fault. We have primate emotions that have been grossly
           | outpaced by our technological development. This is why I've
           | become somewhat anti-technology despite still working on
           | technology because it stretches us to lifestyles that are
           | distinctly inhuman.
           | 
           | Humanity simply does not have the emotional capacity to
           | handle the technology we create. It never has and never will,
           | and it's just that software has greatly amplified this
           | inability. This is why mental disorders are skyrocketing.
           | We're building emotional prisons with technological walls and
           | bars.
        
             | kmlx wrote:
             | > and never will
             | 
             | don't humans continue evolving?
        
               | bmitc wrote:
               | My laymen and superficial understanding of the situation
               | is that humans do continue to evolve but our genome is
               | also degrading in terms of building up mutations.
               | 
               | https://www.nature.com/articles/news990204-2
               | 
               | If we're evolving, I don't think it's anywhere close to
               | the rate of increase of our technology.
        
           | gretch wrote:
           | Not everything that's stupid is harmful.
           | 
           | It's fine if everyone is stupid, or most of us choose to
           | engage in stupid things from time to time. I paid $17 to see
           | Transformers. That was pretty stupid, but oh well, no one got
           | hurt.
           | 
           | People want to be entertained and engaged - I do not think
           | they aim to harm. So I think we need to develop alternatives
           | to entertain people - these alternatives can still be stupid,
           | they just shouldn't be harmful.
        
         | GuB-42 wrote:
         | > Look at video games, [...] I mean they aren't even games
         | anymore. They're just metrics-optimized psychological-trick
         | machines to extract the most money from you $1 at a time [...]
         | 
         | Did you just describe arcades?
        
           | notacoward wrote:
           | I don't think so, and I used to spend a lot of time in
           | arcades. For one thing, _everyone_ had to pay the exact same
           | amount to play. Good player, bad player, whatever. There was
           | no  "free tier" to get you hooked before things suddenly got
           | much harder, and when they did there was no way to pay more
           | to make things easier (though toward the end some games did
           | let you _continue_ by pumping in more tokens). Every game was
           | also self contained. There was no persistent state that you
           | were in danger of losing if you didn 't keep checking in day
           | after day, week after week. Fortunately, the games were also
           | _cheap_. Even when I was really poor, I could afford a dollar
           | for several tokens that were enough to play the games I liked
           | just about as long as I could stand. Hour-long games of
           | Battle Zone turned into all-afternoon games of Joust. I could
           | turn a game over to someone less skilled, go back to my
           | apartment, eat lunch, come back, and pick up again before
           | they 'd managed to exhaust the lives I had accumulated.
           | 
           | Arcade games were certainly meant to be enjoyable, and to
           | keep you playing, but they were nowhere near the dark-pattern
           | psychological minefield that modern games - especially mobile
           | games - often are.
        
           | supertrope wrote:
           | Arcades may have heavier users but not any "whales" who are
           | using mom's credit card.
        
       | aborsy wrote:
       | I agree with the title.
        
       | nimrody wrote:
       | "When you're young, you look at television and think, There's a
       | conspiracy. The networks have conspired to dumb us down. But when
       | you get a little older, you realize that's not true. The networks
       | are in business to give people exactly what they want. That's a
       | far more depressing thought." - Steve Jobs.
        
       ___________________________________________________________________
       (page generated 2021-12-13 23:02 UTC)