[HN Gopher] Facebook Whistleblower Leaks Thousands of Pages of ...
___________________________________________________________________
Facebook Whistleblower Leaks Thousands of Pages of Incriminating
Internal Docs
Author : sizzle
Score : 831 points
Date : 2021-10-05 16:03 UTC (6 hours ago)
(HTM) web link (www.npr.org)
(TXT) w3m dump (www.npr.org)
| prvc wrote:
| What concrete information has been revealed that implicates
| Facebook in illegal behavior?
|
| How does the "whistleblower"'s stated desires wrt goverment-
| mandated moderation standards differ from those articulated by
| representatives of the company itself? Such a framework would
| shift liability away from the company, possibly reduce their
| moderation load, and create a significant barrier for entry for
| potential competitors.
| avisser wrote:
| In the 60 minutes interview her lawyer cites part of Dodd-Frank
| that protects whistleblowers going to the SEC. Their position
| is that the documents are material to investors re:valuation
| and that Facebook was negligent by not providing it.
|
| https://youtu.be/_Lx5VmAdZSI?t=596
| jhawk28 wrote:
| The problem is that would cover pretty much anything.
| gameswithgo wrote:
| Why is that a problem?
| nostrademons wrote:
| That's the intent, not the problem.
| TeMPOraL wrote:
| As the famous Levine's quip goes, "everything is securities
| fraud". From first reading of the article, it seems to me
| she just wants to nail Facebook with SEC, with everything
| else being just noise to get media attention.
| travoc wrote:
| The whistleblower is taking a huge legal risk here. I wonder if
| the Wall Street Journal has really explained the consequences
| to the leaker.
| nostrademons wrote:
| I worked with her briefly on Google+. She struck me as a
| trust-fund kid: idealistic, brash, (perhaps over)confident,
| intelligent but prone to tackling problems that can't be
| solved. Wikipedia says her parents were a doctor & academic-
| turned-priest, so she's likely got some family wealth backing
| her up. It's a different calculus than for most of the people
| here, who can't afford good lawyers and need that paycheck to
| survive.
| robbrown451 wrote:
| She is taking a risk, and I salute her for it. But at the end
| of the day, it is likely her career will have a huge boost
| from this. And Facebook would probably be very foolish to go
| after her.
| AzzieElbab wrote:
| umm if this hearing results in some kind of monetary
| settlement she will be entitled to some of it
| robbrown451 wrote:
| I can't see how a monetary settlement would be the result
| of this.
| AzzieElbab wrote:
| why not? this hearing is not about monopolization, and fb
| cannot be singled out when it comes to regulations either
| newfonewhodis wrote:
| > What concrete information has been revealed that implicates
| Facebook in illegal behavior?
|
| I'll leave it up to the legal system to decide what FB has done
| that's currently illegal. However, you don't just blow the
| whistle when you think laws are being broken. Sometimes you do
| it to start a discussion that can change the laws.
|
| It's clear from the Facebook Files stories and documents that
| FB has known about the effects of their product on consumers
| (especially kids) and 1. lied to the public 2. lied to
| lawmakers 3. didn't change the product in ways that positively
| impacted kids.
|
| That's obviously messed up and is not (afaik, ianal) covered
| under current US laws. I do strongly believe that it should.
| Platforms have responsibility, and S230 entirely takes the
| responsibility away from them.
| caminante wrote:
| _> 1. lied to the public 2. lied to lawmakers 3. didn't
| change the product in ways that positively impacted kids._
|
| To the parent's request, something needs to be concrete and
| egregious for a prosecutor to chase this.
| prvc wrote:
| Will the "lying to shareholders" argument hold up under
| scrutiny, though? The putative knowledge they supposedly
| withheld is extremely vague, and what's more, pertains to
| very sensationalistic topics. The delta between appearance
| and reality for these revelations' severity couldn't be
| greater.
| flandish wrote:
| What I don't understand is how this is different than Nike
| making shoes with child labor overseas in the "global south",
| etc. That harms children, for profit.
|
| Facebook is a for profit organization. This is the rule with
| organizations like this. If given enough time, size, and lack
| of shielding, any corporation will eventually cause harm in
| some way.
|
| We seem to have less focus on "Nike whistleblowers" or similar,
| you get the idea.
| robbrown451 wrote:
| It's a very different issue. One important difference is that
| the children being harmed here are American children. Rightly
| or wrongly, it raises the priority since American parents are
| seeing their own children harmed.
|
| I don't want to defend Nike, but still, the fact that people
| are poor enough to send their kids off to factories to make
| shoes isn't Nike's fault.
|
| But again, very different issue. Worthwhile to think about,
| but I don't think it should detract from this one.
| JohnFen wrote:
| Nike has received (and continues to receive) a lot of
| attention about that sort of thing for decades.
|
| But, on a larger level, I see several comments that seem to
| be implying that talking about the wrongdoing of one company
| isn't valid unless we talk about the wrongdoing of all
| companies at the same time.
|
| I think that argument is faulty. If the argument were good,
| then it wouldn't be possible to talk about any company's
| wrongdoing because you'll always be leaving others out of the
| conversation.
| flandish wrote:
| I understand where you are coming from - I was speaking
| more to the "meta" understanding of how this is a surprise
| to some folks (in gov).
| 8note wrote:
| I don't think the focus on Nike has resulted in the whole
| industry being fixed.
|
| Companies are case studies of a problem, any single one is
| insufficient information for coming up with solutions.
| iabacu wrote:
| This is not about children, though, it's about investors.
|
| If Nike makes a material claim to investors (e.g. that
| children labor is not used), but the claim is revealed to be
| knowingly false, then that's securities fraud.
| SavantIdiot wrote:
| If you can't moderate your platform, you shouldn't have a
| platform.
|
| This "Boo hoo we can't moderate ... " _wipes tears with
| billions of dollars_... is deadly comedy, yet lots of people
| fall for it.
| gruez wrote:
| > If you can't moderate your platform, you shouldn't have a
| platform.
|
| what counts as "moderate"? only removing illegal material?
| removing fake news as well? maybe remove questionable news as
| well, as long as it's towards the "greater good"?
| throwawayay02 wrote:
| Or maybe have no moderation at all? I was under the
| impression you only see posts from people you follow, so if
| you're an adult, why require any moderation at all except for
| not showing what your country deems illegal?
| 6gvONxR4sf7o wrote:
| Is Zuckerberg lying to congress illegal behavior? These docs
| seem to have made a pretty clear case that he did that.
| fullshark wrote:
| The claim is FB mislead shareholders about their product and
| committed securities fraud.
| I_am_tiberius wrote:
| I assume not all of it concerns relevant information. Therefore I
| ask myself why not leaking only information relevant to the
| public?
| PragmaticPulp wrote:
| EDIT: I removed a misquote from this comment after another user
| pointed out my mistake.
|
| There do appear to be a few legitimate concerns worth
| investigating, but it's starting to feel like the media has
| sensed that Facebook is the villain du jour and they're throwing
| everything at the wall to see what sticks. These stories seem to
| dance around the subject of _what_ exactly Facebook did, but
| instead focus on the existence of a whistleblower. If there was a
| story here, I feel like they're working hard on overplaying their
| hand at the risk of losing the audience once the initial frenzy
| wears off. That's fine for media companies who can move on to the
| next bogeyman, but it's not going to help the underlying cause.
| paxys wrote:
| > Facebook issued a lengthy statement from director of policy
| communications Lena Pietsch titled "Missing Facts from
| Tonight's 60 Minutes Segment."
|
| > She pointed to Facebook's investment to monitor for harmful
| content; disputed the way Facebook's own research on teenagers'
| mental health has been reported; and rejected the claim that
| the social network has furthered political polarization.
|
| You are quoting something Facebook used as a defense
| [deleted]
| aardvarkr wrote:
| Did you actually read the article or are you just trying to
| create controversy? That quote comes from a Facebook
| spokesperson trying to push back against the negative
| attention.
|
| > Facebook issued a lengthy statement from director of policy
| communications Lena Pietsch titled "Missing Facts from
| Tonight's 60 Minutes Segment."
|
| > She pointed to Facebook's investment to monitor for harmful
| content; disputed the way Facebook's own research on teenagers'
| mental health has been reported; and rejected the claim that
| the social network has furthered political polarization.
|
| How can you say "If there was a story here, I feel like they're
| working hard on overplaying their hand at the risk of losing
| the audience once the initial frenzy wears off." when you have
| clearly demonstrated that you didn't bother to read the story
| in the first place and just like to cherrypick quotes out of
| context to prove a point?
|
| EDIT: And now the OP has edited their post but still stands by
| their claim that this is a nothingburger, without evidence this
| time. For context, the OP originally quoted "She pointed to
| Facebook's investment to monitor for harmful content" out of
| context as evidence of something good that facebook did but
| this article is trying to trump up into something bad.
| isabelc wrote:
| From HN guidelines:
|
| > _Please don 't comment on whether someone read an article.
| "Did you even read the article?"_
| aardvarkr wrote:
| Because at the time it was the top comment making
| speculative claims and taking quotes WILDLY out of context.
| Accusing the author of trumping up an issue when the OP is
| quoting the rebuttal from the company is not just
| misleading but harmful misinformation and should be
| rightfully called out
| [deleted]
| PragmaticPulp wrote:
| > EDIT: And now the OP has edited their post but still stands
| by their claim that this is a nothingburger, without evidence
| this time
|
| There was more to my post than the one-line quote that I
| removed. I also acknowledged the error and thank a commenter
| for pointing it out below.
| [deleted]
| swampthinker wrote:
| That was a quote from Facebook's official response to the
| leaks.
|
| The display ad on mobile cuts the article in a really odd way.
| PragmaticPulp wrote:
| > The display ad on mobile cuts the article in a really odd
| way.
|
| Yep. That got me. Thanks for pointing it out. I removed that
| part of my post.
| amznthrwaway wrote:
| Thanks for leaving the part of your post where you argue
| that we should simply _not_ be bothered by these things,
| that being less bothered will, magically, result in greater
| response.
|
| You definitely aren't making a bad faith argument at all.
| bryan0 wrote:
| You should put an edit line in your OP then. Currently
| thread doesn't really make sense with those silent edits.
| ParanoidShroom wrote:
| Funny how those articles are popular, hating on big tech
| creates high engagement. Haven't we heard that argument before?
| amznthrwaway wrote:
| > I really don't understand why investing in content monitoring
| is being used as a point against Facebook. Isn't this what
| people wanted?
|
| That isn't what was stated. I fully understand that you are
| purposefully making disingenuous arguments, because you know
| that YC leads are fine with disrespectful shit that supports a
| hard right-wing positions.... But at least _try_ to pretend
| that you're not shitting all over us with nonsense.
| r00f wrote:
| It was pointed at by PR director, of course she tried to find
| some good things about FB
| edoceo wrote:
| And paid handsomely for the trouble.
| cycomanic wrote:
| I have to say it's extremely bad form to retroactively
| completely reedit your post so that your original statements
| (which were shown to be wrong) don't show anymore.
| PragmaticPulp wrote:
| I removed the misquote and thanked another user for pointing
| it out, but the rest of the comment still stands. Deleting a
| comment isn't an option after people reply, so I don't know
| what you want me to do. I apologized, removed the mistake,
| and thanked someone for pointing it out.
| [deleted]
| [deleted]
| kadabra9 wrote:
| "Whistleblower" - with the heaviest use of air quotes.
| woeirua wrote:
| I think a lot of people are choosing to ignore that a lot of
| companies have done things in the past that were not illegal at
| the time of action. However, those actions were later decided to
| be _made_ illegal because the behavior was deemed to be
| antithetical to our values.
|
| For example, Standard Oil did not break any laws in its ruthless
| consolidation of the nascent oil industry. In fact, it exploited
| the law to allow it to grow into the monstrosity that it
| eventually became. In response, Congress passed the Sherman
| Antitrust Act in 1890 which subsequently prevented the actions
| that Standard Oil had used to consolidate the market.
|
| There should be no question, that what FB is doing here, while
| not illegal, is highly dubious ethically.
| Pyramus wrote:
| Enron is another fascinating example, there is an interview
| with the former CFO where he talks about what he calls "legal
| fraud" - practices that are highly dubious but technically not
| illegal [1].
|
| [1] https://youtu.be/goQhGqQtFZ4
| _hyn3 wrote:
| There's no such thing as "legal fraud". Fraud is always
| illegal.
| Pyramus wrote:
| In theory yes - in practice no.
|
| Legal/illegal is only a binary variable in theory. In
| practice it's things that are clearly white, clearly black
| and lots of grey in between. There are many concrete
| examples regarding accounting rules mentioned in the
| interview.
|
| The question is where does a court draw the line, and as
| parent rightly points out, sometimes code/case law changes
| after the fact.
| tsimionescu wrote:
| I think the parent is making a point about definitions -
| that the word fraud can only refer to illegal actions by
| definition. I think this is an overly literal
| interpretation of language' though.
| gadrev wrote:
| Good point. The state doesn't decide what's "good" or "bad". We
| must not forget that.
|
| Laws are useful but are an application of power. Power of who?
| Of the people? Of some bureaucrats? The answer will not be
| black and white in most cases, but what's important is not to
| regard any body that can legislate as a source of moral
| authority based on that power alone.
|
| And one consequence is not to buy big corp arguments about evil
| practices "but it's legal!". Yeah, it's legal because it hasn't
| been outlawed _yet_, because of strong lobbying... b/c of
| whatever. Never relegate the moral judgement to just "it's
| legal/illegal".
| hdjjhhvvhga wrote:
| The problem is, very often the law is unable to keep up,
| especially on technical issues. And even when it does,
| sometimes is way too late. Gates knew it when he asked to
| implement the AARD code - yes, they were sued, but they settled
| out of court and made the competitor's product irrelevant. A
| lot of Microsoft behavior in the 90s was just this: they could
| get away with it, but it simply didn't feel right.
| akudha wrote:
| That is the problem of following the _absolute minimum_
| standards in life - we (the society as a whole) have accepted
| that as long as businesses follow the law, its all good. We 've
| accepted that the sole purpose of businesses is to make money
| within the bounds of the law. While this makes sense logically,
| it isn't good for anyone in the long run, practically. Also
| remember that all kinds of unfair laws can be passed, if you
| have enough money to buy politicians.
|
| We should strive for higher standards, but who am I kidding -
| we live in a world of "greed is good" mantra.
| Digory wrote:
| Scarcity exists.
|
| You cannot follow the _maximum_ standards, because you only
| have so many resources. We can sort our 'recycling' and go
| through 'security theater' at the airport but those involve
| necessary trade-offs to real care for the environment or
| security.
|
| I don't mind when wealth and education allows us to
| voluntarily do better or more. But there's danger in
| punishing people using hindsight.
| titzer wrote:
| > the sole purpose of businesses is to make money within the
| bounds of the law.
|
| It's worse than that. Businesses _constantly_ make risk
| /reward/punishment tradeoffs and will flout the law according
| to their estimation of the risk of being caught and paying
| fines. There are precious few illegal behaviors that bubble
| up to criminal charges for executives, so the risk is
| quantifiable in dollar amounts.
| bilbo0s wrote:
| The problem is that without laws, what would those
| "standards" be?
|
| You have to do more than the law? But how does any business
| in the future know what that is?
|
| The great things about law are predictability and
| flexibility. If the law is not enough, we can change it. Then
| everyone is held to that new standard. But having a standard
| that is not laid out, is the same thing as having no standard
| at all.
|
| Going into an area where we say companies have to meet an
| unwritten ethical and/or moral standard is ripe for abuse.
| Under those conditions, if I show certain messages or ads on
| my website that are wholly unethical and immoral, but not
| necessarily illegal in the written law, I'm opening myself up
| to liability based on violating unstated ethics and morals.
| akudha wrote:
| You seem to be misunderstanding what I was saying (or I
| wasn't clear). I am not at all saying we shouldn't have
| laws - we absolutely should have laws and rules. I am not
| saying that we should sue companies based on unstated
| ethics or morals (how would that even work anyway?). All I
| am saying is we should, as a society, have a better
| attitude and higher standards than stuff like _greed is
| good_ , _the sole purpose of a business is make profit,
| even at the expense of everything else_ etc etc.
|
| I fully understand this sounds idealistic and maybe it is
| dumb to expect people to do better, when much of humanity
| is trying to do the minimum and get the maximum in return.
| cddotdot wrote:
| Humanity isn't that selfish. It is idealistic but to even
| have the conversation to know why is minimally ethical.
|
| Taking the converse argument of 0 ethics except THE LAW
| is infuriatingly common. Is it okay to murder so long as
| the act of murder is technically legal? Assuming perfect
| proof intent with direct action indefensible confession
| murder. But also 100% legal. Breaking no law. Would
| society find that acceptable? Even for those that used
| the loop hole?
|
| It seems like folks want to live in a 0 common ethical
| baseline reality. We're discussing the middle and if
| Facebook is wrong. Not if they will get away with it.
| They will. And only because they get away with it does
| not make it right. Can we stop with the definition
| arguments of legality or the accountability of large
| corporations for a moment?
|
| Is knowingly proceeding with a damaging action
| acceptable? One could even argue social media isn't
| damaging. The study is wrong and Facebook paid for it.
| Not me. But at least it's not this manifest destiny
| morality bullshit.
| [deleted]
| Dracophoenix wrote:
| >It seems like folks want to live in a 0 common ethical
| baseline reality.
|
| We _do_ live in a world zero common ethics except where
| these ethics pertain to the laws of physics. Anything
| else is determined by societal dictate by way of law or
| cultural fiat. If Facebook were in Saudi Arabia, there
| wouldn 't a be a rainbow flag filter and accounts would
| be shadow banned for any mention of Khassogi's murder.
|
| So who gets to determine what is ethical?
| [deleted]
| hatmatrix wrote:
| Agreed. Facebook has a fiduciary duty to it's shareholders to
| maximize profits; it's up to the government to create legal
| boundaries by which they can do that.
| [deleted]
| hunterb123 wrote:
| To clarify, can you specify exactly what law you would like
| made? What do you want to be done _exactly_?
|
| I would rather take the stance of feed algorithms and
| moderation logs be PUBLICLY available. Transparency instead of
| censorship.
| vpfaulkner wrote:
| A lot of these issues seem difficult to regulate but one that
| seems more realistic is usage by minors.
|
| What if social media platforms required all minors to have
| their account associated with a parent account? The parent
| could monitor activity, institute time limits, etc.
| EVa5I7bHFq9mnYK wrote:
| Minors don't use FB much anyway. It's more tik-tok now. And
| of course no minor would use an app monitored by her
| parents, she will immediately switch to another app.
| vpfaulkner wrote:
| Sorry, should have clarified: I was suggesting that if
| the government decides to regulate it should apply to all
| social media platforms, not just FB. Updated the original
| comment.
| azinman2 wrote:
| > I would rather take the stance of feed algorithms and
| moderation logs be PUBLICLY available. Transparency instead
| of censorship.
|
| Ok let's say that's now the case. The FB source is now open.
| What changed? Any negative consequences are still occurring,
| and if anything, we just have had actors better visibility
| into what to exploit.
| zabatuvajdka wrote:
| In my opinion if social media is important to society the way
| it seems there should be a government funded social network
| for users and businesses. In the USA like NPR and PBS.
|
| The problem is companies selling peoples data and optimizing
| the algorithms on probability. Instead what if everyone paid
| some taxes to have a social network which helps people
| interact and businesses promote themselves, you can get rid
| of the ads AND the algorithms. Let users customize settings
| which dictate the algorithm.
| onemoresoop wrote:
| You better be sure the incumbents would fight tooth and
| nail make this look like a very unattractive idea and lobby
| heavily to make sure it will never happen.
| intended wrote:
| The testimony was about what you expect.
|
| It looks like the recco algorithms will be carved out from
| the neutral platform protections they enjoy.
| burlesona wrote:
| I would ban algorithmic targeted media -- ie no personalized
| feed based on an "engagement" algorithm, for social media
| just see a chronological feed of posts from the people you
| follow. This is the most addictive and radicalizing part of
| social media - and the most lucrative. Much like the nicotine
| in Big Tobacco's case.
| woeirua wrote:
| No recommendation algorithms for certain classes of websites.
|
| News feeds must be sorted in chronological order only. Users
| may selectively filter their feeds if they choose.
|
| No infinite scrolling.
| christkv wrote:
| How about an age limit. You have to be 18 or above.
| asdff wrote:
| Treat addictive social media companies like addictive
| cigarette companies. Lets see some huge warning labels about
| how mentally harmful it is to continue scrolling on facebook
| right on the first result where its unavoidable to see. Lets
| tax the hell out of social media companies to generate local
| revenue just like sin taxes. It won't be a huge change but it
| will be a great starting point and will come with revenue
| that can fund potentially mental healthcare programs for
| people damaged by these companies.
| mcguire wrote:
| I'm not sure this would have the effect you want.
|
| You tax Facebook but allow it to operate however it wants.
| Facebook is then incentivized to double down on its
| algorithms---like tobacco companies using chemical and
| biological techniques to make cigarettes more addictive---
| in order to regain the lost profits.
| asdff wrote:
| Then you can double down on the taxes you levy against
| them if they begin harming more people, no? The idea is
| the cost of doing bad business will eventually be too
| much to make it worth doing that sort of bad business.
| Same idea with carbon taxes where the costs scale to
| damage and incentivize shifting to good behavior rather
| than doubling down on bad behavior. And even with
| cigarette companies doubling down, far fewer people smoke
| today and die of lung cancer than 50 years ago, so this
| stuff works on the whole.
| lazide wrote:
| That definitely isn't what happened with alcohol or
| tobacco! Instead you end up with a significant enough
| amount of money going to the government that the
| government now ends up protecting those industries to an
| extend - ensuring lower priced competition (e-cigs,
| moonshine) get stomped on and the market gets protected
| and not eliminated or reduced too much.
| hunterb123 wrote:
| Uhm a cigarette you cannot change the base ingredient of,
| it's tobacco. A cig by definition burns the carcinogen
| tobacco.
|
| A site you can specify certain requirements like no doom
| scrolling or requirements like PUBLISH YOUR ALGORITHMS.
|
| Why go full California with another banner? People will
| ignore it and you will have done nothing of substance.
| habeebtc wrote:
| > Uhm a cigarette you cannot change the ingredients of,
| it's tobacco.
|
| You can soak the tobacco in solution which contains
| additives, such as more nicotine. Which is exactly what
| cigarette companies have done in the past (and not just
| the tobacco, the filters, and the paper as well).
|
| The parallel here is filling people's feeds with divisive
| political news and posts, even when they have tried to
| opt out.
| hunterb123 wrote:
| The point is tobacco itself is a carcinogen, you cannot
| make a cig not cause cancer because it needs to burn
| tobacco at least.
|
| A social media website does not need doom scrolling or
| private algorithms for the feed, you can change how it
| works instead of adding a useless banner.
| prancer_or_vix wrote:
| 2 questions:
|
| 1) What would you expect be implemented to
| reduce/eradicate doom scrolling?
|
| 2) What would making the algorithm public do for us? I'm
| not an ML engineer, but presumably their algorithm isn't
| just an algebraic equation where x is how toxic the post
| is and y is how inflammatory it is and y is the number of
| kids who will think harder about suicide because of the
| post.
|
| Maybe I'm just super naive and that _is_ how Facebook
| made their algorithm, but my understanding is that the
| algorithm is a little more of a black-box and is a little
| abstract. How is a lay-person supposed to evaluate
| something like that?
| hunterb123 wrote:
| I don't want them to do anything regarding doom
| scrolling, it was just an example and came from another
| user.
|
| I do want them to publish their algorithms and moderation
| logs so we have insight on how they are serving and
| moderating content.
|
| I don't care about organic user content, I do care if FB
| is pulling the strings to make it either more salacious
| or being biased in one way or another.
|
| I also care if they are banning certain users or content
| but not others.
| iamstupidsimple wrote:
| The input to these algorithms are usually human
| understandable and quantifiable signals like likes, text
| sentiment, maybe engagement history -- and the output is
| probably a score than can be ranked. Ultimately though
| even if the algorithm is a black box (entirely possible
| it's not ML based!) we can still evaluate it in a lab
| environment.
|
| Some of the signals might be generated by ML also, like
| photo labels, but ultimately these things are very
| understandable if you have the model and data.
| VikingCoder wrote:
| Sorry for being pedantic, but you can absolutely change
| the ingredients of a cigarette. There's a ton besides the
| tobacco. And you can breed different strains of tobacco
| to have more or less of some chemical.
| hunterb123 wrote:
| My point was a cig always causes cancer because it burns
| tobacco, so you need a banner to warn users.
|
| A social media behemoth does not need to use doom
| scrolling or private algorithms to be a social media
| site.
| asdff wrote:
| My nintendo DS from 15 years ago gave me an eye strain
| warning every time i started it up and it doesn't always
| cause eye strain, only misuse does and I know that thanks
| to the informative banner.
| jmcgough wrote:
| I love that Nintendo is very aware of the potential
| negative effects of their products and games and tries to
| inform users / mitigate.
|
| Even when it comes to encouraging positive play between
| users - in the new Pokemon MOBA (games known for their
| toxicity) there's no text chat, only communication with a
| few emotes you can show. Some of their decisions make for
| arguably worse games for "hardcore" gamers (like the way
| they rank users in smash, or how they focus on more
| casual-style in-game tournaments or make matchmaking
| harder) but they sacrifice that in favor of a more
| positive general experience, especially important since
| children play their games.
| asdff wrote:
| I thought pictochat was great too and a lot of fun. They
| could have opened it up and made it into a global
| network, but the beauty is that it operates on local
| networks so it was more of an in person social network,
| plus no way for advertisers and commercial companies to
| break in.
| hunterb123 wrote:
| I remember pictochat, so many dicks and graphic drawings
| sent to each other in JR high. The sensitive world today
| would have had a field day with that.
|
| This part of the thread went pretty off topic but I like
| it! Pictochat was certainly ahead of its time, wish we
| stuck to things like that.
| mpalczewski wrote:
| Smoking was happening in the 1800's. Lung Cancer rates
| didn't shoot up until the 1900's, it was rather rare.
| This is around the same time that tobacco companies
| figured out they could soak tobacco in ammonia. This
| allowed for inhalation into the lungs (e.g. it sucks to
| inflate a cigar deep into your lungs). It also made the
| cigarettes much more addicting, so people smoked way more
| and inhaled into the lungs. That's about when lung cancer
| stopped being so rare.
|
| Yes, cig's cause cancer, but to say that it's because it
| burns tobacco is missing a big part of the story.
| asdff wrote:
| That was probably because smoking was not common outside
| of wealthy men during the 1800s. It was not widespread at
| all among most of the public until after the world wars
| thanks to mass produced cigarettes (which weren't around
| until the late 1800s) now being added to rations. Smoking
| rate after WWI increased 350% and was high ever since. US
| government didn't stop issuing cigarette rations to
| soldiers until 1975. Lung cancer rates have followed lock
| step with smoking rates, its not really that smoking
| suddenly became harmful. It always was, it just wasn't
| common to smoke and even among those who did back in
| those days, it wasn't common to smoke very much at all
| and certainly not around the clock (kinda like hookah
| users today).
| erosenbe0 wrote:
| What is the appropriate middle though? Think about
| alcohol culture. Should we ban beer commercials on TV?
| Only allow beer commercials with talking frogs rather
| than attractive young people having fun?
| munk-a wrote:
| I'm honestly baffled over why beer commercials are
| considered socially acceptable - but then again I think
| that advertising (in our modern interconnected world)
| only ever serves to drive overconsumption. If you want a
| beer - you go to the bevy and pick out a beer you'd
| enjoy... if I'm watching TV and the TV tries to make me
| want a beer - that's not a good thing.
|
| Good advertising[1] is limited to making sure your
| product is visible in comparison to competitors - having
| shiny cereal boxes is something I find pretty meh, but in
| the cereal aisle you're dealing with someone who wants to
| buy some kind of cereal and you're trying to convince
| them to buy yours. TV Advertising drives up demand for
| products which, by definition, means we're consuming more
| of that product than we otherwise would... that's great
| for business... and it's also great for the obesity
| epidemic.
|
| 1. What I'd consider to be ethical advertising, but
| that's like my opinion man.
| hunterb123 wrote:
| It's fine, only piss water beers advertise anyway.
| naravara wrote:
| Moreover, warnings are useless if people can't vote with
| their feet. So if you want to actually affect change in
| the dynamics of the market you need to make services
| compete on quality and value to the customer rather than
| engaging in a scramble to accrue insurmountable network
| effects and lock-in.
|
| That means mandates for data interoperability. Sadly, I
| have no idea how to implement that in a way that doesn't
| utterly stifle innovation by ossifying what sorts of data
| models social media is allowed to have. But at the very
| least we could create a sort of interoperability minimum
| that prevents you from locking up things like photo
| albums or peoples' "social graphs."
|
| Over the longer term I'd like to see some kind of
| disentanglement of the protocols, standards, and data
| models from the front-end clients. It's obviously a lot
| more complicated now, but in the same way that you could
| access AIM, ICQ, GChat, and a bunch of other stuff from a
| variety of chat clients it would be good to be able to do
| this with everything social. Hell, ActivityPub basically
| tries to do this now so it's not impossible.
| parineum wrote:
| > sin taxes
|
| How about we don't let the state decide what I get to do
| with my body (and mind)?
| dustinchilson wrote:
| Isn't this the root of the problem?
|
| Neither you or the government/state decides what you get
| to do with your mind. An advertising company decides what
| to do with it and can manipulate it however it decides
| best benefits itself. Not you, not society, Facebook,
| what makes Facebook the most money.
| mynameisash wrote:
| That's not what a sin tax does. You are still free to
| smoke cigarettes or drink alcohol, and were we to tax
| social media usage, you would still be free to use or not
| use that.
|
| But a sin tax ostensibly accounts for the economic
| externality*. We know that cigarettes impose a cost on
| society beyond the individual smoker. I'm all in favor of
| making people pay for things that we know cause damage to
| society more broadly. And I hardly think it's
| controversial that social media is in many aspects
| harmful to society.
|
| *Sin taxes are technically different than pigovian taxes,
| but I and I think most people tend to use the terms
| interchangeably.
| parineum wrote:
| > We know that cigarettes impose a cost on society beyond
| the individual smoker
|
| What's that cost?
|
| From what I've read, all economic costs smokers impose on
| society are more than made up for in their dying early,
| they actually cost less [1]. I guess everyone should
| smoke to save the state money!
|
| [1] https://pantagraph.com/news/fact-check-do-smokers-
| cost-socie...
| oarabbus_ wrote:
| You are are able to consume both tobacco and alcohol
| (let's not tangent into a drug legalization discussion).
| Tobacco and alcohol cause measurable societal harm and
| measurable costs to the state - are you implying it's
| unreasonable for states to tax these goods for those
| reasons?
|
| Generally speaking I'd rather reduce taxes but I fail to
| see what's wrong with e.g. an alcohol excise tax going
| towards rehabilitation and/or highway safety programs.
| "Sin tax" is just a colloquial name for an excise tax,
| which a state has every right to enact.
| parineum wrote:
| > Tobacco and alcohol cause measurable societal harm
|
| And if I choose to smoke in the privacy of my own home
| (or yard)? What societal harm am I causing?
|
| As for alcohol, the societal harm caused is a laundry
| list of already illegal behaviors that are illegal
| regardless of alcohol's involvement with the exception of
| sin tax avoidance.
|
| Why not outlaw the societal harm instead?
|
| > e.g. an alcohol excise tax going towards rehabilitation
| and/or highway safety programs
|
| Both of those seem like good things regardless don't
| they? Why do we need a special tax on alcohol for things
| that are generally good? It's not like only people who
| consume alcohol are the only ones who need rehab or
| they're the only problem with highway safety.
|
| Does the tobacco tax go toward lung cancer patients? It
| actually goes towards funding campaigns that overstate
| (ie, lie) about the dangers of smoking to the point that
| people vastly overestimate the dangers of smoking [1].
|
| > Sin tax" is just a colloquial name for an excise tax,
| which a state has every right to enact.
|
| Of course it's legal, it's just garbage policy. Sin taxes
| come from the pairing politicians wanting more money and
| pearl clutching interest groups pleading to think about
| the children.
|
| [1] https://journals.plos.org/plosone/article?id=10.1371/
| journal... > 99.5% of respondents overestimated absolute
| risk, only about 0.3% estimated it correctly (by giving
| an answer of 30), and 0.2% underestimated it (by giving
| an answer less than 30).
| oarabbus_ wrote:
| >Why not outlaw the societal harm instead?
|
| Weren't you just saying how you don't want the state
| legislating what you put in your body and mind? That's
| why.
| parineum wrote:
| I want the state to not outlaw hurting myself when it
| doesn't hurt others. The state's role is to prevent
| individuals from hurting others.
| oarabbus_ wrote:
| Unfortunately it's not so simple. An individual's smoking
| and alcohol use can and does harm others, and the state
| levies excise taxes for that reason.
|
| Another example is driving a car, which results in
| thousands of fatalities and many more injuries daily. Not
| to mention environmental impacts which affect others. The
| state chooses to require drivers to have insurance and
| their cars to pass smog tests, rather than outlawing
| driving.
| parineum wrote:
| > An individual's smoking and alcohol use can and does
| harm others, and the state levies excise taxes for that
| reason.
|
| Smoking and alcohol use can also not harm others. Should
| those who smoke and drink responsibly be held responsible
| for those who don't? How does the tax ameliorate those
| harms?
| analognoise wrote:
| So you get to ignore all the responsibilities that come
| with your rights so you get to clog up our hospitals with
| your bad decisions?
|
| How about you take full responsibility: you get to not
| put whatever in your body, and you agree never to take an
| ambulance ride or be treated by a hospital.
|
| Don't want the vaccine? That's fine, it's your right. But
| now when you can't breathe, nobody coming to help.
| parineum wrote:
| > So you get to ignore all the responsibilities that come
| with your rights so
|
| Absolutely not. I know lots of people that manage to
| drink alcohol responsibly. They never drink and drive,
| don't regularly over indulge and it makes their and their
| peers lives _better_.
|
| What negative externality are they paying for with
| alcohol taxes?
|
| > How about you take full responsibility: you get to not
| put whatever in your body, and you agree never to take an
| ambulance ride or be treated by a hospital.
|
| > Don't want the vaccine? That's fine, it's your right.
| But now when you can't breathe, nobody coming to help.
|
| If I pay for health insurance, I'm already taxing myself
| in this instance. It would be perfectly reasonable for a
| health insurance company to offer incentives for people
| to be vaccinated just like they offer incentives to non-
| smokers.
|
| If the government wants to start providing that
| healthcare, then they can have a say in the cost of poor
| health decisions.
| asdff wrote:
| Curious, do you not wear seatbelts too? Opt for asbestos
| insulation since its better than anything on the market
| today? Plumb your home with lead since its more durable
| and flexible? Use leaded gas because its better for your
| older engine?
|
| The state acts on the collective when the public is not
| making good decisions for themselves and causing net harm
| onto themselves, usually with the public paying the
| price. Sometimes thats overt like with death rates from
| accidents without seatbelts, or cancer from asbestos
| exposure. Sometimes its less overt like the behavioral
| issues, increased incidents of mental illness, and crime
| rate increases from leaded pipes and gasoline.
|
| I'm willing to bet social media causes net harm. It
| hasn't enabled communication that wasn't possible before;
| if you can get access to a facebook account you therefore
| have email and access to irc. But it has cost probably
| trillions in productivity from people staring at it so
| much during all their idle time, and the cost to treat
| mental health issues that wouldn't have cropped up
| without toxic social media culture.
|
| I say we have these companies pay for these externalities
| if they are forcing us to pay for them otherwise. By not
| passing a tax on externalities like this, the state is
| deciding that I need to pay for facebook's ills on
| society whether I use the service or not, which should
| anger you as a libertarian as much as it angers me as
| someone on the left.
| ethbr0 wrote:
| As another example:
|
| - The US prohibits people under 21 years old from buying
| alcohol, and allows those over 21 to do so.
|
| - The US prohibits anyone of any age from driving a motor
| vehicle over a certain blood alcohol level.
|
| This is something which causes health and community harm
| (alcohol), which we have allowed and denied to people in
| certain ways.
|
| And honestly, I think struck a fair balance between
| individual liberty and social liberty/good.
|
| I don't think anyone would argue that everyone should be
| allowed to drive anywhere, as drunk as they wanted to, at
| whatever age they wanted to.
| asdff wrote:
| Not only the age restriction but there are restrictions
| meant to curb some abuse at least. Drunk in public is a
| crime, establishments technically aren't allowed to
| overserve patrons who are very drunk, you can get tried
| for manslaughter worst case if you force someone to
| overconsume and they die, etc.
| parineum wrote:
| > Curious, do you not wear seatbelts too?
|
| I wear my seatbelt, I don't smoke, I don't drink and I'm
| vaccinated.
|
| Everyone keeps talking about these "negative
| externalities" without being specific. Why not just make
| the societal harm illegal and let people hurt themselves
| without buying permission from the government?
| ethbr0 wrote:
| Because making it illegal to accidentally kill someone
| with your car while intoxicated doesn't solve the
| problem.
| parineum wrote:
| How do we solve the problem of people accidentally
| killing someone with their car?
| unethical_ban wrote:
| It is the responsibility of the state to inform citizens
| on the facts and dangers of activities. And yes,
| sometimes to incentivize healthy behavior.
| colinmhayes wrote:
| When your choices make everyone else worse off it
| absolutely is. Facebook usage is harmful to society.
| mrep wrote:
| For everyone responding that smokers cost the government
| money, it is actually the opposite in that they save the
| government money because on average they die sooner. From
| the manning study: "In this analysis, the federal
| government saves about $29 billion per year in net health
| and retirement costs (accounting for effects on tax
| payments). These include a saving in retirement (largely
| social security benefits) of about $40 billion and in
| nursing home costs (largely medicaid) of about $8
| billion. Costs include about $7 billion for medical care
| under 65 and about $2 billion for medical care over 65;
| the remaining $10 billion cost is the loss in
| contributions to social security and general revenues
| that fund medicaid. "
|
| (PDF): https://www.everycrsreport.com/files/19980430_97-1
| 053E_53c59...
| elliekelly wrote:
| It's not the state "deciding" it's the state requiring
| compensation for the negative externalities created by
| the product. You're more than welcome to smoke cigarettes
| if you so chose. But that decision isn't made in a vacuum
| and it impacts the rest of us in the form of increased
| public health burden, insurance costs, secondhand smoke,
| etc. A "sin tax" serves not only to discourage the
| asocial behavior (we'd have a big problem if _everyone_
| made the same choice) but also to pay your fair share of
| the costs of your decision.
| parineum wrote:
| > state requiring compensation
|
| So, advertising company hurts me and the state gets
| compensated?
| oarabbus_ wrote:
| > Lets tax the hell out of social media companies to
| generate local revenue just like sin taxes.
|
| Very interesting idea, actually. There is evidence Social
| Media causes harm to some individuals' mental health (in a
| widespread manner causing some measurable societal harm),
| so a proposed tax on all social media companies with
| revenue going towards mental health programs seems worth
| exploring.
|
| Generally I'm not much in favor of implementing new taxes
| (would rather close existing loopholes) but if implemented
| reasonably and backed by scientific evidence this seems
| valid.
| intended wrote:
| It seems internal data shows that FB and Insta harm 14
| years old more than other groups.
| erosenbe0 wrote:
| That could apply to TV, video games, alcohol culture,
| porn or any number of things.
|
| It seems to be the near monopoly that is one of the major
| issues for FB. Lack of competition seems to lead to a
| house of horrors.
| asdff wrote:
| Yet with all those things we have laws and regulations
| and even restrictions for young people explicitly. FB is
| the wild west on the other hand and constantly lobbies to
| keep it that way in terms of how regulators see it.
| intended wrote:
| Yes, thats the buried lede. Those are all things which
| you need to be old or mature enough to use responsibly -
| they make demands of experience and impulse control you
| develop as adults.
|
| Meaning that blocking social media for kids and teens is
| likely on the anvil at some point.
| cratermoon wrote:
| > There is evidence Social Media causes harm to some
| individuals' mental health (in a widespread manner
| causing some measurable societal harm)
|
| And they do so by exploiting human weaknesses using the
| same psychological techniques used by casinos and other
| forms of gambling.
| asdff wrote:
| Yes, and we have regulations and taxes and laws around
| these industries, but evidently facebook gets off scott
| free.
| cratermoon wrote:
| That's because, so far, they've managed to deflect, deny,
| and discredit research and critics pointing out exactly
| how social media uses things like variable rewards in the
| same way as slot machines use them to keep gamblers
| pulling the lever. They do this using tactics developed
| by the tobacco companies to fight findings that smoking
| causes cancer and other harms and refined by the fossil
| fuel industry to prevent action on global warming.
| erosenbe0 wrote:
| I agree with you but a lot of the analogies and metaphors
| here are insufficiently subtle.
|
| FB in some sense, but not entirely, is a form of speech,
| no better or worse than Grand Theft Auto or the National
| Enquirer. That's how I thought of it ten years ago.
|
| Now that it is in our pockets nearly cradle to grave; a
| monopoly; and dependent on minutes of engagement rather
| than subscriptions -- it is a different animal
| altogether.
| FFRefresh wrote:
| Honestly asking:
|
| What is the _specific_ harm involved here that is deserving
| to be taxed?
|
| How would we measure this harm in order to know how much to
| tax a given company?
|
| Should other causes of this harm be taxed/penalized as
| well? If not, why?
|
| For instance, if the harm in question is some people feel
| varying degrees of worse after using a given product, is
| there any limit we as a society should set on penalizing
| the cause of the harm?
|
| Should people or entities who say things that make people
| feel worse be fined/prosecuted by the law? If I feel worse
| (let's call this 'trauma' or 'anxiety' or 'depression' or
| 'literally shaking' or 'panic attack') after reading a book
| or reading a news site, should I have standing to sue the
| creators and medium which presents said content?
| RIMR wrote:
| Freedom of speech also includes from being compelled to speak
| of things you don't want to, so forcing companies to make
| their recommendation and moderation systems publicly visible
| would be eve more of a free speech issue than expecting
| companies to moderate violent, hateful, or deliberately
| misleading content.
| solveit wrote:
| I absolutely disagree but I'm upvoting anyway because it's
| an argument I haven't seen before with regards to making
| algorithms public and god knows the discourse could use
| some variety.
|
| That being said. No. This is no more a free speech issue
| than forcing food manufacturers to make their ingredients
| public.
| alistairSH wrote:
| Can you cite some case law that bears out this argument?
| While I agree that your point is true in the most general
| sense, we compel companies to make their internal
| information public fairly regularly via various mechanisms
| (admittedly, none of which are 100% analogous to the
| FB/social media situation).
| alistairSH wrote:
| _What do you want to be done exactly?_
|
| Public algorithms, or at least some 3rd party review. Ban
| infinite-scroll on social media platforms. Require feeds to
| be configurable (users can set to "newest first" or "top
| picks" or whatever else). I'm sure I could come up with more,
| that's just off the top of my head.
| idiotsecant wrote:
| These seem like awkward things to encode into law in a
| durable way. Laws are long-term blunt instruments, banning
| something like infinite scroll will have all kinds of
| unintended consequences.
| alistairSH wrote:
| That's true - these things might be better implemented as
| regulations out of the Executive branch - but that would
| still require legislation authorizing somebody to
| implement the regulations.
| hatenberg wrote:
| I'm infinitely scrolling hacker news comments right now.
| You think adding a next page button is gonna change
| anything?
| kreeben wrote:
| FB's research shows that, yes, any friction between you
| and the next post/article/item will decrease the
| likelihood that you see it.
|
| A click == friction.
| alistairSH wrote:
| Have you ever watched a teenager (or addicted adult)
| scroll thought their IG feed? It's disturbing. They just
| scroll and scroll and scroll waiting for the tiny little
| dopamine hits. I don't know if a "Next" button fixes it
| completely, but it almost has to be better, even if only
| marginally so.
| photochemsyn wrote:
| I think that's the right approach. Legally you could require
| every social media company that collects and sells data on
| its users to advertisers to allow the users to access their
| internal algorithmic interface (for their own account).
|
| Now, what controls are on the internal algorithmic dial?
| Apparently that's top secret, but a legal requirement to
| expose the interface to the users seems reasonable.
|
| Note that this might not affect what ads you get served (that
| seems more on the private business side, although banning
| prescription pharma ads makes sense), but it would affect
| what shows up in your feed, what content you get served, etc.
| You could write your own exclude lists, for example (i.e. if
| you never want to see content from MSNBC, FOX, or CNN, that
| would be your decision - not the algorithms, etc.)
| 6gvONxR4sf7o wrote:
| If you get too big, you can't buy your competition (e.g. FB
| buying IG). Or if you get too big, you have to open your
| stuff up like email does. Or if you lie to congress, you get
| penalized. Or if you get too big, you have to make your
| algorithms publicly available.
| piggybox wrote:
| What part of Gmail is open?
| 6gvONxR4sf7o wrote:
| Email, not Gmail. I can email people from my provider
| even if they use other providers, including people who
| self host. And I can get email from them too.
| thinkling wrote:
| I think GP is referring to the fact that email overall is
| a system that is based on public standards and open to
| new entrants. You can start Hmail.com if you want, and
| plug into the existing email eco-system as a new
| competitor very easily.
|
| The social media ecosystems aren't like that. You can't
| be a chat provider and plug into FB Messenger; you can't
| plug into Twitter, etc.
|
| There _is_ an open social media eco-system called the
| fediverse (for its federated nature), in which Mastodon
| is the best-known player. But it 's gotten very limited
| traction, because of the network effect that keeps people
| on FB and Twitter. No such effect keeps people on Gmail.
| piggybox wrote:
| Ah, I got it. Thanks!
| new_guy wrote:
| Yeah but that's just an American-centric take, there's
| plenty of companies out there that don't give a flying f*ck
| about American laws and congress.
| munificent wrote:
| Simply disallowing corporations to reach the scale of
| Facebook/ConAgra/Amazon/WalMart would solve many many
| problems.
|
| These companies do awful things because:
|
| 1. They have few viable alternatives so ethical consumers end
| up choosing them when they might not otherwise.
|
| 2. They have enough money that it is profitable to do bad
| things and pay for the damage control.
|
| 3. They have enough power to defeat regulation and government
| oversight.
|
| 4. They are so large and monolithic that they can hide their
| internal workings more easily.
|
| 5. The org chart is so deep that those in power are
| psychologically removed from much of the consequences of
| their overpowered actions.
| at-fates-hands wrote:
| >> To clarify, can you specify exactly what law you would
| like made? What do you want to be done exactly?
|
| Honestly, social media issues are for the most part a
| parenting issue. If you don't have access to your kids phone,
| or know what platforms they are on and who they are talking
| to and what they're sharing, I'm not sure legislating social
| media is going to do much of anything. New platforms will pop
| up, more private networks will be started and suddenly,
| everything becomes to fragmented to really oversee.
|
| I would create laws that have teeth and address issues like
| bullying, doxxing, SWATING and other ways people weaponize
| social media against other people. You start to put some
| teeth into laws where people are facing serious consequences
| for bullying and pushing people to suicide, then you might
| see some changes.
| mLuby wrote:
| > You start to put some teeth into laws where people are
| facing serious consequences for bullying and pushing people
| to suicide
|
| Counterpoint: kids aren't all neurologically and socially
| developed enough to understand life-altering consequences
| for certain actions, and _that 's not their fault._ Legal
| codes and law enforcement are too crude in most child-
| related cases, unless you're okay with incarcerating
| misbehaving children.
|
| It's on adults to make sure things kids can reach are
| reasonably safe for--as well as from--them.
| handrous wrote:
| Make spying on people illegal, even when a computer does it
| to billions of people rather than one creep doing it to one
| person. If you _have to_ collect info about people to provide
| a product or service, make it strictly illegal to transfer or
| sell that info _or anything derived from it_. Don 't like it,
| get into another business. No one's making you collect
| people's info. Yes, this should apply to e.g. credit card
| companies, not just big tech. This'd need some fine points
| hammered out (don't laws always?) but it's not that crazy.
|
| Do something to make platforms responsible when their
| "algorithms" promote something. Not just hosting it, but when
| they _promote_ it. Don 't like it? Don't curate, then, or
| have a human do it so you're _sure_ nothing you 're
| deliberately promoting is shitty enough to land you on the
| wrong end of a lawsuit. "But how will tech companies show
| every visitor a totally different home page of content
| they're promoting (but in no way responsible for), and how
| will Youtube find a way to recommend Jordan Peterson and Joe
| Rogan videos next to every damn thing? How will tech
| companies make every part of their 'experience'
| algorithmically-selected, personalized recommendations of
| content they farmed from randos?" They won't, they won't,
| and... they won't. You're welcome.
|
| Make data leaks so cripplingly expensive that no company
| would dare hoard personal data it didn't _absolutely_ need to
| get by.
|
| Force the quasi-official credit reporting agencies not to be
| so shitty. In particular, "freezes" should be free and should
| be the default, alerts for activity should be free, and
| access to one's own info should be _on demand at any time_ ,
| not once per year per agency. Or just outlaw the bastards
| completely, IDGAF.
|
| I dunno, lots of things we could do to make the current
| personal data free-for-all less hellish.
| smolder wrote:
| > This'd need some fine points hammered out (don't laws
| always?) but it's not that crazy.
|
| It sounds like you're suggesting GDPR style regulation.
| They're still figuring out how to enforce that but
| generally I support it. Too much money is against it to get
| anything passed in the US, though.
|
| Another problem is that the US government seems to like
| when the tech sector gobbles up data on people. It gives
| them new powers for social control.
| nuerow wrote:
| > I think a lot of people are choosing to ignore that a lot of
| companies have done things in the past that were not illegal at
| the time of action. (...)
|
| The definition of what represents good and evil does not come
| from what's passed as legislation, nor does the negative
| influence on society as a whole of a business.
|
| Legislation is also a moot point given that these mega-
| corporations actively lobby law-makers into not passing any
| inconvenient legislation.
| 1vuio0pswjnm7 wrote:
| "In response, Congress passed the Sherman Antitrust Act in 1890
| which subsequently prevented the actions that Standard Oil had
| used to conslidate the market."
|
| This tech company employee (aka the "Facebook Whistleblower")
| is refusing to share the documents she stole with the FTC.
|
| Although she did share them with several state attorneys
| general.
|
| It appears she does not support antitrust inquiries. Heavy
| consolidation of "social media" is to her an acceptable status
| quo.
|
| Needless to say, some would argue competition provides
| incentives for large players to improve their services.
| [deleted]
| echelon wrote:
| > while not illegal
|
| While not _currently_ illegal.
|
| Call your representatives and tell them you find this
| reprehensible.
|
| Let's make it illegal. We live in a representative democracy.
| beaner wrote:
| What is _it_ , exactly?
| thrwn_frthr_awy wrote:
| Personalized advertising.
| malandrew wrote:
| I personally find the personalized advertising great. A
| lot of the time I am shown things that are actually
| useful/valuable to me.
|
| I think a lot of the value really depends on the
| individual. If you're engaging in productive activities
| like hobbies, you get valuable targeted ads. If you're
| engaging in activities that are low value like signaling
| to others in myriad ways, you probably get adds for
| things like disposable fashion.
|
| Personalized ads are basically a mirror. They feed what
| the person already wants to engage in. If you want less
| of the bad types of advertising, then you need to start
| at the root which is getting people to stop being
| interested in activities and behaviors that are lower
| value.
| Tamrind7 wrote:
| All ads are fundamentally ugly in the sense that their
| effect is the opposite of a great work of art or
| entertainment. Ads are fundamentally just some pathetic
| person's selfish attempt to control what other people to
| think and feel in order to increase their own power
| through financial profit. In a sane world they would all
| be banned. Ads exist in their current deranged and
| disgusting form because contemporary humans have been
| selectively bred through social engineering to be
| submissive, cowardly, selfish, and stupid.
| Personalized/targeted advertising is not something that
| needs to be discussed.
| RandomLensman wrote:
| Ads have been around for more than 2000 years now - would
| need a massive shift in mores to get rid of them (they
| survive in a lot of very different societies).
| MetaWhirledPeas wrote:
| > I personally find the personalized advertising great. A
| lot of the time I am shown things that are actually
| useful/valuable to me.
|
| There are plenty of ways to deliver this value without
| secretly fingerprinting every user and delivering
| targeted ads at every corner. A search where you profile
| _yourself_ , for instance; similar to how you provide
| search filters on Amazon.
| MetaWhirledPeas wrote:
| Yes! Excellent suggestion. This is at the root of so many
| data-related problems.
| handmodel wrote:
| What would be the definition of this?
|
| I listen to a podcast on football. Are they allowed to
| run ads that are about sports betting and NFL tickets?
| That is personalized to the group. Is Facebook allowed to
| run ads for sports betting to all people who are fans of
| a professional team on their site?
|
| Is Facebook not allowed to run me ads for local
| restaurants any more?
| MetaWhirledPeas wrote:
| This could be done. The ads on a football podcast could
| be based on who their broader audience is, not based on a
| specific user.
|
| > Is Facebook not allowed to run me ads for local
| restaurants any more?
|
| Nope.
| yellow_postit wrote:
| the difference between a cohort that listens to a
| football podcast and lives in a metro doesn't seem
| obvious to me.
| runarberg wrote:
| I'm guessing that people use _targeted advertising_ and
| _personalized advertising_ interchangeably. The
| advertising industry knows full well what it means, and
| I'm sure the legislator should have no problem finding
| experts in that area to make a legally rigorous
| definition.
| zo1 wrote:
| _Ads_.
|
| I knew something was seriously wrong the moment I saw a
| legitimate business (EBay) selling eye-ball space (ads) on
| their property that was supposedly profitable through
| legitimate business (hosting a marketplace, taking a cut,
| etc).
|
| Ads create a negative and detrimental feedback loop by
| incentivizing dark patterns and other negative gamification
| in order to squeeze out previously non-existant eyeball
| time from your product. E.g. the optimal path for say EBay
| is to have a user come on, find what they want, browse a
| bit through interesting things and recommendations, buy
| what they want/need, then log off. Instead, ads have
| incentivized spam listings which do two things: More
| eyeball time and thus ad-impressions/clicks. And they've
| cause the creation of non-optimal experiences by allowing
| non-optimal players to exist through pure randomness. I.e.
| In an ideal market, it should be "winner takes all" for any
| unique genre or field or product space, one which should be
| exploring. Instead, the spam listings make it so a non-
| negligible amount of useless and bottom of the barrel
| products/sellers/companies to exist and _thrive_.
|
| For FB, ads have commoditized eyeball time even more
| directly than the indirect example I gave above with EBay.
| A potential product path with FB should be people using it
| as a platform to interact with people they know, organize
| events, and to have a shared space to communicate and
| discuss ideas.
| asdff wrote:
| There should be some laws about using addictive patterns
| imo. I'm sure that's fine and profitable and coca cola
| would continue to like putting cocaine into their drinks to
| make their customers want it all the more, but we have laws
| preventing that behavior in the meatspace and therefore we
| can have laws preventing this sort of evil behavior with
| technology companies too. Tie it into website accessibility
| laws that are already codified in law and can be used to
| sue certain companies today.
| MetaWhirledPeas wrote:
| While this is extremely murky and maybe impossible to pin
| down from a legal standpoint, I do like the thought. It's
| not just Facebook and it's not just social media. It's
| any software (online games?) that clearly goes out of its
| way to induce addictive behavior as their business model.
| nradov wrote:
| How could legislators draft such a law in a way that
| wouldn't be voided for vagueness?
|
| https://en.wikipedia.org/wiki/Vagueness_doctrine?wprov=sf
| la1
| asdff wrote:
| Probably with the help of psychologists who can offer
| more concrete definitions of addictive behavior and dark
| patterns than you or I.
| nradov wrote:
| Since psychology is mostly unscientific bunk, hopefully
| the courts would put a stop to that type of legislative
| overreach.
| asdff wrote:
| Yikes
| fidesomnes wrote:
| You sound like an absolute tool.
| mckirk wrote:
| The gaming industry would like to have a word...
|
| In a sense, companies are right now incentived to develop
| the most effective 'digital crack', because anything that
| hijacks the reward pathway of the brain more effectively
| leads to more profit. It'll be quite interesting to see
| how the public discourse around this will progress, since
| digital entertainment isn't as easy to publicly mark as
| 'bad' as drugs were.
|
| On the other hand, China is sending quite clear signals
| that it's theoretically possible to legislate against
| e.g. video games -- though only after you've already
| established an intrusive 'social credit' system, which I
| hope we won't see in the west any time soon.
| asdff wrote:
| Digital crack is a perfect way to describe this. I'm sure
| someone clever enough can write some great legislation
| for this. The issue is that so many industries are
| beholden to relying on digital crack. You might get one
| senator who wants this, then 99 others who are getting
| flooded with calls from every major employer in their
| district telling them to vote no. I wish we had stronger
| government that wasn't so susceptible to having anything
| good for the public exploited to make a few people very
| wealthy. Then again we've never had this sort of public
| first government in the history of our nation, its sort
| of always been like this out of design whenever I learn
| more about our history.
| MetaWhirledPeas wrote:
| > The gaming industry would like to have a word
|
| And I would like to have a word with them. They've been
| given free reign to turn our kids into absolute digital
| junkies (this is coming from a self-diagnosed sometimes-
| addict who realizes these kids are _on another level_ ),
| deliberately dangling carrots that reward 24/7 engagement
| in the activity.
|
| > digital entertainment isn't as easy to publicly mark as
| 'bad' as drugs were
|
| Definitely true. We need a way to differentiate between
| Super Mario Brothers and Mega Crack Force Gacha Legends
| Online.
| threeseed wrote:
| > There should be some laws about using addictive
| patterns
|
| Of course there should be.
|
| But then you would also need to ban casinos, sports
| gambling, gaming, porn, cigarettes, alcohol and the
| myriad of other things that are addictive in nature.
| kaibee wrote:
| Notably, all of those things are in fact, banned for
| people under the age of 18 or 21.
| asdff wrote:
| Well we do have laws regulating and/or taxing most of
| those addictive things already. Except for social media
| and gaming really, although gaming is under hot water
| currently due to loot box gambling mechanics.
| axguscbklp wrote:
| I disagree - I think that it should be completely legal
| to sell cocaine drinks as long as you inform the
| customers that the drinks have cocaine in them and I
| think that is should be legal to use even the most
| psychologically manipulative marketing techniques
| imaginable. I would rather that it be the responsibility
| of consumers to avoid getting addicted than to use
| government power to ban things. Similarly, for example I
| think that it should be legal to sell skateboards even
| though people sometimes injure themselves while riding
| them.
| madengr wrote:
| No, we live in a constitutional republic.
| Animats wrote:
| _I think a lot of people are choosing to ignore that a lot of
| companies have done things in the past that were not illegal at
| the time of action. However, those actions were later decided
| to be made illegal because the behavior was deemed to be
| antithetical to our values._
|
| Which, on good days, is why we have legislatures. To make new
| laws to cover new situations.
| 1vuio0pswjnm7 wrote:
| "In response, Congress passed the Sherman Antitrust Act in 1890
| which subsequently prevented the actions that Standard Oil had
| used to conslidate the market."
|
| This tech company employee (aka the "Facebook Whistleblower")
| is refusing to share the documents she stole with the FTC.
|
| Although she did share them with several attorneys general.
|
| It appears she does not support antitrust inquiries. Heavy
| consolidation of "social media" with no meaningful competition
| is acceptable to her.
|
| Needless to say, some would argue competition provides
| incentives for large players to improve their services.
| madengr wrote:
| It's not unethical at all. It adheres to the 1st amendment. If
| anything, the censorship is illegal.
| ayngg wrote:
| One fairly common pattern seen is that companies develop in a
| nascent space where there were few rules and were therefore
| able to basically outrun regulation/ the law that moves very
| slowly. When that regulation eventually comes it ends up
| solidifying the monopolistic advantage by essentially creating
| a moat and closing the door on practices that helped create
| such growth in the first place. I think when stakes are that
| high, companies are generally rewarded and incentivized to be
| unscrupulous rather than virtuous, especially when the
| unscrupulous actors just become wealthy enough to buy out the
| virtuous ones.
|
| I wouldn't be surprised if we are currently in the middle of a
| version of this regarding social media and how privacy of
| personal information is handled right now.
| tsimionescu wrote:
| This argument is brought up a lot, but it seems the lack of
| regulation hadn't really stopped FB and Google from
| monopolizing their markets anyway (or, oligopolizing if we
| think they're in the same market).
| laurent92 wrote:
| What would a developed civilization do? I doubt we would be
| able to prevent the "bubble up and close the door" behavior,
| so should it also ensure that corporations are regularly
| rotated (ie dismantled for others to take the space) so only
| those which can succeed fairly in the current law framework
| would survive?
| axguscbklp wrote:
| >There should be no question, that what FB is doing here, while
| not illegal, is highly dubious ethically.
|
| Why, what exactly are they doing that is ethically dubious? So
| far based on what I have read of this whistleblower's
| revelations, I do not have a problem with Facebook doing any of
| it.
| woeirua wrote:
| Really? You don't have a problem with an app that causes 1%
| of teens that use it to develop suicidal thoughts? By the
| way, according to the leaked study these teens directly
| attributed their suicidal ideation to Instagram.
| newaccount2021 wrote:
| Are you coming after my collection of Smiths CDs?
| discobot2 wrote:
| What number in question would be there if we evaluate
| schools or cinema or night clubs?
| axguscbklp wrote:
| Yes, I do not have any problem with it whatsoever. There
| are probably plenty of books that also cause some
| percentage of people to develop suicidal thoughts, but I do
| not want to start banning those books.
| w0m wrote:
| This is the kicker I think. Facebook scales 'keeping up
| with the Jones' up and make it easier. But that's been a
| common trope since (google search... 1920ish). What
| Facebook's doing isn't new; it's simply Easier.
|
| When you say, '1% develop suicidal thoughts' - Is that
| causation or correlation? Maybe I'm missing something;
| but this seems somewhat like 'biggest target' to me as
| the world had shrunk.
|
| https://health.ucdavis.edu/health-news/newsroom/even-
| before-...
| jimkleiber wrote:
| I really appreciate this point. I often see it as written rules
| (laws) and unwritten rules (ethics). If something breaks the
| unwritten rules we have about how people are supposed to
| interact with each other, then we often codify that rule into
| law. Many people will say "I didn't break the law" but where
| many people would say that person did break an unwritten law.
|
| > There should be no question, that what FB is doing here,
| while not illegal, is highly dubious ethically.
|
| At the same time, I believe some of the stuff FB has done is
| currently illegal, such as this example in one of the
| whistleblower's disclosures to the SEC [0]:
|
| > Our anonymous client is disclosing original evidence showing
| that Facebook, Inc. (NASDAQ: FB) has, for years past and
| ongoing, violated U.S. security laws by making material
| misrepresentations and omissions in statements to investors and
| prospective investors, including, inter alia, through filings
| with the SEC, testimony to Congress, online statements, and
| media stories.
|
| So it could be a combination of them both violating ethics and
| violating the law.
|
| [0]:
| https://twitter.com/jason_kint/status/1445248400237244423?s=...
| vladd wrote:
| If a poem (or book) makes 10% of its readers more likely to
| become geniuses and contribute to solving world problems such
| as cancer, but 0.1% of its readers are more likely to commit
| suicide, should that book be banned by law?
|
| Today's online society is based on posts created by content
| creators around the world, where algorithms can barely
| scratch the surface at interpreting their content, humans
| don't scale in reviewing every post, but statistics such as
| the above could be arguably inferred easily based on a
| combination of engagement (click/scrolls) data and
| attrition/session-revisits numbers.
|
| Which is really problematic, because codifying into law rules
| and punishments based on aggregated outcomes and impact to us
| as a society (or to society sub-segments such as teens) makes
| it a very hard process to navigate between censorship vs.
| positive overall outcome vs. specific negative outcome on
| some outliers.
| cutemonster wrote:
| It's a misleading comparison.
|
| From what I read, trafficking / sex slavery was (is)
| happening via some places on FB, the company knew about it,
| did nothing. For example.
|
| Edit: the article from this HN discussion:
| https://news.ycombinator.com/item?id=28741532 , search for
| "traffickers" and "drug cartels". /Edit.
|
| Understaffed moderation teams, although FB had lots of
| money
| dfadsadsf wrote:
| I know that somebody is raping someone in NYC right now
| and somebody will be killed in Chicago by the end of the
| day today. Should we ban the cities or at least force
| them to spend all their budget on security? Or set up
| curfew for citizens? May be public hanging a la Taliban -
| those definitely reduce crime.
|
| Humans are using FB and where you have humans they commit
| crimes. Trying to eradicate all crime when you have
| humans in the loop is generally not great idea. Besides
| fighting trafficking/sex slavery with very few exceptions
| generally means harassing women with zero benefit to
| society or reduction in actual sex crimes.
| freetinker wrote:
| But because we know humans rape and kill, we take
| measures to create circumstances that reduce the
| probability of such things happening.
|
| Such as well-lit streets or gun control laws.
| vladd wrote:
| Let's try to phrase it in an actionable way for the law-
| makers to act upon it.
|
| Are you suggesting that any profitable company hosting
| user-submitted content should invest all the profits in
| moderation teams to the point where they are either a)
| becoming profit-neutral or b) all the relevant content
| has been reviewed by a human moderator?
|
| And how do you define relevant content -- having had 50
| views? 10 views? 1 view? Who should decide where to set
| these limits? Do we believe politicians are going to do a
| better job at it rather than the existing situation? Or
| should we ban any non-human reviewed post just to move
| the certainty of illegal posts removals from 99.9% to
| 99.99%? (humans do make mistakes too)
|
| (Facebook is really big so having just 99.99% of posts in
| compliance still means an awful amount of them escaping
| the system undetected)
| tsimionescu wrote:
| > Are you suggesting that any profitable company hosting
| user-submitted content should invest all the profits in
| moderation teams to the point where they are either a)
| becoming profit-neutral or b) all the relevant content
| has been reviewed by a human moderator?
|
| Yes, obviously. Why should a company get to profit from
| sex traffic or any other such content on their platform,
| just because it would cost money to take it down?
| haroldp wrote:
| Sex slavery is being facilitated by telephone
| conversations. What is the phone company's obligation to
| do about that?
| finfinfin wrote:
| Would you agree that it would be wrong for telephone
| companies to amplify sex slavery conversations? Like they
| would call you directly and just let you participate in
| the conversation because that would generate more
| engagement?
| haroldp wrote:
| That is a very good counter point. I haven't read this
| facebook story yet, but I am willing to assume for
| argument that describes what happened. I guess it would
| depend for me on whether _people_ saw sex-slavery content
| and decided to amplify it, vs an algorithm that finds and
| promotes "engaging" things without being very smart
| about what they are.
| vladd wrote:
| When phone companies came into existence, that's exactly
| what they did -- they amplified such conversations by
| making it easier for people to have phone calls and talk
| at a distance of each other.
|
| They also got amplified whenever long distance calls got
| cheaper (as the overall volume of conversations
| increased).
| Dracophoenix wrote:
| How are you defining "amplification"? Phones already
| operate by complex signal amplification over long
| distances. Why do you think burner phones are still
| prevalent for all manner of illicit activity?
|
| I don't think the phone company should be shut down
| because others can use it in a way that's considered
| devious. I don't think the phone company should play
| "morality police" either. I simply expect the phone
| company to simply provide the service I paid for.
|
| This type of thinking strikes as the kind that would damn
| Gutenberg for inventing the movable-type printing press
| because print has been used to disseminate propaganda and
| debauchery to billions of people several centuries later.
| finfinfin wrote:
| Amplification not in the electrical signal amplification
| sense but rather in the sense of amplifying the message.
| Facebook is giving more visibility to content that it
| considers more engaging, even if that content leads to
| harmful outcomes (it's own research proves that).
| Dracophoenix wrote:
| You were making a point regarding phone-operated sex
| trafficking. Your characterization of what the phone
| company should do was what I contended. While, I'm aware
| that this was made as a broader point regarding Facebook,
| amplifying a signal and amplifying a message isn't
| functionally different. Television is an example of where
| both are happening. Even Twitter and Tiktok engage in
| amplification every time there's some Tide-pod Challenge.
| I don't see why Facebook would have to be responsible for
| how people feel about themselves, what stunts bad actors
| pull.
| finfinfin wrote:
| Right. In the case of phone-operated sex trafficking I
| don't think amplification is even an option. It's not
| like phone companies are deciding what phone calls you
| should be receiving today and are lining them up for you
| to take part in. So they don't involve algorithmic
| manipulation (or optimization for engagement), unlike
| Facebook or other social media.
|
| In my parent post I was giving an example of an absurd
| imaginary situation with phone companies attempting to
| amplify sex trafficking by directly deciding who will
| participate in the conversation for the purpose of
| increasing engagement.
| drdeca wrote:
| if magic books were real, then the way we would have to
| treat books would be much different.
| finfinfin wrote:
| Looks like you are willfully ignoring Facebook's own
| findings. They know that polarizing content is more
| engaging yet harmful... and they choose to amplify it
| anyway.
|
| The same old argument that it's hard therefore let's not do
| anything is not applicable.
|
| Facebook is not a neutral platform that just shows all
| posts from your friends in a chronological order. They are
| actively manipulating the stream and are fully responsible
| for what you consume.
| freeopinion wrote:
| > Facebook [clipped] are fully responsible for what you
| consume.
|
| I'm not sure how deeply you hold this belief, but I am
| concerned to see so many people push all blame from their
| own actions. While it may be true that Facebook is
| largely responsible for what is consumed * on Facebook *,
| individuals are largely responsible for consuming
| Facebook.
| rmahan wrote:
| I think they fall into more responsibility here because
| they've also designed it to be addictive. If Facebook was
| easier to quit, I'd hold individuals more accountable.
| drbojingle wrote:
| That's true, but does my mother understand what's really
| going on? Do you? Do I? Choosing to pick up the phone and
| call your daughter and choosing to go on Facebook is very
| different and people growing up with the former might not
| realize how different the latter really is.
| chasd00 wrote:
| this is true and if you're going to put Facebook in the
| spotlight you're going to have to put a light on everyone
| else. The entire computer gaming industry is one big
| dopamine cartel. If the facebook addiction is such a big
| deal then it's a little ironic gaming hasn't been
| completely dismantled.
|
| //edit: honestly i think politics are a little at play
| here. Facebook (these days) is used heavily by an older
| more conservative crowd and i think it's irritating to
| the other side
| satellite2 wrote:
| I think they do. When you only see post about how vaccine
| cause autism, anectode about this and that person and the
| diseases they got from the vaccine and that on top of
| that the vaccine doesn't even prevent the disease it was
| designed against, then it becomes reasonable to become
| antivax.
|
| And if effectively Facebook knowingly choose, through
| their algorithm parameters selection, to promote this
| material as it increases engagement more than reasonable
| content, then yes, I think they should at least be partly
| held responsible for the harm caused by the anti vaccine
| movement.
|
| And this is only one example.
| vladd wrote:
| Walmart is "manipulating" the placement of products on
| the shelf so that it's more likely for you to engage in
| bulk buying when you visit their stores.
|
| Both Facebook and Walmart have a fiduciary duty to their
| shareholders to create value for them.
|
| The difference is that, with user generated content, the
| idea of black and white "bounds" of the law is no longer
| applicable and you have to devise a system of checks and
| balances based on probabilities.
|
| You can consider 10'000 posts for offline analysis: give
| them to some human raters and decide retrospectively what
| engagement and thoughts (positive/negative) are they
| generating in teens, which should enable you to draw some
| statistics about the expected average outcome. This
| doesn't mean it's either scalable or economically
| feasible to do so in real time for every post (so you
| cannot take decisions based on something that doesn't
| exist at the individual post level).
|
| You can have multiple algorithms, send all of them to
| human raters and get for each algorithm some aggregated
| behaviour, but then we're back to the book question above
| -- what ratio of positive vs negative outcome in outliers
| is acceptable, and how do you define a "legal"/"allowed"
| algorithm?
| sul_tasto wrote:
| Walmart doesn't stock land mines, rocket launchers,
| anthrax, or many other items harmful to democracy and
| society on its shelves, even though I'm sure it could
| make a lot of money selling such items.
| chalst wrote:
| My regular reminder that there is no fiduciary duty to
| behave unethically. Fiduciary duty is a class of highly
| specific legal obligations on directors to act
| attentively and not put their own financial interests
| above those of shareholders. It is not an obligation to
| maximise return on investment.
|
| Cf. https://news.ycombinator.com/item?id=20776770
| finfinfin wrote:
| I am baffled by this display of lack of ethics. Do we
| need a Walmart comparison to put Facebook's action in
| perspective? Facebook - by its own acknowledgement -
| negatively affects teenage mental health and the
| democratic processes in many countries. Do you see how
| different this is from selling more mayonnaise jars in
| Walmart?
|
| Facebook doesn't have a duty to manipulate content. This
| is a very weak excuse that works mostly for people
| directly benefiting from the situation. Didn't cigarette
| companies have a duty to maximize profits? Pharma
| companies pushing accessible opioids? Is that a more apt
| analogy?
| vladd wrote:
| The following has been used for sure in order to commit
| crimes and fiddle with democracy: Verizon phone
| conversations, Gmail discussions, Twitter, Snapchat or
| Tiktok messages etc.
|
| Nobody wakes up and says "let's be unethical today", but
| rather, it's the reality of life with user generated
| content platforms, that either you get both outcomes, or
| you get none.
|
| The discussion is about making people realize that the
| "technology" to keep only the good parts (without the
| downsides) wasn't invented yet.
|
| Hence we're in a position to argue whether it would be
| more ethical to shutdown / censor everything, or have
| fruitful discussions on how to emphasize the good
| outcomes over the bad ones with the current tech (by
| first understanding it, something that politicians seem
| to be very bad at, or show little interest in it compared
| to the negative FB sentiment engagement they're
| generating in their voters -- ironic :) ).
| cmorgan31 wrote:
| Nobody? Give it a rest. We're not dumb enough to think
| everyone in technology, specifically ad tech is ethical
| by default. Facebook made their own bed and made the
| mistake of allowing the internal research out of the
| closed corporate box. They can mitigate the impact of
| their most engaged content but it would be to their own
| fiscal detriment which is why they fundamentally decide
| not to mitigate it.
| dkarl wrote:
| > Facebook - by its own acknowledgement - negatively
| affects teenage mental health and the democratic
| processes in many countries. Do you see how different
| this is from selling more mayonnaise jars in Walmart?
|
| Replace mental health with physical health and you have a
| great argument against how food is produced, marketed,
| and sold. We tackled these issues first with tobacco, and
| food wouldn't be a bad place to turn our attention after
| the social media companies.
|
| Corporations are ruthless, inhuman optimization engines.
| When we don't sufficiently constrain the problems we ask
| them to solve, we get grotesque, inhuman solutions, like
| turning healthy desires into harmful addictions.
| kaibee wrote:
| I would also have OP consider that yes, maybe having
| corporations like Nestle, CocaCola, etc that prioritize
| profit above all else is, in fact, also bad. Like, lets
| be real here, if the CEO of Coke had a button that could
| double the consumption of Coke products in the USA he
| would definitely push it, despite the fact that hundreds
| of thousands of people would become more obese and live
| worse, shorter lives. Advertising is an attempt at such a
| button.
| int_19h wrote:
| > Both Facebook and Walmart have a fiduciary duty to
| their shareholders to create value for them.
|
| I feel like the more this claim is repeated, the more
| pushback you're going to see against it - and rightly so.
|
| We need to remember that corporations are themselves
| fictitious legal entities. They only exist because
| society wills them into existence, and it can do so with
| arbitrary strings attached - there's no natural right to
| form a corporation. So, if it turns out that "fiduciary
| duty to their shareholders to create value" inevitably
| leads to the abusive megacorp clusterfuck that we are
| seeing today, why should we be clinging to it?
| finfinfin wrote:
| It's puzzling how many people are so ready to mask their
| own responsibility by shifting it to a legal entity that
| apparently now has a duty to do whatever it takes to
| generate more profit. As if individually these people
| wouldn't act in unethical ways but once they put on the
| "I am a corporation" mask anything goes.
| nuerow wrote:
| > Walmart is (...)
|
| Whataboutism advances no discussion. Either Facebook's
| problems are discussed based on Facebook's circumstances
| and decisions and consequences, or we're better off not
| posting any message at all.
| docmars wrote:
| Comparisons, analogies, and metaphors are useful tools to
| increase understanding and draw parallels to ideas that
| are challenging to navigate and naturally, lead to a
| variety of thoughtful outcomes or interpretations.
|
| Crying "whataboutism" is as fruitless as you've described
| above. It is often used to steer a conversation towards a
| single direction of bias when those comparisons lead to
| inconvenient conclusions/possibilities that fall outside
| of what the person claiming it has accepted. Just sayin'.
| ;)
| nuerow wrote:
| > Comparisons, analogies, and metaphors are useful tools
| (...)
|
| Whataboutism is neither. It's a logical fallacy employed
| to avoid discussing the problem or address issues by
| trying to distract and deflect the attention to
| irrelevant and completely unrelated subjects.
| wanderingstan wrote:
| I found it an apt comparison, highlighting how something
| we might accept in physical space (Walmart) yet be
| critical of equivalent action in the online space. It's a
| thoughtful and coherent argument, even if one disagrees
| with it, not whataboutism
| haroldp wrote:
| Please stop down-voting thoughtful comments such as this
| just because you disagree with them.
| chacham15 wrote:
| > I often see it as written rules (laws) and unwritten rules
| (ethics).
|
| I think this is a very dangerous line to walk. A common
| phrase in law is "the law often allows what honor forbids"
| and that is because there is a difference between the law and
| ethics and IMO that is a good thing.
|
| Is it ethical to eat all the cookies in the cookie jar and
| leave none for anyone else? No. Should it be illegal? No.
| pjc50 wrote:
| Should it be subject to social sanction? Yes.
|
| (arguably eating cookies that aren't yours _is_ a crime,
| and I don 't doubt that someone has in the past been
| arrested for it in ridiculous circumstances)
| cookie_monsta wrote:
| Australia says hello
| munk-a wrote:
| Let's say that if there are two or more cookies in the jar
| every morning I add another one to it - under that scenario
| (especially if we go so far as to say cookies reproduce at
| some fixed proportion) then yea - it's totally illegal to
| eat all the cookies. The most common example of this
| tragedy of the commons is fishing but it happens all over
| the place.
|
| Specifically on the topic of cookies - it honestly is
| "forbidden" in a lot of households to eat all the cookies
| in the jar. At work you'll probably face some consequences
| if there's a communal cookie jar (or, the more common
| scenario, drinking all the half-n-half and not getting
| more). We don't really have "public" cookie jars so this
| scenario is pretty contrived, but if there was one (i.e. if
| NYC installed a big cookie jar in Time Square for
| Halloween) then it probably would actually be illegal (or
| at least, against a city ordinance) to eat more than X
| cookies. But, like I said, it's pretty contrived feeling.
| NomDePlum wrote:
| Is it a dangerous line to walk? For who?
|
| FB are involved in unethical practices that whilst not
| illegal at present are much more consequential than your
| example.
|
| There are obvious questions here to ask that may lead to
| new laws being made and perhaps even retrospectively
| enforced.
|
| Social engineering for profit is a little more serious than
| who ate the cookies surely?
| licebmi__at__ wrote:
| If we are trapped somewhere with no other food than the
| cookie jar, the we will see how long before eating all the
| cookies is illegal.
|
| Justice is a messy concept because is rooted in specific
| circumstances, and it's absurd to think there's a clear
| line between what's unethical and what's illegal.
| TAForObvReasons wrote:
| We've slowly seen an alternate interpretation promulgated
| by many: anything that is not illegal is ethical. The
| endpoint is practically the same (anything legal is ethical
| and vice versa) but it arguably makes for a worse society.
| bregma wrote:
| We've also seen, sometimes in high places, the third way
| of "it's illegal and unethical but I can get away with
| it."
| int_19h wrote:
| "For my friends, everything; for my enemies, the law."
| satellite2 wrote:
| Isn't the cookie jar an alegory of the commons? And as such
| shouldn't it be forbiden to shit in the cookie jar?
|
| I had the impression that it was how the story ended when
| we talked about cookies, but maybe I didn't get the memo.
| josephg wrote:
| The point is that being forbidden and being illegal are
| different ideas. It's bad for society to codify too much
| behaviour in law. Knowing the law is no substitute for
| knowing the difference between right and wrong.
|
| Regulating Facebook is a great example. Congress could
| easily react to facebook's indiscretions by passing new
| laws here which stifle innovation.
| munk-a wrote:
| > It's bad for society to codify too much behaviour in
| law.
|
| The issue with over-codification is one of the complexity
| in the laws that result - not that a large number of
| prohibitions is actually damaging to society. If too many
| laws exist then enforcement becomes intractable,
| arbitrary and unjust - but if enforcement could be sanely
| and fairly dealt out then there are lots of things that
| we'd appreciate being laws - i.e. sniping someone's
| parking spot while they're pulling in: it's a dangerous
| action that encourages people to park faster than they're
| comfortable and generally makes people act like
| assholes... but is it worth paying someone 50k/year to
| prevent sniping parking spots? Nope.
| satellite2 wrote:
| I think I got that part. I was referring to the book the
| tragedy of the commons (very interesting small book that
| I recommend) which basically says that when you have N
| users of some common, even if game theory says that it's
| in their best individual best interest to protect the
| common as it is the strategy that maximise satisfaction,
| if N becomes large enough, someone will start damageing
| it and soon everyone will do the same. So the tragedy is
| that you actually have to enforce the behaviour that's in
| everyone's best interest as a law.
| billiam wrote:
| AFAIK SEC laws and regulations about misrepresentation are
| only sporadically enforced to encourage compliance by
| example. Look at what Musk and his companies have gotten away
| with. Of course, I am all for these disclosures, which of
| course FB will pay their way out of without admitting
| wrongdoing. Because corporations manage our government, not
| the other way around.
| chalst wrote:
| SEC cases are hard to put together, but the the SEC is far
| from toothless. Cf. [1].
|
| Lying to Congress is another matter: perjury convictions
| are pretty rare.
|
| [1]:
| https://papers.ssrn.com/sol3/papers.cfm?abstract_id=933333
| taf2 wrote:
| My Mom said it to me with simpler terms when i was little...
| "we only have laws because of the assholes"
| koonsolo wrote:
| Same with contracts: you only need them when things go bad.
| sarkron wrote:
| Plato put it like this in his "Laws": "Laws are made to
| instruct the good, and in the hope that there may be no
| need of them; also to control the bad, whose hardness of
| heart will not be hindered from crime."
| chmsky00 wrote:
| I wonder how he'd speak of a supposedly even better
| educated society stealing the future of the next due to
| circular validation of our waste filled industrialism.
| SuoDuanDao wrote:
| Given that most Greeks of that era believed they lived
| after the decline of a golden age, I suspect he might be
| be more understanding than most people today.
| beepbooptheory wrote:
| In fact, Plato himself lived through a ruthless and
| bloody revolution that killed his friends and ended in
| the reign of the 14 tyrants!
| toomuchtodo wrote:
| May you live in interesting times.
| munk-a wrote:
| Plato at the Googleplex might be of interest to you -
| it's got its rough points but it brings a lot of his
| philosophy forward to be more relatable.
| B1FF_PSUVM wrote:
| > stealing the future of the next
|
| How would the next generation like having a pristine
| Earth, without any infrastructure whatsoever?
|
| No roads, no buildings, no domesticated animals, no
| libraries, nothing but nature.
|
| No pyramids, no poems, all the oil and coal in the
| ground, "noble savages" all around.
| listless wrote:
| Wait - does that mean it's the assholes who make the laws
| or are they the ones we create laws for? Or does it work
| both ways?!
|
| Your moms super profound.
| Ancapistani wrote:
| Which assholes? The ones that passed them, or the ones that
| did something stupid to prompt them?
|
| (I like this statement!)
| _tom_ wrote:
| Then there is the case of things are illegal, but are not
| enforced. Leading to the question of what is the law? What is
| written, or how it is enforced?
|
| How many of you went above the speed limit today?
|
| I suspect that much of what goes on in the stock market is
| similar.
| noizejoy wrote:
| Having driven in quite a few different jurisdictions over
| the years, my impression became: The safest driving speed
| is the one that blends with local driving culture. In some
| places, that's well above the posted limits, and in others
| it's quite a bit below.
|
| I suspect that degrees of being generally law abiding also
| vary across cultures.
| Y_Y wrote:
| At least speeding is democratised. Any asshole can go too
| fast. Insider trading is tough and you've got to be an
| insider (or at least know one).
|
| I wonder which causes more net deaths.
| playguardin wrote:
| If Congress invites you to speak without fear of arrest and
| the main stream media hold you up as a darling then you are
| NOT a whistleblower. You are doing the bidding of power... If
| you are exiled to Russia (Snowden) or locked up without trial
| like Assange THEN you are whistlw blower and dangerous to
| power. This chick is a shill
| downandout wrote:
| Not everyone has the same moral compass. It isn't even clear
| that the whistleblower herself was guided by concern for
| society: she first went to the SEC with these complaints.
| That's a weird place to go with concerns about social media's
| impact on society. I wonder why? Perhaps...just maybe...it was
| because the SEC will give her 10-30% of any fines levied
| against Facebook, leading to a potential windfall of $1 billion
| or more to her personally.
|
| It takes truly egregious behavior for society to agree that new
| laws must be passed to outlaw it. The current state of social
| media says much more about _human_ behavior than _Facebook's_
| behavior. Everyone here would also almost certainly reject the
| kinds of laws that would be required to make Facebook /IG a
| healthier place. They would likely involve serious privacy
| violations, just for starters. So given that legislation in
| this area has almost no chance of passing, it is unclear what
| the point of this is, other than a huge payday for the
| whistleblower.
| hef19898 wrote:
| How about she went to SEC because for certain things that
| _the right place_ to go to? And the SEC has to fine FB first,
| which would mean that FB committed illegal acts. That alone
| is behavior that needs to be encouraged, exposing corporate
| wrong doing is a net positively for society.
| downandout wrote:
| _which would mean that FB committed illegal acts_
|
| The SEC will make extremely vague allegations that Facebook
| misled investors by not disclosing some of these reports.
| Facebook will settle to avoid further reputational damage,
| paying large fines without admitting wrongdoing. This woman
| will then buy an island to vacation on and a G650 to get
| there with. That is how 99% of these things play out.
|
| I personally don't believe that Facebook has done anything
| illegal here. That is not to say I don't think they have
| done anything _wrong_ - their business, like many others,
| is morally bankrupt in some ways. But there is no codified
| responsibility for Facebook to do _anything_ to cure the
| ills of social media. You don't see casinos being
| successfully sued for causing suicides, bankruptcies,
| divorces, financial crimes, etc., but it happens every day.
| That's because there is no law against being in a scummy
| business. Investors in such businesses know (or should
| know) what they are supporting.
| tannhaeuser wrote:
| Frankly, I'm not expecting any meaningful legislative response,
| given US antitrust has blessed the WhatsApp and Instagram
| acquisitions by Fb as well as Google's acquisition of
| DoubleClick and YouTube.
| dantheman wrote:
| Standard oil reduces the cost of oil, delighted customers, and
| had already a massive decline in market share by the time the
| antitrust stuff happened. It was driven by people who couldn't
| compete.
| _hyn3 wrote:
| As someone who has studied this fairly extensively, I believe
| this comment to be factually correct.
|
| It also didn't hurt Rockefeller in the slightest. To the
| contrary, he actually became _far_ wealthier post-breakup,
| possibly because all former business units became more
| efficient in the light of open competition.
|
| Many of them live on today, such as Texaco, Chevron, and
| Mobil.
|
| https://historyincharts.com/the-legacy-of-the-breakup-of-
| sta...
|
| https://www.britannica.com/topic/Standard-Oil
|
| In general, trustbusting almost never actually works as
| planned, but it always seems like a good idea -- a desperate
| solution, perhaps the _only_ possible solution -- at the
| time.
|
| The only thing that tends to work is upstart competition
| driven by new technology that blindsides the older company.
|
| When it comes to a monopolist, the one thing that we can say
| historically is that, "This too shall pass."
| jimbob45 wrote:
| >possibly because all former business units became more
| efficient in the light of open competition.
|
| Then it sounds like it _did_ work in the eyes of the people
| who wanted more efficient corporations (and thus potential
| savings to be passed down to them).
| freeopinion wrote:
| Your point is a good one, but we should be careful not to
| equate [edit: completely] Facebook's actions with those of
| Standard Oil.
|
| If we say that the Sherman Antitrust Act was only necessary
| because of unethical behavior on the part of players like
| Standard Oil, we cannot say the same in this situation.
|
| If you consider Facebook's behavior unethical, how do you view
| the behavior of the millions of people who fund them and
| provide them such market power? There are many many
| alternatives to Facebook. But non-Facebook parties routinely
| force people to Facebook if they want to be involved in an
| event or receive a notification or provide feedback.
|
| If you would shame Facebook for their behavior, you should also
| shame others for using Facebook. Users enable Facebook's
| behavior.
|
| Conversely, if you hold Facebook users harmless, it is harder
| to sympathize with complaints about Facebook's behavior.
|
| Their are clear parallels between Facebook and Standard Oil.
| But it is useful to note where there are differences, too.
| stephc_int13 wrote:
| Law is always lagging behind social norms, for many reasons.
|
| Because of that, I don't think laws can change the world, this
| is the opposite, laws are merely acknowledging the common rules
| of the majority.
|
| Technology also change the world and we need time to figure out
| new rules to adapt, in the case of Facebook and other giants,
| some changes are clearly in need, at least it seems to be a
| growing consensus.
| tsimionescu wrote:
| Laws can absolutely change the world (at least if we take the
| world to mean one country). Look at de-segregation. It's
| obviously that the world was segregated before the de-
| segregation laws were passed. Of course, the laws were passed
| because there was ample enough support for them, but that
| doesn't mean that the laws only aknowledged an existing state
| of affairs. Even more so, the laws themselves helped
| accelerate the perception of segregation as evil among the
| majority of the population, whereas before it was just a
| regular part of life to many (many on the good part of the
| segregated world, of course).
| dionidium wrote:
| It really bums me out to see these sentiments at the top of HN.
| What to do about "misinformation" is an interesting question
| for private actors to think about, if they wish, but what the
| government should do about it is not an interesting question.
| The debate has been had for a couple hundred years, already.
| It's over. One side already won.
| _hyn3 wrote:
| > a lot of companies have done things in the past that were not
| illegal at the time of action. However, those actions were
| later decided to be made illegal because the behavior was
| deemed to be antithetical to our values.
|
| What you are saying is literally the opposite of hundreds of
| years of the rule of law.
|
| If FB did break the law as written, then prosecute them for
| that in a fair trial by a jury of their peers, but yours (or
| anyone else's) personal feelings about "our values" should
| never be able to override the plain language of the law,
| especially retroactively.
| NineStarPoint wrote:
| The point they're making is you don't prosecute Facebook for
| something that we think is unjust but is not yet illegal. You
| make it illegal, and then after that point if anyone
| continues with the now illegal course of action then you
| prosecute them. Prosecuting someone for something that wasn't
| illegal when they did it would of course be wrong.
| _hyn3 wrote:
| If that was what the OP had said, I would agree, but I
| don't interpret that as being what they actually said
| (although I may have misinterpreted):
|
| "However, those actions were later decided to be _made_
| illegal " (emphasis added)
|
| I interpret this as saying that they believe retroactively
| applying enforcement would be moral, just, and legal.
| oort-cloud9 wrote:
| Standard oil would not have been able to consolidate the oil
| industry without the help of the government.
| philwelch wrote:
| It's astounding to me that _too little_ censorship is
| characterized as "antithetical to our values", "highly dubious
| ethically", and worthy of potential legal sanction in the top-
| ranked comment on HN.
| paulryanrogers wrote:
| Is it too little censorship or rather amplifying problematic
| things and suppressing heathier things because of perverse
| incentives? FB and Instagram timelines are _not_ raw feeds
| from ones friends /follows. They are tuned by human
| calibrated algorithms.
|
| Kids need to eat vegetables and lean protein sources. But if
| school districts instead optimize for profit they may end up
| feeding the kids borderline poison like sodas and candy. When
| companies come to dominate a public space, like huge parts of
| digital comms, then maybe it's OK to demand more responsible
| behavior of them.
| dionidium wrote:
| What people choose to "amplify" is none of the government's
| business. People are allowed to be wrong. Yes, _even if_
| you think it 's about something really important.
| philwelch wrote:
| > Kids need to eat vegetables and lean protein sources. But
| if school districts instead optimize for profit they may
| end up feeding the kids borderline poison like sodas and
| candy. When companies come to dominate a public space, like
| huge parts of digital comms, then maybe it's OK to demand
| more responsible behavior of them.
|
| Adults are not children and social media sites are not
| school districts.
|
| The school district analogy also doesn't really hold up on
| its own terms unless you're talking about boarding schools,
| which you probably aren't given the term "school district".
| When I was a kid, I ate at least 2 out of 3 meals at home,
| and more often than not, I brought a sack lunch. I know
| that poorer kids rely on school lunches a lot more than I
| did, but that's still just one meal a day. My high school
| actually did have a Coca-Cola machine, but I think that's
| old enough for kids to start making some of their own life
| decisions, like whether or not to have a Coke with their
| lunch. I mean, high school is around the same time that
| students start planning for their future career and/or
| higher education, so if you can be trusted to decide
| between taking vocational classes and fulfilling college
| admissions requirements, I think you can also be trusted to
| decide whether or not to drink a Coke. 14 isn't that far
| off from 13, which is the legal minimum age to get a social
| media account.
|
| Also, unlike going to school, nobody is forced by the
| government to spend multiple hours a day using social
| media. Of course we regulate schools. We also regulate
| prisons to make sure that prisoners are humanely treated,
| or at least we're supposed to. The better analogy isn't
| school districts but convenience stores, in an alternate
| universe where children under the age of 13 were prohibited
| from entering convenience stores and some people were
| complaining that still wasn't enough.
| pangolinplayer wrote:
| If Congress invites you to speak without fear of arrest and the
| main stream media hold you up as a darling then you are NOT a
| whistleblower. You are doing the bidding of power... If you are
| exiled to Russia (Snowden) or locked up without trial like
| Assange THEN you are whistlw blower and dangerous to power.
| This chick is a shill
| [deleted]
| [deleted]
| mrweasel wrote:
| Facebook may not be doing anything illegal, but it is immoral.
| While morality is subjective, and not enforceable, the public
| needs to know what is happening, so they can make their mind
| about supporting a given company.
| chasd00 wrote:
| i agree, legislating morality has never worked. However,
| legislation to inform the consumer has worked.
|
| Social media should be forced to inform the consumer when/how
| they're being targeted. When a user is shown 15 pieces of
| content it should be crystal clear the platform is trying to
| tease out an emotional response from them and not just
| showing them their friend's posts. Maybe a warning label like
| "This content was algorithmically curated to elicit the
| maximum emotional response from you".
| onemoresoop wrote:
| Facebook is not doing anything illegal only because there are
| in relatively new space the law hasn't caught up with yet.
| Simplicitas wrote:
| Hmm ... I'm personally getting tired of this "well, it's
| currently legal" .. law changes start with moral indignation.
| We are at that junction now. Although accurate, let's park the
| legality lines.
| shmatt wrote:
| There is something that doesn't sit right with me about the
| whistleblower. Yes, she is a data scientist which gives her many
| many extra points. But, she was a PM for 2 years in FAANG. Her
| actual scope of the what was going on in FB as a huge org, as
| she's trying to comment on (the "I want to fix FB" quotes from 60
| minutes)
|
| Everyone I know who works in FAANG, FB specifically, and PMs even
| more specifically, really have little idea of whats going on in
| the bigger picture. They kind of understand their little piece of
| the puzzle, but even then things can get ambiguous sometimes (and
| this is intentional, as I understand the inner workings at FB)
|
| Bringing her in to the media and Congress as this star witness
| that understands exactly whats going on seems misleading. People
| from outside this world are taking her word like we are hearing
| from someone at the C-suite or an executive who has been with the
| company for 10+ years
|
| That, and she seems to falsely claim she is an officially titled
| co-founder of Hinge on her Linkedin
| unethical_ban wrote:
| >Everyone I know who works in FAANG, FB specifically, and PMs
| even more specifically, really have little idea of whats going
| on in the bigger picture.
|
| First, I disagree with your assertion, based on my experience
| in, well, working anywhere. Two years as a PM in the field on
| which she is blowing the whistle is not a lack of credibility
| (others are saying her PM experience at FB dealt with the kinds
| of issues that would make her pretty informed).
|
| Second, if what you say is true, then what kind of Hell is
| Facebook? This behemoth of a company, this pillar of society is
| somehow so large that only a handful of privileged overseers
| can possibly understand its mechanics?
|
| Tear it down, then.
| farcebook wrote:
| So, what, we should just wait until Zuckerberg decides to get
| his act together one day and admit he's been a wee bit evil?
|
| The whole point of whistleblowing is so the average worker cna
| tattle on the unethical decisions made by leadership. The
| C-suite isn't going to volunteer for the guillotine.
| aaroninsf wrote:
| There is not much mystery here IMO.
|
| The obvious options: - she went in with an agenda from the
| beginning intending to get this stuff - she went in, discovered
| that the sociopathic pattern she'd been warned about and had a
| nagging doubt about was worse than she though, got the
| religion, and got the stuff
|
| As an aside,
|
| Facebook hires smart. Both of these and her success are
| consistent with a smart and motivated person. Isn't that what
| we're supposed to be selecting for? Isn't drive and high
| functioning what Facebook is filtering for?
|
| More: isn't setting aside moral misgivings and agonizing, in
| pursuit of achievement of your mission, EXACTLY what Facebook
| is trying to filter for...?
|
| Always a shame when the sword cuts the wielder!
|
| But back to the point: does it matter which of these is true?
|
| Not to me, or to democracy; the bottom line is that it takes an
| action like this to force the endless malfeasance, amorality,
| and actual destructive behavior into public consciousness. Not
| least when fighting a machine that seeks to stifle criticism
| and control narrative: this is indeed exactly what she is
| bringing receipts on.
|
| The supposition seems to be that she is a plant, sent on a
| mission to bring down Facebook.
|
| Let's say that's the case.
|
| I have no issue with that, as Facebook belongs down.
|
| I don't care who paid for this skullduggery--especially if it's
| the US taxpayer.
|
| The GOP has done everything it can to curtail the ability of
| the state to challenge the power of accumulated capital.
|
| I would applaud a game leveling asymmetric warfare style
| methods to do the state's work.
|
| Indeed, this would be a remarkable and rare return on taxpayer
| money, should it bring about the dismantling of their
| profoundly caustic monopoly.
| gameswithgo wrote:
| So who do you propose to bring in and testify against Facebook
| who does understand the bigger picture, shmatt?
|
| Are you arguing in good faith here?
| koheripbal wrote:
| At minimum it needs to be someone who is aware of the company
| strategy.
|
| She simply doesnt have any relevant knowledge.
|
| This is all politics.
| the_snooze wrote:
| "Oh, they're a Corporal? They don't know the strategy."
|
| "Oh, they're a General? They don't know what's actually in
| the trenches."
| Graffur wrote:
| Question both then?
| shmatt wrote:
| like i've written in other comments
|
| She has proven evil happened, we're now asking who directed
| it. The first thing we need to know is who did she report
| into, and who the most senior person in her teams meetings
| was.
|
| From there just keep going up, it will stop at some point (I
| have no doubt there will be zuck martyrs, just wondering at
| which level)
| walrus01 wrote:
| > Everyone I know who works in FAANG, FB specifically, and PMs
| even more specifically, really have little idea of whats going
| on in the bigger picture
|
| That's exactly the thing about the banality of evil, any
| individual person working on some small subsystem or component
| of something in a FAANG, to them it might not immediately be
| apparent that what they're doing is enabling the corrosion of
| democracy and discourse. And enabling the monetization of
| peoples' attention into endlessly scrolling social media
| feedback loops.
|
| It's the end result of all the work of hundreds, or thousands
| of 'engineers' and 'product managers' in facebook working
| together on their own projects, combined together in aggregate,
| that turn it into a monster.
|
| You can also see the same thing in hardware and software
| engineers that work on some small discrete component of some
| piece of equipment in the defense industry. They might not
| agree with the total product as it's used in the real world
| (precision US weapons sold to Saudi Arabia and deployed in
| Yemen, for instance). But all that one engineer sees is the
| small item or subsystem that is within their scope of work.
| aierou wrote:
| You just compared a social media company to a weapons
| manufacturer.
|
| Elsewhere, I see comparisons to cigarettes, oxycontin,
| gambling.
|
| It is easy to see how one might think Facebook to be evil if
| they are constantly bombarded with these false equivalencies.
| gameswithgo wrote:
| Nothing false about those equivalencies. In the case of
| cigarettes and Facebook you have the exact same practice of
| optimizing for addiction with no regard for the harm it
| does.
| entropicdrifter wrote:
| Not to mention targeting children specifically to foster
| deeper, longer-term addiction
| OvidNaso wrote:
| You really cant see the comparison to gambling? not so long
| ago most would think that was absurd to put on the list as
| well
| JohnFen wrote:
| How are they false?
|
| The similarities being talked about center around those
| companies specifically leveraging the mechanisms of
| addiction in order to make more money in spite of the fact
| that doing so harms people.
|
| I think a very good argument can be made that Facebook
| (among others) are doing this very thing.
| aierou wrote:
| Let me put it this way: how would you go about measuring
| the harm that Facebook or any other social media company
| causes?
|
| With cigarettes, oxycontin, or weapons manufacturers, you
| can look directly at the physical harm they cause. We
| have statistics and studies that require little effort to
| interpret.
|
| With social media, it is nearly impossible to connect
| physical or psychological harm to a platform over the
| course of someone's life. The papers that draw
| conclusions in this space are based on limited studies
| and polls that could be influenced by any number of
| external factors. We judge these things based on little
| more than feeling.
|
| Now, I think it would be great to be able to make
| informed decisions based on diligently collected data
| (and maybe that's what we should be fighting for) but we
| don't have that right now. Why, in this case, do we seem
| eager to throw scientific rigor and frankly due process
| out the window?
| kaibee wrote:
| > With social media, it is nearly impossible to connect
| physical or psychological harm to a platform over the
| course of someone's life. The papers that draw
| conclusions in this space are based on limited studies
| and polls that could be influenced by any number of
| external factors. We judge these things based on little
| more than feeling.
|
| Yeah, you'd need to some kind of A/B testing on
| unsuspecting users and see if you can manipulate their
| mental health to get worse. Fortunately this sort of
| thing would never pass an ethics review board.
| Unfortunately, Facebook either didn't have one at the
| time or didn't listen to it, because they literally did
| exactly that.
| JohnFen wrote:
| The misbehavior of Facebook has been well documented and
| has been going on for many years. Facebook has been
| shifty and deceptive in their responses, and there is
| literally no reason to give them the benefit of any doubt
| at this point.
|
| Your comments about hard data are well-taken, but you
| talk as if we have no, or very little, evidence that harm
| has been (and continues to be) done. I think that we have
| a lot of evidence to indicate that there's a real problem
| here -- and one that Facebook continues to downplay. None
| of that evidence is as clear-cut and solid as physical
| injury is, but that's to be expected with social harm.
|
| > Why, in this case, do we seem eager to throw scientific
| rigor and frankly due process out the window?
|
| I'm not eager for that at all. On the other hand, the
| evidence we do have very strongly indicates (but does not
| prove) that there's a real, serious issue here. Are you
| suggesting that we should ignore that? If we waited until
| there was zero uncertainty on things before taking
| action, the world would be unacceptably dangerous.
|
| Facebook could have helped on this front by being honest
| and taking the issue as seriously as they take profit-
| generation. But they chose not to, and now, after so many
| years of deceptive and abusive behavior, we have no
| reason to trust them anyway.
| dntrkv wrote:
| The discussion around FB, and social media in general, has
| gone off the deep end. There is no nuance. Facebook is
| Hitler, Philip Morris, and Purdue combined.
| plaidfuji wrote:
| The comparisons are completely relevant. All are companies
| that provide a product with significant negative societal
| externalities. The only difference is that social media has
| yet to be regulated in any way.
| silverlake wrote:
| She collected all this info from Facebook Workplace. Nearly
| everything is available there. It's very open.
| 1024core wrote:
| From her profile:
|
| > In June 2019, she joined Facebook. There, she handled
| democracy and misinformation issues, as well as working on
| counterespionage as part of the civic misinformation team,
| according to her personal website.
|
| So she's not just a random "PM" in Facebook; she was intimately
| involved in what she's talking about. Please stop trying to
| spread disinformation.
| rightbyte wrote:
| Maybe she read the internal docs and meeting notes? Most
| employees wouldn't notice an proverbial elephant farm in
| another department aslong as it didn't involve them directly.
| Even if it was advertized in mails and on billboards ...
| unsui wrote:
| This is textbook ad hominem:
|
| https://en.wikipedia.org/wiki/Ad_hominem
|
| > attacks the character, motive, or some other attribute of the
| person making an argument rather than attacking the substance
| of the argument itself
|
| Question her motives or qualifications, rather than the claims
| themselves.
| shmatt wrote:
| No. I want to see c-level, senior vp's, and around them start
| answering questions. Now that its in congress, subpoenas are
| pretty easy
| elliekelly wrote:
| Lucky for Zuck he's spent the last decade cultivating lots
| of relationships in DC...
| BitwiseFool wrote:
| To me at least, something feels very off about this whole
| situation. Out of the blue some larger-than-life person comes
| out of the woodwork and is lauded with attention while the
| big news outlets make this massive push against Facebook, all
| the while congress is holding hearings and a massive outage
| happens at Facebook right after the New York Times published
| an article titled "Facebook Is Weaker Than We Knew."
|
| Edit: She is _remarkably_ calm, well-spoken, knowledgeable,
| and articulate for someone testifying before the Senate for
| the very first time - all while being broadcast around the
| globe, live on television. Perhaps she 's simply a natural,
| but I sense she received some coaching and preparation
| beforehand.
|
| Combine all of this with politicians chomping at the bit to
| fight online 'misinformation' and I become very skeptical.
|
| It certainly could all just be a perfect storm and Mark Z.
| has some terrible luck. But again, my intuition is telling me
| there is something coordinated going on here.
| 6gvONxR4sf7o wrote:
| > Bringing her in to the media and Congress as this star
| witness that understands exactly whats going on seems
| misleading
|
| I mean, they tried bringing Zuckerberg in, but he apparently
| just lied, so she seems like the best option to get honest
| informed answers.
| kentonv wrote:
| > But, she was a PM for 2 years in FAANG.
|
| More like 15 years. I've known Frances since 2009 when she was
| a PM at Google, where she worked on a few products including
| Google+. She also spent several years an Pinterest which, while
| not technically a FAANG, is certainly a major social network
| that does lots of algorithmic ranking. She is definitely an
| expert in the topics she is testifying about.
| Graffur wrote:
| I'm not surprised G+ wasn't brought up earlier. What a
| failure
| shmatt wrote:
| my point was, people who work for 2 years at FAANG, in 1
| company, far from understand the complete scope. I did not
| mean she was 2 years into her career
|
| It can take 6 months to just fully understand what a 7 person
| team does
|
| After 2 years at Facebook, does she really understand the
| full strategy 4 levels above her? Her leak is super
| important, but she doesn't have all the answers the media is
| claiming she does
| kentonv wrote:
| I think her fundamental argument isn't even really about
| Facebook specifically. It's about algorithmic content
| ranking being intrinsically harmful to society. The thing
| that's specific to Facebook is that they actually have a
| ton of research quantifying this -- which she has released
| -- and yet they are still all-in on it.
|
| It doesn't seem like there's a lot of complexity to
| understand here about how Facebook works as a company.
| Their commitment to optimizing metrics over all else is
| well-known already. The interesting thing is the research
| showing how harmful that is.
| jasondigitized wrote:
| The importance here is she understands the B2C model for
| revenue which is to test and test some more and let the date
| optimize your funnel. That's the story and that method has lots
| of unintended chickens coming home to roost
| alfalfasprout wrote:
| Well, yes and no.
|
| While as a PM (or IC) she'd be primarily working on a scoped
| area of work, the reality is also that she likely didn't come
| to learn all this on her own. At nearly any large tech company
| it's pretty easy to get in touch with someone on another team
| and learn more about their work, etc.
|
| The fact that she's been able to get all this context and
| significant supporting documentation points to her not having
| gone about this totally solo (despite her claims).
|
| There's also something to note about things being intentionally
| ambiguous-- that's by design to prevent most ICs from putting
| together broader context. But it's not clear that it would
| prevent a highly motivated employee from amassing broader
| context. Like I said before, with even a bit more context other
| ICs can be looped in to fill in the gaps.
|
| You're giving Facebook far too much credit here.
| aardvarkr wrote:
| FYI the lady was the Product Manager and ran the entire Civic
| Engagement group. She's not just a project manager working on
| a team of devs.
| shmatt wrote:
| >FYI the lady was the Product Manager and ran the entire
| Civic Engagement group. She's not just a project manager
| working on a team of devs.
|
| See, i'd just like to hear more of this
|
| * How many direct reports did she have?
|
| * What was her official FB level?
|
| * What level of management were in meetings with her?
|
| * Who did she directly report into?
|
| All really good context most people are ignoring
| nitrogen wrote:
| Every small engineering team I have worked on had its own
| pro _duct_ manager.
| alfalfasprout wrote:
| Right, but my point is that even if she ran the entire
| civic engagement group it's naive to think she couldn't
| find out more about what's going on in other orgs if
| properly motivated. She'd have to go out of her way to do
| it and do a fair amount of digging but it's far from
| impossible.
| OldHand2018 wrote:
| Let me ask you something: How much of the "bigger picture" did
| Snowden have?
|
| Because, you know, one person is male and the other is female,
| and I'd really like to believe that gender has nothing to do
| with it.
| shmatt wrote:
| They're pretty much seem the same to me
|
| Same with Snowden. The person who needs to answer the
| question is 10 levels above him
|
| She is showing us "evil stuff happened"
|
| The question we're asking is "Did evil stuff happen on
| purpose, and who directed it"
|
| Both Snowden and Her can't honestly answer than question
| (Snowden wasn't really an official employee IIRC)
|
| My issue is her portrayal not by herself but mostly by the
| media like she can prove the latter
| aardvarkr wrote:
| What's your point here? This lady is a PRODUCT manger and heads
| up the entire product, in this case Civic Integrity, which was
| directly charged with tamping down hate on the platform related
| to civic engagement. She is not a PROJECT manager like what you
| deal with on your engineering team. She is by definition an
| expert on this segment of Facebook and you're dismissing her
| wrongly for being a "PM".
|
| She's a Harvard MBA with a degree in Computer Engineering and
| experience at Google and Pinterest. She's qualified to have a
| pretty significant role in the company and rightly so earned
| the job that she has.
|
| EDIT: I looked into the claim that she co-founded Hinge and
| it's mostly accurate. She worked with Justin McLeod (Hinge CEO)
| to build "Secret Agent Cupid" which was Hinge before it
| rebranded with the launch of it's new mobile app.
| blahblah123456 wrote:
| The PM people usually deal with on eng teams are product
| managers and people do call them PMs.
|
| Lots of junior people also have these educational
| credentials.
|
| Not saying she isn't senior, but these two things mean
| nothing.
| intended wrote:
| This is simply forgetting the context.
|
| Someone said she doesnt know firm strategy,
|
| The response was that she is one of the best people to
| speak on this topic.
|
| And then your response dismisses the value of her
| credentials.
|
| In context, her credentials rebut the claim that she has no
| perspective. A Harvard MBA and Data Science degree give her
| unique perspective and valid perspective on what she is
| talking about.
|
| Further I have watched the Whistleblower deposition and she
| has said things within the ambit of what she has said.
|
| Additionally I personally have a position to know this
| particular space and she has repeated things that many
| already know.
| tinyhouse wrote:
| Her engineering degree from Olin is more impressive than her
| MBA from Harvard. If you highlight just one then highlight
| that. Every idiot with money can get an MBA from Harvard
| (this is going to be down-voted badly I know; but don't get
| me wrong, I worked with a lot of very smart MBAs but also
| with many idiots; the distribution is very wide)
|
| Also, a Facebook PM is not a senior role. While she does have
| a lot of experience, I doubt she led anything significant at
| FB with a PM role. With her experience I would expect her to
| be at least senior PM if not Principal. Probably didn't do
| well in the interview or maybe concerns about lack of
| promotions anywhere she worked.
| johntiger1 wrote:
| Agree with the Olin point. But disagree on the scope
| argument; I agree with GP about banality of evil - doesn't
| matter how small or minor it is
| [deleted]
| koheripbal wrote:
| The issue is that her experience at Facebook did not put her
| in a position know the company strategy.
| crooked-v wrote:
| The company strategy seems pretty self-explanatory from the
| documents that she released.
| intended wrote:
| Are there any specific parts of her testimony that you disagree
| with?
|
| From what I have seen, everything she has said tracks
| correctly, and does not go outside her realm of knowledge.
|
| She has limited her focus very clearly, for example on FB
| prioritizing reshares vs a known increased risk of violence.
|
| So she is in perfect position say that FB has chosen to
| prioritize growth over user safety.
| trompetenaccoun wrote:
| This story is on the current frontpage with 3 seperate threads,
| two even linking to the same outlet (NPR). When you go into the
| threads criticism of this call for greater government regulation
| of speech is met with personal attacks and other fallacious
| arguments. Might be just humans being overly excited humans, but
| this sure seems very organic.
| nanidin wrote:
| My $0.02, but it seems like the moderation team (dang in
| particular) has been less active over the last 24 hours.
|
| Usually there are reminders in large threads to click the
| "more" button to see all comments, but there was no such
| comment on the huge "Facebook-owned sites were down" post
| yesterday. They usually also merge duplicates and add top level
| comments with links to the other conversations taking place on
| the same topic.
|
| The moderation team usually does a lot to reign in problem
| behavior, but there seems to have been a lot questionable
| comments making it through lately.
|
| I have also noticed a lot of comments by users with "throwaway"
| in their usernames - seems like a shift in who is using HN and
| how they are using it.
| jasondigitized wrote:
| Now do the same for CNN, Fox News and every other sophisticated
| media company who have all created Skinner boxes to get more
| eyeballs. The entire system needs a hard examination.
| afavour wrote:
| Oh come on, they're not comparable.
|
| If you turn on CNN or Fox News right now I know exactly what
| you'll see. Because it's exactly what I'd see if did the same.
|
| What's on your Facebook feed right now? No clue. How are
| different types of content weighted? No way of knowing.
|
| Facebook holds a _lot_ more secrets than any traditional media
| company about what they're showing their users. Leaks from FB
| are much more valuable.
| jasondigitized wrote:
| What will you see? 24-7, Emotionally charged "Breaking News"
| which is simply a different fruit from the same horrible
| tree. CNN and Fox News are just using different methods based
| on the data they have available which comes from focus
| groups, Nielsen, and a whole cottage industry that most
| people are unaware of.
| giantrobot wrote:
| You're doubling down on your whataboutism and deflection.
| The key difference between Facebook's "news" feed any
| mainstream television and print news is visibility.
|
| If I record every minute of cable news and subscribe to a
| bunch of newspapers I can tell you the exact contents of
| each. If any of them wants to espouse a narrative it's
| pretty easy to see just by looking at what they've
| aired/printed.
|
| That's not possible to do with Facebook, unless you _are_
| Facebook. Every users ' "news" feed is slightly different
| based on their behavior/relationships tracked on Facebook
| and through their massive advertising network (off
| Facebook).
|
| I put "news" in quotes because Facebook tunes the contents
| of their feeds for monitization rather than any semblance
| of truth or accuracy. Facebook is awash in literal fake
| news, as in completely made up "news" articles, because
| they peak in some engagement metric.
|
| An actual news program on cable or non-opinion news article
| in a paper can't get away with outright lies. They also
| would be liable for outright defamation.
|
| For all the ills and failings of cable news and newspapers
| they are nowhere near as toxic or fundamentally broken as
| Facebook's "news" feeds. They're not even in the same
| ballpark which makes your whataboutism really puzzling. Do
| you honestly not see the difference?
| jasondigitized wrote:
| You are pointing out scale and impact which I agree with.
| I am pointing out economics. Both have incentives which
| are driving outcomes that are undermining civility and
| cooperation.
| afavour wrote:
| > What will you see? 24-7, Emotionally charged "Breaking
| News" which is simply a different fruit from the same
| horrible tree.
|
| Right, but _we know what they are showing people_. We have
| no idea what Facebook is showing people. There 's no
| equivalence there. Yes, both are bad, but in the context of
| these leaks equating Facebook to CNN is delusional
| deflection.
| jasondigitized wrote:
| It's not deflection. You are pointing out magnification.
| It's the same concept. Facebook just has far greater
| granularity in their ability to target and deliver
| personalized content. CNN and Fox News is still
| segmenting content and targeting users. That's why they
| can both coexist. I am not arguing that Facebook isn't
| far more sophisticated. Newspapers < CNN < CNN.com <
| Facebook.
| unsui wrote:
| whataboutism...
|
| those deserve their own attention on their own merits.
|
| But not directly relevant to this discussion, particularly if
| it defers or interferes with the FB discussion specifically.
| q1w2 wrote:
| I suspect the reaction from a lot of companies is going to be
| to lock down internal documents under a need-to-know security
| model.
| CosmicShadow wrote:
| Easier said than done though. My wife works for a massive
| company that now makes you classify every email as external,
| internal, or confidential and after numerous emails, training
| and constantly calling people out on things, nobody can still
| figure out the difference (and thus marks everything far less
| secure than it is), despite it being trivial.
| fullshark wrote:
| Well if the precedent from this is any employee can leak
| whatever they want, violating their NDA and claim the company
| is committing securities fraud by having secrets why wouldn't
| they?
| throwaway6734 wrote:
| And greatly reduce their efficiency
| amoshi wrote:
| That's pretty much what happened after Snowden, everything is
| much more locked down, access is tightly controlled and
| employees are more closely monitored.
|
| That's why I'm glad he released as much as he did, any
| followup whistle-blower leak is bound to be much much harder.
| cratermoon wrote:
| That also happened somewhat under Sabanes-Oxley and HIPAA.
| Documents that might have once sat in an unlocked filing
| cabinet are now locked in vaults.
| daniel-cussen wrote:
| That helps the public good too.
| colpabar wrote:
| My tinfoil theory about the outage yesterday is that it was
| done on purpose, as a way to create an opportunity to "hide"
| as many internal documents as possible without anyone
| noticing.
| bellyfullofbac wrote:
| First reaction: That really is tinfoil, if you don't
| include how the outage would help in that process..
|
| Second reaction: Huh, employees were locked out of their
| offices, VPN was surely down, yeah if there was someone
| inside the data centre deleting files off their Intranet no
| one from outside the data centre would be able to notice,
| due to the lack of connectivity.
|
| But it would have to be a very good scrubbing and people
| would notice things missing anyway, "Hey wasn't there used
| to be a PDF here..?". Hah,
| https://en.wikipedia.org/wiki/Memory_hole
| lrem wrote:
| Man, you could delete random 200 documents I worked on at
| least 3 months ago and I would only notice if I followed
| a broken link...
| 5faulker wrote:
| The system's imploding.
| RobRivera wrote:
| i disagree
| [deleted]
| the_snooze wrote:
| This is straight-up deflection. It makes sense to start
| somewhere, so why not the biggest player in town? That would be
| Facebook, with billions of active users. None of those cable
| channels are even close to 100s of millions of viewers.
| justicezyx wrote:
| No, calling out more fundamental and broker problem is the
| first step to address the root problem.
|
| Otherwise, you are just whacking moles, and pretending that
| one day there would be no more moles...
|
| Or rather, you indeed are OK with a constant number of moles
| indefinitely. That's OK. But the premise of the parent is
| obviously that the moles are growing too numerous and are not
| staying constant at all...
| 6gvONxR4sf7o wrote:
| I mean, I think this should be an 'and' thing, not an 'or'
| thing. Don't deflect from FB. But when FB's dirty laundry is
| aired, air these other groups' too.
| asdff wrote:
| Easier to set a precedent against one org and use that to
| enforce good behavior towards other orgs in the future.
| It's how we've always gone after corrupted industries in
| this country. If you go after every bad egg at once you
| aren't going to win.
| 6gvONxR4sf7o wrote:
| I don't think we're disagreeing.
| hunterb123 wrote:
| Main stream media and social media are recursive.
|
| Social media shares main stream media stories.
|
| Main stream media stories use social media as sources.
|
| Social media shares social media source as story.
| cratermoon wrote:
| But which of the those two have made doing the harmful
| amplification for profit a business model?
| hunterb123 wrote:
| The main stream media of course. FB makes money on ads.
|
| FB's media is organic from users. The MSM is
| orchestrated.
| cratermoon wrote:
| And how do they make money on ads?
| AzzieElbab wrote:
| _None of those cable channels are even close to 100s of
| millions of viewers._ That is not fair at all. TV networks
| are local, there are only 330M ppl in the US. Also, you are
| not counting how many _viewers_ TV networks and their
| employees reach via FB /Twitter.
| fullshark wrote:
| Considering the claim that FB committed securities fraud
| based on the evidence we've seen so far is pretty laughable,
| it appears her testimony is designed to encourage new laws
| and regulations for (social in particular) media companies.
| As such a conversation about the industry at large and its
| practices makes sense.
| softwarebeware wrote:
| Let's assume Facebook does invest into policing their platform to
| the extent necessary for it to not result in political
| consequences ("misinformation and disinformation"). There's still
| the rest of the internet. There's still all the open-source tools
| available that can be used for good or ill. The problem is much
| larger than Facebook.
| grappler wrote:
| The headline on this hacker news entry says "Incriminating". I
| didn't find that word anywhere in the linked npr article. Is what
| has been leaked in fact incriminating?
| PrinceRichard wrote:
| "Facebook does not moderate content in a way that I approve of"
| is a long way off from "incriminating"
| plandis wrote:
| If you don't like Facebook just don't use Facebook, WhatsApp, or
| Instagram. Don't know why so many people here think parents and
| adults are not capable of regulating themselves and their kids.
|
| It's honestly condescending to think you know better than your
| fellow citizens if you're arguing for government intervention.
| bbarn wrote:
| There are places in the world where these services are the only
| option if you wish to engage in the US equivalent of simple
| text messaging. I have been "that guy" that doesn't want to use
| the platform the rest of the social group uses, and it sucks.
|
| Is it so much to ask for honesty from a company these days?
| Steltek wrote:
| Pardon but network effect much? In my area, FB is a huge
| resource for community groups and news. It would probably make
| local parent life more difficult to unilaterally cut off FB.
|
| It's a good market opportunity actually. Too bad NextDoor is
| basically known as "racist people alerting their neighbors that
| a black person is out walking their dog" or some nonsense.
| notacoward wrote:
| "If you don't like vaccines..."
|
| At first I thought that was a bit _too_ facile, they have
| nothing to do with each other, but ... is that true? Other
| people using Facebook to spread hate and misinformation
| (including vaccine misinformation) _does_ affect me. They
| affect who gets elected (or appointed) and what policies get
| enacted. We 're already seeing real tangible effects at the
| state level, and with the 2022 elections we might see more at
| the federal level. (That's just the US. The same is absolutely
| true elsewhere, but it's harder for me to come up with examples
| that are both accurate and familiar to most readers.) In the
| sense that they both affect public health -
| political/economic/social in one case, physical in another -
| hate/misinformation and vaccine refusal _are_ similar. And for
| the same reasons, "if you don't like..." is an unhelpful
| response.
| CountDrewku wrote:
| Exactly how does it affect you if you're vaccinated?
| notacoward wrote:
| What @heartbreak said, but also far more. Vaccines aren't
| 100% so the continuing circulation of the virus will cause
| some number of those to get sick. Then there are variants.
| Kids being sent home because someone else tested positive.
| Travel restrictions. The list goes on. _None_ of this
| should be news to anybody who has actually been paying
| attention, and such a person better not be saying "do your
| own research" to anyone else.
| heartbreak wrote:
| Medical resources are finite, and there are people who need
| non-Covid medical care competing for those limited
| resources.
|
| Getting pretty tired of having the same argument with
| people who are smart enough to figure this out on their
| own.
| dirkt wrote:
| While I agree, and while I do that myself, the social pressure
| to use Facebook/WhatsApp/Instagram is immense. And it's worse
| for kids.
|
| I have to constantly explain to people why I don't use those,
| and they still keep trying to convince me that I should.
| 015a wrote:
| If you don't like Oxycontin, just don't use Oxycontin,
| Dilaudid, or Fentanyl. Don't know why so many people here think
| adults are not capable of regulating themselves.
| CountDrewku wrote:
| You realize ending the drug war is largely favorable to most
| people on here right?
| bestcoder69 wrote:
| I haven't polled HN, but I get the sense it's more about
| regulating formerly black market drugs, rather than
| deregulating pharmaceuticals.
|
| Anyone on the 2nd side should do a deep dive on buying
| "clean" delta 8 THC.
| [deleted]
| an-allen wrote:
| Chemical Addiction != scrolling facebook
| 015a wrote:
| Of that, we are in agreement; one is popping a pill which
| releases chemicals in your brain which over time correlate
| with a measurable reduction in quality of life; the other
| is interacting with an app which releases chemicals in your
| brain which over time correlate with a measurable reduction
| in quality of life.
|
| Definitely on the same page; not a strict mathematical
| equivalence.
| AlexandrB wrote:
| Maybe gambling is a better analog. Last time I checked
| gambling was pretty tightly regulated.
| bigphishy wrote:
| Excellent analogy.
| sentinel wrote:
| Not a better analog at all.
| unethical_ban wrote:
| The reward system is quite similar, in my opinion.
| jeffrallen wrote:
| Facebook's researchers found the opposite...
| karaterobot wrote:
| And yet, regulating off-label usage of opioids doesn't seem
| to be very effective at reducing addiction either. So perhaps
| this analogy doesn't work, and we should acknowledge that
| using social media and becoming addicted to opioids are
| substantively different?
| CountDrewku wrote:
| Because they're authoritarians. There's a large cultural push
| by people to use government to bludgeon everyone else with
| their values.
|
| I'm not sure when it switched but it's seems to be more and
| more acceptable to force people into acting a certain way. It's
| really anti-american and anti-progressive imo.
|
| A government controlled social media ripe for propaganda seems
| far more terrifying to me than what's currently on facebook.
| [deleted]
| brap wrote:
| I had to read through way too many comments to get to a sane
| one.
| unethical_ban wrote:
| We haven't banned cigarettes or alcohol, but we have put
| labels, restrictions and public ad campaigns into place to
| govern the behavior of their sellers, and to inform the public
| of their dangers.
|
| I think it is a terrible place where government does not have a
| responsibility, as the union of the people, to help inform
| citizens of the dangers of addictive products and to regulate
| unethical behavior of the sellers of such products.
|
| Your perspective completely ignores the intentionally addictive
| design of the products, and the network effects of having
| everyone and every business you know also using it.
| lr4444lr wrote:
| There is no amount of tobacco, and probably no amount of
| alcohol, that has any health benefits. It's completely toxic.
|
| What you're advocating is more like suggesting we might also
| consider putting labels on cheese about the risk of eating
| too much saturated fat, or heck, a warning on most green
| vegetables for people taking Coumadin and other blood
| thinners.
|
| The warning labels anyway are not placed on these things you
| mention on the basis of addictive qualities.
| CountDrewku wrote:
| The problem with that is what is and isn't disinformation
| isn't as clear cut as alcohol/tobacco being harmful for your
| health.
|
| The former could be easily abused to create a certain
| narrative.
| yumraj wrote:
| I think FB is having its MySpace moment.
|
| From what I remember, right around the time FB was picking up,
| MySpace was facing a lot of scrutiny especially around predators
| preying on children, which was one of the main, if not the only,
| reason for MySpace's downfall.
|
| The time is ripe for a competitor to enter this space.
| throw_m239339 wrote:
| > I think FB is having its MySpace moment.
|
| No. Beyond USA, Facebook and its app ecosystem is used by
| people all over the world as a way to do business, this is
| especially true in developed country, people sell goods via
| Facebook, adversative and communicate to their customers via
| Facebook... MySpace never had as much reach and was never
| central to anybody's life. For many people all over the planet,
| Facebook and Whatsapp are the only apps they use. A lot of
| people on HN, because they are westerners, completely fail to
| understand that and only see how Facebook "fails to moderate
| the speech they don't agree with".
|
| > MySpace was facing a lot of scrutiny especially around
| predators preying on children, which was one of the main, if
| not the only, reason for MySpace's downfall.
|
| No, Myspace failed mostly for racist reasons, ironically, when
| young white educated people left Myspace for Facebook when the
| latter was deemed, and I quote, "less ghetto".
| binarymax wrote:
| Meta: Be warned that anyone who is anonymous and commenting on
| the validity of the whistleblower, may be speaking in the
| interests of facebook and spreading disinformation.
| erehweb wrote:
| The link claims that thousands of documents were shared. Wonder
| if we will get a list of what these were at some point - perhaps
| just title if not the full docs.
| intended wrote:
| A frustration is how often these discussions are America centric,
| and the whistleblower herself pointed out how little integrity
| work is done on most other languages - that integrity software is
| not even pushed to those regions. The specific example discussed
| was Ethiopia where facebook has integrity teams(?) for 2 out of 5
| languages.
|
| However, its remiss to say this is just a FB thing. Want to work
| on hate detection in any complex region. The ease with which you
| can do hate speech detection in English (with all its caveats)
| pales in comparison to working on commonly shared content in
| other regions.
| betwixthewires wrote:
| Imagine unironically whistleblowing and stoking outrage that the
| censorship isn't enough.
|
| It is bad that the company knew that Instagram is harmful to
| teenagers' mental health and still marketed to them anyway. But
| "Facebook doesn't do enough to stop the spread of misinformation"
| is the most ridiculous narrative on something like this I've ever
| heard, and honestly I'm not surprised such a narrative got a 60
| minutes special. The problems I have with Facebook are the _real_
| problems, like how it is designed to get people addicted and
| steal their lives from them, how Instagram is designed to instill
| envy so as to maximize use. These are real problems, bad
| problems. "Facebook isn't doing enough to prevent vaccine
| hesitancy" is not on anyone's radar that seriously cares about
| the effect Facebook is having on our societies, the only people
| hammering on this narrative are misdirecting at best, working a
| propagandistic angle most likely.
| bestcoder69 wrote:
| Anecdotally, all of the people I know who are addicted to FB
| (and they were before COVID) are also all anti-vax.
| betwixthewires wrote:
| For me it ranges. Some people are opposed to the covid
| vaccine mandate, some people don't want it for themselves,
| some people (I think only one or two) are anti vax types,
| some have it but don't think others should have to and some
| think you're evil if you disagree that everyone should have
| to. Precisely what information people internalize on Facebook
| isn't the common thread there, the common thread is that it
| consumes their social life and has a near monopoly on their
| information availability and therefore worldview. Facebook's
| problems go beyond what information is available on it.
| lurquer wrote:
| A 'whistleblower' who claims FB isn't censoring enough?
|
| Isn't manipulating and curating political views enough?
|
| Gimme a break...
|
| This is like something you'd read in the Gulag Archipelago.
| TheGigaChad wrote:
| Idiot.
| kentonv wrote:
| She is not arguing for censorship at all. She's actually
| arguing that content-based censorship doesn't work because the
| AI algorithms are so inaccurate.
|
| She is arguing against algorithmic content ranking, and in
| favor of chronological feeds, as well as other measures that do
| not attempt to judge the content itself.
| tomcam wrote:
| Can anyone provide links to the actual documents?
| phantom_oracle wrote:
| Firstly, it is sad to say that HN is becoming more negative like
| the rest of the internet. On the front page there are more
| articles devoted to the ever-evolving shit-show of American-
| focused news issues. There are far less links to things that _I
| THINK_ HN is more suited for: like BGP protocol or building your
| own ham radio.
|
| That aside, my theory about whistleblowing is that it is a
| counter-intuitive exercise that results in very little at the
| expense of orgs tightening their security policies. Case in
| point: Snowden and the NSA
|
| Leaks don't seem to happen after the first one. One or two small
| bills to "change a law" doesn't fix an endemic problem.
|
| Facebook will continue after this blip. They have enough money to
| spin the PR in their favor and to grease the hands of their
| political-dependents there in Washington.
| Syonyk wrote:
| Two of the front page articles right now are about BGP - one
| about exploring it, one about _playing Battleship over it._
| That seems... relevant?
|
| But we all have to live in a world influenced by social media,
| Facebook is really the most overtly evil of them (What's Good
| for Zuck is Good for Zuck! seems to be their guiding principle
| lately - anything for more ZuckBucks), and as it comes out that
| they've _known_ that what they 're doing is evil, and continue
| doing it? This is relevant tech news.
|
| And, yes, I'm exceedingly "negative" about social media
| anymore. The downsides in terms of ripping apart society
| outweigh the upsides of making a lot of money for a few people.
| bigphishy wrote:
| facebook, inc. is not just an amiercan issue. For years they
| have been nefariously bribing other countries to promote their
| website.
|
| Take for example internet.org internet takeover attempt in
| India, or a better example, in Brazil facebook, inc. bribes
| local telcom providers to provide whatsapp access for free (
| users are not charged data usage )
|
| their agenda is clear. abuse and lie through their teeth,
| making as much money and power as possible.
|
| facebook, inc. is a cancer on our global society.
| runawaybottle wrote:
| It's because that blockchain someone made is being used as
| national currency somewhere (and contributing needlessly to
| global energy consumption), and because that photo sharing site
| is causing body image issues, and that ad-tracker is building a
| digital trail of everything you do, and that ML algo is
| identifying protestors, and and and ...
|
| We are not discussing the shit show, we are the shit show.
| tantalor wrote:
| > Please don't complain that a submission is inappropriate
|
| > Please don't post comments saying that HN is turning into
| Reddit
|
| https://news.ycombinator.com/newsguidelines.html
| nostrademons wrote:
| > That aside, my theory about whistleblowing is that it is a
| counter-intuitive exercise that results in very little at the
| expense of orgs tightening their security policies.
|
| Oftentimes this is the point. An organization with tighter
| internal security policies and lower levels of trust internally
| is significantly less efficient. Over time, this leads to them
| being unable to respond to competitive pressures and then
| getting eclipsed in the marketplace. It's not that the
| whistleblower kills the organization, it's that the
| whistleblower triggers the organization into killing itself.
|
| This was the explicit goal of Wikileaks and of Osama bin Laden.
| They knew they couldn't take down governments themselves, but
| they can make government so inefficient that their own citizens
| take them down.
| vincent_waters wrote:
| What is "incriminating"? It is legal to remove protected speech,
| but it's not illegal to not remove it. COVID and election
| "misinformation" are, for the most part, protected. In fact, the
| actual headline is just "Facebook's new whistleblower is renewing
| scrutiny of the social media giant," with no mention of
| incrimination.
|
| The Hacker News version of the headline is misinformation.
| sizzle wrote:
| She leaked info to the SEC about Facebook materially misleading
| shareholders and investors, which is illegal. Reread the full
| article till the end.
| kjgkjhfkjf wrote:
| I'm not outraged or surprised by the revelations. Everyone knows
| there is toxicity on Facebook and other social media products.
|
| Social media is sometimes toxic because people are sometimes
| toxic. Moreover, people are drawn to toxicity because it is
| grotesquely fascinating. This is also the case for movies, video
| games, and other media; much of it is anti-social and misleading.
|
| Condemning Facebook for the toxicity of some of its users is like
| condemning the manufacturers of mirrors because you don't like
| what you see in them. If you are appalled by society's propensity
| for producing and consuming toxicity, then consider directing
| your attention to shortcomings of our education and healthcare
| systems rather than a company that is simply providing a useful
| service to its users and value to its stockholders.
| marstall wrote:
| Time to talk about Section 230 again?
| siruncledrew wrote:
| I don't care about Facebook, and am not interested in using it,
| but after reading the spiral of consequences Facebook has been
| in, it made me think:
|
| 1. To all the governments that want to tighten more control on
| communication, this is great kindling to show people "we the
| government should further control tech for your own good".
|
| 2. You can hardly get the US gov to agree on anything, but when
| it's about hating each other as much in the digital world as in
| the physical world via a common source, everyone's at attention.
|
| 3. Facebook is so ill-equipped to handle most of the issues that
| were brought forth. The expectation that the gov/people places on
| Facebook =/= reality of what Facebook can deliver. Facebook is
| running around frantically trying to manage an existing mess of a
| switchboard, they are not going to pull a miracle.
| PragmaticPulp wrote:
| > 1. To all the governments that want to tighten more control
| on communication, this is great kindling to show people "we the
| government should further control tech for your own good".
|
| I'm genuinely shocked that the popular sentiment on HN leans
| toward more government intervention and control of internet
| communications. So many comments here are calling for more laws
| and regulation, but few people can even begin to elaborate
| _what_ they want those laws to do.
|
| If laws are passed, they won't be targeted at a specific
| company, nor will they be limited to specific bad actors on
| Facebook. There is no magic law that makes all of the bad parts
| of the internet disappear without also having some chilling
| effects on the part of the internet that you actually like. If
| anything, large incumbents like Facebook tend to come out ahead
| of the smaller companies when onerous regulations are put in
| place.
| lrem wrote:
| > 3. Facebook is so ill-equipped to handle most of the issues
| that were brought forth. The expectation that the gov/people
| places on Facebook =/= reality of what Facebook can deliver.
| Facebook is running around frantically trying to manage an
| existing mess of a switchboard, they are not going to pull a
| miracle.
|
| Do we expect a miracle? Frankly, a modicum of decency would be
| a huge step forward...
| lr4444lr wrote:
| Heaven forbid parents be held responsible for their kids' mental
| health. We saw this with rock 'n roll, video games, and
| marijuana. Your kid saw someone possibly wearing a more expensive
| outfit on social media? Scarred for life, right? Imposing
| stricter age verification to clamp down on trafficking and the
| like is perfectly reasonable, but this social media demonization
| is the latest moral panic, and I hope it dies down before
| Congress does something stupid.
| endisneigh wrote:
| My issue is that the government shouldn't allow companies -
| especially tech companies - to become so big to begin with.
|
| There are ways the government can begin to minimize the growth of
| said companies. There can be taxes based on the amount of MAU,
| strictly prohibit any acquisition of other "social media" after a
| certain size, etc.
|
| In my opinion Facebook should be forced to break up WhatsApp,
| Insta, etc. In addition, the new broken up Facebook organizations
| should be taxed heavily (call it a network effect tax, that's
| progressive and highly de-incentivizes being so huge).
|
| Alternatively, the government could just deem certain internet
| activity "marked" and make it so all "marked" internet activity
| to require payment. This would include pornography, social media,
| etc.
|
| That being said the challenge would be creating a reasonable
| definition of what "marked" includes.
| impostervt wrote:
| What are the potential legal ramifications for the whistleblower?
| chipgap98 wrote:
| I don't think it is what you are getting at, but she may be
| entitled to financial compensation if Facebook is fined as a
| result of her whistleblowing.
|
| > Whistleblower awards can range from 10-30% of the money
| collected when the monetary sanctions exceed $1 million.
|
| [0]: https://www.sec.gov/news/press-release/2021-149
| culebron21 wrote:
| This program says there's an act passed 10 years ago protecting
| whistleblowers who report internal documents to the state.
|
| https://www.cbsnews.com/news/facebook-whistleblower-frances-...
| PragmaticPulp wrote:
| Whistleblowing to the state is different than going on a
| media tour. The protections for the former don't extend to
| the latter.
| propogandist wrote:
| anything FB does against her will undermine the PR damage
| control campaigns underway. They may wait for all this to "blow
| over" and then try to go after her once their very expensive
| campaigns make some impact.
|
| It's more likely that they will collude with other big tech
| firm and lobby for aggressive legislation against
| whistleblowers to prevent something like this from ever
| happening.
|
| Edit -- there are reports suggesting the whistleblower is
| represented by the PR firm where the current US govt press
| secretary held a SVP role, so her case is unofficially aligned
| to the current administrations agenda:
| https://twitter.com/JackPosobiec/status/1445438141775683584
| ahdeanz wrote:
| Coming in with the amplification of alt right conspiracy
| theorists hot take... How meta.
| whatthesmack wrote:
| How is that "alt-right" (whatever that means) or a
| "conspiracy theory"?
|
| It is a direct connection that would be looked into by
| anybody, including Facebook, who is investigating the
| information Frances leaked.
| lr4444lr wrote:
| What could come out in court if they shut her up could be much
| more damaging to FB's reputation than it's worth, and it'd
| probably be hard to find a jury likely to convict her when she
| hasn't directly gained financially from it.
| boringg wrote:
| Depends how she play it. From a work perspective she sadly
| might have a tough time getting employed at other companies. It
| would depend on how she plays the next bit though. That said I
| wouldn't be surprised if theres a book that comes out of this
| given the amount of media already surrounding this.
|
| I am curious what her longer term goals are - she is clearly
| intelligent, has solid PM experience in the industry and is
| certainly aware of what she is doing. I'm guessing there is a
| strategy of some sort at play - maybe leading an NFP for better
| corporate practices. Best of luck either way hope this changes
| things in a meaningful way!
| PragmaticPulp wrote:
| That's a good question. Leaking internal company documents
| doesn't automatically grant someone whistleblower protection.
| She appears to be trying an angle where she claims that
| Facebook's activities have harmed shareholders in an attempt to
| capture some degree of whistleblower protection, but that's a
| huge stretch given that she's arguing they made choices in the
| interest of profits without violating any actual laws.
|
| At this point, I think her best chance is to hope that Facebook
| will simply try to minimize the legal issue to avoid making her
| into too much of a martyr.
| Me1000 wrote:
| It's not really an "angle", she worked with a whistleblower
| protection organization and she specifically gave these
| documents to the SEC, filing 8 complaints with the regulatory
| organization. All of that communication is protected.
|
| I don't know what the legal implications for leaking to the
| press are, but that's where her exposure is.
| asdfasgasdgasdg wrote:
| Her bigger risk is probably the employment angle. Hard to
| picture hiring this person except if you're 100% certain
| your company's behavior aligns with her morals, and you
| can't find anyone else. Even for people who generally agree
| with her view on Facebook employing her presents a pretty
| serious known risk at this point.
| mikestew wrote:
| Oh, she doesn't work in tech now. Her work will now
| involve television interviews and book-signing tours.
| From that jumping off point, she'll be able to do what
| she likes.
| shkkmo wrote:
| I think that underestimates her abilities. I think it is
| more likely she will found or at least join a policy
| focused non-profit that will further her social goals
| while making use of her existing technical and management
| skills.
| solveit wrote:
| Also overestimates her abilities in a different
| direction. Becoming a media personality is _hard_ , and
| takes an entirely different skillset. It just seems easy
| to people who have an axe to grind against the currently
| successful crop of personalities.
| your_a_poor wrote:
| You must be a poor. She doesn't need a job, she worked in
| SV for a decade at 4 startups that went unicorn. Her
| yearly bonus at FB was probably 7 figures.
| xxs wrote:
| She has politics written all over - not to worry.
| asdfasgasdgasdg wrote:
| That would be my assumption, or some kind of public
| policy position. Not to say that her concerns aren't
| genuine, but she has to be aware that for better or for
| worse she won't have an easy time finding further work in
| the industry.
| robbrown451 wrote:
| She'll be fine. She is now globally famous for doing
| something an awful lot of people think is brave and
| admirable, and she has also shown off on TV how sharp and
| eloquent she is. There are a huge number of companies who
| will jump on the chance to bring her onboard.
| warent wrote:
| I'm having doubts, seems that _most_ startups and small
| /medium businesses will not care at all. Have you seen
| the market? Engineers are ridiculously scarce. Most
| hiring managers and execs will just say something like
| "Great, we're too small and don't do anything like
| Facebook. Just keep her in the code and restrict access
| to Google Drive"
| robbrown451 wrote:
| I can't see her being a coder going forward. She is
| globally famous for taking a stand (from a very
| technically informed perspective) on policy issues. A
| smart company would hire her to be in a very public
| facing position, that will reflect upon themselves
| positively.
|
| Obviously, a company that has a huge amount to hide isn't
| going to want to hire her.
| Me1000 wrote:
| OP asked about the legal ramifications, not the social
| and professional implications of her whistleblowing.
|
| But that said, I think there are a lot of cynical takes
| in this comment thread. I don't think Frances will find
| have a difficult time getting a job. There are plenty of
| people in tech who think what she did was admirable and
| are very proud of her, including her alma mater. Sure
| there will be many people and companies who wouldn't hire
| her, but there are also many who will.
| jyxent wrote:
| I'm guessing she would be eligible for an SEC
| whistleblower award if her complaints result in fines to
| Facebook.
| asdfasgasdgasdg wrote:
| Has she alleged any illegal behaviors? Nothing I read
| seemed to be against the law, but I guess the law is a
| big and complicated beast.
| q1w2 wrote:
| I seriously doubt shareholders agree with her angle.
|
| > her best chance is to hope that Facebook will simply try to
| minimize the legal issue to avoid making her into too much of
| a martyr
|
| I suspect the opposite. They will attempt to make an example
| of her to dissuade subsequent whistleblowers.
|
| Social media has a short attention span and forgets its
| "martyrs" within days.
| avisser wrote:
| Doesn't the fact that Facebook shares dropped %5+ yesterday
| give some credence to that argument? It's obviously more
| complex than that, but selling is how shareholders would
| express their agreement.
| joshmlewis wrote:
| But the stock has bounced back today. A lot of tech
| stocks dipped yesterday.
| CPLX wrote:
| Seeing as how the actual mechanics of the leak were
| orchestrated by an attorney, and the documents were sent to an
| enforcement division of the Federal Government, it would be a
| pretty fair guess that they have considered it and are on solid
| legal ground.
| PragmaticPulp wrote:
| Are they operating pro bono?
|
| If not, they may simply take the case because it will
| generate a lot of work for them, and it's likely that such a
| public case could attract plenty of donations to fund the
| cause. Having argued a high profile case against Facebook is
| a huge reputation boost for a lawyer.
| CPLX wrote:
| > Are they operating pro bono?
|
| They are a public interest non-profit:
| https://whistlebloweraid.org/vision/
| N00bN00b wrote:
| No more Facebook for her.
| aardvarkr wrote:
| Potentially severe for leaking company IP. There are explicit
| legal protections for whistleblowing to the SEC but I don't
| believe there is anything protecting one's right to go to
| journalists with privileged information. However, Facebook
| would be 100% insane to press legal charges because that just
| drags this on even longer and reinforce the perception that
| they're bullies
| bpodgursky wrote:
| I don't think I agree. If they don't enforce a precedent of
| consequences for leaking confidential documents, it will be a
| complete breakdown of operational security (everyone will
| feel free to leak memos and documents).
| fullshark wrote:
| And ultimately it's gonna be the call of a single person in
| the entire company.
| q1w2 wrote:
| They have no choice. They MUST enforce their IP. That's how
| IP law works. If you don't contest it - you lose ownership of
| it.
| kentonv wrote:
| I highly recommend watching the Senate hearing from today, and
| I'm a person who normally can't stand these things. This was
| totally different from any other hearing I've seen -- little
| grandstanding, no partisan bickering, no evading of questions.
| Most of the Senators seemed genuinely interested in what Frances
| had to say, and she gave meaningful insights backed by real data.
|
| A lot of what she's saying is stuff that has been generally known
| in the tech industry for some time -- that algorithmic content
| ranking amplifies division and outrage. But the detail she gave
| about actual research quantifying it goes way deeper than I think
| most of us were aware of.
|
| https://www.commerce.senate.gov/2021/10/protecting%20kids%20...
| zestyping wrote:
| Strongly agree. She is doing this in a way that no one has
| really done (certainly not at this level of skill) before:
| explaining the systemic issues, giving clear and direct
| answers, keeping the conflict away from the personal, and most
| of all delivering criticism with empathy and compassion for all
| involved.
| htrp wrote:
| Thats future Senator Francds Haugen to you
| oort-cloud9 wrote:
| She is a Deep State plant. People like her are there to reinforce
| the idea that we need more government control over business. She
| is not any kind of genuine whistleblower. It's all political
| theater.
| dukeofdoom wrote:
| The government doesn't belong in the bedrooms of the nation, but
| somehow should decide which meme graphics two adults can share
| online. Lets Go Brandon.
| dionian wrote:
| The fact that the whistleblower has connections to the Democratic
| Party (donations, legal representation), and is calling for more
| censorship... makes me wonder about the possible ulterior motives
| grouphugs wrote:
| once again people are not really understanding this issue. this
| isn't about security or products, it's about white men and their
| fascist institutions controlling technology for their own fascist
| gains
|
| once again, it was never about products, security, or customer
| satisfaction
| ozzythecat wrote:
| I don't personally use FB and don't have a very favorable view of
| their products.
|
| I would, however, like to see the same level scrutiny applied to
| the general American media, especially the news media. FB has
| come in and started eating their lunch. There's a deeper problem
| here, and it feels more like the powers that he want to take down
| FB.
|
| I'm not denying any allegation made against FB, but why is it
| that Fox News, CNN, MSNBC, and Hollywood get a pass when they've
| been damaging America for much longer than FB has existed?
|
| If the government sees a problem and wants to get involved -
| great. But let's hold an equal bar.
| rblion wrote:
| Both mass media and social media are sleeping in the same bed
| with the same 'influencers' and 'voices of authority' on a pile
| of ad dollars.
|
| They can both go fuck themselves.
| Noumenon72 wrote:
| I've definitely had much darker thoughts after reading a lot of
| traditional media than looking at Instagram.
| isoskeles wrote:
| This past year, I think immediately back to believing and
| repeating the grotesque lie that Brian Sicknick had his
| brains savagely bashed in with a fire extinguisher by Jan 6
| rioters. That never happened, but it was major news for a few
| days. I actually feel a bit violated for believing and
| repeating something so false.
|
| On the other hand, I haven't had a FB account of any type for
| over five years, so it hasn't directly affected my personal
| life.
| rumblerock wrote:
| That feeling of being violated by false / incomplete
| information and narratives drives me mad. I've spent the
| last 5 years trying to manage my digital hygiene, not
| oversaturate myself with news notifications, etc. I look at
| things with a more critical eye, always hunting for bias.
| But it's still inescapable, especially with the force with
| which some of these narratives are pushed.
| jasondigitized wrote:
| Hey I said that. https://news.ycombinator.com/item?id=28762415
| gremIin wrote:
| When you phrase your argument like that, it comes off as
| whataboutism.
|
| You are free to campaign against Fox News, CNN, and the like.
| I will even upvote and share those HN posts if you do.
| jasondigitized wrote:
| I should have simply said "This is a good start"
| [deleted]
| cblconfederate wrote:
| The press and media are an old and mature ecosystem that has
| legal framework around it . These new companies are using free
| content, and sometimes free moderation but still act like the
| press. There's something unsustainable about that and sooner or
| later the society and the law would have to deal with it.
|
| There's something particularly unethical about companies that
| hide behind "User generated content" and "external fact-
| checkers".
| 2OEH8eoCRo0 wrote:
| Opinion: Facebook is eating their lunch because humans love
| being fed belief-affirming drivel for dopamine. It's easy to
| churn out this content when you have no integrity or regard for
| the truth.
|
| Tinfoil hat: Bad actors are freaking out because their greatest
| mis/disinformation tools (Facebook, Twitter, etc) are about to
| be regulated. The jig is up.
| farcebook wrote:
| One, most of the mainstream news has at least basic editorial
| processes in place that do a rudimentary check on truthiness.
| Facebook, along with some of the sketchier news outlets, are
| the opposite: they profit off (and optimize for)
| disinformation.
|
| Two, it's a matter of scale. Facebook reaches a far larger
| audience and has far more detail into their preferences and can
| nano-target personalized stories for them and corral them into
| groups, creating perfect echo chambers. The news companies are
| way too small and in some way irrelevant. Even if they all went
| bankrupt overnight, Facebook's algorithms will keep working,
| keep producing personalized truth bubbles. Is fake news a
| problem in general? Sure. But the news companies are tiny
| compared to Facebook, and not the immediate and persistent
| threat to democracy that Facebook is, just because they're much
| smaller. Even if you take the entirety of local news networks
| as a whole, they don't have the same saturation and engagement
| feedback loops that Facebook has.
|
| The Powers That Be are typically reactionary forces, and has
| long battled the news industry over free speech and censorship
| etc. Facebook is a relatively new villain, different than the
| old ones, and way more powerful. Further regulating the news
| industry won't really deal the Facebook issue since they can
| keep on aggregating from anywhere and everywhere. It's a
| different beast altogether.
| SirensOfTitan wrote:
| I'd heavily recommend Matt Taibbi's Hate Inc to learn more
| about how news media lies, how it addicts people to its
| consumer product, and ultimately how high bar publications
| have been long dead. From the book;
|
| "The public largely misunderstands the "fake news" issue.
| Newspapers rarely fib outright. Most "lies" are errors of
| omission or emphasis. There are no Fox stories saying blue
| states have lower divorce rates, nor are there MSNBC stories
| exploring the fact that many pro-choice Democrats,
| particularly religious ones, struggle with a schism between
| their moral and political beliefs on abortion."
| mdoms wrote:
| > One, most of the mainstream news has at least basic
| editorial processes in place that do a rudimentary check on
| truthiness.
|
| Come on, man. There are just so many counter examples, from
| the "paper of record" NYT (1619 Project) to MSNBC (Russia
| nonsense) to Fox News (literally everything) to Rolling Stone
| (A Rape on Campus) I could sit here all day listing outright
| lies and untruths published in the pages of mainstream media
| outlets in order to push an agenda.
| kriskrunch wrote:
| Agreed. I'll just add, specifically the media's agenda is
| making money, and just like FB it affects their ethical
| obligation to society.
|
| The surge in click-bait and outrageous lies is eroding one
| of the pillars of freedom: the freedom of the press. Now,
| the press is largely viewed as untrustworthy by 60% of the
| US population.
|
| Source: https://news.gallup.com/poll/321116/americans-
| remain-distrus...
| farcebook wrote:
| I agree, and that's why I said "rudimentary". Still, as
| content producers and not algorithmic aggregators, both
| their ability and success at amplifying disinformation is
| much, much less than Facebook's. Even Breitbart or
| DailyKOS's impacts -- as outlets who often and purposefully
| distort the truth -- are not even rounding errors to
| Facebook's sheer scale.
| phendrenad2 wrote:
| Both are true. Both the news and social media optimize for
| clickbait, which bad actors use to slip in misinformation.
| What's funny is, people use the exact language about the
| news. They'll say that CNN is just being used by the far
| left to promote far-left views, or that Fox News is being
| used by the far-right to promote far-right views. Yet when
| the medium is Facebook instead of the media, suddenly it's
| a problem that must be solved with more laws.
| cronix wrote:
| Facebook and other internet media have special Section
| 230 exemptions from libel and other things that print
| media and media using public airwaves do not.
|
| How about we just remove that so they are all equal? Not
| special or "more laws," just equal.
| mcguire wrote:
| 1619 Project? " _The 1619 Project is an ongoing initiative
| from The New York Times Magazine that began in August 2019,
| the 400th anniversary of the beginning of American slavery.
| It aims to reframe the country's history by placing the
| consequences of slavery and the contributions of black
| Americans at the very center of our national narrative_ "?
|
| I mean, sure, some historians objected (https://web.archive
| .org/web/20200814135117/https://www.nyboo... [1], https://w
| ww.theatlantic.com/ideas/archive/2019/12/historians...).
| But if "cynicism" is a sin, everyone here on HN (and not
| the least you) are due for some unpleasant cigars in hell.
|
| [1] I note that _I,_ non-historian that I am, could poke
| some logical and (from primary sources) historical holes in
| that speech.
| dillondoyle wrote:
| I agree with holding all media accountable. But the problem
| with what you say is that for millions and millions 'news' is
| actually BS opinion shows. On both sides but from my
| perspective Fox & Murdoch's empire abroad has done far more
| harm than say Maddow preaching whatever riles that audience
| up.
|
| The actual press, WaPo, NyTimes (which does great docs, so
| does Vice imho) have rigid editorial process. Yet I still on
| HN people say NyTimes is liberal which I don't belive either.
| iammisc wrote:
| > The actual press, WaPo, NyTimes (which does great docs,
| so does Vice imho) have rigid editorial process. Yet I
| still on HN people say NyTimes is liberal which I don't
| belive either.
|
| The New York times and WaPo? You mean the ones that
| consistently do things like make up false allegations about
| children (Nick Sandmann... the kid who smiled) and take
| much more time to scrutinize the later proof of their wrong
| doing than they ever did for their initial reporting? This
| happens constantly. It's not a one time event. It's almost
| like they're pushing an agenda.
|
| The 'right wing' bias of the NYT tends to be around being
| Warhawks, but the right wing isn't necessarily the sole
| wing of war hawks in this country. The last republican
| president was quite anti-war.
| keneda7 wrote:
| I've found this site to be pretty fair when rating media
| bias. It lists all the three sites as having liberal bias.
|
| https://mediabiasfactcheck.com/
|
| Also I want to point out Vice gave a known murderer a
| national interview for him to push his fake narrative.
| Meanwhile there was an actual video and pictures of the
| killing that completely blew away any notion this was
| anything other that a murder. Michael Reinoehl hide behind
| a wall, came out behind two people, and shot one in cold
| blood. The two people were simply walking. The video and
| pictures of the killing were available online before Vice
| decided to give him an interview.
|
| https://www.vice.com/en/article/v7g8vb/man-linked-to-
| killing...
| cyberpunk wrote:
| Vice are an absolute shower of bastards, I don't read
| anything there after they basically outed SexyCyborg even
| though she begged them not to, and then took her patreon
| down cutting off a lot of her funding... [0]
|
| Wankers.
|
| 0: https://thefederalist.com/2018/08/20/spat-chinese-
| hacker-vic...
| NineStarPoint wrote:
| New York Times is listed as center-left and high factual
| though. Orbiting the center and factual is really about
| as good as you're going to get for less biased reporting.
| (Ignoring how even that is reductive, and that the NYT
| definitely has its own set of biases that don't neatly
| fit on the left-right spectrum.)
| mcguire wrote:
| " _Now, I know there are some polls out there saying
| [George W. Bush] has a 32 percent approval rating. But
| guys like us, we don 't pay attention to the polls. We
| know that polls are just a collection of statistics that
| reflect what people are thinking in reality. And reality
| has a well-known liberal bias ... Sir, pay no attention
| to the people who say the glass is half empty, [...]
| because 32 percent means it's two-thirds empty. There's
| still some liquid in that glass, is my point. But I
| wouldn't drink it. The last third is usually backwash._"
|
| (And what is the Vice article an example of? Presenting
| both sides of an issue?)
| speedybird wrote:
| > _The actual press, WaPo, NyTimes (which does great docs,
| so does Vice imho) have rigid editorial process._
|
| That hasn't stopped them from publishing complete bullshit
| to start wars and similarly abominable shit. The New York
| Times allowed Judith Miller to uncritically publish
| flagrant government lies to start the Second Iraq War. In
| the leadup to the First Iraq War, ABC and NBC published
| atrocity propaganda (the "Nayirah testimony") for the
| American government. In the 20th century, the New York
| Times let Walter Duranty publish Stalinist propaganda
| denying genocidal oppression and famine in Ukraine. In the
| late 19th century, a bunch of American newspapers were used
| to start a bullshit war with Spain.
| farcebook wrote:
| Facebook further blurs the difference between objective
| analysis, expert opinion, and uninformed nonsense. And,
| yes, like the talk shows, it profits off this amplification
| of nonsense. Algorithmic engagement is the natural
| evolution of "if it bleeds, it leads".
|
| Facebook just happens to be much bigger and much better at
| it than the legacy news companies.
| dado3212 wrote:
| The vast majority of what you could call disinformation is
| directly downstream of mainstream media sources. FB is a
| platform: the content has to come from somewhere, and it's
| usually downstream of these news companies. Should FB be
| banning Fox and MSNBC? What's the expectation here?
| mcguire wrote:
| While "mainstream media" deserves its share of opprobrium,
| they didn't generate a lot of the "content" shared on
| Facebook:
|
| https://img.buzzfeed.com/buzzfeed-
| static/static/2019-02/14/1...
|
| Or if they did, they'd be looking at some consequences.
| dado3212 wrote:
| Yeah, but by VPVs those groups don't. I think Fox News
| and Ben Shapiro were the two highest VPV producers of
| what was dubbed misleading content.
| MikusR wrote:
| Facebook has more rudimentary checks on truthiness than any
| of mainstream news organizations.
| farcebook wrote:
| Explain?
| Tamrind7 wrote:
| No one with a brain uses the terms "disinformation" or
| "misinformation".
| smolder wrote:
| What was the point of posting this? Surely it wasn't to
| convince anyone they don't have a brain.
| sentinel wrote:
| 1. The mainstream news pushed numerous stories over the past
| years without those rudimentary truthiness checks.
|
| 2. I disagree with many points in that paragraph: - echo
| chambers: when was the last time you heard a nuanced "from
| the other aisle" opinion from a NYT reader; an opinion that
| wasn't covered in the NYT. Same echo chamber, just a
| different format. - news companies are small and irrelevant:
| Seems like this narrative has captured everybody's
| imagination today. Everybody is talking about it. It might
| even lead to action in Congress. Are news companies really
| that small and irrelevant as you claim them to be?
| colinmhayes wrote:
| I read the times almost everyday and have plenty of
| conservative opinions. The Times has multiple conservative
| columnists and consistently publishes conservative op-eds.
| iammisc wrote:
| These 'conservative' columnists do not reflect the
| majority of conservative opinion in this country though.
| NYT columnists tend to be members of the political or
| cultural elite. There is a large strain of anti-elite
| sentiment running through the country right now,
| especially on the right. These people do not feel their
| views are expressed in the NYT. For that matter, I know
| plenty of working-class dems who feel disenfranchised
| with the NYT, the media elite, the democratic party, etc.
| sentinel wrote:
| You're in the minority.
| whimsicalism wrote:
| Conservatives at NYT do not reflect the opinions of most
| conservatives or most conservative media.
|
| If you want a taste of conservative media, look at what
| is collated on RealClearPolitics. NYT is very different,
| you won't see Ross Douthat saying the same things they
| say in The Federalist.
| BitwiseFool wrote:
| >"basic editorial processes in place that do a rudimentary
| check on truthiness"
|
| I think this is merely an illusion. The beauty of English is
| that you can spin a story a dozen different ways while still
| presenting 'true' facts/details about an event. (Edit:
| "Fiery, but mostly peaceful" is a quintessential example of
| this.) You can also shape how strongly the public reacts to
| something by how much you decide to cover the story. I
| guarantee you that if the Fall of Kabul happened under Donald
| Trump's administration we'd still be getting daily news
| stories about the fallout.
|
| Furthermore, the press has done an excellent job of branding
| itself as an impartial arbiter of truth, whereas in reality
| they're just another business run by people with their own
| motivations.
| hatenberg wrote:
| Have you ever watched fox? People die every day because of
| their anti vaccine crap. Basic editorial process.
| iammisc wrote:
| I was just listening to hannity the other day and the man
| was constantly talking about how you should talk to your
| doctor about vaccines because it's the best way to prevent
| covid. Listening to Cavuto the other afternoon, and heard a
| similar sentiment on his show. Who are you listening to
| exactly? The majority of anchors, especially the biggest
| ones, seem to basically endorse the vaccine.
| keewee7 wrote:
| One thing that makes Facebook distinct from other big Internet
| companies is that they allow far-right speech on their
| platform.
|
| That is probably why the Democrats and liberal media are
| targeting Facebook but not twitter, YouTube, reddit, Snap,
| TikTok etc.
| baby wrote:
| Have you read the comments on youtube?
| philwelch wrote:
| The elephant in the room here is the First Amendment. The
| government, legally, cannot do anything to censor "hate speech"
| or "misinformation" on CNN or Fox News. They also can't do
| anything to censor "hate speech" or "misinformation" on
| Facebook, but they can certainly harass Facebook into doing it
| for them.
| CountDrewku wrote:
| And this is exactly what they're doing. In effect they're
| loopholing the 1st amendment. Bring in Zuckerberg, hound him
| about removing specific "disinformation" under the threat of
| future political action if they don't comply. Then they get
| to throw the private company bs flag whenever someone says
| it's stomping on the 1st amendment.
| cronix wrote:
| > but they can certainly harass Facebook into doing it for
| them.
|
| I believe you are exactly right here, or at least giving them
| cover to censor more with "the public's blessing."
| n8cpdx wrote:
| Not completely true. Historically mass media was regulated.
| Lying also isn't legally protected if it causes injury (fire
| in a crowded theater being the canonical example, libel
| another).
|
| https://en.m.wikipedia.org/wiki/FCC_fairness_doctrine
| riffic wrote:
| well, the airwaves (which is where the FCC fairness
| doctrine was applied to) are a limited resource which,
| without regulation, become quickly polluted by bad actors.
|
| print and cable are somewhat more immune to that particular
| issue.
| danShumway wrote:
| The Supreme Court only upheld the fairness doctrine in
| situations where spectrum was limited (Red Lion
| Broadcasting vs FCC). Even at the time the fairness
| doctrine wasn't really applicable to mediums in which
| bandwidth/spectrum is not limited and not licensed to
| specific broadcasters.
|
| It's not clear to me how this kind of regulation would be
| justified to the Supreme Court in regards to the Internet.
| bhupy wrote:
| > fire in a crowded theater being the canonical example
|
| https://www.theatlantic.com/national/archive/2012/11/its-
| tim...
| philwelch wrote:
| > fire in a crowded theater being the canonical example
|
| That phrase comes from _Schenck v. United States_ , where
| the Supreme Court ruled that the federal government could
| imprison an anti-war activist for disseminating pamphlets.
| ashtonkem wrote:
| "Fire in a crowded theater" comes from the Supreme Court
| upholding the criminal conviction of a man handing out
| pamphlets protesting the WW1 draft. Not exactly the
| precedent I'd reach for if I was trying to argue that
| governmental regulation of speech is not dangerous.
| ashtonkem wrote:
| Breaking up Facebook would probably be the way to go; it has
| less troubling side effects than trying to regulate "hate
| speech" via a clever legal trick.
| jorblumesea wrote:
| FB is much worse because there's no sense of moderation.
|
| In mainstream media, there's someone who vets content, fact
| checkers etc. You can disagree on whether you think Tucker
| Carlson is "factual" but effort goes into not running afoul of
| legal rules and fear of being sued. Scripts are written, it's
| edited, etc.
|
| With social media, it's just straight up lies. Not even "a
| little bias" but just straight up garbage. They hide behind
| "free speech" and "light moderation" but the reality is that
| there's no rules like traditional media have to take into
| account.
|
| > when they've been damaging America for much longer than FB
| has existed
|
| Has modern mainstream media been implicated in major events
| like the Jan 6th insurrection?
| mcguire wrote:
| I note that the "general American media" is subject to quite a
| few more constraints than Facebook and others of similar ilk.
|
| At the lowest level, the "general American media" can have
| advertising pulled if they get too far out of hand. Going
| further, they have to live with their editorial decisions---
| see, for example, your own antipathy to them. Ultimately, they
| can be sued.
|
| Facebook, on the other hand, can offer not to associate
| someone's advertising with _that_ content, but someone else
| will surely be happy to fill the spot and FB will be making
| money on both. Facebook doesn 't have to live with its (non-)
| editorial decisions---that's user generated content, right? And
| you can't really sue Facebook either.
|
| The bar has never been equal, but not in the way you think.
| h2odragon wrote:
| Facebook gets to say "its the users being polarizing"; where
| traditional media gets to enjoy libel and slander laws.
|
| If Facebook's editorial decisions rendered them subject to
| those laws, would that be equal enough?
|
| There _are_ answers to this question that do not easily
| summarize as "we need more rules about who gets to say what".
| We gots plenty of rules. Let's apply them fairly and equally
| for once.
| mdoms wrote:
| Am I the only one who found the Facebook Files reporting very
| overwrought and stretched? There were some issues and red flags
| in there, but mostly it seemed... not that bad?
| fullshark wrote:
| Appears to be the emerging consensus at least at hacker news.
| Graffur wrote:
| How do you see the files?
| mdoms wrote:
| "Facebook Files" was the project name Wall Street Journal
| gave to their flagship reporting on this story. I haven't
| seen the files unfortunately.
| ncr100 wrote:
| As a technologist, to me this is technology being disruptive
| without sufficient control. In this case disruptive to human
| communication. And FB doesn't also have a comprehensive suite of
| human ethics sufficiently in their loop.
|
| Why did FB do this: She identified a disconnect between the FB
| Civic Engagement team recommendations and FB acting on those
| recommendations, in one of the 60 Minutes videos.
|
| She says people at FB aren't trying to be evil, however:
|
| > "... also detailed how she says Facebook quickly disbanded its
| civic integrity team -- responsible for protecting the democratic
| process and tackling misinformation -- after the 2020 U.S.
| election."
|
| What ethics does FB corporation stand for, factually? [Spare the
| cynical "profit" ethic - other than that.] Is it "connecting
| people for .. " some reason?
|
| Today's hearings indicate an evolution to COPPA, at the least,
| may be a legal outcome:
| https://www.youtube.com/watch?v=VvYB9_PR3sQ
| darkwizard42 wrote:
| I mean disbanding the civic integrity team after a huge civic
| event seems reasonable? It could have been the team was more of
| a task force and comes together when large events occur. We
| don't really have any/much insight into how they spun up a war
| room for the 2020 election.
|
| A related situation might be how Lyft/Uber have massive war
| rooms and special ops teams to handle Halloween and New Years
| (massive heavy traffic events on these platforms). There is a
| special ops team and several marketplace teams that get pulled
| in, but they get "disbanded" and go work on more standard
| things after
| shkkmo wrote:
| Unless that civic event had significant unresolved elements
| such as the failure of the sitting president to actually
| concede the election but rather to contest it.
|
| And it's not like Facebook isn't a global company that deals
| with a more or less constant deluge of important elections in
| many countries.
| utunbu wrote:
| Not surprised after seeing "aliens are our ancestors" ads on FB
| for a while.
| nradov wrote:
| The title here doesn't match the article headline. The documents
| make Facebook look bad but so far I haven't seen anything
| _incriminating_ , in the sense of violating a US Federal criminal
| statute. Are there credible allegations of something like
| securities fraud or wire fraud here?
|
| Note that there is no generally no law against disseminating
| incorrect or "harmful" information.
| slongfield wrote:
| The section at the end -- "Haugen contacted state officials and
| the SEC" talks about the parts of the whistleblower documents
| that show that Facebook was misleading investors. That is
| criminal under current US law.
|
| I think people are more mad about the other things in the
| documents--most people are more sympathetic to harm done to
| kids' mental health than to investors' bottom line, but lying
| to investors is the one that's illegal.
| ENOTTY wrote:
| This might fall under Matt Levine's theory that "everything
| is securities fraud"
| elliekelly wrote:
| They've lied to the public and (if the lies were material to
| investors) that's securities fraud.
| cratermoon wrote:
| Which says more about the emasculated regulatory regime in
| the US than anything else. There are no penalties, or none of
| any real meaning, for causing to harm to consumers or the
| community, but violating the holy tenet of maximizing
| shareholder value is almost a cardinal sin.
| nradov wrote:
| Every single one of us has caused some harm to the
| community at some point in our lives. That doesn't mean it
| should be a criminal offense.
| cratermoon wrote:
| Perhaps, but no individual continually does it for years,
| harming millions if not billions of others, in pursuit of
| profits.
| cratermoon wrote:
| The fact that it's possible to question whether or not any
| illegal behavior has been revealed doesn't mean that Facebook
| didn't cause - and is not causing - any _harm_ doesn 't mean
| there's no basis to Haugen's revelations.
|
| Rather, it means that under the current regulatory regime,
| there are no penalties for behavior harmful to consumers. That
| the only recourse is to file with the SEC alleging harm to the
| stockholders is simply a reflection of the fact that the mantra
| of "shareholder value" has become the only law that
| corporations have to follow.
| coolspot wrote:
| IMO Twitter and Reddit cause more mental harm than all FB
| properties combined.
| bobthepanda wrote:
| we can walk and chew gum. Looking at one doesn't preclude
| looking at the other.
| IshKebab wrote:
| Yeah this appears to just be some _embarrassing_ things but
| nothing outright illegal or really even surprising. What
| company doesn 't have embarrassing secrets they'd rather not be
| leaked? None that I've worked at anyway, and none of those were
| evil.
| ausbah wrote:
| why would you reveal yourself if you don't have guanareteed
| protection?
| jasondigitized wrote:
| She went through the FTC which adds a layer of potential
| whistleblower protection.
| travoc wrote:
| Facebook likely knew the identity of the leaker right away.
| There was little advantage to maintaining the illusion of
| anonymity at this point.
| achow wrote:
| Correct.
|
| She said that she began thinking about leaving messages for
| Facebook's internal security team for when they inevitably
| reviewed her search activity. On May 17, shortly before 7
| p.m., she logged on for the last time and typed her final
| message into Workplace's search bar to try to explain her
| motives.
|
| "I don't hate Facebook," she wrote. "I love Facebook. I want
| to save it."
|
| https://www.wsj.com/articles/facebook-whistleblower-
| frances-...
___________________________________________________________________
(page generated 2021-10-05 23:00 UTC)