[HN Gopher] Facebook deliberately made people sad. This ought to...
___________________________________________________________________
Facebook deliberately made people sad. This ought to be the final
straw (2012)
Author : seesawtron
Score : 135 points
Date : 2021-04-17 19:29 UTC (3 hours ago)
(HTM) web link (www.theguardian.com)
(TXT) w3m dump (www.theguardian.com)
| alfl wrote:
| When they announced that they had this capability a couple of
| years ago I deleted my account.
| [deleted]
| stadium wrote:
| The ultimate propaganda machine.
| dhosek wrote:
| 9 years later and still a whole lot of straw left.
| sidcool wrote:
| Facebook has indeed done much harm to individuals and the society
| as a whole. But at the same time, their tenacity to continue
| making money is impressive. Villiams too have some quaint evil
| power.
| chris_wot wrote:
| That was when I started to realise just how bad they were.
| yeuxardents wrote:
| This was the last straw for me, thats when I permabanned facebook
| from my life.
|
| This was an unauthorized, unguided, unethical, mass psychological
| experiment on human beings. Anyone involved should have gone to
| jail for crimes against humanity.
| jbverschoor wrote:
| luckily we don't have the dependency on fb anynmore, since
| nobody forces "login with facebook" anymore thesem days
| neonological wrote:
| No it's not a crime against humanity, let's not go that far and
| let's not unreasonably promote cancel culture by burning the
| witch at the stake before giving the issue some deep thought.
|
| It's a much more complicated issue then you describe. Many
| corporate organizations have been selectively distributing
| information to deliberately influence sentiment for decades.
| Facebook is not the first nor will they be the last.
|
| Restricting any entity from doing this is a complicated issue
| that has to do with restricting freedom of speech.
|
| Fox news comes to mind when I think of another organization
| that does this deliberately. To add more complication to the
| issue one should consider the fact that the success of Fox news
| can also be attributed to it's customers: people. People choose
| what they want to hear, and many people prefer news viewed
| through a biased right leaning lens.
|
| Just like how the above poster wants to paint this issue as a
| crime against humanity, I think part of the problem is that on
| some level we all want to lie to ourselves. We want to view the
| world in a very specific and certain black and white way.
| wruza wrote:
| Honestly that sounds like "this stuff dealer is bad because they
| make people sad while they wait for a dose; I quit, clean for 2d
| 6h 19min". No doubt facebook is evil, corporate monster, etc, but
| hey what about stopping being a junkie. The problem is not
| someone experimenting with your unhealthy addiction, it _is_ your
| unhealthy addiction.
|
| At the early stages of the internet, most of the content was
| hidden, waiting for you to actively discover and bookmark it.
| Like you do with good places in your town - you find one, add it
| to your address book and visit occasionally. It was a slow
| process, full of findings, enjoyment and variety. Now everyone
| seems to sit at their mailbox, desperately waiting for another
| pack of junk mail to arrive. Facebook is just that - a postman
| who chooses from a variety of crap to push into your inbox. It
| doesn't change lives unless people are too lazy to live by
| themselves.
| d3ntb3ev1l wrote:
| In other news, the ocean is wet
| imwillofficial wrote:
| What if somebody had killed themselves? This type of unconsensual
| experimentation is criminal.
| jimbob45 wrote:
| I don't know anyone that uses Facebook anymore that likes it.
| Everyone I know who uses it says, "I'm thinking I should delete
| it soon". Universally, the number one criticism is, "All I ever
| wanted to see are my friends' posts and every update shows me
| less and less of those".
|
| Does anyone actually know people who avidly use and love
| Facebook? It seems like Facebook is like the Christian church
| where the church and everyone _says_ they go every Sunday but
| it's really more like once a year at this point.
| alistairSH wrote:
| Same boat here. Wife deleted her account. I keep mine only to
| use the market (currently, the best place to sell bicycle parts
| locally). I'll probably delete my account once I do my next
| round of parts bin purging.
| chatmasta wrote:
| Pre-2013 was a wild time on the internet. It seems like that's
| when a lot of its nasty underbelly went mainstream.
|
| The Snowden leaks were a turning point, I think, when people
| realized "the NSA and corporations are spying on you" wasn't just
| a tinfoil hat conspiracy theory.
|
| It's mind blowing to think that most major sites on the internet
| (including Amazon) were not using HTTPS at that time. It's
| possible Amazon used it on its payment pages, but it certainly
| didn't for much of the site. Tools like FireSheep existed for
| years before anyone started to care that everyone from your
| coffeeshop to your ISP could read your plaintext traffic.
|
| Now 9 years later we're finally about to fix DNS (albeit with
| protocols hostile to reverse engineering). Then hopefully up next
| is fixing BGP before the bad guys realize how absurdly vulnerable
| it is.
|
| All this is to say, Facebook could maybe be excused for this
| experiment, because our standards for "that's messed up" were
| already so much lower in 2012 than they are now.
| cryptoz wrote:
| There is no excuse for this. This was a morally bankrupt and
| exceedingly awful thing to do. I was absolutely _furious_ when
| this happened and made it a life mission to help other people
| know this was going on. 99% of people don 't know this event
| happened, still.
|
| I don't trust Facebook one bit. I assume they do this kind of
| illegal 'experiment' all the time now, and the only change is
| they stopped bragging about it.
|
| So gross, this whole thing, it makes me mad there were no real
| repercussions from this. Worse yet is how many people actually
| _support_ this unethical and sick experiment from Facebook.
|
| Lots of people don't see _anything_ wrong with it, even
| technical people like those on HN, and that is just as
| disturbing to me as Facebook actually carrying it out.
|
| ---
|
| Edit: I'm now blocked from posting on HN (for a while I
| guess?), so here is my response to a poster below.
|
| @antiterra says,
|
| > That you'd prefer they either stayed ignorant or unconcerned
| to what changes they made and their effect is exceedingly awful
| and morally bankrupt.
|
| I have no idea how you reached the conclusion that I would
| prefer either of those things. There is a massive chasm between
| 'perform illegal, unethical mass psychological experiments' and
| 'I don't care about stuff'.
|
| ---
|
| @chatmasta says I'm not blocked, but I am. I cannot reply to
| you either. It says I'm posting too fast and won't let me
| submit any content.
| chatmasta wrote:
| > I assume they do this kind of illegal 'experiment' all the
| time now
|
| Oh 100% they do. But hey at least they've contributed Presto
| back to the community!
|
| > Lots of people don't see anything wrong with it
|
| Surely that's not true. In 2021, Facebook is largely hated by
| people on all sides of all political spectrums. [Remember
| when people thought Zuck would run for president? Lol!]
|
| (Edit:
|
| > I'm now blocked from posting on HN
|
| You're not blocked; you just can't reply to someone within a
| minute of their post past a certain nesting level.
|
| (Edit edit: Oh guess you're right! That's annoying. :/ But I
| am enjoying our conversation via edit nonetheless!)
|
| )
| antiterra wrote:
| That you'd prefer they either stayed ignorant or unconcerned
| to what changes they made and their effect is of questionable
| morality.
|
| Edit re your edit response:
|
| It is absurd to me that deciding whether to show more or less
| 'I got a new car' posts based on the resulting behavior is
| some kind of sinister event. Even the author of this article
| struggles to articulate why this isn't just standard A/B
| testing.
| jwilber wrote:
| It sounds like you're trying to relate the ever changing
| adoption and development of web standards with longstanding
| ethical expectations in research.
|
| Nevermind the fact that the former aren't direct actions on
| customers (Amazon wasn't not showing https for only certain
| users) while the latter is a direct, concentrated, understood
| action on customers.
|
| The two aren't similar at all.
| chatmasta wrote:
| No, I'm comparing the complete disregard for user's privacy /
| well-being with the status quo at the time, which was "major
| payment portals don't even implement HTTPS."
|
| If you want to document Facebook's ethical lapses, you could
| go all the way back to 2004. But it's only the past few years
| people started _actually caring_.
| aasasd wrote:
| In related news, to keep their job or occupy their time,
| journalists and social media commenters continue pretending that
| web users have self-respect.
| vmception wrote:
| Do we know what specific week that was? Would like to see it
| affected trading patterns or the VIX.
| rationalfaith wrote:
| Really good point there. Curious about that too. This is could
| a be huge vector for market manipulation.
| antiterra wrote:
| Product Person #1: Hey, here's a writeup that says if we show too
| many positive posts to people that it creates bad feelings.
| Should we show less positive posts?
|
| Product Person #2: Not sure, wouldn't negative posts make people
| feel worse?
|
| Product Person #1: Wow, I donno, maybe we should try adjusting
| additional posts people see one way a bit and see what happens?
|
| Product Person #2: Not a bad idea, how about we try both? You
| know, an A/B test.
|
| Product Person #1: Hmmm. Ok, but-- when we're done, let's have
| some scientific review of our data just so that we can correct
| the record and push along the science around this stuff.
|
| Journalist: This company deliberately made people sad
| mhh__ wrote:
| > This company deliberately made people sad
|
| Which would be correct, surely?
| antiterra wrote:
| Would you consider 'this company deliberately made people
| happy' to also be correct, then?
| [deleted]
| Aunche wrote:
| It's technically correct but implies malicious intent. It's
| like saying "Pfizer deliberately injected a substance in
| people that kills them" to describe a failed drug trial.
| dheera wrote:
| "it creates bad feelings"
|
| If this is the hypothesis they are testing for, the problem
| is they subjected people to a psychological experiment
| without consenting to it, for the ultimate purpose of their
| own financial gain.
|
| How is that not malicious intent?
| eptcyka wrote:
| If you dislike a _b_ version of an a/b tested site to the
| point that you feel deeply dissappinted by it, were the
| people who designed the experiment operating with a
| malicious intent?
|
| I don't think facebook engineers or data scientists care
| what people feel when they're adjusting their models,
| they only care about time spent on their platform or some
| other overly reductionist metric. One could argue that
| chasing super-efficiency will always result in
| abominations, but I think its hard to say that someone is
| doing something imoral by just doing the human equivalent
| of gradient descent in the optimization problem of
| driving engagement. That's not to say that we shouldn't
| have a conversation about this and not figure out ways to
| make these super efficient and powerful companies find
| ways to be satisfied with organic engagement.
| dheera wrote:
| There's a difference between disappointment about a
| product (which is fine) and doing an experiment to test
| that might knowingly create bad feelings in someone's
| life in one cohort of your experiment.
|
| What if an almost-suicidal person got the "b" version?
|
| A/B testing should be used when you think, _in good
| faith_ , that BOTH "a" and "b" versions could be good and
| want to know which one works better, not to confirm a
| hypothesis that "b" has a negative impact on users'
| personal lives.
| Aunche wrote:
| Surely, people understand that Facebook makes changes to
| their website, and that these changes are based on data.
| That is all an experiment is. If Facebook wasn't allowed
| to experiment, they would never be allowed to change
| their interface or recommendation engine at all.
| dheera wrote:
| There's a difference between experimenting on the
| product, and conducting an experiment that deliberately
| puts one cohort of users in an environment hypothesized
| to create negative emotions.
|
| A/B tests for the case where you honestly, in good faith,
| think both A or B could be good designs, but you don't
| know which one is better.
|
| A/B testing is for product design changes, not psychology
| experiments. It is NOT for the primary intention of a
| psychology experiment with a group of subjects that
| didn't consent to a study, where either A or B has a
| suspected negative impact on emotions, and you're using
| A/B testing to confirm or deny that hypothesis.
| Aunche wrote:
| It's still a product decision though, especially the way
| the top comment frames it. It's just one that happens to
| be centered around emotional language. I suspect that an
| experiment that deals with more or less political content
| would have a similar effect on people's mental state.
| moron4hire wrote:
| Except people have to give consent for drug trials
| kennywinker wrote:
| Not just consent! _informed_ consent.
| detaro wrote:
| Guess why you need to opt in to a drug trial.
| tshaddox wrote:
| It seems like you're implying that, by explaining in slightly
| more detail how this might have happened, you somehow show that
| the journalist was oversimplifying or distorting things. But,
| no, your last sentence is definitely still correct.
| typon wrote:
| The constant derision of journalists in tech circles
| (especially on HN) is kind of shocking to see. Did you read the
| article or did you miss that the journalist is relaying fellow
| researchers' apprehensiveness about Facebook being allowed to
| conduct an unethical study. This article [1] is linked in the
| second paragraph.
|
| [1]
| https://www.theguardian.com/technology/2014/jun/30/facebook-...
| hn_throwaway_99 wrote:
| It's nothing personal against journalists, but the economics
| of the business mean that they have huge incentives to make
| things seem more outrageous or nefarious than they actually
| are.
|
| I did read the whole article, and that doesn't excuse the BS
| clickbait title of "Facebook deliberately made people sad."
|
| I find the fact that the article is complaining about
| emotional manipulation by Facebook, while using a
| deliberately manipulative title, to be more than a tad
| ironic.
| rendall wrote:
| Questioning whether someone read the article is against the
| guidelines:
|
| > "Please don't comment on whether someone read an article.
| "Did you even read the article? It mentions that" can be
| shortened to "The article mentions that.""
|
| https://news.ycombinator.com/newsguidelines.html
| antiterra wrote:
| The commentary is more on the clickbait title, which very
| likely could have been written by someone other than the
| article's author. (NB: journalists do A/B testing to pick the
| title that incites the most clicks from outrage[1].)
|
| Further, the article you link appears to have commentary by a
| single researcher.
|
| Here's a writeup by researcher who thinks it wasn't unethical
| at all and that it would have passed IRB as performed. It, of
| course, does not end the discussion, but it does demonstrate
| that the issue is more complex than a company deliberately
| making people sad:
|
| https://hbr.org/2014/07/were-okcupids-and-facebooks-
| experime...
|
| [1] https://www.washingtonpost.com/lifestyle/media/media-
| values-...
| Graffur wrote:
| Why did you make this comment? Is it what you believe happened?
| chatmasta wrote:
| I admit I didn't read the article because I thought I knew the
| event it was referring to, but didn't Facebook actually publish
| this as a psychology study? i.e. they didn't just use it for
| A/B testing features but actually thought they were doing
| something "good" to the point of publishing a study about it.
| It's laughable now lol.
| cryptoz wrote:
| Yes, they published the paper and publicly bragged about how
| successful they were in making hundreds of thousands of
| people feel sad.
|
| https://www.pnas.org/content/pnas/111/24/8788.full.pdf
|
| Edit: And it is in their public defense of this paper that
| they think we all gave 'informed consent' to this kind of
| psychological manipulation when we signed up for Facebook
| because it's in the TOS.
|
| They literally define 'informed consent' as the legal text in
| their TOS. I mean that's insane, they _know_ their users don
| 't read that before signing up. It's not f'ing _informed_ or
| _consent_ ugh this makes me so mad.
|
| Also what kind of sick company puts into their TOS that they
| can emotionally manipulate you on purpose whenever they want?
| Fucked up.
| wpietri wrote:
| Agreed. If people end up surprised about what they
| nominally consented to, it's not informed consent. Another
| example of Facebook's "move fast and break people" ethos.
| babesh wrote:
| Mark Zuckerberg - They "trust me". Dumb fucks.
| thatcat wrote:
| That's the definition of an unethical experiment design and
| would be considered academic misconduct if that were academic
| work. Regardless of intent, fb probably shouldn't be allowing
| front end devs to design psychological experiments at mass
| scale.
|
| https://psychology.wikia.org/wiki/Experimental_ethics
| shadowgovt wrote:
| Believe it or not, they didn't have the front end devs design
| it.
|
| There was a specific task force at Facebook intended to
| better understand customer psychology. Came from them.
| mgraczyk wrote:
| Which part of this conception of experimental ethics does
| this violate? Facebook's experiment did not directly cause
| the emotions, the posted content for that. Even if it did,
| this sort of thing is well within the bounds of normal
| experimentation with large systems.
|
| This is similar to claiming that traffic experiments which
| change the flow of cars, in term causing some people to
| experience more traffic and become unhappy, are unethical. Do
| you think traffic experiments are unethical?
| seattledev wrote:
| I disagree. If facebook is testing content like this then
| they explicitly know which content makes people sad. To
| manipulate someone's feed to push them stuff that knowingly
| makes them sad for the purpose of experimenting with
| someone's emotions, that is highly unethical to me.
|
| There is a massive difference in studying and directing
| traffic to help make traffic flow better. The goal there is
| to improve driving for everyone. Facebook's goal is to
| intentionally make people unhappy for the sole purpose of
| making more money.
| plaidfuji wrote:
| To be fair, the article states that the study was designed by
| a data scientist. Not that that implies that any additional
| ethical considerations were taken into account.
| graeme wrote:
| Doesn't that suggest professional ethics are immoral? They
| block a _temporary, one off_ experiment which could suggest
| how to avoid sadness.
|
| The alternative is relying on intuition only and possibly
| making people permanently sadder. Or is there a way to
| achieve this knowledge consistent with the field of research
| ethics?
|
| We saw this with challenge trials in the pandemic where
| ethicists fretted people would reject them and concluded
| challenge trials are immoral. Meanwhile opinion polls suggest
| people view challenge trials as virtuous and the current
| method of waiting for infections as immoral.
| seattledev wrote:
| Yes, there is a way. You left them specifically opt in to
| it, you provide them disclosure documents of what's being
| tested, and you provide free access to mental health
| resources.
| unethical_ban wrote:
| How would it have been ethical?
|
| If their current process was already suspected to be the
| worse of the two, and their goal was to improve peoples'
| moods per se, what is the ethical thing to do? Keep people
| feeling more sad?
|
| Or, perhaps, stop being algorithmic in what people see, or
| else put people in full control of simple sorting (prioritize
| my "starred" category and sort by date)?
| throw7 wrote:
| "Anyone using facebook is a dumb fuck." - Mark Zuckerberg
| bserge wrote:
| Yes, indeed, we should all learn _not_ to use our emotions in
| online discussions. People will say and do things that they would
| never say in real life, even under their real name, because of
| the disconnection from the actual person hearing /reading/seeing
| that.
|
| Funnily enough, this is how I quit Imgur for good. It's amazing.
| I posted something that I believed was right and got downvoted to
| hell.
|
| So I started asking people "why, why do you downvote?" and got
| mostly laughs, memes and people calling me stupid. Except one
| person, who said "you care way too much about this". Indeed, I
| did. Thank you random person!
|
| Not sure why but it affected me more on Imgur. Maybe it's the
| length of the comments? The memes that encompass a thousand
| words, as they say? Regardless, I just deleted my account and
| never went back. It's great.
|
| Still trying to quit Reddit and HN, but they're good resources if
| you ignore all the stupidity. Imgur is bottom of the barrel
| social media, but it was fun.
|
| And of course, this is used by various media outlets, has been
| for a long time.
|
| It's all about eliciting emotions, which come from the primitive
| part of the brain, bypassing any advanced conscious analysis and
| engaging the impulse to do either what you're told (good for
| sales) or the opposite of it (good for spreading a message) or
| something in between, but it's a response that you _will_
| remember and most likely take action.
|
| I forgot what I was trying to say. Somehow Facebook never got me,
| it's just a useless platform aside from contacting people.
| towergratis wrote:
| It does affect you. Because you get the feeling you are "part"
| of the community.
|
| I used to be very active in HN until I got into a heated
| discussion with someone and since then my HN "flag" privileges
| were stripped away.
|
| I know I shouldn't care, but I couldn't help it. Now I am just
| lurking on HN and rarely reply with this new account using TOR
| and care as little about "karma" as I do about "producing
| value" with my replies to HN.
| doublerabbit wrote:
| Feel the same.
|
| HN is exactly the same as Reddit, just a different crowd and
| less edge. It's nice to think that you can have a civil
| discussion on the HN platform but you can't. The whole
| internet karma/points is a flawed system.
|
| For HN: You should have to give a reason to down votes. To
| give someone downvote permissions and then allow them to
| downvote on something they bias, isn't far.
| NateEag wrote:
| > For HN: You should have to give a reason to down votes.
| To give someone downvote permissions and then allow them to
| downvote on something they bias, isn't far.
|
| Because no one would ever lie or put noise in the "why I
| downvoted" field.
| doublerabbit wrote:
| And In that case they don't get to down vote. Problem
| solved.
|
| See downvoted already, point proven.
| shadowgovt wrote:
| I believe slashdot experimented with meta-moderation for
| that issue, but I actually never found out what came of
| that system.
| jb775 wrote:
| I noticed during the election that whenever I glanced at the fb
| "watch" video section, it was videos of guys getting into fist
| fights, or videos containing disturbing violence. I don't ever
| watch or search for videos like this, so it's not like it was
| selected based on my watch history or something.
|
| At the time, I figured it was a psychological trick to suck me
| into viewing more ads since violence has that "can't look away"
| nature to it...but now that I think about it, they could have
| been intentionally stirring up angry emotions within the general
| population during the election cycle. Anyone else notice this?
| bob33212 wrote:
| I rarely use facbook but when I do they show me some
| woodworking projects. I have never said anything about
| woodworking or joined any wordworking groups on facebook. They
| are just making a guess that someone like me would find those
| interesting.
| beforeolives wrote:
| You're breaking some data scientist's heart by calling their
| model "just making a guess".
| shadowgovt wrote:
| Much of that content is based on other people's watch
| histories.
|
| It's similar to the process of trending topics on Twitter (and
| one of the reasons that space is such a garbage fire).
| tittenfick wrote:
| Social media algorithms are designed to show you things which
| make you upset because that type of content is highly likely to
| "engage."
| dbtc wrote:
| aren't they designed to optimize for most engagement, in
| whichever way they can (which turns out to be with upsetting
| content)?
|
| Or are people actually selecting specifically upsetting
| stuff?
| tittenfick wrote:
| Yes, they are designed to select for most engagement,
| regardless of the content.
| secondcoming wrote:
| It's not just algorithms, MSM do this too. 'Fear sells'
| ACow_Adonis wrote:
| I admit confusion. isn't unsolicited experimental psychological
| manipulation without consent just another word for most modern
| marketing?
|
| I mean don't get me wrong, I dont like it and try to exclude it
| from my life and my families life, but there's pretty wide
| acceptance socially for this type of behaviour.
|
| I would of thought the beauty industry would be an old
| perpetrator that should generally be investigated. going to shut
| down that?
|
| and negatively instilling fear and distrust in a population
| without their consent for personal gain is about as old as
| politics itself?
|
| isn't this practically mainstream media behaviour?
|
| fomo? status anxiety? conspicuous consumption? I don't see why
| Facebook should be singled out for society-wide mandated and
| culturally supported practices.
| [deleted]
| luckylion wrote:
| "News media deliberately make people outraged. This ought to be
| the final straw".
|
| I don't have issues with the study in general. You _do_ want to
| know whether and how you can influence people in a positive or
| negative way, especially if you want to avoid it. There 's really
| no other way to find out than to study it. They should've gotten
| clear consent for participation in that study, but that's about
| it from my point of view.
| seoaeu wrote:
| "There was nothing wrong except for the lack of consent" is
| really rather missing the point... In other contexts that's the
| difference between acceptable behavior and a felony.
| cryptoz wrote:
| Article published 2014 not 2012 I think, btw. Event happened in
| 2012.
| fogof wrote:
| How sad did this experiment really make people? The chart in the
| study says that people who saw fewer positive posts used one
| percent fewer positive words and about 0.3% more negative words.
| But on the other hand, they were seeing fewer positive posts, so
| maybe they were just replying to the posts they saw in a way that
| was natural, without their inner emotional state being very
| affected.
| seattle_spring wrote:
| Excuse me, but this is the HN bi-hourly "Facebook is evil" post
| (tm). Questioning the content is against the rules.
| pindab0ter wrote:
| There's no need for this.
| [deleted]
| imwillofficial wrote:
| Does it matter?
| hackinthebochs wrote:
| I still don't see why I should be outraged over this. A user of
| facebook already consents to facebook displaying whatever content
| on their feed facebook chooses. Why should facebook require extra
| consent to gather scientific evidence on the effects of the
| content being displayed, given they are only analyzing data they
| already gather?
| cryptoz wrote:
| This is unethical psychological research against non-consenting
| adults and children that likely caused real harm. From a paper
| cited below,
|
| > This Facebook study was conducted without consent and without
| appropriate oversight, and may have harmed both participants
| and non-participants. Kramer's apology also puts the vast
| number of participants in context; a full 0.04% of Facebook's
| users, or 1 in 2500, were unwitting subjects in this unethical
| research. Many of these people were almost certainly children,
| and many of the participants were probably suffering from
| depression. It is surprising and worrying that one of the
| world's most prominent companies should treat both the emotions
| of its users and research ethics so carelessly. Steps must be
| taken to ensure that international psychological and medical
| studies involving social network users are regulated to the
| same standard as other human subjects research.
|
| https://journals.sagepub.com/doi/10.1177/1747016115579535
|
| Edit: Also you said something that confused me,
|
| > given they are only analyzing data they already gather?
|
| I'm not sure what this means. But they did not only analyze
| existing data, they _created new_ data specifically intended to
| make people feel sad, then analyzed _that_ data.
| antiterra wrote:
| Again, that means if they just accepted the wisdom of the
| time, which was the 'common worry that seeing friends post
| positive content leads to people feeling negative or left
| out' and unilaterally limited positive content, they'd still
| possibly be creating more negative feelings and causing this
| 'harm.' But, you know, it'd be all ethical, since they didn't
| look at the data or run an A/B test.
| philplckthun wrote:
| To be fair, since this has been a while ago it's hard to tell
| what to do about this. Personally I find it hard to draw any
| conclusions from this just due to the time that has passed. I
| don't use Facebook, so maybe it's just my distance from it. But
| it has happened and it's worth stating that this is basically a
| psychological experiment and not a simple A/B test, but at a
| company that most likely at the time didn't have an ethics
| board to review this.
|
| Other sources list a couple of principles behind the ethics of
| psychological research. The relevant ones being:
|
| - Minimise the risk of harm - Obtain informed consent
|
| Some of them do state that the latter isn't always exactly
| possible, since that may influence the outcome.
|
| But the fact of the matter is that Facebook did an A/B test
| that could inflect serious harm on the quality of life of the
| participants, who weren't aware of any research being
| conducted. The latter sounds like it'd be at least the minimum
| here.
|
| So, I'm not a psychologist, but this does sound like it
| shouldn't have happened in this way. There were definitely more
| ethical ways in running this experiment that wouldn't have
| involved 700K unknowing and potentially unwilling participants.
| emodendroket wrote:
| Let's imagine hypothetically that sad, negative posts get
| more engagement by whatever metric Facebook uses, and
| Facebook was paying no attention to sentiments at all and
| ending up putting more sad posts on feeds. Would that have
| been unethical? I can't really see what would be so
| different.
| alfl wrote:
| Is it unethical to create an automated system that
| maximizes global unhappiness for profit?
| emodendroket wrote:
| Well, if so the problem goes a bit deeper than Facebook.
| erik_seaberg wrote:
| When a movie makes the audience sad, it wins Oscars, we
| don't censor it. Why should the rules be different for
| Facebook?
| pizza wrote:
| It's hard to get a lot of positive reinforcement by
| interacting with like-minded others at scale through a
| movie. Facebook's original stated intent was to study
| contagion of emotion, which seems to me to suggest a
| multiplayer, interactive effect.
| [deleted]
| tobr wrote:
| That is a banal comparison. When a film makes you sad,
| you are aware of what is going on. If you are unusually
| sensitive to these types of emotions, you can read about
| the film ahead of time to see if you might want to avoid
| it.
| emodendroket wrote:
| Do you typically go read a synopsis of the entire plot of
| a film, including any surprise developments, before
| watching it?
| tobr wrote:
| Yes, that would be deeply unethical. And to make matters
| worse, I believe that's a fairly accurate description of
| how Facebook works.
| emodendroket wrote:
| So how could someone ethically run social media of any
| stripe?
| tobr wrote:
| I don't understand how this question can follow. Are you
| suggesting that social media simply must optimize for
| engagement and not pay attention to negative
| consequences?
| emodendroket wrote:
| What does that mean? Like, we want the Facebook mods to
| delete anything that's too depressing? Sounds more
| dystopian rather than less... and I thought we were
| supposed to be worried about "duck syndrome" where
| everyone appears to be having great lives, making you
| feel bad, because you don't see the negatives (like a
| duck paddling underwater, see?).
| jancsika wrote:
| Is it against the rules for me to use HN as a dating site? I'm
| going to pen-test it:
|
| I enjoy cuddling, long walks on the beach, and services that do
| not run social experiments on me like something out of a cheap
| movie plot about a mad-scientist.
|
| to contact this user please dial "jancsika" on your rotary phone
| now
| alfl wrote:
| You asked our permission so unfortunately this experiment is
| not comparable with Facebook's.
| jancsika wrote:
| Fortunately and wittingly it is not comparable!
| neonological wrote:
| This is a psychological experiment on me and all the users on
| this site and I am now outraged.
___________________________________________________________________
(page generated 2021-04-17 23:02 UTC)