[HN Gopher] Harvard concluded that a dishonesty expert committed...
       ___________________________________________________________________
        
       Harvard concluded that a dishonesty expert committed misconduct
        
       Author : Tomte
       Score  : 235 points
       Date   : 2024-03-15 04:53 UTC (18 hours ago)
        
 (HTM) web link (www.chronicle.com)
 (TXT) w3m dump (www.chronicle.com)
        
       | smcin wrote:
       | For excellent ongoing coverage of Francesca Gino and other
       | misconduct cases in academia, esp. behavioral science (such as
       | Dan Ariely), see Pete Judo. Also covers replication failures,
       | data hacking, and much more.
       | 
       | [YouTube: https://www.youtube.com/@PeteJudo1 ,
       | https://www.petejudo.com/]
       | 
       | Also: DataColada blog [https://datacolada.org/], who made public
       | the misconduct reported by Gino's graduate student.
        
         | russdill wrote:
         | Paul Sutter's new book seems very timely
         | https://rowman.com/ISBN/9781538181614/Rescuing-Science-Resto...
        
         | hashemian wrote:
         | Thanks for the links. I watched some of his videos where he
         | explained how DataColada did their forensic investigation in
         | data manipulation.
         | 
         | What amazes me is how simple was the fraud (or at least the
         | ones reported by Pete!). They basically just opened an excel
         | file, started from the top, changed some random numbers, until
         | they reached the effect they aimed for!!! Really? What about
         | those that can do more sophisticated data manipulation?
        
       | KingOfCoders wrote:
       | I guess she is the expert.
        
         | cloudbonsai wrote:
         | > I guess she is the expert
         | 
         | Absolutely. I recommend everyone to look at her publication
         | history on HBS website:
         | 
         | https://www.hbs.edu/faculty/Pages/profile.aspx?facId=271812&...
         | 
         | Her featured work is a book titled "Why It Pays to Break the
         | Rules at Work and Life."
         | 
         | One of her most cited paper is "The Dark Side of Creativity:
         | Original Thinkers Can Be More Dishonest".
        
           | sitkack wrote:
           | Maybe she wasn't committing fraud, but original research.
        
             | xkcd-sucks wrote:
             | If enough people read and internalize the fraudulent
             | conclusions via pop science journalism etc., maybe the
             | effect will become measurable enough to replicate the
             | original paper :)
        
       | GauntletWizard wrote:
       | https://archive.is/6BrOv
        
       | jose_zap wrote:
       | Researchers should tick a checkbox "I swear I did not hack the
       | data" before submitting a paper to a peer-reviewed journal to
       | prevent this kind of misconduct.
        
         | dtech wrote:
         | Isn't this just basically fraud? I'm sure it's already covered
         | by the existing things you sign, but surprisingly that doesn't
         | stop people who are willing to commit fraud.
        
           | thaumasiotes wrote:
           | I assume that in this particular case, it's a joke referring
           | to the subject of the study, which involved similar, if even
           | weaker, assurances.
           | 
           | In general, this kind of thing is oddly common. It's all over
           | government forms. I just interviewed with a Chinese father
           | who wanted me to spend time with his children providing
           | exposure to English. He asked me whether I had a criminal
           | record.
           | 
           | I don't, but if I did, and I chose to lie about it, random
           | Chinese parents would never know the difference. (Though
           | entering China might have been a challenge.) Why ask?
        
             | gwd wrote:
             | > I don't, but if I did, and I chose to lie about it,
             | random Chinese parents would never know the difference.
             | (Though entering China might have been a challenge.) Why
             | ask?
             | 
             | I think you'd be surprised how many people are really bad
             | at lying; or even bad at acting normal when they think they
             | have something to hide, particularly when asked such a
             | question unexpectedly. Sure, some people with a criminal
             | record may be able to lie plausibly when put on the spot,
             | but there are a reasonable number who would give themselves
             | away even if denying it. No real cost, some benefit, so why
             | not ask?
        
               | thaumasiotes wrote:
               | Well, I sanitized my report of the interview. He actually
               | asked if I had a bad history. I was confused and said I
               | didn't understand. He clarified that he was asking about
               | things like crime.
               | 
               | So the element of getting suddenly put on the spot
               | doesn't really apply. That's already about as awkward as
               | communication gets.
               | 
               | More importantly, though, I don't think the point is
               | correct to begin with. This is fine:
               | 
               | > I think you'd be surprised how many people are really
               | bad at lying; or even bad at acting normal when they
               | think they have something to hide, particularly when
               | asked such a question unexpectedly.
               | 
               | But job applicants with criminal records are only going
               | to match this description once or twice. For the rest of
               | their lives, it's not going to be an unexpected question,
               | and they'll have lots of practice in denying their record
               | if that's the way they choose to go. You're just never
               | going to catch anyone out this way.
        
               | gwd wrote:
               | > But job applicants with criminal records are only going
               | to match this description once or twice. For the rest of
               | their lives, it's not going to be an unexpected question,
               | and they'll have lots of practice in denying their record
               | if that's the way they choose to go. You're just never
               | going to catch anyone out this way.
               | 
               | You're right, you're only going to catch people out if
               | you're the only one asking this kind of question... but
               | right now that's more or less true of the people in your
               | story. First-mover advantage. :-)
        
             | woah wrote:
             | Why do forms require signatures if it's easy to forge
             | (especially if you are submitting it electronically), and
             | evidence of the agreement is gathered from other sources if
             | it comes up in court?
             | 
             | Because it boils down what could have started as a nebulous
             | mix of halfway-unethical actions into a single fraudulent
             | act that can be pointed to and punished later, and will
             | provide a lot of people with the impetus to back out of the
             | fraud once they see themselves about to commit an
             | unambiguously illegal act.
        
             | triceratops wrote:
             | Can't they run a background check? It's common to check a
             | tenant out when renting a property.
        
         | evandijk70 wrote:
         | They already do for a lot of publications. Check the 'reporting
         | summary' in this random article
         | https://www.nature.com/articles/s41586-024-07171-z#rightslin...
        
           | mattkrause wrote:
           | They do, but those "Reporting Summaries" are useless
           | makework, IMO.
           | 
           | First, you fill them out _after_ the data has been collected,
           | analyzed, and written up. It 's perhaps helpful as a reminder
           | to include a few tidbits in the text (e.g., the ethics
           | approval #), but literally _no one_ is going to fill this
           | form out, realize the sample size is way too small,
           | and....abandon the manuscript.
           | 
           | Second, you don't actually want people to comply with the
           | instructions. For example, it asks for "A description of any
           | assumptions or corrections, such as tests of normality". A
           | decent number of statisticians argue that you _shouldn 't_ be
           | using normality tests to choose between parametric and non-
           | parametric stats. On top of that, nobody actually writes out
           | assumptions behind OLS in their paper either.
           | 
           | I am deeply skeptical that this cookie-cutter stuff actually
           | helps in any meaningful way. It feels like rigor-theatre
           | instead.
        
         | red_admiral wrote:
         | And the checkbox should be at the top of the submission form :)
        
           | zaphirplane wrote:
           | It's Pre-ticked
        
           | seagullz wrote:
           | And a question at the bottom asking "do you feel the urge to
           | wash your hands with soap?"
        
           | 2cynykyl wrote:
           | You just doxxed yourself as Dan Ariely :-)
        
           | lern_too_spel wrote:
           | Background for the joke:
           | https://www.buzzfeednews.com/article/stephaniemlee/dan-
           | ariel...
        
         | kmeisthax wrote:
         | This sounds RFC 3514 compliant.
        
       | DominikPeters wrote:
       | Direct link to committee report:
       | https://s3.documentcloud.org/documents/24481375/gino-hbs-inv...
        
       | gwd wrote:
       | Reportedly, a lot of people who choose to study psychology are
       | motivated originally to figure out what's going on in their own
       | heads (which they have a sense is not quite normal) [1]. I wonder
       | if there's a similar dynamic with "honesty / ethics": people who
       | lack a native impulse to be honest / ethical and are curious
       | about people who do.
       | 
       | [1] See e.g., https://news.ycombinator.com/item?id=39703638 from
       | yesterday
        
         | beezlebroxxxxxx wrote:
         | Alternative, and more likely imo, hypothesis is that attaining
         | and maintaining the social (and probably $$$) capital of being
         | a "Harvard Expert" leads to desperation and breaking the rules.
        
           | gwd wrote:
           | Sure, that might have been the "trigger"; but that's missing
           | the key thing that needs to be explained. If this were
           | dishonesty by someone doing chemistry, it's unlikely that
           | this would have hit the front page of HN. As they say, "Dog
           | bites man isn't a story; man bites dog is."
           | 
           | But a priori, you'd expect someone studying honesty* to
           | personally care about honesty, and thus to be _less_ likely
           | to give in to these sorts of pressures. That 's the thing
           | that needs to be explained; the "man bites dog" aspect.
           | 
           | * EDIT s/honestly/honesty/g;
        
             | Kamq wrote:
             | > But a priori, you'd expect someone studying honestly to
             | personally care about honestly, and thus to be less likely
             | to give in to these sorts of pressures.
             | 
             | I'm not so sure you can automatically assume this. You can
             | definitely assume that the subject interests them.
             | 
             | But it's also possible that someone studying
             | honesty/dishonesty might start seeing the subject in
             | academic/technical terms instead of moral terms. Which may
             | give them much less of a disincentive to be dishonest than
             | the average person.
             | 
             | Which is to say that repeated studying and analysis of
             | instances where people are dishonest may break down the gut
             | reaction people have to being dishonest.
        
               | gwd wrote:
               | I didn't say one _should_ assume it, just that many
               | people _do_.  "Chemistry researcher committed fraud" is
               | simply not the same as "Dishonesty expert committed
               | fraud".
               | 
               | You give an alternate explanation, but it's still an
               | explanation; one which wouldn't be needed (and indeed
               | wouldn't apply) to a chemistry researcher.
               | 
               | A variation on your explanation might be: Dishonesty
               | researchers discover _just how easy_ it is for dishonest
               | people to cheat the system, and how little consequence
               | there is, and so is more tempted to be dishonest.
        
             | verisimi wrote:
             | > But a priori, you'd expect someone studying honestly to
             | personally care about honestly
             | 
             | But then, why study honesty if you already know what it is?
             | 
             | For me, this sort of thing - an honesty expert - is in the
             | same realm as 'ethics panels'.
             | 
             | These folk are there to abuse edge case arguments (think
             | "trolley problem") in order to provide moral cover for
             | corporations to act dishonestly, unethically. And they will
             | provide documentation in support. CEOs will just do what
             | they wanted, but call it "moral" tm.
        
             | JackeJR wrote:
             | The researcher was studying dishonesty... So I gather that
             | she was more interested in dishonesty?
        
             | yellowstuff wrote:
             | Probably the most important reason this story got big is
             | because it involved Dan Ariely, a popular author and
             | perhaps the most famous active researcher in the world.
        
               | hn_throwaway_99 wrote:
               | Strong disagree. Literally every story I've read about
               | this starts by talking about Francesca Gino.
        
             | zaphirplane wrote:
             | lawyer broke law or accountant tax cheat or doctor kidney
             | thief or lock smith cat burglar
             | 
             | People using their skills dishonestly is the theme of the
             | story
        
               | globalnode wrote:
               | computer professional caught hacking!
        
             | almostnormal wrote:
             | > As they say, "Dog bites man isn't a story; man bites dog
             | is."
             | 
             | The headline is just making fun of theory and practice. If
             | some is a murder expert, is that some one who understands
             | murder or someone highly qualified to kill people?
             | 
             | Therefore, the title is about a "dishonesty expert"
             | matching the practice, not about a "honesty expert", which
             | could be replaced by any other field.
        
               | philwelch wrote:
               | Reminds me of the grad student in criminology who
               | allegedly committed a brutal multiple murder in Moscow,
               | Idaho. He got caught, so I guess he wasn't as smart as he
               | thought he was.
        
               | pixl97 wrote:
               | > not about a "honesty expert", which could be replaced
               | by any other field.
               | 
               | Surely not marketing and sales!
        
         | willcipriano wrote:
         | A thief has the most locks on his door.
        
         | solfox wrote:
         | I think it's simpler than that: there's a heck of a lot of
         | dishonesty in academia: but the dishonest dishonesty researcher
         | grabs the headline.
        
           | bartwr wrote:
           | Yes, my experience with academics is that there are a lot of
           | very dishonest people. They are political bullies who also
           | lie in their research.
           | 
           | Chances of being caught are close to zero (I have contacted
           | many times authors of papers who's work I was unable to
           | replicate - most of the time zero reply, sometimes "yeah it
           | was a honest mistake, oops"), super high competition (only a
           | few tenured positions in all world's high visibility
           | institutions per year), full control over student's future
           | and being able to force them to do fraud (and later blame on
           | them).
           | 
           | Obviously, not all, blah blah - but many academic scientists
           | are the last people that should be doing science.
        
         | whack wrote:
         | Interesting hypothesis but my money is on the opposite
         | direction of causality. When you're surrounded by dishonesty on
         | a daily basis, you get de-sensitized to it. You start to see it
         | as "normal" or as "everyone does it". And then you start to
         | think _" Oh, what's the harm if I just fudge a little bit here
         | and there. It's not like I'm profiting off of this. Unlike all
         | these other millionaires who do far worse"_
         | 
         | Sure, it's easy for us to think _" I would never do anything
         | like that, no matter what others around me are doing"_. But as
         | someone who has lived in multiple countries, trust me - the
         | vast majority of the things we do are simply a reflection of
         | what we see other people doing.
        
         | simpletone wrote:
         | > Reportedly, a lot of people who choose to study psychology
         | are motivated originally to figure out what's going on in their
         | own heads
         | 
         | People who go into psychology are more interested in what's
         | going on in other people's heads and more importantly
         | manipulating other people. It's more about controlling others
         | than controlling oneself. It's why psychology was founded in
         | the first place.
         | 
         | > people who lack a native impulse to be honest / ethical and
         | are curious about people who do.
         | 
         | Ethics isn't about studying people. It's about studying
         | principles. IE what does ethics mean. What makes an act ethical
         | vs non-ethical. So on. You can delve into the ethics of gods,
         | god, AI or even animals. Are ethical principles universal or
         | not. So on and so forth.
        
           | staunton wrote:
           | > manipulating other people. [...] It's why psychology was
           | founded in the first place.
           | 
           | What are you referring to here? Which specific founder(s)
           | wanted to (or did) manipulate people?
        
         | pessimizer wrote:
         | I think it's more likely that psychology offers a lot of
         | potentially marketable propositions, such as lie detection,
         | effective lying, covert behavioral control, secret information
         | about the movements of financial markets, etc. Therefore, a lot
         | of snake oil is sold, and people's careers advance
         | proportionally to the amount of snake oil they can sell.
         | 
         | The job has often been to come up with a marketable theory; to
         | design experiments and write papers to imply that that theory
         | is true without quite proving it; and to avoid the possibility,
         | _through any means,_ that someone will weaken or disprove the
         | theory.
        
         | rossdavidh wrote:
         | Within academia, this is known as "research is me-search".
        
       | iforgotpassword wrote:
       | It's one of these days where I have to double-check I'm not on a
       | hn-themed the onion.
       | 
       | "CEO of data privacy company Onerep.com founded dozens of people-
       | search firms"
       | 
       | "Harvard concluded that a dishonesty expert committed misconduct"
       | 
       | And depending on the general mood I'm in this is kinda a downer
       | sometimes, like apparently only assholes make it in this world
       | because you cannot compete with others if you have a conscience
       | and they don't. Sorry for this tangent but it's just one of these
       | days somehow.
       | 
       | The McDonald's outage at least was good for a quick laugh when I
       | read the "unexpected world's health day" comment.
        
         | vsnf wrote:
         | > you cannot compete with others if you have a conscience and
         | they don't.
         | 
         | This is self evidently true, though. A conscience is a
         | constraint on behavior. People without this constraint can
         | accomplish more things. Of course, they may end up suffering
         | for their actions, but it's a surefire way to short-term
         | success, and not unlikely to lead to long-term success as well.
        
           | trashtester wrote:
           | > This is self evidently true, though.
           | 
           | I absolutely disagree.
           | 
           | Conscience, empathy, shame and similar emotions/instincts
           | that encourage pro-social behavior would only have developed
           | if it lead to higher reproductive fitness for those who had
           | them compared to those who didn't.
           | 
           | For instance, only a very small percentage of people are
           | psychopaths. If we look at the average life outcome for
           | psychopaths, we discover that while some of them may be
           | super-successful, we also find a lot of them as social
           | outcasts or in prison. And if they live in areas like law
           | enforcements, they're also more likely to get killed.
           | 
           | I would say the reason should be obvious. People without
           | these guardrails on behavior may get away with anti-social
           | behavior for a time, but eventually people figure them out.
           | This leads to all sorts of active and passive punishment,
           | ranging from difficulty making friends or holding on to a job
           | all the way to life in prison or execution.
           | 
           | SOME people may negate the downsides by acting in a super-
           | rational way. But this only works for people who are both
           | highly intelligent and who also have very good impulse
           | control.
           | 
           | But for most people, having at least moderate levels of pro-
           | social instincts/emotions will be just as beneficial for
           | themselves as for the people around them.
        
             | hackerlight wrote:
             | Genetic fitness != financial success. Moreover, ancestral
             | environment != modern environment. Not saying you're wrong,
             | but it's not a straightforward application of evolutionary
             | thinking.
        
               | trashtester wrote:
               | I was interpreting "this world" a bit more widely than
               | just the work environment in high level Academia.
               | 
               | There are certainly environments where anti-social
               | behavior can be optimal. But there are also a LOT of
               | environments where pro-social environments are rewarded.
               | Includig, but not restricted to, within families.
               | 
               | Anyway, you're right that we're not living in our
               | ancestral environment. But I don't think we're in a world
               | that actually rewards anti-social behavior less (EDIT)
               | than much of our evolutionary history.
               | 
               | Rather, I think cynical outlooks like that of the OP
               | actually appears for reasons such as:
               | 
               | 1) People who are themselves anti-social who either
               | actually think that everyone else are the same, or at
               | least try to normalize such tendencies.
               | 
               | 2) People who, during early development, had a very naive
               | world view and then when faced with reality, became so
               | disillusioned that they went to the other extreme.
               | 
               | I think the vast majority of people on earth today still
               | live in environments where some amount of compassion,
               | conscience and empathy are useful traits (for
               | themselves).
               | 
               | However, in a lot of cases, it can also be useful to have
               | the ability to turn such emotions on or off depending on
               | the context. Once we start to think of some other group
               | of people (or individuals) as either a mortal enemy or a
               | kind of prey, a lot of us can reach such an off-switch
               | when dealing with them.
        
             | everforward wrote:
             | > Conscience, empathy, shame and similar emotions/instincts
             | that encourage pro-social behavior would only have
             | developed if it lead to higher reproductive fitness for
             | those who had them compared to those who didn't.
             | 
             | It's a classic prisoners dilemma. In aggregate, the best
             | option is cooperation. For an individual, the best option
             | is betrayal in a society where everyone else follows the
             | rules.
             | 
             | Betrayers can't become dominant because the worst option
             | for both the individual and society is everyone being a
             | betrayer. It ruins social cohesion, so there's no synergies
             | to reap rewards from.
        
               | trashtester wrote:
               | > For an individual, the best option is betrayal ...
               | 
               | Well, this applies to prisoner's dilemma games if you
               | know exactly how many times you will "play" the game with
               | a given other player, including the case where you will
               | only play one time.
               | 
               | But for repeated prisoner's dilemma games, where you
               | don't know what games is the last against a given player,
               | a tit-for-tat strategy massively dominates an always-
               | defect strategy.
               | 
               | Humans have evolved in environments where repeated
               | prisoner's dilemma situation is common, and have evolved
               | emotions such as empathy and consciousness for
               | interactions with cooperative agents.
               | 
               | We ALSO have evolved (to a lesser or greater degree) the
               | ability to turn such emotions off when dealing with
               | people who attempt the always-defect strategy (such as
               | psycopaths).
               | 
               | In fact, psychpathic behavior only becomes viable in
               | populations that have been using tit-for-tat for so long
               | that they've "forgotten" how to punish defectors (ie
               | almost all agents switched to always-cooperate).
               | 
               | But once you introduce a few always-defect agents into
               | the population, the tit-for-tat agents will soon dominate
               | again.
               | 
               | At least for the brain wireing for interacting with
               | members of our own "tribe". When dealing with people
               | outside our "tribe", it's much more likely that we will
               | only play them once, and in such cases, always-defect may
               | indeed dominate.
               | 
               | However, there is also a tribe-level game, where the
               | tribe can act as agents. And in tribe-vs-tribe games,
               | depending on circumstances, both tit-for-tat and always-
               | defect can be optimal strategies. Always-defect can,
               | ultimately, lead to genocide, while tit-for-tat can lead
               | to alliances, trade and intermarriage.
               | 
               | People who think that always-defect is always the optimal
               | mode of behavior tend to be people who either don't
               | consider it for their close relations, or who have very
               | few close relations. For instance, someone who lives
               | alone in the city and is either unemployed or tend to
               | have purely transactional work relations.
               | 
               | What such people tend to forget, is that there is an
               | artificial presence in the city, called the police, that
               | can impose a level of safety that can remove the need for
               | high trust, pro-socal relationships. In a society with
               | police, all you have to do is to know what behavior lands
               | you in prison, and otherwise you can be a selfish
               | bastard.
               | 
               | As soon as you remove the police, loners like that are
               | free game for "predators", and only those who have a
               | "tribe" have any kind of protection. Such a "tribe" can
               | be a clan in an Afghan mountain area or it can be a gang
               | in a high-crime city. While a gang member may have little
               | to no empathy or conscience when dealing with outsiders,
               | maintaining good relations with other clan members can be
               | critical (Depending on what specific gang it is.)
        
               | everforward wrote:
               | I suppose I should have clarified that always defecting
               | is optimal if you're optimizing purely selfishly for
               | material goods. It won't make anyone feel fulfilled and
               | loved.
               | 
               | > But for repeated prisoner's dilemma games, where you
               | don't know what games is the last against a given player,
               | a tit-for-tat strategy massively dominates an always-
               | defect strategy.
               | 
               | If we're considering repeated prisoner's dilemma games,
               | you also have to consider the gain/loss of power between
               | the various players. The classic prisoner's dilemma
               | consists of two equally impotent players. Over a series
               | of games, a player could aggregate enough power to change
               | the nature of the game. They could aggregate enough
               | power/resources to alter the parameters of the game, like
               | using those resources to offer a substantial reward for
               | cooperating with them or confessing. Inversely, they
               | could use that power to enforce penalties on other
               | players for betraying them.
               | 
               | The question then becomes whether an always-defect player
               | can accumulate enough resources to change the rules of
               | the game before the other players switch to tit-for-tat.
               | 
               | The answer in the real world appears to be "mostly yes".
               | Uber appears to have chosen "always defect", but they got
               | enough power through their defections that we've become
               | effectively powerless to penalize them for defecting.
               | Theranos is a counter-example where they were eventually
               | punished, though they made the mistake of playing the
               | prisoner's dilemma against already powerful players.
               | 
               | Without getting too political, Trump has a history of
               | defecting on business relationships and the man was
               | elected president.
               | 
               | > At least for the brain wireing for interacting with
               | members of our own "tribe". When dealing with people
               | outside our "tribe", it's much more likely that we will
               | only play them once, and in such cases, always-defect may
               | indeed dominate.
               | 
               | Within the tribe, the power levels tend to be flatter,
               | though, making it easier for an always defect player to
               | seize enough power to not have to play the game anymore.
               | I.e. if the player can become moderately wealthy, the
               | resources they offer their tribe would likely outweigh
               | their history of defection. Company towns are an example
               | of this; virtually all the residents agreed the company
               | was awful, and yet they stayed (for a long while).
               | 
               | > As soon as you remove the police, loners like that are
               | free game for "predators", and only those who have a
               | "tribe" have any kind of protection. Such a "tribe" can
               | be a clan in an Afghan mountain area or it can be a gang
               | in a high-crime city. While a gang member may have little
               | to no empathy or conscience when dealing with outsiders,
               | maintaining good relations with other clan members can be
               | critical (Depending on what specific gang it is.)
               | 
               | Sure, but that's not the prisoner's dilemma anymore, so
               | always-defect is probably not an optimal choice.
               | 
               | A crux of the prisoner's dilemma is that a player can
               | benefit by choosing to defect when the other player
               | chooses to cooperate. If cooperating results in the best
               | individual outcome and the best societal outcome, it's
               | not a dilemma anymore. Cooperating is the obvious best
               | choice.
               | 
               | This is part of what makes always-defect viable. Most
               | situations do not mirror the prisoner's dilemma, and
               | cooperating is often the best individual outcome. If you
               | look at an always-defect player, they might be
               | cooperative in 9/10 or 99/100 situations because only
               | 1/10 or 1/100 were prisoner's dilemmas where they could
               | benefit by defecting.
               | 
               | I don't think "always defect" is globally optimal, only
               | for prisoner's dilemmas. Always-defect globally implies
               | doing so even when it is clearly a sub-optimal choice
               | (like driving your car into a brick wall just because the
               | police said not to). The parameters of the dilemma scope
               | it to only situations where defecting confers gain.
        
               | silvestrov wrote:
               | > repeated prisoner's dilemma games
               | 
               | I think this is why it is very important with
               | communities: so that your interactions with other people
               | are mostly with the same people instead of mostly
               | strangers.
               | 
               | The other important aspect is that people thinks "we have
               | the same values, so we are the same team", as you will
               | otherwise always select to deflect due to mistrust of the
               | other part.
               | 
               | Police on its own is not enough, they cannot solve most
               | crimes without help from citizens. A police officier
               | without good informers is not good police ("The Wire",
               | see IMDB). So police is only really effective in a high
               | trust, pro-socal relationships society.
        
           | randysalami wrote:
           | A constraint can lead to more power e.g. the power of
           | abstractions in code. On a more grounded note, having a
           | constraint (a creed or code that limits you) can be a visible
           | signal to others that you're worth following. With these
           | bonds strengthened by self-imposed constraint, you can
           | accomplish more together than you could alone but unfettered
           | by limitation (debatable). This is a theme in many movies and
           | stories where the main character's restraint inspires others
           | to great deeds and gives them the resolve to do the
           | unthinkable (Star Wars). Of course, a person with no
           | conscience can always feign this restraint... (Palpatine)
        
           | mistrial9 wrote:
           | congratulations - a completely self-centered analysis amidst
           | a world of systems of systems
        
           | javajosh wrote:
           | _> This is self evidently true, though._
           | 
           | I think you mean "this is intuitively true" because, like so
           | many intuitive conclusions, it is wrong. Looking at animal
           | behavior, it's intuitively true that organisms will only win
           | if they are selfish, and yet selfless behavior has evolved in
           | countless species. Another intuitive truth is that when you
           | have a powerful species of predator that can kill prey at
           | will, that they will decimate the prey population and end up
           | dying off from lack of food. Yet most ecosystems can and do
           | support both predators and prey in equilibrium. In business,
           | it's intuitively true that it would be stupid to make another
           | fast-food hamburger business in the presence of McDonalds -
           | yet Carl's Junior/Hardees did very well doing just that.
           | Counter-examples abound to the intuitive conclusion.
           | 
           | One of the best habits you can form is the habit of
           | questioning your intuition.
        
         | orzig wrote:
         | Adding a little bit of optimism: there are many decent people
         | in all of these fields. They might not as often make the news
         | (initially for extraordinary "achievements" then for
         | downfalls), but it is entirely possible to "make it in this
         | world" (supporting a family, taking the occasional vacation,
         | retiring at age 65) while being honest.
        
         | itsoktocry wrote:
         | > _like apparently only assholes make it in this world because
         | you cannot compete with others if you have a conscience and
         | they don 't. Sorry for this tangent but it's just one of these
         | days somehow._
         | 
         | We are surrounded by psychopaths. We have politicians willing
         | to sell out to foreign countries for what isn't even massive
         | wealth. We have government workers "working from home" on side
         | gigs on government contracts[1]. And these are examples of
         | "nice" ones that don't involve wars. It's crazy.
         | 
         | [0] https://www.justice.gov/usao-sdny/pr/us-senator-robert-
         | menen...
         | 
         | [1]https://nationalpost.com/news/politics/auditor-general-
         | fired...
        
         | vundercind wrote:
         | This and other factors are why n-gate's HN article descriptions
         | were so good. Most of the time they were basically more
         | accurate and honest than the actual articles or most posts on
         | this site, while also being concise. Cynical? Yeah, but...
         | accurate.
        
         | ninju wrote:
         | For further details on the "data privacy company Onerep.com"
         | discovery
         | 
         | https://news.ycombinator.com/item?id=39709089
        
         | 20after4 wrote:
         | > like apparently only assholes make it in this world because
         | you cannot compete with others if you have a conscience and
         | they don't.
         | 
         | I think this is the key fact of life that it took me far too
         | long to realize. They say that "cheaters never win and winners
         | never cheat" but it's actually pretty much the opposite of
         | that.
        
         | BurningFrog wrote:
         | This is true in some fields and not at all in others.
        
       | vouaobrasil wrote:
       | The article states that this researcher falsified data. While I
       | rather abhor dishonesty, especially being a former academic
       | myself, one must keep in mind the following: 90% of academia is
       | rotten in the sense that the research being done is being done
       | because the academic path requires _some_ kind of research. This
       | in turn is the case because research groups and departments want
       | to keep themselves alive, and the only way to stay alive is gain
       | funding. Finally, and worst of all, gaining funding has been
       | almost entirely divorced from the usefulness that research has to
       | people.
       | 
       | On the counterpoint, I do support basic research and curiosity,
       | which I fundamentally believe has value to societies, but the
       | research being done is hardly out of curiosity. Instead, these
       | perverse incentives have led to an environment where basic
       | curiosity is paradoxically discouraged.
       | 
       | The end result is that large portions of academia are filled with
       | useless machines doing useless work. As a result, we're getting
       | these pathological cases. And we can think about this fraud on
       | one more level: don't think that it will necessarily hurt
       | academia....a fraud is necessary every once in a while so that
       | everyone else shines just a little brighter.
        
         | itsoktocry wrote:
         | > _Finally, and worst of all, gaining funding has been almost
         | entirely divorced from the usefulness that research has to
         | people._
         | 
         | By "usefulness to people" do you mean, essentially, "good for
         | society"? Because I have a hard time believing that funding
         | entities are simply flushing money down the toilet. They must
         | find the research useful to their ends.
        
           | vouaobrasil wrote:
           | Yes, I mean good for society. They are not flushing money
           | down the toilet: who do you think makes funding decisions?
           | Who do you think reads the funding proposals? Scientists.
           | Scientists are giving money to each other to keep their
           | efforts alive. Scientists don't do things for the good of
           | humankind, they do it for fame and pure intellectual
           | curiosity.
           | 
           | Of course, there are benefits to those outside of science:
           | science furthers the growth of technology, but is that really
           | beneficial to us? Yes, we are thrown a bone now and then that
           | is useful such as a vaccine here and there, but the the
           | benefits that actually make life better probably account for
           | 1% of scientific activity these days.
        
           | red_admiral wrote:
           | There is definitely a lot of money thrown after whatever the
           | current buzzword is - we had blockchain, now we hve AI;
           | sustainability is also a big one in some places. That doesn't
           | mean that there's not great things one could achieve for
           | society in both the AI and sustainability fields, but having
           | the correct keywords on your project proposal goes a long way
           | even if you quite obviously don't have a clue what those
           | words mean.
        
         | BurningFrog wrote:
         | I've come to think we need to ignore the current science
         | system/establishment and start over with something1 better.
         | 
         | 1 To be determined :)
        
           | vouaobrasil wrote:
           | For the most part, I absolutely do agree with you. I would
           | say something like more traditional knowledge emphasizing the
           | relationship with the earth. I would get rid of 90% of
           | science, at least.
        
       | j7ake wrote:
       | The sad part about the field is that in the end nobody really
       | cares what science turned out to be fake... because nobody
       | actually cares about the specific results that come out of
       | psychology research.
        
         | tucnak wrote:
         | Microbiology is similarly rife with fraud
        
           | burnished wrote:
           | How so?
        
           | kjkjadksj wrote:
           | At least you can more easily verify an experimental result
           | that takes a million e coli vs one that takes a million
           | depressed people.
        
           | theGnuMe wrote:
           | Just wait until it's all in AI LLMs... fun times ahead.
        
         | sitkack wrote:
         | That seems like a worse thing, it says something about the
         | field. There are always going to be individuals who do things
         | wrong, or wrong things. But ultimately, it really comes down to
         | how the group behaves and the goals they seek.
        
         | gustavus wrote:
         | Fact of the matter is Psychology is at this point more pseudo
         | than science.
         | 
         | The biggest factor impacting the outcome of an experiment in
         | psych is what the researcher conducting the study wants to be
         | true. The whole subject is broken down in this article:
         | https://slatestarcodex.com/2014/04/28/the-control-group-is-o...
        
           | nathan_compton wrote:
           | I think you're kind of mischaracterizing this article. The
           | conclusion drawn isn't really that psychological research is
           | wrong per se, but that the statistical and methodological
           | tools which we _believed_ suffice to allow us to formulate
           | experiments that make decisive epistemological statements
           | apparently do not do so. Scott Alexander isn't imputing the
           | behavior of most of these scientists or even the endeavor of
           | social psychological research. He is simply observing that
           | our methods clearly are insufficient to the end of
           | attributing results strong epistemological weight.
        
             | pessimizer wrote:
             | > I think you're kind of mischaracterizing this article.
             | The conclusion drawn isn't really that _psychological
             | research is wrong per se_
             | 
             | This is a characterization that you have just made, not one
             | that the person you are replying to made. They were very
             | clear: "The biggest factor impacting the outcome of an
             | experiment in psych is what the researcher conducting the
             | study wants to be true."
             | 
             | Even to say something is "more pseudo than science" almost
             | directly contradicts your characterization, because it
             | implies that there is _both science and pseudoscience_.
        
           | slt2021 wrote:
           | Psychology is just vibes, dressed up as science
        
         | andoando wrote:
         | I certainly don't. I see so many scientific/philosophical
         | issues with it, and Im surprised anyone takes it seriously.
        
         | chubot wrote:
         | Yup, that's the point this article makes:
         | 
         |  _I'm so sorry for psychology's loss, whatever it is_ -
         | https://www.experimental-history.com/p/im-so-sorry-for-psych...
         | 
         |  _This whole debacle matters a lot socially: careers ruined,
         | reputations in tatters, lawsuits flying. But strangely, it
         | doesn 't seem to matter much scientifically. That is, our
         | understanding of psychology remains unchanged ..._
         | 
         |  _That might sound like a dunk on Gino and Ariely, or like a
         | claim about how experimental psychology is wonderfully robust.
         | It is, unfortunately, neither. It is actually a terrifying fact
         | that you can reveal whole swaths of a scientific field to be
         | fraudulent and it doesn 't make a difference._
         | 
         | ---
         | 
         | Also interestingly the author apparently studied under Dan
         | Gilbert at Harvard, so he's an "insider". I remember >10 years
         | ago seeing the "happiness science" from Gilbert go around:
         | 
         | https://blog.ted.com/ten-years-later-dan-gilbert-on-life-aft...
         | 
         | Although I actually think the core insight is a good one, and a
         | memorable one -- people are unable to predict what makes them
         | happy. They think they will be happy if they buy a new car to
         | show off, but if you ask them afterward, that didn't really
         | happen.
         | 
         | There was another one of these "TED memes" that turned out to
         | be widely mocked / unreplicable:
         | 
         |  _When the Revolution Came for Amy Cuddy_
         | 
         |  _As a young social psychologist, she played by the rules and
         | won big: an influential study, a viral TED talk, a prestigious
         | job at Harvard. Then, suddenly, the rules changed._
         | 
         | https://www.nytimes.com/2017/10/18/magazine/when-the-revolut...
        
         | RowdyTomato wrote:
         | I understand where you are coming from. However, this overlooks
         | a critical stakeholder group directly affected by the outcomes
         | of such research: patients.
         | 
         | For instance, controversies surrounding the PACE trial, which
         | explored treatments for Chronic Fatigue Syndrome (ME/CFS),
         | illustrate the profound implications of research integrity.
         | Critics have condemned the trial as "biased and profoundly
         | flawed," arguing that its deficiencies have led to detrimental
         | effects on patients' lives and treatment approaches. This
         | controversy underscores the significance of research outcomes.
         | 
         | For more insights into the academic dishonesty surrounding
         | ME/CFS and its consequences on patients I recommended this
         | article:
         | https://www.theguardian.com/commentisfree/2024/mar/12/chroni...
         | 
         | This issue highlights why it's crucial for both the scientific
         | community and the public to demand rigor and transparency in
         | research, especially when the well-being of vulnerable
         | populations is at stake.
        
       | fasteo wrote:
       | Being an "expert" takes a lot of practice, and it shows.
        
       | fsflover wrote:
       | Dupe: https://news.ycombinator.com/item?id=39712021
        
       | DrNosferatu wrote:
       | Fake it until you make it!
        
       | noneeeed wrote:
       | There was an interesting two-part series on the Freakonomics
       | podcast about academic fraud, and it covered this case.
       | 
       | It's all incredibly depressing. I really feel for all the junior
       | researchers who end up wasting their time, and often derailing
       | their careers because they followed a path based on other
       | people's academic fraud.
       | 
       | https://freakonomics.com/podcast-tag/academic-fraud/
        
         | 11101010001100 wrote:
         | No one should be putting all their eggs in one scientific
         | basket even if the basis ISN'T fraudulent. I see this mistake
         | over and over.
        
           | noneeeed wrote:
           | I understand where you are coming from, but if you are
           | starting out on a PhD and decide to do research that is
           | branching off from work done by someone like Gino you could
           | spend a long time chasing ghosts and it will be hard for you
           | to turn round and say "I think this is actually BS".
           | 
           | Even once you are over that hump you will have quite a few
           | years going from one short-term grant to another, with your
           | ability to get funding being dependent on your previous work.
           | If that has been stymied because you were basing it off dodgy
           | research of other people it could take a long time to build
           | up the kind of record where you get to diversify and some any
           | kind of academic security of freedom.
        
             | mcv wrote:
             | A friend of mine proved (for his PhD I think) that the
             | thing his entire department was working on was based on
             | bullshit. They weren't happy.
             | 
             | That said, putting all your eggs in one basket is often
             | necessary to get anywhere with that basket of research.
        
               | andoando wrote:
               | What did they disprove? Or at least, what field of study?
        
               | mcv wrote:
               | I forgot the details, but he studied both mathematics and
               | AI, so something in either of those fields.
        
             | 11101010001100 wrote:
             | Having done a PhD and said 'I think this is BS', it's not
             | easy but possible.
        
           | 0cf8612b2e1e wrote:
           | Academic research (especially a PhD) is about going deep into
           | one particular topic. You are fully dependent upon the giants
           | on which you stand.
        
             | smallmancontrov wrote:
             | Yep. "Just stand on the shoulders of TWO giants" is muuuuch
             | easier said than done, lol.
        
         | dbsmith83 wrote:
         | It seems like dishonesty in academia is not punished severely
         | enough. Someone who is caught may face professional
         | embarrassment, and lose some privileges, but I think there
         | should be tougher consequences. Hell, Ranga Dias still seems to
         | be employed. These people are wasting government (our) money
         | which could have been used by honest scientists to further our
         | scientific knowledge of the world. It also further wastes the
         | time of other scientists who may try to build off of the
         | fraudulent research. To me, they are essentially committing
         | fraud in a field where truth-seeking is paramount. I think a
         | trial and prison time needs to be a part of the consequence for
         | flagrant fraud.
        
           | smallmancontrov wrote:
           | A hardline approach is not going to win support from the
           | people on the front lines in the best position to spot and
           | police this. Even the most honest researcher has a published
           | result or three that they suspect is incorrect and they will
           | not trust you to accurately litigate against only the worst
           | players because they are smart cookies and fully understand
           | that honesty is a severe liability when there is an authority
           | out for blood.
           | 
           | No, the better approach here is to just shift the incentives.
           | Start funding replication. Once we see labs and career paths
           | that specialize in replication / knowledge consolidation, the
           | whole system will shift for the better. Bibliometrics and
           | hiring committees will start to pay attention and then
           | exploratory researchers will start to pay attention and the
           | system will start to work a little bit better.
        
             | lupusreal wrote:
             | Why can we charge other professions with fraud but not
             | researchers? I'm sure people in other professions might
             | worry about being wrongfully prosecuted too, but that
             | doesn't stop us. Even doctors get criminal charges when
             | they deliberately do something wrong. You can be pretty
             | sure that virtually every doctor has made honest mistakes,
             | but they don't all prevent dishonest doctors from being
             | brought to justice.
        
               | CamperBob2 wrote:
               | There are objective standards in other professions. The
               | thing about research is, by definition, whatever you're
               | doing is not part of an established profession yet.
               | 
               | Obviously that doesn't apply in setting like drug
               | development where standards do exist, as defined by the
               | best currently available treatments. But if someone is
               | working on something like psychological studies where
               | replicability is the exception rather than the rule, or
               | on exotic tech where only one experimental facility might
               | exist, or on substances or effects that exist only under
               | weird conditions, it's not always that easy (or that
               | safe) to accuse them of lying. Even when you're pretty
               | sure they are.
        
               | kelipso wrote:
               | Agreed. There are plenty of stories in physics where
               | researchers reported some effect which was found later to
               | be due to an error setting up the experiment. Are they
               | supposed to face fraud charges because of this?
               | Researchers will just quit and go work in the industry or
               | something.
               | 
               | And there are plenty of fields where you have different
               | interpretations of the same data (see the entire field of
               | economics, also plenty in physics and other fields).
               | Should the people who espoused ether theory be sued for
               | fraud? It'll be a huge mess because doing research is by
               | definition doing something unprecedented.
        
               | persnickety wrote:
               | I have a hunch that the not everything you do as a
               | researcher is novel. A part of it is, like you mention,
               | new by definition.
               | 
               | But there's also the old and established parts, like
               | statistics, parts of the experimental setup, methodology,
               | reporting data accurately (or at all). This is plenty
               | enough to have objective standards for.
               | 
               | A lot of fraud is not in making up experimental results,
               | but instead misreporting the data and drawing unsupported
               | conclusions.
        
               | dbsmith83 wrote:
               | Exactly! There are some obvious things too like: don't
               | copy and paste tiny bits of an electrophoresis gel and
               | put it into another image to make it look like it was the
               | same result. Sylvain Lesne comes to mind here. Last I
               | checked, this jackass still has a job, too
        
               | burnished wrote:
               | If your contention is fraud then good news - we already
               | have the laws and authorities required to pursue it.
               | Nothing new required.
        
               | Nevermark wrote:
               | (IANAL) Fraud can often be prosecuted as a civil crime,
               | in which case there needs to be an aggrieved party with
               | credible evidence of damages, no inhibitions about making
               | a splash, and the time and money resources to sue.
               | 
               | I suspect educational institutions don't want to be seen
               | as organizations that sue their own researchers. Same
               | with paper publishers. And same for grant issuers.
               | 
               | Downstream researchers? Do they have the time, the money?
               | ability to show direct damages whose recompense would be
               | worth all the effort?
               | 
               | Students affected perhaps could, but only if the effect
               | was very direct and they had the resources. And desire to
               | be known as someone who sues their professor.
               | 
               | The damage is usually so diffuse. There is no one party
               | with all the reasons to sue.
               | 
               | I have no idea what the process would be for criminal
               | prosecution, but the diffuse impact may be an inhibitory
               | factor there too.
        
             | ksenzee wrote:
             | > Start funding replication.
             | 
             | I'm not convinced this fixes anything. Even when a result
             | is genuine, it's very easy to fail to replicate it. We all
             | know this from software development: it's a lot easier to
             | say "couldn't reproduce" about a genuine bug than it is to
             | track down the precise context in which the bug actually
             | manifests. So if you get rewarded for failing to replicate
             | a result, all the fraudsters will do that. If you get
             | funded only when you actually replicate the result, the
             | fraudsters will pretend to replicate the result.
        
               | mannykannot wrote:
               | > Even when a result is genuine, it's very easy to fail
               | to replicate it.
               | 
               | To the extent that is true, then it is itself evidence
               | supporting the proposition that not-yet-replicated
               | results should be regarded as provisional.
        
               | Sebb767 wrote:
               | > So if you get rewarded for failing to replicate a
               | result, all the fraudsters will do that. If you get
               | funded only when you actually replicate the result, the
               | fraudsters will pretend to replicate the result.
               | 
               | So reward either? This seems pretty obvious.
        
               | andrewaylett wrote:
               | It's much easier to fail to replicate a result than to
               | actually try to replicate it.
        
               | anankaie wrote:
               | Yes, but the original author is incentivized to attempt
               | to show where the replicators got it wrong, so there will
               | still be a push to correct the bad data from the false
               | replication failure. With that said, I am convinced it
               | will be a panacea.
               | 
               | A bigger issue is that... 3-sigmas is really a weak
               | signal in a high cardinality state-space, which is
               | basically everything above physics of small numbers of
               | elementary particles, and it is the elementary physicists
               | that go for higher. This is what is feeding the
               | replication crisis: Weak signals from very poorly sampled
               | studies.
               | 
               | The meta issue is that we as a society need to start
               | accepting that some things will take longer and require
               | more investment to achieve results. Do fewer studies per
               | unit grant, but do the three-sigma ones only to justify a
               | real experiment/study, not as an acceptance criteria for
               | "discovery!".
               | 
               | I'm not even going to touch politicalization, since I
               | _really_ have no idea what to do about it without a worse
               | cure than the disease.
        
               | r00fus wrote:
               | > We all know this from software development: it's a lot
               | easier to say "couldn't reproduce" about a genuine bug
               | than it is to track down the precise context in which the
               | bug actually manifests.
               | 
               | If a study claims to prove something, it should be
               | repeatedly provable or it's a) fraud or b) not proven
               | solidly enough.
               | 
               | I think replication is a key component of a functional
               | research.
        
               | singleshot_ wrote:
               | Figuring out why a result is reproducible by some, but
               | not others, is probably where the scientific discovery
               | lies, if there is one to be had.
        
         | ProjectArcturis wrote:
         | Or derailing their careers because only the very successful
         | grad students and postdocs get grants and tenure, and evidently
         | a good chunk of those elite spots get taken by people who
         | publish dishonest research.
        
       | barfbagginus wrote:
       | As an expert in dishonesty - literally a corporate espionage
       | contractor - I would be very sad if anyone were to actually
       | expect me to not commit misconduct...
       | 
       | Things in the academic world are a little different I see.
       | Strange.
        
       | _ZeD_ wrote:
       | it reminds me of Fullmetal alchemist's Shou Tucker
        
       | pavlov wrote:
       | You become a dishonesty expert by lying for 10,000 hours. There
       | are no workarounds, got to do the work.
        
       | wtcactus wrote:
       | It's high time we accept that social sciences have nothing
       | scientific about them, and are just a mix of political ideology
       | and astrology.
       | 
       | I already said this before and I'll repeat it. Media constantly
       | portraying social sciences and their findings as actual science,
       | is a big culprit of all this disbelief the masses are showing
       | towards real science (the physical and natural sciences).
        
         | CJefferson wrote:
         | I'm not sure the social sciences are any more scientifically
         | corrupt than the "harder" sciences.
        
           | orzig wrote:
           | Biology has probably had the most replication scandals of the
           | hard sciences, but if you look back 10 years, there have been
           | several tangible breakthroughs. CRISPR, mRNA vaccines, Cystic
           | Fibrosis treatments, malaria vaccine, and that's just off the
           | top of my head.
           | 
           | So let's not throw the baby out with a bathwater on Science.
           | 
           | I would be interested to see what other peoples comparable
           | list would be for the social sciences - I don't know enough
           | to say that "absence of evidence is evidence of absence"
        
             | realityfactchex wrote:
             | Here's the thing: even some of the listed items "off the
             | top of your head" as counterexamples are entirely
             | fraudulent.
             | 
             | I don't think anybody's trying to throw the baby out with
             | the bathwater.
             | 
             | This is entirely serious. But most people have no idea how
             | broad and deep the reach of the hard science integrity
             | problem goes (at least in biology).
             | 
             | Not only that, but it's HARD for people to know, and there
             | are with 100% certainty efforts to keep it that way.
        
           | willcipriano wrote:
           | You honestly believe that the social sciences have the same
           | rigor as physics?
        
             | sitkack wrote:
             | corruption and rigor are different things.
        
               | willcipriano wrote:
               | Can't really be corrupt if you have enough rigor.
        
               | dahart wrote:
               | Oh, lots of people have been rigorous about their
               | corruption. Maybe they only got caught because they're
               | not rigorous enough. :P https://en.wikipedia.org/wiki/Lis
               | t_of_scientific_misconduct_...
        
               | willcipriano wrote:
               | Physics: "Elements of what became physics were drawn
               | primarily from the fields of astronomy, optics, and
               | mechanics, which were methodologically united through the
               | study of geometry. These mathematical disciplines began
               | in antiquity with the Babylonians and with Hellenistic
               | writers such as Archimedes and Ptolemy." Established: 200
               | BC[0]
               | 
               | Social Sciences: "The history of the social sciences
               | began in the Age of Enlightenment after 1650"
               | Established: 1650 AD[1]
               | 
               | Physics: 2224 year history
               | 
               | Social Sciences: 374 year history
               | 
               | Physics: 3 incidents of misconduct on Wikipedia article
               | (and that includes engineering as well).
               | 
               | Social Sciences: 11 incidents of misconduct on Wikipedia
               | article.
               | 
               | Physics: One incident per 741 years.
               | 
               | Social Sciences: One incident per 34 years.
               | 
               | [0]https://en.m.wikipedia.org/wiki/History_of_physics
               | 
               | [1]https://en.m.wikipedia.org/wiki/Social_science
        
             | xkcd-sucks wrote:
             | Yes, _for whatever that 's worth_.
             | 
             | One example: I did a pure-chemistry undergrad degree and a
             | psychology-adjacent graduate degree. The "hard science"
             | degree involved basically zero applied statistics of any
             | kind, any statistics concepts that came up were emergent
             | from lower level stuff even in e.g. stat mech courses, and
             | I wasn't even aware that "Design of Experiments" was a
             | thing. Whereas the "soft science" curricula was heavily
             | focused on statistical rigor, DoE, layers upon layers of
             | internal controls, and so forth; basically because there
             | was no practical way to see an unambiguous effect. It
             | certainly seemed "rigorous".
             | 
             | However the soft science stuff just has less predictive
             | power despite the rigor. In broad sense it relates to human
             | perception, capabilities of technology, model systems,
             | semiotics/epistemology (maybe wrong word),
             | prediction/confirmation, especially faith.
             | 
             | For example, at a macro scale on Earth, it is very easy to
             | accurately predict at human scale how things work. Micro
             | scale physical objects require a bit more bootstrapping.
             | You can't see most molecules or atoms, so you have to come
             | up with a way of inferring measurements through other
             | processes which themselves have to be trustworthy. Etc.
             | 
             | At some point of course, one has to select some model as an
             | axiom or matter of faith to make any progress. You can't
             | model a dropping brick if you can't trust your timer or
             | measuring tape, etc. So you stand on giants' shoulders and
             | make a predictive model as an extension of the axiomatic
             | one.
             | 
             | This starts getting really squishy when you're dealing with
             | _entire concepts that have no agreed upon definition
             | whether quantitative or qualitative_ and try to make them
             | into something onto which statistical rigor can be
             | applied!!!!!
             | 
             | So with soft-science literature it is always extremely
             | important to mentally substitute the details of a model's
             | implementation, for the _shorthand expression used to
             | describe it which may overlap with a commonly used word_.
             | 
             | For example, "We found that this drug candidate
             | significantly reduced depression in mice" -> "We found that
             | this drug candidate significantly _increased the amount of
             | time mice swim around before giving up when you chuck them
             | into a tank of water_ , etc.
             | 
             | because the rigorous conclusion might not actually be
             | practically meaningful if the axioms are practically
             | unpredictive.
             | 
             | And even worse, going back to the "no agreed definition"
             | thing, the definitions of psychological concepts are only
             | defined in terms of these very vague experiments! It's like
             | bootstrapping physics if you're a disembodied nothing in a
             | simulation.
             | 
             | idk there's much more to say on this but I'm
             | procrastinating at work so
        
           | Workaccount2 wrote:
           | The harder sciences get to lay claim to things like AI and
           | cell phones. Anti-depressants and EV's.
           | 
           | Social sciences get to lay claim to CBT?
        
             | dahart wrote:
             | So? That list has nothing to do with corruption in the
             | sciences.
             | 
             | BTW, social sciences get credit for the modern economy and
             | for the known history of humankind, for modern farming, for
             | today's government and policy, for mental health and mental
             | therapies, for modern advertising, for understanding of
             | languages, and for modern corporate management, just to
             | name a few.
        
               | 20after4 wrote:
               | Your list includes a lot of things I wouldn't be proud to
               | claim as accomplishments.
        
         | rhelz wrote:
         | The question of whether social sciences are sciences is subtle
         | to answer. If we take Francis Bacon's method as more or less
         | what a science is, then, in order to be a science, we need to
         | be able to do experiments. Experiments are objective (anybody
         | can watch it happening), communicable (you can tell anybody how
         | to do the experiment), and repeatable (anybody can do it).
         | 
         | An example would be something like chemistry. An experiment
         | could be burning hydrogen with oxygen to see if it yielded
         | water. There is a very strong sense in which anybody could do
         | _this same exact experiment_. It is repeatable by anybody, it
         | happens in an objective space we can all observe.
         | 
         | Now, take something like history (perhaps one of the social
         | sciences). It's proverbial that history repeats itself, but can
         | you do _experiments_ in history? Not really, because you can 't
         | do something like rerun WWII, only this time the nazi's didn't
         | chase out Einstein.
         | 
         | Note, this doesn't mean that History is bogus. Yes, it takes as
         | its object of study something which you can't perform
         | experiments on, but nevertheless, it does have other methods
         | which are apropos to its subject matter, and it is a discipline
         | --doing history is scholarship. It yields knowledge--just not
         | scientific knowledge in the sense of the Baconian scientific
         | method.
         | 
         | Economics is the same--we can't just re-run the Reagan era,
         | only this time without trickle-down economics. If somebody is
         | studying, say, the causes of the great depression, their
         | subject matter isn't something which you can gain knowledge
         | about using experiments. Doesn't mean you can't get economic
         | knowledge in other ways.
         | 
         | So....what about Psychology? Specifically, somebody who
         | studies...deception. Is deception something you gain knowledge
         | about by doing experiments? If you take a group of people and
         | try to deceive them, and study what happens, is that an
         | experiment?
         | 
         | Its not, and for a very interesting reason: the "is-ought"
         | distinction. Experiments can tell you the "is"--what is
         | happening. But they can't tell you the "oughts"--what _should
         | be_ happening. In English, we use two different verb moods to
         | mark the distinction, e.g.  "Fred is not lying" vs "Fred should
         | not be lying."
         | 
         | Now, you can perform experiments and observations to determine
         | whether or not Fred is not telling the truth. But lying? Lying
         | has an extra ingredient, the _intention_ to deceive. Intentions
         | are subjective and not observable in the same way that a
         | chemistry experiment is. Very problematic from the standpoint
         | of Baconian science.
         | 
         | But what is completely outside the scope of Baconian science is
         | determining whether a sentence like "Fred should not be lying"
         | is true. If Fred is hiding jews in the basement, should he lie
         | to the stormtroopers at his front door? If he had an affair,
         | but its over and he wants to stay happily married to his wife,
         | _should_ he lie to her if she asked him whether he had an
         | affair?
         | 
         | There are no experiments--in the sense of yielding publicly
         | observable and repeatable results--which will tell you this.
         | Note, this doesn't mean that Psychology is bogus, any more than
         | History is Bogus. It's just Psychology has as its subject
         | matter, some phenomena which are not amenable to experiment. It
         | _used_ to develop methods--like psychoanalysis--which _were_
         | more apropos to studying its subject matter.
         | 
         | But these days, it seems to be embarrassed about all that, and
         | wants to be a "respectable" science. Instead of Freud
         | theorizing about how childhood trauma affects adult moodiness,
         | they do things like study whether or not prozac helps
         | depression. Studying whether or not a drug can boost serotonin
         | levels in the brain can be investigated scientifically, but in
         | an important sense, it is changing the subject. We're not
         | talking about _humans_ anymore, we are talking about what
         | chemicals do to neuron serotonin re-uptake.
         | 
         | So, what if you are an assistant professor, desperately trying
         | to get some scientific results so you can get tenure? If you
         | try to study something like "deception" scientifically---doing
         | experiments on sophomores--you'll run up against the brute fact
         | that your subject matter can't be studied scientifically. You
         | can call what you are doing "experimenting", but any results
         | you get will not be repeatable.
         | 
         | This is a problem if you are trying to get tenure. Psychology
         | _could_ have just said  "hey, we are more like economics and
         | history than we are like chemistry, so lets develop some
         | methods appropriate to investigating our subject matter."
         | 
         | But no. What they did instead was just make up results, on what
         | looks like a phenomenal scale. And its been going so long that
         | we have _generations_ of people who got their Ph.D. from
         | somebody who cheated their way through. And _even if_ they
         | wanted not to cheat---they are competing with their peers who
         | will happily cheat.
         | 
         | The result is sad stories like the OP. It's a vicious circle, a
         | race to the bottom. Honesty is punished, deception is rewarded.
         | The whole field has seriously lost its way. They need to get
         | back to realizing that they are studying a subject matter which
         | needs different methods than experimentation to study.
        
       | beryilma wrote:
       | Once they start giving TED talks and such, they are not
       | researchers anymore. They are highly-educated influencers who
       | want to stay in the limelight. And this brings its own set of bad
       | incentives, staying "prolific" and eventually dishonesty being
       | some of them. In the same field, see what is happening with
       | Jordan Peterson.
        
         | homeless_engi wrote:
         | In my experience most/all academics crave a certain type of
         | attention. It is an occupation where you get promoted in part
         | by how famous you are (how many citations your publications
         | have, which conferences you attended, etc.).
        
       | javajosh wrote:
       | This is great news for Harvard! It implies that it's taking
       | academic honesty and rigor seriously, and deserves praise for
       | that. Just as police departments deserve praise for
       | firing/suspending/charging bad officers, just as state bars
       | deserve praise for disbarring bad lawyers, we need to see more of
       | this not less. It concerns me that such news is often used as
       | evidence of systemic problems when this sort of news signals the
       | exact opposite.
       | 
       | We should be more concerned when there are NO instances of
       | malpractice reported by institutions. This doesn't mean there
       | isn't malpractice, but that the institution has lost the will to
       | enforce its own rules. Reacting to this news negatively provides
       | a perverse incentive to institutions, and should be quelled.
        
       | curtis3389 wrote:
       | This reminds me of Orson Welles' excellent film: F for Fake.
       | 
       | It explores art forgery, art experts, and the author of a book
       | about art forgery committing a massive forgery.
        
       | ivraatiems wrote:
       | So the author of a bestselling book on why breaking the rules can
       | be advantageous... broke the rules?
       | 
       | https://www.amazon.com/Rebel-Talent-Pays-Break-Rules/dp/0062...
        
         | pknomad wrote:
         | Irony aside... doesn't that validate the premise?
        
           | rossdavidh wrote:
           | Well not at the moment, her career is pretty well off track
           | now.
        
         | saghm wrote:
         | They say to write what you know, so they did!
        
         | antegamisou wrote:
         | I think this is a great reason why one should trash all these
         | self-improvement books by grifters with "expensive" credentials
         | (CEO, Ivy League person etc.) and replace them with fiction
         | works of 19th Century American/Russian literature.
        
       | gnicholas wrote:
       | Discussed concurrently:
       | https://news.ycombinator.com/item?id=39712021
        
       | gotoeleven wrote:
       | Harvard will come out on top of all this because they can easily
       | pivot to being a clown college.
        
       | neurotech1 wrote:
       | Archive copy: https://archive.ph/6BrOv
        
       | visarga wrote:
       | Plot twist - he was doing his job, it was just a dishonesty
       | experiment. /s
        
       | seizethecheese wrote:
       | I believe high profile academic fraud to be worse for the world
       | than financial fraud, even.
       | 
       | Financial fraud misappropriates one sum of money. Academic fraud
       | can misappropriate massive amounts of resources when decision
       | makers rely on bullshit ideas
        
       | stmichel wrote:
       | Ethicists and especially bio-ethicists are the least ethical
       | people on Earth. "Happiness experts" are the most miserable
       | people on Earth. etc. etc. etc...
        
       | semiquaver wrote:
       | https://archive.ph/6BrOv
        
       | seydor wrote:
       | The choice of title has an infuriating amount of wasted potential
        
       | belter wrote:
       | The Wikipedia article also has some nuggets:
       | 
       | "In or before 2020, a graduate student named Zoe Ziani developed
       | concerns about the validity of results from a highly publicized
       | paper by Gino about networking. According to Ziani, she was
       | strongly warned by her academic advisers not to criticize Gino,
       | and two members of her dissertation committee refused to approve
       | her thesis unless she deleted criticism of Gino's paper from it"
        
       ___________________________________________________________________
       (page generated 2024-03-15 23:01 UTC)