[HN Gopher] The Crime of Curiosity
       ___________________________________________________________________
        
       The Crime of Curiosity
        
       Author : prostoalex
       Score  : 143 points
       Date   : 2021-09-15 16:06 UTC (1 days ago)
        
 (HTM) web link (www.piratewires.com)
 (TXT) w3m dump (www.piratewires.com)
        
       | donmcronald wrote:
       | > I'm hesitant to say it worked because vaccines are complicated
       | and we'd need further testing to confirm our results.
       | 
       | That's the line that makes me interested enough to read more and
       | think about what the author is saying. Misinformation doesn't
       | typically come with a disclaimer saying it might be wrong, so
       | that tells me the author isn't being intentionally deceiving or
       | spreading information where they have no clue what's happening.
       | 
       | Reading more, it actually piques my curiosity a bit. It reminds
       | me of board level electronics repair before I saw Louis
       | Rossmann's YouTube channel. I always thought it was impossible,
       | but then you watch one video of him doing it and realize all the
       | rhetoric about it being difficult, impossible, and dangerous is
       | just that; rhetoric.
       | 
       | I don't know what the answer is in terms of dealing with
       | misinformation, but one thing I believe fairly strongly is that
       | letting large, private institutions decide what's fact and what's
       | fiction is problematic. The "facts" will _always_ align with
       | their business interests.
       | 
       | I'd even say it's risky to let big companies like manufacturers
       | get away with their lies. I _KNOW_ electronic and appliance
       | manufacturers are lying about the complexity of their products
       | because I can watch a YouTube video and fix a lot of what they
       | claim is unfixable or too dangerous to fix.
       | 
       | Considering that, WTF am I supposed to believe when I read an
       | article like this and someone is saying that biotech isn't as
       | complicated as the profit driven institutions claim? There's got
       | to be _some_ truth to it if they behave the same as the
       | electronics industry, right? Or am I sliding into anti-vaxxer
       | territory?
       | 
       | No matter what, I think it would be a great step forward if we
       | stopped accepting deceit and disinformation as being a normal
       | thing for large corporations and institutions to engage in. We
       | have so many provable examples in the electronic repair industry
       | alone that I think it's endemic. Maybe if the big companies
       | didn't lie so much, there would be less distrust and less of an
       | opportunity for people to spread misinformation and propaganda.
        
         | jrochkind1 wrote:
         | I agree that's an important bar, but intentional misinformation
         | purveyors will learn to use that language too if necessary.
         | It's necessary but not sufficient.
         | 
         | Note: I dont' think this guy should have been removed from
         | youtube, and have no reason to doubt the veracity of his story,
         | it seems plausible and interesting.
        
       | bserge wrote:
       | Everybody back to your own domains and hosting (not fucking
       | Amazon!). If your server provider bans you, _then_ we have a real
       | problem.
        
       | johnhenry wrote:
       | I wonder if the video had prominent "Don't try this at home"
       | warnings?
        
         | Rd6n6 wrote:
         | The whole point of his video and work is to democratize
         | science. He sells kits to let you do genetics science at home.
         | Ie, a disclaimer would not make sense considering his objective
        
       | phkahler wrote:
       | >> monopoly speech platforms are also sloppily banning any form
       | of science that doesn't come from 'ordained' sources.
       | 
       | It's not just that. The real message is that individuals are not
       | able to make decisions or do things for themselves. This is why
       | any medical treatment that doesn't involve surgery or
       | prescriptions is thrown in with homeopathy and general quackery.
       | It's not just companies either, individuals will look at you
       | funny if you do anything not sanctioned by some perceived source
       | of truth.
        
         | [deleted]
        
         | swayvil wrote:
         | My body my choice.
         | 
         | No, you're not qualified to make that choice.
        
         | heyitsguay wrote:
         | Nobody has to use YouTube. There are many other video
         | hosting/watching platforms. None of them are especially popular
         | because they get filled with a lot of junk, because they
         | attract the people who can't host their content on YouTube.
         | 
         | In a way, this dynamic is an effective, natural check on
         | moderation. If moderation primarily removes stuff that most
         | people think is junk, then alternatives competing on reduced
         | moderation will be junk. If moderation removes valuable stuff
         | that lots of people want, then other platforms that host it
         | instead will receive more traffic and legitimacy.
         | 
         | It's kind of like the Reddit vs. Voat thing. Reddit isn't
         | great, but Voat ended up a cesspool, because Reddit's biggest
         | exoduses were driven by hate groups.
        
           | AlbertCory wrote:
           | Nonsense. A monopoly does not conform to the rules a "normal"
           | company does. Rumble does not compete with YouTube, because a
           | generation of uploaders have put essentially every piece of
           | TV, music, movies, conference proceedings, and every other
           | form of media on YouTube.
           | 
           | If you're wondering if something unusual is online, you would
           | always look on YouTube first. And then stay there.
        
             | breckenedge wrote:
             | I disagree. You're confusing network effects with monopoly.
             | YouTube is not a monopoly.
        
       | [deleted]
        
       | amadeuspagel wrote:
       | > In general, I think I understand where the criticism is coming
       | from. Every time I post on social media about being deplatformed,
       | banned, or silenced, someone chimes in with their own story about
       | being banned because "big government is trying to suppress the
       | fact that echinacea cures Covid" or whatever. Spoiler, echinacea
       | doesn't cure Covid, but this is the kind of crazy nonsense my
       | work is compared to. Are you a credential person? Great, I'm a
       | scientist with a PhD from one of the top universities in the
       | world. I've worked at NASA. I've published a number of papers.
       | 
       | I couldn't care less for this shit. "Yes I understand the
       | necessity of censorship. But not me. I've worked at NASA. I've
       | published 'a number of papers.'" Lol.
       | 
       | > The problem isn't my thoroughly detailed research, which I
       | would love to have critiqued in good faith. The problem is big
       | tech companies making billions of dollars aren't capable of doing
       | basic analysis of scientific work, or hiring a team that can,
       | which is why the best they're capable of on the pandemic front,
       | for example, is attaching a link to the CDC website on every post
       | that mentions "Covid" or "vaccine."
       | 
       | Indeed. They are not capable of doing basic analysis of every
       | video, or hiring a team that can. Billions of dollars aren't
       | enough to hire a team to analyze every video on youtube.
        
         | AlbertCory wrote:
         | Unfortunately, the "echinacea cures Covid" claim probably isn't
         | true, but the CDC and YouTube people seem to have decided that
         | ANY claims about therapeutics must be suppressed. There isn't
         | any objective reason for this; it's purely political side-
         | taking.
         | 
         | As for your research: are you saying there are _no_ forums
         | where your work can be  "critiqued in good faith"? Or just that
         | YouTube isn't that forum? Because you're right -- it isn't.
        
         | hnuser123456 wrote:
         | >I couldn't care less for this shit. "Yes I understand the
         | necessity of censorship. But not me. I've worked at NASA. I've
         | published 'a number of papers.'" Lol.
         | 
         | There are situations where you have someone who truly has deep
         | merit meeting someone who believes they do and who and tries to
         | relate to the former, but is so far disconnected that they have
         | categorically different experience which immediately
         | disqualifies the latter, but the general public would be unable
         | to recognize, yet it would take significant effort for the
         | former to explain the situation to an extent the public could
         | understand, and at risk to the emotions and ego of the latter.
         | Surely you can relate to this situation to some extent?
        
       | perihelions wrote:
       | Also covered by _Reason_ (whose reporting on Zayner was likewise
       | censored),
       | 
       | https://reason.com/2021/06/16/why-did-youtube-remove-this-re... (
       | _" Why Did YouTube Remove This Reason Video?"_)
       | 
       | edit: Here's a complete transcript of the censored material, post
       | by Eugene Volokh
       | 
       | https://reason.com/volokh/2021/06/17/youtube-removes-march-2...
        
       | Imnimo wrote:
       | I do sympathize with him, and I think his specific content should
       | be available on YouTube, but I can also see why YouTube would
       | want to ban him. If I'm YouTube, and I don't really understand
       | anything about biohacking, do I really want to take the risk of
       | promoting material about making your own vaccines? Sure, this
       | guy's content is informative and valuable, but it wouldn't be
       | long before you have someone less competent promoting a self-made
       | vaccine that kills someone. If YouTube doesn't feel it has the
       | expertise to distinguish the two, I can see why it would be
       | prudent to just ban them all.
        
       | daenz wrote:
       | >Trust is antithetical to science.
       | 
       | Never heard it put this way before, but I think I agree with it.
        
       | Barrin92 wrote:
       | The author conflates two things. The right to biohack at home and
       | Youtube's alleged obligation to let him broadcast his material.
       | 
       | The first one is okay, as long as we're not talking about
       | experiments that may be dangerous to the public. (which he seems
       | to categorically reject for some reason, while home made
       | bioterrorism is a real threat), the second one just doesn't
       | follow at all.
       | 
       | Youtube has no realistic way to tell whether someone producing
       | home-made science on youtube is a phd ex-nasa biohacker who
       | follows best practices or just a complete quack who tries to sell
       | dangerous fake remedies to vulnerable people. In practice the
       | incentive to promote the latter probably far outweighs the former
       | in number given the huge, generic audience on Youtube.
       | 
       | 77% of the Youtube audience in the US are 17-25 years old, it's
       | not some niche forum for engineers and the notion that they can
       | read scientific papers and weed out legitimate science created at
       | home from misinformation is absurd. The correct platform for
       | something like this is a separate forum or community where
       | enthusiasts meet with some barrier to entry, not the mass media.
       | 
       | I immediately question the motive of someone who promotes
       | individual science or 'hacking' and seeks the largest mass media
       | audience. I think the motive is much more straight forward. The
       | author has a company that sells genetic engineering kits to
       | people and by banning him from Youtube that impacts him
       | financially.
        
         | reginold wrote:
         | Agreed - the lesson here is platforms like YouTube have a lot
         | more control than most assume. An easy to use, censorship
         | resistant, video host seems needed.
        
           | vippy wrote:
           | That's an interesting, and entirely unrelated, takeaway given
           | what Barrin92 wrote.
        
       | ssijak wrote:
       | I don't understand this social platforms.
       | 
       | Covid stuff and fake news aside, I reported maybe 20 times
       | obvious scam videos, or posts with extreme hatred and threats of
       | violence against a group of people and similar stuff where
       | 99.99999% of us would agree that it should be removed without
       | question, and only once they did remove it. All the other times I
       | get a message that they reviewed my complaint, that it does not
       | go against their standards/tos/bla bla, and that I can block the
       | video/post/channel for myself.
       | 
       | And taking into account how many random stuff they ban
       | proactively, it just does not compute in my logic board in the
       | brain.
        
         | gameswithgo wrote:
         | Most things are badly done. Most social media companies are not
         | doing moderation well. Fitting the general pattern of most
         | things are garbage. But some things are not! Find them and
         | treasure them.
        
         | nix0n wrote:
         | A few days ago, I learned that "Facebook has exempted high-
         | profile users from some or all of its rules"[0]. Maybe there's
         | something like that happening on youtube also.
         | 
         | [0]https://news.ycombinator.com/item?id=28512121
        
         | ravi-delia wrote:
         | It's the same way ADHD is both under- and over- diagnosed. If
         | you're bad enough at your job, you can both miss the majority
         | of content that is harmful, and wind up taking down
         | significantly more false positives than truly harmful material.
        
       | peter303 wrote:
       | Isaacsons new book Code Breakers about CRISPR has a couple
       | positive chapters on Zaynor. (Isaacson interviewed Steve Jobs in
       | his final months for the definitive biography.)
        
         | lostlogin wrote:
         | I think that's a typo and I'm after Josiah Zayner, the author
         | of the the article here?
         | 
         | Thanks for the connection, that book looks really interesting.
        
       | carlisle_ wrote:
       | My question is how can platforms ensure only "good" science is
       | done at the scale they operate at? So many pseudosciences hide
       | behind jargon, how much scientific education is required to
       | delineate between the two at the velocity people want? A company
       | might take two weeks to review and approve a flagged post, but
       | isn't the damage already done for how these platforms operate?
        
       | reginold wrote:
       | What's the best video platform that supports open standards and
       | censorship resistance?
       | 
       | Is self-hosting the only option?
        
       | jimbob45 wrote:
       | At the core of YouTube's censorship is the idea that people
       | aren't intelligent or wise enough to know what kinds of videos
       | they should watch. They don't know what's good for them.
       | 
       | Whatever you may think of that idea, surely the idea that
       | YouTube, which reduces down to a bunch of people sitting behind
       | desks, should get to decide what others watch is questionable.
       | That is, they think a small group of people are wiser and more
       | intelligent than everyone and should get to curate and censor
       | content for the majority at will.
        
         | shadowgovt wrote:
         | > surely the idea that YouTube, which reduces down to a bunch
         | of people sitting behind desks, should get to decide what
         | others watch is questionable
         | 
         | That's been the value-add of YouTube since the day they
         | implemented preference-learning and recommendations, and it
         | continues to be one of their distinguishing factors in the
         | marketplace of alternatives.
        
         | Syonyk wrote:
         | > _At the core of YouTube 's censorship is the idea that people
         | aren't intelligent or wise enough to know what kinds of videos
         | they should watch. They don't know what's good for them._
         | 
         | I'm not sure I'd agree. I think that's putting far too much
         | faith in them.
         | 
         | YouTube's goal, roughly stated, is "More Hours Watched." This,
         | in whatever form is appropriate, is pretty much the end goal of
         | _any_ social media sharing handwave etc platform. More eyeball-
         | hours to sell ads to.
         | 
         | For a while, YouTube's algorithms (which I think are almost
         | certainly too dumb to have any idea what a video is about)
         | pushed conspiracy theory content for the simple reason that if
         | you can get someone watching conspiracy theory content on
         | YouTube, they're very likely to continue watching conspiracy
         | theory content on YouTube. Of the people who watch video 1234,
         | 30% of them then watch dramatically more YouTube afterwards, so
         | the more people you can show video 1234 to, the more hours will
         | be watched - think "paperclip maximizer," not "Muahaha, we will
         | drive people down conspiracy rabbit holes!"
         | 
         | Of course, do this long enough, and eventually you have a
         | problem - bad press coverage about how you're driving people
         | down conspiracy rabbit holes. Whoops. But the problem here,
         | from YouTube's perspective, isn't that you're driving people
         | down conspiracy theory rabbit holes - it's the standard tech
         | industry problem that you're getting bad press for it. Bad
         | press is bad for hours viewed. So you fix the problem, and
         | issue the standard tech industry appy polly loggy - "We are so,
         | so sorry you caught us doing this and we will do the work to
         | ensure that you don't catch us doing it in the future."
         | 
         | I don't think YouTube particularly cares about Covid
         | misinformation or [whatever]. They care about the bad press
         | from being seen hosting it, which might impact people's opinion
         | of them and reduce hours watched.
         | 
         | To claim that the algorithms understand anything beyond the
         | title or such is to claim a capability that has no evidence for
         | existing.
        
       | Syonyk wrote:
       | > _A day after YouTube took down my video, I received an email.
       | They banned me for life. This is not only to say I could no
       | longer upload content. I could no longer even login._
       | 
       | We are, sadly, _long_ since past the time that cloud providers,
       | free content hosting, YouTube, etc, has to be considered hostile
       | to anything outside the tech industry 's consensus of what's
       | allowed (as interpreted by their algorithms, which they like to
       | pretend are fancy, but seem only barely smarter than keyword
       | matching, except when they're rather dumber). Of course, due to
       | Scale(TM), you can't actually have any _humans_ in the loop.
       | Unless, of course, you 're well enough connected to get a bit of
       | a rise on a tech news site, at which point a human will (usually)
       | step in, mutter something about a mistake, fix the problem (the
       | actual problem being the bad PR created), and go on their way.
       | 
       | If you're posting funny reaction videos to nonsense content,
       | sure, use YouTube. For anything serious, this is no longer a good
       | idea (well, if you're outside the tech industry consensus for
       | whatever that is today).
       | 
       | But if you are even the slightest bit outside the mainstream, you
       | probably shouldn't be using YouTube, or even the various cloud
       | based hosting services. Your own server, in a local datacenter,
       | perhaps fronted by CloudFront, is closer to the right answer
       | anymore.
       | 
       | In the rush to free services, we've handed far, _far_ too much
       | power to a very small set of companies, who are now happy to use
       | that power to turn the internet into only what they want to hear.
        
         | shadowgovt wrote:
         | Plenty of serious content survives on YouTube.
         | 
         | I'd say posting something at the intersection of fringe and
         | dangerous can get your content canned, and unfortunately for
         | the article author, "How to cook a COVID-19 vaccine in your
         | kitchen" is in that category.
        
         | [deleted]
        
       | ftrobro wrote:
       | "Democratizing genetic engineering won't suddenly unleash
       | bioterrorism upon the world."
       | 
       | How sure is he of that, and why? As a comparison, nuclear power
       | has been of great utility for many countries, but I sure would
       | not want to see it "democratized".
        
         | bserge wrote:
         | A motivated party can deploy and detonate several bombs in
         | places with a lot of people any day, Covid rules
         | notwithstanding.
         | 
         | It's laughably/terrifyingly easy, the hardest part is building
         | or buying the bombs.
         | 
         | But you don't see it done regularly at all.
         | 
         | Granted, a single nuclear or bio attack would affect way more
         | people.
        
           | [deleted]
        
         | frazbin wrote:
         | Yeah I was with him up to that point.
         | 
         | "None of the kits we sell contain anything dangerous, nor is
         | the average person experimenting with biology inherently
         | dangerous. If you are trying to engineer something hazardous --
         | like say a bat virus -- you might have a problem, but the
         | genome search space is large enough that accidentally creating
         | a harmful organism is astronomically improbable. Access to most
         | dangerous materials are also heavily restricted..."
         | 
         | This sounds really sketchy to me. I can't tell if he's lying or
         | if he really can't see any further than his own research
         | program.. sometimes a field of research is just actually
         | existentially dangerous for life on earth, you can't handwave
         | it away.
         | 
         | The possibility of biohacking as a hobby is thrilling. Trouble
         | is, the dangers are so extreme they're difficult to even think
         | about. Like, thought experiment: can you design a virus that
         | kills all eukaryotic cells? Can you think of other
         | possibilities equally terrifying? Of course stuff like that is
         | far off, but neither you nor this guy know whether 'far off'
         | means 10, 100, or 1000 years.
         | 
         | We already have examples of substances that are strictly
         | harmful to almost all life.. Dispersal of them (via the
         | democratization of access to Chemistry) is already placing
         | stress on the biosphere.. which again, is such a large and
         | horrible thing it's hard to imagine or reason about.
         | 
         | Another thing: it is still plausible that covid was a lab
         | escape. I think the biohacker community should be incredibly
         | humbled by that, and really think about how they'll be seen in
         | decades to come.. we've seen cycles of technologists becoming
         | disgraced because of the impact of their products; those cycles
         | seem poised to accelerate. Maybe this is a good time to get on
         | the right side of history?
        
           | sjtindell wrote:
           | Do you see any comparison to Machine Learning? I get the
           | impression he considers himself like a hobbyist running some
           | models on their home GPU. Without access to server farms,
           | there is an almost zero percent chance a home hobbyist
           | invents an AI capable of displacing large amounts of human
           | workers or setting off nukes on its own. AI is dangerous, but
           | can be explored safely by a small practitioner. Perhaps these
           | biomaterials are more sensitive though.
        
             | Rd6n6 wrote:
             | Ai isn't really at risk of escaping the lab and afflicting
             | a population with something. I mean, deep fakes went online
             | and that will afflict people, but that's not the same: the
             | closer parallel is "a technique for bio hacking escaped his
             | computer," which is different than "I accidentally/sort-of-
             | not-accidentally made mosquitos extinct in my region
        
             | TeMPOraL wrote:
             | Live biomaterials are _self-replicating_. That 's one
             | difference between biotech and any other scary tech we're
             | used to dealing with, like explosives, dangerous chemicals
             | or nuclear material. Even ML scales only as fast as you can
             | buy compute and convince companies and governments to put
             | your algorithms to use. Self-replication is another game
             | entirely.
        
       | autoliteInline wrote:
       | I wonder if there isn't a market for a COVID vaccine that the
       | inevitable variants aren't resistant to.
       | 
       | If you kept it to a small group of customers, it would maintain
       | it's value and be worth $$$$ to them potentially.
        
         | [deleted]
        
         | BrianOnHN wrote:
         | That's certainly the future.
         | 
         | Mass vaccination might have made it better for costs on tptb in
         | the past. But, if that doesn't work because a sufficient amount
         | of the population refuses to cooperate...
        
         | GuB-42 wrote:
         | AFAIK, variants are not "resistant" to vaccines, they are just
         | different, so a vaccine targeting the previous strand will be
         | less effective because it doesn't match perfectly.
         | 
         | There is no better vaccine than the one for the current
         | dominant variants. You can't really target future mutations
         | because you don't know what the future mutations will be. You
         | can imagine restricting the best, most current vaccine to the
         | rich to limit escape mutations but besides the obvious ethical
         | concerns, the virus will mutate anyways. Better vaccinate
         | everyone with the best, because the less people infected, the
         | less chances you give the virus to mutate.
        
           | gweinberg wrote:
           | Current vaccines aren't optimized for the currently dominant
           | Covid strains, they're optimized for the Covid strains that
           | were around a year and a half ago.
        
         | pfdietz wrote:
         | The problem, IMO, with the current vaccination system is not
         | that lots of people aren't getting their shots with the
         | available vaccines (although that's sad); the problem is how
         | many hoops have to be jumped through to get the vaccines on the
         | market at all. The GOP could have legitimately criticized Big
         | Government on this if they hadn't swallowed all the stupid.
         | Freedom doesn't mean just the freedom to decline a vaccine; it
         | means the freedom to act without the government's permission.
        
       | ve55 wrote:
       | The potential costs of systemically removing correct but
       | controversial/fringe content are extremely detrimental to
       | scientific progress in general. Sure, we remove thousands of
       | obviously false and likely harmful posts, and there are cases to
       | be made for why that may be good, but sometimes hidden within
       | those thousands of posts was something that may have had a
       | tremendously positive effect, but yet could not be separated from
       | the noise.
       | 
       | "The Crime of Curiosity" is a great way to put it, because we're
       | already banned from questioning a lot of areas of science on most
       | major tech platforms. This system seems to be helpful in some
       | areas until it makes some mistakes, in which case the effects are
       | catastrophic.
       | 
       | Remember that within the first year of covid, "masks work" was
       | considered misinformation along with "a vaccine is likely to
       | happen within a year or two", along with "this may be related to
       | a lab leak", along with... Reality is always changing and
       | uncertain and our policies should reflect that we do not have it
       | all figured out, nor will we (collectively) ever. (Edit: as one
       | commenter expressed skepticism of the mask claim, read over a
       | link like
       | https://old.reddit.com/r/AmItheAsshole/comments/fe2oqg/aita_...
       | about how normal people felt about masks in the first few months
       | of covid. It's pretty shocking and I feel like I'm living in an
       | alternate reality just re-reading it and the top responses).
       | 
       | Now that our infrastructure is being expanded and built out with
       | censoring of 'incorrect' information as a top priority, I fear
       | for how bad the mistakes we make in the future may be.
        
         | reginold wrote:
         | Yeah the whole "censoring" stuff ratcheted up really fast.
         | Kinda crazy.
        
           | Animats wrote:
           | It's scary.
           | 
           | Imagine the US five years from now, with Team Trump in charge
           | of the censoring.
        
             | reginold wrote:
             | Banning Trump from Twitter...the pendulum will indeed swing
             | the other way. How long do we have.
             | 
             | The clock is ticking on open source, decentralized
             | solutions. Nothing else is relevant.
        
           | hnisfullofshit wrote:
           | HackerNews is _notorious_ for censoring anything that falls
           | outside its groupthink or threatens a portfolio company. It
           | 's actually one of the more aggressive censors on the
           | internet.
           | 
           | Case in point from this very thread:
           | 
           | https://news.ycombinator.com/item?id=28558254
           | 
           | There is no free speech here.
        
             | reginold wrote:
             | Curious to hear more about this. Is censorship "removal" or
             | simply not getting votes? I've seen this concept generally
             | addressed here before as "we don't censor, people just vote
             | this way"
        
               | hnisfullofshit wrote:
               | It was flagged by a mod.
        
               | detaro wrote:
               | flagging is something normal users can do.
        
               | [deleted]
        
         | donmcronald wrote:
         | > The potential costs of systemically removing correct but
         | controversial/fringe content are extremely detrimental to
         | scientific progress in general.
         | 
         | I fall pretty strongly on the side of combatting
         | misinformation, but I disagree with outright removal of
         | content. I think it should be flagged as misinformation, but
         | left available. Put another way, I'm a fan of labelling, not
         | censoring.
         | 
         | I want transparency. "Our misinformation bot rated this as a
         | 90% chance of being misinformation because XYZ." I bet the only
         | reason we can't have that is because the ML bots suck so much
         | the tech industry is scared to implement any system that might
         | be open to scrutiny or analysis. It's a bit ironic.
        
           | reginold wrote:
           | Appreciate this perspective here. I'm against censorship, and
           | am surprised how clear your "label not remove" concept is. I
           | like it.
        
           | nradov wrote:
           | There is no such thing as a bot which can identify
           | misinformation with 90% accuracy.
           | 
           | Google has some very advanced natural language processing
           | technology. Try this Google search: "What year did Neil
           | Armstrong land on Mars?"
        
             | dragonsky67 wrote:
             | Wow, unless you happen to know that the Sea of Tranquility
             | is on the moon (seas on the moon, no way), then you are
             | well on your way to believing that Armstrong is relaxing on
             | a Martian beach....
        
           | nine_k wrote:
           | Marking is helpful. Removal is not.
           | 
           | Very much like spam: filters are good but not 100% good, so
           | there must be a way to look at what the filter has rejected,
           | and allow the reader to judge.
        
           | perihelions wrote:
           | YouTube's ML bots flagged an episode of Michael Osterholm's
           | podcast. If you're unfamiliar, that's a former COVID advisor
           | to President Biden.
           | 
           | https://news.ycombinator.com/item?id=28003635
        
         | tidalmcmuffin wrote:
         | Funny I replied to you saying that you're false about the mask
         | statement and HN censored me! Real free speech platform you
         | guys have here. The irony considering what I was replying to!!
        
         | [deleted]
        
         | tidalmcmuffin wrote:
         | > Remember that within the first year of covid, "masks work"
         | was considered misinformation
         | 
         | No, because that never happened. Maybe don't spread
         | misinformation yourself if you want to make a point.
        
         | AussieWog93 wrote:
         | >"The Crime of Curiosity" is a great way to put it, because
         | we're already banned from questioning a lot of areas of science
         | on most major tech platforms.
         | 
         | I think the keyword that, in my mind, justifies the censorship
         | here is "on most major tech platforms". Nobody is banning the
         | discussion of these ideas in academic journals or HN or other
         | places where curious people can go to discuss things - it's
         | just making sure unverified, potentially dangerous theories
         | aren't spreading like wildfire amongst the general population
         | who _aren't_ curious and will assume whatever they're reading
         | is absolute truth.
        
           | hnisfullofshit wrote:
           | You can't actually discuss anything on HN that goes against
           | the moderator's petty ideologies (or yc's profit motive)
           | without being censored. Turn on show dead and take a scroll
           | through nearly any topic.
           | 
           | The irony is that many of the people who claim to be free-
           | speech see nothing wrong with this platform blacklisting
           | people who don't share in the groupthink.
        
             | liamwire wrote:
             | I feel this is a dishonest take. I've seen the moderators
             | leave up some pretty egregious violations of the site rules
             | because it fostered good discussion, which seems to be at
             | the heart of HN.
             | 
             | However, good discussion ultimately requires respect for
             | one another, and it seems maybe that thinking is not
             | bilateral in your case. I'd encourage you to introspect on
             | why you may be finding resistance wherever you look.
        
           | nradov wrote:
           | That's such an arrogant, condescending statement. You're
           | assuming that the general population is too stupid to be
           | trusted with unfiltered information. But theories aren't
           | dangerous. Actions are dangerous.
        
             | amrocha wrote:
             | I think social media has proven pretty conclusively that
             | giving the general population unfiltered information and
             | enabling everyone to promote whatever they want is in fact
             | a terrible idea
        
         | [deleted]
        
       ___________________________________________________________________
       (page generated 2021-09-16 23:00 UTC)