[HN Gopher] YouTube algorithm recommends videos that violate its...
       ___________________________________________________________________
        
       YouTube algorithm recommends videos that violate its own policies:
       study
        
       Author : Liriel
       Score  : 152 points
       Date   : 2021-07-07 08:59 UTC (14 hours ago)
        
 (HTM) web link (foundation.mozilla.org)
 (TXT) w3m dump (foundation.mozilla.org)
        
       | encryptluks2 wrote:
       | I agree it sucks, but so does Netflix, Amazon, Twitter, Facebook,
       | etc. Its almost like social media is just a marketing tool and
       | your promotions aren't even recommendations at all, but paid
       | placements.
        
         | herbst wrote:
         | You can't compare netflix tho. With the small amount of content
         | they have its enough to browse the new section maybe once a
         | week to not miss anything. Facebook and Twitter are not there
         | to recommend content but to follow content you already cared
         | about.
         | 
         | We have 'illegal' streaming sites with better discovery than
         | YouTube has.
        
         | rvz wrote:
         | They all do it, from influences, big advertisers, and cable
         | news networks on social media sites like Facebook, Twitter,
         | YouTube, Instagram, Snap, TikTok, etc all game the
         | recommendation algorithms of each of these platforms.
         | 
         | As long as the advertisers are happy paying them, they won't
         | change. Not only these algorithms govern what is seen and
         | unseen on those platforms, it is also a privacy issue and they
         | operate by requiring to track the user's usage information.
         | 
         | The small users and 'content creators' lose either way. They
         | work for the recommendation algorithm and it can change at
         | anytime and they end up earning less. Or they get removed from
         | the platform / programme because of their low number of views
         | and that's that, whilst the big players game the system.
        
       | boyka wrote:
       | I stream music from YouTube because of its superior
       | recommendation algorithm (in comparison to, e.g., Spotify). The
       | recommendations that I didn't know prior to them being suggested
       | to me are spot-on.
       | 
       | This must be entirely based on my views and non-specific data
       | from my Google account, as I do not engage (like, dislike,
       | comment).
        
       | Tenoke wrote:
       | I literally never get such content and I use YouTube constantly.
       | I'm not sure how much I'd blame youtube for showing similar
       | content to whatever people are already watching.
        
         | wodenokoto wrote:
         | As far as I can tell, simply watching news is enough to get
         | entangled with the loonies.
        
           | krapp wrote:
           | On the contrary, the loonies are the ones who refuse to watch
           | the news because they believe it's controlled by evil Marxist
           | puppetmasters, while believing anything Youtube, Twitter or
           | Facebook tell them.
           | 
           | As bad as the news gets, it isn't going to tell you Bill
           | Gates is putting mind-control chips into COVID vaccines, or
           | that the Presidential election was stolen by a conspiracy of
           | Chinese communists and Democratic party pedophile cultists,
           | or that Jewish space lasers are responsible for forest fires.
           | It may _report_ on those beliefs and their effects elsewhere,
           | but unlike the internet, won 't assert them as fact, or let
           | you form a comfortable bubble of alternate reality around
           | yourself where those beliefs are only ever reinforced, but
           | never questioned.
        
             | wodenokoto wrote:
             | They are also the ones brigading the comment section on
             | news segments.
        
         | swiley wrote:
         | I prefer to use YouTube in private mode so the ML doesn't
         | pigeonhole you. It only takes a few videos for it to "decide
         | what you like" and so you can pretty easily convince it to feed
         | you weird stuff like this if you want.
        
         | codemusings wrote:
         | I think the trick is also to watch pontential extreme videos
         | linked from some place else in private mode so that your feed
         | doesn't get messed up.
         | 
         | But yeah I'd second that. My feed is mostly video games, music
         | & crafting which fits my interests pretty well. I don't get any
         | political or violent content suggested at all.
         | 
         | I also wonder how much this gets influenced by tracking data
         | gathered outside of YouTube.
        
           | Tenoke wrote:
           | >I also wonder how much this gets influenced by tracking data
           | gathered outside of YouTube.
           | 
           | I think not at all given what I see in incognito/secondary
           | google account and that I dont see any of my interests which
           | I havent watched much about on youtube recommended - e.g. no
           | programming stuff and I'm sure there's a ton of tracking data
           | outside of youtube suggesting I'm interested in that.
        
       | [deleted]
        
       | The_rationalist wrote:
       | Annecdotal: music recommandations have regressed for me.
        
       | rickstanley wrote:
       | My case: my brother commited suicide early this year, I got
       | recommended Reckful's last minutes of streaming [1], I didn't
       | know what was going on; only after reading the description and
       | comments that I understood the whole thing, and guess what, I got
       | invested and searched some more, and I found out that his brother
       | had also commited suicide. Imagine that going through my
       | thoughts.
       | 
       | I didn't take the "recommendation" very well, couldn't sleep
       | properly after.
       | 
       | [1] https://www.youtube.com/watch?v=UiER4tMnbJA
        
         | pjc50 wrote:
         | That's horrendous, sorry for your loss.
        
       | iNane9000 wrote:
       | "Art Garfunkel music video, and was then recommended a highly-
       | sensationalised political video titled "Trump Debate Moderator
       | EXPOSED as having Deep Democrat Ties, Media Bias Reaches BREAKING
       | Point."
       | 
       | It was much harder to figure out the connection on that one. But
       | seriously, most of this is just demographics. It's not like
       | there's a conspiracy to control information or anything crazy
       | like that. I remember the panic about the creepy kids cartoons.
       | It's just algorithms doing their thing. Get used to it.
        
         | enumjorge wrote:
         | > It's just algorithms doing their thing. Get used to it.
         | 
         | Given how much influence some of these platforms have on
         | people, how about the companies running them fix their stuff?
         | We could also introduce legislation to make them fix it. Not
         | sure why we have to get used to it.
        
         | SllX wrote:
         | Honestly I wonder how much of it is just peoples own
         | personalized recommendations and how much of it is general
         | recommendations.
         | 
         | I have a hard rule not to use YouTube for any sort of news or
         | politics, and the algorithm accommodates that very well. I'm
         | sure if I created a new account, it would start off
         | recommending me whatever BS is popular across YouTube at the
         | time of creation, but if I used it exactly like I use my
         | current account, I think the algorithm would accommodate that.
         | I actually _like_ YouTube, which seems to be an unpopular
         | opinion in some places.
         | 
         | The only thing I can't seem to get rid of, permanently, are the
         | stupid banners of videos that YouTube adds in about COVID, or
         | at the time of the election, that crap, and some other
         | nonsense. I mean if I keep clicking the X to get that out of my
         | face, isn't that a strong signal I never want to see that crap
         | or any of those topics again?
        
       | LatteLazy wrote:
       | Why wouldn't it? Until YT has a perfect algo to detect violating
       | videos, this will obviously happen.
       | 
       | I keep seeing pieces like this and I feel like I'm taking crazy
       | pills. I'm no expert at tech or social media. But this is a
       | really obvious fact isn't it? So obvious that doing a "study" to
       | "discover" it seems actively dishonest.
       | 
       | What am I missing here?
        
         | Black101 wrote:
         | They will never make the algo perfect... like that they always
         | have an excuse to remove whatever they don't like.
        
       | tpmx wrote:
       | _Mozilla conducted this research using RegretsReporter, an open-
       | source browser extension that converted thousands of YouTube
       | users into YouTube watchdogs._
       | 
       |  _Research volunteers encountered a range of regrettable videos,
       | reporting everything from COVID fear-mongering to political
       | misinformation_
       | 
       | Like videos discussing the lab leak theory?
        
         | kube-system wrote:
         | > Like videos discussing the lab leak theory?
         | 
         |  _Which_ lab leak theory?
         | 
         | There are widely varying claims on this topic, ranging from
         | narrow and nuanced scientific discussions to all-out
         | nonsensical conjecture. Labelling all of these claims with a
         | broad brush and equating them is simply a straw man.
        
           | kiawe_fire wrote:
           | And yet, the "study" itself appears to imply that a video is
           | extremist just because it has a title asserting that there is
           | a left-leaning bias in some of the media.
           | 
           | Seems false equivalencies, lack of nuances, and straw man
           | arguments only matter some of the time.
        
             | kube-system wrote:
             | I wasn't defending the study. I doubt crowdsourced
             | measurements of "regret" are good at measuring the accuracy
             | of a video's content.
        
               | kiawe_fire wrote:
               | Fair enough, and your point is perfectly valid taken on
               | its own.
        
         | throwawayboise wrote:
         | I keep telling YouTube that I'm "Not interested" in COVID-19
         | videos, but every other day they shove it in my face anyway.
        
         | uniqueuid wrote:
         | The lab leak may have been in there (you can probably check in
         | the appendix to the report).
         | 
         | But to be honest, it's pretty irrelevant. Content analysis is
         | usually performed using the current state of knowledge. It is
         | to be expected that known facts and what is perceived as truth
         | changes after the fact. This does not invalidate the analysis
         | itself.
         | 
         | Here's the explanation from the report, which I find pretty
         | reasonable:
         | 
         | > Over the course of three months, the working group developed
         | a conceptual framework for classifying some of the videos,
         | based on YouTube's Community Guidelines. The working group
         | decided to use YouTube's Community Guidelines to guide the
         | qualitative analysis because it provides a useful taxonomy of
         | problematic video content and also represents a commitment from
         | YouTube as to what sort of content should be included on their
         | platform.
        
         | cromwellian wrote:
         | Isn't there a problem with the sampling here? What if people
         | who likely install RegretsReporter are also people who are more
         | likely to view tangentially related problematic content in the
         | first place, and then are shown more of it. Also, the dividing
         | line on what's consider problematic is difficult, as we saw
         | recently with Right Wing Watch.
         | 
         | And the proposed regulation seems even more problematic. If the
         | AI model were public, then it would be far easier for people to
         | create adversarial content to game it. This is a problem with
         | any models built on public data, including OpenAI's GPT stuff,
         | or GitHub's Copilot. Detailed knowledge of how it works allows
         | more efficient adversarial attacks, and the more these services
         | become important public infrastructure, the more value such
         | attacks will be.
         | 
         | Imagine a co-pilot attack that inserts hard-to-discover
         | Heartbleed-esque bugs into code for example.
         | 
         | It seems way way too early to be discussing regulation of
         | algorithms of a field so young and changing so rapidly. 5 years
         | from now, the network architectures may be unrecognizably
         | different in both form and function.
         | 
         | It might be better to have some industry-wide group like the
         | W3C or IETF which sets these standards, and have tech published
         | reports and audits for compliance, but done in a way to prevent
         | attacks.
        
           | uniqueuid wrote:
           | > Isn't there a problem with the sampling here? What if
           | people who likely install RegretsReporter are also people who
           | are more likely to view tangentially related problematic
           | content in the first place, and then are shown more of it.
           | Also, the dividing line on what's consider problematic is
           | difficult, as we saw recently with Right Wing Watch.
           | 
           | As I commented above, sampling is a problem, but the sample
           | was only used to gather candidate videos which were _then_
           | manually analyzed following Youtube 's own content
           | guidelines.
           | 
           | So the takeaway is: According to the study's authors, Youtube
           | recommends videos despite them violating their own
           | guidelines.
        
             | cromwellian wrote:
             | Does Right Wing Watch violate the guidelines? It's
             | difficult enough to make AI that could enforce these
             | guidelines, but even with a staff of tens of thousands of
             | human evaluators I doubt you could avoid even 10% false
             | positives given how much people argue over classification
             | already.
             | 
             | This kind of filtering, some say censoring, is super
             | problematic both to pull off And to please all
             | stakeholders, which is why a one size fits all regulation
             | is likely to fail and create a new class of problems.
        
       | secretsatan wrote:
       | I found my YT sticks to mostly what I like, but if I accidentally
       | select a video about some old vintage weird gun from some guy
       | that occasionally appears in my feed, I guess through some other
       | inference, then my channel is suddenly all guns, nothing but
       | guns, guns, guns.
        
         | troelsSteegin wrote:
         | How do you then un-gun your feed? Select a recommendation from
         | a different topic? Or, how long does it take for the gun recs
         | to go away if you don't pick any? It's interesting that there'd
         | be a burst of recommendations around a new topic.
        
           | secretsatan wrote:
           | I think last time I just avoided all the gun ones and they
           | eventually went away. I since clicked on another style of
           | annoying video topic and a faster way to get rid of an
           | annoying category is to actively go and dislike the videos
        
           | whywhywhywhy wrote:
           | If you go into your history and remove the video that caused
           | it fixes it for me usually.
        
         | walshemj wrote:
         | I don't get that and I subscribe to forgotten weapons nd a fair
         | number of millatery themed channels Tank museum etc
        
       | temac wrote:
       | > Another person watched a video about software rights, and was
       | then recommended a video about gun rights.
       | 
       | Do all videos about gun rights violate Youtube policy? Maybe the
       | content was actually problematic, but presented like this, I'm
       | wondering what was the problem on this one.
        
       | kebman wrote:
       | Whatever happened to "let people be their own judge?" Or are we
       | really that elitist, and do we really have so little faith in
       | individual's ability to think for themselves? Why is Mozilla, of
       | all entities, seemingly seeking to influence YouTube, and to what
       | end? Should we just abolish the idea of a free market of ideas?
       | Is it bad to be radical? Are you automatically _wrong_ if you
       | are? And are YouTube 's own policies flawless? Just some
       | questions that come to mind.
        
         | swiley wrote:
         | The feedback loop the recommendation algorithm generates is
         | really powerful. I personally have to treat youtube cautiously
         | the way I treat any addictive drug otherwise I end up watching
         | it long after I would have liked to.
        
         | batch12 wrote:
         | Some others- Should Youtube develop and use algorithms which
         | lead people down the path of radacalism for the purpose of
         | engagement? Why are people so radically and violently opposed
         | to conflicting thoughts?
        
       | FranzFerdiNaN wrote:
       | Youtube is like Reddit or Twitter: a dumpster fire unless you
       | carefully curate what you do on it. I remove everything from my
       | youtube history unless i specifically want it to be used for
       | recommendations.
       | 
       | I mostly watch video's about bass playing and use youtube to
       | listen music, and even with curating my history i still get the
       | occasional insane political rant recommmended out of nowhere.
        
       | fareesh wrote:
       | It's not harmful to watch Alex Jones videos, lizard people
       | videos, deep state conspiracy videos. This premise should be
       | rejected outright. It is _more_ harmful to be watching videos
       | about religion, astrology, etc. In some cases, these are
       | mainstream belief systems that have caused countless death,
       | destruction, financial ruin for thousands of years. This sudden
       | interest in harm reduction is disingenuous and conveniently
       | selective.
       | 
       | If you are not treating these videos as harmful, do not waste my
       | time with blatant politics disguised as sanctimonious harm
       | reduction.
       | 
       | As a new YouTube user why am I recommended 100% of comedians and
       | channels who have identical politics? What are the odds?
       | 
       | All extreme ideology and ideologues - left, right, up down, are
       | poisonous and divisive. They should be treated equally, not
       | selectively.
        
         | IfOnlyYouKnew wrote:
         | The Alex Jones crowd recently tried to stage a coup. Thousands
         | of Asians were assaulted in the last 16 months or so, often by
         | people indoctrinated by middle-aged men talking rather loudly
         | directly into a camera, in their car with ugly sunglasses.
        
           | wyoh wrote:
           | A riot is not a "coup", especially by people with no weapons.
           | The people attacking asians are mostly not from the right
           | wing crowd.
        
             | slothtrop wrote:
             | Heavy projection there. Racism falls within the purview of
             | the right.
        
         | prezjordan wrote:
         | It is not more harmful to watch videos about Astrology than
         | videos about how the Sandy Hook Elementary School shooting was
         | a false flag operation.
        
           | fareesh wrote:
           | I have used financial ruin as one of the criteria for harm.
           | There are far more examples of people who have been reduced
           | to financial ruin because of astrology, so I am correct with
           | regards to the quantitative comparison.
        
             | prezjordan wrote:
             | You are not correct under any lens in any context
        
         | specialist wrote:
         | You argue that belief systems are both equal and not equal.
         | 
         | Pick a side.
        
           | fareesh wrote:
           | No that is not accurate, I am arguing that if the goal is to
           | tweak an algorithm to reduce "harmful" information then the
           | criteria for what is considered harmful should not be
           | narrowed down on the basis of one's personal politics.
           | 
           | Any sincere effort to minimize harmful information would
           | start with astrology, superstition, religion, homeopathy and
           | other such content - not some local USA "problems".
        
             | specialist wrote:
             | Then make that argument.
             | 
             | That our tools should empower us. That all existing
             | recommenders disempower, or worse. That people should be in
             | control of their own eyeballs.
             | 
             | In our attention economy, _the choice_ of what to show and
             | not show are equally important. It 's trivial to crowd out
             | opposing opinions, no censorship needed.
             | 
             | Who controls that algorithm? Certainly not us.
             | 
             | Most of the pearl clutching over censorship, "cancel
             | culture", blah blah blah would be resolved today by
             | restoring the fairness doctrine (equal time).
             | 
             | (This obvious and correct remediation will not happen so
             | long as the outrage machine is fed by advertising.)
             | 
             | ((Chomsky, McLuhan, Postman, many others, all made all
             | these points, much more eloquently, repeatedly, much better
             | than I ever could.))
             | 
             | > _...minimize harmful information would start with
             | astrology..._
             | 
             | Encourage you to omit your own opinions of any particular
             | belief system. No one cares. It's a distraction (food
             | fight).
        
         | JasonFruit wrote:
         | It's not harmful to watch videos, if you don't do so
         | uncritically. If you watch uncritically, almost _any_ media is
         | harmful, down to Teletubbies. People need to take
         | responsibility for their minds: Boogaloo bois and Antifa didn
         | 't radicalize you, you radicalized yourself by believing what
         | they said, either rationally or by default -- but own your
         | decision-making.
        
         | [deleted]
        
         | uniqueuid wrote:
         | > It's not harmful to watch Alex Jones videos, lizard people
         | videos, deep state conspiracy videos. This premise should be
         | rejected outright.
         | 
         | That's an open question - whether conspiracy videos have a
         | _negative effect_. Given the current state of research in
         | psychology, political science and communication research, it
         | seems plausible that they do have a negative effect, albeit a
         | small one.
         | 
         | The defining feature of conspiracy theories is mistrust in
         | state institutions and elites. Such attitudes - while legally
         | protected - can lead to people rejecting sensible and helpful
         | behavior, including vaccinations [1].
         | 
         | So given the current state of research, I do not think the
         | premise should be rejected outright.
         | 
         | [1]
         | https://journals.sagepub.com/doi/full/10.1177/19485506209346...
        
           | throwaway316943 wrote:
           | The only reason we have somewhat responsible state
           | institutions and elites is due to a long history of well
           | deserved mistrust of said powers.
        
           | fareesh wrote:
           | You are selectively highlighting part of the comment.
           | 
           | The full context is that it should be rejected outright IF
           | other forms of harmful information are allowed to remain.
           | 
           | Any selective enforcement of this nature is insincere because
           | it blatantly ignores more harmful examples.
           | 
           | If there is a comprehensive enforcement that treats all
           | harmful content equally, it can be considered sincere, else
           | it is simply politics. These are not minor examples, they are
           | far more dangerous than the examples being cited in this
           | study.
        
             | meowface wrote:
             | I agree with you, but I think I and most people wouldn't
             | view astrology videos anywhere near as harmful as any of
             | the other videos mentioned. Could you link one or two
             | videos that you consider to be presenting information as
             | harmful as the other things you listed?
             | 
             | For things like homeopathy and other medical pseudoscience,
             | I think a lot of those things do get banned, depending on
             | the claims they're presenting.
             | 
             | And for something like a video recommending you invest in
             | the video creator's company because [astrology gibberish],
             | I think in that case it's just a matter of magnitude. Of
             | course such a video causes harm, but YouTube can't be
             | expected to be able to prevent all levels of harm. Your
             | argument is sound when you're comparing against things that
             | many would agree are at least as harmful as the other
             | examples you give, rather than just meeting the criteria of
             | any level of harmfulness.
        
               | gunapologist99 wrote:
               | You are using the term "homeopathy", which includes
               | osteopathy, acupuncture, chiropractic treatments, and
               | other "holistic" areas of treatment.
               | 
               | Homeopathy is a very broad area, and some of it isn't
               | just pseudoscience, although we may not yet know the
               | mechanism of operation for some of the 'treatments' (and
               | some are almost certainly actively harmful).
               | 
               | It's probably a weak analogy, but homeopathy is to
               | medicine as supplements are to FDA-approved
               | pharmaceuticals. There's always going to be some crazies,
               | but there will also be some good things there, too, and
               | by banning it, we will miss out on some really good
               | innovation and ideas.
        
               | temac wrote:
               | I've never heard it referring to all pseudo-medicines.
               | Wikipedia seems to agree on a narrow definition about
               | just heavily diluted shits.
        
         | meowface wrote:
         | I think a better analogy would be Flat Earth, moon landing
         | hoax, aliens on the dark side of the moon vs. astrology and
         | mysticism and religion.
         | 
         | I doubt astrology causes much financial distress for the vast
         | majority of believers, even if it results in some people making
         | poor financial decisions. Same for any other kind of distress.
         | 
         | The problem with lizard people, Deep State, and much of Alex
         | Jones' stuff is that the claims are about specific people and
         | groups of people and their intentions and actions. They cause
         | many people to genuinely, truly believe that certain people are
         | the most abominable evil imaginable. That's inevitably going to
         | increase the risk of direct and indirect harm a lot more than
         | woo-woo fortune-scrying.
         | 
         | One could say religion has done the same, and that's not
         | necessarily wrong, but it's misleading. You'd have to compare
         | specific things like religious extremist videos rather than
         | merely religious/spiritual videos. Lizard people / Deep State
         | YouTube content is very often extreme, while astrology and
         | religion YouTube content is very rarely extreme. Pizzagate and
         | QAnon adherents are earnestly trying to bring about the arrest
         | and/or execution of people they believe to be child rapists and
         | murderers.
         | 
         | Even stuff like 9/11 conspiracy theory videos, if believed,
         | lead you to the unavoidable conclusion that some, much, most,
         | or all of the US government and/or Israel and/or Jews are
         | diabolical mass murderers. You're not going to come away with a
         | conclusion like that after watching a crystal healing or reiki
         | video.
         | 
         | >As a new YouTube user why am I recommended 100% of comedians
         | and channels who have identical politics? What are the odds?
         | 
         | Probably because something in your Chrome or Google search
         | history, or the history or video activity of other people
         | sharing your IP address or device, led YouTube to believe that
         | may be something you like. Or, as this article points out,
         | maybe you clicked one or two things that were tangentially
         | related in some way and their data set indicated a lot of
         | people who watched those also watched those other things. So,
         | the odds are probably very high.
        
         | baobabKoodaa wrote:
         | > It's not harmful to watch Alex Jones videos, lizard people
         | videos, deep state conspiracy videos. This premise should be
         | rejected outright. It is more harmful to ...
         | 
         | Sure we can go down the whataboutism-road to find examples of
         | things that are _more_ harmful, but you do realize there are
         | people who believe in deep state and other insane conspiracy
         | theories?
        
           | fareesh wrote:
           | There are concepts like equal enforcement and equal justice
           | which by definition require what you are terming as
           | "whataboutism" to highlight.
           | 
           | If there is going to be selective enforcement of content,
           | then call it what it is - politics.
           | 
           | No sincere effort to minimize harm would ignore the harm
           | caused by religion, astrology, homeopathy and other ideas
           | like this.
        
             | baobabKoodaa wrote:
             | I fully agree!
        
       | intended wrote:
       | Big Tech went from the shiny new thing, to the Mc Donalds of
       | content.
       | 
       | They serve crap in mass produced quantities because it sells.
       | 
       | Remember that content moderation stops scaling after a point. No
       | matter what good engineers think they will achieve in the Trust
       | and Safety / Risk divisions.
       | 
       | Reading this article it seems pretty clear that training
       | parameters weigh engagement at a rate that makes junk content a
       | guarantee.
       | 
       | The troubling part is that this IS creating another
       | educated/uneducated divide. The kind of conspiracy theories I see
       | gaining traction around me differ entirely based on education
       | (and therefore income).
       | 
       | And if you are reading this far - almost all the data needed to
       | have a society level discussion is under an NDA.
       | 
       | It gets worse the moment you leave the English language. Want to
       | find a sentiment Analyzer that works well on Code-Mixed
       | subcontinental/Portuguese/Dominican hate speech? Good luck with
       | that.
        
         | cvwright wrote:
         | > No matter what good engineers think they will achieve in the
         | Trust and Safety / Risk divisions.
         | 
         | You can have all the "Trust and Safety" effort you want, but
         | when the core architecture of the system is built to reward the
         | craziest stuff, you're still going to get a ton of crazy.
         | 
         | It's like building a paintball park inside a nuclear reactor,
         | in the middle of a WW2 minefield. No matter how many "Safety"
         | staff you have running around in orange vests and hard hats,
         | people are still going to get hurt.
        
       | Quequau wrote:
       | I enjoy watching metal machining content and trying to find it,
       | in a language I can at least semi-follow along with it way harder
       | than it has any right to be.
       | 
       | On the other hand I really, really dislike most extremely popular
       | YT shows that are focused on personalities and/or talking
       | heads... yet YT has no problem relentless suggesting those to me.
        
         | swiley wrote:
         | If YT ever suggests something you don't like then you have to
         | not ever watch it for a long time to make it go away.
         | 
         | Unless it's television news, something seems to artificially
         | inject that into the feed.
        
         | nitrogen wrote:
         | _metal machining content and trying to find it, in a language I
         | can at least semi-follow along with_
         | 
         | What language are you looking for, and what channels do you
         | like so far? I'm curious how much overlap there is with what
         | I've been watching in this area, and can recommend some English
         | language channels.
        
           | Quequau wrote:
           | Most of what I see is in English but I also speak German and
           | can understand some Russian (at least enough to be a danger
           | to myself).
           | 
           | Some of what I subscribe to: Tom Lipton, Adam Booth, Chris
           | Maj, Cutting Edge Engineering, Dragonfly Engineering,
           | Blondihacks, CNC4XR7, Edge Precision (a favourite),
           | JamesPark_85 Machining TV, JohnSL, Keith Rucker, Mateusz
           | Doniec, Max Grant, The Swan Valley Machine Shop,
           | MrPragmaticLee, myfordboy, outsidescrewball,
           | PracticalMachinist, Rustinox, Special Instrument Designs,
           | shaper andi, Stef van Itterzon, Steve Summers, Topper
           | Machine, AlwaysSunnyintheShop, An Engineer's Findings,
           | AndysMachines, Assolo, Bart Harkema Metalworks, BAXEDM,
           | clemwyo, CompEdgeX, Ca Lem, David Wilks, FactoryDragon87,
           | Fireball Tool, H&W Machine Repair and Rebuilding,
           | Halligan142, Henrik Andren, igorkawa, IronHead Machine, James
           | T. Kilroy Jr., Jan Sverre Haugjord , Joerg Beigang,
           | Mechanical Advantage, Piotr Fox Wysocki, Solid Rock Machine
           | Shop Inc. (another favourite), Stefan Gotteswinter (really
           | good), THE IRONWORKER, Vladimir Alekseenko, TheMetalRaymond
           | (now I want a horizontal boring mill).
           | 
           | I guess that's just over half. A lot of these channels have
           | either stopped uploading videos or upload very irregularly,
           | so YouTube's algorithm hides their content from me, even
           | though I am subscribed to the channel.
        
             | jcims wrote:
             | I wonder how many of us there are out there. It seems like
             | I bump into folks that are into the same thing all the
             | time, but where are the numbers? Tom Lipton (for example)
             | is one of the GOATs of course, but he has 140k subs? Peter
             | at Edge Precision, consistently delivering excellent
             | tutorials and great camera work on a huge machine...60k
             | subs. Stef van Itterzon, building one of the most
             | ridiculous DIY CNCs I've ever seen, less than 10k subs.
             | It's odd.
        
               | throwawayboise wrote:
               | I watch a fair bit of this sort of thing on YouTube, but
               | I don't use the subscribe or like functions on anything.
        
           | MaxikCZ wrote:
           | "This Old Tony" is english spoken with just the right humour
           | for me. Anyone got any more channels of similar style, not
           | neccesairly just about metal machining?
        
             | nitrogen wrote:
             | Breaking Taps has some interesting stuff including
             | machining but without the humor, Rainfall Projects is
             | metalworking rather than machining, Wintergatan is building
             | a crazy machine, Ron Covell shows some crazy sheet metal
             | bending techniques in a very dry style, Machine Thinking
             | goes into some of the history of machining, and Clickspring
             | makes clocks and an Antikythera mechanism out of brass.
             | 
             | Not all of those upload regularly, and as someone who also
             | uploads occasionally (though not machining related) I don't
             | blame them, because good video is _hard_.
        
             | arethuza wrote:
             | Abom79 is one I watch regularly.
        
             | throwanem wrote:
             | AvE's worth a look if you like This Old Tony. The drunk
             | uncle to Tony's sober one, if you like.
        
       | herbst wrote:
       | I get recommendations of people that literally make me angry.
       | Like people I totally cant stand and because they make kinda
       | similar content to people I watch they get recommended all the
       | fucking time. I tried disliking (which is not fair to begin with,
       | because I would not watch them if my TV would not simply start
       | it) but it changed nothing.
       | 
       | So I am stuck with getting the same few shitty YouTubers pulled
       | in my autoplay every single day.
       | 
       | I tried making an new account from scratch. But nope,
       | recommendations seem to be based on the channels you follow with
       | no personal references pulled in
        
         | ud_0 wrote:
         | _> I tried disliking_
         | 
         | Don't do that! The algorithm counts that as an engagement
         | action.
         | 
         | What you need to do is click on the three dots symbol next to
         | the video title and select "don't recommend this channel". This
         | absolutely fixes it. You may have to do this a lot. There is
         | also a general "don't recommend this" option that, anecdotally,
         | doesn't work as well.
         | 
         | Also, disable autoplay. It doesn't seem like YT can distinguish
         | between clicking on a video and it just playing on autoqueue.
         | 
         | The YT algorithm is not fundamentally different from the other
         | social media algorithms out there: it tries to serve you up
         | with controversial content that "engages" you. You are most
         | likely to be "engaged" by content you either directly disagree
         | with, or by second-source channels that report on content you
         | disagree with. As a fallback, the algorithm will also try to
         | serve you content based on your cohorts, which can in of itself
         | be pretty radical and disagreeable.
         | 
         | Get a YT Premium account if you can, sending a very small
         | message that you vote against ads and all the algorithmic
         | baggage that comes with them.
        
           | yissp wrote:
           | > Get a YT Premium account if you can, sending a very small
           | message that you vote against ads and all the algorithmic
           | baggage that comes with them.
           | 
           | YT premium users are still presented with the same
           | recommendations as normal users, are they not? That's kinda
           | my problem with the service. Sure you don't actually see the
           | ads, but you're still using an advertising-based platform
           | with all the user-hostile design that model encourages.
        
             | ud_0 wrote:
             | Oh absolutely, I'm recommending this because of the second
             | part of my sentence above. I believe the ad model is bad
             | for both consumers and companies, in ways that are not
             | immediately obvious. Ad models tend to generate perverse
             | incentives in my opinion. I also think it's important to
             | signal a willingness to be a reasonable consumer, and to me
             | it's also about ethics: I watch a lot of Youtube, I can
             | afford to pay for YT Premium, and I run an adblocker.
        
           | judge2020 wrote:
           | Yes, a view is a view and a dislike is engagement. If you
           | really don't want to see people or content, use the triple
           | dot menu and click 'not recommended' or 'don't recommend this
           | channel' (which actually straight filters it out).
        
           | namibj wrote:
           | The "don't recommend this" notably has two reasons on the
           | "tell us more" dialog: "I've already seen this video", and "I
           | don't like this video".
           | 
           | I often use the former, after watching stuff via mpv
           | (chromium has tearing, but more notably, high enough CPU
           | usage to annoy me with a half-broken fan that's quite loud).
        
         | weare138 wrote:
         | I have the same problem. I watch alot of science and history
         | channels but damn near every time I watch videos on World War
         | II I start getting suggestions for crazy far-right channels.
         | And I don't mean like mainstream 'conservative' channels like
         | FoxNews, I mean crazy stuff like Nick Fuentes. Even if I tell
         | it to not recommend a certain channel it will just pop back
         | into the recommendations after a few weeks. It would be
         | different if the rest of the recommendations were good but
         | they're just crap too.
        
         | Felk wrote:
         | I am personally abusing the block feature for this. I haven't
         | tested if it is actually true, but I began blocking channels
         | that I am not interested in and haven't noticed anything being
         | recommended yet from any channel I remember blocking.
        
           | herbst wrote:
           | That sounds like it could work. Pretty sure I never saw that
           | function on my Android TV tho.
           | 
           | Kudos, gonna try!
        
             | sbarre wrote:
             | I spend time blocking/curating videos in a desktop browser
             | so that my Apple TV YouTube app sucks less from a
             | recommendations perspective..
             | 
             | It's lame that it has to be done that way, but it's the
             | best way I've been able to do it.
        
           | kevincox wrote:
           | Is this abusing the function? Block is for things that you
           | never want to see. It sounds fairly appropriate.
        
         | mdoms wrote:
         | Not sure what client you're using, but on the web and mobile
         | you can click/tap the kebab menu and select "Don't recommend
         | this channel". Youtube has a nasty habit of recommending rage-
         | inducing channels because they generate engagement. If "Don't
         | recommend this channel" was a button on my keyboard it would be
         | worn down to a nub by now.
        
           | herbst wrote:
           | Google just really hates their Android TV users. I realized I
           | simply can block them on my phone which actually works!
        
           | wincy wrote:
           | I did this to Cocomelon as my daughters watched it a few
           | times and the algorithm thought I wanted to watch nursery
           | rhymes all day long.
        
         | nickthegreek wrote:
         | turn off autoplay to protect your recommendations.
        
         | 2OEH8eoCRo0 wrote:
         | https://github.com/TeamNewPipe/NewPipe/
         | 
         | Turn off recommendations, comments, autoplay etc. I only see
         | exactly who I've subscribed to and nothing more.
        
         | sureglymop wrote:
         | A few years ago I completely scratched using a YouTube account
         | and instead switched to the android app NewPipe (I was
         | primarily watching on my Android phone).
         | 
         | This has completely changed and improved the experience for me.
         | NewPipe lets you create multiple feeds of groups of channels.
         | So for example I can have a feed podcasts, music and tech news.
         | 
         | This not only makes it easier to actually get to the content
         | you want to consume at any given time without distracting
         | yourself but also helps you get better recommendations in these
         | grouped feeds (the recommendations still work).
         | 
         | NewPipe is free and open source and can be downloaded on
         | F-Droid.
        
           | herbst wrote:
           | Thanks for that. Going to look into it. (In the hope it works
           | on TV and also blocks ads)
        
             | pjc50 wrote:
             | Seconding newpipe, it's great if you just want to watch
             | specific videos or channels. Should work on Android tv,
             | unfortunately not available for LG WebOS TVs.
        
       | wilde wrote:
       | I turned auto play off by default on YouTube. Recently I noticed
       | that sometimes auto play was happening anyway. It seems that
       | YouTube has introduced videos that override my choice called
       | "Mixes". Whoever came up with that is an evil genius. Now I have
       | to be really careful to click the actual video I want to watch
       | rather than a version of it with the same thumbnail that ignores
       | my settings.
       | 
       | Fuck YouTube.
        
       | uniqueuid wrote:
       | By the way, regarding these algorithm studies:
       | 
       | It's important to do them correctly to _prevent bad studies_ ,
       | not (just) to get valid information.
       | 
       | Or, as Andrew Gelman [1] said:
       | 
       | > The utilitarian motivation for the social sciences is they can
       | protect us from bad social-science reasoning.
       | 
       | [1] https://statmodeling.stat.columbia.edu/2021/03/12/the-
       | social...
        
         | janto wrote:
         | I'd argue that the net utility of social "science" has been
         | negative because it generates bad ideas under the guise of
         | "good social-science reasoning" and is often explicitly
         | antiscience.
        
       | azangru wrote:
       | > recommender AI
       | 
       | So, am I correct to understand that the recommender AI is just
       | learning from youtube users' actual behavior and shows what
       | content people who have watched a given set of videos tend to
       | find "engaging" (watch for a significant amount of time, leave
       | comments, like, etc)? That, in effect, the AI is just holding a
       | mirror to youtube users, and some journalists don't like what
       | they see in it?
        
         | intended wrote:
         | Think more like junk food when it was first invented.
         | 
         | A culture/society with no understanding of Mc Donalds will
         | immediately see Mc Donalds proliferate.
         | 
         | The food is cheap, fast and tasty. It hits all the buttons.
         | 
         | Same thing here. Except, this is the phase before health and
         | safety regulations and poisonous/radical content is distributed
         | along click bait.
        
         | uniqueuid wrote:
         | Not quite.
         | 
         | a) we don't know how the recommender works, and it changes over
         | time, and some problematic recommendations have been found even
         | in absence of actively seeking out problematic content, and
         | 
         | b) it's not journalists but researchers from mozilla and
         | scientists from the University of Exeter.
        
           | azangru wrote:
           | > some problematic recommendations have been found even in
           | absence of actively seeking out problematic content
           | 
           | I don't know how to put it in proper terms, but isn't it
           | possible that the user who has watched videos A and B ("non-
           | problematic") is classified the same as a subset of users who
           | have watched A, B and C (C being "problematic"), and thus is
           | shown C?
           | 
           | Asking because personally, I've never seen my "safe for work"
           | youtube account ever recommend me anything "problematic". At
           | the moment, it's just feeding me music videos of the same
           | genre, which I am already bored by :-)
        
             | uniqueuid wrote:
             | Both happens.
             | 
             | Some studies such as [1] measure recommendations without
             | past browsing history, and they still find recommendations
             | for harmful content.
             | 
             | Other studies (such as this from Berkman Klein Center [2])
             | simulate user behavior and subsequent recommendations and
             | find additional harmful effects of personalized
             | recommendations.
             | 
             | [1] Faddoul, M., Chaslot, G., & Farid, H. (2020). A
             | Longitudinal Analysis of YouTube's Promotion of Conspiracy
             | Videos. ArXiv:2003.03318 [Cs].
             | http://arxiv.org/abs/2003.03318
             | 
             | [2] https://cyber.harvard.edu/story/2019-06/youtubes-
             | digital-pla...
        
         | dtech wrote:
         | There is a known problem with Youtube both excessive
         | recommending in the same category and driving people to more
         | extreme videos.
         | 
         | If you watch a surfing video, even for a few seconds, expect a
         | lot of surfing videos to pop up in your recommendations for a
         | long time. This has gotten slightly better over time in my
         | experience.
         | 
         | The larger problem is extremism. Watch a video about a
         | vegetarian dish, get recommended vegetarian dish and lifestyle
         | videos, watch those, get recommended veganism videos, watch
         | those, get recommended PETA and animal activism. It kinda works
         | like a funnel.
         | 
         | Might not seems that dangerous for your favourite topic, but
         | now replace the topic with conspiracy theories, religious
         | extermism, radical feminism, anti-government armed resistance,
         | self-harm and suicide, anorexia etc. etc.
        
           | bl5THJUSFXWy4ii wrote:
           | > There is a known problem with Youtube both excessive
           | recommending in the same category and driving people to more
           | extreme videos.
           | 
           | There is a belief that that problem exists with YouTube but
           | whether that actually exists requires further investigation.
           | For example [1] suggests the opposite.
           | 
           | [1]
           | https://firstmonday.org/ojs/index.php/fm/article/view/10419
        
           | azangru wrote:
           | I believe it's been attributed to the fact that more
           | "extremist" content gets more engagement from users in the
           | form of comments or longer watch time.
        
           | banana_giraffe wrote:
           | All of this reminds me of the early days of the TiVo. It had
           | a recommendation system that it would use to fill your DVR
           | with shows you would like. If you recorded a show in a new
           | category (which gave the show an implicit positive rating),
           | it would often go off the rails and find everything in that
           | category. The first time I recorded "The Simpsons", it filled
           | up my DVR with cartoons over the course of the next week
           | before I noticed what it was doing.
           | 
           | We seem to be basically no better than that sort of logic.
        
       | uniqueuid wrote:
       | Here is a link to the original announcement by mozilla, which is
       | a better source (https://foundation.mozilla.org/en/blog/mozilla-
       | investigation...) (@dang should this be updated?)
       | 
       | Some insights from the report:
       | 
       | * The study uses donated data from mozilla's browser plugin [1].
       | This means it almost surely has self-selection bias and does not
       | offer representative usage data. But that's a fair tradeoff to
       | get realistic measurements of recommendations.
       | 
       | * The focus is on harmful content, so we don't know how much this
       | is out of the entire exposure people get. (But there are a couple
       | of representative studies out there).
       | 
       | * Out of all _reported_ videos, 12% were considered actually
       | harmful. That 's not a terrible rate in my opinion.
       | 
       | * 70% of harmful videos came from recommendations - that's why
       | the authors demand changes to the YT recommender system.
       | 
       | * Data quality: Reports came from ~37k users of the browser
       | extension, but only ~1600 submitted a report. Total N of reports
       | is ~3300, so pretty low. The actual analysis was performed on a
       | sample of ~1100 out of these (why?). Harmfulness was analyzed by
       | student assistants which is ok.
       | 
       | * Non-english languages seem to have higher prevalence of harmful
       | videos (which is to be expected given the attention and resources
       | spent).
       | 
       | [1] https://foundation.mozilla.org/en/campaigns/regrets-
       | reporter...
        
         | pseudo0 wrote:
         | The problem is that the study's design makes it impossible to
         | measure the magnitude of the issue. Any recommendation system
         | operating on the scale of YouTube's is going to have a non-zero
         | error rate, and this study effectively just confirmed this
         | known fact. The relative uselessness of the design makes me
         | think this is more about ammo for advocacy efforts than genuine
         | interest in studying this issue.
         | 
         | The question I'd be interested in is: how frequently does a
         | typical YouTube viewer have actually harmful content
         | recommended to them? Looks like it's pretty rare, if even the
         | self-selected group that was interested in this effort and
         | reported harmful content saw an average of (0.12 * 3300)/1600 =
         | 0.2475 over the entire duration of the study. And that's a very
         | generous upper bound, assuming that all of the people who
         | didn't submit reports just forgot about the extension rather
         | than failing to see any potentially harmful content.
        
           | uniqueuid wrote:
           | Right, that's the issue here: We don't get any meaningful
           | base rate.
           | 
           | But the problem is: It's incredibly hard to get
           | representative measurements of _personalized_
           | recommendations. How do you want to collect those? You 'd
           | need a representative sample in the upper four digits (costs
           | five to six figures), and track their browsing including on
           | mobile phones.
           | 
           | This is expensive and technically difficult, because you need
           | to essentially MITM the connection (SNI doesn't give you the
           | videos, only the domain).
           | 
           | Of course, Youtube knows the precise extent of the issue, but
           | they can't/won't/don't give precise data to anyone
           | (problematic but understandable, given that every
           | totalitarian government in the world would immediately send a
           | wishlist).
        
             | gipp wrote:
             | YT started reporting Violative View rate a few months ago:
             | https://blog.youtube/inside-youtube/building-greater-
             | transpa...
        
             | pseudo0 wrote:
             | It would be more expensive, but I don't think it would be
             | infeasible for an organization like Mozilla. Focus just on
             | the desktop use to start with (extracting recommendations
             | from the YouTube app would likely be challenging) and pay a
             | representative set of users to install a browser extension
             | that records the videos they are recommended. Then go
             | through some subset of the videos and evaluate if they are
             | harmful or not. It would definitely be more work, but at
             | least that would give some useful new information on this
             | issue, rather than confirming what was previously known.
        
               | uniqueuid wrote:
               | Well, I've done some of these things in other contexts.
               | 
               | Market rate compensation for participation in such
               | studies is ~$1 per day. You'd need at least 10'000 users
               | to have menaningful confidence intervals. Say you're
               | tracking for one month, then you pay 300k for the sample
               | alone.
               | 
               | If you have a single multi-purpose researcher doing study
               | design, programming, logistics, analysis, outreach and
               | all, that's perhaps 50k-100k a year (highly dependent on
               | country and institution).
               | 
               | Next you hire a bunch of people to do the content
               | analysis. Either mechanical turk with a 2-3 factor
               | overprovisioning or actually trained people who are 2-3
               | times as expensive. At roughly one minute per video,
               | you'd need one hour for 20 videos, i.e. approximatelly 50
               | cents to 1 dollar (ballpark) per video.
               | 
               | Say you argue that the long tail is irrelevant, and you
               | want to code the top 100k videos (danger, arbitrary
               | cutoffs are bad), then that's another 100k.
               | 
               | That's half a million for something that isn't mozilla's
               | core mission and is already being done by publicly funded
               | researchers. Would be nice if they did it, but it sounds
               | like a bad investment to me!
        
         | justshowpost wrote:
         | > browser plugin
         | 
         | Didn't Mozilla drop plugin support completely in Jan?
        
           | vengefulduck wrote:
           | They most likely meant a web extension (or "add on" as
           | Firefox refers to it) not a NPAPI plugin which was what was
           | removed in January.
        
           | [deleted]
        
         | dang wrote:
         | It certainly should, since
         | https://techcrunch.com/2021/07/07/youtubes-recommender-ai-st...
         | (the submitted URL) does nothing but copy the Mozilla report
         | and sex it up with gaudy language. We've changed it now.
         | 
         | Submitters: please read the site guidelines. Note this one: "
         | _Please submit the original source. If a post reports on
         | something found on another site, submit the latter._ "
         | 
         | https://news.ycombinator.com/newsguidelines.html
        
         | karmakaze wrote:
         | That's a good summary. I was trying to get a handle on what
         | 'harmful' could mean, and why this isn't higher than 70% from
         | recommendations. What are the other sources, search results,
         | word-of-mouth, etc? Anyway this part cleared up the context.
         | 
         | > New research published today by Mozilla backs that notion up,
         | suggesting YouTube's AI continues to puff up piles of 'bottom-
         | feeding'/low grade/divisive/disinforming content -- stuff that
         | tries to grab eyeballs by [...]
         | 
         | It seems the best way to play this is to not interact with
         | (downvote, comment on) content you don't want to see, and stop
         | playing such content as soon as you know.
        
         | pizza wrote:
         | Well, not sure what to make of that 70% since 70% of all
         | youtube video watches are due to recommendation in the first
         | place
         | 
         | https://qz.com/1178125/youtubes-recommendations-drive-70-of-...
        
         | belorn wrote:
         | Looking at the example of their harmful content, half is videos
         | about pro-republic videos talking about the US 2020 election.
         | If that is representative of their study then that seems very
         | much self-selected in both nationality and politics.
         | 
         | It should be noted that those that classified if content was
         | harmful or not was a team of 41 research assistants employed by
         | the University of Exeter.
        
       ___________________________________________________________________
       (page generated 2021-07-07 23:01 UTC)