[HN Gopher] YouTube suspends account for linking to a PhD resear...
       ___________________________________________________________________
        
       YouTube suspends account for linking to a PhD research on WPA2
       vulnerability
        
       Author : decrypt
       Score  : 741 points
       Date   : 2021-04-14 10:05 UTC (12 hours ago)
        
 (HTM) web link (www.reddit.com)
 (TXT) w3m dump (www.reddit.com)
        
       | black_puppydog wrote:
       | FWIW the video in question seems to be this one from ~1y ago:
       | 
       | https://peertube.sunknudsen.com/videos/watch/182e7a03-729c-4...
        
       | Aissen wrote:
       | FYI, this is the video:
       | https://peertube.sunknudsen.com/videos/watch/182e7a03-729c-4... ;
       | the links in the description:
       | 
       | KRACK Attacks: Breaking WPA2 https://www.krackattacks.com/
       | 
       | KRACK - Key Reinstallation Attacks: Forcing Nonce Reuse in WPA2
       | https://www.youtube.com/watch?v=fOgJswt7nAc
        
         | koheripbal wrote:
         | Are these new attacks? The video in OP only seems to indicate
         | the ability to attack content on a Wifi connection of other
         | connected clients if you're already connected.
         | 
         | It doesn't give you the ability to "crack" a WPA2 network
         | without the password.
        
       | kypro wrote:
       | YouTube's content policy is bizare. Children in my family seem to
       | watch videos of Spiderman dry humping Elsa or grannies kicking
       | people in the nuts, but then they remove stuff like this.
       | 
       | I guess this is what happens when advertisers are the main
       | priority and the moderators have all been replaced with robots.
       | Content that may actually harm the development of children is
       | considered fine, while someone posting a slightly edgey political
       | take or a research paper about something which triggers a bot is
       | removed.
        
         | Minor49er wrote:
         | Knowing that that's what the kids are watching, why are you
         | still allowing them to have access to YouTube?
        
           | hansel_der wrote:
           | i kind of get where you are coming from but denying access to
           | knowledge/reality is not helpful.
        
             | Minor49er wrote:
             | What knowledge/reality is gained from videos of Spiderman
             | dry humping Elsa or grannies kicking people in the nuts?
             | This is the typical kinds of content that YouTube has lined
             | up for children. Do you really expect a cesspool like this
             | to harbor anything of quality or value for developing
             | minds?
        
               | t-writescode wrote:
               | You're speaking of a specific instance and declaring all
               | instances bad. This is unhelpful.
               | 
               | If that specific video exists and if it's common, this
               | conversation could be had; but, it's relatively uncommon,
               | making it a surprise.
               | 
               | The real answer, as someone else has alluded to, is that
               | YouTube is something that parents should watch or be
               | aware of with their children.
               | 
               | There is a great deal of valuable information on YouTube
               | and cancelling it wholesale is absolutely the wrong
               | approach.
        
               | Minor49er wrote:
               | You realize that this comment section is in response to a
               | story where YouTube is suspending educational content,
               | right? Someone else already mentioned Elsagate which is
               | still an ongoing problem as noted by the GP. YouTube is
               | not interested in providing healthy material, which is
               | not surprising since they are owned by the world's
               | largest advertising company. I would advocate that
               | parents should look for better alternatives altogether
               | rather than use it as source of education or
               | entertainment, even with supervision. It's simply the
               | wrong place to go for that sort of thing.
        
               | re wrote:
               | > If that specific video exists and if it's common, this
               | conversation could be had; but, it's relatively uncommon,
               | making it a surprise.
               | 
               | It is common: https://en.wikipedia.org/wiki/Elsagate
        
             | yarcob wrote:
             | our kids are not allowed to watch youtube on their own
             | because youtubes algorithms surface the worst of the worst
             | 
             | Youtube is not something kids can use without supervision.
        
               | ranieuwe wrote:
               | And for those pointing out YouTube Kids: it is only
               | slightly less bad. It's still full of questionable
               | material.
        
         | shakezula wrote:
         | YouTube will robot-moderate themselves into the ground. Too
         | many people are leaving the platform. It's just a trickle now
         | but soon it will be a flow.
         | 
         | It's the inevitable result of moderation based only on who's
         | paying you.
        
           | 3minus1 wrote:
           | > Too many people are leaving the platform
           | 
           | A quick google search shows the platform has 2 billion
           | monthly active users[1]. A number that has only been growing.
           | Why do you claim people are leaving? Is it because of a
           | handful of high-profile cases, or maybe you live in a
           | hackernews/techie thought bubble where everyone is annoyed at
           | youtube? The claim that people are leaving, or will leave is
           | just completely baseless imo. This reminds me of the saying
           | "there are 2 types of programming languages: those that
           | people complain about, and those that no one uses".
           | 
           | [1] https://www.businessofapps.com/data/youtube-statistics/
        
             | dmos62 wrote:
             | I too don't see people leaving. We can complain all we
             | like. Nothing will change until there's an alternative.
        
             | sillysaurusx wrote:
             | We've all seen mighty platforms fall. Myspace at one time
             | could've made the same argument.
             | 
             | You're not wrong, but that logic doesn't hold forever.
        
           | iso8859-1 wrote:
           | Where are people escaping to? Are people publishing their own
           | content on Gopher or BitTorrent nowadays? These platforms
           | have a perfect balance of usability and freedom. They can
           | take over the world.
        
             | Balgair wrote:
             | It seems like they are making their own sites. Nebula is
             | one such place where a few of the higher quality content
             | creators are going. I've no idea if it'll work, but they
             | are trying.
             | 
             | https://watchnebula.com/
        
               | CameronNemo wrote:
               | The problem with these techniques is discover ability.
               | How do you build an online audience without paying google
               | for advertisements?
        
               | dane-pgp wrote:
               | Then perhaps the solution isn't for politicians to demand
               | that YouTube be separated from Google/Alphabet, but
               | instead demand that users on YouTube be able to discover
               | videos on other sites.
               | 
               | I see no reason why the search feature within YouTube
               | couldn't return videos that people were hosting on their
               | personal sites or on competitors to YouTube (including
               | Twitch VODs and livestreams). Even subscriptions and
               | recommendations could work across platforms.
               | 
               | YouTube could still run adverts in/on the videos, and the
               | revenue from that would be split with the site where the
               | video was actually published. For a greater share of the
               | ad revenue, YouTube could also offer to stream (a copy
               | of) the video from their own servers.
               | 
               | Even if sites (or Multi-Channel Networks) had to pay a
               | $1000 fee to be included in this system, and YouTube kept
               | its rules about moderation and demonetisation, it would
               | still greatly improve discoverability and hopefully
               | competition generally.
        
               | leereeves wrote:
               | With a $5 monthly fee and no way to upload your own
               | videos, Nebula is most directly competing against
               | Netflix, not YouTube. A few top creators might be able to
               | make that work, but most won't.
        
               | timbre1234 wrote:
               | And no free trial w/o giving credit card info. Yeah, hard
               | pass.
        
               | sillysaurusx wrote:
               | Use privacy.com. It's a godsend.
        
               | bradlys wrote:
               | That is very normal. It's to prevent people from just
               | creating accounts multiple times to never have to pay.
        
               | clankyclanker wrote:
               | A $20 gift card is great for those things. Then, it
               | doesn't matter whether you remember to cancel.
        
           | [deleted]
        
         | tgsovlerkhgsel wrote:
         | The author receiving a strike can mean two things: a) the
         | policy prohibits this b) a mistake was made.
         | 
         | The author seems to consider the latter more likely.
         | 
         | Likewise, Spiderman dry humping Elsa staying up can mean either
         | that the policy allows it, or, much more likely, that they just
         | didn't manage to identify and take it down yet.
        
           | t-writescode wrote:
           | The problem is, how difficult is it for the average, non-
           | special YouTube producer able to fix "b) a mistake was made."
           | 
           | ?
           | 
           | And, how long are they refused earnings during that time?
           | 
           | It's similar to the copyright infringement stuff, where real
           | livelihoods can be put in jeopardy from these sorts of
           | decisions.
        
       | sagolikasoppor wrote:
       | Google being google. Its funny that once I used to lookup to this
       | company and now I view it as a threat to society.
        
         | devoutsalsa wrote:
         | "You either die a hero, or you live long enough to see yourself
         | become the villain." - Harvey Dent in The Dark Knight
         | 
         | It's funny that I really appreciated Google when I first signed
         | up for Gmail. Now that I've had it for 16 years, I resent the
         | fact that I can get deplatformed for any reason, and there's
         | really nothing I can do to prevent that.
        
           | Jukg7e3AFyskdj wrote:
           | export all of your email to Protonmail and stop sucking on
           | that Google breast
        
             | Andrew_nenakhov wrote:
             | buy your own domain and only use the email addresses you
             | fully control.
        
             | bayindirh wrote:
             | I'm seriously considering it, but I'm a heavy IMAP user.
             | When you enable IMAP in ProtonMail, it's no different than
             | any other mail provider, security wise.
             | 
             | This is the dilemma for me.
        
               | Jukg7e3AFyskdj wrote:
               | I used to be also but now use the protonmail app on my
               | phone. And this on my desktop
               | https://github.com/vladimiry/ElectronMail
        
               | decrypt wrote:
               | It is for this reason I am moving away from ProtonMail to
               | Migadu. Literally any IMAP-offering email service would
               | do. All that we need is a custom domain.
        
               | dunnevens wrote:
               | Other people have made good points. Just wanted to add:
               | you'll feel better knowing you're no longer vulnerable to
               | Google. It's a nice feeling of relief knowing Google
               | could cancel my account tomorrow and I wouldn't be hurt
               | in any substantial way. Definitely worth the effort to
               | move your email to an independent provider.
        
               | infamia wrote:
               | I don't think you can connect to your email via IMAP
               | without ProtonMail Bridge (which encrypts traffic between
               | your email client and ProtonMail).
        
               | bayindirh wrote:
               | AFAIK you bypass 2FA when you use IMAP. I don't see many
               | advantages of using Proton (over any other independent
               | e-mail provider) without 2FA.
               | 
               | The thing is, I connect all my accounts to a single mail
               | application and handle everything from there, including
               | their automated backups for ~15 years or so.
               | 
               | The jury is still out for me, we'll see.
        
               | social_quotient wrote:
               | Technically no different but you have cast a small vote
               | to behavior you no loner tolerate or support (socially).
               | Even if it's a fraction of a dollar. In aggregate they
               | add up.
        
               | bayindirh wrote:
               | That's another point to look at it, you're right.
        
       | PhilosAccnting wrote:
       | Whenever I hear "algorithm" anymore, I see it as the "statistics"
       | of the past. This entire thing is a product of math washing[1].
       | 
       | [1]https://www.mathwashing.com/
        
       | enriquto wrote:
       | > I am a privacy and security researcher and content creator on
       | YouTube and PeerTube.
       | 
       | > I am no longer able to post content for a week.
       | 
       | Honestly, they had it coming.
       | 
       | While "regular people" can be understood to use overtly shitty
       | platforms like youtube, somebody who is a security and privacy
       | researcher has no excuse. They should just avoid platforms with
       | such callously hostile policies. Moreso when there are plenty of
       | sane alternatives like peertube and others.
        
         | giansegato wrote:
         | If you're interested in reaching out the mainstream public to
         | increase its exposure to a topic you care about, there's no way
         | around YouTube and the like
         | 
         | It's obviously and evidently not about uploading videos, but
         | rather to build and reach a community
         | 
         | Also, it's wrong to accept that these platforms have this kind
         | of influence and we should just avoid them. They're part of the
         | social fabric of our time, and it's our duty to make things
         | better.
        
           | zentiggr wrote:
           | I see that "duty" as avoiding using YouTube and encouraging
           | others to do so as well, until YouTube goes the way of
           | MySpace and Yahoo and every other service that's lost their
           | way.
           | 
           | Eventually even Google in general.
           | 
           | MySpace was part of the social fabric, too, at one point.
           | 
           | No one is too big to fail or at least become irrelevant. No
           | one is sacred.
        
           | grumple wrote:
           | > If you're interested in reaching out the mainstream public
           | to increase its exposure to a topic you care about, there's
           | no way around YouTube and the like
           | 
           | Is this true? There's almost no discoverability on YouTube
           | imo. You could just push people towards your privately hosted
           | platforms.
        
           | enriquto wrote:
           | > If you're interested in reaching out the mainstream public
           | to increase its exposure to a topic you care about, there's
           | no way around YouTube and the like
           | 
           | This kind of attitude only serves to entrench the problem and
           | make everything worse. Even if you are not personally
           | affected by youtube's damaging policies, just using the
           | platform is an insulting lack of respect towards other people
           | who are. What ever happened to "be the change you want to see
           | in the world"? Any such change must be led by knowledgeable
           | people like this security researcher. I hope that this ban
           | serves Sun Knudsen (who already publishes their videos on
           | peertube) to lead more and more people out of this and other
           | user-hostile platforms, until these platforms finally turn
           | into a cesspool of content-free "influencers".
        
       | at_a_remove wrote:
       | I think the "oh no human moderation would cost us too much"
       | defense is a distraction. It's a good one, because it is working,
       | but it is still a distraction.
       | 
       | Let us suppose we have a program which checks for orthodoxy
       | according to the YouTube Terms of Service against each video
       | submitted.
       | 
       | Nothing but a lack of having done the work prevents YouTube from
       | adding to its "rejected" flag the following:
       | 
       | 1) A list of timestarts and durations of the portions of the
       | video violating the ToS. 2) For each item in that list, make it a
       | tuple and add a _reason_ -- which clause was violated?
       | 
       | Then outputting that to the owner of the rejected video.
       | 
       | Nothing stops them from doing this. _Some_ set of words or images
       | happened somewhere in the timestream -- the orthodoxy program has
       | that timestamp as it churns through the video. And a specific
       | clause was violated -- a particular word in a wordlist as an
       | example, because there 's no simple, non-composite function that
       | just says "good or not good."
       | 
       | This is really not a particularly high bar as an ask.
        
         | flavius29663 wrote:
         | That will never work.
         | 
         | If you tell the user exactly what they did wrong, they will
         | keep trying by changing very little in the original video until
         | it gets past bots, but still keeps the illegal or "forbidden"
         | message.
         | 
         | I think youtube should not be allowed to remove videos for
         | their content unless for legal reasons. Meaning it's a
         | violation of the law, in which case they have to send the
         | information to the police as well. And that, of course, should
         | require a human stamp of approval. Of course, Youtube should be
         | allowed to remove videos if they are too long, wrong format
         | etc., but NOT for their content.
        
           | tumetab1 wrote:
           | > If you tell the user exactly what they did wrong, they will
           | keep trying by changing very little in the original video
           | until it gets past bots, but still keeps the illegal or
           | "forbidden" message.
           | 
           | I think there's a possibility to implement that handles the
           | issue you describe.
           | 
           | It's trivial to detect an user trying to circumvent content
           | controls if he's trying that from the same account. When that
           | is detected the user account can be put into manual mode
           | review or automatically block new uploads for a while.
           | 
           | If circumvent is tried with different accounts the current
           | system already doesn't prevent that so nothing is lost.
        
         | slt2021 wrote:
         | this will make it very easy to develop adversarial algorithm
         | and avoid detection by youtube's system
        
           | at_a_remove wrote:
           | That would literally beat the current situation by a country
           | mile. The goal isn't to perfect the YouTube flagging algo,
           | after all ... an adversarial algo is a small price to pay.
        
             | colinmhayes wrote:
             | >That would literally beat the current situation by a
             | country mile.
             | 
             | I don't think google agrees.
        
               | pixl97 wrote:
               | Google likely only agrees with whatever choice makes them
               | the most money. Whatever decision they make will always
               | be jaded by how they think it will affect their stock
               | price.
        
       | m_a_g wrote:
       | >Anyhow, I truly believe humanity has to rollback to operating at
       | a human scale.
       | 
       | >Using algorithms to flag content is totally fine... problem is
       | when humans cannot interact with humans anymore and AI gets to
       | chose what is right and wrong.
       | 
       | Can't agree more.
        
         | agilob wrote:
         | That's exactly the point, this issue was known as a problem in
         | 1977 published in Computers Don't Argue, but developed in 2010s
         | as a feature
         | 
         | https://archive.org/details/bestofcreativeco00ahld/page/133/...
        
         | lolthishuman wrote:
         | Only way is probably population reduction
        
         | BrianOnHN wrote:
         | This ignores human bad actors.
         | 
         | There is no "rollback" for the existence of algorithms.
         | 
         | Instead of whining (problems without solutions) spend some time
         | contemplating what a sufficient algorithmic solution looks
         | like.
         | 
         | To be clear, that would be an algorithmic solution that "fights
         | back" against issues like these (and not simply improving a
         | failing solution).
         | 
         | That may not be a competing product. Maybe it's an algorithmic
         | whistleblower?
        
           | jrumbut wrote:
           | The algorithm can exist, but to run it at scale requires
           | buildings and large teams and bank accounts and therefore the
           | companies that use these algorithms are susceptible to
           | regulation.
           | 
           | Otherwise it seems like an enormous waste of talent and
           | resources playing cat and mouse, when we can just give them
           | guidelines and fairly often they'll be followed.
           | 
           | Of course bad actors can exist, but that's not a reason to
           | give up on social solution to problems.
        
             | BrianOnHN wrote:
             | > give up on social solution to problems
             | 
             | Everything is a trade-off.
             | 
             | Sure humans can be used for moderating spam.
             | 
             | I'd prefer if they performed real social work, instead.
        
         | b3kart wrote:
         | The thing is, we have to make sure we're OK with the
         | consequences of going this way. Would people be happy to wait
         | for days, maybe weeks, for their video to be approved for
         | upload to YouTube? Would that work with the current society's
         | attention span?
         | 
         | Is YouTube as we know it actually feasible without AI-based
         | moderation? Can it be a sustainable business with an army of
         | human moderators and/or a drastically reduced scale as a
         | result?
         | 
         | I don't know, but I have my doubts.
         | 
         | Yes, moderation/suspensions can be a pain, and pointing out
         | embarrassing mistakes is important to add pressure on Google to
         | put more resources into it. At the same time, YouTube is full
         | of great content and content creators. Anyone can upload a
         | video and make it viewable by billions, _instantly_. I think we
         | shouldn 't forget how amazing this actually is.
        
           | izacus wrote:
           | What if... instead of YouTube you'd have several intereset
           | based sites which could actually handle their moderation and
           | workload instead of being a giant uncaring behemoth?
        
             | rapnie wrote:
             | You mean, like the open standards-based Fediverse, where
             | there are also no ads and algoritmic feeds optimizing for
             | engagement?
        
             | b3kart wrote:
             | Isn't that basically how reddit operates? And last I heard
             | they have their native video hosting as well. Sounds like
             | this could work, but doesn't quite at the moment. Is
             | YouTube handling content discovery better? Network effects
             | maybe?
        
               | judge2020 wrote:
               | Reddit is interest-based but gets free labor from most
               | subreddit moderators that remove bad content for them.
               | Reddit is also much less lucrative of a content platform
               | given you have limited ways to make money (usually
               | limited to linking to your patreon), so the content is
               | constrained to whatever people do with no expectation of
               | money.
        
             | adrianN wrote:
             | Who pays for the moderators? You still need a lot of
             | manpower if you don't want algorithms, regardless of
             | whether you have one Youtube or a thousand
             | SpecialInterestTube.
        
               | tclancy wrote:
               | Without disagreeing with your point, I do think there's
               | something to be said for the more specialized sites the
               | Internet used to specialize in. Places that foster a
               | sense of community tend to wind up with users who are
               | happy to pitch in when something doesn't look right (or
               | get overrun by spammers if the community isn't able to
               | deal with the issues). I am happy to pull weeds in a
               | community garden in a way I will not at some factory
               | farm. This approach isn't a utopia; people are people and
               | everything is political, but at smaller scales, saying
               | "Well, just start your own [niche] video site" isn't
               | snark, it's a realistic thing and perhaps both sites
               | contribute to the marketplace of ideas. YouTube is the
               | marketplace of the unthinking.
        
               | foepys wrote:
               | Tech is very special in a way where they can get away
               | with claiming that quite a lot is not possible at their
               | scale. Imagine this for other industries.
               | 
               | "We cannot do quality control before sending products
               | out. We make too much stuff."
               | 
               | "We don't check all tickets for the venue. Too many
               | people."
               | 
               | "Cleaning the entire building is not possible. It's too
               | much space."
        
               | anticensor wrote:
               | "We don't check all tickets for the venue. Too many
               | people."
               | 
               | This happens in some public transport operators already.
        
               | manigandham wrote:
               | There's a massive difference between centralized
               | production of physical goods vs user-generated digital
               | assets. Scale itself becomes the challenge.
        
               | ghaff wrote:
               | And those things are paid for. It would be perfectly
               | reasonable if you had to pay per minute for uploading
               | video and people had to pay for a subscription to watch
               | it. That's just not what is expected for a mass platform
               | today.
        
               | colejohnson66 wrote:
               | Vimeo has an "uploader pays" system (of sorts), and,
               | while it prevents garbage on the platform, it stops
               | creators from wanting to use it when YouTube is free.
        
               | ghaff wrote:
               | Yes. It's of course a perfectly valid (and widely-used)
               | model to pay for hosting. But, by and large, only
               | "serious" users who probably otherwise have a business
               | model will do so. And there will inevitably be far less
               | content on the platform.
               | 
               | On the one hand, I think it's sort of unfortunate that so
               | much of the Web has come to be about "free" access
               | (though paying for access with attention and time) but
               | that does come from the perspective of someone who would
               | be able and happy to pay with money instead.
        
               | adrianN wrote:
               | There are few other industries where users expect to get
               | everything for free.
        
               | b3kart wrote:
               | All the examples you provide have to do with tangible,
               | physical things. Tech _is_ in some sense special because
               | we 're dealing with intangibles. It costs a bad actor
               | next to nothing to upload the same video with harmful
               | content to YouTube a thousand times, a million times
               | even. And YouTube has to deal with all of this.
               | 
               | That's why they still use physical ballots in most
               | elections: as soon as you're dealing with physical things
               | the costs for a bad actor to do bad things at scale
               | balloon up.
        
               | SamoyedFurFluff wrote:
               | But a bad actor can only do so much if manual reviews are
               | involved. If they're a completely new account because
               | their previous accounts have been banned, accessing on a
               | new device because their previous Alphabet-surveilled
               | fingerprint had been banned, or on a VPN because their IP
               | has been banned, YouTube could just implement manual
               | review which would slow such a bad actor down. And if the
               | intent is to clog up the manual review process with
               | whatever copies of a single illegal video Alphabet we
               | already have technology that is used to automatically
               | identify specific videos or images and remove them,
               | usually for purposes of preventing the spread of illegal
               | depictions of children.
        
               | adrianN wrote:
               | IPs are very cheap. Botnets are huge and residential IPs
               | are easy to come by.
        
               | water8 wrote:
               | If you change the MAC address on your cable modem you can
               | get tons of different IPs. Same thing with cellular
               | networks. With raspberryPis so cheap, fingerprints don't
               | work that well for all devices.
        
               | SamoyedFurFluff wrote:
               | There are relatively few people capable of doing this
               | compared to the population of the world that uses
               | YouTube, and there's even fewer people who, even with
               | capacity, will be doing this just to upload illegal
               | content into YouTube. I think this sizes down the problem
               | of manual review quite easily.
        
               | sgift wrote:
               | But Tech also enjoys the benefits of this. You make a
               | product once, sell it thousands of times. Or allow
               | thousands/millions of people access without much
               | overhead. I'm simplifying here of course, but compared to
               | e.g. a machine which you have to produce new for each
               | customer I think we are in a very good situation. And
               | with that advantage comes a price tech should be forced
               | to pay.
        
               | BigGreenTurtle wrote:
               | "We cannot do quality control before sending products
               | out. We make too much stuff."
               | 
               | Ever buy strawberries?
               | 
               | "We don't check all tickets for the venue. Too many
               | people."
               | 
               | Train conductors often check tickets after the train has
               | started moving.
               | 
               | "Cleaning the entire building is not possible. It's too
               | much space."
               | 
               | Climate change. Dirty streets.
        
               | da_chicken wrote:
               | Very much so. Tech companies have a way of confusing
               | "this is difficult" with "this is impossible," or "this
               | is impossible" with "this isn't free," or "this is
               | impossible" with "we don't want to."
        
               | LegitShady wrote:
               | They are incentivized to have that confusion. So no
               | surprise.
        
               | Nasrudith wrote:
               | It is impossible is the polite way of telling you that
               | your idea is incredibly illbconsidered and contrary to
               | the actual purpose of the program and please fuck off and
               | get cancer. Linus Torvalds was one of the few who
               | operated under that style. Sadly for transparency and
               | clarity it is the diplomatic ignore the elephant in the
               | room which has become the standard. I blame all of the
               | professional emotional manipulator sophist whiners who
               | don't even try to understand reality. "Do it yourself
               | then." is embedded in it as a message to fuck off because
               | everyone knows they either cannot do it because they
               | asked in the first place in those terms.
        
               | viraptor wrote:
               | I think it's much less work on SpecialInterestTube. On
               | YouTube there's a huge amount of decisions and ideas the
               | moderator has to take into account. The other one can
               | filter a large amount of spam with "the first 5 seconds
               | make it clear it's not SpecialInterest".
               | 
               | For example comparing to HN, I can go on "New" and see
               | "Places to Spray a Fragrance on for a Long Lasting
               | Effect-Ajulu's Thoughts" - I know it's immediately
               | flaggable, even if it could be a valid post on another
               | website. (and require a full review there to check it
               | doesn't break some specific rules)
        
               | [deleted]
        
               | tinco wrote:
               | Google had a revenue of $180B in 2020, and their EBITDA
               | was $55B. I'm no accountant but that sounds like an
               | absolutely insane profit margin. 8-12% would seem much
               | more palatable, so that means they have over 25B of
               | excess profits they could spend on 500.000 middle class
               | incomes for content moderators.
               | 
               | YouTube represents 10% of Google's income BTW, so the
               | other 90% of middle class incomes could do moderation for
               | their other services.
               | 
               | Is that naieve?
        
               | izacus wrote:
               | I usually frame it this way - Mazda Motor Corporation, an
               | order of magnitude smaller company can provide me with a
               | several salons in my < 2mil population country where
               | there are physical people that will fix my problem.
               | 
               | And Google, the mighty megacorporation, can't even clear
               | a basic phone support bar, much less a local user support
               | office in each region? Something normal for other
               | corporations?
        
               | adrianN wrote:
               | Mazda earns several hundred to several thousand dollars
               | of profit from the sale of a car to you. Google earns an
               | order of magnitude or two less per user.
        
               | foepys wrote:
               | Thousands of dollars are only possible with luxury cars.
               | 
               | VW, one of the two largest auto makers, makes around
               | 300EUR profit per vehicle sale (excluding luxury brands).
               | For some models it's even less than 100EUR. You can
               | calculate around 1% profit for a car sale, often less.
               | 
               | The actual money is in leasing and financing, which is
               | why almost all automakers own a bank.
        
               | gambiting wrote:
               | >>VW, one of the two largest auto makers, makes around
               | 300EUR profit per vehicle sale (excluding luxury brands).
               | 
               | Seeing as it's absolutely trivial to negotiate 10%
               | discount on any new VW car, I struggle to see how that's
               | true? Unless you mean that VW makes 300 euro and the rest
               | is the profit of the dealership?
               | 
               | Even if that's true, VW makes an absolute mountain of
               | money through financial services. Yes, they will make
               | very little profit selling you a PS12k Polo, but VW
               | Financial Services will make PS2-3k just providing you
               | with a financial agreement for it.
        
               | foepys wrote:
               | I don't see how you are contradicting me? That's exactly
               | what I said. The dealer doesn't matter here, it's just
               | about the manufacturer.
        
               | gambiting wrote:
               | Oh it's not about contradiction, I'm just curious where
               | the 300EUR value comes from.
        
               | tinco wrote:
               | Mazda spends several hours in pre- and after sales on
               | each car. Google could automate 99.99% of moderation, and
               | spend literal seconds on 90% of all further moderation
               | per user. A magnitude or 3 less human attention than
               | Mazda's sales pipeline spends on a user would be worlds
               | better than what Google does now.
               | 
               | I've seen YouTube channels with thousands of dollars of
               | monthly revenue be closed over issues that would've taken
               | a trained human less than 10 minutes to resolve.
               | YouTubers with that kind of revenue are literally one in
               | a million users or so.
        
               | izacus wrote:
               | They also had to source parts, manufacture it, transport
               | it via boat and train, then pay someone to show me
               | through all the buttons, clean it and then offer service
               | at least once every year.
               | 
               | Besides that they also provide 24 hour support for
               | breakdowns and plenty of other services.
               | 
               | And at best, they made 10% (!!) gross margin on my car -
               | which is of a range of 2500$. Meanwhile Google has 45-60%
               | (!!) profit margins while not giving a crap.
        
               | TheRealDunkirk wrote:
               | Let people have the detestable YouTube TV for free as
               | mods.
        
               | Dylan16807 wrote:
               | It's definitely not just a money thing, because even
               | major channels that bring in lots of money can't get
               | someone to review their cases.
        
               | atat7024 wrote:
               | Hackaday.com does just fine.
               | 
               | YouTube was fine before Google.
               | 
               | Stop centralising.
        
               | judge2020 wrote:
               | YouTube was bought in 2006 - not much time to even think
               | about having a content problem or adpocalypse.
               | 
               | 0: https://www.nbcnews.com/id/wbna15196982
        
             | amelius wrote:
             | What if... people could make their own websites and host
             | their own videos, and people could link to each other's
             | content?
        
             | criley2 wrote:
             | >What if... instead of YouTube you'd have several intereset
             | based sites which could actually handle their moderation
             | and workload instead of being a giant uncaring behemoth?
             | 
             | What if... the average viewer was willing to pay for video
             | content instead of requiring a free, unprofitable service
             | run as a "Free add-on" to an Ad Empire?
             | 
             | Youtube may finally be profitable in the past few years but
             | the first 90% of its existence, the entire rise to power,
             | it was unprofitable and run at a loss. There aren't other
             | major providers because no one else wants to lose that much
             | money
        
               | Nasrudith wrote:
               | Have you forgotten abour Netflix and the innumerable
               | streaming services out there as every studio and their
               | dog's mother wants to claim 100% of the pie? The money
               | isn't an issue - it is a freudian slip that either you
               | want it to be like cable or are blindly repeating the
               | words of someone who does.
               | 
               | If you don't like youtube's free service you can get your
               | money back.
        
               | [deleted]
        
               | JoBrad wrote:
               | That will end up just like paywalls on news sites. It's
               | the best way forward (with tweaks), but ultimately will
               | fail unless everyone participates.
               | 
               | Services like Brilliant and others are attempting what
               | you're suggesting, though.
        
               | b3kart wrote:
               | > What if... the average viewer was willing to pay for
               | video content
               | 
               | That would solve all of our problems, wouldn't it? Too
               | bad there's likely always going to be someone willing to
               | offer the same service for free + ads and/or selling
               | users data. Unless people start valuing their
               | privacy/data much more than they do at the moment (or
               | regulation comes in), it's just wishful thinking.
        
               | Dylan16807 wrote:
               | It would solve a couple problems. I don't see how it
               | would solve this problem.
        
             | JohnWhigham wrote:
             | That may have worked in the pre-iPhone Internet, where it
             | took effort to become a part of communities, and everyone
             | implicitly understood that. Now, it's a rock vs. a hard
             | place: the masses want centralized platforms, but those
             | centralized platforms are proving cancerous to society.
        
               | api wrote:
               | The masses don't want centralized platforms or
               | decentralized platforms. The masses don't care how things
               | work. What the masses want is an easy to use, fast
               | platform. Centralized systems have been the best at
               | providing that because they are easier to engineer and
               | because there is an economic model via ads and
               | surveillance.
        
           | doublejay1999 wrote:
           | The answers to all your questions is no, which is precisely
           | the point the author is trying to make.
           | 
           | If we made the case for operating "at human scale" Youtube
           | would be amongst first casualties.
           | 
           | to your last point :
           | 
           | > Anyone can upload a video and make it viewable by billions,
           | instantly. I think we shouldn't forget how amazing this
           | actually is.
           | 
           | By the same reasoning, COVID-19 is similarly amazing.
           | 
           | Opinions move on : Youtube is 15 years old now. It's a
           | practical monopoly, being the worlds 2nd most visited
           | website. I wont deny it's utility - but it's a gross
           | centralisation who's modus operandi present significant
           | questions - chiefly, Is it right that it has for years
           | circumvented nearly all regulation applied to other media,
           | and should it remains largely unaccountable to any elected
           | power ?
           | 
           | So the challenge it seems is thus : how do preserve the good
           | things that self published video content has done for us, but
           | prevent or limit the consequences and attendant injustice
           | caused by a service so infeasibly large. it's impossible to
           | administer ?
        
             | ehsankia wrote:
             | Do you have reason to believe it can be done better than
             | how Youtube does it? Again, there are two approach:
             | 
             | 1. Limit who can post content
             | 
             | You'll end up with only the powerful being able to push
             | content, you're taking the voice away from the masses.
             | 
             | 2. Try to gradually improve content moderation and live
             | with the rare false-positives or false-negatives.
             | 
             | This is the current approach, and while not perfect, I
             | personally much prefer it to the former.
        
               | doublejay1999 wrote:
               | I dont think it could be done better much better than
               | youtube aat the scale youtube does it, but i would insist
               | they need to be plugged in to existing national media
               | regulators and bound by the same standards- flawed as
               | they are.
               | 
               | > You'll end up with only the powerful being able to push
               | content, you're taking the voice away from the masses.
               | 
               | i think the algorithms already do that. the channels with
               | the most followers have the loudest voice
               | 
               | >. Try to gradually improve content moderation and live
               | with the rare false-positives or false-negatives.
               | 
               | its better than it was, and will likely improve but it's
               | still drowing in shills and scammers. i think this would
               | be rooted out by more community centric model. exactly
               | how that looks, i dont know - but i think self policing
               | would play a part.
        
           | superjan wrote:
           | One problem is that there is a single process for all.
           | Certain types of journalism and research should be able to
           | ask for pre-approval of videos without risking a ban. The AI
           | can still deal with the bulk of those requests.
        
           | skybrian wrote:
           | There are often problems with Reddit moderators, but at least
           | you know who they are (sort of) and can talk to them. It
           | seems like this there is a better relationship between
           | moderators and a community when the moderators are part of
           | the community, versus being an anonymous horde of
           | interchangeable personnel.
           | 
           | (And of course there is Hacker News for another example,
           | combining some automatic stuff with manual control.)
           | 
           | To get this at scale, though, it seems like you need a larger
           | social network to be divided into many smaller ones in some
           | logical way that's understood by by everyone involved.
           | 
           | It seems like the inherent hierarchy (authors versus
           | subscribers) and subdivision of Substack is likely to work
           | out well for them.
        
           | [deleted]
        
           | kodablah wrote:
           | I think the argument is framed as wanting a human during take
           | down dispute, not pre-approval. There's no need to treat this
           | as an absolute. Why are human contact and reasonable
           | expediency considered mutually exclusive? There is a lot of
           | room in the middle. Can allow human dispute resolution for
           | non-free accounts (i.e. paid or earning). Can limit upload
           | frequency site wide while dispute/abuse queues are backed up
           | (knowing YouTube would be incentivized to prevent backup).
           | Can have non-profit reviewer company that charge very minimal
           | amounts for human review. Etc.
           | 
           | Let's be clear, the lack of humans anywhere in the pipeline
           | for uploaders is because they don't have to, not because it
           | is impossible. Having said that, I really hope we do not see
           | government intervention here.
        
             | AnthonyMouse wrote:
             | > Let's be clear, the lack of humans anywhere in the
             | pipeline for uploaders is because they don't have to, not
             | because it is impossible.
             | 
             | The real problem here is that people are envisioning human
             | interaction as something that can produce better outcomes,
             | but the cost of that may not be economical.
             | 
             | You can hire a generic human to look over a flagged video
             | for 15 seconds and render a verdict, but the verdict is
             | going to be _bad_. That 's not enough time to make an
             | accurate evaluation.
             | 
             | Suppose you have an epidemiologist on YouTube making a
             | claim about virus propagation. It gets flagged because
             | there is a CNN article reporting that the CDC said
             | something contradictory. The epidemiologist argues that the
             | CNN article is a misinterpretation and if you look at the
             | original CDC publication, that's not really what they said.
             | 
             | At this point for YouTube to get it right they've basically
             | got to hire their own M.D. to study the CDC publication in
             | enough detail to understand the context. That is
             | prohibitively expensive. But the alternative is often
             | censoring an epidemiologist who is trying to _correct_
             | misinformation being disseminated by CNN.
             | 
             | And science is a moving target. The infamous example was
             | the CDC in 2020 telling people not to wear masks. But there
             | was a point in time when scientists were publishing studies
             | on the health benefits of tobacco consumption. Challenging
             | the established orthodoxy is inherently necessary for
             | progress.
             | 
             | The underlying problem here is that YouTube isn't competent
             | to make these determinations. They're attempting to do
             | something they have no capacity to succeed in.
        
               | posguy wrote:
               | What stops the YouTube reviewer from calling or emailing
               | the CDC to render a verdict on which interpretation of
               | their public information is accurate?
               | 
               | When it comes to government agencies, they are
               | contactable (usually with a simple email). You might not
               | get an immediate response, but a few hours to a few days
               | later you can expect to hear back.
        
               | vokep wrote:
               | Why would youtube be trying to make a determination like
               | that? Individuals are the arbiters of truth, not
               | organizations. Ministries of truth are not desirable.
        
               | eecc wrote:
               | So true. Automation cannot infer nuance. It's ok to
               | strike down obvious spam and troll but the but this
               | pretense that private actors should spend their resources
               | to cultivate a sane public space in just a broad-spectrum
               | reflex reaction to micro-targeted disinformation.
               | Nonsense.
        
             | b3kart wrote:
             | Agree wholeheartedly: there's definitely a spectrum between
             | what Google are doing atm and full human moderation. I am a
             | strong believer in using AI to augment humans, not replace
             | them. Nudging Google towards putting more humans in the
             | pipeline is what I am hoping posts like this are achieving.
        
               | prox wrote:
               | When was the last time Google listened to the public? In
               | spirit, not because there was a backlash? It's a techno
               | conglomerate with no human face. Google was cool and
               | interesting in 00's, but nowadays all they create is
               | fodder for those sweet ad dollars.
        
             | thu2111 wrote:
             | What's with the assumption here that there aren't humans
             | involved in this decision? The linked post literally quotes
             | Google saying:
             | 
             |  _" Hi Sun Knudsen, Our team has reviewed your content,
             | and, unfortunately, we think it violates our harmful and
             | dangerous policy."_
             | 
             | Google has always had humans in the loop. Not for every
             | decision but certainly the more impactful ones that affect
             | user generated content. When I worked there, there were
             | even humans reviewing requests for account un-suspensions
             | (yes, really! they didn't do a great job though, needless
             | to say).
             | 
             | Also, it's quite hard to understand what sort of algorithm
             | could result in this sort of suspension. What training data
             | would give it hundreds of thousands of examples of infosec
             | discussions labelled as harmful? But it's very easy to see
             | how a large army of low paid humans given vague advice like
             | "don't allow harmful or dangerous content" could decide
             | that linking to hacking information is harmful.
             | 
             | The problem here is not AI. The problem here is Google
             | allowing vague and low quality thinking amongst the ranks
             | of its management, who have decided to give in to political
             | activists who are determined to describe anything they're
             | ideologically opposed to as "harmful", "dangerous",
             | "unsafe" etc. When management give in to those radicals and
             | tell their human and AI moderators to suspend "harmful"
             | content, they inevitably end up capturing all kinds of
             | stuff that isn't political in nature, simply due to the
             | doublespeak nature of the original rules.
        
               | devmor wrote:
               | "Our team" is quite often a lie. Why do you assume that a
               | human was involved? Especially when it comes to Google,
               | who notoriously skimps on human support staff.
        
               | water8 wrote:
               | It's a bad idea to give liars the benefit of the doubt
        
               | dbuder wrote:
               | They lie both ways too, whenever there is a public
               | kerfuffle they always blame the algorithm / the AI.
        
               | thu2111 wrote:
               | Because I worked there and back then, when Google claimed
               | content had been manually reviewed, it had been. It might
               | have been by a low wage worker in India or Brazil and
               | they might have spent 20 seconds on it, but there _was_ a
               | person in the loop.
               | 
               | That's why I turned this around. Why do you assume
               | they're lying, without evidence?
        
               | SirSourdough wrote:
               | Why wouldn't we assume they are lying? Interacting with
               | their dispute process doesn't have any human qualities,
               | so what would make us think that humans are involved?
               | 
               | They are a massive monopoly and just the other day it was
               | reported that they manipulated their own ad markets for
               | their own profit. They constantly inaccurately accuse
               | people of malicious intent and copyright infringement,
               | often massively hindering peoples' careers and generally
               | providing zero recourse. To not assume lying and at best
               | self-serving intent given Alphabet/Google's position in
               | the world would be foolish.
               | 
               | On top of that, is it really that different to use
               | inaccurate AI or to use humans who are intentionally not
               | given the time or tools to do human-quality work?
               | 
               | The problem is obviously with the overall quality of
               | moderation, and the level of power exerted by platforms
               | over users. Whether a non-skilled human reviewer or an AI
               | is handling the broken process hardly matters given the
               | results.
        
               | hervature wrote:
               | Choosing to die on that hill of technicality seems like a
               | silly choice. People say "human" but they really mean "a
               | person who diligently considered the content, probably
               | watched the video more than once, understood what they
               | were watching, and possibly has a chance to debate with
               | another human". Or, you know, a reasonable definition for
               | "human in the loop".
        
               | admax88q wrote:
               | It seems you're defining "human in the loop" as "Google
               | makes what I consider to be the correct decision in this
               | case." I believe the point that thu2111 is trying to make
               | is that there _are_ humans in the loop, and having them
               | is no insurance against unilateral decisions that you
               | disagree with or even just plain stupid decisions.
               | 
               | The point is that involving more humans in the moderation
               | decisions at YouTube does not guarantee nor is there any
               | evidence to suggest, that it would result in better
               | moderation.
               | 
               | Somebody with the technical and industry knowledge to
               | know that talking about security vulnerabilities like
               | this shouldn't be removed can probably find a better job
               | that YouTube video moderation.
        
               | tremon wrote:
               | _Google makes what I consider to be the correct decision
               | in this case._
               | 
               | I'll settle for "Google makes a well-informed decision
               | and will communicate the weighted pros and cons that led
               | to that decision."
        
               | kelnos wrote:
               | I would accept that Google will sometimes make the wrong
               | decision, but it's unacceptable that the care given
               | toward making that decision stops at a low-wage worker in
               | another country spending 20 seconds on the decision.
               | 
               | There's a very wide chasm between "an AI decided" or
               | "someone spent 20 seconds on it", and "we hired an expert
               | in every possible field and they thoroughly research the
               | facts at hand before making a decision".
               | 
               | The latter is obviously not feasible, but there are many
               | options in between that will give markedly better
               | outcomes than the status quo while not making YT
               | moderation economically infeasible.
        
               | [deleted]
        
               | hervature wrote:
               | There are plenty of cases where I disagree with the
               | outcome of companies I interact with. When I spend 30
               | minutes chatting with an agent via text with Amazon or
               | airlines, I might leave the conversation disappointed,
               | but I never doubt their conscientiousness. I highly doubt
               | United beat me at the Turing Test.
        
               | kodablah wrote:
               | > What's with the assumption here that there aren't
               | humans involved in this decision? The linked post
               | literally quotes Google saying [...]
               | 
               | The post literally quotes Google as saying that for the
               | _last_ time this happened. Then the post literally states
               | they appealed and never heard back. There is nothing
               | concerning a human for this instance, though I suspect
               | once the uploader reaches out (were it not for the
               | popularity this is gaining), one could expect similar
               | dismissal. That there are barely humans involved, and in
               | a poor way when they are, doesn't mean there are never
               | humans involved, true. But for many, the bare minimum of
               | assistance is indistiguishable from no assistance.
        
               | thu2111 wrote:
               | Appealing and not hearing back doesn't mean the original
               | decision was entirely automated, it means they ignore
               | appeals. Plenty of people would love nothing more than to
               | argue with Google employees all day about every decision
               | the firm makes - just think about webmasters and SEO
               | alone - so the fact that YouTube actually has such a
               | process is already kind of remarkable.
        
               | kelnos wrote:
               | You seem to be arguing that because their dispute process
               | everywhere else is reprehensible, a slightly-less-
               | reprehensible process should be celebrated.
               | 
               | It shouldn't, and we need to hold Google and others to a
               | much higher standard here. I'm not sure how we do that,
               | but the status quo is garbage.
        
               | jscipione wrote:
               | Louis Rossman had his YouTube account temporarily
               | suspended for posting a video entitled "An angry clinton"
               | which was a video of his pet cat Clinton meowing. Human
               | moderation is needed in cases like this. There is
               | obviously no analysis of the video content, the algorithm
               | looks for keywords in the video title. The keywords used
               | by the algorithm are centered around protecting
               | Democratic politicians from criticism. There is no
               | advanced AI going on, the algorithm is simple pattern
               | matching.
        
               | hoppla wrote:
               | Knowledge is empowering, and one can also argue that lack
               | of it is considered harmful. Not being able to link to
               | resources that can help you understand how something
               | works can lead to (in the big picture) that only
               | criminals will gain access to that knowledge.
        
           | insert_coin wrote:
           | Apple reviews each app update manually. Not only each app,
           | but each and every one of its updates too.
        
           | WhompingWindows wrote:
           | Why not allow the majority of YTers with a legit posting
           | history to post videos as they wish? You wouldn't have to
           | wait days or weeks for these folks, they are trustworthy
           | already. The vast majority of channels I care about wouldn't
           | need moderation, they're hobbyists like woodworkers,
           | musicians, etc. with dozens of posts already.
        
           | ttt0 wrote:
           | > The thing is, we have to make sure we're OK with the
           | consequences of going this way. Would people be happy to wait
           | for days, maybe weeks, for their video to be approved for
           | upload to YouTube?
           | 
           | Or they could just stop doing that and go back to how it used
           | to be. Let people upload videos and review them only after it
           | was flagged. Novel idea, I know.
        
             | colejohnson66 wrote:
             | The media companies made sure that'll never happen again :/
             | That's how we got ContentID: to prevent music and video
             | piracy.
        
               | syshum wrote:
               | That is why we need Copyright reform to take the teeth
               | out of media companies.
               | 
               | Return copyright in the US back to the original
               | constitutional limits
        
               | kingsuper20 wrote:
               | Never'll happen.
               | 
               | "Congress shall have power... to promote the progress of
               | science and useful arts, by securing for limited times to
               | authors and inventors the exclusive right to their
               | respective writings and discoveries."
               | 
               | As I read it, the law should be formed to maximize
               | progress, not to maximize value of an idea.
               | 
               | But, a large amorphous group of citizens will never
               | overcome a small, highly-concentrated self interested
               | group. I doubt you'll ever see the copyright-opening
               | version of the NRA or SPCA.
        
               | syshum wrote:
               | Well the google case put a huge dent in the prevailing
               | idea that Maximum value / profit is the constitutional
               | purpose.
               | 
               | The court clearly held that profitability alone should
               | not be a factor.
        
               | dragonwriter wrote:
               | > I doubt you'll ever see the copyright-opening version
               | of the NRA
               | 
               | Corrupt influence-peddling charlatans milking the rubes?
               | 
               | Or was there something else intended by this analogy?
        
               | kingsuper20 wrote:
               | Simply that the power of a highly focused group can be
               | remarkable. Substitute 'NRA' for Big Pharma trade
               | organization or Teachers' Union if you like.
        
               | ttt0 wrote:
               | That's its own thing and it's just as absurd as this, but
               | this person got a strike for "violating harmful and
               | dangerous policy".
               | 
               | I don't upload any videos, but if their AI for reviewing
               | videos is as bad as their AI for reviewing comments then
               | both should be completely scrapped. Because it's not a
               | regex matching or whatnot, but AI, it's impossible to
               | predict which comment or a livechat message will get
               | through. In practice it's virtually random. And the worst
               | part is that they don't even bother to mention that your
               | comment was 'problematic' to them, you have to have two
               | windows open, one for posting and one on private mode to
               | verify which of your comments actually showed up.
               | 
               | Just as a disclaimer, I haven't tried it in a while, so
               | maybe it's somewhat better now, I don't know.
        
               | feanaro wrote:
               | Why is it a foregone conclusion that we cannot unmake
               | this?
        
               | Karunamon wrote:
               | Something I never quite understood... couldn't Google
               | have just said "no" and required them to submit DMCA
               | takedowns? The law doesn't require the level of broken
               | proactivity that Youtube has.
        
               | sigstoat wrote:
               | google wants the content up. it is in their interests for
               | there to be as much mundane fluffy ad-watch-generating
               | content as possible.
               | 
               | having it DMCA'd down is a problem for youtube. content
               | id is how most of the material can stay up, the riaa can
               | stay happy, and google can sell ads.
        
               | ttt0 wrote:
               | They don't care about your average Joe user. Just look at
               | their search results and see what comes up. What they
               | want is to transfer the 'legacy media' to their
               | platforms.
        
               | MereInterest wrote:
               | They absolutely could have, and have been entirely within
               | the rules of the DMCA. My best guess is that the
               | ContentID system is either (a) a bribe to get media
               | companies to partner with youtube's paid streaming, or
               | (b) a bribe to convince media companies that it isn't
               | worth lobbying yet again for stronger copyright
               | monopolies.
        
           | bg24 wrote:
           | > Is YouTube as we know it actually feasible without AI-based
           | moderation? Can it be a sustainable business with an army of
           | human moderators and/or a drastically reduced scale as a
           | result?
           | 
           | I am curious about the cost of a process that establishes
           | human reviews for conflict scenarios, i.e. situations like
           | this. Is that still the planet-scale numbers?
        
           | tw04 wrote:
           | >The thing is, we have to make sure we're OK with the
           | consequences of going this way. Would people be happy to wait
           | for days, maybe weeks, for their video to be approved for
           | upload to YouTube? Would that work with the current society's
           | attention span?
           | 
           | Why should they have to? Youtube makes BILLIONS of dollars,
           | and they can afford to employs 10s of thousands of people to
           | be human moderators, they just don't want to because that
           | would eat into profits. If the video has X views or the
           | publisher has X followers, it should be reviewed within hours
           | by a real human.
           | 
           | >Is YouTube as we know it actually feasible without AI-based
           | moderation? Can it be a sustainable business with an army of
           | human moderators and/or a drastically reduced scale as a
           | result?
           | 
           | Nobody is saying there should be 0 AI moderation, just that
           | there should be a human that reviews a clip if AI flags it
           | and the end-user clicks a button saying "this shouldn't have
           | been flagged".
           | 
           | The entire premise of google is kind of broken and both
           | technical and non-technical people feel that way from what
           | I've heard. The fact it's nearly impossible to get to a real
           | human on anything, and if you do happen to get to a real
           | human, half the time they aren't even empowered to undo
           | whatever decision their AI has made... it's just not a good
           | way to do business. There's a balance to technology and human
           | interaction and Google (again IMO) has gone too far in the
           | technology direction. Heck, humanity as a whole might need to
           | find someplace in the middle if the studies about the current
           | generation of 20-somethings struggling with stunted social
           | interactions due to social media is to be believed.
        
           | ipaddr wrote:
           | Anyone can upload a video to their website and make it
           | viewable to all. The hosting is not the issue. You could use
           | dropbox, box, etc for easy video hosting and sharing.
           | 
           | It's the discovery piece that cannot be replicated. If
           | hosting happened outside moderation can move to automatic
           | with a user/ml flag with manual approve process that follows.
        
           | coliveira wrote:
           | Your question is: would a company based on youtube be hugely
           | profitable without AI? Answer: probably not, but who cares
           | other than the shareholders of Alphabet? I don't really think
           | I should base my decisions about software based on how much
           | the owners of the company are making.
        
           | ravenstine wrote:
           | I'd say it's infeasible without AI moderation based on their
           | revenue model. They will never make enough money off
           | intellectual content in order to pay for enough humans to
           | moderate their system. Content for the lowest common
           | denominator is not only easy as hell for an AI to moderate
           | but it makes orders of magnitude more revenue than anything
           | about technology, science, philosophy, etc.
           | 
           | The answer is to simply not rely on YouTube for anything
           | besides lifestyle and self-help vlogs, marketing,
           | instructions on how to put together your IKEA bookshelf, and
           | of course cat videos. There are plenty of other video hosting
           | sites that won't ruin your day as a creator because you
           | accidentally used the word "doodie" or because the AI thinks
           | your video on writing "hello world" in Rust is teaching
           | people to hack computers.
        
           | harpiaharpyja wrote:
           | I think it would be great. There would be huge incentives to
           | find other solutions, and a lot of what goes into YouTube
           | would find other outlets. Especially since we've already
           | tasted what's possible. It could be the end of the YouTube
           | monopoly.
        
           | bluecalm wrote:
           | Maybe we could have that if the uploader is responsible, not
           | the platform.
           | 
           | Want to upload fast and not having your content taken down -
           | deal with the legal issues yourself.
           | 
           | Want anonymous uploading and/or some protection - deal with
           | automatic systems as employing enough people is just too
           | expensive.
        
             | Silhouette wrote:
             | _Maybe we could have that if the uploader is responsible,
             | not the platform._
             | 
             | Maybe we could have the uploader host their own content,
             | and do away with hosting platforms and their gatekeeping
             | altogether?
             | 
             | I can see little reason today that well-established
             | technologies and individual hosting arrangements couldn't
             | offer similar value to whatever YouTube or some other
             | centralised hosting service offers.
             | 
             | We are talking about the World Wide _Web_. Linking to other
             | people 's content is literally the central feature of the
             | real platform here. For a long time, we did just fine with
             | community-operated facilities and everyone linking to other
             | sources of interesting content to help discoverability. We
             | still do, really, but now the links and other community
             | engagement tend to be locked up in places like different
             | YouTube tabs for each channel, not the sidebar of someone's
             | personal website. Of course, these days we also have entire
             | sites (HN among them) built around sharing links to content
             | that might be interesting to like-minded people. So I don't
             | buy the usual arguments about centralised content hosting
             | being the only way interesting content can gain exposure.
             | 
             | Numerous individual content creators make demos or
             | livestream games or present tutorials with decent-to-
             | excellent presentation values. They often use specialist
             | software to do it. I therefore don't buy arguments about
             | the technical difficulty of self-hosting either. There
             | could be hosting packages offered by ISPs or other hosting
             | services on a common carrier basis if individuals didn't
             | want to set up all the infrastructure themselves, and those
             | who were good/popular enough to make it more of a regular
             | income would still have the ability to do so but now on
             | their own terms.
             | 
             | Even at today's prices, you could serve tens of thousands
             | of videos directly for less than a lot of people are paying
             | for their ISP or mobile/cell phone plan each month. And of
             | course, we know lots about how to efficiently serve
             | interesting content via P2P networks that could be a big
             | amplifier for popular content producers who were happy to
             | share without wanting to monetise or otherwise collect
             | large numbers of viewers on their own site for some reason.
             | The available capacity and prices are surely only going to
             | improve as the technology improves. So that's cost taken
             | care of in most or all cases too.
             | 
             | So, if individuals hosting their own content is technically
             | and economically feasible, and if individual creators and
             | community groups facilitating content discovery is a viable
             | culture along with all the other interactions they tend to
             | have anyway, I struggle to see why the likes of YouTube are
             | so valuable or irreplaceable in our society today.
             | Obviously if YT disappeared overnight there would be a gap
             | tomorrow, but we've been adapting and filling gaps and
             | finding better ways to do things on the Internet since some
             | time in the last millennium, and I see no reason to think
             | we wouldn't do so again.
        
               | Goronmon wrote:
               | _...if individual creators and community groups
               | facilitating content discovery is a viable culture..._
               | 
               | That's a pretty big if, and from my limited perspective,
               | the most important problem content creators have and the
               | most useful service Youtube provides.
        
           | throwaway_kufu wrote:
           | > Would people be happy to wait for days, maybe weeks, for
           | their video to be approved for upload to YouTube?
           | 
           | Well presumably Google being one of the worlds wealthiest
           | corporations could hire and pay people a living wage to match
           | the demand of users looking to upload videos to YouTube. It
           | is in YouTube's interest as it only exists and makes money on
           | this very user generated content.
           | 
           | Yeah it's great YouTube exists and can market to content
           | creators about its market capture audience, but maybe it
           | would be a better system without the middleman as anyone can
           | upload a video anywhere on the internet making it instantly
           | viewable by billions
        
             | [deleted]
        
           | SamoyedFurFluff wrote:
           | I'd be shocked if the best and the brightest that Alphabet
           | can buy can't figure out a reputation system, manually-
           | reviewed with AI assistance, or some other way to approve
           | content in a convenient manner.
           | 
           | Don't forget also that Alphabet is swimming in money. They
           | can certainly afford the manpower to make it relatively
           | seamless for the user. And I really doubt users can't be
           | trained to associate a few hours' wait of a completely new,
           | reputationless user, or a user who is using an unknown device
           | or VPN, with a better quality of site experience because the
           | videos are screened.
        
             | water8 wrote:
             | How about you don't believe everything you read on the
             | internet. Even 5 star people are wrong too.
        
             | MonsterBurger wrote:
             | this! they could have a page entirely of "reported videos"
             | by person or AI and the severity would rank how high on
             | this page the content is.
             | 
             | Things like timestamp of when in the video was reported
             | might help for scanning "content" e.g. "50 people reported
             | this 20 minute video in the 3-5 minute mark" or similar
             | average.
        
           | libraryatnight wrote:
           | > The thing is, we have to make sure we're OK with the
           | consequences of going this way. Would people be happy to wait
           | for days, maybe weeks, for their video to be approved for
           | upload to YouTube?
           | 
           | People will eventually be fine with however the system works.
           | If that's how it works, they assume that's how it has to be.
           | Most of users have no idea how this stuff functions. We're in
           | it now because the expectation has been set for instant
           | uploads, so there would be outrage, but eventually it becomes
           | "the way things are." As an example see any change users hate
           | - users complain, TPTB either concede completely (rare) and
           | revert, make minor concessions but change little, or
           | apologize for the inconvenience and do nothing. Eventually
           | nobody cares anymore and forgets they were angry.
           | 
           | Most of this stuff barely matters. Frankly it might be good
           | if my nephews had to wait longer between any number of
           | irritating youtube videos they binge and parrot.
        
         | [deleted]
        
         | swiley wrote:
         | The problem is that all of humanities videos are going through
         | a single organization. Peertube may provide a solution to this
         | but people have to actually use it.
        
         | gumby wrote:
         | > >Using algorithms to flag content is totally fine... problem
         | is when humans cannot interact with humans anymore and AI gets
         | to chose what is right and wrong.
         | 
         | > Can't agree more.
         | 
         | Simply replaced a prior system in which humans cannot interact
         | with humans anymore and certain humans get to chose what is
         | right and wrong.
         | 
         | The key problem is the inability to provide human feedback more
         | than the mechanism for making the initial decision itself
         | (which is a different problem). Not sure how to achieve that at
         | scale.
        
           | indigochill wrote:
           | Federation is the answer at scale.
           | 
           | You need dictatorship for effective moderation. But you also
           | want those dictatorships small, uniform, and interchangeable
           | so that the "citizens" of those dictatorships can easily pick
           | the one that suits them best. Those dictatorships link up
           | (federate) with like-minded dictatorships to benefit from
           | network effects of sharing their content among themselves.
           | 
           | If your content doesn't suit a certain dictatorship, it will
           | suit another one.
           | 
           | PeerTube does federated video already. No, it's not as
           | polished as YouTube. But it is the answer to human-level
           | moderation at scale.
        
             | root_axis wrote:
             | The web is already federated, YouTube just happens to be
             | one of the most popular nodes. Fediverse systems like
             | PeerTube are great, but they don't change anything; in a
             | world where something like PeerTube is as popular as
             | YouTube we'd see the exact same complaints about the most
             | popular node.
        
             | amanaplanacanal wrote:
             | I'm unfamiliar with PeerTube, but it appears to me that
             | search and the recommendation engine is YouTube's secret
             | sauce. Does PeerTube have that?
        
             | water8 wrote:
             | Or you could just be mature and not need to suck on the
             | teat of authoritarianism and learn to tolerate
             | offensiveness.
        
               | xboxnolifes wrote:
               | Moderation isn't just for offensive content.
        
               | indigochill wrote:
               | There's always going to be some line people are unwilling
               | to cross. For instance, one might say Nazi rhetoric
               | should be tolerated for its historical value, but I know
               | zero advocates for allowing child porn.
               | 
               | But how about adult porn? Acceptable to some, not to
               | others (also depends on the context - even advocates of
               | adult videos are generally not arguing that YouTube
               | should host them). And there are shades of that.
               | 
               | In the end, no matter how mature and accepting you are,
               | unless you allow literally everything, you will need
               | moderators who are empowered to be the final authority on
               | that decision, which is what I mean by "dictatorship".
        
         | e40 wrote:
         | Baby steps to autonomous weapons.
         | 
         | Bring on the disagreement. It was a few short years ago the
         | idea of this would have seemed a really, really bad idea. Just
         | like autonomous weapons do now. And, hey, it doesn't even have
         | to be us (Americans). I'm not going to name the potential
         | baddie. There are many, including us. FOMO will drive so many
         | terrible things in our future. Inch by inch, march closer to
         | the reality many say will never happen. FSM save us.
        
         | throwawaysea wrote:
         | I think this is a distraction from the real problem, which is
         | that YouTube censors/demonetizes/deplatforms videos at all. The
         | problem is very widespread, ranging from censorship of
         | political content to firearm videos to COVID related content.
         | Most recently YouTube banned a panel held by the Florida
         | governor with medical experts from Stanford and Harvard (https:
         | //www.politico.com/states/florida/story/2021/04/12/des...).
         | 
         | This is a company in the business of shaping political opinion
         | and manufacturing narratives, not providing a neutral platform.
         | Unless that fundamental issue is addressed, symptoms (like bad
         | AI flagging) will continue to persist.
        
           | benlivengood wrote:
           | Advertisers asked to not have their ads on certain content,
           | because normal people complained to them when they perceived
           | the advertisers to be supporting the content they didn't
           | like.
           | 
           | Advertising is the problem.
        
         | hodgesrm wrote:
         | > >Using algorithms to flag content is totally fine... problem
         | is when humans cannot interact with humans anymore and AI gets
         | to chose what is right and wrong.
         | 
         | Youtube is in the same economic vice as Walmart. It's hard to
         | be cheap and provide a human touch, especially as labor costs
         | rise. Automation only takes you so far in a free economy.
        
           | slt2021 wrote:
           | walmart actually does have customer service in stores. amazon
           | also tries it best to provide customer service (want a
           | return, here is shipping label).
           | 
           | the problem with youtube is that you are not youtube's
           | customer, so the dont care about you. Only advertisers are
           | customers and youtube cares about them only. I bet there is a
           | big customer support department staffed with real people and
           | serves only advertisers
        
             | hodgesrm wrote:
             | It's hard to argue that Walmart and Amazon are much more
             | considerate of their "users" than Youtube. They are all
             | trying to maximize margins, which are dependent on
             | maintaining network effects and keeping costs as low as
             | possible.
        
         | intrasight wrote:
         | I'm of the complete opposite opinion. I look forward to the day
         | when our machine overlords make all decisions. Humans are
         | assholes with their own nefarious agendas. Machines are not. Of
         | course if the day comes where machines ask for bribes then I'll
         | change my opinion ;)
        
           | slt2021 wrote:
           | and who do you think develops code for machines and whats
           | their agenda?
        
           | tertius wrote:
           | Who controls the machines? Those people are a-holes too.
        
             | intrasight wrote:
             | If they are machine overlords, then they control themselves
             | ;)
        
           | MereInterest wrote:
           | Except that machines aren't making some grand objective
           | pronouncement, sent from on high and untainted by base human
           | nature. Machines are programmed to carry out some particular
           | human desire. They follow that single human desire, without
           | any of the restraint or competition of other desires, no
           | matter how deep that path may lead.
           | 
           | Machines are assholes with nefarious agendas given to them
           | either intentionally or unintentionally.
        
             | intrasight wrote:
             | We have a brief window to get this right before the
             | machines take over. Let's not waste it.
        
               | MereInterest wrote:
               | Though I agree, I would phrase it as "before people with
               | machines take over". The machines themselves are tools
               | enacting the will of whoever controls them.
        
         | sneak wrote:
         | Why does content need to be "flagged"?
        
         | esrauch wrote:
         | So I actually think this viewpoint vastly overestimates the
         | ability for human reviewers to come to the right conclusion. I
         | can tell you thats not really the case: paid trained human
         | contractors for content moderation very commonly make far more
         | problematic errors than "wpa2 cracking info is a bannable
         | offense, but not this case because it's a academic research".
         | 
         | This is the kind of edge case that even after escalation to
         | higher tier human contractors will often still get wrong. I
         | don't really see how it's actually much less dystopian just to
         | have humans in the loop with the same outcomes.
        
         | [deleted]
        
       | HenryBemis wrote:
       | > Anyhow, I truly _believe_ humanity has to rollback to operating
       | at a human scale.
       | 
       | BYOReligion.. and Google believes that they need to increase
       | 'engagement', increase profit, and decrease headcount.. so..
       | there you have it dear Sun Knudsen.
       | 
       | Edit: I don't care for the 'virtual currency' of 'karma'. I
       | wonder though; the sky is blue, I point at the sky, I say "the
       | sky is blue", and people frown (?) upon me for stating the
       | obvious fact of life.. Or is it just Google-
       | trolls/fanboys/fangirls that don't like the true being called
       | out? My above comment is also applicable to any entity that
       | automates "as much as possible" and maintains a small team to
       | manage operations. Something tells me that their internal Finance
       | team has "enough" staff to monitor the General Ledger. Because
       | General Ledger is critical to their $$$$$. A 'content creator'
       | though, is not as critical, thus gets far less attention, and
       | <insert FAANG> will only be bothered to fix this, if said person
       | yells loud enough. How is _this_ fact of life being downvoted?
       | (not me.. I don 't grow taller/shorter based on the karma points)
       | :)
        
         | jodrellblank wrote:
         | > " _I say "the sky is blue", and people frown (?) upon me for
         | stating the obvious fact of life.._"
         | 
         | Yes; exactly this. If you were out in a crowd and called
         | everyone's attention to you - thousands of people quietened to
         | pay attention to what you were going to say for a moment, and
         | you announced the thing you yourself say is an "obvious fact",
         | that the sky is blue, the crowd would be annoyed at you for
         | wasting their time with obvious low-quality trivia.
         | 
         | "company seeks profit" is the same low-value annoying thing to
         | say. You're not being downvoted because it's _untrue_ , you're
         | being downvoted because it's like a toddler going "hey, hey,
         | hey, hey, look, look at this, watch this" "okay?" " _throws
         | ball at ground_ ". Great, thanks for that, good contribution.
        
       | nromiun wrote:
       | Recently I also discovered that YouTube shadow bans comments for
       | using banned words (e.g. kill, coronavirus etc) or external links
       | (two of my comments got banned). I have seen some people asking
       | for sources on YouTube comments and it turns out you literally
       | can't do that. Maybe that's why only spam comments rise to the
       | top, any long or thoughtful comment simply gets shadow banned.
       | 
       | 0. https://support.google.com/youtube/thread/6273409?hl=en
       | 
       | 1.
       | https://support.google.com/youtube/forum/AAAAiuErobU70d28s1N...
        
       | haolez wrote:
       | Off topic: is there an YouTube equivalent in China? I'm just
       | curious.
        
         | [deleted]
        
       | arminiusreturns wrote:
       | Stop buying into the "the AI did it" whitewashing. Youtube is
       | selectively targeting more and more channels and videos for
       | daring to be any sort of contrarian. Its psyops under any other
       | name.
        
         | finnthehuman wrote:
         | "The AI did it" is the same cop-out as "A low level employee
         | did it of their own volition" and should be rejected as an
         | excuse for all the same reasons.
        
         | root_axis wrote:
         | It's not a "psyops", YouTube, like every gigantic corporation
         | is motivated entirely by money, and they would be happy to
         | monetize every video on the site if they could, but advertisers
         | pay the bills, and since advertisers are sensitive to their
         | public image, YouTube goes out of their way to appease them.
         | The problem here is the ad model, not AI or YouTube's values.
        
           | seriousquestion wrote:
           | That's an outdated view. Youtube now bends to activists and
           | political winds.
        
             | benlivengood wrote:
             | Advertisers (and PR departments) are fragile to activists
             | and political winds.
             | 
             | Society gives PR departments far too much power. We should
             | be electing governments that enforce reasonable standards,
             | not relying on what is essentially DDoS to achieve social
             | mores.
        
             | president wrote:
             | IMO bending to activists and political winds is within the
             | same realm of what OP mentioned. There is a LOT of money to
             | be made by following the trends and in today's world,
             | activism and politicizing is the name of the game.
        
             | root_axis wrote:
             | So what is the activist political position on WPA2
             | vulnerabilities?
             | 
             | You see what you want to see, but YouTube hosts an
             | overwhelming abundance of evidence that contradicts your
             | claim.
        
               | im3w1l wrote:
               | Internal employee pressure is the cause of some things
               | getting removed, but they are not behind everything. In
               | this particular case it might bdriven by legal concerns
               | or something. But it should also be seen in light of the
               | status quo. When censorship being the new default, then
               | it doesn't take a strong reason to add another thing,
               | such as hacking-adjacent content to the no-go pile. The
               | code is already written, and the routines are well-known.
               | There is no freedom-of-speech reputation left to protect.
               | So the cost of removing this content is very low, lower
               | than the legal risk.
        
               | root_axis wrote:
               | Without any evidence this is just speculation. The
               | reality is, there are many popular political YouTube
               | channels that represent viewpoints all across the
               | spectrum and this is an easily verifiable fact. I am yet
               | to see any evidence that YouTube is motivated by anything
               | other than money.
        
               | [deleted]
        
               | im3w1l wrote:
               | I have seen evidence, but nothing I can share publicly.
        
               | tomschlick wrote:
               | > So what is the activist political position on WPA2
               | vulnerabilities?
               | 
               | Well we did just have a case with Twitter and IIRC
               | Youtube where they redefined their "hacked material"
               | policies to bend to the political will after the whole
               | Hunter Biden laptop fiasco. So this may just be
               | collateral damage due to those policy changes and the AI
               | behind enforcing them.
        
           | IronWolve wrote:
           | >hey would be happy to monetize every video on the site if
           | they could, but advertisers pay the bills, and since
           | advertisers are sensitive to their public image
           | 
           | Thats the excuse, but in reality, google still runs ads on
           | demonetized channels. And there are companies that want to
           | advertise on these other channels. Google is actually doing a
           | disservice to stock holders by excluding profits, but many
           | suspect that they want to be a cable network in the long run,
           | siding with media corps.
           | 
           | There are tiers of advertisers, lower tier companies will
           | gladly pay to advertise on lower quality channels.
        
         | ehsankia wrote:
         | Are you claiming that Youtube is running psyops trying to cover
         | up PhD research into WPA2 security? I don't want to distort
         | your argument, I honestly don't get how you went from a WPA2
         | video being accidentally taken down to "this is intentional
         | psychological warfare"...
        
         | t-writescode wrote:
         | YouTube is? Or _we as a society_ are?
         | 
         | The world of Fahrenheit 451 didn't arrive that way in a day, it
         | probably took decades.
        
       | mikaeluman wrote:
       | YouTube (and others) appear to have lost the plot.
       | 
       | I tend to avoid it and support other platforms to encourage
       | content creators to migrate.
        
       | throwaway120831 wrote:
       | tube has been censoring folks left and right since the start of
       | covid
        
       | tacosaretasty wrote:
       | The hypocrisy from Google here is palpable. First this content
       | creator got a strike for showing users how GPG works and now for
       | linking to academic research. On the other hand you have Google
       | project zero who openly publish vulnerability research to the
       | detriment of some software vendors.
       | 
       | You can't have it both ways and as someone who works in this
       | space I'm pretty upset by this ham fisted approach to censoring
       | content.
       | 
       | I look forward to google becoming the next AOL.
        
         | crocsarecool wrote:
         | I used to love Google - I can't believe I'm feeling the same
         | way. Everything is so downhill for them. Searching isn't as
         | great as it used to be. Popular services get killed to be
         | replaced by replicas (thinking of messaging apps). What the
         | heck is going on with Google? I used to think they were
         | innovators, and now I feel like they're just Big Brother.
        
       | FpUser wrote:
       | I am sure it is ok to use AI to cheapen initial screening. This
       | FAANG and the like set of companies are becoming vital for people
       | to the point that sometime they can loose their income. I see
       | nothing wrong with those that serve as a source of income being
       | declared as utility type service with the mandatory staged
       | conflict resolution process. Said process at the last stage (well
       | before the actual lawsuit) would include being able to present
       | proof to the actual human being/s who has the power to reverse
       | the decision and is paid for but not employed by said utility. In
       | case of utility failing to comply that watchdog should be able to
       | levy fines. This should also help to prevent cases when the
       | utility "moderation" is based on the political (other similar in
       | nature) opinions of the owners.
       | 
       | All this "I am a private company and do as I please" when it
       | comes to a very big companies is baloney. They have way too much
       | power and should be held responsible for how this power being
       | used. Preventing them from being able to lobby governments should
       | be of first priority as well.
       | 
       | Since the companies of that type are international it is of
       | course up to the individual governments to implement whatever if
       | any measures they would see fit.
        
       | 1cvmask wrote:
       | AI is a great excuse for censorship. But many economic actors
       | want censorship including the advertisers. So it seems more
       | competition to Youtube is the only answer to censorship. But I
       | don't see any credible competition on the horizon to Youtube for
       | a while.
       | 
       | The ultimate ironic censorship on the part of Youtube:
       | 
       | https://www.mintpressnews.com/media-censorship-conference-ce...
       | 
       | And the discussion thread:
       | 
       | https://news.ycombinator.com/item?id=26008217
        
       | Jan454 wrote:
       | Time to install GPT-3 on your own server and unleash it onto
       | Google-Support, YouTube-Support, Alpha-Support, etc. to complain
       | about your situation ;-) You just need to answer the captchas to
       | keep it goin.
        
         | joshxyz wrote:
         | Combine that with 2captcha.com, what do we have? Lol
        
         | abhishekbasu wrote:
         | With AI-based support automation, it might end up like
         | https://www.youtube.com/watch?v=WnzlbyTZsQY
        
       | pyronik19 wrote:
       | It's disheartening that in the 2000s, people used to regularly
       | talk about creating decentralized systems to prevent
       | megacorporations and governments from being able to censor the
       | internet, now the tech community that used to deplore such
       | tactics actively supports mass AI censorship on entire host of
       | topics where the only version of the truth that is allowed to
       | stand is #thenarrative.
        
       | Black101 wrote:
       | I think that Google should hide all sites that display ads from
       | the search results... because after all, it goes against AMP's
       | principles of responsive websites.
       | 
       | They definitely should also block Gmail... because it has become
       | incredibly slow to load.
       | 
       | I bet the CEO is laughing a lot when he sees all of this
       | interacting together.
        
       | Zenst wrote:
       | Odd, more so given a whole video is upon YT about this from 2017.
       | 
       | https://youtu.be/Oh4WURZoR98
        
       | privatesgtrj wrote:
       | Google is a private company. It's fully within their rights to
       | remove any video they don't like, for whatever reason.
       | 
       | It comes with the territory if we want Google to be able to
       | censor hate speech and misinformation.
       | 
       | You can always use a competing site, or build your own.
        
         | danuker wrote:
         | Except the sheer scale of Google is hair-raising. (Sleep tight
         | dear FTC).
         | 
         | If Google loses a YouTube creator, it has almost no impact on
         | their bottom line.
         | 
         | If a creator gets kicked out however, their livelihood is
         | threatened.
         | 
         | I would argue this is the perfect scenario for a creator's
         | union (of course, it should use some other platform to
         | communicate).
        
           | freebuju wrote:
           | > If a creator gets kicked out however, their livelihood is
           | threatened
           | 
           | No. There are numerous providers that can host your videos.
           | For a small fee. This mentality is what gives Google such
           | pseudo-power.
        
             | derivagral wrote:
             | At the end of the day, I'd guess this is less about
             | alternatives and more about youtube being the best platform
             | for your investment. Hosting isn't the problem, its the
             | audience and traffic (and ad monetization from your vids.)
             | 
             | Do the alternatives hold up in terms of things like market
             | reach, $ performance, and not banning the same users?
        
         | [deleted]
        
         | EasyTiger_ wrote:
         | Thanks for joining HN just to tell us this.
        
           | rscoots wrote:
           | That post really does read like a parody haha.
           | 
           | Such a desire to shield mega-corporations from any moral
           | culpability since they think it serves their personal
           | sensibilities. Very naive.
        
           | sumtechguy wrote:
           | It seems to be a common opinion, I see a lot.
           | 
           | What many do not realize is the 'build your own' _is_ going
           | on (I can name at least 4 YT replacements, some better than
           | others). But they are dismissing them with the same reasons
           | they are kicking the creators off the  'mainstream' sites. So
           | you see a lot of YT refugees and many existing YT creators
           | 'splitting the difference' basically they upload it to both
           | YT and several other sites. There is almost zero downside in
           | doing so really.
           | 
           | Eventually someone will click it all together in the right
           | way and create a decentralized uncensorable thing. At that
           | point the internet will become a lot more like 4chan.
           | 
           | The old adage is 'the internet sees censorship as damage and
           | routes around it'. It just does not happen overnight though.
        
         | AnimalMuppet wrote:
         | _Do_ we want Google to be able to censor hate speech and
         | misinformation? You assume the answer is  "yes", but I think
         | it's at least an open question, with me personally leaning
         | toward "no".
        
       | PedroBatista wrote:
       | YouTube, like most of the "Internet" these days it's just a
       | shopping center. It appears to be the Universe because these
       | sites/companies are so massive but at the end of the day it's
       | their turf and they do what they want.
       | 
       | Now, once they reach a certain scale, why are they allowed to
       | operate like a "normal small company"?
        
         | infogulch wrote:
         | Leonard French recently reviewed a Supreme Court ruling on
         | Knight Institute v. Trump that includes a concurrence statement
         | by Justice Thomas that touches exactly on these issues (and
         | maybe even net neutrality by proxy).
         | 
         | https://youtu.be/IpkGzJYYQj4
         | 
         | https://www.supremecourt.gov/orders/courtorders/040521zor_32...
        
         | macspoofing wrote:
         | >Now, once they reach a certain scale, why are they allowed to
         | operate like a "normal small company"?
         | 
         | There was and is a push to make them behave more like a utility
         | (i.e. a platform). Mainstream media, along with a major
         | political party, are on a crusade to make these tech platform
         | be more and more of an editor because they like that currently
         | their competition (in case of media) and political rivals (in
         | case of Democrats) are bearing the brunt of the censorship.
         | 
         | And you can see this push even here on HN with constant
         | references to how evil Facebook and Twitter is because memes on
         | those platforms caused Hillary to lose in 2016, and therefore
         | how misinformation is such a big problem that more censorship
         | is needed. I don't know how you fix that. These platforms
         | didn't want to do this. They got bullied into it.
        
         | dmingod666 wrote:
         | If I can uninstall YouTube from my Android I think that would
         | be a good start towards a journey to a normal, if not small
         | company..
        
       | codeecan wrote:
       | Google has long been manipulating content, look up "blue anon".
       | You can compare searches between Bing and you'll find lots of
       | content thats been "moderated"
        
       | fencepost wrote:
       | I don't see any mention of the other thing that would be a
       | concern to me - don't ever be in a situation where any Google
       | product is _critical_ to you, because the behemoth may casually
       | destroy you as a side effect.
       | 
       | What if instead of locking him for a week they'd canceled his
       | account (and related accounts e.g. Gmail, plus linked accounts
       | because 'bypassing ban')?
       | 
       | And you don't have to be doing something like discussing security
       | - maybe you're talking about Pokemon Go and they ban you for "CP
       | videos" or you posted a bunch of red or green emoji in livestream
       | comments for a gamer. Goodbye email, sites logged into with a
       | Google account, Android phones (because they KNOW that phone
       | number is linked to your identity), etc.
       | 
       | And what are you going to do about it? Call customer service?
       | _snrk_
        
         | barbazoo wrote:
         | > And what are you going to do about it? Call customer service?
         | 
         | I know it's not perfect but I feel much better knowing I pay
         | for my email service, online storage and password management,
         | knowing I'll be a paying customer in someone's system deserving
         | of at least some degree of customer service.
        
       | Jan454 wrote:
       | Maybe it's time for a public company-independent complains-
       | platform where all companies are obliged by law to respond. Each
       | complaint of course is hidden behind some anti-bot
       | captcha/protection so that the companies themselve can't use AI
       | to answer your complaints.
        
         | astrange wrote:
         | https://ec.europa.eu/info/law/law-topic/data-protection/refo...
         | 
         | I dunno, where's the evidence this case was even done by AI?
         | People just say Google does everything with computers when they
         | actually use armies of contractors.
        
         | dgudkov wrote:
         | A small claims court for the internet. I like that.
        
         | qayxc wrote:
         | That's basically what the author suggests when they say:
         | 
         | > Anyhow, I truly believe humanity has to rollback to operating
         | at a human scale.
         | 
         | It's impossible to operate a complaints-platform on a global
         | scale if it's run by humans. According to GMI [0], 500 hours of
         | content are uploaded every minute.
         | 
         | Given an average video duration of about 12 minutes [1], that
         | would be 2500 videos per hour. That's just too much to manually
         | review and handle complaints.
         | 
         | [0] https://www.globalmediainsight.com/blog/youtube-users-
         | statis...
         | 
         | [1] https://www.statista.com/statistics/1026923/youtube-video-
         | ca...
        
           | ylere wrote:
           | Is it though? Let's do a very rough estimate: 500 hours of
           | content per minute in 2019, lets say 1 reviewer can review 3
           | hours of video each hour (by increasing play speed/skipping
           | etc.) and we have a global workforce in lower income
           | countries working 3x 8 hour shifts + weekends. That's 500 _60
           | /3_3 = 30000 reviewers, at a monthly cost of $1000 that's
           | 30000 _1000_ 12*7/5 = $504m/year. Youtube had $15b in
           | revenues in 2019, so this represents around 3.5% of revenue.
           | Now this is assuming that we actually need to 100% review
           | every video before releasing it (which is not the case) and
           | one reviewer can probably review more than 3 hours of content
           | per hour with the right AI assistance so the real cost would
           | be quite a bit lower. Even then, spending less than 5% of
           | revenues on content review, moderation and support sounds
           | very reasonable to me.
        
             | qayxc wrote:
             | > Is it though?
             | 
             | Yes, it is - because it's actually 2500 videos per MINUTE,
             | not per hour, mea culpa. So your 30,000 reviewers would
             | actually have to be at least 1.8 MILLION.
        
               | a1369209993 wrote:
               | They didn't use your videos per hour/minute figure; they
               | used hours of content per minute, so it still comes out
               | 30'000 reviewers.
        
             | Thaxll wrote:
             | The flaw is that you think humans would do a better job
             | than AI which def not the case. Especially hiring 30k
             | people in a low income country, what could go wrong... This
             | is the kind of scale problems that can't be fixed by humans
             | review.
        
           | dTal wrote:
           | It takes a lot longer to make a video than to watch it. It
           | therefore stands to reason that if humanity is capable of
           | _making_ all that content, humanity is capable of _watching_
           | it - if it decided that were a priority.
        
             | qayxc wrote:
             | 2500 videos per minute doesn't equal 500 hours of
             | _original_ content per minute, which is part of the
             | problem.
             | 
             | Just look at all the reaction channels and compilations
             | that simply reuse the same content over and over again. You
             | have one funny or shocking clip (often from 3rd party
             | sources such as TikTok) and you'll find the same video
             | snippet in at least 10,000 remix/compilation/reaction
             | videos. Not to mention reuploads and straight up copies.
             | 
             | Algorithms have a hard time catching up with this and
             | cropping, mirroring, tinting, etc. are often used to
             | confuse ContentID. Asymmetry is the problem. Bots and
             | software can both spam and flag content at superhuman
             | rates.
             | 
             | The inverse - e.g. deciding whether a complaint is legit,
             | fair use applies, whether monetisation is possible, etc. -
             | is actually a really hard problem and therein lies the
             | dilemma.
             | 
             | Certain parties are gaming the system and the scale is just
             | too much to handle manually.
        
           | danhor wrote:
           | *2500 videos per minute
        
             | qayxc wrote:
             | Yes, thanks.
        
           | Zambyte wrote:
           | Maybe we don't need to let people upload that much video
           | content.
        
           | carlhjerpe wrote:
           | I don't have any data to back this up, but i believe of those
           | 2500 videos per minute, 2450 or so the AI could classify them
           | as safe, not requiring human interaction. The other 50 are
           | classified on a scale from 0 to 100 on a badness scale. The
           | ones closer to "not that bad" (ToS and such) gets put through
           | automatically waiting for a review. The illegal content
           | (rape, gore, child porn) and such gets blocked automatically
           | until reviewed by a human. Doesn't sound that far fetched to
           | implement with 50B a year in profits?
        
             | zentiggr wrote:
             | I've thought that something like the spamassassin model
             | would be sufficient - calculate a 0.0 - 1.0 range of
             | likelihood to block, and set cutoffs on the 0 end to auto
             | approve, and toward the 1 end to auto block, and moderate
             | the middle.
             | 
             | Was good for spam for a long time.
        
             | qayxc wrote:
             | But how would that help with complaints, ContentId and
             | copyright claims, though?
             | 
             | The problem isn't the automated review process, the problem
             | is complaints and disputes.
             | 
             | Even if only 1% of all videos had any issues of this sort
             | at all, that'd still be 25 complaints per minute about the
             | most complex topic in media no less - copyright law and
             | fair use.
             | 
             | The problem lies in the asymmetry - bots and automated
             | flagging campaigns can scan, mark and take down thousands
             | of videos per minute no problem.
             | 
             | But it's impossible for creators to get their issue
             | reviewed in time by a human, because we just don't have AI
             | capable of handling such decisions yet. And even then it's
             | often still not as clear-cut as one might think and both
             | sides need to be heard, etc.
        
         | roenxi wrote:
         | What do you think this will achieve? We know what Google
         | thinks; they don't want this content on their platform.
         | 
         | Forcing them to state that in a specific forum as some sort of
         | power play isn't going to help.
        
         | Werewolf255 wrote:
         | That sounds like something that the government should oversee,
         | maybe they could also make some regulations so that , I dunno,
         | corporations might need to be responsible for their actions.
        
       | cl3misch wrote:
       | Are we certain this was not because the vulnerability is called
       | "KRACK" which is similar to the drug "crack"? Youtube has been
       | very strict about such phrasing mishaps.
        
         | SV_BubbleTime wrote:
         | If your argument is it would be reasonable for you tube to
         | automatically ban anything that could be a mention of crack
         | cocaine we're not going to find a lot of common ground on what
         | is accidental moderation vs intentional censorship.
        
       | Ian678 wrote:
       | I remember a survey by google about security research and one of
       | the questions was something like "What hinders learning/education
       | in this topic?", and I remember answering exactly _this_
       | behavior.
        
       | jb775 wrote:
       | Not a popular opinion here but selectively covering up research
       | is a play straight out of the Communist Manifesto. This sounds
       | more like concerns over the spread of hacking intel, but it's
       | worth noting yet another example given big tech's selective
       | censorship over the past year.
        
       | williesleg wrote:
       | Boycott google while you still can. Almost have world domination.
        
       | ddtaylor wrote:
       | One of the largest and most profitable companies in the world
       | can't hire enough low-paid humans to do content review. They
       | already have some of the highest margins in history and they
       | still can't figure out that sacrificing less than 1% of profits
       | could solve this problem?
        
         | zentiggr wrote:
         | Paying less attention to YouTube also solves this problem.
         | 
         | People still can't figure out that not watching YouTube solves
         | all moderation problems?
         | 
         | Google doesn't care about those posting on YouTube. They care
         | about advertisers pulling their money if whatever content
         | infringes on THEIR priorities.
        
           | ddtaylor wrote:
           | I think the reason most people are watching YouTube instead
           | of traditional media is _because_ of this type of thing. Take
           | this exact situation here as an example: this kind of content
           | would never make it to traditional TV because it can't find
           | an audience for such niche content, eg. people smart enough
           | to warrant linking a PhD research paper in the description.
           | 
           | I think Odysee is a good alternative. PeerTube doesn't really
           | seem to be a great option here as it seems to have been down
           | during this time.
           | 
           | > Google doesn't care about [...]
           | 
           | Mostly true, but also kind of not relevant. _Google_ isn 't
           | doing anything here besides making poorly trained AI models
           | that try to solve this problem for them. It's more negligence
           | than it is malice, IMO.
        
       | broknbottle wrote:
       | We should start letting "AI" decide the outcome in any of
       | Google's pending court cases. We could setup a twitter account
       | and direct them to complain, I mean appeal by @ the twitter
       | account and hope the bot gods decide their tweet is worthy enough
       | for human review
        
       | asimpletune wrote:
       | The first part of the post, where they said the got a strike for
       | teaching people how to use pgp...
       | 
       | Wow, that's um, wild
        
         | boomboomsubban wrote:
         | I believe it's still against US law to export pgp to certain
         | countries, as it's considered munitions. Wild is a great
         | description.
        
           | arsome wrote:
           | Didn't DJB v US pretty much end that? That would also impact
           | things like Firefox for the record.
        
             | boomboomsubban wrote:
             | Pretty much, but export to "rogue states and terrorist
             | organizations" is still technically illegal. I suspect it
             | wouldn't withstand challenge though.
        
       | kingsuper20 wrote:
       | It would be fun to do a quick survey of books breathlessly laying
       | out the future of the internet from 20 years ago.
       | 
       | Looking back, the evolution of social media/youtube was pretty
       | obvious. Back then, not so much.
       | 
       | 1) Begin with anything goes including illegal. Run at a loss,
       | Grow baby!
       | 
       | 2) Ads
       | 
       | 3) Bitching from some important people causes some removal of the
       | more flagrantly illegal stuff.
       | 
       | 4) Employees and PR departments apply a POV to what is now close
       | to a monopoly. Large scale censorship.
       | 
       | What makes the modern era interesting is the POV angle. I can't
       | imagine Rockefeller's Standard Oil restricting sales to people
       | with conflicting politics. Of course, there are people spending a
       | lot of time each day on r/politics and what used to be
       | r/thedonald shouting joyous insanity at each other...that's the
       | modern pool of workers.
        
       | Aunche wrote:
       | This is what people choose when they decide to angrily tweet at
       | advertisers into pulling away ads whenever YouTube doesn't take
       | down bad videos quickly enough. It tells Youtube that they should
       | tune their moderation to be overly aggressive. This is true
       | regardless of how much human moderation they use.
        
       ___________________________________________________________________
       (page generated 2021-04-14 23:01 UTC)