[HN Gopher] One woman's struggle to remove all traces of videota...
       ___________________________________________________________________
        
       One woman's struggle to remove all traces of videotaped sexual
       assault
        
       Author : barry-cotter
       Score  : 164 points
       Date   : 2021-04-05 12:38 UTC (10 hours ago)
        
 (HTM) web link (www.ctvnews.ca)
 (TXT) w3m dump (www.ctvnews.ca)
        
       | balozi wrote:
       | Definitely not an easy issue to address but for argument sake,
       | shouldn't the central claim be that of copyright? Is the claimant
       | the copyright owner of the material, or can they become copyright
       | owner by virtue of being depicted in the material?
        
       | nunez wrote:
       | > Rachel searched for the username that her husband had used. It
       | led to a video uploaded to the world's biggest porn site,
       | Pornhub. In that video, it shows her, in her own bed, obviously
       | unconscious. She says her husband's hands can be seen reaching in
       | to move her, touch her and sexually assault her. The video titles
       | include "while sleeping" and "sleeping pills."
       | 
       | Isn't this a massive crime? How horrible.
        
         | pjc50 wrote:
         | Depends on the jurisdiction. The sexual assault is a crime,
         | obviously, but very hard to prosecute even with this rare video
         | evidence. Publishing nonconsensual porn surprisingly isn't a
         | crime in a lot of places. Scotland recently criminalised it.
         | Check your local jurisdiction.
        
       | S_A_P wrote:
       | No mention of it in the article but can't charges be brought on
       | the (ex) husband for this? Pornhub definitely owns some of the
       | responsibility but the dude that did this is the bad guy here.
        
         | oh_sigh wrote:
         | It seems difficult for pornhub to do anything about this(from a
         | hosting perspective), considering people also will upload
         | completely consensual videos along the same lines (where the
         | "victim" is just pretending to be passed out and is fully
         | consenting).
         | 
         | Pornhub could make you sign an affidavit, but even then, it is
         | relying on trusting the uploading party - if they are already
         | breaking the law by uploading non consensual sexual assault
         | videos, odds are they will probably check the affidavit box
         | saying that everyone in the video has consented.
        
           | paxys wrote:
           | Pornhub has already mostly fixed this problem by only
           | allowing uploads from verified accounts. It won't 100%
           | eliminate it, sure, but when the site has your real name,
           | picture, ID etc. you will be much less likely to upload stuff
           | that you aren't completely sure is legal and consensual.
        
           | Spivak wrote:
           | Sure, but you can't just check the affidavit box when you
           | have to have all parties in the video identified, registered
           | with the site with government ID, and everyone in the video
           | has to approve it before it goes live.
           | 
           | It's wild that we just offload all responsibility to the
           | victims who have to scour the internet and issue takedown
           | requests for videos because hosting sites literally can't be
           | bothered to actually get consent beforehand.
        
             | oh_sigh wrote:
             | If there is a husband who is living with his wife, and he
             | drugs her and sexually assaults her, wouldn't he also have
             | access to her ID to scan it and upload it? I guess at a
             | certain point, any kind of roadblocks will lower the
             | chances this person uploading the content.
             | 
             | What's the alternative to victims scouring the internet and
             | issuing takedown requests? Some centralized porn database
             | where you can type in someone's name and see what porn
             | they've done?
        
               | totalZero wrote:
               | We already have an apparatus to do this for copyrighted
               | material that people go far further out of their way to
               | share and download. The vast majority of porn is a
               | commodity to most porn addicts/users. It should be far
               | easier to get some amateur porn video off the web than a
               | BDrip of Disney's _Coco_.
               | 
               | If the question is "who should shoulder the cost of
               | takedown," my suggestion is that offenders and porn
               | industry behemoths should pay into a fund that finances
               | redflagged content takedown efforts.
        
               | Spivak wrote:
               | > What's the alternative to victims scouring the internet
               | and issuing takedown requests? Some centralized porn
               | database where you can type in someone's name and see
               | what porn they've done?
               | 
               | I mean this completely unironically: ContentID. It exists
               | _exactly_ for this use-case because copyright holders don
               | 't want to scour the internet for violations either. If
               | you're a victim and you find that someone posted a video
               | of you being assaulted online you should be able to
               | register that video with ContentID and have every site
               | immediately and automatically take down the video
               | everywhere.
               | 
               | Yes it will only affect above-board sites but broadly
               | speaking those are the sites with large audiences and the
               | ones you really care about.
        
           | Retric wrote:
           | Requiring real names, age, and contact info from uploaders
           | and everyone in the video would discourage people well beyond
           | a simple checkbox. Similarly, adding video fingerprinting
           | should make it very easy to avoid someone uploading the same
           | video again.
        
             | nitwit005 wrote:
             | You could just toss anyone's contact information into a web
             | form. If you really want to verify ID, you have to
             | physically check it in the real world.
        
             | AnthonyMouse wrote:
             | Real name policies in this context have an obvious flaw
             | when the host gets hacked and now the ostensibly private
             | records fall into the hands of criminals who start
             | blackmailing everyone to reveal to their bosses and
             | families that they were involved in the creation of
             | pornography.
             | 
             | The ability to remain pseudonymous is more important in
             | this context than many others.
             | 
             | > Similarly, adding video fingerprinting should make it
             | very easy to avoid someone uploading the same video again.
             | 
             | Those systems don't really work, because the uploaders can
             | tell when it's rejected so they can keep messing with the
             | file until it's accepted. Or they upload it to a different
             | host each time, or to a file rather than video host who
             | can't see the contents because it was encrypted and the key
             | is distributed to the downloaders with the link but the
             | host doesn't have it.
        
               | Retric wrote:
               | > core issue
               | 
               | If people don't want to be associated with these videos
               | then perhaps they shouldn't be uploaded. I think you just
               | made my point for me.
        
               | AnthonyMouse wrote:
               | > If people don't want to be associated with these videos
               | then perhaps they shouldn't be uploaded.
               | 
               | I don't often use the word privilege, but there it is.
        
               | Thiez wrote:
               | So I take it Retric is your full legal name? Or are you
               | posting under a pseudonym? If you don't want to be
               | associated with your comments maybe they shouldn't be
               | made.
               | 
               | Is that the point you are trying to make? Why is porn
               | "special" in this regard?
        
           | supergirl wrote:
           | there are many things that PH can do. they could simply
           | require verification of both participants. or they could just
           | reject any such videos if there is even a doubt about
           | consent. I think recently they just axed the whole "community
           | videos" (only because VISA and Mastercard cut ties with
           | them), so you must be a pro to upload. or did that change
           | already?
           | 
           | I think the right approach is for gov to regulate this as
           | prostitution (in Europe at least). you can't just let anyone
           | make porn videos because women and children will get abused,
           | same as with prostitution. you regulate it so whoever wants
           | to do it can do it safely. if you're not a licensed porn
           | actor then you can't upload.
        
             | oh_sigh wrote:
             | If my fetish is people over-eating, does someone who stuffs
             | their face with 10 Big Macs (completely clothed) need to
             | get approval to upload to pornhub?
             | 
             | Community uploads to pornhub don't feel like prostitution
             | to me, because for the most part no one is doing it for
             | money, merely because they're horny or enjoy it.
        
               | supergirl wrote:
               | well if it falls under the legal definition of porn then
               | yes, otherwise no. I don't know what is the legal
               | definition. right now there is no regulation at all,
               | which is crazy. or there is but somehow it doesn't apply
               | to pornhub. we could start with something simple and
               | obvious that covers most videos. something is better than
               | nothing.
               | 
               | > Community uploads to pornhub don't feel like
               | prostitution to me, because for the most part no one is
               | doing it for money, merely because they're horny or enjoy
               | it.
               | 
               | I meant we can regulate for the same reason prostitution
               | is regulated, to make it safer.
        
               | oh_sigh wrote:
               | But if there is no financial incentive in sex, then isn't
               | it more like just attempting to regulate the private sex
               | lives of individuals? Prostitution is regulated, but
               | there is no regulation (at least in any Western country
               | that I know of), that prevents a person from just going
               | out and having sex with other consenting adults.
        
               | [deleted]
        
               | totalZero wrote:
               | > private
               | 
               | Well, this word simply doesn't apply to a company like
               | pornhub, which has a
               | 
               | > financial
               | 
               | interest in publishing sexual content.
        
         | sleepysysadmin wrote:
         | Yes Canadian law makes this a crime including civil damages.
        
       | gher-shyu3i wrote:
       | And now with companies making it easy to make self produced porn,
       | we're going to see even more levels of mental issues in the
       | future when those women realize the mistakes they're making.
        
       | darkerside wrote:
       | This seems like an actual case where you could use machine
       | learning to detect and remove instances of this video, or
       | automate the sending of takedown requests. That would however
       | require that the aggregators actually care.
        
         | hilldude wrote:
         | Looks like they are already doing it:
         | https://help.pornhub.com/hc/en-us/articles/1260803955549-Tra...
         | 
         | CSAI Match - YouTube's proprietary technology for combating
         | child sexual abuse imagery. In 2020, we scanned all video
         | content previously uploaded to Pornhub against YouTube's CSAI
         | Match and continue to scan all new uploads. PhotoDNA -
         | Microsoft's technology that aids in finding and removing known
         | images of child sexual abuse material. In 2020, we scanned all
         | photos previously uploaded to Pornhub through Microsoft's
         | PhotoDNA, and continue to scan all new uploads.
         | 
         | Google's Content Safety API - Google's artificial intelligence
         | (AI) technology designed to help identify online child sexual
         | abuse material. Initially built to help detect all "not safe
         | for work content" as well as illegal content, Google's Content
         | Safety API attributes a score to content, which in turn serves
         | as an additional tool available for our moderation team.
         | 
         | MediaWise - Vobile's cyber "fingerprinting" software that scans
         | all new user uploads in order to help prevent previously
         | identified offending content from being re-uploaded.
         | 
         | Safeguard - Safeguard is Pornhub's proprietary image
         | recognition technology designed with the purpose of combatting
         | both child sexual abuse imagery and non-consensual content,
         | like revenge pornography, and helping to prevent the re-
         | uploading of that content to our platform and any other
         | platform that uses Safeguard. We believe in sharing this
         | technology with other social media platforms, video sharing
         | platforms, non-profits, and governmental organizations, free of
         | charge, to help make our platform, as well as the Internet at
         | large, a safer place for all by helping to limit the spread of
         | this harmful content. We also will provide the Safeguard
         | technology to our Trusted Flaggers so that our Trusted Flaggers
         | can fingerprint content on behalf of victims or potential
         | victims.
        
         | Spivak wrote:
         | So yes but I think the real answer would just be to have sites
         | support ContentID and once a victim registers a video with the
         | system it will automatically be taken down.
        
         | tgv wrote:
         | Only check the item of a few days ago:
         | https://news.ycombinator.com/item?id=26681936
        
       | supergirl wrote:
       | this title is pretty mild. it should be something like "One
       | woman's struggle to remove her drugged rape photos from pornhub"
        
       | xyst wrote:
       | Only way to get PH to act is to threaten their bottom line. Visa
       | and MC threatened to stop processing transactions related to PH
       | in response to a journalist article and then in less than a few
       | days PH decides to haphazardly delete all of the non-partnered
       | content (which includes innocuous videos as well).
        
         | bigmattystyles wrote:
         | ..and forced them to bitcoin right before the latest massive
         | uptick. If they've been sitting on them, while I'm sure only a
         | tiny percentage will bother converting their account to BTC,
         | and likely not enough to make up the difference, they could not
         | have asked for a better timing.
        
         | ogurechny wrote:
         | I guess most posters have already forgotten the previous big
         | case of Visa and Mastercard declining to do processing of a
         | client organization to praise this one as if it is a good
         | thing. And, no, they couldn't care less about some article, and
         | they most likely deal with much sleazier companies when it's "a
         | matter of national/international importance", or sufficiently
         | high-ranking official asks politely for certain exempts. Such
         | corporate reactions are agreed upon in advance.
         | 
         | Even counter-terrorism arguments still mean that someone gets
         | the right to define who is called a terrorist today _for you_.
         | I can 't see how this is good news. If you think that this
         | isn't real danger, political activists in Russia have routinely
         | be placed on "extremist" lists, which means all their bank
         | accounts have been frozen. I know, I know, just following the
         | law, someone has to stop the bad guys, etc.
        
       | symlinkk wrote:
       | It sounds like other sites (not pornhub.com) have the porn video
       | she's trying to remove, and she doesn't understand that Pornhub
       | isn't in control of them. Doesn't really seem like Pornhub's at
       | fault here, and I question the motives of a journalist who writes
       | something like this
        
         | mcphage wrote:
         | > It sounds like other sites (not pornhub.com) have the porn
         | video she's trying to remove, and she doesn't understand that
         | Pornhub isn't in control of them. Doesn't really seem like
         | Pornhub's at fault here
         | 
         | (1) Given the nature of the pornography industry, most sites
         | are owned by a small number of players, so it's definitely not
         | clear that Pornhub isn't in control of them.
         | 
         | (2) As the article describes, Pornhub offers a service which is
         | exactly what she is demanding: 'Pornhub offers something called
         | its "exclusive model program," which promises that it will send
         | takedown notices to any website to "help protect your content
         | from being uploaded to other websites."' And so even if Pornhub
         | is not in control of the sites where the video is hosted, they
         | already advertise the ability to get content taken down from
         | other sites.
        
           | claudiawerner wrote:
           | >(1) Given the nature of the pornography industry, most sites
           | are owned by a small number of players, so it's definitely
           | not clear that Pornhub isn't in control of them.
           | 
           | It also doesn't mean they _are_ in control of them. I think
           | it 's reasonable to assume that PornHub doesn't own every
           | site the video was uploaded on.
           | 
           | >Pornhub offers a service which is exactly what she is
           | demanding
           | 
           | This is almost certainly on the basis of copyright. As in,
           | those 'exclusive models' sign over the copyright to their
           | content to PornHub, or authorize PornHub to act on their
           | behalf as copyright holders. PornHub then issus DMCA takedown
           | notices or uses some other 'takedown notice'.
           | 
           | In order for this to work, she would have to be the copyright
           | owner of the video. If anything, in a horribly ironic twist,
           | her ex-husband is likely the copyright holder. She has no
           | standing (speaking in terms of copyright) to get PH to
           | protect her video like they do with the exclusive model
           | program.
           | 
           | If you know more about how it works, then please inform me,
           | but I can only assume it's copyright.
        
         | gccs wrote:
         | If you've been paying attention to the news in the past few
         | years, its been increasingly obvious the lack of research and
         | effort put into new posts. We live in the era of social media
         | where the headline is the only thing that 99% of people read.
         | The actual content of the article is irrelevant. People will be
         | outraged for a second and then continue scrolling and
         | completely forget it existed 5 minutes later. There is no
         | economic incentive for journalists to do any real journalism.
        
       | supergirl wrote:
       | I don't get how pornhub still exists considering it hosts so many
       | illegal videos: assaults, child porn, spy cams, etc. and this is
       | the premier porn site. imagine what you can find on less
       | mainstream porn sites.
       | 
       | the US gov went full on war against torrent websites and closed
       | every one of them no matter how small, yet this giant website
       | that literally hosts rape videos is OK.
       | 
       | all providers of PH should follow VISA and Mastercard and cut
       | ties. I realize killing PH will not end sharing of videos but at
       | least there will be a lot less money made from it and a lot less
       | viewers.
        
         | totalZero wrote:
         | I wonder what percentage of lawmakers are regular users of
         | PornHub and its affiliates.
        
       | kjrose wrote:
       | The item I find with this is once it's online... it's never going
       | away. Not without some form of incredible draconian overarching
       | system. What could even be done with situations like this. Short
       | of insisting that any content provider (pornhub included) be
       | liable for all content posted to their site in perpetuity. (Which
       | seems like a really dangerous idea and a good way to force
       | extreme censorship online.)
        
         | fencepost wrote:
         | It may not be possible to eradicate it completely, but it's
         | absolutely feasible to dramatically reduce its availability
         | until there's no practical difference.
         | 
         | Maybe a thousand people downloaded it and added it to personal
         | collections/datahoards. The vast majority of that thousand
         | would never consider uploading to a public site (possibly as a
         | response to a request). Of the remainder, most probably
         | wouldn't bother unless the video in question is an outstanding
         | example of that particular kink (in the story sex with someone
         | unconscious).
         | 
         | The first step is undoubtedly getting and _keeping_ it off
         | public sites. Once that 's done, even the few places it may
         | remain effectively disappear in a vast sea of other content.
        
         | worik wrote:
         | They offer a service to paying customers, of removing content
         | from all the affiliated sites.
         | 
         | So they have more capability than they suggest. Not perfect,
         | but much better
        
           | kjrose wrote:
           | They have the ability to encourage content removal. Not that
           | they will be successful.
        
             | fencepost wrote:
             | Pretty sure the parent corporation owns a disturbingly high
             | percentage of the video sharing sites. At the very least
             | they own Pornhub, Redtube, Youporn as well as a bunch of
             | others (Wikipedia about Mindgeek).
        
               | kjrose wrote:
               | That, if true, is even more disturbing to me. Imagine the
               | blackmail potential of a corporation with that much data.
        
       | hirundo wrote:
       | Could this problem be largely resolved for Pornhub (at least) by
       | using GAN generation to replace faces in submissions? Sort of
       | "This porn star does not exist"? If done well, it could
       | _increase_ the average fap-worthiness of content, while
       | preserving privacy. It could also be applied to other identifying
       | characteristics, like tattoos and moles.
        
         | ljp_206 wrote:
         | "Solving" a problem deeply intwined with bodily autonomy and
         | consent with 'obfuscate it until its no longer your body and
         | face to abdicate you of control' seems like... The wrong
         | approach. If actors/actresses are okay with it then let's go
         | right ahead.
        
           | [deleted]
        
         | leetcrew wrote:
         | if they could identify the face with sufficient accuracy, why
         | not just take down the video?
        
         | dTal wrote:
         | You'd be happy with people jackin it to a video of you being
         | sexually assaulted if they just obscured your face? In that
         | case why bother with the fancy neural network stuff, just make
         | sure to wear a paper bag over your head when you're being
         | raped! Problem solved.
         | 
         | /s
         | 
         | Your proposal has merit as a privacy measure - but this isn't
         | about privacy. This is about consent.
        
           | dredmorbius wrote:
           | Whilst you've a point here, there's also he case that revenge
           | porn for which the subject isn't actually identifiable ...
           | loses some of its punch.
           | 
           | There's attacking the supply-side aspect.
           | 
           | Though yes, masked assault remains assault
        
       | ogurechny wrote:
       | What a nice manipulative article perfectly targeted at clueless
       | general public.
       | 
       | So the criminal was her husband, the video was reposted by random
       | users or just bots (probably not illegal, but can start a long
       | talk about personal moral obligations), and the blame is on...
       | Pornhub (because all other porn sites are supposedly very noble).
       | First, because it is silently implied that all good websites
       | should collect and check the official IDs of all good citizens
       | uploading any data to them (what a great perspective), and,
       | second, because it had The Download Button (this lame Don't-Copy-
       | That-Floppy point gets stressed in most of the articles in anti-
       | Pornhub campaign -- bravo, incorruptible journalists). But what
       | is the alternative? Of course, it's good old trusted porn studios
       | that have all the papers to prove that women (and men) pretending
       | to have sex on camera and even destroying the functions of their
       | body parts have legal contract. Hooray!
       | 
       | It is amusing how porn industry leveraged anti-porn, women's
       | rights, and other groups to kill the competition from amateur and
       | no-name internet content. It is clear that Pornhub owners
       | understood that they can not refuse the offer, and the goal was
       | reached: now Porn Site #1 is mostly a shop-front for big porn
       | studios (and you can be sure they get their share of those
       | horrible, horrible ad revenue money).
        
         | onion2k wrote:
         | _First, because it is silently implied that all good websites
         | should collect and check the official IDs of all good citizens
         | uploading any data to them (what a great perspective)..._
         | 
         | There's a significant difference between "uploading any data"
         | and "uploading porn". Suggesting someone who is uploading porn
         | should have to prove that they're entitled to, and that the
         | people in the video have consented to be put online in the
         | context of a porn video, is not unreasonable given the weight
         | of evidence that "revenge porn" is incredibly damaging to
         | victims.
        
           | ogurechny wrote:
           | You are probably guided by the false dilemma presented by
           | that or similar articles. I doubt any porn service on the
           | internet in the last 25 years did anything like that, just
           | like no web hosting service has checked that each image and
           | each piece of text on your web site is legally owned or
           | distributed by you. This is by no means a recent problem,
           | naked pictures were leaked intentionally or unintentionally
           | long before internet existed. However, the discussion implies
           | that all of the sudden porn sites appeared, you can be on
           | them, and wegottadosomethingfast!
           | 
           | I don't think there is much difference between your naked
           | body and other information you might want to keep private.
           | "Porn" triggers people (most often American people, I have to
           | admit as a distant observer), makes them reason like there is
           | some inherent difference between "porn" and "not porn", and
           | makes manipulation easier. Basically, what you're saying is
           | that there should be more (indirect) censorship and real-life
           | identity matching on the Web -- but it's for an Obvious Good
           | Cause! The problem is that Good Causes get forgotten quite
           | fast, and the one who benefits from it is the entity in
           | control of the system that gained power. Here we have porn
           | studios dictating previously independent Mindgeek (former
           | pain in their asses) which content should be on the biggest
           | porn streaming sites under new system. The speed and scale of
           | the attack, and the concerned voices from various directions,
           | hint there was a lot of high-level lobbying involved.
           | 
           | Here's no less thrilling example. Suppose that Rick Astley
           | wakes up tomorrow and decides to point out that he has always
           | been a singer and not some kind of internet joke. Perfectly
           | understandable impulse. Then he proceeds to remove all the
           | non-musical uses of his songs (let's forget about corporate
           | ownership). How would you react to that? What would results
           | be? There's certainly more people with similar wishes --
           | basically the whole genre of "viral videos" is one big zero
           | consent heap. When you are having a laugh at someone's
           | expense, do you worry about them? When you watch videos of
           | Beirut explosion, do you think about all the dead people who
           | never gave consent for their last moment to be your
           | entertainment? Should we ban the uploading of such videos
           | without explicit source checks, then?
        
           | Mirioron wrote:
           | What's the significant difference? I've seen viral videos of
           | people in compromising or embarrassing situations. Some have
           | even lost their jobs as a result. Should all video uploading
           | sites demand documentation that every person in the video
           | consents to it?
        
             | toomuchtodo wrote:
             | If the law requires it, yes. The uploader should be the
             | liable party if the subject or subjects didn't consent.
             | 
             | Why is this a strange concept? Why should someone be able
             | to upload whatever they want to platforms with no
             | responsibility for what they're uploading?
        
               | Mirioron wrote:
               | The whole question here is whether the law _should_
               | require it. The reason why it 's a strange concept is
               | that it would severely limit the spread of information.
               | The same avenues that a regular person can get content
               | about them deleted can be used by politicians or
               | criminals to do the same. Not only that, these levers can
               | also be abused to simply target somebody with false
               | claims. Furthermore, it also places trust onto the video
               | hosting website to keep all of that information safe. If
               | they get hacked, then all of that documentation and
               | images of IDs get leaked.
               | 
               | > _Why should someone be able to upload whatever they
               | want to platforms with no responsibility for what they're
               | uploading?_
               | 
               | This isn't in question though. The question is whether
               | sites need to be preemptive or reactive. Illegal content
               | is illegal even right now and these websites will delete
               | it. If they don't then you can take them to court and the
               | content will get deleted. However, the thread suggests
               | that these websites should be forced to be preemptive -
               | to demand documentation upfront. That's a whole different
               | situation.
        
               | brayhite wrote:
               | I'm not comfortable being liable to being sued because my
               | cousin decided to let us know after the fact he didn't
               | want to be in any of the pictures uploaded to
               | Facebook/Twitter/Instagram/tumblr/Flickr/etc.
               | 
               | I admittedly didn't read the article, so maybe your
               | comment's sentiment isn't meant to include these
               | scenarios.
        
               | [deleted]
        
               | kbenson wrote:
               | What if your cousin is doing something frowned upon by
               | society (wearing blackface, spouting racial epithets,
               | etc)? What if it's not frowned upon now, but is at a
               | later date? What if you don't find anything
               | objectionable, but they wouldn't want it shown, and their
               | concerns are borne out later? What if it's completely
               | benign but out of context appears problematic (goofing
               | around with a good friend of a different race and it
               | comes across as something different when only a portion
               | of it is represented)? What if they just have a personal
               | objection to having heir picture posted?
               | 
               | There are many reasons why it's a good idea to at least
               | check that the person is comfortable with it. You
               | shouldn't be liable if they've said it's okay.
        
               | toomuchtodo wrote:
               | Absolutely. Good UX would be the subject receiving a
               | notification that a photo of them has been uploaded,
               | requesting whether they should be scrubbed/blurred from
               | the photo or not before it's made public. It's about
               | consent and agency.
        
               | brayhite wrote:
               | I'd argue the tech to support this would require much
               | more to be known about me by another entity and system
               | than likely any picture taken by someone would share. No
               | thanks.
        
               | brayhite wrote:
               | And all those tourists in the background of your selfie
               | at the Statue of Liberty. Need their permission, too?
        
               | toomuchtodo wrote:
               | If it's out in public and non commercial, no. If
               | commercial, a release form is required. Assumes US law
               | as-is today.
               | 
               | https://www.format.com/magazine/resources/photography/pho
               | to-...
        
               | PhasmaFelis wrote:
               | Your link says:
               | 
               | "If someone is in a public space, like a park, beach, or
               | sidewalk, it is legal to photograph them. They shouldn't
               | expect privacy laws to prevent them from being
               | photographed. That means a street photographer can
               | publish candids taken in public spaces, as long as those
               | images are only being used for editorial purposes."
               | 
               | It'd be impossible to photograph a protest, rally,
               | sporting event, or basically any long shot of a public
               | area if you had to get individual consent from every
               | person in the frame.
        
               | toomuchtodo wrote:
               | Sporting events and similar usually have blanket photo
               | permission grants on them.
        
               | kbenson wrote:
               | I agree with this stance, but I also think it's
               | beneficial to explore whether "there's a significant
               | difference between 'uploading any data' and 'uploading
               | porn'" is true in part or whole, and whether the "porn"
               | part of it is a red-herring.
        
           | scarmig wrote:
           | It's a balance. Suppose you add this proof requirement. What
           | happens then? It's not as simple as lots of legitimate
           | amateur content producers facing additional friction. Most
           | will make the explicit decision not to have their legal name
           | and address associated with pornographic content. Others will
           | upload their content anyway, without doing an audit of the
           | different platforms' security postures or being aware of the
           | likelihood of and outcomes from breaches.
           | 
           | What happens in a breach? Now, you're probably thinking, just
           | require the platforms to be secure! But you can't simply will
           | that into existence, and I guarantee you it's impossible. So
           | ultimately you've made people vulnerable when they weren't
           | before, and you're setting up a situation where it's pretty
           | much inevitable that they'll be exposed and threatened by
           | stalkers, deranged fans, and ideological zealots.
           | 
           | Let's suppose you bite the bullet, and say turning all the
           | porn platforms into outlets for major porn production
           | companies is the way to go. At least we've eliminated revenge
           | porn, right? No. Large masses of people really hate that kind
           | of mass produced porn, so they jump to foreign platforms that
           | don't require complicated ID verification. These sites end up
           | with the large majority of amateur porn content, along with
           | the revenge porn. But they're now out of the reach of US law
           | enforcement, so the victims of revenge porn would then be in
           | a worse position than they are today.
           | 
           | Now we're getting to even more extreme measures. Maybe the US
           | needs to build a national firewall to prevent the foreign
           | criminals from penetrating our great nation and undermining
           | our morals with porn that's unapproved by the Feds? China is
           | way ahead of the game on this, in that pornography is
           | outright illegal, and it has sophisticated internet security
           | measures that are exceptional in breadth and depth by Western
           | standards. How's its war on porn going? Spoiler alert: it's
           | lost it. Perhaps it could be a moment to build cross-cultural
           | empathy, as Chinese netizens exchange tips with Americans on
           | how to use Shadowsocks to avoid the censors and access
           | amateur porn.
           | 
           | Requiring ID verification is one of those things that sounds
           | good and moral on its face, but has so many unintended
           | downstream consequences that exacerbate the original problem
           | while making things generally worse.
        
             | lostcolony wrote:
             | So all professional porn already requires identification
             | and age checks. I'm...not convinced it's the end of the
             | world if amateur ones do too. The considerations are the
             | same.
        
               | scarmig wrote:
               | People appearing in professional porn today are almost by
               | definition those who are most comfortable getting
               | identification and age checks; the ability to maintain
               | anonymity is a key concern for most amateurs. Adding the
               | same verification process to amateur porn is more or less
               | making all porn professional porn, not mildly tweaking
               | the nature of amateur porn. Amateurs would be faced with
               | a choice of 1) effectively going pro, 2) fleeing to a
               | different platform, or 3) exiting porn production
               | altogether. I'm fairly confident that 2) would be the
               | largest proportion of people, closely followed by 3),
               | with 1) a distant third.
        
           | Ceezy wrote:
           | Yes, any normal porn company make performer sign specific
           | contract, it should just be the same for porn sites
        
             | pgsimp wrote:
             | How could that even work in practice? Everybody who is
             | visible in a video has to do some kind of VideoIdent and
             | clearly state their consent? YouTube should have to do the
             | same, btw.
        
         | Traster wrote:
         | You have this almost exactly backwards- PH now own (or are
         | responsible for) their entire industry because they take all
         | the profit.
        
           | ogurechny wrote:
           | That's what I'm talking about: real stories of regular people
           | are cynically used to negotiate new porn profit deals.
        
         | LanceH wrote:
         | The obvious step would be to require the uploader to be
         | responsible for their content. Less so than youtube, ph's
         | content would generally imply that the owner has secured the
         | rights of its participants.
        
         | pc86 wrote:
         | > _What a nice manipulative article perfectly targeted at
         | clueless general public._
         | 
         | This could describe approximately 100% of local news
         | articles/segments on any topic even remotely political or
         | controversial. And maybe 75% of the remaining.
        
         | mattbee wrote:
         | Is your diatribe an argument that Pornhub _should_ have the
         | right to distribute pictures of a sexual assault?
        
       | dustinmoris wrote:
       | I bet if someone uploaded a video of all the dodgy tax avoidance
       | these crooks do a spree gun it all over the internet these
       | bastards would find a way to trace it and take it down in less
       | than 24h. Maybe someone needs to start physically harassing all
       | the CEOs of those shit companies and video document it so that
       | they can't live a single day of their lives without being
       | violently attacked and harassed before they will find a way to
       | take harassment serious.
        
       | matheusmoreira wrote:
       | This problem is fundamentally impossible to solve. Once data has
       | been created, it's trivial to make and transmit copies. Complete
       | tyranny would be necessary to erase all illegal data from all
       | computers in the world. The best people can hope for is removal
       | of data from popular centralized services.
        
         | bigmattystyles wrote:
         | The 'good' news is that with deepfakes everyone will (soon) at
         | least have the veneer of plausible deniability. I know there
         | will always be ways to determine whether something is a
         | deepfake via adversarial detection but perhaps in the future so
         | much content will be faked that unless it is a head of state or
         | other renowned person, only then would it then be worth
         | checking.
        
       | blackbrokkoli wrote:
       | That is honestly just the same old big tech story I've must have
       | read a thousand times now.
       | 
       | Sure, it is a little bit more sinister than usual even, but
       | that's just because the company in question serves a market with
       | some very dark corners, like the assault featured in the article.
       | 
       | People here argue pro or contra porn prohibition or platform
       | specifics, but it comes down to this: The Western world decided
       | that giant multinationals sitting above countries, laws, ethics
       | and responsibility holding _all_ the data are the ultimate
       | expression of the American Dream and should therefore exist
       | undisturbed forever, growing bigger and bigger. Shit like this is
       | just fallout from the gargantuan power distance between megacorp
       | and human you get, _by design_.
       | 
       | Sure, there are probably ways to get PornHub to be sorry for
       | that, make amends, change some rules. Whatever. Last week it was
       | Uber discriminating, here it's PornHub violating, maybe next week
       | it will be someone having his business destroyed because he
       | associated with a scammer in 2004 again.
       | 
       | Every time there is juicy drama and a micro-outcome in this or
       | that direction, but for someone reason the macro-problem gets
       | mostly ignored.
        
         | grecy wrote:
         | Though it starts out feeling a little dismissive, I think your
         | comment does a good job of explaining the problem we have now
         | is not limited to the _mico_ level, but that it 's a _macro_
         | level problem.
         | 
         | Now it's time to start talking about how we're going to fix
         | that problem you have so rightly pointed out.
         | 
         | It's clear the rules _need_ to change, and it 's clear we need
         | to claw back the gargantuan power we gave over to mega-corps.
        
         | pgsimp wrote:
         | Or maybe it is just media companies trying to exert power and
         | blowing up issues out of proportion. Just because some media
         | outlet writes a dramatic article, doesn't mean politicians
         | should rush to instantiate new laws to appease the journalists.
         | 
         | I'm not defending the hosting of revenge porn, but that
         | specific issue does not seem to rely on the existence of porn
         | sites.
         | 
         | It does rely on sites where people can upload stuff. Maybe
         | media companies don't really like that such sites exist,
         | because it destroys their information monopoly.
        
         | true_religion wrote:
         | It is hardly just changing some rules. Pornhub killed off the
         | majority of their content, and removed the permissive copyright
         | system that enabled them to become dominant in the market in
         | the first place.
         | 
         | For them, it's a huge change.
        
           | [deleted]
        
         | routerl wrote:
         | > The Western world decided...
         | 
         | It's not so simple[1].
         | 
         | [1] https://en.wikipedia.org/wiki/Manufacturing_Consent
        
         | brailsafe wrote:
         | Perhaps a bit pedantic, but the American Dream isn't
         | necessarily wholly applicable in Canada. Uber was very slowly
         | allowed in, and developer salaries are kilometers behind the
         | US.
        
         | pjc50 wrote:
         | Serious question: there used to be in US law a record keeping
         | requirement
         | https://en.wikipedia.org/wiki/Child_Protection_and_Obscenity...
         | ; what happened to that which meant Pornhub could just publish
         | whatever with no attempt at verifying legitimacy?
        
           | 0cVlTeIATBs wrote:
           | Does PornHub "publish" the videos on their site?
        
             | totalZero wrote:
             | My gut tells me that yes, they do. What's the alternative?
             | 
             | "We're just a platform your honor. We are not responsible
             | for the rape content that we host on our website."
             | 
             | Seems like an unsatisfactory standard.
        
           | brailsafe wrote:
           | US law only applies to US companies
        
             | pjc50 wrote:
             | Well, up to a point; there are plenty of laws which apply
             | to anyone deemed to be marketing to the US, especially
             | financial services and online gambling, and Megaupload got
             | raided by the FBI despite being in New Zealand.
        
           | hluska wrote:
           | Mindgeek has a very interesting corporate past. This FT
           | article is the closest I've ever found to an investigation
           | but honestly, it reads more like a really bad soap opera than
           | an FT article:
           | 
           | https://financialpost.com/financial-times/the-secretive-
           | worl...
        
         | throwaway0a5e wrote:
         | This isn't a BigCo's specific failure.
         | 
         | They're just following in the footsteps of the precedent set by
         | the DMV et al who themselves are standing on a long and storied
         | tradition of bureaucratic ineptitude. Heck, this crap goes back
         | at least as far as the ancient Roman administrative state if
         | not further. The fact that the bureaucratic run-around flies
         | from department to department in the form of 1's and zeros
         | instead of dried tree pulp or waxed stone tablets doesn't
         | really make a substantial difference. This isn't anything new.
        
           | elmomle wrote:
           | I think the argument is that if companies feared consequences
           | from an engaged government that actively protected its
           | private citizens' rights, this specific issue would not exist
           | (since the company's continued functioning could then depend
           | on its efficiently cooperating with such requests).
        
             | abakker wrote:
             | I think the issue being discussed here is also, "how can we
             | make a government whose bureaucracy will be able to be
             | engaged?" So many examples show bureaucracy and the
             | abdication of responsibility go hand in hand. In order for
             | the government to be empowered, its workers need to be
             | empowered, and we have few cultural narratives of how rule
             | following enabled great outcomes.
             | 
             | Instead, almost fetishistically, we embrace rule breaking
             | as the ultimate expression of self actualization, from
             | forming businesses to bureaucrats who actually took time
             | during their breaks to help.
             | 
             | IMO, working for a company and working for the government
             | are both exercises in limiting personal liability, and I've
             | not yet encountered a construct that allowed people to
             | limit liability without also limiting their agency. I also
             | think it is important that we as a modern democracy do work
             | to solve this issue.
        
         | HenryBemis wrote:
         | > Pornhub has recently removed that download button.
         | 
         | News flash: there are many ( _many many many_ ) softwares out
         | there that can 'catch'/downlaod a video, mp3, etc. from a
         | website.
         | 
         | I listen a daily radio show. Not live though. I wait till the
         | recording is done, then wait till they upload it on the
         | station's website, and I am using a SW to 'catch' the mp3,
         | download it, and listen to it later at my convenience (and
         | offline). I have written and asked them to make a podcast for
         | each of their radio shows, they're in the process of setting it
         | up, I just cannot listen to them live, so I do this 'illegal
         | workaround'.
         | 
         | That same software can catch videos (and streams) from YT,
         | Vimeo, and most sites I visit. I don't 'pirate', but I do
         | download some old-time music videos because I fear that one day
         | they will disappear (e.g. Van Halen - Right now, Chicane -
         | Saltwater). I want to be able to watch them when I'm (very)
         | old, and I don't know how 'old music' videos will be treated. I
         | do pay for Spotify, so I do pay for the music I enjoy (to avoid
         | any responses to the contrary).
         | 
         | > Every time there is juicy drama and a micro-outcome
         | 
         | (not picking a fight with you): If that was your sister,
         | mother, daughter you wouldn't even think of writing something
         | like that. It is cases like this though, that can bring a $50m
         | penalty to the big bullies and their minions ("marios",
         | "kevin"). And large corp don't give a poop for the little
         | people. They care to _not lose money_. Slap a $50m penalty to
         | PornHub, and see them changing their tune within 24h and fixing
         | _this_ problem within 2-3 months.
        
         | Dirlewanger wrote:
         | The macro-problem gets ignored because we have baby boomer
         | politicians that are maliciously negligent in their concern for
         | the well-being of society and have selfishly pillaged America
         | for the past 50 years. You want to see change? Get rid of all
         | 535 members of Congress tomorrow, and don't allow anyone over
         | the age of 50 in their place.
        
       | imwillofficial wrote:
       | This is so messed up. I can't even imagine what that must be
       | like. The dark side of "the Internet never forgets."
        
       | cblconfederate wrote:
       | > However, when she searched for the name of the video on Google
       | in January, it still returned 1,900 results. It seems that
       | although Pornhub had removed the video, it still kept thumbnails
       | of the naked images. Because those thumbnails still existed, a
       | Google search would find - and display - those naked images. She
       | realized the only way to eliminate those was to get Pornhub to
       | remove all traces of the thumbnail images.
       | 
       | It seems the problem was / is Google (and not pornhub), because
       | makes it impossible to remove stuff from the internet fast enough
       | to prevent damage. I wonder why the article doesn't consider it.
       | Google should have an on demand mechanism to instantly deletes
       | all text/images that match a fingerprint .
        
         | mrweasel wrote:
         | Google do have to comply with EUs right to be forgotten, but I
         | don't know how they deal with images.
         | 
         | Your right that we really can't remove stuff fast enough. There
         | will almost always be a copy somewhere.
        
         | zakki wrote:
         | If Google removed it from their search result it is true that
         | most people will not see/find it. But her video/thumbnail is
         | still all over the internet. But if the (porn) sites removed
         | the video/ thumbnail it will be gone from google search as well
        
           | visarga wrote:
           | and only live on archive.org and other places
        
         | ggggtez wrote:
         | As far as I understand it, if PH stopped hosting the
         | thumbnails, then Google would stop showing them.
         | 
         | The problem is that PH didn't actually remove the thumbnails.
        
           | nunez wrote:
           | I don't even think they own all of them? There are a
           | billionty porn site aggregators that pull from Pornhub and
           | get ranked on Google, Bing, and other search engines.
           | 
           | Is removing _all_ traces of this video even a tractable
           | problem?
        
             | zepto wrote:
             | Yes - the video / thumbnails to be removed can be hashed in
             | various ways. The hashes can be made public.
             | 
             | A court can order videos or thumbnails matching the hash to
             | be removed, and levy penalties against anyone still hosting
             | them after a period of time.
             | 
             | It's a less complex problem than policing use of
             | copyrighted music on YouTube videos, and that has been
             | achieved.
        
         | elliekelly wrote:
         | Perhaps I'm not understanding but why couldn't PH just remove
         | the thumbnails so they couldn't be indexed anymore? Sure,
         | Google should have a mechanism so material like that isn't
         | indexed but Google isn't the only search engine and PH
         | shouldn't be hosting that content, not even as a thumbnail.
        
           | MattGaiser wrote:
           | > why couldn't PH just remove the thumbnails so they couldn't
           | be indexed anymore?
           | 
           | > Kevin responded again, insisting that Pornhub "can NOT"
           | remove content from other sites. However, that doesn't seem
           | to be completely accurate. Pornhub offers something called
           | its "exclusive model program," which promises that it will
           | send takedown notices to any website to "help protect your
           | content from being uploaded to other websites."
           | 
           | I am not sure that PornHub actually hosts those thumbnails or
           | necessarily controls them. Sending a takedown notice
           | indicates to me that they don't control that content. They
           | would just be requesting removal.
        
           | cblconfederate wrote:
           | Even if they did remove it, how long would it take for google
           | to remove their copy? That's why it's better to have an on-
           | demand mechanism to remove content.
        
       | dfxm12 wrote:
       | It sucks that this is a case where no amount of money & no
       | punishment of anyone involved can really make the victim whole.
       | Some things require a time machine to fix.
        
         | KozmoNau7 wrote:
         | It shows how much our societal norms and laws were - and still
         | are - completely unprepared for the implications of everyone
         | being able to effortlessly copy and distribute images and video
         | all over the world. We're still in a sort of local to national
         | mindset, but international and virtual issues completely
         | shortcut a lot of the foundations of that, and the people who
         | get caught in that friction get hurt, sometimes _very_ badly.
         | 
         | If someone wants to ruin your reputation and cause you immense
         | grief, they can very easily do that, and there is basically
         | nothing you can do once the material is out there, _especially_
         | if it is pornographic in nature.
         | 
         | Privacy is a basic human right, and yet we are trampling all
         | over it. I wish I knew what we as humanity could do about this,
         | because shitty people aren't just magically going to stop
         | existing, and the internet is an extremely powerful tool for
         | abuse in their hands.
         | 
         | Facilitating abuse by sharing abuse materials should be subject
         | to just as hard punishment as uploading them in the first
         | place.
        
           | ogurechny wrote:
           | The subjects of Nick Ut's "The Terror of War" didn't sign
           | consent papers, the same with countless other famous photos.
           | There is no doubt you have seen it. Aren't you the one of the
           | "shitty people"? Will you repent?
        
         | hypertele-Xii wrote:
         | What a shitty attitude. Once a victim, forever a victim? I
         | propose to you a simple yet powerful idea: _Shit happens, get
         | over it._ Life goes on. New experiences.
        
           | qzx_pierri wrote:
           | I suspect you'll be downvoted, but I agree. People take the
           | internet waaaaaaay too seriously now. Not downplaying their
           | trauma. Crazy stuff happens online, but everyone's attention
           | span lasts for 30 seconds, so just move on, get some therapy,
           | and try to 'let it be'. I suspect stories like this will
           | facilitate additional mass-censorship via complaints similar
           | to "think of the children!!", but framed towards adults who
           | don't understand how web archival works.
           | 
           | But is this what we want? Don't we love the internet because
           | everything lives forever? Remove the story in the OP from
           | your mind and respond without any emotion. I know I sound
           | like a contrarian, but I really miss when the internet felt
           | more dangerous and unhinged. People starting these justice
           | campaigns for every little thing about the internet is not
           | only futile, but also short sighted. Do we want more "mass
           | deletions" of content like Tumblr, Pornhub, etc ? This is
           | what happens when the internet is slowly homogenized into
           | this business friendly, marketable, "safe space" for casual
           | users. Sorry for the rant, but I've noticed this so much
           | lately.
           | 
           | YES terrible stuff happens online. YES it has been happening
           | since before the author of this story was born. Deal with it
           | the best you can, but stop using emotion to convince everyone
           | that you will be "fixed" until every "bad" site is fully
           | regulated and monitored.
        
             | SketchySeaBeast wrote:
             | I'd really rather my employer not stumble upon someone's
             | revenge porn of me, and I'd like the ability to get that
             | removed, yes.
        
               | qzx_pierri wrote:
               | Submit a request to have it removed. Lawyer up if the
               | site doesn't comply. If the site is overseas and doesn't
               | comply, then you'll have to just deal with it. Google
               | even has their own methods to have a URL removed. I've
               | done this before myself. I agree content should have
               | mechanisms which allow takedowns to occur. Taking
               | Google's approach and delisting certain websites from
               | their search results seems a bit much (this refers to the
               | mass censorship in my original comment)
               | 
               | Stories like this cater to people who aren't savvy enough
               | to do the work to get the content removed. I said I miss
               | when the internet felt more dangerous and wasn't catered
               | to casuals. I think both of us have the same beliefs, I
               | just think there should be a bit more freedom online, but
               | these "platforms" are publicly traded companies now, so
               | my thoughts don't matter.
        
             | notyourday wrote:
             | > I suspect you'll be downvoted, but I agree. People take
             | the internet waaaaaaay too seriously now. Not downplaying
             | their trauma. Crazy stuff happens online, but everyone's
             | attention span lasts for 30 seconds, so just move on, get
             | some therapy, and try to 'let it be'. I suspect stories
             | like this will facilitate additional mass-censorship via
             | complaints similar to "think of the children!!", but framed
             | towards adults who don't understand how web archival works.
             | 
             | This will stop being a big deal as the boomers die out and
             | GenX gets into a nursing home age. My wife is a millennial.
             | She grew up with the internet and cameras -- the number of
             | photos that would make Gen X ers blush that she has is
             | insane and that's nothing compared to what an average gen Z
             | has.
        
             | pocket_cheese wrote:
             | Asking someone experiencing sexual trauma to 'let it be' is
             | such a dismissive take.
             | 
             | Being a victim of revenge porn means that you can be
             | blackmailed at anytime. Most people do not want to be
             | sexualized, and it's humiliating to have to live with the
             | fact that your coworkers will now have ammo to harass over
             | you if they discover your videos. Don't get me started on
             | what happens if you have kids, and their friends find out
             | you were in porn.
             | 
             | Realistically, you can't scrub the internet of a video. I
             | personally feel like the solution is that porn should be
             | highly regulated. A formalized content upload process,
             | licensing, required staff to deal with these types of
             | complaints. It should be impossible to upload a video
             | without a signed waver from the participants. Sure, this
             | might remove pure anonymity, but you can still have
             | mechanisms in place to protect your identity if you want to
             | make amateur content without divulging your identity to the
             | public.
        
               | qzx_pierri wrote:
               | "Asking someone experiencing cyber bullying to 'let it
               | be' is such a dismissive take. Realistically, you can't
               | scrub the internet of cyber bullying. I personally feel
               | like the solution is that comments should be highly
               | regulated. A formalized posting process, licensing,
               | required staff to deal with these types of complaints. It
               | should be impossible to comment referencing someone else
               | without a signed waver from the participants. Sure, this
               | might remove pure anonymity, but you can still have
               | mechanisms in place to protect your identity if you want
               | to post comments without divulging your identity to the
               | public."
               | 
               | This sound similar to any other regimes in recent memory?
               | Porn isn't the argument I'm responding to, it's short
               | sighted bandaid solutions which require ADDITIONAL
               | regulation and government control. We all know these
               | systems of regulation bleed into other parts of the
               | internet. Are you willing to start this trend and put it
               | into the hands of someone whom you disagree with?
        
               | pocket_cheese wrote:
               | You are equating mean youtube comments to revenge porn.
               | This is the whole "We should just ban cars" response when
               | someone brings up gun control.
               | 
               | >It should be impossible to comment referencing someone
               | else without a signed waver from the participants
               | 
               | If the comment is a nude picture of myself and I didn't
               | give you permission to post it the law already agrees
               | that this should be illegal. All I'm asking is to make it
               | enforceable.
               | 
               | Porn is shady industry ripe with abuse and exploitation.
               | How do you know the person you are watching is over 18?
               | Let me answer that... You don't. Some industries should
               | be given more regulatory scrutiny than others.
        
             | AshamedCaptain wrote:
             | When I try to respond `without any emotion` to the question
             | `Do we want more "mass deletions" of content like Tumblr,
             | Pornhub, etc ?' , the answer that first comes up is `why
             | not?`.
             | 
             | The Spock couldn't care less about these websites -- they
             | are mostly entertainment, after all -- and on the other
             | hand the risk that they someday archive something that may
             | damage you is not nil. So there is one clear cold, logical
             | conclusion.
        
             | AltruisticGapHN wrote:
             | I agree that she should move on but the porn industry also
             | needs regularization, random people should not be able to
             | upload anything they wish at the click of a button.
             | 
             | While I agree with your sentiment, we will miss the early
             | days, but internet was in its "teenager" phase, wild
             | chaotic, fun, but not sustainable.
             | 
             | The fact that small sites have been swallowed by facebook &
             | co. , the knowledge bubbles cause by "smart" search
             | engines, etc. those are another topic entirely.
        
               | qzx_pierri wrote:
               | "X needs more regulation"
               | 
               | This is not the internet that I remember from 2004, and
               | that was my only point. The internet feels too "safe"
               | now.
        
               | [deleted]
        
             | [deleted]
        
             | throwaway3699 wrote:
             | Takedowns are a perfectly valid thing to have for something
             | like pornography websites. Its not life threatening if it
             | vanished for a few hours, but it can be devastating if it
             | stays on long-term.
             | 
             | Another comment correctly pointed out that we would not
             | treat this as laissez faire if it was a child, but we
             | should. There is a victim who has had harm done to them
             | during the creation of the video.
        
               | qzx_pierri wrote:
               | "Another comment correctly pointed out that we would not
               | treat this as laissez faire if it was a child, but we
               | should. There is a victim who has had harm done to them
               | during the creation of the video."
               | 
               | 'Think of the children!' is such an overused moral
               | bulwark. The people in power who would handle this
               | regulation others in this thread are begging for are
               | using that argument to defeat encryption.
        
           | Toutouxc wrote:
           | You should really reword that, because now it sounds as an
           | incredibly insensitive thing to say. She was a child when it
           | happened. Bad things hurt so much more when you're a child.
        
             | fastball wrote:
             | She was not a child when it happened, FWIW.
        
             | jhgb wrote:
             | > She was a child when it happened
             | 
             | I'm confused. The article is talking about her ex-husband.
             | Did she marry as a child? Or did some things happen to her
             | as a child and later her husband of all people got his
             | hands on some recordings? Or are you talking about some
             | other person than the woman in the article?
        
               | Toutouxc wrote:
               | I'm even more confused, I only saw a seemingly fullscreen
               | video, didn't know there was an article below.
        
               | jhgb wrote:
               | Well, _I_ only saw an article and didn 't even know there
               | was a video. ;) I guess blocking JavaScript will do that.
        
           | 12ian34 wrote:
           | What a shitty attitude. Have you been raped on video and
           | revealed online?
        
       | blfr wrote:
       | Pornography is bad for its consumers, bad for willing actors, and
       | obviously terrible for people featured without consent.
       | Tolerating it will be one of the great shames of our time.
        
         | vbezhenar wrote:
         | There are countries where porn is blocked. VPN software is very
         | popular in those countries.
        
         | GeoAtreides wrote:
         | I'm genuinely curious, how do you feel about erotica or written
         | smut? How about drawn porn?
        
         | sunsetMurk wrote:
         | Can you elaborate a bit more?
         | 
         | I'm very curious because I think the existence and availability
         | of porn and the sex industry is very important for the well-
         | being of lots of people. Definitely some problems.. but I'd
         | parallel those to problems with prohibition and suppression.
         | 
         | Have you always held this opinion? Are you religious?
        
         | nunez wrote:
         | I think pornography has also helped bolster sexual freedom to
         | new heights, which I think has been a good thing for humanity.
        
         | weswpg wrote:
         | Trying to decide for other people which parts of their
         | sexuality are "tolerable" or not is actually a far bigger
         | mistake, historically and presently
        
         | pxue wrote:
         | Blaming porn is like blaming cheeseburger for obesity. While
         | you're not wrong but it's hardly the root cause.
        
           | blfr wrote:
           | You're right, fast food is probably not the primary offender,
           | and there are many causes (HFCS, seed oils, not cooking,
           | abundance in general), and future is overdetermined... but
           | fast food definitely carries some responsibility for the
           | obesity epidemic.
           | 
           | Same with porn and Pornhub. But it is the mental tobacco of
           | our time.
        
             | viraptor wrote:
             | > mental tobacco [of] our time
             | 
             | You may want to check how early does porn (or sexualised
             | images) exist. "our time" may turn out to be "since we
             | discovered drawing".
        
               | blfr wrote:
               | I don't see your point. Tobacco use is prehistoric,
               | barely more recent than drawing.
        
               | viraptor wrote:
               | What was your intended meaning for "of our time" in that
               | case?
        
               | blfr wrote:
               | It was in relation to widespread use and availability of
               | pornography, like tobacco was in the 20th century. Or
               | like obesity is now.
               | 
               | We had obese people since at least the first settled
               | societies. But it was nothing like today, both in how
               | widespread and how extreme is has become.
               | 
               | Same with drugs. An opioid epidemic pushed by mega pharma
               | is nothing like a shaman serving ayahuasca.
               | 
               | A nude cave painting or a figurine is nothing like
               | thousands of hours of streaming porn. Barely related.
        
         | WindyLakeReturn wrote:
         | If a war on drugs has taught us anything, a war on porn is
         | going to lead to far worse porn being spread further than ever
         | before while innocent lives are ruined, much like how teenagers
         | are already being charged as adults for sexting while not
         | adults. There are likely places the law can be improved but a
         | blanket ban would be a large step backwards even if we ignore
         | freedom of speech implications.
        
           | blfr wrote:
           | In the world where entire sites are pulled over some silly Q
           | conspiracy theory, we can surely do more to combat porn.
           | We've largely dealt with smoking, also addictive, also once
           | prevalent, also while cigarettes are still available at every
           | gas station and grocery store.
        
             | IIAOPSW wrote:
             | >entire sites are pulled over some silly Q conspiracy
             | theory, we can surely do more to combat porn.
             | 
             | Remember that time the pornographers tried to overthrow the
             | US government? Me neither.
        
               | pvaldes wrote:
               | > Remember that time the pornographers tried to overthrow
               | the US government?
               | 
               | Like mass blackmailing top politicians with sexual videos
               | involving minors?, Yep, I remember the last week.
        
               | blfr wrote:
               | Trashing some offices is nowhere near overthrowing a
               | government.
               | 
               | And pornographers did much, much worse. They trafficked
               | underage women, misrepresented the contracts, routinely
               | provided drugs to dull their actors' senses, and engaged
               | in all kinds of underhanded or outright criminal conduct.
               | 
               | The guy who made (super mild by today's standards) Girls
               | Gone Wild videos has a whole Criminal section in his wiki
               | bio https://en.wikipedia.org/wiki/Joe_Francis#Criminal
               | 
               | Here's GirlsDoPorn
               | https://en.wikipedia.org/wiki/GirlsDoPorn#Legal_action
               | 
               | And these are the "legitimate" ones who can be sued.
        
               | watwut wrote:
               | The vote counting was target of the overthrow part.
        
               | ceejayoz wrote:
               | > Trashing some offices is nowhere near overthrowing a
               | government.
               | 
               | A comically bad attempt at murder is still attempted
               | murder.
        
               | blfr wrote:
               | I genuinely don't agree with this. Just like FBI
               | providing a 78 IQ unfortunate loser with explosives is
               | not an attempted terrorist attack.
        
               | ceejayoz wrote:
               | Entrapment is an entirely different issue.
        
               | WindyLakeReturn wrote:
               | Those cases generally aren't entrapment. They get pretty
               | close, but they don't include the final push. The other
               | party is free to walk away without taking the bait.
               | Granted, I've only read the details on a few cases but in
               | the ones I read the FBI is clear to not cross the legal
               | boundary.
        
               | totalZero wrote:
               | Legally that's a fair argument, but there's an ethical
               | hazard in law enforcement catalyzing a crime that may not
               | otherwise occur, in order to bag a person who may not
               | otherwise be a criminal.
               | 
               | Some of those setups discriminate based on ethnicity,
               | such as those that target Islamic radicals and black
               | nationalists. In my mind, this further deepens the
               | ethical quandary.
               | 
               | Sometimes a solution in search of a problem is itself a
               | problem.
               | 
               |  _"...the FBI and Joint Terrorism Task Forces (JTTFs)
               | have approached multiple activists organizing for justice
               | for George Floyd--who was killed by Minneapolis police
               | officers--and have alternatively attempted to entrap them
               | or pushed them to work as informants. "_
               | 
               | http://hrlr.law.columbia.edu/hrlr-online/the-anatomy-of-
               | a-fe...
        
               | weswpg wrote:
               | Given the state of the war on drugs and the war on human
               | trafficking, do you think that there would be less drugs
               | and human trafficking if people were not allowed to watch
               | porn as you suggest?
               | 
               | I suspect that because criminals tend to ignore the law
               | anyway, placing restrictions on pornography will
               | completely fail to reduce any harm as bad people will
               | continue to do those things regardless of whether PornHub
               | exists or not
               | 
               | If pornography were not legal, then production would move
               | underground and would probably involve even more harm.
               | 
               | Some might suggest that there needs to be heavier
               | regulation and more protection for the women involved but
               | banning porn would mean zero protection for the women and
               | an unregulated trade
        
             | WindyLakeReturn wrote:
             | We can definitely do better, but you have to allow for
             | legal porn with consent to remain or else you will push
             | everything to an underground market where the end result
             | will be far less rules to control content.
             | 
             | One thing that would be nice is an automated take down.
             | Anyone who no longer consents to their porn being hosted
             | (or who never consented) can have the selected files added
             | to a database and all porn sites would have to take down
             | based upon matches to this database. This technology
             | already exists with PhotoDNA for fighting known child porn
             | (though I think the technical details are kept secret to
             | avoid people finding work arounds).
             | 
             | As long as one draws a line between consensual and
             | nonconsensual porn then I think you'll be able to crack
             | down on the non-consensual material without having to worry
             | about the failures of a 'war on x'.
             | 
             | Think of it like the difference between cracking down on
             | weed and cracking down on synthetic 'weed' that is killing
             | people. Or just look at stores that are able to sell
             | alcohol. Because it is generally allowed, specific bans are
             | much easier to enforce because business likes keeping the
             | legal status.
        
               | blfr wrote:
               | Alcohol and weed are not good examples because these are
               | failed interventions if you consider them harmful.
               | Consumption of both exploded in the last few decades.
               | 
               | This is why I use smoking: it's also legal or semi-legal,
               | it used to be prevalent but its popularity cratered.
        
               | PeterisP wrote:
               | "Consumption of both exploded in the last few decades."
               | -> I don't know about weed, but it's definitely not true
               | for alcohol, it's roughly stable; in USA there has been
               | some decrease in per capita consumption since a peak in
               | 1980s - see https://www.who.int/substance_abuse/publicati
               | ons/global_alco... for example.
        
             | gsich wrote:
             | Why? Is porn bad per se?
        
               | totalZero wrote:
               | Philip Zimbardo, the psychologist behind the Stanford
               | Prison Experiment, seems to think so. He has written a
               | couple of books on the subject of deteriorating
               | development of young men, and has suggested a correlation
               | to the advent of high-speed internet.
               | 
               | Not sure if he's right, but it's certainly a reasonable
               | theory.
               | 
               | He also did a TED talk briefly touching on the subject.
               | 
               | https://www.ted.com/talks/philip_zimbardo_the_demise_of_g
               | uys
        
           | lamp987 wrote:
           | I bet you support gun control.
        
         | kjrose wrote:
         | Especially with environments where anyone can upload anything
         | without proof of consent. Essentially you have a system where
         | producing and distributing this regardless of consideration is
         | maximized. However, even if pornhub et al. required proof of
         | consent for every video it would still likely lead to many
         | situations where the actor was "willing" for bad reasons which
         | will play out over time with an increasingly negative effect.
         | 
         | As I get older I'm expecting more and more to have a future
         | generation look at our with quite a dark opinion.
        
           | jessaustin wrote:
           | "A future generation" will think poorly of our mass
           | incarceration and meat consumption, but they won't have our
           | body image hangups and literally every second of their lives
           | will be recorded from multiple angles so they won't care
           | about porn.
        
       | ChrisArchitect wrote:
       | 'normal' link: https://www.ctvnews.ca/w5/i-will-always-be-
       | someone-s-porn-on...
        
       | INTPenis wrote:
       | Not trying to belittle this womans struggle but there are videos
       | online where people are being hurt real bad and those videos will
       | forever be someone's entertainment.
       | 
       | World star hip-hop, and other sites like it, have almost made a
       | business out of showing people be knocked out, kicked, punched
       | and assaulted.
       | 
       | That's what the internet is. A global network that spreads
       | information at light speed.
       | 
       | So I don't think attacking Pornhub specifically is the right
       | thing to do here.
       | 
       | That sort of smells of someone trying to make waves by going
       | after one of the more established players in the internet porn
       | business.
       | 
       | What happened was awful but it has nothing to do with Pornhub.
       | They're doing their best to police a giant platform that everyone
       | in the world wants to use and abuse. They're not alone in this
       | challenge.
        
         | elliekelly wrote:
         | > They're doing their best to police a giant platform that
         | everyone in the world wants to use and abuse.
         | 
         | It doesn't really sound like they're doing their best though,
         | when they offer an aggressive content take-down service to
         | their paying customers but not to victims of exploitation from
         | whom they've (knowingly or unknowingly) profited.
        
           | INTPenis wrote:
           | Sure but put yourself in their position for a minute.
           | 
           | How does pornhub even receive notice of this video being
           | posted? How many others contact ph through this channel? How
           | many of these cases are bogus and lead nowhere?
           | 
           | Remember that this is the internet. If you open up any
           | communication channel to your massive website you will be
           | flooded with junk.
           | 
           | So just to maintain a communication channel with the outside
           | world is an entire project in itself. Probably requires its
           | own manager and employees working full time with nothing but
           | handling cases.
           | 
           | And despite all this ph did respond on this case, they even
           | tried sending takedown requests to OTHER SITES.
           | 
           | Imo they did truly do their best.
           | 
           | But the problem goes beyond pornhub. It's an internet
           | problem. There is no simple resolution to this problem,
           | unless you want to lock down the entire internet.
           | 
           | And despite all these difficulties the stories posted still
           | mention Pornhub as the problem.
           | 
           | Pornhub is not the problem here.
           | 
           | But I would not be surprised if Pornhub comes up with a
           | solution. if coinbase can verify your identity to open an
           | account with them then surely Pornhub can do the same for
           | uploaders.
        
             | symlinkk wrote:
             | They already have a program that verifies the identity of
             | uploaders called Verified Amateurs, and last year after the
             | NYT published their hit piece and Visa stopped processing
             | payments to them, they removed all amateur videos that
             | weren't Verified Amateurs, which was most of them.
             | 
             | What's sad is that you're having to read this from me,
             | instead of from the original article linked above, that
             | someone who has a college degree in journalism was paid to
             | write.
        
         | [deleted]
        
         | ceejayoz wrote:
         | The article pretty clearly indicates that they're not "doing
         | their best":
         | 
         | > Kevin responded again, insisting that Pornhub "can NOT"
         | remove content from other sites. However, that doesn't seem to
         | be completely accurate. Pornhub offers something called its
         | "exclusive model program," which promises that it will send
         | takedown notices to any website to "help protect your content
         | from being uploaded to other websites."
         | 
         | The logical step here would seem to be to extend that takedown
         | program to victims as well as their models.
        
           | weswpg wrote:
           | While I agree with your suggestion, the article notes that PH
           | did request removals for her and still concludes by
           | attributing blame to PH for the video being newly uploaded
           | elsewhere (despite the fact that her ex husband likely has a
           | copy which he may have uploaded again)
        
         | WindyLakeReturn wrote:
         | I don't see why non-consensual gore isn't treated the same as
         | child porn.
         | 
         | Both are done without consent.
         | 
         | Both require someone to be hurt to be created.
         | 
         | Both either have a victim who is dead or who is harmed by the
         | continue spread of the video for entertainment purposes.
         | 
         | Both cross the threshold for obscenity.
         | 
         | Political and historical exceptions would still apply, just
         | like the photo taken of Phan Thi Kim Phuc fleeing a napalm
         | attack is legal since it serves significant political and
         | historic significance, despite it being a literal picture of a
         | naked child being harmed.
        
           | MattGaiser wrote:
           | Is a lot of gore footage created for the sake of selling
           | videos? Allowing child porn has the consequence of
           | incentivizing more to be produced. Is anyone producing gore
           | videos in any quantity?
        
             | WindyLakeReturn wrote:
             | The incentivizing argument seems to be a red herring
             | because in no universe would we legalize some subset of
             | child porn that is shown to not incentivize more being
             | produced, no matter how clearly such a case was shown.
        
               | retrac wrote:
               | Drawn and computer generated images of that kind are
               | legal under the First Amendment in the USA. I mention it
               | because they are, in contrast, illegal in Canada.
        
               | claudiawerner wrote:
               | The other argument is that it inflames and encourages
               | desire to assault children in a significant subset (in
               | the sense of risk; i.e. the population doesn't have to be
               | large, only the risk) of those who consume it, and that
               | it does so in a unique way, compared to other forms of
               | media. The other argument is that it's a particularly
               | grave violation of the child's privacy, one they cannot
               | consent to.
               | 
               | Alternatively, we could just bite the bullet and conclude
               | (perhaps rightly) that maybe porn _in general_ has the
               | same negative effects we allege CP to have. I 'm not sure
               | if that's true, but if it is, then I think it would make
               | a good case for banning it.
        
         | [deleted]
        
         | thereare5lights wrote:
         | > They're doing their best to police a giant platform that
         | everyone in the world wants to use and abuse.
         | 
         | There are many citations to the contrary.
         | 
         | This is just one
         | 
         | https://www.bbc.com/news/stories-51391981
         | 
         | They didn't just go and delete millions of videos for the fun
         | of it.
         | 
         | It's because they let their platform turn into a cesspool of
         | illegal content.
        
         | anigbrowl wrote:
         | If you have serious points to make about a porn issue, perhaps
         | you should use a throaway account rather than 'INTPenis'. joke
         | that's mildly amusing in other contexts seems tasteless when it
         | shows up in a discussion on sexual assault, and that's probably
         | outweighing the substance of your argument.
        
         | luckylion wrote:
         | > They're doing their best to police a giant platform that
         | everyone in the world wants to use and abuse.
         | 
         | [citation needed]
        
         | [deleted]
        
         | [deleted]
        
       | luckylion wrote:
       | The whole "we're just a platform" thing makes it incredibly easy
       | for Pornhub & co to just say "wasn't us, it was some user" and
       | not even handle the deletion process on their affiliates' sites
       | (who they provided the video + images to). It's like Mega Upload,
       | which was obviously made for copyright infringement but
       | successfully hid behind the platform-excuse for years.
       | 
       | I'm not a friend of far-reaching regulation, but it seems that we
       | don't have sufficient processes in place to deal with these
       | companies unless there's gigantic financial interest involved
       | that makes the state feel motivated to intervene.
        
         | darkerside wrote:
         | You can say the same of Google in this instance
        
         | yummypaint wrote:
         | I would argue the copyright argument goes the other way. If
         | you're a big vertically integrated corporation who has greased
         | the right cogs it's incredibly easy to get content taken down
         | wholesale without any oversight or accountability. Whether it's
         | having content removed or restored, the burden somehow always
         | falls on the individual.
        
       | mensetmanusman wrote:
       | Future prediction: This would be a good application of some thing
       | like Ethereum NFT.
       | 
       | Verified individuals could license their content, and it would be
       | illegal to host anything unverified.
       | 
       | This would make it harder to widely distribute illegal content
       | and abuse. (it will never completely remove, but it raises the
       | bar just like any law).
        
         | perlgeek wrote:
         | What would prevent a bad actor from getting verified, and then
         | uploading material that they don't have consent/license for?
         | 
         | What would prevent anybody from claiming they were the "talent"
         | in the material and giving consent?
         | 
         | Where would the "verified individuals" and their verification
         | even come from?
        
           | mensetmanusman wrote:
           | The same thing we do to try to limit fraud in the banking
           | system, use that.
           | 
           | We aren't going for perfect fraud protection, because that's
           | impossible. But just raising the bar would be useful.
        
             | stale2002 wrote:
             | But that has nothing to do with crypto.
             | 
             | I agree that it is possible to use the law to take down
             | infringing content. But cryptocurrency and NFTs do nothing
             | to help that.
        
               | mensetmanusman wrote:
               | It does when you automate takedowns, because it is just a
               | verified lookup table.
        
         | folli wrote:
         | Hosting illegal content is already illegal. How would NFTs
         | solve this?
        
           | anigbrowl wrote:
           | If any uploaded videa were assigned a token, and one turned
           | out to be illegal, any matches against the original token
           | would trigger an auto-deletion.
        
             | mensetmanusman wrote:
             | This, the AI police of the future that automatically take
             | down copyrighted information will just check against the
             | chain.
        
           | datavirtue wrote:
           | The monetary barrier to attaining the NFT. I doubt her
           | husband would have uploaded it if he had to pay $100-200 to
           | do so. In general this is not a practical solution.
        
         | BuyMyBitcoins wrote:
         | Most NFTs are just a link to the content, or a hash of the
         | content. They're not literally storing the data.
        
           | mensetmanusman wrote:
           | One could set up a system that requires verified linkage.
        
       ___________________________________________________________________
       (page generated 2021-04-05 23:01 UTC)