[HN Gopher] DeepCreamPy- Decensoring Hentai with Deep Neural Net...
       ___________________________________________________________________
        
       DeepCreamPy- Decensoring Hentai with Deep Neural Networks (2018)
        
       Author : acqbu
       Score  : 242 points
       Date   : 2021-12-30 13:46 UTC (9 hours ago)
        
 (HTM) web link (github.com)
 (TXT) w3m dump (github.com)
        
       | ramesh31 wrote:
        
         | spurtrenolds wrote:
         | What a ridiculous comment
         | 
         | So much of our technologies have been built or adapted to
         | enhance and/or fulfil our desires, including sexual desires, it
         | is puzzling to me why you think women are somehow excluded from
         | this. Your post smacks of a modern puritanism.
        
           | ramesh31 wrote:
           | > So much of our technologies have been built or adapted to
           | enhance and/or fulfil our desires, including sexual desires,
           | it is puzzling to me why you think women are somehow excluded
           | from this. Your post smacks of a modern puritanism.
           | 
           | It's a question of venue. There's nothing wrong with the tech
           | itself. Don't bring this up on HN. We have enough trouble
           | fighting the stereotype of the misogynistic porn watching
           | computer nerd as it is. Women are overwhelmingly offended by
           | pornography (particularly hentai, which is very commonly
           | focused on incest and pedophilia), and especially in a
           | setting that is supposedly professional. Of course there are
           | always exceptions to any rule. It just really bothers me to
           | see the casual locker room conversation here that is so toxic
           | to having an inclusive community.
        
             | throwaway425260 wrote:
             | >Don't bring this up on HN.
             | 
             | What? Why? This is interesting. As a woman the _only_ thing
             | that offends me in  "tech" is people who don't treat me as
             | a person, but rather as a "woman in tech." I am _person._ I
             | am just as good as you. I don 't need some strong man to
             | protect me from the big, scary world of the impolite
             | hentai-masturbating nerds. Throwaway since I want to tell
             | you to go fuck yourself. Yes, even women know those words,
             | thank you very much.
        
               | ramesh31 wrote:
               | It's not just a question of male/female, but of
               | inclusiveness. As a minority, I see the same thing. The
               | problem with tech is the white/asian male dominated frat
               | house culture that pervades it from the point of college
               | all the way to large companies. And anything that
               | perpetuates that is unwelcome in a professional
               | environment in my opinion.
        
             | smegsicle wrote:
             | > Women are overwhelmingly offended by pornography
             | 
             | This can be changed- it's merely cultural, right?
             | 
             | Bigots of either sex need to get over pornophobia for
             | culture to progress.
        
             | quincy_holden wrote:
             | How was it possible that you typed all this out, read it
             | over, and still didn't see the irony in what you were
             | saying?
             | 
             | We are absolutely incredibly offended by porn. So thank you
             | for speaking out to protect us women! I see that another
             | woman has already told you to fuck off, so I'll spare you
             | that.
        
         | yoyohello13 wrote:
         | Plenty of women enjoy Hentai.
        
           | [deleted]
        
         | mdni007 wrote:
         | What a strange thing to say. Do you even know any women? Women
         | watch porn and hentai too.
        
         | wiseowise wrote:
         | Oh yeah, because only men watch porn and hentai.
        
           | ramesh31 wrote:
           | >Oh yeah, because only men watch porn and hentai.
           | 
           | Nope. But what is the breakdown, 70/30 at best? There's
           | nothing wrong with watching porn. It's just that having a
           | casual "frat house" culture like this where people feel fine
           | sharing these things in a supposedly professional environment
           | inevitably leads to women being ostracized or alienated. The
           | majority who disagree get labelled as boring or buzzkills and
           | are excluded from the group, while the minority who go along
           | are held up as examples to make them feel bad (we're not
           | misogynistic! look!) like you pointed out.
        
             | 0x0nyandesu wrote:
             | You know what actually drove my fiance out of tech? Sitting
             | in front of a computer all day till her health, looks, and
             | worth as a women became threatened because instead of being
             | fit, taking care of her body, and practicing self care she
             | would instead have to do what most men do and just sit all
             | day.
             | 
             | She has made commits to the Linux kernel that have been
             | merged and she loves hentai and porn but is no longer in
             | tech because she literally makes more money being a cam
             | girl and it's way more fun.
             | 
             | Tldr: stop trying to white knight. It's neither warranted
             | nor welcome.
        
               | ramesh31 wrote:
               | >She has made commits to the Linux kernel that have been
               | merged and she loves hentai and porn but is no longer in
               | tech because she literally makes more money being a cam
               | girl and it's way more fun.
               | 
               | I'd argue this proves my point though. Your fiance is
               | someone who conformed to the existing male dominated
               | culture and succeeded despite the structural roadblocks
               | in place. But the other 50% of women who may have
               | otherwise made a career in this industry were driven out.
        
               | 0x0nyandesu wrote:
               | > But the other 50% of women who may have otherwise made
               | a career in this industry were driven out.
               | 
               | [Citation Needed]
        
               | ramesh31 wrote:
               | My point being that there's no reason engineering teams
               | shouldn't be 50/50 male/female, but the reality is more
               | like 90/10. Why does the other 40% never show up despite
               | our best efforts? It's cultural.
        
               | wiseowise wrote:
               | Maybe because they don't apply?
        
               | 0x0nyandesu wrote:
               | It's wild seeing nerds on HN bend over backwards to
               | justify themselves when the truth is that fewer women
               | want to do the job. Not because they aren't capable but
               | because they just don't want to.
        
       | jptech wrote:
       | Not the latest on censorship. Couple of months ago someone was
       | arrested for decensoring porn and putting it on their website.[1]
       | With Deepfake and all the latest GANs, anyone can probably do it
       | with an acceptable quality. It's just that the sale and
       | distribution of such material will remain illegal in Japan for
       | the forseeable future.
       | 
       | [1] https://www.vice.com/en/article/xgdq87/deepfakes-japan-
       | arres...
        
       | [deleted]
        
       | swayvil wrote:
       | Doesn't work with nipples or anuses.
       | 
       | Is it just a matter of what the NN is trained for?
        
       | cs702 wrote:
       | A new front in the long-running battle between censors and those
       | who oppose censorship.
       | 
       | The use of deep neural networks for this purpose seems
       | inevitable, in hindsight.
       | 
       | We live in interesting times.
        
         | marderfarker2 wrote:
         | How do you recover information that is not there? Best you can
         | do is guess what's supposed to be there.
         | 
         | That's why I hate watching 4K AI "enhanced" historical films,
         | it is akin to rewriting history.
        
           | mayapugai wrote:
           | Long story short, these algorithms aren't recovering
           | information that was lost. They reconstruct the images by
           | guessing what should've been there. So, you're not violating
           | any information theoretic concept.
           | 
           | But this means the guesses can be incorrect, although the
           | likelihood of that happening can be greatly reduced with good
           | training data.
        
           | feanaro wrote:
           | I never heard of this before. What kind of enhancing does the
           | AI do?
        
             | [deleted]
        
             | Jhsto wrote:
             | It tries to predict patterns from blurry shots. It might
             | introduce artifacts that were never there: some drawings I
             | upscaled turned distant woods into houses. You can see why
             | this might be bad when viewing something for historical
             | accuracy.
             | 
             | See e.g., https://www.topazlabs.com and
             | https://github.com/nagadomi/waifu2x
        
       | gaze wrote:
       | Imagine putting this on your resume.
        
         | delusional wrote:
         | I think the tone of the interview is going to depend heavily on
         | the first impression. A well rounded person with good personal
         | hygiene would make this seem like a fun and weird project. If
         | the person is a total misfit neckbeard, it's going to be less
         | fun and quirky, and more worrying.
        
         | vmception wrote:
         | Then you immediately pass the culture fit test
        
         | DaftDank wrote:
         | Plot twist: He's applying at Pornhub next, and this gets him
         | instantly hired.
        
           | luckman212 wrote:
           | Out of curiosity, how many Pornhub'ers are also here on HN?
           | (I mean employees, not users haha)
        
             | amitport wrote:
             | The company is mindgeek and AFAIKT they outsource (/ use
             | subcontracts / off the shelves tools) in most of their
             | operations.
        
               | smoe wrote:
               | They do have a lot of open positions in SRE and
               | development, from the job descriptions it doesn't sound
               | like just outsourcing.
               | 
               | https://www.mindgeek.com/careers/
               | 
               | They do a pretty good job at never mentioning anything
               | related to porn. So I guess in most cases it won't be a
               | problem on someone's cv
        
             | Debug_Overload wrote:
             | I remember the AMA one of their devs did on Reddit a few
             | years ago, it was interesting.
        
             | marnett wrote:
             | I believe the company is referred to as MindGeek.
        
       | politelemon wrote:
       | > The user colors cencored regions green in an image editing
       | program like GIMP or Photoshop. A neural network fills in the
       | censored regions.
       | 
       | This will have limited use as it would require highly motivated
       | individuals with enough time to go through their collections.
       | 
       | Speculation, also probably won't work with alien/creature hentai
       | of the green skin variety.
        
       | fork-bomber wrote:
       | 'Vagina' is not anatomically correct in the context of the
       | description within the github text. 'Vulva' would be more
       | appropriate. Vagina is the internal passage onwards from the
       | vulva and terminating at the cervix. The vagina wouldn't be
       | visible under normal circumstances, the vulva would.
        
         | unbanned wrote:
         | You significantly underestimate the sexual depravity of your
         | average deep learning enthusiast. Vagina may full well be
         | visible.
        
           | fork-bomber wrote:
           | Possibly. But not in the average case surely
        
       | vmception wrote:
       | Only works with color images, so that's basically limited to
       | fanart and explicit covers, and not the actually manga pages
       | where the censoring appears
       | 
       | Very limited utility
       | 
       | Love the pun in the title
        
         | Mumps wrote:
         | > very limited utility
         | 
         | Easily one of the funniest things I've read on HN; as if
         | decensoring _any_ hentai isn't already a hilariously limited
         | use case.
        
       | rg111 wrote:
       | A man was very recently arrested for decesoring pixeleted
       | Japanese porn with Deep Learning ("AI").
       | 
       | Man Arrested for Uncensoring Japanese Porn With AI in First
       | Deepfake Case - Vice
       | (https://www.vice.com/en/article/xgdq87/deepfakes-japan-arres...)
       | 
       | Note: This is not hentai, but real-life porn.
       | 
       | Whenever I see news like this- I feel like there are a lot of
       | work that can be done in Porn with AI. There are several low-
       | hanging fruits that can be very easily solved. I don't know why
       | no one is working on them.
       | 
       | I guess it would be difficult for people to say that they do AI
       | at PornHub, and it will be hard for people to find work
       | afterwards. Not to mention the ethical implications even when the
       | humans involved fully agree to everything.
        
         | bayindirh wrote:
         | > I feel like there are a lot of work that can be done in Porn
         | with AI.
         | 
         | Well, one of the de-facto test images for image processing
         | belongs to a Playboy model [0], so it looks like a natural
         | evolution in the field.
         | 
         | Actually it's amazing how image processing revolves around
         | sexual imagery.
         | 
         | [0]: https://en.wikipedia.org/wiki/Lenna
        
           | taylorius wrote:
           | Lenna is a great image, they shouldn't retire it. it's only
           | her head and shoulders after all, and she looks great. They
           | should get a good looking male photo too, to balance things
           | up - use them both for testing.
        
         | netcan wrote:
         | >> a lot of work that can be done in Porn with AI.
         | 
         | I think that "can be" is probably and "is being."
         | 
         | Porn generally, has a historical tendency to be a pioneering
         | adopter of media technology. A substantial chunk of early
         | commercial photography was pornographic. Early films. Early
         | ecommerce. Streaming video. Porn is currently the most
         | developed/active niche in VR filmmaking. I would even consider
         | renaissance portraiture a part of the pattern.
         | 
         | In any case, most of "deepfake" that exists currently besides
         | demonstration is pornographic.
        
           | Barrin92 wrote:
           | ads and payment as well. One of the first uses of real-time
           | online credit card transactions was payment for the Pamela
           | Anderson / Tommy Lee sex tape.
        
           | rg111 wrote:
           | I also think the same.
           | 
           | Many things are possible with the AI technology that we have
           | _today, right at this very moment_ that can be used to do
           | things are the most wild things. Some of it would be illegal,
           | some of it would be unethical. Good things are possible, too.
           | 
           | There was a repo on GitHub that contained the JDK environment
           | variables (IIRC) to identify and box Uighurs from CC camera
           | footage. The news made the headlines.
           | 
           | The getting out part was not really intended, I guess. So,
           | imagine- what other wild things are out there in non-public
           | repos.
           | 
           | "AI" is a very bad name, and often seen as synonymous to AGI.
           | And people are either afraid or excited about it. Because
           | much of the public facing content is focused on beyond-human
           | AI. I guess this is because it is easy to get fundings, and
           | there are private propaganda Machines running?
           | 
           | But there are so much that can be done today. Right now. Good
           | things, too. If there are people willing to do it, and people
           | willing to fund it, things are getting done.
           | 
           | AI weaponry is among this. I believe many things are being
           | done that are far away from the public information.
           | 
           | Let me give you an example.
           | 
           | If there is a drone, I guess, it is not hard for mechanical
           | engineers to rig a machine gun to it. Or with $$$ of defence
           | funding, maybe a drone _designed_ to host a machine gun. Then
           | all you need is basically a cheap webcam and a Raspberry Pi
           | to detect any motion, store the coordinates and aim and
           | trigger the gun. When you have an enemy line, you deploy it
           | behind that, and get done with it.
           | 
           | I am sure multiple countries have this _today_. And this
           | example is among the lowest-hanging fruits of AI weaponry.
        
         | warning26 wrote:
         | I believe the issue was not that he was de-censoring per se, it
         | was that he was reselling copyrighted works. That is, he could
         | have been arrested for the same crime even if he was reselling
         | them with no changes.
        
         | hwers wrote:
         | Despite the headline, inside the actual article they admit that
         | it wasn't really for uncensoring that he got arrested, but for
         | copyright violation (as well as basically displaying porn in a
         | country where porn is illegal).
        
           | [deleted]
        
           | rg111 wrote:
           | Copyright laws are a favorite instrument of preventing stuff
           | that you do not want to happen.
           | 
           | In one country, making reaction videos or analysing news with
           | clips from proprietary news channel is very common.
           | 
           | Then came a guy, who used news clips to create Before/After
           | clips of politicians of a certain party. In these clips, the
           | same person would be saying completely opposite things- on
           | camera, and publicly- some years apart. These videos went
           | viral. The party was in power. They forced the news channels
           | to claim copyright to YouTube, and as they were legitimate,
           | YT did take down those videos.
           | 
           | So, yes, when they say that he was arrested for copyright
           | claims- I don't trust that.
        
         | zionic wrote:
         | >Porn and AI
         | 
         | This comment reminds me of a disturbing one I read a few months
         | back on an anonymous underwater basket weaving forum.
         | 
         | The poster was discussing those expensive "real doll" (?) sex
         | robots that creep me out, anyways he went on to mention that he
         | discovered the app "face app" could take these things from
         | "uncanny valley" to essentially photorealistic. Makes sense
         | given what the app is supposed to do, but the real mind fuck
         | was he said it also works on under aged dolls... which again I
         | was surprised to learn are apparently legal most places.
         | 
         | So AI/ML can take an inanimate object and then create highly
         | illegal fool-most-humans level fake image from said inanimate
         | object. Seems pretty scary to me, and we might not be far from
         | an ML model that can essentially produce endless amounts of
         | illegal content.
         | 
         | What are the implications of that? How do you even begin to
         | police/filter that? Then I guess there's the question of if we
         | even should, since none of it's real anyways.
         | 
         | Porn and AI is gonna get real messy.
        
           | tomerv wrote:
           | Not sure if I want to get into this discussion... but is
           | underage porn still illegal when it's not technically
           | depicting a real person? And is it moral? I could see it
           | going either way (no child got hurt in the process / but it
           | may push someone with pedophile tendencies towards acting on
           | them / but it could also provide a safe outlet for someone
           | with those tendencies).
        
             | GistNoesis wrote:
             | It reminds me of the children's right campaign sweetie
             | 
             | https://en.wikipedia.org/wiki/Sweetie_(internet_avatar)
             | 
             | https://vimeo.com/86895084
             | 
             | where an org lured predators by posing as an animated 10
             | years old.
        
             | roomey wrote:
             | It's illegal in some places, like Ireland for example.
        
             | solarmist wrote:
             | I think it is illegal in the USA. But explicitly is is not
             | illegal in Japan.
             | 
             | Japan has a pretty bad reputation for making their child
             | porn laws pretty narrowly defined in the international
             | community.
             | 
             | For these reasons Japan has a very high level of child
             | pornography activity, but I know very little beyond that or
             | of details.
        
             | ta8902 wrote:
        
             | netcan wrote:
             | It's complicated, in that this depends on jurisdiction. In
             | some places, the age of the character matters, not just the
             | actor's age. Regardless of legality, I think this is very
             | prominent in self censorship. A lot of filmmakers wouldn't
             | depict a nude 15 year old sexually, regardless of the
             | actor's actual age.
        
             | zionic wrote:
             | I haven't researched it heavily, but AFAIK the gov can't
             | really criminalize art/drawings etc.
             | 
             | If it doesn't depict a real person, then what's the crime
             | exactly? It's still disgusting, but that's another topic.
        
               | dymk wrote:
               | Well sure they can, although it depends what government
               | you're talking about. Porn laws in Japan are weird, and
               | the government does require that distributed pornography
               | (including drawings) be censored.
        
             | rg111 wrote:
             | > but is underage porn still illegal when it's not
             | technically depicting a real person?
             | 
             | If we look at existing laws, it should be. Depicting an
             | underage person _even when the actor is an adult_ is
             | illegal, I think, in every country. So adult actress
             | _portraying_ a minor is illegal. So why not pixels
             | depicting young girls should be illegal?
             | 
             | The pixels are just the conduit, just as the the adult
             | actress depicting a minor in a sexual act.
             | 
             | So it should be.
             | 
             | But whether it is moral- I don't know. I don't know enough
             | psychology, public policy, etc. to express an informed
             | opinion.
             | 
             | But my personal choice? Yes. It should be illegal.
             | Hypothetically, if it ever comes to a vote, I would vote
             | against child porn even when there is no real human
             | involved.
             | 
             | That's why child porn in animations is banned as well.
        
               | bitwize wrote:
               | > If we look at existing laws, it should be. Depicting an
               | underage person even when the actor is an adult is
               | illegal, I think, in every country. So adult actress
               | portraying a minor is illegal. So why not pixels
               | depicting young girls should be illegal?
               | 
               | Due to that pesky First Amendment, this is not illegal in
               | the USA. The USA bans child pornography under the rubric
               | that producing or even viewing it implicates the
               | perpetrator in the abuse of a child. If no actual child
               | abuse is taking place, they can't ban it.
               | 
               | Cartoon images and the like may be banned if they depict
               | an actual minor.
        
             | oriki wrote:
             | In the US in particular, it doesn't matter if the underaged
             | subject exists or not, but I think you'll find the law much
             | more loosely enforced in cases where the subject of the
             | pictures doesn't exist.
        
               | zionic wrote:
               | > it doesn't matter if the underaged subject exists or
               | not
               | 
               | Are you sure this is true? As weird as it is it seems
               | pretty ridiculous to suggest that someone drawing a
               | sketch could be guilty of the same thing as someone
               | abusing a real child.
        
               | sterlind wrote:
               | They're both illegal under the Child Pornography
               | Prevention Act of 1996:
               | 
               | > _The Child Pornography Prevention Act added two
               | categories of speech to the definition of child
               | pornography. The first prohibited "any visual depiction,
               | including any photograph, film, video, picture, or
               | computer or computer-generated image or picture" that
               | "is, or appears to be, of a minor engaging in sexually
               | explicit conduct." In Ashcroft case, the Court observed
               | that this provision "captures a range of depictions,
               | sometimes called 'virtual child pornography,' which
               | include computer-generated images, as well as images
               | produced by more traditional means."_
               | 
               | > _The second prohibited "any sexually explicit image
               | that was advertised, promoted, presented, described, or
               | distributed in such a manner that conveys the impression
               | it depicts a minor engaging in sexually explicit
               | conduct."_
               | 
               | (From https://en.wikipedia.org/wiki/Child_Pornography_Pre
               | vention_A... )
               | 
               | "Computer-generated" images was a remarkably prescient
               | thing to include in 1996, since it criminalizes deepfakes
               | and GAN images.
               | 
               | I think the standard argument is that even though actual
               | minors aren't harmed by the creation of such images, the
               | images trigger pedophiles to become more depraved or
               | something. There's probably an element of the old
               | obscenity laws to it - real or not, child porn is
               | obscene, and obscenity isn't covered by the first
               | amendment without some artistic merit.
               | 
               | I'm not sure of case law around enforcement of this.
        
               | zionic wrote:
               | Interesting, that seems to strictly criminalize cartoons
               | then. I wonder if that's ever been enforced or
               | challenged.
               | 
               | Not that anyone would ever put their name behind
               | attacking this.
        
               | CheezeIt wrote:
               | As the Wikipedia page states, it was struck down by the
               | Supreme Court in Ashcroft v. Free Speech Coalition in
               | 2002.
        
             | ta8902 wrote:
        
             | lr4444lr wrote:
             | It probably depends on whether they can prove the
             | individual was soliciting for the real stuff. Same legal
             | reason for why you're still committing a crime when you
             | fall for a sting operation with an actual adult.
        
           | rg111 wrote:
           | > we might not be far from an ML model that can essentially
           | produce endless amounts of illegal content
           | 
           | Like this [0] one?
           | 
           | And give me enough money, I can get you one that's a lot
           | better. But I don't want to work for a porn company because
           | of not the perceived immorality of porn, but for the horrible
           | things that come _with_ Porn. And I am well past my teenage
           | years to do this with my own time and money.
           | 
           | [0]: https://thisvaginadoesnotexist.com/about.html
        
           | MomoXenosaga wrote:
           | Feminists are against it because they are afraid it could
           | replace real women. I have to admit that considering the
           | general shittyness of man it is not beyond the realm of
           | possibility that one day we might prefer the companionship of
           | robots. I read Asimov when I was a teen.
        
             | na85 wrote:
             | >Feminists are against it because they are afraid it could
             | replace real women.
             | 
             | Do you _really_ think that 's why feminists are against it?
        
           | cbozeman wrote:
           | > Then I guess there's the question of if we even should,
           | since none of it's real anyways.
           | 
           | The idea of "policing" something that isn't even real should
           | be seen as just patently ridiculous to any thinking person.
           | 
           | What comes after that? People love to shit all over the
           | "slippery slope" and call it a "fallacy" even though there's
           | a _clear_ historical record of how the smallest of openings
           | for those in power eventually become handholds, then
           | footholds, until finally, you 're living under a tyrannical
           | system.
        
             | dijonman2 wrote:
             | > those in power eventually become handholds, then
             | footholds, until finally, you're living under a tyrannical
             | system.
             | 
             | I immediately thought about the current US political
             | landscape. I believe this is exactly what we're seeing with
             | respect to radicalization.
        
             | alex_anglin wrote:
             | I would argue that it's a reasonable limit on whichever
             | freedom may be cited not to allow the creation and
             | distribution of child pornography, even when it is
             | fictitious.
        
               | somebodythere wrote:
               | That limit is probably where real harm was caused to a
               | real child.
        
           | aeturnum wrote:
           | > _Porn and AI is gonna get real messy._
           | 
           | From a practical standpoint I think it is going to be an
           | unmitigated nightmare: people creating AI revenge porn, fake
           | porn of famous people, etc.
           | 
           | From a legal standpoint, I don't think AI changes that much.
           | People still own their image and while I would support
           | stronger penalties for creating fake media of people, the law
           | protects third parties using your image without your
           | permission. Revenge porn already carries specific penalties
           | in many jurisdictions[1].
           | 
           | > _I guess there's the question of if we even should, since
           | none of it's real anyways._
           | 
           | I absolutely think you should have say over the release and
           | distribution of porn that uses your image. Not because it's
           | wrong to make porn, but because you are impacted by its
           | existence.
           | 
           | [1] I am generally skeptical of the ability of the prison
           | system to deal with these kinds of things, but if we are
           | going to have one I am glad it's included in the list of
           | crimes.
        
             | setr wrote:
             | > From a practical standpoint I think it is going to be an
             | unmitigated nightmare: people creating AI revenge porn,
             | fake porn of famous people, etc.
             | 
             | I honestly don't see these becoming larger problems -- once
             | it's ubiquitous, then it'll quickly become meaningless; by
             | default, everyone will assume it's generated, the
             | equivalent of cutting out and pasting their face onto a
             | nude image today, and it'll have relatively little impact.
             | 
             | It'd probably still be rude and stupid, but I'm fairly
             | positive it'll be much less damaging than it is today.
             | 
             | Effectively: if everyone stars in a porno, no one does.
        
       | [deleted]
        
       | can16358p wrote:
       | I think the name of the library deserves an award on its own.
        
         | hwers wrote:
         | Hardly subtle
        
         | schleck8 wrote:
         | There is also DeepThroat, the speech synthesis architecture
         | name of 15.ai (was added later on after someone recommended it
         | on twitter)
        
         | rg111 wrote:
         | Nah, that award goes to this project (Hent-AI)-
         | https://github.com/natethegreate/hent-AI
        
           | oneepic wrote:
           | Not to start a war, but Hent-AI is pretty basic by comparison
           | IMO.
        
             | dilap wrote:
             | Here are all of the double-meanings / associations I can
             | find in the DeepCreamPy name:
             | 
             | DeepCream sounds like deep dream, the AI technique
             | 
             | To cream is slang for to ejaculatw
             | 
             | Thus, "deep cream" would be to ejaculate "deeply". In
             | general, "deep" is a richly evocative word in the context
             | of sex.
             | 
             | "CreamPy" sounds like "cream pie," a type of pies, the
             | pastry. Pies themselves also have a sexual connotation, due
             | to the widely popular and influential sex comedy "American
             | Pie" (it gave us the word MILF, for example).
        
               | flobosg wrote:
               | This one is missing (NSFW):
               | https://en.wikipedia.org/wiki/Creampie_(sexual_act)
        
               | dilap wrote:
               | Haha, how did I miss that?! Thanks!
        
               | toomanydoubts wrote:
               | Yeah, and honestly, at least for me, the most heavy
               | sexual connotation. I like the py/pie joke.
        
               | ComputerGuru wrote:
               | Can you still tag that with a huge NSFW? The link itself
               | is truncated on mobile so it's a big surprise.
        
               | amelius wrote:
               | I think Wikipedia should add NSFW in their links.
        
               | [deleted]
        
               | BTCOG wrote:
               | How did you go this Sheldon and still miss that it's
               | referring directly to a creampie? LOL
        
               | fsckboy wrote:
               | I can't believe I'm participating in this conversation,
               | but misapprehensions beg for clarification.
               | 
               | pie had the sexual connotation decades before American
               | Pie, see notably "hair pie" at least from the 1960s; also
               | earlier cherry pie with its jailbait associations; I
               | would imagine also dating back to the early part of the
               | last century in Black vernacular, see for comparison
               | jelly roll.
               | 
               | cream pie only recently took on its modern meaning; prior
               | to the last decade or so it referred to the same thing it
               | does now, but only in the context of eating it. search
               | usenet archives (see asstr.org) if you feel an
               | uncontrollable need to learn more about this.
               | 
               | creaming, your jeans for example, equally applies to
               | women.
        
               | [deleted]
        
           | hiyer wrote:
           | Incidentally this depends on DeepCreamPy. From the github
           | readme:
           | 
           | > You will need to download and install DeepCreamPy, which is
           | linked in the intro.
        
         | throwaway_bdcp wrote:
        
       | bgeeek wrote:
       | But that name. Ugh.
        
       | [deleted]
        
       | nomdep wrote:
       | It does NOT work with:         - Censorship of nipples
       | ...
       | 
       | What? Useless!
        
         | wiradikusuma wrote:
         | I heard hentai pictures don't censor nipples, only penises and
         | vaginas. Haven't seen it myself, of course.
        
           | micromacrofoot wrote:
           | you don't need to even go as deep as hentai, you see it in
           | anime (Ghost in the Shell is one prominent example)... it's
           | fairly common to see nipples and a complete lack of genitalia
        
           | toyg wrote:
           | They used to censor nipples too, and then at some point they
           | just didn't. I guess the Japanese government relaxed their
           | guidelines a bit - after all, they've now recognized that
           | manga and anime are effectively their Hollywood, and are very
           | busy promoting them around the globe. I expect at some point
           | they will quietly drop prosecutions of uncensored drawings
           | altogether.
        
       | hooande wrote:
       | this name is too much. there's a difference between cute and
       | vulgar. no star
        
         | marginalia_nu wrote:
         | To be fair, it's a pretty vulgar application.
        
           | computerfriend wrote:
           | It's not my thing, but I personally find censorship much more
           | vulgar.
        
           | goldcd wrote:
           | _shrugs_ Not to my taste, but you 've got to admire somebody
           | so pissed off with pixelation, they've created a project..
           | 
           | I was disappointed at the issues - I'd anticipated there
           | being amusing complaints.
        
       | monkeybutton wrote:
       | How popular are the censorship laws in Japan? Is it like cannabis
       | where the majority is in favor of legalization but politicians
       | are beholden to a vocal minority? The whole thing seems so dated
       | in the age of the internet. Anyone in Japan can easily access
       | uncensored porn from anywhere else in the world on their phone
       | now. What's the point?
        
         | [deleted]
        
         | makeitdouble wrote:
         | Am not super versed in the question, but would wager it's a
         | matter of who's willing to go fight the status quo to change
         | the law. Contrary to cannabis, there's no strong science behind
         | decensoring, nor obvious medical use. The field is already
         | under heavy attacks in a lot of fronts, there would be better
         | battles to fight than this one.
        
           | monkeybutton wrote:
           | I would argue that using science to prove cannabis/uncensored
           | nips has benefits as an argument in favor of legalization is
           | wrong. One should not start from the position of a thing
           | being illegal and demanding others to argue for its
           | legalization. The correct approach is to start from the
           | position of a thing being legal, and arguing for it being
           | illegal using science. So, has anyone proven using science
           | that censorship has benefits? Reduces porn addiction? Reduces
           | abuse?
        
             | teawrecks wrote:
             | iirc Japan's more conservative laws, including the ones
             | related to porn, originated after WW2 in an attempt to
             | appease the US's Christian sensibilities.
        
             | kodah wrote:
             | Much of our free speech (and porn laws) are thanks to Larry
             | Flynt. His case rather than being argued through "science"
             | or statistics was argued via innate rights. Whether it's
             | "good" or "bad" is none of society's business, and his case
             | was about establishing the bar where society is allowed to
             | care from a legal perspective.
             | 
             | Also worth mentioning that he lost his ability to walk over
             | that case.
        
               | solarmist wrote:
               | I didn't know that. Was he attacked by someone due to it?
        
               | singlow wrote:
               | From Wikipedia:
               | 
               | > On March 6, 1978, during a legal battle related to
               | obscenity in Gwinnett County, Georgia, Flynt and his
               | lawyer were shot on the sidewalk in Lawrenceville by
               | Joseph Paul Franklin. The shooting left Flynt partially
               | paralyzed with permanent spinal cord damage, and in need
               | of a wheelchair.
               | 
               | >Franklin, a militant white supremacist and serial
               | killer, also shot Vernon Jordan; he targeted other black
               | and Jewish people in a killing spree from 1977-1980.
               | Violently opposed to 'miscegenation,' he confessed to the
               | shootings many years later, claiming he was outraged by
               | an interracial photo shoot in Hustler.[12] About Flynt
               | and a Hustler pictorial, he stated, "I saw that
               | interracial couple ... having sex ... It just made me
               | sick ... I threw the magazine down and thought, I'm gonna
               | kill that guy."[13] Flynt himself suspected the attack
               | was part of a larger conspiracy involving ultra-right
               | elements surrounding U.S. Representative Larry McDonald
               | also behind the Karen Silkwood case with ties to the
               | Intelligence Community and that Franklin may have been
               | subject to MKULTRA-style Mind Control.[14]
        
               | kodah wrote:
               | It's worth noting that after a lot of psychological
               | evaluation that he was deemed a paranoid shizophrenic
               | from years of abuse and starvation at the hands of his
               | mother as a kid. That's detailed on Franklin's Wikipedia
               | page as well.
        
               | MomoXenosaga wrote:
               | Free speech is one those things everyone pretends to
               | support until someone uses that free speech to do
               | something they disagree with. The mental gymnastics and
               | hypocrisy tends to go off the charts quickly.
        
               | syshum wrote:
               | I dont think that is limited to free speech, that pretty
               | much applies to all freedoms.
               | 
               | The number of people that can separate how they would
               | choose to act, or live their life, and how the law should
               | be applied to people is very slim.
               | 
               | Most people want the law to support their choices in
               | life, thus if they don't believe blue cars should exists
               | there should be a law that prohibits blue cars.
               | 
               | Over my couple decades of advocacy for more liberty, I
               | have found that in general people do not want liberty,
               | they want validation of their life choices
        
       | criley2 wrote:
       | This project was deleted by the original author 9 months ago
       | after they were doxxed/compromised potentially in relation to
       | this project. This repo is a copy. The paypal donate link in the
       | readme still goes to the original authors paypal lol.
        
       | toxik wrote:
       | This is three years old, why is this being reposted now?
        
         | high_byte wrote:
         | the world needs to know
        
           | danuker wrote:
           | Urgently!
        
       | scubbo wrote:
       | Ah, I remember the terrible coworker who shared this in work-chat
       | within a week or so of being hired. Somehow he stuck around for a
       | couple of years despite continued terribleness (of both the "HR-
       | related" and the "actually-doing-his-job-badly" flavours), then
       | went to Facebook. Good riddance.
        
         | 0x0nyandesu wrote:
         | The funny part is everyone I've ever known to be into sex,
         | porn, kink, etc has been not the one doing the sexual
         | harassment in an organization but rather the ones calling out
         | bad behavior.
        
           | Subsentient wrote:
           | Just like the heretics are usually the most deeply
           | principled.
        
           | thewarrior wrote:
           | Because they understand consent a lot better.
        
       | mmiliauskas wrote:
       | Now we know what HN faps to..
        
       | tmp65535 wrote:
       | Here is the first (?) application of deep learning to improve the
       | pornography experience, an Android app called Melondream
       | published in early 2017:
       | 
       | http://driftwheeler.com
       | 
       | Surprisingly, almost 5 years after being published, this app
       | still serves thousands of user sessions per day. That rate
       | continues to slowly grow.
       | 
       | Except for its user interface, the whole app (client and server)
       | was written in Go
       | (https://pkg.go.dev/golang.org/x/mobile/cmd/gomobile).
       | 
       | It uses densenet feature vectors
       | (https://arxiv.org/abs/1608.06993) to allow "touch-to-search"
       | (ha!) with a fractional norm distance metric (https://citeseerx.i
       | st.psu.edu/viewdoc/download?doi=10.1.1.23...).
       | 
       | Citation: Discriminative Unsupervised Feature Learning with
       | Exemplar Convolutional Neural Networks
       | (https://arxiv.org/abs/1406.6909).
        
         | vmception wrote:
         | How does it improve the pornography experience? The description
         | and examples on that website don't make it obvious. The
         | Register article linked seems to help but I really didn't get
         | that from the landing page or example video.
         | 
         | edit: now I understand, it is a continuous slideshow of
         | erotica, and you can tap any part of the image to be taken down
         | a path of similar images. that's kind of cool.
         | 
         | I thought it would have had something to do with decensoring,
         | but nevermind.
         | 
         | Does BlueStacks work on an Apple Silicon machine? I don't have
         | an android device.
        
       | [deleted]
        
       | jameswhite1994 wrote:
        
       | Invictus0 wrote:
       | (2018)
        
       | [deleted]
        
       | textcortex wrote:
       | Finally something that deeplearning is useful for..
        
       ___________________________________________________________________
       (page generated 2021-12-30 23:01 UTC)