[HN Gopher] Show HN: Loti - Remove revenge porn using a facial r...
___________________________________________________________________
Show HN: Loti - Remove revenge porn using a facial recognition on
adult sites
Author : lja
Score : 24 points
Date : 2023-01-11 16:53 UTC (6 hours ago)
(HTM) web link (goloti.com)
(TXT) w3m dump (goloti.com)
| hirak wrote:
| Hi, I am Hirak. I am a co-founder of Loti.
|
| Loti uses state of the art facial recognition algorithm built
| with proprietary models. It takes into consideration a dozen
| facial characteristics in more than 1 angles to find matches from
| a collection of more than 50 million data sets. We keep adding
| millions of new data every month to make sure we don't miss out
| any privacy breach of your photo.
| braingenious wrote:
| How did you arrive at the number of over 2.6 _Billion_ victims
| of revenge porn? 33% of people seems like an incredible number.
| lja wrote:
| Here's the quick read:
| https://www.fastcompany.com/90467411/shocking-study-
| finds-1-...
|
| Here's the real study: https://www.naag.org/attorney-general-
| journal/an-update-on-t...
|
| Also, I don't think these numbers could be extrapolated to
| every country/region/culture on earth but they do seem to
| hold for european and english speaking markets.
| braingenious wrote:
| So you work for this company and your opinion that it's 33%
| of english-speaking and european people? Napkin math puts
| the number of victims at ~400 million (population of europe
| + north america/3), maybe double that if you charitably
| count all of Africa being "english speaking".
| lja wrote:
| Any issues you have, you should read the study listed
| above. Gather what you want from it, believe what makes
| sense to you. It doesn't matter to our mission.
| braingenious wrote:
| Separate question:
|
| We've established that your website's claim that there are 2.6
| billion victims of revenge porn isn't shared by people that
| work at Loti and it's more likely ~400 million.
|
| Given that fact, if you were able to get 1% of those people to
| subscribe (4 million) at $8/mo, you would be taking in
| $32,000,000 per month ($384,000,000 per year).
|
| Would you not be the most profitable entity in the business of
| revenge porn?
| toomuchtodo wrote:
| How do you confirm the identity of your customer to ensure your
| DMCA takedown notices are legitimate?
|
| How do you programmatically discern between non-consensual adult
| images being shared against the subject's wishes versus images
| where they are somewhere public, don't own the copyright to the
| photo, but the law does not require the copyright holder remove
| or take down the image?
|
| Cool idea, might need to pivot, regardless an interesting space
| to be in in these times.
|
| EDIT: Appreciate the replies!
| hirak wrote:
| We have an error-proof system in place that follows specific
| protocols and instructions provided by each individual host
| websites. This helps us filter out 90% non-legitimate takedown
| notices.
| lja wrote:
| Confirmation is lower right now as we are working to get people
| into the app, a DMV/government license before people know who
| we are may be a too big of a pill to swallow for a starting
| company.
|
| We do have ML models running in the background to detect abuse
| and the accounts are flagged to management to reach out to
| those customers. I don't want to go through all the cat-and-
| mouse strategies we may have to avoid people gaming them.
|
| If someone has taken pictures for a company and given their
| model release for that image and its made its way to an adult
| site, then the individual should not initiate a DMCA. This is
| something we make our customers aware of before they DMCA
| content. Also, we know from first hand experience that site
| operators are quick to deny a DMCA if they do own rights for
| it. We've talked to them about this very situation.
| thot_experiment wrote:
| I bet y'all would make a lot more money charging $8/mo to find
| porn with the exact facial characteristic someone is looking for.
|
| Don't get me wrong, revenge porn is bad, but charging people
| money to take it down seems, well I guess better than nothing but
| let's hope a nonprofit starts maintaining a database like this
| and completely eliminates any market this site may capture.
|
| Also abusing the DMCA for censorship, even in pursuit of a noble
| cause is harmful to society, though maybe not more harmful than
| revenge porn?
|
| I'm really curious how our culture will continue to reckon with
| the ideas that
|
| a) There are intimate images extant of many people. Many will
| become most.
|
| b) AI powered face swapping and image generation will make
| arbitrary porn trivial on home-gamer GPUs in a few years.
| lja wrote:
| The idea of charging $8/mo for someone to look at anyone's
| photos online is pretty far off brand for us. We're really
| looking to help people find just their own photos. I do agree
| the market would be larger but that's not something we're
| interested in doing. That breaks ethical barriers for us.
|
| We have to charge money to cover our expenses, most people
| don't realize how expensive GPU's are and that's the only way
| to do this cost effectively at scale. Even a non-profit would
| incur enormous expenses because there is just no way to do this
| cheap.
|
| There is no abuse of the DMCA process here. Customers are asked
| to sign an affidavit and we don't allow them to DMCA outside of
| their facial profiles.
| thot_experiment wrote:
| I just don't feel like your motives can possibly be aligned
| with victims if you're trying to make money off victims.
|
| RE: DMCA abuse, it's probably within the letter of the law,
| the law itself is broken.
|
| Bonus question: How does this all deal with twins? What's the
| miss rate on facial recognition anyway?
| lja wrote:
| We're partnered with RAINN and other sexual abuse agencies
| to give victims in extreme situations free access to our
| site. Our alignment isn't perfect but we are working to be
| a useful tool at a low cost.
|
| Some things provide a social good but also have expenses,
| this happens to be one of them. Doctors charge patients,
| lawyers charge their clients and we aren't anywhere near
| those margins.
|
| EDIT: Twins are still a weak spot for us, but we have a
| plan to build a model that can tell two twins apart.
| thot_experiment wrote:
| Good answers, I would find this a lot more palatable if
| you went much harder on the "charity/help" angle. I
| didn't see any mention of that when I was browsing the
| site and that's probably something you should be
| highlighting.
|
| > Doctors charge patients, lawyers charge their clients.
|
| Yeah I'm aware, but because that happens doesn't mean
| it's the most prosocial system we can implement. I think
| that you're probably doing good on net and I probably
| shouldn't have been so negative.
|
| I do worry though that inadequate capitalistic solutions
| to these sorts of problems serve to preserve a bad status
| quo in the long run and may end up doing more harm than
| good?
| aliswe wrote:
| What the hell
| KomoD wrote:
| I feel like this can be easily misused
| lja wrote:
| We have protections in place to prevent misuse but we are aware
| that prevention will be a large hill for us to climb and we're
| working on it.
| thot_experiment wrote:
| I'm not sure what (1) alert per month means, but I _definitely_
| think that $8 /mo to find porn with your preferred facial
| characteristics based on some uploaded images seems like a more
| viable business model. I'm certain there are people who would
| pay to have a digest of pron with lookalikes of their favorite
| actresses/whatever mailed to them regularly.
| lja wrote:
| 1 alert a month means that we'll automatically scan our
| database once a month and send you an email. It's more of an
| insurance policy to make sure we didn't find anything for you
| that month.
| gardenhedge wrote:
| Hopefully I never have to use this for me but I am glad it exists
| lja wrote:
| Hi I'm Luke, co-founder.
|
| People are warned that an image lives on the internet forever,
| this has been especially true for images that are not
| consensually taken or shared. We're fixing that.
|
| We created Loti (https://goloti.com), a service that uses facial
| recognition to help users search, find, and reclaim non-
| consensual intimate images and videos using a streamlined DMCA
| process we facilitate in our software.
|
| Over 10 million people are victims of non-consensual image
| sharing in the United States alone. Those are just the people
| that were even aware that their images were being shared;
| research shows that up to 30% of victims were hacked or victims
| of hidden cameras and are unaware.
|
| Our goal is to bring peace of mind that your private images stay
| private.
| Am4TIfIsER0ppos wrote:
| > Submit clear detailed pictures of your face in order to find
| porn of yourself.
|
| Do you think that is a good idea?
| lja wrote:
| Would love clarity, what do you mean? That is the whole point
| of the service. If you don't submit a clear picture of
| yourself, there isn't any way for us to search.
| Centigonal wrote:
| I think this idea has potential, but HN might be far off from
| your target audience.
| lja wrote:
| Thanks! I've been a member for a decade and just wanted to
| share what I'm working on in data science.
___________________________________________________________________
(page generated 2023-01-11 23:02 UTC)