In a ten-week long campaign RadicalGirlsss will go through the myths surrounding pornography, trying to uncover the reality behind the glamour. We’ll be hosting a series of webinars, post reality-checks and foster discussion to understand the operation of this multi-million dollar industry.Een reactie plaatsen
“Am I In Porn?”: This Tool Searches Porn Sites to See if Your Images are Used in Videos
JULY 19, 2022
Created by a German AI company called deepXtech UG, “Am I In Porn?” is a search engine that exists to help you find out if you appear on porn sites.
According to research by the Cyber Civil Rights Initiative (CCRI), a nonprofit organization that offers services to victims of cyberbullying and cyber harassment, 1 in 8 is the number of social media-using survey respondents who said that they themselves had been victims of nonconsensual porn, also known as image-based sexual abuse.
And another 1 in 20 even admitted to having shared a sexually graphic image of another person without their consent.
Related: The New York Times Exposé That Helped Spark the Possible Beginning of the End of Pornhub
Those stats clearly exhibit a massive problem, but some underlying issues are making things worse.
Let’s dive into what’s going on and discuss what tools exist to fight the problem.
Porn sites make loads of money from nonconsensual porn and rape tapes
It’s possible that porn sites are incentivized not to take down nonconsensual porn and rape tapes because they’re popular. More views mean more money from advertisements for the sites, after all.
Take the story of 14-year-old Rose Kalemba, for example. In 2009, she was abducted by men driving around her neighborhood and reportedly raped by them for hours. After being stabbed multiple times and nearly dying, she was able to escape.
Sadly, one nightmare was quickly replaced by another. Only months after the attack, she reportedly discovered six videos of her rape on Pornhub. They were being shared by her schoolmates, and it led to intense bullying from classmates.
Related: How My Images were Stolen, Manipulated, and Nonconsensually Posted to a Porn Site
Rose spent the next half-year emailing Pornhub, requesting that they take down the videos. But, even with her explaining that she was a minor in the videos, her requests came up empty—Pornhub didn’t even respond to her and the videos stayed live on the site.
Finally, Rose set up an email account and contacted Pornhub posing as a lawyer threatening a lawsuit. Within 48 hours of the email, the videos of her were gone.
That’s right: the porn site reportedly ignored the “harmless” underage rape victim for months, but listened to the “credible” lawyer in a matter of days.
And why’d they do this? It might be because they had such a poor content moderation and review system, or because videos of her were maybe helping them rake in more cash. Either way, it doesn’t look good for one of the world’s most popular free porn sites.
Tools and tips that’ll help you fight nonconsensual porn
Rose’s story isn’t a one-time thing, it’s becoming a more frequent issue. As that’s the case, here are some tips and tools that can help you protect yourself.
According to Caleb Chen, an internet privacy advocate at Private Internet Access, a personal private network service, one thing you can do is make sure your phone isn’t automatically backing up into the cloud.
Related: “I Wasn’t in Control of My Body”: How the Porn Industry Cashes In on Nonconsensual Content
“When you take a photo on an iPhone, it encourages you to back it up on iCloud (a bunch of servers run by Apple) and many users have accepted having all their photos backed up onto the cloud, whether during their phone set-up or later, and then forgotten about it,” explains Chen. “When the photo is sent to the cloud, it is generally encrypted in some way so the cloud provider can’t see what the contents are. The issue is that cloud back-ups can be accessed with an email and password, and those are often not as secure as people think.”
Check out this guide to making sure your phone isn’t automatically backing up into the cloud, if this is a precaution you’d like to take. Note that a risk with not uploading your photos to iCloud is that they’ll need to be backed up elsewhere or else you’ll lose your photos altogether if something were to happen to your phone.
Related: “I Didn’t Know If They’d Kill Me”: What Happened When This Jane Doe was Trafficked by GirlsDoPorn
Another thing you can do is check out the new site “Am I In Porn?” Created by a German AI company called deepXtech UG, “Am I In Porn?” is a search engine that exists to help you find out if you appear on porn sites.
What is “Am I In Porn?” and how does it work?
Because victims cannot know if there is “parasite porn” or revenge porn of themselves on the internet until they or someone they know stumbles upon it, it is intended for any person who wants to check that no pornographic content of themselves is distributed on various porn platforms without having to visit those platforms themselves. These can be any number of people including, but not limited to, those who have passed on content to third parties or those who fear that pornographic material has been created and distributed without their knowledge (e.g. by DeepFake technology).
“Am I In Porn?” allows users who are over the age of 18 to match their face with millions of videos and find out within seconds if people in the videos they searched are similar. All you have to do is upload a picture of yourself (which will never be saved) and check the results. The photo only needs to show your face clearly, and the site will only show you the videos that have the highest probability of a match.
How does it make these matches?
Related: MindGeek, Pornhub’s Parent Company, Sued for Reportedly Hosting Videos of Child Sex Trafficking
The simple answer is mathematics. According to the site, every face has a unique arrangement of features, such as eyes, nose, mouth, etc., which have a certain distance between them. Mathematically speaking, these distances and arrangements are things called “vectors.” The site then built a database with millions of vectors, which they extracted from millions of porn videos. As soon as you upload an image, they extract the vectors and match them with their database. The more similar the arrangement of the vectors, the higher the probability that the face is the same.
If and when you find a match, “Am I In Porn?” will provide you with step-by-step instructions on how to quickly and easily report and remove the video on the appropriate platform.
Currently, the site charges a small fee (through PayPal or a SEPA direct debit mandate) for each search to cover their costs and is only available in European Union countries. They are working to change both of those things because they believe “things that make the world a better place should be free” and available to all.
Is nonconsensual porn even allowed?
On its face, the law strongly prohibits nonconsensual porn. However, a number of loopholes exist that make it extremely difficult to hold those sharing the illicit imagery responsible. Click here to learn more about how porn sites profit from nonconsensual imagery.
Survivors of nonconsensual image sharing face many disruptive mental health issues that affect their daily lives. And, although they haven’t faced literal sexual assault, in some cases, there are striking similarities between the mental health effects of sexual assault and nonconsensual video creation for survivors.
Related: Their Private Videos Were Nonconsensually Uploaded to Pornhub, and Now These Women are Fighting Back
Porn sites don’t care about your mental health. They don’t care whether or not you were raped. All that matters is money, and how they can profit from content—even if that content displays someone’s real suffering.
So, how many more must be exploited until society recognizes the harms of porn and the porn industry?
This is one of the many reasons we raise awareness on the harms of porn, because many porn sites profit from the creation and distribution of nonconsensual content.Een reactie plaatsen