Ga naar de inhoud

Tag: fightthenewdrug

Ex-Performer Describes What BDSM and Abuse Porn Is Really Like

“They want the suffering.”

Have you ever wondered what really goes on in the world of extreme abuse porn?

Meet Theodosia, an ex-porn performer who spent years doing bondage, domination, submission and masochism (BDSM) porn. After surviving childhood abuse, the trauma she endured fed into violent and abusive romantic relationships, and eventually to a boyfriend introducing her to the world of violent pornography. She came to learn that the women who could endure a lot of pain on camera were valued in the BDSM porn world, and she was taught that her primary talent was suffering well.

One day, after years of being abused on camera, Theodosia realized she could no longer stomach the world she was thrown into. See how Theodosia got her start in BDSM porn, and why she eventually left the industry on her own terms.

Fight the New Drug is a non-religious and non-legislative organization that exists to provide individuals the opportunity to make an informed decision regarding pornography by raising awareness on its harmful effects using only science, facts, and personal accounts.

To learn more about how pornography impacts individuals, relationships, and society, visit http://ftnd.org/. This video was made possible by Fighter Club. To help us create more content like this, consider joining Fighter Club at http://ftnd.org/fc.

Een reactie plaatsen

“Am I In Porn?”: This Tool Searches Porn Sites to See if Your Images are Used in Videos

JULY 19, 2022

Created by a German AI company called deepXtech UG, “Am I In Porn?” is a search engine that exists to help you find out if you appear on porn sites.

1 in 8.

According to research by the Cyber Civil Rights Initiative (CCRI), a nonprofit organization that offers services to victims of cyberbullying and cyber harassment, 1 in 8 is the number of social media-using survey respondents who said that they themselves had been victims of nonconsensual porn, also known as image-based sexual abuse.

And another 1 in 20 even admitted to having shared a sexually graphic image of another person without their consent.

Related: The New York Times Exposé That Helped Spark the Possible Beginning of the End of Pornhub

Those stats clearly exhibit a massive problem, but some underlying issues are making things worse.

Let’s dive into what’s going on and discuss what tools exist to fight the problem.

Porn sites make loads of money from nonconsensual porn and rape tapes

It’s possible that porn sites are incentivized not to take down nonconsensual porn and rape tapes because they’re popular. More views mean more money from advertisements for the sites, after all.

Take the story of 14-year-old Rose Kalemba, for example. In 2009, she was abducted by men driving around her neighborhood and reportedly raped by them for hours. After being stabbed multiple times and nearly dying, she was able to escape.

Sadly, one nightmare was quickly replaced by another. Only months after the attack, she reportedly discovered six videos of her rape on Pornhub. They were being shared by her schoolmates, and it led to intense bullying from classmates.

Related: How My Images were Stolen, Manipulated, and Nonconsensually Posted to a Porn Site

Rose spent the next half-year emailing Pornhub, requesting that they take down the videos. But, even with her explaining that she was a minor in the videos, her requests came up empty—Pornhub didn’t even respond to her and the videos stayed live on the site.

Finally, Rose set up an email account and contacted Pornhub posing as a lawyer threatening a lawsuit. Within 48 hours of the email, the videos of her were gone.

That’s right: the porn site reportedly ignored the “harmless” underage rape victim for months, but listened to the “credible” lawyer in a matter of days.

And why’d they do this? It might be because they had such a poor content moderation and review system, or because videos of her were maybe helping them rake in more cash. Either way, it doesn’t look good for one of the world’s most popular free porn sites.

Tools and tips that’ll help you fight nonconsensual porn

Rose’s story isn’t a one-time thing, it’s becoming a more frequent issue. As that’s the case, here are some tips and tools that can help you protect yourself.

According to Caleb Chen, an internet privacy advocate at Private Internet Access, a personal private network service, one thing you can do is make sure your phone isn’t automatically backing up into the cloud.

Related: “I Wasn’t in Control of My Body”: How the Porn Industry Cashes In on Nonconsensual Content

When you take a photo on an iPhone, it encourages you to back it up on iCloud (a bunch of servers run by Apple) and many users have accepted having all their photos backed up onto the cloud, whether during their phone set-up or later, and then forgotten about it,” explains Chen. “When the photo is sent to the cloud, it is generally encrypted in some way so the cloud provider can’t see what the contents are. The issue is that cloud back-ups can be accessed with an email and password, and those are often not as secure as people think.”

Check out this guide to making sure your phone isn’t automatically backing up into the cloud, if this is a precaution you’d like to take. Note that a risk with not uploading your photos to iCloud is that they’ll need to be backed up elsewhere or else you’ll lose your photos altogether if something were to happen to your phone.

Related: “I Didn’t Know If They’d Kill Me”: What Happened When This Jane Doe was Trafficked by GirlsDoPorn

Another thing you can do is check out the new site “Am I In Porn?” Created by a German AI company called deepXtech UG, “Am I In Porn?” is a search engine that exists to help you find out if you appear on porn sites.

What is “Am I In Porn?” and how does it work?

Because victims cannot know if there is “parasite porn” or revenge porn of themselves on the internet until they or someone they know stumbles upon it, it is intended for any person who wants to check that no pornographic content of themselves is distributed on various porn platforms without having to visit those platforms themselves. These can be any number of people including, but not limited to, those who have passed on content to third parties or those who fear that pornographic material has been created and distributed without their knowledge (e.g. by DeepFake technology).

“Am I In Porn?” allows users who are over the age of 18 to match their face with millions of videos and find out within seconds if people in the videos they searched are similar. All you have to do is upload a picture of yourself (which will never be saved) and check the results. The photo only needs to show your face clearly, and the site will only show you the videos that have the highest probability of a match.

How does it make these matches?

Related: MindGeek, Pornhub’s Parent Company, Sued for Reportedly Hosting Videos of Child Sex Trafficking

The simple answer is mathematics. According to the site, every face has a unique arrangement of features, such as eyes, nose, mouth, etc., which have a certain distance between them. Mathematically speaking, these distances and arrangements are things called “vectors.” The site then built a database with millions of vectors, which they extracted from millions of porn videos. As soon as you upload an image, they extract the vectors and match them with their database. The more similar the arrangement of the vectors, the higher the probability that the face is the same.

If and when you find a match, “Am I In Porn?” will provide you with step-by-step instructions on how to quickly and easily report and remove the video on the appropriate platform.

Currently, the site charges a small fee (through PayPal or a SEPA direct debit mandate) for each search to cover their costs and is only available in European Union countries. They are working to change both of those things because they believe “things that make the world a better place should be free” and available to all.

Is nonconsensual porn even allowed?

On its face, the law strongly prohibits nonconsensual porn. However, a number of loopholes exist that make it extremely difficult to hold those sharing the illicit imagery responsible. Click here to learn more about how porn sites profit from nonconsensual imagery.

Survivors of nonconsensual image sharing face many disruptive mental health issues that affect their daily lives. And, although they haven’t faced literal sexual assault, in some cases, there are striking similarities between the mental health effects of sexual assault and nonconsensual video creation for survivors.

Related: Their Private Videos Were Nonconsensually Uploaded to Pornhub, and Now These Women are Fighting Back

Porn sites don’t care about your mental health. They don’t care whether or not you were raped. All that matters is money, and how they can profit from content—even if that content displays someone’s real suffering.

So, how many more must be exploited until society recognizes the harms of porn and the porn industry?

This is one of the many reasons we raise awareness on the harms of porn, because many porn sites profit from the creation and distribution of nonconsensual content.

Een reactie plaatsen

Real Rape Videos Reportedly Continually on World’s Most Popular Porn Site

XVideos reportedly receives over 3.3 billion site visits per month. The site also reportedly hosts non-consensual content like real rape and abuse videos.

AUGUST 8, 2022

XVideos, the world’s most popular porn sites, reportedly receives over 3.3 billion site visits per month.

The site also reportedly still hosts real rape tapes and abuse videos, according to a recent report by a German media company.

This isn’t the first time XVideos has been reviewed and called out for illegal and illicit content—this scathing review was just published due to the countless sexually violent videos on its website.

The review came from Netzpolitik.org, a Berlin-based media company whose mission is to increase internet digital freedom and openness.

Related: XVideos, World’s Most Popular Porn Site, Reportedly Hosts Nonconsensual Content & Child Exploitation

The report showed that XVideos has taken steps to combat toxic content, such as banning the term “rape,” but that a number of videos still exist on the site that are extremely abusive and problematic.

Netzpolitik was able to find videos where people do not seem to be “fully conscious but are apparently being abused for sexual acts,” while other videos are of people who “don’t seem to know they’re being filmed, for example on the toilet.”

Moreover, slight changes to spelling and wording can still reportedly provide access to videos that should be banned under rape searches. For example, the site has content categories, like “against her will” or “unconscious and f—ed,” and instead of typing “rape,” closely-related but alternate search terms provide over 400,000 results.

It’s no wonder that Chris Köver, editor at Netzpolitik, said that “XVideos could certainly do more to prevent distribution of these recordings.”

Is XVideos moderating its content?

XVideos reportedly pays a bunch of people to stop toxic content from ever being uploaded to the site or to get rid of it the moment it’s found. Also, as we mentioned earlier, the porn site has stopped uploaders from using certain tags that hint at sexual abuse and violence.

However, it’s not difficult to get around such tags and it’s nearly impossible for the moderators to catch everything that’s uploaded given XVideos is the 7th-most visited website in the world, and the first most-popular porn site.

Related: By the Numbers: Is the Porn Industry Connected to Sex Trafficking?

While XVideos does have an online form that can be filled out anonymously to flag videos to moderators (Netzpolitik‘s review noted 25 of the 30 videos it flagged were removed within a day), it’s still not enough.

Why? Because this kind of system puts the burden on people to report the things they see. It forces abuse victims, for example, to be re-victimized by searching for and seeing their violation broadcast to the world.

Related: How Porn Portrays Violence As a Sexual Fantasy

In other words, unless moderators or an occasional external reviewer finds it among all the toxic content being uploaded on a minute-by-minute basis, it’s not all going to be reported.

That’s a problem because it opens the door for victims of all types to be traumatized again and again rather than preventing the disturbing content from being shared and consumed for “entertainment” in the first place.

What happens when victims are traumatized by sexual content shared against their will?

Take the tragic story of a 16-year-old girl from Perth, Australia recounted by journalist Nicholas Kristof in an investigative op-ed for The New York Times.

The teen Snapchatted a nude photo of herself to her then-boyfriend with a message, “I love you. I trust you.” Without consent, the boyfriend immediately screenshot the snap and shared it with five of his friends, who then shared it with 47 other friends.

Before long, over 200 students at the teen’s school had seen the image with one person uploading it to a porn site with her name and school.

Related: 20 Stats About the Porn Industry and its Underage Consumers

The teen stopped attending school and self-medicated with drugs. Her family moved to a different city and then a different state, but she felt she could not escape. At 21 years old, she died by suicide.

Sadly, this story is an all too common one. People’s lives are turned upside-down: some are forced to change their names, looks, and move. Others face mental health crises. And others still face all of these ramifications and more.

What if the content isn’t real sexual abuse imagery?

Most people would probably agree that if something was uploaded without consent, it should be taken down, but what about the content that is simply staged and scripted as if one person is being sexually abused?

For example, many of the links Google returns for a search term like “schoolgirl” will likely be of porn performers who are play acting child abuse, but this blend of professional videos mixed with non-consensual and abusive content is problematic for a few reasons:

  1. This content makes it even more challenging for consumers to tell the difference between the real and staged videos of abuse.
  2. These videos sexualize and fetishize real abuse scenarios that can ultimately influence their sexual tastes.1

Related: How Can You Know for Sure if the Porn You Watch is Consensual?

Even if the video isn’t technically “real rape,” it normalizes the abuse that many do face and creates demand for more exploitative, violent content.

Exploitation, rape, sexual assault, and sex abuse are not sexual entertainment.

Why this matters

There are videos that exist that don’t contain real or acted sexual abuse material, but it seems as though violence-free content is becoming more rare in the mainstream porn world.

One study analyzing the acts portrayed in porn videos suggests that as little as 35.0% and as much as 88.2% of popular porn scenes contain violence or aggression, and that women are the targets of violence approximately 97% of the time.23

Related: Man Sets Up Fake “Sleep Study” To Rape 100 Women And Film It

Another study found the most common form of sexual violence shown was between family members, and frequent terms used to describe the videos included “abuse,” “annihilation,” and “attack.” The researchers concluded by saying that these sites are “likely hosting” unlawful material.4

And the few videos that exist that don’t fall under the categories we’ve mentioned earlier still have their own host of negative effects on viewers ranging from decreased enjoyment in sex, decreased empathy, lowered self-esteem and more.

At the end of the day, porn is just not worth it. Protect yourself and protect others by refusing to click.

Citations

1Downing, M. J., Jr, Schrimshaw, E. W., Scheinmann, R., Antebi-Gruszka, N., & Hirshfield, S. (2017). Sexually Explicit Media Use by Sexual Identity: A Comparative Analysis of Gay, Bisexual, and Heterosexual Men in the United States. Archives of sexual behavior, 46(6), 1763–1776. https://doi.org/10.1007/s10508-016-0837-9

2Bridges, A. J., Wosnitzer, R., Scharrer, E., Sun, C., & Liberman, R. (2010). Aggression and sexual behavior in best-selling pornography videos: a content analysis update. Violence against women, 16(10), 1065–1085. https://doi.org/10.1177/1077801210382866

3Fritz, N., Malic, V., Paul, B., & Zhou, Y. (2020). A descriptive analysis of the types, targets, and relative frequency of aggression in mainstream pornography. Archives of Sexual Behavior, 49(8), 3041-3053. doi:10.1007/s10508-020-01773-0

4Vera-Gray, F., McGlynn, C., Kureshi, I., & Butterby, K. (2021). Sexual violence as a sexual script in mainstream online pornography. The British Journal of Criminology, doi:10.1093/bjc/azab035

Een reactie plaatsen