“They made money off my pain and suffering.” These are the words of Taylor, an 18-year-old woman victim of non-consensual intimate imagery (NCII). Without her knowledge, her boyfriend made an intimate video of her when she was 14, which ended up on Pornhub. The presence of the video changed her life forever. “I went to school the next day and everybody was looking at their phones and me as I walked down the hall. They were laughing.” Due to the trauma and humiliation she experienced at the hand of NCII, Taylor attempted to commit suicide twice. Platforms like Pornhub regularly profit off of NCII as such platforms are not required by law to ensure that both parties consent to the uploading of the sensitive content. This is why legislation such as the Take It Down Act is desperately needed to curtail NCII.
Unfortunately, abuses of NCII are not isolated tragedies but widespread realities for many Americans. Currently, 1 in 25 Americans have been the victims of NCII, while 1 in 10 women have been victims of or threatened with NCII. Since 2015, there have been 339,000 reports of image-based sexual abuse. In 2023, there was a 106% increase in reports of such abuse from the previous year. Victims of NCII face a variety of harms, including PTSD, anxiety, depression, suicidal thoughts, and other mental health issues.
The advent of AI’s ability to generate and manipulate videos has contributed to the increase in NCII. More than 90% of AI-generated or manipulated videos are sexually explicit, and many of them are created and posted without the consent or knowledge of those depicted. In 2023, 21,000 deepfake pornographic videos were uploaded. This was a 460% increase from the previous year. The use of AI allows for NCII to be targeted as a weapon against people. For example, 1 in 6 congresswomen are the victims of AI-generated NCII. Currently, no legislation exists to protect victims and punish abusers of NCII.
To fight the spread of non-consensual intimate imagery, the bipartisan Take It Down Act was introduced in both the House and Senate, and has already passed the Senate unanimously. The bill alters current law in two primary ways. First, it criminalizes the publication of NCII in interstate commerce. Under the bill, pornography of adults could not be published unless each party within the content agrees to its publication. In addition, the bill criminalizes pornography depicting the sexual abuse of minors. These restrictions apply to real content as well as AI-created content. Second, the bill requires platforms to set up measures that would enable victims to request that content involving them be removed. Pursuant to a valid request made by a victim, the platform would be required to take down the content within 48 hours. This act enables victims of NCII to effectively remove content they wish not to be shared.
The Take It Down Act boasts a long list of over 80 organizations supporting and promoting the bill. These organizations include the National Center for Missing and Exploited Children; the National Center on Sexual Exploitation; The Rape, Abuse, and Incenst National Network; the Family Project Alliance; and the National Assosiation of Chiefs of Police. These organizations all agree: sexual exploitation through NCII must stop and federal action is required.
By criminalizing the publication of NCII, those who abuse and exploit others will now be criminally liable on a federal level. By requiring platforms to provide a means for taking down NCII, victims will now be equipped to remove NCII from online. In order to empower victims and criminalize abusive pornography including minors, Congress should enact a federal solution to this widespread problem and pass the Take It Down Act.
Comments