Blog Post

New Bill Protects Women and Girls from DeepFake Revenge Porn

One morning in October, 2023, 14-year-old Elliston Berry woke up to find her cell phone flooded with text messages from friends telling her that nude images of her were all over the internet. Knowing she had never posed nude for anyone, where did these images come from and how could this be stopped?

According to WFAA.com, what Elliston and her family quickly learned is that a student from her school took her Instagram photos and plugged them into artificial intelligence to create fake nude images. Nine other girls in her school were also victimized.

"It was so realistic. It is child porn," said Anna McAdams, Elliston's mom. "We really could not take care of her. We, in that moment, were helpless. [...] More and more pictures were coming out throughout that first day into the second day."

The school and the sheriff’s office were unable to stop the spread of the images, and Snapchat refused to take the images down. For eight months, the family fought to have them taken down until they contacted their state Senator, Ted Cruz (R).

"It wasn't until I went to Washington a couple of weeks ago and Senator [Ted] Cruz realized those pictures are still up there,” Anna said. “I spent eight-and-a-half months, and he was able to get ahold of somebody at Snapchat and they immediately, within 24 hours, took the accounts down."

What happened to the culprit who orchestrated this horrendous breach of common decency?

"He has probation, a slap on the wrist, and then he gets done at 18. It'll be expunged. But these pictures could forever be out there of our girls," Anna said.

Thankfully, Cruz and other members of the Senate got together and created the bipartisan “Take it Down Act” to protect and empower victims of non-consensual intimate image (NCII) abuse—also known as “revenge pornography.” The bill would criminalize the publication of NCII, including AI-generated NCII, and require social media and similar websites to have in place procedures to remove such content upon notification from a victim.

The internet is awash in NCII in large part from new generative artificial intelligence tools that can create lifelike, but fake, NCII depicting real people—also known as “deepfakes.”

Disturbingly, this trend is increasingly affecting minors. A number of high-profile cases, such as Elliston’s, have involved young girls targeted by their classmates with deepfake NCII. Up to 95 percent of all internet deepfake videos depict NCII, with the vast majority targeting women and girls. The spread of these images—possibly in perpetuity if allowed to remain online—is having a profoundly traumatic impact on victims.

“In recent years, we’ve witnessed a stunning increase in exploitative sexual material online, largely due to bad actors taking advantage of newer technologies like generative artificial intelligence. Many women and girls are forever harmed by these crimes, having to live with being victimized again and again,” said Senator Cruz.

“While some states provide legal remedies for victims of non-consensual intimate imagery, states would be further supported by a uniform federal statute that aids in removing and prosecuting the publication of non-consensual intimate images nationwide. By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime.”

Even though nearly every state has a law protecting people from NCII, including 20 states with laws explicitly covering deepfake NCII, these state laws vary in classification of crime and penalty and have uneven criminal prosecution. Even in these states, victims struggle to have images depicting them removed from websites, increasing the likelihood the images are continuously spread and they are continually retraumatized.

The Take it Down Act makes it unlawful for a person to knowingly publish NCII on social media and other online platforms. NCII is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. The bill also clarifies that a victim consenting to the creation of an authentic image does not mean that the victim has consented to its publication.

It also requires websites to take down NCII upon notice from the victim. Social media and other websites would be required to have in place procedures to remove NCII, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images.

“Right now, tech and social media platforms aren’t prioritizing taking down this kind of content, leaving families nowhere to turn for justice, said Melissa Henson, vice president, Parents Television and Media Council. “It is appalling that children are being subjected to this kind of abuse, which is magnified as it spreads on the internet. The tech industry must confront this and the ‘Take it Down Act’ will ensure that tech has accountability for acting to remove deepfake pornography.”

She adds: “It’s unfortunate that once again Congress will be forced to act in the absence of common sense and basic decency from social media companies, which should be more proactive in detecting and removing these images. These companies are using AI tools already, they know what the dangers are and how easily they can be exploited by bad actors.”

In addition to Senator Cruz, the legislation is cosponsored by many other Senators, including Amy Klobuchar (D-MN), Cynthia Lummis (R-WY), Richard Blumenthal (D-CT), Shelley Moore Capito (R-WV), Jacky Rosen (D-NV), Ted Budd (R-NC), Laphonza Butler (D-CA), Todd Young (R-IN), John Hickenlooper (D-CO), Bill Cassidy (R-LA) and Martin Heinrich (D-NM).

Click here to read the full legislation.

© All Rights Reserved, Living His Life Abundantly®/Women of Grace®  http://www.womenofgrace.com

Categories

Archives

2024
2023
2022
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
203

203 Archives