Should Generating Pornography Using AI Land You in Jail? Cruz Says Yes | Dallas Observer
Navigation

Ted Cruz Takes Aim at AI-Generated Revenge Porn With Bipartisan Bill

An Aledo high schooler testified in front of the U.S. Senate about a classmate turning her Instagram photos into porn.
The "Take It Down" Act would make distributing nonconsensual, deep fake pornography a federal crime punishable by jail time.
The "Take It Down" Act would make distributing nonconsensual, deep fake pornography a federal crime punishable by jail time. Getty Images
Share this:

In line with Texas lawmakers working to create stricter legislation surrounding the pornography industry, Senator Ted Cruz has introduced a bill that would offer federal protections to victims of AI-generated deep fake pornography.


The bipartisan “Take It Down” act would criminalize the publication of any “non-consensual, sexually exploitative images” including AI-generated deep fake images, and would require social media websites to create processes to help victims remove photos from the internet. Twelve other lawmakers signed onto the bill, six of whom are Democrats. 


“Many women and girls are forever harmed by these crimes, having to live with being victimized again and again,” Cruz said in a statement following the bill’s introduction. "By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime.”


A 2019 study by the deep fake detection agency Sensity found that 96% of deep fake material is sexual, and the majority of victims are women. Many states, including Texas, have seen a rise in deep fake “revenge pornography” targeting girls of high school-age. 


Aledo high-school student Elliston Berry testified before the U.S. Senate Committee on Commerce, Science, and Transportation last week about her experience of a classmate creating AI-generated sexually explicit photos of her when she was 14 years old. The images were taken from her Instagram account, and an app called DNGG was used to superimpose naked bodies onto the photos. 

“I was just 14 years old when I feared my future was ruined,” Berry said. “To this day, the number of people that have these images or have seen them is still a mystery. As it took eight and a half months to get these images off Snapchat, that doesn’t wipe the photos off people’s devices. Every day, I will live in fear that these images will resurface.”

Texas has already passed laws banning the creation of deep fake videos meant to impersonate political figures during election periods. The state also banned the creation of sexually explicit content that uses a person's likeness without their permission, but it is considered only a misdemeanor. The Take It Down act proposes jail time.

The law's current draft introduces a "lot of uncertainty," says North Texas attorney Steve Baker, who is skeptical of the law's impact due to the juvenile ages of many victims and perpetrators involved in deep fake pornography.

"Generally in federal jurisprudence, juveniles aren't prosecuted," Baker told the Observer. "I doubt that you're going to see juveniles and high-school kids prosecuted in federal court. I bet most of the time it'll be brought to the attention of federal prosecutors who send it back down to the states to handle it."

“If we had been Taylor Swift, they would have come down immediately. This bill gives us a voice we didn’t have before.” – Anna McAdams

tweet this

Baker said the law also introduces questions surrounding consent, which is typically difficult to prove or disprove in a court. In addition to introducing stricter penalties for those who create and distribute sexually explicit deep fakes, Cruz believes federal regulation is necessary to pressure social media websites to comply with taking down images. 

Berry’s mother, Anna McAdams, told lawmakers that the social media app Snapchat did not respond to a police warrant requesting the images of her daughter be taken down or to messages sent through the company’s communications channels. The images were removed from the website after Cruz’s office made contact with the company, she said. 

“If we had been Taylor Swift, they would have come down immediately. This bill gives us a voice we didn’t have before,” McAdams said.

National attention was drawn to AI-generated pornography earlier this year after fake, sexually explicit photos of Swift began circulating on X. The social media app stepped in soon after the photos began circulating, blocking searches for “Taylor Swift” and removing the posts. Before they were removed, the images were viewed over 45 million times.

Cruz’s bill would require social media companies to remove deep fake images within 48 hours of receiving a request from a victim, something he believes is a “critical remedy” for those who may not know which accounts have the images or where they originated. The Federal Trade Commission would be responsible for enforcing the mandate. 

“As the Supreme Court stated in a 2014 case concerning restitution for possession of child pornography, ‘every viewing of child pornography is a repetition of the victim’s abuse.’ This is no less the reality for victims of non-consensual, so-called ‘revenge pornography’ and victims of realistic, but fake, computer-generated sexually explicit images,” Cruz said. “It is one of my top priorities that this bill is on the next committee markup so that it can receive Senate floor consideration as soon as possible.”

BEFORE YOU GO...
Can you help us continue to share our stories? Since the beginning, Dallas Observer has been defined as the free, independent voice of Dallas — and we'd like to keep it that way. Our members allow us to continue offering readers access to our incisive coverage of local news, food, and culture with no paywalls.