TAKE IT DOWN Act will crack down on malicious uses of AI and protect victims of “deepfake porn”
Legislation passed the Senate this February, now heads to President’s desk to be signed into law
WASHINGTON – Today, U.S. Senator John Hickenlooper celebrated the final passage of the bipartisan Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act, which he helped reintroduce and pass out of the Senate earlier this year. This legislation protects Americans by making it unlawful for a person to knowingly publish sexually explicit deepfake images of an identifiable individual, and requiring social media companies and websites to promptly remove the images.
The bill now heads to the President’s desk to be signed into law.
“AI innovation is changing so much about our world, but it can’t come at the cost of our children’s privacy and safety,” said Hickenlooper. “We have a narrow window to get out in front of this technology. We can’t miss it.”
New generative artificial intelligence tools are able to create lifelike, but fake, imagery depicting real people, known as deepfakes. Deepfakes have recently been used to target minors, including incidents where classmates used AI tools to create sexually explicit images of other classmates that they then shared on social media.
The Take It Down Act criminalizes the publication of non-consensual, intimate imagery (NCII), including AI-generated “deepfakes,” and requires social media platforms and online sites to remove NCII within 48 hours of notice.
Specifically, the TAKE IT DOWN Act:
- Criminalizes the publication of NCII in interstate commerce. The bill makes it unlawful for a person to knowingly publish NCII on social media and other online platforms. NCII is defined to include realistic, computer-generated pornographic images and videos that depict identifiable, real people. The bill also clarifies that a victim consenting to the creation of an authentic image does not mean that the victim has consented to its publication.
- Protects good faith efforts to assist victims. The bill permits the good faith disclosure of NCII, such as to law enforcement, in narrow cases.
- Requires websites to take down NCII upon notice from the victim. Social media and other websites will be required to have procedures in place to remove NCII, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images. The FTC is charged with enforcement of this section.
- Protects lawful speech. The bill is narrowly tailored to criminalize knowingly publishing NCII without barring lawful speech. The bill respects first amendment protections by requiring that computer-generated NCII meet a “reasonable person” test. Meaning, it must appear to realistically depict an individual.
Full text of the bill available HERE.