Office of the
Illinois Attorney General
Kwame Raoul

Illinois Attorney General Photo

ATTORNEY GENERAL RAOUL URGES TECH COMPANIES TO STOP THE SPREAD OF DEEPFAKE NONCONSENSUAL INTIMATE IMAGERY

August 26, 2025

Chicago – Attorney General Kwame Raoul today, as part of a bipartisan coalition of 47 state attorneys general, called on major search engines and payment platforms to take stronger action against the growing spread of computer-generated deepfake nonconsensual intimate imagery (NCII), commonly known as “deepfakes.” In a letter to search engines, the coalition outlines the failures of these companies to limit the creation of deepfakes and calls for stronger safeguards – such as warnings and redirecting users away from harmful content – to better protect the public. In a separate letter to payment platforms, the coalition urges these companies to take bolder action to protect the public by identifying and removing payment authorization for deepfake NCII content.

 “Deepfake nonconsensual intimate imagery causes real harm,” Raoul said. “In particular, AI-generated child pornography is an increasing concern, as it is used by predators to lure and groom minors and to normalize their own reprehensible behavior. We all have a responsibility to protect our children from the trauma of exploitation. I join my colleagues in calling on tech companies to do their part.”

 The spread of computer-generated NCII online poses significant harm to the public – particularly women and girls. It has increasingly been used to embarrass, intimidate and exploit people around the world, including in notable cases involving celebrities like Taylor Swift, as well as teenagers in New Jersey, Florida, Washington, Kentucky, South Korea and Spain. Although deepfake NCII overwhelmingly targets women and girls, men and boys have been victimized as well. A recent report found that 98% of fake videos online are deepfake NCII.

 In their letters, the coalition points to existing industry practices that can be deployed to address NCII. For example, search engines already limit access to harmful content such as searches for “how to build a bomb” and “how to kill yourself.” Raoul and the attorneys general urged the companies to adopt similar measures for searches such as “how to make deepfake pornography,” “undress apps,” “nudify apps” or “deepfake porn.” The coalition also urged payment platforms to deny sellers the ability to use their services when they learn of connections to deepfake NCII tools and content, and to remove those sellers from their network.

 This is the latest in Raoul’s effort to stop child exploitation, which includes the use of AI. Attorney General Raoul initiated House Bill (HB) 4623, which was signed into law last year, to address the use of AI-generated child pornography. The law prohibits the use of AI technology to create child pornography that either involves real children or obscene imagery. It also separately prohibits the nonconsensual dissemination of certain AI-generated sexual images.

 In 2023, Raoul also joined a bipartisan coalition urging Congress to study how AI can and is being used to exploit children through child sexual abuse material.

 Joining Attorney General Raoul in sending these letters are the attorneys general of Alaska, American Samoa, Arizona, Arkansas, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Idaho, Iowa, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Nebraska, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, Puerto Rico, Rhode Island, South Carolina, South Dakota, Tennessee, U.S. Virgin Islands, Utah, Vermont, Virginia, Washington, West Virginia, Wisconsin and Wyoming.