In an unprecedented move, the attorneys general from all 50 states and four territories have sent a collective letter to Congress, calling for the establishment of an expert commission to address the exploitation of children through AI-generated child pornography.
Ars Technica reports that in a rare show of bipartisan unity, attorneys general from every state have come together to tackle a pressing issue: the exploitation of children through AI-generated child porn content. Their collective letter sent to Congress highlights the urgency of the situation, stating, “As Attorneys General of our respective States and territories, we have a deep and grave concern for the safety of the children within our respective jurisdictions.”
Light reflects off of the U.S. Capitol dome as the sun sets on Capitol Hill in Washington, Wednesday, Jan. 4, 2023. (AP Photo/Patrick Semansky)
The letter emphasizes the challenges posed by AI technologies, particularly open-source image synthesis tools, in creating child sexual abuse material (CSAM). “Creating these images is easier than ever,” the letter reads, “as anyone can download the AI tools to their computer and create images by simply typing in a short description of what the user wants to see.” The attorneys general express concern that the lack of regulation allows these tools to be “run in an unrestricted and unpoliced way.”
The issue extends beyond traditional CSAM to include AI-generated deepfakes. The letter outlines how these deepfakes can be created by “studying real photographs of abused children to generate new images showing those children in sexual positions.” This technology can also overlay the faces of unvictimized children onto the bodies of abused children, creating new, disturbing content.
Breitbart News reported in July on the spread of AI-generated child pornography:
Now that tech giants are allowing amateur coders to rip out safeguards from their chatbots, thousands of AI-generated child abuse images are flooding dark web forums, and predators are even sharing “pedophile guides” to AI along with selling their pornographic material, according to a report by Daily Mail.
AI Chatbots, which have become increasingly sophisticated in recent years, and are known for, in part, generating life-like images, are now being exploited by pedophiles seeking to create very realistic child porn.
This perverted endeavor has been made possible by tech firms that have decided to release their codes to the public, claiming that all they wanted to do was democratize the AI technology.
The letter argues that AI-generated images, not involving real children, still constitute a form of abuse. The attorneys general argue that these technologies “support the growth of the child exploitation market by normalizing child abuse and stoking the appetites of those who seek to sexualize children.”
The letter concludes with two primary recommendations. First, the establishment of an expert commission focused on the exploitation of children through AI technologies. This commission would operate on an ongoing basis to keep up with the rapidly evolving tech landscape. Second, the attorneys general urge Congress to expand existing laws against CSAM to explicitly cover AI-generated materials.
The attorneys general acknowledged that weighing personal freedoms versus protecting vulnerable populations such as children is a complex issue, stating that establishing a proper balance “may be difficult in practice,” which is why they recommend the creation of a commission to study any potential regulation.
Read more at Ars Technica here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan