AI-generated child sexual abuse images are spreading. Law enforcement is racing to stop them – WTAP
AI-generated child sexual abuse images are spreading. Law enforcement is racing to stop them WTAP
Crackdown on Child Sexual Abuse Imagery
Introduction
Law enforcement agencies across the U.S. are taking action against the troubling spread of child sexual abuse imagery created through artificial intelligence technology. This includes manipulated photos of real children and graphic depictions of computer-generated kids. The Justice Department is aggressively pursuing offenders who exploit AI tools, while states are enacting laws to ensure prosecution for the creation of harmful imagery of children.
Prosecutions and Legal Measures
The Justice Department has recently brought what is believed to be the first federal case involving purely AI-generated imagery. In another case, federal authorities arrested a U.S. soldier accused of using an AI chatbot to make innocent pictures of real children sexually explicit. Existing federal laws apply to such content, and states are enacting legislation to prosecute the creation of AI-generated “deepfakes” and sexually explicit images of children.
Preventing Misuse of Technology
Child advocates are urgently working to prevent the misuse of technology and curb the flood of disturbing images that could hinder the rescue of real victims. Law enforcement officials are concerned about wasting time and resources on identifying and tracking down exploited children who do not actually exist. Legislators are passing laws to ensure local prosecutors can bring charges for AI-generated content, and governors in over a dozen states have signed such laws this year.
Impact on Real Victims
AI-generated child sexual abuse images can be used to groom children, and even if they are not physically abused, the victims can be deeply affected when their image is morphed to appear sexually explicit. The emotional impact can be severe, as experienced by 17-year-old Kaylin Hayman, a victim of “deepfake” imagery.
Efforts by Technology Companies
Top technology companies, including Google, OpenAI, and Stability AI, are collaborating with anti-child sexual abuse organization Thorn to combat the spread of child sexual abuse images. However, experts argue that more should have been done initially to prevent misuse before the technology became widely available.
Realism of AI Images
The National Center for Missing & Exploited Children’s CyberTipline has seen an increase in reports of content involving AI technology. The images generated by AI are so realistic that it is often difficult to determine if they were AI-generated or not.
Legal Framework and Charges
The Justice Department has the tools under federal law to prosecute offenders for AI-generated imagery. The Supreme Court has struck down a federal ban on virtual child sexual abuse material, but a federal law prohibits the production of visual depictions, including drawings, of children engaged in sexually explicit conduct. Charges are brought under the federal “child pornography” law in cases involving “deepfakes.”
Conclusion
The crackdown on child sexual abuse imagery created through AI technology is a priority for law enforcement agencies. The existing legal framework allows for prosecution, and efforts are being made to prevent the misuse of AI tools. However, more needs to be done to ensure the safety of children and prevent the abuse of AI technology.
SDGs, Targets, and Indicators
-
SDG 16: Peace, Justice, and Strong Institutions
- Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
- Indicator 16.2.3: Proportion of young women and men aged 18-29 years who experienced sexual violence by age 18
-
SDG 5: Gender Equality
- Target 5.2: Eliminate all forms of violence against all women and girls in the public and private spheres, including trafficking and sexual and other types of exploitation
- Indicator 5.2.1: Proportion of ever-partnered women and girls aged 15 years and older subjected to physical, sexual, or psychological violence by a current or former intimate partner in the previous 12 months
-
SDG 10: Reduced Inequalities
- Target 10.2: By 2030, empower and promote the social, economic, and political inclusion of all, irrespective of age, sex, disability, race, ethnicity, origin, religion, or economic or other status
- Indicator 10.2.1: Proportion of people living below 50 percent of median income, by age, sex, and persons with disabilities
Table: SDGs, Targets, and Indicators
SDGs | Targets | Indicators |
---|---|---|
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator 16.2.3: Proportion of young women and men aged 18-29 years who experienced sexual violence by age 18 |
SDG 5: Gender Equality | Target 5.2: Eliminate all forms of violence against all women and girls in the public and private spheres, including trafficking and sexual and other types of exploitation | Indicator 5.2.1: Proportion of ever-partnered women and girls aged 15 years and older subjected to physical, sexual, or psychological violence by a current or former intimate partner in the previous 12 months |
SDG 10: Reduced Inequalities | Target 10.2: By 2030, empower and promote the social, economic, and political inclusion of all, irrespective of age, sex, disability, race, ethnicity, origin, religion, or economic or other status | Indicator 10.2.1: Proportion of people living below 50 percent of median income, by age, sex, and persons with disabilities |
Source: wtap.com