Child sexual abuse content growing online with AI-made images, report says
Child sexual abuse content growing online with AI-made images, report says The Guardian
Sustainable Development Goals (SDGs) and the Rise of Child Sexual Exploitation Online
Introduction
Child sexual exploitation is on the rise online and taking new forms such as images and videos generated by artificial intelligence, according to an annual assessment released on Tuesday by the National Center for Missing & Exploited Children (NCMEC), a US-based clearinghouse for the reporting of child sexual abuse material.
Increased Reports of Child Abuse Online
Reports to the NCMEC of child abuse online rose by more than 12% in 2023 compared with the previous year, surpassing 36.2m reports, the organization said in its annual CyberTipline report. The majority of tips received were related to the circulation of child sexual abuse material (CSAM) such as photos and videos, but there was also an increase in reports of financial sexual extortion, when an online predator lures a child into sending nude images or videos and then demands money.
New Threats: AI-generated CSAM
Some children and families were extorted for financial gain by predators using AI-made CSAM, according to the NCMEC.
The center received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI, a category it only started tracking in 2023, a spokesperson said.
“The NCMEC is deeply concerned about this quickly growing trend, as bad actors can use artificial intelligence to create deepfaked sexually explicit images or videos based on any photograph of a real child or generate CSAM depicting computer-generated children engaged in graphic sexual acts,” the NCMEC report states.
“For the children seen in deepfakes and their families, it is devastating.”
Impact on Real Child Victims
AI-generated child abuse content also impedes the identification of real child victims, according to the organization.
Legal Implications and Global Reporting
Creating such material is illegal in the United States, as making any visual depictions of minors engaging in sexually explicit conduct is a federal crime, according to a Massachusetts-based prosecutor from the Department of Justice, who spoke on the condition of anonymity.
In total in 2023, the CyberTipline received more than 35.9m reports that referred to incidents of suspected CSAM, more than 90% of it uploaded outside the US. Roughly 1.1m reports were referred to police in the US, and 63,892 reports were urgent or involved a child in imminent danger, according to Tuesday’s report.
Role of Tech Companies
The platform that submitted the most cybertips was Facebook, with 17,838,422. Meta’s Instagram made 11,430,007 reports, and its WhatsApp messaging service made 1,389,618. Google sent NCMEC 1,470,958 tips, Snapchat sent 713,055, TikTok sent 590,376 and Twitter reported 597,087.
Importance of Reporting and Quality
In total, 245 companies submitted CyberTipline reports to the NCMEC out of 1,600 companies around the world who have registered their participation with the cybertip reporting program. US-based internet service providers, such as social media platforms, are legally mandated to report instances of CSAM to the CyberTipline when they become aware of them.
According to the NCMEC, there is disconnect between the volumes of reporting and the quality of the reports submitted. The center and law enforcement cannot legally take action in response to some of the reports, including ones made by content moderation algorithms, without human input. This technicality can prevent police from seeing reports of potential child abuse.
“The relatively low number of reporting companies and the poor quality of many reports marks the continued need for action from Congress and the global tech community,” the NCMEC report states.
SDGs, Targets, and Indicators
SDGs | Targets | Indicators |
---|---|---|
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator: Increase in reports of child abuse online |
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator: Increase in reports of financial sexual extortion |
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator: Increase in reports of AI-made CSAM (child sexual abuse material) |
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator: Increase in reports of deepfaked sexually explicit images or videos |
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator: Increase in reports of AI-generated child abuse content |
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Indicator: Increase in reports of online enticement |
SDG 17: Partnerships for the Goals | Target 17.17: Encourage and promote effective public, public-private, and civil society partnerships | Indicator: Number of companies submitting CyberTipline reports to the NCMEC |
1. Which SDGs are addressed or connected to the issues highlighted in the article?
SDG 16: Peace, Justice, and Strong Institutions
The issues highlighted in the article are connected to SDG 16, which focuses on promoting peaceful and inclusive societies, providing access to justice for all, and building effective, accountable, and inclusive institutions at all levels. The article specifically discusses child sexual exploitation, abuse, and the circulation of child sexual abuse material (CSAM), which are forms of violence against children that SDG 16 aims to address.
2. What specific targets under those SDGs can be identified based on the article’s content?
Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
Based on the article’s content, the specific target that can be identified is Target 16.2 under SDG 16. This target focuses on ending abuse, exploitation, trafficking, and all forms of violence against and torture of children. The article highlights the rise in child sexual exploitation online, including the circulation of CSAM, financial sexual extortion, and the creation of AI-generated child abuse content.
3. Are there any indicators mentioned or implied in the article that can be used to measure progress towards the identified targets?
Indicator: Increase in reports of child abuse online
The article mentions that reports to the National Center for Missing & Exploited Children (NCMEC) of child abuse online rose by more than 12% in 2023 compared to the previous year. This increase in reports can be used as an indicator to measure progress towards ending abuse and exploitation of children.
Indicator: Increase in reports of financial sexual extortion
The article also mentions an increase in reports of financial sexual extortion, where online predators lure children into sending nude images or videos and then demand money. The rise in these reports can serve as an indicator to measure progress in addressing this form of exploitation.
Indicator: Increase in reports of AI-made CSAM (child sexual abuse material)
The article highlights that the NCMEC received 4,700 reports of images or videos of the sexual exploitation of children made by generative AI. This new category of reports can be used as an indicator to track the increase in AI-made CSAM and measure progress in addressing this issue.
Indicator: Increase in reports of deepfaked sexually explicit images or videos
The article mentions that bad actors can use artificial intelligence to create deepfaked sexually explicit images or videos based on any photograph of a real child. The increase in reports of such deepfakes can be used as an indicator to measure progress in addressing this specific form of abuse.
Indicator: Increase in reports of AI-generated child abuse content
According to the article, AI-generated child abuse content impedes the identification of real child victims. The increase in reports of AI-generated child abuse content can be used as an indicator to measure progress in addressing this issue and ensuring the safety of children.
Indicator: Increase in reports of online enticement
The article mentions a significant increase in reports regarding online enticement, which involves individuals communicating online with someone believed to be a child with the intent to commit a sexual offense or abduction. The rise in these reports can be used as an indicator to measure progress in addressing online enticement and protecting children from exploitation.
Indicator: Number of companies submitting CyberTipline reports to the NCMEC
The article states that 245 companies submitted CyberTipline reports to the NCMEC out of 1,600 registered companies. The number of companies actively participating in reporting instances of child sexual abuse material (CSAM) can be used as an indicator to measure progress in promoting effective partnerships and collaboration in addressing the issue.
4. SDGs, Targets, and Indicators
SDGs | Targets | Indicators |
---|---|---|
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against
Behold! This splendid article springs forth from the wellspring of knowledge, shaped by a wondrous proprietary AI technology that delved into a vast ocean of data, illuminating the path towards the Sustainable Development Goals. Remember that all rights are reserved by SDG Investors LLC, empowering us to champion progress together. Source: theguardian.com
Join us, as fellow seekers of change, on a transformative journey at https://sdgtalks.ai/welcome, where you can become a member and actively contribute to shaping a brighter future.
|