Google, Meta, Discord, and more team up to fight child abuse online
Google, Meta, Discord, and more team up to fight child abuse online The Verge
A new cross-platform information sharing program called Lantern aims keep predators from directing minors to other platforms for exploitation.
By Wes Davis, a weekend editor who covers the latest in tech and entertainment. He has written news, reviews, and more as a tech journalist since 2020.
- A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) with cross-platform signal sharing between online companies like Meta and Discord. The Tech Coalition, a group of tech businesses with a cooperative aim to fight online child sexual exploitation, wrote in today’s announcement that the program is an attempt to keep predators from avoiding detection by moving potential victims to other platforms.
- Lantern serves as a central database for companies to contribute data and check their own platforms against. When companies see signals, like known OCSEA policy-violating email addresses or usernames, child sexual abuse material (CSAM) hashes, or CSAM keywords, they can flag them in their own systems. The announcement notes that while the signals don’t strictly prove abuse, they help companies investigate and possibly take action like closing an account or reporting the activity to authorities.
- Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, it used information shared by one of the program’s partners, Mega, to remove “over 10,000 violating Facebook Profiles, Pages and Instagram accounts” and report them to the National Center for Missing and Exploited Children.
- The coalition’s announcement also quotes John Redgrave, Discord’s trust and safety head, who says, “Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations.”
- The companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Members of the coalition have been developing Lantern for the last two years, and the group says that besides creating technical solutions, it had to put the program through “eligibility vetting” and ensure it jibes with legal and regulatory requirements and is “ethically compliant.”
- One of the big challenges of programs like this is being sure it is effective while not presenting new problems. In a 2021 incident, a father was investigated by police after Google flagged him for CSAM over pictures of his kid’s groin infection. Several groups warned that similar issues could arise with Apple’s now-canceled automated iCloud photo library CSAM-scanning feature.
- The Tech Coalition wrote that it commissioned a human rights impact assessment by the Business for Social Responsibility (BSR) — a larger coalition of companies aimed at global safety and sustainability issues. BSR will offer ongoing guidance as the program changes over time.
- The coalition will oversee Lantern and says it’s responsible for making clear guidelines and rules for data sharing. As part of the program, companies must complete mandatory training and routine check-ins, and the group will review its policies and practices regularly.
SDGs, Targets, and Indicators
-
SDG 16: Peace, Justice, and Strong Institutions
- Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
- Target 16.9: By 2030, provide legal identity for all, including birth registration
- Indicator 16.2.3: Proportion of young women and men aged 18-29 years who experienced sexual violence by age 18
-
SDG 17: Partnerships for the Goals
- Target 17.16: Enhance the Global Partnership for Sustainable Development
- Indicator 17.16.1: Number of countries reporting progress in multi-stakeholder development effectiveness monitoring frameworks that support the achievement of the Sustainable Development Goals
Analysis
The article discusses a new program called Lantern that aims to fight online child sexual exploitation and abuse (OCSEA) by sharing signals between online companies like Meta and Discord. Based on the content of the article, the following SDGs, targets, and indicators can be identified:
1. SDG 16: Peace, Justice, and Strong Institutions
The issue of online child sexual exploitation and abuse aligns with SDG 16, which focuses on promoting peace, justice, and strong institutions. The targets under this SDG that can be identified based on the article’s content are:
- Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
- Target 16.9: By 2030, provide legal identity for all, including birth registration
The article mentions that Lantern aims to fight online child sexual exploitation and abuse by keeping predators from avoiding detection and moving potential victims to other platforms. This aligns with the target of ending abuse and exploitation of children. Additionally, the program’s goal of providing a central database for companies to contribute data and check their own platforms aligns with the target of providing legal identity for all.
The article does not explicitly mention any indicators related to SDG 16, but Indicator 16.2.3, which measures the proportion of young women and men who experienced sexual violence by age 18, could be relevant in measuring progress towards ending online child sexual exploitation and abuse.
2. SDG 17: Partnerships for the Goals
The issue of combating online child sexual exploitation and abuse requires partnerships and collaboration between various stakeholders. SDG 17 focuses on partnerships for the goals, and the target that can be identified based on the article’s content is:
- Target 17.16: Enhance the Global Partnership for Sustainable Development
The article mentions that Lantern is a program developed by The Tech Coalition, a group of tech businesses with a cooperative aim to fight online child sexual exploitation. This aligns with the target of enhancing global partnerships for sustainable development.
The article also mentions Indicator 17.16.1, which measures the number of countries reporting progress in multi-stakeholder development effectiveness monitoring frameworks that support the achievement of the Sustainable Development Goals. While the article does not provide specific information about this indicator, it implies that the program involves collaboration between multiple companies and stakeholders to address the issue of online child sexual exploitation and abuse.
Table: SDGs, Targets, and Indicators
SDGs | Targets | Indicators |
---|---|---|
SDG 16: Peace, Justice, and Strong Institutions |
|
Indicator 16.2.3: Proportion of young women and men aged 18-29 years who experienced sexual violence by age 18 |
SDG 17: Partnerships for the Goals |
|
Indicator 17.16.1: Number of countries reporting progress in multi-stakeholder development effectiveness monitoring frameworks that support the achievement of the Sustainable Development Goals |
Behold! This splendid article springs forth from the wellspring of knowledge, shaped by a wondrous proprietary AI technology that delved into a vast ocean of data, illuminating the path towards the Sustainable Development Goals. Remember that all rights are reserved by SDG Investors LLC, empowering us to champion progress together.
Source: theverge.com
Join us, as fellow seekers of change, on a transformative journey at https://sdgtalks.ai/welcome, where you can become a member and actively contribute to shaping a brighter future.