Australia Fines X for Not Providing Information on Child Abuse Content
Australia Fines X for Not Providing Information on Child Abuse Content The New York Times
The Service Formerly Known as Twitter Fined for Failing to Combat Child Exploitation
Summary
The service formerly known as Twitter told Australian regulators that its automated detection of abusive material had declined after Elon Musk bought it.
Australia said on Sunday that it would fine X for failing to provide information about its efforts to combat child exploitation and that the social media service had told officials that its automated detection of abusive material declined after Elon Musk bought the company.
The amount of the fine is 610,500 Australian dollars, or about $384,000.
X, formerly known as Twitter, did not comply with a national law that requires platforms to disclose what they are doing to fight child exploitation on their services, Australian officials said. They said they had sent legal notices to X, Google, Discord, TikTok and Twitch in February, asking the companies for details about their measures for detecting and removing child sexual abuse material.
“Companies can make empty statements like ‘Child exploitation is our top priority,’ so what we’re saying is show us,” Julie Inman Grant, Australia’s commissioner in charge of online safety, said in an interview. “This is important not only in terms of deterrence in the types of defiance we are seeing from the companies but because this information is in the public interest.”
Mr. Musk bought Twitter for $44 billion last October. Since then, he has renamed the service X and loosened the platform’s content moderation rules. The company said this year that it was suspending hundreds of thousands of accounts for sharing abusive material, but a New York Times review in February found that such imagery persisted on the platform.
X told Australian officials that its detection of child abuse material on the platform had fallen to 75 percent from 90 percent in the three months after Mr. Musk bought the company. The detection has since improved, X told them.
Google and X failed to answer all of the regulator’s questions, Australian officials said. While Google received a warning, they said, X’s lack of a response was more extensive.
Tech companies take varied approaches to detecting and eradicating child sexual abuse materials. Some use automated scanning tools on all parts of their platforms, while others use them only in certain circumstances. Several of the companies said they responded to reports of abuse within minutes, while others take hours, according to a report from Australia’s eSafety Commissioner.
X can appeal the fine. The company did not immediately have a comment. Lucinda Longcroft, a director of government affairs and public policy for Google, said in a statement, “Protecting children on our platforms is the most important work we do.” She added, “We remain committed to these efforts and collaborating constructively and in good faith with the safety commissioner, government and industry on the shared goal of keeping Australians safer online.”
X also told the Australian regulator that it maintained a “zero-tolerance policy” on child sexual abuse material and was committed to finding and removing the content on its platform. The company said it uses automated software to detect abusive images and has experts who can review content shared on the platform in 12 languages.
In response to whether children might be targeted for grooming on X, the company told the regulator, “Children are not our target customer, and our service is not overwhelmingly used by children.”
Linda Yaccarino, X’s chief executive, recently said at a conference that Generation Z was the company’s fastest-growing demographic, with 200 million teenagers and young adults in their 20s visiting the platform each month.
SDGs, Targets, and Indicators
-
SDG 16: Peace, Justice, and Strong Institutions
- Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
- Indicator: Percentage change in the detection of child abuse material on the platform
-
SDG 17: Partnerships for the Goals
- Target 17.16: Enhance the global partnership for sustainable development, complemented by multi-stakeholder partnerships that mobilize and share knowledge, expertise, technology, and financial resources
- Indicator: Collaboration between X (formerly Twitter) and Australian regulators in addressing child exploitation on the platform
The article highlights the issue of X (formerly known as Twitter) failing to provide information about its efforts to combat child exploitation. This issue is directly connected to SDG 16, which aims to end abuse, exploitation, trafficking, and all forms of violence against children. Specifically, Target 16.2 focuses on ending violence against children, including child exploitation. The article mentions that X did not comply with a national law requiring platforms to disclose their measures for detecting and removing child sexual abuse material.
The article also mentions that X’s automated detection of child abuse material on the platform declined after Elon Musk bought the company. This decline in detection can be seen as a setback in achieving Target 16.2. The specific indicator mentioned in the article is the percentage change in the detection of child abuse material on the platform. X informed Australian officials that the detection had fallen to 75 percent from 90 percent after Musk’s acquisition of the company.
Additionally, SDG 17, which focuses on partnerships for the goals, is relevant to the article. Target 17.16 emphasizes the importance of global partnerships and collaboration in achieving sustainable development. The article mentions that Australian officials sent legal notices to X, Google, Discord, TikTok, and Twitch, asking for details about their measures for detecting and removing child sexual abuse material. The indicator related to this target is the collaboration between X and Australian regulators in addressing child exploitation on the platform.
SDGs | Targets | Indicators |
---|---|---|
SDG 16: Peace, Justice, and Strong Institutions | Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children | Percentage change in the detection of child abuse material on the platform |
SDG 17: Partnerships for the Goals | Target 17.16: Enhance the global partnership for sustainable development, complemented by multi-stakeholder partnerships that mobilize and share knowledge, expertise, technology, and financial resources | Collaboration between X (formerly Twitter) and Australian regulators in addressing child exploitation on the platform |
Behold! This splendid article springs forth from the wellspring of knowledge, shaped by a wondrous proprietary AI technology that delved into a vast ocean of data, illuminating the path towards the Sustainable Development Goals. Remember that all rights are reserved by SDG Investors LLC, empowering us to champion progress together.
Source: nytimes.com
Join us, as fellow seekers of change, on a transformative journey at https://sdgtalks.ai/welcome, where you can become a member and actively contribute to shaping a brighter future.