eSafety commissioner says YouTube ‘turning a blind eye’ to child abuse – Australian Broadcasting Corporation

eSafety commissioner says YouTube ‘turning a blind eye’ to child abuse – Australian Broadcasting Corporation

 

Report on Social Media Platforms’ Failure to Uphold Child Safety and Sustainable Development Goals

Executive Summary

A report by Australia’s eSafety Commissioner has identified significant failures by major technology corporations, including Google’s YouTube and Apple, in protecting children from online sexual abuse material. These deficiencies represent a direct contravention of the principles outlined in the United Nations Sustainable Development Goals (SDGs), particularly SDG 16, which calls for an end to violence against children and the establishment of just and strong institutions.

Failure to Meet Sustainable Development Goal 16: Peace, Justice and Strong Institutions

The actions and, in some cases, inaction of the world’s largest social media firms undermine the core targets of SDG 16.

  • Violation of Target 16.2: The report indicates that companies are “turning a blind eye” to child sexual exploitation and abuse material on their services. This directly conflicts with the global objective to “end abuse, exploitation, trafficking and all forms of violence against and torture of children.”
  • Lack of Institutional Accountability (Target 16.6): The failure of platforms like YouTube and Apple to track, report on, or provide data regarding response times to user reports of abuse material demonstrates a severe lack of the accountability and transparency required to build “effective, accountable and transparent institutions at all levels.”
  • Inadequate Safety Mechanisms: The eSafety Commissioner, Julie Inman Grant, stated that no other consumer-facing industry would be permitted to operate while enabling such crimes, highlighting a governance gap that SDG 16 aims to close.

Broader Implications for Sustainable Development

The identified safety gaps have wider consequences for the achievement of other SDGs.

  • SDG 3 (Good Health and Well-being): The proliferation of and exposure to child abuse material poses a grave risk to the mental and physical well-being of children, undermining efforts to ensure healthy lives for all.
  • SDG 17 (Partnerships for the Goals): The unresponsiveness of some corporations to inquiries from a national regulatory body like the eSafety Commissioner signifies a breakdown in the public-private partnerships essential for achieving global safety and development objectives.

Specific Corporate Deficiencies and Regulatory Response

The eSafety Commissioner’s investigation, which mandated reporting from Apple, Discord, Google, Meta, Microsoft, Skype, Snap, and WhatsApp, uncovered a range of critical safety failures.

  1. Inadequate Reporting and Tracking: YouTube and Apple were unable to provide data on the number of user reports of child abuse material they received or how long it took them to respond.
  2. Failure to Implement Technology: Platforms were found to be inadequately using available tools, such as hash-matching technology, across all parts of their services to detect and remove known abuse material.
  3. Deficient Safety Systems: Key safety gaps included failures to prevent the live-streaming of abuse material and block links to known sources of such content.
  4. Insufficient Personnel: Both Apple and Google failed to disclose the number of trust and safety personnel they employ to handle these issues.

In response to these findings, the Australian federal government has acted on the Commissioner’s advice to include YouTube in its proposed social media ban for teenagers under 16, reversing a planned exemption.

Which SDGs are addressed or connected to the issues highlighted in the article?

SDG 16: Peace, Justice and Strong Institutions

  • This goal is directly addressed as the article focuses on combating crime (online child sex abuse), ensuring justice for victims, and strengthening institutions. The eSafety Commissioner acts as an institutional body trying to enforce regulations and hold powerful corporations accountable for “crimes occurring on their services” and protecting children from violence and exploitation.

SDG 3: Good Health and Well-being

  • This goal is relevant because the core issue of child sexual abuse has profound and devastating impacts on the mental and physical well-being of children. The article’s focus on “prioritising the protection of children” is a preventative measure aimed at safeguarding their health and well-being from the trauma associated with such material.

What specific targets under those SDGs can be identified based on the article’s content?

SDG 16: Peace, Justice and Strong Institutions

  1. Target 16.2: End abuse, exploitation, trafficking and all forms of violence against and torture of children.
    • The article is entirely centered on this target. The eSafety Commissioner’s report criticizes social media firms for “turning a blind eye’ to online child sex abuse material” and failing to prevent “heinous crimes against children on their premises, or services.” The entire regulatory effort described is aimed at ending this specific form of abuse and exploitation of children.

SDG 3: Good Health and Well-being

  1. Target 3.4: By 2030, reduce by one-third premature mortality from non-communicable diseases through prevention and treatment and promote mental health and well-being.
    • The article connects to the “promote mental health and well-being” component of this target. By seeking to eliminate child abuse material from online platforms, the Australian regulator is working to prevent the severe psychological trauma and long-term mental health issues that result from such abuse, thereby promoting the well-being of children.

Are there any indicators mentioned or implied in the article that can be used to measure progress towards the identified targets?

  • Number of user reports of child sex abuse material: This is a direct indicator the eSafety Commissioner attempted to collect. The article states that YouTube and Apple “failed to track the number of user reports it received of child sex abuse appearing on their platforms.”
  • Response time to reports of abuse: This is a key performance indicator of a platform’s effectiveness. The article notes that the companies “could not say how long it took them to respond to such reports.”
  • Implementation of safety technologies: Progress can be measured by the adoption and coverage of specific technologies. The report identified “safety gaps” including failures to use “hash-matching technology on all parts of their services to identify images of child sexual abuse.”
  • Number of trust and safety personnel: This indicates the level of resources a company dedicates to safety. The article mentions that the regulator asked “how many trust and safety personnel Apple and Google have on-staff,” but the companies did not answer.
  • Effectiveness of detection and prevention systems: The article points to “failures to detect and prevent live-streaming of the material or block links to known child abuse material” as a key deficiency, implying that the rate of successful detection and prevention is a critical indicator.

SDGs, Targets and Indicators Table

SDGs Targets Indicators
SDG 16: Peace, Justice and Strong Institutions 16.2: End abuse, exploitation, trafficking and all forms of violence against and torture of children.
  • Number of user reports of child sex abuse material received.
  • Response time to reports of abuse.
  • Rate of implementation of safety technologies (e.g., hash-matching, AI).
  • Number of dedicated trust and safety personnel.
  • Rate of successful detection and prevention of live-streaming and links to abuse material.
SDG 3: Good Health and Well-being 3.4: Promote mental health and well-being.
  • Effectiveness of measures to prevent exposure to traumatic material (measured by the same indicators as for SDG 16.2, as these actions directly contribute to protecting children’s mental well-being).

Source: abc.net.au