Government send notices to X, Telegram, YouTube to remove child sexual abuse material

Government send notices to X, Telegram, YouTube to remove child sexual abuse material  BusinessLine

Government send notices to X, Telegram, YouTube to remove child sexual abuse material

The Ministry of Electronics and Information Technology (MeitY) Issues Notices to Social Media Intermediaries

The Ministry of Electronics and Information Technology (MeitY) issued notices on Friday to social media intermediaries X, YouTube, and Telegram, warning them to remove Child Sexual Abuse Material (CSAM) from their platforms on the Indian Internet.

The notices served to these platforms emphasize the importance of prompt and permanent removal or disabling of access to any CSAM on their platforms, as stated by MeitY in a statement.

The notices also call for the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent the dissemination of CSAM in the future.

Strict Action

These notices from MeitY state that non-compliance with these requirements will be deemed a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021. The Ministry has also warned the three social media intermediaries that any delay in complying with the notices will result in the withdrawal of their safe harbor protection under Section 79 of the IT Act, which currently shields them from legal liability.

“We have sent notices to X, YouTube, and Telegram to ensure there is no Child Sexual Abuse Material that exists on their platforms. The government is determined to build a safe and trusted Internet under the IT rules. The IT rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms,” said Rajeev Chandrasekhar, Minister of State for Skill Development & Entrepreneurship and Electronics & IT.

The IT Act, 2000, provides the legal framework for addressing pornographic content, including CSAM. Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.

Comments

  • Copy link
  • Email
  • Facebook
  • Twitter
  • Telegram
  • LinkedIn
  • WhatsApp
  • Reddit

Published on October 6, 2023

SDGs, Targets, and Indicators Analysis

1. Which SDGs are addressed or connected to the issues highlighted in the article?

  • SDG 5: Gender Equality
  • SDG 16: Peace, Justice, and Strong Institutions

The issue highlighted in the article is the presence of Child Sexual Abuse Material (CSAM) on social media platforms. This issue is connected to SDG 5, which aims to achieve gender equality and empower all women and girls. CSAM is a form of violence against children, particularly girls, and addressing this issue is crucial for promoting gender equality.

Additionally, the issue is also connected to SDG 16, which aims to promote peaceful and inclusive societies for sustainable development, provide access to justice for all, and build effective, accountable, and inclusive institutions at all levels. Removing CSAM from social media platforms contributes to creating a safe and secure online environment and ensuring justice for victims.

2. What specific targets under those SDGs can be identified based on the article’s content?

  • Target 5.2: Eliminate all forms of violence against all women and girls in the public and private spheres
  • Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
  • Target 16.3: Promote the rule of law at the national and international levels and ensure equal access to justice for all

Based on the article’s content, the specific targets that can be identified are related to eliminating violence against women and girls (Target 5.2) and ending abuse and violence against children (Target 16.2). The article highlights the importance of removing CSAM, which involves violence against children, particularly girls. Additionally, ensuring the implementation of proactive measures and content moderation algorithms aligns with promoting the rule of law and equal access to justice (Target 16.3).

3. Are there any indicators mentioned or implied in the article that can be used to measure progress towards the identified targets?

Yes, the article implies indicators that can be used to measure progress towards the identified targets. These indicators include:

  • Number of reported cases of CSAM on social media platforms
  • Number of CSAM content removals or disabling of access on social media platforms
  • Implementation and effectiveness of content moderation algorithms
  • Existence and utilization of reporting mechanisms for CSAM
  • Timeliness of response to CSAM reports

The article mentions the importance of prompt and permanent removal or disabling of access to any CSAM on social media platforms. This implies that the number of reported cases and the subsequent removal or disabling of CSAM content can be used as indicators to measure progress. Additionally, the implementation and effectiveness of content moderation algorithms and reporting mechanisms can also serve as indicators to assess the platforms’ efforts in preventing the dissemination of CSAM.

4. Table: SDGs, Targets, and Indicators

SDGs Targets Indicators
SDG 5: Gender Equality Target 5.2: Eliminate all forms of violence against all women and girls in the public and private spheres
  • Number of reported cases of CSAM on social media platforms
  • Number of CSAM content removals or disabling of access on social media platforms
SDG 16: Peace, Justice, and Strong Institutions Target 16.2: End abuse, exploitation, trafficking, and all forms of violence against and torture of children
  • Number of reported cases of CSAM on social media platforms
  • Number of CSAM content removals or disabling of access on social media platforms
  • Implementation and effectiveness of content moderation algorithms
  • Existence and utilization of reporting mechanisms for CSAM
  • Timeliness of response to CSAM reports
SDG 16: Peace, Justice, and Strong Institutions Target 16.3: Promote the rule of law at the national and international levels and ensure equal access to justice for all
  • Implementation and effectiveness of content moderation algorithms
  • Existence and utilization of reporting mechanisms for CSAM
  • Timeliness of response to CSAM reports

Behold! This splendid article springs forth from the wellspring of knowledge, shaped by a wondrous proprietary AI technology that delved into a vast ocean of data, illuminating the path towards the Sustainable Development Goals. Remember that all rights are reserved by SDG Investors LLC, empowering us to champion progress together.

Source: thehindubusinessline.com

 

Join us, as fellow seekers of change, on a transformative journey at https://sdgtalks.ai/welcome, where you can become a member and actively contribute to shaping a brighter future.