Today's Editorial

Today's Editorial - 01 January 2023

The deplorable world of child sexual abuse

Source: By Tejasvini Akhawat: Deccan Herald

The vast domain of the Internet brings with it several severe challenges, one of which is Child Sexual Abuse Material (CSAM), commonly referred to as child pornography. The United Nations recently declared 18 November as the day to spotlight the issue of child sexual abuse.

According to the US-based National Centre for Missing and Exploited Children (NCMEC), about two million cases of child sexual abuse are reported every year in India. The challenge is that the creation and distribution of CSAM occur both online and offline. The lack of reporting and awareness on cybercrime protocol and the accessibility of Virtual Private Network facilities and end-to-end encryption have become barriers, circumventing the governments’ and intermediaries’ ability to track such content.

The malicious business of child sexual abuse requires tactful handling especially with reference to the identification of victims without exposing it in the public domain. Furthermore, the use of the Internet has become inevitable for children due to the shift in their learning to online platforms, especially after the Covid pandemic.

According to the India Child Protection Fund’s (ICPF) Report of 2020, millions of paedophiles, child rapists, and child pornography addicts increased their activities online during the pandemic.

While the Internet appears to be an unexplored territory for children, measures are required to ensure the safe usage of online content through adult supervision, counselling and curriculum modifications for digital literacy, including sections on social media regulations, cyber safety, and related laws such as the Protection of Children from Sexual Offences (POCSO) Act, 2012.

To combat the menace of CSAM, the ad hoc committee of the Rajya Sabha, chaired by Jairam Ramesh “to study the alarming issue of child pornography on social media and its effects on children and society as a whole submitted its report in January 2020, recommended that the ministries of electronics and information technology and home affairs sign MoUs with industry partners to develop technological solutions such as Artificial Intelligence (AI) tools for dark-web investigation and proactive monitoring of CSAMPartnerships with blockchain companies can be undertaken to track crypto currency transactions in the trading of child pornography content.

There is a dire need for capacity building, skill development, and the upgrading of cyber cell experts and police. Adoption and emulation of best practices in some states can fast track the resolution of the dangers of CSAM. These include the Goa government’s tie-up with Google to deploy e-safety modules for school curricula; Maharashtra Cyber Cell’s initiative of ‘Operation Black Face’ and Kerala police’s ‘Operation Daddy’ to tackle paedophilia and CSAM.

Owing to the transnational nature of the crime, India, as a South Asian giant, should work towards closer partnership and cooperation with nations under similar threats to eliminate the creation, sustenance, and viewership of CSAM. An Online Child Sexual Abuse and Exploitation (OCSAE) Prevention/Investigation Unit was set up by the Central Bureau of Investigation in 2019 under its Special Crime Zone. The OCSAE’s recent access to Interpol’s intelligence and investigative tool, the International Child Sexual Exploitation Database, is a step in the right direction. A zero-tolerance policy is the only way to stop the production, circulation, and use of CSAM.

The National Human Rights Commission, in its July 2020 virtual conference on ‘Online Child Sexual Abuse and Child Pornography’ cited various reasons for the wide circulation of CSAM, such as a lack of information and education on sexual expression and the trafficking of children for sexual exploitation and made recommendations: upgraded surveillance mechanism, improved inter-state and international coordination, and establishing a national database on CSAM, to name a few. The NHRC’s complaint management and redressal system sharply targets the victimisation of children at the hands of CSAM.

The world of CSAM is darker than any of us could have imagined. As per the ICPF’s report, the user base for CSAM content in India is more than 90% male. Furthermore, the key findings of the research indicated that Indian men are not ‘satisfied’ with generic child pornography and demand specific violent and exploitative content. Turning a blind eye to this tragic reality would only push its victims further. The horrifying sexual acts on children are antithetical to their innocence and development.