A new study conducted by non-governmental organizations ECPAT International, Eurochild, and Terre des Hommes Netherlands reveals that children often rely on their own instincts when faced with threats to their safety online. The study involved focus group discussions with 483 children from 15 countries, including ten EU member states. Many children prefer to keep their online activities to themselves and struggle to talk with adults about the risks they face online. They also filter what they tell their parents and caregivers about the harms they encounter, which include cyberbullying, violent content, and negative mental health experiences. Across all countries studied, sexual abuse and exploitation online, such as grooming, self-generated sexual material, and live-streamed child sexual abuse, is the biggest threat for minors.

Children feel very alone in ensuring their safety from child sexual abuse and exploitation, according to the study. They try to self-censor their behavior and look out for risks, but lack the necessary tools and information to navigate the online world effectively. The report highlights the urgent need for EU countries to find a compromise on a planned new law to crack down on child exploitation online using emerging technologies to detect child sexual abuse material and grooming activities. Digital privacy advocates oppose the law, claiming it infringes on the right to privacy online. NGOs stress the importance of regulatory frameworks that put the responsibility on online service providers rather than children to protect them from online child sexual abuse.

Amid mounting concern about the use of AI to generate deep-fake child sexual abuse material, NGOs are calling on digital platforms to play their role in fighting illegal content that puts children’s safety at risk. Tomas Hartman, senior public policy manager at Snap Inc., emphasized that the safety and privacy of Snapchat users, especially minors, is a key priority for the app, which has safeguards in place to protect young users. Snapchat supports the planned EU law to tackle child sexual abuse material and uses reliable technologies to proactively scan for this material. The app has an age requirement of 13 years old and additional privacy settings for users aged 13-17. Snapchat has faced scrutiny for failing to keep underage users off its platform and is working to comply with obligations related to the protection of minors.

NGOs stress the importance of regulations that shift responsibility for protecting children online from children themselves to online service providers. They call for legal guardrails to make the internet safer for kids, particularly in the face of increasing concerns about the distribution of deep-fake child sexual abuse material using AI. Platforms like Snapchat play a crucial role in this effort, with a focus on safeguarding young users and proactively scanning for illegal content. The planned EU law to crack down on child exploitation online is seen as crucial by companies like Snapchat, who use technologies like photo DNA to detect and remove child sexual abuse material. The responsibility lies on both digital platforms and regulatory frameworks to work together to ensure the safety and well-being of children in the online world.

Share.
Exit mobile version