The NSPCC is urging tech companies to embed technology on children’s phones that blocks nude images from being created, shared or viewed – and for the Government to take further action if they fail to.
This comes as new data from the charity reveals reports of child sexual abuse image crimes are rising.
A Freedom of Information request submitted by the NSPCC and responded to by 42 of the 45 police forces, revealed that between 1 April 2024 to 31 March 2025 there were 36,829 recorded offences of indecent and prohibited images of children across the UK.
Data from Dyfed Powys, Gwent and North Wales police forces show that 1,287 offences were recorded over the same period. A 3% increase on last year.
298 offences were recorded in the Dyfed-Powys region, up from 202 the year before.
Overall, Meta platforms still accounted for almost a quarter of all offences (24%), with 8% on Instagram, 7% on WhatsApp, 5% on Facebook and 4% on Messenger. However, the figures in relation to these platforms only paint part of the picture, as end-to-end-encryption (E2EE) means the scale of abuse children are experiencing online is hidden – preventing detection and leading to under-reporting on these platforms.
Without adequate safety features designed to keep children safe online across all platforms, many young people are exposed to the risk of grooming, extortion, online child sexual abuse and having intimate images shared - all of which can have a devastating impact on a child's life, sometimes well into adulthood.
One 17-year-old boy who contacted Childline said, “I shared a nude online and it was leaked, so everyone at school saw it. I was in a really bad way, so I moved schools. The nude pictures still come up as random people message me and blackmail me with them. I'm worried about my new friends seeing them and how the leaked nudes will impact my career in the future."
The Government committed in the Violence Against Women and Girls (VAWG) Strategy to work with tech companies to stop children in the UK from taking or sharing nude images. This new data makes it clear that tech companies are failing to prioritise young people’s safety across their platforms.
The NSPCC believes that tech companies must act now and embed effective protections for children. They argue that using existing technologies on children’s phones that blocks illegal images in real time would help prevent these crimes from happening in the first place.
Chris Sherwood, CEO at the NSPCC, said:” It is utterly indefensible that we are still seeing around 100 child sexual abuse image offences recorded every single day.
“Children across the UK are being completely failed by tech companies that should be protecting them online. We cannot keep letting them off the hook when they can do more to prevent this from happening in the first place.
“Behind every one of these offences is a child who has been groomed, abused and manipulated. They are left to carry the trauma, whilst tech companies continue to profit handsomely. “





Comments
This article has no comments yet. Be the first to leave a comment.