Following last week’s announcement that YouTube would be disabling the comments function on videos of children in an effort to keep younger users of the platform safe from predators, the NSPCC has reported that the use of social media for grooming children has increased sharply.
Using freedom of information data obtained from 39 of the 43 police forces in England and Wales, the charity revealed that 5,161 reports of sexual communications with a child were recorded over the 18 month period following the introduction of a new offence of sexual communication with a child.
Surrey, Sussex, Northampton and City of London police forces did not contribute data.
Facebook, Snapchat and Instagram were the platforms used in 70 per cent of these incidents and, although girls aged 12 to 15 were most commonly targeted, around one in five victims were under the age of 11, with some as young as five.
In the April to September period immediately following the introduction of the new law, Instagram was used by groomers to contact a child 126 times; a figure that rose to 428 for the same period in 2018. These figures are drawn only from incidents where the method of contact used was recorded by the police, indicating that the real number may be even higher.
“These figures are overwhelming evidence that keeping children safe cannot be left to social networks,” said the NSPCC’s chief executive Peter Wanless. “We cannot wait for the next tragedy before tech companies are made to act.”
The government is due to release a white paper on internet safety, with new safety laws aimed at ensuring that ‘the UK is the safest place in the world to be online,’ according to gov.uk and the NSPCC, hopeful that grooming will be one of the issues addressed, is campaigning for a legal duty of care to be imposed on tech firms where it comes to young people who use their platforms.
Accusing social media firms of “10 years of failed self-regulation,” Mr Wanless said: “It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people.”