We haven't been able to take payment
You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Act now to keep your subscription
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Your subscription is due to terminate
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account, otherwise your subscription will terminate.
author-image
LEADING ARTICLE

Groom Room

Government could and should be doing more to force social networks to end online grooming — but so could the networks themselves

The Times

If social media platforms can flag content as extremist or pornographic, and direct advertisements to users based on their browsing habits, why do they not alert children whenever there is a risk of online grooming? It’s an important question, and one the NSPCC is asking today.

The short answer is that these platforms could automatically send pop-up warnings of potential sexual grooming to moderators as well as children, but they would need to make adjustments to their software and have not done so. The longer answer is that this state of affairs has come about because the government does not yet have an internet safety strategy in force and has indicated that when it does the code of conduct for social networks will be voluntary, not mandatory.

It is not too late to change this approach, and changed it must be. In the six months after last year’s introduction of the new offence of sexual communication with a child, more than 1,300 instances of the crime were recorded by police. The number is higher than the NSPCC expected and the total would be higher still but for an unnecessary two-year delay in bringing this part of the new law into force. The figure also excludes London, where police failed to comply with a freedom of information request.

The primary purpose of police and the law is to keep the public safe. They failed in real-world grooming scandals in Rochdale and Rotherham, but failure is not inevitable online. It should be possible to prevent the cybergrooming of vulner- able children rather than waiting for young lives to be ruined first. The technology exists and it is the very technology that makes social networks so responsive to their users, and so profitable. All that is needed is for the networks to do the right thing, or, failing that, for government to force them to.

Nearly two thirds of the 1,316 online grooming cases logged between April and October last year involved the use of Facebook, Instagram (which Facebook owns) or Snapchat. Sixty-three per cent targeted girls aged 12 to 15; a fifth of victims were 11 or younger.

Advertisement

The platforms promote the idea that they are merely online versions of the informal networks that constitute civil society. This is an illusion. Social media platforms are fully in control of the parameters within which their users communicate while granting them anonymity and freedoms they do not have in the real world. The results are often far from civil. The networks have acknow- ledged this by starting to take down extremist material and child pornography, identified using a mix of human monitors and software fixes. The NSPCC is asking for further fixes to spot the telltale signs of groomers operating online. These signs include accounts with multiple “friend” requests rejected by children, and account-holders following young people with whom they have no mutual friends. It is not too much to ask. Much of the work has been done already by Swansea University researchers who have identified linguistic patterns commonly used by groomers to win children’s trust. Apps already exist to alert vulnerable children to online bullying and to offer help. Victims of grooming need it too.

The context for the charity’s appeal is a mood of growing international resentment towards the social networks because of their extensive control over users’ privacy, social interactions, news consumption and consumer choices. In the United States a bill is likely to be passed that would make platforms liable for online sex trafficking. In Germany they are subject to stiff fines if proscribed content is not taken down within a day.

In matters of regulation, as of tax, the British government has so far been a soft touch for tech giants. That cannot go on, especially if social networks continue to be part of the problem when they could so easily be part of the solution.