A third of 13 to 17-year-olds have seen footage of real-life violence on TikTok in the past year, research suggests today.

The poll of 7,500 teenagers for Home Office-backed charity the Youth Endowment Fund found a quarter had seen similar material on Snapchat, 20% on YouTube and 19% on Instagram. Researchers found that across all social media platforms, the most common type of violent material viewed was footage of fights, with 48% of the children polled having viewed such clips. Youth Endowment Fund executive director Jon Yates said: "Social media companies need to wake up - it is completely unacceptable to promote violent content to children. Children want it to stop. Children shouldn't be exposed to footage of fights, threats or so-called 'influencers' peddling misogynistic propaganda.”

Some 36% of those quizzed had seen threats to beat someone up, while 29% viewed people carrying, promoting or using weapons. Twenty-six-per-cent had seen posts showing or encouraging harm to women and girls. Asked how they had come across the material, 27% said the platform they were using suggested it, while only 9% admitted deliberately accessing it.

Mr Yates added: "This type of content can easily stoke tension between individuals and groups, and lead to boys having misguided and unhealthy attitudes towards girls, women and relationships. As a society, we have a duty to help children live their lives free from violence, both offline and online."

Children’s Commissioner Dame Rachel de Souza was “deeply concerned by the findings of this report, which clearly show that children and young people are increasingly being exposed to violence”. She said: “We need to focus on supporting children - both victims and perpetrators. We need a more consistent approach so that children don’t fall through gaps, so that children - wherever they are, whatever background they’re from - get the help they need before issues escalate, as well as focus on early intervention to stop young people perpetrating violence by diverting them away from criminal behaviour and offering intensive support to change their behaviour. This couldn’t be more important to get right.”

A TikTok spokesman said: "TikTok removes or age-restricts content that's violent or graphic, most often before it receives a single view, and provides parents with tools to further customise content and safety settings for their teens' account." A Snapchat spokeswoman said: “Violence has devastating consequences, and there is no place for it on Snapchat. When we find violent content we remove it immediately, we have no open newsfeed and the app is designed to limit opportunities for potentially harmful content to go viral. We encourage anyone who sees violent content to report it using our confidential in-app reporting tools. We work with law enforcement to support investigations and partner closely with safety experts, NGOs and the police to help create a safe environment for our community."

A YouTube spokeswoman said the site has strict policies prohibiting violent content, and quickly removes material violating its policies, with more than 946,000 videos taken down in the second quarter of 2023. Instagram was contacted for comment.