We haven't been able to take payment
You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Act now to keep your subscription
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account or by clicking update payment details to keep your subscription.
Your subscription is due to terminate
We've tried to contact you several times as we haven't been able to take payment. You must update your payment details via My Account, otherwise your subscription will terminate.

Molly Russell: Meta blamed for delay to schoolgirl’s inquest

Molly Russell died in 2017 but her father Ian, right, has still not been shown the Instagram posts she had viewed before her death
Molly Russell died in 2017 but her father Ian, right, has still not been shown the Instagram posts she had viewed before her death
PA

The family of a 14-year-old girl who killed herself after viewing Instagram posts about self-harm have expressed anger at the social media giant after her inquest was delayed again.

The inquest of Molly Russell has been adjourned until at least mid-September after more than a thousand posts were shared only with the coroner and not with her family. Meta, Instagram’s owner, disclosed the majority of 12,000 posts last year but handed over 1,500 posts from private accounts at yesterday’s hearing.

Molly, from Harrow, northwest London, died in November 2017 after viewing posts and images about depression and self-harm. Earlier hearings were told that in her last few days Molly used her Instagram account more than 120 times a day. She liked more than 11,000 pieces of content and shared material more than 3,000 times.

•Legislation is needed but must not promote censorship

Meta, formerly Facebook, is the parent company of Instagram and WhatsApp.

Advertisement

Oliver Saunders QC, for Molly’s family, told North London coroner’s court that “frustratingly and regrettably we are in a position that the hearing in 12 working days is not going to be viable”. He added that Molly’s father, Ian Russell, could not finalise his witness statement and “give evidence on behalf of the family” until the posts had been reviewed.

The court was told that coding had to be created to view the posts and videos, which would run to 36,000 pages if it were printed double-sided.

Caoilfhionn Gallagher QC, for Meta, said the material would be presented in the order in which Molly interacted with it, rather than by the dates on which it was created, which would lead to a delay of at least a month. She said Meta was only made an interested party in December 2021, had “engaged at great speed to meet disclosure requirements” and was not given notice to provide the posts until last month. Gallagher said the number of posts requested was unprecedented.

Molly’s death is one of the reasons the government is seeking to act against abuses of power by social media giants in its Online Safety Bill, which is going through parliament. The bill would impose a duty of care on the tech giants to stop them allowing users to view harmful material.

What is the Online Safety Bill?
It is an attempt to make Britain the “safest place in the world to be online while defending freedom of expression”. It will regulate companies that host user-generated content accessible in the UK.

Advertisement

Why is it needed?
Ministers have concluded that the codes of conduct agreed with social media companies do not work and children are seeing pornography, suffering online bullying and being driven towards self-harm.

What will it do?
Companies will need to show a duty of care to users by removing items such as child abuse or terrorist content. Children must not see harmful or dangerous material.

Will it cover legal content?
In some circumstances. Some companies will need to act against content that is “legal but harmful”, such as some types of online abuse or the promotion of self-harm or eating disorders.

Who decides what legal material should be removed?
The government is drawing up a list of “priority” harms so that companies focus on material such as racist abuse and glorification of self-harm rather than political debate.

Advertisement

What are the penalties?
Fines of up to £18 million or 10 per cent of annual global turnover.

Are there exemptions?
Content published by a “recognised media outlet” will be exempt.

Advertisement

Advertisement