Children as young as 13 are being targeted online with adult sexual images, and content promoting suicide and crash diets within 24 hours of setting up social media accounts, an investigation has found.

The research, to be published on Tuesday, found social media firms’ commercially orientated algorithms drove adverts towards children, as well as recommending ever more adult and harmful content to them in a bid to monetise their accounts.

It included sexual content from porn sites, requests from adults for contact, material on self-harm and suicide, and idealised bodies “so unachievable that they distort any sense of what a body should look like”.

This was uncovered by investigators Revealing Reality, for the 5Rights Foundation, who created 10 fake accounts, or “avatars”, that imitated children aged from 13 to 17 setting up social media accounts.

Within 24 hours, the accounts were directly targeted with harmful content by companies including Facebook, Instagram and TikTok; and received direct messages or added to group chats by strangers linking them into sites with paid or porn content.

‘Profound carelessness and disregard for children’

Dame Rachel de Souza, the Children’s Commissioner, will on Tuesday call for urgent action by the Government and industry to “bring about an online world which is fit for children”.

“This research highlights the enormous range of risks children encounter online. We don’t allow children to access services and content that are inappropriate for them, such as pornography, in the offline world. They shouldn’t be able to access them in the online world either,” she said.

Ian Russell, whose daughter Molly took her life after being bombarded with self-harm content on Instagram, said the research showed “algorithmic amplification actively connects children to harmful digital content, sadly as I know only too well, sometimes with tragic consequences.”

Baroness Kidron, chairman of the 5Rights Foundation, said: “What this Pathways research highlights is a profound carelessness and disregard for children, embedded in the features, products and services of the digital world.”

Content designed to boost screen time

Designers told researchers they were tasked with creating services that maximised the time spent online, the reach of the site to draw in as many people as possible, and the activity by the user by encouraging them to interact with and generate as much content as possible.

This meant children’s attention was stimulated with push notifications, endless scrolling feeds, likes to quantify their popularity, shared content, in-app or in-game purchases, and easy links to connect with friends or followers.

The child avatars were not only targeted with child-focused, age-appropriate advertising, but were also served up with sexual content, requests from adults for contact, self-harm and suicide material, crash diets and other extreme body image content.

In one screenshot, a child receives not only adverts for Nintendo Switch, a sweet shop and teenage tampons, but at the same time pro-suicide material entitled “It is easy to end it all”.

Another avatar registered as a 15-year-old, is targeted by a Home Office advert with an anti-child abuse campaign while at the same time being offered contact with, and content from, adults in a pornographic pose.

“The company monetised that child account, but still they recommended, ranked, rated or offered up material that in many cases broke their own terms and, in every case, should not have been offered to a user registered as a child,” said Baroness Kidron.