Instagram has fuelled a 70 per cent rise in online grooming of children leading to calls from the NSPCC for tech bosses to face prosecution if they put children in danger.

New figures show that police forces recorded 5,441 crimes of sexual communications with a child in the year to March 2021, a rise of 69 per cent from 3,217 in 2017/18, when the offence was first introduced after a campaign by the NSPCC.

The police data, obtained through Freedom of Information requests, showed Instagram was the most common site used by offenders, flagged by police in 32 per cent of cases where a platform was identified.

Since 2017/18, the number of offences linked to the picture-sharing platform has almost doubled from 418 to 808 in the last year, according to the data for 42 of the 43 police forces in England and Wales.

Facebook-owned apps including Instagram, WhatsApp and Messenger accounted for more than half of the offences, while Snapchat was the second most common, responsible for a quarter of the crimes where a platform was identified.

The NSPCC said the data underlined the need for tougher government action than currently planned in its draft online harms bill which is due to begin its consideration by a parliamentary committee next month.

The charity wants ministers to introduce criminal sanctions where bosses would be held personally liable for serious breaches of their companies’ duty of care to protect children from online harms.

The criminal sanctions are currently “reserve” powers that the Government will only introduce if its proposed regime of multi-billion pound fines worth up to 10 per cent of the tech giants’ global turnover fail to improve protections for children.

Online harms | under the scope of the White Paper

Andy Burrows, the head of child safety online policy at the NSPCC, said: “Year after year tech firms’ failings result in more children being groomed and record levels of sexual abuse.

“To respond to the size and complexity of the threat, the Government must put child protection front and centre of legislation and ensure the online safety bill does everything necessary to prevent online abuse.

“Safety must be the yardstick against which the legislation is judged and ministers’ welcome ambition will only be realised if it achieves robust measures to keep children truly safe now and in the future.”

The NSPCC said the figures were likely to be an underestimate of the true scale of online grooming as Facebook detected less than half the child abuse content it had done previously, due to two technology failures.

The charity said tech firms had failed to adequately respond to the increased risk children faced during lockdowns because of historic inaction to design their sites safely for young users.

The NSPCC acknowledged that companies including Instagram, Apple and TikTok had announced new safety measures but claimed they were playing catch-up in responding to the threat after years of poorly-designed sites.

Limits – one of the new features on Instagram – gives users the power to automatically hide comments and direct message requests from other users who do not already follow, or have only recently followed them.