Apple has delayed the launch of new child safety features that would have monitored children’s messages for sexually explicit photos following a privacy backlash.

The iPhone maker had planned to introduce the tools in its upcoming iOS 15 software as part of measures aimed at curbing child sexual abuse and pornography.

Apple said: “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material. 

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

New features were due to include software that would check photos for child abuse material uploaded from users’ iPhones into Apple’s online storage platform iCloud.

When users uploaded photos, Apple would automatically check them for a digital “hash”, or a hidden line of code. Known child abuse images are collected by safety watchdogs and marked with these hashes to block them across the web.

If a user tried to upload multiple suspect photos, Apple could manually review and block their account and alert child protection authorities.

A separate tool was planned to monitor the iMessage accounts of under-13s. When a child sent or received an image that appeared to be explicit, they would be warned against sending or opening it. If they proceeded, Apple would alert their parents.

However, the tools were criticised by privacy campaigners. Apple’s plans to check its iCloud service for child abuse images was described as a “backdoor into people’s devices” by activists. 

The Electronic Frontier Foundation warned its tools opened the door to authoritarian governments demanding other types of photos are reported to the state.

Apple rejected these claims, arguing its technology was more secure than tools developed by other companies. Child safety advocates argued Apple’s safety technology would help stop the spread of child abuse images.

Technology companies have been launching new tools to protect children online ahead as new UK rules come into force. The Age Appropriate Design Code came into force on September 1, which forces technology companies to introduce further safeguards on how children use apps and websites.