Instagram’s algorithm is to start blocking "suspicious" accounts from contacting under-18s in the aftermath of grooming scandals.
From this week, adult accounts on the site deemed to be acting in a predatory way towards children will be barred from messaging, following or commenting on their profiles.
The social media giant admitted that innocent people may be accidentally blacklisted and left unable to follow their children, nieces and nephews on the site, but said its approach was "erring on the side of caution".
In an interview with The Telegraph, Tara Hopkins, Instagram’s head of public policy, said the automated watchlist would "be a powerful tool" in stopping groomers, but would never be "100 percent" at spotting the right accounts.
However, the NSPCC warned that the measures would still leave children "unprotected" on Instagram as abusers would be able to easily create a new account if theirs were blocked.
The move comes after Instagram was caught up in a number of grooming scandals. A Telegraph investigation in May found more than 100 convicted paedophiles with accounts on the social network.
As well as the new blocked list, Instagram said that from this week all new accounts created by under-18s in the UK would be made private by default, meaning people will not be able to see their posts or message them without permission.
Meanwhile, the company’s algorithm will add accounts to its new "suspicious" list for behaviour such as messaging or being blocked by an unusual number of child accounts.
Instagram said accounts would not be notified that they have been deemed suspicious, and there is currently no appeals process against it.
Ms Hopkins said the accounts on the suspicious list would be reviewed by humans and accounts taken off if they had clearly been added mistakenly.
She said: "I would caution that the technology doing the signalling [as to whether an account is suspicious] will be pretty good. It is never 100 percent – of course it’s not – which is why we are erring on the side of caution and safety."
Instagram’s move comes as the Government draws up legislation to impose a legal duty of care on social media companies that would see them fined billions if children come to harm on their sites – a measure for which The Telegraph has campaigned since 2018.
Phil Satherley, the NSPCC Child Safety Online Policy Manager, said Instagram’s new measures were a "step in the right direction" but warned they would still "leave children unprotected and at risk".
He added: "It is all too simple for abusers to pose as children to bypass the new measures or to set up new accounts after being banned."