Banner: Duty of Care
Parents will be able to ask Google to delete photographs of their children under new policies being brought in by the tech giant after criticism that it is failing to protect young people online.
Google, which also owns YouTube, said it will update its range of removal tools within its search engine, enabling parents and under 18s themselves to request that any image of them that appears in the company’s image search is deleted.
While the tool will not permanently delete images from the whole web, it will make them harder to find on the open internet. The privacy feature is expected to be launched in the coming weeks.
Advertising targeting will be blocked based on the age, gender or interests of people under the age of 18. This change will be implemented globally over the coming months, Google said.
The new tools will come into force as Britain’s data regulator tees up rules that will require UK firms to take greater steps to protect children browsing the web.
The Information Commissioner’s Office (ICO) is due to enforce new powers from September, called the Age Appropriate Design Code, that will place a duty on companies to make sure their websites are used by and tailored for specific age groups.
The regulator will have the power to fine companies up to 4pc of global turnover or £17.5m for the worst breaches of the new code.
The Telegraph has campaigned since 2018 for a Duty of Care to be placed on technology companies to protect children online.
Duty of Care white paper
The tech giant will also enforce its Safe Search function – a tool that hides adult links or images – automatically for users under the age of 18.
Google said it would also change the default settings of its YouTube app to the most private options for teenage users between the ages of 13 and 17. This will make videos private by default, meaning they can only be viewed by users with a specific link. The video site will no longer offer auto-playing videos to younger users by default.
James Beser of YouTube said it would also crack down on videos in its YouTube Kids app – designed for under 13s – that are aimed at children and overtly promote commercial products.
“In the coming weeks, we’ll also begin to remove overly commercial content from YouTube Kids, such as a video that only focuses on product packaging or directly encourages children to spend money," he said.
Mindy Brooks, general manager of YouTube Kids and Family, said: “Some countries are implementing regulations in this area, and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens globally.”
At a glance | Your data rights under GDPR
Internet users already had the power to request certain images or pictures are deleted from the web under the “right to be forgotten”, which was previously part of EU rules and now forms part of the UK’s data regime.
However, this regulation requires internet users directly write to tech companies and demand their image or a story about them is taken down, which must be weighed against the public interest.
Google’s new tools will go beyond this right, making it easier and faster to remove images of young people.
It is understood Google believes the changes are compliant with the ICO’s new code, with some of the features going beyond what has been demanded by the regulator.