Image source, Getty Images
A group of US states is investigating how Instagram targets children, despite it posing potential risks to them.
The group – made up of both Democrat and Republican states – is investigating Instagram and Facebook's parent company Meta to determine if consumer protection laws were broken.
It comes after a company whistleblower testified in the US that the company knew its products can harm children.
A Meta spokesman on Thursday denied that their platforms are unhealthy.
Massachusetts Attorney General Maura Healey, a Democrat who first announced the inquiry, tweeted: "Facebook, or Meta, has known Instagram is linked to depression, eating disorders & suicide among young people."
"We will identify if any laws were broken and end the abuse for good."
Nebraska Attorney General Doug Peterson, a Republican, said that the companies "treat our children as mere commodities to manipulate for longer screen time engagement and data extraction".
"These social media platforms are extremely dangerous and have been proven to cause both physical and mental harm in young people," added New York Attorney General Letitia James.
- Facebook Files: 5 things leaked documents reveal
- Facebook under fire over secret teen research
Facebook, which owns Instagram and WhatsApp changed its name to Meta last month after a series of scandals.
A Meta spokesman pushed back against the consortium's allegations.
"These accusations are false and demonstrate a deep misunderstanding of the facts," a spokesman said in a statement.
"While challenges in protecting young people online impact the entire industry, we've led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders," he added.
The announcement comes after a series of explosive reports based on of documents leaked by former Facebook employee Frances Haugen.
In testimony to lawmakers in the US, she said the company knowingly pushed its platforms to young children despite knowing that they could cause health issues.
In September, the platform abandoned plans for a child-focused app after a group of over 40 state attorneys general wrote and urged them to cancel it.
Instagram, like other platforms, requires users to be over 13, but the company has admitted that it knows some users are younger.