By Angus Crawford
Publishedduration6 hours agoimage copyrightGetty Imagesimage captionSome self-harm images remain on Instagram despite being marked with users' own trigger warnings
Children's charity the NSPCC has said a drop in Facebook's removal of harmful content was a "significant failure in corporate responsibility".
Facebook's own records show its Instagram app removed almost 80% less graphic content about suicide and self-harm between April and June this year than in the previous quarter.
Covid restrictions meant most of its content moderators were sent home.
Facebook said it prioritised the removal of the most harmful content.
Figures published on Thursday showed that as restrictions were lifted and moderators started to go back to work, the number of removals went back up to pre-Covid levels.