Governments are now talking about banning Instagram, following on from complaints of self-harm content.
Voices are being raised that social media accounts are not ensuring the safe use of their services for those suffering from self-harm impulses.
The parents of British 14-year-old, Molly Russell, who committed suicide in 2017, said that they had no reason to believe that she was suffering from mental health problems. However, after her death, they found that she had been viewing self-harm content via social media.
Rather than providing the necessary support for those who searched self-harm content, Russell’s father, Ian, said that Instagram had cultivated a community around depression which was, “fatalistic”.
“I have no doubt that Instagram helped kill my daughter,” Ian said.
Head of Instagram, Adam Mosseri will meet with Matt Hancock, the health secretary, this week to discuss self-harm minimisation efforts.
Hancock has said in a letter that, “It is time for internet and social media providers to step up and purge this content once and for all.”
Instagram has already committed to introducing “sensitivity screens” which will hide images until users actively choose to look at them.
**In Australia, the crisis support service Lifeline is 13 11 14.
Do you think those in charge of social media platforms are doing enough to ensure online spaces remain safe? Let us know what you think in the comments.