What you need to know
- Instagram has announced it has extended restrictions on depictions of self-harm.
- It now covers fictional depictions of self-harm or suicide, including drawings, memes and graphic images from films or comics.
- Instagram previously announced tougher measures for self-harm images in February.
Instagram has announced that in the last month it has extended its ban on depictions of self-harm and suicide to include fictional content such as memes and drawings, as well as graphic images from film and comics.
In a blog post Sunday Head of the Service Adam Mosseri said:
This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery. We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.
Accounts sharing this type of content will also not be recommended in search or in our discovery surfaces, like Explore. And we'll send more people more resources with localized helplines like the Samaritans and PAPYRUS in the UK or the National Suicide Prevention Lifeline and The Trevor Project in the United States.
Mosseri framed the most recent updates by admitting that young people can be influenced both positively and negatively by what they see online.
Two things are true about online communities, and they are in conflict with one another. First, the tragic reality is that some young people are influenced in a negative way by what they see online, and as a result they might hurt themselves. This is a real risk.
But at the same time, there are many young people who are coming online to get support with the struggles they're having — like those sharing healed scars or talking about their recovery from an eating disorder. Often these online support networks are the only way to find other people who have shared their experiences.
Instagram says that since it announced new measures in February to prevent content related to self-harm and suicide it has removed, reduced visibility or added sensitivity screens to more than 834,000 pieces of content. It found more than 77% of these before they were reported. Mosseri also recognised that this is an ongoing battle, and that Instagram's work on the subject will never be finished, its policies and tech evolving as new behaviours and trends emerge. It also says that it is working with several academics and experts, as well as the Swedish mental health organization MIND. It expressed a desire to bring its improved technology to the EU, but that it needed to review "important legal considerations under EU law" in partnership with its European regulator.
Let's talk aesthetic Home screens, Apple Watches, iPhone 12, and more
It's been quite a busy September. We got new Apple Watches, iOS 14 and watchOS 7, new customization trends, and so much more. Let's dive in!
FAQ: TikTok & WeChat ban — why it’s happening and what it means for you
Are TikTok and WeChat really being banned? When does all of this take effect? Will I still be able to use these apps? All this and more answered in our FAQ regarding the latest U.S. orders.
Here's everything we know about the iPhone 12 so far
With the iPhone 12 reportedly just weeks away, here's everything we currently know about Apple's next flagship lineup!
All the Nintendo Switch accessories for Animal Crossing fans
Animal Crossing: New Horizons is finally out on Nintendo Switch. Go all out with your Animal Crossing love with these adorable-themed accessories.