Facebook has addressed occurrences of murder and suicide reports on its platform. In a statement released by the company, it says such content have no place on the platform.
“This is an appalling incident and our hearts go out to the family of the victim. There is absolutely no place for content of this kind on Facebook and it has now been removed,” the statement said.
Facebook recently updated the tools and resources it offers to people who may be thinking of suicide, as well as the support it offers to their concerned friends and family members, which include integrated suicide prevention tools to help people in real time on Facebook Live.
If someone posts something on Facebook that makes you concerned about their well-being, you can reach out to them directly or report the post to Facebook. The company has teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports like suicide.
Facebook provides people who have expressed suicidal thoughts with a number of support options. For example, it prompts people to reach out to a friend and even offer pre-populated text to make it easier for people to start a conversation. Facebook also suggest contacting a helpline and offer other tips and resources for people to help themselves in that moment.
Suicide prevention tools have been available on Facebook for more than 10 years and were developed in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, Forefront and Crisis Text Line, and with input from people who have personal experience thinking about or attempting suicide. In 2016 Facebook expanded the availability of the latest tools globally – with the help of over 70 partners around the world – and improved how they work based on new technology and feedback from the community.
Facebook is also aware of at least four recent instances in which attempted suicides streamed on Live were interrupted by friends who saw the streams and alerted authorities:
In addition to improving Facebook’s reporting flows, the company is constantly exploring ways that new technologies can help make sure Facebook is a safe environment. Artificial intelligence, for example, plays an important part in this work, helping Facebook prevent the videos from being reshared in their entirety. The company is also working on improving its review processes.
Currently, thousands of people around the world review the millions of items that are reported every week in more than 40 languages. Facebook prioritize reports with serious safety implications for its community, and is working on making that review process go even faster.
Keeping its global community safe is an important part of Facebook’s mission. The company is grateful to everyone who reported these videos and other offensive content, and to those who are helping to keep Facebook safe every day.