Falling in line with promises made in his recent manifesto, Facebook is utilizing the immense power of machine learning and artificial intelligence technologies to clamp down on suicide attempts by members of its community. The platform plans to provide people who may be suicidal new tools and resources, coupled with continued support from concerned friends and family members.
In addition to already available suicide prevention tools, the social media giant is now making more such resources available through Messenger and Facebook Live.
This means that one of the company’s suicide prevention tools will now be integrated within Facebook Live. It will enable any viewer watching their friend’s livestream to reach out and report that video if they say something worrisome. Once you report their live video as “suicide or self-injury” related, you’ll be provided with a set of resources to better help your friend in need. With regards to the same, the blog post reads,
The person sharing a live video will see a set of resources on their screen. They can choose to reach out to a friend, contact a help line or see tips. If you or someone you know is in crisis, it is important to call local emergency services right away.
As for Facebook Messenger, the company is adding the feature to allow suicidal individuals to instantly connect with support partners. This tool is available globally and allows users to visit the organization’s Page and message with someone in real-time. This resource is currently being launched as a test with support from Zendesk (backend service provider) and organization including Crisis Text Line, the National Eating Disorder Association, and the National Suicide Prevention Lifeline.
These tools and resources are also being accompanied by an AI-powered reporting process, which can scan through and identify posts that indicate thoughts of suicide. The said AI has been trained using pattern recognition in posts already reported for suicide. The use of AI has been employed to streamline the reporting and community support procedure. This feature will enable to not only make the suicide options prominent while reporting but also help review posts which haven’t been marked as suicidal
This feature will enable to not only make the suicide options prominent while reporting but also help review posts which haven’t been marked as suicidal by other community members. The AI test is currently being deployed in the U.S and will be expanded on with help from suicide prevention experts.
For those unaware, Facebook recently made a massive humanitarian change to its mission statement. Instead of just being focused on connecting individuals, it is now working on build a safe and healthier community — the next step after connections are made. Today’s announcement falls in line with important changes in Zuckerberg’s mission, which included the mention of Live being used to stream live suicides (a grim and unfortunate experience).
Now, however, those individuals can freely connect and contact with their friends or other members of the social community to seek guidance from them. You already have the option to report posts on the timeline where an individual expressed their suicidal thoughts. The company’s team prioritizes such cases and provides them with a number of support options — to directly reach out to friends and contact a helpline for counseling.