Facebook uses Artificial Intelligence to help spot potentially suicidal users

Posted on Posted in 2017 Business Opportunities For You, What's Trending Now


Facebook has gradually grown from supposed social media fad to an everyday essential that has amassed a monthly base of 1.86 billion users. The ever-scaling operation frequently pushes out new features to keep its users interested, and at the moment, one of those features is Facebook Live, a service that lets users broadcast real-time videos to their followers. While it has found favor with professionals and laymen alike, it has also become an unfortunate platform for live suicides.


Hashflare 336 x 280


On Wednesday, Facebook announced that it is ready to take a first and significant step in building a safer and more supportive Facebook community by significantly strengthening its own suicide prevention tools (Facebook has had suicide reporting and tools for a decade). The update includes a rather incredible claim: Facebook will use artificial intelligence to identify those members contemplating suicide.

“I wrote a letter on building global community a couple weeks ago, and one of the pillars of community I discussed was keeping people safe,” Zuckerberg wrote on his personal Facebook feed on Wednesday. “These tools are an example of how we can help keep each other safe.”
Noting that live suicides had occurred on similar platforms before, Facebook is now testing a system that relies on pattern recognition based on posts previously reported for suicide risk. The AI tool looks at words in the post and, especially, comments from friends — such as “Are you okay?” and “I’m here to help” — that may indicate someone is struggling.
Now, when suicide-like behavior is detected, Facebook will provide the at-risk user with resources that range from the ability to contact a friend or helpline to a few potentially helpful tips for dealing with depression without halting their stream. On the other end, viewers can flag broadcasts that they think demonstrate at-risk behavior while also receiving guidance from Facebook on how to proceed.

While the system is rolling out worldwide, the option of contacting a crisis counselor helpline via Facebook Messenger will be available in the U.S. only.
Skeptics may argue that a message from Facebook might not be as effective as immediately involving a friend. However, Vanessa Callison-Burch, a Facebook product manager, told BBC that the social media company is hoping to avoid invading anyone’s privacy or tampering with personal dynamics between friends. They acknowledge how critical a fast response time is, so as soon as the system identifies an at-risk user, a community operation’s team rapidly reviews the case.
The U.S. alone averages one suicide every 13 minutes, and it is the country’s tenth leading cause of death. While Facebook’s system is still new, it is reassuring to see that the social media company is dedicated to protecting its users from adding to this troubling statistic.


Leave a Reply

Your email address will not be published. Required fields are marked *