Years and years ago, Emile Durkheim’s timeless study of suicide brought to notice the problems and from then, people have pondered how to lessen the rate of suicide by proactive measures and social bonding. Things have changed significantly in terms of social implications and Facebook certainly controls a major portion of social media and social interactions.
It is no surprise that Facebook has seen it all and recently, someone went live on Facebook while committing suicide using the live.me social media platform. This was surely a disturbing moment for Facebook as they did not know what to do with it. However, it was evident that they were thinking about it and after a month, Facebook came out with a statement that would try to deal with the issue of suicide.
The Facebook statement
The statement confessed that it surely had some responsibilities regarding the issue of suicide because its impact on people’s lives are immense and hence, Facebook feels obliged to increase a hand of solidarity or assistance in the moment of crisis. They said that they believe in the goodness of people that allows them to respond in case of an emergency and keeping that in mind, Facebook must react accordingly.
What would be the first step?
While the statement only reflected the mission, the vision became clear after Facebook came up with the first step towards a more supportive community where suicide prevention is not a mere thought, but a tool of action. Facebook have had reporting of death and other features for profiles, but what they are going to do is surely going to shock many.
Facebook, as they have planned, are going to use artificial intelligence to battle the suicidal tendencies of certain members. They are extremely concerned about them since they are imagining a global community of people who can interact with each other and keep them safe accordingly. Obviously, the artificial intelligence will be using pattern recognition to prevent suicides and detect the Facebook posts accordingly.
The process of Artificial Intelligence recognition
The Artificial Intelligence, as designed, will not just be looking for posts and the suggestive words, but also comments from friends on these posts. If the AI detects phrases like “Are you okay?” or “Ask me for any help” kind of statements, then these are sure signs of struggle with life. However, Facebook will not be collecting such reports to ensure that there is a transparency in the process and no hampering of privacy.
Rather, Facebook will now be more active so that any sort of self-injury case becomes prominent next time the person concerns come to Facebook. This is surely exciting for all the developers since technology saving lives is what the ultimate goal of technology is. In fact, there are websites that have huge databases of suicide-prone people and that may help the Facebook AI to detect better as a tool.
Engage people socially
The process is simple- they are looking to engage more people socially so that they stay away from suicidal thoughts. In fact, the more people become aware, the more there is a chance that such a tragedy could be averted. Facebook will also extend suicide prevention resources to the member so that they leave their decision behind. They are also launching social campaigns and using messenger as an instant crisis manager to provide emotional support.
Facebook is also implementing suicide prevention groups who can be contacted through their helpline to curb this problem. Crisis Text Line is one such group where people can contact in times of distress. So, building a prevention network from the scratch, Facebook intends to use its live features even more powerfully in the coming days.