Get weekly updates by subscribing to our newsletter
Get weekly updates by subscribing to our newsletter
tech

Assessing responses to suicide attempts broadcast on social media

 

 

A world-first research study mining social media reactions to suicide attempts made on one of the biggest social media platforms in the world had shed light on how best responses can be framed to such events.

The number of people broadcasting their suicide attempts live on social media is of growing concern globally, particularly in China, where an estimated 193 people shared such video streams online between 2003 and 2016.

Popular Chinese platform Weibo has emerged as one of the most commonly used spheres for the sharing of such actions, however the sheer number of Weibo posts every day (estimated to be 100 million every day) has made it near impossible to identify the full scale of suicide ideation and how to help those in need of help.

A cross-discipline research team, including personnel from Black Dog Institute and Beijing Forestry University, measured suicide stigma and literacy on Weibo, sampling almost 5,000 posts on the platform and classifying those posts as either stigmatising or non-stigmatising while also mentioning the levels of suicide literacy.

The researchers found that more than one-third of people who responded to suicidal posts did so by expressing stigmatising attitudes (35.1 per cent), with over 1 in 10 responses (11.6 per cent) conveying inaccurate knowledge about suicide.

Users who showed poor or little knowledge about mental health issues and suicide were, correspondingly, overwhelmingly more likely to have stigmatising attitudes in their responses (84.1 per cent).

"The words people use in their social media posts when responding to a suicide broadcast can provide vital clues into broader levels of suicide literacy," said lead author of the study Dr Ang Li, a former Black Dog research fellow who is now based at Beijing Forestry.

"With millions of social media users worldwide posting every day, finding ways to automate the detection of problematic language is essential if we are to improve suicide prevention campaign messages into the future."

Of the stigmatising posts, over half (55.9 per cent) contained incorrect knowledge about signs and symptoms of suicide, while 44.1 per cent communicated false facts on treatment options, including thematic messaging such as "suicide happens without warning" and "people who want to attempt suicide cannot change their mind quickly".

The research team said the results could help improve future stigma reduction campaigns by way of harnessing social media user insights to develop targeted suicide prevention messaging.

"Used well, social media can be a first line of defence for helping people facing a suicidal crisis," Black Dog Institute researcher and study co-author Dr Bridianne O'Dea said.

The next step, she said, is to apply this language classification model to the early detection of suicide warning signs and risk factors among social media users.

"If we can detect with greater accuracy those who need help, and what the education needs are for those responding in a stigmatising way, we can develop more effective suicide prevention campaigns that encourage help-seeking through potentially life-saving responses," she concluded.

Sign up for Wellness Daily’s mailing list to receive weekly content

Latest Poll

Have any of your clients taken up the superannuation guarantee amnesty?

daily wisdom

“Kindness is the language that the deaf can hear and the blind can see.” – Mark Twain