AI's Role in the Push for Suicide Prevention

Janae Sharp

Data, access and money challenges face mental healthcare, but there is some promising new tech.

In the second month of school, a Utah student talked about accidentally falling off a building. The child didn’t look anyone in the eye that day. Peers joked about killing themselves for weeks after that incident, and the student went on medical leave later that month.

In Herriman, Utah, seven students at one high school have already died by suicide during the 2018 and 2019 school year. In health-tech circles, however, a hopeful question has appeared: Can artificial intelligence (AI) combat the suicide crisis? It’s timely as ever today, Sept. 10, which is World Suicide Prevention Day.

>> READ: Hunting for the Heart of a Changing Community

News about AI being able to recognize signs of depression has created a narrative of big brother watching over us and “curing” depression and suicide. But the reality of mental health support in the U.S. is one that lacks affordable care and a general lack of resources. AI can help, in some small way, overcome these issues and more.

HBI Solutions is a company that uses AI with data already available to providers to create clinical decision support tools, providing insights into suicide risk that is not otherwise obvious. Many studies indicate that people who intend to attempt suicide deny their intention or suicidal thoughts. But HBI’s predictive Spotlight Suicide Attempt Model has achieved results that show the importance of better risk visibility at the point of care and why data science still has much to learn about risk context and mental health.

The Forces Working Against Suicide Prevention

The gap in mental healthcare manifests in both access problems because of financial constraints and underserved populations with no providers in their area. In Utah, for instance, suicide rates have been increasing steadily at least since 1999, with many counties showing a lack of available resources across the board. Simply put, there is an insufficient number of providers to serve every patient who tries to get mental health support, and specific groups — such as those with language disparities — have an even wider gap in care accessibility.

The year 2013 marked the first in which suicide had become the leading cause of death for people between 10 and 19 years of age in Utah. Social factors such as education and social support negatively affect at-risk youth. Those without a positive family support had at 25 percent greater risk of dying by suicide. When presented with an underserved population, healthcare providers need better risk assessment tools and greater resources to combat climbing suicide rates.

The dearth of providers is so severe that even patients who present at the emergency room with suicidal thoughts might not have access to immediate care or ongoing outpatient therapy afterward. Trauma-informed medical care in an emergency setting is complicated by narrow provider bandwidth and a lack of financial resources. Some of the risk factors for suicide attempts are readily apparent with good information, but the lack of significant social determinants of health information inhibits provider decision making in an overburdened system. Many risk factors are not self-evident for healthcare providers or not reported by patients. This can include information such as family support for an at-risk LGBT youth with suicidal ideation. Many parents do not understand the seriousness of mental health issues for youths or don’t have the information to know how to get their children mental health help.

I spoke with Camille Cook, a school counselor, about some of the problems with mental health for kids in Utah. Some parents have told school counselors that they figure they can always take their children to the university health system or somewhere for emergency mental healthcare. She told me, however, that it’s not so simple. Here’s a paraphrased version of what Cook says:

If you think your children are in danger, a health system might be able to hold the child for a few days. But if you need long-term help, it seems like everywhere is full. If you need inpatient treatment or any type of residential treatment, the waiting list is long. I had a student that needed treatment last year, and the family was told to wait for six months. In six months, you might not have your kid anymore. If you are waiting for a psychiatrist, you will wait at least three months.

There are therapists — it can be really hard to find one that takes your insurance. So many clinics are going to self-pay, and when a family is already paying $600 a month for insurance, they can’t afford to pay for psychiatry and therapy out of pocket. One of the hardest issues is that if a child needs residential treatment, the insurance will only cover minimal expenses. You are still paying all living expenses.

How Tech, AI Can Support Suicide Prevention

Social context is critical to tailoring interventions. Social media use also provides insight into suicide risk, making it a useful addition to risk modeling.

>> READ: The Population Health Problems Facing Urban and Rural America

Resources to help understand social media and how it affects suicide risk are increasing. Google recently altered searches for suicide-related issues to include crisis numbers, and Facebook is working with AI to promote suicide prevention. Providers should have access to social media-facilitated data for decision support. “One effort to better understand if social media can help prevent suicide is oursatahelps.org, in which people can donate their anonymized social media data or that of their loved one who died by suicide” says Julie Cerel, Ph.D., professor in the university of Kentucky college of social work and president of American Association of Suicidology.

But the lack of access to care may create confusion about social solutions in underserved populations such as Utah. When presented with an underserved population, healthcare providers need better risk assessment tools and greater resources to combat climbing suicide rates. Healthcare providers need a better way to assess risk with the sometimes-fragmented data available at the point of care. Much of the information relevant to elevated risk for suicide is not located in the electronic health record (EHR) and not typically seen. The HBI Spotlight Suicide Attempt model improves risk stratification despite scarce resources.

The data that we have don’t always include how people feel about social support and the like — and clinicians might not have that at the point of care. We get insights from the data that we have. There is other research that indicates that these pieces are important in predictive models, but hospitals and payers aren’t always going to have that. We have a way to predict the risk through information that is readily available at the point of care, such as EHR data and insurance claims data. The problem we are trying to solve is: How can we get insights from data that are readily available at the individual level, so we can explore some of the more personal areas?

For data scientists, outside experts can help interpret what we’ve found and connect the pieces. Take the HBI Spotlight Suicide Attempt Model, whose data suggested that a lower respiratory risk index was associated with an increase in suicide attempt risk. I also saw that some of the language barrier did not have a strong impact on suicide risk. I asked Cerel what that might mean. She says much of the data related to differences in rural or urban settings. Many rural white patients have a lack of mental health support, coupled with greater access to lethal means through firearms. Rural suicide statistics models, then, should be viewed within the lens of decreased access to mental health care and increased access to guns and other lethal means.

Data science can help provide decision support to providers and identify risk that isn’t immediately clear. AI models increase our understanding of the risk of suicide. The more we know, the better providers will be at connecting patients with resources. In areas like Utah, where the need doesn’t meet the demand for mental health support, we have many opportunities to improve. Better data science models will improve care, but they are only part of the puzzle to combat our mental health crisis. We can make things better, but much work is left to be done.

If you’re feeling suicidal, talk to somebody. Call the National Suicide Prevention Lifeline at 1-800-273- 8255; the Trans Lifeline at 1-877-565-8860; or the Trevor Project at 1-866-488-7386. Text ‘START’ to the Crisis Text Line at 741-741. If you don’t like the phone, connect to the Lifeline Crisis Chat at crisischat.org.

Get the best insights in healthcare analytics directly to your inbox.

Related

In the U.S., Healthcare Data Access Is a Scavenger Hunt

Researchers Develop New Suicide Risk Prediction Model from Massive Cohort

Why C-Suites Need to Get a Grip on Physician Burnout