
AI's Role in the Push for Suicide Prevention
Data, access and money challenges face mental healthcare, but there is some promising new tech.
In the second month of school, a Utah student talked about accidentally falling off a building. The child didn’t look anyone in the eye that day. Peers joked about killing themselves for weeks after that incident, and the student went on medical leave later that month.
In Herriman, Utah, seven students at one high school have already
>>
News about AI being able to recognize signs of depression has created a narrative of big brother watching over us and “curing” depression and suicide. But the reality of mental health support in the U.S. is one that lacks affordable care and a general lack of resources. AI can help, in some small way, overcome these issues and more.
The Forces Working Against Suicide Prevention
The gap in mental healthcare manifests in both access problems because of financial constraints and underserved populations with no providers in their area. In Utah, for instance, suicide rates have been
The year 2013 marked the first in which suicide had become the leading cause of death for people between 10 and 19 years of age in Utah.
The dearth of providers is
I spoke with Camille Cook, a school counselor, about some of the problems with mental health for kids in Utah. Some parents have told school counselors that they figure they can always take their children to the university health system or somewhere for emergency mental healthcare. She told me, however, that it’s not so simple. Here’s a paraphrased version of what Cook says:
If you think your children are in danger, a health system might be able to hold the child for a few days. But if you need long-term help, it seems like everywhere is full. If you need inpatient treatment or any type of residential treatment, the waiting list is long. I had a student that needed treatment last year, and the family was told to wait for six months. In six months, you might not have your kid anymore. If you are waiting for a psychiatrist, you will wait at least three months.
There are therapists — it can be really hard to find one that takes your insurance. So many clinics are going to self-pay, and when a family is already paying $600 a month for insurance, they can’t afford to pay for psychiatry and therapy out of pocket. One of the hardest issues is that if a child needs residential treatment, the insurance will only cover minimal expenses. You are still paying all living expenses.
How Tech, AI Can Support Suicide Prevention
Social context is critical to tailoring interventions. Social media use also provides insight into suicide risk, making it a useful addition to risk modeling.
>>
Resources to help understand social media and how it affects suicide risk are increasing. Google recently altered searches for suicide-related issues to
But the lack of access to care may create confusion about social solutions in underserved populations such as Utah. When presented with an underserved population, healthcare providers need better risk assessment tools and greater resources to combat climbing suicide rates. Healthcare providers need a better way to assess risk with the sometimes-fragmented data available at the point of care. Much of the information relevant to elevated risk for suicide is not located in the electronic health record (EHR) and not typically seen. The HBI Spotlight Suicide Attempt model improves risk stratification despite scarce resources.
The data that we have don’t always include how people feel about social support and the like — and clinicians might not have that at the point of care. We get insights from the data that we have. There is other research that indicates that these pieces are important in predictive models, but hospitals and payers aren’t always going to have that. We have a way to predict the risk through information that is readily available at the point of care, such as EHR data and insurance claims data. The problem we are trying to solve is: How can we get insights from data that are readily available at the individual level, so we can explore some of the more personal areas?
For data scientists, outside experts can help interpret what we’ve found and connect the pieces. Take the HBI Spotlight Suicide Attempt Model, whose data suggested that a lower respiratory risk index was associated with an increase in suicide attempt risk. I also saw that some of the language barrier did not have a strong impact on suicide risk. I asked Cerel what that might mean. She says much of the data related to differences in rural or urban settings. Many rural white patients have a lack of mental health support, coupled with greater access to lethal means through firearms. Rural suicide statistics models, then,
Data science can help provide decision support to providers and identify risk that isn’t immediately clear. AI models increase our understanding of the risk of suicide. The more we know, the better providers will be at connecting patients with resources. In areas like Utah, where the need doesn’t meet the demand for mental health support, we have many opportunities to improve. Better data science models will improve care, but they are only part of the puzzle to combat our mental health crisis. We can make things better, but much work is left to be done.
If you’re feeling suicidal, talk to somebody. Call the National Suicide Prevention Lifeline at 1-800-273- 8255; the Trans Lifeline at 1-877-565-8860; or the Trevor Project at 1-866-488-7386. Text ‘START’ to the Crisis Text Line at 741-741. If you don’t like the phone, connect to the Lifeline Crisis Chat at crisischat.org.
Get the best insights in healthcare analytics
Related






































