• Politics
  • Diversity, equity and inclusion
  • Financial Decision Making
  • Telehealth
  • Patient Experience
  • Leadership
  • Point of Care Tools
  • Product Solutions
  • Management
  • Technology
  • Healthcare Transformation
  • Data + Technology
  • Safer Hospitals
  • Business
  • Providers in Practice
  • Mergers and Acquisitions
  • AI & Data Analytics
  • Cybersecurity
  • Interoperability & EHRs
  • Medical Devices
  • Pop Health Tech
  • Precision Medicine
  • Virtual Care
  • Health equity

Facebook and the Intertwining of Mental Health, AI, and Social Media

Article

Several studies have used AI to detect mental health conditions from social media activity. Now, Facebook has announced it is implementing interventions.

With increasing frequency, researchers have been releasing studies that show how machine learning algorithms can detect mental health conditions based on social media behavior. Now, Facebook has announced that it will use artificial intelligence (AI) to engage more actively in the effort to prevent suicides.

In a post written by its vice president of product management, Guy Rosen, the world’s largest social network stated that it would use pattern recognition to detect posts or videos where users may have expressed thoughts of suicide. The company will also look to identify appropriate first responders to notify if troubling indications are detected, and it wants to find ways to strengthen lines of communication with those responders.

“We are starting to roll out artificial intelligence outside the US to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. This will eventually be available worldwide, except the [European Union],” Rosen wrote. Data privacy laws in the European Union make the technology more difficult to employ there than in other countries.

Comments on posts and videos will be taken into consideration. Responses like “Are you ok?” and “Can I help?” are considered strong indicators, Rosen wrote, and the technology has been able to identify videos that may have gone unreported by human users.

Facebook also plans to bolster its team of community operations reviewers to respond to the initiative, and AI will also be used to prioritize the order in which posts and videos are reviewed.

In March, the network announced that it was running a program to study how AI can detect warning signs in text postings. That announcement focused more on tools that could connect potentially suicidal users with friends on the network or with prevention hotlines. This week’s statement also acknowledges those options, but increases emphasis on direct intervention from first responders.

“With all the fear about how AI may be harmful in the future, it's good to remind ourselves how AI is actually helping save people's lives today,” Facebook founder Mark Zuckerberg wrote of the new initiative. He said that the technology had already allowed the company to notify first responders of those at risk over 100 times in the preceding month.

Responding to comments on his post, Zuckerberg indicated that the technology would work in “many languages” and that rollout in “most countries around the world” would begin this week.

For years, social media has been seen as an invaluable tool for understanding and identifying potentially harmful mental health conditions. When combined with machine learning techniques, researchers may be able to parse “healthy” social media behaviors from those that may indicate a range of mental health conditions. In recent months, studies have found that machine learning could predict the presence of post-traumatic stress disorder (PTSD), depression, and attention deficit/hyperactivity disorder (ADHD) based on Twitter activity, often with respectable accuracy.

Larry Ungar, a computer scientist at the University of Pennsylvania who worked on the ADHD study, admitted to Healthcare Analytics News that “the whole thing has a creepy aspect to it,” but that overall, mental health interventions that come from AI and social media can be positive. “It can actually help, we hope, eventually give people feedback and suggest micro treatments,” he said.

His colleague and co-author, Sharath Chandra Guntuku, told HCA News that the team was pivoting to focus its detection studies on anxiety and depression, rates of which are on the rise among American teenagers. Likewise, suicide and self-harm have steadily become more prevalent in US youths aged 10 to 24 years old, according to a new CDC report.

Recent Videos
Image: Ron Southwick, Chief Healthcare Executive
George Van Antwerp, MBA
Edmondo Robinson, MD
Craig Newman
Related Content
© 2024 MJH Life Sciences

All rights reserved.