Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



News & Events > NewsBreaks
Back Index Forward
Threads bluesky LinkedIn FaceBook Instagram RSS Feed
 



The Pros and Cons of Using AI-Based Mental Health Tools
by
Posted On September 27, 2022
The COVID-19 pandemic has highlighted the need for mental health resources and treatments. But underinvestment in mental healthcare has been pervasive not just in the U.S., but across the globe, and the data paints a grim picture. Less than half of the adults who have a mental illness receive treatment. There is a shortage of trained therapists and psychiatrists. For those who have access to them, the wait for an appointment can stretch into months. Many mental health conditions go undiagnosed or are diagnosed long after they begin.

Given this context, there is a great deal of interest in digital tools for mental health. Digital tools are seen as inexpensive options that can increase access to mental healthcare across the globe. They may be able to identify symptoms and enable early diagnosis or identify disorders that are currently not being diagnosed. One of the barriers to seeking treatment is the stigma associated with mental disorders, and digital tools can reduce that stigma. And in the absence of therapists who speak a patient’s native language, it is possible to use digital tools to provide care in the patient’s language.

Not surprisingly, there is a wide variety of digital—and often artificial intelligence (AI)-based—well-being and mental health tools, ranging from simple mindfulness and meditation apps to therapy apps that are marketed as complements (or even alternatives) to actual (that is, in-person) therapy. Typically, these latter options come in two forms (although they may not explicitly be called as such for regulatory reasons): psychotherapy chatbots (or interactive tools) and digital phenotypes. Woebot Health, Wysa, and myCompass are examples of interactive tools, and Cogito Companion, StudentLife, EmotionSense, MOSS, and Biewe are examples of digital phenotypes (many of which are research projects rather than commercial apps, but they are being piloted by key stakeholders such as insurers, employers, and governments). Common to both of these types of apps is that their clinical effectiveness has not been established via long-term studies. But digital tools, particularly psychotherapy chatbots, have gained millions of users already.

Psychotherapy Chatbots

Apps such as myCompass and Woebot Health are intended to treat mild to moderate depression, stress, and anxiety. By sending text prompts and emails, myCompass nudges users to self-monitor their moods, behaviors, and lifestyle changes. Somewhat similarly, Woebot Health uses short daily chats, videos, games, and mood tracking and challenges people to examine their thoughts. These chatbots are mostly cognitive behavioral therapy (CBT) tools in digital avatars.

CBT therapy is an evidence-based approach used by mental health professionals. A central tenet of CBT is that an individual’s reaction to an adverse event, not just the event, plays a key role in their mental well-being. Based on this insight, CBT therapists train a patient to observe their reactions and mental states and reorient/reframe their responses to be less negative and more realistic as part of their treatment. During CBT, patients record their thoughts (e.g., negative feelings) and are repeatedly (in several sessions) challenged to reframe them, so that the new way of thinking is reinforced and becomes normal. Not surprisingly, CBT requires repeated interactions between the patient and their therapist to be effective.

The chatbots try to implement CBT interventions (which are originally meant for in-person sessions) digitally, but there are some challenges, including the following:

  • CBT is usually delivered via sessions that last 30 minutes to 1 hour. With the digital apps, the users spend smaller chunks of time but access the sessions more frequently. Will CBT be effective in such a scenario?
  • The therapist observes a lot of contextual cues and then tailors their sessions accordingly. Would a digital assistant be equal to this task?
  • The effectiveness of therapy depends on the level of trust and the relationship established between the counselor and the patient. Can this trust be replicated by digital apps? Even though they may be programmed to seem caring, ultimately, it is a simulation of empathy rather than real empathy, right?

One potential advantage of digital apps is that they can gather a more detailed picture of the users’ mental states based on daily (or even more granular) logs compared to the less-frequent (weekly or monthly) self-reporting that is the norm for in-person CBT. But this assumes that users are going to be conscientiously logging and reporting their moods and emotional triggers in the digital apps—and even then, the subjective bias of self-reporting remains. To be sure, psychotherapy chatbots have a role in mental health interventions, but their clinical effectiveness must be proven first, and then best practices of digital CBT must be codified.

Digital Phenotypes

Another active area of research and interest is in mental health digital phenotypes. Digital phenotypes refer to AI models that infer a user’s mental states, emotions, and behavior patterns based on data collected from their smartphone. Given the ubiquity of smartphones and the large amount of time users spend on them, a digital phenotype can be very useful to determine baseline behaviors at the individual level. For example, sleep cycles, speech patterns, social interactions, cognitive functioning, physical movements, and several other facets can all be inferred by analysis of the data from smartphones and wearable devices.

Two kinds of data act as inputs to a digital phenotype: active data and passive data. Active data refers to the data provided by the users in response to nudges, prompts, and questions while using the mental health app. Passive data is the data collected in the background as the users go about their daily lives. For example, the number of steps taken in a day, the number of hours of sleep, time spent on apps, time spent on phone calls, plus every digital interaction—clicks, taps, and scrolls on the phone—are automatically logged. The user’s call data, text message data, social data, and activity data contain a lot of clues about their mental state.

After certain baseline behavior patterns are established following an initial period of usage, any deviations can cause an alert that somebody maybe going through a rough patch. The sensors in the smartphone, such as the GPS, accelerometer, keyboard, and microphone, can pick up changes in speech patterns, activity rhythms, etc., and can be used to detect any depressive tendencies.

Such signals can be used to personalize a digital CBT app or can lead to a recommendation to consult a human therapist. Advocates for mental health digital phenotypes contend that smartphone data can result in an early and more accurate diagnosis than traditional approaches. That is, data and AI can unlock better diagnoses, improved treatments, and better care.

Concerns About Digital Phenotypes

Digital phenotypes raise several concerns (perhaps even more so than CBT chatbots), including the following:

  • The user data collected is highly sensitive, but many of the current apps are coy about their data protection, data privacy, and data-sharing practices. Also, a high level of data security must be in place to safeguard against any data breaches or security attacks.
  • If (as is the current wisdom) mental health disorders are on a spectrum, it’s difficult to decide where to draw the line on classifying an individual as having an ailment or not. This raises the specters of subjectivity (into a seemingly objective, data-driven approach), false negatives, and false positives.
  • False negatives deprive the deserving of access to treatments they need.
  • False positives can be triggered by a temporary change in a user’s activity or patterns for innocuous reasons (e.g., a regular flu virus). If such data is recorded and shared with other parties, perhaps it becomes a permanent part of their record.
  • AI technologies that digital phenotypes rely on, such as natural language processing (NLP) and speech recognition, work well only for/in certain regions. NLP works well only for English and a dozen other languages, but doesn’t work at the same level of accuracy for most of the world’s languages. Ditto for voice/speech recognition.
  • Cultural norms and individual differences play a role in treatment efficacy. This is something a skilled practitioner is good at. Phenotypes that work in one context don’t necessarily work in a different context or culture.

In summary, digital tools and AI hold much promise for mental health diagnosis and treatment, but much work remains to be done. There’s a need for stricter clinical standards, regulatory oversight, strong data privacy and protection practices, transparency of AI methods, adherence to Responsible AI principles, and validation/evidence from large-scale clinical trials. Mental health is too important an area to take a “move fast and break things” attitude. We need to proceed with caution given what’s at stake.


Kashyap Kompella, CFA, is an industry analyst, author, educator, and advisor. He is the co-author of A Short and Happy Guide to Artificial Intelligence for Lawyers.




Related Articles

5/14/2024Mental Health Awareness Month: How Libraries Serve People With Mental Illness
5/7/2024Mental Health Awareness Month: What Is Library Anxiety?
3/19/2024Beyond Self-Care in Libraries: Supporting Ourselves Through Real Change
1/25/2024Wiley Studies the Rise in Mental Health Issues Since the Pandemic
1/2/20242023 FLASHBACK: Librarians as Second Responders: Mental Health Resources
8/8/2023Lifehacker Looks at How Phones Seem to Read Our Minds
6/20/2023Overcoming Burnout at Work
6/20/2023How to Beat the Burnout Phase
5/2/2023Librarians as Second Responders: Resources for Mental Health Awareness Month
4/11/2023John Snow Labs Introduces the NLP Test to Help Data Scientists Deliver AI Models
2/23/2023OpenAI Shares Road Map for ChatGPT Output
2/14/2023ZDNet Offers Tips for Limiting Your Phone Usage
1/24/2023The Effectiveness of Using AI Engines to Write Articles
1/17/2023Artificial Intelligence News Roundup
1/10/2023The Latest Research on Procrastination
12/6/2022The 30th Anniversary of Texting
11/15/2022Five Artificial Intelligence and Data Predictions for 2023
11/3/2022Elsevier Looks at Pandemic-Era Mental Health
9/29/2022Omneky Improves on Artificial Intelligence-Based Art Generation
9/29/2022Testing Out Artificial Intelligence-Based Art Generation
9/22/2022Kudos and Impact Science Launch the Artificial Intelligence Knowledge Cooperative
8/9/2022APA Creates Action Plan to Work Toward Racial Equity
4/19/2022A Recap of an Association for Information Science and Technology AI Conference
4/19/2022'We Need to Talk About the Mental Health Effects of Book Bans on Authors' by Karis Rogerson
2/3/2022APA Urges Congress to Help Strengthen the Mental Healthcare Workforce
8/24/2021'Libraries Can Help in Times of Stress, Uncertainty, and Burnout' by EveryLibrary
2/4/2021APA Shares New Stats on Stress in America
1/28/2021Exact Editions Provides Access to the MindSet Mental Health Magazine


Comments Add A Comment
Posted By Harold A Maio9/27/2022 5:16:47 AM

-----One of the barriers to seeking treatment is the stigma associated with mental disorders,

Actually it is those of us trained to declare that "stigma" who are the guilty parties.


---and digital tools can reduce that stigma.

Reduce"??

Why hold onto some?

              Back to top