This is part of the article A limited series On the ability of artificial intelligence to solve everyday problems.
Imagine a test as quick and easy as taking your temperature or measuring your blood pressure that could reliably identify an anxiety disorder or predict the onset of depression.
There are many healthcare providers to determine a patient’s physical condition, yet none of the reliable biomarkers – objective indicators for medical states that have been observed outside of the patient – are used to diagnose mental health.
But some artificial intelligence researchers now understand that the sound of your voice can be crucial to understanding your state of mind – and AI is perfectly suited to detect changes that are difficult to understand, if not impossible. The result is a set of apps and online tools designed to track your mental state, as well as programs that deliver mental health assessment to telehealth and call center providers in real time.
Psychiatrists have long known that some mental health problems can be detected not just with anxiety what the A person says but how to They say, says Maria Spinola, a psychologist and assistant professor at the University of Cincinnati College of Madison.
With depressed patients, Dr. Spinola said, “Their speech is usually more consistent, flatter, and softer.” They also have a lower pitch range and lower volume. They take longer breaks. They are often stopped.
Patients with anxiety feel more in their bodies, which can also change the sound of their voice, he said. “They are accustomed to talking fast. It makes them more difficult to breathe.
Today, these types of sound features are being used by machine learning researchers to predict depression and anxiety, as well as other mental illnesses such as schizophrenia and post-traumatic stress disorder. The use of deep learning algorithms can uncover additional patterns and features, as captured in short voice recordings, which may not be obvious to trained specialists.
“The technology we are using now can extract features that can be meaningful that human ears can’t,” said Kate Bentley, assistant professor at Harvard Medical School and a clinical psychologist at Massachusetts General Hospital.
“There is a lot of excitement around finding biological or more objective indicators of psychiatric diagnosis that are outside the more subjective forms of diagnosis that are traditionally used, such as clinical rating interviews or self-report measures,” he said. General Chat Chat Lounge Other indicators that researchers are looking for include changes in activity levels, sleep patterns, and social media data.
These technological advances come at a time when mental health care needs are particularly severe: According to a report by the National Alliance on Mental Illness, one in five adults in the United States experienced mental illness in 2020. General Chat Chat Lounge
Although AI technology cannot cope with the shortage of qualified mental health care providers – not nearly enough to meet the needs of the country, Dr. Bentley said – hopefully it can reduce the barriers to obtaining a proper diagnosis, diagnosis. . Assist the physician. Patients who may be involved in seeking care and facilitate self-monitoring between visits.
“There can be a lot in between, and the technology really can offer us the ability to improve monitoring and diagnostics in a more consistent way,” said Dr. Bentley.
To try this new technology, I downloaded the Mental Fitness app from health technology company GoldHealth to see if my feelings of anxiety were a sign of something serious or that I was just tired. Described as “a sound-driven mental fitness tracking and journaling product”, the free app invited me to record my first check-up, a 30-second oral journal entry that would boost my mental health on a scale of 1 to 100. Will register
After a minute my score was: Not a great 52. The “Attention” app warns.
The app flagged that the level of living in my voice was significantly lower. Did I just feel the same sound as I was trying to speak quietly? Should I focus on app suggestions by going for a walk or eliminating my space to improve my mental fitness? (The first question probably reflects one of the potential drawbacks of the app: as a consumer, it can be difficult to know why The change in your sound level.)
Later, in the midst of feeling tired during the interview, I tried another voice-analysis program, focusing on finding a level of anxiety. StressWaves Test is a free online tool from Cigna, a healthcare and insurance organization developed by AI expert Ellipsis Health to measure pressure levels using 60 seconds of recorded speech patterns. General Chat Chat Lounge
“Do you wake up at night? The website was a reference. After spending a minute counting my persistent anxieties, the program scours my recordings and sends me an email announcement: “Your stress level is moderate. Unlike the Sonde app, Cigna’s email didn’t offer any helpful self-improvement tips.
Other technologies add a potentially helpful layer of human interaction, such as Kintsugi, a company based in Berkeley, California, that raised $ 20 million in Series A funding earlier this month. Kintsugi is the name for a Japanese practice who mixes broken dishes with gold rugs.
Founded by Grace Chang and Rima Seiilova-Olson, who combine their past experiences of struggling to access mental health care, Kintsugi develops technology for telehealth and call center providers that can help them. Identify patients who might benefit from further help.
Using Kintsugi’s voice-analysis program, a nurse is referred to, for example, for taking a minute to ask a parent about his or her own well-being with a Kolkata child.
One concern with the development of these types of machine learning technologies is the problem of bias – ensuring that programs work equally for all patients, regardless of age, gender, nationality, ethnicity and other demographic criteria.
“In order for machine learning models to work well, you need to have a very large and diverse set of data,” said Ms. Chang, who noted that Kintsugi used voice recordings in many languages around the world, in different languages. . Especially the problem.
Another big concern in this new field is privacy – especially voice data, which can be used to identify people, Dr. Bentley said.
And even when patients agree to be recorded, the question of consent is sometimes two-fold. In addition to evaluating a patient’s mental health, some use voice-analysis program recordings to develop and improve their own algorithms.
Another challenge, Dr. Bentley said, is the potential distrust of consumer machine learning and so-called blackbox algorithms, which work in a way that developers themselves cannot fully explain, especially the features that they use. To do
“The algorithm is building here, and the algorithm is there to understand it,” said Dr. Alexander S. Young, interim director of the Small Institute for Neuro Science and Human Behavior and chair of psychology at the University of California, Los Angeles. That’s what many researchers have in common about AI and machine learning: there is little, if any, human supervision available during the training phase of the program.
Until now, physicians have been cautiously optimistic about the capabilities of young voice analytics technology, especially as a tool for patients to monitor themselves.
“I think you can model people’s mental health status or assess their mental health status in a general way,” he said. “People like to be able to self-monitor their condition, especially with chronic diseases.”
But before automated voice-analysis technologies are entering mainstream use, few call for a rigorous investigation of their accuracy.
“We really need more validation of not only voice technology, but AI and machine learning models built on other data streams,” said Dr. Bentley. “And we need to get this confirmation from a large-scale, well-designed representative study.”
Until now, AI-powered voice-analysis technology has remained a promising but unsurpassed tool, which can eventually become an everyday way to lift the heat of our mental health.