Science

AI speech analysis may aid in assessing and preventing potential suicides, says researcher

Credit: Pixabay/CC0 Public Area

Speech is vital to detecting suicidal ideation and a key to understanding the psychological and emotional state of individuals experiencing it. Suicide hotline counselors are skilled to rapidly analyze speech variation to higher assist callers by a disaster.

However simply as no system is ideal, there’s room for error in deciphering a caller’s speech. In an effort to help hotline counselors to correctly assess a caller’s situation, Concordia Ph.D. scholar Alaa Nfissi has developed a mannequin for speech emotion recognition (SER) utilizing synthetic intelligence instruments. The mannequin analyzes and codes waveform modulations in callers’ voices. This mannequin, he says, can result in improved responder efficiency in real-life suicide monitoring.

The analysis is published as a part of the 2024 IEEE 18th Worldwide Convention on Semantic Computing (ICSC).

“Traditionally, SER was done manually by trained psychologists who would annotate speech signals, which requires high levels of time and expertise,” he says. “Our deep learning model automatically extracts speech features that are relevant to emotion recognition.”

Nfissi is a member of the Centre for Research and Intervention on Suicide, Moral Points and Finish-of-Life Practices (CRISE). His paper was first introduced on the February 2024 IEEE 18th International Conference on Semantic Computing in California, the place it acquired the Greatest Scholar Paper Award.

Immediate emotional reads

To construct his mannequin, Nfissi used a database of precise calls made to suicide hotlines, which had been merged with a database of recordings from a various vary of actors expressing explicit feelings. Each units of recordings had been segmented and annotated by skilled researchers, or by the actors who had voiced the recordings, in keeping with a protocol tailor-made for this job.

Every phase was annotated to mirror a particular way of thinking: indignant, impartial, unhappy, or fearful/involved/anxious. The actors’ recordings enhanced the unique dataset’s emotional protection, wherein indignant and fearful/involved/anxious states had been underrepresented.

Nfissi’s deep learning model then analyzed the information utilizing a neural community and gated recurrent models. These deep studying architectures are used to course of knowledge sequences that extract native and time-dependent options.

“This method conveys emotions through a time process, meaning we can detect emotions by what has been prior to one individual instant. We have an idea of what happened and what was before, and that us to better detect the emotional state at a certain time.”

This mannequin improves on present architectures, in keeping with Nfissi. Older fashions required segments to be the identical size to be able to be processed, normally someplace within the five- to six-second vary. His mannequin makes use of variable size administration indicators, which might course of completely different time segments without having for hand-crafted options.

The outcomes validated Nfissi’s mannequin. It acknowledged the 4 feelings within the merged dataset precisely. It accurately recognized fearful/involved/anxious 82% of the time; impartial, 78%; unhappy, 77%; and indignant, 72% of the time.

The mannequin proved notably adept at accurately figuring out the professionally recorded segments, with success charges between 78% for unhappy and 100% for indignant.

This work is private to Nfissi, who needed to examine in-depth suicide hotline intervention whereas growing the mannequin.

“Many of these people are suffering, and sometimes just a simple intervention from a counselor can help a lot. However, not all counselors are trained the same way, and some may need more time to process and understand the emotions of the caller.”

He says he hopes his mannequin can be utilized to develop a real-time dashboard counselors can use when speaking to emotional callers to be able to assist select the suitable intervention technique.

“This will hopefully ensure that the intervention will help them and ultimately prevent a suicide.”

Professor Nizar Bouguila on the Concordia Institute for Info Programs and Engineering co-authored the paper, together with Wassim Bouachir the Université TÉLUQ and CRISE and Brian Mishara at UQÀM and CRISE.

Extra info:
Alaa Nfissi et al, Unlocking the Emotional States of Excessive-Threat Suicide Callers by Speech Evaluation, 2024 IEEE 18th Worldwide Convention on Semantic Computing (ICSC) (2024). DOI: 10.1109/ICSC59802.2024.00012

Quotation:
AI speech evaluation might assist in assessing and stopping potential suicides, says researcher (2024, April 30)
retrieved 30 April 2024
from https://techxplore.com/information/2024-04-ai-speech-analysis-aid-potential.html

This doc is topic to copyright. Other than any honest dealing for the aim of personal examine or analysis, no
half could also be reproduced with out the written permission. The content material is offered for info functions solely.



Click Here To Join Our Telegram Channel


Source link

When you have any considerations or complaints relating to this text, please tell us and the article can be eliminated quickly. 

Raise A Concern

Show More

Related Articles

Back to top button