Chatbots see greater usage as mental health support tool but can’t deal with urgent or suicidal cases: NTU study
SINGAPORE — Chatbots that are used as a resource for mental health support have seen an increased usage in recent years, especially during the Covid-19 pandemic, but they still have their weaknesses, researchers here have found.

- More people are using mental health chatbots in recent years
- A new study by NTU researchers found that chatbots can be encouraging, nurturing and motivating
- However, they cannot give personalised advice
- They also cannot offer appropriate help or support in crisis situations
SINGAPORE — Chatbots that are used as a resource for mental health support have seen an increased usage in recent years, especially during the Covid-19 pandemic, but they still have their weaknesses, researchers here have found.
A study by the Nanyang Technological University (NTU) published this month in the peer-reviewed Journal of Affective Disorders showed that such chatbots are unable to deliver personalised advice.
They are also unable to detect suicidal tendencies or render help in cases of crisis management.
Professor Josip Car, director of the Centre for Population Health Sciences at NTU’s Lee Kong Chian School of Medicine, and Dr Laura Martinengo, a research fellow, gave more details on their findings.
WHAT DID THEY ANALYSE?
Chatbots, or conversational agents, are computer programmes meant to simulate human conversations. Smarter chatbots include artificial intelligence.
- Nine mental health chatbots from leading mobile application stores were analysed
- Five had at least 500,000 downloads
- They were examined on how effective they were in managing symptoms of depression
- This would be based on their ability to educate users about depression, offer cognitive behavioural therapy and suicide prevention
WHAT WERE THE FINDINGS?
- The chatbots were able to effectively engage people with depression
- They empathise with users in conversations and help them to manage their symptoms
- The chatbots can display a “coach-like” personality that is nurturing and encouraging
- However, they fell short when it came to more serious issues such as suicide management
- They were not able to understand users who express suicidal tendencies or ideas if expressed in a non-straightforward way, or if users gave unspecific statements
- The chatbots may acknowledge the user’s suffering but not recognise it as a crisis situation
- The researchers thus advised that speaking to a counsellor in person would still be ideal
IN WHAT WAYS ARE CHATBOTS HELPFUL?
- The mental health chatbot apps were task-oriented
- They were focused on supporting self-management of depression or other mental health disorders
- They also provided a variety of activities such as mood tracking and cognitive behavioural therapy exercises
- The chatbots engaged in empathetic and non-judgemental conversations that offered support and guidance through psychotherapeutic exercises commonly used by psychologists and counsellors
- Users’ personal information were kept confidential and the chatbots did not transfer or store any of it
- However, this privacy factor prevented the chatbots from probing its users for more information regarding their conditions
WHY ARE CHATBOTS USED MORE OFTEN NOW?
Chatbots are increasingly used in healthcare, for instance. They help to manage mental health conditions such as depression and anxiety and for general well-being.
Prof Car said that healthcare systems are struggling to cope with the increased demand for mental health services.
Digital health tools, including chatbots, could assist in providing timely care to people who may be unwilling or unable to consult a healthcare provider.
One of the chatbots studied, Wysa, has notably been used by the Ministry of Health’s Office for Healthcare Transformation (MOHT) on its mental health platform Mindline.sg.
It is also used by the Ministry of Education’s mental well-being portal called Mindline at Work for MOE, which drew attention earlier this year after being labelled unhelpful by some users.
In response to TODAY's queries on the study, MOE referred to its previous comments in September when the chatbot drew criticism from some users.
It had said then that the chatbot was trialled with various groups of officers, including education officers, at its development stage.
MOE also added at the time that the mental well-being portal is among several initiatives, by the ministry to provide mental well-being support for teachers.
Other initiatives also include services such as a whole-of-government counselling hotline, external professional counselling services, well-being committees for employees in schools and wellness talks.
TODAY has reached out to MOHT for comment on the study.

WHAT RESEARCHER SAYS ABOUT CHATBOTS
Dr Martinengo said: “Chatbots are not yet able to provide personalised advice and do not ask enough personal questions — possibly to avoid breaching user anonymity.”
The chatbot’s automated programmed responses sometimes failed to properly understand the user’s input.
These would result in strange interactions such as this:
- Chatbot – “Choose one thing from your list that you can do today”
- User – “I don’t have a list”
- Chatbot – “Bravo, (user name)!”
In another instance, the chatbot inappropriately responded to a statement reflecting a suicidal idea with: “I see. How about becoming a neuroscientist and digging deeper?”
Dr Martinengo said: “I don't think at this point in time, we could say that chatbots could in any way replace healthcare professionals... This is not a replacement for health professionals and they are nowhere near placing or being able to provide full treatment for somebody that needs it.”
However, these chatbots could still be a useful alternative for individuals in need, especially those who are not able to access medical help.
“For some people, it’s easier to talk to a machine than a human being,” she added.