Chatbots, downloadable as apps, have too many become the most accessible therapists. But is this the right approach to a mental health problem?
Back in 2017, Aparna was struggling professionally and personally. The 32-year-old HR professional based in Gurgaon could barely meet work targets and felt guilty for not being able to spend more time with her three-and-a-half-year-old daughter. Aparna was diagnosed with severe anxiety. It wasn’t, however, a visit to a therapist but an Android application, IWill, that helped her understand the problem. The app, which Aparna discovered while looking for help online, asked her a set of questions, like age range, areas of concern, daily activities, sleeping patterns, and food habits. After a 3-minute interaction, it paired her with a mental health professional. It was exactly the help she needed.
Of the 1.3 billion people in India, an estimated 56 million are suffering from depression and another 38 million from anxiety. Although there is no official data on the number of mental healthcare practitioners, according to a recent report by the World Health Organisation, there are only around 5,000 psychiatrists and fewer than 2,000 clinical psychologists in India, making it difficult for large sections of our population to approach experts.
But as more and more Indians move online, there has been a steady rise in the number of mobile applications and websites on mental health. According to a Kantar report released earlier this year, India’s internet user base is likely to reach 627 million by the end of 2019. The most accessible therapist, too many of them, would be a friendly app store bot, as easily downloadable as a Swiggy or an Amazon app.
But sharing anxieties and worries with bots also raises an important question: who is listening to these?
I Will, currently available only on Android, was developed by Gurgaon-based ePsyclinic. The founder, Shipra Dawar, once lived with depression and founded the company to bridge the gap between patients and mental healthcare professionals. “Depression or a feeling of anxiety can hit you anytime. Even if it is 1 am, you can chat on the app,” says Dawar. The app, according to IWill, has had nearly 50,000 downloads so far and 10% of users opted for counseling services after the initial stage of diagnosis for a fee. The company employs 35 psychologists and has 15 freelancers.
Will’s reliance on automation is confined to the detection stage. But some apps like Wysa use artificial intelligence to train chatbots as virtual mental fitness coaches. The Bengaluru-based company was founded by a couple — Jo Aggarwal and Ramakant Vempati — in 2015. Its founding premise was to reach out to people before they arrive at the crisis stage, Jo explains. The app, which claims to have a user base of 2.5 lakh in India, works on the principles of cognitive behavior therapy that aims to change the way a person thinks or behaves by tracking mood and monitoring eating and sleeping habits to help break negative patterns.
WoeBot, developed by clinical research psychologist Alison Darcy from Stanford, works similarly. It checks in on users every day and asks them to maintain a ‘gratitude journal’ that encourages them to keep updating a list of things they are thankful for. When it senses a problem, the app can tell an uplifting story too.
Preeti (26), a teacher in Kolkata, found initial conversations with the Wysa chatbot helped overcome apprehensions. “I could easily catalog my thoughts and emotions,” she says. The user reviews are positive, pointing out several ways in which the app helped. “Don’t have the family support I need for my moodiness and tantrums,” wrote one user on iOS.
COACHING THE COACH
Explaining how chatbots can help users create goals and track progress, Bengaluru-based psychologist Mahesh Natarajan says, “The ability of these apps to guide and hold a person’s chat history, assessment history, therapy worksheets, and records can be very helpful for specific analysis and solution-oriented therapies, and even to work through cognitive therapies. Increasingly, chatbots will play a bigger role as a wellness coach.”
Wysa employs 25 people, including eight mental health professionals who serve as coaches. “We have a database of over 80 million conversations to learn from. The conversations are arranged according to topics like relationship problems, depression, anxiety, and stress. The AI can recognize certain words to figure out which topic it should talk about. We test all our conversations with the internal team before launching them in the market,” says Jo.
Bengaluru-based Aishwarya Kamat (24), who works as a content writer for a chatbot, has to make sure interactions don’t feel mechanical. “The first job is to determine a topic a user wants to speak about. It could, for example, be about relationship problems. Once we identify the topic, our clinical experts put together a script for the bot,” says Kamat.
Asked if she wondered who she was talking to when she first reached out to the IWill bot, Aparna says, “At that point, it didn’t matter as I just wanted to talk my heart out. It felt safe because not many personal questions were asked.”
PRIVACY & DANGERS
However, as conversations around mental health progress online, the interplay between data and security does get a lot more complex and personal information troves, collected in servers and clouds and prone to misuse, are a concern worldwide. “There is a lack of transparency in the marketing and promotional material for chatbots that do not reveal what happens to the sensitive information being stored. Is this being used to train the algorithm for future users, fine-tune the technology, or for monitoring purposes?” asks behavioral scientist and AI expert Pragya Aggarwal. She also questions the notion that bots are ‘unbiased’. “There is a perception that technology is neutral and unbiased, and people are more likely to trust a chatbot than a human being. But a chatbot will inherit the prejudices of its makers. Bias in AI is not being given adequate attention, especially when such tools are being deployed in a sensitive domain,” Aggarwal adds.
Researchers argue it’s easier for AI to unlearn biases when compared to humans but are ambivalent about the bigger question: should anyone with a mental health problem be talking to a bot? “It is definitely potential for apps to be harmful, based on the way the chatbot responds,” says Aditya Vidyam, a research assistant at Harvard University whose work focuses on digital psychiatry. Speaking to a chatbot, he adds, can be a hit or a miss, and individuals with depression would not want to hear a computer say, ‘I know how you feel’ or ‘I have felt that way before too’.
Currently, there isn’t enough research to gauge the impact these virtual therapists have on users. But there is also no evidence that suggests patients should be deterred from interacting with a bot as long as there is human intervention in the process. Nimesh Desai, director of the Institute of Human Behaviour and Allied Sciences, says technology can bridge the huge gap between the need for therapists and availability.
“It would be stupid to not encourage the application of AI in the mental health sphere. But one should maintain caution. Regulations around it should be more stringent,” says Desai, who is also CEO of State Mental Health Authority, Delhi. “AI shouldn’t be a standalone activity. The expertise of AI should be combined with in-person guidance.”
IIT Delhi associate professor Arpan Kar says the diagnosis of any mental disorder through chatbots is not unethical but prescribing a treatment or medication through AI is. “Chatbots in the wellness sector will always have higher outreach. If taught properly, they will have higher accuracy and the quality of assessment will also be top-notch. But the larger worry is if the model is flawed, who should be held accountable? The user, the data (context), which trained it, the developer or the company that commercialized it?” asks the professor.
Chats in the video are samples from conversations these correspondents had with the bots.
Reference: When your smartphone becomes your therapist
I am Blogger, Share my views and stories to help people around me. Reach out to me in case you have something I can help with.