PsyKey – A Chatbot Providing Information on Mental Health and Location- based Assistance

    “The only journey is the journey within.” – Rainer Maria Rilke

PREMISE

Exploring how Power tools can be used to create a chatbot that provides users with insights about the mental health statistics, provides first aid therapy and connects them with therapists within their geo-location.

SYNOPSIS

Mental disorders are one of the leading diseases worldwide and most people are afraid to publicize their struggles because of the fear of stigmatization. A good amount of young adults including myself have at some point suffered from certain forms of mental disorders and would hide it away instead of talking about it. In some cases, because mental health resources are not readily accessible, people resort to other forms of expressions such as alcoholism, drug abuse and to an extent, suicide. Therefore, I chose this topic because I wanted to see how well I could create a trustworthy solution that is readily available and is able to provide users with relevant information such as contacts or websites to therapists based on users’ location. Also to experiment with how I can make this system provide some form of “first aid” therapy by giving empathetic responses.

BACKGROUND

Mental Health as defined by the World Health Organization (WHO) is “a state of well-being in which the individual realizes his or her abilities, can cope with the normal stresses of life, can work productively and fruitfully, and can contribute to his or her community” (World Health Organization (WHO), 2020). It has been estimated that mental disorders may affect 29% of people in their lifetime (Steel et al., 2014). Globally, mental disorders are considered one of the most common causes of disability as these disorders impair quality of life (Abd-Alrazaq et al, 2019).  In most countries and cultures, access to mental health services, education and treatment remains a global issue (Vaidyam et al; 2019). This has resulted in difficulty to provide one-on-one mental health interventions. Mental health services only reach 15% and 45% of those in need in developing and developed countries, respectively (WHO, 2020). Though the unavailability of mental healthcare is an important issue, another issue that has been investigated as important is the concept of stigmatization. People are often afraid to externalize their problems and behaviours in front of others, with the fear of being judged and labelled as something they are not, particularly if they are still immature and with little experience (Chandra & Minkovitz, 2007).

Research has been done to provide technology-based health interventions and solutions in mental health such as cognitive-behavioural therapy (Fitzpatrick et al., 2017) and to effectively detect and prevent mental health problems (Robinson et al., 2013).  Internet-based mobile applications (apps) and embodied conversational agents (chatbots) have been used to complement these researches (Melia et al., 2020). People are now adopting technology-based treatments due to their availability, confidentiality and relative affordability (Martínez-Miranda, 2017). Although several chatbots have been developed to support the treatment and prevention of mental and behavioural disorders, research showed that it is important to create a system that will be able to provide its users with mental health information and contact to both physical and virtual psychotherapists tailored to their locations.

Therefore, I decided to create a solution that is similar to my final graduation project; Conversational agents in Suicide Prevention; by researching further on mental health chatbots and their functionality. I found out that a few of these chatbots do provide contacts for professional help, but they are not customized to the users’ location. Hence, my project inspiration was formed.

Research Question

How can a low coding environment be used to develop an intelligent and empathetic chatbot that can assist users with information about mental health and provide counselling recommendations?

Research and Experimentation Process

This project employs a qualitative research method approach; desk research, reviews and experimentation. I experimented with platforms and decided on the one that seemed more appealing in terms of user experience. I also had weekly consultations with the professor in charge of this course and tested the prototype with my groupmates and friends. Feedback collected from these processes was used to improve the prototype.

Research Steps

  •       Research on existing literature and technology-based solutions for mental disorders
  •       Project proposal and Pitch
  •       Experimenting with low-code environments
  •       Chatbot Development Process
  •       Testing the 3 iterations of the prototype 
  •       Presenting and Deciding on the final prototype

Experimentation Process

First Iteration

Literature study was done on previous mental health studies and the insights were:

  • Young people mostly shy away from speaking about their feeling due to the fear of being judged.
  • Most communities do not have the necessary resources to help with mental disorders.
  • Technology-based solutions are now being adopted by people and are willing to use a chatbot if it is empathetic and provide tailored solutions.

I also reached out to people who openly advocate for mental health and are struggling with mental disorders. Though not a formal interview; from these conversations, two out of three suggested that it would be good to have a system that is easily accessible, trustworthy, free or inexpensive and easy to use. One suggested that though this was just an experiment, it would be advisable to also make the chatbot empathetic. The professor of this course also suggested that I should include empathy in the functionality of the chatbot.

I then experimented with low-code environments such as Power Virtual Agents (PVA) and IBM Watson. PVA seemed to be a more user-friendly and less rigid environment than IBM Watson, therefore it was the preferred development tool.  I created the first iteration (skeleton) of the chatbot; Factoid by redefining pre-existing topics and creating new ones and ad. In the first week, it could only give its users facts and information about mental health topics and refer the user to a generic helpline (113 Suicide Hotline). 

Reflection and Insights

In week 1, it was clear that the tool mostly caters to the development of business chatbots rather than social chatbots. Though it is possible to provide a platform to create an intelligent agent, it might not be the best platform for creating an empathetic bot that can be used for mental health therapy. Therefore, I had to build an agent from scratch and create new functions.

 

Second Iteration

In the second week, I tried ways by which the chatbot can be made more Intelligent and Empathetic. Empathy in a chatbot is when the agent is able to detect human emotions or state of condition and talk accordingly with the user (Augello et al., 2016). It has been stated that empathy in a chatbot can help enhance the use of this chatbot by making users feel better. The generation of empathic responses will detect and convey appropriate emotional responses more dynamically Dewan(2017).

I tried tailoring the responses to providing contact to available therapists located within the same city as the user (Intelligence) by connecting to Google API but this feature was only available for preview for a student account so I had to use Conditional Statements. To improve its empathy, keywords that showcase sympathy were introduced to its library.

Reflections and Insights

From experimenting with PVA, it is clear that the tool does provide a platform to create an agent which is somewhat empathetic but due to its rigidity, the chatbot is not as empathetic as it would be if it were developed in a coding environment. Although it can remember some of the inputted user variables, it might not be the best platform for creating an empathetic bot that can be used for mental health therapy.

Third Iteration

In the final week, to improve the empathy of the chatbot, Sentiment Analysis was introduced. The chatbot was able to successfully analyse the users’ responses to detect negative and positive sentiments.  Initially, I thought this won’t be possible to do in PVA, but it can be done by creating a flow by linking the user’s response to Power Automate(PA). The process involved ‘Calling an action and creating a flow in PA’. I also added a preview of the Language Detection function, so that the chatbot can be able to communicate with the user regardless of the inputted language. These improve the chatbot’s responses to be more empathetic. The final prototype was renamed as PsyKey and personalized with an icon.

Conclusion and Reflection

To answer the research question, PVA does provide a platform for creating an empathetic and intelligent chatbot, though it provides better sentiments (intelligence) than empathy. Most of the features were only previewed or paid for, therefore I couldn’t customize the prototype in terms of making it more empathetic and intelligent as most of its functions had to be manually inputted. In the end, the chatbot is able to efficiently predict and analyse the sentiments from the users’ responses and remember users’ details.

I have gained new skills and understanding of PVA and it does provide a very efficient way to integrate flows from PA which can be used to automate the functionality of the bot. Though the environment is most suitable for business entities, it does provide a better tool for creating intelligent chatbots that can be integrated into different platforms. Further improvement can be done by integrating this chatbot with APIs and Natural Language Processing in order to improve its functionality and empathy. 

 

Mobile View of the Prototype

References

Abd-Alrazaq, A. A., Rababeh, A., Alajlani, M., Bewick, B. M., & Househ, M. (2019). Effectiveness and safety of using chatbots to improve mental health: Systematic review and meta-analysis (Preprint). Journal of Medical Internet Research. https://doi.org/10.2196/preprints.16021

Augello, A., Gentile, M., & Dignum, F. (2017). An overview of open-source chatbots social skills. In International conference on internet science (pp. 236–248). Cham: Springer, Cham.

Chandra, A., & Minkovitz, C. S. (2007). Factors that influence mental health stigma among 8th grade adolescents. Journal of Youth and Adolescence, 36(6), 763–774. https://doi.org/10.1007/s10964-006- 9091-0

Dewan, T. (2017, April 1). Implementation of a Bangla chatbot. http://hdl.handle.net/10361/8122

Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Mental Health, 4(2), e19. https://doi.org/10.2196/mental.7785

Martínez-Miranda, J. (2017). Embodied conversational agents for the detection and prevention of suicidal behaviour: Current applications and open challenges. Journal of Medical Systems, 41(9). https://doi.org/10.1007/s10916-017-0784-6

Melia, R., Francis, K., Hickey, E., Bogue, J., Duggan, J., O’Sullivan, M., & Young, K. (2020). Mobile health technology interventions for suicide prevention: Systematic review. JMIR mHealth and uHealth, 8(1), e12516. https://doi.org/10.2196/12516

Robinson, J., Cox, G., Malone, A., Williamson, M., Baldwin, G., Fletcher, K., & O’Brien, M. (2013). A systematic review of school-based interventions aimed at preventing, treating, and responding to suicide- Related behavior in young people. Crisis, 34(3), 164-182. https://doi.org/10.1027/0227-5910/a000168

Vaidyam, A. N., Wisniewski, H., Halamka, J. D., Kashavan, M. S., & Torous, J. B. (2019, March 21). Chatbots and conversational agents in mental health: A review of the psychiatric landscape.SAGE Journals. https://journals.sagepub.com/doi/full/10.1177/0706743719828977

World Health Organization. (2019, December 19). Mental health. WHO | World Health Organization. https://www.who.int/health-topics/mental-health#tab=tab_1

 

Leave a Reply

Your email address will not be published. Required fields are marked *