Back

Chatbot Therapists: Help for People Who Don’t Need People

Chatbot Therapists: Help for People Who Don’t Need People

Chatbot Therapists: Help for People Who Don’t Need People

You can’t help but wonder, as you’re using a bot to order pizza, buy a pair of jeans and book a taxi to meet a friend afterward – these guys are just so smart and so helpful and so here for me! I wish my therapist was this good!

Be careful what you wish for, because help is on the way. Facebook Messenger just mainstreamed Woebot, the most recent addition to the developing market of AI therapy assistants. For $39/month, you can share your most anxiety provoking and depression inducing thoughts and feelings with a bot that’s programmed to help you feel better.

Is it possible that it could actually help you – and be worth the price?

Let’s take a closer look, because if a bot can be a successful therapist …. Well, then bots have made a very big leap into formerly human-only activities.  It’s estimated that over 600 million people – close to ten percent of the world’s population – suffer from depression and/or anxiety disorder, according to 2017 World Health Organization statistics. During emergencies, according to the WHO, as many as one in five people are affected. Clearly, this is a large market. Given the astounding advances in chat-related technologies, developers interested in psychology began to explore the potential for AI-based therapy that could be offered on a mass, worldwide scale.

A major breakthrough came just two years ago, in 2015, when a psychology major from Princeton University named Robert Morris decided to pursue a PhD at MIT. His premise was that a common and effective “talk therapy”, Cognitive Behavioral Therapy (CBT) could be adapted as a chatbot to replace a live therapist.

CBT is a method used in behavioral and cognitive psychology to redirect destructive sequences of triggers, thoughts and actions into positive sequences. For example, CBT seeks to redirect something like, “I’m worried about my business presentation today … I think the woman who hates me will be there …. I know I’ll be a failure, I’d better call in sick” into something like, “I’m excited about my presentation today … it’s an opportunity to get my foot in the door for a promotion … I’ll practice one more time in front of a mirror, so I feel really confident.”

As a psychology major, however, Morris found himself ill-equipped to transform his ideas into an actual program. He turned to Stack Overflow, the online programmer community, for help. Then came the epiphany: “Whenever I had a bug or was stuck on something, I would go on there, and almost miraculously, this crowd of programmers would come and help me,” he said. He realized he could program a system of CBT “re-directs” – his original idea — and combine it with crowd-sourced human help, such as he found at Stack Overflow, to create an automated-and-human therapeutic community where participants both seek and offer help. The result was a program called KoKo, which was funded and launched shortly thereafter.

For the full Magazine >> Click Here

Woebot, launched on Facebook Messenger in June 2017, also relies on CBT but with a different premise. Woebot takes the view that a bot-only model is preferable because users are more inclined to be emotionally open with a non-human.  Their innovation – and business model – is to use the bot-only methodology to pro-actively contact subscribers daily and ask how they’re doing. Based on the response, Woebot offers practical feedback on a consistent, daily basis. This adds to the feeling that you’re actually speaking to a therapist.

Woebot’s founders are Stanford University affiliated psychology and AI professionals. Alison Darcy, a psychologist, is CEO. Andrew Ng, a major Silicon Valley innovator, announced himself as chairman of the Board of Directors in October 2017.

The company is making a major push in marketing and development. This is from the pitch in Ng’s Board of Directors announcement:

“I think mental health may be “the” killer app for chatbots: If you are feeling depressed at 2am, you may not want to wake up your therapist…

Woebot is just a piece of software. It’s okay to let him see you on your worst day ….

Woebot has already had conversations with more users than a typical therapist will in an entire career…

AI is the new electricity: even with its current limitations, it is already transforming multiple industries. The transformation of mental health care will help millions of people who struggle with their mental health …Woebot will make high-quality mental health coaching globally accessible.”

Woebot looks like this:

While other psychology bots with different key features are available, Koko and now Woebot are capturing the attention.

How do they stack up?

Both companies offer research published in The Journal of Medical Internet Research to support claims that users show improvement. However, the research is not what would be considered “rigorous”, even by the authors’ own disclaimers. Leaving aside statistical methods and control groups (both can be questioned in these studies) the research is questionable on even a simple level.  KoKo measured the success of its users against a group using an expressive writing method (what do you think, would you rather write an essay or talk to a bot)? Woebot measured its success against a group using an e-book titled “Depression in College Students”; needless to say, the Woebot users emerged less depressed.

This isn’t to say the bots can’t help; it’s simply to say the scientific basis for whether they can remains to be seen.

Privacy is another concern, and one which got Koko into trouble already. It can be considered an early warning sign to the industry.

In January 2016, a 13-year-old from Virginia became suicidal after suffering severe online bullying. In this vulnerable emotional state, she fell prey to a college student who subsequently met and murdered her.

The incident brought attention to the good that psychology bots could possibly do, in their ability to identify at-risk online users and intervene. Koko was recruited to help. It used its existing technology to identify key words, classify them as manageable/critical/troll, and respond accordingly. This technique, perfected as a form of online triage, is a system Koko continues to use on its own site.

Koko ran into trouble when it sought to scale up – something generally required for business success – by embedding the at-risk feature in other sites. For example, Koko surprised some Reddit users by sending unsolicited offers of help in response to selected key words. The reaction was swift and negative from users who objected to a bot injecting itself in their posts and feeling their pain.

Woebot is too young to have faced such experiences. However, there already are privacy-related questions based on its Facebook platform – it can only be used on Messenger with a Facebook account. Thus, there is concern that Facebook might use its stored information on user’s emotional issues to target future advertisers. (Koko can also be accessed via Messenger.)

AI psychology applications almost certainly can help at some level – who wouldn’t benefit from positive feedback that sets you in a healthy direction?

At this point, people who want to give it a try can select from free or paid; subscription or a variety of intervention frequencies; all-bot, some-bot or even therapist assisted, among other choices. All can be compared with a cost/benefit analysis against a human therapist – and perhaps that will be the “killer research study”.

Looking way down the road from this very early stage of the technology, where might it lead?

Perhaps it will come full circle one day, all the way back to its roots in Freudian psychotherapy. You’ll look at a screen shot of the ceiling in a psychoanalyst’s office, imagine yourself lying on a couch looking up, and hear the voice of someone behind you saying “hmmmmmmm…”

That might be the easiest of all!

For the full Magazine >> Click Here