Ever since the introduction of chatbots on websites and assistants at home, our view of AI has changed. The main reason why it is changing, though, is the fact that we are seeing more human-like responses by chatbots. Today, they can provide us with more of a humanistic response when we express a variety of emotions. However, one industry where more work has to be done is that of healthcare. Could we really expect a healthcare chatbot to be able to give us empathy?
This feature may change in the near future. With around one fifth of the US populace now owners of a smart speaker, AI has been well-received. Indeed, the global health assistant market is growing. Reports suggest that it could be worth as much as $3.5bn as an industry in 2025.
Take a look at the various ‘skills’ you can get for your Amazon Alexa, for example, and you’ll notice various options. One particular choice is to go for the Mayo Clinic’s First Aid Skill. This skill ensures that we can give our bots the opportunity to provide “hands-free answers from a trusted source”. By delivering us useful health information, this is becoming a great solution for things like remote diagnosis of conditions, and even long-distance monitoring of therapeutic results.
However, the question still remains – can we expect AI to show us empathy?
When you raise a condition that you have/could have, can you really expect a chat bot to feel empathy for your condition?
That is the next part of the discussion. At the moment, when we get a response from a chatbot or a smart speaker in the medical industry, it’s purely factual. Sometimes, this can lead to an feeling somewhat impersonal, or to feeling insulted or hurt.
This is a big reason why so many people are now looking to get involved with making chatbots a touch more human. While there are huge improvements already in things like early diagnosis of conditions and helping people to spot real risks, we’re still at an early phase with regards to empathy emanating from our speakers.
At the moment, groups like the Mayo Clinic are working to try and find ways to make their hardware and software more friendly to humans. While other concerns exist, like raising a concern and then escalating that to being able to connect with a medical professional, empathy is a big part of the discussion.
Until we can find a way to make our chatbots respond and react in the way that a real medical professional should, there will likely be limits. Make no mistake, though; the speed at which the use of chatbots has grown means that, in the near future, an empathetic response might not be the pipe dream that it might sound like today. How are you feeling today?
Citation
Sorry, the comment form is closed at this time.