Developing A Voice Assistant for Healthcare? Don’t Forget the HIPAA-cractic Oath

PinIt

Healthcare is highly personal and oftentimes extremely emotional, and the bar for healthcare organizations providing voice applications is very high.

With news that Amazon Alexa is now HIPAA compliant, the market seems poised for digital voice assistants to take the sting out of common healthcare-related chores. Everything from booking appointments to checking on the status of prescription deliveries and gaining access to hospital post-discharge instructions is set to be made easier by the advent of voice technology.

See also: CI Enables New Healthcare Diagnostic Capabilities

Experts believe these are just the tip of the iceberg. A lineup of six new Alexa skills provides a taste of what’s possible as voice technology collides with healthcare management: skills for locating urgent care centers and scheduling same-day appointments, finding recovery progress updates on hospitalized patients, and managing health improvement goals as part of insurance and wellness plans. While Amazon is testing the waters with this limited release program, it’s just a matter of time before the HIPAA-compliant Alexa development environment is made more broadly available and other voice assistant giants like Google, Apple, and Samsung follow suit. Demand for digital voice assistants is soaring—Juniper Research estimates about 8 billion voice assistants will be in use in 2023—and industry-specific use cases, including those in healthcare, are expected to be among the “killer apps” contributing to those rising numbers.

Regardless of industry, voice app quality remains a key challenge to keeping interest high and users actively engaged—and the stakes are even higher in healthcare, where an individual’s well-being is on the line. All it takes is one bad experience with Alexa misinterpreting a question about a specific medication dosage or a miscue with a doctor’s contact info, and a nervous patient is likely to pass on voice and opt for a more traditional communications channel. The same scenario applies if the voice app is too difficult or cumbersome for a patient, who is perhaps elderly or under the weather, to get the answer or action they need exactly when they need it. The real-time results and natural accessibility, after all, are what make voice apps so convenient in the first place.

But the reservations users have are not without merit. There has been reticence among some to trust voice-controlled devices, whether it be Amazon Echo, Google Home, or any other device in this emerging market. Tales of devices “listening” have generated criticism from consumers that the devices are “creepy.” Because of that, it falls on healthcare organizations to convince consumers that their applications are invaluable.

Case in point: a well-known manufacturer developed a pill dispenser that incorporated artificial intelligence to supply medications at the right time. Within a couple of days, 100 percent of users viewed the dispenser as an indispensable part of their lives and not a scary robot. Why were they successful? The application solved an issue everyone has faced before—forgetting to take their medication. But more importantly, they demonstrated to the consumer base that the product worked reliably and that it was designed ethically, not just to make money.

And that’s the rub for healthcare companies: the ethical ramifications of getting consumers to rely upon technology is greater than any other industry. Healthcare organizations don’t have the luxury of just demonstrating that their apps don’t do harm…they must do good. Developers in other industries have time to iterate and permission from consumers to make mistakes —healthcare organizations do not. Their apps must work and work reliably from day one.

Developers building Alexa HIPAA-compliant skills or simple digital assistant healthcare apps will need to plan for robust testing and quality control to ensure their voice experiences function as expected, don’t misinterpret common utterances, and encourage users with the proper empathy and immersion. A combination of real-world and exploratory testing can help identify problems early on to ensure the highest quality voice skill.

Healthcare is highly personal and oftentimes extremely emotional, and the bar for healthcare organizations providing voice applications is very high. With that in mind, in addition to following the traditional Hippocratic Oath of “First, do no harm,” healthcare providers should consider, as voice expert Emerson Sklar (and former colleague) suggested, a new Voice HIPAA-cratic Oath: “First, do good.”

Inge De Bleecker

About Inge De Bleecker

Inge De Bleecker is Senior Director of User Experience at Applause. She has been designing and testing web, mobile, voice and multi-channel experiences for more than 20 years. She builds and leads UX teams and evangelizes customer experience principles throughout organizations. Her mantras are "design for everyone" and "test early and often." At Applause, Inge heads up the UX/CX and Voice practices. She holds an MA from the University of Texas, Austin.

Leave a Reply

Your email address will not be published. Required fields are marked *