Communicating Care: Not An App, Not A Bot

Communicating Care: Not An App, Not A Bot

With the increase in bots overtaking more “natural apps” on our smartphones, when will enough be enough? After all, do you really want to talk to a chat bot?

 

Today’s digital-led world has changed how people want and do things, the parameters being “me, here and now.” Healthcare is no exception. Patients today demand personalized care and communication.

 

Patients have become proactive in managing their own health, and the tech world has responded. The “on-demand” way of doing things is helping health apps and medical chatbots thrive.

 

Industry estimates show that of more than 3.4 billion smartphone and tablet users, 50 percent may have downloaded mobile health apps last year. This is why the U.S. Food and Drug Administration “encourages the development of mobile medical apps that improve health care and provide consumers and healthcare professionals with valuable health information.”

 

There are now an estimated 165,000 health and medical apps. These include software programs, accessories or a combination of both, installed on smartphones or similar communication devices. These can range from a simple pedometer or calorie counter for healthy weight maintenance, to a glucose meter or heart monitor that helps health care professionals improve and facilitate patient care.

 

While Google can provide information when it comes to self-care, a more informed decision is crucial when it comes to apps that are remotely controlled or accessed by health providers, or those which store sensitive patient data.

 

The U.S. FDA regulates that medical apps “that are intended to be used as an accessory to a regulated medical device, or transform a mobile platform into a regulated medical device” will require review to see which may pose a risk to patients.

 

But overseeing the implementation and use of the most advanced health information technology and the electronic exchange of health information, in the US is the Office of the National Coordinator for Health Information Technology. It is under the mandate of the Office of the Secretary for the U.S. Department of Health and Human Services (HHS).

 

In the UK, there is a digital app library of vetted health apps that have been tested by the National Health Service, which has quality standards for clinical effectiveness, safety, usability and accessibility. Technical assessment includes compliance to the Data Protection Act, and evaluation requires evidence supporting improved patient outcomes.

 

But beyond monitors and trackers, these apps may still include what would be considered as Artificially Intelligent medical assistants - simply called chatbots.

 

In the case of AI-chatbots in patient engagement, the dialogue that happens at a particular time may not be necessarily regulated, only the software hosting the chatbot should pass the same privacy and security compliance for any other app if used medically.

 

So as noted by the HealthWorks Collective, “knowledge-based management rules must be implemented to ensure that any information derived during the interaction with the patient is compliant to ethical medical practices.”

 

While chatbots cannot and should not replace the doctor for proper medical care, they can simply act like interns regarding a pre-triage of issues, so when intelligently trained they can respond when prompted. But in this function they cannot make any decision for certain course of actions, it must first be cleared with the attending physicians.

 

While medical chatbots continue to evolve, HealthWorks Collective believes the chatbots can be safely used for booking doctor appointments based on symptoms, and/or take care of billing or any customer service tasks using the Doctor’s rules. Other similar uses include health-monitoring to notify a human nurse if parameters become unsafe, or inform homecare assistants about improvements in a patient.

chatbots

However, we must remember we are still dealing with lives. No matter how smart the functionalities of health apps or medical chatbots become, they are still rules-based and only as good as the rules and contents provided. Anything outside those strict parameters anything can go wrong.

 

And in a healthcare system that is becoming more and more patient-centric, patient-driven and patient led, almost all of these apps and bots require internet connection, data plans, specific devices and specific apps, which can lead to both limited engagement and further security concerns.

 

Whereas online connectivity may be considered a given in today’s digital world, it is not necessarily so in rural parts, and even where available, still should be a choice. Not all patients are comfortable discussing their health information through digital channels. Or there are also patients that do not have access to these technologies.

 

Patient engagement shouldn’t solely rely on apps and bots, so that where bots and apps are not possible, not engaging or not desired, patient care still needs to be addressed.

 

Patient care is not a communication derived from a preset data, it is real and personal and as such needs a platform that can be customized and individualized in real time for everything from anesthesia care, to wellness , PTSD, or surgery.

 

Communicating real care is engaging patients anytime, anywhere, any device - at the patient’s choosing - is what we need to be moving toward. With a communication platform you can get unlimited engagement without the barriers of of an app or devices, it allows the patient to control the “dialogue” that is secure, and is responsive in real time. Consider the approach as a next gen bot. Being your indefatigable health navigator championing quality patient care.

 

Patient engagement cannot afford technological boundaries to communicate real care, and that’s something we’re passionate about at LifeWIRE.

 

 

Share

Post a Comment

Search

Calendar

The latest blogs