Quantcast
Channel: RSS Feed
Viewing all articles
Browse latest Browse all 634

Voice the richest source for insight on the patient experience

$
0
0

by Steve Saunders, Australian healthcare lead, RingCentral

In the healthcare industry, we talk about the voice of the patient in two contexts; in the literal sense when they pick up the phone and they want to call their healthcare provider, and in the metaphorical sense when we think about tracking the overall patient experience.

With the advent of AI, weve reached the tipping point where we can achieve far greater efficiencies through automation, and we dont need to be talking in metaphors any more about patient experience.

When I talk to CIOs and others in the healthcare sector, they tell me that voice is still their number one patient engagement channel. In fact, the CIO of a large public healthcare provider told me that voice is the number one application in their organisation, full stop.

That anecdotal evidence is borne out by our global statistics. RingCentral supports billions of inbound and outbound minutes from all clients on a monthly basis, and our healthcare clients consumption of voice minutes is higher by a factor of ten compared to our next closest vertical sector.

Relying on voice

Healthcare organisations are still largely reliant on the voice channel to communicate with patients because they have been conservative in their adoption and deployment of more modern technologies compared to most other sectors. There are some really good reasons for that regulation, security, data privacy, patient demographics, accessibility and language to name a few.

Also, even though we often have other channels we can use, as patients we typically prefer to pick up the phone and call.

These two factors have spiralled to escalate the situation. As patients, we are using voice as the number one channel because the healthcare provider hasn't given us an alternate, and we prefer to call anyway, so we will continue doing so.

However, voice is a very expensive medium to communicate. It requires two human beings having a synchronous conversation; you have to have someone there to answer a phone to talk to a patient or a prospective patient, and, if it's a very transactional interaction, a voice conversation is not the most efficient way to do it.

As an example, one of my daughters was unwell last Sunday, but I had to wait till 8:30am on Monday morning when our GP was open for me to book an appointment. I called, along with everyone else who got sick over the weekend, but the line was engaged and I had to ring back multiple times before I got in the queue. However, work had started, and I was about to go into a meeting. Voice is not very convenient for the patient, and it's certainly not very efficient for the service provider.

Booking automation

For one of RingCentrals clients, a large healthcare organisation with almost 100 clinics who takes three million calls a year, 80% of those calls are to make, change or cancel a booking. The organisation cant handle the phone calls quickly enough. Some of its clinics have abandon rates of over 30%. That leads to dissatisfaction; if I cant get through to that clinic, I'm going to go to your competitor.

If those clinics were using conversational AI to take the call, they could engage with the patient in natural language to choose and confirm a booking date and time, send an SMS to confirm the booking and finish that call without a human staff member needing to be on the other end. And that can work during lunchtime, during busy periods, or on a Sunday afternoon.

If I can ring my doctor on a Sunday, and book an appointment for the Monday, its a win-win situation.

Voice is the richest data

I mentioned in the opening that we think about the voice of the patient as representing a metaphor for patient experience. Thanks to AI, the patients actual voice represents some of the richest data organisations can have on patient experience.

If I write a sentence with 10 words in an email it means one thing, but if I said those same 10 words in a telephone conversation with a different context, a different tone and a different cadence, it would have quite another meaning altogether. Thats important, because up until recently, voice had been a very separate channel for organisations; it has been completely unstructured, and it has been hard to do much more than record calls and store them to access later.

We used to only have the objective data from a call. Who called us? When did they call, what was its duration, how long did they wait, and how many calls were abandoned? We might also have a series of codes and other attributes that the customer service team logs against the call, but thats about all we have had for a long time.

Thats all changed with AI. Now we can mine voice as the rich data source that it is.

Using conversational intelligence, from a phone recording we can analyse what's going on in that front door interaction with the patient. We can extract information around the sentiment of the call. Did we achieve the outcome of the call? Was the patient happy when they hung up? What were the key concepts that were discussed? Was the competitor mentioned? Was there dissatisfaction? All of this data sits within that phone conversation.

Taking the lead

Now that we have that insight, we can start being more proactive. The head of transformation at one of RingCentrals clients wants to stop people ringing his organisation, and he wants to start ringing them. They dont want to have to wait for a call, because it's probably going to be at their busy hour in the morning, or at lunchtime or late afternoon when they are understaffed. Instead, they can take control of that patient interaction and communicate with them on their channel of choice, at their time of choice, or at a time that's convenient for the organisation, leaving frontline staff free to welcome patients into the clinic or take new bookings.

Imagine if you had information at your fingertips from conversational intelligence that 50% of your follow-up calls from patients are to find out when their test results will be ready. How do we turn that into something more proactive? We can make an outbound call to the patient to let them know their results will be available after 2pm today, so please call us back then.

Healthcare organisations can also start to use AI to influence workflows. Take for example, booking an ultrasound. For another of our RingCentral healthcare clients, thats typically a 12-minute call, and often a second or third call to finalise the booking. Instead, we can send the patient a link via SMS to upload their referral form, AI scans the form and picks out the key pieces of information. Then, using conversational AI, the AI agent has the conversation with the patient to say, Would you like an ultrasound for your right shoulder at this time, on this date, and to complete the booking we might then transfer it to an agent or a receptionist. What weve done is taken a 12-minute call down to six minutes, and avoided any follow-up calls to collect all the information needed for the ultrasound.

This is an edited extract of Steve Saunders presentation at arecent RingCentral lunch in Sydneyfor healthcare leaders and IT executives.


Viewing all articles
Browse latest Browse all 634

Trending Articles