We successfully deployed this intervention with 22 providers at two primary care clinics. Our participation rates were 76% for providers and 78% for patients. A total of 22 providers took part in the coaching intervention. These included 15 internal medicine residents, 3 fellows, 3 attendings, and 1 nurse practitioner. The majority of the providers were male (n=14, 64%). Racially, the providers were primarily white (n=10, 45%) and Asian (n=10, 45%), and the remainder were Hispanic (2, 10%).
Patients and providers indicated that the coaching intervention was not burdensome. Patients did not mind the recording; all said that the recording had no impact on their experience and therapeutic relationship with the provider. Overall, >90% of providers would “probably” or “definitely” recommend the coaching program to other providers; and >90% reported that the length of the feedback session “was just right.”
Survey
Responses to the survey indicate high acceptability of the coaching intervention, with high scores across all domains: feedback quality, feedback content, and source credibility. Mean responses ranged from 6.2 to 6.8 on a 7-point scale. See Table 2. The high ratings on the survey provide a strong foundation for our qualitative results that follow.
Qualitative interviews
Key features of the coaching program
Our analyses of the interviews identified key features of the coaching program, that relate to acceptability and which providers said were integral. Key features include: 1) coaches were credible and supportive, 2) feedback was useful, 3) video-clips allowed for self-reflection, 4) getting feedback on the same day was useful, and 5) use of real patients preferred over standardized patients. Each element is detailed below.
1) Coaches were credible and supportive
When providers were asked about their impressions, they focused on the coaches’ supportive tone and their credibility.
Coaches create a safe environment for feedback
All providers interviewed described coaches in a positive way. Feedback delivery was the highest rated program characteristic in the survey. Providers described coaches’ delivery of feedback as “nonconfrontational,” “nonjudgmental,” “friendly,” and done in “a tactful way.” One provider said, “She made me feel like it was completely nonjudgmental because when I first signed up I thought to myself ‘hmmm how open to feedback am I going to be?’” Another provider said a balance of affirming and constructive feedback made him feel more comfortable, noting that the coach “drew out both things that went well and things to work on, so that I felt good about what I did but also had some good targetable actions for the future.”
Coaches focus on communication, not medicine.
The majority of providers described the coaches as a credible source of information and feedback. Because coaches were not clinicians, several participating providers thought that the feedback session was less prescriptive, and they “didn’t feel intimidated or anything like that.” One provider said non-clinician coaches were more likely to identify with a patient and focus solely on provider communication. He had concerns that clinician coaches, in contrast, might be distracted by the medical aspects of an encounter (e.g. diagnostic work-up, treatment decisions). Only one of the 22 providers found the coaches lacking in credibility because, he said, the appropriate coach is “someone who is doing patient care.”
Coaches come prepared.
The coaches’ preparation contributed to providers’ impressions of their credibility. Providers who discussed the credibility of coaches pointed out the importance of coaches coming prepared to feedback sessions. Many found coaches credible because they focused on specific communication tasks and came with prepared video clips. Two providers noted that coaches could cite the research supporting the importance of specific communication tasks, as well as audit and feedback principles. Another provider also noted that coaches “were well prepared [for] how to coach me…they had a lot of things written down.”
2) Feedback was useful
Several providers talked about the lack of feedback specificity in past training. One provider felt frustrated because she wanted to know specific things she could do to improve her practice. She said, “A lot of [feedback is] based on patient surveys. [T]he biggest complaint patients would have is that their doctor didn’t show enough enthusiasm or care enough.” In her case, she struggled to organize her thoughts and take notes during visits with patients, and at the same time act in a manner that is caring and attentive. She and others said they needed “specific actions to improve patient care.”
Almost all providers said the feedback received from the coach was useful and helpful because the feedback focused on concrete communication tasks and the feedback was specific. In fact, one provider added, “I liked that there were concrete things that were picked out that you could see and there were specific things she would refer to.”
3) Video-clips allowed for self-reflection.
Nearly all participants said that watching video clips during feedback sessions increased self-awareness and self-reflection.
“Just the fact that you’re doing this self-reflection, like ‘Oh how am I doing? I’m going to be on video’ …. And then seeing that one little minute clip here…. When are you going to have the opportunity for that?”
Many said the video-clips were the most useful part of the feedback session, and for most, it was the first time they had ever seen themselves talking to a patient. In addition to learning new strategies, providers also talked about the video-clips reinforcing desired behaviors. For one provider, the video clip reinforced desired behavior because she could see patients respond positively. She said that “seeing patients appreciate [effective communication and seeing] a benefit” motivated her to continue to use the strategies.
Some providers who received more than one coaching session also liked watching video clips in follow-up sessions, showing change in practice:
“That’s what’s sticking to me the most. Going over the [most recent] video and then saying ‘here’s what we saw, here’s what we practiced, here’s what you did. What would you do differently?”
4) Getting feedback on the same day was useful
Providers liked getting feedback on the same day. One provider said the program provided him “the opportunity to get some real-time feedback and…see my own interaction from a different person’s [perspective].”
Providers also felt that viewing the video clips soon after the patient encounter augmented feedback because their recollection of the situation and context is fresh.
“I could immediately go back, like I remember this interaction. I remember what I was thinking, and I remember, here's how I said that.”
5) Use of real patients was preferred over standardized patients
Providers said it was useful to get feedback based on direct observation of encounters with real patients. When asked to compare observations of encounters with standardized versus real patients, providers said they prefer the latter. One provider said:
“The whole standardized patient interaction, the whole time you know it’s all artificial because this person is not a real patient with real symptoms or real problems…. So I think doing that same exercise with real patients… [is] more helpful.”
To that provider, interactions with a standardized patient feels artificial, and provider behaviors with a standardized patient may not reflect how he or she would act with a real patient.
Situational factors
Motivation to take part in coaching
When providers discussed motives for taking part in the program, a recurrent theme was the desire to be more effective communicators, though reasons for wanting to improve differed. Providers framed their motivation as a responsibility to self and patients. Most stressed the importance of clinical communication skills, saying effective communication is a “good thing to do” or the “right” thing to do as a provider, implying a sense of responsibility to their profession. These clinicians were interested in “anything that helps me become a better communicator.” Some clinicians expressed their sense of responsibility to their patients. One wanted to improve their own clinical skills so that “there’s nothing [more] I can be doing to [provide] a better experience for the patient.” Another clinician focused on his institution’s performance measures and used language common to quality improvement goals and metrics: “More effective communication…can help improve patient outcomes and satisfaction.”
Comments on program format
Brief, same day format effective
Two providers discussed the length of coaching sessions; they thought 10 minutes for each coaching session was a good length of time and that sessions occurring on the same day were “efficient” for the coaching program. Two providers noted that the number of coaching sessions needed could vary by provider; one argued that 4 sessions might be “more than necessary” for a provider to implement a skill they master quickly, and another argued that the number of sessions should vary according to the provider’s workload.
Working around the provider’s schedule is key to uptake
Providers talked about the need for any coaching program to be mindful of time pressures, particularly with sick patient encounters or in busy clinics. One provider pointed out that patients who presented in serious condition limited the provider’s willingness to participate in coaching for the day because of the stress caused by treating the patient’s more urgent needs. For example, one provider “had to send [a patient] to the ER, and so the video…I wouldn’t say it inhibited me, but it was just an extra thing.” Another provider echoed this sentiment by pointing out that what works well in a lower volume clinic may not work in a busy clinic.
Providers want and appreciate strategies to save time
Some providers thought that incorporating the tips, such as agenda setting, saved time during visits. One described using agenda setting to keep a new patient visit on track, and “whenever [the new patient] started diverging or going off on tangents to talk about something else, we went back to the list.” Another thought that asking open-ended questions at the beginning of the visit saved time by better organizing the encounter, “…. it gave the patient the opportunity to ask all of the questions up front not to come up with a whole bunch of by the ways.”
On the other hand, other providers were concerned that incorporating communication tips would take too much time during a clinic visit. One provider was skeptical that he would use a technique such as asking open-ended questions: “I don’t think so, just because of time constraints.” Others also cited a lack of time during visits and suggested alternative, time sensitive ways of using agenda setting. One said “primary care physicians…are too busy” to use the strategies, but “if we could improve [and for example] patients [could] already have a list of what they want to talk about…maybe it would be a little more attractive for them.” Another provider advocated for “anything that could help outside of the [examination] room.” She suggested that, for example, staff checking patient in at the clinic could ask patients to make a list of questions. In fact, patients could be asked when scheduling the appointment, while checking in or while waiting in the waiting room.