Capturing experience data
As the FFT is a nationally mandated patient feedback tool (34, 35), we sought specific feedback from our PPIE group on how to enhance the FFT question digitally by changing the response scale to capture emotional feedback collected via emoticons, and adapting the FFT free text question to invite more meaningful comments.
Adapting the FFT response scale to capture emotional feedback
PPIE contributors liked that the proposed FFT question and how response could be enhanced to capture emotional feedback to encourage more people to use it. The issue of how best to capture people’s feelings about the healthcare they received was tackled differently by PPIE contributors and research participants. In general, the PPG contributors in both primary care sites considered the use of emoticons appropriate, and feasible, as their practice was currently collecting feedback digitally in this way. Further, they also recommended that four instead of five options on the FFT response scale should be offered ‘as people tended to go for the middle ranking where 4 makes the respondent go one way or the other’ (researcher note). Likewise, the PPIE SMI contributors described making the FFT question and response via the interface simple, quick and friendly to use as ‘some people like to use simple emoticons for feedback with an option for adding brief text’ (researcher note). Some PPIE MSK contributors described preferring an alternative visual system, for example, traffic lights to express emotional feedback as it was felt that patients with MSK might experience discomfort to touch the iPad screen and that emoticons might not represent their pain. These views on appropriateness of using emoticons to collect emotional feedback resonated with the majority of staff participants working in mental health services (site B) where they talked about specific sensitivities of collecting feedback digitally in their context:
‘I don’t know whether necessarily in mental health they’re always the best tool for measuring satisfaction, you know, for somebody who is suffering with an episode of depression, there’s going to be nothing that makes them smile from ear to ear, so to see it as an emoticon to measure the most satisfied with a big smiley face, it’s not something that’s right at that time’ (Site B CMHT, ID349, Staff FG).
As a result of these initial discussions, different options were explored by PPIE contributors, and in the subsequent co-design interviews with staff and patients. This led to a joint decision to test out emoticons with traffic light colours on the interface in four settings (Sites A, B OPT, C1 and C2: see Additional file 2), but not in the mental health setting, as these were viewed as inappropriate.
Enhancing the FFT free text question to elicit meaningful feedback
All PPIE contributors and research participants voiced how it would be most useful for a new digital tool to capture both negative and positive comments within the FFT free-text box. This emerged from the PPIE suggestion of using a gratitude journal form of feedback, where the option for one piece of positive feedback and one piece of negative feedback is given:
‘The negative feedback is clearly what you want to know where things could be improved. The positive feedback is where you don’t want to make unnecessary changes when things are working well, but you need to identify those areas where the patient thinks things are working’ (Site A, ID107, patient).
Our PPIE contributors also felt that capturing both positive and negative feedback from different service elements within each setting would be useful to examine:
‘It was thought a good idea to capture the positive and negative feedback from different areas of the service e.g. experience at reception, experience with doctor etc. to explore the usefulness of feedback data.…’ (PPG Site C2, researcher note).
As a result of these discussions the FFT free-text question was enhanced to capture positive and negative free-text comments via a digital interface to replicate the check-in process in place in three sites (B OPT, Sites C1 & C2: see Additional file 2).
Enhancing digital and non-digital tools
All PPIE contributors and research participants could give views on the use of feedback via the following tools: iPad with self-standing kiosk; text message; dedicated phone line; face-to-face discussion; pen and paper; and the use of the site-specific website. This suite of FFT tools was based on the phase 1 data collected. The possibility of building in a feedback period was initially suggested by our PPIE SMI contributors, which would allow patients and carers to provide comments pre and post consultation in each NHS setting:
‘Waiting time could be used for feedback. Pre and post feedback may be based on previous experience, or lack of experience, or expectations for the appointment. Subsequent visits can include the after-service experience’ (researcher note).
This view was shared with our PPIE MSK contributors so that people could be invited to give feedback in their own time:
‘There was group agreement that people should be invited to give feedback pre, during and post appointment and in their own time… Some contributors described people on medication with side effects that might make them confused… Pictures alongside each instruction might be of help’ (researcher note)
In the PPG discussions, a few individuals asked if the digital Patient Access System that now is widely adopted in primary care could house our FFT feedback survey. Further alternatives were a phone app, providing a URL from the site website, or text message following a consultation. Some contributors liked the idea of enabling SMS text messages for collecting feedback:
‘One PPG member said they get sent text messages via [site A] asking for patient feedback but find them very repetitive. The group thought flyers to be useful props to remind people to feedback via different methods, but ‘texts are a nightmare’’ (PPG site C2, researcher note).
This suggested that some guidelines of how and when to send these messages should be developed if we were to test this tool. While enthusiasm for digital methods was high, PPIE contributors and research participants felt that non-digital tools should be offered in parallel. Physically situating different tools alongside each other was seen as another possibility to give people choice of feedback method:
‘But I guess if you've got a [digital] stand like that, I mean I don't know, I'll show it to you when we go down, when you leave, because I was looking at that stand there, there's nowhere to put pen and paper, however, the one downstairs it's got like a little thing on it where you could actually put...you can have your screen but you could actually have pen and paper next to it’ (Site A, ID354, Staff FG).
Many PPIE contributors and research participants felt that this dual approach would enhance the level of participation as patients could choose their preferred method and that the feedback tools offered in each site should be ‘context sensitive’. Thus, the differences between health conditions and their impact, therapeutic relationships and care settings should be taken into account. Integrating feedback within the therapeutic relationship in Site B was seen as another option to respect the person in a holistic way:
‘I have a CPN [community psychiatric nurse] nurse and she comes to see me every 2–3 weeks about my tablets and talks to me . . . then you can discuss things, can’t you?’ (ID208, Site B, patient).
This demonstrates that trust and understanding of a patient’s life facilitate gaining feedback on the delivery of care, and the design of new tools should reflect this. Consequently, a new process for eliciting feedback within CMH services was co-designed to encourage the recording of verbal feedback from patients and carers that may have been excluded using current methods. In brief, each care coordinator would ask a patient a couple of trigger questions at the end of their home visit in order to allow comments on people’s experience of the service. The care co-ordinator would then give a brief explanation to say that the team wants to improve information they collect about people’s experiences of services. Responses to the trigger questions were to be recorded in a dedicated field within the electronic care record used by the CMHT (see Additional File 3).
Ensuring privacy and confidentiality
For PPIE contributors, privacy and confidentiality were considered equally important to encouraging patient feedback via the new tools. The key issue was the location of the new digital kiosk. Many contributors expressed it should be located in a private cubicle in appropriate reception or clinical areas, clearly signposted and inviting:
‘PPGs liked the idea of using a digital screen for giving feedback as they use screens to check-in for appointments. Another contributor also liked the idea of having a private cubicle, would this be possible? Like a passport photograph cubicle, to enter and give feedback. There was collective agreement that privacy and confidentiality are equally important to encourage people to feedback to services’ (PPG Site C2, researcher note).
As a result of these insights and preferences, joint decisions were made on testing the new tools and they were placed in areas of privacy. They were monitored by the observation measures in place during the testing phase.
Co-design of a suite of new tools
The majority of PPIE contributors and some patient and carer participants talked about using guidance, information and signposting to motivate patients and carers to give feedback using the new tools:
‘The importance of presenting “feedback about the feedback” was met with group laughter but agreed to be a crucial element to the guidance produced to accompany the new tools’ (PPI MSK, researcher note).
It was also voiced that any guidance could work to encourage not only more but also better quality feedback. This idea was adopted in the form of a colourful poster in four sites and later a larger poster to support use of the new tools in two sites (C1, C2: see Additional File 2):
‘It was thought that more striking colours and something simple on the poster needed to be used - even something like “Please give us your feedback” with a big arrow pointing to the kiosk’ (PPG Site C1).
The PPGs gave crucial insights on how to present and update the content of the poster ensuring it remained current amidst the plethora of information on display. Discussions about providing this information led to displaying examples of positive comments on a later version of the poster. Several PPIE contributors and research participants talked about the need to provide hands-on support, especially for older people or others who might be less confident in using digital devices:
‘I think it just depends on people’s . . . well, partly the age, isn’t it, and their abilities. I mean, obviously some older people will struggle. I mean, I do intend to get computer literate again, but I’ve not been able to use my computer, and I mean, my skills are quite basic . . . If there’s like some support available, so for example when you think of the supermarket when you use the . . . self-serve, and there’s usually someone there, and it comes up on screen and so if there’s someone there if you’re stuck or something flags up’ (Site A, ID115, patient).
PPIE contributors and staff participants explored ways of providing support for patients and carers to use the kiosk. Staff reported that the majority of patients and carers would not use the kiosk unless they were asked to do so. Use increased once the larger colourful laminated poster with guidance notes was placed above the kiosk and a PPG volunteer promoting the use of the kiosk was in post:
‘The PPG volunteer was active in promoting the kiosk in the reception area. Five people walked up to the poster (this has never happened before with the smaller A4 landscape poster), read it for a few minutes, then used the kiosk without being prompted by either of us’ (Site C2, observation note).
Two PPG members attached to Site C2 provided peer guidance on how to use the kiosk during the busiest clinics. Having this component in place allowed data capture on the acceptability of and continued engagement with digital feedback. We subsequently organised for a volunteer attached to Site B OPT to provide peer support during the testing phase. This role aided implementation in both settings as it responded to staff concerns about the impact of the new tools on their workload:
“. . . if I’m having to come away from my desk to talk them through it, even through the glass, that is going to take me away from the phones, it’s going to take me away from what I’m supposed to be doing” (Site B OPT, ID262, FG).
Our recommendation that information and bespoke guidance should be provided ready for the set-up of the new tools was to be monitored and adapted to incorporate the feedback data throughout the testing period (32, 48, 49). As part of the later phase of the co-design and implementation phase, visual feedback reports that included summaries of quantitative data and free text comments were split by sentiment (positive and negative comments: (32)). In line with PPIE and staff preferences, these monthly reports were tailored to the different service contexts.
A summary of main points of feedback and recommendations made by the PPIE contributors to shape the research components by each theme through co-design is shown in Table 2 (Appendix 1). These crucial insights also helped us to tailor interview props for the co-design interviews.
Dissemination and evaluation
A special PPI dissemination and evaluation workshop in February 2018 was held, with presentations on the DEPEND study research findings. Nineteen members of the public attended the workshop, with representation from the study sites and multiple PPIE groups and networks across Greater Manchester. Members of our PPIE group co-presented at the workshop and co-facilitated discussions on our findings from DEPEND. Feedback on our reflections resonated with the public audience:
‘I found that the way those presenting conveyed how much this project has meant to them, the emotions of it, was very moving and inspiring’ (workshop attendee).
We also used this workshop to premier an animated video reporting the study findings (59), and this was received positively:
‘I loved the animation, it is a great visual way to present information in increments, that some people might not read through if it were just text, the humour in it helped to keep it engaging too’ (workshop attendee).
We were able to use the comments and contributions to make final improvements to the animation prior to general release.
Following the PPIE workshop, two of our PPI contributors, AML and DA, co-presented with NS, our experiences of working together on DEPEND at an International PPIE conference (60), and spoke about some of the challenges and successes. The presentation highlighted the importance of joint working between PPIE contributors, researchers and NHS staff in order to co-design new tools that have the best chance of working in practice.
Following the co-design and testing phases of DEPEND, we conducted a meeting with members of our PPIE group to hear collective and individual views on their experiences of the DEPEND study, and in particular the co-design model. Overall, the process and the outcome of PPIE in the DEPEND study was successful: strong and trusting relationships between PPIE partners and researchers had developed over the 2-year study. The PPIE partners made valuable contributions to, and provided important insights into each work package, ensuring that research priorities aligned with those of patients and carers and these enabled recommendations for delivering future PPIE work to be developed. All of our PPIE contributors commented on how much they had enjoyed being part of the team and that the experience had been rewarding. Participating in the DEPEND study had taken various forms for our PPIE contributors and examples are highlighted in a visual representation developed by one of our more experienced PPIE co-investigators, DA, a co-author on this paper (See Figure 1).
[Insert Figure 1: PPIE reflection model]
The model contains some personal involvement reflections and learning points useful for future work, and emphasises the nuanced aspect to involvement, what it may achieve and how. As the PPIE co-design meetings progressed, DA helped us to recruit carer participants within Site B using DA’s Patient Expert Group networks. This enabled us to widen our recruitment with DA spending time on site with the researcher at carer events held by Site B and attending meetings with the carer lead at this site. During year 2 of the study, DA spent dedicated time working at the University with NS and contributed to the shaping of toolkit documents (see Additional Files 2 and 3). Our PPIE co-applicant, AML, also spent a significant amount of time reviewing components for the toolkit. Other PPIE contributors gave crucial support throughout the evaluation and dissemination phase to DEPEND. For instance, two PPIE contributors had a lead role in co-designing the animated film and co-delivered a public dissemination workshop. One PPG (Site C2) took a lead peer support role that helped to promote and sustain kiosk use during the evaluation of the tools phase.