How and why we created an illustrated summary of recommendations for disseminating reviews to patients by Sarah Knowles, Joe Langley, and Lynn Laidlaw
In Summer of 2019, funded by the NIHR Centre for Engagement and Dissemination, KMA members Sarah, Joe and Krysia worked with two different groups of experts: public contributors and design specialists. We wanted to evaluate how easy to find and how useful evidence reviews were for patients, carers and members of the public, and identify ways their presentation could be improved.
This meant working with public contributors who understood what patients and carers would be looking for, and had first-hand experience of trying to improve research, and working with design experts who had skills in visual communication design and digital services.
What did we do?
We used two older reviews selected from the NIHR Evidence website. This ensured we were examining reviews that represented typical academic outputs*, being quite long documents in PDF form. We used a method called Think Aloud interviews to understand how the contributors looked for the reviews and how they navigated through the information when they found them. We then held a co-design workshop with contributors who had worked on a lay review, with researchers who produced reviews, and with design experts, so they could learn from each other’s perspective and think about how things could be improved.
What did we find?
Drawing across our learning in the project, we produced an illustrated summary of recommendations for improving how review outputs are shared with patients, carers and the public.
There are three broad categories to the recommendations:
Visual: The most direct suggestions about how reviews should look – this includes the need for white space, and thinking about visual cues that help orient people, for example logos that help them quickly assess whether the info is trustworthy.
Content: Better organisation of content is key to helping people navigate reviews and find what they are looking for. Most importantly, outputs for patients shouldn’t be presented like a paper! Rather than wanting background, methods, and findings, patients wanted to first know who the info was for, what the key findings were, and then have the option to explore in detail what was done.
Process: Probably the most important recommendation from the study is about the need to work on outputs collaboratively. Working out the value of a review for patients themselves, and thinking through how different groups may want to access that information, is something that researchers need to do in partnership with contributors. Working with experts in information design and visual communication is necessary to support researchers who don’t have those skill sets. We also encourage researchers to think about the process that patients go through to engage with a review – how do they find it? What do they want to happen afterwards? We need to think about a user journey of a patient seeking and reacting to a review, not just think about reviews as an isolated object.
Why an illustrated summary?
We wanted to produce a summary that could be shared as an example in action, to try to practice what we preach. So we tried where possible to have the recommendations acted out on the graphic – for example, we talk about white space when we use blank space, about patient stories when we include a patient quote, provide next step links when we talk about forward paths, and so on. We wanted to share something brief and focused (given one of the most common complaints about reviews being the excessive length). However, in order to distil the recommendations into one page we had to keep these succinct, and so they could be criticised as lacking detail. We hope there’s enough information to act as a prompt for thinking and conversation. We recognise that anything more substantial would again require us to follow our own advice and co-create a resource or toolkit with the intended users, rather than trying to produce it ourselves in isolation.
We didn’t want to imply the recommendations were a template to copy, or a by-the-numbers formula that can be replicated every time. The illustration showing the graphic as being constructed and discussed was a deliberate choice, to try to emphasise that an output has to be actively produced, and that this should be a collaborative effort, bringing in additional and essential expertise (specifically, that of patients themselves, and of designers).
What happens next?
We want to keep trying to practice what we preach, by continuing to get feedback and input, to demonstrate that creating and improving outputs is an ongoing process, and most importantly, an ongoing conversation with the people you share your findings with. For that reason, we’d be really grateful if you would take a few minutes to fill out our survey here, which asks about your opinion of the graphic. We will also be running a tweet chat where we share some individual images, so we can compare two different ways of sharing our results. We look forward to sharing what we find out!
* The more recent products on the site have been much innovated. We did some comparison with newer materials, for example with an easy read review, which the contributors thought was a huge improvement on the older themed review content: https://evidence.nihr.ac.uk/themedreview/https-evidence-nihr-ac-uk-themedreview-better-health-and-care-for-all-easy-read/
This post presents independent research funded by the National Institute for Health and Care Research (NIHR). The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health