When I started in the Manson Unit (the medical unit based in MSF’s London office), as a Research Communications Advisor, my first question was what research are we doing? That was easy to find out from the people who shared my office, but what about my colleagues in Amsterdam, Democratic Republic of Congo, Myanmar?
MSF is a highly dispersed organisation working across many disease topics and countries. From understanding why children might stop taking HIV medications in South Africa to assessing the potential surgical needs of women with complicated pregnancies in rural Burundi, research
has become a part of our daily business.
School children in South Africa perform a play about HIV/AIDS. Photo: Eric Miller
But our research portfolio was managed via a spreadsheet on our intranet that myself and others were able to access (at best) intermittently. My work involves writing and repackaging MSF research into formats for different audiences so it can be disseminated widely and have more impact. So, if you don’t have access to a spreadsheet that tells you what research is going on, you feel like you’re peddling backwards.
Rewind a few years to an earlier iteration of the project that I’m going to tell you about, and some colleagues of mine have started to try and understand the impact of the research that MSF does
For an example of impact, we can look at Karakalpakstan, where MSF research prompted changes in local guidelines, ensuring patients stand the best possible chance of being offered a successful treatment for tuberculosis. And in Bangladesh, where a parasitic disease called visceral leishmaniasis is endemic, MSF research has trialled shorter treatments with fewer side-effects, which has contributed to national policy to try to eliminate the disease.
MSF wants to ensure the research we invest in improves situations for our patients, but there hadn’t been any way of capturing when and where this has happened. MSF research has achieved lots over the years, but this has been kept in the heads of researchers and there has never been a centralised place for documenting these big (and not-so-big) achievements and to understand what steps were taken to achieve change - what meetings were had, what conferences were attended, what reports were written, and so on.
Kim and colleagues on a research study into health-seeking behaviour for maternal and child health.
To me, documenting and analysing research impact is important, not just because it’s my job, but because it has the potential to make us do more relevant and better research. How do we know which research delivers the most benefit for our patients if we don’t assess impact? How are we ever going to improve the way we do research if we don’t understand how effective we are at it?
Research impact is a hot topic in academia, and thus various attempts have been made to develop impact assessment tools. Rather than reinventing the wheel, a colleague started the journey to assess impact in MSF by trialling an off-the-shelf product used by UK funding bodies to assess research impact – Researchfish.
After an initial pilot we stopped using Researchfish as it was clear it did not suit MSF’s needs. However, the pilot data were useful in helping shape our thinking on the type of tool we wanted. ‘Make it simple’ became our mantra – both in terms of usability of the system we wanted, and how we would categorise the data we collected. We now have research impact boiled down to three broad, almost tangible areas – a change in patients, policies, or programmes.
A colleague’s computer-whiz husband, Geoff, created what is now affectionately known as GeoffFish. GeoffFish is an Excel tool that took our learnings from Researchfish and turned it into something that was useful for MSF. Although this tool is about the most sophisticated Excel thing I’ve ever seen, we were still battling with accessibility, and a clunky system that was difficult for researchers to buy into and use. Thankfully that process did not end in divorce, and did give us a robust data schema to take forward.
After GeoffFish, we decided to do things properly - we realised we needed a custom solution, we resourced it properly, and we actually took time to understand the needs of people involved in research.
Shining light in the information ‘black hole’
In actual fact, there were more unknowns about the research process than about what happened after a study had finished. At times there was an information ‘black hole’ from when a study was approved to go ahead to when it was completed. The volatile nature of MSF’s work means studies sometimes don’t come to fruition, but managers had little understanding of why studies were being postponed, delayed, or had gone to plan. Managers wanted a light way of overseeing projects more effectively and so we decided to build this functionality into the tool.
We commissioned a user experience specialist to speak to various stakeholders involved in research. This was probably the most important part of the process because by getting him to ask really simple questions about what everyone wanted, we had a very clear project brief.
We then worked with some super-cool developers called MySociety who really understood the project. It was tempting to get the tool to do everything straight off but MySociety, became adept at quashing our more far-fetched ideas to ensure we didn’t get a product that was so complex it became unusable. The only way the system was going to be a success is if people actually use it. We could try out more radical ideas in version 2.0.
The ReMIT homepage.
Our end product is ReMIT (Research Management and Impact Tool), and we’re really happy with it. It’s open source and open access so now the whole world can explore the research that MSF is doing. It tracks the research process from conception through to impact and enables the lifecycle of our research studies to be thoroughly documented. The minimal administration burden means researchers are slowly integrating it into their ways of working and self-reporting where studies have been disseminated or had an impact. Our initial feedback from users has been positive.
So now, we have a way of managing our research process more effectively, a way of seeing how and where our research has led to change, and (selfishly) most importantly, a way of me seeing what my colleagues are working on!
Let’s see what version 2.0 holds…..