For National Health and Medical Research Council (NHMRC) Investigator Grants, your track record is heavily weighted in the assessment criteria (70% in Investigator Grants), and 20% of your track record is based on evidence that your previous (ideally recent) research generated impact (this weighting may increase in future years according to NHMRC's Track Record Working Group). You get just 6000 characters (around 800 words) to write your impact case study, so it is important to get this right.
Essentially, there are three things you need to do. First you need evidence that your research generated significant and far-reaching benefits, then you need to show that this impact can be attributed to your research, and finally you need to show that you made a significant contribution to the research that generated the impact. This maps onto the three fields in the application form (2000 characters per field):
Field 1: Reach and significance of the research impact
Field 2: Research program’s contribution to the research impact
Field 3: Applicant’s contribution to the research program
It is not possible to analyse successful versus unsuccessful NHMRC proposals to provide an evidence-based guide to what works in the impact sections of these grants, as they are not publicly available. However, working with PhD student Bella Reichard and colleagues in Newcastle, we have the largest possible dataset of successful versus unsuccessful impact track records available (as case studies submitted to REF2014). Based on this ongoing analysis, it is possible to offer guidance on what to include in the impact sections of your NHMRC proposal.
1. Evidence your research generated significant and far-reaching impact
The most important mistake people make when making claims about the impact of their research is to write about their engagement rather than their impact. To make this simple, I like to conceptualise impact as benefit. Make sure each impact claim has a clear benefit to someone or something. If there is no benefit then it probably isn’t an impact. There is no harm in writing about the engagement activities that led to your impacts as long as you actually explain the benefits that arose.
Media engagement, industry placements, consultation responses and policy seminars are all great forms of engagement, but are not evidence of impact unless you can explain the benefits that arose. Any claim to impact based on engagement alone is vulnerable to criticism as you cannot prove that the engagement led to impacts. For all the reader knows, your media work may have bored or offended those who engaged, and your industry placement may have created a harmful new product or a costly new process that failed to deliver hoped-for efficiency savings. Did anyone read your consultation response and did the right people come to your policy seminar?
Your engagement may have led to important impacts, but you need to evaluate what happened and present evidence of the benefits that arose from your engagement. Figure 1 shows how you can design an impact evaluation to test whether or not your impact generated impact.
Figure 1: Methodological framework for evaluating research impact (from Reed MS, Ferre M, Martin-Ortega J, Blanche R, Dallimer M, Lawford-Rolfe R, Holden J (under review) Evaluating research impact: a methodological framework. Research Policy)
Find out more in this vlog or read Chapter 22 of The Research Impact Handbook:
Once it is clear your research has had impact, you need to evidence its significance and reach. To evaluate the significance, ask why the benefits you generated are important. If you aren’t sure, then ask the beneficiaries why this matters to them. An evidence-based problem statement can provide important context to further justify claims to significance. Evidence can range from Randomised Control Trials to methods from social sciences, arts and humanities and mixed methods approaches. You may draw on quantitative or qualitative data, but where possible provide evidence from more than one source to increase the robustness of your claims through triangulation. Finally, assess the reach of the benefits, bearing in mind that this could be in terms of population size, geography or you could be more creative, for example emphasising how you benefited hard-to-reach groups. Again, a sentence of context can go a long way to justify seemingly limited reach, for example you cured eleven people but it was a rare disease and you cured all eleven sufferers.
Remember that space is seriously constrained in an NHMRC application, so rather than trying to cram in multiple impact claims and not evidencing any of them, it is more convincing to focus on a small number of your most significant and far-reaching impacts and properly evidence these.
NHMRC requires evidence of at least one of four different types of impact, including knowledge (including academic), health, economic and social impacts. Although academic impacts (e.g. citations, prizes and use of your work in other disciplines) is easiest to demonstrate, your competitors are likely to have these metrics too. So in addition to providing this, inclusion of evidence at least one of the three other domains is likely to make your application stand out.
The highest scoring impact track records submitted to REF2014 articulated clear benefits to specific beneficiary groups and evidenced the significance and reach of each claim. The claims were specific. For example, rather than writing vaguely about impacts on “policy and practice” they talked about specific policies and practices their research had shaped, and rather than writing about “a range of” or “a number of” benefits, health benefits or cost savings were specified accurately. Low-scoring track records were more likely to focus on engagement rather than impact, and this was often one-way dissemination of knowledge or technology transfer.
2. Establish attribution between your research and your impacts
Although this can seem intimidating, it is fairly straightforward if you take a systematic approach. The secret to attributing impacts to your research is to trace cause and effect relationships or causal pathways, identifying how research outputs (cause) lead to impacts (effect). The strength of any claim to significance or reach will only be as strong as the weakest link in this causal chain, so it is important that you evidence each link in the chain as well as possible within the character limit.
Also bear in mind that most impacts are multi-factorial, with your research being one of many pieces of evidence and other factors that ultimately contributed to the impact. It is rare that you will be able to provide evidence of sole or direct attribution from your research to a given impact. Instead, your task is to create an evidence-based argument that your research made a significant contribution to the impacts that arose.
To illustrate this, if you think your research influenced policy, the first step is to identify which part of the policy your research may have shaped (it is rare that an entire policy can be attributed to one person’s research). Provide some context to explain why the specific policy mechanism that your research informed is important in the context of the wider legislation in which it sits (e.g. does it make the policy enforceable, appropriately targeted or cost-effective to implement?). Next, look for citations to your research in the policy documents themselves (you are more likely to find this in some types of policy than others e.g. in clinical guidelines). If there is no explicit link to your research, identify the elements that you think reflect your research recommendations, and then look for evidence that your research shaped the discussions and documents that led up to the policy, including evidence of submissions to consultations and committees, policy briefs and seminars where your recommendations were clear. The more active your engagement, the more credible it appears that your work shaped the policy, but this is not proof of a causal link yet. Therefore, the final step is to get a short testimonial from someone in the policy community who can explain how your research fed into the process and helped shaped the policy. Although this might sound like a tall order when you have just 6000 characters for the whole track record, take a look at the individual impact claims in the “details of the impact” sections of these impact case studies to see how concisely you can evidence impact claims like this.
Emerging evidence from the UK track record data shows that attributing claims of significance and reach to the researcher’s work was crucial to success. Qualitative analysis found evidence of cause and effect relationships in almost all of the highest scoring track records, but in only half of the poorer ones. Furthermore, Bella’s quantitative linguistic analysis shows there were many more attributional phrases from research and/or pathways to impacts in the best track records (e.g. phrases like “used to inform”, “to improve the”, “led to the”, “resulting in”, “evidence for”, “cited in”).
3. Articulate your role in the research that generated impact
Finally, you need to justify your role in the research that generated the impact. Again, this doesn’t need to be entirely down to you. Most of us work in teams, so you need to explain the role you played and justify how this contributed to the outcomes of the research. Given that this is part of an impact track record, it is important to make sure that at least some of your key contributions were to research outputs that directly underpin your claims to impact in the previous two sections.
Find out more
Arrange a webinar with your group, to get training and advice and help trouble shoot impact track records you are developing: find out more here.
Download a hypothetical example NHMRC impact track record based on my research (which is not in any way medical by the way, but it gives you a sense of how you might apply some of the advice above).
Check out The Research Impact Handbook for more on how to generate impact once you get your funding. Copies are only available in Australia in bulk or with an NHMRC webinar - contact us to find out more.
About the author
Mark is a recognized international expert in research impact with >150 publications that have been cited over 15,000 times. He holds a Research England and N8 funded Chair in Socio-Technical Innovation at Newcastle University and has won awards for the impact of his research. He regularly reviews research proposals and is a member or chair of funding panels for the UK Research Councils, EU Horizon 2020 and other national and international research funders. He has been commissioned to write reports and talk to international policy conferences by the United Nations, has acted as a science advisor to the BBC, and is research lead for an international charity.
Mark regularly advises research funders, helping write funding calls, evaluating the impact of funding programmes and training their staff (his interdisciplinary approach to impact has been featured by the UKRI, the largest UK research funder, as an example of good practice). He also regularly advises on impact to policymakers (e.g. evaluating the impact of research funded by Scottish Government and Forest Research), research projects (e.g. via the advisory panel for the EU Horizon 2020 SciShops project) and agencies (e.g. Australian Research Data Common).