top of page
Choosing evaluation methods.png

When assessing the impact of research on public policy, empirical methods of data collection to choose from will be quantitative, qualitative, or mixed methods (i.e., combining quantitative and qualitative).

​

  • Quantitative methods. The most common quantitative method is the use of surveys or questionnaires. For instance, surveys can be employed post-event (e.g., workshops) to evaluate the influence of research on policy professionals’ perspectives and to track subsequent changes in policy driven by these insights. Surveys can help quantify the direct impacts of research on policy-making and gauge ongoing engagement interest.
     

  • Qualitative methods. When evaluating policy engagement with qualitative methods, semi-structured interviews are the most widely used technique. For example, you may conduct qualitative interviews with policy professionals to gather qualitative evidence of the research’s influence on a policy outcome. Such interviews can reveal how the research was used in decision-making processes and its perceived impact on the eventual policy change.
     

  • Mixed methods. Both quantitative and qualitative methods can be used to develop case studies that can provide a well-evidenced account of how research influenced a particular policy decision or development. Case studies can include timelines, key players, decision points, and the role of research at key stages in the policy impact process.
     

Policy can be influenced by research in many diverse ways, and no single method can effectively assess all types of policy impacts. When researchers aim to evaluate how their work has influenced public policy, they encounter unique challenges that require a customised approach. Here’s how you can adapt their strategies for evaluating policy impact to the specific circumstances they face:
 

  • Assess time and resources. Evaluate available resources and decide whether to conduct evaluations independently or seek assistance, which influences the scope and depth of analysis.
     

  • Align design with skills. Match the evaluation design with personal skills and understandings of valid evidence to ensure effective management.
     

  • Tailor methods for evaluating policy influence. Choose methods tailored to specific goals, whether assessing the influence on new policies or their effectiveness post-implementation.
     

  • Address challenges in policy influence attribution. Identify whether research is a significant contributor or the direct cause of policy changes, using appropriate methods for accurate attribution.
     

  • Tailor evidence to audience interests. Align evidence gathering with the interests of the audience, such as funders focused on direct impacts like economic benefits, to effectively communicate research impact.

bottom of page