NHMRC’s Research Impact Track Record Assessment (RITRA) framework requires researchers to report on past research impacts in their applications for Investigator and Synergy Grants and requires peer reviewers to assess and score these reported impacts.
Implementation of this framework is intended to provide an incentive for researchers to consider future impact when planning and conducting research, ideally leading to an increase in the translation of NHMRC-funded research and improved public health.
The RITRA framework evaluation report describes the results of a process evaluation that sought to determine whether the RITRA framework has been implemented as intended.
Publication Data
Contents of the report’s executive summary are provided immediately below. The full report may be downloaded at the bottom of this page.
Questions and findings
The evaluation questions and associated findings are as follows:
A1 – How easy was it to provide the impact text?
Applicants found it difficult to provide the impact text and both applicants and peer reviewers thought that there was substantial overlap in the text provided in the three impact sub-sections. Applicants found the examples of evidence to support their impact statements provided by NHMRC helpful but thought that they could be improved.
A2 – Are the impact types useful?
Most applicants consider that the four impact types, knowledge, health, social and economic, allow them to report all the impacts that they would like to, however some peer reviewers raised concerns that applicants had selected incorrect impact types.
Data analysis confirms that, across all applications, the impact types selected do relate to the impact text being provided, however applicants report knowledge impacts even if they have not selected this impact type.
A3 – What types of impacts are being reported?
Applicants more often report on knowledge impact than they do on benefits experienced by stakeholders beyond the research sector.
A4 – Is the impact text duplicating the publications text?
The impact and publications texts provided by applicants overlap substantially.
B1 – How easy was it to assess the impact text?
Peer reviewers expressed mixed views about how easy the impact text was to assess but were more united in finding the Category Descriptors to be unhelpful and offered a variety of suggestions for their improvement.
B2 – Do applicant characteristics influence scoring of impact?
Applicant characteristics do not appear to be associated with differences in scoring.
B3 – What factors affect impact scoring?
Since the introduction of the collection of more structured information under the Relative to Opportunity policy (R2O) and revised Statement of Expectations (SofE), Leadership level has become less predictive of the impact score. There is no apparent relationship between impact type and impact score.
Discussion
While noting the short time frames since RITRA was first implemented and that applicants and peer reviewers are still learning how best to engage with the RITRA framework, the evaluation has identified several issues arising from the framework implementation that need to be addressed. These include:
- duplication of information provided by applicants in the three impact subsections and between the impact and publication texts.
- difficultly experienced by applicants, and especially those early in their career, in providing retrospective impacts
- possible confusion about the nature of research impact
- difficulty experienced by peer reviewers in assessing the information provided in the applications
- insufficient guidance provided by the category descriptors.
Based on these findings, the RITRA framework could be improved to ensure it is achieving its short-term outcomes and overall objectives.
Recommendations
In order to address the issues identified by this evaluation, some revisions could be made to the RITRA framework, as follows:
- To reduce duplication of content in the application, the three components of the Research Impact section could be combined into one or two components
- To provide more space for applicants to explain their research impact, all evidence for impact – in the form of URLs or document citations – could be included within a separate free-text field of the application form
- To align with most applicants describing the generation of knowledge that may lead to impact, applicants could instead outline their pathway(s) to impact and engagement with research end users. NHMRC could develop advice about pathways to impact and describe various markers for each impact pathway type
- To help peer reviewers assess the applicant’s impact pathway, the Category Descriptors could be revised – ideally assisted by cognitive interviews – to ensure that applicants, peer reviewers and NHMRC all understand them in the same way.
Revision of the RITRA framework should be guided by an expert working group.
Next Steps
NHMRC has formed a Working Group to co-design improvements to the RITRA framework. These updates were first piloted by the Investigator Grant scheme in the 2025 funding round. Further updates are expected to be implemented from the Investigator Grants 2026 funding round, following a public consultation.