The Investigator Grants 2024 CEO Introduction webinar was held on 18 January 2024. NHMRC CEO, Professor Steve Wesselingh, and Research Foundations Executive Director, Dr Julie Glover hosted the webinar. They shared tips for quality peer review and outlined what resources are available. The video is available to watch on Vimeo and a transcript is below.
Question topics
- Importance of peer review
- Investigator Grants Scheme
- Peer review principles and starting early
- Resources
- Peer review mentors
- Assessing applications applied at the wrong level
- Overarching tips and tricks
- Track record and relative to opportunity
- Top 10 in 10 publications assessment
- Benchmarking applications
- Outlier scores
- Feedback to applicants
- Comment sharing with other reviewers
- Final note of advice
Transcript
Speakers
- Professor Steve Wesselingh – CEO, NHMRC
- Dr Julie Glover – Executive Director, Research Foundations Branch
- Dr Justin Graf – Director, Investigator Grants
Introduction
Dr Justin Graf – (0:07)
Hello and welcome to the 2024 Investigator Grants Peer Review introduction video.
I'm joining you here from Ngunnawal Country in Canberra and my name is Justin Graf and I'm the Director of the Investigator Grants section and I'll be facilitating today's discussion.
I'm pleased to be joined by NHMRC CEO Professor Steve Wesselingh and Executive Director of Research Foundations Branch, Dr Julie Glover who oversees NHMRC’s largest grants [schemes] including the Investigator Grants and Ideas Grants and Synergy Grants.
Today, we'll be taking you through some of the key responsibilities of your task as a peer reviewer [for Investigator Grants] and some of the key considerations to undertake that task. Thank you for completing the conflict of interest and suitability declaration stage. This is a really important stage to ensure that the application-centric peer review process allocates the most suitably qualified peer reviewers to the broad range of fields of research applications that we receive.
We’ve just started the 4-week assessment period. So, this video will briefly cover some of the main considerations for assessing Investigator Grants applications. So, I’ll dive right in and ask Steve a question.
Importance of peer review
Why is peer review so important, Steve?
Professor Steve Wesselingh – (1:20)
So, peer review as you all know is essential to the integrity and the quality of research in Australia and I know that it's a really big commitment by you guys. I know that it will take time away from your family and your own research, but it is just so important that it gives confidence to the Government, gives confidence to the public that we really are selecting the very best research for funding.
NHMRC is really proud of its peer review processes and we believe we're up there internationally with the very best and we believe our peer review is the gold standard but really, we can't do that without you. Peer review exists and develops because of the peer reviewers and you guys have done such a fantastic job over the last few years and again, we really thank you for committing to do it this year. So, thank you very, very much.
I must say also, I think that there's a professional development process involved in peer review. I think every time you get involved in peer review, you're not only get better at peer review, but you get better at writing your own grants and you get better at thinking about research and really learning about research. I've also said publicly that I'd like the universities and the MRIs to acknowledge the peer reviewers in their organisations because it is such a big commitment. It really should be acknowledged broadly and widely that you are putting so much time and effort into it.
So, thank you again for all the time and effort you're putting into this process.
Investigator Grants Scheme
What makes the Investigator scheme different from other schemes, to inform those new peer reviewers to the scheme?
Professor Steve Wesselingh – (3:01)
Well, the Investigator [Grant] scheme really is our flagship. It consumes the largest amount of resources from NHMRC and what we're really trying to do is select the very best researchers in Australia, give them 5 years of funding, mostly salary and research funding.
Sometimes people don't take salary and just take the research funding if they already have a salary [elsewhere], but basically it frees them up for 5 years to do outstanding research and really contribute to health and medical research in Australia through that process.
I like to think of it as similar to the Howard Hughes in the United States. We really choose the very best people, identifying the area they're going to work in, but acknowledging that over the next 5 years they may move a little bit in terms of their research questions, and it gives them that flexibility and that freedom really to answer the most important questions in their area and really develop solutions for our very big health challenges.
Peer review principles and starting early
Julie, what should peer reviewers be mindful of when conducting the important task of peer review?
Dr Julie Glover – (4:11)
So it's really important to be aware that we have a set of peer review principles.
NHMRC’s Peer Review Principles are really important things to be aware of when you're doing your peer review. They cover fairness, transparency, independence, impartiality, and confidentiality, and they underpin the whole process, as well there are obligations under the Privacy Act and obligations under the Code for the Responsible Conduct of Research that you'll need to be aware of.
You can find out more information about these in the information [pack] that will be sent to you by the secretariats, and it's also available on our website. I think the most important thing to be aware of is just how important peer review and it requires time and energy. Make sure you start your assessments early, so that there's enough time to complete a thorough assessment and also if you identify any late conflicts that that gives us time to reallocate those applications. Our peer reviewers report that typically around it takes between 2 and 4 hours per application to complete their assessment. So, it is a great time commitment and as Steve has mentioned, thank you so much for your time. It's also important to let you know that some people like to work offline to do their peer review assessments. That's OK.
We would prefer you worked in Sapphire because it lessens the chances for errors, but if you do work offline, please make sure that you let our secretariats know. We monitor completion, so we will hassle you if we don't see that there are assessments coming in and we do try to keep to our time frames to make sure we can get outcomes out to applicants as soon as possible.
Additional information
NHMRC strongly encourages peer reviewers to work within Sapphire but understands this is not always possible. In Sapphire, it is possible to enter scores and applicant feedback as you work through your assessments because Sapphire autosaves these scores and comments on entry. You can continue to amend these until you submit the assessment for each application. You should only submit when you have finalised your assessment. After submission, you will then be able to see your overall score against the application.
Should you submit a score, and subsequently realise you have made an error, please contact your secretariat. Due to the tight timeline for assessments this year, NHMRC would appreciate it if you enter your assessments into Sapphire as you complete them. This will enable us to avoid unnecessarily contacting you where you are working offline by providing secretariats the re-assurance that you have commenced your allocated assessments. To assist peer reviewers who have difficulty working within Sapphire, or who prefer to work ‘offline’, a blank Excel assessment template has been provided to record scores, peer reviewer notes and applicant feedback, which then needs to be entered manually into Sapphire and submitted.
Resources
Steve, what resources are available to peer viewers to help them with their assessments?
Professor Steve Wesselingh – (6:12)
So Justin, there’s lots of resources and in fact I’m going to refer to my notes a little bit because otherwise I’ll forget some of them. The 2024 Peer Review Guidelines are the most important, that has all of the key issues [and materials] that you need to look at.
There’s also the Peer Review Support Pack, which helps you in so many ways but particularly in navigating Sapphire. The other thing that we've brought in in the last couple of years, which I think is terrific, is the peer review mentors [PRMs] and these are experienced peer reviewers who are there to help answer questions and help you along your way.
We also understand that it’s very easy for bias to enter into the system and there are tools to counter bias [available]. It’s really interesting to utilise those tools and to see what your sort of biases are and sometimes your unconscious biases really surprise you. So have a look at the tools to counter bias.
We’ve got last year’s webinar with Anne Kelso the CEO at the time, who did a terrific job in that webinar and so have a look at that as well as a detailed Q&A transcript, which gives you a lot of answers to really common questions that you might be answering.
And of course, we're always here. The NHMRC secretariat is always here and the NHMRC help desk is always here and easy to call. So, lots of resources and call NHMRC anytime, send us emails, whatever and wherever you can seek out that help, we're here to help you. Absolutely.
Additional information
- Read the Peer Review Guidelines to understand processes and your responsibilities (sent to you from the secretariat or available on Grant Connect).
- Familiarise yourself with the scheme aims, assessment criteria (Appendix C in the Peer Review Guidelines) and score descriptors (Appendix D in the Peer Review Guidelines).
- Review the Investigator Grants peer review mentor video and peer review support pack (sent to you from your secretariat).
- Review the 2023 CEO webinar and transcript.
- Undertake the online Implicit Association Test for gender and science and view The Royal Society’s Video Understanding Unconscious Bias which you may find helpful prior to commencing assessments. Further information can be found in the Peer Review: A guide supporting the Australian Code for the Responsible Conduct of Research.
- Seek advice from peer review mentors where required and attend the drop-in sessions.
- Contact your secretariat if you have any questions.
- Sapphire-related queries can be resolved by calling or emailing the NHMRC Helpdesk at help@nhmrc.gov.au.
Peer review mentors
Julie, who are the PRMs and how would you go about utilising them through the round?
Dr Julie Glover – (7:53)
So, this year we have 3 peer review mentors. We've got Professor Patsy Yates from QUT, Professor Kim Delbaere from Neuroscience Research Australia, and Associate Professor Thomas Ve from Griffith University.
So, they'll be available to answer questions that you might have about assessing the applications. It's important to realise they don't have access to the application, so they're not answering questions around the specifics of the applications or the science, just about the process and how you would approach your task. You can hear directly from former peer review mentors as we have some videos on the Investigator Grant website and we'll also have drop-in sessions where the PRMs will be available to answer your questions directly. So, we'll provide information to you via email about how to contact the PRMs.
Additional information
Similar to the 2022 and 2023 Investigator Grants round, peer review mentors (PRMs) will be on hand during the assessment phase of peer review in 2024. Should you need to speak with a PRM, please contact your secretariat. PRMs are independent senior researchers with experience in conducting Investigator Grant/NHMRC peer review.
Professor Patsy Yates – Queensland University of Technology Distinguished Professor
Patsy Yates is the Executive Dean of the Faculty of Health, Queensland University of Technology. Professor Yates is a Registered Nurse with extensive experience as a leader in education and research in the health sector. Her research focuses mainly on developing workforce capacity in cancer, palliative and aged care, advancing the management of cancer-related symptoms and treatment side effects, policy and practice in cancer, palliative and aged care. Professor Yates was a peer review mentor for Investigator Grants in 2021 to 2023 and is a current Investigator grant fellow.
Professor Kim Delbaere – Neuroscience Research Australia Professor
Kim Delbaere is a Senior Principal Research Scientist at Neuroscience Research Australia and Director of Innovation & Translation at the Falls, Balance & Injury Research Centre, and is a Professor at University of New South Wales. Her multidisciplinary approach incorporates physiotherapy, psychology, brain imaging and software engineering has contributed to the understanding of physical, psychological and cognitive factors causing and preventing falls. Professor Delbaere is the recipient of 2 prestigious NHMRC excellence awards and numerous successful NHMRC applications, including a current NHMRC Investigator grant. Professor Delbaere has contributed to Investigator Grant peer review in a chairing capacity in 2019 to 2020 and then as a peer reviewer in 2021 to 2022.
Associate Professor Thomas Ve – Griffith University
Associate Professor Thomas Ve is a structural biologist and a research leader at the Institute for Glycomics at Griffith University. He is an ARC Future fellow, a NHMRC Investigator fellow and a former ARC DECRA fellow. His group is focused on understanding how the essential metabolite NAD+ and related nucleotide products trigger nerve fibre (axon) degeneration in response to injury and disease and how they activate immune responses in bacteria and plants upon pathogen infection. Associate Professor Ve has participated in Investigator Grants peer review for 3 consecutive rounds, from 2021 to 2023.
Assessing applications applied at the wrong level
Steve, one of the most common issues that peer reviewers report to us is difficulty assessing an application when they think they've applied at the wrong level. So, what advice would you give to peer reviewers if this happens to them?
Professor Steve Wesselingh – (8:59)
So this is a really important issue and a really difficult issue. It's important that the peer reviewers read the Statements of Expectations in the Guidelines really carefully, so that you understand what our expectations are for an EL1 and EL2 and L1 and L2 and L3.
You can then align the description that the investigator has given you with those expectations and if you think that someone might have applied at the wrong level, particularly if they're applying at a level lower than really, they should be, then you need to take that into account. You should take that into account in your scoring in terms of whether they have applied, let's say, then an L1 or an EL2 when they should have been applying in an L2.
So those sorts of things are really important to take into account, and also if you look at the videos from the PRMs, they covered that as well. So that that's really important. And the last point is, if you do think that someone's applying at a level lower than they should be, then it's important to make that point in the comments as well.
Additional information
Further guidance on assessing applications applied at the wrong level is provided in the Investigator Grants 2024 Peer Reviewer Guidelines in the following appendix:
- Appendix G. Statement of Expectations
- Appendix G(I). Reviewing applications submitted at an inappropriate Category/Level
- Investigator Grants peer review mentor video.
Overarching tips and tricks
Julie, what are some tips and tricks you can pass on to first time peer reviewers?
Dr Julie Glover – (10:15)
I've already mentioned making sure you allow enough time because it's such an important task. We do provide a lot of documentation, so it can be quite overwhelming, but please take the time to look through it because it has been developed over a number of years and based on feedback from peer reviewers, so that can really help. That material can really help with the task.
Other than that, the peer review mentors have also suggested some tips and tricks on the video that I mentioned, such as grouping your assigned applications by level and assessing those applications together.
Some peer reviewers also find it helpful just to look at one criterion at a time across all of the applications, but we highly recommend completing your assessments as I've mentioned in Sapphire to avoid any transcription errors, and so we'd be really grateful if you could keep your secretariat informed. As I mentioned previously.
Track record and relative to opportunity
Steve, how can peer reviewers ensure that they’re consistent in their approach to peer review, track record assessment?
Professor Steve Wesselingh – (11:28)
Track record is obviously a really key part of what you're going to be looking at and you get a lot of information on someone's track record. But one of the key issues around track record is relative to opportunity and this is all about fairness and equity.
So, you really do have to look very carefully at what are the career disruptions and the career opportunities. It's important to look at the career context that the investigator lays out. And they'll give you a sort of plotted history of their career and they'll give you a sense of the opportunities and the lack of opportunities that had in the disruptions they've had. You need to take that into account as you assess their track record.
You need to do that because that's fair and it provides equity and so it is, a difficult part of the process. I think you need to read all the information that you're given and then having a holistic view of the opportunities that the investigators had in order to develop a successful track record and take those 2 aspects into account.
Additional information
Comprehensive guidance on NHMRC’s Relative to Opportunity Policy is provided at:
- Application-centric peer review
- Appendix H (Relative to opportunity policy) and Appendix I (Guide to evaluating industry-relevant experience) of the Investigator Grants 2024 Peer Reviewer Guidelines.
Top 10 in 10 publications assessment
One of the biggest changes, as you know, in recent years has been the introduction of applicants providing up to 10 of their top 10 publications from the past 10 years in their track record, accompanied by a description of their contribution to the paper, and also that paper’s contribution to their field of research. In this way, peer reviewers should be assessing the quality and contribution to science rather than the quantity of publications that someone has accrued during their career. So can you just outline NHMRC’s expectations for assessing the publications criteria?
Dr Julie Glover – (13:14)
Yeah, thanks. So we really ask you to rely on your expert knowledge and use your best judgement to provide a fair and consistent review of the nominated up to 10 publications. Looking at the quality. NHMRC has put some guidance in our documentation around what we mean by quality and it's about contribution to science and the applicant’s claims for them. As well as the citation and public publication practices of the field. Those are the sorts of things you need to take into account.
The characteristics of publication quality, rigour of experimental design, appropriate use of statistical methods, reproducibility of results, analytical strength of the interpretations and significance of the outcomes or overall impact. And obviously those things will apply to different publications in different ways. It's important that to note that NHMRC is a signatory to the San Francisco Declaration on Research Assessment, DORA, and that is really about not basing the quality of the publication around the journal, but more about the paper itself, the publication itself. There is some more guidance available on that in our on our website and also on DORA's website.
Professor Steve Wesselingh – (14:45)
Can I just add something to that? I mean I think I've been really proud of the change to the 10 in 10 years and I think it really has put us up there with all the international peer review and DORA has been a really important part of that.
I think the response to us doing that, it has been people looking much more carefully at the quality of their papers rather than the numbers of papers. And in the end, it's the impact of the individual paper that's most important rather than someone publishing 200 papers that are having little impact, 10 papers that have terrific impact, far more important internationally.
This is you know now the gold standard and I think an NHMRC's meeting that, it's really important.
Additional information
NHMRC expects peer reviewers to consider their expert knowledge of their field of research and to use their best judgement in providing fair and consistent review of the nominated publications’ quality, contribution to science and the applicant’s claims for them, as well as the citation and publication practices of that field, when assessing the publication component of an applicant’s track record. Further information can be located in the Investigator Grants 2024 Peer Review Guidelines section 4.3.6.5.
Given the Publications criterion makes up 35% of the total score for these significant grants (more than any other criterion and more than Knowledge Gain at 30%), NHMRC expects peer reviewers to expend proportionate effort in reviewing this criterion.
Publication quality refers to characteristics such as the rigour of experimental design, statistical significance of findings [read, appropriate use of statistical methods], reproducibility of results, analytical strength of interpretations and significance of outcomes or overall impact, quality and contribution to the field of the published journal articles from the grant applicant, not just the standing of the journal in which those articles are published. It is not appropriate to use publication metrics such as Journal Impact Factors. Journal-based metrics, if included by an applicant, should not be taken into consideration in the assessment of publications.
Instead, peer reviewers are to focus on the creativity and innovation of ideas, rigour of experimental design, appropriate use of statistical methods, reproducibility of results, analytical strength of interpretations and significance of outcomes, all of which serve as surrogates for measuring research quality of a publication, irrespective of the field of research.
Applicants are advised that the explanation field is not to be used to provide additional track record information (for example, conference participation, awards, patents, publications not already nominated in the applicant’s Top 10) but can include field weighted metrics and citation metrics. Section 4.3.6.4 of the Peer Review Guidelines provides guidance on the use of metrics and states that peer reviewers are to take into account their expert knowledge of their field of research, as well as citation and publication practices of that field, when assessing the publications component of an applicant’s track record. Journal metrics, if included by an applicant, should not be considered in the assessment of publications, in line with the recommendations made in the San Francisco Declaration on Research Assessment (DoRA), of which NHMRC is a signatory.
Benchmarking applications
Steve, how important is benchmarking in assessing applications?
Professor Steve Wesselingh – (15:33)
So in the end you need to look at every application independently and score it independently according to the Guidelines and the criteria that are in the Guidelines. So that's the most important thing to do but at the same time, we do know that there's hundreds of applications and we need to understand how they relate to each other. And you will get a number of applications and it's very reasonable for you to benchmark your applications against each other and particularly ones that are at the same level that are L1s or L2s or EL1s and so thinking about where you rank your applications obviously is important.
One bit of advice that I think one of the PRMs might have given was about doing each section separately. So you might look at the 10 publications in a series of your applications that you're looking at and benchmark that part first and then do the next part and then the next part. That may work for you or it may not but it was a tip that I saw in one of the videos that I thought was quite helpful.
Additional information
For further advice please see the Investigator Grants 2024 Peer Review Guidelines, which outline the assessment criteria (Appendix C) and score descriptors (Appendix D) and Statements of Expectations (Appendix G) to guide your assessment of applications.
Outlier scores
Can you explain a little bit about what NHMRC does with outlier scores and checking scores?
Professor Steve Wesselingh – (16:43)
Yeah. So there's been a lot of concern in the sector about outlier scores and that people's grants might be pulled down by a single score and we've actually had a separate committee to look at that and they have done a large amount of statistical analysis to look at the consequences of outliers.
But in terms of peer review and in terms of the process that you're about to undertake, we will be looking at all the scores and if we do see scores that are from one reviewer that are significantly different from the other reviewers and I guess on average going to have 5 reviewers and so 4 say have scores that are close together and the 5th has an outlier.
We'll probably contact the 5th person and just check that they meant to put that score there and we'll also look at the comments and make sure the outline score and the comments align and we're not saying that all the scores have to be close together because obviously people will have different views and analyse different parts of the application.
But we are checking all of the outliers to make sure that there's no errors in transcription and no other errors that might have led to that, that we would have to correct after.
Feedback to applicants
Julie, do you have any advice for peer reviews in providing their written feedback to applicants?
Dr Julie Glover – (18:02)
Yeah, thanks. So written feedback or those sort of qualitative comments around the application, they go back verbatim to the applicants.
It's really important to, it's such an important part of the process. So please keep this in mind as you as you write your comments, focus on what they could change to be stronger in a future application and focus on why you scored as you did. So those are the sort of 2 things to keep in mind and we're really keen to ensure that our peer review feedback is of high quality.
So, we've got a lot of do's and don'ts about feedback which are available in our Guidelines. So please have a look at those and for this scheme comments aren't required against each criterion, so it is a chance to give an overall comment. So think about that as well and what you think would be of most value to the applicant. We do review feedback and we may come back to you if we have any questions or concerns.
Professor Steve Wesselingh – (19:10)
Again, can I just comment that I guess all of the peer reviewers, all of you guys out there will be right, adding grants and getting your own peer review back.
Just think about what are the sort of comments that you'd like to get back. What would be helpful to you and that will help you think about the comments that you're going to write about these grants?
Additional information
Peer reviewers are required to provide constructive qualitative feedback to applicants that focus on the strengths and weaknesses of the application. It's important to remember that this feedback will go to applicants verbatim, so please keep that in mind as you write your comments.
For further advice please see Appendix A: “Advice on Preparing Applicant Feedback” in the Peer Reviewer Support Pack.
Comment sharing with other reviewers
Did you want to comment a little bit about sharing assessments from other peer reviewers of the same application?
Dr Julie Glover – (19:41)
Yeah, and so it's part of the increasing transparency and accountability, as well as a way to catch the odd error that might creep into the process. We've opened up the ability for peer reviewers to review, the de-identified comments from other peer reviewers on the same application.
The main purpose of this step is not for you to revisit or change your mind about your assessment, but it's really to help identify any errors, but also to provide a learning opportunity for reviewers so that they can see what other reviewers thought of the same application.
So, we'll need peer reviewers to check that the comments relate to the application and that the comments are appropriate. So, if you see anything of concern in those, please come back to your secretariats. That period available for looking at those comments is rather short and we do apologise for that, but we will give you notice for that and the reason why it is short is it so that we can move on to the next steps in the process. I think the other thing I wanted to point out is it's ok if peer reviewers have other different views from you about an application. We all have different expertise and we bring that to our assessments when we're doing them. So that's ok as well. And as Steve's already mentioned, we do have an outlier checking process to look through and identify outliers.
Additional information
As for the 2023 Investigator Grants round, the de-identified applicant feedback provided by other assessors of the same applications will be available for you in Sapphire for the 2024 Investigator Grants round. This is to help increase the transparency and accountability of the peer review process, as well as provide a gauge for how your own assessments compared with those of other peer reviewers on the same application.
Final note of advice
And Steve, if you just had one final bit of advice of peer reviewers to leave with, what would that be?
Professor Steve Wesselingh – (21:26)
Well, I think we need to think about why we do all of this, and it's really the public. The public funds health and medical research. The public benefits from health and medical research, but the public also gives us our social license to do health and medical research and the social license is based on the quality and integrity of our research.
If we lose the quality or the integrity of our research, we lose the confidence of the public and you know, the consequences reduce funding and a whole lot of other problems.
So, think about the peer review and the peer reviewer as being an essential component of quality and integrity and the consequence of that is public confidence, public funding and in the end better health. So that that's the way I would think about it. That's why I got involved in peer review and that's why I think you should get involved in peer review.
Dr Justin Graf – (22:17)
Thanks Steve, and thanks Julie. But most of all, thank you to the over 650 peer reviewers that will undertake assessments in the 2024 round. We look forward to working with you closely this year.
Thank you.