How will we judge scientists in 2030? #wetenschapper2030

Authors: Esther Plomp, Marta Teperek, Maria Cruz, Yan Wang, Yasemin Türkyilmaz-van der Velden

The conference on the 23rd of May on evaluation criteria of Dutch researchers, organised by NWO and ZonMW and held in The Hague, aimed to discuss the current rewards and incentives system and to think about the evaluation criteria of the future.


The meeting was attended primarily by (female) university staff, and to a lesser degree by other stakeholders. The support/research staff ratio was estimated to be 50/50. The main language, Dutch, may have prevented international researchers to attend this meeting, despite live translation efforts in the morning. The morning session was primarily filled with videos, panels and talks by funders and scientists discussing why researcher evaluation criteria needed to change, with the afternoon session focussing on smaller break out discussions looking at how to change the system, and how to practically implement these changes (see here for the full programme).

Esther Plomp tweeting about the lack of diversity at the meeting. Diversity was addressed by the participants as an important aspect for future evaluation which demonstrated that diversity is also needed in terms of participant representation during these consultation meetings.

The day was introduced by a video of Ingrid van Engelshoven, the Minister Education, Culture and Science (OCW). She announced that the impact factor does not say anything about quality and is too narrow as a criterion to evaluate researchers, as well as irrelevant to some scientific fields. Instead, the focus should shift to generating scientific impact through science communication and collaborations between researchers.

Mismatch in the current reward system

The audience gave addressing the current evaluation criteria of researchers a score of 8.6 out of 10 in terms of urgency. This is because there is a mismatch in the current reward system: we do not reward what is necessary to advance scientific research, as highlighted by Rianne Letschert (rector magnificus Maastricht University and member of the VSNU committee that is leading the development of a new evaluation system). According to Rianne, scientists should no longer be expected to excel in all areas. These expectations have led to stress, frustration and burnouts among scientists, increased competition, academic abuse and harassment, and fraud cases. Rianne argued that the focus should shift to a diverse range of career paths which can focus more on the strengths of the individual researchers in different areas (research, teaching, societal impact, or leadership), so that these talents are appreciated and could be embedded better within a team if required. It should also become possible to return to a scientific career after a “break”, to focus on other demands (e.g. administration and management). To enable these changes the VSNU is currently developing a toolkit of new evaluation methods, which are quantitative and qualitative in nature and focus on individual as well as team performances.

This slide by Rianne Letschert provided us with a great summary of the current reward system.
“For anyone who is still wondering why we need to talk about the scientific reward and incentive system” (tweet by Lotte Melenhorst).

What’s your story?

As scientific progress is a team effort, we require more flexibility and variety in the evaluation of scientists. It is unrealistic to evaluate research quality based on a very narrow output: publications. Hanneke Hulst, assistant professor at the VUmc and member of the Young Academy, argued that the story behind the scientist should become more important, as opposed to simply employing standard evaluation criteria. This includes treating science as a “team sport”, since researchers often work within a team, a department or an institute, rather than in an isolated context. This makes it difficult to evaluate who has contributed what specifically, as Stan Bentvelsen, director at Nikhef and professor at the UvA stated. Bas Borsje, assistant professor at the University of Twente, argued that team science is not a right metaphor. Researchers work in communities of practice and there should be more incentives to stimulate collaboration. Collaborative research requires good leadership, and as Beatrice de Graaf, professor at the University of Utrecht, highlighted: competences required from leaders are changing. There should be space for more diversity and leaders should have leadership and interpersonal skills instead of being in a leadership position because of their research capabilities.

Evaluate the process, not just the end result

Sarah de Rijcke, director of the Centre for Science and Technology Studies at Leiden University, argued that the future focus should not shift to only value collaborative work, as we will then generate a new hoop to jump through. Sarah argued that we have to look at the content of scientific research instead. This will require transparency in evaluation criteria and a focus on the process instead of the end result. The question should be whether it is good quality science, no matter what the end result is. Rob van Gassel, PhD student Maastricht University and PhD Network (PNN) representative, echoed these comments and argued that early career researchers want to generate meaningful research and not only focus on the end product of research. Evaluation based on content requires more flexibility in evaluation criteria, such as an additional category within the Veni proposal to evaluate researchers on, as proposed by Bas Borsje. Researchers would decide themselves what they would fill into this category and what they would like to be evaluated on.

PostdocNL tweeting about the meeting.

Diversify career paths

In his talk, Barend van der Meulen, head of research at the Rathenau Institute, stated that we are currently selling the idea that there is a place for everyone at universities as long as they work hard enough, or when they fulfil the new criteria for scientists. But the number of research positions at universities are limited and, therefore, there needs to be more focus on diverse career paths, including collaboration with industry. Jet Bussemaker, professor at Leiden University, agreed that PhD students should be better prepared during their training for these diverse career paths and that they should also focus on the societal relevance of their research. She highlighted the importance of the Dutch National Research Agenda, which aims to build bridges and stimulate collaboration between scientists and the public.

Importance of societal relevance

Ionica Smeets, Professor in Science Communication at Leiden university, also highlighted the importance of connection with the public and increased appreciation for science communication, which is currently seen as a hobby and not taken seriously in the evaluation of scientists. Ionica also wrote a column about this topic which you can read on the Volkskrant website. It is even unclear where the money allocated to promote science communication will go, it appears to disappear into science festivals rather than to be used to support individuals that are already much involved in science communication. Even Ionica’s own yearly evaluation focuses on her scientific publications rather than her science communication activities. Ionica stated that doing research sometimes “feels like an endless battle across all fronts”. An evolution or slow change of evaluation criteria will take too long and a revolution will result in more battle fronts, so perhaps we should focus on mutations and experiments to test what changes will work. Ionica hopes that the scientist of the future has less battles to fight and can instead focus on the areas in which they excel.

Tweet by Sanli Faez on the mutations proposed by Ionica Smeets.

What do the funders think?

Jeroen Geurts, chair of ZonMW, admitted that the funders have contributed to the creation of a very competitive environment that focuses on publications and generates ‘science divas’ (see also here for a more detailed statement). He hopes that a new system will allow for more breathing space. Stan van Gielen, chair of NWO, added that funders do their work but that the reward system from the funders is not evidence based: they do not measure the things that they find important in research proposals.

Jean-Eric Paquet, Director-General Research and Innovation, European Commission, discussed the progress in altmetrics and open science. He argued that scientists should open up their data and make it FAIR, as only 30% of the research data is re-used. The open science movement is also seen in efforts to fight paywalls that hide scientific outputs, such as Plan S, which supports immediate open access. He stated that evaluation needs to focus on the content instead of publication metrics.

How to implement changes?

In the afternoon, the participants of the meeting were invited to join smaller parallel discussion sessions:

  1. Research vs. Education
  2. Teamplayers vs. Superstars
  3. Skills
  4. Metrics vs. Narratives
  5. The Netherlands vs. the rest of the world
  6. Inclusivity vs. reality on the workfloor

These sessions aimed at gathering input from the participants on ways to improve the current system. The session chairs collected key messages and will report all outcomes at a later stage. The overall perception from the sessions attended by the authors is that while most of them resulted in engaging and focused discussions there were hardly any suggestions and concrete plans on how to implement necessary changes. The formulation of the questions asked during these sessions could have been more pragmatic and aimed towards action. During the last panel session, the problems with the current system were therefore emphasised instead of placing the focus on how to move forward.

At the same time, some practical suggestions were shared by participants during the last panel session and on twitter:

Sanli Faez tweets about the composition of review committees.
  • Terms such as excellence and quality need to be well defined, so that scientists know what the criteria are on which they are evaluated.
  • The use of metrics such as the h-index and impact factor should be avoided.
  • Review committees should change in composition to achieve a change in how proposals are evaluated.
  • Sharing of data and code, open science practices and science communication efforts should be rewarded appropriately, rather than seen as second class citizens in research. (This could be implemented through, for example, adding an additional “free choice” category in research proposals, as mentioned before.)
  • PhDs should not be required to publish four publications in order to defend their thesis. As Stan Gielen stated, the PhD is a training programme and you can be a good scientist with only one paper. Jeroen Geurts added that we should stop asking about publications. These requirements should be dropped by the universities, as well as the funding agencies.
  • Supervisors should be chosen for their leadership, social and management skills in order to be able to support future scientists, and there should be consequences for supervisors that are not up to these tasks.
  • The Spinoza price should not be a price for individuals, but, like the Stevin price, be awarded to teams, as previously suggested by Stan Gielen during the Plan S Consultation meeting in January 2019.
  • All author lists on publications should become alphabetical so that people are forced to look up the contributions that each other has made to the publication (see the CRediT taxonomy).
Sarah de Rijcke tweets about academic leadership.

Next steps

The session organised by ZonMW and NWO was an important first step towards rewarding and recognising the diversity among the scientists at Dutch universities. It was the first nation-wide consultation meeting to discuss the importance of introducing changes in the academic rewards system. The fact that the meeting was attended by so many interested colleagues (more than 400 participants from diverse stakeholder groups, so that the meeting had to be held in a building previously used as an aircraft hangar), demonstrates its timeliness and relevance.

Very importantly, ZonmW and NWO already signed the DORA declaration on the 18th of April 2019 and proposed changes and concrete actions aimed at reducing the dependence on bibliometric indicators in the evaluation of research and researchers. Next on the agenda is the publication of a position paper on research evaluation criteria by the VSNU, NFU, KNAW, ZonMw and NWO, which will be made available in September. However, driving an important systemic change which involves multiple stakeholder groups is not easy and it is crucial that the process is done carefully, with thorough stakeholder consultations. Therefore, publication of the position paper will be followed by pilots and smaller consultation meetings in order to discuss the modifications (or “mutations” as described by Ionica Smeets) that are required to change the evaluation system. The Netherlands seems to be leading this change.

Tweet by Marta Teperek about the Netherlands as a leader in changing the current research evaluation system.

In the meantime, each one of us can act within our own spheres of influence in order to contribute to the change in research evaluation criteria – wherever and whenever possible.

You can also start by signing DORA yourself, as tweeted by Egon Willighagen.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s