Science is shifting to focus on collaboration and the societal impact of research. The reward system for scientists needs to keep pace with that change
Many scientists are transitioning to a new way of working, known as open science, which will require new ways of evaluating researchers’ work. At Utrecht University we are adapting the reward system so it will incentivise this shift.
The change that has received the most public attention, ditching the publishing metric known as the journal impact factor, is important, but it’s just one step in a much larger transformation.
Through open science, researchers and research administrators seek to improve the quality, reproducibility and social impact of research. Open science includes open access publishing, so citizens and peers can access the fruits of publicly-funded research without paying for the privilege, and moving to a system of FAIR data, making information easy for researchers to find, access, and reuse. Open science also includes software sharing.
The new spirit to back this is, ‘open, if possible, closed if necessary’.
Finally, open science includes citizen science, and public engagement. In this context, public engagement means identifying problems with societal stakeholders, involving members of the public in setting a research agenda, and testing results and applications in the real world. The public is involved in iterative co-creation of research instead of coming in at the end stage of research as passive consumers of results. Research can address international challenges but also national or regional problems.
Open science is an international movement built on local initiatives that have recently been brought together in one programme. The European Commission has declared open science as the way of working when it comes to research and innovation.
Because open science requires a transition to a fundamentally different way of doing research, it needs to evaluate the process and products of research differently as well. This is a true culture change: adapting the rewards cycle affects the social structures of academia, including distribution of credit, reputation, and funding.
That is why change in rewards is an integral and critical part of the new strategic plan of Utrecht University, effected through our Open Science programme.
Those who have been trained in academia over the past thirty years know the drill. The current rewards system is skewed heavily towards research, disregarding the important work of academics in domains such as education and public engagement. Within the scope of research, recognition of work is dominated by classical ideas about quality, excellence and reputation, with a focus on papers mainly read (if at all) by academic peers, not the public, and measures of quality based on metrics some of which are flawed.
The “journal impact factor” is the most commonly used metric. It is a measure of how much on average an article in a journal is cited over a defined period of time. The journal impact factor is fundamentally flawed because journals with high ratings appear to depend on a minority of very highly cited papers. Despite this, the journal impact factor became widely abused as a supposedly objective measure of the quality of an individual paper and unjustified credit for the authors.
Using the journal impact factor for the personal assessment of the individual has thus been shown to be seriously misleading. Not only has it proven to be a very poor and imprecise measure of an individual research project or researcher, it has been shown to distort research funding at many levels. The journal impact factor has a strong influence on the well-known hierarchies in academia, giving more importance to work in natural sciences than that in the social sciences or the humanities.
However, the debate over the journal impact factor is anything but academic. In many instances research that may serve great social or clinical needs may be disadvantaged in impact-factor metrics. For example, the journal impact factor may underestimate the importance of work in social and preventive medicine or nursing science, compared to work in cancer genetics. The eagerness to surpass colleagues in their research endevours often distracts from the actual (societally) relevant questions.
The rebellion against the frequent and often uncritical use of journal impact factors has been building up over time. In 2012, the San Francisco Declaration On Research Assessment crystallised the movement against the inaccurate use of the journal impact factor. More than 17,000 individuals and 2,000 institutions from 148 countries, including Utrecht University, subsequently signed the declaration - the European Research Council being its most recent signatory.
But if these easy-to-use ‘objective’ numerical metrics aren’t working, then how to evaluate research? Impact needs evaluation and attribution from stakeholders. Overall, research will need to show it has induced change on a meaningful level and time scale. To capture this, researchers will need to provide narratives on the whole process of doing research from defining questions to research output and the effects of their research and education in the real world. This might be underscored by relevant articles irrespective of the journal where they appeared (including blogs, newspaper articles, and policy reports) but also with narratives describing relevant change such as altered (government) policies, clinical impact, change in the university curriculum or the contribution to one’s specific field of research.
These narratives will be discussed with peers in grant panels and audit committees to avoid bias. We hear complaints that this will take more time than looking at journal impact factors, which can’t be refuted. But the time invested in a proper evaluation of quality will have a huge return. It will take into account that excellence and impact have many forms. It will properly acknowledge the diversity of excellence among applied and basic science, social sciences and the humanities, technology and classical ‘hard’ sciences.
The way research and the team are organised has been shown to be critical. Among the important factors are building multidisciplinary teams and connecting relevant peers and non-academic stakeholders. At Utrecht University we are implementing a model based on these and other principles. This is hard work, but it’s worth the effort.
We are proud that the Netherlands has taken a strong and active position on this subject. Universities, funders, academic medical centers and the Royal Academy have drafted a new recognition and rewards system and recently adopted a national Strategy Evaluation Protocol, very much in the spirit of open science.
Early career academics in our country are positive about these changes, but also concerned. Will their achievements be recognized abroad? Since the European Commission started its Open Science Programme, the movement is getting a lot of international traction. Many major international organizations and institutes including UNESCO, Wellcome Trust, other charities, government funders and national institutes are engaged. They understand that open science and its reward system will increase the impact of research for the different societal stakeholders that they want to serve.
At Utrecht University we are confident this shift in focus will change academia for the better. Changing the evaluation of academic work is only one of the necessary steps on the road to open science. We believe it will fuel a more inclusive and collaborative academic culture, one that engages with society and values academics for it. Not only because it will create more impact, but because it’s the right thing to do.
Frank Miedema is professor of open science and vice rector for research at Utrecht University. Paul Boselie, professor in public administration & organisation science and chair of the Utrecht University programme for Rewards & Recognition, Sicco de Knecht, open science programme coordinator at Utrecht University, and Judith de Haan, open science programme manager at Utrecht University, also contributed to this commentary.