As the process of reforming research assessment gains momentum, representatives of Widening countries share concerns that their perspectives will not be properly accommodated
Addressing uneven research performance in the EU is among the key objectives of moves to reform research assessment, which has the overall aim of replacing the focus on metrics such as citations and number of papers published, with criteria that better reflect research impact.
Now, researchers in Widening countries are asking for a bigger say in how the reform is shaped.
Martin Bareš, Rector of Masaryk University, says that while the reform clearly reflects the expectations of the global scientific community, Widening countries appear not to be getting a proper say in the research evaluation debates, given the composition of the Coalition for Advancing Research Assessment (CoARA), the board which is steering the process.
The ‘Agreement on Reforming Research Assessment’ published in July 2022, lays out a routemap that is backed by a “coalition of the willing” stakeholders who are committed to implementing an assessment of the reform and review by the end of 2027.
Bareš said in the Czech Republic there is a certain level of mistrust and even fear surrounding the potential outcomes of the reform. Although this fear is to some extent irrational, it is crucial to allay it by considering the distinct social and historical contexts of Widening countries, he said.
In countries including the Czech Republic, Poland, and Slovakia, researchers in the social sciences had few opportunities to get involved in international collaborations due to communist central planning. Consequently, in these disciplines, the ability to communicate research with peers is still relatively weak. That highlights the importance of not drafting the assessment reform from the perspective of western and Nordic countries, where there has historically been a tradition of using English and engaging in international cooperation, Bareš said.
Other factors could hinder the reform process, including resistance from scientists who may view current indicators as more predictable and therefore more acceptable. In addition, there may be a lack of engagement at various levels, with some institutes still requiring the use of traditional impact factors.
According to Bareš, there is a tendency for evaluation processes to focus on immediate outcomes, signaling to the academic community that meeting these key indicators is the ultimate goal. Researchers will need to learn how to communicate the impact of their work in a narrative manner and to connect their research to societal needs. “I perceive the ability to formulate impacts as very weak in our academia,” he said.
Another barrier is less trust in assessment by peers in Widening countries, where there is a tendency to view metrics as more objective. “In the Czech Republic, there is now more of a myth that expert opinion cannot be trusted, while metrics are perceived as an impartial and objective way of evaluation,” Bareš says.
Within this context, cultivating a culture of trust, diversity and openness at both interpersonal and inter-institutional levels can be challenging. As Bareš sees it, there is room for improvement of research culture in the Widening countries. He underlined the importance of assessment reform in setting international standards of the minimum expected level. However, despite the cultural and mindset differences, he thinks the gap between western and eastern member states is narrowing.
Traditional metrics are not being ditched altogether and one of the major issues is striking a balance between quantitative and qualitative indicators, while ensuring assessments are objective.
Maciej Żylicz, president of the executive board at the Foundation for Polish Science, wants to see a shift away from metrics-based assessment and towards evaluating the quality of science based on expert opinions. Under the current system scientists focus on scoring points, “but science is not collecting stamps,” he said.
Żylicz pointed out that countries, including the US, do not rely on raw metrics. In Poland, the National Science Centre in Krakow awards grants for research projects through a peer review system.
That may be possible for a standalone institute, but the complex structure of the Polish higher education system, in which there are about 130 public universities and numerous private universities, makes it hard to implement across the board.
The Foundation for Polish Science proposes the consolidation of universities in individual regions of Poland and the creation of centres of scientific excellence, which would consist of small units of 3 - 5 research teams established from the beginning within the existing reformed universities. These centres of excellence would be judged on the quality of the science they produce.
Żylicz also drew attention to the problem of brain drain in Widening countries, with a significant portion of EU funds for science allocated to international cooperation. Centres of scientific excellence could be an incentive for Polish scientists who moved to western countries to return to their home country.
In order to demonstrate societal benefit scientists have to convince the public their work is valuable, so the public can then influence politicians. Again, Widening countries are at a disadvantage because there is less of a tradition of popular science and communicating science to the public. Żylicz pointed to the UK is a positive example, where citizens understand more about the importance of science in their everyday lives and politics listen more to what society says.
Artur Silva, vice-Rector at University of Aveiro, Portugal does not think current indicators such as impact factor and citations will be completely replaced with new ones, stressing that performing evaluations only on the basis of qualitative criteria is not realistic.
Evaluation criteria can vary depending on the system in each country. For instance, in the Netherlands, researchers are assessed against prespecified objectives. If a scientist successfully meets these objectives, they are eligible for promotion. Such an approach is not feasible in Portugal because financial constraints mean there is a limited number of available positions.
Seeing the winds of change blowing through research assessment, the University of Aveiro started an internal reform in 2018 to make its system more flexible and less reliant on quantitative indicators. The new system offers researchers a choice of three different methods. The first method relies solely on quantitative parameters, the second combines quantitative (80%) and qualitative (20%) elements, while the third involves evaluation by a jury, which is purely qualitative. Silva said the introduction of qualitative criteria allows those who performed excellent jobs in a specific area to have the recognition they deserve.
In January 2024, the university will implement the system for the first time, covering the period from 2021 to 2023.
Taken overall, the message from Widening countries is that amidst the ongoing reform, they must be given a significant voice, to ensure the development of inclusive criteria that will foster international cooperation.