First results of a new approach to research assessment introduced as part of reforms to bring its higher education system into line with EU standards led to multiple appeals. Earlier this month an expert group met education minister Przemysław Czarnek, to consider improvements
In 2018, Poland adopted the Constitution for Science, a major reform of the higher education system aiming to bring it closer to the EU standards. As part of the reform, new criteria for research assessment were introduced.
That led on to the publication of the results of the first assessment carried out in the four-year cycle in August 2022. The standing of each discipline in each institution was evaluated on three major criteria: the quality of research output; securing external research funding; and the impact on society and economy.
The results are classified from A+ to C. While the evaluation is carried out by the Science Evaluation Committee, the Ministry of Education makes the final decision on which classification a particular discipline falls into.
That matters, because the classification affects the degree of freedom institutions have in setting up their own study programmes, the right to award doctorates, qualification as an academic institution or a vocational institution, and the amount of funding. All of this feeds into the status of an institution and how it is perceived by the general public.
When the Ministry of Education and Science published initial assessment results under the new system in August last year, almost half of those evaluated were not satisfied with the outcome.
The Ministry awarded 65 A+; 323 A ;582 B+; 139 B; and 36 C, within individual disciplines.
Of a total of 1,145 disciplines assessed, about 500 people filed requests for the reconsideration of the assigned category, according to Marcin Pałys, a former rector of the University of Warsaw, who is a member of the academic expert group that met education minister Przemysław Czarnek earlier this month group to draw up proposals for adjustments to the system.
Pałys said the initial assessment outcome was rather unexpected. “The most striking part was that among the highest categories, there was a substantial group of institutions that have very few people who work in a given field. For example, the discipline of law in technical universities was ranked very highly,” he told Science|Business.
But with assignment to a category open to being contested, “for any reason deemed significant by the receiver,” with no risk that they could be marked down, and Czarnek encouraging all those who doubted the outcomes to ask for reconsideration, there was little reason not to ask for a reassessment.
In mid-September, Czarnek told the newspaper Dziennik Gazeta Prawna that “the overwhelming majority” of these requests are very well-founded, and he expects them to be resolved positively.
In a statement after the January meeting, the Ministry underscored that the evaluation system should be transparent, understandable to all participants and reflect the nature of the institutions under evaluation.
Pałys said there is room for improvement, but stressed that any changes will focus on evolution of the system, rather than completely replacing it. Speaking about possible changes, he noted the current assessment algorithm relies on sophisticated calculations, which should be simplified so that participants can better predict outcomes.
Another important issue is adjusting the quality of research output criteria to the needs of different disciplines. In humanities, researchers tend to publish books and monographs, while in technology, scientific papers are more common. While the four-year evaluation cycle is sufficiently long for papers, it may not be enough for books, Pałys said. He also called for greater balance between bibliometric data, such as publication in high impact journals and the number of times a paper is cited, with peer review and the societal impact of research.
“Many discussions in Poland have a strong focus on the bibliometric part, but it is just one criterion, and the other two are important too. So, we need to have a balance,” said Pałys.
He also suggested looking again at how the number of people involved in a particular body of research impacts the assessment outcome. “At the moment, it appears that if the volume of output is artificially assigned to a smaller group of people than were actually involved, the score is higher,” he said. This issue is likely to be a reason for the high rankings of non-core disciplines in higher education institutions, said Pałys.
The meeting also focused on limiting the impact of random fluctuations in the assignment of the final category. “If the institution receives A category in one evaluation period, and C category in the next period, or vice versa this would be inconsistent,” Pałys said.
Finally, the group considered the impact of publications in predatory journals that publish any paper submitted without peer review, as long as the author pays article processing charges, on the outcome of the evaluation.
In November 2022, Włodzimierz Bernacki, deputy minister of Education and Science, told the Polish Press agency the amended evaluation criteria will be made public by mid-2023. He underscored that the goal is to improve the evaluation procedure without changing its basic structure.
However, Bernacki also stressed the need to take into account how universities contribute to the ongoing education of scientific staff. At the moment there is no such criterion in place.
Bernacki also suggested adjusting the criteria assessing societal impact by assigning scores for publications in popular science magazines, in addition to scores for publishing in scientific journals.
The Ministry of Education will publish final outcomes of the first research assessment under the new scheme after resolution of the appeals. The 2022-2025 assessment will take place in 2026.
Despite the many complaints about how the upgrade to the research assessment has played out so far, there is general agreement that such assessments are necessary.
Pałys noted also that Polish academics are interested in contributing to the European Commission’s research assessment initiative, and participating in discussions on how to improve methods for assessing research.
In December, when Coalition for Advancement of Research Assessment (CoARA) was set up to take the initiative forward, institutions from Poland were the second largest group, after Italian institutions. Rectors of German universities in contrast, decided not to join CoARA.