Viewpoint: the world must wake up to threat of mistrust in science

24 Jan 2024 | Viewpoint

Polish professor Dariusz Jemielniak is leading a project on vaccine scepticism. He says long term strategies and more resources are needed to counter the disinformation that is fuelling mistrust in science

Public scepticism in science - and in particular medical science - is on the rise and is being fuelled by disinformation, according Dariusz Jemielniak, a specialist in networked and digital societies.

This issue is not being addressed seriously enough by public bodies around the world. “I don’t really know of a single country that is putting significant resources behind increasing trust in science,” says Jemielniak, professor of management at Kozminski University, and vice president of the Polish Academy of Sciences.

“In most countries, public health policy is short term, with politicians focusing on the next year or two, at best until the next elections. There needs to be a longer term strategy. We’re reactive and I would say we’re not doing enough.”

Jemielniak is wrapping up a three-year project funded by the National Centre for Research and Development (NCBR) in Poland called Medfake, set up to examine vaccine scepticism. The work started during the rush to develop and administer COVID-19 vaccines, but the project was approved before there was a hint of the pandemic, in 2019.

That coincidence provided Jemielniak and his team with the real world context in which to study how people react to global vaccination programmes, and to medical advice in general. The conclusions are worrying.

“I was assuming that, since the [COVID-19] vaccines were such a huge scientific success, this would reinforce trust in science and medicine in general,” he said. “And we can now say very confidently, it's pretty much the opposite.”

Jemielniak ascribes this to a general expectation that science would deliver solutions. However, after waiting months for vaccines to be approved, when finally available, they did not immediately end the pandemic. Despite the COVID-19 vaccination roll-out being seen as extremely quick and effective by medical experts, this was not necessarily the perception of the general public.

Fuzziness and confusion

Partly, this is due to the nature of COVID-19. The minor negatives of the vaccines, such as causing sore arms or slight fevers, were more visible than the huge benefit of creating wider resistance to the virus, Jemielniak said. “There was a lot of fuzziness and confusion about what the vaccines were supposed to do,”

This was not helped by mixed messages from public authorities and was heightened by malicious actors, often from more authoritarian countries, spreading misinformation and disinformation online.

The biggest source of disinformation though, was the general public, with social media a key tool for spreading their views. Jemielniak said this comes back to a general lack of trust in science amongst the public, as often people go searching for answers to medical problems online more readily than they would go see a health expert.

“There is good research showing that anti-intellectual movements, distrust in science and anti-establishment thinking in general is on the rise, and that is coinciding with the growth of social media and authoritarian regimes meddling,” he said.

Studies into trust in science around the world vary in their findings. A recent survey conducted by the Pew Research Center in the US shows that while 73% of adults have a great deal or fair amount of confidence in scientists to act in the public’s best interests, this is 14 percentage points lower than it was at the beginning of the COVID-19 pandemic. The proportion of US adults who expressed the highest level of trust in scientists also fell from 39% in 2020 to 23% in 2023.

In the UK, a study by the Genetics Society found that the pandemic led to increased trust in science in over one third of the 2,000 people studied. However, it also found that the most science-sceptic people tended to have the most self-belief in their knowledge.

Fixing the broken trust in science

For Jemielniak, countering scientific disinformation and increasing trust starts in school. “We need to educate people in critical thinking and statistical thinking,” he said. For example, if you ask most people what the difference between 30% of 12 and 12% of 30 is, most will not know it is the same. Misunderstanding statistics was one part of why it was hard to convince people of the effectiveness of COVID-19 vaccines, Jemielniak said.

Another measure would be holding large, private social media companies more accountable for stopping the spread of disinformation.

In 2022, over 40 companies and associations, including social media platform owners such as Meta, TikTok and Twitch, signed up to a strengthened code of practice to fight disinformation, based on guidance from the European Commission. X Corp., owner of X (formerly Twitter), is a notable non-signatory.

This code of practice adds to a range of tools and initiatives brought forward by the EU to fight disinformation. Jemielniak says these are heading in the right direction but he is worried that the pace of regulation is too slow. He wants more stringent legislation to stop disinformation.

“If you look at, for example, the propagation of paedophilic materials, we see that society has the general agreement that this should be banned, and it is pretty much banned and social networks are able to take it down,” he said. “I think the same should go for at least the most extreme cases of medical disinformation, when it's causing death, when it's resulting in avoidable deaths.”

Another solution is arming experts with the knowledge to counter disinformation. The Medfake team is developing a tool that will help detect medical disinformation and alert healthcare professionals so they will be able to keep abreast of the latest conspiracies and prepare information to counter them.

A fast developing contributor of disinformation is the use of artificial intelligence to generate fake content. Before their advent, a lack of tools to create material that looked genuine blocked the spread of disinformation. With AI, that’s now over. “Currently, it’s an absolute disaster,” Jemielniak said.

But looking longer term, he is more optimistic. “Maybe once we have this barrage of disinformation resulting from AI […] people will start realising that it is not just about the content, but about having a trusted source.”

Never miss an update from Science|Business:   Newsletter sign-up