To get EU science grants, applicants should sit research integrity classes

16 Jul 2015 | Viewpoint
Research Commissioner Carlos Moedas wants an initiative to tackle science misconduct to be in place by the end of the year. It is an issue Europe is divided on, says Simon Godecharle, researcher and ethicist at KU Leuven

Headlines may lead people to believe the problem of science misconduct is getting worse, and the evidence appears to confirm this.

Researchers say the percentage of articles retracted from scientific journals because of fraud has increased ten-fold since 1975.

Last month, EU Research Commissioner Carlos Moedas said he is going to do something about it. A new ‘European Research Integrity Initiative’ to set clear standards and mechanisms to tackle the problem, will be established by the end of this year. 

The precise details are not known, but there are a few sensible things it could include, says Simon Godecharle, a researcher who studies research integrity and misconduct within biomedical research at KU Leuven.

These include a requirement for applicants to sit research integrity classes before receiving grant money. “I think they should make it obligatory,” Godecharle says.

In the US researchers applying for National Institutes of Health grants have to undergo research integrity training in conflict of interest, responsible authourship, policies for handling misconduct, data management, data sharing, and policies on the use of human and animal subjects.

This is more effective than just drawing up new guidelines Godecharle believes. “I don’t think a new EU guideline alone will be the answer. Researchers don’t have time to read all the guidelines that are out there [already].” 

Another thing Godecharle would like to see included in the initiative is a statement on data management. In EU member states, there are different guidelines for how long researchers should keep their raw data, which may become important down the line for anyone who wants to verify research findings. The UK Research Integrity Office says researchers should hold on to data for at least three years; the Danish Committees on Scientific Dishonesty says five years and the Austrian Agency for Research Integrity, ten years.

“Because of its influence, the European Commission can set the tone,” said Godecharle. “This might enable other institutions, research organisations and funders to follow.”

Ignoring evidence

Science misconduct comes in many forms and can involve inventing data, intentionally misrepresenting results, copying texts or ideas without referring to the original source, or the pursuit of a compelling story that ignores contradictory evidence.  

In biomedical research, research misconduct can result in needless animal deaths, reputational damage to institutions and to other, and the loss of patient trust.

Pinpointing scientific misconduct as a problem is one thing, agreeing on the best way to deal with it among 28 member states is another.

In one of its Horizon 2020 competitions, the Commission set researchers the challenge “to assess the possibility to unify the codes, principles and methods [that exist] at [an] EU and international level” for tackling scientific misconduct. Two researchers from Radboud University, Hub Zwart and Willem Halffman, are managing the €2 million project, which will inform the Commission’s initiative.

They have their work cut out for them. Godecharle’s research shows that, apart from Denmark and Norway, no two countries in Europe share a definition of scientific misconduct.

“Some documents were published by ministries, others by national organisations; some are laws, some are ‘only’ guidelines”, according to one of Godecharle’s papers, published in The Lancet.

Denmark, for example, has a very clear definition of misconduct in law. Belgium scientists and lawmakers, on the other hand, chose to create a moral code based on values, rather than a legal document.

In many countries, the intention to deceive is an important element of misconduct, said Godecharle.

However, Sweden’s definition of misconduct is much narrower and consequences, not motives, are what matters. 

“If you are continuously careless in Finland, you might be fine; do it in Sweden and it could be very serious,” said Godecharle.

In the US and the UK, scientists found to have misrepresented data have even ended up in prison.

Before the EU’s upcoming initiative, the most significant effort to harmonise Europe’s thinking on science misconduct was the (now disbanded) European Science Foundation’s ‘European Code of Conduct for Research Integrity’, introduced in 2011.

Its text states, “it is not a body of law, but rather a canon for self-regulation” and “it is not intended to replace existing national or academic guidelines, but to represent a Europe-wide agreement on a set of principles and priorities for the research community”.

Its impact on the science world is doubtful. Godecharle points out that Hungary has since introduced its own research integrity guidelines, but arrived at a different definition of misconduct. 

Self-regulation

“Can science be self-cleaning? Many scientists were inclined to say yes up to 2011,” said Godecharle.

That was before Tilburg University’s Diederik Stapel, a social psychology researcher, was found to have falsified data on a huge scale.

He was an academic star who published intriguing behavioural studies purporting to show that eating meat can make people more aggressive, and being in a rubbish-filled environment can bring out racist tendencies. “Stapel had papers in the best journals,” noted Godecharle.

The network of watchdogs around him, including journal editors and reviewers, did not bark. The issue eventually came to light when a whistle-blower pointed the finger, raising the question of how so many people were duped. 

Detecting fraud is not always straightforward, Godecharle pointed out. In theory, a journal’s peer reviewers are supposed to detect errors. “But it’s hard for them to tell if a researcher really used 1,000 mice in his experiment instead of 100, for instance.”

One bad scientist can knock a whole research field off course, Godecharle said. Science is a continuous enterprise, meaning researchers build their work on all that has gone before. Doctoral dissertations Stapel oversaw in Tilburg used his fabricated data, for example. And after having found evidence of fraud in 55 of Stapel’s papers, an investigation concluded that, "from the bottom to the top there was a general neglect of fundamental scientific standards and methodological requirements.”


Never miss an update from Science|Business:   Newsletter sign-up