Done in Holland The largest study to date on scientific integrity. A self-report survey measuring researchers’ integrity found that more than half of participating researchers regularly use questionable research methods: for example, it is common to try to hide errors made in the research plan or to selectively cite the literature.
According to the survey, one in a dozen researchers in the past three years has done more to counter research etiquette — creating or falsifying search results, which could already be classified as research abuse. This 8 percent rate is more than double the result of a similar previous study: Chinese researchers A meta-analysis examined the frequency of research violations, and found to be around 2.9 percent on a self-reporting basis (plus plagiarism was included as well as generated or erroneous findings). Meanwhile, when asked whether others had committed such a serious violation, 15.5 percent of researchers in the studies included in the analysis answered yes.
John Ioannidis, a famous professor of medical statistics at Stanford University, took it in the early 2000s Most of the published research results are flawed. Today, it seems that not only small scams, but also publishing scandals that reach a larger audience, frequently follow each other. Just a few of the last few years:
- Recently, at the end of June, Vaccines magazine editorial board resigned A scandalous vaccine study declined by specialized lecturers obligatory to.
- Even a Nobel laureate in chemistry can make such a mistake You should withdraw your studies at a later time.
- Moreover, Jacques-William Szostak, 2009 Nobel Prize Winner in Medicine also appeared: as it turns out with his colleagues However, they did not reveal any of the secrets of life.
- As the Neue Zürcher Zeitung revealed in 2019, More than 3,300 scientific publications had to be withdrawn in the past 10 years السنوات, a quarter of the co-authors were Chinese and a seventh were Americans.
- Psychologist Gary Lewis specifically Fake study books on the habits of eliminating politiciansTo prove that in some magazines no one cares about what appears, there is in fact no professional criticism.
It also tells us that more than 60,000 researchers from 22 Dutch universities, of all disciplines, have been invited to the 2020 Dutch survey, but Many institutions refused to participateThe results will not shine a good light on it in public. In the end, only 6,800 completed questionnaires were received, but it is still the most comprehensive survey ever on a given topic.
In the Dutch national research, care was taken to ensure that the identities of the participants were not revealed, and their method helped to get more honest answers – told science Because of this, the results of the survey are much closer to the real situation than similar research so far, said research leader Gauri Gopalakrishna, an epidemiologist at the University Health Center in Amsterdam.
Lessons learned in the form of two studies was announced before print on July 6. the first Examine the frequency of search violations From questionable practices—including, for example, negligent judgment, directing young researchers, or self-selection from the literature—to more serious violations that might be called fraud. The other study is fair Responsible search behavior occurs Examine, for example, how often someone corrects their erroneous publications, shares the data on which their research is based, and tests their experience by publishing their theory and procedures early — to see when they run the risk of bias in the methodology.
According to the survey, doctoral students are the least likely to adhere to research criteria (or only the most honest): 53 percent admitted that they have frequently resorted to one of the 11 questionable research methods included in the survey in the past three years. True, this percentage was no more favorable among participating professors and university teachers as well: 49 percent.
Tip of the iceberg
The survey also sought to find out what motivates the researcher’s misconduct, so the research team also looked at participants’ personal experiences in the profession, such as how much they felt pressure from fellow researchers at work or a similar field. Dutch researchers found that coercion to publish showed the strongest association with questionable research methods, and was the most deterrent to fraud and deception through exposure to professional reviewers.
Although the 8 per cent rate for the more serious violations listed above is much higher than the value measured in previous research, it may still be an underestimate, said Danielle Fanelli, a research ethics researcher at the London School of Economics, who was not involved in the research. in research. This is because the questionnaire was clearly worded regarding scientific findings that were generated and falsified, so respondents were unable to assess that these were just minor violations. For this reason, according to Vanelli, fewer people answered honestly, yet the researcher considers the Dutch survey to be one of the best on the subject.
Elisabeth Bik He fearlessly guards the integrity of science Microbiology that lasts A star epidemiologist searched in France to tasteAnd you got it for yourself. A researcher specializing in manipulated images in publications is not surprised by the high rate of disapproval found in a Dutch survey. He finds manipulated images in an average of 4 percent of his studies, but believes that most modifications aren’t at work, so “what we’re seeing is the tip of the iceberg. It’s also likely that the above is between 5 and 10 percent.”
Beck cautions, however, that one should not always consider black and white even some specifically fraudulent crimes, let alone questionable practices. For example, removing inappropriate facts from a score is a falsification, but researchers often have good reason to do so. Negative results are not easy to publish according to the expert because journals lose interest. “It’s good that these rules are there and it’s good to think about them from time to time, but it wouldn’t make someone a bad researcher by not always following them.”
Quality instead of quality
According to Vanelli, a higher rate of scams does not mean that Dutch researchers will behave less ethically than their colleagues. Since Diederik Stapel was a Dutch social psychologist fell in 2011 Manipulating data and imaginative research results, the Netherlands became a leader in scientific integrity and was born in 2018. Dutch Code of Research Ethics he is. For this reason, Vanelli believes that Dutch researchers are well-versed in aspects of a researcher’s integrity.
However, awareness is not enough to disguise undesirable methods, said the head of the Dutch survey, who sees researchers currently being judged not by quality but by quantity. As biologist Anna Fedor previously explained on Qubit, The mechanism of funding science and publishing encourages fraud and deception researchers. “There are theories that are not supported by experiments, and there are times when the conclusion that can be drawn from experiments is not clear, but researchers often have been working on these experiments for years. All this so that they do not end up getting any rewards for it, as none of the publishers will publish their articles, nor will their colleagues read them, nor will they increase the overall impact factor.”
As the architect Gabor Domokos, the inventor of the field, told Kibet about this earlier, he only sees the need to publish not as an individual, but at an institutional level: according to him, it was also bad for large universities and a prestigious university “It was enough to give up the award Noble every fifty years, now you should be the best every two months. You can’t have a person every two months at bestBecause you have to think too.”
In 2019, hundreds of researchers supported the . initiative It ends with a change of statistical methodology For sensory compulsive search. Namely, there should not be a statistical significance factor denoted by p, which could skew the results and be very easy to circumvent.
In the 2020 trial, 70 research teams were asked to interpret the same fMRI database and examined 9 hypotheses about how well the data supported them. The results of the 70 teams of experts weren’t grateful to each other because of different data analysis methods and different decisions, which Tom Schönberg, a neuroscientist at Tel Aviv University, says the research leader says. The lesson is that taking multiple approaches at once can help in identifying the results that really matter. The same is done by pre-publishing hypotheses so that they cannot be modified for the result afterwards.
Daniel Lackens, a social psychologist at Eindhoven University of Technology, who was not involved in the Dutch survey, believes that a researcher’s integrity could indeed be improved by introducing some good practices, such as pre-registering studies or publishing raw data using manuscripts. According to him, the research and follow-up practices involved are also largely determined by the prevailing rule.
Related articles on Qubit: