Back in 2015, a landmark study revealed that over half of 100 peer-reviewed psychology studies were faulty as they could not be replicated. The quake that shook the field was finally put to rest by a blistering announcement this Thursday.
Scientists at the University of Virginia and Harvard University said the 2015 study’s results are condemnable as the research methods used to reproduce those studies were erroneous on multiple counts.
According to the lead author of the critique Daniel Gilbert, a Harvard psychologist, the study’s researchers introduced statistical error into the data by inappropriately applying some poorly designed methods.
The takeaway here: the failure rate of psychology studies had been grossly overestimated.
After the 2015 meta-analysis conducted by the nonprofit Center for Open Science, the public’s trust in psychological studies dropped dramatically.
As Gilbert explained, the study’s conclusions not only harmed the public’s opinion but also led to many funding agencies and scientific journals changing their policy.
His team at Harvard noted that one of the first problems of the center’s study was the selection of studies they chose to replicate.
According to the commentary released by the university, the center “created an idiosyncratic, arbitrary list of sampling rules that excluded the majority of psychology subfields from the sample.”
Instead of repeating the original experiments with fidelity, the replicated research did anything but. For example, they replicated a study of race involving white and black students at Stanford University on the subject of affirmative action.
However, instead of keeping in line with the original study and head over to Stanford, the center’s researchers tried their hand with students at the University of Amsterdam. It turns out that the error was eventually remedied, and the scientists repeated their work at Stanford.
Gilbert’s team found the Center for Open Science could eventually reproduce the results, something they never acknowledged when they released the controversial 2015 study.
Once they fixed the mistakes, the reproducibility rate was “about what we should expect if every single one of the original findings had been true,” according to co-author Gary King, head of Harvard’s Institute for Quantitative Social Science.
So instead of misleading the public with news of ‘Yet another psychology study that doesn’t replicate’ the meta-review should’ve read ‘Yet another psychology study replicates just fine if you do it right and not if you do it wrong,’ King said.
The authors of the 2015 study responded to the follow-up critique by saying that Harvard scientists blamed the replication while the original explanation was that the data is sometimes inconclusive.
Image Source: ABC News