The New York Times misses the real academic fraud: How academic research is biased towards finding statistically significant results that aren't really there

Suppose that you do an experiment and you don't get the "desired results." So you redo the experiment again and you get the "right" results.  Do you just report those results?  It seems that many commonly practice this.  The problem is that if you were to do an experiment many times, but only publish the one experiment that works, you have to adjust the statistical significance for the number of times that you redid the experiment.  Suppose that you did the experiment 10 times and that one time you got results that were significant at the 10 percent level.  Pretty obviously, you really didn't get statistically significant results.  The New York Times in its discussion of Stapel's fraudulent research doesn't seem to understand this problem.  Given that the others in psychology that were interviewed apparently view this as standard practice and assuming that is indeed the correct implication that they are giving, it means to me that the research in psychology is usually fraudulent.

Obviously there is a bias towards publishing research with statistically significant results in journals, and this bias creates the wrong incentives for academic authors.  But it raises the question what if anything one can learn from most academic research.  (BTW, it is one reason that I often try to publish redoing the different combination of control variables in my regressions.)  From the New York Times:
In one experiment conducted with undergraduates recruited from his class, Stapel asked subjects to rate their individual attractiveness after they were flashed an image of either an attractive female face or a very unattractive one. The hypothesis was that subjects exposed to the attractive image would — through an automatic comparison — rate themselves as less attractive than subjects exposed to the other image. 
The experiment — and others like it — didn’t give Stapel the desired results, he said. He had the choice of abandoning the work or redoing the experiment. . . .



Post a Comment

Links to this post:

Create a Link

<< Home