Just how broken is peer review?

Came across this combination today:

I used to be the editor of the BMJ, and we conducted our own research into peer review. In one study we inserted eight errors into a 600 word paper and sent it 300 reviewers. None of them spotted more than five errors, and a fifth didn’t detect any. The median number spotted was two. Richard Smith

OK, sounds like peer review is pretty incompetent (and that these guys have a much easier time getting reviewers than GG does–this fake paper was one easily reviewed by 300?).  But then down in the comments is this:

Richard Smith (THE 28/5/15, page 29) says that when he was editor of the BMJ, he deliberately inserted 8 errors into a paper and sent it to 300 reviewers. None of them spotted more than five errors. I was one of those reviewers, unaware at the time that this particular paper was part of an experiment. I sent a covering letter with my review that went something like this: “This paper is not only unpublishable in its present form; it contains a number of fatal flaws that suggest it will never be publishable. I list below three of those fatal flaws. If you would like me to take a closer look at the paper to provide a more comprehensive set of feedback for the authors, please let me know.” The BMJ’s staff – then as now – viewed peer review as a technical task (‘spotting errors’) rather than a scholarly one (interpretation and judgement).-Trisha Greenhalgh

This then is followed by discussion of the validity of each claim and its significance (and a link back to the original study) and the note that the article makes a number of other points (for instance, that those who wanted to accept the paper didn’t notice any more errors than those that wanted it rejected). You get the feeling that advocates on both sides are determined to plow forward. GG kind of wonders what the world of scientific research would look like if it was just the wild west of thousands of preprints littering the internet.  How would you find anything?  Search engines even within disciplines are stunningly awkward to manipulate to really find everything; often terminology morphs over time,or the significance of terms is context sensitive.  OK, now imagine you do have a common archive (for instance, the arXiv archive); what is worth your time? Maybe the papers from your friends? Are you looking for social media type approvals? When Smith calls for a return to the roots of science, where “scientists gathered together, presented their studies and critiqued them,” he harkens back to a time of a tiny research community where all were known quantities; is it realistic to expect the same behavior with the thousands of scientists in nearly any field today? Yes, peer-review is imperfect, and it is arguably most broken in the tabloid journals where the emphasis is too far towards the “is this sexy?” than “is this solid?”.  But it guarantees some eyes looked at some work and helped to improve it.  Some work sits on the sidelines for years before its significance is recognized: if there is no syn-publicaiton review, it is far less likely that any remediation is possible and good work poorly presented will not gain the audience it would deserve. Again, if you really think that the sole purpose of peer-review is as gatekeeper, you miss out on other potential benefits.

Tags:

Trackbacks / Pingbacks

  1. Scientific Publication Essentials | The Grumpy Geophysicist - June 9, 2017

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: