Chris Blattman

Search
Close this search box.

Annals of “All research is wrong”

A popular study from the 1970s that helps sell millions of dollars’ worth of fish oil supplements worldwide is deeply flawed, according to a new study being published in the Canadian Journal of Cardiology.

The original study, by Danish physicians H.O. Bang and D.J. Dyerburg, claimed Inuit in Greenland had low rates of heart disease because of their diet, which is rich in fish oil and omega-3 fatty acids from eating fish and blubber from whales and seals.

“I reviewed this original paper and it turned out to be that they actually never measured the frequency of heart disease in [Inuit],” said Dr. George Fodor, the new study’s lead researcher.

Story and study.

Published medical science is deeply flawed. More often than not, when I’ve looked up a study claiming X, the statistics are deeply problematic. I suspect poor training and poor refereeing are proximately to blame, but there must be some deeper absence of incentives. It’s a shameful state of affairs.

Political science and economics are (a little) better, but as I teach my students, the first thing you should say to yourself as you open every book or research paper is, “This is almost certainly wrong.” Depressing but important. Welcome to science.

56 Responses

  1. I agree that there are high profile examples of studies published in medical journals (ahem, LANCET) that were subsequently shown to be bunk. Assuming you don’t have any sort of economics superiority complex (I’ve seen Easterly’s comic strip), aren’t you drawing some hasty conclusions from a very small sample size? Or are you saying that there is something inherently better about the production of economics/political science knowledge compared to the production of medical knowledge? (In order to get a JPE you need to present the econ paper at 30 different brown bag lunches, post it on the Internet prior to publication, circulate it, etc — so by the time it actually gets published in JPE then the publication event is more or like an afterthought and all of the reputation-making has already occurred. Contrast that to getting a JAMA paper– you’re explicitly *barred* from presenting at more than 1 conference prior to publication, and you keep it under wraps until the publication event.)

  2. I’m curious about the broad claim that political science and econ are better. I imagine this may be true in terms of strict adherence to the appropriate use of statistical methods, but that’s only one particular way for a study to be “deeply flawed”. As a generalization, I think medical science has better, more reliable data than social science. (This is particularly true of the “development” sub-fields of social sciences. Those household surveys are crap.) Medical research also tends to work with more cogent, well-founded, testable theories than social science. Medical journals offer fewer examples of researchers selectively mining huge data sets in search of significant relationships.

    One final thought. I’m not sure if this is accurate, but it seem to me that there’s been a huge proliferation in 2nd and 3rd tier peer-reviewed publications in the social sciences, which produce a sheer volume of crap research that dwarfs anything in medicine. Maybe there are also a bunch of lower-tier medical journals that I don’t know about that are doing the same thing, but my impression is that the medical establishment has done a better “gatekeeper” job at stemming the profusion of shoddy journals publishing shoddy research than the social sciences have.

  3. Does this also apply to RCT’s. Can they, at least some of them, be interpreted as ridiculous conclusion theory rather than randomised control test.
    I remember reading somewhere about an economist who did not use complicated and bewildering mathematical calculations for his theory but used words and lateral thinking. But maybe he was not highly thought of. I can’t remember who he was.

Why We Fight - Book Cover
Subscribe to Blog