Science Magazine raises its statistical bar. Will we?

From the Editors of Science:

….unfortunately, there have been far too many cases where the quantitative analysis of those numbers has been flawed, causing doubt about the authors’ interpretation and uncertainty about the result. Furthermore, it is not realistic to expect that a technical reviewer, chosen for her or his expertise in the topical subject matter or experimental protocol, will also be an expert in data analysis.

For that reason, with much help from the American Statistical Association, Science has established, effective 1 July 2014, a Statistical Board of Reviewing Editors (SBoRE), consisting of experts in various aspects of statistics and data analysis, to provide better oversight of the interpretation of observational data.

…I have been amazed at how many scientists have never considered that their data might be presented with bias. There are fundamental truths that may be missed when bias is unintentionally overlooked, or worse yet, when data are “massaged.” Especially as we enter an era of “big data,” we should raise the bar ever higher in scrutinizing the analyses that take us from observations to understanding.

This is an important move. I would love to see medical journals do the same, where I think the problems are greater and the consequences for human welfare more immediate.

At the same time, if your research mainly deals with numbers, then I think it is time to expect people with substantive expertise to become better statisticians. They need this not only to produce better work, but to be effective users of what their peers produce. This cannot simply be exported to a committee in the top journal.

Raising the refereeing bar is going to get the incentives right, which is a step in the right direction. But something will need to change in graduate admissions requirements and training.

In particular, I think that a 21st century undergraduate degree in social science ought to require fluency in statistics. It’s such a fundamental part of science, medicine, social science, and even reading the newspaper. But even the Columbia’s and Yale’s of the world don’t impress this on their undergraduates, let alone require it. I’m a big supporter of the liberal arts education, but on the margin I’d substitute a couple courses in the humanities for statistics and causal inference.

89 thoughts on “Science Magazine raises its statistical bar. Will we?

  1. I believe that many social science programs require introductory stats and a course in research methods. While this may make for better newspaper readers I wouldn’t suggest that it helps those students make sense of academic works employing quantitative methodology. A major assumption in your argument Prof. Blattman is that students will have the math background to be able to progress in stats at the graduate level.

  2. Some basic level statistics class should be required even in high school, IMO. Every person, even outside of research, is frequently faced with statistics-based arguments, and the need to assess the validity of them.

  3. My own thinking in social science has been improved by advanced calculus, statistics, probability and in general, quantitative methods. Ironically, I got most of that as Kris says above, in high school and then in undergraduate as a chemical engineer. And my brothers were engineers who then became PhDs in Mechanical Engineering and who do advanced mathematical modeling, so it has always helped that the family raises the bar. I am not sure how that would translate to undergraduate teaching though.

  4. It’s an interesting move, but I am not quite sure what the Statistical Reviewers do. In statistics, disagreements pop up early. I do not mean how to deal with various sorts of dependence in panel regression, but relatively simply issues in cross-section OLS. On what basis do the statisticians decide whether an analysis is sound or not?
    What I do certainly like is that journals (e.g., PSRM) check whether scripts and data are running and produce the results an author is reporting in the manuscript.
    I also agree one statistics and research design/causal inference course is warranted. Starting with this in the Master’s studies is too late.

  5. All very well and good: bad stats is bad research, so statistical education should start earlier and be more thorough. But when social scientists all have the mathematical and statistical skills of engineers and physicists – or better, if you really want to do the job properly, ‘cos the data you deal with is more complex – then what has been left out of their educational experience to fit in all the maths? We already have economists being pretty much wanna-be mathematicians who have forgotten about the real world, do we want to completely erase any actual understanding of what the numbers might mean from the social sciences as well? It’s a bit of a dilemma, I know, but whatever happened to David Freedman’s advocacy of “low-tech” research that emphazises actually knowing your subject?

  6. Right, there is a limited number of courses students can take. Two or three solid courses on causal inference, research design and methods, and statistics could achieve a lot. For example, it would help if students understand significance testing correctly. Teaching them how to determine the “lowest-tech method” appropriate for a given research question could be part of a course too.