Last week I posted my worries about the direction of social science experiments, how raising the quality bar was going to drop the quantity of studies (and what that will sacrifice). I did not expect tens of thousands of visits, tweets, replies, and suggested readings. Even Vox picked it up, causing me to immediately think: “Oh man, why do I say these things out loud?”
Clearly there is an untapped demand for mournful posts about feeling overworked, masquerading as critiques of science. And here all I thought you cared about was Star Wars and Trump.
My basic point was this: each study is like a lamp post. We might want to have a few smaller lamp posts illuminating our path, rather than the world’s largest and most awesome lamp post illuminating just one spot. I worried that our striving for perfect, overachieving studies could make our world darker on average.
There were lots of good comments, and I thought I’d summarize some of them here.
David McKenzie blogged that he thinks my concerns apply more to the big government evaluations than the new, smaller, flexible, lab-like experiments that more and more social scientists are running in the field.
I suppose that’s possible, but I found that the more control over experiments I’ve had, the more ability I have to add bells and whistles. The expectations of what is possible are higher. And I push myself harder, since I’m my own worst enemy. So, in my view, the incentives to over-invest are greatest with the experiments we control the most.
Meanwhile, Rachel Glennerster countered David with tips on how to work with government partners on randomized evaluations.
But most surprisingly to me, a huge number of commenters said something along the lines of: “I’d much rather have a couple of really good studies than a whole bunch of small and underpowered ones.” On some level: of course. But on another level, this kind of statement is exactly the problem I’m describing.
Think in extremes for a moment. On one extreme, we could have just one experiment in one place to answer an important question, and it would be big and amazing. On the other extreme, we could have thousands of tiny trials in as many different places, each with only a trivial sample.
Obviously neither of these extremes makes any sense. There’s a sweet spot in the middle. I read those commenters as saying “we know where we want to be, and we’re already there.” I’m not so confident in the status quo, and we should always be suspicious when “right here” feels like the best place in the world.
I think economics might be just a little too close to the first extreme, with just one big awesome study. Why do I say this? Because we obsess about internal validity all the time and almost ignore external validity. The profession is not optimizing. Meanwhile, my sense is that political science paper-writers are a little too far towards the other extreme, with more emphasis on quantity over quality (this does not apply to the books so much).
Scientific and statistical analysis should be able to guide us to the optimal point. For this we probably need a better science of extrapolation. The fact that I am not aware of any such science doesn’t mean it doesn’t exist. But it’s not something in the mainstream that we discuss. This should worry you.
Actually there are a few things out there.
- Jorg Peters sent me this systematic review of all experimental papers published in top economics journals, with a good discussion of the many external validity problems.
- My colleague Kiki Pop-Eleches and coauthors have a draft paper that looks at a natural experiment that’s happened in every country—the gender combination of your children—and use it to build models for understanding when we can and cannot extrapolate well, and how to build experiments to maximize generalizability.
- Lant Pritchett send me a new paper arguing that, when social programs are complex, with lots of dimensions, we benefit from testing more elements of its design, even if it leads to small sample sizes and low statistical power.
I can’t say whether these papers are correct, or if the list is complete, but I enjoyed skimming through them, and plan to read them carefully sometime soon. All this strikes me as a pretty important area for more research and reflection.
I would love to get pointers to other work in the comments.
81 Responses
Are social science RCTs headed in the wrong direction? A roundup of the discussion. https://t.co/gaSeSOGMIK
Are social science #RCTs headed in the wrong direction? More on this key question from @cblatts. https://t.co/PjgAWSNgtM
RT @LiamDelaneyEcon: On the quantity-quality tradeoff for public policy trials. Very interesting blogpost by Chris Blattman https://t.co/iy…
For reference, here’s the link to the post: https://t.co/yEfXaw5sCy
Are social science RCTs headed in the wrong direction? A roundup of the discussion – Chris Blattman https://t.co/weUyQHwUy3
On the quantity-quality tradeoff for public policy trials. Very interesting blogpost by Chris Blattman https://t.co/iyFwSb1Cka
Über randomisierte kontrollierte Studien in der Entwicklungszusammenarbeit schreibt @cblatts https://t.co/JbwCqjlt1m
Are social science RCTs headed in the wrong direction? A roundup of the discussion – Chris Blattman https://t.co/oJKIZ2mawc
RT @fp2p: Are social science RCTs headed in the wrong direction? Roundup of discussion on @cblatts blog w excellent insights https://t.co/c…
RT @fp2p: Are social science RCTs headed in the wrong direction? Roundup of discussion on @cblatts blog w excellent insights https://t.co/c…
RT @fp2p: Are social science RCTs headed in the wrong direction? Roundup of discussion on @cblatts blog w excellent insights https://t.co/c…
RT @fp2p: Are social science RCTs headed in the wrong direction? Roundup of discussion on @cblatts blog w excellent insights https://t.co/c…
RT @fp2p: Are social science RCTs headed in the wrong direction? Roundup of discussion on @cblatts blog w excellent insights https://t.co/c…
Are social science RCTs headed in the wrong direction? Roundup of discussion on @cblatts blog w excellent insights https://t.co/c6UbMnKVzl
Sparking the discussion: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/vBq2QvvivC
ここのブログ面白し。ちょうどRCT見つけた。
https://t.co/KGm6SDVAFj
Chris Blattman rounds up his recent furies on #RCT #socialscience.
https://t.co/UEHmG3TNHy
RT @tylercowen: Are social science RCTs going in the wrong direction?, https://t.co/rZEZBacXAQ, by @cblatts
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
“Are social science RCTs headed in the wrong direction? A roundup of the discussion – Chris Blattman” @myen #rstats https://t.co/axwvFH4wdR
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/khOHCMDgD0
RT @D_Wlkr: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hxVzRKgbg6 @rapid_odi @JessicaM…
If @cblatts has gifted nothing else this holiday season, surely his discussion of RCTs is gift enough: https://t.co/vF1dUxiNDl
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/NHbdqVvn1s via @cblatts
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @jcohen570: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/3YksmTIvzW
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/3YksmTIvzW
Thanks! This is an interesting perspective. Personally, I am someone who has not found RCTs to be useful in the work I do and support primarily because I find them too rigid and overly simplified to demonstrate anything close to the reality of complex behaviour in systems. Lately, I have comes across this tool: http://cognitive-edge.com/sensemaker/. I know some practitioners are using it and are finding it enlightening. One interesting piece is that it removes the bias of the researcher (or the statistical framework, and computer algorithm) from the interpretation of the results. What do you think?
RT @D_Wlkr: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hxVzRKgbg6 @rapid_odi @JessicaM…
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hxVzRKgbg6 @rapid_odi @JessicaM_London
Debating the value of thoroughly illuminating 1 point with an RCT or lighting up broader issues with smaller studies https://t.co/9JkNGKYSPP
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
I’m skeptical that either proliferation of small-scale experiments or meta-analysis provides a solution to the problem of external validity. In both cases, we only learn about the distribution of treatment effects in places where experiments are being done. And we increasingly have evidence that places with experiments differ from other contexts we may be interested in, and differ in ways that are correlated with the effect of the treatment being evaluated. Hunt Allcott calls this “site selection bias” and provides some of the empirical evidence for it here: https://www.dropbox.com/s/g9bxwzyhsxz7uri/Allcott%202015%20QJE%20-%20Site%20Selection%20Bias%20in%20Program%20Evaluation.pdf?dl=0 .
So just as we developed tools to account for selection bias in program evaluation (including randomization), we need to develop new tools to account for site selection bias. These tools could be developed either in a reduced-form framework or using a structural model of behavior. My own applied econometrics work, on bounding the average effect of an experimentally-evaluated treatment in a new context (http://www.personal.psu.edu/mdg5396/Gechter_Generalizing_Social_Experiments.pdf), has focused on the former. I believe the latter is a particularly exciting avenue for future research.
RT @anbrowndc: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/f1wJEWg746
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/ULR3pxkC1B
RT @tylercowen: Are social science RCTs going in the wrong direction?, https://t.co/rZEZBacXAQ, by @cblatts
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/ehdfAliiHU via @cblatts
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
Are social science RCTs headed in the wrong direction? A roundup of the discussion – Chris Blattman https://t.co/t6DPQ5DynL
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
Top story: Are social science RCTs headed in the wrong direction? A roundup of … https://t.co/gqmay5xk9n, see more https://t.co/pnLeaFC8aN
Top story: Are social science RCTs headed in the wrong direction? A roundup of … https://t.co/O86Or3qWNB, see… https://t.co/YsgkbiVSmQ
Top story: Are social science RCTs headed in the wrong direction? A roundup of … https://t.co/z3uSwOYEdF, see more https://t.co/XfSSynXNbu
Top story: Are social science RCTs headed in the wrong direction? A roundup of … https://t.co/VViKRmmaxE, see more https://t.co/w46cIuS8qu
Top story: Are social science RCTs headed in the wrong direction? A roundup of … https://t.co/O86Or3qWNB, see more https://t.co/49qC72kcQJ
There is a science of extrapolation and we do talk about it: it’s meta-analysis. Good meta analysis should always assess the validity of extrapolating from one site/study to another – that is, it should assess the genuine heterogeneity of the treatment effects (netting out sampling variation). For more detail: I discuss this, with links to papers that are trying to do this in economics, in my current working paper: http://economics.mit.edu/files/10595
RT @tylercowen: Are social science RCTs going in the wrong direction?, https://t.co/rZEZBacXAQ, by @cblatts
A provocative question on experimental social science: Are we headed in the wrong direction? https://t.co/1NWJW7s9WC https://t.co/zSfCpk0AiE
RT @amlebas: Excellent follow-up fr @cblatts on earlier post re poss sophisticatn-knowledge trade-off in experimental research. https://t.c…
Excellent follow-up fr @cblatts on earlier post re poss sophisticatn-knowledge trade-off in experimental research. https://t.co/3EYaEAL0nJ
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @tylercowen: Are social science RCTs going in the wrong direction?, https://t.co/rZEZBacXAQ, by @cblatts
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @tylercowen: Are social science RCTs going in the wrong direction?, https://t.co/rZEZBacXAQ, by @cblatts
RT @tylercowen: Are social science RCTs going in the wrong direction?, https://t.co/rZEZBacXAQ, by @cblatts
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
@marmar1028 seems like this is sort of your thang. via @cblatts https://t.co/w9ZtQZSrb5
@alb202 @cblatts read some Larry Wright, there is nothing scary about it at all.
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/f1wJEWg746
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/t6y0tRMh7W
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT
@cblatts Careful, getting close to epistemological, philosophy of science stuff. That way madness lies.
RT @cblatts: Are social science RCTs headed in the wrong direction? A roundup of the discussion https://t.co/hK56Z71wwT