DB

Taking the plunge into credible neuroscience 

Dorothy Bishop, Supernumerary fellow of St John’s College Oxford and former Credibility Advisory Board member

As someone who is prominent in trying to promote open, reproducible ways of working, I'm often asked how much time this it takes. Many scientists are worried about doing anything that might slow down their production of research papers. In our competitive times, when a glittering CV is needed to be employable, why would anyone want to act in a way that would be equivalent to running a race with one's shoelaces tied together?


This is a valid concern. If you know that your data and scripts are going to be open so that anyone could check your results, you turn into an obsessive. I used to pride myself on being careful and thorough, but I now realise I wasn't. I check and re-check everything – and the horrifying truth is that I still find errors in my work. 


I started to wonder if my brain was beginning to rot, but I have found that others have the same experience, and indeed it was engagingly described by David Donoho (1), who talked about 'the ubiquity of error'. There's no doubt that, once you realise just how error-prone we all are, you will have to slow down to check everything, and your rate of productivity will decline.

But, of course, there is an upside. 


The work you do publish still won't be error-free (I'm beginning to think that's impossible), but it will be far closer to that state than it would otherwise be. I have found that people who start working this way – trying to ensure that every detail of a paper can be reproduced from the raw data – become addicted to it because it feels more like 'proper science'. 


I can thoroughly recommend Donoho's article to anyone who is debating with themselves about taking the plunge into working in a more reproducible way. Quite apart from doing higher quality work, he notes many advantages, all of which I can attest to. 


One of these is that you can understand what you did when you return to your research after a few months. No longer do I find myself desperately searching for an old file and then trying to reconstruct what I did with it to produce a certain table of numbers. 


The fact that others too can reconstruct what you did leads to far more sharing, building on your work by others, and overall greater impact of the work. 


Career suicide? I don't think so.

1. Donoho, D. L. (2010). An invitation to reproducible computational research. Biostatistics, 11(3), 385-388. doi:10.1093/biostatistics/kxq028
Share by: