, 19 tweets, 11 min read Read on Twitter
A late-night thread on reproducibility and #openscience in cognitive neuroscience, including our upcoming series of (rather punchy) comment pieces at the journal Cortex. Gather round all ye.
Here is my editorial introducing the seven commentaries. I’m going to move through each of them here in turn, and stick around to the end of the thread to hear about two new initiatives we’re launching this year in response psyarxiv.com/shryx /1
First up, Huber et al . report how they tried to replicate a study published in @NatureNeuro. After being invited beforehand to run & submit the study by one editor, a different editor then desk rejected them once the (non-replication) results were in. /2
Sidebar: we later published Huber et al’s replication study at Cortex (thanks @NatureNeuro, we’re happy to help you out any time 👍). You can read the paper here: drive.google.com/file/d/1PhWHaA… /3
Good for them, but Huber & co believe the problem w/ replication in cog neurosci is deep & serious. They call for more stringent checks on reproducibility *before* publication & dynamic tracking of rep attempts & outcomes. Their full comment here: psyarxiv.com/5q9ma/ /4
Full thread unrolled here due to possible missing tweets: neurochambers.blogspot.com/2019/03/the-ba…
Next, @sampendu pushes back at the suggestion that we should select what gets published based on results, even when doing so is based on replicability. Instead he calls for a “pending replication” stamp to be placed on unverified exploratory studies psyarxiv.com/h49a6 /5
But wait...what about the tools we’re using? @m_wall argues that the reliability of our research cannot exceed the reliability of the methods we employ. And in cognitive neuroscience this is poorly understood. It's not just about publication culture. psyarxiv.com/upynr/ /6
Nevertheless the often obstructive nature of peer review isn’t terribly helpful. @HannahMBuxton weighs in to point out the value of adversarial collaborations for reducing bias & encouraging better theory, esp. when submitted as Registered Reports psyarxiv.com/82sr3/ /7
Do reforms to how science works take into account the scientists who DO the work – the early career researchers? @LeahMaizey & @LTzavella argue that unless reforms work for #ECRs, they will fail. M&T suggest “replication & extension” as one solution psyarxiv.com/dzsh4 /8
But it’s not all about incentives. @minzlicht calls for cognitive neuroscientists to rise above their egos and fallibilities, embrace error correction & champion reproducibility over reputation. And he is someone who practices what he preaches psyarxiv.com/8zvc3 /9
In particular, you can read @minzlicht’s recent Registered Report at Cortex where he tests the reproducibility of one of his own previous findings & concludes that the original result may be a false positive osf.io/473kd/

Almost nobody ever does this in cogneuro. /10
And finally @NeuroMinded, a former @NatureNeuro editor, takes on the newsroom culture of sci publishing. Huber et al.’s fixes will help but only superficially. To really fix these problems, he says, scientists need to take back control from publishers psyarxiv.com/d59me /11
Where does all this leave us? Cortex has been at the forefront of #openscience initiatives such as @RegReports, Exploratory Reports, TOP guidelines & #openscience badges.

But these are NOT enough and this year we’ll be launching two new initiatives. /12
The first is an Accountable Replications policy – @hardsci’s now famous "pottery barn rule" of publishing, which we recently introduced at @RoyalSociety Open Science. In a nutshell: if Cortex published the original study we’ll publish the replications of that study. /13
The second is an entirely new initiative, again the creation of @hardsci: Verification Reports. Short articles with the sole purpose of testing the reproducibility & robustness of original studies using the exact SAME data. /14
These steps aren’t a total answer but they move us in the right direction. The recent launch of @ukrepro – together with the wide support the network is receiving from funders, publishers & regulators – means that reproducibility is going to be a Big Deal for many years. /15
That’s why cognitive neuroscientists need to be at the forefront of these discussions. And it’s why cog neurosci journals need to work harder to support reproducibility. That means adopting @RegReports, Exploratory Reports, TOP guidelines, replication initiatives & more. /16
I will end this (now repaired) thread here! Hope you enjoy the articles, which are all available as preprints in the tweets above and thanks to all the wonderful contributors for weighing in. Onward.

Full thread also unrolled here: neurochambers.blogspot.com/2019/03/the-ba…

/end
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Chris Chambers
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!