Given the complexity of delivering clinical trials, they are a fertile ground to gain from #interdisciplinary thinking. For example, the field of trial #recruitment has already gained enormously from insights from other disciplinary approaches 1/7 #MethodologyMonday
A recent paper highlighted the use of #StatedPreference methods in this space. It showed aspects of trial design can affect recruitment 2/7
#StatedPreference methods eg discrete choice experiments are more commonly used by health economists to value and quantify aspects of health care but can be used to determine preference priorities in any domain 3/7
#BehaviouralScience methods have also been used to inform recruitment. Approaches such as the Theoretical Domains Framework have been used to diagnose recruitment barriers & introduce behaviour-informed interventions to help address them. 4/7 trialsjournal.biomedcentral.com/articles/10.11…
#Qualitative methods have also yielded useful insights and have underpinned the development of the #Quintet approach, now widely adopted across a range of trials. 6/7
These examples show that collaborating across disciplinary boundaries and adopting wider #interdisciplinary approaches can majorly advance our thinking. We should routinely ask what insights from other disciplines may help 7/7
• • •
Missing some Tweet in this thread? You can try to
force a refresh
It’s good to start a new year getting the basics right. It’s the same with methods; important not to slip into using common errors.The recent Xmas BMJ paper which showed the most common stats/methods errors is a great place to start 1/7 #MethodologyMonday
The BMJ stats editors highlighted the top 12 most common stats errors they come across. They are summarised in a neat infographic 2/7
All are important,but a couple particularly resonate. One is “dichotomania” (the term coined by Stephen Senn for this) where a perfectly good continuous measure eg blood pressure or weight is then arbitrarily dichotomised into two categories - good/bad; high/low etc 3/7
Following the paper noted this week to have just added capital “T”s to a graph to depict standard errors 😱🤯, a short note on the importance of accurate data visualisation in any research report … 1/8 #MethodologyMonday
This was the tweet & thread which highlighted T-gate. There are lots of other issues with that paper, but data visualisation is a core element 2/8
The paper had attempted to use a #DynamitePlot (sometimes known as a Plunger Plot) to display the data. Even without adding T’s there are major issues with dynamite plots and frankly most statisticians would like them consigned to history! 3/8
We are all being rightly encouraged to be #efficient in our trial design & conduct. Efficiency comes primarily through design choices … whether classic or more modern efficient designs … a few reflections below 1/7 #MethodologyMonday
A #crossover design can be highly efficient. Each person acts as their own control removing large element of variation, making the design more powerful. The outcome needs to be short term however & the intervention can’t have a long-standing effect 2/7 bmj.com/content/316/71…
This is particularly the case when a cluster design is also in play. A #ClusterCrossover design can majorly reduce the sample size requirements compared with a std cluster design. A good primer on this was published by @karlahemming and colleagues 3/7 bmj.com/content/371/bm…