2/ This study’s results do not demonstrate that colonoscopy is effective, but also do not demonstrate that it is ineffective. Why? Because only 42% of those in the screening arm actually had a colonoscopy!
3/ What *does* this study show? Only that a screening program involving an invitation to get a one-time colonoscopy modestly reduced colon cancer cases but did not reduce colon cancer deaths or overall mortality at a median of 10 years of follow-up.
4/ So this is a trial of the *invitation* to get a colonoscopy, NOT the colonoscopy itself. What intervention study would be accepted with an actual receipt of the intervention by only 42% of those randomized to that arm of the trial?
5/ Let’s consider a few ludicrous examples to make this point. These are intended to be absurd, so don’t @ me if you miss the point. First, I might report that smoking doesn’t cause lung cancer – cough cough, are you sure, you might say?
6/ What if I randomized people to smoke or not (yeah yeah), and then followed them to see if they developed lung cancer? The lung cancer rate was 0.56% in the smoking group and 0.06% in the non-smoking group, p = NS. But only 42% of those in the smoking group actually smoked.
7/ Does this provide evidence that smoking doesn’t meaningfully increase lung cancer rates? Of course not, and I hope that’s obvious! More than half of the smoking group DIDN’T ACTUALLY SMOKE, so the study doesn’t assess the effect of smoking.
8/ Second, I might report that super-loud rock concerts don’t affect hearing – say what now, you might say (literally)?
9/ What if I randomized people to attend rock concerts or not, and then followed them to see if they developed hearing loss? The hearing loss rate was 3% in the concert group and 2% in the control group, p = NS. But only 42% of those in the concert group actually went to concerts
10/ Does this provide evidence that rock concerts don’t cause hearing loss? Of course not, and I hope again that’s obvious! More than half of the rock concert group DIDN’T ACTUALLY GO TO A CONCERT, so the study doesn’t assess the effect of attending rock concerts.
11/ Third, let’s get a little more risque. I might report that sex doesn’t cause pregnancy – come again, you might say?
12/ What if I randomized people to have sex or not, and then followed them to see if they became pregnant? The pregnancy rate was 2% in the sex group and 0.5% in the no-sex group, p = NS. But only 42% of those in the sex group actually had sex.
13/ Does this provide evidence that sex doesn’t cause pregnancy? Of course not, and I hope for the final time that’s obvious! More than half of the sex group DIDN’T ACTUALLY HAVE SEX, so the study doesn’t assess the effect of having sex. Anti-climactic, huh?
14/ Back to the current study: this trial doesn’t tell us if colonoscopy is effective as a screening tool for colon cancer, especially to reduce mortality, ONE WAY OR THE OTHER! There is actually some signal of benefit in my assessment, but this is far from definitive.
15/ We still need better data to establish colonoscopy’s effect, unfortunately, but in the meantime I hope this trial does not push people away from colonoscopy because it really should have little to no effect on attitudes toward colonoscopy as a screening tool. /fin
• • •
Missing some Tweet in this thread? You can try to
force a refresh
1/ More on the colonoscopy trial: for those interpreting this as evidence that colonoscopy screening doesn’t work, read the protocol: thieme-connect.com/products/ejour…
2/ If you think this was a sound screening recruitment strategy, well, you have a different understanding of public health, primary care, and shared decision-making in medical practice than I do. Read on …
3/ From the protocol:
“Each individual in the screening group receives an invitation letter with the appointment date and time for a colonoscopy at the corresponding participating center.”
1/ Welcome to another edition of West’s Well-Being Wednesday! As a reminder, I’ll briefly highlight papers, topics, questions, etc. related to healthcare professional #wellbeing, with a new 🧵 each week. #wellbeingwednesday#burnout#medtwitter#meded
2/ To start Year 2, I’ll link to the threads from July 2020.
First, from July 15, a thread on #burnout history back to Freudenberger.
1/ Welcome to another edition of West’s Well-Being Wednesday! As a reminder, I’ll briefly highlight papers, topics, questions, etc. related to healthcare professional #wellbeing, with a new entry each week. #wellbeingwednesday#burnout#MedTwitter
2/ This week we’ll touch on the association of racial bias and burnout, prompted by @FutureDocs thread last week reflecting on #DrSusanMoore and so many other victims of systematic disparities and racism.
3/ As a biostatistician, I think it’s interesting that in statistics “bias” is defined as a systematic error or deviation from the truth. This is worth reflecting upon as we debate whether racism and other biases are inherent in our systems – by definition, bias is systematic!
1/ Welcome to another edition of West’s Well-Being Wednesday! As a reminder, I’ll briefly highlight papers, topics, questions, etc. related to healthcare professional #wellbeing, with a new entry each week. #wellbeingwednesday#burnout#MedTwitter
2/ This week will be brief because the point is simple:
1/34 Okay #medtwitter#epitwitter , read on for an #EBM#Tweetorial on p-values, with specific attention to the implications of the recent remdesivir trial with p=0.059 for mortality (full report still not published, which is not ideal …).
2/ This is a follow-up to my prior #EBM#Tweetorial on diagnostic test performance study design