Quantitative Study Designs and Critical Appraisal
Intended to help to do independent critical appraisal of the data being released in #COVID19 including an overview of study designs.
Also to help you assess whether "experts" did critical appraisal or are just repeating abstract
Quantitative Study Designs
Only covering studies where unit of analysis is an individual person.
There are also ecological studies which use a population as the level of analysis and systematic reviews & meta analyses which quantitatively combine results of several studies.
Observational Studies - Descriptive
Examples: case reports, case-series reports, surveillance studies, surveys
Cross-sectional studies - Describe the prevalence of a disease or other phenomena without looking for associations between variables
Observational Studies - Analytic Cross-Sectional
Can test a hypothesis of associations between exposures, risks and possible causative factors and outcomes.
Pros: Easy, Cheap, no loss to follow up
Cons: Limited causal inference, selection bias, confounding, effect modification
Observational Studies - Analytic Cohort Studies
Prospective or Retrospective
Process: Classify people by exposure (use matched comparison), follow up and observe outcome
Pros: Better causality, calculate incidence, measure multiple outcomes
Cons: Generalizability, secular trends
Experimental Studies - Quasi-Experimental Studies
Synonymous with non randomized intervention study but try and assess impact
Overview: Difference in Differences, ITS, Regression Discontinuity, etc
Pros: Study exposures where RCT infeasible
Cons: Challenge for Interpretation
Randomized Control Trials
Types: Treatment & Prevention, Individuals & Communities, single arm vs. multi arm, normally 4 phases from safety to efficacy to post-marketing studies
Pros: Randomization eliminates the effect of confounders
Cons: expensive, generalizability, + time
Critical Appraisal -- ODDCHAIR
O: Objectives
D: Design
D: Definitions
C: Data Collection
H: Data Handling
A: Data Analysis
I: Interpretation
R: Reporting
Using a framework for a standardized approach to critically appraising studies can facilitate meaningful but rapid review.
Skip the conclusions in abstract as will cloud review.
1) Be a skeptic
Worst case scenario, the investigators are proven right.
2) All studies are flawed
The key is whether flaws are declared and addressed as much as possible
3) Appraising evidence is complex--but another thread
ODDCHAIR
Objectives -- Understanding the objectives is critical as should guide the design, methods, analyses, and reviews
Design -- Does design make sense based on objectives given earlier tweets
Definitions -- Clear definitions for inclusion, exclusion, exposure, and outcomes?
Data Collection -- Review how they collected data--in person, virtual, over what time frame, etc
Data Handling -- See if they report how data went from the individual to analyses (including biological samples or self-reported data)
Data Analysis -- More complex, but do analyses make sense for the type of data available. This is where an expert could actually be useful rather than just repeating abstract
Interpretation -- All researchers are guessing, but is it consistent with hypothesis and objectives?
Reporting -- Are all inputs needed for you to decide whether the study makes sense available to you in the paper? Is reporting consistent with best practices -- ie STROBE, MOOSE, etc. Check equator-network.org
Pro tip: DATA ARE ALWAYS PLURAL! Or it is just a date :)
Share this Scrolly Tale with your friends.
A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.
