Stories like this are sadly common. When MDs discourage everyone else (even STEM PhDs) from reading medical literature, this really does NOT help us create a society with higher scientific & medical literacy.
Problems with "appeal to authority" arguments:
- many ppl are more motivated when they know the underlying reasoning/mechanism/data
- doesn't build underlying scientific knowledge or critical thinking skills
- science is messy & always evolving
- discourage interdisciplinary work
I value interdisciplinary work, which entails encouraging others to learn about your field & engage with it, and recognizing that their other perspectives/skills/domains that can be relevant & shine new light on your area.
To return to the 1st tweet, it's also a cruel way to treat patients (regardless of their credentials) to suggest that they don't have a right to even try to understand their own health, medical treatments, and high-stakes decisions.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
It bothers me when doctors, journalists, & others lump together:
- disabled & chronically ill patients who read papers/research their own medical issues AND
- able-bodied anti-vaxxers/conspiracy theorists who use "do your own research" to justify anti-science 1/ #NEISvoid
Patients w/ chronic illnesses & disabilities have 3 types of expertise:
- physical experience of their illness/disability
- how to navigate medical system, which is traumatizing
- in many cases: deep scientific knowledge, from keeping up on relevant research 2/
Able-bodied anti-vaxxer/conspiracy theorists typically don't have any of the above 3 types of expertise, which is why I think we need to draw more careful distinctions when talking about "patients doing their own research" in a derisive way. 3/
It is funny when the "only doctors are allowed to speak about public health" crew suggests I wouldn't want outsiders weighing in on AI, because one of my core values/driving motivations is that I want people from all backgrounds involved with AI & AI ethics 1/
Most problems related to AI/automated systems are inherently interdisciplinary, benefiting from having experts from many domains involved, spanning STEM, social sciences, and the humanities.
And those who are most impacted by a system should be centered most. 2/
Too often, there has been unnecessary gatekeeping & credentialism in AI, narrow definitions of who has the "right" background. I talk about this more here, but our vision for fast.ai was a community for ppl w/ the "wrong" backgrounds. 3/
We need more nuanced ways to talk about medical & public health disagreements, without the simplistic black-and-white reductions that you either trust ALL doctors in ALL matters OR you must be anti-science. 1/
"Science is less the parade of decisive blockbuster discoveries that the press often portrays, and more a slow, erratic stumble toward ever less uncertainty." @zeynep 2/
My new essay: In topics ranging from covid-19 to HIV research to the long history of wrongly assuming women’s illnesses are psychosomatic, we have seen again and again that medicine, like all science, is political.
We are not prepared for the surge in disability due to #LongCovid. The physiological damage covid causes can include cognitive dysfunction, GI immune system damage, immune dysfunction, increased risk of kidney outcomes, dysfunction in T cell memory generation, pancreas damage, 2/
We are seeing concerted efforts to downplay the long-term health effects of covid using strategies straight out of the climate change denial playbook... Many have a significant financial interest in distorting the science around long term effects of covid. 3/
Designing recommender systems to depolarize:
- algorithmic social media isn't primary driver of polarization, but could be useful intervention
- goal: to transform conflict, not to suppress or eliminate it
- 3 stages: moderation, ranking, & user interface
paper by @jonathanstray
Polarization is involved in variety of feedback loops:
- it leads to less intergroup contact, which causes polarization
- it is a precursor to violence, which causes
- polarization it leads to selective information exposure, which causes polarization
3 key places where changes to recommender systems could be used for depolarization:
- which content is available (moderation)
- which content is selected (ranking)
- how content is presented (interface)
Pundits urge people to “listen to the science,” as if “the science” is a tome of facts and not an amorphous, dynamic entity. The naive desire for science to remain above politics meant many researchers were unprepared for a crisis that was both scientific & political to its core.
The pandemic hasn’t just been a science story. It is an omnicrisis. One must understand not just virology, but also the history of racism & genocide, the carceral state, nursing homes, historical attitudes toward medicine, social media algorithms, & more. theatlantic.com/science/archiv…
Much of journalism is fragmentary. For science, that means treating individual papers as a sacrosanct atomic unit and writing about them one at a time. But for an omnicrisis, this approach leads only to a messy, confusing, & ever-shifting mound of jigsaw pieces. @edyong209