Serious question, and I don't mean this patronizingly, but do people not know how to research topics or follow scholarly or other literature through their chain of citation/creation? I forget that it's an acquired skill, so I wonder about its distribution generally.
In the past, for example, when i give sources as authors or titles or topic or some other such manner, people struggle to find the stuff unless I link to it, since the people in q are clearly not lazy cus they're trying to find it, I've wondered if it's, like, not well taught
And, since people always come to me looking for research suggestions & sources (which I'm always happy to do, if I've the time, for a MuFo or someone who needs help or who is inquiring about subjects I'm interested in), I'm curious
Since the site is social & asking questions publicly or pontificating is part of it, I get the impulse to ask Twitter/speculate on something w/o looking it up (indeed, I do it, openly, all the time), but sometimes people will ask specific questions or make specific speculations
And for a while it'd kinda annoy me, because normally my first impulse on such a topic would just be to search for the answer, or reach out to a specific person, or to publicly discuss it but admit I have no absolutely no idea what I'm talking about
But then it sorta dawned on me--& I mean this seriously--is it *hard* for people to research something? Like do you know where to look? How to optimize your search query? Which links to follow in article? What the indicators are of reliability vs. BS? etc.
I can give some quick pointers if you’re interested
Hands down, and I mean this totally un-ironically, best place to start to learn about something is Wikipedia
I use it 3 ways:
1. I follow wiki links in a path, after a quick skim, & open them in (way too many new tabs)
2. I read for claims I find bizarre or interesting or confusing & quick their source links
3. I go to the bottom & open all available linked scholarly books/articles
To find an article someone cites:
1. Copy name and/or author and put it into google or google scholar (if you can’t find it go to the academics’ page)
2. If freely available as PDF j click it, otherwise copy the DOI number (sometimes hard to find), & go to Sci-Hub dot tw & paste
Whenever a journalist, government, business or pundit makes a claim (sorry if you’re any of these) I immediately assume it’s wrong & will search it. If no link is provided, assume it is just made up.
In general the flashier the title, the more sensational & the more convenient it is for you, the more likely it is that:
1. The journalist or pundit is misrepresenting & over inferring the article or
2. The article itself is wrong
But, articles USUALLY when discussing something will link to the article in q, but often in a confusing manner, any of the following may be the hyper link:
1. Authors name
2. Journals name
3. Study’s name
4. University’s name
Etc. click all of the above
Alternatively, sometimes at the end of the article they’ll say ‘Article X’ by author Y is published/forthcoming in journal Z.

NOTE, sometimes a so called ‘study’ is an interview w the author & the study itself hasn’t been published.
This is fine but:
1. it is not a study
2. The author or the journalist could be overstating the case
3. Until publication it’s not truly a source
4. Where the interviewee is drawing on general expertise, note that, & go to their academic page, & click their ‘publications’ tab
Often journalists and pundits will list as their source a link to another piece by a journalist or pundit.

Similarly often wiki claims are based on these or dead links.

Or they may come from a press release, a government, a think tank, or an advocacy group.
If the a journalist or pundit cites another one for a claim, especially a controversial or important one:
1. Do not trust it, immediately become skeptical
2. It’s tiresome, but follow the link chain until you get to the origin
3. 1/3rd of the time there will be no origin
Press releases, advocacy group, and think tank papers and studies are fine and I will cite them, if need be, but always read their intro & methods etc.

Good heuristic: if a finding is against the interest or bias of the advocate, it is more reliable
Even more problematic, a science journalist will cover something with a controversial title.

Someone on ether woke twitter or reactionary rage twitter sees it, and gets mad, giving a gloss on it (‘Here Fixed it for you!’)
Always click the link, and find the study, ignore the journalist & Twitter summary of it.

Keywords to look out for: ‘survey’, ‘polled’, ‘administered over internet’, ‘representative sample’, or terms for ‘correlation’ like ‘association’
Survey & polling research, unless incredibly well conducted, is usually just flatly wrong or incredibly misleading, and it does NOT tell you what you want it to, namely the prevalence of a belief or something in the relevant population.
Often representative sample is codeword for ‘non random’ & is a red flag.
ASSOCIATIONAL STUDIES DO NOT PROVE CAUSATION

ASSOCIATIONAL STUDIES DO NOT PROVE CAUSATION

ASSOCIATIONAL STUDIES DO NOT PROVE CAUSATION

ASSOCIATIONAL STUDIES DO NOT PROVE CAUSATION

ASSOCIATIONAL STUDIES DO NOT PROVE CAUSATION
Studies that find a correlation are not causal, and most of them do not even claim to be. That is not the fault of the researcher.
For example, an identical copy of this tweet I saw once got like 10,000 RTs, I clicked the Post article & went to the study. It did not even remotely claim the thing in the tweet.

I followed the chain like so, and voila we get to the article—‘representative sample of German pop’ means statistically non random, 964 is a small sample
Here's a step by step if you're wondering
What'd I tell you?
'Computer assisted telephone interview'
'Accessible by landline or mobile phone'
'Stratified by region/provider'
'It was assessed' (how? what criteria?)
8000 'identified', 3500 'gave informed consent', of 2300, 964 returned, 60% did it online the rest pencil
1. Data is self-report survey on controversial topic
2. Provider, region, and contact method (phone type) are not truly random
3. Couples informed not to discuss, no way to prove they didn't
4. Assessment/question/interviewer data/method/phrasing is unclear
5. Method of administration (pencil or online or phone) affect results
6. BY THEIR OWN ADMISSION, REQUIREMENT OF STABLE COUPLEHOOD, DUAL INFORMED CONSENT & WILLINGNESS TO TAKE SEXUAL SURVEY ARE EACH NON-RANDOMLY CORRELATED WITH THE RELEVANT VARIABLES
7. Data is based on inventories, scored tests, whose reliability, ecologically validity, and ability to solicit true 'traits' is entirely unclear, both in general & in context
8. Self-report,rather than observed, means people answer *how they want to be/or feel wanted to be seen*
A. ~42% remaining (~60% attrition) from target population to agreed population
B. ~60% remaining out of that (!~% attrition)
C. ~42% actually returned them (~60% attrition)
D. 60% participated online, 40% pencil & paper
THEY LOST 90% OF THEIR TARGETED SAMPLE, AND, EACH TIME, THROUGH A NON-INDEPENDENT METHOD--AGREEMENT, DUAL AGREEMENT, ACTUALLY RETURNING, AND METHOD OF ADMINISTRATION ARE ALL NON RANDOM
HERE, FIXED IT FOR YOU:

'Non-random associational study based on self-report data, unclearly administered, on a population sample with 90% non-independent attrition, showed the correlation in the sample between two scored inventories'
The study isn’t representative but is a sample of:
1. Coupled German people who are willing to discuss their sex life with random people on the phone
2. Are stable & concordant enough to both give informed consent
3. Have the follow through to return or take a survey
What’s more the study isn’t causal. It showed that women whose partners report less on an index agreeableness report lower satisfaction (hmm wonder why?), a correlation not a causal analysis
The result the Post reported on is also misleading. The study found that men whose women partners (also, forgot to mention—>the gender assumptions are whack) report less emotional stability, report higher sexual satisfaction.
Sorry the first claim is the other way around—>more self reported agreeable predicted less self reported sexual satisfaction
Okay, so let’s just spitball here. Let’s say your partner ranks lower on emotional stability, this could index anything, from depression to bipolar to disatisfaction to emotional abuse to physical illness etc
Let’s make what I think is a reasonable assumption: people who are willing to discuss their sex life on the phone with randos are probably over compensating. If a man’s wife is ‘emotionally unstable’, he may over compensate & report higher sexual satisfaction.
Okay now that I’ve torn Woke Twitter, the NY Post and that article a new one, back to research tips.
When searching for research, don’t think like a human, think like one of the following:
1. A tech nerd
2. An academic
3. A journalist
4. An algorithm
5. A concept network
6. A topic model
Mind maps, mind palaces, concept networks, topic models, citation webs, etc are more than just cutesy they:
1. Accurately describe how algorithms work
2. Track free association &
3. Are genuinely good methods for learning & researching
I do this in the margins of my books, in written journals, and in notes, as an exercise.

1. WHENEVER something reminds you a scholar or idea, write down the relation

2. Make literal webs of concepts & authors based in relation, citation, genealogy, analogy & prominence
3. Spatialize, enumerate, color code, draw, act out—i.e. whatever works best for you—the concepts at hand, you will understand & remember them better, as a fact

4. Recount, teach, explain & criticize ideas to others—> teaching is best learning method
But for the purposes of research, this is also the best way to search things. Remember, SEOs are built to game this system, when a metric becomes a target, it ceases to function as a metric, BUT it does give you a good method.
So when I search for things in Google that I don’t know a lot about, I think like a concept map.

I omit words that might lead to overly general or irrelevant topics, and I try to go for specific terms.
If I have a particular question I will ask it in syncopated form.

Let’s say I am wondering about the ecological conditions that predict hierarchies in mammals, or let’s be specific, let’s say I just got challenged with some example about Dolphins or smth
So I’d type ‘Dolphins ecology research hierarchy field study’ (idk if this will work, let’s see)

First hit!
Also, you’ll be surprised how much adding ‘free PDF’ will strengthen your results. Similarly, if you think something doh did like bullshit and you search ‘is X bullshit?’ Chances are you’ll come across a blog or something that links to stuff on exactly that.
Additional places to look:
1. JStor
2. Web of Science
3. Specific journals
4. Amazon
5. University libraries
6. Disciplinary association & subdiscipline reading lists
7. University working groups
8. Google scholar
Honestly that’s basically everything lol.
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to 🌎🌵the 🚀🌌cosmist 💣✊insurrection 🏴🚩
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!