(1/8) Our study on questionable research practices (QRPs) and open science practices (OSPs) is now published in JQC. Data are public, link below is to open access version. This thread summarizes the findings. @socpsychupdate@ceptional@siminevazire
(2/8) First, we review the evidence on the prevalence of QRPs and OSPs in other disciplines, besides criminology. Unfortunately, QRPs are common in all disciplines where scholars have looked.
(3/8) We find that the vast majority of criminologists have used QRPs. The average criminologist has used multiple QRPs. One of the most common QRPs is selectively reporting significant studies/findings. Hence, publication bias is a real threat to evidence-based crime policies.
(4/8) OSPs are common in criminology, about as common as in other disciplines. The most common OSP is posting an article publicly, which is also similar to other disciplines. OSP users are also more likely to use QRPs, which is weird (we discuss some possible reasons why).
(5/8) Criminologists perceive a high prevalence of QRPs in their discipline, but the uniform distributions for many of the perceptual questions suggest that criminologists really don't know what their peers are doing in terms of research ethics (= weak descriptive norms).
(6/8) In terms of support, there is a non-trivial percentage of criminologists who believe it is sometimes ok to do things like file drawer null results (67%) or fill in missing values WITHOUT informing readers (18%).😟The latter is arguably data fraud.
(7/8) Support for open science is even higher. The vast majority of criminologists support each OSP. That is good news! The implication is that there is likely to be little resistance to changing journal policies. So, why don't we do it?
(8/8) Methods training appears to have little association with use of QRPs or OSPs. It also appears to have little association with views about QRPs or OSPs. The implication is that QRP use may not reflect ignorance; QRP users may be fully aware of what they are doing.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
(1/2) We need more attention to selection bias in qualitative research. A new study in a top sociology journal examines "how young people experience policing," but it draws only on interviews of youth in an organization devoted to abolishing the police, one that bombards...
(2/2) ...its members with abolitionist messages. If you repeatedly expose youth to these messages (a few of the Coalition's social media posts are below), and then ask them how they feel about police, you are going in circles. They're going to tell you what you told them.
Given the responses, I need to add two posts. Although I'm not a qualitative researcher, the concerns I'm raising are discussed regularly by qualitative methodologists. I'll put one source here and one in the next post.
(1/6) Because we didn't find racial discrimination, Reviewers thought our findings were wrong and our preregistered hiring experiments were flawed, leading to repeat rejections. Some lessons we learned, as outlined in our Discussion.
(2/6) There is publication bias in the literature of field studies on racial discrimination in hiring, as shown in our Appendix. Discrimination definitely exists. But it seems clear that null and negative findings have been left in the file drawer.
(3/6) Some field studies of hiring discrimination report very large differences (>15 points) in callback rates (e.g., Pager, 2003). Yet, the best estimate, from Kline et al. (2022) with 83,000 applications, is that the difference in callback rates is about 2 points.
(1/7) Whenever I say the evidence for evidence-based crime policies is weaker than we think because of QRPs, someone will say I'm speculating. Here's why I'm not. First, we know that many CJ researchers selectively report experimental outcomes in ways that inflate effect sizes.
(2/7) We know that sample size is negatively correlated with effect size in CJ experiments, when it shouldn't be, which is bad news: it suggests that whether studies are being written up depends on results being significant (which requires large effects in small samples).
(3/7) We know that most (87%) criminologists admit to using questionable research practices, with about half saying they have selectively reported results or only written up significant findings.
(1/9) Our new article on police-related fear (with @agrahamphd) shows that Black and White Americans live in different emotional worlds. This thread summarizes our findings.
(2/9) Most Black Americans live in fear of the police mistreating them. Few White Americans do so. In fact, the modal responses among Black and White respondents to most of the fear questions are mirror opposites: “very afraid” versus “very unafraid.”
(3/9) There is also a pronounced racial divide in altruistic fear—in worrying about the police hurting other people. For example, a majority (51%) of Black respondents worry “often” or “very often” about their family members being hurt by the police compared with 9% of Whites.
(1/5) New preprint examining personal and altruistic fear of the police (with @agrahamphd). Although most Whites are unafraid, most Blacks are more afraid of the police than of crime—58% fear being killed by the police vs. 34% by criminals.
(2/5) Altruistic fear of the police is also much higher among Blacks than Whites. For example, a majority (51%) of Blacks worry “often” or “very often” about their family members being hurt by the police, compared to only 9% of Whites.
(3/5) The racial divide in both personal and altruistic fear is partially mediated by past experiences (personal and vicarious) with police mistreatment. Blacks report experiencing far more police mistreatment than Whites (d = .76, t = 11.18, p <.001).