John Warner Profile picture
Jun 17, 2021 16 tweets 5 min read Read on X
For what it's worth, some additional context about a survey that the viewpoint diversity crowd is crapping themselves over this evening. Took me all of 10 minutes to look a little harder at the data and see how, as always, things are more complicated than they 1st appear.
First, this survey is produced by a NDSU university center that's funded largely by Koch money. There's dozens of these around the country. It's not nefarious, but they have a clear agenda behind their work. ndsu.edu/news/view/deta…
Second regarding that question about how students would respond to "offensive" speech, it's designed to be as vague as possible. There's no definition of what "offensive" is. It's purely in the eye of the beholder. An intentionally bad question.
If the survey were truly interested in understanding what's behind these attitudes, it would ask a simple open ended question of those who would report a professor for offensive speech. "Can you give an example of offensive speech that should be reported?"
But the survey doesn't ask that question, so we'll never know what students are thinking when they answer yes or no to reporting offensive speech. Lost opportunity there. Lost on purpose because it wouldn't be useful to the narrative.
There's all kinds of findings even in this dubious survey that tell a different story. For ex. the vast majority of students say "uncomfortable" material is fine.
An overwhelming majority of students support including readings they disagree with.
Vast majority of students don't think invited speakers should be withdrawn if students disagree with the speaker's views.
Here's a really interesting wrinkle. Students who believe that professors create a climate open to diverse views are MORE likely to say that offensive speech by a fellow student should be reported. Not sure how to interpret this, but it cuts against the snowflake narrative.
Another interesting tidbit. Liberal and Conservative students are nearly identical in believing things have gotten better over the last 50 years. I gets the libs haven't been taught to hate the country.
Liberal and Conservative students are nearly identical in the effect college has on their view of the future.
Major source of student optimism is their professors. Little difference by political affiliation.
Liberal students are more enthusiastic about the role of entrepreneurs than conservative students.
This should have the Kochs quaking in their boots. A full 1/3 of conservative students don't think capitalism can solve major societal problems.
Lastly, the sample is not particularly representative. White students are overrepresented (60% should be about 50%), as are private school students (38%, should be about 25%).
It's a shame this survey was written by bought and sold hacks because with a little more curiosity and care, they really could've told us something interesting. As is, it still is worth puzzling over, rather than jerking our knees to a single question/answer.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with John Warner

John Warner Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @biblioracle

Jun 3
The study cited here is a perfect example of the horrible turn we made in teaching students to write. Whether or not ChatGPT can give comparable feedback to humans is immaterial when the underlying activity is so meaningless. We have to examine this stuff with more depth.
The study is a perfect example of what I call "academic cosplay" activities which resemble learning, but which really involve a performance of "schooling." If the goal is to learn to write, the assignment is meaningless, the feedback (both human and ChatGPT) is meaningless.
Check out the example. This is not a prompt that has any relationship to judging student writing ability. Writing involves acting inside a rhetorical situation. This is using sentences to judge subject comprehension. Perfectly fine, nothing to do with learning to write. Image
Read 5 tweets
Apr 26
Growing more and more convinced that if you're going to find any use for LLMs as an aid to writing you need to have a robust and flexible writing practice that works without using AI, suggesting having students integrate AI into their processes without these conditions is flawed.
A huge complicating factor, as articulated by @Marc__Watkins is that these tools now just show up asking to be used without being actively summoned. Students quite sensibly see them as a normalized part of the writing process.
@Marc__Watkins The result, and I think this is true of many if not most of the instructor experiments I've seen in integrating AI, is that the tool is used to complete an assignment without consideration for what is learned and built in doing so. This is not an unusual behavior when students...
Read 6 tweets
Mar 26
I am increasingly distressed at how generative AI is being treated as a presence in student writing. To meet this huge challenge we have to get at the roots of what we want students to do and why. Thread of what I've been thinking & an invitation for other thoughts.
First, I believe that the primary byproduct of ChatGPT is to show us that much of the work students do in school is probably not worth doing if the goal is helping students learn. If ChatGPT can do it, it's not worth doing. insidehighered.com/opinion/blogs/…
Next, I believe generative AI has made it even more important to make students work at their writing. The key is to make sure they're actually writing, not just doing academic cosplay that passes for the purpose of having something to grade. insidehighered.com/opinion/blogs/…
Read 8 tweets
Mar 23
If writing is thinking (as I believe it to be) the last thing you want to outsource to AI is the first draft because that's where the initial gathering of thoughts happens. It's why first drafts are hard. That difficulty signals their importance.
When I see ed tech people say that students can use AI to "get over the hump" of the first draft I know that they understand nothing about learning to write. They've invited students to abandon the most important part of the experience when it comes to learning.
Writing is not just a process by which to produce a text. It is a process through which we think, feel, communicate, and learn. Outsourcing the production of text to something that can do none of those things is really just saying none of that stuff matters. The output is all.
Read 4 tweets
Dec 13, 2023
Propositions I'm pretty sure are true vis a vis smart phones and schools. A list until I run out of thoughts.

1. Smart phones are uniquely distracting, and for some (maybe many) a compulsion.

2. Students have always been distracted and disengaged in schools.
3. Increases in student disengagement with school pre-dates the arrival of smart phones.

4. Some phone behaviors have negative consequences to student mental health.

5. Some phone behaviors have positive consequences to student mental health.
6. Phones can have positive utility as tools in the classroom.

7. The utility of the phone itself as opposed to an internet-connected computer is pretty limited.

8. Handy access to a phone makes it harder to resist a reflexive desire to "check" the phone.
Read 14 tweets
Sep 16, 2023
Every time GPT "passes" some kind of test as compared to humans it mostly reveals the test is bad and we should rethink the human activity, rather than outsourcing the bad thing to GPT. That GPT can replicate generic feedback on student writing is unsurprising and not useful.
There was some thing where GPT aced an MBA without prompting or training, and that should've just alerted everyone to the academic cosplay that is much of MBA studies. That degree was invented in order to extract money from people already in white collar jobs.
Of course GPT can do fill in the blank feedback from a rubric. It's been trained on reams of writing that fits that criteria. But that writing IS NOT GOOD. We have to get beyond students doing writing imitations for school and let them write as writers do.
Read 4 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(