So I talk a lot about the problems in both the graduate school experience and the academic job market, I thought it would be worthwhile to shout out how my own PhD department, @UNChistory is taking what I view as the right steps to respond as a department.
(Obligatory note that I also have a temporary adjunct gig in the department, but did not have any say in these changes).
What @UNChistory is doing - I can't give any exact numbers for obvious reasons - but they're cutting back grad. admissions and using that to raise stipends.
That's tough for a lot of departments to do, because big graduate programs are a component of dept. prestige. Universities often don't like it for the same reasons.
But it's the right thing to do. While departments can't solve the jobs crisis, this is the one thing they can do.
It gets at the twin problems: on the one hand that graduate stipends are so low that graduate students face a poverty-like condition which has all sorts of negative impacts (described here: acoup.blog/2021/10/01/col…)
At the same time, the number of jobs in most fields in the humanities have utterly cratered, leaving programs which were well balanced for the job market of 2007 producing far, far too many PhDs for the job market of literally any year since then.
That produces a job market that is almost totally random, a hiring environment for many adjuncts that is also very bad and also just a lot of shattered broken dreams for people who committed almost a decade to a field.
So cutting back the size of incoming grad classes and using the extra money to raise stipends (albeit only modestly) is really the only responsible course for departments.
And I'm glad to see that @UNChistory is giving it a try and I'd love to hear other programs doing the same.
Ideally, looking at the AHA's jobs data, what we'd need is for the entire field to cut graduate admissions by about half to bring new PhDs in line with jobs and give a little slack to allow the 2008-present glut to be absorbed: historians.org/ahajobsreport2…
This is a topic where I think pressure and guidance from the professional associations in various fields - e.g. @AHAhistorians@scsclassics@archaeology_aia - would serve a valuable role in getting entire fields to move together, given the collective action problem.
Of course that would require those professional associations to actually grapple with the scale of the problem, something that, from both attending and hearing about their annual meetings, they mostly seem as yet unwilling to do.
So at this stage it falls to individual departments to be the first movers in this and its good to see at least one (and I hope more?) moving in this direction.
(I should note, I suppose, both that again, I'm currently teaching for @UNChistory , which is why I knew about this, and also that I checked with the department before, you know, tweeting out their policy changes.)
Also, if you want to study Roman history at UNC like I did...well, you can't. They currently have no permanent Roman historian. But they are trying to raise money to hire one (won't be me, no self-dealing here), so if you have a spare few million dollars...history.unc.edu/the-richard-ta…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
So for one, if Putin is going to own the libs by not invading Ukraine, wow, yes, I am totally owned. Pwned, even.
But seriously if Russia actually withdraws the troops from the border, that is a huge win for NATO and Biden should do a giant victory lap w/ other NATO leaders. 1/8
Figure it this way: assuming Russia is backing down, there are really two possibilities here.
Possibility one: Putin was bluffing. He moved forces to the border and made threatening noises (and then followed up with demands) in the hopes NATO would blink... 2/8
...or that the stress would divide NATO. It didn't work. Allies mostly handled it well - sure, some posturing from all of the majors (inc. USA) but no concessions, no fatal split.
If this was a bluff, Putin got called on it and folded. Embarrassing. 3/8
So thinking a bit about choice in historical video games..., we've got a fair bit of evidence that most players - like 90%+ - when given a choice play games as the 'good' character.
'Evil' gameplay choices thus mostly exist to give weight and consequence to the 'good' choices.
I think that puts a burden on developers to either 1) make it really clear why 'evil' options were chosen (@PdxInteractive is, I think, pretty good at this) or 2) not hide all of the historical cruelty behind 'evil' choices no one is going to take.
If your game has the player doing imperialism, you can't have the character of it depend on their choices, because most players are going to choose the 'good' option and thus you get a game that presents relatively benign imperialism, a thing which didn't really happen.
I actually think we agree on more than the article lets on - a lot of what Harper and Nagl are laying out strikes me as modern operationalization... 1/8
...of the 'socially embedded' option. As I laid out in my own piece, past examples suggest two choices when raising 'auxiliary' forces: either total deracination or else you need to learn to work within existing social institutions. 2/8
In arguing against military 'helicopter parenting' Harper and Nagl really to me seem to be saying that the more independent, socially embedded system - the way Rome treated the armies of Pergamum or even the Italian allies - is the way to go.
You know, I enjoy playing Hardspace: Shipbreaker as a sort of chill, zen experience.
But it really probably is an issue, storywise, that a game about being an exploited laborer in a ultra-capitalist dystopia is fun to actually play.
There's also a real tension between a story about being debt-trapped in an unsafe, brutal job and the gameplay fact that you can 'get gud' to safely navigate all of these hazards and also consequently steadily work down that massive debt.
The 'interest on debt' floor is 500k credits per work shift, but I regularly do 1.5-2m credits of salvaging per shift - sure the debt is made comically huge to prevent it being zero'd, but if each shift is a day the implication is you could do it in about 4 years.
Today I've learned that if you tweet about Star Trek, one of the genre of replies you get are folks who maybe don't know their Star Trek so well, but are really convinced someone must have made the whole thing internally consistent and scientifically rigorous.
And...I have bad news for those folks: Star Trek can't keep basic, plot-essential things like beaming through shields or how many shuttles Voyager has straight.
The 'kill' setting on phasers was 1/4 and 3 and also 10, because they couldn't keep that straight either.
Consistency was *never* a priority on the classic Star Trek shows.
Also the science was more or less entirely made up. Like, the scarce warp fuel they use is 'Deuterium' - literally just a molecule of two atoms of the most common substance in the universe (it's H2).
One of my sci-fi pet peeves? Weapons that fire like a magic spell that either happens or it doesn't; Star Trek's phasers are a frequent offender.
What I mean is the sort where 'oh no, we fired our techtech beam, but they had their techtech shield up, so it did *nothing.*" 1/13
Actual weapons nearly all work by delivering some amount of energy to a target; there are some exceptions (chemical and biological weapons come to mind), but a sword, a javelin, a rifle and a nuclear bomb all work by delivering energy, either as kinetic energy or heat. 2/13
Even if those weapons fail to defeat a target's defensive systems (armor, whatever), they still delivered the energy, which, thermodynamics being what they are, had to go somewhere. It might degrade armor, or knock the target around a bit, or cause collateral damage. 3/13