American movies and TV seem to be doing a worse job of telling the stories of everyday Americans these days than in past decades...
Of course, these movies and shows weren't really realistic, they were fantasies, but their focus was on average folks in realistic situations.
So maybe it's just me, and there's no actual trend there.
And I want to see their stories get told.
(end)