(I really dislike chatty first-person-plural ledes that make casual, sweeping assumptions about one's readership)
Also interesting: "It might seem like paying close attention to this year’s Presidential election would lead to higher levels of stress, but paying close attention to the race is actually correlated with lower levels of perceived stress."
Pollsters should stand by their results, and journalists should help readers understand the broader context of the data they're providing. These are good examples of simultaneously doing both.
New HuffPost/YouGov poll: Here's what voters say their top issues are for this year's election -- and how it compares to what they think the two campaigns care about.
I get into this more in the story, but: there are a lot of reasons to use caution when analyzing top-issue polls. One thing I DO think they can be useful for is gauging which campaign messages are resonating or not.
We've been asking this question biennially since 2014. A few things that stand out to me this year:
-Immigration is WAY down in salience from 2018
-Trump's messaging on crime is just not resonating, even with his supporters
New HuffPost/YouGov poll: Most voters think social media platforms have a responsibility to prevent users from spreading conspiracy theories/false information. Big partisan gap, but also an age divide.
Voters are pretty divided on whether it's a good or bad thing for elected officials to be on social media, but they're more likely to think Trump's tweets hurt than help him. huffpost.com/entry/poll-fac…
I realize this is beating a dead horse, but...SSRS, CNN's pollster, also conducts some of their polling online (their post-debate poll was phone, but includes call-backs of voters originally reached through SSRS's panel). "Online" is not the problem here!
I realize this seems nitpicky, but I think it runs the risk of being genuinely confusing as the industry moves toward different modes. Gallup is doing online polling. Pew is doing online polling. AP is doing online polling. Etc.
Twitter polls are not representative of what people thought of the debate and focus groups are not representative of what people thought of the debate -- I know you all know this, but I just realized I'm not going to get to say this again for ages.
no worries, I can still be a condescending scold about the SotU
While I'm at it: snap polls are a more-or-less reasonable gauge of who watched the debate, but "debate watchers" is not the same sample universe as "the electorate."