Pollsters should stand by their results, and journalists should help readers understand the broader context of the data they're providing. These are good examples of simultaneously doing both.
Polling errors are indeed real! They could favor either Trump or Biden! If they do favor Trump, it's still probably not because of social desirability bias!
I do not understand why this is the one concept to which the Discourse has latched onto like a remora.
New HuffPost/YouGov poll: Here's what voters say their top issues are for this year's election -- and how it compares to what they think the two campaigns care about.
I get into this more in the story, but: there are a lot of reasons to use caution when analyzing top-issue polls. One thing I DO think they can be useful for is gauging which campaign messages are resonating or not.
We've been asking this question biennially since 2014. A few things that stand out to me this year:
-Immigration is WAY down in salience from 2018
-Trump's messaging on crime is just not resonating, even with his supporters
(I really dislike chatty first-person-plural ledes that make casual, sweeping assumptions about one's readership)
Also interesting: "It might seem like paying close attention to this year’s Presidential election would lead to higher levels of stress, but paying close attention to the race is actually correlated with lower levels of perceived stress."
New HuffPost/YouGov poll: Most voters think social media platforms have a responsibility to prevent users from spreading conspiracy theories/false information. Big partisan gap, but also an age divide.
Voters are pretty divided on whether it's a good or bad thing for elected officials to be on social media, but they're more likely to think Trump's tweets hurt than help him. huffpost.com/entry/poll-fac…
I realize this is beating a dead horse, but...SSRS, CNN's pollster, also conducts some of their polling online (their post-debate poll was phone, but includes call-backs of voters originally reached through SSRS's panel). "Online" is not the problem here!
I realize this seems nitpicky, but I think it runs the risk of being genuinely confusing as the industry moves toward different modes. Gallup is doing online polling. Pew is doing online polling. AP is doing online polling. Etc.