My next session is “Like it or Not? Survey Recruitment and Data Collection using Social Media”. Presented by @trentbuskirk#AAPOR
Trent starts by saying this survey is a pilot study to conceptually cover individuals who have professional experience about privacy. Trent first starts with a literature review of using social media as a method of recruitment. #AAPOR
The sample was partially sourced by twitter users who tweeted certain keywords about privacy. A random sample of 1680 users were identified. #AAPOR
The timeframe for the was 2020 and 2022. The target was with less than 1000 followers. The experiment conditions: dm, follow then dm, or waiting to dm until they followed back. #AAPOR
Next Trent discusses the phases of the sample flow. 8399 were first identified. Then only 7358 could be found to be direct message, 5046 messages were sent but only 2137 messages were delivered, 97 surveys were started. #AAPOR
The final response rate was about 1% which was lower than expected. But there were lessons learned that could be applicable to other studies. #AAPOR
The next presentation is “Use of Social Media Advertisements to Increase Panel Recruitment and Survey Participation” presented by Jon Schreiner from @Gallup#AAPOR
Jon first starts by pointing out that traditional survey recruitment response rates are declining. Social media is a place where many people use. Jon says this presentation will cover how to use social media advertising to recruit respondents. #AAPOR
Jon has the first step is to define the population and who to target. Then to design the add. Then you need to decide how to target. There are two options social matching to match a name to a social media profile or geofencing to target people in a certain area #AAPOR
The last step was to consult legal to make sure any recruitment makes laws as states such as California special requirements. Next Jon is describing an experiment to show an ad to increase awareness prior to sending a survey invitation via mail. #AAPOR
The first case study was to enhance survey recruitment via geofencing. There were about 14000 address and about 7800 , 27k devices saw ads and 121 clicks. #AAPOR
The first case study did not find that overall there was not an increase but in people 18-24 there was a significant increase. #AAPOR
The second case study is attempting to recruiting young people using social match. There was a 31% social match rate and 31k displays and 96 clicks. Overall no difference but in the youngest age group it was significant. #AAPOR
Next Jon provides some conclusion. In this case the experiment was not very successful but it be something that improves over time. #AAPOR
Next presentation is “Using Social Media for Experimental Research: A Case Study” presented by Ting Yan. Ting states her talk is focused on using social media for experiments. The study is sponsored by the FDA and compares Instagram recruitment vs a non-prob panel. #AAPOR
In the two samples used different fictional drugs. The Instagram used a influencer to promote the survey. The Instagram group had to be familiar with the influencer. The non-probability sample was a quota sample but the respondents needed to be familiar with a endorser. #AAPOR
The Instagram sample was smaller with a sample size of 140 and the nonprobability sample size has about 440. The samples had different demographic composition. #AAPOR
The first thing accessed the effects of paid endorsements. The experiment did not always match the non-prob sample, and the were differences in sample composition across the samples. #AAPOR
Ting mentions that a challenge is that the Instagram algorithm was not possible to control and limited the reach of the posts. #AAPOR
The next presentation is “Do You Have Two Minutes to Talk about Your Data? Using Data Donation to Collect Facebook Data” by @floriankeusch
Florian starts by defining digital trace data as records of activity on online devices. Digital traces provide information that surveys struggle with such as smartphone use or online media consumption. #AAPOR
Florian discusses two ways meter data is collected. The first is users install software to collect and report the digital trace data. The second is data donation where a respondent downloads their data from a platform and donates it for a researcher. #AAPOR
Florian mentions one benefit of data donation is that the data can be anonymized. Additionally the data available from the platform can have information not covered by traditional meters and panelists have more control over what is shared. #AAPOR
Florian now is detailing the data which was a web survey on a non-prob panel focused on Facebook. The requested info was account information and topics. The participants had to contribute the data on a PC. #AAPOR
79% if respondents were willing to donate and Facebook use did not affect the rate of donation. The survey walked the respondents through how to get upload the data. #AAPOR
48% of survey respondent that were willing to donate actually donated something that could be match to the survey. Not all donations could be used. #AAPOR
The next presentation is “Investigating Digital Advertising and Online Self-Response” presented by Brett Moran @uscensusbureau
Brett states this presentation discusses the impact of digital advertising on increasing online self-response in the 2020 census. The Census did online ads to encourage people to complete the Census. #AAPOR
The various referral sources had different links which allowed tracking of where people came from prior to going to the internet self-response site. #AAPOR
Next Brett shows the breakdown of referral sources. The most common source was the mailers (35%) and the second most common source was digital ads (23%). #AAPOR
Another take away for me for this session is that how weird I am because I take way to many surveys I get invited to via ads. I know I’m weird but I didn’t realize I’m doing something <0.1% do. Although I also respond to mail and phone surveys.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
My next session at #AAPOR is “Things that Divide Us: Ideology, Identification, and Information”. The first talk is “Fake News Interventions: Effective for Both Strong and Weakly Identified Partisans?” presented by Joseph Sandor.
Joseph first explains that while people have a motivation to share accurate information on social media sometimes people share things that are inaccurate. Joseph is presenting the results of an experiment where users rated a headline’s accuracy before sharing. #AAPOR
Next Joseph is walking through the literature on why people share fake news and how sometimes people are motivated to share fake news for political reasons. #AAPOR
I’m excited to be at #aapor this year. The first session I’m attending (and live tweeting) is “Assessing the Polls: Measuring Bias and Vote Choice”.
The first presentation is “It's Not Personal: Evaluating the Impact of Asking for Voters By Name” presented by Travis Brodbeck and Madeline Harland of @SienaResearch.
Sienna college polls have been some of the most accurate and are based on the L2 voter file and a likely voter model to determine the likelihood someone will vote. #AAPOR
I’ve looked into statistics about voter fraud and I find it hard to believe there is the necessary level of voter fraud to flip this election. If Biden wins AZ and GA you have to overturn the results of 37 electors. The easiest way to do that is to contest GA, AZ, PA.
Biden is ahead by this many votes in these states:
GA: 12338
AZ: 14746
PA: 45103
Total: 72187
Now these aren’t official numbers but bare with me here. So we need about 72k fraudulent votes all for Biden to flip the election. That number needs to be put in perspective.
A Heritage Foundation study found 1298 proven cases of voter fraud going back to 1982 in all different types of elections. This includes fake registration that may not have resulted in voting. There may be unproven cases not counted. heritage.org/voterfraud/#ch…