‘Trump’s Claims About Illegal Votes Are Nonsense. I Debunked the Study He Cites as ‘Evidence.’’

From Politico.com: Brian Schaffner is a political science professor at University of Massachusetts-Amherst and the founding director of UMass Poll.

Donald Trump is making news with his false claim that he would have won the national popular vote if millions of non-citizens had not voted in November. As evidence, he and his staff are pointing to a study by Jesse Richman and his co-authors that was published in the journal Electoral Studies and advertised on the Washington Post’s Monkey Cage blog. As a member of the team that produces the datasets upon which that study was based and as the co-author of an article published in the same journal that provides a clear “take down” of the study in question, I can say unequivocally that this research is not only wrong, it is irresponsible social science and should never have been published in the first place. There is no evidence that non-citizens have voted in recent U.S. elections.

I first came across the Richman study in 2014 when I was sent a link to an article by the authors promoting their newly published work. Their chief claim, and the one that made headlines, was that as many as 14 percent of noncitizens living in the United States had cast votes in recent elections. As soon as I saw that figure, I knew it was almost certainly nonsense, but what was troubling was that the “evidence” the scholars were pointing to was from a survey that I coordinate along with my colleagues Stephen Ansolabehere of Harvard University and Samantha Luks from the survey research firm YouGov. The survey is the Cooperative Congressional Election Study—a project that interviews tens of thousands of respondents every election year about their views on politics. A wealth of excellent research has come from this dataset in the past decade, providing important insights about our political world. Unfortunately, the Richman study doesn’t fall into that category. It is bad research, because it fails to understand basic facts about the data it uses.

Indeed, it took me and my colleagues only a few hours to figure out why the authors’ findings were wrong and to produce the evidence needed to prove as much. The authors were essentially basing their claims on two pieces of data associated with the large survey—a question that asks people whether they are citizens and official vote records to which each respondent has been matched to determine whether he or she had voted. Both these pieces of information include some small amounts of measurement error, as is true of all survey questions. What the authors failed to consider is that measurement error was entirely responsible for their results. In fact, once my colleagues and I accounted for that error, we found that there were essentially zero non-citizens who voted in recent elections.

The biggest source of error with the Richman study was its use of one of the survey questions to identify “non-citizens.” Survey respondents occasionally select the wrong response by accident—perhaps because they are rushing through and not reading the questions carefully, because they do not fully understand the terminology being used, or because they simply click on the wrong box on the page. Such errors are infrequent, but they happen in any survey. In this case, they were crucial, because Richman and his colleagues saw the very small number of people who answered that they were “immigrant non-citizens,” and extrapolated that (inaccurate) number to the U.S. population as a whole.

How do we know that some people give an inaccurate response to this question? Well, we actually took 19,000 respondents from one of the surveys that Richman used (the 2010 study) and we interviewed them again in 2012. A total of 121 of the 19,000 respondents (.64 percent) identified themselves as immigrant non-citizens when they first answered the survey in 2010. However, when asked the question again in 2012, 36 of the 121 selected a different response, indicating that they were citizens. Even more telling was this: 20 respondents identified themselves as citizens in 2010 but then in 2012 changed their answers to indicate that they were non-citizens. It is highly unrealistic to go from being a citizen in 2010 to a non-citizen in 2012, which provides even stronger evidence that some people were providing incorrect responses to this question for idiosyncratic reasons.

Since Richman was trying to extrapolate from a very small fraction of respondents to the survey, even these very small amounts of measurement error could cause major problems for his analysis. To get a more valid estimate of non-citizen voting, we can look at the 85 respondents who said that they were non-citizens in both waves of the survey. Since this group answered the question the same way twice, we can be much more confident that they really are non-citizens. Among these 85 respondents, zero were matched to a valid vote record in 2010. That is, all of the non-citizen voters that Richman reports in his study for the 2010 election disappear once we account for measurement error.

In the 2012 election, we do find that one of these 85 non-citizens was matched to a vote record. However, given that this is just one individual among 85 non-citizens, it is unlikely that this is actually a non-citizen voter. One possibility is that this is a citizen who answered the question incorrectly twice. Another possibility is that this individual was matched to the wrong vote record. That’s another place where survey error comes in. When we match survey respondents to vote records, there is always some probability of making an incorrect match—that is, matching a respondent to a record that is actually somebody else’s. Even though the error rate is low, it could easily explain why we find a single voter in 2012 among 85 reported non-citizens.

In our article refuting the Richman study, we summarize our findings very plainly this way: “The results, we show, are completely accounted for by very low frequency measurement error; further, the likely percent of non-citizen voters in recent US elections is 0.” We are confident that such a conclusion would hold in 2016 as well.

Simply put, the claims Trump is making are false through and through. Fact checkers and major news organizations have consistently pointed to our study over the past few months to demonstrate that Trump’s claims are based on bad science, yet he continues to use this debunked information to demonize non-citizens as justification for his self-serving claims about voter fraud. Let’s hope the public stops paying attention.

About Luke Ford

I've written five books (see Amazon.com). My work has been covered in the New York Times, the Los Angeles Times, and on 60 Minutes. I teach Alexander Technique in Beverly Hills (Alexander90210.com).
This entry was posted in Voter Fraud. Bookmark the permalink.