While some public opinion polling is still done the old fashioned way – calling into a home on a land line telephone or asking folks their opinions in face-to-face interviews – there’s a whole new, hi-tech polling world out there, as there is in so many other professions and businesses.
Dan Williams, director of data services for Western Wats, says polling is today as accurate as in the past, or even more so, if – and this is the important part – surveys are conducted correctly, with the proper safeguards.
But with the ease of technology – especially the Internet and automated dialing – shortcuts that can save money for the poll purchaser, can also lead to less than accurate results.
The new catch phrase in inexpensive polling is IVR – interactive voice recording.
You may have gotten some of these polls on your home or even your cell phone. A recorded voice asks you some questions and you press a number on your keypad to respond. Thousands and thousands of these calls can go out at a time – and even though a smaller number of folks may respond, and not just hang up – the larger number of out-calls can result in about the same sampling total as in more traditional person-to-person telephone questioning.
Williams says Western Wats can, and does, do some IVRs. “There are some appropriate circumstances” for IVRs in data collections, but too often some low-cost survey firms are using IVRs in the wrong ways – especially in political/candidate polling, he says. The most obvious problem with IVRs is if a minor child answers the phone and pretends to be a voting age adult in answering the questions.
But there are more subtle problems, too. “There’s the intensity in the response,” says Williams.
For example, let’s say on an IVR poll two folks punch the keypad saying they are “very likely” to vote. But in a person-to-person call, one respondent may say: “If I have a broken back I’ll crawl to the polls to vote against that jerk incumbent.” The other respondent may say: “I don’t know the candidates or issues well, but, yeah, I’ll likely vote, and not vote for the incumbent, whoever that is.”
Both the same response in the IVR, but very different intensity and motivation in a person-to-person response and real likelihood in voting, or voting against the incumbent.
And just because a poll is conducted for a well-known media source, non-profit or government entity does not on its face mean the survey was properly conducted, he warns.
Western Wats is, for lack of a better term, one of the leading and largest “data gathering” wholesalers in the nation.
In fact, while any number of relatively small polling organizations may have higher name identification, in reality they may well be getting their raw data collection from Western Wats. “All kinds of firms, including polling firms, come to us to do the data collection,” says Williams. WW had 5.8 million completed questionnaires last year, making up thousands of individual polling surveys.
WW may charge a retail polling firm $5,000 for a low-cost survey, with the firm turning around and charging the ultimate customer $10,000 for the whole poll. WW does full service, retail polling, as well – no middle man.
WW still does polling the old fashioned way, calling into houses on land lines. But more and more, those answering their home phone don’t want to take a 10-minute to 15-minute survey. And they are more likely, also, not to want to answer any questions at all, says Williams.
In addition, larger segments of American society don’t have land lines – only cell phones or even using their computers as telephones.
While it is possible to get cell phone numbers, says Williams, pollsters have to worry not only about cell users’ rejections in even taking their polls, but the safety of those answering their questions (you don’t want to ask thought-provoking questions to someone driving a car, for example). And you worry whether the person on the end of the cell call can give you the proper concentration for proper sampling.
So, as turn about is fair play, many pollsters, including Western Wats, are using the disruptive force – the Internet – as part of the solution.
Pollsters are starting to build large data banks of web-users. Through various means, people are invited to come into pollster sites and sign up to be constant, or regular, poll takers. It is kind of like signing up to be in the jury pool – while some certainly don’t want to be bothered by taking a poll, others seem to like it.
And – this is America after all – you can “earn” points by being in the poll pool and taking surveys online. The points can be redeemed in purchases on Amazon.com or other entities, explained Williams.
Of course, online you fill out an extensive demographic questionnaire, just as telephone poll-takers answer questions about their political preferences, income, education, race, religion and so on -- data needed by the pollster to get a good sampling of public in the geographic area in which they are polling.
But there’s more. Pollsters are also using information gathered by some cell phone users and mixing that in with line land polling or even online polling – all in an effort to get a more true picture of the demographic group or geographic area.
For example, if you are trying to find out how often folks in their 20s go to the movies, and what kind of movies they like and what time or day they like to go to movies, you clearly have a problem – many of the folks you are trying to survey don’t have land lines, only cell phones.
If you are trying to poll senior citizens, you probably can’t use online polling, since many seniors don’t have computers or don’t often get on the web even if they do have computers.
For political junkies – like those reading this story – there are other concerns – a chief one being just how accurate are the issue and candidate head-to-head surveys that newspapers and TV stations love to publish?
Williams and Kelly Patterson, a Brigham Young University political science professor, say political polls can be very accurate – within their accepted margin of error.
Patterson is director of BYU’s Center for the Study of Elections and Democracy. BYU has for years operated a student-based election exit polling (called the Utah Voter Poll) that, over time, has proven itself to be one of the most accurate data collection systems run in the state.
The center is also involved in a number of collaborative exit polling efforts with other universities and groups across the nation.
“Utah is a great place” to conduct exit polling – the questioning of voters as they leave their polling places – “because people here are so open and willing” to answer questions, says Patterson.
That is especially true if those asking the questions are college students – not professional poll-takers – and identify themselves as students at local colleges and universities. “Those being questioned see it as helping out a college student,” says Patterson. “And they are generally glad to do it – although we have over time seen more and more” voters unwilling to answer questions.
“The new problem for us is early voting,” says Patterson. One fourth to one third of Utahns don’t go to the polls on Election Day, but vote early either through mail-in ballots or at regional polling places. And that early voting percentage will likely rise as time goes on – especially if some kind of online voting is allowed. (Some Utah legislators are promising to pass an online voting law in the 2011 Legislature.)
Patterson says there are ways to still accumulate early-voting data, but it’s tough and time consuming.
“We have access to the early voting rolls; we know who has voted early,” he says. His group can then send a written questionnaire to that person’s home, and they can either fill it out and mail it back or get online and respond to the questionnaire.
In Utah, the Democratic Party has been pro-active in getting its members to vote early. Republican Party officials are now trying to catch up in that effort – with the belief among both party leaders that early voting helps their cause because that is just fewer party loyalists they have to get to the polls on Election Day.
Patterson says that properly tracking what early voters are doing is a real concern for exit polling accuracy. But having the members of one party voting early more than another party doesn’t in itself harm his efforts – for early voters usually cast ballots in the same manner as Election-Day voters, he says.
“But in close elections,” early voting can turn the tide one way or the other – and that is the concern in properly predicting winners and losers at 8 p.m. when the polls close.
If early voting predictions can’t keep up through technology or other methods with same-day voting, you may see pollsters less willing to call a close contest on election night, he believes.
“Generally, we don’t feel comfortable predicting an outcome if the sampling is within the margin of error,” he says.
In fact, Williams and Patterson say most election/candidate polling is accurate within the margin of error. But if a poll gets a final result wrong – but it is still within the margin of error – most often the public doesn’t realize or pay attention to that.
It’s just remembered that pollster so-and-so said this candidate or ballot issue would lose, when in fact he or it won in a close vote.
Finally, it seems every election year someone yells loudly that his opponent, or the opponent’s political party, is conducting “push polls.”
But, in fact, few true push polls are conducted, either in Utah or nationally, says Williams.
A push poll is a campaign tactic aimed at harming one’s opponent. They are not accurate data collecting methods at all – and are not meant to be.
The difference between a push poll and a legitimate message testing polling is clear to the trained eye, but perhaps not to the average voter, Williams notes.
A push poll is, by nature, short. It is only two or three questions or even just plain statements. It is highly negative, like “would it make a difference in your voting if you knew Candidate A beat his wife and abused his children?” Tens of thousands of people are called or contacted for a push poll, because the goal is to swing public opinion. And you need big numbers to do that.
A message testing poll is small, just a 400 or 600 sample. It’s longer, like a regular poll – 10 or 15 minutes, more than a dozen questions. It could ask if you would change your mind about a candidate if it was known that he had been twice divorced or had a bankruptcy or had been charged with a crime, or had been a lobbyist or investment banker.
But, says Williams, anyone troubled by a poll question can ask the caller who the poll is conducted by. If the caller doesn’t know – and for bias concerns sometimes the person making the call really doesn’t know who is paying for the poll – a supervisor can be called to the phone to tell the person being questioned. And if the supervisor won’t, or can’t, tell you who the poll is for, then the person being questioned should be skeptical of the whole situation.
“There are professional guidelines that legitimate pollsters follow,” says Williams. And Western Wats and many other members of the professional pollster association won’t conduct push polls.