Thanks to Insight Analytical for providing this amazing insiders view on the history of and modern goings on in the world of polling.
IA is a former pollster and study director for companies such as Gallup, and even has some good inside scoop on Zogby. This is a must read for anyone interested in politics.
Scroll down for the latest update on Gallup’s new polling methods…
I’m SO HAPPY (snark) to report that the Zogby Interactive Survey has now re-appeared in my inbox! I am always curious about what’s being surveyed and how, because I have a “pollster past.” Over the last few months, they had mysteriously disappeared. But now, with Obama the presumtive nominee, the folks at Zogby have decided to pay attention to me again. Perhaps they didn’t bother with me for months because I was one of those older women who weren’t coveted for their sampling.
Of course, “Interactive Phone Surveys” are subject to a big dose of suspicion. Signing up to participate is open to anyone and is ongoing. Zogby picks respondents from this pool. A few times before the primaries began I was contacted and occasionally told I wasn’t suitable for the survey after being asked a few screening questions. This is a normal part of polling as clients sometimes need to focus on certain demographics. Which is why I noticed when at the height of primary season I wasn’t getting polled anymore. Was it my age, sex, and the fact that I sometimes had answered that I was a moderate instead of a progressive or liberal? I never knew exactly how to answer that question because the terms were not defined as part of the question.
And therein lies one of the flaws of polling, whether respondent selection is deliberately skewed or not. Or, if a poll is online or even a supposedly “more reliable” telephone poll. If questions are poorly worded, unclear, or leave a lot of leeway for interpretation by the respondent, how accurate can the poll be?
Another problem area that I’ve found with Zogby’s Interactive Survey is the omission of questions that related to a respondent’s experience with market research. Some polls ask right at the beginning whether a respondent has ever worked in a particular industry to eliminate any bias which might impact a particular client’s survey. Many pollsters ask right upfront if a respondent has ever worked in the market research field. Every time I receive a Zogby poll I’m interested in seeing if they ask if I, as a respondent, have ever worked in market research. Even though I left the field ages ago, I still know a few tricks of the trade and can see where a survey is going and can guess what type of client is polling and what emphasis they are looking for in the result.
So, am I capable of skewing a poll beyond just answering questions with choices that don’t really reflect my opinion? Sure I am. Especially when the interactive poll like Zogby’s allows you to go back and change answers once you’ve guessed who’s polling or the type of responses the poll is fishing for (and even allows the poll to be copied)! [I've done it for another piece I wrote on a related topic. I literally copied and saved the survey for later reference...see Pundits, Pollsters and Should We Be Getting Ready for the Next Play of the Race Card? (Clues in the latest Zogby survey I received?) (Updated 1X)]
Do I do this? Not deliberately. But if there’s poorly worded question or a questions whose answers don’t allow me to REALLY GIVE my opinion, then I’m in the same box every respondent to that poll is in…trying to do an honest job, but being led along by the pollster.
John Zogby’s telephone polling methods are also suspect. I have not studied this closely, but I have read/seen that they have changed their methodology on their phone polls, shifting their sampling toward a heavier weighting of progressive/liberal support across the board… which would explain his string of primary polls which tended to result in lower poll numbers for Clinton than was actually the case.
Also on some minds is the influence of John Zogby’ s brother, James Zogby. James Zogby, who blogs at the Huffington Post, is “founder and president of the Arab American Institute (AAI), a Washington, D.C.-based organization which serves as the political and policy research arm of the Arab American community.” His organization is active in promoting voter registration and mobilization within this community.
His political work also involves Democratic politics:
Dr. Zogby has also been personally active in U.S. politics for many years. Most recently, Zogby was elected a co-convener of the National Democratic Ethnic Coordinating Committee (NDECC), an umbrella organization of Democratic Party leaders of European and Mediterranean descent. On September 24, 1999, the NDECC elected Dr. James Zogby as its representative to the Democratic National Committee’s Executive Committee. In 2005 he was appointed as chair of the DNC’s Resolutions Committee.
In his March 7, 2008 blog at The Huffington Post entitled “The Big Costs of Hillary’s “Big Wins,” James Zogby basically accused Clinton of dividing the party and “wounding” Obama. James Zogby, of course, is entitled to his own opinion, but a question arises about whether his politics has any influence on the way brother John polls.
This is a legitimate question to raise because buried in the depths of his Huffington Post bio is the fact that he “is a Senior Analyst for the polling firm Zogby International.” The question is, what does that title/job really encompass? Another question is how closely James Zogby is tied to the Obama camp. The Interactive Poll I just received a few days ago seemed to be more than just a general survey on the current political scene. As I read through the questions, it seemed to be something that the Democratic Party may have paid Zogby to do. In other words, the DNC may have been the client; the questions were not neutral, in my opinion, but heavily weighted to the types of questions used to find out how to mold a campaign around public opinion on certain topics, notably one series of questions about “The American Dream” (undefined by Zogby, but in the questions, focused on “material versus spiritual values.”
Here are a couple of links to discussions from 2006 that delve into one of the more interesting examples of flawed Zogby polling specifically related to the attitudes of American troops in Iraq:
Zogby’s Flawed Polls (Interesting tidbit from this piece: Poll in question was conducted in conjuntion with Le Moyne College’s Center for Peace and Global Studies–James Zogby “received a Bachelor of Arts from Le Moyne College. In 1995, Le Moyne awarded Zogby an honorary doctoral of laws degree, and in 1997 named him the college’s outstanding alumnus.”–from his bio, linked above.)
Opinion polling, in general, has other problems.
Once a survey is returned from the field, manipulation of the data is another area of potential mischief. In my time as a researcher, I had several instances where I witnessed manipulation of the questionnaire data.
In my second job in marketing research, I was a coding supervisor for a now defunct company in New Brunswick, NJ. My job was to make sure all the questionnaires were properly coded and if I found problems, I would alert the project director. Questionnaires that didn’t meet demographic requirements or were incomplete were obvious problems and were eliminated if the respondent couldn’t be contacted again. But sometimes the clients themselves created a problem.
One part of my jobs was to create a codebook for all the “open-ended” questions. These are questions where the respondent is able to answer freely, without having to select from a list of answers. I would tally each answer then write up the codes that the coding staff would use.
I’ll never forget how one client, a major financial outfit, suddenly decided they wanted to save money on their project. I had created approximately 36 “answers” from the open-ended question on the survey. Orders came down to “collapse” the codes, which meant I was supposed to cut the number of answers down to about 18. The only way to do this was to combine answers. The result? An entirely meaningless set of answers which really didn’t accurately reflect the opinions of the respondents. Supposedly, the client was going to use these answers to plan their marketing, but how they would be of any real value was beyond me. (Immediately after this episode, I started looking for another job, and left a few months later.)
While working at another company in Hoboken, NJ I was working as a project director on a taste test for an imported beer. The interviewees weren’t reacting too favorably to the product. Once again, a client interceded. I received a phone call asking me if there was some way I could get more favorable answers from the taste testers!
In between these two jobs I worked at The Gallup Organization in Princeton, NJ. This was way back when the founder, George Gallup, Sr., was still roaming the halls. Phone interviewing there was in its infancy. The main source of general survey data was the “omnibus” Gallup survey which was conducted IN PERSON. Appointments were arranged with respondents and pages of questions from various clients were asked, as well as the presidential preference questions that Gallup included for its poll. Of course, as time went on and fewer people were at home, even Gallup was forced to do more phone surveys.
At Gallup, many of the project directors had advanced degrees and there was an “academic” feel to the place. There was a great deal of pride in doing good work. It was at Gallup that I learned how to create a series of questions, figure the statistical significance of results and how to write a report from data. As a study director I also fielded studies and supervised people who recruited interviewees for studies like the infamous microwave oven evaluation, where ovens being sent out for the California part of a study which had begun in Princeton wound up in Japan instead of San Francisco. Luckily, I had an ace field specialist who was able to rearrange the interview schedules and make up the lost time after the ovens finally arrived back in California! And I personally had to make sure we had the correct pool of respondents in terms of demographics and experience even as we were really pressed to refield the study.
At Gallup, there was never any question of altering a survey or slanting the data because of client pressure once the study was designed (with client requirements considered) and finally fielded. Furthermore, Gallup did not undertake any polling from special interests groups, such as the Democratic or Republican parties, other politcal groups, or any organizations with a particular agenda. INDEPENDENCE was the hallmark of The Gallup Organization. In addition, the man who sat in the small office down the hall and designed the sampling criteria was an ex-Marine who was a stickler about sampling. Gallup was not to be messed with! And Andy Kohut, who was President of Gallup when I was there and later went on to run Pew, was someone else who was a great teacher and took the business of polling seriously.
(For an absolutely fascinating history of how Gallup became the “world’s pollster,” check out this story which describes the early years of Gallup and how he learned the hard way after being wrong about the Dewey-Truman outcome that “Lesson No. 1 was to keep polling, right up to Election Day.”)
Of course, “my” Gallup no longer exists. Gallup was bought out a few years after I left. When George Gallup Sr. died, Gallup was acquired by a company in Nebraska in 1988 and has since expanded greatly into other areas, including psychological “profiling” of people to see how they “fit” into an organization. The “new” Gallup also has “expanded its activities from tracking presidential approval to tracking consumer product and customer service approval.” It has since further expanded by developing “The Gallup Path” which aimed to answer the question, “What is the role of human nature in driving business outcomes?“
“Gallup’s next major technical advance provided the answer to this question. Gallup sorted through billions of bits of economic information and analyzed more customer and employee data than had ever been studied before. The answer to the role of human nature in driving business outcomes is contained in the management theory known today as The Gallup Path.
And the next step in 2002 was even more ambitious:
Gallup has designed and engineered the world’s first Web storage system containing millions of records of what people have thought over the last 65 years. The Gallup Brain, introduced in October 2002, provides ongoing opinion tracking data concerning virtually all issues affecting humankind. The Gallup Brain is the first information or intelligence resource designed specifically for the world’s 20 million leaders. Access to the Gallup Brain offers these leaders the opportunity to significantly improve their decision-making ability in practically every area of their lives. (NOTE: “Gallup loosely defines a world leader as any individual with an (overlapping) personal constituency of at least 1,000 people.”)
In 2003, Gallup went one step further:
Gallup opened a 50-acre Gallup University campus on the Missouri River in Omaha, Nebraska, in August 2003. Gallup’s future efforts will focus on educating, informing, and advising the 1 million most influential people who lead, mentor, and determine the futures of the remaining 6 billion people who inhabit Earth.
Obviously, Gallup has morphed into a business that’s a far cry from the small organization that I worked for in Princeton years ago. And the question for me becomes, how does the emphasis on tracking people for business now affect the way Gallup polls politics?
Back in October 2004, Steve Soto at The Left Coaster studied the internal data and found that Gallup had oversampled Republicans in a large swing out of line from the 1996 and 2000 elections. And this had not been the first time during 2004, either.
So, there we are. It appears that my beloved Gallup has gone the way of Zogby. Oh, the corporate history talks about independence and objectivity, of course:
Although Gallup has typically conducted its polling activities in collaboration with various media organizations and, on occasion, with worldwide associations and academic institutions, these polls have always been carried out independently and objectively.
This single, chosen ethical principle — independence — has made the Gallup name famous and among the most trusted brand names on Earth, synonymous with democracy and the democratic process.
But to me, it sounds as if the company has been absorbed by the borg of business interests and surveying for the world’s most powerful “leaders.” And Zogby’s mission statement sounds eerily similar:
“To offer the best polling, market research, information services, and business solutions worldwide based on accuracy and detailed strategic information.”
Do you think Zogby and Gallup are really interested in “surveying the will of the people” anymore without another agenda behind what they do?
I think George Gallup, Sr. must be turning over in his grave…
UPDATE/ADDITION July 2, 2008
In light of my post on Missouri and McCain, I’m adding this information about polling:
Smaller samples usually yield larger margins of error (at Gallup we used to consider a sample below 1000 or 1500 as the range where a sample was considered to be small). Beware of data from breakdowns of the total sample if it is fairly small since the number of respondents in the individual sub-samples will be very small at that point. For example, a total study size of 546 is small; results based on a breakdown into numbers from 6 or 7 income levels, lets say, would then be very small samples and the data quoted would be unreliable.)
UPDATE/ADDITION October 17, 2008
The Gallup Poll continues its slide into a ruined reputation with this new twist in it’s methodology and it’s downplaying of the standard “likely voter” polling which is the norm by this time in an election cycle:
Gallup is presenting two likely voter estimates to see how preferences might vary under different turnout scenarios. The “expanded” model determines likely voters based only on current voting intentions. This estimate would take into account higher turnout among groups of voters traditionally less likely to vote, such as young adults and minorities. That model has generally produced results that closely match the registered voter figures, but with a lower undecided percentage, and show Obama up by six percentage points today, 51% to 45%.
The “traditional” likely voter model, which Gallup has employed for past elections, factors in prior voting behavior as well as current voting intention. This has generally shown a closer contest, reflecting the fact that Republicans have typically been more likely to vote than Democrats in previous elections. Today’s results show Obama with a two-point advantage over McCain using this likely voter model, 49% to 47%, this is within the poll’s margin of error. — Frank Newport
In other words, in the new model, they’re not taking into account how often people have actually voted in the past and are instead estimating how people who are less likely to vote will now be voting and are assuming this group will vote in higher numbers this time around–essentially, they are pumping up the “registered voters” sample. The media can now conveniently choose to highlight this new polling method rather than the traditional “likely voter” model and, since it comes from Gallup, it has “credibility.”