University of Chicago
Type of paper: Thesis/Dissertation Chapter
Questions to Ask Before You Believe a Pol
Opinion polls and other sample surveys can produce accurate and useful information if the pollster uses good statistical techniques and also works hard at preparing a sampling frame, wording questions, and reducing nonresponse. Many surveys, however, especially those designed to influence public opinion rather than just record it, do not produce accurate or useful information. Here are some questions to ask before you pay much attention to poll results.
•Who carried out the survey? Even a political party should hire a professional sample survey firm whose reputation demands that they follow good survey practices. •What was the population? That is, whose opinions were being sought? •How was the sample selected? Look for mention of random sampling. •How large was the sample? Even better, find out both the sample size and the margin of error within which the results of 95% of all samples drawn as this one was would fall. •What was the response rate? That is, what percent of the original subjects actually provided information? •How were the subjects contacted? By telephone? Mail? Face-to-face interview? •When was the survey conducted? Was it just after some event that might have influenced opinion? •What were the exact questions asked?
Academic survey centers and government statistical offices answer these questions when they announce the results of a sample survey. National opinion polls usually don’t announce their response rate (which is often low) but do give us the other information. Editors and newscasters have the bad habit of cutting out these dull facts and reporting only the sample results. Many sample surveys by interest groups and local newspapers and TV stations don’t answer these questions because their polling methods are in fact unreliable.
If a politician, an advertiser, or your local TV station announces the results of a poll without complete information, be skeptical. Even professional sample surveys don’t give exactly correct information about the population. There are many potential sources of error in sampling. The margin of error announced by a sample survey covers only random sampling error, the variation due to chance in choosing a random sample. Other types of error are in addition to the margin of error and can’t be directly measured. Sampling errors come from the act of choosing a sample. Random sampling error and undercoverage are common types of sampling error.
Undercoverage occurs when some members of the population are left out of the sampling frame, the list from which the sample is actually chosen. The most serious errors in most careful surveys, however, are non-sampling errors. These have nothing to do with choosing a sample—they are present even in a census. The single biggest problem for sample surveys is nonresponse: subjects can’t be contacted or refuse to answer. Mistakes in handling the data (processing errors) and incorrect answers by respondents (response errors) are other examples of non-sampling errors.