I have never been called by a political pollster and don’t know anybody who has, but I know some pollsters, who assure me they don’t make the numbers up, and I believe them.
Pollsters, or rather the phone-bankers who make call after call (or computers that make robo-call after robo-call) do get people to talk to them. Not vast numbers of people, but pollsters do not require vast numbers.
We are a nation of nearly 313 million people. So how many people did the pollsters actually speak to? If you have extremely good eyes, you can find the answer in tiny type at the bottom of a chart: The Post-ABC poll was conducted by phone “among a random sample of 1,005 adults.”
That represents 0.0003 percent of the nation at large. (The number of Republicans and Republican-leaning independents was an even smaller sample of 395 people.)
This poll has a very good reputation and I “believe” the results in that I believe they were calculated carefully and (unlike some partisan or campaign polls) without any agenda.
Does Obama really lead Gingrich by 8 percentage points in a (currently) imaginary matchup?
I dunno. Sounds right to me. But I am an even smaller sample than 0.0003 percent.You really don't need to be a statistician to understand this stuff. Why can a survey of 1,100 people be accurate in telling us how the whole nation is thinking? The metaphor I always liked was a blood test. For a doctor to determine if there's a problem with your blood, she doesn't need to remove it all -- she can just extract a small vial. This vial of blood represents the rest of your blood well because it's constantly being mixed up, so that a few cc's of your blood in your arm looks like the blood anywhere else in your body.
It's the same thing in surveys. You can poll a fairly small number of people as long as you can be confident that you're getting a representative sample of American voters. (Talking to your friends and neighbors? Not representative. Calling people randomly across the country? Much better.) And some relatively simple math can tell you just how likely it is that your sample believes what the rest of the country believes. Picking 1,100 people for a survey means you have a margin of error of roughly 3%. That means there's a 95% chance that the actual population is within three percentage points of what your sample believes. Pollsters have settled on that as a pretty reliable margin. You could get it down to 2%, but only by interviewing lots more people, driving up the costs of the poll considerably without improving its accuracy by much.
The sad thing is that Simon has an audience who might really appreciate a better understanding of how polling works, but he decided to waste their time with some blather about how polls are magical and therefore beyond our understanding. They're not, and Simon's readers deserve better.
*Must credit Brendan Nyhan for the Insane Clown Posse reference.