Atlanta-based Strategic Vision, the GOP-oriented polling and strategy firm, is doubly in the news today.
This morning, SV released the latest statewide poll in the race for governor — the first since the departure of Lt. Gov. Casey Cagle from the fray.
Download the entire poll memo here, but this is the GOP line:
— State Insurance Commissioner John Oxendine, 33%;
— Secretary of State Karen Handel 14%;
— U.S. Rep. Jack Kingston 11%;
— U.S. Rep. Lynn Westmoreland 7%;
— Cobb County Commission Chairman Sam Olens 4%;
— House Speaker pro tem Mark Burkhalter 2%;
— Ray McBerry 2%;
— State Rep. Austin Scott 2%;
— and undecided, 25%.
David Johnson, CEO of Strategic Vision, said Handel appears to be getting a good share of Cagle’s support. While Oxendine leads, Johnson said, the question is whether there’s a ceiling on his support.
The SV poll also gives Gov. Sonny Perdue a higher approval rating (51 percent) than a SurveyUSA poll released last week (40 percent).
The Democratic horse race is as follows:
— Former Gov. Roy Barnes, 56%;
— Attorney General Thurbert Baker, 29%;
— Former labor commissioner David Poythress, 4%;
— House Minority Leader DuBose Porter, 2%;
— and undecided, 9%.
Methodology is based on phone interviews with 800 likely voters. Margin of error is ± 3 percentage points.
Now, on to more complicated matters. You’ll remember that last year, Hillary Clinton dumbfounded pundits with the victory she pulled out of the New Hampshire primary. Polling was uniformly off-base.
As a result, the American Association of Public Opinion Research conducted an inquiry using data from all firms that conducted random surveys leading up to the primary.
The audit was supposed to be finished in May 2008, but the AAPOR said a lack of cooperation and transparency and cooperation from several polling firms delayed a conclusion.
Download the report here. Wrote the AAPOR:
…It is impossible to evaluate the performance of the pollsters without information about their methodology. That is why the AAPOR “Code of Professional Ethics and Practices” include a set of elements that those who conduct polls and surveys should disclose so that other professionals can evaluate the quality of the research that they conduct and the results that they disseminate.
The committee’s experience suggests that some firms engaged in election polling pay only lip service to these disclosure standards, while others are operating at such a thin margin that they do not have the resources to devote to answering questions about their methods.
The AAPOR identified three firms who polled in New Hampshire in the days leading up to the primary, but who refused to cooperate.
“Despite repeated requests for information, at the time of the analysis of data for this report three firms never responded: Clemson University, Ebony/Jet, magazine and Strategic Vision,” the report said.
Johnson, the CEO of Strategic Vision, said he received a single request from the organization. “I got the request for this two days before the report was released,” he said. “And I’ve got the e-mails to prove it.” Johnson said the AAPOR told him that the study group had sent a request by certified mail, but he never received the communication.
More firms were cited as providing only a minimum of detail. Another set of thick but worrisome paragraphs:
The world of survey research now uses more complicated and diverse sampling frames and selection techniques. We inhabit a world where the sampling frames used for studies of the same population are quite diverse, where the separation between volunteering and being approached because you were selected randomly is fuzzy. The field now uses technologies where the selection of respondents is not straightforward….
The world of survey research uses more complicated and diverse statistical adjustments for errors of non-observation. Propensity models are increasingly used as an adjustment tool; some firms claim that their models are trade secrets, not to be disclosed. The adjustment for non-response is combined with adjustment for coverage and likely voting in ways that cannot be disconnected.
As for substance, these were among the findings:
Respondents who required more effort to contact seemed more likely to support Senator Clinton, but most interviews were conducted on the first or second call, favoring Senator Obama.
Patterns of nonresponse, derived from comparing the characteristics of the pre-election samples with the exit poll samples, suggest that some groups that supported Senator Hillary Clinton were underrepresented in the pre-election polls.
Although no significant social desirability effects were found that systematically produced an overestimate of support for Senator Obama among white respondents or for Senator Clinton among male respondents, an interaction effect between the race of the interviewer and the race of the respondent did seem to produce higher support for Senator Obama in the case of a black interviewer. However, Obama was also preferred over Clinton by those who were interviewed by a white interviewer.
For instant updates, follow me on Twitter.