By using open records to obtain CRCT scores from the major metro districts and diving into the data, the AJC investigative team examined how schools under suspicion for test tampering in 2009 fared this year when testing protocols were tightened and answer sheets were guarded like gold bullion. (You can see the results for yourselves here as we have just posted them.)
Not very well, it turns out.
The newspaper’s analysis by AJC reporters John Perry and Heather Vogell found that score drops in the 39 “severe” Atlanta schools drove up the district’s overall failure rate in reading, English and math.
For all of you who have insisted that this was a witch hunt, consider this: Most schools that did not have unusual erasure patterns showed a slight gain in their CRCT scores. Is this also random?
Those schools with suspicious erasure rates last year and jump in failures this year ought to be doing some soul searching. And Superintendent Beverly Hall ought to searching for the reasons for the plummet in scores. And for the people responsible for any tampering.
Yes, there are fluctuations in test scores year to year. But abnormal fluctuations can’t be ignored in light of the state erasure analysis.
While the state won’t release final school results until mid July, the AJC used the state’s Open Records Act to obtain preliminary scores for nearly 500 schools in the six biggest metro districts. In general, the scores showed that students at schools where state officials said cheating concerns were “severe” suffered average score drops of as much as 11 points, while students at schools considered “clear” made slight gains.
Responding to evidence of cheating uncovered by an AJC investigation in 2008, the state reviewed every 2009 CRCT answer sheet to measure how often kids changed wrong answers to right by virtue of erasures on the sheets. Because every test sheet was checked, the state was able to develop a reliable index of how often test answers were changed from wrong to right and flag schools that had inordinate occurrences of answer changes, right down to the classroom level.
The worst incidents occurred in Atlanta.
For some APS schools, the drop from last year was stark. At Gideons Elementary, 92 percent of fifth-graders passed math in 2009. This year, 39 percent did. And at Dunbar Elementary, about 87 percent of fourth-graders passed math last year, compared to 49 percent this spring.
In a statement prior to the AJC’s story, Hall acknowledged the scores dropped but maintained the results overall demonstrated the district’s “continued academic progress.”
I am assuming that she will come out with a stronger statement soon and that it will not gloss over the very real evidence of test tampering.
According to the AJC:
The scores showed that overall, students at schools where state officials found cheating concerns were “severe” suffered average score drops of as much as 11 points, while students at schools considered “clear” made slight gains.
Passing rates mirrored the sagging scores. In severe schools, for instance, the percent of students failing math rose roughly 12 percentage points. In cleared schools, the percent of failing students declined by 1 percentage point this year.
School districts should view stark score drops at severe schools as potential evidence of tampering, said Kathleen Mathers, executive director of the Governor’s Office of Student Achievement.
“Generally, in schools where there proves to be a dramatic drop, we have concerns about what would cause that drop,” she said. “If it happens to be a school that had a high number of answers changed last year from wrong to right, that could be an indication that there had been intentional wrongdoing.”