Atlanta explains how it calculates value-added scores. Using scores for “improvement, not accountability.”

I ran a letter a few days ago from the principal of an Atlanta charter school expressing concerns about the value-added scores assigned to his school.

Atlanta is looking at both teacher and school-level value-added as part of its Effective Teacher in Every Classroom initiative. Using test scores, researchers are calculating how much “learning” Atlanta students gain in the standard school year. This sort of calculation is being made for school districts and teachers nationwide and will ultimately be done for every school system in Georgia  as we move to accountability models that measure student progress over time.

There is great debate over whether any value-added system — and Atlanta has hired some of the nation’s top experts to help it develop accurate value-added metrics — can be trusted.

Under Atlanta’s analysis, students at Atlanta Neighborhood Charter were found to only gain gain 5.2 months of learning in a year, one of the lowest scores in the district.

In a letter to parents, Neighborhood Charter principal Matt Underwood wrote: “As much as these efforts try to boil “success” down to a single number or letter, assessing students or teachers or schools is substantially more complicated than that, a fact that has been on my mind these past few weeks…All students–whether at ANCS or elsewhere–are complex individuals with differing strengths and weaknesses. They deserve ways of assessing their performance (and, by extension, the performance of their teachers or schools) that acknowledge just how unique they are as human beings and recognition that there is value in many skills and knowledge that cannot be shown by filling in a bubble.

APS has written a response to Underwood. It is from Rubye Sullivan,  director of research and evaluation for School Improvement.  I received several emails from researchers on how APS created its scores, and I hope Dr. Sullivan’s letter helps them.

Here it is:

By Dr. Rubye Sullivan

I read Matt Underwood’s letter on value added with great interest. I am very impressed by Matt’s efforts to understand both value-added and multiple measures of his students’ learning. I would like to clear up a few questions raised about value-added.

First, a technical concern: Matt raises the issue that not all CRCT exams have content alignment from year-to-year. When two exams do not have content alignment, we can think of value added as measuring whether students illustrate the same amount of knowledge one year later as students who scored similarly the year before. Although Social Studies exams measure different content areas in different grade levels, the predictive relationship between current and previous social studies exams is as high as or higher than the relationship for English or math exams. This allows for accurate Social Studies value-added models.

Atlanta Public Schools is also careful to be forthcoming about the accuracy of value-added results. Schools receive a value-added report that includes a point estimate and a confidence interval for each result. The confidence interval is similar to the plus/minus numbers associated with public opinion polls. If teacher or school effects are difficult to measure in a particular grade or subject, the confidence intervals will be wider. This helps prevent over-interpreting the data.

APS also uses information external to test scores to validate value-added models. Social Studies value-added scores are positively correlated with CLASS Keys evaluation scores; the same teachers that have high-value added tend to have high evaluation scores. The correlation is similar in magnitude to other subjects and external academic research in other school districts.

Mr. Underwood also highlights the very important point that test scores are only one facet of student outcomes. APS stresses during the distribution of value-added scores this very same point. However, we do consider test scores and test score growth to be important. As long as our students are held accountable for meeting proficiency standards, we are obligated to highlight our schools and teachers who are able to grow their students towards proficiency and beyond. Value-added is how we are able to understand the successful instructional practices of our schools and teachers and replicate those practices such that all of our students experience academic growth.

I do not want to misconstrue Mr. Underwood’s point. I know he is very concerned about students with low test scores as well. Rather, his concern is that the increased emphasis on student test scores weakens instruction and other attempts to enrich student’s lives.

Recent studies by the Gates Foundation have worked to better understand exactly what value-added is telling us, including whether high-value added is a result of teaching to the test or real, conceptual learning. In addition to state exams, students in six large public school districts were also administered exams that attempt to measure more higher -order conceptual thinking. The study found that teachers who increase learning on state exams tend to be the same teachers that increased learning on more conceptual exams. For example, the correlation of value-added between the state exam and the Stanford 9 Open Ended Reading assessment was 0.59. (Correlation values range from -1 to 1. A value of 1 means two measures are the same, zero means they are unrelated, and -1 means they are exact opposites.) If “teaching to the test” or memorization was the way to maximize value added, we would expect a negative correlation. The positive correlation is encouraging, although, because it is not 1, also emphasizes that accountability exams must measure what we expect our students to be learning. The current CRCT and EOCT exams clearly define curriculum objectives and Georgia will be adopting the PARCC exams in the 2014-2015 school year, which seek to better measure higher-order conceptual learning.

There are also concerns that value-added does not measure a school’s impact on other important results such as a student’s motivation or behavior. However, these outcomes are closely tied to test scores. A school that is unable to motivate its students or teach good behavior will also be ineffective in raising academic outcomes. Value-added is a powerful tool for a district to understand learning patterns across schools or a principal to understand learning within schools. It will help recognize areas of strength but will not diagnose what specific strategies are improving student learning. This requires additional investigation by school staff, a step that Mr. Underwood appears to already be taking.

We also know that raising test scores now has important long-run impacts for children. A recent study by researchers at Harvard and Columbia has helped us understand the long-term impacts of teachers who raise test scores. The paper links student test scores and teacher and school assignments with IRS wage data to show that the difference between a single year with a low performing teacher versus a high-performing teacher has a substantial effect on student’s future earnings as well as outcomes such as college attendance and reduced likelihood of teenage pregnancy.

In Atlanta Public Schools, we are using value-added information for school improvement, not accountability. Please be reminded that similar to achievement data as reported in terms of proficiency rates, these data only tell one part of the complex story of teaching and learning. We will continue to improve our data systems such that our school leaders have access to multiple sources of data to utilize in the school improvement process.

–From Maureen Downey, for the AJC Get Schooled blog

71 comments Add your comment

Poor Boy from Alabama

October 18th, 2012
8:13 am

Dr. Sullivan’s explanation was not reassuring. It did not include a clear, concise explanation of how APS calculates value added. Nor did it provide much evidence to support the accuracy and relevance of the results.

Jarod Apperson’s analysis of the claims made by Superintendent Davis about NAHS’s academic performance is a good indication that APS’s value added numbers are less than robust and don’t hold up well under careful scrutiny.

As many have pointed out before, APS needs to develop an easily understood set of performance metrics for each school and the district overall. Performance should be tracked on a regular basis and the results should be made available to the public in a timely fashion.

taco taco

October 18th, 2012
8:17 am

Value added scores are in place because GA wants that RTTT money that will never see a classroom.

Phil from Athens

October 18th, 2012
8:18 am

It’s time to end the department of education.

Phil from Athens

October 18th, 2012
8:20 am

“Using test scores, researchers are calculating how much “learning” Atlanta students gain in the standard school year.”

That makes no sense at all considering teachers teach students how to take tests. Atlanta schools have been caught cheating and Georgia has one of the worst school systems in the nation.

Mortimer Collins

October 18th, 2012
8:28 am

Just more waste followed by more waste. The performance of these “egg-heads” is, at best, pitiful and sad.

indigo

October 18th, 2012
8:33 am

This “value-added scores” thing sounds like just another social experiment designed to make sure black students score at the same level as white ones.

Big Mama

October 18th, 2012
8:36 am

This whole ” value-added” scoring sounds like a waste of money. Why is the school system wasting their resources on something so bogus when they could be using those resources in the classroom itself? How many staff members were required to pull this together?

bc

October 18th, 2012
8:46 am

Why does everybody associated with the education-industrial complex behave as if “accountability” is a bad word? It’s not very heartening to those of us paying a ton of money in taxes to fund this sham.

Bill

October 18th, 2012
8:51 am

1) I firmly believe that you cannot manage or improve anything if you do not measure it.
2) Value added measures (as I understand them) compare a students performance at the end of the year, with their performance at the beginning of the year. This seem much better than the previous method of comparing this years cohort to last years cohort.
3) A major shortcoming is that there is much that is not measured. For example, two of my three kids were high school debaters (the last one at Grady HS. Both of them learned much more from debate than from the core curriculum. If I were forced to choose between the core curriculum and debate (for these two kids) I would choose debate. Some of the learning in debate will show up on core curriculum tests. The learning spans history, science, statistics and more. But the more importantly they learn how to think critically and be self directed learners. As I said, I think it is important to measure, but we have to remember that these measurements only give us part of the picture.

cy

October 18th, 2012
8:59 am

Back to basics is what we need. Uniqueness has nothing to do with it. Reading, Writing and Arithmetic first and foremost. There is nothing wrong with holding parents and teachers accountable. The idea that esteem has anything to do with it has been the tragedy of education for years.

d

October 18th, 2012
9:00 am

@taco taco is right. When Dade County signed on to RTTT, they indicated their funds would be used to hire two administrators. Ow does that help in the classroom?

bc

October 18th, 2012
9:02 am

@Bill
Great point. I was a national championship debate team in Gwinnett public schools and I attribute over half of my education to debate. I also took 7 AP exams my senior year, was in the top ten of my class and scored above 1500 on the old SAT exam. If it weren’t for debate the greatly enhanced education I received from that activity (and a full ride to college on debate scholarships), things would’ve turned out very differently for me. I had some great teachers also (especially my AP English teacher-phenomenal), but I remember several that were certainly not there for the noble cause we often hear.

Don Grant

October 18th, 2012
9:19 am

I am a parent of a child at the worst middle school in Atlanta—at least, that’s what this measurement tells me. Surely, I should believe this over my child’s 2.5 years of actual experience at this school. And those on the waiting list to get in are fools—just look at how much better the traditional public middles are performing. I should just disregard the praise given to graduates of ANCS by teachers at 4 area high schools. And I will definitely throw out the high scores on CRCT and ITBS testing—it’s a new day in evaluation, y’know. I never expected our school would be the best—but the worst? What fools we’ve all been: thinking that we were succeeding with a grass-roots, community-led, non-corporate funded charter where for DECADES Atlanta Public Schools has failed. Thanks for opening our eyes. Up is down. Sweet is sour. Wet is dry

Or…

Maybe, there’s a flaw in the system…

Teacher

October 18th, 2012
9:23 am

Taco Taco is totally right. This is all about money. Nothing more.

Value added? As a teacher, just so all readers know, we have never even seen our “value added” results. If they are meant for “improvement” not penalization, should we not learn from our, as well as our students’ strengths and weaknesses?

Yet another hype that will soon fade away. The system we are using is undeniably BROKEN.

Unfortunately, for TEACHERS and STUDENTS, we are triviliazed to numbers and scores. From one test. 5 days out of the year. In one format for all learners. Although we are taught to differentiate to meet the needs of all of our students… Day in and day out we modify, challenge, remediate, and individualize our instruction and assessment; however, the “one test to determine our value” is ONE test for all!

jarvis

October 18th, 2012
9:33 am

Does anyone have the formula on “value added”? Just curious to see how the system works.

bc

October 18th, 2012
9:35 am

@ Don Grant
You are my hero. Bravo.

jarvis

October 18th, 2012
9:37 am

@Teacher, how can you use multiple measurements to measure anything? I’m a firm believer that if it can’t be measured, it’s not worth doing.

Southside Parent

October 18th, 2012
9:48 am

@Maureen: I appreciate your follow up on this story. However, after reading the title, I was quite disappointed in the letter itself. Dr. Sullivan’s letter does not provide much in the way of explanation of how how calculated their VAM scores. Dr. Sullivan primarily gives information supporting the validity of the scores as a measure of assessment. I hope you will follow up and request that Dr. Sullivan provide an explanation as she certainly has the credentials to be able to do so.

@Don Grant: What this score reflects is that ANCS Middle in 2012 had the lowest VAM in APS by the calculation method APS utilized. It does not erase that ANCS Middle also had among the highest levels of student performance. In this single year, by whatever process APS arrived at their VAM scores, ANCS middle is rated a high performance, low growth school while ANCS elementary is rated a high performance + high growth school.

There are questions to ask about these scores and that there are limits as to what they can do. However, I think it’s important to keep in mind that the AJC released these scores as a single score per school, not APS. Dr. Sullivan’s letter provides some reassurance to me that APS views the VAMs as a valid but flawed instrument to understand what is happening at a school. I hope everyone at Trinity understands those flaws.

I would still love to better understand how APS specifically arrives at their VAM calculation. From materials not yet reported by the AJC, we know that APS utilized demographic data to compare performance gains between demographically similar students. We do not know the demographic groups compared nor do we know the level of variance within the groups nor how APS determined valid variance ranges when some of those demographic groups likely would have provided a very small statistical group.

Maureen Downey

October 18th, 2012
9:56 am

@Southside and all, I sent your comments/concerns and those of some earlier posters to APS asking for more details on value-added. Will post response if I get one.
Maureen

JW

October 18th, 2012
10:06 am

Your title is incorrect. She did not explain how they calculated the scores. She explained a bit about how APS interprets and uses the scores, nothing more. Try again.

Don't Poke The Wolf

October 18th, 2012
10:21 am

The idea that King Middle is performing better than ANC Middle in added value is a clear indication that the mystery/voodoo formula being used by Sullivan is another SHAM!

There are 8th graders at Jackson High via King who couldn’t write a sentence if their life depended on it! I’ve met them. The ANCS students at Jackson are thriving in IB and AP classes but King is teaching more?

So the idea that King was a functioning school last year, teaching kids more information than ANCS is just stupid!!! Does anyone at APS really believe this? It’s irresponsible and borderline criminal to continue the Dr Hall smoke and mirror games and it must stop!! Stop victimizing the kids at King!! Stop pretending they are getting educated when their not!! The future will be bright with the new principal Paul Brown but thats still down the road.

DR SULLIVAN PUT YOUR COMPUTER DOWN! GET OUT OF YOUR IVORY TOWER AT CLL! AND VISIT THE SCHOOLS! ANSC MIDDLE SCHOOL IS JUST UP THE STREET FROM YOUR KIDS IN THE ELEMENTARY SCHOOL!

JR Garcia

October 18th, 2012
10:30 am

@Maureen Downey – Can you also inquire if the computerized adaptive assessment software tool APS purchased last year is a factor in the VAM scores? APS made a significant purchase in a computerized adaptive assessment tool, similar to the MAP(r) system used by the City Schools of Decatur. Thanks.

RickinATL

October 18th, 2012
10:31 am

This is an expensive piece of machinery crafted to do half the job, poorly. In the grand public-school tradition!

Here’s an idea: next time we want to evaluate teachers, let’s evaluate teachers. Let’s get a video camera, a parent panel, and an outside expert paid for with grant money or PTA funds, not paid by APS, and rigidly, thoroughly evaluate every teacher twice a year.

Let’s insist that the recommendations of this panel: to terminate, to reward with bonuses, to insist on continuing education for the teacher, or to just give ‘em a satisfactory rating – be adopted by APS unless APS has compelling data of its own indicating otherwise.

The two pages of bureaucratese at the top of this post? Unimpressive.

Jarod Apperson

October 18th, 2012
10:52 am

I would like to understand more about the mechanics of calculating these progress metrics. When I looked at the original list posted by the AJC, a couple of things stood out to me. First, the measure doesn’t appear to be a good measure of elementary performance for a couple of reasons.

First, the measure only considers fourth and fifth grade performance, ignoring all that is achieved in K-3. If schools are successful in K-3, the majority of grades served, they may receive lower scores.

Second, something just doesn’t seem to add up. SPARK’s fourth graders rank in the 92nd percentile statewide and their fifth graders rank in the 96th percentile statewide. How is the school one of APS’s worst according to progress measures. Even if all those students were in the 100th percentile in 3rd grade, something seems off. If the measure is based upon points rather than percentiles, perhaps the measure ignores the bell curve?

Third, the fact that the difficulty of the CRCT is so low, may make growth figures a bad measure for progress among students at the high end of the scale.

In addition to the above argument for why progress scores can add value, APS needs to explain how the scores are actually calculated so that the process can be vetted.

Suzanne from Summerhill

October 18th, 2012
10:58 am

Please remove my previous comment. I would like to edit it?

Maureen Downey

October 18th, 2012
11:10 am

Suzanne, Your first comment didn’t post as all new commenters go into moderation until I can approve their comment.
When you post again, the comment will go right up.
Maureen

Southside Parent

October 18th, 2012
11:51 am

@ Don’t Poke: I’m *thrilled* King Middle has a decent VAM. In 2011-12 (pre-redistricting), King drew students from some of the lowest performing elementary schools in APS. High dropout rates at Jackson demonstrate that King Middle students are not prepared for high school. King started trending up in 2010-11, not long after Dr. Battle took the helm before moving up to regional director. King now has a truly exciting principal who seems committed to staying with the school for many years to come. I look forward to more great VAM scores.

But there’s a difference between growth and performance. King’s solid 2012 VAM doesn’t erase the fact that King is one of the lowest performing middle schools in the state of Georgia. On the other hand, several years of good VAM K-8 may yield decent performance at King, including students who can competently write sentences when they matriculate at Jackson.

Private Citizen

October 18th, 2012
11:57 am

“Effective Teacher in Every Classroom” initiative

These kind of terms make me cringe. For starters, propaganda tends to use appropriation speak. I note the red-herring use of an “all / everything” quantifier.

This type propaganda demands to force everyone into the swirling vortex.

Why is it that it has become acceptable to apply these type of terms that read like a bad cartoon, apply them to the profession of educators? It is like the management ruling caste that does not work in the classroom has gotten everyone use to accepting this type of bullying and harassment, and it seems like a bird-call to call other bully management birds-of-a-feather to the role of enforcing these buzzwords over the people who do the work.

There might be a message to analytics, but using propaganda terms for it is not good. It is like the terms would fit: “marketing used to manipulative stupid people.” It is not appropriate description of the initiative and should not be used for activities of professionals. Whoever originates these types terms is getting a big fat paycheck and is not doing the work. Therefore, endorsing such unsophisticated “all / everything” terms is unethical and enables a mafia. Bureaucracies use this type terminology like gasoline to power other activities. It is a complete smoke screen and occupies the information space while real information is with-held or done behind closed doors and then done to people using the “all / everything” mode.

Private Citizen

October 18th, 2012
12:07 pm

Nice disclaimer in the last paragraph. What a jerk.

Private Citizen

October 18th, 2012
12:09 pm

“these data only tell one part of the complex story of teaching and learning. We will continue to improve our data systems such that our school leaders have access to multiple sources of data to utilize….”

This guy is getting a nice fat paycheck to be a re-delivery artist.

Private Citizen

October 18th, 2012
12:11 pm

Hey Rubye, How about some BOOKS in the classrooms?

Private Citizen

October 18th, 2012
12:20 pm

This is EXACTLY the kind of information teachers have to listen to in-person for hours and hours during the school year at required “faculty meetings.” And if any teacher speaks at one of these meetings or asks for clarification or does otherwise than be in awe of the sacred re-delivery from the people with the big fat paychecks, they will be SMASHED and otherwise marginalised and pushed away from the obedient caste cult.

One question. Rubye, Do you have health insurance and can you go to the doctor in a dignified manner and not have it affect your pocket book? Now, is this the same for the students (and faimilies) you teach? The ones you love so much and are so “caring” toward.

Private Citizen

October 18th, 2012
12:22 pm

Propaganda “We” speak, “we will….” meaning “me and Bill Gates.”

mountain man

October 18th, 2012
12:22 pm

Teachers should not be held accountable for student learning – ADMINISTRATORS should. Teachers should only be evaluated for their teaching. Think about it – who can control absenteeism – the administrators! Who can control discipline – the administrators! Who makes the decisions to advance students to the next level when they have not mastered the subject matter – the administrators (sometimes over the objection of the teacher)!

Private Citizen

October 18th, 2012
12:37 pm

For anyone wanting details and clarification, you will not get them from the local information re-delivery artist. If you want to find more detail, etc. do a search for “value-added gates foundation.”

Why can’t the local re-delivery person just say, “It’s from the Gates Foundation. Go ask them.” It’s because this would not justify executive compensation. $5. says the redelivery artist puts more than $100k/year on the “income” box of their annual tax return.

PS Don’t take it personally.

Private Citizen

October 18th, 2012
12:43 pm

@ Mountain Man, I think you should go open up a private school and control all that stuff. Make a bunch of diagrams. Then, go over and visit the apartment complex off Roswell Road, the place where the drives between the buildings go up and down like hills and you can pull in in the evening roll down your windows and buy some drive-through dope and then be on your way. You ought to stop and park the car and visit with the folks, no doubt at least somebody will be standing around and ask them these questions. Ask them about “who can control absenteeism.” I’m completely serious and you can get some of the most profound answers asking regular people who walk the walk. Then come back here and tell the people of Atlanta / environs what they say.

Private Citizen

October 18th, 2012
12:52 pm

Education admin has made the mistake that they think they have something in common with the corporate or business caste. This is not true. Business exists to make money. Government exists to serve the whole populace. Traditionally, government jobs are supposed to pay moderate and in return, have great stability. Working in government is a trade-off and a dynamic person gives up any dreams about living the high life, making the big deal, whereas if you are in business, you can open up new markets or think up a way to make an extra couple million dollars and enjoy and be doing good work. But business and government are two difference castes with different objectives. My point is, a lot of people have gotten into education management and want to play-act like they are now business class with business class income. This is a misapplication of caste. If you go into service work, you’re supposed to get moderate pay and job stability.

Private Citizen

October 18th, 2012
12:56 pm

Or maybe you should take it personally because you’re running a game.

Private Citizen

October 18th, 2012
1:04 pm

@ Mountain Man, You should read this: https://www.rutherford.org/publications_resources/john_whiteheads_commentary/americas_schools_breeding_grounds_for_compliant_citizens

Good old Henry Giroux. ha. “Explain it to me like I’m a five year old.”

Beverly Fraud

October 18th, 2012
1:13 pm

QWhen two exams do not have content alignment, we can think of value added as measuring whether students illustrate the same amount of knowledge one year later as students who scored similarly the year before.Q

In other words, your argument falls apart from the very start…

Sounds like a post RIPE for Jerry Eads to comment on.

Private Citizen

October 18th, 2012
1:19 pm

Pardon the numerous posts. I’m going to go disappear for a while into the mist.

Beverly Fraud

October 18th, 2012
1:33 pm

Enter your comments here

Beverly Fraud

October 18th, 2012
1:52 pm

“Atlanta Public Schools is also careful to be forthcoming…”

APS? Forthcoming? Someone actually said that after the cheating and North Atlanta debacle?

HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA!!!!!!!!

Questions on VAM

October 18th, 2012
2:17 pm

It’s interesting that Atlanta Neighborhood Charter, which prides itself on intentionally not teaching to the test, and SPARK, which does intensive test prep even in pre-CRCT grades, both scored low on on this (single, unexplained) VAM measure. Assuming these particular VAM scores actually mean anything at all, there are many, many different things they could mean.

Are VAM scores available for different demographics or different performance levels? Do some schools seem to be concentrating on raising the number of students who score with basic proficiency, while others seem to be focusing on raising the number of students who “exceed standards”?

Jaron Apperson: “the fact that the difficulty of the CRCT is so low, may make growth figures a bad measure for progress among students at the high end of the scale.” Seems likely. VAM isn’t some magical new test. It’s a measure that incorporates CRCT test results. And if it’s making fine distinctions among high-scoring schools based on scores on the CRCT, that’s really problematic. Does the research Rubye Sullivan points to show that CRCT scores are valid discriminators among the highest achieving students?

Ole Guy

October 18th, 2012
2:32 pm

Once again, it appears that (presumably) educated people wish to create mountains outa mole hills. Rather than attack the problems and issues directly (and, unfortunately, in extremely unpopular; un-pc manner), they go about generating far more problems than they could possibly solve within the time frames of their fraudulent careers.

These problem areas, as I have pointed out so many times, are embarrasingly simple…perhaps too simple for those whose primary agenda is none other than…job justification. I will not bore the readership with repetitions of previous remarks, for I am quite certain they stand out in stark contrast to the seemingly popular approach of 1) above all, remaining politically-correct; ensuring that any corrective measures are attached to the “smiley faces” of moot intervention where everyone…parents, kids, and, of course, the education elite all walk away with the very same feeling as one who has wee weed one’s dark-colored trousers…gives one that warm feeling while nobody really notices any differences.

Many of you may take a dim view upon my somewhat “off-color” means of expression, however, I repeat…the ONLY way of regaining any semblence of value within the educational system is to return to the tried-and-proven ways of old school education…no frills, gimicks or quackery; no “special considerations”; no “at risk bs. STANDARDS STANDARDS STANDARDS…establish em’; insist upon them…PERIOD!

Leila Holmann

October 18th, 2012
4:44 pm

Like Bill, I firmly believe that you have to measure something to be able to evaluate at a given time and then improve on it. I understand that it is not easy to put in place a fair, good, measuring system and I accept that it may take years of tweaking before that. In the meantime, we parents and instructors have to be patient with dealing with a system that may have flaws (as long as it’s acceptable and that we’re moving in the good direction). The measuring system and tests may never be perfect but hopefully they will become “good enough”.

Like Southside Parent, I would like a better explanation on how VAM scores are calculated (details please). One question that I have regarding the current system is what happens when students at a “high achieving school” have great scores at the beginning of the year on a given test – will the system be able to measure growth at the end of the year? Let’s say a student already excels at the test at the beginning of the year (let’s say he already masters the program taught that school year), will the test be able to evaluate growth for this student (into future year program for instance). I wonder how this could be possible? Tests will have to be very sophisticated to allow for such things it seems to me, such as a “rule based” computer test which will be able to bypass easy questions and jump to more advanced ones based on the student level. Is this the case for current CRCT tests?

McGill

October 18th, 2012
5:05 pm

I am concerned with R. Sullivan’s comment, “A school that is unable to motivate its students or teach good behavior will also be ineffective in raising academic outcomes.” One APS middle school with a relatively high VAM has a significant problem with behavior. I suggest R. Sullivan consider the correlation between low expectations from administrators and staff AND academic success and behavior.

Matt Underwood

October 18th, 2012
5:14 pm

I appreciate the thought Rubye put into responding to my letter and the responses it generated. I’ll also say that, somewhat knowing Rubye, she is extremely sharp and cares deeply about students, so some of the comments that question her intellect or her motivations are off-base, in my opinion. I don’t agree with all she writes above, but it’s certainly not because I have any doubts about her as skilled and committed educator.

A few quick points I wanted to make regarding what she wrote:

The RAND Corporation and several other independent organizations have conducted large-scale studies of the nascent use of value-added modeling in K-12 education, and there is a substantial evidence of the shortcomings of value-added modeling, some of which I mentioned in my letter. I mention that because, while the Gates Foundation may have other evidence in regards to value-added modeling, the Gates Foundation is also helping to fund the very initiatives using value-added modeling, so looking to more independent studies would seem to me to be useful. More importantly, though, I don’t really see much acknowledgement or explanation about the limits and potential errors in value-added modeling alongside the scores that are given. Whether that’s the responsibility of APS or the AJC is debatable, but, considering how much emphasis is apparently being put on these scores, someone should really be putting them in their proper context.

As for using content (rather than skill) driven tests as a part of assessing growth in learning from one year to the next, I still contend that there are some statistical challenges with that approach. For me, though, the real issue lies in what the “learning” is that is being measured. If you have not, for example, seen the Georgia social studies CRCT, I can tell you that the test in the middle school grades assesses very discrete, disconnected factual knowledge. Being able to recall such information from memory for a multiple choice test is, to me, much less important than assessing student’s research skills, ability to make arguments based on historical documents, and the like. If one thinks that preparing students to do both of these tasks–recalling facts from memory and practical application of skills–equally well is possible during the course of a regular school year, please come and show me how it is done.

Lastly, though there is mention of “not over-interpreting the data” and utilizing “multiple sources of data…in the school improvement process”, I haven’t heard of any other data being reported in a similar way nor with the same level of investment of time and money. My hope is that there might be something coming…

Tony

October 18th, 2012
5:25 pm

Please don’t be deceived. There are some things that cannot be measured and student learning is one of them. Treating the idea that learning can be placed on some kind of month to month continuum is ludicrous on its face. The only value you get from the scores is related to what the students were tested on. The test represents a very narrow slice of reality and should not be extrapolated to the extent these folks are taking it. This is an example of the blatant misuse of mathematics to justify an agenda.

Tony

October 18th, 2012
5:26 pm

Good response, Matt.