NAEP — known as the Nation’s Report Card — released results today of how American students fared on a new component of its science test that included hands-on, interactive experiments and virtual labs.
The new component was added to the 2009 science assessment. In one example, 12th graders were asked to determine a location for a new town based on an assessment of water quality flowing near that site. Students were asked to test water samples, determine levels of pollutants and then justify the decision where they would locate the new town using the data from the experiment they conducted.
Overall, students could conduct the experiments but were not as skilled in using their data to justify conclusions or writing reports. In one example cited in a webinar this morning on the results, 93 percent of fourth graders got the right answer in a science experiment, but only 32 percent could use the evidence from the experiment to justify their answer.
On the webinar announcing the results, National Center for Education Statistics Commissioner Jack Buckley said NAEP learned three key things through this new testing component:
•”Students are pretty good at doing some parts of science. The vast majority could use simulated laboratories to do the tests.”
•However, “students overall across all the task and across all grade levels were challenged by the parts of the test that required them to consider more than one variable at a time or if they had to make strategic decisions about how to collect the data.”
•”Students could select correct conclusions, but didn’t do so well when we asked them to explain their conclusions using the evidence from the data tables.”
Buckley noted that girls outscored boys in hands-on tasks, although boys outscored girls in the traditional NAEP science test for which results were released last month. There was no gender gap in the interactive computer segment.
Buckley said students loved these new assessment items, adding, “Kids said it was fun. It was hard to pry the computer and the tasks away from the kids. They really wanted to keep doing them. It was hard to get them to stop.”
Also taking part in the webinar was National Assessment Governing Board member Alan J. Friedman, a Georgia Tech graduate.
“With technology so close to the center of our society, we reward response to change and innovation,” he said. “So, testing to see how much students can memorize and how well they can follow instruction is no longer good enough. We need to know that students have the so-called higher order, 21st century skills…It is crucial to know if students know how an experiment or engineering task is designed, how data is analyzed and how to draw the best of multiple, possible solutions. These are all critical to innovation.”
(Official definition: National Assessment of Educational Progress (NAEP) is the largest nationally representative and continuing assessment of what America’s students know and can do in various subject areas. Assessments are conducted periodically in mathematics, reading, science, writing, the arts, civics, economics, geography, and U.S. history. )
Here is the official NAEP release:
Today’s results reveal that America’s fourth, eighth, and 12th graders can conduct science investigations using limited data sets, but many students lack the ability to explain results. The report shows that students were challenged by parts of investigations requiring more variables to manipulate, strategic decision-making in collecting data, and the explanation of why a certain result was the correct conclusion.
The new interactive computer tasks and updated hands-on tasks that involve more open-ended scenarios were administered as part of the 2009 science assessment by the National Center for Education Statistics to a nationally representative sample of more than 2,000 students in each of grades 4, 8 and 12. The findings provide important insights for educators and policymakers who are looking for academic approaches that support careers in science, technology, engineering and mathematics (STEM) fields, and encourage scientific inquiry.
“Science is fundamental to education because it is through scientific inquiry that students understand how to solve problems and ultimately how to learn,” said David Driscoll, chairman of the National Assessment Governing Board, which sets policy for NAEP. “So it’s tragic that our students are only grasping the basics and not doing the higher-level analysis and providing written explanations needed to succeed in higher education and compete in a global economy.”
The purpose of using hands-on and interactive computer tasks in testing is to determine whether students can solve problems as a scientist would and require students to perform actual science experiments. Interactive computer tasks require students to solve scientific problems in a computer-based environment, often by simulating a natural or laboratory setting.
“This innovative format allows for a richer analysis than a paper-and-pencil test,” Driscoll said. “Interactive computer tasks allow us to more deeply examine students’ abilities to solve problems because the tasks generate much more data.”
Only 53 percent of 12th graders reported that they were enrolled in a science course, and only 28 percent reported writing a report on a science project at least once a week. Ninety-two percent of fourth graders and 98 percent of eighth graders had teachers who reported doing hands-on science activities with students at least monthly. Thirty-nine percent of fourth graders and 57 percent of eighth graders had teachers who reported having at least a moderate emphasis on developing scientific writing skills.
The assessment measures science skills in a number of ways. Some questions use a model known as “predict-observe-explain” to examine students’ ability to combine their science knowledge with real-world investigative skills.
To correctly predict, students had to provide an accurate description of what might happen in a situation. For instance, when asked what kind of sunlight conditions were needed for a sun-loving plant and a shade-tolerant plant, 59 percent of fourth graders showed understanding that different plants have different sunlight needs.
Through the observe phase, students watched what happened as they conducted their experiments. Eighty percent of fourth graders made straightforward observations and tested how fertilizer and sunlight affected plant growth, but only 35 percent could perform a higher-level task that required them to make decisions about the best fertilizer levels for a sun-loving plant.
Students were then asked to explain what they had observed by interpreting data or drawing conclusions. Across all grade levels, a majority of students could observe, but far fewer could predict or explain. In fourth grade, fewer than 50 percent of students could explain why they selected a given fertilizer amount to support plant growth and use evidence to support their answer. At grade 8, 88 percent of students could correctly identify which liquid flowed at the same rate as water at a given temperature, while only 54 percent could support this answer with a written explanation of the evidence.
At twelfth grade, 64 percent of students could recommend the site for a new town based on information provided about water quality, while 75 percent of students could perform a straightforward investigation to test the water samples and accurately tabulate data. But only 11 percent were able to provide a valid recommendation and support their conclusions with details from the data.
More highlights from Science in Action include:
Overall achievement gaps
•There are gaps in average scores for all tasks between students from low-income families (those eligible for free and reduced-price lunch) and those from higher-income families.
•There are gaps by race/ethnicity. At all grade levels, white and Asian/Pacific Islander students outscored their black and Hispanic peers.
•At grades 4 and 12, Hispanic students scored higher than their black peers on interactive computer tasks and hands-on tasks.
•Female students outscored males on the hands-on tasks, but males scored higher on the traditional paper-and-pencil assessment. There was no gender gap for interactive computer tasks.
•Seventy-one percent of students could correctly select how volume changes when ice melts into water, but only 15 percent could support this conclusion with evidence from the investigation.
•Overall, students earned about 42 percent of the total points available from the questions they attempted on the interactive computer tasks.
•Overall, students earned about 47 percent of the total points available from the questions they attempted on the hands-on tasks.
•Eighty-four percent of eighth graders could correctly test how much water flowed to different soil samples during a simulated laboratory test.
•Overall, students earned about 41 percent of the total points available from the questions they attempted on the interactive computer tasks.
•Overall, students earned about 44 percent of the total points available from the questions they attempted on the hands-on tasks.
•Fifty-five percent of students could select the correct temperature changes occurring when a warm solid is placed in cool water, but only 27 percent were able to explain how heat was transferred from a warmer to a cooler substance.
•Overall, students earned about 27 percent of the total points available from the questions they attempted on the interactive computer tasks.
•Overall, students earned about 40 percent of the total points available from the questions they attempted on the hands-on tasks.
–From Maureen Downey, for the AJC Get Schooled blog