What to do with tainted PSSA scores?
By by Benjamin Herold for NewsWorks, a Notebook news partner on Nov 20, 2012 03:39 PM
Howard Wainer poses a question:
If you measured the height of a group of kids and later found out some had been standing on a stool, what would you do?
“The statistical approach would be to try to estimate the height of the stool and make [an] adjustment,” says Wainer, answering himself. “If the stool is a foot [tall], and everybody is suddenly a foot shorter than they measured, then you have to do something.”
A leading national expert on testing and statistics, Wainer spent 21 years as the principal research scientist at the Educational Testing Service and is the author of “Uneducated Guesses: A Guide to Using Evidence to Uncover Misguided Education Policies.” He says the stool analogy is useful when considering the dilemma facing the Pennsylvania Department of Education (PDE) and the School District of Philadelphia.
A state-commissioned analysis found strong circumstantial evidence of adult cheating on state standardized tests in 2009, 2010, and 2011 at dozens of Pennsylvania schools, including 53 District-run schools and four area charters. After scores dropped precipitously last year, state Secretary of Education Ronald Tomalis made comments suggesting that PDE did not have confidence in the validity of the past results.
So what has been done?
A source familiar with the cheating investigation says officials from both the state and the District have quietly attempted to calculate how big an impact the cheating had on scores during the years in question. But so far, it appears that no one has used their results. The questionable, unadjusted scores continue to be used to hold schools accountable and make a wide range of high-stakes policy decisions.
Through their spokespersons, both PDE and the District suggested they are waiting for investigations to conclude before taking action. Both declined interview requests.
The Ohio Department of Education recently took a different approach. After finding evidence that attendance records in the Lockland School District near Cincinnati had been manipulated in order to boost test results, the department downgraded the ratings of the district and some of its schools.
“We want to make sure we give the taxpayers, the parents, and the students an accurate picture of how well their school is doing,” said John Charlton, the associate director of communications for the department.
Wainer says it’s not difficult to understand why an agency might move slowly to correct inflated test scores that created the false impression of improvement.
“Trust me, [a test score] wouldn’t be used if it was 20 percent lower than it should have been,” he says. “They would fix it.”