Cheating: Time for answers
For nine years in a row, up through 2011, every summer brought cheery news that test scores went up in Philadelphia schools. Sure, performance wasn’t where it should be, but things were on the right track. More and more students scored proficient and many schools met their adequate yearly progress targets.
To some skeptics, it didn’t compute. Why were we still seeing dismal results on other assessments like the SAT and the NAEP? Why were graduates still crowding remedial classes at Community College of Philadelphia? But the likes of former superintendents Paul Vallas and Arlene Ackerman kept trumpeting the rising test scores.
Now much of that progress has been cast in doubt.
In 2011, the Notebook uncovered a state report showing that nearly 100 Philadelphia schools had implausible numbers of wrong-to-right erasures, according to a forensic analysis of the 2009 PSSA exam. When the state also looked at the 2010 and 2011 tests, they apparently found further signs of adult cheating. The Department of Education ordered investigations at 56 city schools – 53 District, three charters – and put tough security measures in place for 2012. Add in some severe state budget cuts, and lo and behold, at half the schools, scores plunged.
Investigations of past wrongdoing are dragging on. But the state says this year’s results are valid, and with new security measures, everything should be fine going forward.
We don’t buy it.
First, the results of investigations so far give us no confidence that the past messes have been cleaned up. We have heard no explanation of what happened, so how can we know it won’t happen again? Unlike Atlanta, where widespread cheating was uncovered, these investigations are low-profile and low-budget. Without subpoena power, investigators have no leverage getting those who witnessed cheating to talk. Some schools were allowed to investigate themselves. Surprise – they found no evidence of cheating.
Second, we do not have a consistent testing system in place. Experts say that uniform procedures are important for test integrity. Yet District schools are forced to follow different rules for proctoring the exam from other schools in the city and state.
Above all, the evidence of widespread cheating points to the lengths to which people will go under a high-stakes testing regime when so much is riding on one exam – from teacher evaluations to school closings. Perhaps the overt cheating can be curtailed by tough security. But there are other ways to game the system: manipulating which students are tested, coaching on past test questions, or simply engaging in the intensive test prep drills that are commonplace at low-performing schools.
Until we get a thorough accounting of all the ways that test results have been manipulated, how can anyone have confidence that these are valid assessments of how students and schools are doing? How can critical decisions like closing schools be based on suspect data?
We cannot resume the testing business as usual. We need answers: details about how the scores were manipulated – and by how much. Where was the chain of command? Who participated, who encouraged it, who looked the other way? Those involved need to be held accountable. And we need to build some alternatives to this high-stakes assessment system so that we don’t keep repeating this folly.