A distorted reality
An ex-principal says inflated test scores skewed decision-making and hurt students. The problem isn’t fixed.
by Benjamin Herold for NewsWorks, a Notebook news partner
She says she tried talking to Comm Tech staff to get a handle on what was going on.
“‘Guys, help me understand this. What were we doing last year that accounted for the kind of academic performance the kids had?’”
In response, says Cruz, staff pointed to “Study Island,” a computer-based test prep program used at many District schools.
“It didn’t make any sense,” she said.
Despite her skepticism, Cruz says the 2010 PSSA results still led her to believe that only a small proportion of Comm Tech’s students needed remedial help. Rather than overhaul staffing patterns and course schedules to allow for a schoolwide intervention, she expanded use of Study Island.
But early indicators signaled disaster. Reports generated by Study Island suggested that students didn’t understand the material. Interim tests used to predict PSSA performance pointed to huge score drops. Cruz’s own eyes told her that students weren’t learning.
Some of her staff refused to believe any of it, she says.
“I got a lot of pushback,” said Cruz. “‘I don’t care what all this data is saying, our PSSA scores say something different.’”
Her efforts to get some staff to change their instruction or re-teach content were rebuffed.
“I felt like I was running into a brick wall,” said Cruz.
As a result, says Cruz, students at Comm Tech got a Band-Aid when they needed surgery.
“I don’t think the kids got the supports they needed,” she said flatly.
Shortly after beginning her second year at Comm Tech, Cruz left the District altogether.
Supovitz, the head of CPRE, has studied educational testing for 15 years.
A strong believer in standardized tests, he says that exams like the PSSA provide a reasonably accurate look over time at whether kids across a school or district are learning what they’re supposed to learn.
Shown the wild fluctuations in Comm Tech’s test score data between 2009 and 2011, Supovitz offered a one-word reaction:
The sharp spike in the 2010 Comm Tech scores should have provoked a closer look from the central office, he said.
“That’s the usefulness of these kinds of data,” said Supovitz. “An administrator overseeing 250 schools can look and ask questions.”
That’s not generally how the scores have been used in Philadelphia, however.
Cruz says that if anyone inside District headquarters took a critical look at Comm Tech’s PSSA results, she wasn’t aware of it.
“I was not a part of any conversations like that,” she said.
Instead, District officials held up schools with improbable results as exemplars.
Huge gains lauded
Cruz recalls vividly a citywide principals’ meeting in 2010 at which former Roosevelt Middle School principal Stefanie Ressler was invited to present on her school’s astronomical test score gains.
“I’m sitting there going, ‘Well, how in the heck did she do that?’” recalled Cruz, who had just been removed as principal of West Philadelphia High. “I have the same resources, and I’m pulling my hair out, and I can’t make those kinds of leaps.”
Several members of Roosevelt’s staff later accused Ressler, now principal of Wilson Middle School, of cheating. The state-commissioned analysis found overwhelming signs of suspicious erasures in every tested grade and subject at Roosevelt between 2009 and 2011. An investigation is ongoing.
Accounts from the unfolding cheating scandal have been hard to swallow, says Cruz.
Despite making widely praised improvements in the climate when she was at West, she was told during the 2009-10 school year that test scores weren’t rising fast enough. Superintendent Arlene Ackerman designated the school for a complete overhaul as part of her Renaissance turnaround initiative. Cruz was ousted.
“Slow, incremental growth got dismissed,” said Cruz, while questionable results were allowed to “distort what’s actually possible.”