In July 2010, when Saliyah Cruz was named principal of Communications Technology High, state test scores said the small citywide admission school in Southwest Philadelphia was one of the best in the city.
Everything else said something different.
SAT scores were poor. Summer enrichment programs were empty. Loads of kids tested into remedial reading and math. According to Cruz, even the police complained that many students had spent much of the previous school year at the nearby Penrose Plaza strip mall instead of in class.
“Those kinds of things didn’t add up for me,” she said. “If my kids were out in the street when they belong in school, how were they scoring  percent proficient?”
Two years later, an answer appeared: A mountain of circumstantial evidence now suggests that Comm Tech’s results on the 2010 Pennsylvania System of School Assessment (PSSA) exams were inflated by adult cheating. The school is one of 53 District schools and four area charters involved in a state-led investigation that has prompted questions about the validity of test results between 2009 and 2011.
Cruz, now a middle school principal in Delaware, says the suspect scores at Comm Tech hurt students. She also believes they reflected a districtwide culture of rewarding improbable PSSA gains while dismissing steady improvement.
“The message quite clearly was, ‘Here’s what’s expected in the School District of Philadelphia,’” said Cruz. “All the principals, all the teachers, all the kids need to be able to make these giant leaps forward.”
Testing experts say the ripple effects of inflated scores go even wider, especially because the District continues to rely heavily on data that is likely tainted to measure success and make high-stakes policy decisions.
“I think the implications are pretty profound,” said Jonathan Supovitz, a University of Pennsylvania professor who co-directs the Consortium for Policy Research in Education (CPRE).
“If we can’t assume the stability of the data, then any sense of guidance about what we’re doing well or not well is broken down.”
District officials declined to be interviewed for this story.
“It is too early to say how the PSSA scores have been affected by the allegations of testing improprieties,” wrote spokesman Fernando Gallard in a statement, citing ongoing investigations.
Each year, students in grades 3-8 and 11 take the PSSA in reading and math. Their scores are used to determine whether schools meet federally mandated performance targets, known as adequate yearly progress (AYP). In Philadelphia, they’re also used to make big decisions, including which schools get closed or converted to charters.
In 2010, 75 percent of 11th graders at Comm Tech scored proficient or above in reading. That was a 22 percentage-point jump over the previous year.
In math, 70 percent of Comm Tech 11th graders scored proficient or above, 40 points higher than the year before.
An analysis commissioned by the Pennsylvania Department of Education suggested the results may be illegitimate. In both 2009 and 2010, a high number of student response sheets at Comm Tech had suspicious patterns of “wrong-to-right” erasures – a telltale sign of adult cheating.
Before the 2010-11 school year started, Comm Tech’s principal, Barbara McCreery, was replaced. That year, under Saliyah Cruz, the suspicious erasures went away. The school’s scores tanked, dropping 38 points in reading and 45 points in math.
McCreery, now the principal at Bok Technical High, declined to comment for this story.
Cruz says she wasn’t sure what to think when she walked into Comm Tech.
“I thought I was taking the helm of a high-performing school,” said Cruz. “Although there were some red flags.”
She says she tried talking to Comm Tech staff to get a handle on what was going on.
“‘Guys, help me understand this. What were we doing last year that accounted for the kind of academic performance the kids had?’”
In response, says Cruz, staff pointed to “Study Island,” a computer-based test prep program used at many District schools.
“It didn’t make any sense,” she said.
Despite her skepticism, Cruz says the 2010 PSSA results still led her to believe that only a small proportion of Comm Tech’s students needed remedial help. Rather than overhaul staffing patterns and course schedules to allow for a schoolwide intervention, she expanded use of Study Island.
But early indicators signaled disaster. Reports generated by Study Island suggested that students didn’t understand the material. Interim tests used to predict PSSA performance pointed to huge score drops. Cruz’s own eyes told her that students weren’t learning.
Some of her staff refused to believe any of it, she says.
“I got a lot of pushback,” said Cruz. “‘I don’t care what all this data is saying, our PSSA scores say something different.’”
Her efforts to get some staff to change their instruction or re-teach content were rebuffed.
“I felt like I was running into a brick wall,” said Cruz.
As a result, says Cruz, students at Comm Tech got a Band-Aid when they needed surgery.
“I don’t think the kids got the supports they needed,” she said flatly.
Shortly after beginning her second year at Comm Tech, Cruz left the District altogether.
Supovitz, the head of CPRE, has studied educational testing for 15 years.
A strong believer in standardized tests, he says that exams like the PSSA provide a reasonably accurate look over time at whether kids across a school or district are learning what they’re supposed to learn.
Shown the wild fluctuations in Comm Tech’s test score data between 2009 and 2011, Supovitz offered a one-word reaction:
The sharp spike in the 2010 Comm Tech scores should have provoked a closer look from the central office, he said.
“That’s the usefulness of these kinds of data,” said Supovitz. “An administrator overseeing 250 schools can look and ask questions.”
That’s not generally how the scores have been used in Philadelphia, however.
Cruz says that if anyone inside District headquarters took a critical look at Comm Tech’s PSSA results, she wasn’t aware of it.
“I was not a part of any conversations like that,” she said.
Instead, District officials held up schools with improbable results as exemplars.
Huge gains lauded
Cruz recalls vividly a citywide principals’ meeting in 2010 at which former Roosevelt Middle School principal Stefanie Ressler was invited to present on her school’s astronomical test score gains.
“I’m sitting there going, ‘Well, how in the heck did she do that?’” recalled Cruz, who had just been removed as principal of West Philadelphia High. “I have the same resources, and I’m pulling my hair out, and I can’t make those kinds of leaps.”
Several members of Roosevelt’s staff later accused Ressler, now principal of Wilson Middle School, of cheating. The state-commissioned analysis found overwhelming signs of suspicious erasures in every tested grade and subject at Roosevelt between 2009 and 2011. An investigation is ongoing.
Accounts from the unfolding cheating scandal have been hard to swallow, says Cruz.
Despite making widely praised improvements in the climate when she was at West, she was told during the 2009-10 school year that test scores weren’t rising fast enough. Superintendent Arlene Ackerman designated the school for a complete overhaul as part of her Renaissance turnaround initiative. Cruz was ousted.
“Slow, incremental growth got dismissed,” said Cruz, while questionable results were allowed to “distort what’s actually possible.”
Since 2010, 26 schools, including West, have been either converted to “Promise Academies” or handed over to charter operators, largely on the basis of poor test scores. Last year, the District closed eight schools, based in part on the same scores.
Although it is impossible to undo any of those decisions, it is not too late to “build a system that produces stable data we can have confidence in,” said Supovitz.
No action yet from city or state
To date, though, both the Pennsylvania Department of Education (PDE) and the School District have declined to address the distorting effects of artificially inflated PSSA scores.
Both continue to use the three years of questionable results to hold schools accountable and guide significant policy decisions.
In September of this year, Secretary of Education Ronald Tomalis contended that 2012 is the first year in which the public can be confident that “PSSA scores are a true reflection of student achievement and academic progress.”
Regardless, PDE does not appear to have adjusted the past AYP status of any district or school. Roosevelt Middle, for example, is still deemed to have met its performance targets in both 2009 and 2010 – which gives it a more favorable AYP status now – despite the likelihood that its results from those years were tainted by cheating.
Through a spokesman, Tomalis did not respond to interview requests.
In an email, PDE spokesman Timothy Eller suggested that the state is waiting for its cheating investigation to conclude before making any decisions about adjusting AYP determinations.
“The department is considering the various options, and decisions will be announced when they are made,” he said.
The District has taken a similar stance.
No moves have yet been made to either remove the suspect data from use or to adjust it. Officials in Philadelphia still plan to use AYP status to help determine which schools to close this year. They also will apparently continue feeding questionable PSSA results into their School Performance Index, used to rank schools.
“The District will wait for the [investigation] findings before providing further comment on this issue,” wrote Gallard.
Saliyah Cruz has been affected by it all as much as anyone.
From her office in Delaware, she still wonders where Philadelphia schools would be now if officials had set a “realistic target” for growth instead of touting implausible test score gains as the norm.
“When you set up a system like that,” Cruz concludes, “it’s only a matter of time before you get the issues that we have now.”