Quantcast
Channel: Editor's Note » riverside
Viewing all articles
Browse latest Browse all 3

Behind the scenes: Grading Our Schools

$
0
0

gos2Accountability.

Thirteen years ago, it was the educational buzz word that stemmed from The No Child Left Behind Act of 2001. Then-President George W. Bush signed the act into law on Jan. 8, 2002, granting the buzz word eternalness.

The law’s unrealistic goal was to require every child in the United States to become proficient on his or her state reading and math tests by 2014. In attempts to overhaul NCLB, the Obama administration waived some of the law’s requirements, and the U.S. House passed the Student Success Act last summer.

There are new buzz words – such as Common Core, Keystone Exams – but accountability remains stronger than ever.

Like it or not, in public education, accountability is measured by test scores. The state has tinkered with exactly how those test scores are measured but the bottom line is still performance-based.

I don’t think anyone will argue that improving student achievement is a good idea. The problems, and debate, emerge in the details on how exactly you go about trying to improve it.

One of the biggest problems with laws like NCLB is that they are one-size-fits-all. Districts are not. Some have higher percentages of low-income and ESL students. Others have a high number of transferring students, low attendance rates. Funding has decreased at different levels. All those things impact test scores but are not really addressed in the bottom line of test scores, and accountability.

Back in 2001, I wanted to come up with a way to better measure how students, and schools, were doing in meeting the NCLB goals. I spent weeks collecting all kinds of data from every single public school district and elementary, middle and high schools in Pennsylvania. I met with a University of Scranton educational statistician and we spent weeks crunching and analyzing all this demographic data and test scores using the statistical software program, SPSS.

I wanted to know what contributed to low and high SAT and PSSA test scores.

The analysis revealed that data with the strongest correlation to high and low test scores are the percentages of low-income students in districts, student-attendance rates and per-pupil expenditures. In that order, and probably of little surprise to anyone in education. We were then able to come up with average predicted SAT scores and average predicted PSSA test scores for each school districts. The predicted scores were tailored specifically for each district and painted a picture of how well each district is doing on those tests based on its resources, student and teacher demographics and other factors.

Then, by comparing the predicted test scores with the actual test scores for each district, I was able to see how close, or how far away, the districts came to their predicted scores. The results were interesting because it leveled the playing field.

After five years of being able to do this statistical analysis, the state altered the ay it releases PSSA test scores.

Instead of an average raw test score, which was crucial for the analysis we were doing, the state would only release proficiency percentages. We fought, we appealed, and we lost in our efforts to try to continue getting the average raw scores so our SPSS analysis could continue.

With that change, we had to revamp Grading Our Schools. What you saw in yesterday’s paper and here on our site is the result of that. Instead of a statistical analysis, the report looks heavily at proficiency rates, using the state averages to see how districts are doing.

Staff Writer Sarah Hofius Hall took the Grading Our School reins in 2008. Weeks before the special report appears, she crunches numbers, analyzes data and writes stories, not just for The Times-Tribune but also for our sister papers, The Citizens’ Voice in Wilkes-Barre and The Standard-Speaker in Hazleton. The work is exhaustive.

This year was even more challenging because the state failed to give us all the data we needed. We expect to have it later this week, and we will add it to our data center.

Key to the project is our searchable database. You can customize your search by selecting different districts statewide and compare their demographics and other data, including percentages of special education, ESL and low-income students, teacher and administrator salaries, drop-out rates, graduation rates, per-pupil spending and total district spending.

In its 14th year of publication, our special report is just a snapshot. Test scores and demographics don’t measure all the other important aspects of a well-rounded education. Please keep that in mind as you dive into the data. Study it. Discuss it. And, please contact us if you have any questions about it (jmatthews@timesshamrock.com; shofius@timesshamrock.com).

We hope you recognize the report is an important tool. It is meant to foster discussion and debate with the goal of improving education, so our students have what they need to succeed. – Jess


Viewing all articles
Browse latest Browse all 3

Latest Images

Trending Articles





Latest Images