Thursday, March 10, 2016

Manhattan Institute Grades Schools Based on an Unsubstantiated Norms


The Manhattan Institute for Policy Research released in the fall of 2015 a website, SchoolGrades.org, which claims to use an international standard of excellence to grade how well America’s schools prepare students in core subjects. Grading projects of this sort are only as useful as the transparency and merits of the underlying data and calculations.

In this case,, a review reports that the site is rife with technical and logical shortcomings.

Jaime L. Del Razo, a Principal Associate at the Annenberg Institute for School Reform at Brown University and Adjunct Assistant Professor at Brown University’s Department of Education, reviewed the website SchoolGrades.org for the Think Twice Think Tank Review Project at the National Education Policy Center, housed at the University of Colorado Boulder’s School of Education.

SchoolGrades.org evaluates and assigns grades, using reading and math test scores, to U.S. schools, comparing schools across their respective states and to other countries. The site’s creators never fully explain their approach, but it apparently uses a four-step process: (1) average two state test scores; (2) “norm” these results to the NAEP exam; (3) make an adjustment to this nationally normed measure using free and reduced price lunch data to account at least partially for differences in socioeconomic status; and (4) “norm” these results to the international PISA exam.

The website alleges that this process allows a parent to compare a local school to schools in other countries, yet the unsubstantiated norming chain is too tenuous and the results are overly extrapolated to be of any useful value. It fails to explain how international scores are equated to the national standard they developed, how letter grades were determined, and how free and reduced price lunch counts were used to make socioeconomic adjustments. The challenge facing the creators of this project was to equate scores on the various tests, yet none of the field’s abundant equating research is even cited.

Professor Del Razo concludes that the website’s reliance on aggregated test scores is far too narrow a base to serve as a useful evaluation of schools. He also points out that it perpetuates the misuse of testing as the best way to assess a school’s worth despite ample research to the contrary. Thus, its approach to evaluating schools fails on technical grounds and, just as importantly, it fails to understand and consider the broader purposes of education in a democratic society.

No comments: