Urban Wire How good are DC's schools?
Austin Nichols
Display Date

Media Name: shutterstock_140026603.jpg

Public schools in DC have gained ground on national tests over the past 15 years, but much of that gain is due to the changing demographic composition of DC’s student body. Parents’ income and education are primary determinants of student performance, and as the income and education of DC residents has improved, so have incoming students’ scores. Comparing NAEP scores over time without accounting for incoming class scores is mere jackaNAEPery, and comparing proficiency levels is no better.

The best measure of school quality is how much students learn: their improvement from incoming to outgoing scores, not how well they test at just one point in time. Measuring how much students learn over time is difficult, and measures of teacher impact and school-specific test score growth compare only how well DC public school students do relative to other DC public schools students. This leads to relative rankings only, not measures of how DC public schools as a whole are improving, as do most other measures of value added by schools.

So how can we gauge how much a given school improves students’ scores relative to other schools? Using the median growth percentile (MGP) may be the best bet. Each student’s growth percentile score is equal to the percentage of comparable students districtwide who performed worse on a later math and reading exam. The school’s score is then the typical student’s score.

For example, imagine Nicole earns a 45 on a math exam at the end of 4th grade. Nicole’s comparison group is students who also scored around 45 on that exam. Now imagine that at the end of 5th grade Nicole scores a 52. If that 52 is better than the scores of 70 percent of her comparison-group peers, then Nicole’s growth percentile is 70, which is a measurement of her relative growth. From there we can figure out each school’s MGP, the middle growth percentile of all of that school’s students.

A typical school will have a median growth percentile of 50, while 70 means a school is doing substantially better than a typical school and 30 means a school is doing substantially worse. So we can say one school is better than another if its MGP is higher, meaning that at least the typical growth level is higher at one school than the other (it could still be the case that the lower-ranked school serves a specific subgroup much better than the higher-ranked school).

The main problem is that these scores are fraught with measurement error, so rarely can we say with confidence that one school is better than another school. For example, Bancroft was in the top third on math MGP for 2011-12 and H.D. Cooke was in the bottom third, but the two math MGP scores are statistically indistinguishable (their scores’ margins of error overlap).

The graph below shows each school’s math versus reading MGP scores for 2011 and 2012. A box around each dot indicates the range of statistically indistinguishable values for both math and reading in that year (2013 data were not released with the ranges). This way, we can compare the score both to other schools and to the typical school with a score of 50 (whose dot would be at the intersection of the math and reading lines at 50). Each school’s box overlaps with the boxes of many other schools, and any two schools whose boxes touch are statistically indistinguishable. In many cases, many dots representing best guesses for other schools will be included in the range for a given school, for example Barnard ES (Lincoln Hill) in the graphic below. Whenever we compare schools, we should remain appropriately skeptical about the relative strengths of signal and noise in these data.

So what can we say? Charter schools (in blue) tend to have higher growth scores, but traditional public schools (in dark gray) are overrepresented at both the highest and lowest ranks. The main difference between the sectors is that charters tend not to be observed among the lowest-performing schools, suggesting that the worst charters are better than the worst traditional public schools.

But there are surprises in both sectors. For example, what is the best high school in town? Ranked by the MGP for math, it is Thurgood Marshall Academy (a charter), east of the Anacostia, with McKinley Technology High School (DCPS) in Eckington running a close second. Only a handful of other high schools are statistically better than average on math in both years, but the sought-after high schools Wilson and School Without Walls are not on that list.

Among elementary schools, the top performers are scattered around the city, for example Ross in Dupont, Stanton in far Southeast DC, Tubman in Columbia Heights, Watkins on Capitol Hill, and Stoddert in Glover Park. The top middle schools are the KIPP DC AIM and KEY academies, with Cesar Chavez Prep and DC Prep’s Edgewood Middle charters not far behind. The highly regarded Deal Middle School in Ward 3 barely squeaks out a statistical advantage over the typical school.

Interested in learning more about how education in DC has evolved? Explore the schools chapter of Our Changing City, an interactive web feature that uses data to tell the story of change in the District of Columbia.

Body

Tune in and subscribe today.

The Urban Institute podcast, Evidence in Action, inspires changemakers to lead with evidence and act with equity. Cohosted by Urban President Sarah Rosen Wartell and Executive Vice President Kimberlyn Leary, every episode features in-depth discussions with experts and leaders on topics ranging from how to advance equity, to designing innovative solutions that achieve community impact, to what it means to practice evidence-based leadership.

LISTEN AND SUBSCRIBE TODAY

Research Areas Education Greater DC
Tags K-12 education Secondary education Washington, DC, research initiative
Policy Centers Income and Benefits Policy Center
Cities Washington-Arlington-Alexandria, DC-VA-MD-WV