Friday, September 20, 2013

The Boston.com MCAS School Rankings Stink!

This post originally appeared on October 10, 2012.  However with the annual rankings going up again today, I thought I would share these thoughts again. 

Since the first time I saw the school rankings in the Boston Globe over a decade ago, I have been frustrated by the simplistic and misleading approach that this news outlet has taken in publicizing the scores from our state's high stakes test.  The approach is simply to rate the top schools from "Number One" to whatever the final number is depending on the grade level that was tested. For instance, if you were a school that had third graders in your building last spring then you had 954 other schools to compare yourself with.

As I discuss my thoughts here on these rankings, I need to make it clear that my intention is not to criticize or praise a school that I reference, but simply to clarify how this works for those who take these rankings too seriously.

Going back to third grade for a moment, the "number one" ranked school in the state in English Language Arts was the Richmond Consolidated School which had 100-percent of its students score in either Advanced or Proficient.  By the way, the Richmond Consolidated School tested only 19 students. Compare this to the school that had the largest third grade population in the state, the Woodland School in Milford, MA which tested 303 students and ranked 571.  Clearly we are comparing apples and oranges and it is unfair to the students and teachers to portray such a misleading picture. There are countless examples of these same types of comparisons that can be done at every grade level.  This is without even getting into the demographics of individual schools and communities.

Here's a another thing that irks me about the Boston.com ratings

Using the Grade 10 English Language Arts rankings as an example this time, I would like to ask this question.  Do you think that a school ranked "number one" clearly outperformed a school ranked 99th?  While the answer is an emphatic NO,  if I were a typical parent from Andover, Brookline or any of the 23 schools that were ranked 99 I would probably be wondering why my child's school is apparently so far away from "number one."

The explanation is pretty straight forward, there were 28 schools that had 100% of their students score either Advanced or Proficient and were therefore ranked "number one." The next ranking was "number 29," a ranking that was shared by 22 schools that had 99% of its students scoring in the top two levels of the ELA MCAS.  So, the good news for folks who ranked "number 99" is that 96% of their students scored either Advanced or Proficient.



Growth Scores Are A Better Measure

Thankfully our state's Department of Education has moved to a growth model in regards to testing.  What is a growth model?

Here is a quick definition from the DESE's website -
For K-12 education, the phrase "growth model" describes a method of measuring individual student progress on statewide assessments (tests) by tracking the scores of the same students from one year to the next. Traditional student assessment reports tell you about a student's achievement, whereas growth reports tell you how much change or "growth" there has been in achievement from year to year.
Shouldn't we be paying more attention to these measures? Isn't it more important to show where students were and how we track their growth and chart their progress compared to all of the students who had a similar score during the previous school year?   For example, if we had a student who was in the lowest category (warning), shouldn't we get some credit for moving them along to the next level (needs improvement)?  The obvious answer is - yes!

In addition, I am sure that there are students that walk in the door in September and could score in the advanced level on that year's MCAS test on day one of the school year.  Therefore, I think it is insignificant when these students score advanced in May of the same school year.  Again, we need to show that we are supporting student growth no matter where they are on day one of the school year.

One More Thing About Ranking Ourselves Based On Standardized Test Scores  

For those who aren't aware of the correlations between socioeconomics and standardized test, there are clear connections between standardized test results and the median household income in a community or a state.  Check out the graphic below depicting average NAEP scores across our country and the median household income in each state.

Source: http://www.edpolicythoughts.com/2012/10/why-does-massachusetts-rank-highly.html

Concluding Thoughts About Standardized Tests

In closing, I think that measuring student progress is critical. However, I think we have to keep standardized test results in the proper perspective. In Burlington, we are always of the opinion that we can do a better job for our students. There are certainly areas where we think our state tests scores could be better and we will have plans in place to accomplish this. However, we also have to be careful not to be focused solely on these tests when we talk about our progress.  Our feeling is that these tests are the floor and not the ceiling for what we hope to see our students accomplish.  As a community, we need to make sure that we are utilizing multiple measures to chart the progress of our schools and our students.  

As a parent of three children in another district (grade 1, grade 7 , and grade 9), I am less concerned about the standardized test scores of my students and more interested in whether or not they are developing the skills that they will need to be successful after their formal education is complete. I am fairly confident that their MCAS results or their scores on whatever new federal or state standardized test comes down the pike is not something that will have a major impact in their success.  If the major focus of their schools is on these results then I pretty sure I can find a computer program that can prepare them equally well.

Don't get me wrong, I think we need schools more than ever. The dilemma is that we need schools that realize the world that we are preparing our students for is one that has changed dramatically and that we cannot prepare students with business as usual.

Here are few blog posts that reference this idea:

An Interesting Question To Ponder - Are Schools KillingYour Child's Creativity?




A decade of No Child Left Behind: Lessons from a policy failure


Enhanced by Zemanta

No comments:

Post a Comment