Before I changed my college major to Physics, I studied Electrical Engineering. One of the ways electrical engineers look at complex circuits is to replace them with imaginary black boxes. They can then measure what goes into the black box (inputs), and also what comes out of it (outputs), and from those measurements they can learn how the black box behaves without knowing too much about the details of what goes on inside of it. This sort of approach can be useful in understanding the behavior of other complicated systems, school districts included.
We spend a lot of time evaluating things like budgets, staffing, and enrollment for the district – these are all inputs. We can analyze spending areas and benefit costs over time, but these are just detailed views of some parts of these inputs. What we haven’t looked at yet are outputs, and they are overdue for some attention.
So what are the outputs of a school district? Different people will have different answers to this question. For me, I’m going to say “educated students” as that is the reason the district was created – to educate the children of Amherst. Measuring the level of education is an important, but complicated task. Few professional educators I know can agree on a satisfactory way to measure it, and there are plenty of disagreements on what the measurements that are made actually mean.
But we shouldn’t let the lack of a perfect measurement stop us from making any measurements at all, and we should look very closely at the ones that we can make, or in this case, already have been making. At the beginning of each school year, the New England Common Assessment Program, or NECAP, tests are given to children in grades 3-8 for reading and for math. There are some other subject tests given, but not to all grades each year. The results of these tests for Amherst School District are posted on the New Hampshire Department of Education website (see the NECAP Longitudinal Reports section toward the bottom of the page). There you will find the percentage of students that score proficient or above in each subject, and also the average scaled scores, both of these for each grade level tested. Now if you’ve been following along with anything else on this website, you know that I’m about to show them to you.
Below are the individual grade level NECAP testing results for math and reading for teaching years 2006 through 2012. Plotted first are the percentage of students scoring proficient or above for grades 3-8. (Note: The NECAP test for teaching year 2006 is given in the Fall of 2007, etc.)

Amherst School District NECAP proficiency data for math and reading. Scores are reported for teaching year by grade.
Apart from the steady decline in the percentage of 7th graders testing proficient in math, nothing really jumps out here. That is the problem. Can you pick out the year we hired the literacy coaches from the reading results? How about year we started with math coaches? I’ll admit that I’m having trouble figuring those out myself.
Perhaps we will have better luck looking at the average test score numbers. So here they are broken down by grade, same style as above. NECAP has an odd scoring system. The scores are created by adding the grade level in the hundreds place, plus a numeric score that ranges from 0 to 80. This strikes me as a very poor way to represent test scores, but here they are on comparable scales.

Amherst School District NECAP average scaled scores for math and reading. Scores are reported for teaching year by grade.
Similar to the proficiency percentages we looked at before, there are a few years with some modest gains, but there are some modest losses as well. What is quite clear is that since teaching year 2006 we have seen no significant improvement in reading or math standardized test scores, or the proficiency percentage of students tested. Now would you be surprised to find out that we’ve added about $4 million to the Amherst School District annual budget in this amount of time (i.e. going from about 20 up to 24 million)? This is where the question of value comes in.
There may or may not be better ways than NECAP to measure individual student performance, but that is an issue for a different time. Standardized tests are generally accepted to be a reasonable way to make year-to-year comparisons for establishing progress on a broader scale. And I think the numbers speak for themselves that we have room for improvement in the district. It is reasonable to guess that this was the goal behind starting our teaching coaches program. Now I don’t want to sound like I’m picking on teaching coaches, but it is not clear that this experiment has paid off. Rather, it looks to me quite the opposite.
I should point out that last year during the Ways and Means budget meetings, I asked the director of curriculum development about metrics demonstrating the effectiveness of the coaching program. I was told they did not have them, but that they had a plan to get them (though the time scale was not clear). Now consider that in the school year starting this Fall we will spend $435,000 (salaries + benefits, etc.) on teaching coaches in Amherst School District alone. Literally millions of Amherst taxpayer dollars have been spent on the coaches program since it was started, and without any discernible impact on measured education levels in either math or reading. That is enough to make me question the value the teaching coaches program has brought to the students or to the taxpayers of Amherst.