My Goal in Blogging

I started this blog in May of 2008, shortly after my election to the School Committee, because I believed it was very important to both provide the community with an opportunity to share their thoughts with me about our schools and to provide me with an opportunity for me to ask questions and share my thoughts and reasoning. I have found the conversation generated on my blog to be extremely helpful to me in learning community views on many issues. I appreciate the many people who have taken the time to share their views. I believe it is critical to the quality of our public schools to have a public discussion of our community priorities, concerns and aspirations.

Friday, December 11, 2009

Per Pupil Expenses: Amherst Versus Northampton

There is considerable talk of our dire budget situation and a March 23rd override going around across town. But as a parent with three kids in the elementary schools and a member of the School Committee, I believe my responsibility is to make sure the schools are using whatever money we have in the wisest way possible (whether we have lots of money or not so much money in a given year). As my blog readers know, I find looking at comparisons to other districts very useful in thinking about how we do education (e.g., what courses we offer/require in a given discipline). I also think such comparisons are very useful in thinking about finances.

So, I examined, using public records (http://finance1.doe.mass.edu/statistics/) comparisons between the Amherst Regional Schools and Northampton. Note that Amherst is very similar to Northampton in multiple ways -- they serve just about the same number of kids (2,800 compared to our 3,086) in the same number of schools (6), and the populations are quite similar (except they have a higher percentage of low income kids -- we have 17.3 and they have 26.4 -- and a higher percentage of special ed kids -- we have 18.6 and they have 21.8). Yet they spend almost $1,000 less than the state average per pupil ($11,613.94 compared to the state average of 12,448.78) and we (Regional Schools) spend $3,500 more than the state average ($16,131.11). So, you have to ask do kids in Amherst get a better education for this considerable extra money?

And I don't think so. In the Northampton middle and high schools, class sizes are the same as ours, except they have no mandatory study halls throughout high school (our kids have 2 right now per year, meaning 8 over the course of high school, and may well have 3 next year, meaning 12 over the course of high school). They also have more AP course offerings than we have (AP chemistry and AP statistics). So, I don't see how we are spending so much more, and our kids are getting so much less.

That led me to look at the detailed charts to see where we are high versus Northampton. Here are all the areas in which you can see a real difference (again, you can go to that link I posted above to see the numbers for yourself):

1. We spend more on administration than Northampton (and both spend more than the state average) -- this is particularly true for Human Resources (114.15 in Amherst-Pelham, 31.74 in Northampton, 32.69 state average), and IT (218.19 in Amherst-Pelham, 146.43 in Northampton, 65.40 state).

2. We spend more on instructional leadership (1,237.41 versus 738.57 in Northampton, state average is 800.14) -- that includes more on curriculum directors, and school-building leadership (e.g., principals). Northampton is at the state average on principals -- we are much higher (now, this number may be slightly off since this is only our middle and high school principals, who get paid better than elementary school principals, and the Northampton and state numbers include elementary school principals - so those means will be lower because of that).

3. We spend about the state average on classroom teachers, which includes actual classroom teachers as well as any teaching in a group setting (e.g., art, music, PE, etc.). But we spend way more on specialist teachers -- this is teachers who provide individual specialized attention to one or more students (academic support, intervention, special ed, reading recovery, ELL, etc.). I don't see why Northampton spends $214.50 on these services (with more special ed and low income kids than we have) and the state average is 472.34 and we spend $1,140.47.

4. We spend more than the state average on "other teaching services" but so does Northampton (these are substitutes, medical/therapeutic services, librarians, paraprofessionals).

5. We spend more on professional development than the state average -- Northampton spends less.

6. We spend more on instructional materials than the state average. Northampton spends less.

7. We spend more on guidance, counseling and testing. Northampton spends just about the state average. So, we spend $448.22 on guidance and adjustment counselors -- Northampton spends 275.06. The state average is 232.84.

8. We spend more on pupil services (e.g., athletics, transportation) than the state average -- Northampton spends less.

9. We spend more than the state average AND more than Northampton on operations and maintenance.

10. We spend way more on insurance and retirement programs than the state average -- Northampton spends less. So, we spend $3,112.46 on retirement programs and insurance for employees -- Northampton spends $1,905.06 and the state average is 2,069.27.

11. We spend MORE than Northampton and MORE than the state average on payments to out of district schools. We pay 23,809.73 in this category -- Northampton spends 15,438.42 and the state average is 20.497.51.

So, what do I conclude? We spend a lot more money than the state average (which includes many expensive Boston areas in which salaries would have to be higher) in virtually all categories, and we spend way more money in per pupil expenses than all other area districts. It is not clear to me that this money is being well spent. If we are spending so much more than other districts, why do our kids spend more time in study halls and have fewer course offerings than kids in other schools? And although I certainly believe that we need to spend extra resources on kids who need services (intervention, ELL, special ed), I don't see why it costs us more to provide these services than it costs in other districts, nor do I see any evidence that we provide better services than those in other districts.

85 comments:

Anonymous said...

Thank you Catherine for this analysis.

In my view it is absolutely critical that the District get some professional help - an accountant with a knowledge of public systems - to analyze Amherst's system.

Your post is cogent. The more hard data we have, not just comparative data but data about how we allocate and use resources, the better.
MA

Anonymous said...

This is really incredible. Your comment:
"I don't see why Northampton spends $214.50 on specialist teachers (with more special ed and low income kids than we have) and the state average is 472.34 and we spend $1,140.47."

So it's not about teacher bashing, but it is really about investigating and evaluating what we do and spend as a school system. Amherst should do something it doesn't likes to do: ask Northampton how it does what it does. We have much to learn.

As an anonymous poster, I say thanks for this review, Catherine.

Anonymous said...

If what Alisa Brewer says is correct, then accounting practices may explain the differences. I.e., some towns may allocate certain school costs (retirement, insurance) to the municipal ledger rather than the school ledger. Which would inflate Amherst's school budget and depress other towns' school budgets. But it bears looking into.

Anonymous said...

Read today's commentary in the Bulletin by Bogartz - wondering whether Catherine can respond to the questions there. "Did the statistical effect of raising the Crocker Farm MCAS average have anything to do with the redistricting decision? Were Amherst data studied to assess the relationship between family income and academic achievement? How strong is the correlation? Are other variables such as parental language skills, parental level of education and parental occupation more strongly related to academic achievement here than who pays for lunch? Is there a composite score this is more predictive and should we have been equalizing on the basis of that score?" Instead of doing it as an experiment as he suggests is it possible to study its outcomes? Have any other districts that made this change looked at outcomes?

Unknown said...

This is a duplication from another thread.

I don't know where comparable Amherst budget info is available, but the NoHo mayor's office and the public school websites post the school budget details:

A nice summary of proposed changes for 2010 at

http://www.northamptonma.gov/gsuniverse/httpRoot/mayor/uploads/listWidget/2619/doneSchoolsfy2010.pdf

While the "service line item budget" (each category and school by school) is available at:

http://www.nps.northampton.ma.us/budget.html

Joel said...

RE: Alissa Brewer's comments:

I responded to them on that thread. Simply put, I find it outrageous that someone so involved in town govt. would make such a claim. Our numbers so far above regional and state averages, which include the much higher cost Boston metro area. But Alissa wants us to believe that they aren't really that high because we are the only or one of only a few districts that use such accounting. There's no proof of that, just a claim.

Also, accounting nomenclature doesn't explain how we have two forced HS study halls while districts that spend less per pupil don't.

These issues are too serious to be dismissed with empty assertions that it's "probably" just an accounting difference.

Nina Koch said...

Northampton is on a block schedule. It actually saves a lot of money for a school district. Quite a few local districts are on block. It allows them to offer more courses for the same money, because the teachers teach six courses a year instead of five.

People who don't like the trimester probably will like the block even less. So be careful what you wish for.

Rick Hood said...

I like the idea of sitting down and comparing Noho’s real budget with our real budget – perhaps Hadley and Longmeadow also. I don’t really trust the DOE numbers when I see this:

Cambridge: $24,467 …does Cambridge really spend $24,467 per kid?

And for Amherst Regional, DOE shows ’08 total budget $30,523,280 when in fact the ARPS budget as published was $ 27,257,882. It also shows ’08 enrollment of 1,892 when in fact it was 1,957.

See: http://www.arps.org/files/RegionalBudgetFY10.pdf

So that means the $16,132 shown looks to be incorrect and was really $13,928, which is still above what DOE says the state average was of $12,448, but knowing the Amherst numbers are so far off I don’t believe DOE’s average.

I will check with Rob Detweiler to see if I am missing something – but these are the numbers as published, for both DOE and ARPS FY08.

Anonymous said...

Cambridge does spend a freakish amount on its schools for a whole variety of reasons.

Anonymous said...

Wondering about the validity of comparing the ARPS numbers (which include just the middle and high schools) with the numbers from the entire Northampton system (including the elementary schools). Seems like kind of an apples and oranges situation.

Catherine A. Sanderson said...

My responses:

Anonymous 10:47 - thanks for your comments ... and I frankly would hope that a professional account wouldn't be necessary ... I would hope that our administration is looking carefully into all areas of spending during this tight budget time, and in turn, finding ways in which our spending could look more like other area districts.

Anonymous 11:19 - I agree that we have much to learn ... in many areas. I hope we decide to start looking to other districts to see how they are seemingly managing to provide more for less.

Anonymous 12:59 - Alisa may be right ... but I haven't seen any evidence that she is, other than that she has suggested this is a possibility. Do we really do accounting differently than all other towns? Perhaps -- but I'd like to see some evidence for that assertion.

Anonymous 1:54 - I'm glad to answer these questions (again -- they have been addressed on earlier blog posts). CF is now in corrective action, and therefore we did feel a change needed to be made -- that was clearly stated at multiple points. We did look at the association between family income and MCAS, and yes, low income students failed to make AYP in math (at FR, CF, and WW -- MM numbers were too small to count as a subgroup), and in ELA at CF (again, MM numbers were too small). We don't have data on parental education or occupation -- those just aren't things we collect from parents. ELL kids are likely to have parents with weaker English skills -- and at CF, ELL and low income kids were HIGHLY correlated (I believe in theh 90% range -- much more so than at the other three schools). I am certain we will be studying the outcome - and hope to have all schools making AYP from now on. I do know of other districts in which such a change have been made and, as I believe is even noted in this piece, higher achievement among low income kids resulted (at no cost to achievement in higher income kids).

Jennifer - thanks so much for those links. I am in shock that the Northampton schools post more stuff on line for all to see than members of the SC get to have in Amherst. I've never seen a line by line budget of any school, and this is very informative. I will definitely make such a suggestion to our administration.

Joel - I too would like data to support this assertion -- and I still don't understand why it means we have more study halls.

Nina - I'm not sure why the teaching load from block to our system explains the differences at all -- since the budget for classroom teachers is quite similar in both schools. The costs for things like specialist teachers (and retirement/insurance/HR/IT, etc.) are MUCH higher in Amherst, but that surely can't have anything to do with the block system, right? I can only see the block schedule influencing teacher pay -- am I missing something?

Rick - I think doing some careful budget comparisons with local districts would be a great idea -- and would be very informative.

I'm not sure why the numbers published on the ARPS website are necessarily more accurate than those published by the state ... but that should of course be checked. It is also possible that they have inflated other districts budgets as well, and hence we may still be out of the norm compared to other districts.

The enrollment data is tricky since sometimes we include preK enrollment and sometimes we don't.

Anonymous 9:27 - I've also heard that Cambridge spends huge amounts of money per student!

JPO - it definitely is problematic to make these comparisons between our schools (MS and HS) and Northampton's ES, MS, and HS. However, the Amherst ES numbers are also really high per pupil (over $15,000), so even if you average those two numbers, I don't think that helps reduce the discrepancy.

Anonymous said...

So I was just looking over some MCAS data from the most recent tests (Spring 2009, results available at http://profiles.doe.mass.edu/).

It's striking what large differences there are between Amherst's scores and Northampton's. I was just looking at MS/HS numbers, since the discussion here focuses on these schools. I looked at scores for grades 7, 8, and 10, averaging together the various components for each grade (English/Math for grade 7, English/Math/Science for grades 8 and 10), and combining the scores for the "Advanced/Above Proficient" and "Proficient" categories, excluding the "Needs Improvement" and "Warning/Failing" categories.

In seventh grade, the statewide average for the Advanced and Proficient categories is 60%. Northampton is slightly ahead of the state average, with 63%. Amherst is sixteen percentage points ahead of Northampton, with 79%.

In eighth grade, the statewide average for the Advanced and Proficient categories is 55%. Northampton is exactly equal to the state average, with 55%. Amherst is fourteen percentage points ahead of Northampton, with 69%.

In tenth grade, the statewide average for the Advanced and Proficient categories is 72%. Northampton is ahead of the state average, with 80%. Amherst is six percentage points ahead of Northampton, with 86%.

In all three grades, Amherst's scores are substantially ahead of Northampton's.

Looked at from another perspective, the percentages of "Needs Improvement" and "Warning/Failing" scores, in all three grades, Northampton has about 50% more students than Amherst in these categories.

I know there's a lot more to evaluating educational performance than MCAS scores. But you're always advocating a clinical, data-driven approach, and I don't see how one can ignore the substantial disparity in test scores in trying to evaluate whether or not our budget is being spent effectively in comparison to other towns.

- Jonathan

Anonymous said...

I am in shock that the Northampton schools post more stuff on line for all to see than members of the SC get to have in Amherst. I've never seen a line by line budget of any school, and this is very informative.

A call for transparency...put a detail school budget on line. It is our commonwealth and common budget.

Rick Hood said...

We really need to make sure we are looking at the correct data before we conclude "we spend way more money in per pupil expenses than all other area districts." - especially the "WAY more" part.

We're getting everyone all worked up about it before knowing this for sure.

Anonymous said...

Jonathan - Thank you. Those are convincing numbers. I'd way rather have my kids in Amherst schools, learning more (and I'd rather pay more for it.) I know people very well in Noho who send their kids anywhere else but those schools.

Anonymous said...

There are so many many levels to this topic and discussion. It is hard to figure out who or what system is right. I do, think, however, that Amherst must be way more transparent than they have been up to now.

Nina Koch said...

well said, 8:43 am. I worry about the tendency to oversimplify complicated situations. I also agree that we can improve how we deliver information to the public.

Catherine, yes you are missing something about the cost savings of the block schedule. If Northampton wanted to offer the same number of courses under a traditional schedule it would cost them 20% more. You asked why they don't have forced study halls and the block schedule is part of the reason why. The block gives you more course offerings for your dollar.

The block has disadvantages too. I am not proposing that we move to it.

Abbie said...

Nina,

are modules the same as blocks?

Anonymous said...

Hi Catherine...First let me say I admire the tenacity you have shown as you try to get information which has not been forthcoming. I am hoping you can help me understand something. This past week a group from CORE came to "evaluate" our English Language Arts curriculum materials. They were apparently paid for this, though I'd like to know how much and from where the money came. But here's the bewildering part. As I understand it (I hope incorrectly), they are in business to sell their own Language Arts curriculum material. Please tell me Dr. Rodriguez did not pay a business to review our materials so that they can recommend that the district purchase CORE materials.
Thanks, Catherine
Anxiously Waiting to Know

Anonymous said...

To Rick: December 12, 2009 8:08 AM

It does sound like people are "worked up" about this, but it's because there are those of us who have brought this up in the past and it's been ignored. That may by why the tone sounds so angry. It's frustration and anger at being ignored. My kids went to Ft. River and the middle school. I was so frustrated with my concerns not being answered or being ignored. I ended up moving my kids out of the schools. Ali

Rick Hood said...

Ali,

OK that’s fine and I understand.

I just want help get that info out there – accurately. I think Catherine is doing a fantastic job of pushing this issue (and others) and I am not knocking that in any way – it’s a job that needs to be done.

But we’ve got to make sure we have accurate data not only so we can come to accurate conclusions, but so we can really know where to focus our attention on doing better.

Now sometimes “it’s not accurate” is an excuse to do nothing. That is NOT what I am about all. We need accurate comparisons to other schools.

It turns out to be a nightmare though. Some towns have things in the town budget that other towns have in there school budgets. For example, John Musante told me that virtually all of Northampton HR is in the town budget, so you don’t see it in the school budget (John used to be at Northampton, so he knows) whereas we have HR in our schools budget.

I don’t know whether DOE tries to tease that out in their data or not and neither John nor Rob Detweiler knew either.

Lots of work left to do..

Bottom line:

1. We need accurate comparisons.
2. I don’t think we know that we have that yet.
3. I don’t think anyone thinks we need to be at the level of Northampton, whatever it is. We want better schools than Northampton, not the same.
4. But we also don’t want to be too far above either.

Rick

Anonymous said...

Anon 9:55 raises interesting questions. Having been subjected to such a "review", I too await the answers.

Nina Koch said...

Hi Abbie,

Hmm, I guess I am not sure what modules are, so I can't say if they are the same as blocks. A 4 by 4 block schedule is essentially like a college schedule, in the sense that an entire course (say US History) is completed in half a year. The school day consists of 4 long periods, and teachers teach 3 out of 4. The periods are usually around 85 minutes, which is considered a double period compared to a traditional schedule where the periods are between 40 and 45 minutes.

Most schools with block schedule do not have study halls. They sometimes split blocks or do some modification for music or electives, but that varies a lot from school to school.

It's a good example of a situation where there are trade-offs-- things you gain, things you lose. I think most schools that choose to go to block stick with it, but it might be due to the financial considerations more than educational considerations.

Anonymous said...

As a graduate of Northampton High School, I could have pushed to move back to Northampton when my wife and I moved from Boston with our then 2 year old daughter.

I didn't. We came to Amherst instead and paid more for less house than we would have in Northampton. And I don't regret it.

I don't regret it when I attend the District Music Programs in the spring featuring the top middle school and high school musicians and singers from the region, pull out my pen, and tally up in the program the participants from Amherst versus those from Northampton and other communities. It isn't even close.

And I note Jonathan O'Keeffe's (I assume that's "jpo") data analysis on MCAS scores, also. These are not little things.

But, when I see posters on here turning on Ms. Brewer for daring to suggest that there might be some other explanations for disparities in numbers, I can see where things are going on this blog. With the exception of Richard Hood, the skepticism all seems to be going in one direction: lots of sound and fury (with Joel in the lead wielding his torch and pitchfork) in order to free the conscience for a "no" vote on an override.

You see, I think that we need two things that are in no way mutually exclusive: more honest self-examination about how our schools operate (which I applaud and which happens intensively on this blog) AND an override to get us through the next few fiscal years. I think that we have aspirations for education that other communities don't have, and I don't think we need to be apologetic about that.

And I see the analysis going off the rails on this blog on that latter point. My sense is that Ms. Sanderson's public opposition to an override for the schools will be a devastating blow, and will come to define her first term come reelection time. (And I know that she'll huff and puff and predictably tell me that she doesn't worry about that.) With the leadership of Ms. Sanderson and others on SC, we can spend the additional revenues that we need wisely. We need not cut off our nose to spite our face by setting up unrealistic preconditions to support for an override.

All human endeavors, including our own households, involve some degree of waste. It's not an acceptance thing; it's a constantly try to do better thing.

Rich Morse

Anonymous said...

No override. No way, no how.

Joel said...

Rich,

I'll go on the record as supporting the override (I have two kids in elementary school) as soon as the district does one single, easy thing that has no immediate impact on policy.

It must produce a budget in the form Northampton does. Email me and I'll send you a pdf of the Northampton schools' budget. Every expense is listed, from the superintendent's salary to copier costs. Every cost is broken down by individual building. And, Alissa might actually be right about the way retirement costs are counted, but the detailed Noho budget shows exactly how much each ELL, SPED, and other teacher costs and where he/she is assigned.

Too many parents don't trust how our money is being spent. It isn't about the money, it's about the programs. There are so many more ELL/SPED, etc teachers at Fort River than there are classroom teachers that it's hard to believe they are all needed. The Noho budget shows each school's list of teachers by subject and salary. It is clear exactly how many administrators each school has.

I have little faith in the administrative capacity of our schools. An open and honest budget with that sort of detail would allow us to have informed debates going forward. I'll vote "yes" on an override once I can see how we spend. I'll vote "yes" even if I find the spending silly because I believe we'll be able to fix things once we have some modest level of real transparency.

Anonymous said...

Joel:

I would love to see that pdf. Is that something you can post here?

Joel said...

http://www.nps.northampton.ma.us/2010%20Budget/2010%20budget.pdf

Joel said...

That isn't the entire link. I'm breaking it up into two lines so that it will appear. Paste both in order into your browser as one link

http://www.nps.northampton.ma.us/
2010%20Budget/2010%20budget.pdf

You also probably find it with just the first part.

Joel said...

It's also the first green link available here:

http://www.nps.northampton.ma.us/budget.html

Abbie said...

Hi Nina,

here is the info about "modules" from my HS. It looks like it hasn't changed since I was there which was after 1967(!).

The following schedules illustrate our philosophy of scheduling which includes large
groups, small groups, laboratories, and independent study time. The schedules contain 14
mods each day. Six of these mods (1, 2, 3, 12, 13, & 14) are forty minutes in length with the
other eight mods (4, 5, 6, 7, 8, 9, 10 & 11) approximately 20 minutes long. You can gain some
insight into your possible schedule by studying these examples.

Westside adopted a modular schedule in the fall of 1967. The old schedule was a tradi-
tional six period day. It restricted the number of courses a student could take, placed students
in mass study halls, limited the opportunities for students to develop self-reliance, and pre-
sented many conflicts when students went into the community for learning experiences.
The decision to change schedules was made after defining what the staff believed about learn-
ing and after studying the advantages of both the traditional and modular schedule.
The We s t s i d e s t a ff believed that:

1. Students and teachers should have more time during the day to meet on a one-to-one
basis.
2. Students should be exposed to the fine and practical arts, engineering and
technology, family and consumer studies, and business courses.
3. Students should be involved in making decisions regarding their use of
time.
4. Subject area resource centers and open laboratories should be developed in place of
study halls.
5. More opportunities should be provided for students to utilize community resources.
6. The program should encourage teachers and administrators to try new approaches to
improve the instructional process.
7. Students should be able to take more courses.

The staff further believed that these objectives could best be accomplished through a mod-
ular schedule with four modes of instruction, called large group instruction, small group
instruction, laboratory, and independent study time.

Anonymous said...

Thank you, Joel, for posting this link. I would hope that such a detailed budget exists for the Amherst schools. If it does, it should be posted for all to see immediately. If it does not exist, then we are in deep trouble as a school system.

Trust, but verify. For far too long the schools, and for that matter all of Amherst town goverment, have relied on "trust us, we know what we are doing." I hope the schools and all of town government gets the message very soon that trust us is not going to fly any longer. As tax payers we have a right to see how our money is being spent - especially if we are to be asked to vote yes on an override.

Trust - but verify should become the new Amherst mantra before any override vote.

We need a groundswell of parents demanding budget information such as is available to Northampton voters. I don't know if I will vote yes for an override...but I do know that I want to see that kind of budget detail posted on the ARPS website.

Joel said...

To Anon 9:41

Right. Absolutely.

Here's what we'll be told initially: "We don't have the people in place to produce this now. Trust us. We'll give it to you later."

Funny thing: We have many more administrators than Northampton and yet they produce just such a budget.

Moreover, each and every school/facility in town has such a budget in spreadsheet form. The schools just haven't wanted to share those numbers with the town.

I think everyone has to get behind the idea of demanding this. If we are going to continue to pay among the highest property taxes in the region and if we are being asked to pay even more in an override, then showing us exactly where this money goes is literally the least our administrators can do.

Nina Koch said...

wow, Abbie, that schedule sounds so interesting. I would really like to know more about it. I will check out the website.

It is definitely different than block scheduling and in fact different than any schedule I am aware of. I think this is a good example of a faculty getting together and asking themselves, what do we really want our kids to do? what needs to change here? It looks like they came up with a fairly unique solution, but it is based on sound principles.

I really like the idea of having more time to meet with kids one on one as well as kids being expected to do more work independently. I think we could possibly work both of those features into our mandated study requirement, to try to make the most out of our situation. I wish we could in fact offer a full schedule for kids who want it, but given that we can't afford it, maybe we can figure out something else that is a benefit.

I think people tend to forget that new practices often emerge when there is a perceived problem with current practice. That's why there are so many different schedules across the U.S.. Schools are trying to solve a problem.

Thanks for the info about your school!

Anonymous said...

Here's my commitment as an override supporter:

I will vote for any new prospective TM candidates who are publicly announced as anti-override for TM on my Precinct 7 TM ballot on March 23, 2010. I got off TM this year to allow such new blood to run and get elected, but I don't think much happened.

But heretofore, other than the brave souls Mr. Kelley and Mr. Gawle, the anti-override contingent has been largely closeted in town, bitching anonymously on various blogs, sticking to the shadows, largely absent from the oligarchy in Town Meeting. I wonder why that is.

News flash: the time commitment argument won't work any more, thanks to members like Ms. Jensen and my wife calling the question with prudent dispatch to cut off debate and move on. Last year: 8 sessions in the spring and 2 in the fall for a modest total of 25 hours of meeting time. Let's not make a mountain of time out of this particular molehill commitment. As an excuse not to serve, that dog won't hunt any more.

Rich Morse

Anonymous said...

Rich Morse, I'm not sure I understand what you're saying.

But, I can tell you why people aren't openly anti-override. Do you remember what happened to Wei Ling Greeney? She was anti-override and was personally attacked in the Bulletin and accused of being anti-education. Which she certainly is NOT.

I appreciate the hours and dedication people put into town government - and at the same time I get frustrated at how unwilling people are to challenge or even ask for an explanation of what town leaders say and do. It takes a truly rare breed of person with high status and confidence of steel to stand up to the scrutiny that comes with elected positions in this town.

And I do agree with you that this chaotic "system" of "government" that we have is not effective.

Anonymous said...

In regards to:

As a graduate of Northampton High School, I could have pushed to move back to Northampton when my wife and I moved from Boston with our then 2 year old daughter.

I didn't. We came to Amherst instead and paid more for less house than we would have in Northampton. And I don't regret it.

I don't regret it when I attend the District Music Programs in the spring featuring the top middle school and high school musicians and singers from the region, pull out my pen, and tally up in the program the participants from Amherst versus those from Northampton and other communities. It isn't even close.


You're absolutely right - go to any neighboring community and see what their Music programs are like. They don't even compare to what we all have in Amherst. And yes, it is about what we ALL have. Music and the Arts should be an integral part of any child's education. You all wonder why the SAT scores are higher? Wonder why more kids in Amherst take AP classes - here are maybe a few reasons why:

1. There is also a causal link between music and spatial intelligence (the ability to perceive the world accurately and to form mental pictures of things). This kind of intelligence, by which one can visualize various elements that should go together, is critical to the sort of thinking necessary for everything from solving advanced mathematics problems to being able to pack a book-bag with everything that will be needed for the day.

3. Students of the arts learn to think creatively and to solve problems by imagining various solutions, rejecting outdated rules and assumptions. Questions about the arts do not have only one right answer.

4. Recent studies show that students who study the arts are more successful on standardized tests such as the SAT. They also achieve higher grades in high school.

So - when you are thinking about an override or no override, please think about what it will do to the kids. Think about what kids have access to in Amherst as compared to schools such as Noho, Holyoke, etc. Also - please remember that this is about our children's futures.... don't sell them short.

Abbie said...

to anon @8pm

all you say might be true but the most likely, strongest contributor to Amherst student achievement is the unusually high % of kids whose parents HIGHLY price education (not to mention the contribution of genetics)!!

Anonymous said...

I support Rich Morse. An override is needed so our kids can get the educational opportunities and experiences they deserve.

What does Northampton have that we don't have? Let's hope the answer to that question is not an electorate willing to pass an override for their kids.

Anonymous said...

To Abbie:

I'm wondering: Will those parents who, as you say "HIGHLY price education" be willing to support the override when they are asked, as Northampton residents recently were to contribute to the price of education?

Just wondering.

Anonymous said...

I will agree that our music programs in the middle/high school are excellent. I also think it is great that our kids get to take advantage of the music program from a very young age. As a parent of kids who are not interested in taking music, however, I would say that for some of us, having a great music program is not enough.

If you are not in an ensemble in the middle school (and not in a Math Plus class), there are no other electives for you to take. Great for the music program but not so great for the kid not interested in music.

A great music program alone is not enough to get all parents to vote for an override. Like many, I consider music an "extra," not an essential. Music is definitely something we do well here, but this is not a "music high school" (that is PVPA) but a "comprehensive high school." I would much rather see an increased emphasis on academics (that will serve all kids, even the musicians, well in their futures) and in increase in our core graduation requirements before I would consider an override.

Anonymous said...

And therein Anonymous 6:24 AM's response multiplied several times across Amherst lies the potential for a race to the bottom in our town.

Music as an educational "extra": I'd like to put that notion to a town-wide vote and see how it does.

Like my experience listening to talk radio, I read this blog and pray that it is NOT a representative sampling of the sentiment in Amherst.

Rich Morse

Anonymous said...

Excuse me: let me get the poster's number right. That's 6:29, rather than 6:24 AM.

Sorry, I graduated from Northampton High.

Rich Morse

Anonymous said...

To Anon 6:29: I believe that strong music and art programs are an integral part of a comprehensive hs curriculum as are all those electives that serve kids who might not be so academically inclined or interested. That's what makes it a comprehensive high school.

As for electives at the middle school- they used to have them before budget cuts.

Joel said...

So, let's see a *detailed* budget like the one Northampton issues in order to see how to protect music and art.

We're in a crisis. The superintendent should open the books and show us why.

jm said...

Living in a town where 40% of adults have an advanced degree -- not just an undergraduate degree -- our children should perform at a higher level than surrounding towns. The question is whether Amherst students are outperforming their demographic. An MIT professor analyzes the performance of children in towns based on the demographics, such as income, and often finds that children in high income Boston area towns generally perform as well -- but no better as expected. The surprises are Chelsea and Somerville with large immigrant populations and much lower incomes. Children in these towns outperform their demographic and this strongly suggest that something special is going on in their schools.

My child's brief experience in Somerville showed that the kids were getting an early and hard push to read and write. The K curriculum looked a lot like the 1st grade curriculem here and the 1st grade curriculum looked like Amherst's 2nd grade curriculum. I think the schools knew that their kids were starting behind and workedi hard to have them catch up.

I've always wondered if Amherst kids simply were doing as well as expected or better than expected.

Anonymous said...

how does not passing an override get us closer to getting the kind of education we want amherst to provide?

Joel said...

I think passing the override could be a very good thing, if we know where the money is going.

A lot of us who actually have little kids in the schools are being painted as anti-override, when that isn't so. We'll pay plenty more as soon as we know what we're paying for.

Let's see a line by line, school by school budget, just like the one Northampton posts on the web.

Ken Pransky said...

In these times of budget difficulty, you are to be applauded for trying to get a handle on expenses. However, I believe your analysis of the data is very cursory, as I found with math (I posted regarding your comments about our elementary math programs yesterday).

In your analysis of Northampton, for instance, you left out the vocational school (Smith Voc) that syphons off many learners (I'm assuming most are Northampton students) who are likely to be less on an academic track, and therefore lower achieving, more likely to drop out, more expensive to educate, etc, if they weren't at a vocational school. MCAS scores at Smith Voc are quite a bit lower, in some cases (comparing subgroups) a CPI almost 20 points lower than Amherst (or Northampton HS). Amherst HS takes ALL learners, and must work to educate them all, without a "presorting mechanism" that separates some of the most struggling (not that all voc ed students are, but many are) out of the population of students in full academic programshat need to be well educated.

Another example of how the figures get skewed without incorporating Smith Voc is that 71% of Amherst graduates go to 4-year colleges, while at Northampton HS even as is, only 61% do, but at Smith Voc, it's just 4%. If there were no Smith Voc, those students would be Northampton HS students, and that very low 4-year college % would be factored into Northampton HS stats. Comparing Amherst-Noho graduation rates, drop-out rates, SATs and attendence at 4-year colleges, etc, would all look substantially different.

To be accurate, you would need to figure all Northampton students together to compare to Amherst, not only achievement-wise but also regarding per pupil spending. Just because they are at a vocational school doesn't mean it doesn't cost the town to educate them. My question is, what would the costs be for educating ALL Northampton high school age students?

In Amherst, too, you did not mention the alternative prograns in South Amherst that have a very low student-teacher ratio. Aren't they a part of the regional budget? That skews Amherst's per pupil costs higher when lumped in with the total of educating all students. Does Northampton HS have any such programs? If so, how much do they cost there? If not, it would only be fair to look at our per pupil costs without those programs factored in, so apples are compared to apples.

Amherst also has more ELLs/former ELLs and homes where another language is spoken; our secondary student population is overall quite a bit more diverse even if the % of low income learners is a little smaller than Northampton. If you're going to mention some demographic factors, you really should mention all things in the interest of a fair, dispassionate analysis of the situation.

Until ALL of this information is examined, you can't draw any conclusions one way or the other between us and Northampton, either about the quality of education, or the cost

Anonymous said...

Hmmmm....this sounds like Clinton when he asked "What exactly is the meaning of 'is'?". Can you say splitting hairs? NoHo does a better job than we do. Get over it.

Joel said...

Ken raises great points.

He neglects, however, to mention the fact that some ELL kids in Amherst are the children of foreign grad students (i.e., from high academic achieving households), that some Free and Reduced Lunch kids in Amherst are the children of grad students (again, high academic achievers), and that something like 40% of Amherst adults have a grad degree, making our baseline for achievement much different from that of a town like Northampton.

That doesn't mean Ken isn't right. I think he is. These are complex data.

Here's my complaint about Amherst: Before ACE, and Catherine, Steve, and Irv getting on the SC, and before this blog there was literally no discussion of such issues. We just told that the schools were literally great and that we should shut up and enjoy how wonderful everything is.

There was and is plenty of cheerleading about the Amherst schools and very little real leadership.

Catherine, Steve, and Irv have stepped into that vacuum and now people like Ken are stepping up with good data points.

There's a lot to figure out and open and honest debate is needed.

Joel said...

wanted to write:

"We *were* just told that the schools were literally great and that we should shut up and enjoy how wonderful everything is.

Anonymous said...

Ken, I am really offended by what you had to say about Smith Voc. students. Many come from Amherst and other towns besides Northampton, and you really made it sound like it is a place for losers. I would expect better from you.

Anonymous said...

Ken, you are completely off-base about whether or not Smith Voc should be included in the Northampton High numbers!!! This is totally absurd. First of all, Northampton High and Smith Voc are completely separate schools with different curricula. Secondly, Smith Voc is a REGIONAL vocational school which draws kids from throughout the area. Just like Franklin Tech. It might surprise you to learn that there are quite a number of students from Amherst at Smith Voc. Should we then include the "dismal numbers" from Smith Voc in the Amherst graduation, MCAS, etc numbers? No. No more than you should include them with the Northampton High numbers.

Please don't try to muddy the waters by bringing in a totally different and separate high school. Northampton High and Amehrst Regional are comparable high schools which offer a comprehensive education. Smith Voc is a very separate entity. And I agree with the previous poster who thought you made the place sound like a school for losers. Which it is not. It is there to provide a good and much-needed vocational education for our teenagers who decide to pursue that career path. Students and their parents choose to enroll there because it provides what they want and need. Just like some choose to enroll in PVPA (you wouldn't include their numbers with those of South Hadley High, would you?).

Ken said...

Wow! Posting on this blog can be...entertaining...if you post an alternative opinion. But it's quite silly to call any more considered look at data just an attempt to show that Amherst schools are perfect or don't need changes--I mean, come on, grow up. Ironically, the agenda of many on this blog appears quite clearly to be just the opposite--striving mightily to prove that Amherst schools provide an inferior education.


Joel's thoughtful post, in contrast, is very appreciated. You're absolutely right, the data is very complex, and multiple perspectives are needed. As you note, it's true that some Amherst low income students are from families affiliated with the university. That is certainly true at the elementary level. It could be true at the secondary level, but because the % of kids who speak another language at the Region is so much less than in our elementary schools, I'm guessing it is not as much of a factor. But it's worth finding out. I'm sure I've missed things, too. My ONLY point is, the more accurately the data is examined, the better the comparison will be, whether it conforms to a predetermined agenda or not.

That's one reason I don't look at overall MCAS scores (and strongly urge no one else to, either), but rather at the scores of subgroups, comparing low income learners in Amherst to low income learners in Noho, etc., and why subgroup MCAS growth across grades should be the way data is interpreted at the elementary level.

Anonymous--I apologize if I came off as dissing voc ed students as losers, but that was not my intent whatsoever. I merely looked at the MCAS data and graduation data, noted the differences, and extrapolated. The data shows that many of those students would tend to struggle more in a strictly academic setting, all things being equal. I think voc ed is a fantastic route for many students, and I wish there were more voc ed schools. Academia is not for everyone, nor should it be. But the MCAS test and 4-year college data is what it is, and if comparisons of educational effectivness are being made (I didn't propose that comparison, Catherine did), the data has to be used.

I realize that Smith Voc is a regional school but my understanding is that most do come from Noho. If I'm wrong, I stand corrected. That data should be available to anyone who seeks it out. I raised Smith Voc as an issue to be looked at because it apparently had not been, and gave my reasons why it should be. But even if just half, say, are Noho residents, it is NOT invalid to use that data; it distorts the comparison not to factor it into the mix. You couldn't know what MCAS scores that meant or which groups of students it was, but you would have to add that into the overall mix somehow.

The issue is that the data was used to draw conclusions about an Amherst HS vs. Noho HS education, both in educational effectiveness and cost. Fine, that's a valid thing to do, it just has to be done carefully, and it would be a distortion to compare MCAS scores, graduation data and per pupil expenditures for that purpose WITHOUT factoring in the pre-sorted students who, by the data, on average look like they would be in need of more academic support services (i.e., more $$) and not performing as well in a strictly academic setting. Otherwise, some of the data that Catherine herself chose to bring to the table is skewed without it.

I'm not saying it was deliberate, and I'm not trying to prove Amherst is better. I'm just saying, be as thorough as you can before you draw conclusions. If that offends some on this site, then that is itself quite revealing, but so be it.

Joel said...

Good points on Ken's comment. I said I thought he was right because I think he raises good data points that should be taken into account.

Like most defenders of the Amherst-way, however, he ignored those data points that shine a negative light on Amherst's schools.

I think it's important to remember that before ACE and the election of Catherine, Steve, and Irv to the SC we had literally no debates on anything, just a lot of hyperbole, much of which turned out to be wrong.

Part of what's behind Ken's post is a real shock and defensiveness that Northampton is surpassing Amherst in many ways. For years everyone assumed we had wonderful schools and that poor Northampton only dreamed of being half as good as Amherst.

Well, Northampton has a leadership that publishes a very detailed, school by school budget and does things like increase real APs. It has a curriculum much more in line with best practices from throughout the Commonwealth.

We have the nation's only mandatory 9th grade Environmental Science requirement -- Noho has Biology in 9th grade.

I would have a lot more faith in what we're doing if our leadership opened everything up to scrutiny. If things are going so well, show us don't just assert it.

If our unique ways of teaching ES math, our seemingly unique ELL program, and our highest in the region SPED spending really are working better than more standard programs elsewhere, show us how that's so. Many of us are tired of assertions, e.g., "it's state mandated," when other towns in the state do something different or spend substantially less money doing what the state is actually mandating, rather than what some people in Amherst tell us it mandates.

If we really are at a financial tipping point, show us the real, line by line, school by school budget.

Still, I like the fact that Ken posts using his real name and engages critics with data. I hope he considers the counter-arguments and data he ignored when we thinks about the condition of the Amherst schools now and going forward rather than focusing on their glorious past.

Joel said...

Ken and I posted at the same time and so didn't see each other's reactions.

I think we agree on the complexity of this stuff and the need for open and honest debate.

I'm completely willing to change my view of our programs and spending, I just have to hear more from Ken, with his years of experience, and our administrators, with their access to the real budgets.

Thanks, Ken.

Abbie said...

Hi Ken,

I am going to post this here rather than in the math blog where you posted.

I am confused by your math blog (in the ES math posting). Are you defending Investigations? Are you doubting the legitimate concerns of parents about this choice? Did you participate in the choice of Investigations? Provide some data on why Investigations is BETTER than others that we could have chosen.

Here is a link to a pdf of a study http://www.google.com/#hl=en&q=elementary+math+evaluation+curriculum+department+of+education&aq=f&aqi=&oq=&fp=cbc2f75bf9d43a8f

True, it is just one that I found and it may have limitations (please point those out). BUT it clearly puts Investigations at the WORST of 4 curricula examined.

If you are going to defend Investigations, then please provide us with your rationale (which ought to be more than "I think it is good"). Provide links to valid studies of various curricula, then maybe I will reconsider my views.

Abbie said...

sorry here is the direct link to the pdf

ies.ed.gov/ncee/pdf/20094075.pdf

Abbie said...

here is the link to full study, which is still ongoing:

http://www.mathcurriculastudy.com/index.asp

Anonymous said...

The numbers are out there, somewhere.....

Anonymous said...

Per Joel: 'I think it's important to remember that before ACE and the election of Catherine, Steve, and Irv to the SC we had literally no debates on anything, just a lot of hyperbole, much of which turned out to be wrong.'
Did you go to school committee meetings then? 'Literally no debates on anything' -?? Really?

Joel said...

I did go. Elaine Brighty often refused to hear comments and questions from the community.

There was no debate by SC members on the 9th grade science curriculum. They didn't ask a single probing question.

Yes, I was there then. Were you?

Anonymous said...

Joel, me again. You're absolutely right, they did not ask smart questions and I thought that decision was flawed - a case of dumbing down the high school curriculum in order to adjust to the inadequacies of the achievement in the lower grades.
I think some questions about this decision should have been lobbed at the principal, and there should be (and is now, I guess) some hard thinking about having one person in charge of curriculum.
What I wish to say is that in the many years of school committee work beforehand, there WERE occasions of thoughtful debate, (not enough, we can agree) and to say, sweepingly, that before the present-day committee there was 'literally no debate on anything, just hyperbole,' is absurd and undermines the important criticisms you make. Which, though I dislike your approach, I can appreciate as important to our community.
Why don't you run for s.c.?

Anonymous said...

Just so you know...Northampton uses the Investigations math curriculum in their elementary schools, too.

Ken said...

Joel--I have never worked in the HS. I know teachers at the HS on a professional level as well and respect their skill greatly. However, I have no stake in the HS, or its reputation. I'm sure some things are good and some could be tweaked to be better and some things need substantial changing. For example, it's clear from that graduation rates for our Latino students is low, that we don't have enough students-of-color and low income students in higher level classes, and that an achievement gaps still perist (as they do in Northampton, by the way). I have no fear saying those things. I'm just going to be careful before I leap to conclusions about things that are not so clear. I wish everyone would do the same.

The questions for me are: 1) What data needs to be looked at to give us the best picture of the education we provide our children? 2) What is the best way to analyze that data? And 3) how transparent is the agenda that informs the way data is gathered, interpreted and translated into policy?

Concerning #3, there is clearly an agenda to "prove" that our schools are just average at best and getting worse. Only that sense of urgency justifies substantial changes, which, from what I can tell, will benefit the already-doing-well the most. Well, our schools may be going down the tubes--I've just not seen any data that shows it.

Take our elementary math program (Investigations), which Anonymous asked about. I am well aware of the anti-Investigations and pro-Investigations camps, where each thinks the other ruins math for kids. I know that some studies show Investigations is awful--but there are others that show it to be quite strong. So to me, all that is secondary (and often politically motivated, whichever side it supports). I want to know what MCAS data about OUR program shows.

And Anonymous, I can't compare our Investigations math program to what it might otherwise be like with a program we don't use. I can only look at MCAS data and compare us either to districts that are demographically similar in the aggregate, or to state averages for subgroups of students (e.g. race, income, ethnicity, Special Needs, etc).

Frankly, I had NO idea what the data would show when I went to look in reponse to hearing Catherine say our math programs were "weak." You can ask anyone at Fort River where I taught and they will tell you that I was not the biggest fan of the program. I felt strongly that it was not the best for struggling learners, especially ELLs. I have absolutley no stake in maintaining Investigations in our schools.

Catherine based her conclusion on AYP results, which is a very sloppy and inaccurate use of data for the reasons I described in my math post (well, I'm sure she really based her conclusion on philosophical differences, and AYP furthered the argument). So I was a bit surprised when I looked at the MCAS data to see that we outperformed the state average (in several cases, by a huge amount) in 3rd to 6th grade growth in math for EVERY subgroup category, including ELLs (I consulted with a Dept. of Ed data person to be sure I was using data appropriately).

What gets so twisted on this blog is that many have already made up their minds based largely on ideology, innuendo and anecdotes, so they ignore any data that shows otherwise, or just claim it can't be true! Then if I say that's what the data shows, I get charged with not facing reality and just wanting to prove Amherst schools are great when they're not. It's like Alice in Wonderland!

Conclusions are supposed to be drawn AFTER data is examined, not before.

Anonymous said...

Ken- great post. Prepared to get flamed.

Ken said...

Just 2 bits more Smith Voc related data, and their impact on comparing Noho to Amherst HS.

1) On their website, Smith Voc is described as a public high school for those students who reside in Northampton. There may be some open enrollment. But it is not a regional school as someone alleged--another example of the "untrue facts" that so frequently get tossed around on this blog.

What that data means is that the large majority of Smith Voc students, who by their MCAS scores are shown to be lower academic achievers, would otherwise be educated at Noho. You can't validly compare general outcomes between 2 schools when all possible students at one school are compared to all students just minus a significant group of academically lower achieving students at another school. (Unless, of course, you want to prove a predetermined point and this data gets in the way).

2) The number of students per full time equivelent teacher (FTE) in Amherst is 13, in Noho 16, and at Smith Voc--9. In other words, not only would MCAS score and college attendence data be skewed downwards in Noho High if there was no Smith Voc, but per pupil expenditures would go up. I have no idea how to figure out how much more, but either those students would be supported by more teaching staff ($$) so their academic achievement would be higher, or the per pupil cost would remain steady (i.e., their present 1-9 FTE support would jump to 1-13) but then their achievement would probably be lower.

To be sure everyone understands what I'm NOT saying: I'm NOT saying that Northampton High isn't a fine school, or that Smith Voc students are hard-working, great kids studying what motivates them. I'm NOT saying that Amherst HS doesn't have issues to deal with--or even that it's "better" than Noho High.

I AM saying that there's a LOT more to data analysis in education than it may appear, and certainly than has been happening on this blog, and agendas need to be checked at the door when analyzing it.

Abbie said...

Ken,

you can't just compare our MCAS (with Investigations) with other comparable districts. (1) You don't know the support/training the teachers at other districts get for their various curricula. (2) It is not an actual experiment, like the one that I provided a link for. (3) I think you would be hard pressed to find a comparable district to Amherst (ie. its unusual level of parental education mixed with students).

Why would anyone value the kind of analysis you suggest over an actual experiment? In my opinion, the Dept of Education has wasted so much opportunity in providing leadership on US education because actual controlled experiments (like the one I cited and those the Gates foundation supports) are rarely done.

I believe that Catherine also used the study (link provided previously) to inform her conclusions about our Math program and probably her mailbox full of parent (and maybe teacher) complaints about it.

This is real concern, Ken, and it sounds to me like you are discounting that concern (even though you might share it).


MCAS only shows how the students are doing given their current curricula. We have no idea what they COULD be doing with another curricula. Here we should rely on comparative data of a controlled experiment unless you are pushing that Amherst does its own. We could use a different curricula at each of our 3 ES and then decide based on MCAS results. I'd go for that...

Catherine A. Sanderson said...

My response:

Thanks, Abbie, for noting that my concerns about Investigations are based on an actual experiment showing it is bad as well as complaints from teachers and students. It is hard for me to imagine why Ken would assume I just decided I hated it and then looked for data to support that view (do I have stock in an alternative curriculum? do I secretly wish to have all the Amherst kids fail in math so I want to deprive them of the excellent Investigations curriculum we now have?).

In terms of the MCAS scores - as Abbie points out, we are a very unusual community, and MCAS scores may well be influenced by the teaching that parents do at home, the use of paid tutors (e.g., Kumon), and/or the use of supplemental math curricula by teachers.

Finally, I think an experiment is a great idea - and this is what Framingham, MA did two years ago. They have 8 elementary schools, and four used Investigations (which they were already using) and 4 used a new curriculum called Think Math (which is also used in Brookline). The teachers recorded student scores and interest in all the schools and the unanimous belief was that Think Math was a much better curriculum (which is what all schools in Framingham now use).

Anonymous said...

One problem I see in Amherst is that the school or teachers will point to kids that do well, even with a lousy curriculum. What doesn't get brought to light is that probably the kid is getting instruction/support at home, OR, going to Kumon or Sylvan several days a week. The curriculum here is very weak. Ali

Anonymous said...

So now any success in AMherst schools can't even be attributed to the schools? You have NO basis for making an assumption that MCAS scores are skewed by parental involvment, tutors etc. And don't you think that parents in other affluent communities are paying for those things too?

Anonymous said...

Ali- I'm kind of tired of you making comments about the schools when you're chioldren haven't even attended in 5 years. You too hbve NO BASIS for making any comments about the current state of the schools.

Anonymous said...

Thanks Anon 11:35
I was thinking the same thing.

Anonymous said...

Oopps, my bad, you're sooo right, since I have no kids in the school (thank God) I can't comment on them. Gotta love it, only in the Republic of Amherst. Ali

Anonymous said...

You can of course comment on the schools as is your right but your comments should be viewed for what their worth- which in my opinion is not much since your have no current experience with the schools. And deflecting differing opinions by making inane comments abouth the Republic of Amherst- well nuff said.

Fed Up Parent said...

Personally, I appreciate comments from people who chose to educate their kids elseswhere. I think we need that perspective. I also think that someone like Ali, who clearly follows what is going on in town and on the School Committee, might have more informed opinions than some of our parents who might send their kids to our schools every day but who have absolutely no idea what goes on inside the classroom, school, or administration. So keep posting Ali!

Anonymous said...

Fed u- Well them I guess we''ll just have to disagree on this one. I think folks who removed there kids from the schools years ago can not speak credibly about the current state of the schools.

Ken said...

Abbie,

I appreciate your concerns about making sure the district has a good math program for its students. I do not think, however, that creating experiments is necessary when one can find enough information within MCAS, which is what the state wants us to do. (Plus, Catherine's argument against Investigations brought up MCAS scores in the first place--what's good for the goose is good for the gander.) Here is my reasoning:

Looking at "subgroup growth" somewhat sidesteps the legitimate concern about parent education levels. You will not be 100% accurate in eliminating the "parent education effect" in Amherst, but I'm not sure you would ever get a 100% accurate data analysis about that, and by looking at specific groups of students and not the aggregate of all students, I think the view we get is good enough. By the way, White and non-low income were not our 2 highest growing subgroups in math from the 2006-2009 student cohort--SPED and Hispanic students were.

In terms of services affecting scores, all districts have interventions, etc. The state requires districts to look at MCAS scores in a variety of ways in spite of differences in institutional practices, services, etc. That is the mechanism by which districts are supposed to evaluate the quality of programs and services.

While I had concerns about Investigations as a teacher, I stress again that the data should inform my perspective, not vise versa. My feelings about the program before I looked at the data (for the first time, just this fall) should not affect my analysis of what the data showed me. And while our ELLs did, indeed, score the lowest of any subgroup in terms of growth, they were still stronger than ELLs in the state (which experienced negative CPI growth), and in fact higher than the average growth of ALL learners in the state, and every other subgroup in the state except Asian students. My response to this data can only be: "It's better than I thought it was going to be. Teachers just need to be trained to better teach ELLs (and African-Amercian students, who were 2nd lowest) within this program."

The reason why I don't think an experimental period (including the $$ it would take to train teachers, buy new materials, etc)is worth it is, why do it if there is no underperformance in math? By comparing growth across grades, you are measuring math learning as measured by the MCAS test. Sustained MCAS math growth means sustained math learning. The only reason to change programs given our results is to satisfy a philosophical bias about teaching math another way. It's valid to have a philosophical perspective about best ways to teach math, but that can't trump what the evidence actually shows, even if it does not support one's philosophical bent.

The only conclusion from the evidence is that the way this program is taught in Amherst sustains growth at a greater degree than the state average, in the aggregate and within subgroups. This is in the context of the state in general being the highest scoring state in the nation on the NAEP test at 4th grade.

Ken said...

Abbie,

I'm kind of nuts about this stuff and the experiment interested me, so I did one more thing tonight, maybe the next best thing to doing the experiment. I looked at Framingham's MCAS math scores by subgroup and compared them for the same 3rd-6th grade growth to Amherst. By chance, Framingham (although bigger than Amherst) is a demographically similar district and one of the districts I have used to compare Amherst ELL programs to because of the similarity. So they're an excellent district to compare considering they chose to opt for a different math program. So in a way, this analyzes the results of the experiment--we don't know what their MCAS scores would have been if they had used Investigations instead, just like we don't know what ours would be if we used another one. But if their program was superior, it would be reflected in greater growth over the given time period. This is what I found:

Like Amherst, Framingham student growth is above the state average in every subgroup. Their CPI scores are generally lower than Amherst's, and their rate of growth is mostly lower, though higher in 3 cases. Here's the list of subgroups with Amherst score growth/Framingham score growth:

SPED +10/+6.6
ELLs & former ELLs +1.3/+6
Low Income +9.3/+5.3
African-Amercian +3.4./-2.4
Asian +7.3/+7.9
Latino +14/+1.6
White +8.8/+8.6
Non Low Income +7.9/+8.7

ELLs--I'm not that surprised--did quite a bit better under the another program. Otherwise, other than the close advantage to Framingham for Asian and Non Low Income growth, Amherst effected more growth in all other subgroups (especially Latino students). Also, since Framingham's math CPIs started so low, you would hope to see greater growth because there is more ground to make up.

Overall, though, the results are pretty clear.

Anonymous said...

Ken- Once again great post.

Anonymous said...

Ken- Don't you know that CS and the people in her amen corner only look at MCAS data when it show the schools in a negative light.

ken said...

Yes, it appears they do. THere's a lot of made up stuff too. On another blog thread a "Joel" says "My kids are at FR and there are more SPED/ELL and other specials teachers than regular classroom teachers by a very large margin." Then he refers to intervention, not Specials teachers (like art, music, etc). I thought, "Wow, things have changed since I was there." Then I looked at the FR website. There are 23 classroom teachers that work in classrooms (which includes the 2 Building Blocks program classrooms) and there are...22 SPED/ELL and other intervention teachers. That even includes guidance, and specialized SPEd interventions like Speech, Ot, PT, etc.

Gee, no wonder I'm supporting this weird math elementary math program, because everyone knows that 22 is certainly much larger than 23 in standard math thinking.

lise said...

Ken.

1. I think you could debate whether the Building Blocks teachers are SPED or classroom - so whether you or Joel is right is really a matter of interpretation - not making up numbers.

2. It may be more useful to look at the student teacher ratio differences between regular ed and kid that need services. If for example, there are 350 kids at FR taught by the 23 class room teachers (I used your number) that is a 1:15 ratio. Then there are another 22 teachers that teach maybe 20% of the kids, a 1:3 ratio, who are also taught by the classroom teachers. (I think 20% is a larger number than the percentage of ELL/SPED - but if not I am sure you will be quick to correct me.) I think it is a fair question to ask whether it is typical to have a 1:3 ratio for SPED/ELL. Intuitively a teacher/para for three kids seems high. Do you have any data on how that comapres to other systems or tate averages?

Abbie said...

Hi Ken,

I guess I don't care about "growth". And I admit to knowing almost nothing about MCAS. It seems to me the important data is the scores on MCAS, the data that you are using is relative (ie. improvement), if I understand correctly.

Wouldn't the appropriate data be what are the raw scores on MCAS of those kids who got a pure Investigations curriculum and those that didn't. Didn't we just start Investigations in the last 3 years (correct me if I am wrong). If I am correct then you can't use 4, 5, 6 grade results because they had a hybrid curriculum (later Investigations, earlier the previous curriculum).

Finally, if you are comparing Amherst to Framingham, I would need to know the FTE hours provided/kid (especially the SPED and ELL). Basically what are the costs associated with the subgroup education and the benefits (the MCAS scores). Without that it is hard to interpret. As an extreme example, if we provide 1 helper (define it as you like, para/special/?)per 3 SPED/ELL kid and Framingham provides 1 per 10 SPED/ELL, then that makes those growth scores very expensive. But like I said I would like to see the raw data, not the changes...(Preferably in graph form with error bars).