tag:blogger.com,1999:blog-6270815429299703055.post2009217948075373238..comments2023-09-29T06:32:16.005-04:00Comments on My School Committee Blog: Per Pupil Expenses: Amherst Versus NorthamptonCatherine A. Sandersonhttp://www.blogger.com/profile/03523667921190365891noreply@blogger.comBlogger85125tag:blogger.com,1999:blog-6270815429299703055.post-55776540678292392812009-12-18T11:03:36.105-05:002009-12-18T11:03:36.105-05:00Hi Ken,
I guess I don't care about "grow...Hi Ken,<br /><br />I guess I don't care about "growth". And I admit to knowing almost nothing about MCAS. It seems to me the important data is the scores on MCAS, the data that you are using is relative (ie. improvement), if I understand correctly.<br /><br />Wouldn't the appropriate data be what are the raw scores on MCAS of those kids who got a pure Investigations curriculum and those that didn't. Didn't we just start Investigations in the last 3 years (correct me if I am wrong). If I am correct then you can't use 4, 5, 6 grade results because they had a hybrid curriculum (later Investigations, earlier the previous curriculum).<br /><br />Finally, if you are comparing Amherst to Framingham, I would need to know the FTE hours provided/kid (especially the SPED and ELL). Basically what are the costs associated with the subgroup education and the benefits (the MCAS scores). Without that it is hard to interpret. As an extreme example, if we provide 1 helper (define it as you like, para/special/?)per 3 SPED/ELL kid and Framingham provides 1 per 10 SPED/ELL, then that makes those growth scores very expensive. But like I said I would like to see the raw data, not the changes...(Preferably in graph form with error bars).Abbiehttps://www.blogger.com/profile/02989627808442831131noreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-47003652893111278382009-12-18T10:53:16.281-05:002009-12-18T10:53:16.281-05:00Ken.
1. I think you could debate whether the Buil...Ken.<br /><br />1. I think you could debate whether the Building Blocks teachers are SPED or classroom - so whether you or Joel is right is really a matter of interpretation - not making up numbers.<br /><br />2. It may be more useful to look at the student teacher ratio differences between regular ed and kid that need services. If for example, there are 350 kids at FR taught by the 23 class room teachers (I used your number) that is a 1:15 ratio. Then there are another 22 teachers that teach maybe 20% of the kids, a 1:3 ratio, who are also taught by the classroom teachers. (I think 20% is a larger number than the percentage of ELL/SPED - but if not I am sure you will be quick to correct me.) I think it is a fair question to ask whether it is typical to have a 1:3 ratio for SPED/ELL. Intuitively a teacher/para for three kids seems high. Do you have any data on how that comapres to other systems or tate averages?lisenoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-4894188563777551532009-12-18T08:09:46.735-05:002009-12-18T08:09:46.735-05:00Yes, it appears they do. THere's a lot of mad...Yes, it appears they do. THere's a lot of made up stuff too. On another blog thread a "Joel" says "My kids are at FR and there are more SPED/ELL and other specials teachers than regular classroom teachers by a very large margin." Then he refers to intervention, not Specials teachers (like art, music, etc). I thought, "Wow, things have changed since I was there." Then I looked at the FR website. There are 23 classroom teachers that work in classrooms (which includes the 2 Building Blocks program classrooms) and there are...22 SPED/ELL and other intervention teachers. That even includes guidance, and specialized SPEd interventions like Speech, Ot, PT, etc. <br /><br />Gee, no wonder I'm supporting this weird math elementary math program, because everyone knows that 22 is certainly much larger than 23 in standard math thinking.kennoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-54065101217009703922009-12-18T07:34:51.829-05:002009-12-18T07:34:51.829-05:00Ken- Don't you know that CS and the people in ...Ken- Don't you know that CS and the people in her amen corner only look at MCAS data when it show the schools in a negative light.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-24243175134575742452009-12-18T07:24:01.540-05:002009-12-18T07:24:01.540-05:00Ken- Once again great post.Ken- Once again great post.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-22407380112142553852009-12-17T22:53:11.919-05:002009-12-17T22:53:11.919-05:00Abbie,
I'm kind of nuts about this stuff and ...Abbie,<br /><br />I'm kind of nuts about this stuff and the experiment interested me, so I did one more thing tonight, maybe the next best thing to doing the experiment. I looked at Framingham's MCAS math scores by subgroup and compared them for the same 3rd-6th grade growth to Amherst. By chance, Framingham (although bigger than Amherst) is a demographically similar district and one of the districts I have used to compare Amherst ELL programs to because of the similarity. So they're an excellent district to compare considering they chose to opt for a different math program. So in a way, this analyzes the results of the experiment--we don't know what their MCAS scores would have been if they had used Investigations instead, just like we don't know what ours would be if we used another one. But if their program was superior, it would be reflected in greater growth over the given time period. This is what I found:<br /><br />Like Amherst, Framingham student growth is above the state average in every subgroup. Their CPI scores are generally lower than Amherst's, and their rate of growth is mostly lower, though higher in 3 cases. Here's the list of subgroups with Amherst score growth/Framingham score growth:<br /><br />SPED +10/+6.6<br />ELLs & former ELLs +1.3/+6<br />Low Income +9.3/+5.3<br />African-Amercian +3.4./-2.4<br />Asian +7.3/+7.9<br />Latino +14/+1.6<br />White +8.8/+8.6<br />Non Low Income +7.9/+8.7<br /><br />ELLs--I'm not that surprised--did quite a bit better under the another program. Otherwise, other than the close advantage to Framingham for Asian and Non Low Income growth, Amherst effected more growth in all other subgroups (especially Latino students). Also, since Framingham's math CPIs started so low, you would hope to see greater growth because there is more ground to make up. <br /><br />Overall, though, the results are pretty clear.Kennoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-58127130242499973632009-12-17T22:03:55.263-05:002009-12-17T22:03:55.263-05:00Abbie,
I appreciate your concerns about making su...Abbie,<br /><br />I appreciate your concerns about making sure the district has a good math program for its students. I do not think, however, that creating experiments is necessary when one can find enough information within MCAS, which is what the state wants us to do. (Plus, Catherine's argument against Investigations brought up MCAS scores in the first place--what's good for the goose is good for the gander.) Here is my reasoning:<br /><br />Looking at "subgroup growth" somewhat sidesteps the legitimate concern about parent education levels. You will not be 100% accurate in eliminating the "parent education effect" in Amherst, but I'm not sure you would ever get a 100% accurate data analysis about that, and by looking at specific groups of students and not the aggregate of all students, I think the view we get is good enough. By the way, White and non-low income were not our 2 highest growing subgroups in math from the 2006-2009 student cohort--SPED and Hispanic students were.<br /><br />In terms of services affecting scores, all districts have interventions, etc. The state requires districts to look at MCAS scores in a variety of ways in spite of differences in institutional practices, services, etc. That is the mechanism by which districts are supposed to evaluate the quality of programs and services.<br /><br />While I had concerns about Investigations as a teacher, I stress again that the data should inform my perspective, not vise versa. My feelings about the program before I looked at the data (for the first time, just this fall) should not affect my analysis of what the data showed me. And while our ELLs did, indeed, score the lowest of any subgroup in terms of growth, they were still stronger than ELLs in the state (which experienced negative CPI growth), and in fact higher than the average growth of ALL learners in the state, and every other subgroup in the state except Asian students. My response to this data can only be: "It's better than I thought it was going to be. Teachers just need to be trained to better teach ELLs (and African-Amercian students, who were 2nd lowest) within this program."<br /><br />The reason why I don't think an experimental period (including the $$ it would take to train teachers, buy new materials, etc)is worth it is, why do it if there is no underperformance in math? By comparing growth across grades, you are measuring math learning as measured by the MCAS test. Sustained MCAS math growth means sustained math learning. The only reason to change programs given our results is to satisfy a philosophical bias about teaching math another way. It's valid to have a philosophical perspective about best ways to teach math, but that can't trump what the evidence actually shows, even if it does not support one's philosophical bent. <br /><br />The only conclusion from the evidence is that the way this program is taught in Amherst sustains growth at a greater degree than the state average, in the aggregate and within subgroups. This is in the context of the state in general being the highest scoring state in the nation on the NAEP test at 4th grade.Kennoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-13547535406286686602009-12-17T14:38:07.392-05:002009-12-17T14:38:07.392-05:00Fed u- Well them I guess we''ll just have ...Fed u- Well them I guess we''ll just have to disagree on this one. I think folks who removed there kids from the schools years ago can not speak credibly about the current state of the schools.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-92231288508333692962009-12-17T13:40:31.639-05:002009-12-17T13:40:31.639-05:00Personally, I appreciate comments from people who ...Personally, I appreciate comments from people who chose to educate their kids elseswhere. I think we need that perspective. I also think that someone like Ali, who clearly follows what is going on in town and on the School Committee, might have more informed opinions than some of our parents who might send their kids to our schools every day but who have absolutely no idea what goes on inside the classroom, school, or administration. So keep posting Ali!Fed Up Parentnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-11479049475478269722009-12-17T13:33:50.715-05:002009-12-17T13:33:50.715-05:00You can of course comment on the schools as is you...You can of course comment on the schools as is your right but your comments should be viewed for what their worth- which in my opinion is not much since your have no current experience with the schools. And deflecting differing opinions by making inane comments abouth the Republic of Amherst- well nuff said.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-29701290413647466362009-12-17T13:17:08.425-05:002009-12-17T13:17:08.425-05:00Oopps, my bad, you're sooo right, since I have...Oopps, my bad, you're sooo right, since I have no kids in the school (thank God) I can't comment on them. Gotta love it, only in the Republic of Amherst. AliAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-5175926913705497682009-12-17T12:28:44.548-05:002009-12-17T12:28:44.548-05:00Thanks Anon 11:35
I was thinking the same thing.Thanks Anon 11:35<br />I was thinking the same thing.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-50039411518834919082009-12-17T11:35:15.519-05:002009-12-17T11:35:15.519-05:00Ali- I'm kind of tired of you making comments ...Ali- I'm kind of tired of you making comments about the schools when you're chioldren haven't even attended in 5 years. You too hbve NO BASIS for making any comments about the current state of the schools.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-52915337021923565322009-12-17T11:33:36.515-05:002009-12-17T11:33:36.515-05:00So now any success in AMherst schools can't ev...So now any success in AMherst schools can't even be attributed to the schools? You have NO basis for making an assumption that MCAS scores are skewed by parental involvment, tutors etc. And don't you think that parents in other affluent communities are paying for those things too?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-21735915816894639832009-12-17T09:49:15.588-05:002009-12-17T09:49:15.588-05:00One problem I see in Amherst is that the school or...One problem I see in Amherst is that the school or teachers will point to kids that do well, even with a lousy curriculum. What doesn't get brought to light is that probably the kid is getting instruction/support at home, OR, going to Kumon or Sylvan several days a week. The curriculum here is very weak. AliAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-92131466659277174712009-12-17T09:39:05.730-05:002009-12-17T09:39:05.730-05:00My response:
Thanks, Abbie, for noting that my co...My response:<br /><br />Thanks, Abbie, for noting that my concerns about Investigations are based on an actual experiment showing it is bad as well as complaints from teachers and students. It is hard for me to imagine why Ken would assume I just decided I hated it and then looked for data to support that view (do I have stock in an alternative curriculum? do I secretly wish to have all the Amherst kids fail in math so I want to deprive them of the excellent Investigations curriculum we now have?). <br /><br />In terms of the MCAS scores - as Abbie points out, we are a very unusual community, and MCAS scores may well be influenced by the teaching that parents do at home, the use of paid tutors (e.g., Kumon), and/or the use of supplemental math curricula by teachers.<br /><br />Finally, I think an experiment is a great idea - and this is what Framingham, MA did two years ago. They have 8 elementary schools, and four used Investigations (which they were already using) and 4 used a new curriculum called Think Math (which is also used in Brookline). The teachers recorded student scores and interest in all the schools and the unanimous belief was that Think Math was a much better curriculum (which is what all schools in Framingham now use).Catherine A. Sandersonhttps://www.blogger.com/profile/03523667921190365891noreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-47917974275217587422009-12-17T09:05:01.161-05:002009-12-17T09:05:01.161-05:00Ken,
you can't just compare our MCAS (with In...Ken,<br /><br />you can't just compare our MCAS (with Investigations) with other comparable districts. (1) You don't know the support/training the teachers at other districts get for their various curricula. (2) It is not an actual experiment, like the one that I provided a link for. (3) I think you would be hard pressed to find a comparable district to Amherst (ie. its unusual level of parental education mixed with students). <br /><br /> Why would anyone value the kind of analysis you suggest over an actual experiment? In my opinion, the Dept of Education has wasted so much opportunity in providing leadership on US education because actual controlled experiments (like the one I cited and those the Gates foundation supports) are rarely done. <br /><br />I believe that Catherine also used the study (link provided previously) to inform her conclusions about our Math program and probably her mailbox full of parent (and maybe teacher) complaints about it.<br /><br />This is real concern, Ken, and it sounds to me like you are discounting that concern (even though you might share it). <br /><br /><br />MCAS only shows how the students are doing given their current curricula. We have no idea what they COULD be doing with another curricula. Here we should rely on comparative data of a controlled experiment unless you are pushing that Amherst does its own. We could use a different curricula at each of our 3 ES and then decide based on MCAS results. I'd go for that...Abbiehttps://www.blogger.com/profile/02989627808442831131noreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-25561356545874358282009-12-17T08:49:57.758-05:002009-12-17T08:49:57.758-05:00Just 2 bits more Smith Voc related data, and their...Just 2 bits more Smith Voc related data, and their impact on comparing Noho to Amherst HS. <br /> <br />1) On their website, Smith Voc is described as a public high school for those students who reside in Northampton. There may be some open enrollment. But it is not a regional school as someone alleged--another example of the "untrue facts" that so frequently get tossed around on this blog. <br /><br />What that data means is that the large majority of Smith Voc students, who by their MCAS scores are shown to be lower academic achievers, would otherwise be educated at Noho. You can't validly compare general outcomes between 2 schools when all possible students at one school are compared to all students just minus a significant group of academically lower achieving students at another school. (Unless, of course, you want to prove a predetermined point and this data gets in the way). <br /><br />2) The number of students per full time equivelent teacher (FTE) in Amherst is 13, in Noho 16, and at Smith Voc--9. In other words, not only would MCAS score and college attendence data be skewed downwards in Noho High if there was no Smith Voc, but per pupil expenditures would go up. I have no idea how to figure out how much more, but either those students would be supported by more teaching staff ($$) so their academic achievement would be higher, or the per pupil cost would remain steady (i.e., their present 1-9 FTE support would jump to 1-13) but then their achievement would probably be lower.<br /><br />To be sure everyone understands what I'm NOT saying: I'm NOT saying that Northampton High isn't a fine school, or that Smith Voc students are hard-working, great kids studying what motivates them. I'm NOT saying that Amherst HS doesn't have issues to deal with--or even that it's "better" than Noho High. <br /><br />I AM saying that there's a LOT more to data analysis in education than it may appear, and certainly than has been happening on this blog, and agendas need to be checked at the door when analyzing it.Kennoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-9569698524179145402009-12-16T23:07:55.913-05:002009-12-16T23:07:55.913-05:00Ken- great post. Prepared to get flamed.Ken- great post. Prepared to get flamed.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-3803086967439538052009-12-16T21:02:25.824-05:002009-12-16T21:02:25.824-05:00Joel--I have never worked in the HS. I know teach...Joel--I have never worked in the HS. I know teachers at the HS on a professional level as well and respect their skill greatly. However, I have no stake in the HS, or its reputation. I'm sure some things are good and some could be tweaked to be better and some things need substantial changing. For example, it's clear from that graduation rates for our Latino students is low, that we don't have enough students-of-color and low income students in higher level classes, and that an achievement gaps still perist (as they do in Northampton, by the way). I have no fear saying those things. I'm just going to be careful before I leap to conclusions about things that are not so clear. I wish everyone would do the same.<br /><br />The questions for me are: 1) What data needs to be looked at to give us the best picture of the education we provide our children? 2) What is the best way to analyze that data? And 3) how transparent is the agenda that informs the way data is gathered, interpreted and translated into policy? <br /><br />Concerning #3, there is clearly an agenda to "prove" that our schools are just average at best and getting worse. Only that sense of urgency justifies substantial changes, which, from what I can tell, will benefit the already-doing-well the most. Well, our schools may be going down the tubes--I've just not seen any data that shows it. <br /><br />Take our elementary math program (Investigations), which Anonymous asked about. I am well aware of the anti-Investigations and pro-Investigations camps, where each thinks the other ruins math for kids. I know that some studies show Investigations is awful--but there are others that show it to be quite strong. So to me, all that is secondary (and often politically motivated, whichever side it supports). I want to know what MCAS data about OUR program shows. <br /><br />And Anonymous, I can't compare our Investigations math program to what it might otherwise be like with a program we don't use. I can only look at MCAS data and compare us either to districts that are demographically similar in the aggregate, or to state averages for subgroups of students (e.g. race, income, ethnicity, Special Needs, etc). <br /><br />Frankly, I had NO idea what the data would show when I went to look in reponse to hearing Catherine say our math programs were "weak." You can ask anyone at Fort River where I taught and they will tell you that I was not the biggest fan of the program. I felt strongly that it was not the best for struggling learners, especially ELLs. I have absolutley no stake in maintaining Investigations in our schools.<br /><br />Catherine based her conclusion on AYP results, which is a very sloppy and inaccurate use of data for the reasons I described in my math post (well, I'm sure she really based her conclusion on philosophical differences, and AYP furthered the argument). So I was a bit surprised when I looked at the MCAS data to see that we outperformed the state average (in several cases, by a huge amount) in 3rd to 6th grade growth in math for EVERY subgroup category, including ELLs (I consulted with a Dept. of Ed data person to be sure I was using data appropriately). <br /><br />What gets so twisted on this blog is that many have already made up their minds based largely on ideology, innuendo and anecdotes, so they ignore any data that shows otherwise, or just claim it can't be true! Then if I say that's what the data shows, I get charged with not facing reality and just wanting to prove Amherst schools are great when they're not. It's like Alice in Wonderland! <br /> <br />Conclusions are supposed to be drawn AFTER data is examined, not before.Kennoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-86400010385935308932009-12-16T19:48:59.283-05:002009-12-16T19:48:59.283-05:00Just so you know...Northampton uses the Investigat...Just so you know...Northampton uses the Investigations math curriculum in their elementary schools, too.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-40131613057519407662009-12-16T10:26:04.131-05:002009-12-16T10:26:04.131-05:00Joel, me again. You're absolutely right, they ...Joel, me again. You're absolutely right, they did not ask smart questions and I thought that decision was flawed - a case of dumbing down the high school curriculum in order to adjust to the inadequacies of the achievement in the lower grades.<br />I think some questions about this decision should have been lobbed at the principal, and there should be (and is now, I guess) some hard thinking about having one person in charge of curriculum.<br />What I wish to say is that in the many years of school committee work beforehand, there WERE occasions of thoughtful debate, (not enough, we can agree) and to say, sweepingly, that before the present-day committee there was 'literally no debate on anything, just hyperbole,' is absurd and undermines the important criticisms you make. Which, though I dislike your approach, I can appreciate as important to our community.<br />Why don't you run for s.c.?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-21167934972367029622009-12-15T22:30:07.798-05:002009-12-15T22:30:07.798-05:00I did go. Elaine Brighty often refused to hear co...I did go. Elaine Brighty often refused to hear comments and questions from the community. <br /><br />There was no debate by SC members on the 9th grade science curriculum. They didn't ask a single probing question. <br /><br />Yes, I was there then. Were you?Joelhttps://www.blogger.com/profile/13141742420717242724noreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-25644464535315822962009-12-15T22:13:47.830-05:002009-12-15T22:13:47.830-05:00Per Joel: 'I think it's important to remem...Per Joel: 'I think it's important to remember that before ACE and the election of Catherine, Steve, and Irv to the SC we had literally no debates on anything, just a lot of hyperbole, much of which turned out to be wrong.'<br />Did you go to school committee meetings then? 'Literally no debates on anything' -?? Really?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-6270815429299703055.post-26752596148790159772009-12-15T17:53:34.636-05:002009-12-15T17:53:34.636-05:00The numbers are out there, somewhere.....The numbers are out there, somewhere.....Anonymousnoreply@blogger.com