By CATHERINE SANDERSON

Published on June 11, 2010

The Amherst Regional Schools pride themselves on offering students a vast array of electives. And based on public comment at both School Committee meetings and in the press, these choices are prized by students, parents, teachers and community members.

Offering students choices about which electives to pursue clearly has benefits in terms of keeping students engaged in and excited about school. However, requiring students to make choices regarding their pursuit of core academic disciplines also has some risks, especially when students may lack a full understanding of the longer-term consequences of making a particular choice. For example, the math departments at Amherst Regional Middle and High schools both require students to make decisions that may limit their ability to take higher level math classes. It is critical that both students and families make these choices with a full understanding of these ramifications.

ARHS students must choose between two distinct types of high school math programs: a traditional sequence (algebra, geometry, algebra II, trigonometry, calculus) and a reform-math sequence (Interactive Mathematics Program). IMP is an alternative approach to teaching math that organizes math topics in four to six-week units around a central problem or theme.

Since it is quite difficult to move from the IMP track to the traditional sequence, it is critical that students have the information necessary to choose the right program. Although there has been no evaluation of the IMP program since it began at ARHS, research reveals that students attending California high schools using the IMP curriculum have lower math SAT scores than those using a traditional math sequence. Moreover, there are also concerns about whether IMP serves as an adequate preparation for college math. One UC Berkeley mathematics professor who studies math education has therefore recommended against IMP for students intending to go to college, particularly those potentially interested in quantitative fields.

ARMS also offers a choice between two types of math programs in seventh grade, when 12 and 13 year olds must decide whether to complete advanced mathematics problems (extensions), whose mastery enables students to enter eighth-grade honors algebra. Completing algebra in eighth grade is necessary for students who want to complete calculus during high school, and offering the opportunity to complete extensions is designed to theoretically make advanced math available to more students (and thereby close the achievement gap). However, in practice, students with parent/guardians who understand the importance of taking algebra in eighth grade are much more likely to opt for completing the additional work, and to receive assistance with completing such work if required. Mathematically adept students without this family support may therefore limit their possibilities in high school by choosing not to do extensions.

Other districts have taken different approaches that actually give students fewer choices, but may yield better results. For example, in Rockville Centre (New York), a diverse suburban district on Long Island, the superintendent decided to require all middle school students to take eighth-grade algebra. The results were dramatic: the percentage of students completing trigonometry increased, scores in AP calculus for all students increased, and more than three times as many African-American and Latino students now take higher level and honors math classes, substantially closing the gap with the white and Asian students. Rockville educators believe that requiring higher standards conveyed the message that all students could perform at an advanced level, and that in turn, students (especially students of color) rose to meet these higher expectations.

Allowing students to make academic choices is empowering. However, it also magnifies the impact of inequality in family resources and educational backgrounds. As part of the upcoming review of the mathematics curriculum in Amherst, we need to objectively evaluate the effectiveness of both the IMP and extensions programs so that we can adequately advise students about both the benefits and costs of their choices, and ideally help all students make choices that expand rather than limit their possibilities.

*Catherine A. Sanderson is a professor at Amherst College, and a member of the Amherst and Regional School Committees. This views expressed in this column are hers alone, and not those of the School Committees.*

## 37 comments:

Curriculum reform, especially in math and science, draws the attention of educators who want their schools to be the best at preparing students for a future with opportunity.

How does math curriculum reform stack up against the problems Rodriguez identified in his postmortem letter in terms of priority? I believe we can do more than one thing at a time. I also believe we cannot do all things at once.

Do you have a sense that the SC is setting good priorities on the issues that must be addressed to improve the schools to the standard you seek?

Catherine,

I continue to be concerned about your selective use of research. You give the impression that there is such a thing as "the research" -- a monolithic body of evidence that leads to a single conclusion. That is rarely true, even in areas of research where results are fairly easily quantified and it is possible to use randomization in trials. Reaching consensus in an area like medicine takes years and years and many studies. In education, it's even harder for researchers to come to consensus. So, it's very difficult to say "the research shows." And yet, you and Steve frequently say it. I consider that approach to be unscientific and unscholarly. It doesn't proceed from a posture of inquiry.

I wonder what the source is that allows you to say "research reveals that students attending California high schools using the IMP curriculum have lower math SAT scores than those using a traditional math sequence." I am guessing that this research did not entail students being randomly assigned to a math sequence. That's one of the difficult things about doing educational research; we don't flip a coin to decide who gets what. (For example, in Amherst, no one is proposing that we randomly assign elementary school kids to take Spanish or not take Spanish.) I would like to look at the study you are referring to, because it's entirely possible that a selection effect is at work. I'd like to know how that was addressed.

If your source comes from the website "Mathematically Correct," you might want to be careful about relying on it. Look around a little more on that site and you'll find out that, according to them, Everyday Math is terrible. That site has a very strong bias in support of a traditional math curriculum. Are there some college math professors who subscribe to the beliefs espoused by Mathematically Correct? Yes. There are also plenty who don't. In fact, college math professors played a significant role in the development of IMP and in the development of the NCTM Standards. So, giving the statement of one college professor in your article really isn't very meaningful.

I can point to lots of studies that show the opposite of the research that you alluded to. You can see many of them here:

IMP research

Remember that the NSF-funded reform curricula had to submit extensive research to the NSF in order to get funded in the second round. I think the NSF knows something about research. Not all of the curricula made it through the second round, but IMP did. In fact, IMP was one of five curricula to receive an award from the Department of Education as an "Exemplary Curriculum," based on evidence of its effectiveness.

Exemplary Curriculum Award

But of course, there is still a lot to debate about this research. As with all educational research, we have to ask ourselves what we mean by student achievement before we start to measure it. It's not like it's a simple linear quantity that we can readily apply a yardstick to. This is where differences in definitions, basic beliefs and values come in to the picture. The goals that I have for my students, the skills and capabilities that I consider most essential to develop, may not be captured by the measuring instrument that someone else wants to use.

Those of us who opted for IMP are following this discussion carefully. We got advice from a teacher outside of the Amherst system that IMP was the better choice. It seemed that most parents went the other way for a more traditional program, but we took a chance. On Thursday morning, I saw the statement about California in the column that Nina is challenging and wondered about it.

I will also say that this was a momentous decision about our child's math future that neither one of us as parents of an 8th grader going into 9th grade felt prepared to make. I wonder how other parents feel about being presented with it.

Agree or disagree, Nina is an indispensable commenter on this blog.

Rich Morse

Nina, I continue to be dismayed at your complete disdain for anything Catherine says. This is not the type of open-mindedness I hope to find in my children's teachers. This blog posting is particularly offensive to me, as you are categorizing both Catherine and Steve (who, the last time I checked, both held PhDs which required RESEARCH and also hold faculty positions at one of the top colleges in the country) of not understanding what research is!! That is too much!

Nina, of course ARHS Parent is correct. Open-mindedness is agreeing with whatever Catherine says!

Nina,

I am not familiar with the IMP, as my son is still in elementary school. To learn more about it, I did a quick search of the education literature in the ERIC database, considering only peer-reviewed publications (not those put out by the creators of IMP). I searched with "IMP" or "Interactive mathematics program" in combination with "evaluation," "experiment," or "outcome." I turned up 5 paper in total, 4 of which were simply descriptions of IMP outcomes without any comparison to other curricula (in other words, useless). The 5th paper compared IMP to a more traditional curriculum, but it was fatally flawed because both the students and the teachers in the study were self-selected into the IMP condition (i.e., no random assignment). Perhaps I used the wrong key-words in my search, but I did not find any evidence that IMP is a good approach to teaching math. (I will also note that the link to IMP research that you posted did not work for me - it timed out each time I tried it.)

I will admit to being skeptical of Amherst's math programs in general. I have noted on this blog in the past that the elementary math curriculum is one that a good experiment -- random assignment, large sample -- has shown to be quite a bit weaker than alternative curricula. The difference between weaker and stronger curricula in that study was about 10 percentile points (i.e., huge). Yet when I've talked with teachers and administrators about the weaknesses of our curriculum, I've heard a lot of nonsense about how that (exceptionally well-done) study isn't relevant to Amherst. So, rightly or wrongly, I interpret your reaction to Catherine as knee-jerk defensiveness, especially when you intimate that neither Catherine nor Steve is capable of evaluating the research. They do research for a living (as I do), you do not, and I have seen no evidence that either of them is out to destroy the Amherst schools. Quite the contrary.

Here goes:

Lots of parents are FED-UP with the math curriculum offered by ARPS and have been for many years. In response to this persistent and unaddressed anger, we hear things from folks like Nina and Ken that it isn't the curriculum's problem it is more about (misdirected) perceptions and lack of teacher support. We hear from M. Hayes that the reason not as many students as he'd like take extensions is because many (most?) incoming 7th graders aren't prepared. Any yet, he is a proponent (I believe) of the Investigations curriculum (and indeed held the position of curriculum director for a year).

I have yet to find any valid studies showing Investigations is *superior* to other options that are available. I have an email out to the National Science Board about their lack of leadership on this important issue. The US continues to lag behind other countries (I think we are 17th) in producing students for STEM fields. Our Math administration seems more interested in the idea that students 'like' math than in actual acheivement. This is the mission statement from the NCTM website 'We envision a world where everyone is enthused about mathematics, sees the value and beauty of mathematics, and is empowered by the opportunities mathematics affords.' Nothing mentioned about mastery...I invite everyone who is interested to visit the NCTM website, an organization that is frequently sited as the guide for our ARPS math.

What it has now come down to for me is:(1) is reform math, as practiced in Amherst (Investigations in ES and Extensions in MS, IMP in HS) the best method to educate our kids? (2) who gets to decide the method of math education used: producers (teachers and administrators) or consumers (students' families)?

If school boards have the power to determine whether creationism is offered in place of (or in addition to) evolution, then surely it is within the power of the school committee to decide between reform math, traditional math, or something in between.

I have lost any confidence the authority of the administration to make this decision.

Lighten up ARHS Parent: Catherine and Steve can take it. I'm sure that, on their respective academic journeys, many times they have been challenged at least as sharply as Nina does here.

If Nina really disdained everything that Catherine says, she would have stopped reading this blog and spent her time doing something that gives her infinitely more pleasure.

The Sanderson-Koch counterpoint, as well as the contributions of Jensen and Hood, are what keeps us coming back to read here. And unlike ignorant loudmouths like me, Nina comes back with good stuff: information, links to other research, all of which makes for a richer experience. (I went to the trouble of printing out the documents she links to above.)

I'm convinced that more people read this blog than ADMIT that they read this blog! There's a reason for that.

Rich Morse

So, I'll say a few things here, for what that is worth -- and most importantly, I think this is a really important dialogue to be occurring.

First, I do research for a living. I have a PhD from Princeton (based on conducting research), I have tenure at Amherst College (if you google Catherine Sanderson, many links to peer-reviewed articles in highly competitive journals will appear), and I'm on the review boards of major journals in psychology. I therefore do have a pretty good sense of how to conduct and evaluate research.

Second, I've reviewed A LOT of research on educational issues over the last few years (especially on math, but also on school size effects, class size effects, equity of low income kids effects, physics first, etc.), given my role on the school board. And I actually agree with Nina -- it is VERY hard to really empirically test a lot of education issues as much as we would like, GIVEN that these studies (typically for ethical/practical reasons) aren't done using the "gold standard" of research (e.g., random assignment).

That being said, the research on IMP is very, very lacking. The cites that Nina provided are produced by the makers of IMP (hardly an objective source), and many of these cites are actually to conference presentations or unpublished manuscripts (meaning they have not met the standard of peer-review). I haven't read a single published study in a peer-reviewed journal that shows this is a better approach (or even an as good approach) as the traditional sequence, and I find that concerning. Caren Rotello has also been unable to find such information, and she too is highly familiar with conducting and evaluating research. Nina, if you have actual studies showing IMP is better that have appeared in peer-reviewed journals, please send those cites.

But the thing that concerns me the most is that my column is criticized because the research I cite isn't using random assignment (which is a fair critique), but then when studies ARE done using random assignment that show, for example, Investigations isn't effective (there is a study showing this) or that professional development (e.g., math coaching) isn't effective (there are two recent studies showing this), Nina (and others) also criticize those studies (which are appearing in peer-reviewed journals) as not applying to Amherst.

And finally, if teachers and administrators within the Amherst system are unwilling to accept ANY research from ANY other district as being relevant whatsoever to Amherst, why aren't teachers and administrators coming to SC meetings regularly and DEMANDING that we examine the effectiveness of our programs within the Amherst system? I've been on the SC for 2 years, and I attended SC meetings for a year before that. In all of that time, I haven't heard a single teacher or principal saying to the SC "we really need to know how our unique 9th grade science class/unique approach to 7th grade math/elementary math curriculum/IMP HS math program is working out -- can you please devote some funds in the upcoming year to hiring an independent outside expert to evaluate our program so that we can really know if our program is working as well as it should for all kids?" Such a request would CERTAINLY be taken very seriously, and I believe would receive full support from the entire SC. Yet when SC members, including me and Steve, have asked for such evaluations, we have been criticized by teachers and administrators and community members for doing so.

In sum, what I've heard is that (a) we can't make decisions based on research with poor designs and no random assignment, (b) we can't make decisions based on research with good designs and random assignment used in districts other than Amherst, and (c) we aren't interested in having research done on our programs in Amherst (and will criticize those who think we should analyze the effectiveness of our programs). I guess I'm wondering if there is any evidence beyond teachers' and principals' own beliefs about the effectiveness of our programs (or lackthereof) that would convince those who continue to support our approach that a change is needed?

I do not always agree with you Mr. Morse, but I do always enjoy reading your thoughtful posts, and I also always appreciate Nina's contributions to this blog.

There are others who contribute to this blog also who contribute to the point/counterpoint discussions that make for thought-provoking reading. And that is what brings me back to this blog time and again. Thank you to all who contribute so thoughtfully. I am not in the edcuation field so I do not have the knowledge to contribute to the conversation. But I have learned alot by the give and take and truly appreciate all who take the time to make such thoughtful posts.

I have always believed that gathering information is a good idea. We should evaluate IMP at ARHS. Also, in my experience, extensions are not "advanced mathematics problems", merely more mathematics problems. We have to change MS math.

Catherine,

I did not say that you don't understand research. I expressed a concern about how you present it to the general public. I don't understand why you would throw that result from California into your column, as though the research has spoken and now we know the answer. We don't know the answer. The answer is very hard to know. I notice that you have not provided a link to the study here, by the way. Was it from Mathematically Correct?

Since you bring up the example of your posting on professional development, I will identify that as another instance of an unscholarly approach. Professional development is a form of teaching: it's teaching teachers. It can be done in lots of different ways and I imagine that some are more effective than others. But would you post two studies about teaching and then say "Okay now we know teaching doesn't work"? Or two clinical drug trials and then say "Okay now we know medicine doesn't work"? It just doesn't make sense. (By the way, you report me as criticizing those studies as not applying to Amherst. I did not comment on that thread at all.)

And in fact, those two studies that you did post did not show a lack of effect. Instead, they showed a small positive effect that could not be declared statistically significant, due to the small sample size. The studies covered a period of only one year. If the same positive effect continued over a longer period of time or with a larger sample size, then the probability that it occurred due to the intervention (rather than due to chance) will go up. I believe one of the studies is going to present a two-year report and I will be interested to see that when it comes out.

(continued below)

(continued from above)

I think many people in the general public don't understand the use of the term statistically significant. They think of the common usage of the term "significant," which means noticeable, something worth remarking on. If I flip a coin a hundred times and get 51 heads, everyone would shrug their shoulders. It's not significant, either in common parlance or statistically. If I flip a coin a million times and get 51% heads, the average person would probably still shrug. But an experimenter would be astonished at that result. The probability of that occurring simply due to chance is very very low. Whatever I did to that coin to give it a bias was very effective. The result would be considered statistically significant. But still, it's only a small bias. Who needs a coin that comes up heads 51% of the time instead of 50% of the time? Would it help you win bar bets?

So, it is possible that small effects can be declared to be statistically significant. We saw this in the data that you presented about the effectiveness of extra science courses. Your headline for the posting is "What the Research Says," again suggesting that there is one answer and we just have to look it up. Here it is:

posting on high school science

I would direct people's attention to the graphic, which appears to show that an extra year of high school chemistry will improve student performance in college chemistry. The vertical axis says "Difference in College Grade." What it doesn't say is that the measurement scale on that axis is 1%. The students with an extra year of chemistry scored on average around 82% where the kids without that extra year scored on average around 80%. It's a statistically significant difference because the sample size in the study was quite large. But I would hardly call it remarkable. I think the kids who took an extra year of chemistry should have more than a two point edge. And that doesn't even take into account the obvious selection effect which comes from the fact that the stronger chemistry students will tend to be attracted to taking the second year in high school.

I am not challenging your credentials, Catherine. I am saying that

becauseof those credentials you have an obligation to present research very carefully.Something that Abbie said resonates for me:

“Who gets to decide the method of math education used: producers (teachers and administrators) or consumers (students' families)?”

Frankly, I am much more interested in exactly what we are doing here in Amherst, and whether or not we are pleasing our customers, than what general research shows (or does not show). Customer/ consumer input is very important in this.

I think we all agree on the need for data. But the data I want to look at is data about our own system, not research papers. And I think we lack that. Note that surveys that say 90% of people like what we are doing is not good enough. I want to dig into the 10% to find out what specific problems those 10% are having.

What frustrates me is the following, and not just with math but with many areas:

We have parents who say “this is not working” and we have teachers who say “yes it is” and that tells me absolutely nothing about what the real problem is. I wish we could all get together to discuss

specificproblems that parents/students are having instead of these general comments. I don’t see how we can know what to do to improve things without that.Because we have apparently not done that (have we?) we now find ourselves with parents who say “just do it this way”, proposing fixes that they want implemented immediately.

Catherine,

I am one of the math faculty at ARHS that teaches both IMP and traditional math courses. I read your editorial in the Amherst Bulletin and thought I would take this opportunity to share some facts about the Interactive Mathematics Program (IMP).

Students at ARHS who take IMP are not limited in their ability to take higher-level math courses. Students who take Algebra 1 in the eighth grade, can go directly into IMP 2 in their ninth grade year. These students will finish IMP 4 in their junior year and can take AP Calculus at our school as a senior. If a student takes IMP 1 in the ninth grade year, they can take Calculus when they get to college.

We help students and families make the decision of taking IMP or traditional math in at least 2 ways. First, the IMP teachers prepare an IMP mini-lesson and visit each of the eighth grade math classes to demonstrate what IMP is like. After this mini-lesson, we have a question and answer session with these students and make it clear what their choices are and what their course sequence would look like for their four years in high school. The second thing we do is present information along with a question and answer session to the incoming ninth grade parents at an evening hosted at the high school. Beyond that, all of the IMP teachers offer their time to answer more questions and concerns by phone, email, and in person.

IMP is one of the few curricula which addresses the NCTM (National Council of Teachers of Mathematics) Standards which were first drafted in 1988.

At ARHS, it is not difficult to move from the traditional math sequence into IMP, or from IMP into the traditional math sequence. Roughly one quarter of the students in my IMP 3 classes this year were previously in our traditional math sequence. I have worked hard to make sure their transition was smooth, and I believe my students would agree. At the beginning of this year, a few students moved from IMP into our honors Algebra 2 course (which is challenging) and they have done well.

Very recently, many of my IMP 3 students took the SAT for the first time. Many have reported to me that they did fine (over 600) but would like to do better. In my traditional math courses, I hear the same comments. When these same students took the MCAS math test last year as sophomores, all but one of the IMP 2 students passed the math portion of the MCAS test. I cannot (unfortunately) report the same positive statistic about my traditional math students.

One of my IMP 3 students participated in two very challenging math contests this year; the Math Olympiad and the AMC math contest. She took the 11th and 12th grade test and scored as well as students from our school, who are in traditional honors math courses.

Our IMP students are prepared for college level math courses. We cover all of the topics necessary for success in calculus. We even cover the derivative in IMP 3 as well as a six dimensional linear programming problem using matrices. By the end of the four years in IMP, the students have covered a significant amount of probability and statistics, which prepares them for similar study in college. Our students who would like to go into engineering, psychology, marine biology, the humanities, and all other areas of study are prepared for college level math courses.

In addition to educating our students in high-level mathematics, the IMP curriculum offers many other positive aspects such as writing effectively, speaking knowledgeably, clearly, and persuasively, and working successfully in groups to solve problems. These additional attributes are a part of our high school’s learning expectations, and help set the stage for success as our students move forward in their lives.

Rich Eckart

To: Rich Eckart June 13, 2010 5:11 PM. My son did not go to this high school; but his senior year at his high school, he took differential equations, along with 15 other classmates. Is that an option at Amherst H.S.?

If differential equations are not available at ARHS they certainly are at one of the colleges a few blocks away.

Realisticaly how many students are ready to take that course in HS?

According to Wikipedia, the Bronx HS of Science -- the Gold Standard in public science & math education -- didn't start offering linear algebra & diff eqs until 2007. ARHS should not try to be in that category of HS with the limited resources it has and is far broader mission.

With the Five Colleges and two community colleges 1/2 hour away from Triangle Street, there is no valid reason to put money into courses that only 1% of students can take.

Is our IMP at ARHS a unique curriculum or is it a standardized one from elsewhere?

We looked into IMP to find out what it might offer that a traditional curriculum does not. I had a number of very good conversations with math teachers in the high school about it. They were very helpful and receptive to questions (I was unaware of any parent night to discuss this or I would have attended!). During these conversations, I heard a lot to recommend IMP but in the end, we chose the traditional curriculum. Largely, I have to say, because I was told "it was very difficult to move into the traditional sequence if you start with IMP." I didn't want to commit my daughter, in the ninth grade, to three years of IMP (she would enter at IMP 2) since she had never experienced anything like that before. Now I read that it is easy to transition back-and-forth! I think this part of the process needs more information (i.e. how many kids transition back and forth each year, each grade, etc. and to/from which courses) because I imagine there are other parents out there who rejected IMP for the same reason I did.

The one thing I DO like about IMP is that it is optional!! It is a different sort of curriculum but no one has to sign up for it. Unlike Investigations which our kids are required to experience in elementary school. Or extensions which our children are requred to complete in order to access algebra in the 8th grade. So from that perspective, I think that the introduction of IMP at ARHS was done in a much more reasonable way than some other curricular changes (including mandatory study halls) we have seen.

To Richard Eckart:

I am the parent of two elementary students. Thank you so much for the very clear explanation of the IMP program in our high school.

It is much easier to think about and evaluate both the "traditional" and the IMP programs when we get to hear from the people who are actually working with the programs.

A few assorted responses:

Anonymous 11:51/12:18 - thank you both for your thoughtful comments. I agree that this blog does allow for very interesting back-and-forth, and also that having data is good!

Nina - a few thoughts. First, I've heard concerns from many college professors at the IMP approach to math in terms of preparing kids for college-level math. That includes professors at U Mass-Amherst, and also professors at other institutions. I frankly think that feedback matters a lot, because no matter what HS teachers think about its effectiveness, if college professors see IMP-trained students as less prepared than traditionally-trained students, that is a problem. Two professors in California have written about this extensively: Dr. Milgram (Stanford) and Dr. Wu (Berkeley). I will try to find a link -- or readers can google.

But read my column - I didn't say "so we should eliminate IMP immediately." This is how it ended: "As part of the upcoming review of the mathematics curriculum in Amherst, we need to objectively evaluate the effectiveness of both the IMP and extensions programs so that we can adequately advise students about both the benefits and costs of their choices, and ideally help all students make choices that expand rather than limit their possibilities." I would imagine you agree with the need for us to evaluate IMP at ARHS, right?

In terms of professional development: the two studies I posted involve random assignment and multiple schools/districts. They both show no statistically significant effects. And yes, if in a medical field, a treatment for cancer showed no statistically significant effects after a year, that treatment would likely be stopped immediately! Can you post ANY (even one) link to a study (using random assignment) that shows professional development IS effective at increasing achievement? Again, you said that the studies on math I cited weren't using random assignment, but then you criticize the professional development studies as only studying the effects for one year, so that isn't important information? And I'm sure you know that a non-statisically significant effect does NOT automatically become significant over time ... and that it is misleading to say that there were "small positive effects" since they weren't considered significant.

One more point: the data I presented about the benefits of extra science was actually NOT showing the benefits of extra science -- it was showing that the single best predictor of success in ALL sciences was the number of years of math taken. And in that sense, it is a really remarkable finding -- having more math in high school is a better predictor of success in college science THAN more high school science! That is one reason why I'm so concerned the math department at ARHS has the lowest requirement (only 2 years) in the state. I assume you are pushing your colleagues to increase the requirement to 3 or 4 years?

More from me:

Rick - I'm glad you care about what the consumers think -- and I think this is important. And two more points:

1. I think research should inform all that we do -- and if data from other places is showing something (e.g., program X benefits students more than program Y), I would hope we would learn from that and not try to constantly re-invent the wheel. If I have cancer, I'd prefer to know what the research says about what treatments used across the world will help me live longer more than I'd like to know about how the treatments at the hospital where I'm being treated are working!

2. I think we also of course need to know how things are working in Amherst (e.g., extensions, 9th grade ecology, IMP) -- and I believe we could do a lot better job of evaluating that. We have a lot of resources in our community, and I bet it wouldn't be hard to have professors (even college students) take on this evaluation. I haven't seen a lot of interest/willingness in the district to do that, however. Perhaps Nina can push for more use of evaluation on the side of teachers, which I think would be really helpful.

Rich Ekart - thank you so much for posting! I really appreciate your willingness to be involved (and using your own name) in this discussion. I found your information about IMP very interesting, and I'm sure other readers did as well. Given your positive experiences with IMP, I think it would be great to do a study in which we tested achievement (e.g., MCAS, IMP) on kids in both tracks, and then provided that information to parents/students in 8th grade when they need to make a choice. This is precisely the type of information we need to have -- does IMP work as well as (or better than) a traditional program, and does IMP work better for some students than for others? I'd be glad to work with you (or others) on that, if you would be interested! Thanks again for adding your thoughtful perspective here.

Anonymous 9:04 - let me clarify two important things about taking classes at the colleges. First, ARHS only allows students to take AFTERNOON classes, and many college math classes at Amherst College only take place in the morning. Thus, the options are more limited than you might imagine. Second, although Amherst College allows students to take classes for free, U Mass charges $1500 a class, which really is out of reach for many families. I don't think we should assume college-level math is available to all students, as you describe.

Anonymous 1:48 - IMP is used in other districts; not all, but some.

Alison - I agree that providing more information (on success of IMP, ease of transitioning in and out, etc.) should be available. As Rich Morse noted, this is a difficult decision for 8th graders to make, and more information would help families/kids make a more informed choice.

I think that an important piece of the puzzle that is missing is the way students think. After taking retgular or "Standard" Math in middle school and failing abysmally I decided to try IMP and found it was the best decision I ever made. I don't think like a "Standard" math student, I can't simply memorize an equation and figure out how to apply it to a situation later on. The IMP program lets the students figure out a math concept like Y=MX+B for example using a story or central problem. Once the students have practiced figuring out how to use said equation than they know when to apply the equation. This is not the same in Standard math. As for college several good schools including Harvard, MIT, and Amherst College have accepted IMP students to their colleges as shown by the website below

http://www.mathimp.org/research/college.html

After further thought I think that, should I go back to the standard math program I would run into one serious problem. My teacher getting frustrated with me because I as "Why?" too much. IMP teaches students to as why, an important question not only in math but in real life. Further, I'd also like to say that as a childhood math hater, after taking IMP I'm quite attatched to math now and if I had to go back to standard math I highly doubt that I'd be half as succesfull as I am in IMP now.

I am currently taking IMP and i love it. I took Algebra 1 my freshman year and found that it simply wasn't the right type of math for me. I am not someone who works well by simply memorizing formulas- I do much better learning to understand the legitimacy of math and how it works in our present day world. On the SATs I forgot the formula for a certain shape and was able to use what i had learned in IMP to make the formula myself.

Almost everyday i find myself explaining things to students taking traditional math how the math they are learning fits into their life and everyday world. The other day my friends were complaining about matrices and saying how all they would every be useful for was computer programming. I explained that matrices are actually useful in many situations in which you have more variables than are easily solvable by algebra. By using matrices problems with 6, 7, 8 and more variables are easily solved.

By learning IMP students actually learn to understand and apply math beyond memorizing and spitting out formulas.

I did not take Calculus in the Eighth Grade -- and what I did so that I could take Calculus in High School was to "double up" in the 10th grade and take

bothgeometry and algebra II. As geometry does not really build on algebra the way that all of the other courses in the sequence build on the ones before them, this is a way for a student to "catch up" if desired.It is not easy taking two math classes, but is the option of taking algebra II as an *elective* in the 10th grade given to ARHS students willing to do the work? (And as it was explained to me, if they fail it, they then get to take Algebra II again the next year.)

And if Amherst doesn't permit this, why not????

And could someone explain to me how the "extensions" are anything other than a combination of busywork and a system of rewards for "teachers' pets"?

Notwithstanding this, if the 2-math option is available, it does negate some of my concerns about the "extension" process...

If a student takes IMP 1 in the ninth grade year, they can take Calculus when they get to collegeNo. Not someone like me. I was liberal arts and you don't take calculus in college if you are that. Even one so inclined would be talked out of it.

Furthermore, calculus in college, like a foreign language in college, is a whole lot easier if you had some of it in high school.

I was Pre-Law in intent, and took courses that would be relevant to where I thought I was going -- and calculus wasn't one of them.

My point: High School is the absolute last exposure to a universal education that a child is ever going to have. The focus narrows in college and a high school calculus course is a far more diverse audience than a college one.

And if not having calculus in K-12 is so acceptable, why then are we also seeing attempts to bring parts of it into third grade math? Not that I favor that either, but "reformers" appear to be at cross purposes...

Why are students the only ones being evaluated in the district?

I was at a high school meeting on their math program and a parent asked the math department head if the IMP program has been evaluated. She said it had been nationally but not at ARHS. Then another parent asked if it would be. The math department head said 2 things: the kids that take IMP general are lower performing kids and she didn't know how they could evaluate the IMP program at the high school.

I left thinking the meeting thinking that, in a way, it makes sense that high school teachers don't necessarily know how to evaluate their programs. This is a separate skill from teaching. The Science Department's evaluation of the Environmental Science and Ecology class is an example of a well-meaning but not so great evaluation. But there are many people in town and in New England who do evaluate educational programs. I sat next to such a professional yesterday as I watched a Little League game.

I am, like a lot of parents, am wondering if the IMP program at the high school is effective. Is it an effective curriuculum and is it being well implemented? Are our kids doing better with this program than one would expect or than they would with another program? I wondered this about the middle school Extensions math and Investigations as my children went through them.

Why are these questions so upsetting to some teachers and adminstrators? Are our children the only ones to be tested here? Are we, as parents, wrong for asking these questions or for regular program evaluations? I frankly just don't understand it.

The hard push to examine these programs and others in Amherst is not coming from the administrators or teachers. It is really coming from parents -- and School Committee members are responding to parents concerns that have been voiced for years and years.

We are now seeing some of the benefits of this push now with the reports last year and the ones underway. Let's continue the good work of evaluation to make sure that our schools are doing the best by our children.

Janet McGowan

Abbie:

Could you consider the questions

raised by Anon 5:32 on the "Amherst-Pelham officials pursue ..." thread

about Mr. Hayes comments and add your responses here, because you offer the same report of a comment by Mr. Hayes here that may be construed as a "fact" by one or many who read this blog.

@ Rich Eckart

Thank you for your excellent explanation of IMP. When my daughter was looking to transition to the HS we thought she might really enjoy the IMP approach. However, we were scared off by two things. 1. In contrast to what you say here we were explicitly told in the presentation that it was difficult to transition to the traditional math track and/or Calculus. 2. There was no information on how ARHS IMP students do on the SAT exams relative to kids in the traditional math track. This seems particularly important because colleges will likely consider these tests more important for applicants from a non-traditional curriculum.

We live in this small town with people from all around Massachusetts, the United States and the world. But when we talk about our schools I feel like none of those other places exist. Here's a question: what do other schools do for math in middle school? Catherine Sanderson wrote a report for Superintendent Rodriguez looking at what comparable Minority Student Network Schools do and none of their math programs look like Amherst's.

Most of those MSAN middle schools have more kids taking algebra classes and algebra is offered in many grades. Students who do algebra in middle school are more likely to go on to take more math classes in high school, take higher level math classes, and go to 4 year colleges, than students who don't take algebra in middle school. Taking math in middle school is critically important. Also, no other schools offer Extensions or, as far as I know, ask 12 and 13 year olds to make decisions about math that will affect their entire high school math sequence.

While Extensions (and likely ample coaching at home by parents) has resulted in a good percentage (40%) of Amherst middle school students who take Honors Algebra in 8th grade -- no Amherst middle students take regular algebra. That class is not offered. So by the end of middle school, 60% of Amherst middle schoolers have not completed an algebra class.

That is a lot of kids not doing algebra -- especially when compared to almost all of the other comparison MSAN schools. While we have 40% of kids taking Honor algebra by the end of 8th grade, Brookline has 80% completing algebra (regular or honors).

Why the big gap? Why doesn't our middle school offer regular algebra classes? Why are so many fewer Amherst 7th and 8th graders "algebra ready?" Is there something wrong with Amherst kids that so many can't do algebra in middle school? And speaking of gaps, Extensions has not closed the achievement gap.

continued...

When looking at what ways to assess algebra readiness, I am in no position to ruminate on the proper assessment tool. But since Amherst consistently comes in with the lowest numbers of kids found to be ready so it seems to me there is something wrong with the assessment. Why not look at how other schools successfully identify more students as ready to learn algebra -- and how they go on to teach more students algebra? We are not living in an isolated pioneer village on the Great Plains in 1840. Yet our middle school does not look to other schools for assessment --- and I'm not sure it sees that there are any problems in the math program. Years of parent complaints about the selection process (largely hidden from view until recently) and the amount of home help required by Extensions have been ignored for years.

I am happy that we are meeting with Principal Hayes and Mr. Zakon-Anderson, the head of the Math Department. They have gotten an earful. But these meetings were asked for by parents, including myself, who are impatient. We are ready to see these problems actually resolved. We are parents with graduating seniors, middle schoolers, most have 6th graders coming into the middle school, and some are parents with younger children.

We want to see more Amherst middle school kids taught algebra, just like at schools in other places. We want to see changes implemented now that are shown to work elsewhere because we don't think our kids are that different. None of us wants to wait for the K-12 math review. I have been waiting for this review for 7 years. (Strangely, the math review will be conducted mostly during the summer months when schools aren't in session and many teachers, parents and students aren't available). What will the review tell us what we don't already know -- or that the district can't find out on its own? This review won't be completed in time to help the entering 7th graders this fall.

These problems have been discussed for many, many years. It's time to take steps to increase the number of middle schoolers who complete algebra -- and please let's take the steps that other schools already know work.

Janet McGowan

What I don't understand is why is this so hard to figure out? Schools have been teaching math for generations. Why is it so hard to have a mapped out math program that goes from K to HS? And why is it to hard to get all of our students ready for algebra in the Middle School?

What am I missing?

It's only hard when you have schools run by people who can't make decisions. It's all about leadership and the lack of it.

Catherine,

Here is a report with 9 studies about professional development:

research report on professional development

The studies meet WWC criteria, which is very hard to do. This report was referenced in one of the studies that you posted.

As you will see in the report, all nine of those studies did in fact find that professional development increased student achievement:

"This report finds that teachers who receive substantial professional development—an average of 49 hours in the nine studies—can boost their students' achievement by about 21 percentile points."

But I still wouldn't consider that we have a definitive answer on this. There are many different ways of offering professional development and of course many different ways of measuring student achievement. Standardized test scores are only one such measure.

And that is really my original point: we shouldn't kid ourselves that there is an answer out there and all we need to do is look it up.

I am an IMP student at ARHS, and naturally am somewhat biased. However, I would just like to say that I think it is incredibly great that we have two different types of math courses at ARHS. I'm the kind of person who needs more than just numbers in order to learn, and I like to ask a lot of questions. In class I always strive to understand what I am doing. IMP is very fitting for students who want to completely understand the math that they are doing; students such as myself. I don't think that future success in college and beyond is limited by the math course we take. Honestly, that sounds a bit ridiculous. Each student is an individual, and learns differently, therefore being able to choose between the styles of math classes is very helpful, because students can discover how they learn and what is best for themselves. I know students in the traditional math courses who have said they just couldn't do IMP because they don't learn that way. I also know students in IMP, including myself, who say that this learning style suits them, and that they wouldn't be able to learn the same way in traditional math. Each student is different, and I don't think it is right to assume one way or the other for the whole student body. Each and every student has the chance to succeed in their own style and for themselves, by having the opportunity to choose which math courses to follow.

As for myself, IMP has been the perfect math course. I am so glad to have chosen it, and I do not doubt that I will have a good college experience. I am in the process of applying right now, and I can say from experience that choosing the IMP course does not limit my options. I will find the college that is right for me, and my choices in math will have helped to get me there.

I can understand the concerns about switching from one course to another, but in the long run what's right for the student as an individual and learner, is what I think is the right.

One last thought--IMP is a great course and I think I can vouch for my classmates when I say that we love what we do in IMP and are extremely proud to say that we are IMP students.

I too am an IMP student at ARHS. I am an honors student and most of my classes are just the right amount of challenging. I struggled with math all through elementary and middle school. Learning algebra was the worst. I could not understand what all those numbers and letters meant. I am sure that, if I had not had the choice to take IMP as a freshman, my GPA would not be nearly as high as it is now.

Not all students learn the same way. In Amherst, we pride ourselves on being a school that produces exemplary students. Part of that is teaching to the needs of the students. It is important to provide varied educational models for students who have different strengths. You do not put an engineering student in a performing arts class. Many students do not understand the IMP program and the way it is taught. That is fine. I am a student who doesn’t understand how “standard” math is taught. Why should I be penalized for the way that I think? English Language Learners have their program to help them learn our language. We have special education classes for those students who need them. Why should those of us who learn differently be treated any differently? We deserve our program as much as the next student.

IMP is a math program that gives the student the skills to solve any type of problem. We learn problem solving skills, we don’t just learn how to memorize the correct formulas. We learn what makes a problem, the basic rules to math, and then we build on them so that we can break a problem into its basic nature in order to solve it.

What other freshman can say that they learned trig and they understand it? What other students have the skills to write an equation for the area of a polygon based on the number of sides that the shape has? We learn probability and statistics, something that is extremely useful in life, and something that is also not taught in the standardized math program.

Many colleges accept IMP students. If you go to this link (http://www.mathimp.org/research/colleges/index.html) on the IMP website, they have a list of the colleges which have accepted IMP students. As you can see, there are many “good” colleges on that list.

I just finished my junior year and took SATs this past spring. I scored in the 81st percentile on the math portion and I am going back this fall to attempt to do better. IMP does not hinder, it helps. We learn skills for any type of math, skills that we can apply in all aspects of our life.

Post a Comment