Posted on 4 Comments

Letter to State College, PA Board President on Investigations Math

This is a copy of the letter I sent to the school board president in State College, PA on 5-12-09.

Dear Mr. Madore,
Steve Piazza forwarded me your email asking for research regarding Investigations math. You mentioned that you had seen a number of biased sites on the web and you wanted something objective.  Let me assure you that my website at www.oaknorton.com is 100% biased. However, 4 years ago, it was completely neutral. It was only over time that I came to realize the damage being done by these programs and the lies being put out by the publishers to sell books. I apologize for the length of this email but I think this is not a simple problem and requires a little background to understand my “credentials” to be able to send you this email and a bit of information to specifically address your concerns.

A few years ago I was asked by the Utah State Director of Curriculum how he could get more parents involved in their children’s math education.  I replied, “easy, just implement Investigations math.”

I got started here in Utah (I’m a SCAHS ’87 alum) when my 3rd grader wasn’t being taught the times tables and I was told it wasn’t part of the Investigations math curriculum and that “all the research showed this was the best way to teach kids”. I was also told that even though this was different from how I was taught as a child, the kids turned out just fine.  I can without reservation assure you this is not the case as the local community college now has a remedial math *department* (not just classes) for the over 40% of incoming freshmen that require basic math at pre-college levels.

When I got started, I did a non-scientific poll to ask parents (and even a school board member) what they thought about the math. You can see the questions and results starting here if you’re interested:
http://www.oaknorton.com/imathresults.cfm

Over time, I had many discussions with this school board member and I think she grew quite weary of me, until one day her child’s teacher broke her leg and was out of school for a while and a substitute came in and started doing straight Investigations.  Previously the older teacher had not followed the district plan to teach straight Investigations math. She called me after a few weeks and said, “now I know why so many people don’t like this program.” This is the crux of the problem. Investigations version 1 and 2 have such a lack of content that unless you have highly trained math teachers, it will be a complete failure. These students will then be pushed up through the system into algebra classes where they will become very frustrated and exhibit more behavioral issues and eventually in high school your true math teachers (majored in math) will pull their hair out that they have to spend so much time retraining students they are prevented from covering their material.  Right now in our district (Alpine, Utah) the middle schools spend the first 10 minutes of class reviewing basic math facts before diving into pre-algebra and algebra classes and this is deemed acceptable by our board.  It’s nothing less than insane, if I may show my bias clearly. 🙂

Now before I get into research, I’d like to ask you a question.  Do you really want to implement a program that is among the most controversial programs, which cause so much community contention, and leave you with parents so mad they start putting their children into private schools, charter schools, and home schools, just to avoid it?  You’d better have some extremely strong and compelling evidence from the other side (pro-Investigations) that this really produces results on par with Singapore and the best to move forward with such a plan.  My district in Utah has a dozen charter schools now with applications for more.  It has the highest number of charters per capita of any district in the state.

RESEARCH
This is where things get interesting. If you are an objective individual, and I have no reason to doubt it based on your email to Steve, then I think this will be a compelling argument.

Last year the national math panel released their first report on the early grades and a secondary grades report is due out this year (I think). The conclusion of the report was that there wasn’t a lot of good and proper scientific studies available about specific curricula. That said, there is strong and compelling evidence about the types of programs that are effective vs. what aren’t, and there is a newly released study that was absolutely scientific that should provide you what you need.

First, Project-Follow Through.  This is the largest scale federal study ever performed that has been completely ignored by educators.  The study showed conclusive evidence after tracking 180 schools’ students (79,000 of them) for over 3 decades into college. The results are astoundingly clear as you can see on the graph on this page of my website. (I’m actually attaching it but there are more comments and background on this page about the study should you want to read it)
http://www.oaknorton.com/imathresults34.cfm
Note the comments from a researcher at the Fordham Foundation concerning Project Follow-Through:

Until education becomes the kind of profession that reveres evidence, we should not be surprised to find its experts dispensing unproven methods, endlessly flitting from one fad to another. The greatest victims of these fads are the very students who are most at risk.”
…”This is a classic case of an immature profession, one that lacks a solid scientific base and has less respect for evidence than for opinion and ideology.”
…<snip>
Education has not yet developed into a mature profession. What might cause it to? Based on the experience of other fields, it seems likely that intense and sustained outside pressure will be needed. Dogma does not destroy itself, nor does an immature profession drive out dogma.
The metamorphosis is often triggered by a catalyst, such as pressure from groups that are adversely affected by the poor quality of service provided by a profession.”

If educators were truly concerned about education, they would look for what works and improve on that.  Direct instruction programs were shown to conclusively work far better than constructivism and all other fads.

Next, just recently the federal government completed a fantastic research project of first graders in a number of schools. They will be continuing to follow the study in subsequent years, but they showed with statistical significance that Investigations math students were a full letter grade below Saxon and Math Expressions.  You can read the study here:

http://ies.ed.gov/ncee/pubs/20094052/pdf/20094053.pdf

Here is one paragraph from the conclusion:

“Student math achievement was significantly higher in schools assigned to Math Expressions and Saxon, than in schools assigned to Investigations and SFAW. Average HLM-adjusted spring math achievement of Math Expressions and Saxon students was 0.30 standard deviations higher than Investigations students, and 0.24 standard deviations higher than SFAW students. For a student at the 50th percentile in math achievement, these effects mean that the student’s percentile rank would be 9 to 12 points higher if the school used Math Expressions or Saxon, instead of Investigations or SFAW.”

On a side note, one of my co-workers who thought I was nuts, but just a little concerned took his daughter to a Sylvan center for testing.  She was in the middle of her 2nd grade year and had a 94% A on her report card (Investigations math classroom).  Sylvan tested her and said she had a Kindergarten level of calculation skills and a mid-1st grade level of comprehension skills.  Needless to say, he was most upset at the grade inflation and the lack of any real comprehensive teaching going on.

Last study: This is the ultimate study of all studies concerning constructivism.  For years I had my district telling me “all the studies show this is the new sliced bread” this and that, and I finally got fed up with it and called their bluff by filing a GRAMA to see what studies they had to support their math programs.  They couldn’t give me a single study.  Then I found this gem.

http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf

Now I know I’m giving you a lot to read and digest, but this above link contains some great information by the author detailing how the learning process of both instructivism and constructivism work and then he searched the literature to find valid studies supporting constructivism and couldn’t.  There are none.  In fact the only studies that exist he documents as showing CAUSE HARM to children.  If you want to just read the conclusion of this study, go to this page of my website.  It’s an impressive conclusion.

http://www.oaknorton.com/mathupdates/20080903.cfm

The only other thing you might look at is something I noticed on Steve’s site just yesterday which is the number of school districts that adopted Investigations but have now abandoned it.  You can find it at this link and scroll to the bottom to see the school administrator comments.

http://parents-for-quality-math-education.googlegroups.com/web/Investigations_Success_Stories_Debunked.pdf

Now for the other side of research that is pro-Investigations…

To begin, as stated above, I can quite honestly tell you there are no studies that support it (you are welcome to ask your district people to find you a study and if they are able to produce one, I am quite confident it can be shot down as these next examples will show).  What I have seen touted in support of Investigations is disturbing and inaccurate.

First, a few years ago one of my local board members sent me an email saying, “Oak, if you ever get open minded you should go read the ARC study that showed conclusively that Investigations math works.”  I replied, I’d be happy to read it since I’d been asking for a study for anything that would show our children would turn out OK.  This was the best they could come up with, rather, the only thing they could come up with.

I went to the ARC website and was amazed to see the most comprehensive blanket endorsement ever given regarding Investigations math, Everyday math, and Trailblazers (all NSF funded programs).

“…The principal finding of the study is that the students in the NSF-funded reform curricula consistently outperformed the comparison students: All significant differences favored the reform students; no significant difference favored the comparison students. This result held across all tests, all grade levels, and all strands, regardless of SES and racial/ethnic identity. The data from this study show that these curricula improve student performance in all areas of elementary mathematics, including both basic skills and higher-level processes. Use of these curricula results in higher test scores.”

WOW I thought. That’s amazing. In no circumstance was this not the very best program. Then I started digging.  Dr. Jim Milgram at Stanford (an expert in international math standards and familiar with this research) pointed out to me that this center was founded, funded, and operated by TERC, the publisher of Investigations math (whoops). Then I contacted Sandra Stotsky, the Asst. Commissioner of Education in Massachusetts when this study was done and she gave me these details:

“I am aware of several major problems with the MA part of the study. (1) As the Executive Summary admits, mostly high–income “white” schools were using the “reform” programs in the MA grade 4 sample, (2) no information is given on the supplemental tutoring that exists in these suburban communities (a hard factor to get information on without labor-intensive exploration at each school), (3) no information is given about supplemental curriculum materials the teachers themselves may have used–all we are told is that the schools that were contacted said they fully used the reform program.  I know that many teachers in these high-income schools use supplemental materials to make up for the “reform” programs deficiencies, (4) no information is given on the amount of professional development the “reform” teachers had (a huge amount in all probability) in comparison to the teachers in the comparison group (if no new math program, no professional development), (5) no information is given on the amount of time spent on math in the reform schools compared to the comparison group (the “reform” programs require a lot more time per week than most schools had been allotting math for many years.  For example, I discovered that one Newton elementary school with top scores was considered a model because it taught math one hour each day!), and probably most important and relevant (6) the MCAS grade 4 math test was originally designed with a great deal of advice from TERC.  TERC also shaped the math standards in the 1995 standards document that were being assessed by this test in 2000 (it is acknowledged in the intro to this document).  TERC’s supporters (and EM supporters) were on the assessment advisory committees that made judgments about the test items and their weights for the math tests.  It is well-known that the grade 4 test reflects “constructivist” teaching of math.  In other words, the grade 4 test in MA in 2000 favored students using a “reform” program.

Dr. Milgram at Stanford is the only educator to sit on NASA’s advisory panel. He is there specifically to work at increasing the number of students capable of doing NASA level math work and to raise this level of top performers around the nation. He told me in an email that if these reform programs really worked, NASA, IBM, and others would be looking for students that went through these programs K-12.  He also said it is generally acknowledged that no valid study has ever been performed to show these programs work.  I would be happy to put you in touch with Dr. Milgram if you would like to speak with him. He has helped write standards for many of the top rated states and has vast knowledge about what works.

One other study that was touted by a BYU math ed professor was the Noyce study. It also showed how great constructivism worked, but in the end Noyce refused to reveal what schools they tested in their study so the results could be verified.  Hardly an objective study.

CURRICULA

A few years ago when I was curious to know what really worked, I asked my national contacts this question.  “If someone held a gun to your head and made you state what the 3 best math programs in the world were, what you you say?”  Amazingly, all of them responded the same for 1 & 2.  Singapore math is #1, Saxon is #2, unless you have weak math teachers in which case Saxon is #1.  Their #3 pick varied between a few other programs, but it was significant they all said Singapore math #1 and Saxon math #2.  In Utah, 8 out of the 10 top scoring schools are using Saxon.  In Arizona, the top school in the state for several years was a Saxon school until a couple years ago when they were displaced by Benchmark elementary using Singapore.  Benchmark also polls the students each year as to their favorite classes.  94% of ALL the students say math is their favorite subject.  How would you like to have that kind of result in the SCASD?  Can you imagine how parents would love you instead of cursing everytime they see the words “Investigations math”?  Believe me, it happens. I have 1,000 families on a petition and most of the district doesn’t even know this exists.  One town in our district actually tossed a question on a study they were doing of their citizens and 50% said they hated the program.  That’s significant when you’ve got a portion of the population without kids in school that don’t even know what it is.

Ask yourself if you want a constant thorn in your side from parents like Steve and I?  Do you want upset parents? No parent complains about the program when their child brings home solid math assignments and struggles with them.  That’s normal education.  But as my children constantly brought home games to cut out and play with and do the stupidest assignments imaginable, it was more than I could handle and this is what happens with your highly educated parents.  They will take matters into their own hands, rebel against the district, and find alternative sources for their children to learn from.  In our school district, this means the most involved parents pull their children out of the system, and the charter and private schools far outperform the local district because they get the children from the most involved educated parents.  Here’s some graphs I did a couple years ago that illustrate this:

http://www.oaknorton.com/imathresults29.cfm

VISION

Now for a totally separate plug… 🙂  There is nothing in any state law that says you can’t achieve more than what the current state standards require. There are better standards than what Pennsylvania is using.  States in our country typify the mile wide inch deep problem because we have so many things we try to cover during a school year, whereas a country like Singapore limits its standards to 15-20 items during the year and spends the time to deeply master them.  You may have a couple alternatives. You may be able to petition your state board to adopt Singapore’s standards and curriculum in your district as a pilot plan.  If not, you could still use it coupled with PA’s standards and the students would still perform just fine on the state tests and probably outperform the other schools anyway.  In our state there’s only a few things that would have to be supplemented during the year and they aren’t significant.  I really would love to see SCASD become one of the premier places in the country for math education.  You are in the shadow of Penn State, an engineering powerhouse.  Your opportunity is to either set the bar high, or risk everything on a totally unproven but philosopically attractive program that has consistently enraged parents and hurt the poorer students who do not have the resources to supplement the program at home.

If you wish to contact me about anything, I am happy to discuss this with you except not tonight as I will be attending the new Star Trek movie. It never would have taken this long to see a new Trek movie after opening when I was younger and had lesser demons to battle than Investigations math. 🙂

Biased by information,

Oak Norton

Project Follow-Through Results
Project Follow-Through Results

4 thoughts on “Letter to State College, PA Board President on Investigations Math

  1. This is a magnum opus. It is rather amazing. Great research, good explanation, truth and logic all combined. Thanks.

  2. Excellent work. You have gained a new reader. I hope you keep up the good work and I await more of these absorbing posts.

  3. Hi, We are struggling against Investigations in our district as well. It is amazing how similar our administration’s and Board’s comments are to the comments made by your Admins and Board (the unsubstantiated claims, the vilifying as other math as “algorithmic, not conceptual” and thier ignoring of evidence, and the completely incomprehensible infatuation they demonstrate for MI.

    Thanks for collecting and giving synopses of the many studies.

    How are things progressing in your district?

    Ranjeet Tate

  4. Our district publicly dropped TERC but has quietly brought it back. We are working hard to get them out of office so we can finally put this monster to rest.

Leave a Reply

Your email address will not be published. Required fields are marked *