Education Shock: Actually Teaching Students Math is Effective

This 2014 story published in the Atlanta Journal Constitution reports the shocking news that teaching first-grade students math using the dreaded worksheets, and traditional modes of education was more effective than ” group work, peer tutoring or hands-on activities that use manipulations, calculators, movement and music.”

According to Maureen Downey in her article, “This is an important issue as I increasingly see schools – including those my children attend – tout group learning activities. In many classrooms now, you will see students working at tables together on math. A friend who teaches in a Title 1 school lamented that her students didn’t do as well in the math CRCT as the classroom next door where the teacher used worksheets all the time. My friend’s classroom was a beehive of fun activities around math, but the worksheet class continually outperformed hers. These new findings help us understand why that might have been.”

What I find interesting is the conclusion that the direct, traditional instruction benefits those students with math difficulties, implying that those students without math difficulties do just fine with student-centered approaches. The possibility that difficulties with math may be a result of the student-centered approaches is something that is not discussed even though the study by Paul Morgan and George Farkas of Pennsylvania State University indicates that “a higher percentage of [students with mathematics difficulties, or MD] in the first-grade classrooms were associated with greater use by teachers of manipulatives/calculators and movement/music to teach mathematics. Yet follow-up analysis for each of the MD and non-MD groups indicated that only teacher-directed instruction was significantly associated with the achievement of students with MD (covariate-adjusted effect sizes [ESs] = .05–.07).”

I recall in an article I wrote called “Being Outwitted by Stupidity” I suggested that the increase in students being diagnosed  with learning difficulties in math raises the question of whether the shift in instructional emphasis over the past several decades has increased the number of low achieving children. I also question whether the learning difficulties came about because of poor or ineffective instruction and whether such students would have swum with the rest of the pack in previous eras when traditional math teaching prevailed. I stated “I believe that what is offered as treatment for learning disabilities in mathematics is what we could have done—and need to be doing—in the first place.”

This article garnered about 80 comments, many of them hostile, including my all time favorite which named me a “conservative simpleton fraud”.

I continue to maintain that many of the difficulties we see students having in math may be attributed to insufficient and ineffective instruction. To put it as simply as I can, they may not be learning math because they aren’t being taught math.  But the Morgan/Farkas study is being interpreted in the usual manner: “Teacher-directed instruction is also linked to gains in children without a history of math trouble. But unlike their math-challenged counterparts, they can benefit from some types of student-centered instruction as well – such as working on problems with several solutions, peer tutoring, and activities involving real-life math.”

Not mentioned is whether and to what extent such students receive additional help in the form of parents at home, tutoring, or learning centers.  We might have to wait a while for that kind of study to surface.

 

23 thoughts on “Education Shock: Actually Teaching Students Math is Effective

  1. Certainly they are not learning math because they are not being taught math. And students go on to internalize and personalize their difficulties with math. It is probable that math anxiety is actually being increased by progressive methods.

    It is important to tell students at a certain age that the problems they are having are not a result of individual inability or weakness, but are systemic. This helps to stop the student blaming themself; obviously this needs to be done tactfully, but students need to know they are not to blame and can learn.

    Like

    • I would agree with this Teresa. I have seen this scenario in my own children and have heard from countless others who see children blaming themselves or calling themselves stupid because they don’t understand basic arithmetic. As we have all seen, these convoluted strategies are having a devastating impact on our children. And much of what is now being touted as “math anxiety”, really would be eliminated if straightforward methods were used in the first place…something that is completely lost on today’s educrats and ed gurus.

      Like

  2. My son loved worksheets and flash cards. He loved having skills. When I mentioned worksheets to his Kindergarten teacher, I thought she was going to call the police.

    Hands-on group projects in class can only work (even though it’s not the best use of time) for those students who get the skills at home. Duh, that’s how it works in music. The best are created from private lessons and individual skill/musicality “homework.” In school, they have group band or orchestra where the engaging fun is had. However, the only musicians to get into All-State are ones who had private lessons. When my son played in our State’s Honors Recital (after audition selection by musicians), they finally allowed the listing of the private lessons teacher in the program, not the high school music teacher. This is how El Sistema works. Kids from the barrios get private lessons at their own level starting from an early age. They can and do then compete with kids from affluent families for the top regional and country orchestras. It’s all about content skills and knowledge from an early age. All schools have to do is to allow for some opt-in direct math teaching classes and they will quickly see amazing results. My old high school was the best in music in Connecticut because we had this sort of process that started in the lower grades.

    Long ago, I came to the conclusion that this was all about their academic turf. What they believe is all about them, not the subject or skills. That’s why it changes in high school. Students have teachers with subject area turf.

    Like

  3. One more comment. All of these educators claim to believe in a balance between understanding and skills. However, they hope to achieve those skills indirectly, from a hands-on group process, just like they expect the balance of facts to happen via thematic learning and building vocabulary indirectly via reading. They are forced do to something more direct in math because of the pressure of CCSS, but that is a low hurdle and they still try to do it indirectly via understanding.

    They really hate direct instruction and skills tests that give specific actionable feedback. Our K-8 school got state test feedback indicating a lower problem solving score, so they decided to spend more time on problem solving. I was at that parent/teacher meeting. What they need are things like fraction tests that get sent home right away, not vague results that get send home at the end of the year.

    Like

    • I’m not sure how this adds to the discussion. You’re suggesting there are flaws in the study, then argue that DI is effective…something that has already been noted here. Can you please clarify what you’re trying to say?

      Like

      • When you comment on pieces like this, my sense is that everyone is trying to figure out whose side you’re on. I’m not pro- or anti-DI or whatever. I’m a teacher and a writer and I’ve got my own take on things.

        I think my take on the piece is NOT that it’s fundamentally flawed. It’s that the researchers made some debatable choices. What they found was that practice helps improve performance, and activities that aren’t practice (duh) didn’t help performance. It seems to me that they packaged this in a way that emphasizes the Math Wars angle. Maybe that was justified, maybe it wasn’t. Their data didn’t demand the teacher-directed v. student-directed frame. That was a choice they made.

        The most interesting thing to me, when I dug into the paper, was that there was covariation between student performance and whether kids worked on the “lower” skills like counting or “higher” skills like measurement or double-digit work. That strikes me as a really useful insight about teaching weak students — their teachers often never get past low-level skills. Could it be that struggling students benefit more from a variety of practice in different skills? (Or, perhaps, effective teachers get past the basics into the other skills? They didn’t discuss this in the paper.)

        By the way, given the effect sizes from effective instruction in this study it strikes me as sort of silly to suggest that ALL of their math difficulties are the result of ineffective teaching. Plus, we have lots of research about the gaps between kids by the time they enter schools for the first time. Genetics, home environment, etc.

        Like

      • Again tho, it seems your point is over semantics rather than over what constitutes effective instruction. All that has been itemized and discussed here can all be categorized under the same umbrella: explicit instruction, direct instruction, teacher led instruction. These are all the same things, even tho some like to nitpick on the labelling. Are the kids receiving daily practice? Is the teacher a main part of the instruction or are kids being encouraged to discover multiple strategies to explain their answers? The differences occur when we start emphasizing student led instruction (more inquiry based) vs. teacher led / explicit instruction. And when we examine these 2 different ways of teaching, there are huge variances in student performance.

        Which, again, makes me curious about your initial post. What is it you’re asking specifically? That we read your blog to determine if there were flaws in the study? Or was there something else you’d like to suggest?

        Like

    • this is a response to YOUR reply to me – i am not technically savvy here. I am not really interested in what sides people are positioned in this debate, or where people might be coming from. What I am interested in, is knowing what effective methods for kids might look like, i.e. the evidence behind effective practices. We all know there are many exceptions to the rule, however general rule of thumb here indicates that DI works best for kids, and especially those kids who are struggling. I am sure you would agree with that. Many studies have their flaws, and debating the semantics of these studies are rather a side issue. These researchers are echoing numerous studies which have examined this phenomenon ad nauseum and they all come out with the same conclusions: DI works best. I am sure you are familiar with Project Follow Through, and the multiple studies that Greg Ashman https://gregashman.wordpress.com/2017/04/08/the-best-way-to-teach/ and other edu researchers have highlighted on this topic http://lexiconic.net/pedagogy/

      What Barry is indicating here, is that we have even more studies coming to the same conclusion about DI. And yet many of our educrats in charge of mandating math education in North America, refuse to acknowledge the evidence. This bodes poorly not only for students, but also for their parents who then deal with the fallout by enrolling them in Kumon, and for the teachers, who are being shortchanged effective methods in the classroom.

      Like

      • I’m not saying that anyone here is right or wrong. All I’m trying to do is read the research report in a careful, thoughtful way.

        As far as I can tell, the study that Barry’s talking about here did NOT seek or find evidence for direct instruction. It found that working on worksheets, doing math from textbooks, routine practice and kids doing math on a chalkboard helped struggling 1st Graders. The researchers called these “teacher-directed” activities, but they might as well have called them practice or Group A or Charlie. They call them whatever they want to.

        These researchers found that struggling students whose teachers reported doing these things often tended to have less success: mixed group math work, working on problems with several solutions, real life math, math with a partner, peer tutoring, explain how a math problem is solved.

        They also found that teaching measurement and fractions correlated with helping struggling kids, though teachers with struggling students tended to double down on counting skills.

        Direct instruction has a lot to do with explanations. The stuff that worked in this study has a lot to do with practice. That seems like a distinction worth preserving when talking about this study, even if you’re an advocate for DI or whatever.

        Like

  4. “… But the thing that the headlines get wrong is that this sort of teaching is anything simple. It’s hard to find the right sort of practice for students. It’s also hard to find classroom structures that give strong and struggling students valuable practice to work on at the same time. It’s hard to vary practice formats, hard to keep it interesting. Hard to make sure kids are making progress during practice. All of this is craft.”

    It’s all about my turf.

    Teachers can’t “craft” their way out of the fundamental flaws of low expectation CCSS math and full inclusion where the distance between strong and struggling students is allowed to get wider each year. If you separate kids by levels in each classroom, then schools are just hiding the tracking while providing little direct instruction and push. Everyday Math “trusts the spiral”, but the best students get the hidden tracking push at home where we STEM parents know not to trust any process or “craft.”. We do whatever works. We push. Our kids fill the upper levels in the classroom and whatever “craft” is used in class cannot bridge those gaps. “Craft” cannot overcome a low curriculum slope, low expectations, and no pushing, especially when skills are considered to be rote.

    There is no push in full inclusion except at a low CCSS level which only expects no remediation in algebra in college. CCSS only cares about a low statistical “proficiency” mean, not a process that pushes students to achieve their own individual best. This CCSS low slope starts in Kindergarten, and by the 7th grade math tracking split, it’s all over for many capable students. I got to calculus in high school with absolutely no help from my parents. That could not have happened with my “math brain” son. “Craft” cannot overcome that fundamental systemic flaw. Just ask us STEM parents. we create all of the high slope STEM-prepared students with our practice work at home and with tutors. We even get notes sent home telling us to practice “math facts.” That just increases the academic gap. Is that “craft?”

    Schools cannot accept full inclusion and then expect that some sort of natural process or “craft” (even with an emphasis on skills) will allow students to achieve their potentials. In math, it’s all over by 7th grade. CCSS has officially defined K-6 as a NO-STEM zone, and there is no “craft” that can transition from the low-slope K-6 CCSS math curriculum into the high slope AP Calculus track in high school. We STEM parents know that. We provide the required higher slope at home in K-6. The current educational solution to low expectations in K-6 is to take summer classes or double up in math in high school. Educators can’t “craft” their way out of these fundamental flaws.

    Like

    • I have no idea how any of this responds to my post, except to say “I’m not interested in what you’re interested in, I’m interested in what I’m interested in.”

      If I had written a post titled “Why Craft Can Overcome All Other Factors,” your comment would make more sense.

      Like

      • “All of this is craft.”

        “strong and struggling students

        Yes, go ahead and ignore fundamental systemic flaws. You’re not interested it them.

        Like

      • Steve H: I wasn’t sure what progressive dog whistle you were picking up in my piece, but I’ve got it now — thanks!

        To clarify: I think that direct instruction, worksheets, flashcards, practice, teacher-directed stuff is important. I use all of these methods all the time in my teaching of multiplication to 3rd and 4th Graders.

        Sure, yes, there are places that you and I would disagree. I do think that there are tricky problems of teaching that I experience in my work — you seem to think it’s all a settled matter. All of THIS (meaning, trying to find ways to give kids important, teacher-directed practice that helps each kid) is craft. And, yeah, there are strong and struggling students in my classes.

        Sorry you’ve had shitty experiences with math teaching.

        Like

      • You can’t talk about craft until systemic K-6 problems are fixed, such as the low slope CCSS math to nowhere. Craft is not about how to do the best you can under the circumstances.

        Like

  5. And when we examine these 2 different ways of teaching, there are huge variances in student performance.

    This is a good example of a statement that is not supported by the research piece. The effect sizes were significant, but modest.

    I really don’t have a larger point. I’m really just talking about this research piece.

    If I do have a larger point, it’s that we shouldn’t always need to have a larger point! We need to be able to read individual pieces of research without fitting them into larger arguments, at least not right away. We need to approach each piece of evidence and understand what it’s saying on its own terms.

    Like

    • My statement refers to the overall conclusions that we’ve seen with respect to DI/explicit/teacher led vs. inquiry based learning. I am sure your “opinion” may differ than that on what the researchers concluded. But I’m really not interested in opinions, rather on facts. And when I also read PFT, dozens of other studies which all lead to the same conclusions that this particular study concludes…what’s the issue then?

      Again I see nitpicking rather than meaningful or helpful points being made here. If you are to suggest we need to be better trained at dissecting research papers I would suggest also that one shouldn’t ignore, nor neglect the daily evidence of poor math instruction that teachers such as Barry, and Teresa, and Steve witness on a daily basis. Or parents like I, and others battle regularly. The PISA data comparing DI vs. inquiry is also rather stunning in its findings – I am sure you are familiar with its conclusions. Shall I go on? The edubabble and constant overcorrecting without evidence to support changes is mind boggling. This cannot be debated, nor can it be denied. And, quite frankly, I simply don’t have the time debating the “what ifs” or “this is what it really means” scenario if all these are all simply based on opinion. You may not like this study. Noted. But this doesn’t change the overall conclusion nor its implication of how best to teach our kids, and what changes need to be made in order to ensure kids receive decent arithmetic instruction.

      Like

  6. When l first moved to a state, Texas with standardized testing l made a decision to simply teach algebra 1 & 2, but not look at the test. Before arriving l was told that 33% of the students had passed the state test in math the previous year. My students thought they were suffering, but all survived and 85% of the students passed the math test. At the end of the school l was told, “Thank you for raising our math scores. We don’t need you anymore. ” Ever since the passing rate has been 15 to 30 %.

    Liked by 1 person

  7. I suspect that Michael’s contribution to this thread represents a strategy of trying to sow doubt about broad conclusions by questioning details, definitions, semantics and so on. I suspect this is a deliberate strategy because he recently wrote a piece with Ben Riley arguing for researchers to adopt psychological principles in an attempt to manipulate people’s beliefs, rather than presenting them with the facts or the truth as researchers perceive it:

    https://deansforimpact.org/why-mythbusting-fails-a-guide-to-influencing-education-with-science/

    I believe that the following post is a good example of the strategy:

    https://www.google.com.au/amp/s/problemproblems.wordpress.com/2016/09/11/how-did-69-turn-into-29/amp/

    I therefore question whether Michael is arguing in good faith and whether it is worth investing the time in responding to him.

    Like

Leave a comment