How is Understanding Measured?

I have written about understanding in math, and the education establishment’s view of it. With all this talk about how it is important for students to “know math” and not just “do math” the question arises: “How do we measure a student’s understanding as opposed to their ability to go through procedures?” That is, how do we differentiate someone who truly understands from that of a “math zombie”.

In my opinion, the most reliable tests for understanding are proxies involving procedural fluency and factual mastery but which involve some degree of mathematical reasoning.  Here’s an example.

On a placement test for entering freshmen at California State University, a single item on the exam correlated extremely well with passing the exam and subsequent success in non-remedial college math. The problem was to simplify the following expression. (Multiple choice test):  

Without verbalizing anything or explaining one’s answer, simply recalling the arithmetic properties of fractions along with being fluent in factoring well enough to complete the task correctly was enough for a reasonable promise of mathematics success at any CSU campus. For those who are curious, the answer is (y+x)/(y-x).

Yet, the education establishment often proceeds from the belief that “Getting answers does not support conceptual understanding.” In the teaching of math in K-12, we are seeing more  interest in the process by which students obtain the answers to “authentic” problems. If students really understood, the thinking goes, then they could apply prior knowledge to problem types they have never seen before.

But we have to be aware of level of understanding. Novices don’t think like that. Novices learn how to solve problems from worked examples.  Subsequent problems are varied slightly beyond the initial worked example, forcing students to make connections to prior knowledge. This process is called “scaffolding”.

For example, if we ask what is the perimeter of a rectangle, with sides that are 5 and 7 inches, the student applies the formula he has (yes) memorized: 2W + 2L = perimeter, where W and L represent width and length and comes up with 24 inches. Subsequent problems are variations on this theme:   A rectangle has a side that is 7 inches with a perimeter of 24 inches. What is the length of the other side? … and so forth.

But such scaffolding is sometimes held in disdain, viewed as rote memorization of procedure. To counter this, we have students working on problems that can’t be readily solved by formulas or previous learned procedures. These are called “rich problems”.The best I can do at a ormal tion of “rich problems” comes from someone who disliked “algorithmic” problems: “It’s a problem that has multiple entry points and has various levels of cognitive demands. Every student can be successful on at least part of it.”

My definition might be a bit clearer: “One-off, open-ended, ill-posed problems that supposedly lead students to apply/transfer prior knowledge to new or novel problems that don’t generalize.” (See figure)

For example: “What are the dimensions of a rectangle with a perimeter of 24 units?” A student who may know how to find the perimeter of a rectangle but cannot provide answers to this (and there are infinitely many) is taken as evidence of not having “deep understanding”.  In their view, the practice, repetition and imitation of known procedures as illustrated in the original example about perimeter of a rectangle, and variations thereof, relies on “imitation of thinking”. 

But imitation is key as one goes up the scale from novice to expert. As anyone knows who has learned a skill through initial imitation of specific techniques, such as drawing, bowling, or learning a dance step—watching something and doing it are two different things. What looks easy often is more complicated than it appears. So too with math. The final accomplishment often does not resemble how one gets there. Like playing a game of football or running a marathon, the building blocks of final academic or creative performance are small, painstaking and deliberate.

As the cognitive scientist Dan Willingham has said, only experts see beyond the surface level of a problem to its deeper structure. “But if students fall short of this, it certainly doesn’t mean that they have acquired mere rote knowledge and are little better than parrots.” Rather, they are making the small steps necessary to develop better mathematical thinking. Simply put, no one leaps directly from novice to expert.


8 thoughts on “How is Understanding Measured?

  1. Nice post. I have seen some very clumsy attempts to explain understanding as knowing the connections between facts as if these were not also facts. I can’t imagine any fact that can be stated without connecting two or more things.

    Most of the examples, as the one you suggest, where a vague question is posed amount to an implicit claim – I understand perimeters, I know the answer to my new question so if you can’t answer it you don’t understand perimeters. In this on page 13 the new question is whether the student knows that 0 and 1 have different meanings in two parts of the same sentence – in one case they are labels for various identities and in the others they are labels for elements of a set.In both these examples if the notation was explained or better improved it would be much easier to understand and answer the questions.

    I think it is true that if you can figure out what someone means by their ill posed question you probably do understand the topic. But what an unpleasant approach that is. The saddest part being in math there is always a well posed problem that would do the job.


    • I absolutely agree. The ill posed question is a very ugly and inaccurate way to measure understanding. It is the antithesis of a mathematical approach. My point was just that yes it will be a crude test of understanding but only someone with little interest in math and an interest in catching people out with mind games would deploy it.
      I am a huge fan of because in part they do an excellent job of offering a progression of challenging questions without playing games. In fact they go out of their way to point out common styles of math quiz question to avoid people missing the point because they don’t have the insight into wording styles. As a resource they are the counter example to any claim of a need for ill posed questions – that is you can always find a well posed challenging question on any math topic.


  2. Thanks. While it is true that if you can figure out the answer to an ill-posed problem that you probably do possess understanding, it also may be true that if you cannot, you still have some understanding. It may be at a lower level. Remember that students progress along a spectrum from novice to expert. To expect more expertise than a novice may possess is what I see as the problem here.

    Liked by 1 person

  3. “One-off, open-ended, ill-posed problems that supposedly lead students to apply/transfer prior knowledge to new or novel problems that don’t generalize.”

    Little to no transference. Exactly.

    No one in-class group problem (open-ended or whatever) offers enough practice to achieve what a proper set of individual homework problem variations provide – unit after proper textbook-scaffolded unit. There is no magic concepts or understanding ideas that will accomplish that. Their magic fairy dust ideas are never presented as a complete longitudinal curriculum from K-8. They offer no examples of success so we are constantly responding to shadows or mere ideas.

    I remember trying to help my son prepare for the AMC/12 test related to logarithms. He “understood” what logarithms are all about in concept and for many details. However, what he really needed was to study each and every logarithm problem from previous tests. There is no such thing as a “one-off, open ended” problem that will achieve proper transference to all other problems. There are too many subtle variations that don’t generalize. Full understanding is hidden in all of those homework problem variations.

    One or a few special problems NEVER generalize for transference. Practice, practice, practice is not about speed for the same class of problems. It’s about problem variations and new understandings. This should be intuitively obvious to the most casual observer.


  4. Is there a strong K-5 or even K-8 curriculum you have found that meets the criteria you have outlined above? So many of the new math curriculums fail at every turn.


  5. I used Singapore Math’s “Primary Math” series (U.S. edition) with my daughter but some of it may be hard to execute if you’re not familiar with bar modeling as a means to solve problems. In general though, Grades 1-5 were helpful; grade 6 gets a bit confusing with how they explain percent problems via proportions, because of the notation they use. Other aspects of the 6th grade series are very good.

    Sadlier Oxford’s series of math books appear to be good, though they have succumbed to “aligning” with Common Core, so there’s some aspects you will just have to use judgment on, and go to the standard algorithms rather than spending inordinate amounts of time on alternate “strategies”. They are better than most, though and are used a lot in Catholic schools.

    Some parents are sending their children to learning centers like Sylvan, Kumon, Huntington and MathNasium. They use more traditional approaches, as well as teaching how to navigate the Common Core approaches one is seeing in schools. This can be expensive, unfortunately, but this option is proving to be popular.


    • is worth looking at. I used it with my son. They are going out of there way to practice well posed challenging problems. While they target those who don’t struggle with math using the right entry point I think anyone would find it worthwhile.
      (I have no financial interest in them – just wished I’d seen it sooner so passing it on.)


  6. I’d also recommend AOPS as a series. Even if your student doesn’t use the textbooks, AOPS has excellent free online problems through Alcumus and videos for material though Algebra 1. The problems and explanations are far superior to Khan Academy, IMO. My oldest is thriving with it. I will admit though that my 12-year-old son needs more practice for some of the basics, more p-sets as SteveH would say, so we’ve supplemented the pre-algebra and algebra series with an old Dolciani text and have used some of AOPS’s new Beast Academy series. BA is great for challenging students but make sure they have the basics first. For instance, my 7 year old son was asked to solve 46 x 67, not with the standard algorithm which he can do, but rather by using the distributive property and breaking the numbers into components. He was not developmentally ready for that level. (Which is not a fault of the BA series, simply that my son really likes math and moves through material quickly.)

    Singapore’s primary math series is excellent for K-5 but I’d agree with Mr. Garelick that the 6th level wasn’t as useful and we didn’t even look into their 7-8th grade series. My 2 oldest used that series until jumping over to AoPS. Some of our friends love the Saxon series but I didn’t like how it jumped around topics. Another favorite is RightStart math for k-4 material. I would not recommend Teaching Textbooks. It is too basic. Any new series touted by a school district would be tough to recommend because they have to meet so many “new math” requirements. If all else fails, find some very old textbooks from Dolciani or similar.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s