Plant Possibilities

My last series of posts centered on standards-based assessment, which for several years has been the focus of much of my work with teachers. I’ve been somewhat dragged into this work. I think–a lot–about my experiences in my own classroom and the changes that I would make if I was to return. Assessment would come way down that list. It’s not that I had assessment figured out–I didn’t–but that shifts in other practices would take precedence. Throughout my SBA series, I addressed the same content learning standard: systems of linear equations. This got me thinking about one particular pedagogical do-over that I’d like to have.

A “concept-based” (and “competency-driven”) curriculum would highlight that solving a system requires finding an ordered pair (or set of ordered pairs) that satisfies both equations. In my classroom, I talked about this idea: “There are infinitely many pairs of numbers that make the first equation true. Also, there are infinitely many pairs that make the second equation true. But, there is only one pair that makes both equations true!” My students listened to this essential understanding; they didn’t experience it. Students as spectators.

A few years ago, the PAC at my kids’ school organized a plant sale fundraiser in time for Mother’s Day. Hanging baskets at one price, patio vegetables at another.

Simply translating this scenario into a word problem (like this one) doesn’t lead to students experiencing the big idea. Our actions, not posters, must communicate that “Learning takes patience and time.” A more patient problem-solving experience…

Give students time to play with possibilities: “What is a large/small amount that the PAC could make? What amounts could they not make?” etc.

Slowly reveal information. Remove one sticky note. How many of each did they sell?

Students will come to see, either within small–and visibly random–groups or through whole-class crowdsourcing, that there are many possibilities.

Students will observe that the eight (whole number) possibilities have become one. That is, they’ll experience the big idea about systems for themselves.

Remove a third sticky note. Ask “How confident are you?” This creates a need for students to check their work beyond catching incorrect calculations.

Remove the last sticky note.

In my Systems of Linear Equations videos from Surrey School’s video series for parents, I chose to apply this approach to a non-contextual naked-number problem.

Because of this choice, I might decide to use the plant sale scenario as an assessment item.

Tell students that the PAC has made the claim above. Ask “Do you agree? Why or why not? Convince the PAC.”

Admittedly, I haven’t added much to Dan Meyer’s systems of equations makeover from more than five years ago. (Man, I miss math ed blogs!) These ideas about teaching still interest me much more than “Is this Developing or Proficient?” For more patient problem solving see:

[SBA] Designing Assessment Items

In this series:

1. Writing Learning Standards
2. Constructing Proficiency Scales
3. Designing Assessment Items

Designing Assessment Items

There is a sentiment in BC that using tests and quizzes is an outdated assessment practice. However, these are straightforward tools for finding out what students know and can do. So long as students face learning standards like solve systems of linear equations algebraically, test items like “Solve: $5x + 4y = 13$; $8x + 3y + 3 = 0$are authentic. Rather than eliminate unit tests, teachers can look at them through different lenses; a points-gathering perspective shifts to a data-gathering one. Evidence of student learning can take multiple forms (i.e., products, observations, conversations). In this post I will focus on products, specifically unit tests, in part to push back against the sentiment above.

In the previous post, I constructed proficiency scales that describe what students know at each level. These instruments direct the next standards-based assessment practice: designing assessment items. Items can (1) target what students know at each proficiency level or (2) allow for responses at all levels.

Target What Students Know at Each Level

Recall that I attached specific questions to my descriptors to help students understand the proficiency scales:

This helps teachers too. Teachers can populate a test with similar questions that reflect a correct amount of complexity at each level of a proficiency scale. Keep in mind that these instruments are intended to be descriptive, not prescriptive. Sticking too close to sample questions can emphasize answer-getting over sense-making. Questions that look different but require the same depth of knowledge are “fair game.” For example:

Prompts like “How do you know?” and “Convince me!” also prioritize conceptual understanding.

Allow For Responses at All Levels

Students can demonstrate what they know through questions that allow for responses at all levels. For example, a single open question such as “How are 23 × 14 and (2x + 3)(x + 4) the same? How are they different?” can elicit evidence of student learning from Emerging to Extending.

Nat Banting’s Menu Math task from the first post in this series is an example of a non-traditional assessment item that provides both access (i.e., a “low-threshold” of building a different quadratic function to satisfy each constraint) and challenge (i.e., a “high-ceiling” of using as few quadratic functions as possible). A student who knows that two negative x-intercepts pairs nicely with vertex in quadrant II but not with never enters quadrant III demonstrates a sophisticated knowledge of quadratic functions. These items blur the line between assessment and instruction.

Note that both of these items combine content (“operations with fractions” and “quadratic functions”) and competencies (i.e., “connect mathematical concepts to one another” and “analyze and apply mathematical ideas using reason”). Assessing content is my focus in this series. Still, I wanted to point out the potential to assess competencies.

Unit Tests

Teachers can arrange these items in two ways: (1) by proficiency level then learning outcome or (2) by learning outcome then proficiency level. A side-by-side comparison of the two arrangements:

Teachers prefer the second layout–the one that places the learning outcome above the proficiency levels. I do too. Evidence of learning relevant to a specific standard is right there on single page–no page flipping is required to reach a decision. An open question can come before or after this set. The proficiency-level-above-learning-outcome layout works if students demonstrate the same proficiency level across different learning outcomes. They don’t. And shouldn’t.

There’s room to include a separate page to assess competency learning standards. Take a moment to think about the following task:

Initially, I designed this task to elicit Extending-level knowledge of solve systems of linear equations algebraically. In order to successfully “go backwards,” a student must recognize what happened: equivalent equations having opposite terms were made. The p-terms could have been built from 5p and 2p. This gives $5p + 3q + 19 = 0$ for ① and $2p - 5q - 11 = 0$ for ②. (I’m second-guessing that this targets only Extending; $10p + 6q + 38 = 0$ for ① and $10p - 25q - 55 = 0$ for ② works too.) This task also elicits evidence of students’ capacities to reason and to communicate–two of the curricular competencies.

Teacher Reflections

Many of the teachers who I work with experimented with providing choice. Students self-assessed their level of understanding and decided what evidence to provide. Most of these teachers asked students to demonstrate two proficiency levels (e.g., the most recent level achieved and one higher). Blank responses no longer stood for lost points.

Teachers analyzed their past unit tests. They discovered that progressions from Emerging to Proficient (and sometimes Extending) were already in place. Standards-based assessment just made them visible to students. Some shortened their summative assessments (e.g., Why ask a dozen Developing-level solve-by-elimination questions when two will do?).

The shift from grading based on data, not points, empowered teachers to consider multiple forms (i.e., conversations, observations, products) and sources (e.g., individual interviews, collaborative problem solving, performance tasks) of evidence.

In my next post, I’ll describe the last practice: Determining Grades (and Percentages). Again, a sneak peek:

Update

Here’s a sample unit test populated with questions similar to those from a sample proficiency scale:

Note that Question 18 addresses two content learning standards: (1) solve systems of linear equations graphically and (2) solve systems of linear equations algebraically. Further, this question addresses competency learning standards such as Reasoning (“analyze and apply mathematical ideas using reason”) and Communicating (“explain and justify mathematical ideas and decisions”). The learning standard cells are intentionally left blank; teachers have the flexibility to fill them in for themselves.

Note that Question 19 also addresses competencies. The unfamiliar context can make it a problematic problem that calls for (Problem) Solving. “Which window has been given an incorrect price?” is a novel prompt that requires Reasoning.

These two questions also set up the possibility of a unit test containing a collaborative portion.

[E]valuation is a double edged sword. When we evaluate our students, they evaluate us–for what we choose to evaluate tells our students what we value. So, if we value perseverance, we need to find a way to evaluate it. If we value collaboration, we need to find a way to evaluate it. No amount of talking about how important and valuable these competencies are is going to convince students about our conviction around them if we choose only to evaluate their abilities to individually answer closed skill math questions. We need to put our evaluation where our mouth is. We need to start evaluating what we value.

Liljedahl, P. (2021). Building thinking classrooms in mathematics, grades K-12: 14 teaching practices for enhancing learning. Corwin.

[SBA] Constructing Proficiency Scales

In this series:

1. Writing Learning Standards
2. Constructing Proficiency Scales
3. Designing Assessment Items

Constructing Proficiency Scales

BC’s reporting order requires teachers of Grades K-9 to use proficiency scales with four levels: Emerging, Developing, Proficient, and Extending. Teachers of Grades 10-12 may use proficiency scales but must provide letter grades and percentages. Proficiency scales help communicate to students where they are and where they are going in their learning. But many don’t. When constructing these instruments, I keep three qualities in mind…

Descriptive

BC’s Ministry of Education defines Emerging, Developing, Proficient, and Extending as demonstrating initial, partial, complete, and sophisticated knowledge, respectively. Great. A set of synonyms. It is proficiency scales that describe these depths with respect to specific learning standards; they answer “No, really, what does Emerging, or initial, knowledge of operations with fractions look like?” Populating each category with examples of questions can help students–and teachers–make sense of the descriptors.

Positive

Most scales or rubrics are single-point posing as four. Their authors describe Proficient, that’s it. The text for Proficient is copied and pasted to the Emerging and Developing (or Novice and Apprentice) columns. Then, words such as support, some, and seldom are added. Errors, minor (Developing) and major (Emerging), too. These phrases convey to students how they come up short of Proficient; they do not tell students what they know and can do at the Emerging and Developing levels.

BC’s Ministry of Education uses this phrase to describe profiles of core competencies: “[Profiles] are progressive and additive, and they emphasize the concept of expanding and growing. As students move through the profiles, they maintain and enhance competencies from previous profiles while developing new skills.”

I have borrowed this idea and applied it to content learning standards. It was foreshadowed by the graphic organizer at the end of my previous post: Extending contains Proficient, Proficient contains Developing, and Developing contains Emerging. (Peter Liljedahl calls this backward compatible.) For example, if a student can determine whole number percents of a number (Proficient), then it is assumed that they can also determine benchmark percents (i.e., 50%, 10%) of a number (Emerging). A move from Emerging to Proficient reflects new, more complex, knowledge, not greater independence or fewer mistakes. Students level up against a learning standard.

Emerging and Extending

The meanings of two levels–Emerging to the left and Extending to the right–are open to debate. Emerging is ambiguous, Extending less so. Some interpretations of Extending require rethinking.

Emerging

“Is Emerging a pass?” Some see Emerging as a minimal pass; others interpret “initial understanding” as not yet passing. The MoE equivocates: “Every student needs to find a place on the scale. As such, the Emerging indicator includes both students at the lower end of grade level expectations, as well as those before grade level expectations. […] Students who are not yet passing a given course or learning area can be placed in the Emerging category.” Before teachers can construct proficiency scales that describe Emerging performance, they must land on a meaning of Emerging for themselves. This decision impacts, in turn, the third practice of a standards-based approach, designing assessment items.

Extending

A flawed framing of Extending persists: above and beyond. Above and beyond can refer to a teacher’s expectations. The result: I-know-it-when-I-see-it rubrics. “Wow me!” isn’t descriptive.

Above and beyond can also refer to a student’s grade level. Take a closer look at the MoE’s definition of Extending: “The student demonstrates a sophisticated understanding of the concepts and competencies relevant to the expected learning [emphasis added].” It is Math 6 standards, not Math 8 standards, that set forth the expected learning in Math 6. When reaching a decision about proficiency in relation to a Math 6 outcome, it is unreasonable–and nonsensical–to expect knowledge of Math 8 content.

Characterizing Extending as I can teach others is also problematic. Explaining does not ensure depth; it doesn’t raise a complete understanding of a concept to a sophisticated understanding. Further, I can teach others is not limited to one level. A student may teach others at a basic complexity level. For example, a student demonstrates an initial understanding of add and subtract fractions when they explain how to add proper fractions with the same denominator.

Example: Systems of Linear Equations

In my previous post, I delineated systems of linear equations as solve graphically, solve algebraically, and model and solve contextual problems. Below, I will construct a proficiency scale for each subtopic.

Note that I’ve attached specific questions to my descriptors. My text makes sense to me; it needs to make sense to students. Linear, systems, model, slope-intercept form, general form, substitution, elimination–all of these terms are clear to teachers but may be hazy to the intended audience. (Both logarithmic and sinusoidal appear alongside panendermic and ambifacient in the description of the turbo-encabulator. Substitute nofer trunnions for trigonometric identities in your Math 12 course outline and see if a student calls you on it on Day 1.) The sample questions help students understand the proficiency scales: “Oh yeah, I got this!”

Some of these terms may not make sense to my colleagues. Combination, parts-whole, catch-up, and mixture are my made-up categories of applications of systems. Tees and hoodies are representative of hamburgers and hot dogs or number of wafers and layers of stuf. Adult and child tickets can be swapped out for dimes and quarters or movie sales and rentals. The total cost of a gas vehicle surpassing that of an electric vehicle is similar to the total cost of one gym membership or (dated) cell phone plan overtaking another. Of course, runner, racing car and candle problems fall into the catch-up category, too. Textbooks are chock full o’ mixed nut, alloy, and investment problems. I can’t list every context that students might come across; I can ask “What does this remind you of?”

My descriptors are positive; they describe what students know, not what they don’t know, at each level. They are progressive and additive. Take a moment to look at my solve-by-elimination questions. They are akin to adding and subtracting quarters and quarters, then halves and quarters, then quarters and thirds (or fifths and eighths) in Math 8. Knowing $\frac{8}{3} - \frac{5}{4}$ implies knowing $\frac{7}{4} - \frac{3}{4}$.

Emerging is always the most difficult category for me to describe. My Emerging, like the Ministry’s, includes not yet passing. I would welcome your feedback!

Describing the Extending category can be challenging, too. I’m happy with my solve graphically description and questions. I often lean on create–or create alongside constraints–for this level. I’m leery of verb taxonomies; these pyramids and wheels can oversimplify complexity levels. Go backwards might be better. Open Middle problems populate my Extending columns across all grades and topics.

My solve algebraically… am I assessing content (i.e., systems of linear equations) or competency (i.e., “Explain and justify mathematical ideas and decisions”)? By the way, selecting and defending an approach is behind my choice to not split (👋, Marc!) substitution and elimination. I want to emphasize similarities among methods that derive equivalent systems versus differences between step-by-step procedures. I want to bring in procedural fluency:

Procedural fluency is the ability to apply procedures accurately, efficiently, and flexibly; to transfer procedures to different problems and contexts; to build or modify procedures from other procedures; and to recognize when one strategy or procedure is more appropriate to apply than another

NCTM

But have I narrowed procedural fluency to one level?

$\frac{x}{3} + \frac{y}{2} = 3$
$\frac{x+3}{2} + \frac{y+1}{5} = 4$?

Note that my model and solve contextual problems is described at all levels. Apply does not guarantee depth of knowledge. Separating problem solving–and listing it last–might suggest that problem solving follows building substitution and elimination methods. It doesn’t. They are interweaved. To see my problem-based approach, watch my Systems of Linear Equations videos from Surrey School’s video series for parents.

Next up, designing assessment items… and constructing proficiency scales has done a lot of the heavy lifting!

[SBA] Writing Learning Standards

For several years, standards-based assessment (SBA) has been the focus of much of my work with Surrey teachers. Simply put, SBA connects evidence of student learning with learning standards (e.g., “use ratios and rates to make comparisons between quantities”) rather than events (“Quiz 2.3”). The change from gathering points to gathering data represents a paradigm shift.

In this traditional system, experience has trained students to play the game of school. Schools dangle the carrot (the academic grade) in front of their faces and encourage students to chase it. With these practices, schools have created a culture of compliance. Becoming standards based is about changing to a culture of learning. “Complete this assignment to get these points” changes to “Complete this assignment to improve your learning.” […] Educators have trained learners to focus on the academic grade; they can coach them out of this assumption.

Schimmer et al., 2018, p. 12

In this series, I’ll describe four practices of a standards-based approach:

1. Writing Learning Standards
2. Constructing Proficiency Scales
3. Designing Assessment Items

Writing Learning Standards

In BC, content learning standards describe what students know and curricular competency learning standards describe what students can do. Describe is generous–more like list. In any mathematical experience a student might “bump into” both content and competency learning standards. Consider Nat Banting’s Quadratic Functions Menu Math task:

You could build ten different quadratic functions to satisfy these ten different constraints.

Instead, build a set of as few quadratic functions as possible to satisfy each constraint at least once. Write your functions in the form y = a(x − p)2 + q.

Which constraints pair nicely? Which constraints cannot be paired?

Is it possible to satisfy all ten constraints using four, three, or two functions?

Describe how and why you built each function. Be sure to identify which functions satisfy which constraints.

Students activate their knowledge of quadratic functions. In addition, they engage in several curricular competencies: “analyze and apply mathematical ideas using reason” and “explain and justify mathematical ideas and decisions,” among others. Since the two are interwoven, combining competencies and content (i.e., “reason about characteristics of quadratic functions”) is natural when thinking about a task as a learning activity. However, from an assessment standpoint, it might be helpful to separate the two. In this series, I will focus on assessing content.

The content learning standard quadratic functions and equations is too broad to inform learning. Quadratic functions–nevermind functions and equations–is still too big. A student might demonstrate Extending knowledge of quadratic functions in the form y = a(x − p)2 + q but Emerging knowledge of completing the square, attain Proficient when graphing parabolas but Developing when writing equations.

Operations with fractions names an entire unit in Mathematics 8. Such standards need to be divided into subtopics, or outcomes. For example, operations with fractions might become:

2. multiply and divide fractions
3. evaluate expressions with two or more operations on fractions
4. solve contextual problems involving fractions

Teachers can get carried away breaking down learning standards, differentiating proper from improper fractions, same from different denominators, and so on. These differences point to proficiency levels, not new outcomes. Having too many subtopics risks atomizing curriculum. Further, having as many standards as days in the course is incompatible with gathering data over time. I aim for two to four (content) outcomes per unit.

In Foundations of Mathematics and Pre-calculus 10, systems of linear equations can be delineated as:

1. solve graphically
2. solve algebraically
3. model and solve contextual problems

My solve algebraically includes both substitution and elimination. Some of my colleagues object to this. No worries, separate them.

In my next post, I’ll describe constructing proficiency scales to differentiate complexity levels within these learning standards. Here’s a sneak peek:

What do you notice?

Open Middle Math

In my previous post, I shared some of the principles that guided Marc and me when creating a series of math videos for parents (Mathematics 6 & 7; 8–10): make it visual, make it conceptual, and make it inviting. In this way, we also set out to make these videos representative of math class. It was our hope that they presented parents with a view into their child’s classroom (“window”). Further, we hoped that Surrey teachers saw their classrooms in what was reflected (“mirror”). In that spirit, several videos in this summer’s collection included an open-middle problem.

In Open Middle Math, Robert Kaplinsky describes what makes a math problem an open-middle problem:

[M]ost math problems begin with everyone having the same problem and working toward the same answer. As a result, the beginning and ending are closed. What varies is the middle. Sometimes a problem’s instructions tell students to complete a problem using a specific method (a closed middle). Other times, there are possibly many ways to solve the problem (an open middle). Problems with open middles tend to be much more interesting and lead to richer conversations.

Robert Kaplinsky

This use of open-middle to describe problems has always irked me. There, I said it. To me, open- vs. closed-middle is not a characteristic of a problem itself. Robert argues that a problem’s instructions can close a problem’s middle. Agreed! But I go a step further. There are other ways through which students are told to use a specific method. It’s us. For example, consider a boilerplate best-buy problem. The middle is wide open! Doubling, scaling up, common multiples, unit rates — dollars per can or cans per dollar — and marginal rates are all viable strategies. However, we close the middle when we give this task after demonstrating how to use unit prices to determine best deals (i.e., “now-you-try-one” pedagogy). If students — and teachers! — believe that mathematics is about plucking numbers to place into accepted procedures then they are unlikely to experience the rich “open-middleness” of this task, regardless of its instructions. It’s no accident that the book’s introduction is titled “What Does an Open Middle Classroom Look Like?”

Most of the problems posted on the open middle site involve number — or digit — tiles. But I get why Robert didn’t go with “Number Tile Math.” The boxes in 25 × 32 = ⬚⬚⬚ and 63 − 28 = ⬚⬚ give each a fill-in-the-blanks answer-getting feel. The routine nature misses the problem-solving mark (despite their open middles). So, “open-middle” as an adjective for problems it is. Besides, math class could use more openness, which needn’t come at the end for problems to be interesting and conversations to be rich.

When I look at an Open Middle problem from the site, the mathematical goal of the teacher who created the problem isn’t always clear to me. (The same is true, by the way, of wodb.ca.) What is the deep conceptual understanding that they anticipate their students will develop by working on the problem? What ideas will emerge? What misconceptions might be addressed? Throughout Open Middle Math, Robert describes how Open Middle problems can give us X-ray vision into our students’ mathematical understanding. Similarly, he provides readers with X-ray vision into his thinking during the process of creating these problems. Below, I’ll share a few of the open-middle problems from our video series (plus some that ended up on the cutting room floor) as well as a peek behind the curtain into my thinking.

Polynomials

Towards the end of the Math 10 Factoring Polynomials video, I present two open-middle/number-tile problems. Teachers will recognize these as familiar “find k” problems: For what value(s) of k is x2 + kx − 8 factorable? x2xk? See the answer animations below.

I think that the number tiles add an element of play to these problems. The tiles are forgiving. Make a mistake? No biggie, just move ’em around. (The decision to show an initial misstep in the first animation above was deliberate.) This upholds our third principle: make it inviting.

These two sample tasks above highlight the role of students’ prior knowledge in solving open-middle problems. My assumption here is that teachers have not “proceduralized” these problems — that students have not been provided with predetermined solution pathways (e.g., “First, list all the factors of the constant term c. Then, …”). Note the open end of the second problem. The intent of my animation is to convey that there are infinitely many solutions. The problem presents students with an opportunity to generalize.

Each of these problems can be classified as Depth of Knowledge Level 2 (Skill/Concept). In both, students need a conceptual understanding of factoring x2 + bx + c where b and c are given. The second requires pattern-sniffing (or logic). I created a third problem that asks students to think about these two equations — and a third — simultaneously.

Note that x − 4 could be a factor of each trinomial. However, students need to determine where to put 4 so that the other digits can be placed in the remaining boxes. This twist might not be enough to raise it to DOK 3 (Strategic Thinking). Roughly speaking, Robert’s DOK 2 problems involve making statements true. Sometimes it’s satisfying an equation, sometimes it’s satisfying a condition (e.g., a system of equations having no solution). Robert’s DOK 3 problems call for optimizing a result — least, greatest, closest to.

In my Math 9 Polynomials video, I pose the following open question in the style of Marian Small: Two trinomials add to 3x2 + 7x + 6. What could they be? Here’s a problem, adapted from Open Middle Math, that also tackles adding polynomials:

Both tasks can help reveal students’ understanding of combining like terms and manipulating coefficients and exponents. (In Task 1, I specify that the two polynomials be trinomials. This rules out responses such as (3x2) + (7x + 6) that sidestep like terms.) Task 2 is much more likely to show what students know about additive inverses, although a small tweak to Task 1 (e.g., Two trinomials add to 3x2 + 7x + 6. What could they be?) bridges this gap.

Integers

I include one open-middle problem in the application section of each of my Math 7 integer videos:

Note that the first is DOK 2 whereas the second is DOK 3. Maybe. I don’t want to quibble. What matters more than the differences between DOK 2 and 3 is that these questions require a deeper understanding than DOK 1 problems such as Evaluate: (−9) + (+3) or Evaluate: (+3) − [(−5) + (−4)] × (+5).

In the first video, I ask “How might finding one solution help you find more? How are some of the solutions related to one another?” These questions aren’t answered in the video — an exercise left to the viewer. Here are just some of the ideas that I would anticipate to emerge in an Open Middle classroom discussion:

• addition and subtraction facts are related
• e.g., (+6) + (+3) = +9 and (+9) − (+3) = +6 belong to the same “fact family”
• this relationship extends from whole numbers to integers
• e.g., (−6) + (−3) = −9 and (−9) − (−3) = −6 also form a fact family
• subtracting a number can be thought of as adding it’s opposite
• e.g., if (+6) + (+3) = +9 makes the equation true then so, too, does (+6) − (−3) = +9
• swapping the number being subtracted (subtrahend) and the number it is subtracted from (minuend) changes the sign of the result (difference)
• e.g., (+9) − (+3) is equal to positive 6 whereas (+3) − (+9) is equal to negative 6

Order of operations is a natural fit for optimization problems. In the second video, the intent of my answer animation is to communicate my mathematical reasoning. Once more, note that I show a couple of slight missteps and revisions to my thinking.

In addition to performing the operations in the correct order, students must think about how to maximize sums and minimize products. They must consider how subtracting a number increases the result. See one of Marc’s Math 7 decimal videos for another open-middle order of operations example.

Percents

I did not include an Open Middle problem in my Math 7 Percents video. Rather, I chose to present a percents number talk: Estimate 78% of 49. Note that I show two strategies: one that makes use of quarters…

… and another that utilizes tenths.

Because there are many ways to solve this problem, it can be thought of as a having an open middle despite it not having number tiles. A third, interesting, solution pathway can be taken. Instead of 78% of 49, we can estimate 49% of 78: 50% — or one-half — of 78 is 39. The idea that x% of y is equal to y% of x should emerge from the following:

It’s for this reason that I did not add the constraint Each number can be used only once. You can always add it later, which should bring about doubling and halving — and tripling and “thirding”!

I like that the double number line problem below incorporates estimation; both 20 and 25 per cent are perfectly reasonable estimates. Also, it embraces our first principle — make it visual — which is largely missing from my other open-middle examples.

I wrestle a bit with whether or not to include the “only once” constraint. Does it enhance the problem above? I guess that it necessitates more practice; disqualifying 25% of 64 is 16 does compel students to seek out 25% of 68 is 17 or 25% of 76 is 19. But concentrating on unique tens and units digits of parts and wholes is irrelevant to percents. Again, you can always add this constraint later. (Update: Check out this slick Desmos activity from @TimBrzezinski!)

What might a DOK 3 Open Middle percent problem look like? Below is a possibility — or three! — that uses Robert’s optimization approach. (I haven’t played with the dollar value amounts so treat them as placeholders.)

Open Middle Math is a must read that will help you implement these engaging tasks. Whether you’re new to Open Middle problems or think you know all about ’em, you’ll love the glimpse into how Robert designs opportunities for students to persevere in problem solving and for teachers to gain insights into what students really understand.

Get yourself 920 calories, playa

How many watermelons do you see?

You might be a math teacher if you answered five. Whereas a normal person sees produce, a math teacher sees fractions (4 × ½ + 4 × ¾) or perfect squares (3² − 2²).

As a student, I never asked Sally’s question. To me, math was like an action movie (minus the action). It required suspension of disbelief. I accepted that.

As a teacher, I want more for my students. I want my students to use mathematics to better understand the world around them (i.e., the real one).

So when world-class pool player and Korean War expert Dr. Tae craved a burger, I wanted to find a connection between mathematics and the real world.

The best that I could come up with:

Two burgers and one order of regular fries have 2020 calories. Four burgers and three orders of regular fries have 4660 calories. How many calories are there in each menu item?

There’s plenty not to like about this problem. As a real-world application of mathematics, like a Subway foot-long sandwich, it doesn’t measure up. For starters, it asks students to pretend that the total number of calories is known while the number of calories in each menu item remains a mystery. “We all use math everyday” meets “Yeah, right.” It would be easy to dismiss this problem. Pseudocontext. Full stop.

However, I also want to use the real world to have my students better understand mathematics. Whereas I’m critical of this problem presented as an application, I’m much more accepting of it as an investigation.

To introduce solving systems of linear equations, I have asked similar questions. My students would reason that if two burgers and one order of fries have 2020 calories, then four burgers and two orders of fries must have 2 times 2020, or 4040, calories. Comparing this with the number of calories in four burgers and three orders of fries means one extra order of fries adds 4660 – 4040, or 620, calories. Two burgers must then have 2020 – 620, or 1400, calories. Each burger has 1400 ÷ 2, or 700, calories. Students have solved a system of linear equations using the elimination method. All before having x‘s and y‘s thrown at them. My role was to help my students link their ideas within this context to this:

Solve using elimination.
2x + y = 2020
4x + 3y = 4660

Is that the real issue?

Hey, I just met you and I wanna rock your gypsy soul

Carly Rae Jepsen’s “Call Me Maybe” passed Van Morrison’s “Into the Mystic”.

I’m referring to my iTunes library, of course.

It wasn’t me. Meet the culprits:

First, “Van the Man”. On October 13, 2008, I added “Into the Mystic” to my library (‘Date Modified’ in iTunes). I’m calling this t = 0. I’ve played it 62 times. I last played this “song of such elemental beauty and grace” 1284 days later on April 19, 2012.

Jepsen’s up next. “Call Me Maybe” was added (not by me) on February 28, 2012. This is 1233 days after I added “Into the Mystic”. Seventy-five days later, on May 13, 2012, I listened to this sugary pop tune for the 63rd time. This is 1308 days after adding “Into the Mystic”.

NB: Screenshots of the iTunes Summaries for both songs would make a better first act. Here’s the summary for “Call Me Maybe”:

My initial questions were:

• When did this happen?
• Could I have predicted this?
• How will the number of plays compare in the future?

I modelled this situation using a system of linear equations. For the Irish singer-songwriter, we get p = 0.05d, where p is the number of plays and d is the number of days. For the Canadian Idol, we get p = 0.84d − 1035.72.

Comparing slopes is an obvious discussion topic. The line for “Call Me Maybe” is much steeper than the line for “Into the Mystic”; the rate of change is 0.84 plays/day versus 0.05 plays/day.

This problem can also be used to explore unit rates. Unit rates can be expressed in more than one way. It’s about what one is one.

I wanted to express the equation p = 0.84d − 1035.72 in the form − 63 = 0.84(d − 1233). Slope-point form tells a better story than slope-intercept form in this situation but my GeoGebra skills are rusty.

Having students look at their own iTunes libraries might make a better investigation than practicing solving catch-up problems like this:

I assumed that this situation could be modelled using linear relations. For “Into the Mystic”, fair enough. I think this reasonably approximates the real data. Outside of perhaps when I was commenting on Michael Pershan’s blog, I didn’t go through a Van Morrison phase. Van Morrison is in my wheelhouse and “Into the Mystic” is just in the rotation. The number of plays per day is (almost) constant.

For “Call Me Maybe”, this assumption is likely incorrect. The song’s got legs but the instantaneous rate of change has to be decreasing, right? For my mental health, I hope it is. That many plays would surely take its toll.

And what if Carly has competition?

What if I modelled this using a logarithmic function? Check this out:

Note that ≈ 5½ years after first being added to my library, “Into the Mystic” can be expected to pass “Call Me Maybe”. The natural state of the universe is restored.

Update: I learned how to animate my GeoGebra construction. Also, I corrected a math mistake. (What was my misconception?)