Friday, June 28, 2013

Misunderstanding by Design

One of the most popular frameworks for designing curriculum today is Understanding by Design from Jay McTighe and Grant Wiggins. Their approach makes complete sense. You start by identifying the enduring understandings that you want students to construct, design assessments to let you know if you meet your goal or not, and then design the curriculum backward around the assessments. Iterate as necessary. The problem is that the quality of the curriculum developed using understanding by design is all over the map, with the vast majority of it being fairly awful.

This inconsistency can be partially explained by how understanding by design is typically implemented: through one-off workshops where the trainer or facilitator is often reluctant to push back too hard or give critical feedback. But implementation issues aren't the whole story. Many of the exemplars, including curriculum units highlighted by McTighe and Wiggins themselves, are often not that great either. The problem, as I see it, is that there is no way to know if the curriculum you are designing is any good when using understanding by design. The curriculum could be awesome; the curriculum could be terrible. There is no mechanism in understanding by design that lets you know either way... and our brains are very good at fooling us.

When I was working in Holliston, I worked with three 7th-grade science teachers over the summer to develop a chemistry unit. We started by identifying learning objectives and then designing an assessment. Designing the learning objectives went fairly smoothly; it felt like we were all on the same page. But designing the assessment was a struggle. I'd suggest an assessment, and the other three teachers would kind of exchange looks and then try to undermine what I had proposed. They were very respectful about it, but also very persistent. I was a little puzzled because they were very strong science teachers who were generally up for new ideas.

Finally, one of the teachers (Melissa) said to me, "Dave, our students can't do these assessments." This came at the end of two 8-hour days where we had been quietly butting heads. And she only said this to me because we had three more days left together and there was no sign that I was going to cave and accept an assessment that was nice but did not truly get at the heart of the learning objectives we had all agreed to. This ended up being a breakthrough moment for us because we went on to design a curriculum unit that was a huge success when we implemented it for the first time in the fall. (I am so proud of the work that we did together that the first product I developed after founding Vertical Learning Labs was an interactive textbook based on the unit called Chemistry from the Ground Up.)

Could we have achieved the same result using understanding by design? We could have used it, but it wouldn't have made the difference between an awesome unit and a good one. What made the difference was our commitment to a set of learning objectives and the realization that we could not reach those outcomes with our regular set of tools. I would not accept an assessment that did not actually measure if students reached our outcomes or not, and the teachers would not water down the outcomes. Understanding by design does not provide that kind of feedback. And once we realized that our regular tools were not good enough to do the job, we were freed to acquire and try new ones. It was like we could put all this baggage aside and engage in the task of designing a new chemistry unit with inquiry and not egos driving the process.

Something similar happened when I was in Groton-Dunstable. I was hired because the administration had been trying for years to get the math department at the middle school to adopt a standards-based math curriculum... without much success. I led a math program adoption committee of eight teachers, and we ended up deadlocked in June. I actually convinced all eight teachers to adopt a standards-based math program, but four of them wanted CMP 2 and four wanted Math Thematics. While below the surface the learning in both programs were similar, on the surface Math Thematics felt like a more traditional approach; it was more accessible to the teachers who were nervous about changing their practices.

Something that I've found at every middle school I've worked at is that middle school math teachers hate when students always ask the teacher to walk them through how to solve a problem instead of grappling with it for themselves. Tasked with breaking the deadlock before the school year ended, I asked this question: Which is more likely to get students thinking for themselves, a procedural approach or a conceptual approach? And as a follow up: How much are you prepared to risk to try to make that happen? The committee voted 7-0 with one abstention for CMP 2.

The teachers took a risk at Groton-Dunstable because they realized that they could either accept the status quo or try something different. For them, the potential gain outweighed the pain. There is an adage about a hammer and every problem looking like a nail. A corollary is that you won't learn or use new tools if you think that the hammer might be able to do the job. There are three things that are needed to create disruptive curriculum: (1) a good coach, (2) commitment to achieving something that feels impossible, and (3) accepting anything less than success as failure. Remember, try not. Do or do not. There is no try. Feedback is a bitch.

No comments:

Post a Comment