Ah, the interminable design cycles that we as teachers put ourselves through! Something that I often find challenging is how lengthy these cycles can be–I mean, if the day after you’ve shared a lesson with kids you have a great idea about how it could have been way better, it could be up to a year or more before you get a chance to try it out. And that’s if you remember your idea.
The cycle that has thrown me for the biggest loop is the one I’ve been in about SBG (or Standards-Based Grading). When I first got into this mathedtweetblogiverse two years ago, I was excited by the work Dan Meyer and others had done to make their expectations about skills clear to their students. Until that point, my own assessment arc had not been going well. From the start, it had been really hard to match up my previous experiences with assessment with Saint Ann’s culture and ideals. Giving quizzes and tests in arbitrary and knee-jerky fashion after we had covered “enough” material fizzled in the face of not giving grades. Also, neither the tests themselves nor the feedback and corrections I labored over seemed to improve anyone’s understanding.
When I pulled back from those traditional assessment methods, however, I found myself in something of a vacuum. The fact that I’m in charge of my own curricula and evaluations with little to no constraint–coupled with the fact that I tend to spontaneity and disorganization–often meant that I did few formal assessments whatsoever. I knew that my students were learning things from the work they were doing for my class. I could make records about my observations of their activities to include in my anecdotal reports. Still, I couldn’t help but to think something was missing–my students just weren’t being best served by the lack of clear expectations, a systematic way of pursuing them, and a feedback cycle.
Enter SBG.
Trying to bring Standards-Based Grading into a no-grades school was an interesting adventure. Suffice it to say that after trying out several different formulations over the past two years, I’m really excited to try out my new approach very soon. I’ve decided to go binary with respect to my skills quizzes, since trying to measure progress toward understanding numerically never felt fruitful to me in practice, and there’s no need for me to establish a final “average” for each kid. (Shawn Cornally’s thoughts here also helped to get me there.) I’ll continue to have skills lists for my kids and weekly quizzes for them to choose from in order to demonstrate their mastery. I’ll be giving them copious feedback and letting them know if they nailed it or still have work to do, and we’ll both keep track of the skills they’ve mastered.
Still, I really wanted to find a way to encourage students to see that skills mastery is the beginning of the story, rather than the end. Skills are tools that let you do new things, that empower you, that even give you a new bit of social capital. With these thoughts in mind, I designed the following sheet to help kids to track their progress toward skills mastery and to inspire them to use their knowledge in fruitful ways. I’ll be using the same document to track their progress.
That first column gets checked off once a kid aces a weekly skills quiz–that’s the binary got-it-or-don’t. The space below is for me and students to keep track of feedback that I give them and reminders they might make for themselves. The other three columns are by no means sequential and don’t represent “stages” past mastery. Rather, they are suggested asperations and goals for the newly-minted master geometric-series-summer. Would you like to try a non-routine problem that involves geometric series? Just ask me for one. Does someone you know–in our class or out of it–need help with this topic, or just curious about learning some new math? Share your new knowledge and document it by journaling, snapping a photo, or making a video. Did you recognize three months later that knowing how to sum geometric series opened up a route to solving a problem as you worked on a project? Sweet! Include it in your project write-up.
The point is that those other columns are an ever-present alert: You know things! You can seek out ways to use your knowledge! All three of the “choice prongs” are here–the suggested tasks are big and open-ended, the timeframe is as long as needed, and students can choose these for themselves as goals and record and reflect on their successes as they happen.
A final thought: it seems to me that something like this could be easily adapted to a grades environment. I’m not well-practiced at designing grading schemes, but I’m thinking:
- non-mastery of a skill in isolation is a high F
- mastery of a skill in isolation is a high B or low A
- mastery of a skill in isolation plus a further use of the skill is a high A
And then average them up.
Thoughts on the practicality of such a grading scheme? Comments on the set-up I’m going to try out? Ideas for other ways of building and sharing skills mastery beyond use in isolation?

