The role of rubrics in teaching is not up for debate. Complex tasks need to be analyzed by categories and clear criteria. However, I have found that they sometimes become little more than checklist of instructions on how to complete a task rather than tools for understanding, reflection, and assessment.
My solution is to use blank rubrics. You might think that a blank rubric isn’t a rubric at all, and you would be correct if the purpose of the rubric were only to evaluate a learning artifact. If the rubric itself is a learning tool, then a blank rubric is a rich opportunity for discussion and critical evaluation.
Summative assessment tasks, in particular, benefit from this type of rubric. The categories have been in focus throughout the unit, and have usually been assessed in a more prescribed manner in a previous task. As a summative assessment task should be an opportunity for students to exercise choice and creativity in how they present their understanding, it would be impractical to create specific criteria that could apply to any artifact.
Assessment as learning
Students work in groups to experience a peer’s presentation of their learning and discuss the success of the artifact according to each category. They agree on a score and write in the appropriate boxes the specific elements that support their evaluation.
The assessments are completed in groups of three or four, so every presenter receives at least six separate rubrics which have been completed in this manner. The results are always honest and accurate, especially when averaged and analyzed in detail.
When assessments seem mistaken or vary notably from the norm, a problem that often occurs when a group hasn’t focused or applied enough thought to their findings, a teachable moment to review the categories and criteria arises.
I have observed that students enhance their conceptual understandings of a unit immensely through this process of peer assessment with creative negotiated rubrics.
Data has been an undercurrent in my teaching since my first classroom in 2007. Of course, in that year, I struggled to gather data and there was virtually no chance of utilizing much of it to inform and enrich instructional planning. For good or ill, data is not essential to the survival of a first year teacher.
Each year after, I slowly improved, including a variety of experiments like the one shared in the post Student Empowerment | COETAIL final project. I tried different forms, organizers, notebooks, etc, until finally unveiling an integrated digital system last year. I shared it as a presenter at the GAFE Summit 2016 in Kobe, Japan, and used it for the school year to publish students’ ongoing assessment data, and other key information such as website usernames and passwords, directly to them as web pages. After celebrating and discussing the system, I felt that it was terribly unsatisfying.
Inspiration came in the form of media such as Jack Norris’ keynote presentation from Strata + Hadoop World in San Francisco, Let’s Get Real: Acting on Data in Real Time, embedded below.
The concept of ‘data agility’ through converged data and processing appealed to me because what I sought a tool which would organize all assessment data in a way that could be searched, shared, and analyzed. Over the years I had been introduced to many ‘tracking systems’, only to discover that they were utterly unmanageable at scale. Ticking boxes on scope and sequence documents or highlighting learning objectives almost arbitrarily seemed like a show at best. In fact, a colleague who shared such a system with me admitted that at the end of a term, due to a lack of hard data, he would simply choose outcomes to highlight on every student’s document regardless of their actual progress or learning. To quote Mr Norris, I wanted my data to ‘get real’.
‘Small data observes the details or small clues that uncover large trends. The idea is that by honing in on the elements that make up relationships and narratives in schools, education can be enriched.’ The Edvocate
What I wanted to do was bring transparency to the relationships between myself, students, parents, and administrators. Further readings within the big data and data science trends likeData Quality Should Be Everyone’s Job by Thomas C Redman directed my attention toward the purpose for the data. Before data is collected, it should already have a purpose, and that purpose dictates the design of the collection, publishing, and analysis tools.
The next piece of the design puzzle was my school’s Assessment Handbook. In it were the categories, criteria, and descriptors on top of which my system would function.
Utilizing a system of Google Sheets, data is entered and student progress viewed in potentially real time, depending on the efficiency of my data entry. As we began using the system I shared a video, Tour of your data book, embedded below, which illustrates the details of the user experience much better than I can describe in words.
This system has been remarkably effective and unlike last year, I only plan to make minor tweaks, especially to the user interface. Feedback from students and parents revealed, as I expected, that there are too many graphs and that it’s difficult to know which are more or less important.
Another feature I plan to add is a Google Form which mirrors the data entry document which would allow teaching assistants, specialists, and even parents or students themselves to contribute data to the system.
One of the best ways I’ve found for infusing inquiry into my approach to teaching mathematics is to introduce new topics and concepts with a challenge. Recently, we explored Division by asking volunteers to share on the whiteboard techniques for visually modelling the division expression 21÷3.
Taking the time to do this has many benefits, including:
providing an opportunity to build confidence through peer to peer teaching
introducing, highlighting, and discussing a variety of accessible strategies
developing mental models which can be referred to as learning continues
initially assess general understanding of the concept in a zero pressure setting
review learning that may need to be refreshed before engaging more deeply.