Inspired by a colleague’s presentation during the KIST ‘Teach together; learn together’ professional development event, I took a more formal approach to the impact cycle than I have in the past.
First, I copied the raw data from my students’ diagnostic assessments into an Excel spreadsheet. I added a row at the bottom to show the average result of each test item as a percentage, then used conditional formatting to create a visual perspective into the data.
This allowed me to identify a general area of need: Reading. Then I simply copied and pasted the test items with average results of less than 50% along with the corresponding learning outcome indicated in the test documentation.
The common weak thread, in my analysis, can be expressed by the verbs ‘describe’ and ‘explain’. Surprisingly, in Bloom’s Taxonomy, these terms are associated with Knowledge and Comprehension, or ‘lower’ order thinking.
One issue involves the outcomes related to author’s purpose. Put bluntly, there is no such learning outcome in the standards for Grade 4 at KIST. The students are being assessed in a high stakes manner on learning outcomes that the school doesn’t explicitly teach. I, of course, can add standards about author’s purpose to my working documents. Indeed, that is the purpose of this impact intervention. However, it’s clear that teachers’ voices are needed in the development of the school’s assessment and planning documents to ensure that they are relevant and in alignment with one another.
My plan for having a measurable impact on student learning is to ensure that they are exposed to the idea of author’s purpose, and explore it in a variety of ways in our guided reading sessions. This can be done by direct mini lessons and reinforced by revisiting the concept whenever we encounter a novel or remarkable example in the texts we explore.
Another approach would be through precising and close reading of a master text. For this, the grade four team selected an abridged version of Swiss Family Robinson to be integrated with our unit of inquiry in April and May. This plan might be our students’ first opportunity to read a novel together. The text uses rich vocabulary and imagery, so I believe there will be many opportunities to analyze and summarize selections, and hypothesize about Mr Wyss’ purpose for various literary choices.
To avoid over-assessing my students, I will plan to use the end-of-year English Diagnostic Assessment, of the same type as the one at the start of they year, to measure impact. Throughout the school year, I have assessed and gathered data on a wide variety of learning outcomes informally during guided reading sessions, but this will be the only formal assessment of the learning outcome of author’s purpose.
At KIST, students complete two important diagnostic assessments at the beginning of the school year. One is academic from the United Kingdom Standards and Testing Agency. The other is a Student Survey which allows the learning community to evaluate our classroom environment.
On the academic tests, only 12% of my class achieved ‘just below expectations’ and only 8% were in reading and math. That result indicated to me that academics were an area of strength and that interventions would be needed on a limited and individual basis. With differentiation strategies in place, a classroom culture that would cultivate peer support and collaboration would be helpful to increase the depth and quality of learning.
Turning attention toward the student survey, I identified two major areas of concern that could potentially derail academic progress and achievement.
This post will focus on an action plan to improve classroom climate and morale with the goal of increasing academic achievement through increased enthusiasm and positive engagement.
As detailed in the post, Elementary mindfulness, daily meditation is one strategy that could contribute to a more reflective classroom climate. However, such negative survey results showed a need for a targeted intervention with the goal of helping students to be more Reflective.
Another important opportunity for reflection is our weekly Community Circle. To help my class understand the importance of reflecting together, we elevated Community Circle to a top priority. On top of never cancelling or shortening our sessions, I devised an evaluation system by which active participation results in a ‘meeting expectations’ grade in Listening and Speaking. Knowing that their contributions as members of a community was being monitored, students practiced more intent listening and thoughtful speaking.
I set a goal to award at least one IB Learner Profile Award or PYP Attitude Certificate to each student as quickly as their actions and choices would allow. The result was over 100 being awarded and received, and every student received at least one. To provoke parent encouragement, every award was accompanied by an email to the student’s parents with a photo of them receiving it and a description of how it was earned.
The importance of being reflective
The most precise tool in this plan was to create an opportunity for students to reflect on the way the listen and speak to each other. After collaborating with my grade level team about the questions, the result was a G4B Daily kindness and respect reflection form. Completing the form was assigned as home learning every school day for three months. My assumption was that over time, regular reflection would increase students’ mindfulness to help them to improve their communication and interpersonal interactions.
The form was submitted over 800 times and the results were a satisfactory upward trend. A short term intervention might produce more dramatic results, but would not necessarily produce a lasting outcome. These data demonstrate collective and gradual improvement. It also shows that students were generally more critical of themselves than the class as a whole, and that they each improved in relation to their peers.
The most encouraging results were in the domain of listening. The class showed greatest improvement in listening actively and intently, two skills with a clearly causal connection to academic achievement.
High risk cases
Using the academic diagnostic assessment results to identify ‘high risk’ students, I made a point of checking their reflections occasionally and conferencing with them to increase awareness of their own behavior.
The first case is a student who is well known for having attention challenges as well as socially awkward patterns of behavior, as well as ‘just below expectations’ results on at least one diagnostic assessment.
Interestingly, the results clearly converge, indicating that this student believes that their behavior has improved to more closely match their perception of the class. I have observed this to be true anecdotally, as well, as students in the class have taken responsibility for helping this student to interact more productively and follow directions more consistently.
Another ‘at risk’ student took a very different journey. This may be the only example of a student rating the class lower than themself at the beginning of the survey.
There are students who could reasonably evaluate their own behavior as being better than the class as a whole. Unfortunately, this student is not one of them. We discussed his reflections in detail and there were many instances when I pointed out when choices, ranging from playing with a pencil case to shouting over group members during discussions, were examples of poor listening. The result seems to be increased awareness of their own actions, resulting in a dramatic drop in scores, followed by improvements illustrated by increases in some areas.
Another student who is not achieving academically has also had several issues outside of class related to inappropriate use of language. This is another case in which these reflections may have served as a ‘reality check’.
What is most interesting about this case for me is in which areas this student felt they were doing well and comparing that to their evaluation of the class. At first, two speaking categories were higher than the class, yet the scores converge at the end while the remaining areas dropped.
Are results like these desirable? If the goal is increased awareness, and there is a clear problem, then reflections that become gradually more negative could show increased awareness or acceptance of the problem.
Some students were not ‘at risk’ based on their diagnostic assessments, but warrant special attention for other reasons. The next student is well known, if not notorious, for being at the center of most episodes of misbehavior and interpersonal drama in our class.
Interestingly, they seem to accurately assess that their behavior is less kind and respectful than the class as a group. Yet, I am struck by the ambiguity of the self reflections. There doesn’t appear to be any strong trend and the averages of the scores simply converge at 3.5 at the end. This is a case that raises more questions than answers, the most important being whether the student is very aware of their choices, but simply failed to make or observe any progress. It’s also possible that these results could indicate a deep lack of mindfulness about the student’s own actions and interactions with others.
It is possible that a differently designed reflection tool could reveal more insights into this case.
The following graphs are included simply because the look fascinating. The first shows a strange consistency, yet also a clear trend of improvement.
Next, here’s another example of consistency based on category and gradual progress.
At the end of the three months, I asked the students to answer the original questions of concern: ‘Students are respectful to each other in my class.’ and ‘Students’ behave appropriately in my class.’ This survey was random, like the initial one.
The results are improved, but much more dramatically than I expected.
There has been a fundamental shift in behavior and the perception of behavior in my class since the beginning of the school year. While it is impossible to attribute the change to any one variable, it is safe to say that all efforts to increase kindness and respect had a cumulative effect.
Data has been an undercurrent in my teaching since my first classroom in 2007. Of course, in that year, I struggled to gather data and there was virtually no chance of utilizing much of it to inform and enrich instructional planning. For good or ill, data is not essential to the survival of a first year teacher.
Each year after, I slowly improved, including a variety of experiments like the one shared in the post Student Empowerment | COETAIL final project. I tried different forms, organizers, notebooks, etc, until finally unveiling an integrated digital system last year. I shared it as a presenter at the GAFE Summit 2016 in Kobe, Japan, and used it for the school year to publish students’ ongoing assessment data, and other key information such as website usernames and passwords, directly to them as web pages. After celebrating and discussing the system, I felt that it was terribly unsatisfying.
Inspiration came in the form of media such as Jack Norris’ keynote presentation from Strata + Hadoop World in San Francisco, Let’s Get Real: Acting on Data in Real Time, embedded below.
The concept of ‘data agility’ through converged data and processing appealed to me because what I sought a tool which would organize all assessment data in a way that could be searched, shared, and analyzed. Over the years I had been introduced to many ‘tracking systems’, only to discover that they were utterly unmanageable at scale. Ticking boxes on scope and sequence documents or highlighting learning objectives almost arbitrarily seemed like a show at best. In fact, a colleague who shared such a system with me admitted that at the end of a term, due to a lack of hard data, he would simply choose outcomes to highlight on every student’s document regardless of their actual progress or learning. To quote Mr Norris, I wanted my data to ‘get real’.
‘Small data observes the details or small clues that uncover large trends. The idea is that by honing in on the elements that make up relationships and narratives in schools, education can be enriched.’ The Edvocate
What I wanted to do was bring transparency to the relationships between myself, students, parents, and administrators. Further readings within the big data and data science trends likeData Quality Should Be Everyone’s Job by Thomas C Redman directed my attention toward the purpose for the data. Before data is collected, it should already have a purpose, and that purpose dictates the design of the collection, publishing, and analysis tools.
The next piece of the design puzzle was my school’s Assessment Handbook. In it were the categories, criteria, and descriptors on top of which my system would function.
Utilizing a system of Google Sheets, data is entered and student progress viewed in potentially real time, depending on the efficiency of my data entry. As we began using the system I shared a video, Tour of your data book, embedded below, which illustrates the details of the user experience much better than I can describe in words.
This system has been remarkably effective and unlike last year, I only plan to make minor tweaks, especially to the user interface. Feedback from students and parents revealed, as I expected, that there are too many graphs and that it’s difficult to know which are more or less important.
Another feature I plan to add is a Google Form which mirrors the data entry document which would allow teaching assistants, specialists, and even parents or students themselves to contribute data to the system.
This year, my Student Survey results held few surprises (link to view last year’s Student survey analysis). Items directly related to me, such as ‘My teacher cares about me’, were positive. Generally, 70-80% of students answered ‘usually’ with very few, most often only one student, answering ‘no’.
Listening to students
One surprise was the response to the statement, ‘My teacher listens to me.’, to which 48% of my students think I only ‘sometimes’ listen to them. Slightly baffled, I reflected on my practice and identified a few of my behaviors that could lead to this result.
First, as a rule, I ignore students when they suddenly shout across the classroom, begin asking a question without saying ‘excuse me’ or otherwise catching my attention, and especially interrupt other students. I can easily understand how a child could perceive that I am not listening to them because in some cases, I intentionally don’t listen in order to cultivate a culture in the classroom of politeness.
Of students who responded ‘sometimes’ or ‘no’, their overall average response was only 69% positive, meaning that those who responded negatively to this item were also negative to most of the other items. Of those who don’t feel that I ‘usually’ listen to them, 69% also don’t feel free to ask and answer questions, a tenuous correlation.
As a simple action plan, I would follow the steps below.
1 Observe if and when I don’t listen to students.
2 Make more explicit that I sometimes ignore students speaking to me if they are acting disrespectfully or impolitely.
3 Reinforce our classroom essential agreement – which was composed, synthesized, and signed by all of the students – about being Open-minded Communicators.
We are Open-Minded Communicators.
We have a right to share our opinions and feelings.
We have a responsibility to show respect by listening and practicing empathy.
I would also note that of all of the classes I have taught in nine years, this is by far the most needy. During any written assessment, there is a constant queue at my desk and barrages of hands in the air asking for help. My email box is also consistently populated by emails from students asking to send PDFs of lost homework and other requests for favors which I politely decline. It is possible that their concept of the role of a teacher is significantly different than mine.
Choice and agency
A difference in expectations might illuminate another perplexing survey item result to the statement, ‘My teacher allows me to demonstrate my understanding in various ways.’
For their first unit Summative Assessment Task, students had the instruction to ‘Present your research findings in an appropriate medium of your choice (written report, video, poster, dance, cooking, etc).’
Almost everyone in the class chose to do an oral presentation with a poster or PowerPoint for visual support. The remaining two students submitted written reports. Although this may only be a case of carefully reading and following instructions, I feel justified in being somewhat annoyed.
Respect and classroom behavior
I was shocked to discover their responses to the statement, ‘Students are respectful to each other in my class.’
Only two students think that their peers ‘usually’ treat each other with respect, and almost a quarter feel that their class is always disrespectful. The same holds true for their perceptions of classroom behavior.
When I asked if anyone wanted to learn in a class like the one shown above, no one responded.
I have discussed these results with my grade level team, administration, and the precious grade teachers. All assured me that the students’ feelings about their community are absolutely about complex social dynamics. In brief, this class has too many ‘alphas’ and not enough empathy. This is a case study to test my ability to cultivate social and emotional intelligence. And a fair and timely challenge it is.
A future post will detail the reflection and data informed action plan I have set into motion to help this learning community to become more Caring.
I would certainly appreciate anecdotes and suggestions that might more brightly illuminate a path forward.
One admirable feature of professional development at KIST is the annual Impact on Learning study. Teachers design a data driven experiment based on a pedagogical approach or strategy and then analyze the data to reflect on the efficacy of that aspect of their teaching.
To start, I formulated a question and answer dialogue:
On which group of students do I want to have the greatest impact?
All of them. Inclusive practices and thoughtfully designed learning experiences which emphasize student choice and voice should provide opportunities for all students to excel.
Which group of students are most difficult to reach with inclusive practices and learning experiences that emphasize student choice and voice?
Students who are reluctant to share their ideas in class or participate actively in learning engagements are the most difficult to reach.
Why don’t those students participate?
The reasons they don’t participate are as diverse as the people themselves. However, if they don’t participate now, they likely didn’t before either. If not, then their opportunities for practice have been limited, possibly severely.
Often, students (and people in general) with little experience speaking in a group feel shamed by their lack of fluency. Lack of confidence leads them to withdraw more, causing them to practice even less.
Being fairly introverted myself, I sympathize with many people’s preference to remain in the shadows of a crowd, but nine year old introverts, preferences aside, need to practice public speaking in a safe environment.
Being introverted doesn’t make you averse to collaboration. Everyone should practice social skills. https://t.co/P87RzlBL68
What I needed was a strategy to encourage the students to grow as courageous Communicators by sharing their ideas with the whole class.
My methodology for gathering data is simple. At various times in class, I propose an open ended question. For example, I might ask for interpretations of an idiom, impressions of an image, or opinions about a famous quote. That there are no correct or incorrect answers is made clear to students, as is the fact that ‘I don’t know’ is an acceptable response. Students may also ‘pass’. Sometimes the provocations are directly connected to our unit of inquiry, sometimes not.
Using a deck of laminated cards with the students’ names written on them, I ensure that every student has an equal opportunity to speak. My response to every contribution is ‘thank you’, and I very rarely paraphrase or ask clarifying questions in this context. To students who ‘pass’, I simply respond with ‘OK’.
Cards are separated into two categories and then data entered into a spreadsheet about who contributed an answer and who did not.
The raw data is relatively easy to process to produce interesting graphs.
Some students always talk and some never talk by default. Filtering out those students makes the graph more readable, but still not very revealing.
Referring to diagnostic assessment data in Reading from the beginning of the school year, I included only students who score ‘just below expectations’ or ‘below expectations’.
Next, filtered for students who scored ‘Just below expectations’ or ‘Below expectations’ on Aug diagnostic assessment in Writing.
Again, this graph doesn’t instantly reveal anything other than a general upward trend in participation.
Next, I wanted to explore a possible correlation between this exercise and improvement. The next graph is students with more than 10% improvement on ongoing informal and formal assessments in Reading from Q1 to Q3.
Next, students with more than 10% improvement on ongoing assessments in Writing from Q1 to Q3.
This is the first graph which indicates a clear correlation. With only two exceptions, students whose writing has improved are also increasing their participation.
Since the objective is to improve language skills, I tried including only students who consistently achieve below 80% on ongoing assessment in English language.
Compare to consistently strong achievers in language.
Interesting that consistently higher performers seem to have random participation while consistently lower performers are participating progressively more and more.
If only for the purpose of having more data visualization, members of most advanced guided reading group.
And the least advanced guided reading group.
It’s exciting to see that this activity is impacting the exact group of students it was designed to benefit.
When the class completes its end of year diagnostic assessment in Language, I expect to see similar improvements among students who have gained confidence as communicators through this simple activity.
Finally, here is the whole class average.
Rather than analyzing individual students, this graph reveals something I hadn’t expected. If I compare the number of opportunities to speak with the average rate of participation, there is stark correlation.
Number of data points (% participation):
October 4 in one week (61%)
November 16 (61%)
December 3 (41%)
January 7 (60%)
February 7 (74%)
March 7 (71%)
It would seem that the more we do this activity, the more participation there is. Thinking of the student trying to build confidence, it makes perfect sense. If one hesitates, one loses an opportunity. However, missing a chance might be just the motivation one needs to seize the next one. If that next opportunity comes sooner than later, one is more likely to take it.
And so, the data comes full circle from thinking of individual students, back to individual students.
Peter Gow’s post, The Data Challenge for Schools – What Problem Are You Trying to Solve?, reminds me that the importance of data is not about averages, it’s about outliers. The greatest impact can often be made where there are cracks or gaps in the data. What is important is being intentional when gathering data so that when it is organized and interpreted, it answers the initial question.
It’s also important to remember that while ‘data’ and ‘gut’ are not the same, as Doug Johnson notes in his post, Data or gut?, through investment in time and training, it is possible to align the gut more precisely through data.
My first eye opener was Jim Sill’s session, Google Views – Lessons in 360º, in which I was introduced to Cardboard. This is a realist iteration of virtual reality that could be easily integrated into schools. Although I haven’t had other VR experiences, I wonder if Cardboard offers a majority of the sensory experience.
Overall, I was most inspired by Stephen Taylor’s Formatting the Flow session. As an inquiry teacher, I have always wrestled with the impulse to manage students’ learning. What Stephen showed was how formatted documents can make processes visual and focus students on their learning rather than their presentation and reporting media.
My group was beta testing BreakoutEDU with augmented reality and was not able to open the box like some other groups.
I began with a brief introduction to the concept of transparency as it is viewed in practice in government, business, and education. Then, following a generally ‘less to more’ transparent framework according to the slides embedded below, I shared the tools that I use to make planning, teaching, and assessment in my classroom as transparent as possible. Included in the demonstrations were my weekly planners. I use a template in Google Sheets that allows me to plan to five minutes of accuracy include relevant details including differentiation. These documents are published as a webpage and the link is shared on our class Moodle site. Having the plans published via a link allows easy access from any internet connected device. A classroom computer at the front of our classroom is dedicated to our projector, but it also has all of our links saved as bookmarks in the web browser. Throughout the day, students check these links. This increases the amount of time that I can devote to learning by minimizing questions like ‘what are we doing next?’ or ‘what’s after lunch?’.
Click image to view as webpage.
A teacher in the workshop asked if there was added stress from publishing all of my planning. I replied with that this level of transparency adds a component of accountability that is its own reward. Using the publishing capability of Google Apps, I also publish slides of our daily warm ups and home learning assignments. They are embedded on our class Moodle and require no additional maintenance. They update automatically when new slides are added. If a parent or other member of our learning community uses them even once to have a conversation with their child or keep up to date on home learning, it’s worth the minimal effort to set up.
Finally, I shared my data workbook. This is a system of spreadsheets that provides me with real time data from assessments and then publishes the same data to individual pages, published as websites, for students and families.
This works extremely well for parents to keep up to date on their child’s learning and for sharing web addresses, usernames, and passwords. All materials for the workshop are shared in a public Google Drive folder, Transparency | GAFE Summit Kobe 2016. Strangely, as soon as my session ended, I felt the urge to develop a new data management system that could provide more possibilities for data visualization and analysis. I’ve already begun sketching ideas and look forward to designing and programming this summer.
I’ve completed tons of online professional development, and nothing compares to the invigorating social and interactive experience of a face to face conference. Ironically, this can be especially true in technology where digitally isn’t necessarily the best way to learn something new. The tools which I have put to work immediately are Quizizz and SafeShare. Since introducing Quizizz, my students constantly ask when we will be taking the next quiz. Reflecting on my own presentation, I feel that I probably learned more than my participants! It is easy to feel that the time and energy spent preparing to conduct a conference or workshop session is wasted, but I found the opposite. By deeply analyzing and presenting my approaches to technology in the classroom, I deepened my understanding. Being inspired to expand my strategies was an unexpected surprise! If you’re curious to explore the conference, follow this link to view the full schedule. I’ve already been contacted by Google related colleagues about organizing an event in Tokyo, so I look forward to putting some of that inspiration into action.
At KIST, we conduct an annual student survey to assess classroom climate. Students complete the survey electronically and anonymously by evaluating statements about me and the class as ‘usually’, ‘sometimes’, or ‘no’. The resulting data is later shared with teachers. It’s an informative method of receiving feedback which can be used to refine approaches teaching.
Overall, my survey results were very positive. To statements which I would consider critical, like ‘My teacher cares about me.’ and ‘My teacher shows respect to all students.’
One question which I find very useful for evaluating my differentiation strategies is ‘I am able to do the work given to me.’
An 81% ‘usually’ and 0% ‘no’ outcome shows that my ongoing formative assessment cycle assures that every student is both challenged and capable of success. One surprise was an unexpectedly positive response to ‘My teacher allows me to demonstrate my understanding in various ways.’
Through the first half of the school year, providing multiple pathways to understanding and accepting diverse assessment artifacts have been areas I feel I have neglected. However, my students seem to think otherwise, so perhaps what I have been doing has been satisfying for them.
The first disturbing result in the student survey was to the statement ‘Students are respectful to each other in my class.’
The majority of students responded ‘sometimes’, indicating that I could be doing more to foster respect in my class. Fortunately, they don’t seem to think that I lack respect.
My biggest disappointment is the only 62% ‘usually’ response for ‘My teacher teaches about and demonstrates the Learner Profile attributes.’
Planning and executing IB Primary Years Program units of inquiry has been my greatest frustration this year as I adapt to a new teaching environment. In the past, I have been very careful to explicitly teach the elements of the PYP embedded within units of inquiry. Thus far in this school year, such careful planning and execution simply hasn’t happened. Fortunately, my class is in the formative stages of a new unit so it’s a perfect time to bring IB Learner Profile attributes back to the front. To get started, I’ll access another data source, our Learner Profile reflections, by looking at the analytics.
This data visualization reveals that at this time, according to their self assessments, Caring is my students’ least developed Learner Profile attribute. This evidence also supports their feeling that they are not respectful to each other in class.
By making Caring a focal point in our next unit of inquiry, these issues may be addressed.
The statement ‘I feel free to ask and answer questions.’ is the most perplexing to me.
Contrasting that with a more favorable response to ‘My teacher gives me help when I need it.’ suggest some confusion.
In general, school has traditionally been a place where teachers have answers to students’ questions. While that is a culture that I reject, it doesn’t mean my students have not grow up with that mindset. As in Ted Meyer’s TED talk, Math class needs a makeover, I aim to promote ingenuity and inquiry by being ‘less helpful’. Students constantly ask me questions that could researched in a book, on the Internet, or by asking their peers. For example, I almost never answer spelling questions for students.
I also avoid the traditional hand raising routine that most classrooms follow, preferring to call for responses randomly or in an open forum setting. So it’s little surprise that many only ‘sometimes’ feel free to ask questions since they are likely still processing the cognitive dissonance of a new culture of ‘less helpful’classroom participation.
This is an interesting exercise, although doing it any more than once per year would be too much. I can also see how it might be difficult to compare results year to year, as the perspectives of different classes can vary so widely.
I am happy to take away some actionable data and look forward to seeing what results it might lead to.