Data evolution & revolution

The past

Data has been an undercurrent in my teaching since my first classroom in 2007. Of course, in that year, I struggled to gather data and there was virtually no chance of utilizing much of it to inform and enrich instructional planning. For good or ill, data is not essential to the survival of a first year teacher.

Each year after, I slowly improved, including a variety of experiments like the one shared in the post Student Empowerment | COETAIL final project. I tried different forms, organizers, notebooks, etc, until finally unveiling an integrated digital system last year. I shared it as a presenter at the GAFE Summit 2016 in Kobe, Japan, and used it for the school year to publish students’ ongoing assessment data, and other key information such as website usernames and passwords, directly to them as web pages. After celebrating and discussing the system, I felt that it was terribly unsatisfying.

The present

Inspiration came in the form of media such as Jack Norris’ keynote presentation from Strata + Hadoop World in San Francisco, Let’s Get Real: Acting on Data in Real Time, embedded below.

The concept of ‘data agility’ through converged data and processing appealed to me because what I sought a tool which would organize all assessment data in a way that could be searched, shared, and analyzed. Over the years I had been introduced to many ‘tracking systems’, only to discover that they were utterly unmanageable at scale. Ticking boxes on scope and sequence documents or highlighting learning objectives almost arbitrarily seemed like a show at best. In fact, a colleague who shared such a system with me admitted that at the end of a term, due to a lack of hard data, he would simply choose outcomes to highlight on every student’s document regardless of their actual progress or learning. To quote Mr Norris, I wanted my data to ‘get real’.

While designing my own system, I became somewhat of an amateur data scientist. The implications of the article Putting the science back in data science got me thinking about the flow from data entry to visualization and publishing. A quote from the post Can Small Data Improve K-12 Education? helped to clarify the objective for the project.

‘Small data observes the details or small clues that uncover large trends. The idea is that by honing in on the elements that make up relationships and narratives in schools, education can be enriched.’ The Edvocate

What I wanted to do was bring transparency to the relationships between myself, students, parents, and administrators. Further readings within the big data and data science trends like Data Quality Should Be Everyone’s Job  by Thomas C Redman directed my attention toward the purpose for the data. Before data is collected, it should already have a purpose, and that purpose dictates the design of the collection, publishing, and analysis tools.

 

screenshot-5
Copious data entry (lots of dragging)
The next piece of the design puzzle was my school’s Assessment Handbook. In it were the categories, criteria, and descriptors on top of which my system would function.

 

screenshot-4
Student data visualization via Google Sheets
Utilizing a system of Google Sheets, data is entered and student progress viewed in potentially real time, depending on the efficiency of my data entry. As we began using the system I shared a video, Tour of your data book, embedded below, which illustrates the details of the user experience much better than I can describe in words.

The future

This system has been remarkably effective and unlike last year, I only plan to make minor tweaks, especially to the user interface. Feedback from students and parents revealed, as I expected, that there are too many graphs and that it’s difficult to know which are more or less important.

Another feature I plan to add is a Google Form which mirrors the data entry document which would allow teaching assistants, specialists, and even parents or students themselves to contribute data to the system.

If articles like The Three Ways Teachers Use Data—and What Technology Needs to Do Better by Karen Johnson and 7 Steps to Becoming a Data-Driven School by Eric Crites are any indication of the direction that data utilization is heading in education, I’m glad to be along for the ride.

Student survey analysis

At KIST, we conduct an annual student survey to assess classroom climate. Students complete the survey electronically and anonymously by evaluating statements about me and the class as ‘usually’, ‘sometimes’, or ‘no’. The resulting data is later shared with teachers. It’s an informative method of receiving feedback which can be used to refine approaches teaching.

Positives

Overall, my survey results were very positive. To statements which I would consider critical, like ‘My teacher cares about me.’ and ‘My teacher shows respect to all students.’

   

One question which I find very useful for evaluating my differentiation strategies is ‘I am able to do the work given to me.’


An 81% ‘usually’ and 0% ‘no’ outcome shows that my ongoing formative assessment cycle assures that every student is both challenged and capable of success.

One surprise was an unexpectedly positive response to ‘My teacher allows me to demonstrate my understanding in various ways.’

 

Through the first half of the school year, providing multiple pathways to understanding and accepting diverse assessment artifacts have been areas I feel I have neglected. However, my students seem to think otherwise, so perhaps what I have been doing has been satisfying for them.


Negatives

The first disturbing result in the student survey was to the statement ‘Students are respectful to each other in my class.’


The majority of students responded ‘sometimes’, indicating that I could be doing more to foster respect in my class. Fortunately, they don’t seem to think that I lack respect.

  
My biggest disappointment is the only 62% ‘usually’ response for ‘My teacher teaches about and demonstrates the Learner Profile attributes.’


 Planning and executing IB Primary Years Program units of inquiry has been my greatest frustration this year as I adapt to a new teaching environment. In the past, I have been very careful to explicitly teach the elements of the PYP embedded within units of inquiry. Thus far in this school year, such careful planning and execution simply hasn’t happened.

Fortunately, my class is in the formative stages of a new unit so it’s a perfect time to bring IB Learner Profile attributes back to the front. To get started, I’ll access another data source, our Learner Profile reflections, by looking at the analytics.

This data visualization reveals that at this time, according to their self assessments, Caring is my students’ least developed Learner Profile attribute. This evidence also supports their feeling that they are not respectful to each other in class.

By making Caring a focal point in our next unit of inquiry, these issues may be addressed.

Interesting

The statement ‘I feel free to ask and answer questions.’ is the most perplexing to me.

Contrasting that with a more favorable response to ‘My teacher gives me help when I need it.’ suggest some confusion.

In general, school has traditionally been a place where teachers have answers to students’ questions. While that is a culture that I reject, it doesn’t mean my students have not grow up with that mindset.

As in Ted Meyer’s TED talk, Math class needs a makeover, I aim to promote ingenuity and inquiry by being ‘less helpful’. Students constantly ask me questions that could researched in a book, on the Internet, or by asking their peers. For example, I almost never answer spelling questions for students.

I also avoid the traditional hand raising routine that most classrooms follow, preferring to call for responses randomly or in an open forum setting.

So it’s little surprise that many only ‘sometimes’ feel free to ask questions since they are likely still processing the cognitive dissonance of a new culture of ‘less helpful’ classroom participation.

Conclusion

This is an interesting exercise, although doing it any more than once per year would be too much. I can also see how it might be difficult to compare results year to year, as the perspectives of different classes can vary so widely.

I am happy to take away some actionable data and look forward to seeing what results it might lead to.