SEARCH

Search form

Measuring what matters

Exams and university entrance are now only part of the whole picture of what a good education should look like.
Dr James Dalziel
Former Head of East Campus

James joined UWCSEA in August of 2006 after moving from the Canadian International School where he worked since 1999. Prior to coming to Singapore he worked in Canada as Special Programmes Coordinator with the Durham Board of Education in Ontario.

James has taught in a variety of educational settings including special needs classrooms and outdoor education programmes. He has Masters and Doctoral degrees in Education from the University of Western Australia with a focus on the leadership and management of change within international schools. He is an active workshop leader, author, and trainer within the areas of leadership and international school operations management. He has a Masters in Business Administration and Leadership from the Helsinki School of Economics in Finland with an emphasis on leading for innovation.

James and his wife Nancy have two children, Claire and William who are learning their love for the outdoors from their parents. In July 2017, James and his family relocated to Switzerland, and James is now Executive Director of Education, Continental Europe at GEMS.

 

Measuring what matters

Measuring the success of a learning programme depends on what it is trying to achieve

In the early 1970s, Bhutan’s fourth Dragon King, Jigme Singye Wangchick, introduced the term ‘Gross National Happiness’ to impress on the world the need to expand our way of measuring a country’s success. Decades later, many other heads of state have initiated an expansion of the traditional measure of Gross Domestic Product per person, toward a series of metrics that provide a more accurate representation of how a country, and its citizens, are doing as a whole. The question of ‘how we are doing,’ and the call for a shift in our understanding of what matters and what we should measure, are also being played out within our schools.

Our understanding of what matters in education has changed dramatically in the last few decades. We have shifted from a focus on exams and university entrance towards a broader focus on what knowledge, skills and qualities our students need to develop in order to be successful. Exams and university entrance are now only part of the whole picture of what a good education should look like.

There is a common analogy used in business schools that can be helpful as we think about how to measure the success of a particular learning programme. When we purchase a power drill from the hardware store we are investing in more than a single item, we are investing in the promise of a hole. The drill is just a means to an end, and the value of the drill resides solely in its capacity to produce the hole. So it is with a school’s learning programme. It is a means to an end, the promise of an education. But the value of the learning programme can only be judged by the impact it has on our students. So, if we say that our learning programme will educate individuals to embrace challenge and take responsibility for shaping a better world, how do we know that we are achieving this objective? What data do we use to measure it?

Throughout the course of an academic year, schools collect an enormous amount of data. In order to know how our students are doing, we collect evidence of their learning in a variety of forms. This evidence can be organised into three broad categories: academic results, participation levels and measurements of skills and qualities.

Data regarding academic results is easily gathered: internal and external assessments allow us to measure how well a student is doing against agreed standards and benchmarks. But this data is limited, partly because academic testing sometimes tells us more about the child’s performance than it does about the learning that is taking place in the classroom, but mainly because it is such a narrow measure of learning. Academic results do not take into account either the holistic nature of the programme at UWCSEA or the breadth of our ambition and objective.

Participation levels tell us something about how involved a student is with the learning programme as a whole and are a useful guide to understanding the experience an individual student is having. However, they are only a measure of success if participation is the purpose of the programme. At UWCSEA, becoming involved is not enough; we must ensure that participation is having an impact on learning. If the objective of the sports programme is to develop teamwork and commitment in students, then knowing how many of them participate in sports does not tell us anything about the success of the programme (or the students).

But what of the third type of measurement—the development of skills and qualities as described in the UWCSEA profile? Unfortunately, this measurement, while arguably telling us more about the impact of the education we are offering, is more difficult than measuring academic achievement or levels of participation.

Looking at a skill such as critical thinking, we can see that it can easily be embedded within a learning programme. But while we can measure a student’s critical thinking in a discipline such as mathematics, in order for us to be sure that a child was becoming a critical thinker, we would need to see that skill transferred into other areas, such as scenarios in outdoor education or service. Evidence of broad competence as a critical thinker can only be developed by monitoring students over years of application and in a wide variety of settings.

The qualities within the profile present a different set of challenges. How can we measure the development of ‘resilience’ in students? A common example has been used to demonstrate this issue. Two students walking across the school grounds stop to pick up pieces of litter (observable actions); both place their litter in the trash bin (observable results). To make a judgment based on observable actions and results, we could conclude that each has demonstrated the quality of ‘principled,’ and put a check in that box on the next report. In reality, each student in the above scenario may be motivated by very different things: one may be picking up litter because they know it is the right thing to do, the other because their homework for the day is to pick up a piece of litter. This means we must invest in the cumbersome and imperfect process of trying to understand an individual student’s motivation in order to know whether or not students are really developing the quality of ‘principled.’

All of these difficulties should not prevent us from trying to measure the immeasurable. At UWCSEA, we have designed a programme that offers multiple, age appropriate opportunities for students to develop important skills and qualities, and for teachers and students to be able to assess the outcomes. As the popular axiom states, “If you want to create change, you need to meet people where they are, and take them where they need to be.” We also need a series of measures so that we know when they have arrived.

The College strategic plan includes a major focus and review of our assessment and reporting. This will include the creation of a system that will allow teachers to plan lessons, record student activity and report on student achievement against agreed standards in all five elements of the programme. While reporting may look different in each of the five elements (for example, assessment in service will probably be different to assessment in mathematics), it is important that we are accountable for student learning in all areas. The system should also allow for student reflection, particularly on those qualities in the UWCSEA profile that students develop through their whole time at the College. We are about one third the way through this review; it is a complex process, but it will have a positive impact on student learning.

Sources

“Measuring what matters,” Economist, September 2009.

“How to Measure Anything, Finding the value of ‘intangibles’ in business” Hubbard, Douglas W, John Wiley and Sons Inc., 2010.

“Visible Learning: A synthesis of over 800 meta-analysis relating to achievement” Hattie, John, Routledge Press, 2009.

9 Jun 2013
Media and Republish
Subscribe to our monthly
UWCSEA Points of View
newsletter

*By subscribing, you agree
to our privacy policy.

Articles by the same author

Related articles