preload
Mr G Online
Oct 27

I’ve been involved in a lot of discussions and Professional development sessions this year in my role as Maths lead learner that have revolved around the use of standardised testing and the role of data in improving outcomes. Twitter and Ed blogs are awash with concerns about the “dumbing down” of education today because of the direction to “teach to the test” so our schools’ publicly available data can improve (which in a more positive way should be phrased as ” so our student outcomes can improve).

At the same time, we are led to “worship at the altar” of the Ken Robinsons of the world who are leading us in this inevitable Education Revolution of personalised learning, creativity, student autonomy and voice and choice. Entire Education systems have published documents directing us to following this revolution, which to make it clear I am a big proponent of, only for this lofty goal to consciously or unconsciously hit the proverbial brick wall when our latest NAPLAN ( insert own country’s national testing program here) results come a-visiting to inform us we didn’t score as well as we need to. Suddenly, we as a system turn our curriculum into a series of ‘how tos” in comprehending test questions.

Can we get the balance right? Should we balance it? Is it possible to reconcile the unfortunate reality of needing decent test scores to feel good about your school’s achievements with our far more worthy, yet politically undervalued aim of developing creative, critical thinking, community minded, self driven, connected life long learners? Can we counteract the undoubted power of the the once a year standardised test score with our own data reflecting year long achievements? In a single reflective blog post I can’t answer the question definitively nor do I have all the data the research experts can throw at us to support their view point. Nevertheless, here are my views, and my views alone on some of the burning issues we face in our ideological battle between standardisation and Personalised learning.

The question about questions
One of the biggest issues is this whole perception that we must “teach to the test.” How often have you been directed to give the children more practice in comprehending the type of questions the students will face on the test. The mistake we often make here is we try to teach the children strategies in question answering; looking for key words, identifying reasonable and unreasonable answers. What we fail to do is look at the curriculum content behind the question and analyse how effectively we are teaching that content.

In our Mathematics Leadership team, we are spending a lot of time doing just that. We are making use of the data and the actual test questions to find possible gaps, not just in content, but also in the way we present that content. What language is used in the question- do we use that language? How was the mathematical concept represented – do we represent it that way? ( example from this year’s NAPLAN test – I doubled a number then subtracted 4. I was left with 8. What was the original number? Have we used this worded problem based representation or have we just used function machines or algebra forms? It’s a perfectly reasonable question that we just never presented to our students in that way)

So our aim in our team is not to “teach to the test” but to understand how we can teach effectively what might be in the test. That might just sound like a subtle rephrasing of he same thing but it’s not. We need to improve our knowledge of how concepts can be represented. We need to build the vocabulary and contextual range of teachers and students so that in our Maths classes we can provide all possible thinking experiences that may arise in “that test”. Not by doing practice tests, but by incorporating the language and representations found in these tests in our normal engaging lessons. The same applies to Literacy tests. Do we have a great enough range of questions in our day to day English teaching that are similar to the types of questions being asked in the test?

We also use the data from standardised testing ( not just NAPLAN but an On Demand Testing system) to plan for personalised learning for our students in Maths. We are able to identify student skill levels in different areas of the Maths curriculum and cater for differing needs. I’ve written about this elsewhere in my blog.

Having said that, I still have concerns about how the impost of the testing regime affects teaching, especially in Maths. As an education system, we have spent years extolling the virtues of Habits of Mind, Multiple intelligences and Learning styles. Mounds of research has emphasised that today’s students are very much visual learners. We have implemented technology integration across curriculum areas. And yet, despite all this, the ONLY data collection system we rely on is heavily skewed towards a linguistic test. Word problems and written responses. Maths is not just word problems. Life is not full of word problems and comprehension questions.

We have to get the balance right so that we have a curriculum that values verbal, physical, hands on, real life experiences above test questions. Primary School isn’t just a means for progressing to the textbook world of secondary school, where the student is surround by word problems and nothing else. We still need to problem solve, not just word problem answer.

And we can’t just put all our eggs in the standardised test basket in terms of assessment. There is real danger that teachers will lose faith in their own judgment, their own assessment tasks, the quality work the children produce before and after “that test”. It’s all too easy to just accept the score of the national test as a reflection of the student’s achievement, especially when it is public and known to the parents. While data from tests can be effective in planning programs for extension and intervention, the tests are still just an indication of performance on that given day. We have to trust the worth of all that other data we collect during the year – the student’s work. Which means we have to make that data work better for us. This leads to my next point.

Data vs. Data
Two things evident from the data from standardised tests are
It gets analysed; and
It gets publicised
Therefore we have to make sure our other data can be analysed effectively and we have to make it public so it can be used in a positive light.

Consistent, methodical use of checklists, rubrics, annotated samples. assessment spreadsheets and the like coupled with effective means of sorting and presenting the data so that it can be effectively analysed and used for improvement is the first part of the process. The second is counteracting the power of the publication of standardised test results. We can easily bemoan the unfairness of the misuse and misrepresentation the data. We can cry foul at the cold, limited process of league tables making you look like an underachiever because you’re one percentage point away from moving from below average to average. Or you could fight back and proactively publicise your year round data showing the students are far better achievers than that one off test suggests.

This is where the power of the Internet comes in. If you’re concerned prospective parents are put off by the red mark on the website showing national test results, start advertising your school’s achievements on line. Utilise the school website and class/student blogs to post exemplary work publicly. Use digital portfolios to showcase student progress year round to their parents so they can compare what their child has achieved all year as opposed to that test from May. Kill off the bad publicity a low ranking on the test score website gives you with the good publicity of getting student work published in public forums like local newspapers, community radio or other education based websites. Enter Writing, Maths or Arts competitions that don’t judge student acheivement by 40 minute test papers but by in-depth thinking and creativity and celebrate the results. Get your school involved in community, local, national and global projects to show what your students can achieve beyond a multiple choice question booklet.

One set of data should not outweigh multiple sets. Be organised. Be proactive. Know what data you have. Make it clear and accessible. Stop saying “our children are better than these results” and start proving they are. We live in a data driven age of education. Take control of the wheel yourself instead of being driven by external data.

There is a place for standardised testing. The data it provides can help us plan for Personalised learning. We, and by we I mean teachers, principals, education departments, parents, and most of all political leaders, have to get the balance right. We can’t talk of 21st Century education revolutions and then get judged by 19th/20th century methods. Politicians need to LISTEN to us and TRUST our data. Schools and Education systems have to create, collect and publicise data that CAN be TRUSTED. That’s the balance we need to find. I’m no expert. I’m just a teacher with an opinion. Sometimes we need to be heard.

What do you think? Have you found the balance between the standardised national test and a creative, purposeful school curriculum? Join the conversation.

Print Friendly

10 Responses to “Can we reconcile standardised testing with Personalised Learning?”

  1. Brendt Evenden Says:

    I agree with you Mr G that we need to be proactively making it known what our students are really learning and achieving. Before I got to your great list of ways schools can do this, into my head popped the idea of student projects that connect them to community so that word of mouth becomes a form of publicity. Love reading your blog and tweets. Keep up the great work!
    Mr E

    • mgleeson Says:

      Thanks Brendt. NAPLAN data can be useful for whole school planning but shouldn’t be seen as a true reflection of a school’s standard. That’s why every school should find ways to publicise it’s true achievements over the year. But we also have to make sure we are painting an accurate picture.

  2. Viv Says:

    It is a massive conflict in education today, and I do not pretend to know the answer. There has to be a balance and I agree we need to integrate the language in the naplan tests more broadly across our lessons.
    I used to mark the Snap test quite a lot and taught the School Certificate. I admit that I did teach to the test a lot with my Year 10s, particularly because they were a middle ability class with no intention of doing higher maths at any stage.
    Hence every topic test I would find old questions from the School Certificate exams and give them a pre-test, and then another ‘real’ topic test using much the same questions, worded in the same way, but with difficult figures etc.
    When they got to do the SC, they would perform a lot better than if we had not done this.
    Is this good or bad. I always thought the SC was just a jumped up SNAP test anyway, and the content I had to teach them was basically real world numeracy. There was very little algebra in the old papers and I did not concentrate on the topic a great deal. Lots of consumer arithmetic, a lot of data and measurement etc. This is what they needed in the ‘real world’ anyway. So I am not as big a critic of standardized testing as some, in relation to maths at any rate, with those middle ability kids.
    In English and History I like to get very creative and student centred, but of course that is just my own preference :)

    • mgleeson Says:

      As a primary school teacher, we probably have an even bigger problem with testing than you in secondary school. Testing is part and parcel of secondary ( for better or worse;) ) but for a Grade 3 kid to walk in to a standardised test after 3 years of hands on, oral based support and teaching it’s a big ask. We have so much to develop in these kids and spending Term 1 “preparing them for NAPLAN” takes time away from teaching for permanent improvement. I’m not so much a opponent of the concept of standardised testing as I am an opponent of the system’s obsession with the results and the reaction to them. We’ve got some good outcomes out of using the on demand testing this year, mainly because its instant report means we can respond to student needs straight away, not wait 5 months to find out. That’s why I want schools to get better at showcasing and analyzing their own data rather than just NAPLAN. It should be about student improvement not school comparison.

      • Viv Says:

        Yes, the comparison aspect is horrible and unfair. Nor do I like having to devote a whole term to practicing for the test.

        The pre-test when students come into school is good to identify needs gaps in basic numeracy and literacy and then gear your teaching to that. A secondary school I taught at had a special class for Year 7 students who had been deemed to be lacking in literacy and numeracy – a low ability group.
        Part of my job was to do the Counting On numeracy program with the kids in this class and withdraw group of 5 or so to work on every day..
        I had to pre-test them and found a child whose numeracy skills were better than most in the ‘gifted class’. His rapid capacity for accurate mental arithmetic was phenomenal. He was in the class because of his difficulties with literacy, but he couldn’t be put in another maths class, so his teacher was able to give him work to extend him. This was a great outcome to conducting a numeracy test first up.
        I was then able to test them again after a few months, and their maths teacher had a lot more information about the students’ numeracy skills and improve upon them.
        Many students have poor numeracy and are not identified when they enter high school, and the number topic is not given enough emphasis after a brief review in early Year 7. Pre-testing is much more efficient than the NAPLAN in that you have information straight away.

        • mgleeson Says:

          Spot on, Viv with the lack of emphasis on Number past year 7. I’ve tutored students from Year 7-11 in the past and their biggest issue each time wasn’t the lack of understanding of the high end concepts – it was basic number knowledge. One in year 11 knew all the steps of the process for complex equations but was making errors and stressing because she was still counting on her fingers! She needed that addressed at primary level obviously but at secondary there still has to be a way to identify that issue and do something. This is what focused, personalised timely testing can address. A standardised national test with results delivered 5 months later doesn’t pick that up.

  3. Donna Says:

    Beautifully said. Your article has given me an excellent starting point for a discussion I have organised to have with my yr 7 and 8 teachers. These are all issues that I want us to talk about. Some 7 teachers are already on your wavelength, I hope we can get the rest on board. Thanks

  4. Brett Groves Says:

    I very much like the idea of getting alternative analysis in front of parents and the community of normative assessments – will pinch that one if I may.
    I think the problems we face are not the result of standardised testing per se but rather the corporatisation and politicisation of education in general. NAPLAN data is incredibly useful as a diagnostic and planning tool for individual teachers and whiole school planners. After all if you are planning a differentiated curriculum, as you should!, how can you do this without a reasonably accurate idea of the reading levels in your cohort? Of course you cant, other than by guesses made from observation. And that being the point NAPLAN and it’s ilk are diagnotic tools that are being used as high stakes summative assessments. Instead of informing the relationship between learners and educators they are instead informing the relationship between parents and schools and in fact whole communities and schools. An utterly ridiculous situation. How has it come about? it came about because we as educators rolled; the paradigm that teachers had to prove their own worth against a narrow metric by which education is measured (and the absurd acceptance that is somewhow valid) swept across the education landscape in the 90′s like a tsunami and we made not a peep. Driven, I have to say, largely out of America. From a nation that if you remove a couple of key states like California from it’s achievement data languishes lower than Eastern Europe. Even taking the US as a whole the last lot of Unesco data doesn’t paint the sytem pretty. This misuse of essentially internal data reflects a wider move by regional educational bodies toward modelling themselves on the corporate under the erroneous assumption that corporations can inform educational practice. Like economic rationalism you would be pressed hard to find actuall real data to support that proposition, yet it persists. Qui Bono? Politically it’s highly expedient to be able to pick a number and say hey presto we moved education from 5 to 6 and my amplifier goes to 11! Far easier to do this than to open a dialog with the nation that education is complex, organic, holistic, hard to really measure and sometimes messy…a bit like life :-)

Leave a Reply