Before we get into this episode, are you subscribed yet to the free weekly Teacher bulletin? Get a weekly wrap of our latest content straight to your inbox. Join our community by clicking on the sign-up button at our website, teachermagazine.com.
Thanks for downloading this podcast from Teacher, I’m Andrew Broadley. The OECD’s Programme for International Student Assessment, or PISA, measures the knowledge and skills of 15-year-old students in mathematics, science and reading, and how well they are prepared to use these to meet real-life opportunities and challenges. Since late last year, the OECD has released their findings from the latest assessment cycle – PISA 2022 – over a number of reports. At Teacher, we’ve been diving into some of those reports, which you can read more about at teachermagazine.com.
Today, I’m joined by ACER Senior Research Fellow, and the National Project Manager for PISA, Lisa De Bortoli, to discuss some of what we’ve learned about the experiences of Australian students, based on the latest PISA cycle. We’ll be discussing topics such as whether Australian students are getting better or worse at mathematics, how they feel about the disciplinary climate in their classrooms, and whether or not Aussie kids are creative thinkers. Let’s begin.
Andrew Broadley: Thanks for joining us, Lisa. To start things off today, can you provide a bit of background to PISA, and what sets it apart from other assessments?
Lisa De Bortoli: The Programme for International Student Assessment, otherwise known as PISA, measures 15-year-old students’ ability to apply their knowledge and skills in maths, reading and science to real life situations and challenges.
Since 2000, the assessment has been conducted every 3 years, except for the last cycle, where due to the pandemic the assessment was postponed by a year.
In PISA 2022, there were 81 participating countries and economies, and the students that participated in PISA represented 29,000,000 15-year-old students.
PISA allows for cross-country comparisons and benchmarking of educational outcomes. It gives each country’s education systems the ability to identify their strengths and weaknesses and also learn from other countries.
To put PISA in context – in the Australian education system, PISA is one of the assessments in the National Assessment Program that evaluates educational outcomes of students across the country. So, the National Assessment Program, (or NAP as we know it), you might be familiar with NAPLAN, but there are also some NAP sample assessments and NAP international assessments and that's where PISA comes into play. It's one of the national assessments along with TIMSS, which is the Trends in International Maths and Science Study and PIRLS (Progress in International Reading Literacy Study).
Now, while most of the NAP assessments assess specific content knowledge or curriculum-based skills, PISA (as I mentioned previously) assesses how students apply what they've learned at school in unfamiliar contexts. There's also another difference between most of the NAP assessments and PISA, and that’s that it’s an age-based sample rather than a year-level-based sample. And this means that most of these students in Australia, although they will be in year 10, there will be some students that will be enrolled in year 9 and year 11. And the reason why we have an age-based sample is because PISA is really looking at the effectiveness of education systems as students are nearing the end of compulsory education, and 15-year-olds are really considered the age at which compulsory education is coming to an end.
AB: You’ve already touched on it briefly there in your answer, this latest PISA cycle was pretty unique, right? The assessment period was longer, it was far more disrupted – with school closures and remote learning – how has that impacted the findings? You know, what trends or common themes have we seen in this latest cycle?
LDB: I think we have to start off by acknowledging the impact that the COVID-19 pandemic had on learning disruptions, recovery efforts and wellbeing. We know there were varied learning disruptions across countries and that some schools adapted better to remote learning than others and this would have influenced overall performance.
We know that school efforts to mitigate learning loss – such as having targeted interventions and tutoring programs – played a role in shaping the outcomes for PISA 2022. And we also know that the pandemic had an impact on mental health and wellbeing, with some students reporting higher levels of stress and anxiety, which could have influenced their academic performance.
So, if we look at performance between consecutive PISA cycles up until PISA 2018, the OECD average score never changed by more than 4 points in maths or 5 points in reading. But, if we actually look at the [changes in] performance between PISA 2018 and 2022, it does show a different picture. The average performance in OECD countries declined by 15 points in maths and 10 points in reading. And, interestingly, the average performance in science didn't actually change over the 4-year period.
And to help contextualise when I'm speaking about ‘points’, what we do know is that we can actually estimate the number of score points to years of learning – [just over] 20 score points equates to one year of learning. So, what we've found is that between 2018 and 2022, for example, there's been a decline across the OECD in maths, and that equates to around 3/4 of a year of learning.
AB: Wow, okay. So, that’s pretty significant – across the OECD we’ve more or less seen a sharp decline in performance. Now, is that the same for Australia?
LDB: The results show between 2018 and 2022 that the results for maths, reading and science have remained stable. But we do know, for example, that over time there actually has been a larger decline [in Australia’s average performance].
And the other thing is that when we're talking about performance, we're not only actually looking at performance in terms of averages, but we're also able in PISA to look at students’ performance using proficiency levels. And it basically provides a description of what skills students are able to do; and in each of the maths, reading and science scales there are proficiency levels.
So, at the high end of the scale, we've got the high performers, and these are students that typically have attained proficiency of level 5 or 6 and, generally, these are the students that consistently apply their advanced knowledge and skills in a variety of real-life situations. In Australia, 12% of students were high performers in maths and 13% [were] high performers in science. Over time, the percentage of high performers has decreased for maths by about 8 percentage points, and 5 percentage points for reading, and 2 percentage points for science.
And at the other end of the scale, we have the low performers, and these are the students who fail to reach level 2. And Level 2 is really being defined as the baseline of proficiency at which students begin to demonstrate the competencies that will enable them to engage effectively and productively across a wider range of situations. And for Australia, about 1/4 of Australian students were low performers in maths and around 20% of Australian students are low performers in science and reading. And over time, the percentage of low performers has increased. So, we now have more low performers in PISA; there's about 12 percentage points increase for maths, 9 for reading, and 7 for science.
AB: So, while Australia has managed to, I guess, somewhat buck the trend and hold steady – at least for the latest PISA cycle – it’s not exactly a win, right, because we do have some worrying trends at both the top and bottom ends of the scale. I do, kind of, want to delve into that a little bit more. Because, when we look at high performers and low performers, you can’t help but see the link between things like their socioeconomic status, for example. So, how does Australia look in this area?
LDB: I think that, you know, equity in education is about schools and their education systems being able to provide equal learning opportunities for all students, and that's regardless of their background, or their economic, or their social circumstances. And one of the key aspects that PISA investigates is equity in education.
At the international level, PISA looks at how socioeconomic status, gender, immigrant background, affects student performance. And at a national level, we also look at geographic location, First Nations students’ background, and how that affects their student performance.
If we look at socioeconomic background, and in PISA it's actually measured by looking at the students’ parents – their occupational status, their educational status and also some of their home possessions. So, if we look across all the assessment domains – so if we take maths, reading and science – the results show that students from higher socioeconomic backgrounds perform at a significantly higher level than students from a lower socioeconomic background; and this difference is about 100 score points, which is around 5 years of learning. And if we look at the results across the OECD countries, the difference is about 93 points – if we look, for example, at maths in particular.
And we know that for disadvantaged students in OECD countries they are 7 times more likely, on average, than the advantaged students, to not achieve basic maths proficiency. And look, this is also the same for science. When it comes to reading, the odds of low performance are more than 5 times higher for disadvantaged students and their advantaged peers.
When we look at equity, we can also look at some of the other demographic groups. For example, if we look at gender, we know that males tend to perform higher than females in maths and it's the other the other way round for reading, where females are performing higher than males, and there are no gender differences in science. Between 2018 and 2022, the only difference was for females in reading, where the performance decreased by 10%.
And if we look at First Nations students from the beginning of PISA (when we first started assessing First Nations students), they actually perform lower across maths, reading and science, compared to the non-First Nations students. And this difference is about 80 score points, which equates to around 4 years of learning.
I think that we do need to recognise that there are a small group of students in Australia – there are about 10% of Australian students, which is similar to the OECD average – and these students are actually academically resilient. So, these are the students who come from a disadvantaged socioeconomic background but they're actually high performers. This is an area of interest because what we want to do is try and explore the reasons why these students are actually defying the odds.
AB: Mmm, that’s interesting. I know that Teacher editor Jo Earp actually discussed something similar in a recent podcast special with ACER CEO Geoff Masters. That podcast series looked at some of the findings from Geoff Masters’ research into world-class learning systems, and how they create the conditions for all students to learn successfully. For those interested, I’ll be sure to put a link to that in the transcript of this episode.
If you’re enjoying the Teacher podcast channel, have you subscribed yet to the free Teacher bulletin? It’s the best way to get our latest stories straight to your inbox. In addition to a weekly wrap up of our latest content, subscribers to the bulletin can receive Teacher
Trending, where we share our top 5 pieces of content on a trending topic, and special editions where we share a bonus piece of content with you. The sign-up process is easy – just visit our website, teachermagazine.com and click on the sign-up button on the righthand side of our homepage.
AB: I do want to move on a little bit from our academic results, because I know that PISA looks at a so much more than that. For example, each PISA cycle includes a student and principal questionnaire, and one of the, I guess, big talking points from the latest PISA cycle has been Australia’s questionnaire responses. In particular, there’s been some talk about what students had to say about the disciplinary climate in classrooms. Now, students in Australia actually rated it quite unfavourably. Can you provide some context to that?
LDB: So, the purpose of collecting this background data is to actually help inform what's happening from a performance perspective. And as you mentioned, PISA does collect information on the disciplinary climate; and students were asked about how frequently certain behaviours occurred in their maths classes.
So, they were asked about whether students don't listen to what the teacher says, whether there's noise and disorder, whether the teacher has to wait a long time for students to quieten down, whether students can't work well, whether students don't start working for a long time after the lesson begins. And there's also a couple of questions that students are asked about getting distracted by using digital resources, such as their smartphones’ websites, and also students getting distracted by other students who are using digital resources.
As you mentioned, the results show that Australian students reported a less favourable disciplinary climate than the OECD average. If we look at that from a perspective of comparing states and territories: students in Tasmania reported a less favourable disciplinary climate than students in all jurisdictions except the Northern Territory. And students in New South Wales and Victoria reported a more favourable disciplinary climate than students in Tasmania, South Australia and the ACT.
And if we look at disciplinary climate across different demographic groups, we find that students from advantaged backgrounds, students attending schools in major cities, non-First Nations students, and also first generation and foreign-born students report a more favourable disciplinary climate than their peers.
And if we look at the relationship between disciplinary climate and maths performance, we find that those students who are in the highest quarter – so those are the students who report the most favourable disciplinary climate scores – on average, perform about 60 points higher, which is around 3 years of learning, than those students in the lowest quarter (so those are the students who are reporting the least favourable disciplinary climate).
AB: Wow, okay so there’s a really clear link there, right, between the disciplinary climate of a classroom, and the academic performance of the students in that class. Now, of course, that wasn’t the only big talking point from the questionnaire responses. For example, the first time in PISA, student curiosity was measured. Now, the report found that student curiosity has a pretty major impact on academic performance, and it also found that Australian students are some of the most curious?
LDB: Look, they are. And I think curiosity does enhance academic performance, because it drives students to delve deeper into subjects, leading to a better understanding. It also benefits students by it encourages their active learning, it boosts their critical thinking, increases motivation, supports lifelong learning, and also helps to build resilience.
Curiosity was measured for the first time in PISA 2022; it looks at students’ behaviours and asks them questions about developing hypotheses, knowing how things work, and learning new things as well. And, when we compare the different constructs that PISA examines, curiosity shows that there are about 80 score points (which is around 4 years of learning) between those students with the greatest levels of curiosity and students with the lowest levels of curiosity.
And if we look at, for example, the different demographic groups, we find that there are more males than females, more students from advantaged backgrounds, more students that attended schools in major cities, more non-First Nations students, and also more foreign-born students that reported more curiosity than their counterparts.
AB: Curiosity has a huge hand in a student’s academic performance, as you’ve just mentioned there, but, I’m guessing that it’s also pretty important for students to follow up on that curiosity, particularly when, perhaps, the answers aren’t so easy to find.
LDB: PISA also looked at perseverance. Curiosity and perseverance are deeply interconnected, and they often reinforce each other in the pursuit of learning and achievement.
And if we look at the relationship between perseverance and performance, those students who reported greater perseverance scored around 60 points higher (or around 3 years of learning) than students who reported least perseverance. And those students who demonstrate greater perseverance will, for example, tend to keep working on a task until it's finished, they'll apply additional effort when work becomes challenging, they'll finish tasks that they've started even when they become boring, they don't stop when work becomes too difficult, and they also tend to be more persistent than most people that they know.
And from an international perspective, Australian students reported having less perseverance than students in Austria, Switzerland, Chinese Taipei and Singapore, but reported greater perseverance [than] students in Poland, Hong Kong, the United Kingdom and New Zealand.
AB: Before you go, Lisa, I do quickly want to talk about the most recent PISA report that’s come out. Now, this report looked at creative thinking – which again, was actually a first for PISA. So, can you just provide a little bit of an overview of, I guess, why they’ve looked at this metric, and what the findings were?
LDB: One of the interesting things that PISA does is, in every cycle, in addition to assessing those traditional core subjects of reading, maths and science PISA also aims to assess what they call ‘an innovative domain’. And it's something that's considered as a 21st century skill, it's an assessment that captures broader ranges of skills and competencies that are increasingly relevant in the modern world. And in PISA 2022, as you mentioned, the innovative domain was creative thinking.
And creative thinking in PISA, when we're assessing it, we're really wanting to see how well 15-year-old students can engage in creative thinking processes, and this includes their capacity to generate diverse and original ideas, evaluate and improve these ideas, and also to apply creative solutions to real-world problems.
So, in terms of creative thinking, it's essential for innovation and addressing complex challenges in a rapidly changing world. And thinking creatively involves coming up with original and effective solutions to problems that generate new ideas. So, for example, an example of a student thinking creatively would be they might invent a new method to study for exams by combining, say, digital flash cards with a memory game so it makes the learning process more engaging and [effective].
And, as you mentioned, in June this year, the OECD released their report on creative thinking, and this involved students from 64 countries and economies that participated in the Creative Thinking Assessment. Australia performed really well – they were only outperformed by Singapore, we were on par with Korea and Canada, and performed higher than the other 61 participating countries and economies.
Interestingly, females from all countries and economies performed higher than their peers.
And we're currently preparing a national report on creative thinking that's going to be released at the end of October. And, similar to our national reports, it will focus not only from an international and a national perspective, but it will also look at the performance of the different demographic groups and also look at the contextual data from students, principals and teachers.
AB: Amazing. I look forward to reading that when it comes out. Thank you, Lisa.
LDB: Thank you.
Thanks for listening, that’s all for this episode. If you’d like to keep listening, you can find over 300 podcasts in the Teacher archives, wherever you get your podcasts from. If you’d like to learn more about the latest PISA results, be sure to visit teachermagazine.com. And remember to please leave a rating and review on this episode, as it helps more people like you to find our podcast. Thanks again.
Before you go, are you currently subscribed to the weekly Teacher bulletin? It’s a free weekly wrap of our latest content straight to your inbox. Join the more than 40,000 educators who are already part of the community by clicking on the sign-up button at our website, teachermagazine.com