Hello, Teacher editor Jo Earp here and I’m going to be your host for today, but before we get into this episode a reminder that if you’re looking for more free content from Teacher head over to the teachermagazine.com website. There are thousands of articles, infographics, videos and podcasts (including transcripts of all the episode from all our series) in our archive – they’re all online, they’re all open access, and we publish fresh content throughout the week!
Thanks for downloading this podcast from Teacher – I’m Jo Earp. There’s so much potential for artificial intelligence and generative AI tools to support teaching and learning, but alongside these exciting opportunities there are challenges and risks. If you’re a teacher or a school leader, you may already be using or experimenting with AI tools – and you certainly won’t be surprised to hear that many of your students definitely are! My guest for this special episode is Dr Katie Richardson, Senior Research Fellow at the Australian Council for Educational Research. We’ll be talking about 3 broad areas where AI can enable improvements in education, and what that might look like. Also, what the tools can and can’t do currently, or replace. And we’ll discuss how teachers need to be careful and purposeful in using the different AI tools out there with their students to promote thinking and learning. Okay, let’s dive in.
Jo Earp: Hi Katie, thanks for joining Teacher for this special episode. Now, we're going to be chatting about the integration of artificial intelligence (that's AI, of course) into teaching and learning, both as a tool for students but also for teachers as well. Now, you're a Senior Research Fellow at ACER and you've been involved in a few panel discussions, workshops and webinars on this topic – so, really looking forward to getting your insights today. I thought, to kick things off then, AI and particularly the use of Gen AI (generative AI) tools is becoming more common now in all areas of life, including schools of course. And that's really exploded in the last couple of years; I'm thinking about the release of things like ChatGPT and then more recently DeepSeek. But it's not just that, is it, when we say AI, what do we actually mean?
Katie Richardson: Yeah, thanks Jo. In fairly simple terms AI refers to technology that mimics or simulates human intelligence. For example, it can mimic human activities such as learning, reasoning, problem solving, summarising a meeting, explaining concepts, etcetera. But it doesn't do this in the same way we humans do. So, it's not actual intelligence, it uses what we call neural networks and data sources to compile responses. So, in actual fact, it's limited to the information pool through which it's been trained – and that includes all of the limitations and biases within that pool of information. So currently, we know that it can't truly innovate, it can't create, it doesn't contextualise well, and it doesn't necessarily understand nuance.
JE: Yeah. So, it really does rely on how it's being trained; and we hear a lot about that don’t we, about the training of it and various aspects of that and copyright and so on. Is one of the difficulties that people are (you know, it's human nature), people are already looking for clear answers on things, but this is going so quickly. How much research and guidance is out there at the moment in terms of AI, in this K-12 education space?
KR: Well, it's starting to grow. You see, back in 2022-2023 there was this massive burst of, this flurry of, research when ChatGPT and a few other LLMs [Large Language Models] just suddenly burst into common use. And so, around that time, I think it was 2023, UNESCO published their Guidance for generative AI in education and research (Holmes & Miao, 2023), which is a bit of an interesting read, and that focuses on a human-centred approach to AI. Not long after that, or around about the same time, the Australian Government also published the Australian Framework for Generative Artificial Intelligence in Schools (Department of Education, 2023). I also note AERO [Australian Education Research Organisation] has done a bit of work in this area, and from ACER we've also been researching AI, and we've developed the PATH framework to support the use of AI for learning and teaching (Brazil et al., 2025).
But we also need to realise that generative AI as we know it hasn't been around for long, so there's a lot that we actually don't know. And this means that we still need to use AI really critically and think about the impact that it has, or may have, on learning and teaching. So, the thing that we need to keep foremost in our minds is that AI needs to support and enhance learning and not replace it.
JE: And you mentioned a few of those documents there – and for listeners, as always, I'll put links to all those documents and things that we talk about today in the transcript to this podcast, you can go to teachermagazine.com to find that. As you say, we've got to be critical users of this because it's clear that these tools are already being used by students outside of schools – and possibly in the classroom as well, depending on the teacher and the school, but definitely outside of schools. There are lots of educators out there who are also using the tools to help with teaching in the classroom, but also tasks like lesson planning, admin, things like that. So, then the question becomes how the tools can be best used. As with any tool, as with any teaching and learning tool, how can it be best used? I understand there are 3 broad areas then where AI can enable improvements in education. Those are efficiency, effectiveness and transformative learning opportunities. So, let's take the first one – what kind of things come under that efficiency then?
KR: So, efficiency is about streamlining. So, looking at how we can do tasks more effectively, efficiently, automating tasks. In the context of teaching this might include things like using AI to help us shape a lesson plan or a unit plan – to speed up the process, not to replace our thinking as teachers but to make it more efficient.
JE: The second one then – effectiveness, that's actually more about better outcomes, though, isn't it?
KR: Sort of. It's really about enabling the students to learn more effectively and a consequence of that may be better outcomes.
JE: Right.
KR: So, for example, we might see this in AI programs where feedback is generated to support the learning process.
JE: Yeah. So, effectiveness, that's the second one. We've had efficiency, effectiveness – the third area is transformative learning opportunities. What do we mean by that?
KR: Yeah. So, this is a really interesting space. It's about providing opportunities for students to interact with learning in ways that we previously couldn't in a real-world kind of environment. So, it's about simulation; so, for example, you've probably seen on TV shows doctors using simulations to practise things like operations and things like that, so that's one kind. Virtual reality, artificial or augmented reality. So, for example, one of the apps I've come across is a history program that's developed to help you see what ancient buildings used to look like when they were first built. So, historians and engineers worked together and used the evidence from the archaeological digs to then reestablish, or put together, what this building looked like way, way, way back thousands of years ago. Another one, for example, is another app that is about historical figures – and so, students can actually ‘have a conversation with’ (I guess in inverted commas) an historical figure, someone like Queen Elizabeth I or something like that, and find out about their life.
JE: Well, that's interesting because again the first thing I'm thinking is it’s what you said earlier, just going back to that, like, who's training this? Who’s training this? And where's this information coming from? What perspective is it coming from? And then also, you know, what are the questions they [the students] are going to ask, and are they getting the right information out of it? So, we'll talk about that after the break. Before we go on to that then, the other thing I just wanted to touch on with that transformative opportunities – I'm thinking that could be a massive leveller in terms of accessibility, you know, resourcing, that kind of thing? If you can't physically get to somewhere; if, you know, your students are remote; if your students are ill; you know, there's all kinds of sort of ‘level the playing field’ effects there?
KR: Potentially, potentially, or it could widen it, we don't know yet.
JE: Right, yeah.
KR: And so, there's a lot of discussion at the moment about how we support students with additional needs to engage with AI effectively.
OK, then. So, plenty already to think about. We're going to dig a bit deeper into some of the tools, and the things that teachers need to consider when they're using them, after the break.
You’re listening to a special podcast from Teacher. If you’ve just discovered our podcasts and Teacher magazine, I want to let you know about the additional free content on the teachermagazine.com website. You’ll find more than 3,000 articles, infographics, videos and podcasts (including all the transcripts of these episodes) in the archive – they’re all online and open access. And it’s also where you can sign up to our regular bulletin to get the latest content and trending topics, delivered straight to your inbox. We publish fresh content through the week; access everything and sign up to the bulletin at teachermagazine.com.
JE: Welcome back, I'm here with Dr Katie Richardson and before the break we were talking about the 3 areas where AI can enable improvements in education. A reminder then, we've got efficiency, effectiveness and [transformative] learning opportunities. Now, Katie, you've been digging around a little bit, experimenting with some of these tools, which sounds exciting. What are some of the questions that teachers need to be asking when they're looking at possible tools? I want to focus on ethics and privacy, in particular, to start with.
KR: Yeah, and this is a really tricky one, because if we access an app, our first automatic thought is not ‘oh let's go and read the end user agreement before we check this program out!’ But I think consideration about how our personal information can be used is really, really important. There were some programs that I found that required me to provide a relatively large amount of personal information before I could even look at the program and, interestingly, they didn't tell me what they would be using this information for.
JE: Yeah, so read the fine print, in other words.
KR: And don't just sign up.
JE: Yeah. But that's definitely a discussion for the school, isn't it?
KR: Absolutely.
JE: Rather than individual teachers going off and signing up.
KR: Yes, what are appropriate programs to use and what aren’t.
JE: Yeah, the other thing that I've seen you talk about was that, you know, you’re stressing that AI tools need to augment, not replace, human intelligence. So, we've got to find that balance really then around what the teacher role should be and what shouldn't be replaced. So, start from a point of what should absolutely still be the teacher’s role, and then as we sort of touched on earlier what could be enriched by AI. Can you explain a bit more about that?
KR: OK, so I'm actually going to give an example for this one. A couple of weeks ago I was in a taxi, I had to go to Sydney. I was in a taxi, this old taxi driver, awesome guy, he'd been driving the Sydney streets for 50 years. He knew them like the back of his hand. He knew them as they changed, you know, the one-way streets that changed to the other direction and everything, as it moved and changed through the city. He had his GPS on and all of a sudden he looked at the GPS and he goes ‘I'm not going that way’. And he said, ‘I know these Sydney streets and that is a really bad way because this street here is closed and the GPS doesn't understand that, and this street here is one-way’. And so, this taxi driver had a deep knowledge of the nuance of those roads – how the traffic flowed, where the best ways were to get to the final destination. The GPS didn't have that nuance. It didn't have the knowledge to be able to give us the best information.
And so, I think if we bring it back to teaching and learning – as teachers, number one, we need to know our stuff we really need to know our stuff. Because if we don't know our stuff, how are we going to be able to pick out the best directions to take our students in? The other part of that is that our students really have to know their stuff in order to engage effectively and critically with these new tools.
JE: So, just going back to the teaching aspect. Again, if you are asking an AI program to help you with some lesson planning and it's fairly basic stuff and, you know, like we've spoken about before, it's speeding up, it's efficient, yeah. But then, if you say, ‘no, create some things that will suit this group and this group and this group’, that's potentially a bad idea, isn't it? In terms of … will the AI know that group? Will they know what the students can handle? Do they know exactly where they're at? It might be a certain time of the day where that's not an effective activity. There’s all kinds of things isn’t there?
KR: Absolutely. And when I was test running some of these programs I ran into those issues. And so, having, you know, being a teacher myself and having experience in the classroom, I was able to sort of simulate what a classroom with, you know, a diversity of needs might look like. And so, I started to prompt the program to help me to do the planning. What I found was that it could give me fairly generic answers, but it wasn't anything that I couldn't come up with myself. When I tried to prompt it for particular needs for students, again, the answers were not what I needed in order to meet the needs of those students. And so, as teachers, I don't think we can hand over our role of planning for learning and teaching to the AI. We can use it, you know, to trigger thinking – it might bounce an idea that that we hadn't considered before – but generally what I've found is it's not innovative, it doesn't give me more than what I could do fairly quickly myself, yet.
JE: Yeah, that’s an interesting point. And in terms of the students then, knowing their stuff as well and how they use it.
KR: Yeah, and this is really tricky. And this is a question that we get from teachers a lot, ‘How do we get the students to actually engage in learning?’ Because we all know learning can be really hard work. We often find, as students, we can try to shortcut the learning process because it is hard work; it's easier to let someone or something else do the thinking for us. This is what we call cognitive offloading – when we get the AI to do the hard thinking for us – and we don't want this. The problem with it is that we don't actually learn the vital skills that are necessary for critical thinking, problem solving, creative thinking, and all of that kind of stuff that we’re really wanting to foster in our students.
So, we've already seen examples of this happening many years ago when they introduced calculators into the classroom. So over time, students have become less and less fluent in mathematical concepts; so, let's take times tables for example, because calculators can do it for them. This actually has a big impact as students move through schooling. It interrupts their ability to actually access those higher levels of thinking in mathematics, particularly as they get to the higher levels of schooling. So, when fluency isn't there, and the knowledge of how the mathematical concepts work aren't developed well, it places a ceiling on how far and how deep those students can go in mathematics. I hope that makes sense, this is something that we already know about.
JE: Yeah, yeah, that makes total sense. And just on that teacher note as well, about differentiation and you knowing your students, you know links in well with our columnist Martin Westwell – his first Teacher column which was published earlier this year (I'll put a link into that as well) that's all about the power of professional judgment. So, yeah, we definitely need to keep that kind of thing in mind.
The other thing then is that explicit teaching and the really important role of that. So, it's not just a quick fix, really – there's a lot of preparation and skill development that we need for students or teachers to actually use these tools effectively. You know, to fully understand what they're capable of and also really importantly the limitations. And like I say, that requires explicit teaching of the skills, doesn't it?
KR: Yeah. And we need to be careful and purposeful in how we use it with students so it can promote thinking and learning and not replace it. So, there's a couple of different aspects that we need to think about in terms of explicit teaching. Number one, we need to teach explicitly the metacognitive skills that the students need in order to engage with and interact with particular concepts and skills – so we've got to get them to do that. And these are concepts and skills that, particularly at [lower] schooling levels, the AI could easily sort of come in and replace, but probably not replace well. So, we want the students to actually have a chance to develop those skills without the AI.
Then with AI, we have to also explicitly teach them how to use the AI. Because students don't natively just come in and go ‘oh, this is ChatGPT I have to give it really explicit prompts and I have to tell it exactly what I need it to do and then say no, that's not correct, can you please do this, and AI will respond’. If they haven't been taught how to do it, they'll give it a prompt and ‘oh yeah, that's a great answer’. Hand it in. So, number one, we have to teach them explicitly those skills that they need to have in order to be critical of the AI. Number 2, we need to teach them how to use the AI explicitly. And number 3, we have to teach them how to be critical of the AI outputs.
JE: And have a think about the information that's being spat out there.
KR: 100%.
JE: And what they're going to do with it next.
KR: Yes.
JE: And what that means. I was at a webinar that the OECD did recently and one of the things that was touched on was concerns about cognitive overload. That, you know, maybe when students are putting this in, and maybe they’re new to it, and it's spitting out say 20 things each of 5 paragraphs, and it's just like it's just too much, it’s actually too much for them to process. So, then they have to think, you know, ‘I need to give it a better prompt, I need to be a bit more critical in the information that I'm wanting to get out of it and what I want to do with that and then what the next step is.’
KR: Yeah. And that takes quite a bit of skill. I've found it challenging as well as I've been learning about how to use AI, to be able to steer it in the right directions. We also need to know how to utilise AI critically in terms of, what do we actually believe? So, I know of teachers who have been using 2 or 3 different AI platforms to argue against each other and to actually develop a range of different arguments and trigger different lines of thinking. But, while the AI comes up with it, the way that that teacher is utilising it is playing them off against each other, so there is purpose in there.
JE: Of course, one of the big questions I think you get asked a lot – and I know this is a big question in a lot of the discussions that I've seen as well from teachers – when do I introduce this, in terms of age groups?
KR: And that's a really tricky question because it depends. And I know that's not a great answer, but it really does depend. It depends on the knowledge and maturity of the students. It depends on the programs that you are actually using, and how those programs are generating information and what their intent is. So, for example, we may want to hold off on enabling our younger students to engage with particular types of AI because we want to actually create a better foundation of knowledge and skills in particular areas. That's a possibility, but we also want to be able to support our students from a fairly young age to be able to engage critically with the AI and how they're working with it. And so, there's a bit of a balancing act I think in terms of how and when we introduce it.
So, there's a lot of, there are a number of programs now that are being developed by ed tech that are available, educational programs. And those who are designing them are working to develop them to give prompts or feedback, rather than answers and things like that, and support students to engage in ways that will support the learning process, and of course not replace it. But having said that, we still don't know enough, I don’t think. I think that over the years we need to find out more information from the metadata from these companies, how the students are engaging with it. We need to understand more about student learning outcomes and whether or not they’re impacted on that. Whether or not the breadth of student creativity is impacted. All of this type of thing, we need to just keep our ear to the ground, I think.
JE: So, the key message then that I'm getting from today is that there's lots of discussions and debate about this currently, and there's more to come. And we do have to kind of go cautiously in some areas, and we definitely need to keep having those discussions at school level, too, and reflect on some of those questions and considerations that we've talked about today. So, again, ‘What's our own context?’, ‘What are the needs of staff (in the case of if you're a school leader)’ and ‘What are the needs of our own students, what are their capabilities?’ What would your advice be to listeners, then, who are in K-12 settings and who either want to start using AI (so they've not tried it out yet), or get more out of it than they currently are?
KR: Well, this is a really interesting question, I think. And I think if we go back a step and we look at the students – the students know way more programs than we could ever think of; they're already online finding different programs for different things and churning through. And, you know, I've spoken with teachers who have brought in this really cool AI app and the students go ‘yeah, that's great Miss but look at this one, this one's even better’. And so, as teachers I think we have to realise that we're not always the expert in the room in this, but we are the expert in learning and teaching.
JE: Yeah.
KR: So, there's a few things that I think we need to think about. Perhaps the most important is that we need to be really purposeful in how we incorporate AI into our teaching practice. We need to consider a range of different questions. What skills do the students need to develop before they use the AI tool in order to engage with it critically? So, for example, I also know a maths teacher who uses ChatGPT to demonstrate mathematical misconceptions to students, because ChatGPT is not great with its maths.
JE: No, I’ve seen that.
KR: And so, he uses it because ChatGPT replicates, the data sources that it draws from replicate the common misconceptions in maths. So, what he does is he utilises this issue with ChatGPT to demonstrate those misconceptions to the students and help the students step through the process of rectifying those misconceptions in the AI. Now, the interesting thing is they get to the end of the process, ChatGPT spits out the right answer, finally, and then he gives it another equation that is exactly the same just with different numbers. And you know what?
JE: It can’t do it.
KR: ChatGPT can't do it. So, it's really good as teachers for us to be thinking about this in creative ways, how can we actually utilise it to help the students become more critical of what they’re seeing. So, this example of mathematics sort of leads us to this point, the question of how will students need to engage with the AI in order for them to experience and enhance learning and not replace it? So, how are they engaging with it? What do you want them to learn from the process of using AI? So, considering now with AI use that process – how are you going to ensure that what you want them to learn is actually what they learn in this process, as well? And, also, how will you model effective use of AI to your students?
That’s all for this episode – thanks to Dr Katie Richardson, and thanks to you for joining me. If you want to keep listening now there are more than 300 episodes from the last 10 years to choose from, including our Research Files series, Behaviour Management and School Improvement. Find those wherever you get your podcasts from. Hit the follow button to make sure you don’t miss out on new episodes. And please leave a rating and a review while you’re there. Bye!
The Teacher bulletin is your free, weekly wrap of our latest content – not just the podcasts, but also the articles, infographics and videos – straight to your inbox. Click on the sign-up button at our website, teachermagazine.com.
References
Brazil, J., Yang, S., & van der Kleij, F. (2025). AI systems in teaching and learning: Principles and practical examples. Building a PATH forward. Australian Council for Educational Research. https://doi.org/10.37517/978-1-74286-785-4
Department of Education. (2023). Australian Framework for Generative Artificial Intelligence in Schools. Australian Government. https://www.education.gov.au/download/17416/australian-framework-generative-artificial-intelligence-ai-schools/35400/australian-framework-generative-ai-schools/pdf (PDF, 2.3MB)
Holmes, W., & Miao, F. (2023). Guidance for generative AI in education and research. UNESCO Publishing. https://doi.org/10.54675/EWZM9535
Westwell, M. (2025, March 17). The power of professional judgement. Teacher magazine. https://www.teachermagazine.com/au_en/articles/the-power-of-professional-judgement
Thinking about a planned future use of AI with students: What do you want them to learn from the process of using AI? What skills will they need to develop before they use the AI tool in order to engage with it critically and effectively?