AI in the classroom – ‘master prompting’ as a crucial skill

We know that teachers in Australia are using artificial intelligence more than their international counterparts, and that younger teachers are more likely to embrace the technology. But they also have concerns – including about their own skills in this area, and how they can support students to use it effectively (Friedman & Davies, 2025).

Professor Ken Purnell is Head of Educational Neuroscience at CQUniversity. He says when teachers use it effectively, AI can help with tasks such as summarising non-essential reading and generating practice questions – freeing up time for them to focus on relational and higher-order tasks.

But for some tasks, he adds it’s vital for teachers and students to recognise that AI isn’t an all-knowing oracle but a ‘fallible helper’ that can easily miss the mark when it comes to the accuracy of its responses. The key to unlocking its full potential? Learning to ask it the right questions – a skill known as ‘master prompting’.

Prompting as a new form of literacy

‘Prompting is a new literacy: it’s the skill of steering AI with clear thinking, not outsourcing thinking,’ Professor Purnell tells Teacher

Over the last few months, he has been working with schools on the responsible and effective use of AI in the classroom. He warns about the dangers of cognitive offloading – the handing over of mental tasks – which an MIT Media Lab study (Kosmyna et al., 2025) of Large Language Model-assisted writing has linked with lower neural engagement and weaker immediate recall of a student’s own work.

He also shares how responses from general conversational AI tools contain inaccuracies roughly 30% of the time (Google FACTS Team, 2025). ‘If we had a student, or someone in the workplace, that had a one-in-3 chance of lying or misrepresenting something, would you trust it? To me, that is an astounding figure,’ he tells Teacher.

‘AI is world-renowned for its decency on medical-related things, but for general things it’s just a predictor: What does Jo want? What does Ken want? And it tries to please us. That's why you have to be a smart prompter and start to interrogate, and you’ll suddenly find that it will actually answer you much more truthfully.’

This this is where prompt literacy comes in. ‘Prompt literacy is the ability to guide generative AI intentionally and critically.’ Rather than replacing existing literacy skills, the academic explains it functions as an ‘interface layer’ above them. ‘It combines clarity of thought with technical accuracy, helping humans to make machines more useful. It is not about machines thinking like humans, but humans thinking clearly enough to steer the machine.’

Professor Purnell cautions prompt theory is still emerging, but the fact it’s in its early stages is actually a benefit, as it gives teachers and students a chance to learn together. ‘There is no established pedagogy, no definitive best practice, and no complete theory has been developed. Potentially, what works well today might not be effective tomorrow. Educators must acknowledge this uncertainty honestly.

‘We are experimenting, not implementing a fully developed discipline, and students should be involved in that process rather than being offered false certainty. It involves genuine co-learning between teacher and student. At times, this can resemble a flipped classroom, with students leading exploration while teachers guide direction, standards, and evaluation.’

His advice to schools is to ‘stay alert’. ‘Expertise has to come first, with AI as support, not replacement. You need enough knowledge to judge its output and reshape it when it misses the mark (Purnell, 2025).’

The difference between a basic and master prompter

Being a good AI prompter is a skill you can teach and learn. Professor Purnell says the key difference between a basic and master prompter is supervision and intent, giving the following examples:

  • Basic prompter: Uses AI as a shortcut by saying ‘Give me the answer ...’ This encourages cognitive offloading and prioritises plausibility over accuracy. Enjoys rewards for taking shortcuts and often delegates tasks to AI.
  • Master prompter: Views AI as a helpful assistant. They set the context, review outputs, and refine them. They rely on AI to identify flaws, not conceal them. AI serves as a tool to enhance their work. For example, instead of asking for an essay, a master prompter might ask the AI to critique a draft, pinpoint weak claims, or evaluate arguments from various perspectives.

Getting student buy-in – from shortcut to critical use

So, how do you go about getting student buy-in if they’re already using AI to offload tasks in their entirety (‘give me the answer’, ‘write me an essay’)?

Professor Purnell says teachers can support students by showing them that true control and learning don’t come from shortcuts, rather better prompting and using your critical thinking skills. ‘When students ask AI to review, critique, or verify their reasoning, the tool supports learning instead of hindering it.’

When AI produces inaccurate information – such as making false claims or citing non-existent evidence and studies – researchers call this AI hallucinations. In a newly published paper (2026), Professor Purnell says this is a big problem for students because it’s difficult for them to spot, so they could unintentionally turn in work that gets them into trouble. ‘Unlike an obviously incorrect or nonsensical response, a hallucinated claim can sound credible, especially to learners who lack the domain knowledge to identify the error.’

The fact that prompt theory is still in its infancy offers another way for teachers to get student buy-in – share that fact with them and allow them to explore and test out their own ideas. ‘They can even surpass their teachers. Some of my Masters students do this to me! That makes them contributors rather than mere users,’ he tells Teacher.

As Estonia’s Education Minister Dr Kristina Kallas highlighted on her visit to Melbourne to share details of the country’s AI Leap program, education systems can’t afford to wait and do nothing – ‘students are already using AI and they’re not using it for the purposes of deep thinking’ (Earp, 2025).

Professor Purnell says the Socratic approach to AI taken by a master prompter aligns with Estonia’s strategy. ‘The AI Leap strategy explicitly seeks to prevent cognitive laziness by positioning AI as a challenger rather than an answer machine. The classroom issue isn’t whether students use AI; they already are. The problem is whether we teach them to use it critically.’ Like Dr Kallas, he also stresses the importance of teacher agency: ‘AI should support educators and students, not replace professional judgement’.

Five tips for teachers and school leaders 

The academic shares 5 tips for prioritising human-in-the-loop engagement.

  • Experimentation and reflection: Motivate teachers to try, learn from failures, and share their experiences.
  • Question, rather than accept: Demonstrate how to scrutinise AI outputs. This demands comprehensive disciplinary understanding, not merely technical skill.
  • Humility in upskilling: Follow models like Estonia’s, where ethics and pedagogy take precedence over scale, while making it clear that no one has all the answers.
  • Avoid anthropomorphising: Use AI tools critically – don’t pretend they are partners or friends, they’re not.
  • Counting the costs: Leaders must weigh both cognitive and environmental impacts and be transparent about where AI diminishes rather than enhances human expertise.

Using AI to critique, not complete – a quick classroom routine

Here for Teacher readers, Professional Purnell shares a quick, 5–10-minute AI critique routine that you can use with students.

  • Use AI as a challenger, not an answer machine.
  • Students paste their draft, then prompt: ‘Find the weak claims, missing evidence, and hidden assumptions.’
  • They must accept, reject, or revise. AI can suggest; the student must decide.

Additional prompts teachers can suggest for student use:

  • ‘What are the 3 strongest claims here, and what evidence is missing?’
  • ‘List likely misconceptions or errors a reader could infer.’
  • ‘Give a counterargument from an alternative perspective.’
  • ‘Where is the writing plausible but not actually justified?’

What gets assessed (makes it safe):

  • Teachers mark the audit trail: prompt → AI critique → student decisions → revised work.
  • Reward verification and reasoning, not just polished prose.
  • If a student can't explain why they kept a claim, it doesn't pass.

References

Earp, J. (2025, December 1). Teacher exclusive: Podcast special with Estonian Education Minister Dr Kristina Kallas. Teacher magazine. https://www.teachermagazine.com/au_en/articles/teacher-exclusive-podcast-special-with-estonian-education-minister-dr-kristina-kallas 

Friedman, T., & Davies, B. (2025, November 27). AI in the classroom – evidence, teacher insights and action. Teacher magazine. https://www.teachermagazine.com/au_en/articles/ai-in-the-classroom-evidence-teacher-insights-and-action 

Google FACTS Team. (2025, December 11). The FACTS Leaderboard: A comprehensive benchmark for large language model factuality. Google. https://storage.googleapis.com/deepmind-media/FACTS/FACTS_benchmark_suite_paper.pdf 

Kosmyna, N., Hauptmann, E., Yuan, Y. T., Situ, J., Liao, X.‑H., Beresnitzky, A. V., Braunstein, I., & Maes, P. (2025). Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for an essay writing task [Preprint]. arXiv. https://doi.org/10.48550/arXiv.2506.08872

Purnell, K. (2025, December 4). Future-Ready Brain-Smart Teaching: Neuroeducation, AI, and Practical Strategies for Classrooms. [Presentation]. St Brendan’s College, Yeppoon, QLD, Australia.

Purnell, K. (2026). AI's secrets: What students and educators need to know about chatbots. Zenodo. https://doi.org/10.5281/zenodo.18452249

How confident do you feel in your own ‘prompt literacy’? What small steps could you take to strengthen it this term?

Professor Ken Purnell says school leaders should ‘motivate teachers to try, learn from failures, and share their experiences.’

If you’re a school principal or in a leadership role, are teachers encouraged to experiment with AI? Do you provide regular opportunities for them to share and discuss their experiences with colleagues, so everyone can learn from successes and failures?