Cast your mind back to the first time you used a virtual assistant, unlocked your phone with biometrics, or discovered a streaming service that recommended content tailored to your viewing behaviour. At that moment, could you have possibly imagined how artificial intelligence (AI) would evolve into such a ubiquitous technology and permeate so many aspects of our daily lives?
With the advent of ChatGPT, created by OpenAI and released to the public in late 2022, and the emergence of similar generative AI tools, the teaching and learning landscape is facing a major challenge as it considers how schools might respond to generative AI.
It raises the question, ‘How will generative AI impact teaching and learning, and how will you choose to engage with this change?’
ChatGPT and other AI-powered tools with similar capabilities have the potential to reshape how educators deliver content and how students engage with learning materials. These types of AI systems don’t just respond to queries or search for information, they generate bespoke content. For example, instead of finding an image on the web, an AI image creator can create an image based on your prompts. Generative AI models such as ChatGPT can produce all sorts of text responses from emails to essays to book summaries. It can even create computer program code. Generative AI models create outputs by analysing vast datasets from various sources including music, literature and visual arts. These models are trained to recognise patterns, styles and structures within the data allowing them to generate bespoke content. Though unique, these outputs are not entirely original but shaped by the specific data and information the model has been trained on. This is the main reason that potential copyright issues can arise. AI-generated outputs often reflect and draw from copyrighted material within the training data, which could lead to intellectual property claims. To avoid copyright infringement some AI developers are restricting the training data to include images from their own image library and creative commons and publicly available images. Disclosure of this type of information by the developers of generative AI tools, will assist schools to promote transparency within their school community and when assessing potential impacts of these tools.
Generative AI has been the talking point in many educational settings. To guide schools with their investigation and implementation of generative AI, the Australian Government released the ‘Australian Framework for Generative Artificial Intelligence in Schools’ (the Framework). The aim of the Framework is to guide the responsible and ethical use of generative AI tools.
When schools engage in learning about generative AI tools, it is important to introduce the various types of AI and provide a basic understanding of how they work. For example, as they learn about large language models (LLMs), they will begin to understand the vast number of datasets these AI have been trained on. As you investigate the responsible use and application of AI with students, you can encourage them to focus on understanding its potential limitations and biases. Students need to understand that AI can sometimes generate inaccurate information and may reflect biases, such as gender or cultural stereotypes in generated images. It is important to support students to develop and apply their critical thinking when using generative AI tools. When students understand that the training datasets are likely to contain biases then they are more likely to critically analyse the information generated by the AI. If students presume that an AI tool was trained on content predominantly from one cultural perspective, they may question whether the responses they receive accurately reflect a wide range of viewpoints.
When choosing and using generative AI tools, it is essential that schools carefully consider privacy, safety and security and ensure all users have their privacy and data protected. Make it clear that personal details should never be entered into an AI tool to ensure their privacy is protected. Another aspect of safety is to ensure inappropriate or explicit content is not generated by these AI tools. When using a generative AI, check the ‘Content Policy’ to ensure the AI has an approach in place so that is does not create inappropriate content. In some tools, for example, a pop-up appears warning the user that they have requested prohibited content.
The Framework also highlights a need for transparency, fairness and accountability, and ensuring that generative AI tools are used to benefit all members of the school community.
Consider where you personally sit on a continuum about the use of generative AI tools in the classroom.
At one end of the continuum is ‘apprehensive about generative AI’, in the middle ‘mixed views about generative AI’ and at the other end ‘embrace generative AI’. Reflect on what you are basing your point of view on. Apprehension might have to do with still requiring more information and evidence that AI can be used safely in the classroom and that the implications of generative AI tools need further investigation. On the other hand, embracing AI may indicate that you can see the benefits of incorporating AI into student learning and acknowledge its responsible and safe use. Perhaps you may already be teaching about AI or have explored various AI tools.
Where is your school in their journey of engaging with generative AI and its potential use in schools?
As we know, schools invariably have diverse settings with fluctuating staff arrangements, with different backgrounds and experience and potentially varying views on using generative AI in schools.
As a response to this emerging technology and gathering interest in AI, in 2020 the Australian Government funded the Digital Technologies Hub (DT Hub) to create resources to learn about AI and to support schools to incorporate the use of AI. Lessons have been created to support teachers to use AI as a context for learning in Digital Technologies. One of these lessons, ‘Can AI guess your emotion?’ guides teachers to use an AI tool, Google’s Teachable Machine, to train a machine learning model to recognise images, in this case emojis. Students learn about machine learning and image recognition in a hands-on practical way.
In other lessons, teachers and their students are introduced to natural language processing (NLP), which is the technology that powers several AI applications. This is what makes chatbots chat. It gives virtual assistants their ability to help. It performs sentiment analysis to classify emotions in text. And it translates language to break down language barriers.
Another lesson, ‘Filter bubbles, bias, rabbit holes and nudging’, helps teachers explore the highly relevant topic of recommender systems, as used in many streaming services. These recommender (or recommendation) systems are a type of information-filtering system that attempts to predict the rating or preference a user would give to an item. Through this process, it learns your behaviours and recommends content, and helps students learn about this concept in a way that very likely connects with their everyday experience.
DT Hub also expanded its AI offering with a series of 4 AI explainer videos. Equipped with those videos, teachers were now able to discuss machine learning, share examples of image and speech recognition, and learn how AI works. They can see how the AI recommender system actually learns from users with similar preferences.
To complement these explainers and lessons, the DT Hub has also developed professional learning for teachers on how to implement these lessons and how best to engage their students in learning about AI. Furthermore, our webinar series delivered in late 2022 on ‘Implementing AI in the classroom’ helped those early adopters to enhance their understanding of AI's benefits while also addressing potential challenges to consider when discussing and teaching this important topic. In late 2023, we revamped the webinar series and offered 7 self-paced modules for teachers of varying experience and backgrounds to learn about AI.
Undoubtably, generative AI will continue to impact education in many ways, and for schools it provides yet another change to negotiate ensuring the safe and responsible use of these types of tools to support and enhance learning. The Safe Technology for schools initiative (ST4S) is currently undertaking a trial to incorporate an AI module into the ST4S Framework to reduce risks to schools when choosing AI products and services.
Within this environment of change, the DT Hub will continue to provide schools with content to support the use of AI. As these technologies continue to evolve, we will update our resources to address these ongoing advancements.