How can we teach students to use ChatGPT responsibly?

AI tools like ChatGPT bring dangers as well as opportunities. This article explores some ideas for keeping the balance tilted in the right direction by teaching students how to use AI respnsibly. It starts by defining the problem, then presents some general principles for student use of AI. Finally, it concludes with a few specific thoughts on ideas to use in lessons to achieve these principles.

A Skeptical Start

When I first heard colleagues sharing their fears that ChatGPT would spark a mass wave of student cheating, I was skeptical. Surely, I thought, it would be too obvious if a student submitted a piece of work that had been authored by an AI? Surely I, as a teacher who prides himself on knowing his student, would be able to tell?

I was wrong.

I don't say that because I have caught any students cheating in this way - which would be difficult to do with the way my course and my lessons are planned, anyways. But I have heard students talking about AI so much and seen so many students on social media asking for tips on circumventing AI-detection systems, that I am sure I will come across this reality soon.

It is clear to me that this fear (of student cheating with AI) is well-founded, but also that the benefits of AI in education outweigh the risks. There have always been new technologies that unleash new ways of cheating, and an accompanying arms race as educators and institutions try to stop them.

There is a broader problem though, and again history provides an example. Within my own living memory, one technology that brought benefits but which was also a missed opportunity in many schools was the rise of the internet and web search tools like Google.

Does the following situation sound familiar?

You are teaching a lesson that requires students to carry out internet research. Perhaps the question is, “How will climate change affect the price of tea and coffee?” In any case, students have teacher permission to use devices and find information online. Partway through the activity, you are circulating round the room. You discover the many students have simply copied down the text from the first website they found, word-for-word. In one case, you even catch a student typing the exact question you have asked into Google, then uncritically copying the summary of the first search result onto their worksheet!

Any experienced teacher will have encountered some version of the situation I have just described. And if you haven't already, you will soon find students copying information and answers from their Snapchat AI or from ChatGPT.

The missed opportunity of the internet and Google search was a failure in many schools to teach students how to use these tools responsibly. We must avoid making the same mistake again, and ensure that in implementing AI tools in our lessons we do so in a way that supports - rather than undermines - students' critical thinking skills, creativity, and knowledge.


General Principles

Here are some initial thoughts to help move teachers in the direction of responsible and effective use of AI:

1) Make sure students understand the limitations of ChatGPT.

There are some things that AI tools just can't do well yet. It's important that both teachers and students understand what those things are, and know that some of these weaknesses bring real risks in the classroom.

We would never recommend that math teachers use ChatGPT to create lesson worksheets, for example, because ChatGPT's mathematical reasoning abilities are nothing short of shocking. It makes frequent, basic mathematical and logical errors, and just isn't fit for this purpose.

More relevant to all subjects though are the problem of 'hallucinations', which are what it called when an AI simply makes something up. I don't mean writing something that is obviously or deliberately fiction: I mean statements that look, feel, and sound right but which are in fact totally wrong or even fabricated. This still happens far too often for teachers to use AI-generated lesson resources without first reading through them as carefully as they would a piece of student work, and editing any errors out before using in teaching. By the same token, students need to understand that they can't simply trust exactly what an AI tells them.

2) Encourage students to see ChatGPT as just one more tool in their toolbox - not a one-stop-shop every time they are asked to do something.

Just as too many students jump straight into a Google search when they are given a task, many students will instinctively jump to AI helpers in future. Making sure that students are aware of the best tool for a particular task is key. It might be that they should be taking some time to brainstorm or an their response before accessing help, using a traditional website or textbook to get information that is tailored to your course, or even that a completely different AI tool exists that would beore appropriate! No educational technology is one-size-fits-all, and AI tools like ChatGPT and Teaching AI are no exception.

3) Always have students finish a task with their brains, not with AI.

ChatGPT is amazing for providing inspiration and ideas, as well as answering factual questions in an accessible way. It can even mark students' work - something we are working on adding to the Supporters area of Teaching AI. However, it can't replace the self-reflection, retrieval, and metacognition processes that need to happen within students' brains to make knowledge 'sticky', allow them to continually improve, and help them understand how they learn. Even AI-heavy lessons should, in our view, include a significant consolidation element that removes the AI tool so that students must use their own brains to retrieve or reflect.


Specific Suggestions for Lesson Activities

How, then, can we do all the things I have suggested above? To me, it is very simple and relies on a technique that effective teachers have always used: modelling or demonstrating how students can carry out a key task or develop a specific skill. There are many models for how this teacher modelling can be structured, but at their heart all teacher models rely on the following sequence: Tell the students what the activity or skill is that they are learning. Explicitly show them how to do that activity, step-by-step and flagging up any pitfalls or potential difficulties. Let students try the activity themselves. As a class, reflect on the students' practice and see how they found the activity.

We could apply this model to educating students on the responsible use of AI in many ways. Here are a few ideas:

  • Identify a few prompts related to your subject that tend to result in hallucinations when entered into ChatGPT. Present the prompts and responses to the class, and ask students to identify why these responses are factually incorrect.

  • Provide students with an overarching question that they would typically use internet research to answer. Ask students to work in pairs to brainstorm questions they would like to ask ChatGPT to get the necessary information. Guide students towards high quality questions that will help them to build an answer to the question, without simply asking the whole question!

Then, ask the students' questions to ChatGPT in front of the class - noting down which ones provide the highest quality responses and those that are perhaps less useful.

Finally, ask students to use the information to write a response to the original question - and then have them reflect on what types of questions are most and least effective at helping them get the information they need.

  • Before setting students off on a task, ask them to consider what information they need and where they could get it. Make sure all possibilities are considered: students' own notes, textbooks, teacher-shared resources, Google search, specific websites, and AI tools. Note that AI is only one of many options here! Have students weigh up the pros and cons of each option for the specific task they are doing.

  • Organise a whole class debate on the statement: “ChatGPT speeds up leaning so much it makes a four-day school week possible.” (The reduced school week should serve as a 'hook' to make students care than they otherwise might!)


Conclusions

By modelling responsible and critical use of AI in lessons, teachers can encourage responsible use of tools like ChatGPT in education. Explicit modelling is one of the most effective ways of helpin students to understand and consolidate key skills, and repeated use will help them to get into good habits. However, it is essential to be consistent in using these types of techniques so that students don't slip back into uncritical or ineffective ways of using AI: The temptation will always be there to see AI as a magical one-stop-shop, rather than just another tool that can enhance students' lives and learning.

Teaching AI is an online tool created by teachers, for teachers, to use the power of AI to create lesson resources in seconds. It's totally free to try today, or you can become a Supporter for full access to all 30+ lesson resource types.

Previous
Previous

How To: Add a Careers link to your lesson with AI

Next
Next

Upgrading your teaching with AI: going beyond using ChatGPT to draft lesson resources