Planning a Workshop on AI Tools (like ChatGPT) for Your School

ChatGPT by OpenAI and similar AI tools have been a hot topic for educators the past few months. Instructors are excited and/or concerned. A lot of educational developers are doing or plan on doing workshops and webinars on how educators can adapt and respond to the existence of these tools. Heck, there are at least 5 webinars on the topic just this week. And one might say with good reason: without being more informed about the new AI tools out there, educators may have students using them to cheat on assignments without their knowledge (like apparently 17% of Stanford students in a recent survey), or instructors might require students to use these tools without being aware of the potential (and likely) student privacy issues or potential methods for working around them.

I’m planning a workshop on the topic in March, and I thought I’d share some planning notes in case they are of use to others. I’ve bolded and put asterisks around some “go-to” resources that you might wish to start with. And at the end is a potential list of key takeaways: dos and don’ts” when using ChatGPT and similar AI generation tools in education.

Update: I’ve also created a 2-sided handout on Adapting Your Teaching to AI Tools.

Keeping Up with AI Tool Developments and Issues

Every day there are new technology developments, new teaching examples, and new concerns and issues with artificial intelligence (AI) related tools. It’s very difficult to keep up, but there are some resources to help do so:

Planning a Workshop: Audience

I’m thinking folks who attend a workshop on this topic might fall into one or more of the following three camps:

  • Folks excited about the potential uses of these tools for enhancing their teaching and student learning
  • Folks concerned about the issues and limitations with these tools, especially students using them to plagiarize/cheat (academic integrity), as well as these tools collecting student information and data (student privacy)
  • Folks who haven’t heard much about these tools yet and just want to learn more about what these tools are (and are not)

Planning a Workshop: Goals, Outcomes

At the moment, I’m envisioning these goals and outcomes for a workshop. Of course, feel free to adapt these (and share if you wish) for your own. I hope you’ll forgive the use of unmeasurable verbs.

  • Understand what ChatGPT and similar tools are (and what they are not) 
    • Through some quick demos, mentioning the FuturePedia directory if folks want to explore and try any tools for themselves.
    • See also Ten Facts about ChatGPT.
    • These tools are not true artificial intelligence. They can’t think and act like human beings, and they often make simple mistakes and errors that a person would (hopefully) never make, such as simple math mistakes or making up fake references or facts.
      • One analogy is that these AI generation tools are like auto-complete on steroids. A more correct term for some of these tools is large language models (LLMs).
    • Despite that, ChatGPT can do some impressive things.
  • Become aware of some positive uses of these tools, for instructors and students
  • Become aware of some of the issues & limitations with these tools 
    • Not always free, not always available (servers overloaded or restricted)
    • Copyright infringement, exploiting labor and data
    • Out of date, not aware of current events or closed-access references and books
    • Student privacy issues, collecting and potentially selling student data
    • Not concerned with the truth – can ‘hallucinate’ and generate fake information and references
  • Discuss and become aware of educational/teaching concerns with ChatGPT 
    • Plagiarism / cheating
    • AI detectors – which are easy to beat
  • Apply strategies for adapting your teaching and assignments and activities 
  • Being aware of future possibilities and trends with AI tools
    • Competitors to ChatGPT are rapidly emerging
    • Hopefully more open and ethical AI tools will emerge

Other Related Workshops & Resources

Here are some workshops and compiled resources on this topic, including some I already mentioned, which you might find useful for planning your own:

Some Positive Uses for AI Tools

There are legitimate and serious concerns and issues with AI tools which will be summarized next, but there have also emerged some potentially positive uses for these tools. Just a few examples I’ve come across include:

What is ChatGPT best used for?

Given the limitations with ChatGPT, such as its inability to discern truth, for what type of tasks is ChatGPT most useful? This article suggests:

  • Tasks where it’s easy for the user to check if the bot’s answer is correct, such as debugging help.
  • Tasks where truth is irrelevant, such as writing fiction.
  • Tasks for which there does in fact exist a subset of the training data that acts as a source of truth, such as language translation.

Issues, Limitations, and Concerns with ChatGPT and Similar Tools

This video shares 5 limitations of ChatGPT, but here also is a list of some issues and concerns, including a chief concern among some educators, plagiarism and cheating:

  • ChatGPT “hallucinates” fake references and information
  • ChatGPT was only trained on data through 2021.
    • ChatGPT is not aware of more recent events or data. But there are emerging tools that will allow you to train these tools on your own custom text sources and data (such as GPT Index, described at the bottom of this post).
  • These tool aren’t really “Artificial Intelligence”
    • These tools can’t act or think like human beings. They are large language models (LLMs) that act more like (as some have described) auto-complete on steroids. This can help explain why, for example, ChatGPT generates false information and references.
    • Stop Calling Everything AI, Machine-Learning Pioneer Says
  • Several AI tools were trained on copyrighted content
    • Artists are suing image generation tools, for example, because users can plagiarize their art and style.
  • AI tools are biased and can still generate content that is toxic, offensive, and harmful
    • Microsoft released an AI chatbot 7 years ago. It was taken down in less than a day because it was generating extremely toxic content.
    • AI tools are trained on data that itself contains biases and toxicity. ChatGPT was trained in part on reddit posts, for example.
    • The makers of many of today’s AI tools use reinforcement learning with human feedback to help reduce, but not eliminate, the chances of it generating toxic content. A controversy emerged when it was discovered that OpenAI exploited Kenyan workers paid less than $2 an hour to help with this process.
    • There are several examples of people bypassing OpenAI’s filters to get ChatGPT to generate harmful and toxic content.
  • Students may use AI tools for plagiarism and cheating
    • A big concern of educators is that students will use ChatGPT and similar tools to generate writing or code which they will then submit to their assignments with little or no modification. As I mentioned earlier, approximately 17% of students at Stanford admitted using ChatGPT to cheat.
    • Caktus has several tools designed specifically for students to generate content for assignments, for example.
    • Cheating is a systematic issue and an issue of trust. See Alfie Kohn’s 2007 essay: Who’s Cheating Whom? Some quotes:
      • “when teachers don’t seem to have a real connection with their students, or when they don’t seem to care much about them, students are more inclined to cheat”
      • “Cheating is more common when students experience the academic tasks they’ve been given as boring, irrelevant, or overwhelming”
      • “when students perceive that the ultimate goal of learning is to get good grades, they are more likely to see cheating as an acceptable, justifiable behavior”
    • Another article by Marc Watkins: Our Obsession with Cheating is Ruining Our Relationship with Students.

Adapting Your Assignments in Response to AI Tools: 5 Strategies

What, if anything, should instructors do in response to the existence of these new AI tools? How should they adapt their instruction and assignments? I tried to boil it down to 5 potential strategies, ordered by how much is involved in adapting one’s assignments and activities.

One analogy that might be useful to keep in mind here is students’ use of calculators (or WikiPedia, Google, cell phones, spell checkers, and other tools). We (at least initially) worried about students using calculators in math classes, but student mathematics learning can actually benefit from the use of calculators.

  1. Check for AI-generated Text
    • Most likely, you will be able to tell yourself if a student uses AI-generated text the same way you can with other plagiarized text, as the writing style will be different. Hopefully you have a variety of assignments and activities to help notice changes in writing style.
    • You could incorporate recent events or have your students make personal reflections in your writing assignments, which AI tools wouldn’t be able to help them with.
    • But there are tools emerging that can detect AI-generated text like GPT Zero, AI Writing Check, and DetectGPT. You can even use ChatGPT itself to detect AI-generated text, but they are all very easy to beat. TurnItIn is also planning to add a detector, and OpenAI is working on its own detector, but it has a 9% false positive rate at the moment. These detectors work in part by detecting the “burstiness” and “perplexity” of the text.
    • But remember, by using AI detectors, you’re also allowing these tools to collect student work and data. So hopefully you can avoid using these tools or use them as sparingly as possible. Perhaps open and ethical alternatives that don’t share student data with OpenAI or other third-party entities will emerge in the future.
    • In preparation for evaluating your students’ writing, you could try submitting your assignment prompts to ChatGPT yourself and getting familar with the responses generated. See How Well Can AI Respond to My Assignment Prompts? by Anna Mills, as an example.
    • But what do you do if you detect that a student submitted AI-generated text? See this Professor Guide to AI Text Detection for some initial advice, although note, it is a post from a vendor. It’s important that you consider addressing the use of these AI generation tools in advance by incorporating a policy in your syllabus. There are some example course policies below.
  2. Synchronous Teaching & Learning 
  3. Show the Process 
    • Instead of having students submit a final product, such as homework answers or a paper, have them submit something that reveals the process they used to create that product. Examples: 
    • Google Docs & Microsoft Word online show the editing history, so you can see if they just pasted in text generated elsewhere. 
    • Microsoft Flip is a video discussion board. Students can also record videos in a regular Canvas discussion board. You could have students record videos explaining how they solved a math or science problem, for example. 
  4. Authentic Assessment and Open Pedagogy
  5. Incorporating AI Tools into Your Course Activities 
    • One rule of thumb is to try to think about how or if this task would be done in the real world or workplace. Would they use an AI tool to help them do some task? Then perhaps let students use it, too. Of course, that may not always be feasible or desirable. 
    • An AI tool can act as a learning assistant or student exemplar. You or your students can generate an initial answer or text via AI, and then discuss how to improve it or compare it to other writings. 
    • Addressing issues of student privacy:
      • You might not want to require students to use ChatGPT or similar tools, as they may collect and use student data.
      • Some alternative solutions might involve having your students submit their prompts to you, the instructor, who then submits them to the tool and shares the response back with the students. See this AI Bot Essay Writing Plan video, for example.
      • Another alternative that might be more feasible in the future would be to use a custom interface to an AI tool, so that only you have access to student data (see GPT Index below, for example). 
    • Some other examples of utilizing AI tools in teaching are below.

Utilizing AI Tools in Your Teaching

Example Course and School Policies Regarding AI Tools

Some schools systems and colleges and research journals are already banning the use of AI tools, but you may wish to include a policy statement about these tools in your syllabus. Your school/college might also wish to draft a policy.

What the Future Holds for AI Tools

  • Competitors to ChatGPT are Rapidly Emerging
    • Like Apprentice Bard or Sparrow from Google and Claude from former OpenAI employees. And again, see this directory of hundreds of new AI tools: FuturePedia. Who knows if 6 months from now, a year from now, ChatGPT may already be obsolete/out-of-date (see second-mover advantage), so don’t get too laser-focused on just ChatGPT.
  • These Tools are Rapidly Evolving and Addressing Issues
    • Many of the limitations of these tools (such as generating fake information and references) are being addressed. These tools are continually evolving.
    • But issues like bias and racism, using them for cheating, etc. are not likely to ever go away.
  • Legal and Political Systems are Responding
    • Some AI tool developers are being and will be sued for violating copyright and other reasons.
    • Governments and organizations, including school systems and academic organizations, are developing policies in response to these tools, including banning their use in schools or research articles.
  • Open, Ethical Tools will Hopefully Emerge
    • I hope to see more open source AI tools emerge (like Stable Diffusion) but which are also trained (or can be trained) on openly-licensed content, such as images and text with Creative Commons licenses. Some open source alternatives to ChatGPT in works include Flan-T5 by Google and Open Assistant by LAION, but it’s not ready for general use yet.

Key Takeaways: Do’s and Don’ts When Using ChatGPT & Similar AI Generation Tools in Education

Here’s an initial attempt at some key takeaways, some key “do’s and don’ts” when using ChatGPT and similar AI generation tools in your courses. Eventually perhaps we can get a more refined list, similar to Doug Duncan’s tips for successful clicker use.

  • Do try ChatGPT and similar tools yourself if they are of interest to you and/or you are concerned about how students may use them. You might try submitting some of your own assignment prompts and seeing the types of responses generated. This at the very least will make you more familiar with the style of writing these tools generate by default, which tends to be grammatically perfect but monotonous and positive or neutral in tone. You can also check for any errors, mistakes, and fake information and references generated in the responses.
  • Don’t use AI detectors or use them sparingly as they may collect student data and possibly re-use or even potentially sell the student data you submit to them. These tools are also very easy for students to beat. It’s a losing battle. You can check for their privacy policy, but these tools are so new, they may not have yet developed nor intend to develop appropriate privacy practices for use in schools and colleges. I’m not a writing instructor, but I suspect you’ll often be able to tell yourself when a student has submitted AI-generated text, the same way you can often tell when they have plagiarized, due to the sudden change in their writing style and grammar.
  • Do consider adding an AI policy to your syllabus and discussing the use of these tools with your students. See these sample course policies for guidance. You might think it best to not mention these tools for fear that it might encourage students to use them to cheat who would otherwise be unaware, but that cat is out of the bag. ChatGPT reached a million users in just five days, faster than any app or website ever developed, and after two months had over 100 million active users. Students around the globe are becoming aware of these tools and using them in various ways, both positively and negatively.
  • Do consider making your assessments more authentic, collaborative, or process-focused – my older advice might be that if students can Google the answer to something, perhaps it’s not appropriate to be on an assignment or test. Perhaps the same applies to ChatGPT and similar tools. 89% of calculus exam questions and 65% of biology exam questions involve just rote memorization or low-level knowledge. Apps like ChatGPT, Photomath, and Socratic can pass these questions easily. Consider redesigning your assessments to be more authentic or collaborative (#4 in the above list of strategies), which can make the issue of using AI tools moot plus improve student learning and motivation. Alternatively, consider assessing the process rather than the product of learning. At a simple level, you could have students submit videos explaining their process or use tools that track their process such as Google Docs (#3 in the list). When in doubt, think about how people in the real world or the workplace would accomplish some task, or if it is even a task people would face in the real world. And I know instructor workload is a serious and valid concern, but options like peer assessment might be something to consider to help with that issue. One adage of active and collaborative learning is that the person doing the work is the one doing the learning. After redesigning and practicing new types of assessment, they can eventually be less work, not more work, for an instructor.
  • Don’t be scammed – in many cases, it’s not really necessary to pay for some of these tools. Watch out for snake oil. Some of these third-party tools may really be just thin wrappers for ChatGPT and similar tools. You can often accomplish the same thing using the main tools directly, once you become more familiar with writing better prompts. Look for future tools that may have more customizability, privacy, and usability, or become familiar with how to effectively use free and open alternatives.

If you are using ChatGPT or similar tools in your courses

  • Do explain to students why you are using ChatGPT or similar tools if you do intend to use them with students. How will this be of benefit to them? What real world knowledge and skills will they gain?
  • Don’t require their use, or else provide alternatives – consider not requiring student use of these tools, largely for the same reasons you might not want to require students to use Facebook or Twitter or similar tools (such as student privacy reasons). Provide alternative activities for students who want to opt-out, or:
  • Do consider student privacy – have students submit their prompts to you to submit to these tools, instead of submitting them directly. Ensure there is no personally identifying information in the content submitted to these tools.
  • Don’t depend on these tools working in a live class. The tools mostly run online in the cloud, on remote servers. The free version of ChatGPT is very popular and often will not work because of overloading the servers. The makers of the tool, OpenAI, are also progressively restricting the features of the free version.
  • Do be on the lookout for the emergence of open source alternatives that are also trained on openly-licensed content. Some open source alternatives to ChatGPT, like Open Assistant by LAION & PaLM + RLHF, are not yet ready for general use, however.
  • Do consider issues with these tools generating biased, toxic, and harmful content – this issue has not, and probably will never be, completely eliminated from these tools, despite their efforts. Check out the AI Incident Database and read about some of these issues on Critical AI.

AI Developer Resources

This isn’t really something for a workshop for educators, but for those interested in some AI-related programming tools/libraries, see:

  • GPT Index lets you train these tools with your own custom data sources, including documents, video transcripts, ebooks, etc. See their App Showcase for some examples of applications built with GPT Index.
  • Langchain – building applications with LLMs through composability
  • Andrej Karpathy has some videos and sample code (nanoGPT) to better understand how tools like GPT work.

Sharing Other Resources

Even though there are a lot of resources in this post, I know there are a ton more missing. Feel free to share any other resources you think would be helpful to educators below, or reply to this post on Mastodon or on Twitter. Or use #ChatGPT, #AIEd, or other hashtags when sharing your thoughts or resources online.

Faculty developer. Interests: developing educational technology; faculty & student development; learning sciences & psychology.

Posted in Educational Technology
Doug Holton

Doug Holton

Faculty developer. Interests: developing educational technology; faculty & student development; learning sciences & psychology.

View Full Profile →

Join 5,342 other subscribers
Follow EdTechDev – Doug Holton on WordPress.com