Learning Technologies, Course and Assignment Design
Recent Experiments in Teaching with and about AI
Highlights from a recent reading group discussion of AI and learning
Monday, December 8, 2025
This fall I facilitated a reading group for the book Teaching Effectively with ChatGPT by Dan Levy and Angela Pérez Albertos. I like that the book is so incredibly practical, with dozens of examples how instructors and students use generative AI to support teaching and learning. The authors are keenly aware that AI use can undermine our learning goals and objectives, and they provide lots of advice on how to use these technologies productively in higher ed.
The book also generated very healthy discussions among the reading group participants. During our discussion of Chapter 9 ("Nudging Students to Learn with ChatGPT"), several members of the group shared their recent experiments in teaching with and about generative AI. I thought they were fascinating, and I received permission from a few of the participants to write up their stories here on the CTE website.
At the reading group meeting, I had just come off a meeting where BoodleBox came up. That's an AI tool marketed to educators, and one of its selling points is you can set up group chats for students with AI agents. Most AI tools are individual chats, so I wondered if this group format would be useful. Kristen Wells shared that she had actually tried it, inviting her public health students to ask questions (in a group chat) of an AI agent as they reviewed materials before class. She found that the students engaged well in the BoodleBox group chats, more so than on Canvas discussion boards. She conjectured that the group AI chat felt more interactive or authentic. Also, BoodleBox gives the instructor visibility into the student-AI interactions, so she could see what questions the students were bringing to the AI before class, then address those questions during class, making it a useful pre-class formative assessment mechanism.
Andrew Simon shared an assignment that seemed to help his economics students treat AI output not as an oracle, but as an option. Students read an academic paper--a long one, maybe 60 pages--then a 3-page policy brief the authors of the paper had written on the same topic. Selecting materials for a 3-page brief from a 60-page academic paper involves a fair degree of subjectivity, which left room for students to consider ways to condense the research other than the choices the authors made. Students did this kind of analysis (e.g. what would you have put in the brief?) then asked their AI chatbot to do the same. Andrew reported that this helped students see that there were lots of potential answers here, which in turned helped them approach the chatbot's response not as definitive but as one possible answer.
Lysandra Cook shared another activity that helped prime students to build some AI literacy. She first had her education students compare the representation of disability in two movies, one from the 1980s and one from the last couple of years. They noted things like the fact that the older movie had a character with a disability portrayed by a non-disabled actor, but the newer movie had an actor with a disability. Seeing this kind of change over time prepared students to consider how generative AI's representation of disability has changed in the three years since ChatGPT launched, which was a subsequent activity in Lysandra's course, thus connecting media literacy with critical AI literacy for her students.
These last two examples remind me of the approach that the PAIRR Project (Peer and AI Review +Reflect) has taken to writing instruction. By having students receive feedback on their drafts from a peer before receiving AI feedback, the faculty in that project found that students tended to approach the AI feedback with a bit more skepticism than when the students received the AI feedback first. One of our reading group participants mentioned sharing the PAIRR Project with her students, and they responded well to the idea, perhaps because this kind of structure made AI use "permissible" for students who otherwise would be hesitant to turn to AI out of academic integrity concerns.
Ethan King shared a fun (and useful) assignment from his Writing and Generative AI course in which students used AI to produce a song. First, students created musical persona, a kind of artificial artist, implemented as a student-designed chatbot. Then they talked with the chatbot to collaboratively produce the lyrics for a new song, which required thinking at both macro and micro levels about the song structure. Once they got a set of lyrics they liked, then they turned to Suno, an AI music generator, to set it to music. Ethan said that putting the students in the role of music producer (instead of, say, lyric writer) helped them see that every prompt iteration was a rhetorical choice. And they developed more awareness of genre, particularly when they saw how generic the AI's output was (the same verse structure from the AI chatbot, the same major key from Suno). It helped that students were already consumers of music in the genre in question, with this activity shifting them to become producers. Might they then carry some of their insights into, say, academic writing?
I'm sad to report that even though I live in Nashville, it was Ethan who informed me that an AI-generated song, "Walk My Walk," by a fictional artist named Breaking Rust had topped one of the Billboard country music charts! The comments on the YouTube posting of the song are particularly entertaining. "I will never forget where I was when I first heard him sing this in concert." "Must have been hard to walk that walk with no legs." And: "This takes me back to my year in high school chorus. I remember having to breath from time to time. Clearly, that was why I never went pro."
Thanks to all the participants in this fall's reading group for their active engagement with the book and with our discussions! And if you missed the reading group, stay tuned to CTE channels for information about spring opportunities to talk with colleagues about AI and teaching and learning.