Putting Generative AI in Classrooms
By Bodong Chen in blog
June 9, 2023
It was more than one year ago when I mused on the topic of integrating AI in education. Now, it feels like that post was made a decade ago!
Yesterday, I got a chance to be on a panel on this topic, moderated by my great colleague Dr Susan Yoon and participated by Drs. James Lester and Ryan Baker, two known experts in the area of AI in Education. The panel was part of the launch event for the new McGraw Center for Educational Leadership at Penn GSE. We had a good conversation, followed by rich discussions with a room of educational leaders from the Penn community and school districts.
Susan asked a series of thought-provoking questions that could take hours to dig into. Here I’d like to expand on a few points without making any predictions, because prediction is hard.
Promising aspects of ChatGPT in education
We often hear from the Ed Tech circle about how AI, now Generative AI (GAI), is making good teaching more scalable and cheaper. That’s true. It’s encouraging to see Khan Academy’s Khanmigo and we will see many similar tools in other learning domains.
To me, another promising aspect of GAI is to also make learning “harder”. This point is based on a new study my lab has been working on with a group of high-school students. They used ChatGPT in their collaborative knowledge building, to help them generate research questions, find information, summarize group dialogues, etc. But as they became aware of issues with ChatGPT, they found themselves doing extra work to verify responses from ChatGPT. To some extent, ChatGPT responses provided them entry points to ideas and terms they would have not considered, and as a result they ended up expanding from the original plan to cover more topics in their work.
This is only one example of using ChatGPT to make learning harder. There should be other examples from other contexts.
Here, the key is not be locked in traditional views of education but use this opportunity to seek new visions of learning. If the end result of interacting with ChatGPT is to fill out a worksheet, maybe the usage is missing the point.
It will be fairly important to be aware of narratives and value propositions people make around integrating GAI in education, question the soundness of these narratives, and consider alternative visions. I would argue it’s important for more people (if not everyone) to do so so that the agenda and human-AI interfaces in education are not set by a handful of start-ups or organizations.
Resource allocation and teacher PD
There was a great question about how leaders in education systems – e.g., school districts, universities – should allocate resources in their response.
I brought up the infrastructure perspective that a group of scholars including myself have been working on. (We will have a symposium at the ISLS meeting next week if you happen to be attending the conference.)
One point made in the infrastructure literature is infrastructure has relational properties. For a thing to become an infrastructure, it needs to serve an infrastructural function – for a certain group of people. A thing can be an infrastructure for one group and an obstacle for others, making it especially important to inspect the relational properties of an infrastructure before/when it is deployed.
Now we are facing a key “point of infrastructuring” for education, not due to infrastructural break-downs but disruptions of core educational practice (such as assessment) caused by GAI. Well, maybe these issues are indeed break-downs. At such a point of infrastructuring, it is important to identify the extent infrastructure – which can include information systems, district policies, organizational structures, classroom routines – relevant to GAI and find out potential alignments, frictions, and areas to be newly designed.
The process of infrastructuring is never only about the product (or the initial design ideas embodied in the product), but design in-use when the product is put into practice. Therefore, humans who get to interact with the product – who are often called “users” – would actually need to do creative design work to make the product work for their particular context. From the infrastructuring point of view, these “users” need to be recognized as designers as well, and their design work needs to be better supported.
For these reasons, it is probably wise to prioritize investment in human and organizational infrastructures instead of treating the next purchase of AI products as an isolated technical decision.
Do we need more teacher PD on integrating AI in classrooms? Absolutely. Also, good things are already happening in various teacher communities (see the Facebook “ChatGPT for teachers” group). How to support teacher integration of AI in a way that gives them agency while promoting their professionalism and creativity? These areas of consideration need to go hand-in-hand with making sound product purchase decisions.
Response to AI replacing jobs
McKinsey has been pumping out these reports of job automation for many years now. This topic is not new even though people are understandably more concerned because creative jobs are in danger of being automated away.
I’ve been following some dialogues on risks posed by AI development (e.g., this one featuring Yoshua Bengio & Yuval Noah Harari). These concerns are quite real. Just like climate science experts have been warning us about climate change, we should listen to AI experts working on the frontier of AI research when they say ChatGPT is only the first living organism crawling out of the organic soup.
As educators, we cannot do much to slow down AI development but there are probably two important things to think about.
First, consider integrating (generative) AI literacy in student learning so that students are capable of using GAI, are aware of its issues, and are reflective of the relationship they are forming with GAI. Intimacy is the next arena for big tech after/as they fight for human attention. Humans are even more vulnerable facing GAI-powered agents deployed to every possible corner of society. Banning GAI in the classroom is not helping students to get prepared.
Second, at some point, educators will realize it is not enough to only talking about knowing (e.g., students knowing to write an essay) but also being (e.g., being a biological human while AI brings another kind of living beings). This topic was in the sci-fi territory but not anymore. This is not saying AI is going to destroy humanity but to simply recognize we should not simply use human’s biological systems as reference points when discussing AI’s computational systems. If AI evolves so quickly, even if regulations and guardrails are put in place, the objectives of education are to be revisited, probably sooner than we hope to.