No topic dominated the 2021–22 school year more than critical race theory (CRT). Was it a force for good or evil? Was it even being taught? If so, by whom? To whom? And to what effect? What, if anything, should be done about it? Scholars, classroom teachers, and “experts” of all stripes weighed in on all sides.
CRT mania had many of the characteristics of a typical moral panic, beginning with the creation of what late criminologist Stanley Cohen called “folk devils”—bad actors (in this case, teachers) with no redeeming qualities who are bent on predation (Burke, n.d.). We then saw “distorted mass media campaigns that create[d] fear and reinforce[d] previously held or stereotyped beliefs” about these teachers. We watched politicians “presenting themselves as the safeguards of the high moral ground” (Burke, n.d.). And finally, we stood back as school and district policies were crafted and state laws were passed to ban CRT. Some were challenged, and in rare cases, overturned (Associated Press, 2022); but few could argue that the 2021–22 CRT panic hasn’t changed the educational landscape in ways we’ll be dealing with for a long time.
As impactful as it was, however, CRT has nothing on 2023’s topic du jour, ChatGPT. This artificial intelligence language model can, on one end, write realistic-sounding student essays, and on the other end, write emails to parents, college recommendations, and (gasp!) whole lesson plans. ChatGPT has seemingly caught much of the educational world flat-footed. We have arrived at a world that we thought might one day exist, but only in some far-off Star Trek future. The platitude that we are “preparing students for a world we cannot yet imagine,” feels much less empty in 2023. Many of us (myself included) cannot imagine what the world will be like when every former student can access a digital clerk who can write everything for them—from unique-sounding grant proposals to detailed expense reports to wedding vows that make their partners weep. An unfortunately popular reaction to this national conversation has been an odd form of self-congratulation. Overconfident teachers (and, worse, armchair theorists who aren’t even teaching anymore) are claiming that they will not be impacted because they are (or were) so student-centered. Their pedagogy will not be heavily impacted because it is (or was) so forward thinking. Some revel in their tech savviness. They, unlike numerous educational Luddites across the country, realize that AI is just the next cool tool, just like the calculator or the internet. Therefore, the only permissible reaction to it is starry-eyed wonder.
Our classrooms must be places where students both talk to humans and are listened to by humans.
You’ll read none of that from me. I understand that, for instance, AI writing actually presents many more complications for teachers who are student-centered, inquiry-driven, and project-based than for others. After all, AI is not about completing students’ fill-in-the-blank worksheets or taking their bubble tests. For better or worse, it stands to impact the “good” stuff: the powerful stories, the incisive essays, the passionate speeches, the moving art—the hard and meaningful work that we often want students to wrestle with and be proud of themselves for mastering. Frankly, it’s cheap to dismiss teachers’ concerns about this new player that stands to so thoroughly change the game.
Back to the Future
I cannot predict what language models like ChatGPT will do for (or to) writing instruction. We’re all going to spend the next few decades figuring that out. What I can say, with absolute certainty, is that classroom discussion is about to catapult into a central role in instruction that it hasn’t seen in a very long time. Pretty soon, if we don’t know how to lead a good class discussion about a book, or a time period, or a scientific or mathematical theory, this deficit will matter in ways that it has not mattered before.
Yes, we’ll try to hang onto the pre-ChatGPT world for as long as we can. Many of us will have kids do the same traditional writing assignments on paper, or in-class, their laptop screens under the watchful eye of either their teachers or some sort of spying software. These adjustments will be at times both useful and toxic. But these temporary “fixes” will not change an eventual AI-driven devaluation of “formal” student writing. It will become that much harder for us to answer, “Why do I need to learn this?” when it comes to the writing that students know they’ll always be able to farm out to their personal AI secretaries. The required instructional adjustment, then, will have to be more social. We will have to be able to lead students in meaningful classroom discourse about big ideas.
At their best, our classrooms have always been places where there was a tangible reason for us all to be there together. This reflects other areas of life as well. Consider the difference between watching a movie alone and seeing it in a theater. We choose the latter for the same reasons that our society has created “live tweeting” and YouTube reaction videos. We want to experience things together and feel a part of each other’s energy, even if we are sitting at home alone in our pajamas. As the practical (not spiritual [Merrow, n.d.], not cathartic, etc.) significance of professional student writing diminishes, it is crucial that teachers look to shared social learning experiences to take up more space.
“The future” of classroom instruction is here, a little sooner than many of us expected. But, in a few important ways, it must look like the past. We must be a little like West African griots, creating energetic and fascinating educational moments meant to be experienced collectively. We must be a little like Socrates, asking groups of students question after question until gaps in thinking are exposed. Our classrooms must be places where students both talk to humans and are listened to by humans; for in a world that will soon have ubiquitous AI, this discourse will become the main reason for our classrooms to exist.