Over the past few months, RE’s Chief Technology Officer Linda Lawrence has found herself thinking about “big, uncomfortable questions” of her own: “What is the world where these tools are ubiquitous? How do we assist students in preparing for their future?”
It didn’t take long for our task force to determine that dreaming up ever more elaborate ways to defend against ChatGPT, or to catch students using it, wasn’t a productive course of action. On the one hand, it would go against the culture of honor and accountability that we aspire to create in our classrooms, and that students create with us. On the other hand, it would be a perpetually losing battle.
We realized, however, that we could gain ground against a significant threat raised by the increasing prevalence of AI by working to ensure that RE students continued to possess a sense of agency over their learning.
What happens to our sense of agency when we allow machines to do our thinking for us? In March, Matthew G. Kirschenbaum, a Professor of English and Digital Studies at the University of Maryland, College Park, asked that question on a grand scale in an essay for The Atlantic titled “Prepare for the Textpocalypse.” Kirschenbaum imagined a dark future in which we have allowed AI to take on so much of the world’s communication that the internet itself becomes a cesspool of AIs talking to other AIs, “flooding the internet with synthetic text devoid of human agency or intent: gray goo, but for the written word.” As in the garbage-filled and sun-scorched Earth in the movie Wall-E, the world of knowledge becomes a barren wasteland. Meanwhile, humans, having outsourced all thinking to machines long ago, become pure consumers of content, their intellectual muscles atrophying to the point that they can no longer think at all.
Kirschenbaum’s vision of the AI-powered future is, admittedly, a dour one. But it does raise questions about the intellectual muscles of our students as AI tools become more and more ready-at-hand, and as it becomes easier and easier to automate the thinking routines that we want to cultivate in the classroom. How do we face the challenge of – as Kirschenbaum phrased it in an interview via Google Meet – “a kind of estrangement from human writing”? How do we prevent students from feeling disempowered by this incredibly powerful technology?
Maybe take technology out of the equation – at least at times, in controlled situations in the classroom. Kirschenbaum studies digital writing professionally, but he also runs a scriptorium at UMD: a deliberately low-tech approximation of the places where monks used to produce illuminated manuscripts.
“They get a goose quill. They get iron gall ink. They get a pretty good facsimile of parchment paper, and we turn off the lights and use LED candles,” he explained. “One scenario is that writing instruction becomes that kind of scriptorium, this kind of excruciatingly artificial environment where you draw the blinds and lock the doors and light the candles.”
It might seem backward-facing to re-envision the 21st-century classroom as a scriptorium. But in certain contexts, a degree of strategic Luddism seems necessary to ensure unassisted critical thinking – and to ensure that by the time students step into the AI-dominated world, they know how to think for themselves, distinguish the true from the false, and chart a meaningful path.
Nero uses a different metaphor: she imagines the classroom becoming an “intellectual gym.” This year, she plans to implement more in-class writing. Fewer papers. Oral exams. Above all: an even greater investment in Socratic dialogue, Harkness discussions and other modes of discourse that have provided the foundation of Ransom Everglades pedagogy for 120 years.
“Now that we know that AI can generate so much in the written word, we’re going to expect you to be able to sit in a room and respond in real time. To make an argument, to counteract it, to be able to keep your cool. Make eye contact. It’s going to force students to be more present – and teachers, too,” she said.
In my own course, Research into Anglophone Literature, we have gone full medieval this fall, not quite by replicating a scriptorium but by replacing a traditional essay with a storytelling contest based on The Canterbury Tales. Jester hats, turkey legs, ribaldry – and not a laptop in sight.
“Now that we know that AI can generate so much in the written word, we’re going to expect you to be able to sit in a room and respond in real time. To make an argument, to counteract it, to be able to keep your cool. Make eye contact. It’s going to force students to be more present – and teachers, too.”
Humanities Department Chair Jen Nero
AI as a thinking partner
And yet, Nero, like many teachers at RE, finds herself contemplating not just low-tech plans, but loftier, more exciting ambitions: opportunities for students to unleash AI as a thinking partner, harnessing its capabilities.
“We cannot sit. We cannot be flat-footed. We have to go out there and show our students that we’re not afraid to use it. Even if it means we’re going to make mistakes, too,” she said.
That was the main focus of the AIRE Task Force presentation to the faculty: not strategies for circumventing AI, but strategies for embracing it. At one point, world languages faculty members Felipe Amaro and Alfredo Palacio took the stage to demonstrate how ChatGPT can function as a bespoke language tutor for students at any level, conversing fluently and correcting the student’s grammar in real time. In early May, Khan Academy CEO Sal Khan delivered a TED Talk that celebrated the potential of AI to democratize this kind of one-to-one feedback. Even RE students, who receive significant one-on-one support from faculty, stand to benefit.
The potential for democratization goes far beyond tutoring. In April, I allowed my students to use AI-generated images in a project that involved creating
Brave New World-inspired propaganda posters. One of my students,
Nicolas Poliak ’24, confessed that art wasn’t his strength. But he had a vision for a poster that combined elements of Soviet realism and 1950s American beauty ads. AI brought his vision to life in a way that would not have been possible even six months earlier.
In a similar way, AI tools have liberated Skye McPhillips ’24 as she works to complete her Bowden Fellowship in the Humanities project, a study of creativity among Dominican tabaqueros (cigar rollers). She’s used ChatGPT to digest and index long academic papers, to provide feedback on her writing, and even – with the help of plugins and the machine learning expertise of Elliott Gross ’24 – as a translator and data analysis tool for her more than 30 interviews.
“AI allows [students] to skip phases of searching for a quote for hours or waiting to ask a teacher about their paper right before it’s due, giving [them] more time to think about the next step of their endeavors and pushing them to innovate,” McPhillips said.
In a survey that the AIRE Task Force sent out in May, 33 percent of faculty members said that they were already “actively encouraging” their students to use ChatGPT in various projects. That percentage stands to increase significantly in the 2023-24 school year, which promises to be a massive petri dish of pedagogical experimentation.
The experiments will cut across disciplines. In the sciences, students in Paul Natland ’02’s physics classes will use ChatGPT “as a coding partner to create visualizations and analyze data” – to the point that Natland is planning to make “effective use of AI tools for problem-solving, brainstorming and drafting” one of his core standards. Students in all sections of biology will use AI tools to do something their peers already did in May: imagine an entirely new species – complete with illustrations.
In the humanities, faculty member Kate Bloomfield will allow students in Political Culture in the United States and Understanding the Abrahamic Religions to use ChatGPT to fill out review sheets before assessments – with the caveat that it will be their responsibility to then “tailor” the chatbot’s “often generic” responses to each course’s specific curriculum. Faculty member Cameron Ferguson plans to teach his middle school students how to use AI as a cultural and historical research tool, one that can give them baseline knowledge of a place or time before they delve into more specific lines of inquiry.
For upper school photography teacher Matt Stock, the most exciting prospect involves the old meeting the new. In the spring, he helped Bryce Sadler ’24, a student in his Experimental Photography class, enhance cyanotypes – “a contact-printed analog process from the 1860s” – with the image-generation capabilities of Adobe Firefly, an AI tool that now lives inside Photoshop. As the image transformed from analog to digital and back again, a new way of making art started to take shape.
“This year, I want to allow students to mix cutting-edge digital tools with antiquarian methods, and see what they can come up with using a combination of methodologies,” Stock said.
From the vantage point of the school’s CTO, being “future-forward” is the only option – and Lawrence said she’s here to support teachers as they use AI tools to try radically new things.
“It’s a time where we have to stay ahead of the curve, because we are Ransom Everglades,” she said. “We’ve got such knowledgeable people and such creative people. We have to encourage them to take different strategic approaches and experiment with these tools, so that they can show students not just how to use them to pass this course, but how to use them to create pathways to understanding.”
Some version of the Wall-E Textpocalypse seems inevitable as AI tools continue to embed themselves into work, communication and daily life. But I have also thought a lot about another Disney movie: 2014’s Big Hero 6. In that film, brilliant young people at an engineering academy bring outlandish inventions to life with the aid of AI tools that automate the dirty work: coding, manufacturing, prototyping. The main protagonist, Hiro Hamada, simply has to think of something in his mind, and then a swarm of “microbots” bring his vision into physical form.
My five-year-old son already thinks the Fernandez STEM Center literally is the school in that movie. What if he ends up being right?
If one way to empower students involves creating a space for them to think for themselves, another involves helping them use the technology to think and create in ways that wouldn’t have been possible before – and in ways that we have yet to imagine.