Earlier this semester, I tweeted, “Historic note: This opening day of Spring Semester 2023 marks the first time I had to tell students that no AI may be used in the writing of any assignment. It felt like an explanatory flashback scene in some sci-fi apocalypse.”
I was sort of joking, trying to be playfully self-aware of the tendency to fear what’s unfamiliar, especially with new technology. The “Kindles will kill physical books” kind of thing. But I was also being serious about our tendency to misuse tools, especially shiny new ones, and to be unaware of how we’re messing up.
Can AI be used legitimately in the process of writing an academic assignment? My own institution has assembled a task force to determine what we consider legitimate and illegitimate use. In the meantime, here’s my own initial, not-yet-fully-processed take: Using AI for early research is OK. Using AI to write is wrong.
Using AI for early research is OK. Using AI to write is wrong.
Here’s how I get there: The purpose of education is to form students with the necessary capacities of thought and performance appropriate to their fields. Christian education sees this as a way of forming people to know God in what he has made and in what he has said (Prov 1:1–7).
So the question I’d want my students to ask themselves is grounded in that purpose: How does my use of AI fulfill or fail the purpose of education to form my personal capacities as a thinker and performer in God’s world?
Writing Is Formative Labor
While the thinking and performance appropriate to different fields of study will vary, the point of education is to expand personal capacities for these tasks. This is a type of internal change in the student. Education is intended to bring about that change.
Two different yet related exercises are required: research and writing. Research involves gathering and organizing information and then reflecting on it until you acquire knowledge and understanding. Writing hones and expresses that understanding for others to receive and evaluate. Through the laborious process of expressing an idea precisely, we come to know that idea precisely.
It can be tempting to recruit AI to make this process easier. But there are limits to what the technology can achieve: AI can help gather information and to some degree organize it. But AI cannot cause the internal change we call understanding.
If we rely on AI to produce written material that imitates understanding, we undermine the very purpose of education: the formation of thinkers and doers. Outsourcing word-craft is outsourcing thought-craft. Like every human capacity, the abilities to comprehend and to express that comprehension are gained through hard work. Embodied experience requires us to toil with something in order to know it. This is true neurologically and spiritually.
Input vs. Output Tasks
Tools such as AI are appropriate for input tasks—gathering and, to some degree, organizing information. AI is not appropriate for output tasks: producing evidence of a student’s understanding. It can pull ideas from resources to make a student familiar with an existing conversation on his topic and maybe even suggest an organizational scheme for processing it. But it shouldn’t be used to produce evidence a student is performing as a participant in that conversation.
Outsourcing word-craft is outsourcing thought-craft.
Even for input tasks, we need to keep in mind the limitations of our tools. Like everything created by human beings, AI is neither objective nor omnicompetent. The data it produces will be as partial as the sources it draws from, and the relating of those concepts together will be limited by the algorithms it’s programmed with. AI isn’t capable of arriving at understanding itself, let alone producing persons made in the image of God as embodied souls.
Perhaps we should think about AI in a similar way to other information tools like Boolean search operators or card catalogs. As long as users know the limitations of the tool and therefore the specific purposes it can and cannot serve, they can make sound judgments about its use.
So when a student asks herself the question I posed earlier, a good answer would be, I’m consulting AI as an initial step of research on a topic, but I’m not depending on it to write anything I would claim as my own—or even to draw conclusions I’d credit as my own understanding.