Debatt ● Daumantas Bloznelis
Cheating ourselves with AI. A response to Atle G. Guttormsen
If we agree that the writing process is more important than its result, we understand that using AI for structuring one’s thoughts amounts to shooting oneself in the foot.
Denne teksten er et debattinnlegg. Innholdet i teksten uttrykker forfatterens egen mening.
In a recent opinion piece, professor Atle Guttormsen criticizes rejection of AI as a legitimate tool for students to use in their schoolwork. According to him, getting help from AI is about as innocuous as using a calculator, a spell checker, or an electronic dictionary, and AI is not to be seen as a threat to independent thinking or authentic work. I am not persuaded.
The author likens rejecting AI as a legitimate tool to denying the use of calculators in mathematics. I can agree with that. As far as I remember my mathematics education through high school, calculators were never used, and for a good reason. The point of learning mathematics is not in obtaining the correct answer with the least effort. Rather, it is in the fundamental understanding of how it is obtained and, not least, what the correct answer cannot be.
I do know some students who were trained to use calculators in their math classes at school. When working through the multiplication problem of 99*99 as freshman economics students, they would not blink twice when the calculator showed 98901. This reveals the difference between having learned mathematics and having learned to rely on technology (here, a calculator).
Mind you, we did use calculators in physics and chemistry classes. However, the focus of physics and chemistry is not on the arithmetic. Thus, having to calculate things manually would have wasted the time that was more productively spent on the core material.
The opinion piece maintains that AI can be an essential tool for students in structuring their thoughts, improving their writing skills, and developing better academic texts. This is not really much different from the calculator example. One of the main achievements of higher education is the continued development of the students’ ability to reason. The process of writing is almost indistinguishable from the process of reasoning. Wrestling with loose thoughts while trying to put them in a coherent sequence to build an argument is perhaps the way of learning to reason.
Meanwhile, the end result of the writing process at a bachelor’s or master’s level usually is of limited value in itself. When was the last time you read a term paper that advanced the research frontier in a substantial way or made you change your mind on an important issue?
If we agree that the writing process is more important than its result, we understand that using AI for structuring one’s thoughts amounts to shooting oneself in the foot (a bjørnetjeneste in Norwegian). Yes, it may well make the term paper look better, but at what cost? We risk losing the skill of building a coherent argument – and recognizing one that is not, such as a persuasive hallucination generated by a large language model.
I am sympathetic to the call for adopting AI in education to empower the students. But as we do that, we must find ways to avoid the unintended consequences that technology enthusiasts may too easily dismiss.