Since the explosive entrance of ChatGPT onto the scene in 2022, artificial intelligence (AI)-powered writing programs have become wildly popular with students in writing courses, and, while the full ramifications of work written by AI have yet to be discussed, many students are at least artificially profiting from the latest developments in this high-tech Mercurial sorcery.
What are AI writing programs?
AI writing programs – such as ChatGPT, Anyword, Jasper, and even Grammarly – produce mechanically-correct writing and even whole essays in a matter of seconds. Some programs do it for free.
Why are AI writing programs dangerous?
With their seemingly-superhuman powers, AI writing programs destroy students’ ability to write and think well on their own; and because AI writing programs do that, they also tempt students to be dishonest, and they attack the image of God embedded in students’ souls.
How do I approach AI writing programs?
As a writing teacher—and nearly all of my courses require writing—I am horrified and grieved that this new technology poses such a threat to my students. My goal as a teacher is not merely to show students how to think and write well, but also to help them to become men and women of integrity and to deeply appreciate the fact that the Lord has made them in His image.
To counter the threat of AI writing programs, I will be adjusting some aspects of how I teach and how I approach homework. For more specific details, please see the “Regarding AI Writing Programs” page.
How do AI writing programs promote dishonesty?
Because they can both manufacture and plan long-form written pieces, AI writing programs promote two kinds of dishonesty:
First, they allows students who are skilled-but-lazy writers or who are unskilled writers wanting to look good to hide their laziness and their lack of ability. Strong writers can type in a prompt, and AI will furnish them with a complete essay in less than a minute. Weak writers can do the same. In both cases, if uncaught, they may be able to get decent grades on an assignment they did not consider for even five minutes. Both students have been dishonest—and both have made a mockery of the efforts of thoughtful, hard-working students.
Granted, AI-produced essays are not always high-quality writing. A student who usually earns “A”s on papers who uses a raw AI essay could end up with a “C”; but a struggling student who scrapes by with “D”s or worse can certainly benefit. And AI writing programs are rapidly improving. Articles from January of this year that inform writing instructors about clues that may “give away” the fact that a piece was written by AI are already out of date: after a mere two hours testing just one free AI program with more than 20 distinctly-different and highly-complex prompts myself, I have personally seen that many of those clues are largely irrelevant. AI writing programs can now provide citations, search and quote primary sources, produce outlines, and even write personal responses.
However, the second kind of dishonesty that AI writing programs promote is more sinister. According to a must-read article by Columbia University student Owen Kichizo Terry in The Chronicle of Higher Education, students are using AI not just to write their papers but to do all the thinking necessary to write an effective persuasive argument.
Here are just a few of the most alarming quotes from Terry’s article (emphasis added):
- “In reality, it’s very easy to use AI to do the lion’s share of the thinking while still submitting work that looks like your own.”
- “[M]assive structural change is needed if our schools are going to keep training students to think critically.”
- “The…increasingly popular strategy is to have AI walk you through the writing process step by step.”
- “Already, a major chunk of the thinking had been done for me. ….[O]ne of the main challenges of writing an essay is just thinking through the subject matter and coming up with a strong, debatable claim. With one snap of the fingers and almost zero brain activity, I suddenly had one.”
- “[W]riting is no longer much of an exercise in thinking. …. The ideas on the paper can be computer-generated while the prose can be the student’s own.”
- “[W]e neglect the essay’s value as a method for practicing critical thinking. When we want students to learn how to think…assignments become essentially useless once AI gets involved.”
In sum, Terry says, essays are crucial tools for teaching students to think well, but students now have the ability to avoid thinking altogether when they write them.
Since this is the case, the question then becomes: How can thinking and writing be taught in a world in which students have almost constant access to the very programs that allow them not to think and write at all?
How can we help students avoid the temptation to be dishonest about their writing?
While it’s impossible to prevent all of this type of dishonesty, some characteristics of AI-produced writing will likely continue to be easy to spot. Additionally, I hope that adjusting how thinking and writing skills are taught in the classroom can reduce the likelihood that students would be tempted to be dishonest by using AI.
To see why this could be true, compare the temptation and shortcut of using AI-writing programs to other types of temptations and shortcuts.
Take the temptation to be gluttonous as an example. Since I’m surrounded by readily-available and cheap food, I’m surrounded by the temptation to snack. If I regularly give in to this temptation, I will gradually get fat, thought not noticeably in a short time—but the consequences could be lethal. What prevents me from constantly overeating? Physically, I need to drink plenty of water, eat nutrient-rich foods, and remove myself from areas where I might be tempted to give in. Metaphysically, I need to pray for God’s help and focus my mind on the truths about this issue: people around the world are starving, so I ought to have some self-control when it comes to food; junk food snacks pale in comparison to a homemade Italian dinner; and I want to stay healthy. Further, if I have experienced that I can go without or with very little food for long periods of time, then I know that it is possible for me to have self-control in my everyday eating habits.
Now consider the shortcut of using microwave dinners to feed the family. Microwave dinners do feed the family, and they do save time and effort. But they are neither as nutritious nor as delicious as a homemade Italian dinner using fresh ingredients. As a seasoned cook—pun intended—I can, with a bit of effort, avoid the microwave dinner shortcut and make a homemade Italian dinner myself. While I do need to think carefully about when I will need to start each dish so that all the dishes are ready to eat at the same time, I’ve made these dishes so often that I don’t have to spend extra time reading the recipes or studying the directions.
How does all this relate to helping students avoid the temptation and shortcut of using AI writing programs?
From the example of avoiding temptation, we can see that physical and metaphysical steps can be taken. These would include, in the case of writing:
- Getting away from computers and cellphones that allow access to AI writing programs
- Using hard copies of books and sources
- Writing by hand in a notebook
- Working at a time when I feel most familiar with the material instead of when I am tired or have been thinking about something very different
- If necessary, working in an area where others can see me and hold me accountable
- Praying for God’s help, wisdom, and protection against temptation
- Reminding myself that dishonesty—even if it’s popular or efficient—is still a sin
- Reminding myself that some students are acting rightly and that my being dishonest by using AI would mock their hard and honest work
- Reminding myself that not doing the thinking and writing myself will in the long run weaken my ability to think and write
- Remembering times when I have thought and written successfully without using AI as a reminder that I can do this assignment on my own
From the example of avoiding the shortcut, I can see: that tasting the difference between the microwave meal and the fresh-cooked Italian dinner encourages me to desire the better food; and that repeatedly practicing the skills involved in cooking the Italian dinner gives me the confidence that I can make one with relative ease. For teaching writing, these points imply that:
- Students need to see the difference between poor thought and writing and excellent thought and writing so that they can be encouraged to desire to think and write excellently themselves
- Students need to have plenty of practice in all the skills of thinking and writing in a setting where they can get immediate feedback and gain confidence that they are on the right track on their own because past success is an essential form of encouragement
Some may think that my concern about student integrity is excessive. But to those who find that I’m turning a Gimli into a Gargantua, I would say that student integrity and virtue are far more important than any information I could teach. If I do not take student integrity and virtue more seriously than academic content, I have failed as a tutor because I have taught students to appear wise when they are in fact doing something wrong. With the changes I plan to make to my teaching approach, both students who are more-likely to be tempted to use AI writing programs and students who are less-likely to be tempted to use AI writing programs will benefit: my revised approach is intended to equip all students with more-enhanced thinking and writing skills which should give them the confidence that they can think and write without using AI.
How do AI writing programs ultimately dehumanize those who use them?
Some may think that AI writing programs are simply the future and that any way we can make thinking and writing more efficient justifies any loss that may be incurred by using these technologies.
To those who look to AI writing programs as the modern equivalent of the printing press and the steam engine, I would point out that previous inventions have not been able to so thoroughly destroy, not only our ability to trust one another as fellow human beings, but also our uniquely-human capacity for independent thought.
“But is independent thought really so important?” it may be asked.
My case rests on the thoughtful writing of four observers of the rise of totalitarian regimes in the twentieth century: C.S. Lewis, Milton Mayer, Friedrich Hayek, and George Orwell.
While my point is not to emphasize the vital political connections between thought, use of language, totalitarianism, and the popularization of immorality, these four authors realized that the crucible of totalitarianism clarifies some essential truths about the relationship of thought and language to human nature and virtue.
Lewis, at the close of his essay, “Men Without Chests”, points out the irony of modern society which “continue[s] to clamour for those very qualities we are rendering impossible.… In a sort of ghastly simplicity we remove the organ and demand the function…. We laugh at honor and are shocked to find traitors in our midst.” We daily hear calls for “critical thinking”, “creativity”, a “good work ethic”, and “integrity”, but relying on AI writing programs is a sure-fire way to end up with a new generation of young people who lack these very skills and virtues. If we discourage careful thought, considered writing, and enduring virtue, we are cutting students off from three aspects of life that make them most human.
Mayer, who interviewed former Nazis about the development of Nazism, writes in They Thought They Were Free that, “Men under pressure are first dehumanized and only then demoralized, not the other way around.” While I’m not saying we stand on the brink of totalitarianism, this generation of students is clearly under pressure to use the latest technology and to cut corners so they can quickly move on to do more pleasant and comfortable activities. But allowing computers to think and write for you is dehumanizing: by doing so, you disregard your God-given logos, your highest level of reasoning and considering that sets you apart from animals and from machines. Mayer connects this dehumanization to demoralization: if we come closer and closer to being machines, what moral standards can we claim to uphold or hold others accountable to? It is men, not machines, who face God at the Last Judgment.
Hayek’s Road to Serfdom repeatedly refers to the connection between poor thought and immoral choices. He states that evil “obtain[s] the support of all the docile and gullible, who have no strong convictions of their own but are prepared to accept a ready-made system of values if it is only drummed into their ears sufficiently loudly and frequently.” Those most likely to be deceived into accepting evil ideas are those “whose values and imperfectly formed ideas are easily swayed and whose passions and emotions are readily aroused”. While Hayek is referring to a political context, the same type of thought-less person—who has not taken the time to deeply consider the nature of virtue, of sin, of the pressure to conform, of the desire to fit in and be liked, and of the ability to resist temptation—is likely to struggle morally in any other immoral context. Without depth of thought, we are less-prepared for difficult moral tests. And if we have trained ourselves to take the path of least resistance by farming out our thought and our work to AI, we are that much less likely to resist evil when faced with an unexpected choice.
Orwell’s “Politics and the English Language” emphasizes that poor use of language is caused by our laziness, and he begins his essay by pointing out that “the English language…becomes ugly and inaccurate because our thoughts are foolish, but the slovenliness of our language makes it easier for us to have foolish thoughts.” It’s a vicious cycle: our inability to think promotes our inability to use language effectively. But language is one of the hallmarks of humanity. Language, as Aristotle points out in the first book of his Politics, is part of what distinguishes man from animals. AI writing programs are only able to use language so effectively because human beings enabled them to do so in the first place.
Ultimately, AI writing programs blast an un-crossable chasm between the ideas and words a man may speak and the soul and actions of the man himself. An AI writing program may be able to produce a love letter, but it cannot overflow with delighted, genuine praise of the beauty and virtue of a beloved woman. An AI writing program may be able to produce a stirring call to action, but it cannot slog through muck and lose blood with its fellow-soldiers. An AI writing program may be able to write a sermon, but it cannot walk patiently with its parishioners through a kidney donation, a family member with cancer, or a the excruciating death of an adored mom of three. An AI writing program may be able to write a code of conduct, but it has no soul: unlike students, AI will give no account to God for its sins; unlike students, AI can have no fellowship with God through Christ Jesus.
“…Purposes mistook / Fall’n on th’ inventors’ heads…”
Will all this concern prove to be overblown? Will Scriptorium go out of business because all students decide to use AI not only to write and think for them but also to teach them things? Will my revised approach for teaching thinking and writing work next year?
The future is not in my hands. But I am responsible to be wise and before God to act in the best way I know how to help students know truth and do right.
And there is, highly ironically, one bright point about the whole AI monstrosity.
Despite all the dishonesty Artificial Intelligence promotes, it’s very name reveals it for what it truly is: artificial.
Those who use it have not truly learned to think or write.
And there is a chance that their dishonesty could remain unidentified in the short term.
But this kind of artificial success may backfire at any time.
(And yes, I am the one who actually did write this article—without using AI.)
(c) Grace Hughbanks, 2023