Wednesday, August 02, 2023

When AI answers the wrong questions

Writing instruction is irrelevant, they said, or something very close to that. Artificial Intelligence is able to excel at so many writing tasks that in the long term, they said, we won't need to waste time teaching students how to write effectively because they won't need those skills in the so-called Real World. 

Imagine a world, they said, in which you never again have to explain to a student how to use an apostrophe--because the AI can take care of that. Students who rely on AI will turn in papers without annoying spelling errors, without misplaced or missing commas, and without subject/verb agreement flaws.

And without humanity, I wanted to interject, but there's no point in yelling at the radio. Without creativity, without playfulness, without depth of ideas, without so many things that make writing worth reading.

To my mind, the AI cheerleaders seem intent on providing answers to the wrong questions. If the question is "What's the most efficient way to get students to produce error-free essays," then AI is a pretty good answer. But what about other kinds of questions?

Go ahead and ask ChatGPT about the purpose of suffering. It is capable of instantly producing an error-free, well organized essay outlining various philosophies of suffering, with a caveat at the end:  

It's important to note that these explanations are not exhaustive, and different individuals and cultures may hold unique beliefs about the origin and purpose of suffering. While understanding the reasons behind suffering can offer insight, addressing and alleviating suffering remain important goals for individuals and societies.

All true! And if I ever need a brief summary of different approaches to understanding suffering, I'll know where to look. But what I don't see in this passage is any evidence of original thought: given all these approaches, which do you find most relevant to your life or to a particular text we're discussing? If, as Richard E. Miller insists, writing is "a technology for thinking," then I want writers to do some thinking in their writing, not just parrot back what others have thought.

But apparently I'm missing the point. According to the experts on the radio, I need to think about writing tasks from the perspective of students, who may suffer extreme anxiety at the thought of putting words to paper. AI will help them overcome that anxiety by enabling them to produce error-free prose fulfilling the requirements of the assignment.

Again, all true! If the main purpose of education is to ease students' anxieties about writing, then AI is a pretty good tool. I don't want to minimize the impact of writing anxiety--heck, I still get tense and jittery when I have a deadline and I've been writing professionally for over 50 years. But in my experience, we don't overcome writing anxiety by avoiding writing but by writing. Just as a performer can learn to transform stage fright into energy that connects with the audience, the anxious writer can learn to transform anxiety into energy on the page.

But this transformation takes time, and it takes effort, and it takes writing--a lot of it. The AI cheerleaders seem delighted that these new tech tools will allow students (and others) to write less when what they really need is to write--and think!--more. 

But I can't tell the radio experts all that, and even if I could, why would they listen to me? After all, I"m a dinosaur facing certain extinction in a world in which writing and thinking will no longer matter.  

No comments: