Larry Ferlazzo
The new question of the week is:
How do you think artificial intelligence-powered tools like ChatGPT are going to affect K-12 schools, and what are practical strategies teachers can use to respond to them?
ChatGPT took the world by storm last month when it was made available to the public. Using artificial intelligence, it could produce responses to prompts that were remarkably fluent and cogent and could pass muster as reasonable written responses to class assignments, among other tasks.
Teachers will share their reflections in this series on how these kinds of AI tech developments might affect our classrooms.
Ally or Foe?
Brett Vogelsinger teaches 9th grade English in Doylestown, Pa., where he begins class each day with a poem. His book, Poetry Pauses, is available for preorder now and releases from Corwin Press in February:
I am paranoid that I may later eat these words, and my optimism was misplaced, about ChatGPT in K-12 classrooms, but to agonize over this topic is to use my head and my heart in a way that AI cannot yet. Therefore, I will forge ahead.
ChatGPT will be brutal in classrooms where writing is assigned rather than taught.
See, we can get a decent, accurate response from ChatGPT to an assigned prompt like “What is a major theme in the book Long Way Down?” In fact, when I asked it for one, it gave me two! How much better, though, to ask readers of this book to list five real-world topics it makes them think about—for example revenge, gun violence, and family—than create a question about each of those topics that the book provokes. ChatGPT will not displace the root motive for writing: our human capacity for questioning. Excellent writing starts with questions, and my hope is that pervasive AI moves us away from teacher-created prompts and toward student inquiry.
Many teachers are planning more writing for the classroom and more writing by hand. These can be beneficial moves, unless we regress to lots of on-demand writing sessions where students get that “major theme” prompt at the opening bell and a single class period to create an essay for a grade. While this eliminates ChatGPT cheating, it does not grow writers. How much better if our classwork on paper involves questioning, free writing, sketching, and planning. ChatGPT could thereby create more in-class writing time with the teacher coaching and conferring rather than just preventing or policing AI assistance.
Regarding AI assistance, I also wonder, might we invite AI into our process as we draft? For example, once those students create a worthy question that their reading of Long Way Down provokes, might they write a 10-minute draft in a notebook. Next, they can submit their question to ChatGPT. Now, students have two drafted answers, one purely from their brains, one grammatically precise and AI-generated. Perhaps we will finally move more of our instructional time to that most important and most avoided stage of the writing process: revision.
Will students who have struggled in the past to get those first thoughts down into writing benefit from a few strong AI sentences, trellises upon which to vine their original words and thinking? Will they be able to enjoy spending more time on the craftsmanship—ordering ideas, tweaking word choice, imbuing humor or irony or thoughtful similes—instead of the initial draft?
When we edit, can ChatGPT become a coach? When I asked it, “Can you explain what a comma splice is? Then give a few simple examples and show me how to fix them,” it did a phenomenal job of coaching me in this skill. If I needed more, I could ask it a follow-up question. When we see a student’s writing is rife with these distractions, might ChatGPT provide some quick, tailored, remedial instruction, opening our time to help students apply the skill?
In a conversation of college professors, shared recently on YouTube, one asked us to consider how newer AI compares with the way students are already using Grammarly to write. Another asks us how we should even regard ChatGPT: as an entity to be cited or a tool to be used?
I know this is a barrage of questions. We are all grappling with them right now.
Interestingly, I gave two of my classes the opportunity to work with ChatGPT in their writing process on an essay not too different from the one described above. Only four students took me up on it. Four.
For now, those two classes value their own process and do not feel the need to use this tool or invite this entity into their writing. And part of me wonders, will this go the way of Google Glass? Even when that tech showed us that we could browse the internet not just at our fingertips but at our eyeballs, we realized we didn’t really want it there. And we moved on.
We Have to ‘Slow Down’
Gina Parnaby teaches 12th grade English at an independent Catholic school in the Atlanta metro area. During her 20 years in the classroom, she’s taught 9th, 11th, and 12th grades, as well as electives in speculative fiction, Shakespeare and performance, and the civil rights movement.
Most of the conversations in my department during December were centered around the news that a new AI tool, ChatGPT, was able to generate a reasonable facsimile of student writing in a few seconds. Many of my colleagues were justifiably concerned about how this tool could be abused by students and how we could or should respond. While I do share the concern about academic honesty—integrity is one of the most important lessons that we want students to learn by the time they graduate from high school—I think that the advent of this tool, and whatever else is coming along after it, give us an opportunity to reflect on how and why we’re teaching writing.
In the world of K-12 education today, there’s a heavy emphasis on products and tangible evidence, often to the detriment of process. Process can be messy. It’s imaginative, it isn’t standardized, and it’s hard to quantify. Students and families often expect to see clear criteria for “success” (read: a high grade). If the only thing that seems to matter is the product—a piece of writing on a given topic with X number of words—it’s easy to see how students would feel justified in using a chatbot to generate something they can turn in. Much as math teachers struggle to get students to see the value in learning how to do long division when a calculator can do it for them, or language teachers struggle to get students to see the value in working through their own translations when an online translator can do it in seconds, we’re going to have to work to get students to see the value in writing as a process rather than a product.
One of the great delights of my job is working with students on their personal essays for college. For many of them, it’s the first piece of writing in which they feel they have a strong stake. They want it to be good! When a student comes in to work on that essay, they’re willing to cut a sentence here, add one there, rearrange the body paragraphs, try several different ways of phrasing—even scrapping the whole first draft and going in a completely different direction. I’ve thought often over the past several years of how I can encourage students to see all their writing in a similar fashion—as something that can be revised, rearranged, rewritten, and reconsidered until it’s good.
The best solution I’ve come up with is talking explicitly with students about how writing is thinking made visible. Just like different translators will emphasize different things in their translations of the same work, or different mathematicians will work through problems differently, our unique voices and perspectives matter. I tell my students the story of my first college paper, written for a philosophy course, and the comment that the professor wrote on it: “This is an elegant book report. You’ve proven you can write—now prove you can THINK!” I took her advice to heart and tried to write things that I thought and cared about.
When I’ve looked at pieces of writing that are generated by AI, that human element—the sense that the writer thought and cared about what they said—feels missing. Too often, it’s also missing from our students’ work. Navigating these new tools in our classrooms will require us to slow down, work through the process of drafting and revising, and make space for us and our students to think and care about the craft of writing. Maybe that looks like handwritten work in class; maybe it’s revising work that is AI-generated; maybe it’s offering more opportunities for revision. Whatever it looks like, though, it should have at its heart the joy of making thought visible and finding the right words.
Process, Process, Process
TJ Wilson is a high school English teacher who writes fiction and nonfiction and has been published in OJELA, Santa Ana River Review, and Ansible. He writes essays about education and other nonfiction things on his personal website at www.thomasjosephwilson.com. He lives in Cincinnati:
The new artificial intelligence everyone is talking about, ChatGPT can do some astounding things. It can write essays that seem synthesized and human, like you have your very own second brain on the job, doing your thinking and writing for you. And though people have pointed out that its writing style is not very stylistic or interesting yet, AI like ChatGPT will get better and better. Maybe even become self-aware, infiltrate my smartphone, and let me know when I’m craving burritos before I do. But, really, wouldn’t it be boring if I let my food choices come at the whim of a robot?
I doubt a species who made a world through a strong innate drive of curiosity would allow robot overlords to take over all of the work and the fun of life.
So, will students use ChatGPT to cheat? Yes. And their reasons are age-old.
In 2008, Barry Gilmore wrote Plagiarism, a book addressing old and new (read: analog and digital) forms of plagiarism.
He outlined four reasons for plagiarism:student confusion;
external pressures;
cultural expectations; or
perceptions of ease.
The top three are school-related reasons to cheat that may have more to do with grading systems and other school pressures.
The last item is what I just addressed: the fear of making schooling more efficient than what it should be. If our students were a bunch of Goldilockses, they wouldn’t struggle to the optimum “just right” of the zone of proximal development; students would let their learning take the form of typing a few sentences to a robot and then copy and paste.
Certainly, many assignments that teachers use can be done by ChatGPT. Even now, colleagues are using Google’s AI to write emails to other colleagues; Grammarly has cleaned up student essays without students understanding the true purpose of a semicolon or the bajillion rules of a comma; and spellcheck, let’s not go there.
Granted, these software writing helpers are not ChatGPT. But the point is the purpose of the writing. If we focus too much on product and not process, we run the risk of creating ever more companies that provide ever more policing for cheating in our schools.
But that’s the arc. We’ve been obsessed with using quantitative measures like graduation rates, standardized tests, and grades to deem education successful or not. We’ve programmed robots to even “machine score” standardized assessments, both multiple choice and writing, as they have in Ohio since 2018.
If we are that worried about cheating, haven’t we put ourselves into a situation that treats students more like machines than beings who require time for contemplation to grow?
Here, my teacher idealism rises forth. Yes, we don’t have the resources to make class sizes smaller or the resources to give tutoring time to all the students who need it. Regardless, if we focus on the process, ChatGPT becomes as useless as all the other forms of plagiarism in academia: copying text from a book and not citing it, copying and pasting from the internet, buying writing assignments, having friends write assignments for you, having parents write your assignments for you, or recycling essays from students from past years or from students who have a different teacher but the same assignment.
All the great writers know that writing is organized thinking. And, frankly, all the great professional-development writers in ELA have been touting teaching methods that circumvent any robot-involvement worries for many years, perhaps even four decades and more.
I would not have understood what I really thought about ChatGPT without putting my thoughts down and fiddling with them, asking questions of my own ideas, giving myself a break to contemplate things, reading what others say, talking to colleagues, and then writing, revising, writing, revising.
This is how beings with limited processing power think best: translating our thoughts into highly symbolic words, an act of organized thinking, contemplation, and communication. ChatGPT is not built for that. It’s just a quicker writer than I am, not a better one. And isn’t that the point our students should learn?
So what do we do in this AI world? Isn’t the answer just to keep teaching?
No comments:
Post a Comment