One more post about ChatGPT
I've been invited to be a panelist on a webinar on ChatGPT and its likely impact on teaching and learning. Here are a few of my thoughts.
This is going to be a short post. I need to get some ideas down on paper for a panel I’ve been asked to present on tomorrow (Wednesday, January 18) from 3:00pm to 4:00pm ET on…you guessed it. ChatGPT.
Hosted by IUPUI’s Center for Teaching and Learning, tomorrow’s virtual webinar explores the potential affordances and disruptions of ChatGPT and AI in the classroom and what it might mean for teaching and learning both now and in the future. My role is to play the part of the writing pedagogy expert, so I will be talking about the anxiety that ChatGPT has caused among writing teachers—and really anyone who regularly assigns writing assessments as part of their teaching—and what the future may hold for how we teach and assess student writing.
My general thoughts on ChatGPT are as follows: first, we’ve been here before. When the web became a reality on college campuses, when smartphones first appeared in the hands of every student, or when online learning started to gain traction in the academy, there were pockets of intense resistance. We all had to learn how to work with and think through these technologies, even though the initial knee-jerk response, more often than not, was to ban them or forbid them in the classroom. (Heck, in some corners of higher ed, it wasn’t until the pandemic forced us all online that folks finally acquiesced that online learning might not be such a bad thing after all.)
As Dennis Baron so eloquently pointed out years ago in his classic essay “From Pencils to Pixels” (an essay I still assign in my graduate seminar on mis- and disinformation in the postdigital era), any time a new literacy technology hits the scene, reactions follow a fairly predictable arc: fear and anxiety, followed by misguided attempts to ban or refuse the technology, followed by a gradual (if reluctant) acceptance, followed by a wholesale integration into our daily lives such that, before long, we’re left wondering how we ever got through a meeting without our iPhone there to distract us. (One of the things I adore about Baron’s essay is how he works through the history of such literacy technologies as the lead/graphite pencil, showing how in its earliest stages it provoked anxiety among teachers who feared that if students were able to simply erase their mistakes, they would never truly be able to learn from them.)
Second, ChatGPT has re-opened a robust conversation in higher education circles (and even beyond) about the nature and purpose of writing instruction. This conversation has highlighted much of the anxiety around plagiarism and cheating, naturally, but it has also highlighted questions we should be asking, like Why do we teach writing in the first place? What capacities and habits of mind do we hope students will develop through the act of writing, which includes invention, drafting, feedback, and revision?
Writing is the best tool we have for helping students develop the capacities of critical thought that are absolutely central to intellectual exploration at a high(er) level: summary of complex ideas, analysis, synthesis, metacognition (why did I make the choices I made and what other appropriate choices could be made?), precision of thought, identity-building, and more. The challenge for anyone who teaches writing or uses writing as an assessment in their teaching will be to figure out how to preserve writing’s ability to develop these skills without allowing ChatGPT (and similar AI tech that is sure to come) to hijack and short-circuit this learning process. I tend to think that folks with training and backgrounds in writing pedagogy will have an easier time than folks who simply assign five-page papers on “a topic of your choice, with appropriate evidence and sources.” These folks will need to cozy up with their local writing program administrator or WAC/WID specialist or rhetoric and composition person on their campus to learn some strategies for developing better assessments, because I do believe that AI is going to be the death knell for those kinds of (let’s face it) not-very-exciting-or-well-thought-out writing assignments.
Why do we teach writing in the first place? What capacities and habits of mind do we hope students will develop through the act of writing, which includes invention, drafting, feedback, and revision?
I liken it to the rise of online teaching. In the early days of online instruction, there were some really high quality online courses and a whole lot of really low quality online courses. It took time for people to figure out how to teach in this new virtual medium, how to adapt their assessments, how to communicate effectively with students, and how to get students to interact effectively with them and each other. This didn’t just happen overnight. The instructors who were better prepared to teach online helped out those who weren’t, but many of the old methods of assessment from the days of exclusive face-to-face teaching and paper-based grading went by the wayside because they had to. These teachers were teaching to a world that was rapidly changing; now it no longer exists. (Imagine teaching a course without Canvas or Blackboard, or demanding that students submit paper only assignments.)
Similarly, I think traditional writing assessments that ask students to “pick a theme from The Great Gatsby and write a five page paper about it with appropriate textual evidence” will have to be revised, if not done away with altogether. These are precisely the kinds of assessments that ChatGPT can mimic in a fairly convincing fashion. We will need to assign more in-class invention and drafting, students will need to make guided revisions based on careful feedback from their instructors and peers, and then they will need to explain why they made the changes they made based on (or inspired by) this feedback.
Students will also need to do more personal writing—or writing that uses elements of their own backgrounds, thoughts, ideas, and experiences—and then synthesize this with larger social, cultural, and political topics and issues. I have a podcast assignment that I use in my first-year writing courses (ENG-W 131 at IU) that does precisely this—and it is successful because it asks students to both look within themselves (and their own experiences, prejudices, predilections, passions, etc.) and then connect these to an external issue of some significance to a wider audience—poverty in the US, the high cost of college, the importance of getting enough sleep, pandemic challenges to the healthcare profession, growing up Black and transgender in an overwhelmingly white (conservative) community in the Midwest, and so forth. (These are all topics that students have written about and developed podcasts about in my courses.)
In short, ChatGPT and AI are going to force people to really think hard about their writing assessments and change what they’ve always done, in the same way that online teaching forced us to become better teachers and savvy users of technology, or mobile phones forced us to reckon with digital distraction and then became a tool for learning and exploration. The same old stuff we’ve done in the past will have to be revised—and I have a hard time seeing how this is a bad thing, though I do get the anxiety. We’ve been here before, and anxiety brought on by new literacy technology is as old as Plato’s Phaedrus. (And probably much older.)
Lucky for us we have pencils with erasers, so we can revise.