Reflecting on the Conference on Teaching and Learning, in Mobile

Earlier this week, I went down to the 16th annual Conference on Teaching and Learning, hosted by the University of South Alabama. A colleague and I drove down from Montgomery to learn more about this (somewhat) new technology that’s affecting our work as teachers of writing on a near-constant basis. What most people simply call “AI” is usually referring to large language models, like ChatGPT, which are today the most commonly used tools of this sort. And that includes the uses employed by some students who sink their assignment prompt into an AI tool for the same reasons that I load my dishwasher. It’s just easier. Well, after a year or two of doing everything from proselytizing to punishing, teachers and professors are coming to a few realizations that no one loves but that we’re all accepting, some slower than others: AI is here, it’s growing in prevalence, and because it’s free and accessible, students are using it.

The most common sentiment that I heard during our two days at the conference was that it’s time to start teaching “AI literacy.” Most of us seemed to agree that AI can be a useful tool, some were vocal about their real disdain for it, and a scant few admitted to being neophobes and luddites. Personally, when it comes to AI, there are aspects that I like: the maps apps on my phone and the product suggestions on sites like Amazon. What I don’t like – and I think I’m speaking for most teachers of writing – is when students use AI to manufacture and submit work that isn’t their own. There are myriad problems with using it that way, from the failure to learn relevant skills to committing honor code infractions. I liken this practice of generating papers for classes using AI to the online scam called catfishing. In both cases, the person can’t be in real life what he or she claims to be through the use of digital tools. In the case of academic work, students who submit AI-generated papers are portraying themselves to be competent in writing and thinking when they may not be. Unfortunately, some students have chosen to use this new tool in even more nefarious and manipulative ways. And now the general consensus seems to be that it will be up to educators to do the hard, grinding work of showing them the right way to use it.

The most common solution to our conundrum that I heard, in multiple sessions, was to employ AI tools during the drafting and composition process. Several of the presenters described a writing process that involved and even required using AI tools to generate text that would be compared and contrasted to the student’s own writing. This would necessitate the submission of the AI-generated text to the professor and thus create opportunities to review differences, give advice, and teach students how to use AI as a thinking tool within the process. This sometimes yields the recognition that the AI-generated response is bad, wrong, or even fake. (These are called “hallucinations,” when AI just makes shit up when it can’t come to a real answer.) This incorporation, which will apparently be dubbed “teaching AI literacy,” is going to become part of our jobs as educators.

As a teacher of writing who is also an open-minded person, I am not opposed to change. I ask my students to learn and grow, and I expect that I should, too. I walked into a classroom in 2003 with no training and no experience, and learned how to teach. I adapted when the Great Recession cut teacher jobs and classroom funding. I’ve adapted as students have gotten smart phones, which they bring to school. I adapted when COVID forced us all to work from home. Now, I’m adapting to the emergence of artificial intelligence. In every case, though, what I want to point out is this: society has put a heavy burden on educators, expecting us to respond to problems we didn’t create and insisting that we know what to do about problems we haven’t been trained to address. Educators didn’t cause the Great Recession, we didn’t give kids smartphones, and we certainly didn’t create COVID. Society dropped those things on us like hot coals. I can say with certainty that society is doing that again with AI. Educators didn’t invent it, and we didn’t ask for it, but we’re being made to serve on the front lines of dealing with it.

I want to note here that it will be up to educators to do that, since parents, grandparents, aunts and uncles, older siblings and cousins, friends, pastors, coaches, neighbors, community leaders, politicians, business leaders, mentors, and other adults are – once again – going to leave that work to educators. We will be expected to tame what someone else unleashed. Every adult with a conscience knows that cheating in school is wrong, and manufacturing schoolwork that will be submitted for a grade is cheating. Yet, somehow the task of making that clear is once again falling to educators, as we have to convince young people that they should come to school, do their own work, and learn . . . instead of just plugging assignment instructions into a prompt box and turning in the output from that minimal effort. 

So what do I want? Help us! Society needs to help us. In Catholic culture, we say that parents are the primary educators. We need parents teaching their children to be responsible with AI in the same way that they’d teach their kids to be responsible with other tools, like power saw or a car or a gun. In the wider culture, we also say, “It takes a village.” We need everyone in our children’s lives to do better than just muttering out the un-thoughtful adage, Well, you know, it’s here to stay, and sending them to school for someone else to figure it out. If complacency and passing the buck define our societal attitude on this, then we really do have to worry about AI taking our jobs and controlling our lives. In the meantime, educators are going to pull our weight – probably more than our weight! – but just like before, we shouldn’t be expected to pull everyone else’s weight, too. 

Got anything to say about this?