On March 6, QCC faculty held a panel discussion in collaboration with Worcester Polytechnic Institute faculty that discussed the use of ChatGPT, the artificial intelligence program that went viral in November of 2022, and its effect on higher education. The panel was moderated by Executive Director of Distance/Online Learning and Center for Academic Excellence Brooks Winchell and consisted of:
- QCC Dean of the School of English and Humanities Brady Hammond
- QCC Dean of the School of Business, Engineering, and Technology Betty Lauer
- WPI Associate Professor of Computer Science Gillian Smith
- QCC Professor of English John Stazinski
- WPI Assistant Professor of Anthropology and Rhetoric Yunus Telliel
The panel discussed concerns, advantages, and strategies to adapt to the new technology that is capable of creating written content through the use of a chatbot.
"I was struck by the novelty at first. You can ask it to write an essay in the voice of a known writer and it comes pretty close," said Professor Stazinski, adding, "It can also be a useful tool to get students over the hump who are scared of a blank page."
Winchell commented, "It's one of the first systems to have a million users within a week. There's a really cool feature of interaction with a chatbot that allows users to refine a piece of writing by changing the tone or elaborating on concepts. Definitely try it if you haven’t yet."
Beyond the initial excitement, concerns have risen about the repercussions of automated writing in higher education settings including plagiarism and academic integrity. Programs have already been designed to detect the use of AI in written text.
"Part of the trick is figuring out where it's appropriate to use and where it isn't," said Dean Lauer.
"I typed in a prompt and immediately thought, 'Uh-oh. This changes everything.' I was amazed a how readily available and how quickly it produced things," Dean Hammond recalled.
Professor Smith has worked with generative AI before.
"My 'uh-oh moment' was worrying that people are going to read more into what it’s capable of doing. It doesn’t understand what it’s saying and has no agency; it’s a fancy math program. Yes, people are going to use it to skirt boundaries but we’re in a lot of trouble if people think this thing is actually intelligent," Professor Smith added.
Other concerns were bias and lack of transparency when commercial competition infiltrates this type of technology.
"I think we’ll see companies and institutions try to change that database and get their ideas out there. I not only worry about commercialization but also the aspect of social and political impacts. Different political groups could use programs like this to manipulate information," Lauer said.
A consensus of the panel was that the most important thing to note is that programs such as ChatGPT can't be ignored and educators must find creative ways to engage students.
Professor Telliel described an assignment he gave to his writing classes at WPI, in which students had to utilize ChatGPT and develop learning objectives and assessments. One student compared the actual speeches of Donald Trump to AI-generated speeches.
"It made them think about rhetoric and use of evidence," he said, adding, "When they encounter ChatGPT as something they can analyze, it gives them another perspective."
Smith noted that when she put her activities and assignments into ChatGPT to see what it would create, the results weren't good. It often utilized advanced programming concepts that wouldn't have been covered in the class and would cause confusion even if the student tried to use the text as a starting point.
"I found myself changing how I was teaching the assignments. Making prompts that wouldn't easily compute, like coding a rainbow octopus with googly eyes," Smith said.
Smith also noted that it's important to consider what is motivating students to plagiarize in the first place.
"Maybe we can actually fix the problems that lead to engaging in plagiarism by removing high stressors, pushing back on the commodification of higher education, and making it desirable for students to share their own ideas," she said.
"This forces us to assess actual learning instead of output. Once you realize that a robot can do your assignment, you have to think about what you’re teaching and what you’re signaling to students that you value. Do I value their voice or value jumping through a hoop just to get skills and get a job? There should be joy in our conversations about learning because if there's no pleasure in what they're doing, they're going to use AI," Stazinski commented.
"I'm most excited for the potential here that it will force all of us to really listen to the voices of our students," said Hammond, who compared the fear of ChatGPT to the fear of TI83 graphing calculator, which has now become commonplace. "We can talk all we want about programs that detect the use of AI but we really need to hear our students' voices and what they say. And if we don't adapt and grow, we're behind."
"I want to appeal to administrators to support their faculty as they navigate this. My colleagues often don’t have time to learn new technology and new techniques in a short amount of time. That's why this panel is a great one, we need to have regular discussions about public interest technology and education related technology. That’s the only way we won’t be just reacting but leading the conversation as a community," Telliel said.