When ChatGPT, the large language model (LLM) AI that can generate long, detailed answers to complicated questions, was first made accessible to the public in November 2022, it changed the landscape of education forever.
“As you can imagine, people were and still are apprehensive. I would describe faculty as on a spectrum between sanguine and despairing,” says Dr. Joseph F. Brown, director of academic integrity at The Institute for Learning and Teaching at Colorado State University (CSU). Faculty members, Brown recalls, were worried that students would use ChatGPT to cheat and bypass any difficulties they encountered, negatively impacting learning.
A perfect solution was not easy to find. Some institutions turned to other AI software to assess assignments’ originality in the same way institutions assess for plagiarism. But this method led to its own problems, including students’ work being falsely flagged as AI-generated.
Now almost a year since its release, Brown says CSU faculty and administration are coming to terms with the fact that “policing students’ use [of ChatGPT] is probably impossible, usually ineffective, and definitely inefficient.”
Instead, CSU faculty and educators across the nation are discovering ways to adjust their pedagogy to accommodate this brave new world, not only through the creation of AI-proof assignments but also assignments that purposefully incorporate AI use. Those assignments acknowledge the usefulness and the limitation of the technology, which cannot detect or differentiate human bias and negative stereotypes in the almost limitless field of data and information from which it pulls its answers.
Experts say educators have a responsibility to rethink how they assess learning and help their students gain mastery over AI tools like ChatGPT and other LLMs so they graduate into the world fully ready for their future in a technologically dense workforce. Using AI as a tool will not only better prepare students for the future, experts note, but can also help ease the workload of faculty and administration.
“Faculty need to be clear when they are working with students that one of the primary limitations of AI, generative AI, and LLMs is they are crowdsourcing information — by definition, providing for you a majoritarian view of the world,” says Dr. Inara Scott, senior associate dean in the College of Business at Oregon State University.
Detecting and recognizing biases
ChatGPT and other LLMs work at lightning speed to scrape, or gather, as much information as they possibly can from the Internet to produce solutions or answers to a query. This means that incorrect data is presented in the same manner as true data and that marginalized persons or groups are unlikely to be present in any solution AI offers to its user, which “has clear limitations and implications,” says Scott.
“We need to teach our students the places we’re likely to see bias in a particular discipline,” says Scott. “I teach business law, and I know about areas of law where there is structured bias built into that law. I know about the history of legal bias and racism — so when I teach law, I need to instruct my students in ways that bias might show up and what to think about when they ask an LLM a question.”
Dr. Vaughn A. Calhoun, assistant vice president and dean of the Center for Academic Success at Seton Hall University, agrees.
“As long as there are biased humans, AI will also turn out data that’s biased, which is why AI literacy needs to be a natural part of higher education,” says Calhoun. “If we don’t think about the ways it can be used, we’re setting ourselves up for failure.”
Provost Dr. Janice L. Nerger “led the charge” in educating faculty at CSU, Brown says, hosting campus-wide events that increased their understanding of and facility with ChatGPT and other generative AIs. But Brown still has his concerns, especially regarding learning.
“I’m concerned about students taking the first topic, outline, or feedback suggested [by ChatGPT]. I’m concerned that they don’t have the expertise to know poor output from good,” says Brown. “I worry that in our attempts to seem innovative and embrace something new and exciting, we won’t spend much time asking if students are, in fact, learning with this new tool.”
Calhoun investigated various ways the professoriate across the nation has adjusted pedagogy to incorporate or acknowledge AI. Some assignments purposefully ask students to use ChatGPT to generate answers for a question. Then, students are required to show how the AI might be presenting false or misleading information. Assessments like these encourage critical thinking, something Calhoun points out is as much the institutions’ responsibility as it is the students’.
“It’s about reshaping mindsets, updating teaching methodologies, and revamping institutional structures,” says Calhoun. “AI and ChatGPT is here for the long haul, and this whole shift has pushed administration and educators to rethink how to teach, new ways to help students think critically and make smart decisions, using AI tools for help, without letting our own thinking skills take a backseat.”
Students and faculty members who know how to properly use AI, Calhoun adds, will be able “to navigate a world where information is everywhere, and not all of it is reliable.”
“In a world where data is king,” says Calhoun, “AI is a superpower.”
The need to embrace AI
The sooner faculty members become better acquainted with AI, its limitations, and its advantages, the sooner they, too, can benefit from it, says Calhoun. That’s why some AI detecting software isn’t looking for evidence of plagiarism or learning short-cuts. Instead, companies like GPTZero are offering their clients a detailed and nuanced look at how students are engaging with AI during their homework assignments, even identifying which assignments triggered more AI use than others.
“To a lot of our schools, we’re helping you take control by knowing where AI is happening, but not framing it as adversarial,” says Alex Cui, chief technology officer at GPTZero. “We give people the ability to make informed policy decisions, giving them resources on why or what you should do when students see AI, how to make AI-proof assignments more interesting.”
Many faculty have found ways to mix AI with human-driven writing solutions, marrying the two skills.
“It’s realistic to say, in a lot of knowledge economy jobs we’re doing in the future, people are going to be using AI in their daily workflow,” says Cui. “Teachers need to work with students to ask, ‘how did you get to this answer, what was the process,’ and see if they demonstrate the skills they need to be in the real world. The assessment part of schooling has shifted more from the result to the process.”
The ability to make these pedagogical shifts will vary from institution to institution and greatly depends on the resources available to faculty, says Scott.
“You’re always gonna have those first round of folks who are nimble, agile in the classroom, interested in adopting new techniques and able to flex in unique ways, which comes with some privilege around small class sizes, upper division courses, maybe different student body types,” says Scott.
But faculty members with heavy workloads — perhaps working at multiple institutions with large numbers of students, teaching courses they did not design, or overwhelmed and resistant to new technology — will need extra support from administration to make the transition, notes Scott. And it’s precisely those faculty members who might benefit the most from higher education’s embrace of AI.
“Faculty who have large class sizes, they’re under enormous amounts of strain going into this transition. They don’t have the time or resources to have individual relationships with students that they might get at a private institution, that faculty would love to give,” says Scott. “What AI might allow us to do is provide more individual attention to students who can’t afford and aren’t able to access that other, magical, lovely version of higher education.”
Still, AI can be useful in identifying areas where students might be struggling and can offer lightning quick feedback on assignments that students might otherwise have to wait weeks to receive. AI can also be used to supplement advising and help keep students on track for graduation.
“There’s also a possibility that, if we don’t do a good job of figuring out how to harness this, it doesn’t go well,” says Scott. “If we don’t change the way we teach to account for these tools, don’t think about the ways this impacts our students, there are potentially bad outcomes.”