Create a free Diverse: Issues In Higher Education account to continue reading

As AI Continues to Progress, Opportunities and Warnings Abound

The rapid advance of artificial intelligence in the world of higher education has continued with the report that Harvard University has plans to use an AI chatbot as part of its introductory computer science course. The bot is designed to help students understand code and improve it, as well as to answer common basic questions, freeing up teaching assistants and professors to deal with more complex concerns.

Some believe that it’s a first step toward a reimagining of the role of human instructors.

Dr. Ivory Toldson, professor of counseling psychology at Howard University and national director of education innovation and research for the NAACPDr. Ivory Toldson, professor of counseling psychology at Howard University and national director of education innovation and research for the NAACP“There are things that humans are uniquely positioned to do, and a lot of times, we can’t do those things as effectively because we’re expecting humans to be more like computers,” said Dr. Ivory A. Toldson, a professor of counseling psychology at Howard University and the national director of education innovation and research for the NAACP. “If we have computers that can operate certain aspects of the higher education environment, then we can re-prioritize what humans do. The human instructor can place more emphasis on things like giving emotional support to their students, encouraging them, [and] mentoring them.”

The technology is already making an impact in the classroom aside from applications like chatbots, according to Dr. Trey Conatser, director of the Center for the Enhancement of Learning and Teaching at the University of Kentucky, and a member of the school’s ADVANCE committee, which studies the opportunities and challenges of AI in university settings.

Professors in creative fields, like writing, have found ways to incorporate AI into their teaching, cleverly taking advantage of the technology’s limitations. Some, for example, are having students use tools like ChatGPT to respond to prompts and then having them analyze the flaws in the bot’s responses.

“It can show students that the writing is mechanically pretty correct and stylistically pretty clean, [but] there’s lots of different choices we can make when we’re writing, and a lot of that might fall through the cracks for some of these platforms,” said Conatser. “So, the students learn a little bit about how you can strategize as a writer, and they learn about how you can make different moves in your writing that respond to specific needs.”

Others are teaching information literacy by having students fact-check bots’ responses to research questions.

Dr. Trey Conatser, director of the Center for the Enhancement of Learning and Teaching at the University of KentuckyDr. Trey Conatser, director of the Center for the Enhancement of Learning and Teaching at the University of Kentucky“They make sure that the things that are coming out of the AI are actually correct,” said Conatser. “And in the process learn a bit about how, as researchers, we’re always following up on information and making sure that things are accurately representing the situation that we’re addressing.”

The technology can also help professors plan syllabi and customize courses for different sorts of learners. An instructor could tell an AI what sort of class he or she is teaching, to what level of students, the learning goals, and the mandatory textbook, and ask it how best to fit the content into a specific number of sessions. The AI could return a syllabus with assignments designed to help students with the trickiest topics.

AI also has institutional implications. With the use of a plug-in, ChatGPT can do data analysis and visualizations very quickly, saving human labor. It can also look at data in an open-ended way, to find interesting trends and patterns. (Of course, potential privacy issues would have to be resolved.) AI could also help with student experience management, sending customized information and reminders to different sorts of students, such as adult learners and residential ones.

But the advance of AI brings up equity concerns as well. Students may have unequal access to the technology, and the use of AI in the classroom could privilege those who are more comfortable with it, much as the pandemic-era shift to remote learning advantaged those with more access to and familiarity with tools like videoconferencing.

“If we’re going to use generative AI for instruction, we need to make sure that our students all understand how to use the technology effectively,” said Conatser. “[That] they have equal access to the technology via devices and access points like the internet and so on. And that they’re all coming from the same place in terms of literacy.”

There are also concerns that AI tools could be biased. Large language models are generally trained on text pulled from the internet, which disproportionately represents certain groups.

“The critique is that this favors a certain part of the world, the Northern post-industrial countries that have the most internet access,” said Conatser. “If you’re going to ask questions that have anything to do with identity and experience, the data set is going to have a much more sophisticated understanding of stuff that comes from those parts of the world.”

However, Toldson believes that AI could be especially beneficial for schools that focus on minoritized students.

“If artificial intelligence allows professors to reprioritize and focus on the more humanistic aspects of the position, I think that HBCUs and MSIs could use that more than anybody else,” he said. “A lot of times, they come in with less resources, sometimes they have more first-generation college students, and they have professors that teach more classes.”

But these under-resourced schools could be left out of the AI wave.

“All of these systems cost money,” said Toldson. “Technology becomes cheaper, but you’re always going to have that time period where only the ones who can afford it can use it. If some institutions are having advantages because they are early adopters of something that’s inaccessible to institutions that have less resources, then that could be problematic.”

It’s clear that although the AI revolution may be unstoppable, colleges and universities are going to have to be extremely careful that it is deployed fairly and equitably.

Said Conatser, “This is going to be something that we have an obligation to reckon with.”

Jon Edelman can be reached at