Create a free Diverse: Issues In Higher Education account to continue reading

Higher Ed Leaders and Scholars Discuss Promises and Pitfalls of AI Tool Usage

Artificial intelligence (AI) tools such as ChatGPT can prove very valuable and promising in the realm of higher education but come with their own suite of issues that need to be considered, according to higher ed leaders and faculty who participated in a panel discussion on Wednesday.Dr. Gilda BarabinoDr. Gilda BarabinoOlin College of Engineering

"Our use of AI in our teaching can be seen both as a threat to how students acquire foundational knowledge, but it can also be an enabler to enhance student learning,” said panelist Dr. Gilda A. Barabino, president of the Olin College of Engineering.

The Jan. 10 online panel – hosted by the American Association of Colleges and Universities (AAC&U) – invited a number of scholars in higher ed to weigh in on the potential and challenges that AI tools may bring to the field.

Moderated by AAC&U President Dr. Lynn Pasquerella, the discussion comes at a time when the topic of AI tools continues to make its rounds through public discourse nationally and internationally, particularly in sectors such as art, writing, and labor.

The arrival of consumer-grade AI tools into higher ed space has been swift and “immediate” like few other things in history have been, said panelist Dr. C. Edward Watson, AAC&U’s associate vice president for curricular and pedagogical innovation and executive director for open educational resources and digital innovation.

"ChatGPT launched in November of 2022,” Watson said. “Within two months, there was a survey of students that found 89% had already given ChatGPT a try."

And with its introduction into the mainstream comes a number of promising cases. But faculty need to be properly trained for AI so that they can then in turn prepare students to both be competent in using AI and ready for workplaces where AI is present, Barabino said.

Olin offers courses where faculty and students incorporate the use of AI tools to review and assist with schoolwork and offers a course on the ethical implications of AI in engineering, she said.

“We really emphasize the importance of social considerations in every technology course throughout the entire curriculum,” Barabino said. “In doing that, we've created this atmosphere where questions of how and when to use a technology are paired with questions of whether to use a technology. These new AI tools ... are within that wider context."

AI has promise in terms of recognizing student patterns and how they relate to student persistence and retention, said panelist Dr. William J. McKinney, senior director of higher education initiatives for the Council for Adult and Experiential Learning. The fast pace at which AI tools can gather and process data and can develop early warning systems for student advisers to use is valuable, he said, adding that human analysts will still be needed to distinguish between patterns of correlation from causation.

However, generative AI tools such as ChatGPT also come with issues of factual inaccuracy and sourcing, according to the panelists. For instance, although AI can be used to create even more accessible and free open educational resources (OER), the content such tools produce often lacks attribution, Watson said.

And given how these tools amass their data collections by pulling from numerous sources, concerns over copyright are present as well, said panelist Dr. Bryan Alexander, a senior scholar at Georgetown University.

“All of these tools are trained on huge amounts of digital content, some swathe of which was copyrighted,” Alexander said. “So already we have a whole series of lawsuits."

He also mentioned how AI tools – which have been met with ample skepticism and suspicion from the general public – can bring about concerns over labor and automation, and how AI may replace human workers in certain markets.

In the context of pedagogy, particularly in the humanities and social sciences, it can sometimes be difficult to discern whether students have used AI for their work. But that means educators should be asking different questions, ones that don’t have straightforward answers that can be duplicated by machines, said panelist Dr. Michael S. Roth, president of Wesleyan University.

"We don't want the answer. We want to see the student thinking and expressing their thinking,” Roth said, adding that he wants to see more methods of getting students to show their independent thinking while listening to ideas from other sources, including AI.

He suggested oral exams and class conversations as possible solutions but pointed out that they may disadvantage smart students who don’t excel in such activities.

A New Track: Fostering Diversity and Equity in Athletics
American sport has always served as a platform for resistance and has been measured and critiqued by how it responds in critical moments of racial and social crises.
Read More
A New Track: Fostering Diversity and Equity in Athletics