AI in Admissions Can Reduce or Reinforce Biases

user-gravatar
Updated Sep 15, 2021

Admissions offices have been rolling out new technologies that use artificial intelligence (AI) to engage with or evaluate prospective students. But experts and enrollment professionals point out that AI holds the power to close equity gaps as much as augment them, depending on how these emerging tools are used.

“When you’re using AI, you’re usually asking the data to find out what you have said is important based on the data that is in your possession,” said Dr. Kirsten Martin, a professor of technology and ethics at the University of Notre Dame’s Mendoza School of Business. “The key is the data that I have in my possession. Data is not clean of the sins of the past.”

Martin noted how the same AI program could reduce or reinforce biases to recruit students in admissions. A program, for example, could find high schools in marginalized communities that a university has not reached out to historically. That way offices can recruit more students from underrepresented backgrounds.Adrienne Amador Oddi, vice president of strategic enrollment and communications at Queens University of CharlotteAdrienne Amador Oddi, vice president of strategic enrollment and communications at Queens University of Charlotte

“But the exact same program could be used to find students who can save the college money, not increase diversity, if you ask who is expensive to admit and who is not,” she said. “So, you could find students at boarding schools in the Northeast who your program maybe historically says don’t use as many expensive academic support services on campus.”

Adrienne Amador Oddi, vice president of strategic enrollment and communications at Queens University of Charlotte, agreed that admissions inequalities can be heightened or mitigated depending on how AI technology is used.

“The whole admissions process can be reimagined in a more inclusive way. Each year, we get a bit better. But we code inequality in new terms,” said Oddi, who pointed out that anonymizing student information with AI can help reduce subconscious human biases in reviewers.

That coded inequality could come up when assessing curricular rigor for a student despite needs-blind admissions. If the student attends a lesser-resourced high school without as many advanced placement classes, then curricular rigor would appear differently than a student at a wealthy high school with several advanced classes. 

“That’s the hard part of saying that AI is all good or all bad. It’s not neutral,” said Martin. “When you’re deploying it, you need to be thoughtful about how you’re deploying it.” 

Some technology measures prospective student engagement on college websites to gauge their interest.

“AI can bring some visibility to those activities that students are doing that we may not see,” said Oddi of this tool. “On the other hand, schools still need to access technology to create that visibility. The most marginalized communities don’t know they should be poking around a school’s website because they’re being tracked.”

Something that has been quantified into a dataset doesn’t mean that it wasn’t created in a biased or discriminatory way, added Martin. “We sometimes call data ‘objective,’ but that is not appropriate. Really data is easily quantifiable information," she said. "It is not objective to look at GPAs, for instance.”

In 2016 at Georgia State University (GSU), Dr. Tim Renick looked into another use of AI technology with chatbots for accepted students to ask basic enrollment questions. Like many institutions, GSU faced a summer melt problem, or when accepted students miss key steps to enroll for college on time before the fall starts.

“We found that the students who were being weeded out or eliminated by the summer months were disproportionately from underserved backgrounds, mostly non-White and first-generation students,” said Renick, who today directs GSU’s National Institute for Student Success. “So, when we talk about equity gaps at the college level, those gaps begin even before the first day of college.”

With education technology company Mainstay, formerly AdmitHub, Renick’s team brought in chatbots for simple questions that saved the university time and responded quickly to students.

“We built up a knowledge base that responded to texts from students with vetted answers about things like how to fill out the FAFSA,” said Renick. “But the AI we’re using does not write the answers to the questions. We’ve personally written them. The AI uses algorithms to correctly select the right answers in the knowledge base to respond.”

And if no answer fits the student’s question, Renick said, then that question is sent to a staff person who writes a vetted response back. The summer melt in recent years at GSU dropped by about 20 percent. 

While Renick agrees there are serious potential harms with using AI and other technologies for enrollment, he sees the equity benefits as too powerful to ignore, if programs are vetted.

“I would encourage my colleagues not to turn their backs on this technology. But to make sure that you test it continually,” he said. “The benefits can be truly transformative. To say we’re not going to use these systems because there could be problems seems to me shortsighted. There could be ways to understand and prevent those abuses to see real change.”

Oddi’s enrollment team at Queens University is similarly implementing technology that could make a difference in equity.

“We’re trying to see who is staying on our financial aid pages longer and do some proactive outreach to families,” said Oddi. “That’s a huge benefit for folks who see the sticker price and don’t know how to make sense of it. But I still need to think about reaching those students that are not being coded, that are not lingering on financial aid pages yet still could have questions.”

A human touch remains important to her.

“Systems have limitations and people have limitations, but in partnership, we should be able to reduce the limitations across the board,” said Oddi. “I don’t think it’s an either/or on whether to always go with a human decision or an automated decision. People have biases like equations do, so we need to work together.”

But Oddi brought up another point in this debate about AI’s use in admissions decision-making.

“This is sort of a luxury conversation,” she said. “Most colleges and universities say yes to almost anyone who applies, so we’re really only talking about a small subset of schools using this technology in a decision-making process. I think there are more benefits with AI in how to help campuses prepare for students coming in to give them support.”

For example, if AI picks up on patterns that incoming undergraduate classes in recent years have been taking marine biology more than physics classes in high school, and if those marine biology classes have a different laboratory component, then admissions could talk with the university about making sure the first lab class gives students basic introductions. 

“That’s the greater value of AI to me than using it to say yes or no,” said Oddi, who notes the predictive powers of AI are still important to the job of admissions officers. “We have physical limitations with our campus. We can’t all of a sudden enroll 600 students when we need 300 students, so that power is important. But we are also vision makers trying to make changes."

Rebecca Kelliher can be reached at rkelliher@diverseeducation.com