A group of Democratic senators, led by Sen. Richard Blumenthal, called on three online proctoring companies to respond to equity and privacy concerns raised by students last month, setting a deadline of Dec. 17. The companies – ExamSoft, Proctorio and ProctorU – sent letters back to the senators detailing how their services work, with ExamSoft and ProctorU responding on Dec. 17 followed by Proctorio on Jan. 7.
The senators’ inquiry stemmed from reports that, in some cases, facial recognition software failed to identify students of color and students who wear religious garb, like a hijab. Students with disabilities also said online proctoring technology flagged their involuntary movements, like muscle spasms, as possible signs of cheating.
The exchange between senators and companies shines a spotlight on an industry that’s boomed since the COVID-19 pandemic shifted courses online, raising questions about the benefits and ethical challenges of using technology to monitor test-takers remotely.
People, ideally with bias training, should be the ones making decisions about test-takers “at the end of the day,” to counterbalance potential biases in the software, said Dr. Shayan Doroudi, an assistant professor of education at the University of California Irvine, who studies equity and education technology.
He called this a “socio-technical system.”
The three companies all meld technology and human decision-makers in different ways.
ExamSoft’s online proctoring service ExamMonitor, an optional feature within its test-taking application Examplify, records students’ test-taking sessions. By using ExamMonitor, institutions choose to use facial recognition technology that identifies the student and artificial intelligence that flags any behaviors that could look like cheating to review after the fact. Even if the facial recognition fails, however, the student can take the exam and a person can verify the student’s identity afterward.
With Proctorio’s automated proctoring software, instructors or administrators choose what behaviors the technology should flag and receive a report of potentially suspicious activity, which can range from irregular head movements or disappearing from the frame. Students can be locked out of tests if their faces are not recognized by the face and gaze detection software.
“Proctorio is aware of media reports containing allegations of remote proctoring providers having greater difficulty detecting the faces of test takers of color,” Proctorio Founder and CEO Mike Olsen wrote in a 22-page response letter to the senators. “Proctorio is committed to building technology that not only recognizes, but deeply respects the diverse student populations at each of our international partner institutions.”
According to the letter, Proctorio partnered with BABL AI, an independent AI and ethics consulting company, in September to work on its face detection technology. It’s also initiating an independent research study to investigate possible biases in its algorithms and hiring My Blind Spot, a nonprofit consultancy, to conduct an independent accessibility audit of the company’s technology every six months.
Unlike Proctorio, ProctorU has its own live proctors watching the test-taking process, but artificial intelligence technology alerts the proctor if a student is doing anything flagged as suspect. If facial recognition technology struggles to identify a student, a human proctor can override the alert.
“We use technology like a smoke detector,” said Jarrod Morgan, founder and chief strategy officer at ProctorU. “When a smoke detector goes off, it doesn’t call the fire department. It goes off and there’s a human in the house who has to figure out if there’s a fire or did you burn toast. That’s how we kind of think about our technology. We have tools that can alert our proctors to different things or highlight areas where they likely need to focus their attention, but it doesn’t make the decision.”
Company leaders argue that institutions need online proctoring to preserve the integrity of their tests during a pandemic when it’s easier for students to cheat at home and unsupervised.
“After a little less than a year of us having to do this in a modified way, a lot of people who didn’t use online proctoring are realizing that the honor system doesn’t work,” Morgan said. “And the long-range issue with that is it slowly erodes the value of the credential people are paying so much money for and working so hard to get.”
Within the industry itself, however, there’s debate about whether proctoring software goes too far in tracking minute behaviors, like irregular eye movements, things a proctor in a physical classroom wouldn’t pick up.
In recent years, “the focus has been really on how many events are we catching, how many students are we catching cheating,” said Don Kassner, a former CEO and founder of ProctorU and now president of MonitorEDU, an online proctoring company he started in 2018. “There seems to be a presumption that anyone who sits down to take a test is going to cheat, and it’s just a matter of putting together the tools that are going to be able to catch them.”
Kassner said he created his own company in part because he didn’t want to use some of the more “invasive” cheating detection methods now popular in the industry. He thinks they over-monitor students and risk amplifying biases.
“There’s always a little bit of bias somewhere in life,” he said. “Even humans aren’t perfect. [But with individual proctors,] it isn’t systematic. It’s not going to look at someone’s characteristics and flag them for a reason it shouldn’t flag them for. A human being is going to have judgment. They’re going to look at the situation.”
There are, however, steps companies can take to better ensure their technology works for all students, said Doroudi. For example, they can use more diverse data sets to train facial recognition technology so the software is better able to identify students with different skin tones and clothing.
He also suggests companies think “more proactively” about predicting potential equity problems and develop an “action plan” for when they do happen.
“It’s not enough to say there’s no evidence [of a problem],” Doroudi said. “The onus should be on the company to actively see where the outcome might fail.”
Nonetheless, unforeseen equity issues are bound to arise, he added, especially in a pandemic, when these technologies are being swiftly deployed at larger scales. Plus, institutions don’t know all the “ins and outs” of how the software they’re using works, but he thinks it’s in everyone’s interest to admit it.
“We need to be willing to act quickly and acknowledge that these kinds of mistakes will happen but that we’ll try to come up with solutions to try to fix it,” he said. “Sometimes you have to release something that’s not perfect but be ready to catch those imperfections.”
Sara Weissman can be reached at firstname.lastname@example.org.
Editor’s note: ExamSoft’s test-taking service is called Examplify and its proctoring service, an optional feature, is called ExamMonitor.