WASHINGTON — Colleges and universities should embrace assessments of student learning in order to prove their worth as college costs rise and the job market remains tough.
That was the proposition proffered during a panel discussion Monday at the New America Foundation, a nonprofit think tank.
It came by way of Fredrik deBoer, a Limited Term Lecturer at Purdue University—where he recently obtained his Ph.D. in rhetoric and composition—and a writer who deals with higher education policy.
“There’s no enterprise in the world where we pay hundreds of millions of dollars and no one asks to see how well we’re doing, except the defense industry,” deBoer said.
His message was directed largely at faculty, particularly those who may be resistant to assessments in the world of higher education.
“Faculty can take an active role in assessment and we can get out ahead of these problems and we can become a major force in shaping how assessment happens,” deBoer said. If not, he said, “it’s going to happen anyway, and it’s going to happen in a way that does not reflect faculty interests.”
deBoer’s remarks come at a time when there is continued concern on how much college students actually learn—an issue brought to the fore with the 2011 book “Academically Adrift” that claimed colleges were not making much of a difference in terms of what students know.
deBoer said the book, which relied on an assessment known as the CLA, or College Learning Assessment, was “deeply flawed” because it failed to look at what students have learned in their specific majors.
deBoer said it’s important to note that assessments are already taking place but not in any systemic way.
He says he sees nothing wrong with a “college NAEP”—or National Assessment of Educational Progress, a federally administered test that examines student performance in a variety of subjects, including mathematics, reading, science and writing. He described the NAEP as a “minimally invasive” procedure because a relatively small number of students take it and they don’t take it over and over again like K-12 students do with other tests.
“NAEP for college sounds good,” deBoer said. “No kid is coming home because they had to go through another round of NAEP. Teachers aren’t complaining that it’s using up time and forcing them to teach to the test.”
Other efforts to assess what kind of job higher education is doing have fallen short, deBoer said.
For instance, the College Scorecard—a web-based tool that the Obama administration created to give students and families data such as a college’s average cost, graduation rate and expected salary for graduates—is “great,” deBoer said, but it also has gaps that don’t allow for the comparison of institutions with different types of students.
Similarly, he said, the college rankings put out annually by US News & World Report fail to capture how well institutions—particularly elite institutions—actually educate undergraduates, since they tend to “skim off the top and take only the truly elite high school students.”
“You could probably put those students into any university and see them excel,” deBoer said. “It’s like having a height minimum and then bragging how tall your student body is.
“A lack of assessment data allows elite universities to maintain the fiction that they teach better without having to provide any evidence that they do so.”
deBoer’s paper looked at a variety of different assessments in place. In the end, he said he favors an assessment known as the CLA+ because it has a written performance task that he says is a “more authentic mechanism” than the multiple choice structures that other tests employ.
deBoer said that, while most students enroll in colleges that are close to where they grew up, one hope is that, for those who “cross-shop,” they begin to base their college choices on how much students learn versus whether a given college has nice dorms, climbing walls and the like.
“I can foresee a situation in which actual educational quality becomes a factor in determining a college, because right now it simply isn’t,” deBoer said.
But a problem could lie in motivating students to invest an effort in doing well on the test like they do with the SAT because they know the stakes are high on the college admission exam.
“We can motivate them perhaps to take the test, but motivating them to invest their top effort is one of the hardest things,” deBoer said. “That to me is one of the major empirical challenges.”
Another challenge is whether employers will ever value CLA+ scores or other assessment scores.
“Right now employers don’t know what it means,” deBaer said, because the scores “aren’t interpretable.”
“You would hope that, over time as more of these tests happen, you could say I scored on the X percentile of the test that I took, that would become more meaningful to employers,” deBoer said.
Jamaal Abdul-Alim can be reached at firstname.lastname@example.org or follow him on Twitter @dcwriter.36