Whenever a semester ends at the University of Texas at El Paso, Donna Ekal, Associate Provost in the Office for Undergraduate Studies, asks the registrar to create a spreadsheet that lists all the students who failed a first-semester course.
Then she begins to investigate the reasons behind the F’s. The explanations vary, and not all of them have to do with what does or doesn’t go on in class.
“Students were failing a class but they weren’t failing for academic reasons,” Ekal said she has found.
At UTEP, where roughly three out of four students are Hispanic and many are residents of nearby Mexico, the factors ranged, she said, from family obligations, such as students being the only English-speaker in their home and having to serve as an interpreter for a relative during a doctor’s visit, to problems with transportation or having to work a significant amount of time off campus.
Knowing those particular circumstances, Ekal said, enables the university to develop solutions so that student success isn’t hindered strictly because of family or socioeconomic situations.
“What we can do is work with academic advisers to help pull that information from students and see if this is indeed an issue and, if it is, talk about some alternatives,” Ekal said.
The data-driven approach being taken at UTEP is one of several featured in a new policy brief issued by the Institute for Higher Education Policy, or IHEP, titled “Using Data To Improve Minority-Serving Institution Success.”
The brief—which focuses on the Lumina Foundation for Education’s MSI-Models of Success project—adds to the growing calls for universities to use technology to not just to store information that is required to be kept by law, but to gauge problem areas where intervention can help smooth the path to academic success. (MSI is an acronym for minority-serving institution.)
Dr. Michelle Asha Cooper, president at IHEP, said getting institutions to switch from collecting data to satisfy legal requirements to collecting data to help students represents a much-needed paradigm shift in higher education.
“We want to change the mentality of collecting data for compliance purposes to collecting data for decision-making,” Cooper said.
However, important questions remain about how to bring about that reality, especially given the costs associated with collecting the data and analyzing it in a student-centered way, much less devoting staff hours to following up on any intervention that may be warranted.
In the case of the Lumina MSI-Models of Success project, grant money covered the costs or at least supplemented the money needed to cover the costs.
Cooper said institutional leaders must decide how much time and money to invest in using data to help improve outcomes for students.
“We’re in a time of budgetary constraints,” Cooper said. “We’re also at a time when college completion is at the forefront. If institutions that serve underrepresented populations really want to make sure they are helping students move through the pathways, they’re going to have to figure out how to use data for internal decision-making but more importantly for overall improvements.”
At UTEP, Ekal said, interventions took on a variety of forms.
For students who must make the time-consuming trip from across the Mexican border, Ekal says, the university developed guides on carpooling and how to study while riding the bus. The university also developed podcasts for various courses so students can listen to those while on the bus as well.
For students who work significant hours off campus, efforts are made to find employment opportunities on campus. In other cases students are offered workshops on study skills and taking notes.
It’s too early to say how much these efforts are paying off. UTEP received a $500,000 grant to do work through the MSI-Models of Success project for a period of three years, and is just now embarking upon the final six months.
The ultimate question is whether students graduate in four to six years, Ekal said.
Cooper said it’s also important for universities to look at “interim measures,” which are also addressed in the new IHEP brief.
Interim measures include 1) placement, or making sure students are correctly placed in appropriate courses at the outset of their college careers; 2) persistence, or the degree to which students are continually enrolled; 3) progression toward credential and; 4) completion.
The IHEP brief says these interim measures are particularly important at MSIs, which don’t always fare well based on data collected in the federally-maintained Integrated Postsecondary Education Data Survey (IPEDS).
For example, the brief states, graduation rates are calculated using first-time, full-time students who graduate from the same institution.
“Although this measurement is not unique to MSIs, it is problematic because MSIs educate and graduate many part-time, nontraditional, and transfer students who may not be counted in the standard graduation rate calculation,” the brief states. “Consequently, MSIs may appear to have low graduation rates, which can affect their funding and peer competitiveness.”
Cooper said the interim measures tell the more nuanced story about success at MSIs and can help institutions get better at helping students graduate.
“If you’re driving down the road, there are mile markers at every mile that tell you how much closer you are,” Cooper said. “That’s what these interim measures do.”