Incorporating Predictive Analytics into a Student Success Strategy
Predictive analytics can serve as the foundation of student success efforts. By drawing together data from disparate campus sources and systems, predictive analytics software can enable institutional leaders to predict the likelihood of student attrition, and to identify at-risk students and match them with the right resources that can help them succeed.
This web seminar outlined strategies for incorporating predictive analytics into a student success strategy. The director of institutional research and effectiveness at Bellarmine University in Louisville, Kentucky, discussed how he is using predictive modeling software in innovative ways to connect the IR department with the Student Success Center, to better focus outreach efforts on the students who need it the most.
Associate Dean of Academic Support Student Success Center
Director of Institutional Research & Effectiveness
Dean of Student Success
Kristin Wallitsch: We want to be sure that we are making changes for the better using data-driven decisions. The goal for the Student Success Center is for all Bellarmine students to find success in their first year and persist to graduation—all by using robust data to evaluate the services we are offering.
Drew Thiemann: We’ve had growth, and we had been very steady in our first-year retention, as well as second-, third- and fourth-year. We’ve had a graduation rate above 65 percent for most of the last two decades. But there was one outlier in the fall 2016 cohort: We brought in a class that had more difficulty—not feeling as engaged, not showing up for events, lacking in academic preparation metrics. We didn’t know which variables influenced the student behavior. All we knew was the retention was much lower, and we were hearing from the board and the cabinet of the president asking what could we do to intervene and address this. So we needed to understand ourselves better.
Kristin Wallitsch: There was a lot of anecdotal information spreading around the institution. We wanted to be able to describe what was actually happening and share that with our stakeholders.
Drew Thiemann: We’d had Rapid Insight since 2011, but it was used mainly by our admissions team to predict enrollment and matriculation. Then 2017 was our pilot year for using it as a multifaceted solution, with many offices convening and compiling data to help us with this project.
Kristin Wallitsch: Our Student Success Task Force has evolved over the years from a group who just sits around with a list of students and puts them in particular buckets: students with financial concerns, with health concerns, with academic concerns. It was just looking at one variable for each student. Now we have been able to take these conversations to a new level with predictive modeling.
Drew Thiemann: That’s where the new math comes in. We need to understand the intersectionality of these data as opposed to one bucket about a student or one characteristic that drives a whole office’s work for a year.
We also have a homegrown case management system for monitoring and documenting our efforts—not just individual student-to-student efforts, but seeing what our responsiveness is for these different types of issues. We were able to position this pretty well to help everyone on campus to be able to contribute data—coaches, residence life coordinators, as well as all staff and faculty.
Kristin Wallitsch: Our student alert system is called Focused Interventions for Rapid Engagement, or FIRE. A lot of our stakeholders across campus see this and value it for individual outreach. That’s fantastic. The more submissions we have, the better we’re able to impact and support our students.
It’s also important to recognize that it helps us with a holistic view. FIRE informs us not only for the whole person, but also for our entire cohort. We’re able to run reports that give us trends and tell us about some of the things our students of concern are experiencing.
Jim Breslin: A lot of the conversations we’ve had while leading this effort have involved a layered approach. Throughout this pilot year of using our predictive analytics tools, it’s been about: How do we make this operational and meaningful at the individual student level? How can we segment our student population to understand things that are going on within specific subgroups? How do we have a better understanding of our cohort as a whole?
Looking at those different layers—combined with the conversation we’re having now about how to enhance our efforts for the coming academic year—gets into how we do the more FIRE-integrative work throughout all those different pieces. We learned a lot about our students and our campus that maybe we did not fully understand before.
But the other thing is that we have started to change some of the narratives and conversations on our campus about how we support students. Previously we’d made a lot of assumptions based on rather simplistic descriptive statistics that the university had been relying on for years.
We blew past that this year, and now we understand that a single variable is not enough to label a student at risk. That was a conversation-changer, a game-changer. It makes things murkier, but also a lot more nuanced. As a researcher, that’s very much encouraging to me. As a practitioner, it creates some new challenges, but good ones.
To watch this web seminar in its entirety, please visit universitybusiness.com/ws051018