Building a Data-Focused Campus Culture
Every institution has access to data that can help to drive more effective decision-making; the challenge is that often it resides in silos around campus. By democratizing data access across the institution and building a data-focused campus culture, staff are empowered to make more effective decisions.
This web seminar explored ways to build a data-focused culture on any campus. The chief operating officer from the Community College System of New Hampshire (CCSNH), discussed the institution’s ambitious new goal that 65 percent of adults in the state have a credential of educational and economic value by the year 2025, and how data-driven decision-making is central to this ‘65 By 25’ initiative.
Senior Data Analyst
Chief Operating Officer
Community College System of New Hampshire
Jon MacMillan: An analysis released earlier this year from NASPA, AIR and Educause looked at the use of data and analytics for student success by surveying over 1,000 individuals across the country in IT, IR and student affairs offices. They were looking to see how people are using data to make data-informed decisions.
In that survey, nearly one-third of individuals mentioned that they were concerned with or unsure of the algorithms or different metrics they’re using to identify at-risk students. What may be even more concerning is that 40 percent found that they were not implementing the results of their student success studies effectively, while 54 percent said that wrong conclusions are being drawn.
Rapid Insight is easy, affordable, powerful analytics. We provide a platform—Veera—that enables users to access data from any location, blend it all together, merge it, cleanse it, and then create predictive models as well as reports and dashboards or other analyses. We work in businesses in many different industries, but our main focus has been within higher education. And we do a lot of work around student success enrollment. Then we enable users to access data from different silos to create these analyses and share these insights.
Charles Ansell: Our mission—like most community colleges, as well as many four-year colleges—is all about student access, to make sure that we are not just serving the state by having increased enrollment at our colleges, but also making sure that students are successful, that our programs are relevant, and that they earn credentials of economic value.
When we think about our goals, we consider which metrics that are leading indicators that we need to track, that we need to report on, that all of our institution needs to be on the same page about in order to make student success work. And we don’t look at that in a silo. There are many different agencies with which we furnish data, and then we consider our own leading indicators.
That leaps to the KPIs that make up our scorecards, as well as leading indicators for those KPIs. I keep using the term “leading indicators,” but the measures that will tell us if we’re driving toward the student success numbers we want all revolve around student GPA.
A lot of community colleges and four-years are embracing pathways initiatives, and we’re not an exception to that. But with that, we track what percentage of students are on semester-by-semester plans to completion, how intentional those plans are, the program codes they’re associated with, etc. You’ve got retention, but then you’ve got early indicators of retention that happen within the classroom as well as outside the classroom.
In a metrics-driven culture, we need to continue managing the quantitative targets for key measures, and more pertinent to this call, we need to build a data infrastructure to ensure that execution of strategic initiatives doesn’t lag behind culture. We want data to prove points and to make sure that we’re living our values and we’re living our mission statement.
It’s all well and good to be examining best practices for student success, retention and completion, but if the data is not actually tied to the measures that are tied to your key performance indicators, it runs the risks of becoming fads, of becoming burnt out, of dying in the name of initiative fatigue.
The thing that needs to happen is a constant cycle back to your main measures through what I’m terming “performance dialogues”— essentially, regular meetings at the proper levels of the organization around those measures. Then you’ve got the leading indicators, which are an overview of the factors that influence the outcomes above, and that identify the actions that improve the outcomes.
You would have something like certain KPIs that change weekly—such as enrollment throughout the registration cycle. And you would also have indicators around enrollment of certain cohorts—such as for a graduation rate, or for a proximity to graduation, or a cohort that is in a program for which we expect high median earnings. We want to use those as leading indicators of progress to that main measure.
Our guiding principle around performance dialogues is to make sure that we’re continually revealing the work and that we’re insisting on the work getting done, that we’re associating them to actions, and that these are interventions that are themselves leading indicators, that are traceable back to main measures. The idea is to create leaders at all levels of the organization who can rely on data to work with teams to address problems proactively and—more importantly—very effectively.
To watch this web seminar in its entirety, please visit universitybusiness.com/ws061318