In exit interviews with students planning to transfer, the issue of affordability almost always tops the list of stated reasons for leaving.Institutions whose leaders take such responses at face value, however, could end up spending additional financial aid on upperclassmen without seeing any improvement in retention.
Often saying "I can't afford it" is easier than saying "I'm homesick," or "I don't know what I want to do with my education," or "I'm not able to handle the work." Consequently, it is important to further assess the role of affordability in retention using data.
National and international studies, as well as research conducted at other institutions, can shed light on possible influences of aid on retention.
Statistical models can help measure how each variable contributes to the final retention outcome.
For example, a number of institutional studies have shown that working on campus (as long as the student puts in no more than 20 hours of work a week) can have a positive influence on retention-probably because the job provides another connection to the institution. In August, the Educational Policy Institute (www.educationalpolicy.org) published a summary of research studies in a variety of countries (including the United States) that showed grants are more effective than loans at improving retention, but only of lower-income students.
Finding out whether such findings are true at your own institution, however, requires the use of institutional data, which is readily available from student records.
The first step in understanding the factors most important in retention at a school is to examine retention rates by subpopulation for cohorts of entering students.
As a sample analysis ("Examination of Retention Rates," p. 32) demonstrates, this approach can shed light on whether high-need students are retained less than low-need students. Other financial-aid-related measures that could be included: levels of "unmet need" (e.g., need less grant aid from all sources); parent income levels; levels of total grant aid; types of aid received (e.g., merit only, versus merit plus need, versus need only); and student and parent debt levels.
This analysis can also be helpful in understanding the impact of specific financial aid policies. For example, leaders often wonder if they are setting the GPA bar for renewing merit scholarships too high. Examining retention rates of merit recipients by first-year GPA can help in assessing the potential enrollment and financial implications of changing renewal policies.
Similarly, some institutions allow students not initially offered merit awards to "earn" a scholarship based on their GPAs at the institution. Often, however, students performing well academically already have higher retention rates, and increasing financial aid to those students based on their high performance does not further enhance retention. Again, using a data-driven approach to setting such policies can help enhance institutional net tuition revenues.
The disadvantage of simply reviewing retention rates by subpopulation is that some variables can be strongly correlated to each other, thus giving a false read on the influence of any one factor.
For example, national average-per-pupil expenditures on K-12 education are a function of the local tax base, and thus need and academic preparation are often strongly correlated. Consequently, if retention on high-need students is low, it could be because those students are not performing as well academically. To understand financial aid's influence, holding all other factors constant, therefore, is often helpful to build statistical models that will accurately predict retention. Such models then enable administrators to measure the unique contribution of each variable to the final retention outcome.
Because academic performance in the first term of enrollment is often a highly significant variable in an overall retention model, it is also instructive to build a model only for those students who achieved at a satisfactory level in Term I. This allows for a better understanding of the factors influencing "voluntary" attrition.
The output of these models can be helpful not only in assessing the impact of grant, unmet need, family income, and other financial variables on retention, but also in suggesting targeted intervention strategies. Darin Wohlgemuth, director of research for enrollment at Iowa State University and co-author of an article on regression research (which will be published in the January 2007 issue of Journal of Student Retention), notes that "using research helps identify the characteristics of students who are most at risk and to determine if that risk is related to their financial aid awards or not."
For example, if the model demonstrates that students who come to campus from the farthest away are much more likely to leave before graduation, additional efforts to connect these students to the campus and the community early on could help improve retention. Similarly, such models could suggest changes in recruitment strategies or admission policies. The model developed for one Scannell & Kurz client, for example, found that students from private high schools were less likely to be retained than students from public high schools, even holding academic profile, socioeconomic background, and ethnicity constant. This finding could suggest a different approach to both recruitment and the review of files from students who went to private high schools.
Another valuable source of information in studying retention is the National Student Clearinghouse. Many institutions already participate in the clearinghouse's enrollment verification services, which are free to higher education institutions and save Registrar and Financial Aid office staff time in meeting reporting requirements related to student loan deferments.
In addition, the clearinghouse now offers a number of fee-based services for colleges and universities, including the StudentTracker service that allows administrators to query a national database of higher education enrollment records to understand where students who have withdrawn are now attending college. Not only does this help in understanding the competition better (both before and after matriculation), but the data can also be linked to student demographic and quality data. Thus, leaders at a private institution can learn whether they're losing low-need or high-need students to public institutions or to other high-cost private colleges.
Public institutions can benefit from these data, as well. "One way we use the clearinghouse is to determine if our nonpersisting, out-of-state students are returning to their 'home state' public institutions to avoid paying the higher tuition," says Kathy Jones, assistant vice president for enrollment and registrar at Iowa State University.
Institutions that take a data-driven approach to understanding retention often find that the analysis dispels campus "lore." As Jacquelyn Nealon, vice president for enrollment services at New York Institute of Technology, points out, "Sometimes the data can shine a flashlight on an issue that is very difficult to accept. When students left NYIT and told us that the reason was financial, it was easy and safe to assume that if we just increased our scholarship and financial aid budget, we could resolve our attrition issue. This was a widely held belief on campus for years. But when we dug deeper into the data, and got to the root of the 'financial reasons' students cited, it turned out that they didn't simply believe we cost too much. They felt that we weren't a good value for the money. Ouch. That's tough to hear, but oh, so important. It triggered a whole series of self-studies that have since strengthened the core academic and social experiences for NYIT students. We make it our business to highlight the value of an NYIT education to our students. Without the data, we might not have been focused on the core issue or realized the retention improvement progress we have seen."
In short, a little bit of analysis can go a long way in ensuring that retention interventions are well targeted and can help institutions avoid costly knee-jerk reactions to anecdotal information.
Samantha Veeder was formerly the director of Financial Aid at Hobart and William Smith Colleges (N.Y.). She joined partners Kathy Kurz and Jim Scannell at their enrollment management consulting firm Scannell & Kurz in July. They can be reached via their website, www.scannellkurz.com.