Survey Savvy

Survey Savvy

The power of survey results and how institutions are using them

In 1999, the North Dakota University System coordinated a roundtable discussion inviting its board of directors, K-12 administrators, employers, and others to address their expectations of the university.

"It was a landmark event in North Dakota's higher ed history," explains Michel Hillman, vice chancellor of academic and student affairs at NDUS in Bismarck, which has 11 campuses. "What was recommended was a consistent set of accountability measures."

NDUS now surveys current and former students, graduates, employees, faculty, employers and others every two years, using national surveys created by higher ed consulting firm Noel-Levitz, so it can better track its progress and also provide state legislators with fresh data.

One survey focused on why more than 30 percent of its 45,000 students left or transferred to other institutions. Hillman suspected that state legislators were blaming poor academic services. But the initial survey, which attracted 431 responses, told NDUS administrators what they needed to know—that their educational processes and delivery system were just fine. Students were leaving due to a variety of factors beyond NDUS' control, such as poor health or family relocation.

Higher ed institutions have been conducting surveys for years. If done well, they offer valuable information that can help guide the future direction of programs, streamline processes, introduce new services, enhance the quality of instruction, or even build a college's reputation. While many surveys are offered in both paper or web formats, some institutions have changed their tactics to better understand the needs of their students and community while applying survey data in more meaningful ways.

If no action is taken after survey deadlines, less people will end up participating in them, deteriorating their value and validity.

At NDUS, Hillman says the survey provided important insight about current university students—that they're very mobile. "So if we have archaic policies that say we don't accept transfer students or that we really want students to start as a freshman and graduate from here, then we're going to serve fewer students," he says. "Without this data, the legislature could have assumed that we're doing a poor job because we have so many transfers."

St. Catherine U students are asked for input often, including twice after graduation. They can take surveys online at their convenience.

Hillman has learned it's important to promote the value of each survey to senior administrators. By explaining how the survey's data can help them make key decisions, administrative buy-in becomes a no-brainer.

Notre Dame de Namur University (Calif.) also uses surveys to help in decision-making. Three years ago, student enrollment was 1,300, says Hernan Bucheli, VP for enrollment management. Enrollment has since jumped to 1,600 and is slated to reach 1,700 next year. "We use survey data to get us where we need to be," he explains. One marketing survey asked students a series of questions, such as why they chose the school and if they plan on graduating from it. He says their responses enabled the university to boost enrollment by better promoting itself and recruiting students who were a good fit for the institution.

University officials promote the added value of its surveys in the student newspaper by explaining their purpose and how results will be applied as well as by showcasing outcomes. One survey resulted in dorm improvements. Another placed the business, registrar, and financial aid offices all under one roof. A summary of each survey's results and action plan are shared with everyone on campus. Bucheli says if no action is taken after survey deadlines, less people will end up participating in them, deteriorating their value and validity.

But sometimes, results can be misleading. That's why the school recently invited students at all levels to participate in focus groups for $10 each and a pizza.

The groups were formed last fall in reaction to an annual survey conducted in the spring of that year. One key question came back with low responses: Are you getting value for your tuition? But the focus groups revealed that timing was the culprit. Students were overreacting to a recent tuition hike. When the topic was addressed again in the focus groups, their responses to the same question were high.

However, the challenge that all higher ed institutions face is something Bucheli calls "survey fatigue." Surveys have saturated people's mailboxes or inboxes to the point where they're ignored, deleted or discarded. He says if his university doesn't receive at least a 20 percent response rate, the statistical significance of the survey is not as valid and will be a harder sell to convince administrators to use when making data-driven decisions.

Whenever a university administrator approaches John Kennedy for advice on developing a survey, his response is often the same: "Talk to your friends at the big 10 schools to see if they have something similar."

Kennedy is the director for the Center for Survey Research at Indiana University in Bloomington. He says borrowing surveys is something he always recommends since most schools are willing to share in exchange for being credited for the survey's design.

Instead of hiring his center, which can be costly, Kennedy gently pushes faculty and administrators at IU to conduct simple surveys themselves, like those that ask faculty in one department about a new policy. Complex or sensitive surveys, such as work-life issues or those that require a high level of security regarding data collection, are a different matter. That's when his center or other professional organizations need to get involved, he says.

Every survey must have a central purpose and be outcomes-based, not satisfaction-based, which is the old trend, adds Jennie Robinson Kloos, director of institutional research, planning, and assessment at St. Catherine University ("St. Kate's") in St. Paul, Minn.

"If somebody tells you that they're satisfied with the library's holdings, does that mean you continue to buy books at the same rate or buy fewer books because you've reached that point of satisfaction?" she says. "The most important thing is not to ask about general satisfaction but to look at what is it you're trying to accomplish. Go all the way back to your mission and goals and how well you're achieving them."

While many national surveys tools exist, Robinson Kloos believes they offer limited value. She says a growing number of administrators are creating their own custom instruments. Although the reason to buy off-the-shelf surveys is to obtain benchmarking data, she says comparing differently ranked schools is like comparing apples to oranges. Results may be misleading because they're less about institutional influences and more about the desires, personal characteristics, skills, and abilities of students. For example, she says students at St. Kate's have a different set of expectations than those attending Michigan State University, so why compare the two schools?

'It's very easy to lose someone when showing a blank screen with a typewritten question on the page. The more interactive, the more likely they are to complete the survey.' -Scott Bodfish, Noel-Levitz

"The trend is more to look within and compare ourselves over time," says Robinson Kloos. "Let's design questions that give us the information we need to measure our desired goals and outcomes."

Besides an annual employer survey addressing emerging needs, the school conducts several others throughout the year. Freshmen complete a mandatory survey during their orientation, addressing their values about education and the university. Every February, approximately 40 percent of the school's nearly 5,300 students complete a different survey covering everything from their professors to campus services. One month before graduation, roughly 90 percent of students complete another survey about their overall experience. Lastly, one year and then five years after graduation, approximately 30 percent fill out two more surveys about the value of their college experience.

Her staff also conducts a pilot session for every new survey. A small group completes the survey, then based on participant feedback, staff may clarify questions or make other changes. Next comes "over-the-shoulder" observation. With staff in the room, several people complete the survey, speaking their thoughts out loud. What they say often provides strong clues about how to word questions differently to avoid misinterpretation and solicit candid responses. But she warns against tackling everything at once.

"You don't have to pay attention to everything every year," she says, adding that since the process can be overwhelming, her school uses a theme-based approach. "Instead of looking at 100 items, you're just looking at the 15 that you've earmarked for that year."

While online surveys are nothing new, they have taken the next evolutionary step by becoming interactive.

"It's very easy to lose someone when showing a blank screen with a type-written question on the page," says Scott Bodfish, vice president of market research at Noel-Levitz. "The more interactive, the more likely they are to complete the survey. I've seen general literature that shows the greater engagement, the greater validity to the responses."

Take students who are asked to view three pictures of residence halls, then rank them in order of preference. They click on the picture that reflects their favorite style and drag it down to a box that says, "I like this best." Then they do the same with their second and third choices.

Bodfish adds that campus and market researchers are also using more sophisticated, statistical designs that go way beyond a Likert (where people select their preference using a 1 to 5 scale). Through a series of structured questions, he says administrators can better discriminate people's top preferences.

Consider a survey that addresses an institution's five different health care plans. Employees are first asked to choose which plan they prefer: Plan A with a $250 deductible and $25 copay or Plan B with a $250 deductible and $50 copay. The rest of the questions match up every possible combination of copays and deductibles while always asking which plan they prefer. "When you look at their whole pattern of choices, you will know for every employee whether a low copay or low deductible is more important," Bodfish says.

Other surveys involve open-ended questions, which he believes are very helpful. Citing the example above, what if the majority of respondents stated something like, "I'm more concerned about the type of doctors in my plan and what my insurance will and won't cover." This response tells the school to focus more on the quality of its health plan and less on deductibles or copays.

These tactics can help administrators dig deep instead of accepting answers at face value.

"It helps avoid coming up with answers that make it seem like one-size-fits-all when more often than not, your employees are a diverse group of employees and your students are a diverse group of students," Bodfish says. "You have to look at other questions surrounding the issues that may explain their choices or why they feel the way they do."

Although it's less costly to e-mail surveys and reminders, recent research proves that a combination of e-mail and snail mail produces significantly higher responses.

At Washington State University, Don Dillman, regents professor and deputy director for research development at the school's Social and Economic Sciences Research Center, says using e-mail alone produces about a 20 percent response rate. However, those rates can be pushed past 60 percent when using snail mail and e-mail (see sidebar on previous page for his five-step process).

"Very few schools engage in this process," he says, adding that with postal letters, schools can also enclose cash incentives and explain the survey in more detail. "In theory, [e-mail] sounds good. However, you send an e-mail, they don't know who it's from, they're not interested in it, and it's immediately deleted. Reminders [also] get deleted."

In yet another experiment, Dillman sent half of the same survey by snail mail, then the remainder by e-mail. Guess which received the most responses? Snail mail received almost 55 percent compared to 45 percent for e-mail.

The same combination works when surveying alumni, adds Tom Guterbock, director at the Center for Survey Research and professor of sociology at the University of Virginia.

"Because of the speed and cheapness, there's a temptation to just survey those you have e-mail addresses for," he says. "You could be leaving out 40 percent of your alumni. It's imperative to combine sampling by snail mail with e-mail; [otherwise] you'll have a coverage bias."

Another common mistake is not paying attention to the interface of commercial survey packages like Survey Monkey or Zoomerang and your school's statistical software.

"I see people buy the cheapest version, then are surprised they get this unusable set of Excel data that are in words instead of numbers and then spend hours trying to convert the data," he says.

By contrast, Guterbock says some schools are purchasing site licenses for upgraded survey packages. So instead of faculty or administrators with limited budgets buying the cheapest tool, they're now using the same tool with more advanced features that, individually, they could not afford.

While survey design and administration practices are always transforming, what hasn't changed is the labor and time they require.

"It's a fairly big deal, not something you do in one afternoon," says Guterbock, comparing the process to coordinating an elementary school play. "It could be a disaster if not done well or turn out great. Don't underestimate the steps."

Carol Patton is a Las Vegas-based freelance writer and University Business columnist.


Advertisement