Do you recall this popular fast-food commercial from 1984? Three elderly ladies gaze at an oversized hamburger bun until one poses the question that became a cultural catchphrase: "Where's the beef?" In the '80s, the fun tagline became so overexposed that it became pass?.
Today, it is the question of the hour for college and university IT managers, as institutional administrators demand proof that their large technology investments are actually yielding results. While IT administrators may find the demand for proof of return on investment similarly overdone, demand for proof of ROI is the reality of today's technology applications at institutions of higher education. (Here, references to technology and return on investment are specific to higher education only.)
In light of this ROI reality, I want to suggest four strategies that IT-which refers to instructional technology in this article-administrators can use to respond to demands for ROI data. Here, I am basing my examples on just one instructional technology, course management systems (CMS), but these strategies are applicable across IT.
An understanding of how ROI is measured underscores the discussion of the four strategies administrators can take. Most IT programs measure ROI in terms of cost per user (CPU) matrices that match expenditures to the number of end users to determine and manage costs and support systems.
For example, given the fixed costs of implementing and supporting CMS, it is more cost effective to maintain the application if 50 percent of faculty, rather than 25 percent, utilize the application in their courses. While this simple method of accounting provides supporting data for budgets, personnel management, and product licensing, the quantitative analysis has limitations.
Simply noting that 50 percent of faculty members use a given tool says nothing about how frequently, effectively, or successfully they use it. More importantly, CPU analysis invites follow-up questions about the learning outcomes associated with a given product's use. How can such results be quantified and measured?
A 2004 EDUCAUSE study finds that across all IHEs, while only 19 percent of faculty use CMS for nearly all their courses, 77 percent report using it selectively. What are the implications of these varying levels of use on staffing, training, and student support services? While a 77 percent usage rate suggests that a CMS program is operating profitably in regard to the campus licensing agreement, it reveals nothing about the learning outcomes. What is the impact of these high usage levels on retention and graduation? (Here, retention is defined as an institution's ability to retain students from year to year, ultimately ending in graduation. Graduation rate is defined as the percentage of those students who enrolled in an institution that successfully completed a degree.)
You can't manage what you can't count, so before you can answer the really important questions you need to implement a systemic current state analysis regime, or benchmarking study.
Benchmarking is an overused term in higher ed. Generally, it describes the process by which we compare data about our own institution with that of peer institutions. The standards by which these comparison groups are constructed vary widely and are subject to constant revision depending on the need in question.
For years, the gold standard among institutional researchers was the institution's athletic conference. Such generalized comparisons do have value for issues like tuition, enrollment, and financial aid, but because of broad institutional differences in IT applications and infrastructure, they have limitations.
In addition to identifying external peer groups, internal comparisons can also be very powerful examples for demonstrating the value that IT brings to the institution. Common internal comparisons include: colleges, departments, support programs, courses, and personnel. All of this sounds easy, but internal data collection frequently proves much more challenging than anticipated due to time, resource, and political/cultural restraints.
Benchmarking IT services is more than just a sophisticated form of CPU analysis. Not only does benchmarking take into consideration measurable learning outcomes, but the data generated should also drive faculty development and support services-the outcomes of which contribute to the overall measure of ROI.
Despite the popularity of "academic excellence" initiatives that all but defy rigorous assessment efforts, too few IHEs practice benchmarking in ways that demonstrably contribute to the accomplishment of the institution's mission-i.e., student learning-focusing instead solely on the financial costs/benefits almost to the exclusion of everything else.
There are reams of research and publications extolling the benefits of benchmarking IT, so why don't more IHEs utilize benchmarking as part of their strategic planning process?
In most cases, I believe, it is a matter of insufficient expertise, staffing, and time. IT directors, like other academic deans and department heads, are generally chosen from the ranks of faculty or former faculty. They are selected for their teaching and technological experience, rather than their management or business experience.
So it is no great surprise that common business functions such as process management, project management, and benchmarking are often underutilized or ignored. This is especially true of IT departments at smaller schools and of decentralized IT programs at larger institutions.
Conversely, centralized programs often have more diversified resources and staff and are therefore better able to draw upon a range of professional experiences and skills to accomplish benchmarking projects.
Finally, effective benchmarking takes time, which is frequently in short supply in offices where staffers are asked/required to wear multiple hats. Again, this situation is more noticeable at smaller schools. In some cases the campus political/cultural environment does not support instructional data collection and analysis. Over the years, I have personally had chief academic officers tell me, "We don't use that kind of data here." Fear of offending colleagues, fear of creating distrust between faculty and administrators, fear of union lawsuits, and even fear for one's job are all reasons cited.
How many college and university presidents can readily locate the folder on their desk/desktop where the cost/benefit and performance analyses of their campus IT programs are stored? How many can call the chief information, financial, or academic officer and expect the report to be delivered that day or week?
IT alignment is one mechanism by which institutional ROI can be readily enhanced, but contrary to popular opinion, IT alignment starts at the top and requires leadership from senior executives and board members. After years of debate about the role and cost of IT, particularly Enterprise Resource Planning (ERP), CMS, and other learning technologies, institutions have slowly begun to acknowledge that IT "investments" should be treated like other utility "expenditures" such as electricity and heat, and that IT costs need to included in the annual operating budget rather than languishing on capital funding and donation wish lists. This is especially true for two-year and small to midsized four-year institutions that depend disproportionately on grants and extracurricular funding to meet basic IT needs.
Inconsistent funding yields inconsistent results, which makes it more difficult to build a case in favor of normalized funding. Normalizing IT funding is the first step in aligning IT with strategic institutional goals.
The market also plays an important role in shaping funding models. One example of these pressures and how IHEs attempt to meet these expectations is the port-per-pillow, high-speed internet connectivity and integrated learning communities that schools have been forced to build or risk losing customers to more savvy competitors. Of schools that have dorms, 97 percent provide high-speed connections in residence halls. To the extent that aging and obsolete technology impedes student learning and retards enrollment in highly competitive markets, institutions really have no choice but to normalize IT spending, and yet only 52 percent of IHEs operate with a funding model that includes renewal of the capital IT infrastructure as part of the annual budget, according to EDUCAUSE data.
My point is that the shift in approach from extraordinary spending to utility status often reflects changes in the higher education market as well as the expectations of prospective students and their parents as much as "visionary thinking" and leadership. Recognizing the role of the market in the development and delivery of IT services is the second step in aligning IT with strategic institutional goals.
The overwhelming majority of IHEs have one or more commercial or home-grown CMS, so let's take a quick look at what information can be gleaned from the CMS servers situated on nearly every campus across the nation. The administrative logs of the CMS are a rich source of data about the campus learning environment. CMS applications keep detailed logs of users based on their personal login and password, so overall quantitative and qualitative product use is always readily at hand. When combined with other campus data from the institutional research office, such as faculty demographics and course withdraw and failure (WF) rates, one is able to create detailed data maps of IT use by college, department, faculty rank, gender, age, course level, degree of CMS integration, instructional features used, and student success rates.
This wealth of instructional data can also be used to help shape strategic IT initiatives.
Examining the major campus constituencies and their definitions of successful learning outcomes, as well as their varying ways of determining outcome measures, can be telling. (See chart, ROI for IT.)
Based on benchmarking results, an IT program administrator might elect to focus on improving the instructional design of large lecture courses in a given discipline with a 50 percent or higher WF rate, which would have a direct impact on student retention and graduation rates. Or, the administrator might create a customized training course targeted at faculty who meet specific requirements (female assistant professors teaching 100-level introductory courses that use multimedia and online quizzes), or choose to create personalized professional development programs for faculty based on the analysis of their teaching styles and effectiveness as demonstrated by the benchmarking data.
Regardless of what action the IT program takes, administrators are in a much stronger position to improve learning if they have benchmarking data to help illuminate the pathway to success. Institutionalizing the collection, analysis, and dissemination of benchmarking data is the third step in aligning IT with strategic institutional goals.
No amount of executive enthusiasm, financial support, or data can produce results if the IT program and team are dysfunctional, disorganized, or misguided. One institution with which I am familiar employs 900 faculty and three instructional designers-that's a 300:1 ratio. Spending only eight hours with each faculty member (a ridiculously low figure), each instructional designer would have to work 46 hours a week, 52 weeks a year to meet with each faculty member. Since hiring requisite numbers of additional instructional designers is unlikely for fiscal reasons, IT programs must find creative ways of addressing the very personal needs of individual faculty while demonstrating the value of IT in the process.
As noted above, developing and adhering to sound business and project management processes is essential to success; benchmarking and alignment is out of the question otherwise. Maintaining a functional IT program also depends on being able to cover the basic competencies of IT support, including: CMS administration, instructional design, graphic design, digital imaging, video production, computer/multimedia programming, applications training, web development, and AV support. Does every IHE have an IT staff reflective of this list? Of course not, but most are trying to provide the services on this list with available resources. Utilizing the data generated by the CPU and benchmarking studies, as well as data gleaned from institutional faculty and student technology surveys, IT program administrators can better leverage their strengths to effectively target instructional services to strategic projects. Implementing effective programs based on the benchmarking is the final step in aligning IT with strategic institutional goals.
My message is this: The highest order of ROI is student learning, and IT alignment in combination with benchmarking is one of the surest paths to meeting that goal. By moving beyond basic CPU analysis to examining CMS data and employing sophisticated benchmarking practices, it is possible to measurably increase student retention and graduation rates.
Institutions that focus on student learning outcomes and normalize their IT budgets can achieve strategic success within a conscripted learning environment. Clearly ROI metrics for IT do not produce singular results; on the contrary, successful benchmarking reveals the value of IT across the spectrum of stakeholders in distinctly measurable terms. The ability to articulate and demonstrate results is becoming more important as pressure from legislators and the public for increased transparency and accountability forces higher education to become more responsive to ROI concerns. Documenting the learning outcomes produced by IT programs is a first step towards demonstrating the value of higher education's technology investments.
Anthony R. Bichel is the proprietor of Leading Edge Learning, which partners with leaders from higher education, business, and nonprofit organizations to identify, organize, plan, develop, and deliver custom professional development programs. He can be reached via his website, www.anthonybichel.com.