Don't Fear IT Assessment
Few among us enjoy the prospect of assessment, I think. It can feel like a lonely and difficult endeavor. Some of us shy away after catching a glimpse of an inhospitable landscape pockmarked with huge data craters, thinking “I simply have too much work to do to get dragged into that!”
Others navigate across the shifting sands and craggy outcrops of data collection and analysis with great trepidation. What if I get it wrong? Will I even know if I get it wrong? Most of us are compelled to assess at some level and just try to get through it as quickly as possible. Whew! Now back to my real work.
But my experience gives me confidence that assessment of the campus information technology organization doesn’t have to be feared or avoided, but rather, can and should be a meaningful and rewarding experience.
Like any healthy organization, an IT division depends upon thoughtful assessment at multiple levels. Fortunately, technology makes accomplishing all kinds of assessment so much less onerous than in decades past. Furthermore, you need not be positioned at the top level of the organization to contribute in a meaningful way. But leadership often happens from the middle, and evaluation is an example. Truly, there is so much work to do you can start almost anywhere. Your boss will be glad you did.
Benchmarks form the foundation of a comprehensive assessment plan—both external and internal. One of the most powerful tools for benchmarking IT externally is also among the simplest to use: the EDUCAUSE Core Data Service (CDS). In the spring of 2010, nearly 1,000 institutions contributed to this data repository. With resources like the CDS at your disposal, valuable information can be accessed with modest effort and minimal expertise.
We discovered early on that some of our staffing allocations varied dramatically from peer organizations.
A few years ago, EDUCAUSE leadership made a key decision to incorporate the Costs of Supporting Technology Services (COSTS) Project into the CDS. COSTS provided a framework to understand the IT organization through standardized formulas drawing heavily on budget, staffing, and equipment resource levels. Most importantly, CDS provided the flexibility to gather a tremendous amount of data and filter an independently selected subset of institutions for true apples-to-apples comparisons among like institutions. Labor saved and errors averted through the seamless integration of COSTS into CDS and the resulting rich data analysis is well worth the price of basic EDUCAUSE membership for anyone interested in IT assessment.
Here at St. Lawrence University (N.Y.), we began analyzing our IT organization in earnest externally via the CDS several years ago. We created three distinct groups of institutions, with some overlap among them, for comparative purposes. We’ve used our annual findings to make important strategic decisions. For example, we discovered early on that some of our staffing allocations varied dramatically from peer organizations.
This raised important questions. Closer analysis led to developing a long-term staffing plan that reduced administrative overhead, re-positioned staff and developed new position rationales aligned with an improved organizational structure. We have used external benchmarking to identify where resources were lacking, and unique strategies that were worth preserving, or where there was unmet potential ripe for growth.
In the end, however, our external benchmarking analysis is only as good as their data—IT professionals at the other institutions with whom we compare ourselves. I’ve often wondered if each makes similar efforts in engaging IT staff in the process of CDS data collection. Happily, a redesign allows greater flexibility, so that multiple individuals may be assigned rights with varying degrees of responsibility to enter or review data. If ever there was an opportunity primed to draw others into the process and to help staff understand the value of the data, this is it.
Undergirding our assessment efforts over the past few years has been a concerted effort to deepen our understanding of the role the IT organization plays in helping a liberal arts college achieve its mission as an institution of higher learning. IT must provide highly reliable infrastructure and effective support services, enabling the majority, including students, to be productive and successful. Benchmarking enables us to have confidence in this regard before extending resources toward areas of innovation important to our institution.
A Look Inside
The conundrum is that it’s much more difficult to know how well IT is working in the nooks and crannies of the institution than it is to capture a big picture snapshot. Internal benchmarking requires a different type of data collection and analysis and is likely to be much more time consuming than external benchmarking. How lucky for those who discover the gem of the Merged Information Services Organizations (MISO) Survey.
We’ve used MISO to measure satisfaction with IT and library resources across constituencies—faculty, staff, and students—and to delve deeper using local questions on special topics. MISO is amazingly flexible. As a result of two survey cycles, we’ve made much more confident decisions or established new strategies for wireless, mobile computing, and classroom technology implementations. We’ve adjusted training and other service points to improve impact. Sometimes we’ve used MISO data to defend critical capital decisions. Perhaps most importantly, in the midst of the 24x7 grind, peppered with typical complaints and common frustrations that technology tends to generate, there is great comfort in knowing that overall computing services are rated exceptionally high by all of our constituencies. As an IT leader, sometimes you just need to know that much.
Our Institutional Research office colleagues provide a credentialed safety net, a place to check in before pronouncing what we think we know.
We’ve learned a lot about assessment over the past decade, mostly by doing a lot of programmatic and project-level evaluations, setting annual goals and objectives, and conducting staff evaluations. However, we’ve also learned that one of our most important partners is the Institutional Research office. Our IR colleagues provide a credentialed safety net, a place to check in before pronouncing what we think we know about what the data suggests or before we publish results too broadly. As we often say in IT, just a little bit of knowledge can be a dangerous thing and I’d bet the IR folks say the same thing. Developing a strong working relationship with institutional research is a good first step in any assessment plan.
Our director of IR made this collaboration easy for IT, extending an invaluable invitation to link up with broader institutional efforts. She asked hypothetically, “What if we asked incoming students about their experiences with technology? And then what if we asked outgoing seniors about their experiences with technology? What might we learn?”
By tying a few questions specific to IT consistently into our standard survey cycle, we’ve learned a lot—about students and about our organization. Leveraging HERI/CIRP surveys for first-year students and HEDS for seniors, and also MISO as described above, we’ve become much more informed about our students’ technology experiences and are able to better anticipate the technologies students will bring with them or require from us to succeed at St. Lawrence.
For example, based on CIRP data, we could see both the laptop and Macintosh trends building early and fast, yet had plenty of time to train IT staff, eventually also adjusting semester start-up programming and support services appropriately. We’ve also learned what matters to students in selecting St. Lawrence and thereafter in their undergraduate career with regard to technology. All of this is critically important for strategic purposes as an organization dedicated to serving the institutional mission.
With proper resources and internal buy-in, any IT department can do a solid job of assessing its effectiveness and also learn new ways of providing critical services to a campus learning community. There are amazing resources at your disposal, a mere click or two away, for this work. But first, buy your IR director a cup of coffee, because there’s no better place to start this conversation.
—Sondra Hedger Smith joined the St. Lawrence University (N.Y.) IT staff in 2000 as a teacher and trainer. A 2009 Frye Leadership Institute Fellow, Smith has been active in EDUCAUSE and as a private consultant and co-founder of the New York Six (Liberal Arts Consortium) Technology Directors group.
UBTech 2017 Call for Speakers
Enhance your leadership influence by presenting at UBTech 2017, the biggest week in higher ed AV, IT, and Institutional Success. The UBTech program team is accepting proposal submissions in the following categories:
- Active Classroom
- AV Integration
- Campus IT
- Institutional Success
- Instructional Technology
- Policy and Practice
For more information and helpful tips on submitting high-quality proposals, visit the UBTech Speakers Portal.