Virtualizing desktops vs. apps
The University of Central Florida, a campus of 60,000, decided to virtualize applications rather than entire desktops.
UCF Apps lets users access the specific software needed for coursework. After downloading and installing a Citrix receiver client, students can log in and get the apps that have been provisioned to their account based on their area of study.
“It’s a lot more seamless for them because they’re not logging into an environment that looks alien to them,” says JP Peters, the university’s business relationship manager for IT. “They’re opening up an application just like they would do on their desktop.”
Virtualizing apps, rather than entire desktops, can be easier on IT departments and servers because it saves time and resources—users can be given access to just the apps they actually need rather than being provisioned a complete desktop environment. It also saves students from having to buy applications or trying to figure out what version to download and update.
Application virtualization does, however, have limitations.
“If you need an actual OS environment where you need to integrate applications, and you need to use multiple applications to get the job done, you really can’t do that in any other way than a virtual desktop,” says Peters. “In an application environment, each application is locked out from other applications—at least that’s how we have our environment set up—so if there’s dependency on other applications, a virtual desktop is the way to go.”