Back from the Breach

Back from the Breach

IHEs find that recovery from security breaches must be part of every IT plan.

The headlines are enough to strike fear in the heart of any campus IT manager: personal data of alumni and students exposed at institutions like Boston College, Tufts (Mass.), George Mason University (Va.), Stanford (Calif.), and scores of other schools, putting credit card information, Social Security numbers, and medical records into the hands of digital miscreants.

But if the news seems troubling to read, it's even scarier to be the one in the story.

"You put all these protections in place, all these tools and processes," says Elaine David, assistant vice president for Information Services at the University of Connecticut, which had to notify 72,000 students, faculty, and staff of a potential breach last June. "But even with all that security, it just takes one vulnerability for a malicious person to get in. And they will, because they think like criminals, and we don't."

Security experts agree that the recent spate of security breaches aren't isolated incidents that can be cured with an ounce of prevention. Rather, sophisticated hacking tools and the porous nature of campus server environments make breaches a matter of "when" instead of "if" for just about every IHE.

But that doesn't mean campuses have to simply brace for the onslaught and try to clean up as best they can. Many schools that have been hit are leading the way in showing how to recover from breaches, minimize damage, and prevent future headlines.

Unlike corporate networks, which can be controlled and monitored through strict IT policies, IHE setups have to be flexible, allowing for multiple types of devices and often for decentralized pockets of IT management. That makes schools tempting to hackers, who can crack networks through system flaws, viruses, and spyware.

The Privacy Rights Clearinghouse recently stated that of the 113 data breaches reported since February 2005, almost half took place at colleges, universities, and university-related medical centers.

The prevalence of breaches is likely to continue, according to the security firm Symantec. In its annual threat report, released last fall, the company noted that education is now the most attacked industry, ahead of small business, financial services, and government. IHEs are attractive targets due to their large, diverse networks and stores of highly sensitive information. Also, a false sense of ownership exists among students and faculty. They often install wireless access points or tap into campus networks without firewalls in place, the report notes.

Sometimes, even seemingly bulletproof protection isn't enough. After a worm disrupted its systems in 2003, the University of Washington School of Medicine installed tough firewalls and intrusion systems. But when another virus attacked, IT staff found they couldn't identify where the threat had originated-so cleaning infected departments before the infection spread was difficult.

The news isn't all dire. Despite many incidents of data breaches, there has yet to be any widespread identity theft as a result of the exposed information. Attackers sometimes find themselves with data, but no idea how to exploit it.

"Data can be stolen or lost, but without an application that can tie that information into other databases, usually it's not useful," says Tom Chomicz, a network security engineer at CDW-G, a technology provider to government and educational institutions. "Selling it takes time and connections, and if any part of it is encrypted, it's just not worth it to the attacker."

Most hackers don't break into campus networks specifically to get sensitive data, Chomicz adds, but instead to create channels for sending spam. Purveyors of unsolicited mail pay hackers for these "zombie" connections, so spam can't be traced back to them. Much like breaking into a bank and emptying the cash drawer but neglecting to peek into the open vault, hackers take advantage of vulnerabilities to exploit networks, yet don't always use data that is right in front of them.

At Boston College, for example, letters had to be sent in March 2005 to 120,000 alumni describing an exposed database that contained Social Security numbers. College officials noted that the attacker's real motive seemed to be embedding a program that could be used to attack other computers.

"It's an odd situation," says Joy Hughes, vice president for Information Technology at George Mason University and co-chair of the Education Security Task Force for EDUCAUSE. "Most schools feel that if there's sensitive data they need to notify people that the hacker was after the data, but in almost all cases, that's not what's happening. Unfortunately, it takes months to know that, though, and in the meantime you have to do notification just to be safe."

"It just takes one vulnerability for a malicious person to get in. And they will, because they think like criminals, and we don't."
-Elaine David, University of Connecticut

If a breach has taken place, it's imperative to act as quickly as possible, say officials at IHEs that have been through the wringer. Although forensic evidence takes time to collect, immediate steps include shutting down servers in question and notifying those affected, if necessary (see "Notify or Not?," p. 84).

Notification involves bringing together departments such as Public Affairs, Human Resources, and Student Services, says Gordon Wishon, associate provost and chief information officer at the University of Notre Dame (Ind.), which is investigating the January hack of its Development Office server.

"What needs to be determined right away is the nature and severity of the incident, and how it will be communicated to those affected," Wishon says. "We used to treat every incident alike, but we've learned that there are different levels of severity, and we need to have a hierarchy of actions."

Included in such letters and press releases should be guidance about how to deal with any subsequent impact of the exposure, Wishon notes. Clearly stating that data has been compromised, but that there's no evidence yet it's being used to pursue identity theft, can keep panic to a minimum.

At George Mason, 35,000 student and faculty records were exposed when the ID card server was hacked, leading the university to notify individuals through a note from the IT department as well as a web page set up for updates on the investigation. University officials note that the correspondence couldn't say, with certainty, that any data was taken, but it did communicate the message that it was "highly unlikely" the data was being used, according to Hughes.

George Mason then took the next step in post-breach strategy, starting an extensive forensic examination on several levels, says Hughes. The school made copies of its logs for internal inspection and for the FBI and a private computer forensic investigation company. "Doing full forensics was the only way for us to know for sure what happened," says Hughes. "The FBI has many other things to do, so it took them nine months to complete its report. People who have exposed data can't wait that long to find out that they have nothing to worry about."

When recovering from a breach, IHEs can expect to spend more than just time on the issue. Every breach requires staff resources and usually demands external forensic examiners who can double-check the evidence and issue reports to be used in court. Even if the perpetrator is known, it can take months, and even years, to resolve the issue for good.

In 2003, the University of Texas at Austin discovered that someone had learned how to extract Social Security numbers from an internal database, acquiring about 37,000 numbers of students, faculty, and staff. The university alerted those who had been affected but it was two years-and about $165,000-before the perpetrator was brought to court.

To prove its case, UT Austin had to create detailed forensic analysis records, work with the U.S. Secret Service to confiscate computer equipment, pore over server logs, and hire attorneys to handle prosecution. Because a student of the university had done the hacking, the university also put in time dealing with the public relations aspect of the issue. "It was expensive, and made worse by the fact that there's no joy in prosecuting one of your own students," says Dan Updegrove, vice president of Information Technology at the university. "But we felt like somebody needed to send a message to the hacker community that there are consequences to aggressive and irresponsible behavior."

The school learned the importance of creating a "fire drill" team, Updegrove notes. "It would have been great if we could have gone through some exercises before this happened, so we would have been prepared," he says. "It would have made our response much smoother." Although the university has deep experience now with breach recovery, Updegrove believes that much can still be learned, both by UT Austin and other schools, by creating different types of "incidents" and responding to them. "Declare a security breach drill," he advises. "Dummy up some memos with descriptions based on real breaches, and for the next three hours pretend it's the real thing. Then, if and when a breach happens, at least you won't be taken by surprise."

On the drill team should be members of departments that are pulled together during a real breach, and even attorneys and local law enforcement. Updegrove notes that incidents at other IHEs can provide the framework necessary to run through a drill, and that different types of incidents should be covered. "Security breaches come in lots of different flavors, from hackers getting into ID card databases to a virus that steals data from a medical school," he says. "Create an incident response plan and it will give you a great base to work from if there's a breach."

Fire drill teams can also spread the word about security, says Casey Green, director of the Campus Computing Project, which studies the role of IT in higher ed. "There is such a need for communication about the role of individual users," he notes. "The responsibility isn't just on IT, it carries over to every department, every student, every faculty and staff member."

Many schools that have gone through incidents have also responded by changing technology environments and looking at staff roles. At George Mason, Hughes realized that server logs would be useless for forensic examination if the hacker was clever enough to modify them. The university invested in separate, secure servers for storing logged files. Like forensic work itself, the strategy didn't come cheap, since servers cost money and require trained, high-level IT employees to run them. "Any time you put in a new set of servers, it means someone has to manage it," says Hughes. "It's costly, but it's worth it."

Notre Dame's Wishon can't comment on his university's incident, since evidence is still being collected, but he does note that the university is concentrating on hiring more security staff and beefing up preventive efforts. A breach has a tendency to focus attention on areas that are unguarded, he says, and a first step for any IHE in preventing more attacks is to get an idea of what IT activities are taking place. "The greatest challenge is the fact that our IT functions are so distributed across the campus," he says. "It's difficult to even identify locations where processing and maintenance might be taking place. So, our prevention efforts going forward will start with rigorous risk assessment and a thorough inventory."

UConn officials realize that they must change procedures in terms of where, how, and what data is stored. Before the breach, the university was using Social Security numbers as primary identifiers for students; it has since begun creating random ID numbers.

Encryption is another option, although it can be a budget breaker in some cases. But for the most sensitive data, putting in a bit of encryption can be another security layer that is comforting if someone attempts to hack the server, notes Chomicz of CDW-G.

IHEs may also want to create smaller zones in the network, says Andy Salo, director of Product Management at TippingPoint, a division of 3Com. If networks have fewer connections to each other, they have less chance of being infected, he says, and attackers won't be able to jump to one server by using another.

In general, while it takes a bite out of the IT staff budget, it's a good idea to hire a "network traffic" cop, Salo says. "We've noticed that the universities that are vigilant about monitoring notice anomalies faster, and can prevent some breaches. It's the ones that think they're protected that never see it coming."

Elizabeth Millard, a Minneapolis-based freelance writer, specializes in covering technology.


Advertisement