You are here

On Topic

Higher ed cybersecurity requires a touch of paranoia

How to protect our schools—and ourselves—from internet attacks
University Business, March 2018
Joanne Martin, a member of the Women in Technology International Hall of Fame, will share her perspectives as a keynote speaker in June at the UBTech 2018 conference in Las Vegas.
Joanne Martin, a member of the Women in Technology International Hall of Fame, will share her perspectives as a keynote speaker in June at the UBTech 2018 conference in Las Vegas.

Editor's Note: University Business welcomes the insights and opinions of educators and administrators on all topics. If you would like to contribute a guest column, please contact Tim Goral at tgoral@promediagrp.com.

Joanne Martin was a witness to and participant in the development of what we know now as the internet. As the former chief information security officer and vice president for IT risk at IBM, she ensured the firm’s information assets were protected.

While she is an enthusiastic supporter of digital technologies, she also cautions that boundaries must be set to keep these emergent technologies from being abused.

“We’ve seen what happened as social networks were developed without bounds,” Martin says. “A lot of people are quite concerned about that, but it’s not very clear how you put that one back in the bag.”

Martin is also a passionate advocate for women in technology and is a member of the Women in Technology International Hall of Fame. She will share her perspectives as a keynote speaker in June at the UBTech 2018 conference in Las Vegas.

Schools have to balance the need for security with the desire for openness and sharing. That must present many challenges for cybersecurity.

Yes, it is a very big challenge. The professors have tenure and don’t want to lose control of what they’re doing, so the issues around how to manage that open and private conversation while still staying secure and protecting privacy are key.

In a government or a business world it’s much easier to say, “These are the rules. You have to follow them.” In universities there’s a larger need to pull people along, to help them understand why things are the way they are and how to handle them.  

What is the biggest threat? Is it malware? Phishing? Bots?

I’ll tell you that almost everywhere phishing is the No. 1 threat. It hits businesses. It hits universities. We have a lot of personal data and a lot of intellectual property to protect. Focusing on what data to maintain where and putting specific controls around those areas becomes important.

Clearly, a university system could be brought down by malware. It could do reputational damage—maybe not as much as to some businesses, but it could hurt if their systems were down for a while. And universities that are attached to medical centers have a whole host of other issues because they have HIPAA confidentiality issues around that part of the database.  

You wrote, “Cybersecurity awareness is not just knowledge. Knowing isn’t doing.” What is cybersecurity awareness?

I think we all need to be a touch paranoid on a personal level.

I have grandkids and I am very careful to not post pictures of them on my Facebook, except to a very limited group of people because I wouldn’t want someone to surf and to see what my grandkids look like, what school they go to, and be able to go to the school and say, “I know your grandmother,” for example.

We almost have to live with an air of paranoia around our own electronic communications, no matter what environment we’re in.

Cyberawareness is understanding that these tools are not just benign tools that help us. They are tools that could cause us great problems. Be aware of the fact that there is a very well-funded adversarial community that would like to do damage.

It’s like the posters you see in public spaces: “If you see something, say something.”  

How can a university impress this message on students—and faculty for that matter?  

I think that having the faculty believe it first is really important. Because students will look to them as a model, and sometimes you have to work on the faculty to make sure they know that there are some real threats there. Beyond that, you need to make people as aware of this as they are aware of other threats on campus.

Sometimes campuses have problems with robberies or other things, and there is awareness of what you should do if that happens. The same holds true of cybersecurity. I recommend doing regular session tests. These tests purposely catch students off guard—that’s the lesson—without the threat of causing damage to a network.

You can send out fake emails to students that look like they’re from the dean or from the faculty, and see how they respond. If they click the link, they see a screen that says, “You just got caught!” It’s a very good way to inform and to educate because no one likes to feel they just did something stupid.

And if you can catch them, right at that moment you make them aware that next time it could be an attack.

Are there schools that do that?

I don’t know how many schools might be doing it at the moment, but fake phishing emails are fairly standard in industry. I have a number of clients that run monthly phishing exercises because it keeps that level of sensitivity high, and people respond to that.

Your career rise was concurrent with the development of the internet. Are you surprised at what it has become?  

Absolutely. I don’t think any of us involved in the early development appreciated where it could go. It looks very exciting, so we drive forward and don’t realize what it has done until it’s too late.

We do that with many technologies, but this is one where we pushed to the wall to grow, to make it bigger, to make it better, without recognizing that there was this other side that was also being unleashed.

I don’t think much focus was given to that as development was done. Maybe that’s a bit of a cautionary tale for all of us as we continue to develop technologies like artificial intelligence.

There’s a mixed cultural negative to that, and we should be careful about the developments there. I don’t know where it will go, but I think it’s a healthy discussion for researchers to contemplate what it means if AI is developed without boundaries.  

You are a longtime advocate for women in technology. Why is it that girls exhibit such a strong interest in math and science at a young age, but then that interest wanes?

If I knew the answer to that, I would be so happy. I’ve been on a number of groups trying to figure that out. I will tell you that I think for girls it becomes socially unacceptable to excel. I had girlfriends who hid their report cards because they were very bad. I hid mine because it was very good.

When I was in graduate school, I taught a class in mathematics for elementary school teachers and it was a distressing class for me. I had 25 young women who were terrified of math. They just hated it. They only wanted to take this one-credit course and get through it so they could get their elementary education degree.

All I could see was 25 elementary school teachers who were going to go out in the world and convince their kids that math was a terrible thing.

So, how do we get past that to where we help our kids recognize that they need math as much as anybody, and it’s a good thing? Time, maybe. I don’t know.

I honestly don’t have a lot of hope in this area right now, because I think mostly we’ve backslid as a country and it’s going to take a while to help to recover that in terms of roles and who should do what and who has what capabilities.

On the other hand, I have two nieces who are math and engineering majors in college today. So, somehow, those messages are getting through. It’s a tough one, and I think our social norms have a lot to do with it. Until we can break down what that means, we are only going to put Band-Aids on this thing. 


Tim Goral is senior editor of UB.