THE ALUMNI MAGAZINE OF NORWICH UNIVERSITY
Photo: Michael Battig looks up from his office desk

Professor Michael Battig, Ph.D., returns to Norwich to lead the School of Cybersecurity, Data Science, and Computing

INTERVIEW BY SEAN MARKEY
NORWICH RECORD | Spring 2021

Prof. Michael Battig spent a decade working as a software engineer1 in industry before his desire to teach college-level computer science inspired him to return to school and earn his doctorate In August, Battig rejoined the Norwich faculty after a 20-year hiatus2 to lead the newly reconfigured School of Cybersecurity, Data Science, and Computing. Battig specializes in software design and how to teach computer science. But he knows enough about cybersecurity to compare the onslaught of hacks, security breaches, and data thefts by Russia, China, and other state actors against U.S. companies and government agencies as nothing less than a new Cold War. Albeit one that’s gone digital. “The work we’re doing here couldn’t be more important,” he says of the program he now leads. In a wide-ranging interview, Battig talks about the future of technology, the wisdom of playing to your strengths, and why he teaches a course on the ethics of computing.

How did you get interested in software engineering?
I’ve always had an aptitude for problem solving and love the work that engineers do. The thing that has drawn me to this work is that, whether you’re talking about cybersecurity or software engineering, this field is one that marries a student’s need to be technically astute with being relationally savvy. Good engineers relate well to people, and they’re also great problem solvers. You can’t underestimate either one of those. Because no matter what you do, if you’re building software, you’re in the people business. You can’t be this quint¬essential geek who sits in a cubicle and doesn’t relate to anybody. You have to be a good communicator. Those soft skills will take you a long way.

When you look at new technologies coming on the horizon, what do you see?

As the saying goes, “It’s tough to make predictions, especially about the future.” One of the authors that has really influenced me, and I use his work in my ethics class, is a guy named Neil Postman3, who said the inventor of the technology is not necessarily the best predictor of what will become of it. Postman was not a computer scientist. He was in the humanities.

The driving force when we start talking about things like quantum computing, in my mind, is just an extension of Moore’s law4. A lot of people have been predicting an end of Moore’s Law with integrated circuit chips, for example, that we’re hitting a wall in terms of the physics of it. We’re getting into the nanometer scale, where the electrons are going to start interfering with one another on the integrated circuit chip. You just can’t miniaturize any further. But I view quantum computing as just the next paradigm in the continuation, really the unbridled continuation of [the doubling of computing power every 1.5 to 2 years].

So how does a small school like Norwich stay relevant when it comes to machine learning, artificial intelligence, and similar innovations?

That’s a great question. Machine learning and artificial intelligence are hot areas, but they’re not new. When I did my PhD in software engineering, I minored in artificial intelligence, and I’ve been teaching AI for a good 20 years.

But I think that you have to focus on your strengths. I don’t care if it’s in a marriage, your career, or work here in the school. Our strengths are not going to be cutting-edge research. If you start focusing on your weaknesses, you’re going to do what Peter Lynch warned about. He was the phenomenally successful investor who managed the Magellan Fund for Fidelity Investments back in the ’80s and ’90s. Lynch said the problem with a lot of companies is that they “di-worse-ify.” What he meant by diworsifcation is that they get into business that they’re worse at. I saw this firsthand when I worked in the industry. I had a short stint with JCPenney, one of the top retail companies in America at the time. They decided to get into casualty insurance. They were terrible at it. The whole thing imploded, and I saw firsthand what diworsification can do. We’re not going to diworsify Norwich University.

We need to focus on application. We’re not going to go out there and try to compete with Carnegie Mellon or MIT, trying to assemble a world-class research faculty. Instead, we’re going to ask: How can we graduate students who are knowledgeable of these technologies and understand how they can be applied to the set of problems or knowledge domains that they want to attack?

So what’s your biggest challenge ahead?

My vision is that we continue to attract really good students and really good faculty members. I think we’ve done a good job doing that. But because the world is changing so fast, the competition for faculty members and students in cyber is so fierce that I have to be really vigilant. We have to really work hard at continuing our alliances with the senior military colleges5, with NUARI, with DoD, with NSA. There are a lot of relationships that have to be continuously maintained, so that we really make the most of these opportunities. I took this position knowing that this is a very successful program. It’s the fast¬est-growing program at Norwich. But complacency would kill it in a minute.

Are there things that concern you about the increased role technology plays in our lives today, especially since the pandemic?

Well, I think that there are a lot of things that concern me about technology. I have five adult children. A couple of them contacted me this fall and said, “Dad, you need to watch The Social Dilemma6 on Netflix.” So my wife and I watched it. Literally, almost every major point in the documentary is something that I’ve been teaching in my ethics class for years: In short, there’s also a downside to technology.

I made the executive decision to teach the school’s ethics class, CS 330. It hasn’t been taught in many, many semesters. I have 20 students enrolled, most are cyber majors. I’m super excited about it. We won’t shy away from hard questions about tech’s downside. Because I don’t want students graduating with this muddle-headed idea that technology is just all wonderful. Anybody who thinks there’s not a downside doesn’t understand what happened on 9/11. It’s always a two-edged sword. I want students to have the maturity to understand what’s lost in the technology, not just what’s gained.

That’s where having the traditional view of the liberal arts is important to me. Because the notion of the liberal arts is that students should be liberally educated. I don’t mean as capital “L” political Liberals. But rather, that they’re not afraid to look at other disciplines and look for the connections between them. What is there in my political science class that I can connect to my computer science class? Are there other common threads? Can I build a coherent worldview that understands not only human nature but technology?

I would like to graduate Norwich students who understand that life is multi¬faceted and that it’s a lifelong endeavor to try to understand the world and to be a positive citizen. This is why the Norwich values mean a lot to me, particularly the idea that we will dialogue with and understand people with different viewpoints. The echo chambers of social media today are creating a weird factor where we’re ignorantly talking past one another in many ways. In a small way, I’d like to make a dent in that so that our graduates are as liberally educated as they can be. That includes having a realistic assessment of not only what’s gained in technology, but what’s lost.

Can you unpack that a little more? How you do examine that in your class?

Technology is great at making things more efficient. To paraphrase a famous author, “When you optimize everything, nothing is fun.” So what we’re trying to do is look at the human element in the picture. For example, in what ways do we depersonalize life and humanity with technology? That’s one of the big questions that I want students to wrestle with.

I grew up near Wayne and Holmes Counties in Ohio, which have the largest population of old-order Amish in the world. I grew up in the midst of that community. I’m not Amish, but I always watched them with intrigue. There were a lot of rumors that would fly around about why they did different things. I remember they didn’t use pneumatic tires on their tractors. I asked a mechanic once why not. He said, “They think the devil is in the air.” That actually turned out to be nonsense. As an adult, when I began to study the Amish, I found that they’re very reasonable people. They spend a lot of time thinking about questions like, “Hey, if we speed this process up, what will we lose in the process?”

I’m not going to become Amish, because the Amish don’t use electricity and they certainly don’t have a need for computer scientists. But I like that thought process. I really like the fact that they very intentionally think about things like, “If we adopt internal combustion engines on our balers, what will be the impact on our family life?” That’s a question Americans don’t even think about. I mean, what college student at Norwich is thinking about, “What am I losing through the constant use of my smartphone?”

Students aren’t thinking about that, and I want them to. I want them to think about questions like, what are the down¬sides here? At least begin to open it up for a conversation. For my generation, growing up it was the television. We talked a lot about what was on television, but we never talked about what is television doing to us? What is it doing to our view of education and our view of the political process? Those were bigger questions than, “Hey, what do you think is going to happen on ‘The Waltons’ tonight?”

What about the upside of technology? Do you explore that in your class?
It’s sort of a question that doesn’t need to be asked. I mean, everybody understands that. Look at the Pfizer and the Moderna vaccines. Right there is an upside to technology. The fact that when I was a graduate student, I had an appendix that started to burst. A hundred years earlier, I probably would have died. Instead, I was given intravenous antibiotics; I had surgery a few hours later; and, boom, I was back in class before long. On the medical-tech side, it’s undeniable. Usually, we’re talking about saving lives. How about traction control systems on modern automobiles? Most people have no idea how that works, or what that technology is. But that’s a computer-based technology that basically prevents you from rolling your car due to oversteering.

The list on the upsides of technology is endless. Americans are really, really good at citing those things. In my ethics class, I don’t need to teach the upside of technology. In the American technological society that we grow up in, we’re taught implicitly to see all the upsides of technology. But it’s a very different gear to look critically at the question, ‘What am I losing with this technology?’

Interview condensed and edited for length and clarity.

1 Battig worked as a systems analyst for J.C. Penney Co. in Ohio and Texas.
2 He joined Norwich’s faculty in 1998 as an associate professor of computer science and left in 2000 to teach at St. Michael’s College.
3 Postman, who died in 2003, was an educator, media theorist, and cultural critic who wrote about the negative influence of personal computers and technology on education, among other topics.
4 The observation-based prediction made by Silicon Valley engineer Gordon Moore in 1965 that the number of transistors in computer chips (aka computing power) would double every two years.
5 Norwich is the principal investigator of a $10 million Defense Department cybersecurity grant awarded in September to six senior military colleges. The project aims to stand up cybersecurity institutes to act as pipelines for diverse, next-generation cybersecurity talent.
6 The 2020 documentary explores how Silicon Valley technology companies use persuasion psychology to hijack our attention span and gather unprecedented personal data in service of predicting and influencing consumer behavior for profit.

Norwich University admits students of any race, color, national and ethnic origin to all the rights, privileges, programs, and activities generally accorded or made available to students at the school. It does not discriminate on the basis of race, color, national and ethnic origin in administration of its educational policies, admissions policies, scholarship and loan programs, and athletic and other school-administered programs.

Norwich University collects personal data about visitors to our website in order to improve the user experience and provide visitors with personalized information about our programs and services. By using our site, you acknowledge that you accept the information policies and practices outlined in our Privacy Policy.