Having coffee with Mike Chapple

Published: June 27, 2018

Navigating the news from an interconnected world can feel like drinking from a fire hose. High-profile hacks target a U.S. presidential election and expose consumers’ financial information. Companies gobble up seemingly innocuous data once shared with a social network to serve up manipulative political ads. Emails deluge inboxes with updates on new privacy policies that leave people wondering about how their information is — or has been — used.

Photo by Matt Cashore '94

But if you had to pick a guide to help you parse the mind-bending blur of headlines and hot takes, you would do well to chat with Mike Chapple ’97 ’09Ph.D.

Throughout his career, Chapple has focused on cybersecurity, business analytics, cloud computing, and IT compliance. He’s worked for the National Security Agency, served as an active duty intelligence officer in the U.S. Air Force, and earned advanced degrees in business, computer science and engineering. And he publishes expert analyses for such media outlets as Forbes, CNBC, and USA Today.

“We live in interesting times,” says Chapple, the academic director of Notre Dame’s Master of Science in Business Analytics program and associate teaching professor of IT, Analytics, and Operations. He’s ticking off a series of high-profile breaches as he points out how security constantly evolves to thwart hackers.

“We’re probably at a high-water mark in terms of publicity for these issues,” Chapple says. “They’re not going to go away. It’s not like we’re going to invest millions of dollars and tomorrow everything will be solved. There’s no silver bullet when it comes to cybersecurity. What makes it harder and harder every day is, our technology gets more and more complex. And as things get more complex, there naturally then become more avenues of attack.”

It’s a topic Chapple takes seriously in his teaching. Cybersecurity was once standard fare for IT Management majors, but it’s increasingly important for his students in the University’s Chicago-based MSBA program. They’ll go on to both prevent and deal with security breaches, and he wants to arm them with case studies that will enable them to understand the technical side of things and communicate effectively with customers, regulators and news media.

When he’s not talking security, Chapple is equally comfortable explaining the nuances of digital privacy. Anyone who’s checked email in recent months has noticed how companies and organizations are rapidly updating their privacy policies — a response to the European Union’s General Data Protection Regulation.

The GDPR highlights the different ways data is treated internationally, Chapple says. In the U.S., people have become accustomed a patchwork of privacy regulations that create compliance headaches for companies and organizations. A university like Notre Dame, for instance, must comply with one set of regulations for financial aid, another for student health, and still another for academic records.

“If you think of yourself as a digital entity, your healthcare records are covered by HIPAA, your college and high school records are covered by FERPA, and your financial records are covered by GLBA, but there’s nothing covering you, the person,” Chapple says. “The European Union takes a completely different approach. They regulate personal information.”

Europe’s umbrella approach to regulating data has created a global ripple effect, Chapple says. Companies want to avoid the steep fines they could face if they mishandle a Belgian or French consumer’s data, so they’ve decided it’s just easier to follow the European regulations across the board. He sees parallels with the American auto industry, which now finds it cheaper to make cars that meet California emissions standards, even if most of them will be sold elsewhere.

Ultimately, Chapple says, the GDPR is a win for individual consumers, no matter where they live.

“It’s caused every organization to think about their privacy practices, and that’s always a good thing,” he says. “They’re going through and thinking about what information they have. Do they need it? Should they keep it? What should they do with it? And everyone benefits when that happens.”

Companies aren’t the only ones thinking about privacy. Chapple has noticed how his students have become increasingly sophisticated at navigating their own digital footprint. Edward Snowden taught them how governments eavesdrop and collect information on citizens. Cambridge Analytica reminded them that companies are keen to gather data and exploit it for profit in ways they may not have anticipated.

“I think today’s students are more privacy-conscious than previous generations, and I think it’s because they understand these issues and they’ve lived through it,” Chapple says. “They also in some cases may be free with their information — being privacy-conscious doesn’t necessarily mean you’re not going to give information to people — but you’re going to think about it and you’re going to understand what people are doing with your information. I think today’s students, because they’ve grown up in this world, have more of an understanding of the ways data can be connected across different sources and the types of things that companies and organizations can learn about them.”

Chapple’s students will also think about analytics from a business perspective, and he wants them to do so in socially responsible ways. It will be important, he says, for them to develop models that both protect privacy and help their companies succeed.

For instance, Chapple says, today’s students may one day need to gather data in a way that allows researchers and analysts to draw conclusions and recommend improvements without uniquely identifying people.

In addition, he says, graduates may eventually need to help design transparent algorithms or at least interpret their results with a degree of sophistication to ensure they don’t simply absorb and perpetuate human biases. In a world of big data, that’s something Chapple sees as a crucial skill.

“All we do when we build automations and build models that make decisions for us based on data that we already have is teach the model what’s already happened,” he says. “And if what’s already happened has some kind of inequality or other undesirable characteristic to it, we’re just going to wind up automating that same problem.”