Why Cybersecurity Matters To All Leaders – Including You

As the world grows to become more digital, organizations and governments continue to spend billions of dollars on Cybersecurity. Yet, as their investment has grown, the frequency and the consequences of cyber breaches have not decreased noticeably. Stories of data breaches and ransomware attacks are appearing more regularly in the news, impacting organizations of all sizes in every sector.

In this episode...

In this episode, we explore why leaders at all levels and across all settings should consider cybersecurity as an organizational lifestyle where everyone plays an important role. We set to unpack murky or ambiguous decisions about cybersecurity and discuss how leaders’ responsibilities change in light of digital stewardship principles. 

Our guests include Mark Segal, Chief Information Officer at CBI Health, Dr. Jeff Curtis, Chief Privacy Officer for Sunnybrook Health Sciences Centre and Dr. Laurel Austin, Adjunct Research Professor in Management Science. Alongside our host, Ivey Professor Mazi Raz, guests discuss the myths, missteps, miscalculations, and misguided voices about cybersecurity, how to approach cyber risks as a leader, as well as how to integrate cybersecurity management and response into your organizational character.

Other ways to listen:

Additional Reading:

If you're interested in learning more about cybersecurity, we've linked the following resources on the topic recommended by Dr. Laurel Austin and Mark Segal, a Senior Digital & Technology Executive who acted as a guest on our livestream panel:

  • CyberSecure Canada is the country's cybersecurity certification program for small and medium-sized organizations. Certification can enhance your competitive advantage by letting your supply chain know you're a trusted business partner.
  • Krebs on Security, a blog authored by Brian Krebs, an ex-reporter for the Washington Post, provides news on cybersecurity and investigations surrounding breaches and threats.

 

Follow The Ivey Academy on LinkedInTwitter, and Instagram for future virtual event announcements.

About The Ivey Academy at Ivey Business School
The Ivey Academy at Ivey Business School is the home for executive Learning and Development (L&D) in Canada. It is Canada’s only full-service L&D house, blending Financial Times top-ranked university-based executive education with talent assessment, instructional design and strategy, and behaviour change sustainment. 

Rooted in Ivey Business School’s real-world leadership approach, The Ivey Academy is a place where professionals come to get better, to break old habits and establish new ones, to practice, to change, to obtain coaching and support, and to join a powerful peer network. Follow The Ivey Academy on LinkedIn, Twitter, and Instagram

Full Transcript:

MAZI RAZ: Hello, everyone. Welcome to our livestream, Why Cybersecurity Matters to all Leaders, Including You. My name is Maisie Rose. I'm the director of Learning Design and strategy at the Ivey Academy The Ivey Academy and the Ivey Business School are located on the traditional lands of the Anishinaabe, Haudenosaunee, Lūnaapéewak, and Attawandaron peoples. This land continues to be the home to diverse Indigenous peoples whom we recognize as contemporary stewards of the land and the vital contributors of our society.

Cyberattacks are now very common. A recent report by Security magazine showed that on a global scale, approximately every 30 seconds, hackers attack a computer. In the news, we hear more and more about cybersecurity incidents that affect companies in various sectors. We also do not hear about many more incidents that occur which never make it to the news for all sorts of reasons.

We know the effects of such security breaches can be wide-- impacting individuals, damaging businesses, and loss of jobs obviously. Our focus today is about why cybersecurity is important to all leaders. Today we are joined by three guests, Dr. Laurel Austin, Mark Siegel, and Dr. Jeff Curtis. Laurel is an Ivey faculty in the management science department. The focus of her research and teaching is on decision sciences and behavioral science methods to understand how people perceive risk. We are very proud to have Laurel with us at Ivey.

Mark is a senior technology and digital executive. He has over 25 years of experience in telecommunications media and other industries. He is super smart, and he most recently served as a Senior Vice President of Business IT at Rogers Communications. And Mark is a graduate of another business school in Ontario, which is not Ivey. We will not hold that against him and will welcome in open arms here.

Jeff is the chief privacy officer for Sunnybrook Health Science Center. He too is super smart, and he's the director of the hospital's legal risk and compliance group. Jeff has an impressive and extensive experience in privacy and information technology governance. He has a doctorate of business administration from the United Kingdom. His research focus is on decision making under uncertainty.

And we have a poll ready for you. The question is asking, to reduce cybersecurity risk, the key thing an organization needs to do is what? And we have four options. First one is have the latest technology in place. The second one is to increase cybersecurity awareness to change the behaviors. The third one is, have at least have the latest cybersecurity expertise on staff or on call. And finally, have emergency plans and appropriate resources in place for a possible breach. Let me start with Jeff.

JEFF CURTIS: It's an interesting poll because what I think people have recognized is that the hardest thing to do during that list is the most important thing to do. I can do A, C, and D as a matter of a program, cybersecurity program. And of course, IT folks are used to doing some version of those things in many dimensions of IT. When we're talking about security, often people will say, well, if you think about it, it's sort of a people problem in some respects.

So immediately, we're in a dimension where the most important thing to do, obviously, is to make people aware and then get them to act. That's a tough thing to do. And we'll talk about it today. Probably involves more than just the IT group. It involves a range of factors within the organization that we need to pay attention to.

MAZI RAZ: So that's a really good point. And I agree with you that the answer here is, yes, increased cybersecurity awareness, but possibly that's a very difficult thing to do. Mark, in your work, line of work, how does this increasing awareness manifested? What have you seen successful in increasing awareness?

MARK SEGAL: It's a lot of education, frankly. I think education in their personal lives and in their corporate lives-- everyone loves when the team runs a phishing campaign against you and then sends you for remedial training. But reality is where I've seen it done well, it becomes part of business training. So annual business training includes some understanding on cyber risk. It involves education I think of the rest of the leadership and the board around the risk.

And I think one of the challenges-- historically, we used to keep these breaches really, really quiet. And there's a need to know basis. But having people understand that it is happening and is happening why. It happens with different levels of outcome.

Sometimes the breaches have no effect, right? It was a non-production environment with very, very little data. It showed you a breach in the technology, but teaching people about where the breaches are coming from, looking at your competitors, looking at businesses elsewhere. And I would say the media has done a great job in the last couple of months in coverage and making awareness, just kind of part of normal discourse.

MAZI RAZ: That's quite interesting. Laurel, the answer most of our guests here today picked, which is increased cybersecurity awareness to change current behaviors, I don't suppose that's just enough. I suppose that there are other things that an organization may also need to take into account. What are your thoughts about that?

LAUREL AUSTIN: Yeah. Certainly all four elements that were listed are important. And so we don't want to forget all-- we don't want to forget the others. But one of the things that's certainly true is it's increasingly important for organizations to be thinking about planning for how to respond when there is an attack, because more and more, it seems to be the case that it's not just if there will be an attack. It's when there will be an attack.

And more and more organizations are experiencing these kinds of problems. So in order to prepare for that, one of the things organizations want to do is to really think about what are the things they have at risk.

Traditionally, with cybersecurity, we talk about loss of data, and customer data, and customer privacy, and maybe employee privacy. But more and more with the things we're seeing in the news, of course, there's a lot more at risk. There's business continuity. If you're part of a supply chain, it's your business continuity and those you supply or those who supply you.

If you're part of a large infrastructure system, if you're just a small piece in that, something happening to you can really impact not just many organizations but an entire community and put people's lives or safety at risk. So thinking about starting to think about what's at risk through our little portal or big portal to the internet, what kinds of things are placed at risk because of our business? And how do we prioritize those?

And then once we prioritized those risks and we take stock of where we are, then we're in a starting position to plan our strategy. What's prioritized? And sometimes we focus on the day to day, the easy things. And these bigger risks, we might not even really have thought about our role in the bigger system and in our potential losses and obligations and responsibilities to others.

MAZI RAZ: We're very lucky that all three of you today are experts in risk. So let's open this topic up a little bit more. Jeff, Laurel mentioned that it's necessary for an organization to actually have a bigger picture view of what is at risk and then also figure out priorities of how to respond to it. Have you experienced success working with senior executives in having this dialogue? And what actually contributed to the successful dialogue with the rest of the senior executives in trying to understand the big picture of what is at risk?

JEFF CURTIS: We have. And this, of course, has been a journey over I would say almost two decades for myself certainly here on a couple of levels. I mean, if we're talking about from the top down if we were-- there was a time when it was a new topic to the board, whether it had been a new topic in their own personal business lives or otherwise. But the idea was that it became pretty apparent that you had to tell the story centric on this business.

So we are in an academic health sciences center, and therefore, specific types of things are at risk. We have certain business delivery value that we have to achieve, also in a context that's not often seen as business value, if you were, value to patients, researchers, the various stakeholders that are here. So it's a complex environment.

Then what you're saying as well, whether you realized it or not, we've put a lot onto the security of our information technology systems and computing systems, not that we just woke up one day and it was like that. This has been evolving obviously for decades. But we actually used to talk about the real issue, which was networked computing or internet exposed computing.

So it's one thing years ago to have an internal network. And everybody could talk to everybody else, and that's why the concept of an external bad actor invading the operations was sort of off in the distance, if even present at all. So one of the key discussions, and it still bears discussing, is the fact of the connectivity that we enjoy now, the always on, the personal ability to compute from a mobile anywhere you are, anywhere in the world.

On the one hand, it's delivering huge value. This is where we want to go. On the other hand, you then have to pay attention to the commensurate risks in that deal if you will. There are other ways to operate, but it's almost inconceivable now that you would go backwards. So if the commitment is to go forwards and connect everything up and bring data wherever it's needed all of the time and indeed introduce even machine processing, automated transaction processing that happens at a much greater pace than what might have been experienced by some of those folks on the board in their careers, the story that has to be told is quite a bit different than it was a generation ago at least, if not even not five or 10 years ago.

So we've had pretty good success sort of bringing folks along there. I think the other thing I wanted to mention just quickly was this has to also happen at the vision or the product level or the chief level of the various programs, certainly in our hospital, where they are their own CIOs often these days. There's certainly a corporate CIO or presence or CXO.

But all of these folks are buying their own technologies these days, and we're letting them plug it in. What's new is a resurgence of governance or command and control. So you can plug just about anything you want in, but you have to be able to understand and describe what you're plugging in, and you have to go through some checkpoints. You can't just park any car you want in the garage. You know where it says no propane vehicles? There's a reason for that. And so we use that kind of analogy around here too, and that's been very successful.

[MUSIC PLAYING]

 

MAZI RAZ: So if I understand this correctly, what you're proposing is that, well, it's a feature of the way that we do business, and it's here to stay. It's not just something that it's only a matter of 2020, 2021, 2022, and then in three years, we're probably going to figure out a way of overcoming it.

JEFF CURTIS: It's a feature of the fact that this is how you do business now. It's not something off to the side.

MAZI RAZ: So then in that case, what you're proposing is that we may have to think about the way we do business. We have to approach it slightly differently and somehow embed responses to cybersecurity in the way that we do daily business.

JEFF CURTIS: Sort of maximally or even on a micro basis, every transaction counts. Things are different today than they were tomorrow. Today we have way more transactions, and the transactions are happening in a way that's not necessarily entirely always knowable or transparent. You set up these systems and you let them run, if you will. And so you better be sure about how they're architected. You better be sure that the people that are running them, you better be sure, you've got to be sure. And you can never be 100% sure. But that's what then introduces us to the risk management scenario. It's not a perfect world.

And this is what, for example-- Mark can maybe talk a little bit more about this-- this is what then introduces a concept like zero trust computing. Your leadership often says, what do you mean we don't trust our people? Everyone from the board on down is trusted. Maybe there's a person out there that's not trusted. What do you mean?

Well, the answer is, it's the answer to how sure do you need to be? And that's the question that needs to be asked. If you don't need to be sure, or you don't care, like everything else in life, then you can ignore it. But I think at the scale and scope that our organizations are running, even the smallest organization now, you can't afford not to know about this topic. And that's really the theme of what we're talking about today.

MAZI RAZ: This is a really good point. And thank you for pointing to Mark to join us in this conversation. Mark, if I'm hearing Jeff, which is raising some really good points, Jeff is trying to also help us to understand three interrelated themes together. He starts giving us a list of all the things that we need to be sure of. We are using the term security. What does that really mean? What do we really mean by that?

MARK SEGAL: Generally, I think that security is not-- everyone thought of security as the firewall, back to Jeff's point. Everybody was on in an office, and the data was safe within the building as long as no one could walk in and sit at a computer. Well, that cat's out of the bag. It's been coming out of the bag, and I would say COVID has destroyed any idea of firewalls.

And so firewalls was the old security methodology, which is a little wall and put in holes where you needed things to be connected. And there'd be a conversation about it and a governance board and off you went. Zero trust and security models are really more around let's assume that you are who you are and that you have to prove who you are and you have access to what you are. So instead of what happens today, which is you're on the network and you can get access to all the data in the data repository, it starts with, what data do you need specifically? And that's the data you get access to.

So really I think what people are looking at security posture, it really starts with a different way of looking at how to grant access. It's kind of what-- I'm oversimplifying, but that's what it is. But I think security is compute, but it's also people. And I think that's, right now, if you think about the things that are most breaches-- breaches before were very much around someone stole your data and then would publicize it. And it had brand the implications and possibly your customers not believing you. But it really has now become-- and Laurel will kind of talk about this-- it's impacting operations, right?

The model has changed. And if you look now at the most breaches are coming through phishing, which has nothing to do with security. It's not people scanning attachments to emails. It's people responding to what they think through the normal course of business and email and then getting a piece of virus, virus software that locks you out of some important file or files or network equipment or whatever the case may be that stops the business from operating. You can't package meat, for example, as we saw with fuel line stops, pipeline stops, that it's impacting businesses.

And so now the question is, how do you fix and get as much sort of different security model? But how do you educate people so they understand that they are always the biggest vector, right? So really getting people to understand what's going on and making it harder for them to actually divulge something they didn't need to see in the first place.

MAZI RAZ: If I understand this correctly, you're proposing that this is no longer just a technical problem. And if it were a technical problem, probably would have had some technical solution for it, but you're suggesting that it's a behavioral issue. And how do we, in that case, consider security from that point of view?

MARK SEGAL: Yeah. I mean, look, it is a technical problem and it is a behavioral problem. You know the security, the IT folks will have to work and as part of their job description, determine how do we put the right security model in place. And I think it's a risk conversation. You can become as secure as a government agency. And no one's allowed bring their own phone into the office and all that, or you can decide we're going to allow for some flexibility in the workforce to do their job. [INAUDIBLE] people with mobile devices are more productive because they can see-- [INAUDIBLE] business process their copy of an X-ray on an iPad next to a client. Right? Maybe at home, when they have time, that makes them more productive.

Now, the risk is you've exposed that customer's private data, their X-ray externally. And so I think you have to understand, OK, should we expose it, risk conversation? And then do the people who have it understand? Most people that work in call centers know credit card data is pretty important. And they've all been taught to not write down a scrap of paper. Don't put a note. Don't email to people. So really helping people understand that all data is critical. And even infrastructure design is critical. And all these things can allow an actor-- and most of the breaches that you see, there's sort of different complexity the breaches. The really complex ones are not one whole.

It's people social engineering, understanding how things work. Like that's why solarwinds was so scary. It gives companies the ability to see into the structure of the IT infrastructure of companies, which makes it much easier to understand where to target. And so the answer is, yes, both. That's why I think what we're here today to talk about it. It really isn't this [INAUDIBLE] problem anymore.

MAZI RAZ: Jeff, you wanted to also comment on this.

JEFF CURTIS: If people are interested in sort of bullet points, I think it still bears true that a good way to start these conversations, regardless of the sophistication of the audience is to say, well, look, traditionally, information security, or IT security, was about confidentiality, integrity, and availability of the IT computing environment and/or the data assets that are involved there. We can talk about what those might be-- so CIA, any security professional can talk about those three things.

Is the data in transaction being only accessible by those who need to know at the time that they access it confidentiality? And this is what tips into the privacy world. Confidentiality is not saying is privacy. There it is.

Integrity, the accuracy, completeness, or computability of the systems or data-- I mean, if your spreadsheet, after somebody hacks it, says 1 plus 1 equals 3, the spreadsheet might work or still operate. You can get to it. But it's really not acting the way it should. That's going to cause problems for your business.

And then availability-- are the systems just technically on? That's usually job number one for any CXO, but it's not the only issue in security-- so CIA. And then classically, you can talk about people, process, and technology dimensions of either problems or controls.

LAUREL AUSTIN: One thing I learned to just go back to for a minute was something Mark said about phishing and that that's the primary vector for getting in. And it's not so simple anymore to train people what phishing looks like. It used to be simple. It was those messages. I had an aunt who has a lot of money and you--

MARK SEGAL: I'm a Nigerian prince.

LAUREL AUSTIN: Right. And those are easy to spot. But training for phishing now is much harder because there's targeted phishing, where they have information about who works for you and what they're like and who you might communicate with and can design emails that look very real. And it's coming from your boss. And of course, you're going to click on the link that your boss sends you. I mean, one thing that's always happening is the bad guys are always a step ahead, and they're trying new things, and they're more sophisticated than us. And so the learning is continuous, and the need to understand what's going on and the behavior changes that are needed is continuous.

[MUSIC PLAYING]

 

MAZI RAZ: Might I suggest a slightly different direction to this conversation? I mean, what we're doing really well here is that we are painting a good picture of what the threat is and how they've become more sophisticated. But maybe if we slightly change the conversation and looking at the responses that we as managers can actually give and what are the things that we need to do in order to prepare for this new way of operating in the world.

Laurel, you started us on this path. Early on, you said that it's not just about security and the breaches, but also, it's preparedness and then responses. The event that we had on Tuesday, the simulation, was partly about that. Can you help the audience know a little bit more about what took place on Tuesday?

LAUREL AUSTIN: Yeah, for sure. Absolutely that simulation is about response. How would you respond if there's an incident and you're not sure what's going on? And I think that's typically the case. Something maybe is going on, but you're not sure. And you can't shut down every time you think maybe something's going on, because you're a big company like Amazon, maybe something's going on that's happening all the time. So the backdrop for the sim that we did was it's a small startup, they're growing, you're the chief technology officer, you're in New York, your headquarters is in Seattle three hours away. It's 8:00 AM in New York, 5:00 AM in Seattle. And you get a phone call.

We've got a distributed denial of service attack underway, and it looks like maybe there's something more to this, and we aren't sure. And so now you're virtually managing this cyber incident from afar. You're getting all of these phone calls, all of these text messages. People are being woken up at 5:00 AM in Seattle, because maybe there's a problem. So they're all calling you. Different people have different bits and pieces. You're getting advice from people.

We should shut down. We shouldn't shut down. We should make this change. We should make that change. Everyone sort of acting independently. You're trying to coordinate them from afar. Everyone's virtual, which we're all experiencing right now.

So some of it-- and then you go through this exercise. And so some of the really interesting things or great things about this sim is it's a safe way to experience typical behaviors and typical responses. The typical things that people-- we should shut down, we shouldn't shut down. We should do A, we should do B, we should do C. We're planning a meeting. We're all going to talk in 30 minutes. Can we just wait until then?

But everyone's under pressure. Right? It's very stressful. So some of the key takeaways-- or first, Mazi, as you said at the beginning, the feedback we get when we do this with organizations is that was so real. It felt so real.

We've been in a situation like that, and that was just what it was like. And my heart was pounding as I went through this simulation. So there's that. One of the great things is it helps us with scenario planning. So Jeff has talked about that. And the importance to be prepared-- it helps if you can plan and practice in a safe place how might people respond, what might they say, and how are you going to respond to that. So I think it was Jeff or Mark said you need to know in this situation, we're going to respond to A, B or C. And be prepared for that. So the importance of emergency plans.

So the value of scenario planning and preparedness, emergency plans, communication plans, the importance of updating those comes out in the simulation. And then I think what people experience is that managing behavior is really hard. When everyone's in a panic and everyone's really stressed and they're all trying to do their best and you're finding things out like maybe secret something an employee did. They didn't want it to anybody, but now they're thinking maybe they ought to fess up as to what happened. Or they're investigating something and they don't know yet, but now they're going to tell you.

So all of that comes out. And then finally, and then we'll move on-- I'll take it back to you, Mazi. But we talked about the importance of communication in the aftermath of an event. Because if you're a publicly traded organization, you have legal obligations to disclose things you know that might affect your company. But you also don't want to be alarming. And how do you balance those? And what do you say when you aren't actually sure what happened?

How do you admit that? But how do you not admit that? And so in the sim, people have a chance to practice and get some feedback on that.

MAZI RAZ: Laurel, thank you. And Jeff and Mark, I actually want to get your reaction to this in a second. While we're doing this, might I propose we look at the initial poll that we had, which a lot of people were suggesting increasing awareness about cybersecurity to change behavior. So let's also think about the last part of that phrase-- to change behavior. What are some proposed behavioral changes that we need to see in light of what Laurel is talking to us about? Jeff, let's start with you, and then I'll go to Mark.

JEFF CURTIS: Yeah, sure. So just a couple of quick comments on that. I mean, this whole idea that emails are risky because anybody could click on anything and anything can happen. One answer to that is email really shouldn't be a transaction medium. It should just be a messaging medium, and you shouldn't actually be able to click on anything in an email. People will-- and then this is the internal conversation that will ensue. What do you mean we can't attach things to email anymore? But this is going on in the large banks and the sort of senior players right now. The question is, what's your business? Think about that.

And if you don't even know that there are other ways of working, no wonder you're still suffering from phishing, for example, because there are solutions to that. And there's vendors lined up out the door able to help you with that kind of thing. It was interesting to me-- the analogy I think to the simulation thing is actually if you think about it, anybody who's been on a cruise has gone through the simulation of it's the first thing you do after you get on the cruise ship. Everybody wants to go to the bar. They don't let you do that. What they do is they say come to this room, put on a life jacket, and then go to your muster station. And we're going to describe to you how to get on a boat in case we sink.

And then they open the bar. And everybody has fun. So there's sort of a reward, if you will, for doing the simulation almost. But I think it just bears saying is that the last thing most people think of possibly when they get on a ship is that it's going to sink. But obviously, it's pretty important to have gone through some of this before the first time it happens. So this is analogous to what you want to do within your own corporation.

And this just underlines the value of simulation. Simulation, relatively speaking, is cheap. It's fun. It's a better training method, I would suggest, or it's perfectly complementary to traditional rote learning or something like that. If it has to be said, for phishing as well, there are lots of tools out there that will allow you to run your own phishing simulations or testing inside, create your own emails that are almost perfect except for one little thing in them. But anybody in finance is going to think that it's a command to please send this check to this person, but it's still false.

And so that's increasingly easily done. The question is, will the culture accept the idea that we're going to simulate or test people on these various dimensions, where most people will say, yeah, I know about phishing. I don't think that that's acceptable anymore. I think you have to actually go through and watch what happens when people make these kinds of decisions.

MAZI RAZ: Mark, in addition to simulation rehearsals for preparedness, what other behavioral changes would you advocate for any organization that actually wants to equip itself?

MARK SEGAL: Just a quick comment. Laurel talked about the way the simulation went down. I had a little bit of PTSD going on and not generally just for security. That's kind of how operations things fall apart as well when something goes bad. And you can't transact in a store or whatever.

I think the notion of a breach playbook talking about what happens when, how do you muster what will be the communications pattern? Will we communicate every four hours in this kind of breach, to this level, to that level? Here's what we're going to do. We're going to tell you what we don't know, what we do know. We're going to tell you what we're looking at next, understanding which people you're going to call and when, including third parties.

Understanding when something happens, are you going to use third parties? Who are those third parties? Are contracts already in place? And how does the business owner get involved in those conversations? How do they get involved in making the call about, you know what, we're going to shut down that emergency room, recently as we talk about the [? Humber ?] issue, or we're going to shut down a plant, or whatever the case may be-- parts of the network if you're in a telecom company.

And so being very clear about having those conversations not in time of crisis to provide a framework. And it's never-- you don't call it exactly. But at least it gets everybody kind of swimming in the same direction. And I've seen it not just in security, but operations is the classic example. You come into areas which don't have great operational governance and cadence, but something goes wrong, like it's just everyone's got a different story of what happened. No one knows who's making decision calls. Do you inform the board, do you not? This is just something you need to practice at a time. So it's not just education of people. It's building the process I think Jeff alluded to earlier.

MAZI RAZ: Mark, whose responsibility is to help the organization go through all of these preparations?

MARK SEGAL: Honestly, I think this is a new emerging enterprise risk. And I think that it might be the CISO's accountability to ensure that it gets done. But it's, I think, a leadership accountability responsibility to understand their role, understand how it proceed, understand who gets decision rights and why. I don't think it's an individual.

And I think everybody needs to understand their play and their right to make decisions. I think, Jeff, you talked a little bit earlier about shadow IT, what I call shadow IT. And historically, the issue was he didn't know what to spend in IT. That was kind of the joke, right?

We spend 4% of your budget [? IT goal ?] [INAUDIBLE]. Where are those Amazon bills coming from? [INAUDIBLE] [? corporate account? ?] And then suddenly, now it just becomes this kind of idea of well, you don't really know where your data is. And who has access to it? What controls are in place?

I think a data risk and confidentiality and training people on what's confidential information and what's not and what they can leave lying around and practice not leaving your laptop in your car when you go somewhere, all those things are just part of building a good robust process in a company around what the risk you're willing to take is and what is the individual's accountability working there in understanding their role.

MAZI RAZ: Jeff, you worked in one of the most critical and important hospitals in Toronto actually, in Midtown Toronto. And my question is, all of this new behavior that we're proposing to embody in a business, how do we make sure that they don't become disruptive to the daily business? And how do we make sure that it still has woven into the fabric of daily work?

MARK SEGAL: Yeah, so this is obviously a process. It's not you're probably never finished doing what you suggested. And so there is a concept, business management concept, security framework, privacy framework concept called capability maturity, for example. So where a zero means never heard of the issue, don't know the questions, would never know what the answer is, and who are you, and why are you asking me, all the way to a five, where you're measured, managed, optimized, and there's some type of plan, do, check, act improvement over time, which got you to the five presumably.

So I think it's a matter of course then of addressing the areas of most need first. So this means then-- and Mark alluded to this-- is you have to step up. I mean simulation is one thing, and it presumes that you've understood phishing is a problem. It presumes you understand that that is a problem and you can characterize the problem.

But I think you also have to make sure that you've surveyed the business enough to know where these problems are. Phishing is not your only problem or the results of somebody clicking on a phishing email or something like that. There's myriad problems. And any good security controls framework, which frankly can be read by most levels of senior management into operations management without knowing some of the technical terms.

They should be exposed to the ranges of not only the issues, but the problems, and then into the possible controls. Well, who knew you were sending data from A to B? Have you considered that the data should be encrypted, or what's that? And then you start a conversation about confidentiality and transmission. And then they say, well, tell me more. And then you, of course, there's a cost to that because people have to manage encryption keys and all of this kind of good stuff.

But very quickly, the business unit becomes much more immediately much more sophisticated. I mean, as we say on a capability maturity framework, going from a zero to 1 is 10 times, you're 10 times more sophisticated than you were from a 1 to 2, 10 times more.

It's sort of an exponential scale. So you're really scaling up exponentially the knowledge base of the organization. But this can be done. You can start small, but start where it's most important. So that means usually doing some type of asset survey or allude to where's your data, which systems are acting on that data, and who are the actors? Who are the people who should be or shouldn't be able to transact here, including externals?

And then you can start to prioritize and make your decisions or put your emphasis where it should be. I mean, it's commonly understood, you cannot spend your way out of this problem. And so whether you've got lots of money or not a lot of money, the point is you need to focus your attention on what's most important. And that's based on where the data is and how at risk is it.

[MUSIC PLAYING]

 

MAZI RAZ: Well, what I'm getting a sense of, the two things that I'm actually detecting from this very interesting dialogue here, one I guess-- and please correct me if I'm wrong-- is that as technologies are becoming more and more complex in the way that we're working, problems are also becoming more complex. So phishing is no longer phishing what it used to be. And we don't know in two years what the new problems are going to be.

So this is a live process. And it's constantly changing, which I suspect relates to what Laurel said at the beginning, which is about a risk mindset, organization having in general a risk mindset. But there's also something else that Jeff is saying, which is about, it doesn't matter how deep your pockets are. So this could happen to anyone.

So cybersecurity is not a problem of a big company alone. It's not just about big banks or big hospitals. It could be anyone's. Let me go to Laurel first about the risk mindset, and then Mark, maybe you can help us navigate through this question of, is it an issue for everyone or just very specific type of companies or sectors?

LAUREL AUSTIN: Yeah, so your question is, how do we develop a risk mindset?

MAZI RAZ: Exactly, exactly. Thank you.

LAUREL AUSTIN: And I took a quick peek at the questions, and someone had a question about, are we going to have more education on this in business schools? And I think we're starting to see more of it, but it's not in all business schools. And it's also relevant to IT. Think about IT, because in business schools, traditionally, that's kind of a, well, maybe we don't do that. That's not our thing. But it's now-- and we're seeing and maybe people aren't so cognizant of it's the backbone of maybe every business now. And so it's so important.

So some of the things that we're starting to focus on teaching at Ivey is I teach some courses on decision making and risk management. And it's so important to understand that it's the organizational constraints that we have to consider, the technical constraints, the human resources constraints, the financial constraints. But it's also the cognitive constraints, the things that sort of lead us to make less than useful decisions and engage in less than useful behaviors.

And so training people to be thinking in a risk mindset, training people to recognize that we all fall prey to wishful thinking. So there's an initial hypothesis and it's not hackers. I've seen this before. It's not-- "oh, that's a great. I really want that to be true." And then I tend to look for confirming evidence of that. We call it a confirmation bias.

So how do we help people sort of see those kind of decision traps, those cognitive biases in decision making and then plan to overcome those? One of the problems that we have is short term versus long term thinking. And we all tend to focus on the problem of today. And it's very immediate, and there's a lot of pressure, and so these bigger issues like cybersecurity are back of mind.

I'll get to it tomorrow. And tomorrow there's a new immediate problem that I'll focus on. And it's still back of mind. So how do we help people? So one of the things we want to instill is more long term thinking, long term risk management, that it's not just the short term.

And I think one of the things too is how we prioritize, because we want to prevent things. We want to be-- risk management, we're often thinking about, how do we prevent, how do we prevent, how do we prevent? Risk management is also about how do we mitigate when something happens? How do we minimize the losses. OK, something's happened. What do we do to minimize the harm that's being done?

And there's a risk of overreacting as well as underreacting. So there was an agency in the US several years ago that thought they'd been hacked. They thought something was in their system. They actually threw everything away and rebuilt. And they spent, I think it was their entire annual budget doing this, only to later discover there had never been a breach.

So we all have limited resources, and so you want to use them appropriately both for prevention and for mitigation. And so those are some of the things we're starting to sort of build courses around.

MAZI RAZ: Laurel, thank you. Mark, I know that you have spent quite a bit of your professional life in very large organizations. If you were to have a dialogue with a counterpart of yours who's maybe in a smaller organization about cybersecurity, what advice would you have for them?

MARK SEGAL: It's not a problem of big companies. It isn't a problem of small. It's a problem with individuals. Think about your own privacy. And if you haven't turned on multi-factor authentication on your Google account, I'd recommend you do it. That's the easiest way I've stopped my wife from having issues with her account. But generally, what I would say is historically, if I think about time a little less say, 15 or 20 years. The beginning was people that were just having fun. They're kind of smart kids, smart adults. Kind of testing and probing. And then it became a little governments, and you can pretty much buy anything you want now on the dark web. Whether that's a toolkit, or access to data, or someone to do the work. But it's become business there's an interesting question around cyber security and governments playing a role maybe we'll take that later.

But what I would tell you is, phishing is a business now. For ransom to work, ransom only works if you pay the ransom you get the outcome you expect. And so before paying ransom, ransomware was crazy. He wouldn't unlock you'd kind of just wasted your money. It's not nation-states. It's individuals that get out of school, they have a call center, you can call center and can negotiate with a manager for a lower fee. I heard the US is about two days ago that it might be a tax write off to pay ransom. So it's a business now. They're going to target people that are easier to get access to. And I would say anyone who has a network or has data is a target.

The question is and I think it's a bottomless pit of spend. So I think I've heard numbers in the Tens to 50% of your budget as a whatever that budget is a representative revenue. Figure out what the right thing is. If you were looking into this, I would go and get an external opinion, because there are a lot of people smarter than me and smarter than you who do this for business.

Can we get two opinions on where you sit on a maturity curve in all of the different categories, whether it's your data, whether it's your firewalls, whether it's your employee training, your playbook? And you get an understanding of where you are. And then try to figure out, OK, where do I want to invest, for what risk reward? Do I want to reevaluate why I have 600 phishing websites? Because you've got to maintain all of them. Why are they different technologies?

Well, the answer is, well, there's no business case to consolidate them. The business case is what happens if your business stops for an hour or a month? What happens if your customers refuse to trust you? There's certain places where customers-- I think Target never recovered. For what, two or three years, people just refused to go there and give them a credit card.

I think you really have to say, how much do I want to spend? It is a bottomless pit, and it is a logarithmic scale that you can never really get to that last position. If you do, now with spear phishing, targeted emails-- I know we shared a file message that's a much better way to do this is to work on a file together, versus sending the file around. It gets people out of the habit of doing things that are insecure by nature.

But I would say every company, small, medium, large, you've got to figure out what you want to do here. And you can't completely outsource it, because it's not an IT problem.

MAZI RAZ: And that's a really interesting point that the concept of outsourcing it, as I'm understanding this problem, will be here with us for some time. And it might even get more complex as we go forward. But it's also quite likely that the whole sector will probably find different ways of responding to it and probably going to have more and more cybersecurity professionals and experts and consultants, legislations, and all this coming out. I think the key message, if I understand it correctly, is that we probably should not wait until there are a proper responses from the sector in helping us develop security. We need to take steps ourselves as we go forward.

MARK SEGAL: You know it's interesting about legislation, one thing I'd also like to add there is we compete a lot in industries with our competitors and we'd like to keep everything secret. I think one of the biggest trends, which I think is awesome, is this notion of sector-specific people getting together and sharing information about the threats they're seeing. Because a breach at a hospital makes everyone afraid of every hospital. A breach at a telecom makes everyone afraid of every telecom.

And I can tell you, back in the previous company I worked at, the CSOs weree all talking each other every day about what they're seeing and protecting each other. So this notion of use your network, talk to people, it's not something to be afraid of.

What happens somewhere is going to happen-- [? Laurel, ?] to your point around supply chains, most supply chains supply more than one company in the same industry. And so a breach of one of them is going to impact more than one place.

MAZI RAZ: That's a fantastic point. Sean, Laurel suggested that we actually have quite a bit of dialogue happening on the Q&A. Why don't you perhaps help us figure out a question that is interesting for our panelists to answer?

SEAN: Sure. Well, Mark, you actually referenced the question here in the Q&A from Mike that talks about how government needs to change their approach to cyber crime and how organizations can start maybe lobbying government to do more. So I'll read the question. Even companies that are doing everything right or likely doing everything right, like Solarwinds, Microsoft have been breached.

There's a comparison made between a physical attack from a nation state or a substate actor being very equivalent to a cyberattack in the amount of damage it can cause. So I mean, is there an onus there on governments? And how can that conversation be moved forward?

MARK SEGAL: I think I'll start with the first part that it's not always nation states. It is just organized crime, for lack of a better term. And sometimes it's just individuals having fun. The risk is it's some teenager who is smart with computers and bouncing the dark web and tried and figured it out. What I would tell you, though, is governments are responsible for policing areas. But if you don't put locks on your office in your warehouse, that there is onus upon the company to do what it used to be called minimum level security or basic protections and to get better and smarter about the risks they face. If you got gold in the warehouse, you have a different security company outside than if it's nuts and bolts.

So I think the companies shouldn't hope that the government will solve it. I also think that it's moving so quickly and governments aren't necessarily quick moving. And you can see some of that noise starting in the US. I know in Canada, I think I read an article just recently around that the government itself-- there's a hierarchy of what people go after first-- government infrastructure, health care probably, and then into business disruption in terms of where things are targeted. I would say that the government needs help as well from its partners and other people, from infrastructure providers. I can tell you as a telecom provider, there's lots of stuff that we would do under confidentiality with the government to protect the government and other people.

So for example, if we provided phones to certain government officials, they weren't recorded anywhere in our network. So if someone breached, they couldn't find the location of somebody all the time. So think about there are things that business can do working with the government. The government is going to take a long time to solve this problem for you. And they're going to solve it with legislation.

So you're still going to have to deal with this, is kind of my gut feeling. Now, doesn't mean they can't be active participants in targeting people remotely in securing the borders. It's really-- we're a global technology environment. You're never going to put the genie back in the bottle.

MAZI RAZ: Thank you. Jeff, do you also have a response to this?

JEFF CURTIS: Just quickly. My experience has been obviously, government is a full partner in my domain. It's very helpful when they identify something called a critical infrastructure risk. Whether or not you believe your business is part of a critical infrastructure or a critical infrastructure risk, this is something that certainly the federal or the provincial governments have their eye on all the time anyways. So that dialogue, it should be expressed in those kinds of terms, which again, is sort of telling that story of playing bad movies for people, unfortunately.

You know, the boat could sink, but that's definitely the role of government. They manage that discussion, because that's-- well, because that's their job. And we're going through that right now for example with COVID.

So that type of framing is important. I think obviously statutes and regulations are the government's purview as well. I believe that it would be a good idea to agree on some type of information security or cybersecurity laws or regulations. You will notice there are privacy laws, but there's not much said within that law about security, if you will.

And I think that that can be improved. And then finally, interoperability and standards therein-- certainly again, everybody's talking to everybody these days. No one's an island unto themselves. And certainly in health care, it's obvious that unless we move to a completely commercial model, which I don't think is going to happen any time soon here, we have to work with government to make sure that they understand what's happening on the ground and that we understand what the vision of leaders are who are elected to have a vision about the future of health care. That's at the very broadest stroke.

MAZI RAZ: Jeff, these are really good advice. And recognizing that we are almost out of time. I want to hear from Laurel and Mark as well in very short form, if you don't mind sharing practical advice that you may have for our audience here in the call. Laurel, let's start with you.

LAUREL AUSTIN: Well, listening to Jeff, I mean, I don't know if this is advice, but just something we need to recognize is that every business is at risk. And the big companies, the big telecoms, the big hospitals, they have a lot of money to do this and time and more resources to think about it. But every small to medium sized business is also a part of a supply chain or part of an infrastructure chain. And they don't have the money and resources.

And I think that we need to be giving more thought to that level because it's reaching down to that level more and more. And I do think that's a key sector we need to be thinking-- I don't know if that's sector-- but a type of business that we need to be thinking about, and thinking about how to help and bring along.

MAZI RAZ: That's a really good point. Mark, last thoughts.

MARK SEGAL: Quickly, honestly, I would say go get an assessment to make sure there's an assessment of where you are, at least to help you understand where you want to invest in and what areas you can improve. You can improve everywhere, but some will have a higher rate of return. And then have an honest conversation, not in IT, but amongst leadership of a company around what risk you're willing to take.

MAZI RAZ: Mark, Laurel, Jeff, thank you so very much. This was quite illuminating. I personally learned quite a bit from this. We shall continue this conversation as [? we ?] [? come. ?]

[MUSIC PLAYING]

 

NARRATOR: Thank you for tuning in and listening to this episode. We'd like to extend further Thanks to our guests, Laurel, Mark and Jeff for taking the time to share their insights and expertise with us. Additionally, I want to thank Melissa Walsh our associate director Alumni Relations and corporate development, for all her tireless efforts behind the scenes to bring these currents and learning events and episodes to watch. If you'd like this episode make sure to subscribe to familiar content in the future, if you want to learn more about cybersecurity, either for yourself as an individual or for your organization, we've provided resources and links in a blog post on our website. You can also visit economy.com or follow us on LinkedIn, Twitter or Instagram using the handle @IveyAcademy to view our upcoming events, services and programs. Thanks again for listening. We look forward to having you with us for the next years.

[MUSIC PLAYING]

 

Associated Faculty

Laurel Austin

Laurel Austin

Professor

Mazi Raz

Mazi Raz

Professor