Distilling Complexity Through Design Thinking

In this episode of Cybersecurity (Marketing) Unplugged, Barnier also discusses:

  • How to apply critical and design thinking to cybersecurity;
  • How cyber risk differs from other business continuity risks;
  • The cybersecurity initiatives coming from Washington and predictions on their effectiveness.

Brian Barnier is the director of analytics at ValueBridge Advisors. Barnier is also professor of operations finance and economics at the graduate level across several U.S. universities; he has been a guest lecturer in Russia and Mexico and served on the faculty of the Wharton, MBA Stonier Graduate School of Banking. Additionally, Barnier has written a best-selling business book entitled The Operational Handbook for Financial Companies and has been a contributing author to several other books on risk management. Prior to his work at ValueBridge Advisors, Brian led teams to nine U.S. patents in technology with AT&T, Nokia and IBM. In 2021, Barnier earned the coveted Joseph J. Wasserman award presented by ISACA for outstanding achievement in information technology risk, governance and security.

Barnier, who is developing a course on critical thinking and design thinking in cybersecurity for, a cybersecurity education platform, describes his coursework and the importance of critical thinking today:

We start with the notion of intellectual curiosity. As people have to be thinking and thinking differently, given all the pressures that are going on in the world today. And from intellectual curiosity, we go into critical thinking in general … That all rolls into design thinking. And design thinking, in some ways, you can think of as a lens that provides a way to understand and to simplify the complexity of all these systems in which we work.

Full Transcript

This episode has been automatically transcribed by AI, please excuse any typos or grammatical errors. 


Steve King 00:13
Good day everyone. I’m Steve King, the managing director at CyberTheory. Today’s episode is going to focus on critical and design thinking as they should be applied in cybersecurity systems design. Joining me today is Brian Barnier, the distinguished director of analytics at VallueBridge Advisors and professor of operations finance and economics at the graduate level across several U.S. universities, and has also been a guest lecturer in Russia and Mexico, and has also served on the faculty of the Wharton, MBA Stonier Graduate School of Banking. In addition to all of that, Brian has written a business book, a best seller, entitled The Operational Handbook for Financial Companies and has been a contributing author to several other books on risk management. In prior lives, Brian has led teams to nine U.S. patents in technology with AT&T, Nokia, and IBM. Finally, Brian earned the coveted Joseph J. Wasserman award this year by ISACA for outstanding achievement in information technology risk, governance and security. And by doing so joins other notable folks like past winners, Phil Venables, Ron Ross, Gene Spafford, and Rob Clyde, of which we’ve had a couple on our program before. So welcome, Brian. I’m glad you could join me today.

Brian Barnier 01:48
Hey, Steve, absolutely delighted to be here. This is a fantastic topic. And you know, never more needed than now.

Steve King 01:55
You’re right. So before we jump into risk and cybersecurity, I want to give us a shameless plug about you designing and teaching your custom class in critical and design thinking for cybersecurity and our soon to be launched, which will be the best platform for cybersecurity education in the world. Please tell us about the coursework and why right now this topic is so important.

Brian Barnier 02:25
Steve, it is, indeed so important. The coursework is a build up. We start with the notion of intellectual curiosity. As people have to be thinking and thinking differently, given all the pressures that are going on in the world today. And from intellectual curiosity, we go into critical thinking in general now asking key questions, and not just questions that we’re dreaming up. But questions that have been there in the world for millennia. These are Socratic questions that continue through and permeate all walks of life today. And then from there, we go into systems thinking to take a look at how these adaptive, complex, dynamic, integrative systems are actually functioning. And cybersecurity is certainly one of those, but we’re surrounded by these complex systems, just you know, our planet Earth and beyond. So take a look at that.

Brian Barnier 03:22
The next thing we do is we bring in design, we start with design from sort of a pure perspective that we might get in aesthetics, fashion, style, but then we realize that there is this over century and again, millennia use where design interacts with technology and systems engineering, in order to create things whether that’s way back in time, like the pyramids, or whether that is the latest and greatest features that we can interact with online – non fungible token, adaptive AI’s, whatever it is. But we need to understand how those systems work and how to understand systems to then carry us forward. That all rolls into, Now we can do design thinking, and design thinking, in some ways, you can think of a sort of an easy button or a lens, and that it provides a way to understand and to simplify complexity of all these systems in which we work. So that’s sort of the arc of the program. And it’s not just the technical stuff, but it then goes into how to scale this in an organization. Because so many people try to come in with a great idea. They’re motivated, they’re fired up. They’re just waiting and ready to make change. But then they run into the great clash of the machine that’s around them. So we want to empower them with some of the best of organizational change and transformation as well. And to answer your question about why [it’s so important], all we have to do is read the headlines, breaches and spend are up and yet forensic analysis shows us that the root cause of these breaches is cybersecurity, math and method, we are our own worst enemy. So it’s not until we fix this, that we’re gonna get this problem under control.

Steve King 05:08
Yeah, no kidding. And it has been missing from our public education systems both at the high school and college level for a long time. It’s so not just important in cybersecurity, but in all kinds of innovation and technology innovation as well. You’ve historically lectured on issues in operational risk management, how different is cyber risk compared to all of the other organizational or business risks that we face today? And what can professionals in cybersecurity do different or better than we have in the past?

Brian Barnier 05:47
Yeah, so great question. You know, the differentiator in cyber, when we look at this in a sort of a risk perspective, is, it’s different in the scale of human actors seeking to do harm, short of physical combat, because we have all kinds of other hazards that are extremely destructive, you just go through the list of wildfires, hurricanes, floods, whatever it is, that are out there that are physically destructive, to property and to especially people. So that’s not new. And then, of course, we’ve got, you know, the whole sad, you know, history of the human race of physical combat, killing people and doing property damage. What’s coming in is the technology piece. But again, that’s not that new, either, right? As soon as you had technology, and go back to the original Greek word there for a minute, you either wanted to destroy with it, or figure out how to destroy it. And so all we’re saying is that, you know, new versions of technology are simply being weaponized or attacked. In the same way that you had, you know, the proverbial caveman going at each other with their stone hammers. For me, you know, I’ve been interacting with this for a long time, going back to launching IBM’s first secure messaging product where we were focused, including not just on the financial sector, but we’re looking at SCADA and industrial operations and securing that. And we’re sitting back and talking about how cyber threats can be used to take down, we talked about interesting enough, 20 years ago, we were focused on taking down pharmaceutical manufacturing facilities. And here we are, in the days of COVID. We were talking about taking down water systems, right, which has happened recently. And so all these things aren’t new. I can remember receiving an award in Washington, DC, seven years ago, talking about the attacks on water systems. So this is not new. Remember, the first ransomware was 1989, right? This is not new. And then the whole notion of military of disrupting command and control is not new. And that was taken to a technological level and sort of the first Gulf War that had not been really seen at scale. So this stuff is not new. And that’s where you come back to how do we ever lose critical thinking in our public schools so that people cannot piece together history to see how this unfolds? Great point, Steve.

Steve King 08:20
And the OT world seems to me to be, you know, kind of a uncooperative partner in the attempt to better prepare our security defenses for inbound cyber attacks. And I wonder, you know, what it’s going to take to get the folks in that are responsible for ICS and SCADA systems to sort of stand up and say, “Hey, you know what, maybe we can shut down our production line for eight hours and make, you know, to install some whatever protective shield or an air gapping alternative that actually will work and so forth.” But it’s a huge risk now, and it will be imminent risk. I’m sure in the next several months, we will see many more Colonial attacks. You spent a lot of your background in financial and FinTech and folks, when they look at the different industry sectors are sometimes confused about how they differ in terms of risk and cyber defense. So tell us, in your estimation, what ways do financial and FinTech businesses differ from manufacturers or healthcare providers?

Brian Barnier 09:31
Yeah, I mean, when we go through and look at your great question here, it really breaks down to the nature of the system. Second, attacker motivations and third, human errors that come from the humans that are working within that system, whether that’s healthcare, FinTech, or you just mentioned a moment ago, the OT or operational technology world. And so when we map these out, when we do scenarios when we do Wargaming. Those are the elements that we’re doing. The monster gap is that cybersecurity risk management and method is so deficient compared to other disciplines. And that’s a really interesting thing. Because when you look at somebody that’s in the OT world, for example, that’s focused on uptime, which you just mentioned. And then they come in, and they see the math and methods that are being used by cybersecurity people, a lot of ways they say this is amateur hour, I mean, just think about somebody that’s in a big IT company, say an online retailer that’s concerned with system uptime. And they look at the cybersecurity people in Boeing, you’re doing none of the math, you’re doing none of the systems modeling, you’re doing none of the analysis, you’re not doing none of the holistic aggregation aspects that we’re doing. And you’re coming in here, and you’re using your amateur hour toolkit to try to do cybersecurity. And this is this gap that we’ve got there that came in. And when you look across the industries, another big piece of the industry is what’s reinforcing the gap. And a lot of times, it’s a compliance environment, pick healthcare, I’m just picking on them. But how many times do you see healthcare and the person in charge of risk is an attorney, because they got to comply with HIPAA? Right, or any regulated industry, you can pick on financial services or whatever. And so they’re not really doing critical thinking, systems thinking, design thinking, to apply to solve a problem, they’re sitting back at saying, “What do we do to have our paperwork in order?” And that’s really a scary thing.

Steve King 11:40
Yeah, no kidding. And to your point, you know, about 114 years ago, I was running IT for a major U.S. manufacturing company that had both discrete and process manufacturing facilities. And, we also manufactured computers, oddly enough, and so the folks on the engineering side used to tell us that, “Listen, just give us the key to the data center, you actually don’t have to do anything, we got it.” And that that was their view, and rightly so by the way, that was their view of the competency of my IT teams and organizations. I mean, it’s a huge difference between, you know, back in those days, and even today, and because we don’t have the kind of engineering rigor on the IT side that we have historically had on the electrical and mechanical side, and that we don’t understand how to do structured development in information technology, and the guys on that are building products that are designing chips and boards, those guys do understand that. So same thing is true, I think, across all those sectors, healthcare, oddly, you know, spends a tremendous amount of money on what you said, you know, compliance, legal, yada, yada, yet there, they are also the most heavily it I think their response typically is, you know, Hey, guys, we’re saving lives here, you know, we can’t be bothered with this stuff. Well, going back to the OT issue, we’ve, you know, we have the internet of medical devices, which are more and more dependent upon these remote internet connections and software connections and so forth that are supposedly driving these things. So I can imagine, today, I could have somebody’s life in my hands and asked for 100 million bucks. And what are they going to do?

Brian Barnier 13:36
Yeah, I mean, you are so so very right, Steve, I mean, as a person has worked in cybersecurity for several decades, I’m constantly reminded of that, because my brother has a medical implant. You look at the sophistication of the engineering in the implant, and then you look at the lack of sophistication in thinking about how all that connects together from a cybersecurity perspective. You know, I’ve got some sympathy for the engineers designing those devices, because nobody’s ever approached them in a systems context to say, you know, have the light bulb go on, they get hit with all this compliance stuff that they’re supposed to do when they say how do we or how are we supposed to engineer for a legal document? Right, not what is our objective again, you know, in the course, I quote, you know, people like W. Edwards Deming and Russell Ackoff, great systems thinkers over and over. And they keep emphasizing this point that you cannot tell what a system does, or the intent of the system by analyzing piece parts. Now and their famous example, from Ackoff is if you were to look at a bunch of cars in the UK and a bunch of cars in the US, or you know, right hand left hand cars, wherever you are the world, will anything about analyzing those cars tell you why one has the steering wheel on the right and the other has a steering wheel on the left? No. And again, it goes back to Aristotle in his list of causes or explanatory factors. We’re not communicating with people. And that’s why when we work through these outcomes accelerator workshops that we’re discussing, in this curriculum with you, these just so amazing results. I mean, my metric has always been six months of work in six weeks, and cut costs and get better outcomes. Because it’s been proven and practical in other areas for decades and centuries, this stuff works. Yes, I completely agree, Steve, I completely agree.

Steve King 15:39
You know, we’ve seen this recent surge of cybersecurity attacks on both businesses and on these pipelines and food distribution networks, and so forth and so on. We’ve had a sort of a shift in administration, in government administration here in the United States in the last six months. What is your take on the initiatives that you see coming out of Washington in addressing some of these issues? Do you feel like we’re gonna get some traction out of this? Or is just more sort of lip service to the problems that nobody’s addressed up until now?

Brian Barnier 16:19
You know, I’m cautiously optimistic. The problem is, every time you see some specifics come out, they’re not going in the right direction. And they’re reinforcing the complications. You know, for example, you know, zero trust is sort of being compartmentalised. You know, there’s zero trust, I think of John working, applying it in the grand systems context, and then another piece part, and then we look at why we have breaches, and this compliance problem, and so much of the answer is, “We’re Washington, we know how to do compliance, so we’ll do more of compliance.” And that just locks people in. I mean, when you look at it, again, got to do the forensic analysis on these things. And you got to see how frequently the breaches when you’re doing proper forensic analysis, like in the systems world, like Ishikawa diagrams, Fishbone diagrams, that kind of stuff, where you’re actually understanding causes, like after space shuttles, you know, disasters, that kind of thing. When you break it down, it’s the math and the method. And a lot of that method is driven by compliance. And the compliance method is largely driven by accounting, auditing, or financial reporting, auditing in specific. And that came out of Sox, Sarbanes Oxley act, that was adapted from the Foreign Corrupt Practices Act, you know, that goes back on your history of accounting in time back to, you know, Egyptian grain accounting, accounting, according to accounting historians, that whole thing says, we’re using the method that we’re going to use to audit an airplane pilot’s hotel bill and taxi a bill or whatever it is. And we’re gonna then take that and apply it to flying an airplane in the storm and expecting a passenger airliner to rise safely. That is just a monster category error. And to the extent that Washington gives us more financial reporting, approaches to solving a complex, dynamic, integrated systems problem, we’re going to be spending more money, and we’re going to be just opening the door to more more breaches. Because that is just unsustainable, that is a category error.

Steve King 18:40
It’s interesting, right, we have adversaries in various countries who we’re accusing of attacking us, China and Russia, of course. And at the same time, there are lots of voices in Washington that want to increase sanctions on China and want to impose trade restrictions on China, in many cases, very good reasons. And I guess it’s good. Yeah, this becomes existential, I suppose if you stand back far enough, but down in the physical moment of reality the, you’ve got just as many companies, most of them in high tech, most of them in Silicon Valley that are pushing Congress and our government very, very hard undoing those sanctions and releasing those trade restrictions so that we can do more business with China. And yet, you know, we’ve declared from the FCC that a company like Huawei can’t be relied upon to provide, you know, safe telecommunications equipment to the United States. And so, it’s an interesting contrast, right? I mean, which is, do we do we err on the side of safety or do we err on the side of economic progress I guess for us?

Brian Barnier 20:01
I mean, you’re making great points. And one of the things that we should give kudos here, you mentioned Ron Ross already. And his NIST 800-160. Systems Engineering for trustworthy computing systems is an example of good systems thinking that’s the right direction to go. But that gets caught in the middle of the geopolitics you just mentioned. And if we go back to sort of the history, we’ve always had this issue of technology, and geopolitics that has taken place. I mean, I’m sitting here in New England, you look at Boston, you look at New York, and, you know, go back 200 years, they’re trying to find surreptitious ways to cut technology, from their formal colonial power, by hiring the employees to teach them, how do you do these new manufacturing things that are starting to come along? Especially when you get to like 1860, or there abouts. And we had the US civil war, and we needed to increase manufacturing capability. And so here’s an issue where the United States was on the beggar side of that equation. But then you go back to look at the Levant. And you look at, you know, the whole, you know, history there. And what today we think of as the Middle East and ancient Persia and Greece and the Media Persian Empire. And these are the same kinds of geopolitical issues that we saw then except now we’re using it with Huawei systems and telecommunications, instead of the methods and technologies that were there. And the concern is that we get the right kind of integrated view of these things in order to solve the problem that you just you just mentioned. But it comes back to that critical thinking, systems thinking and design thinking so that people can see how the whole works. And again, that’s why kudos to Ron and his 800-160 at nest.

Steve King 22:04
Yeah, sure. You know, and God bless Ron, for every one of him there are 100 other…

Brian Barnier 22:11
Oh, yeah, maybe 1000.

Steve King 22:13
Yeah, right. Right. So, final question. Let me I thing about I want to get back to the as a risk professional, which is, among several other things, what you indeed are, I’m curious as to your view, on the recent surge of these attacks. And again, if you look back over the last six months, it’s like twice or three times the amount of activity we had in the second half of 2020. And I assume we’re going to have twice as much in the second half of 2021, as well. What is the impact on business generally, and specifically, the recent critical infrastructure storm that we talked about earlier? How does that inform risk professionals who are dependent upon natural resources to power their businesses. Like, we know, for example, that if Biden’s executive order about getting to, you know, electric cars by 2035, that’s all well and good, except guess who owns all of the battery technology and all of the natural resources for battery technology in the world? China. Right. So what are the risks discussions that are going on businesses now? And how are they thinking about accommodating, you know, a real big new business continuity threat?

Brian Barnier 23:33
I would like to say that there’s all this robust conversation going on. If it is, you know, in all the circles that I participate, I’m not seeing it. The Business Continuity community, you know, where I’m fairly active, they’re not there, they’re dealing with the investor dealing with the critical infrastructure disruption threats. So they’re saying, you know, what happens if our assembly line goes down? Or, you know, what happens if we lose water access, or you know, any of those things, electrical utilities. But we come into this quickly, this issue of the methods problems, right, because the stuff we’ve been using, or say, NERC, the engine reliability commission, is using for electric utility grid is not the same kinds of methods being used for cybersecurity that’s bringing in way too much of this compliance and financial reporting audit aspects. So you’ve got an issue there. Now, when you jump to the other end of the spectrum around the geopolitics, that’s where you go back to the piece before, and you’re getting a lot more geopolitical analysis that’s being wrapped around technology. I mean, I can think of a couple firms that are doing more of that. Some are scrambling to bring in a sufficient amount of the technology aspect in order to understand how these are going because they haven’t traditionally had to do that in geopolitical analysis. At least not going back to, you know, World War II, where we were very concerned about resource limitations and constraints. And can we get our hands on the raw materials we need to make war material and you had the war production board, for example, that Franklin Roosevelt created. So that was a category of issue. Now they’ve got to sort of work their way to both recovering that knowledge and then applying it to the geopolitical situation that we’re in. And, again, it’s critical thinking and systems thinking, because all these piece parts are not being connected enough in seeing the whole implication, that’s where the ball is hitting the ground in this conversation, and that is, to your point, very much of a concern, and a threat when we’re bringing this together. Because you’re always trying to increase economic entanglement as a way to have peace. Right. But at the same time, whether it’s the Trump administration, the Biden administration, or an administration in some other country you’re faced with, what’s my trigger? What are the lines that I’m going across? What are the trip wires? If you take this island in the Pacific, will I go to war over that or not? And it’s salami tactics, as some people have called it in the geopolitical history. You slice the salami, right, one piece, is that enough to go to war? Another piece, is that enough to go to war? They’re never going to confront completely. It’s always salami tactics. That’s the big geopolitical threat thing. How do you articulate what your threat lines are? It’s a game of articulation. This is really a threat line, do not go there. How do we make that compelling? And how do we illustrate that we are willing to go to war, if you take down X number of hospitals, if you kill so many people in hospitals, we will go to armed combat over that. That is the issue. And Graham Allison and Philip Zelikow’s book on this is the one that I would recommend Allison wrote after the Cuban Missile Crisis, and then they did a 20 year anniversary version with Philip Zelikow. It’s a very good book on that top,

Steve King 26:10
Yes, well, understanding history is always helpful.

Brian Barnier 27:27
You got it Steve, you got it, positively.

Steve King 27:31
That’s a free nugget of of knowledge, right? Brian, this has been fantastic. I’m afraid we’re out of time. I’m over time actually. And but you know, I mean, these are topics that you we could talk about for days and weeks, and still not kind of scratched the surface, there’s so much, there’s so much to unpack here and so much that affects our daily lives. And we’re now seeing that cybersecurity is like, kind of a wake up call to you know, average walk around citizens who had to stand in gas lines for six hours to fill up a plastic bag full of– garbage bag full of gasoline, I mean, my God, now what? So in a way, that’s terrific. I’m glad that folks, regular folks are paying attention here. So I want to thank you, Brian, again, for taking time out of your schedule to join me in what I hope was an interesting exchange. And we need to schedule something in, you know, October-ish timeframe to come back and talk some more about what’s happened between now and then.

Brian Barnier 28:32
Yeah, I’m delighted to be here, Steve, it is a privilege. You are just such a great thinker and writer in the way that you communicate this stuff to people. I’ve hope I’ve done justice to your literary skill. These are truly topics that deserve critical thinking as you open up this conversation. So thank you. Delighted to be here and look forward to more.

Steve King 28:55
Very kind words, Brian, put a check in the mail today for 100 bucks [laughter]. Thank you to our listeners for joining us in another one of several theories unplugged reviews of that complex, freaky and scary world of cybersecurity technology and new digital reality in which we all find ourselves. Until next time, I’m your host, Steve King, signing out.