Putting The Brakes on Physical Data

David Kruger is the VP of strategy and co-founder of Abiso, which is the fourth software company that he founded or co-founded. Kruger is a leader in the military grade data hardening space, as well as a certified GDPR practitioner who is knowledgeable in all aspects of data privacy with highly specialized skills and encryption for regulatory compliance and has served as a client liaison to both OSHA and the EPA. Kruger has published and is a contributing author of numerous white papers and articles about safety, cybersecurity and data privacy, and is also an official member of the Forbes Technology Council.

Kruger uses his knowledge and experience to weigh in on data hardening and its impact on identity and access management in data protection, private messaging, secure file sharing and digital rights management.

When we talk about thinking of your data as a car that you have controls on, we’re talking about being able to, in effect, put steering and brakes on a physical data object. And that’s part of what we do when we cryptographically bind these instructions to data objects. That’s it, think of them as the analog of the steering wheel on the brakes. We want to go in this direction, but not that direction. We want that data to be able to stop. We think of encryption as the keys to the car that determines who gets in and who can drive it.

 

In this episode of Cybersecurity Unplugged, Kruger discusses:

  • Military intelligence and Abiso: Creating a new tactical battlefield communication system;
  • The concept of data ownership vs. the information of data;
  • The future of the software engineering community and the movement of the manufacturing process.
CLICK HERE for a full transcript of the conversation.

This episode has been automatically transcribed by AI, please excuse any typos or grammatical errors.

 

Steve King  00:13

Good day everyone, I’m Steve King, the managing director of CyberTheory. Today’s episode is going to focus on data hardening. And its impact on identity and access management and in data protection, private messaging, secure file sharing and digital rights management. Joining me today is Dave Kruger, the VP of strategy and a co founder of AB Co, a leader in the military grade data hardening space. As the as the fourth software company, David is founded or co founded. He’s a certified GDPR practitioner and is knowledgeable in all aspects of data privacy with highly specialized skills and encryption for regulatory compliance and has served as a client liaison to both OSHA and the EPA. Dave is also published and a contributing author of numerous white papers and articles about safety, cybersecurity, and data privacy, and is also an official member of the Forbes Technology Council. So welcome, David. I’m glad you could join me today.

 

David Kruger  01:22

Thank you for having me, Steve.

 

Steve King  01:24

Sure. So ABS here. I’ve got to start in military intelligence to commercialize the product, you had to move it out beyond the fortress, if you will. Can you tell our listeners a little bit about that story,

 

David Kruger  01:38

we were tasked the US Army intelligence was trying to create a new tactical battlefield communication system, because we knew that the technology that we had had been thoroughly penetrated. And we had kind of a tough assignment with them that they needed a system that we had to basically assume compromise, we had a coalition partners that we couldn’t adequately vet. So we couldn’t know who was on the inside with the normal degree of certainty. And they said, Well, you know, assume that credentials are compromised, people are inside the network’s devices have been stolen, assume all that’s true, you still have to come up with a way to be able to keep the data secure, and under our control. And again, not all this data was inside that the US Army enclave, they had to, you know, share data with a coalition computer systems and so forth. So long story short, we developed a new invented a new technology called Software Defined distributed key cryptography that enabled us to do that job. But we finished that up as the war was beginning in the Middle East was beginning to ramp down a little bit. And unfortunately, that whole project, which was ready to go through field was canceled, right. So we ran up against a quit sequestration. So we took what we had learned from this military exercise and repackaged everything into basically two things, a multi language software development kit. So the technology could be embedded in any sort of, you know, embedded in any sort of new or existing software application. And then we a Python based broker application to handle PKI and to public key infrastructure and to be able to, you know, backup and transfer encrypted content between devices, across platforms, and so forth. So that’s a little bit of a backstory.

 

Steve King  03:28

And you guys have been in business for how long? Yeah, we

 

David Kruger  03:31

actually started out officially, we had another company name, but but we started this on a full time basis in 2009.

 

Steve King  03:41

There are several companies that I’m aware of who claim to provide mechanisms for self protection of data. How do you approach the problem? And what is it that you do differently than then these other companies do?

 

David Kruger  03:56

Well, there are a lot of companies that do encryption and and obviously, the way to make a way to protect data just generally is to is to encrypt it, right. So that’s that there’s nothing new about that encryption has been around for, you know, for literally for millennia, in one form or another. Right. But when people talk about self protecting data, typically they’re talking about data that gets encrypted very quickly after creation. It’s stored encrypted, and it’s transmitted encrypted by default. We do that and lots of other people can do that kind of thing that that’s not particularly exotic nowadays. Unfortunately, it’s not something people do often enough, but the technology is out there. Where we’re a little bit different is that we only make the data self protecting we make it self directing by cryptographically binding authentications and usage rules to whatever data that we’re encrypted. So that that we’re not only protecting the data when it’s at rest or when it’s in motion when it’s decrypted for use that cryptographically bound set of instructions. so to speak, can allow the user enable the user to be able to say, who you know want on where, when, for how long, and for what purposes, their decrypted data can be used. And I think that’s, that’s where we differ from some of the other folks in this field.

 

Steve King  05:17

So you retain those capabilities after the handoff and, and the decryption. So that data, if you decide you want to kill it, or withdraw it, or

 

David Kruger  05:30

you can revoke it, you can modify it. So let’s say that, you know, I’m sharing it with Alice and Bob, and I need to also share it with Steve. So I need to modify the instructions, we can modify the instruction cipher, you know, access controls, even though that data is in somebody else’s, you know, device, as long as that software utilizes the actual technology, then then we can control it throughout its lifecycle. In kind of our view of the world stated, if you don’t solve the what should you know what, how to control a decrypted data problem, at the same time that you control what to do with it to protect it when it’s in storage? And in motion, then you really don’t, you know, you really don’t have a complete solution. So what’s our view of the world is you have to do both.

 

Steve King  06:16

Yeah, I guess if you if the data is useless, folks, that’s one way to disincentive folks from stealing data.

 

David Kruger  06:28

Yeah, yeah. We liken it to transportation returning to gold in the lead with a correction. But to use it, you still got to turn it back into gold, and you still got to be able to control that use. Yeah.

 

Steve King  06:39

So talk to us a bit about data ownership, you’ve, you use an automotive analogy to describe data ownership, which is good, because most folks can’t grasp the idea of data ownership in its entirety. Can you share that thinking a little bit with our audience

 

David Kruger  06:58

a little bit? You know, part of the problem is, and it’s just a general conceptual problem that we have is that we, we conflate, you know, the concept of information and data on a computer as being the same thing. And they are not the same thing. Information, human usable information, you know, somebody’s listening to this podcast, right? The information is conveyed by the sound waves, right? of them listening to our voices go back and forth. That’s the physical medium, right? But the information happens in our minds, data is physical. It’s, you know, it’s an impression, right, a string of, you know, binary values, zeros and ones that’s impressed on transistors, or on radio waves, or light waves, or, you know, on Pitts on a DVD, but it’s all, it’s all physics, right? It’s all physical, and anything that that you can manufacture, which is what a software application does, it manufacturers, physical data objects, and manages them can be controlled. And we know this is true, because, you know, you’ve got things like data formats, right, you know, a docx file or an Excel file or a PDF, right? Those are those are all formal, rigorous structures that are controlled in the domain of physics, right? So so when we talk about thinking of your data as a car that you have controls on it, we’re talking about being able to, in effect, put steering and brakes on a physical data object. And that’s what this, that’s part of what we do when we when we cryptographically, bind these these instructions to data objects, right? That’s it, think of them as the analog of the steering wheel on the brakes. We want to go in this direction, but not that direction. We want you to, you know, to end we want that data to be able to stop, right? We think of it as again, an encryption is keys for the car that determines who gets in and who can drive it. But then you also have to have the controls once it’s a you know, once the drivers in place, so to speak. So that’s that whole analogy there. And again, it’s it’s not a fanciful analogy, because you are actually talking about physical things, right? Add controls to them if you need to control them.

 

Steve King  09:13

Sure. That makes a lot of sense. In the process of, you know, looking at the whole software development cycle, some folks are, I think often mistake software engineering. As a discipline, it’s kind of up there with you know, mechanical engineering or electrical engineering or what have you. But I think you’ve pointed out several times that our software engineering disciplines are relatively new, you know, or maybe comparatively new compared with with mechanical or civil or, you know, electrical engineering, and as such are immature in terms of the art and science of software engineering. I know firsthand from 115 years in the business The we’re not very good at it. Can you expand on that whole idea and the impact that you think it has on cybersecurity generally?

 

David Kruger  10:08

Oh, I think it lies at the heart of the other problem of of cybersecurity, when you’re when you’re in other engineering disciplines, you know, and my background is in safety engineering, not in it. Right. And one of the things that we learned from day one is that we had a duty of care is the is the legal term, right. And you see that across engineering disciplines, in the legal profession, in the medical profession, a duty of care, you have to make sure that what you’re designing or what you’re making, is not going to do harm to other people, right. And that duty of care to a large extent does not exist in the software engineering trade. And you can tell that every day, because the way that software makes data and the way allows that data to be used, or maybe a better way, the the way it fails to keep it from being used results in a in a great deal of harm, you know, yeah, that’s where the difference in in the professionalization of software engineering as a trade is different is that we don’t have the explicit duty of care. And so you regularly go into software development, again, I’ve been doing software development and leading software development for since the actually since the late 80s. Here, so this is not theory. You know, we simply didn’t go in and start off with Okay, John, number one is safety. You know, how do we build safety thinking good professional engineering into the design process and into the product that that our designs make? And that thinking unfortunately, just doesn’t exist nearly to the level that it should in the in the professional software development world?

 

Steve King  11:47

Yeah, indeed. And there’s a certain amount of, you know, esprit de corps among software engineering community that takes its sort of leadership principles from outlier guidance, you know, we’re kind of Mavericks, we kind of do our own thing we like to be left alone to create and all the rest of that nonsense by, you know, just kind of a way grew up, right. I mean, there was no

 

David Kruger  12:14

Yeah, I mean, that’s a mark of immaturity. So, so, look at this is not a new cycle, or every when you get a new invention, or something that’s really, really valuable. That’s also handled. It’s I mean, history takes us through this cycle. So the, you know, the, a great example that is, nitroglycerin was known to be extremely useful. People really wanted to harness its power. But it wasn’t until Nobel figured out how to make it safe produce it safely by you know, by putting it in diatomaceous earth and making dynamite right, that was a lot more controllable. It went through a cycle where a bunch of people got hurt before people figured out how to make it better. Right. Well, the impetus to make it better was the fact that the harms from nitroglycerin are obvious, right? We don’t have that state of obviousness. Yet in software engineering, it’s it’s it’s obvious if you stop and think about it, even for a minute, that if what is hazardous about computing terms of cybersecurity, in terms of privacy, things like that is what happens with the data, then it’s pretty obvious we need engineering controls on the data. So that’s an obvious thing. But that message of software development engineers, it’s up to you to turn this nitroglycerin into dynamite something that still has the potential, but that we control, that sort of thinking, that sort of professionalization of engineering based on duty of care just hasn’t sunk in yet. You know, it will, it will probably be more as a result of legislative legislation and litigation, because that’s sort of the history of engineering. But we’re not there yet.

 

Steve King  13:50

Yeah, exactly. And that, I think we we have open source semiconductor design algorithms either. And, you know, it’s a challenge for the space. I mean, we see any, any movement as you described,

 

David Kruger  14:08

you’re not going to see any movement, Steve until you have the thing. That’s the the root cause of that, if you kind of unpeel it back is that if you if you treat software as a manufacturing process, and application is something that’s manufacturing physical product or managing data, right, where the software industry and this doesn’t go to the engineers, it goes to the the ownership of the companies that manage to be able to transfer all the third party risk to the users, right? You you sign a EULA. Yeah. Licensing says, Hey, we know that this stuff is nitroglycerin. leitbilder You know, you know, blow up and do a lot of harm, but hey, you bought it so you’re on the hook for that even though the the user has no capability to make the use of that stuff safe. That’s really up to the software. Alright, that’s it because people don’t control data software does words, holding software does. So the precipitating event, I think that we’ll we’ll get to the eventual professionalization of the software development is when there is with litigation legislation most likely that says to software companies, you can’t hide behind the license to offset third party liability. Now, you can transfer that risk to insurance companies, which is what every other manufacturing engineering people who engineer manufacture and sell manufactured products do they transferred insurance. So they’ll, you know, they’ll be software companies will be able to do the same thing, but until they’re no longer able to just transfer that risk to users simply because the EULA says, so we’re probably not going to see the kind of movement that we need to see. However, given the amount of problems that we’re having with cybersecurity and privacy, I think that that day is approaching, I just don’t know when it’s gonna be.

 

Steve King  15:52

The other question is, will any insurance companies be around that are able to are willing to underwrite those kinds of towers? Now? I doubt that that’s going to be the case. So, you know, we’ve seen with log for Jay, I mean, there is no better poster child, and then that vulnerability for open source software in Java. That was just, you know, write

 

David Kruger  16:17

log for J and you know, fill in the briefs, does your is still a failure in engineering, and Ace was not a failure, the failure in the engineering is not, it’s not in the fact that you had the vulnerability, right, you’re always gonna have imperfections in any kind of complex process that can be exploited. The question is not how do we keep stuff from being exploited? The question is, from from a duty of care standards, how do we make it non exploitable in the first place? That rarely has as much to do with the process as it does with the design thinking? So if you if you encrypt things by default, and if you have a sufficient amount of authentication in something other things, you know, rules around the use of the data? Do we care if people get inside the system? Do we care if they exfiltrate the data? And the answer is no. And you know, that goes back to this whole army exercise, right? You it is rational, to assume compromise, because if one thing we’ve learned in the last 30 years is that, you know, you can’t plug enough holes in the diet fast enough to keep the bad guys out. So assume that they’re going to get in and then design accordingly.

 

Steve King  17:25

We don’t act as if we learned that in the last

 

David Kruger  17:29

week. Yeah, we’ve learned that we’ve learned that they they’re going to get in, right. But our response has been is to try to plug the holes in the dike better. Rather than, hey, let’s produce dikes without holes. Right. You know, and so that, and again, you don’t have the duty of care thinking that drives that. And you also don’t have the financial incentive, because software companies have been absolve themselves of liability in a way that no other industry does. So they don’t have an impetus to fix it either. Yeah, their emphasis is to get the next cool release out. And in the meantime, the users, the people who have absolutely no defenses, or are paying the bill for that literally.

 

Steve King  18:12

Right. And so your product kind of addresses that gap, I think, right?

 

David Kruger  18:18

That’s, well, that’s exactly what we intend to do. Right? I mean, it’s our view of the world is to assume compromise, name, a compromise, insecure software blowing credentials, you know, they’re in the network, they’re on the device that they have possession of, right? The the actual device, assume that all of that is true. And still keep the data secure and controlled.

 

Steve King  18:44

Yeah, as many folks, I think people that I respect, have said many times here, we we need to rethink the way that we architect organized orchestrate however you want to describe it, cybersecurity, defense, and weed but no one seems to be listening. I mean, we continue to do the same thing over and over and over again, not a day goes by without learning about some horrendous breach. And of course, now with everywhere disease, the Java language, we have a vulnerability in military

 

David Kruger  19:14

strategy. There’s a defense which we have tried and failed at Mr. miserably. And then there’s denial, right, as a military doctrine, right? And denial, Defense says, you know, protect the target and denial says don’t have a target. Right? Right. And we fall on the denial side, don’t have a target.

 

Steve King  19:36

Right? How do you How are you going to get that message across to the market?

 

David Kruger  19:41

Oh, hopefully by doing interviews like this, and writing for Forbes and other magazines, and I think you’re absolutely correct, that we need to rethink this problem. We need to think it profoundly. There’s something wrong with our approach. There’s something fundamentally wrong with our approach in the news. verifies that every day,

 

Steve King  20:02

you know, you, you you actually just uncovered the colonel have a really good story that the whole notion of military defendant deny, captures, may captured my imagination for a minute. And we’ll capture seasonal audience’s imagination as well. Because no one’s ever talked about that before. And no one’s ever used that as a metaphor for this condition that we’re in.

 

David Kruger  20:34

There’s a there’s a third useful D, did you know this sort of military strategy 101 Defend denied the term? Right. And the deterrence is, we’re going to punish you if you do something that we don’t like. Right. And so we have cybersecurity laws and privacy laws, and you know, things like that, that are intended to term but they have no teeth, because we can’t defend therefore your one strategy left to yours denial. That’s sort of the the military thinking,

 

Steve King  21:03

well, and I That’s correct, I will forget to turn we’re never going to be able to get agreement, either, you know, through the whole of international government, let alone the whole of national government here to be able to go on offense, if you will, and to prosecute cybersecurity bad guys. And there’s a whole bunch of reasons for that. And we all understand what they are. But the defend part hasn’t been working. That leaves you with denial. And I love that notion. I mean, that’s, I mean, you guys, that’s the core of your story is deny the target,

 

David Kruger  21:38

right? If you can’t, if you can’t defend you must not.

 

Steve King  21:41

Yeah, exactly. I mean, that’s a great tagline, too, by the way. So this is pretty cool and a half an hour, we’re not only did we do a little podcast, where we also created a marketing strategy for you. At no, no extra cost. I mean, it’s truly an amazing day. Yes, it is. Alright, man, listen, I’m conscious of the time, we could probably talk in I’m sure for hours about this. I love what you guys are doing. I love the idea of, of if you can’t defend, and I, and I wish you all the best. And so yeah, thanks for taking your time out of your schedule. David Kruger again. And joining me what I thought was, I think a pretty thought provoking exchange.

 

David Kruger  22:26

Well, I appreciate the opportunity, I really do. Sure. And you

 

Steve King  22:29

know, and happy to do it again, and three, six months, something like that and see how you guys are going and the how that tagline is working out for you. You bet. All right, man. Thanks to our listeners also for joining us and another one of cyber theories unplugged reviews of the wild world of cybersecurity technology in our new digital reality. Until next time, I’m your host Steve King, signing up.

 

 

 

 

 

 

 

 

 

 

 

Category: Podcast
Previous Post
Cloud Security: With Challenges Comes Solutions
Next Post
A Medical Approach to Cybersecurity
Menu