Why Are We Here and What to Do About It?

Over the last 12-18 months we have witnessed exponential growth in cybersecurity breaches. If some years ago it was mainly small-to-medium businesses that suffered the most from ransomware, today we are seeing major enterprises and major government departments breached. This includes the U.S. DHS, a number of airlines, a number of cybersecurity companies and the list is growing on a daily basis.

Why is It Happening?

As my good friend Andy Jenkinson rightly points out, a lot of infiltrations happen due to lack of basic cybersecurity hygiene, mainly around mismanagement of domains (insecure domains) and poor PKI management (SSL/TLS). Infiltration is one part of the problem, the other part is execution.

But what is actually the root cause of this recent spike of infiltrations and executions?

It is difficult to pinpoint a single root cause of this phenomenon. We are probably seeing the cumulative effect of what has happened since 1945 mixed with what Andy is trying to highlight. Let’s have a look at what has actually happened since 1945. The brief discussion below does not pretend to be a comprehensive and all-inclusive list but rather points to some of the issues that resulted in what we are observing now.

How We Got Here: A Timeline of Events

1945

Von Neumann Architecture of Modern Computers

1. Devised by John von Neumann around 1945, von Neumann architecture is insecure by design. Lack of separation of instructions memory from data memory (unlike in Harvard Architecture) allows the transfer of control into data space that may contain malicious code. No one at this stage could predict the sheer number of computers today nor the creation of the Internet. In the famous words of Thomas George Watson, then president of IBM: “I think there is a world market for maybe five computers.”

1960s

Insecure nature of TCP/IP protocol

2. Designed initially by DARPA in the late 1960s (ARPAnet started around 1969) to overcome the risk of loss of specific point-to-point communication lines. This would allow robust communication over a meshed network by moving from switching circuits to switching packets which allows selection of various paths and automatic error recovery.

1980s

Ubiquitous Access to the Internet

3.      The word internet was first coined in 1974 in RFC 675 for the TCP protocol. And by the mid-late 80s, it had replaced a number of private networks since using the internet was cheaper, more convenient, more flexible and offered better interoperability.

Poor Software Design and Code

4.      When the focus is on functionality and time to market (and often also on $$$) it is very difficult to achieve proper security by design, especially in large projects and especially with frequently changing requirements and code. Often, what was originally designed with proper security, with time becomes insecure as changes to functionality/code are localized with not enough attention paid to the overall security of software.

Outsourcing

5.      Outsourcing came into vogue sometime in the late 80s. Although everyone knows that one can’t outsource accountability, this is what actually happens in a lot of cases: the contract says all the right things, but actual controls more often than not are missing.

5a.      In the software development space, this typically results in loss of control over the code in general, not to mention the increased risk of supply-chain attacks or an unknown back door left open (accidentally or intentionally) by the third party that will not be caught through code review.

5b.      In the ICT management space it brings loss of holistic end-to-end expertise in deployed ICT infrastructure and thus loss of adequate controls.

2000s

Agile Approach

6.      Agile Approach formally started in 2000, and Agile Manifesto was published in February 2001. I believe that the most fashionable Agile Approach has a lot to answer for in this space. Deviation from the traditional Waterfall Approach (and I acknowledge its shortcomings) effectively leaves no room for a proper top-down software security design. In a sense, it often becomes a bottom-up or “starting in the middle” approach with a focus mainly on UX/UI, time and money, but not security.

Complexity of Enterprise ICT Ecosystems

7.      The ever-growing complexity of enterprise ICT ecosystems brought us to the current situation where in any reasonably sized enterprise nobody has a full and detailed understanding of the ecosystem and its interdependencies. This results in an enormous amount of effort going into maintaining these ecosystems and often results in unpatched software required to maintain interoperability as replacement/upgrade is costly and takes a long time. This also exponentially increases the risk of potential supply chain attacks and makes PKI management even more challenging.

8.      Unjustified trust in certificate issuing authorities.

Digital Transformation

9.      Digitization and digital transformation probably started around 2000. And by the end of 2011, about a third of companies around the world had some form of digital transformation program. This increased both the number of systems directly connected to the internet and the incentive for the “bad guys” to breach them.

Cloud Computing

10.   The term cloud computing was mentioned for the first time in 1996, but actual offerings started to appear between 1999-2006. Cloud computing is often seen as a silver bullet, but it requires even more stringent cybersecurity management, especially if we are talking about public cloud (see outsourcing above). At the end of the day, cloud is somebody else’s computer.

11. The “bad guys” are learning fast and are now using NSA-type exploits, some of which are available in the wild, and internet proliferation has destroyed geographical borders – one can make a hit into any country from any other country around the world.

So, What Does the Future Hold?

Quantum computing is going to kill current asymmetric cryptography. And the cost and time required to secure the existing and growing ICT footprint is prohibitive. One also needs to add to this the continued acceleration of digital transformation combined with an almost absolute lack of understanding of the problem at the senior executive and board levels.

So, the future looks pretty bleak.

What Can Be Done?

There are certain measures that can lower the risk of infiltration, but as the Stuxnet episode shows, it is impossible to rule out the possibility of infiltration with a 100% guarantee. But this does not mean that we should not close as many loopholes as possible, primarily by maintaining strong cybersecurity hygiene, especially around PKI and domain management. We should also look at planned and accelerated shrinking of the size of the “zoo” and make a concerted effort to simplify as much as possible in existing ICT ecosystems. We may need to develop a next generation of secure internet protocol, as IPsec, in my personal opinion, just won’t cut it. Quantum encryption will assist in this space too and hopefully, this will happen sooner rather than later, but it is not yet available. We need to move very fast to universal use of a zero trust model no matter how difficult it is.

Execution is a much more difficult problem to deal with as it may involve the redevelopment of billions of lines of the existing code, or wrapping them around with a highly secure “gladwrap” and maybe a deviation from the traditional von Neumann Architecture.

There is a huge amount of work that needs to be done in this area and it needs to be done very, very fast.

Read more: