We continue to do it to ourselves. I have been around the cybersecurity space for a long time. And if I had a dollar for every time a self-assured software and network engineer told me in no uncertain terms that “I’ve got this”, I could endow a cybersecurity chair at MIT. Turns out that every time, he (almost always) didn’t really have this at all. And it was always sometime later and well into the distant future that we would discover the “oops” that led to a system crash or logic failure.
These days it shows up most frequently in devops or server configurations network design cloud implementations or SIEM tuning. Every critical aspect of cybersecurity is reliant on some form of human decision-making and is, as a result, doomed.
Micro-Segmentation Mishaps are the Norm
Micro-segmentation is a good example. In this case, we have well-intentioned network engineers and operations staff designing and configuring segmented networks to reduce our attack surfaces and corporate LoB owners pushing for productivity gains and business goals that are promised by the adoption of digitalization strategies.
It’s the constant human decision-making fallibility that inflames a highly complex and increasingly combustible problem space. And, here’s why:
Fragmenting the Attack Surface Unwise
The typical user base and the devices and applications on a corporate network are increasingly geographically dispersed. As we have accommodated mobile and Internet-of-Things (IoT) technologies and adopted Software-as-a-Service (SaaS) applications in multiple public clouds over which we have limited security control, our attack surfaces have become increasingly difficult to protect
These rapidly expanding and fragmenting attack surfaces have created an array of new paths through which criminals can attack. The threats are increasingly more sophisticated, automatically seeking and taking advantage of any exposed vulnerabilities.
It’s almost as if we felt that the cybersecurity challenge wasn’t big enough, so we decided to throw in cloud services, IoT connections, open APIs, and third-party threats just for laughs. What’s next? Oh. Security for wireless sensor networks using identity-based cryptology. What could possibly go wrong?
So Cybersecurity’s A Reactive Exercise
In most organizations, cybersecurity has become a reactive exercise, as we have thrown in the towel trying to predict behavior from the monster we unleashed. In this new, highly dispersed, and heavily connected environment, IT is unable to prevent lateral movement of intrusions across the devices and applications connected to and traversing the network resulting in complete proactive impotence.
Historically, the response from smart network engineering and operations leaders has been to move to micro-segmented networks. The best techniques for doing so were based on IP addresses augmented with VLAN segmentation and VMware NSX segmentation for virtualized workloads. Networks based on Cisco hardware relied on Cisco ACI segmentation using physical switches and VXLANs.
These micro-segmentation techniques enabled access control policies to be defined by workloads, applications, or by architectural attributes such as the virtual machines (VM) upon which the applications, data, and operating systems resided.
In these segmentation approaches, firewalls were commonly used to separate the network resources for each group. In theory, this approach was to prohibit any unauthorized traffic from moving between segments. When an attack breached network security in one area, micro-segmentation techniques would prevent the spread of attacks laterally to other areas of the network.
In reality, as is the case with so many of the other well-intentioned paths to the cybersecurity promised land, micro-segmentation isn’t the panacea that many sought. It turns out that unless the network infrastructure is designed properly, dividing a complex corporate network into a large number of small segments may limit visibility into threats and attack mitigation activities across the entire network.
Dividing a complex corporate network into a large number of small segments may limit visibility into threats and attack mitigation activities across the entire network.
Current Segmentation Practices are not Cyber-Suitable
There are three primary problems with the segmentation techniques currently in vogue:
1. Access control for internal network segments is designed from the architecture up, a tactical approach that cannot easily adapt to changing business needs.
2. The trust valuations on which access policies are based tend to be static and become quickly outdated.
3. Access control policies cannot be effectively enforced due to a lack of advanced (Layer 7) security components from the data center to the network edge and moreover are unable to see and control these components efficiently.
These all stem from the fact that IT network engineering and operations folk plan the segmentation architecture without adequate attention to cybersecurity best practices and without consulting their counterparts in information security. This is not a criticism of IT network engineering or operations personnel. We never have and don’t now put sufficient emphasis on cybersecurity in the design stages of anything we do whether it be dev ops, network engineering, or cloud migration.
If we want a more risk-wise approach to network segmentation, we need to start at the beginning.
Segmentation Done the Right Way
On paper, the design of the corporate network is dictated by the needs of the organization as it evolves. The rules governing who and what can access which network resources are determined by business policies, industry standards, and government regulations. The network operations team should follow these rules when configuring the access control settings that permit users, devices, or applications to access specific network resources.
Network engineering and operations leaders will immediately recognize two downsides to this approach.
Understand the Disbalance of GRC and the Network Structure
First, the business processes, compliance requirements, and network access needs of an organization are vastly more complex than the structure of its network. Consequently, it is very difficult to use the network architecture to define secure segments for network resources that will be simultaneously accessible to all authorized users and applications and completely inaccessible to all others.
In practice, there will be security gaps; access scenarios that the network architects did not envision, which bad actors can take advantage of. With today’s advanced, sophisticated malware, they are doing so already.
Change Management is a Real-Time Trust Policy
Second, any process, regulation, or organizational structure is liable to change. So, even if the optimally secure network design were achieved, it would have to be amended. Once again, there are numerous opportunities for security gaps, not to mention the time and cost involved in the reconfiguration, which few networking teams can afford.
To effectively manage cyber-risk, network engineering and operations leaders need to have current and accurate information on the trustworthiness of users, applications, and network assets. Their internal firewalls or other access control mechanisms that enable or prohibit traffic flow between network segments must always be from up-to-date trust data. If trust assessments are out of date, the segmentation technologies become useless at preventing potential threats from moving laterally through the network.
The quality of trust data is becoming a pressing issue in network segmentation security because the actual trustworthiness of network resources can change unexpectedly. Lots of companies have been surprised by attacks from within the ranks of their trusted employees and contractors.
More than one-third of reported breaches involve internal users, and another third involve stolen credentials.
Zero Trust Not the Answer
Some businesses have responded to these dangers by locking down their networks, trusting no user or application and creating layers of verification before permitting access. This is of course, not a practical solution as network engineering and operations must protect sensitive assets, but not at the expense of imposing extremely unnecessary burdens on those who legitimately require access to those assets.
Access control policies cannot work as expected if the network is missing key elements of an effective security infrastructure. Traditional approaches to network segmentation assume that all the necessary network security components are in place to execute whatever access control policies the IT team defines.
However, this assumption is usually wrong.
Not all Segments Treated Equally
The network engineering and operations team driving segmentation may decide that some network segments with smaller attack surfaces are adequately protected without Level 7 advanced enforcement.
Due to lack of budget or simply because deployment and management requires too many resources, network engineering and operations teams seldom deploy next-generation firewalls (NGFWs) and other advanced threat-protection solutions everywhere they are needed (like in every cloud in which they operate, and at every endpoint and IoT device).
The security components that are in place are often not fully functional. Examples abound but the most common is a network team that intentionally turns off secure sockets layer (SSL)/transport layer security (TLS) inspection in their NGFWs in order to optimize network performance.
We all want fast networks but opening the door (front or back) to illegitimate traffic negates all other efforts to secure the computing environment and is a really bad idea. Again, this is not a critique of network engineers or operations folks. In almost every case, these folks are simply trying to do the impossible job of serving a multi-headed monster with conflicting requirements under extreme pressure.
The effectiveness of cybersecurity components is reduced if they are not tightly integrated.
Segmentation Without Integration is Risky
Lack of integration has several implications. For example, when one firewall detects a suspicious packet, it takes several hours until the information is picked up by the security team and disseminated to the rest of the network. Additionally, disparate security solutions cannot easily share threat intelligence, neither globally acquired intelligence on known and emerging threats nor zero-day threat intelligence regarding newly discovered threats.
It is the principal reason that the mean time to identify a breach remains high, at 197 days.
These conditions present network engineering and operations leaders who believe their segmented network is well-protected with a false sense of security. An ongoing end-to-end security assessment would tell them how their security platform is performing and whether their access control policies are achieving their business intent. But, without breadth of security and end-to-end visibility, a reliable assessment is just not possible, and it prevents network engineering and operations leaders from reporting accurately on their company’s cybersecurity posture.
As we’ve acknowledged, networks in which the segmentation architecture constrains business intent do not support progress toward organizational goals, but if performance priorities supersede security concerns, segmentation will likely result in reactive and ineffective threat mitigation.
Without adequate visibility into the exposures caused by network segmentation, well-intentioned network engineering leaders may be adding fuel to an already combustible attack surface. We are all humans and humans make mistakes. Especially under pressure. And in today’s cybersecurity world, we operate under constant and unreasonable pressure.
Having said all of that, no amount of warning or lengthy arguments in favor of cautious planning and testing are ever going to stop or slow this train. We see it every day.
So, as I always try to offer some positive steps that we might take to avoid a train wreck, my advice with network segmentation is to accept the inherent dangers in the design and implementation processes but invest in a third-party audit to assure that another set of eyes can peruse and certify that the work was done correctly and makes technical sense. An external audit of your segmentation design and a related risk assessment might be the best money you can spend.