The Hidden Costs of Appliance-Based Models
By Martin Roesch, CEO
In the network security game, deep packet inspection (DPI) technologies are primarily delivered on appliance-based architectures, an approach suffering from major evolutionary pressure due to pervasive network encryption and the dispersion of enterprise networks across multiple cloud providers – what we call the Atomized Network. Even without that, there are many additional reasons why organizations that are extending or updating security for their Atomized Network, should hit the pause button if considering additional investment in this trailing-edge architecture. The ongoing operational costs of appliance-based deployments are considerable.
As an industry, we talk a lot about total cost of ownership (TCO) and return on investment (ROI) associated with deployment, so the racking and stacking costs of appliances are understood. However, the lifecycle management and curation costs of appliance-based architectures is not something that gets much attention. I refer to these as the externalities – analogous to the pollutants a coal-fired power plant throws off that result in acid rain a thousand miles away, while generating power that keeps local customers happy. Users like the capabilities of their intrusion prevention system (IPS) or on-premises network detect and response (NDR) tools. But externalities, like the amount of configuration management needed to get DPI appliances to perform and deliver value, and the expertise required to handle events and analyze data, can result in massive operational costs.
Spec’ing and sizing
Spec’ing and sizing appliances can also have hidden costs. Say you’ve been struggling to get your 10-gigabit appliance to perform due to a variety of factors like traffic mix and configuration. Three years down the road from initial purchase there’s a new model with more headroom, so you decide to bite the bullet and upgrade. Once again, you face the costs associated with moving boxes around, racking and stacking, getting power and packets, and configuring and incorporating the appliance into the overall security architecture. Or maybe you have an appliance you believe is spec’d properly for your environment and then the vendor adds features that make it under spec’d. The spec’ing and sizing game is tricky and fraught with hidden costs. There’s also the issue that an appliance as a physical device has a limited purview into a network environment and it is very difficult to change that in the face of emerging needs, like an attack that is in a part of the network with no coverage from the appliance-based technology.
Let’s not forget the constant drumbeat of software upgrades. You can kick the can down the road for a while by delaying updates, but when you finally decide to catch up to the latest and greatest version, the QA and testing done long ago to transition from X.y to Z.y might not be as solid when you try to jump ahead. The technical debt that you accumulate the farther you get off the update path, translates into additional costs later because you didn’t service that debt within the vendor’s suggested timeframe. When you finally decide to get back on track, you discover that your marginally supported configuration doesn’t transition smoothly, and you might find yourself wishing you’d brought your sleeping bag to the office when the upgrade takes all weekend to complete.
These factors may be somewhat nebulous, but they happen and impart real costs to the appliance-based architectural model for security. Granted, there are some baseline capabilities that require being on an appliance. You need switches to plug ethernet cables into to build wired networks, and you need an actual firewall somewhere in the path of traffic. But today, applications and data are scattered across a complex and fluid environment—the Atomized Network. If you want to gain situational awareness across your ever-expanding enterprise network and are considering deploying a big appliance-based architecture to do so, you need to think about these hidden factors and honestly assess your capabilities for success.
Few organizations have enough geeks onboard interested in dealing with an interrelated set of sophisticated technologies with massive externalities as well as the spec’ing, sizing, and software updates required to get them to operate as promised. Not to mention the additional costs and complications associated with international deployments when you have operations in other parts of the world. Boxes can get hung up on loading docks for months, and there are customs, tariffs, taxes, and other fees and regulatory processes to navigate. Technologies and tools that are very important to secure your networks, can sit for a very long time.
Recognizing the limitations and expense of appliance-based models in an increasingly cloud-first world, some legacy vendors tell you they can put appliance-based architectures in the cloud by using a virtual machine (VM). But in many cases, all that VM does is serve as a collector for cloud data. Instead of leveraging the data capabilities of the cloud, data is shipped back to an on-premises appliance for analysis and action. As appliance-based models shift to the cloud, the hidden costs follow.
To secure your Atomized Network you can try to hold on to the appliance-based way which is rife with costs and limitations, or you can choose the Netography way: a cloud-native, SaaS-based universal platform that uses metadata (not packets) to provide complete network visibility for threat detection and advanced analytics with no hardware, no software, and nothing to install. When we recognize the hidden costs of appliances and their diminishing effectiveness, the choice is obvious.