When I started in cybersecurity back in the early 90’s, entire cyber security capabilities were just a couple of MB in size and would fit on a couple of floppy disks. Today, many cyber security capabilities are so big and complex that they rely on cloud computing, either for processing speed or the ability to leverage large data sets used in for example with Machine Learning as part of the detection and response capabilities.
Whilst we live in an ever more interconnected world however, there is a growing subset of systems that simply can not connect to the public internet. This may be due to expanding regulatory requirements (be those industry specific or broader country requirements). It’s often a simple business risk decision to have closed systems and sometimes the choice is taken away because of how the system was engineered to function; for example healthcare systems, energy production and management systems to name a few. The reality is, in an ever more connected world, there is also a growing scope of systems that simply can’t use many of today's cloud based rich cybersecurity capabilities are growing. So what's the alternative?
Well, one answer is do nothing if it's truly a closed system. I remember talking to one energy sector organization many years ago that told me the environment was too sensitive to be secured. There was an oxymoron if ever I heard it! The reality is, and experience has shown, threats still get into such environments through physical data movements, such as updates, or a user plugging in their own USB stick just for a moment to move some data across.
Many other environments may have small connections, either required by the supply chain of the process, or the manufacturer's back door to apply updates and patches, as well as so many other requirements for a thin connection. All of these further increase the risk.
So how do you secure this space? The answer has been historically to install a security appliance, a dedicated security device into the infrastructure. In many of the formative security companies I worked with we produced these. But during those times I have been involved through bitter experience the limitations of such an approach.
One such case was when a company was breached by a relatively simple attack, and we had to help them understand how this had happened. When you limit yourself to a capacity, and as such you need to make decisions on what gets processed first and what happens when you run out of capacity. Typically you need to decide if you prioritize analysis by complexity, or risk. When you are at capacity do you drop some bits of analysis or queue them up?, and in this case then what gets processed first things in the queue or new things coming in? In the instance I was involved in they were running beyond capacity and a really simple threat was deprioritized and got through. Trust me this was then a hard conversation to have. It's easy to say just buy more hardware, but most will work on capacity averages, and unpredictable spikes can always occur as was the case here.
The second big limitation is capability. This occurs not only due to hardware constraints but also in my experience, much slower version update cycles for dedicated appliances. Indeed often rather than keeping pace with capabilities all too often they fall into maintenance mode (i.e. What is required just to keep the appliance functioning). The hardware constraint I have seen again through formative security companies I worked for would mean that capabilities had to be stripped back to fit on the hardware provided. Often this meant key features would have to be cut and especially computationally or data heavy capabilities such as machine learning based functions simply could not run. In my career I had to have conversations about why we just couldn’t include that capability because of hardware constraints. Something that doesn’t placate a company especially when they have just suffered a breach.
When I joined Cybereason I discovered what I can only describe as a hidden gem. What is now known as Cybereason On-Prem a different approach that I’m sure others will follow, if their code base allows.
Learn more about Cybereason On-Prem
Rather than providing an appliance, the solution is provided as virtualized containers. The same way the EDR/EPP services are run from the cloud.
This can allow customers to re-use their own hardware which reduces costs of running offline cyber security capabilities! (If you don’t have spare hardware it can of course be installed on any new hardware, or if you prefer you could have a 3rd party install and manage the hardware for you.) The key point here is you have limitless flexibility.
It also breaks away from the traditional limitations of the appliance model. The only limit on scalability is how much hardware you chose to use.
This means not only can you now have an on-premise solution that has near parity to modern cloud based solutions, as you’re no longer limited by the vendors pre defined hardware.
This approach also makes it easy to add in new capabilities as the cybersecurity world evolves and new features come out, they can be easily added as you don’t have the concerns of overloading the hardware. It can be run in your premises or private environment either with no outside connection (so that solution updates must be manually applied across the air gap) or a thin link to help ensure you get access to updates and capabilities much faster. One of the great advantages of cloud based security is the ability to continually evolve security capabilities. If you can enable this secure thin link (a hybrid approach) you achieve the best of both worlds.
If you are looking for offline cyber security I would recommend you always consider the following: