The Problem With 'Defense-in-Depth'
It is most often evoked in the sense of battlements around a castle. First, you have the wall around the town, then a moat around the castle, then the castle wall, then an inner ambuscade, and finally the castle keep. The idea is that each layer of defense is inadequate for warding off a concerted attack and is sacrificial.
Does this metaphor really represent the proper response to IT threats? I dont think so. Let's examine a few of the historical reasons defense-in-depth has become the mantra of so many security professionals and vendors.
Firewalls & AV
Two great examples are the firewall and anti-virus. The firewall, as traditionally deployed, was meant to block connections based on source, destination and port. That made it easy to block services such as telnet or FTP or to allow access only from trusted sources.
Before the turn of the century many firewalls were deployed on standard Unix machines and some even on NT servers. The risk of maintaining a critical service such as a firewall on notoriously vulnerable servers led many organizations to deploy multiple firewalls from multiple vendors. In that way, if a hacker exploited a weakness in one they would still be blocked by the second or even third firewall.
As firewalls matured and moved to proprietary platforms, the need for redundant manufacturers has gone away. Today, the risk of misconfiguration inherent in maintaining two separate brands of firewalls outweighs the marginal increase in security.
Anti-virus (AV) solutions were notoriously faulty a decade ago. Dont forget the primary way for transferring files in those days was floppy disks, not email. Viruses spread much slower then. AV vendors had yet to invest in the research staffs to find viruses and create definitions in the blazing four-to-six hours they do today. Sometimes it was as long as a week before a definition was made available.
To counter this, many enterprises had relationships with multiple vendors in the hope one of them would have a definition in time to avoid the next virus induced meltdown. That was one aspect of defense-in-depth in AV.
The other was the idea that a gateway anti-virus solution should sit in front of the mail servers and yet another AV solution should run on the mail server. Along with the desktop AV that made three layers of AV from three vendors. But once again, the complexity and expense of dealing with multiple vendors now outweighed the benefit.
Many organizations still have three layers of AV but they are consolidating their suppliers thanks to huge investments on the part of vendors to discover new viruses and push out definitions in a timely manner.
If it werent for the fact that there were so many laptops in the enterprise there would not be a need for so many layers. But regardless of how effective network AV is there still has to be desktop AV to protect those machines when they are off net.
Plugging the Breach
The problem with defense-in-depth as a guiding strategy is it causes too much spending on redundant system and neglects the new attack vectors. It is not that the bad guys are systematically looking for ways past each of your defenses, it is that the bad guys are evolving as they discover completely undefended gaps in security.
Some examples include:
A recent trip to the beach gave me a better metaphor than defense-in-depth. As I watched, a child build a dam across a small rivulet. First, she stopped the water from flowing with a nicely formed wall of sand. As the water backed up, she had to keep extending the walls to keep the water from finding a way around her dam.
She then found she had to make the dam higher as the water level rose. And all the time she had to maintain her porous sand foundations by beefing them up with more and more sand.
That depicts the battle we are fighting today: Cyber crime is a rising flood. Any crack in our damn will admit the criminal. And were one succeeds a thousand more follow. We must be ever vigilant to find and block each newly uncovered weakness.
Richard Stiennon is the former vice president of Threat Research at Webroot Software and now the founder of IT Harvest, an IT security research firm. He is a holder of Gartner's Thought Leadership award for 2003 and was named "One of the 50 Most Powerful People in Networking" by Network World Magazine.