The Last Security Tool
Until now, IT teams have relied on perimeter defenses or “security prevention” to keep attackers out. However, the reality is that the bad guys are still getting in. Additionally, organizations are recognizing that more threats are now coming from within due to employee error and the BYOD trend, which can contaminate the network.
A complicating factor is the rapid increase in network speeds. They now routinely hit 100 Gigabits per second, or roughly 70 million times faster than the typical network connection when firewalls were introduced. This poses a number of challenges, particularly in the area of security. Network growth, along with the data deluge, puts a great amount of pressure on organizations to combat cyber threats and analyze cyber-attacks in real time so that necessary actions can be taken with minimum delay.
Consequently, there is a growing emphasis on “security detection” using network analysis to detect anomalies as a first indicator of new types of threats, either zero-day threats or ones that come from within the network. Gartner’s (News - Alert) Shift Cybersecurity Investment to Detection and Response research projects that by 2020, 60 percent of enterprise information security budgets will be allocated for rapid detection and response approaches, up from less than 20 percent in 2015. It is in this context that post-analysis comes into its own, as it is not always easy to catch threats as they happen.
The ability to do deep analysis offline or even post-attack to determine what happened, then, is crucial. The analysis allows management to make decisions and take actions in response to an attack. More importantly, it is needed to ensure that a cyber event has been truly resolved so that all public disclosure, notification of impacted parties and internal remediation can be completed.
Too Much Information
Attacks on high-speed networks are at their highest level to date – and they are only increasing. However, in most cases the attacks are discovered weeks later. Network security solutions are facing a two-fold growth challenge: Data traffic is increasing exponentially, so there is more to analyze at faster speeds. At the same time, cyber-attacks are also growing in number and complexity.
As part of the ongoing process of defeating network attacks, there is no shortage of security alerts or events in an organization’s environment – in fact, just the opposite. Entire industries have been created to fulfill the need to process the tens of billions of events generated every day in a typical large enterprise. The security team faces the huge task of collecting this data from all the tools and then prioritizing them by severity.
The trouble is that, although alert management tools generate massive numbers of events reported, organized and prioritized, the tools often give either incomplete or contradictory information about a given event. Add to this that once an attacker is inside, he will often compromise the credentials of a legitimate user and might disguise himself as an employee to do searches and extract sensitive data.
Breaches Get More Costly
The 2015 Cost of Data Breach Study revealed that the main cause of data breaches (47 percent) in 2015 was malicious or criminal attack. The average cost of a data breach was $3.79 million, a 23 percent increase since 2013. Worse, these direct expenses often are only the first of many legal, shareholder, employee, regulatory, customer and reputational ripples. The Ponemon Institute’s (News - Alert) The Post Breach Boom revealed that while detection and escalation costs rose to $417,700, up from $395,262 over the past year, post-data breach response and detection costs rose to $1.6 million, up from approximately $1.41 million.
In light of all this downside potential, organizations need to deploy a diverse strategy that:
- Ensures all security prevention solutions have the necessary bandwidth and capacity to handle high-speed, high-volume attacks.
- Not only ensures the security detection solutions are in place to detect anomalies in real time but also to record network activity for deeper analysis and/or later detection of a past breach.
It is no longer possible to rely on one single security solution. Traditional point defenses cannot adequately address the new, faster-moving, multi-layer threats and more sophisticated attackers. What’s required is a layered approach with defense-in-depth, where an organization not only relies on network security appliances for indications of data breaches but also network behavior analysis.
Consequently, the “last security tool” is continuously recorded network data. A network forensics solution should continuously capture all data 24x7, regardless of whether anything interesting is happening in a particular moment or not. Then, in conjunction with alerts from the other tools, the security team can investigate whether the event was a false alarm or something that needs to be actioned. Moreover, they can see what happened after the breach and achieve the ultimate goal: determining all the assets the attacker may have accessed and whether he has truly been eliminated from their environment.
The Bigger Picture
While tools currently exist that can offer a partial network recording based on an event, that data is inevitably incomplete if the recording tool did not see anything it considered interesting. For effective network forensics, best practices today suggest complementing solutions that can record everything continuously at high speed. It must be purpose-built for this, since the demands for storage and indexing of this volume of data are much different than the architecture of other security tools.
By introducing the concepts of data capture and retrieval-on-demand, real-time data capture can be taken a step further. The network forensics solution must provide an immediate and indexed answer to an investigator pursuing an event. It is crucial that security officers can quickly go to the time and place of the event to start analysis, and waiting several hours for this initial answer can cause serious delays while the attacker may still be inside.
IT security teams are challenged by the massive tsunami of data they must deal with. It can be tedious and often quite expensive to store and analyze every single data packet.
This reality drives the need to be able to retrieve data on demand with a few simple commands. In this scenario, users are able to get to the root of the problem by accessing the packets from a certain server or time period. Ultimately, as critical as it is to capture data, it’s equally critical to have rapid retrieval speed and data on demand. Using this kind of layered solution approach, organizations can build the security prevention perimeter higher and wider. However, if attacks emerge from outside the network or within it, teams can ensure that they have the security detection capabilities in real time and on demand to detect what is going on and determine the root source of the breach.
About the Author
Daniel Joseph Barry (News - Alert) is VP Positioning and Chief Evangelist at Napatech and has over 20 years’ experience in the IT and Telecom industry. Prior to joining Napatech (News - Alert) in 2009, Dan Joe was Marketing Director at TPACK, a leading supplier of transport chip solutions to the Telecom sector. He has an MBA and a BSc degree in Electronic Engineering from Trinity College Dublin.
Daniel Joseph Barry is VP of Marketing at Napatech and has over 20 years experience in the IT and Telecom industry. Prior to joining Napatech in 2009, Dan Joe was Marketing Director at TPACK (News - Alert), a leading supplier of transport chip solutions to the Telecom sector.
Edited by Peter Bernstein