As data streams inside networks continue to increase and change in nature, it will be more important for IT to adjust security strategies and adopt both data mining and crunching approaches. According to Palo Alto Networks, hiding malicious activity has become a critical cornerstone of cybercrime and analyzing unknown data flows is essential to keep your network defenses effective.

Network security is often more reactive in nature than about actively preparing an organization for emerging threats. The latest example may be a rather new but quickly maturing trend in the security software industry—data-stream mining and analysis—that follows a threat that is already common and a foundation of effective attacks on networks today. Making sure that an attack is planted and remains hidden within a target is critical criteria in cyber crime.

"It is important for attacks to live quietly within a target," said Wade Williamson, security analyst with Palo Alto Networks, in a conversation with Tom's IT Pro. "The attack process has many layers, but obscuring data, and undetected communication out to a server is the foundation for a longer attack cycle and the success of an attack."

High profile examples include, for example, Stuxnet and Flame, highly sophisticated espionage malware that allegedly extracted data from the networks they had infected for several years. Security researchers believe that Flame has at least three siblings that have been deployed but have not been identified yet, and even if they are, they may—just like Flame—integrate a kill switch that removes the malware immediately along with its traces.

Mary Landesman, senior security researcher at Cisco, considers dark data as a developing trend that is rapidly increasing. "It raises the bar for enterprise security," she told Tom's IT Pro. Landesman said that "there is no single approach" to effectively halt this threat, but tools like Cisco's Netflow, which allows forensic analysis of network traffic may be used more often by IT departments to monitor the data that goes into and out of an organization. While she mentioned legal issues that prevent the use of such tools in some jurisdictions, Landesman believes that data analysis requires a standardized process to keep up with developing threats.

"IT departments can greatly benefit from tools that minimize the noise in network traffic, weed out background noise and expose what is unique," she said.

Williamson recommended classifying all network traffic at an application level to develop an approach to potentially identify an encrypted tunnel along with traffic that should not be flowing through. Separating known and unknown traffic, whitelisting traffic and being cautious and suspicious about unknown traffic is generally a good idea: "Half of all unknown traffic is typically connected to malware," Williamson said. He even recommended blocking all unknown data by default. Landesman suggests the use of exclusion lists of traffic "you don't want to see.”

Wolfgang GruenerWolfgang Gruener is a contributor to Tom's IT Pro. He is currently principal analyst at Ndicio Research, a market analysis firm that focuses on cloud computing and disruptive technologies, and maintains the conceivablytech.com blog. An 18-year veteran in IT journalism and market research, he previously published TG Daily and was managing editor of Tom's Hardware news, which he grew from a link collection in the early 2000s into one of the most comprehensive and trusted technology news sources.