TOPICS:

EVENT ANGLE:

Premium Research

You can't buy a hybrid cloud as a product nor as a service, and even if you could you would need to customise it for your unique requirements and constraints. The reality today is you need to buy the ingredients from a supplier then roll your own hybrid cloud and to manage this you need to put in place a Hybrid Cloud Manifesto.

The SPC-2 benchmark is a useful benchmark for bandwidth intensive sequential workloads, such as backup, ETL (extraction, translate, load) and large-scale analytics. Wikibon does a deep comparative analysis of the SPC-2 results, time-adjusting the pricing information to correct for different publication dates. Wikibon then analyses performance and price-performance together, and develops a guide to enable practitioners to understand the business options and best strategic fit. Wikibon concludes the Oracle ZS4-4 storage appliance dominates this high-bandwidth processing as of the best combination of good performance and great price performance at the high-end and mid-range of this market.

The thesis of the overall Wikibon research in this area is that within 2 years, the majority of IT installations will be moving to combine workloads together to share data using NAND flash as the only active storage media. This will save on IT budget and improve IT productivity, especially in the IT development function. Our research shows that these changes have the potential to reduce the typical IT budget by 34% over a five year period while delivering the same functionality to the business. The projected IT savings of moving to a shared-data all-flash datacenter for an organization with a $40M IT budget are $38M over 5 years, with an IRR of 246%, an annual ROI of 542%, and a breakeven of 13 months. Future research will look at the potential to maximize the contribution of IT to the business, and will conclude that IT budgets should increase to deliver historic improvements in internal productivity and increased business potential.

The Public Cloud market is still forming – but seems to be poised to soon enter the Early Majority stage of its development where user behavior, preferences, and strategies become more stable. Large enterprises are more discerning of Public Cloud IaaS offerings. Test and development appears to be a key entry point for them since scale, operational complexity, and security/compliance/regulatory demands require a more nuanced approach to Public Cloud for IaaS. Small and Medium enterprises have the greatest need for Public Cloud and should consider well-established, lower risk entry points to Public Cloud like SaaS, Email, and Web Applications before venturing into Mission Critical and IaaS workloads to help them navigate an increasingly complex and costly IT infrastructure environment.

Big Data Security and Intelligence into 2013

The Information Age is quickly becoming the age of information-overload–to combat this, we’ve seen search, curation, and even intelligent agents start to take shape out of the desire to make sense of all the data available. For big business it’s become a whole different problem: the complexity of data that flows through their systems means numerous new ways that attackers can exploit loopholes or amplify flaws in operations. In the current culture of cybersecurity this could lead to leaking internal and secret customer data, losing millions of dollars to fraud, or worse.

To combat this, Big Data solutions have been leveraged across numerous different enterprises to help prevent or discover fraud and shore up potential security holes.

Wikibon’s Jeff Kelly recently released his Big Data market revenue report and in that many different outfits can be seen pouring hard earned profits into this new technology. Many of them have their own projects that reflect the use of Big Data for security.

One big thing about the IBM solution that fits nicely into Kelly’s market report is the increased adoption of Hadoop for Big Data products. IBM’s product roundly and proudly trumpets their use of Hadoop-based analytics as part of its core product set.

Splunk

.

For log analysis and maintenance and intrusion detection, Splunk has been at the forefront of everything analytics for a very long time. I’m personally familiar with how Splunk has been used in the past for cybersecurity and how their approach to Big Data analysis–and specifically unstructured data from a multitude of otherwise unconnected sources–could be used to protect critical infrastructure.

Splunk has a deep relationship with security and their primary products that address that lay around enterprise security, advanced persistent threats, and log analysis.

Their presence in the market list isn’t at all surprising, especially how Splunk describes the necessity of Big Data analysis for advanced persistent threats (APT.) Many attackers take advantage of holes left in the infrastructure of culture of enterprise to sneak in and do their damage–but they don’t often just do it once: they often leave behind malware or backdoors as we’ve seen in the recent hacks against US journalism sites such as the Wall Street Journal and the New York Times.

Big Data is an important part of watching for patterns that reveal the activity of malicious actors inside a network. Sometimes this means taking data not just from what computers are doing, but also what people are doing, or how information is moving to and from networks. This means huge amounts of unstructured data need to be processed, context sifted through, and rules designed to identify patterns that are out-of-the-norm.

Others in the Big Data security sphere

.

Also in the list, EMC pops out as a big name in storage and with their acquisition of RSA adds themselves nicely to Big Data security as well. The company released an excellently concise white paper on the subject of how big data can transform security–by putting together infrastructure, analytics, and intelligence (the big three of every security management perspective.)

Cisco recently acquired Cognitive Security, a Czech network security startup, to develop a big data solution to detecting anomalies. As a networking outfit, who primarily sell and build networks, being able to collect data from distant nodes and analyze the information to determine if anything out of the ordinary happened to be going on.

Overall, it’s obvious that 2013 outfits that work with Big Data are continuing to see how effective it is in supplementing standard security practices—primarily because analysis of threats both spontaneous and persistent requires real-time analysis of both current and historical data. It also shows that much of the market is angling itself to prepare for more businesses seeking solutions that will give them not just intrusion detection but swift answers on how to deal with intruders once discovered and Big Data is leading that charge.

About Kyt Dotson

Kyt Dotson is a Senior Editor at SiliconAngle and works to cover beats surrounding DevOps, security, gaming, and cutting edge technology. Before joining SiliconAngle, Kyt worked as a software engineer starting at Motorola in Q&A to eventually settle at Pets911.com where he helped build a vast database for pet adoption and a lost and found system. Kyt is a published author who writes science fiction and fantasy works that incorporate ideas from modern-day technological innovation and explore the outcome of living with those technologies.