Monthly Archives: November 2014

Open source software built the Internet, but for years was distrusted by large enterprises. That’s obviously changed with Linux servers, LAMP stacks and Hadoop cluster now common in enterprise data centers. But despite a history of strident debates, open source versus proprietary is no longer an either-or decision. Indeed, the distinction between the two is nuanced and hard to identify as vendors leverage freely available code, fund open source projects and embrace open APIs within proprietary products. As I write in this Network Computing cover story, the era of public cloud services and software-defined infrastructure means that philosophical debates over OpenStack versus vCloud or OpenDaylight versus Cisco ACI miss the point. For IT, ideological purity is neither possible nor desirable. When it comes to technology decisions it’s business, not dogma.

Contention between open source groups freely releasing code and commercial vendors capitalizing on proprietary products started the minute software became a profit-generating industry. The latest battleground is enterprise data centers where the fights focus on cloud stacks, software-controlled networks, and big data systems. The lines are far from clear-cut since established IT vendors incorporate open source code, APIs, and standards in their products. On the flip side, startup companies are happy to commercialize public open source projects if they see demand for a polished version of rough, complex, not easily configured open source code. IT must take a pragmatic approach to software and vendor selection. Indeed, InformationWeek survey data shows that few IT organizations have black-and-white policies, with only 14% of respondents stating that they can’t or won’t use open source software.

The open source/proprietary distinction spans every corner of the data center and the full NWC digital issue analyzes survey questions focused on the software alternatives in three emergent and important areas of IT infrastructure: cloud stacks, SDN and network automation, and big data systems. There’s plenty of innovation on both sides of the software development divide and for IT it means a pragmatic approach to software and vendor selection, focusing on requirements, features and results, not the development process and philosophy, is most important. The question isn’t OpenStack versus VMware vCloud or OpenDaylight versus Cisco ACI, but how each can contribute to meeting your IT goals.

Open source proponents, like Red Hat CEO Jim Whitehurst argue that it’s a much more efficient software development model that leads to better, faster and cheaper code and that open source’s distributed, collaborative nature fosters greater innovation. Says Whitehurst, “If you look at where most new applications are getting built, and therefore where so much of the innovation around languages, frameworks and management paradigms are happening, it’s around an open infrastructure.”

Most of those leading large enterprises based on proprietary software acknowledge the importance of open source in today’s software ecosystem, but would agree with VMware CEO Pat Gelsinger that proprietary systems can build added features without the design-by-committee infighting that sometimes reduces open source projects to the lowest common denominator. In an exclusive interview after his 2014 Interop keynote, Gelsinger says VMware embraces openness through support of open, standard APIs. “We’re embracing APIs at multiple levels of our products and saying we will increasingly support these open interfaces into our products. And you’ll see us make more announcements on that as we go forward.”

As networking increasingly becomes defined by software, not hardware, the line between open source and proprietary systems is perhaps the . Not only have established vendors like Cisco begun using merchant switching silicon available to anyone (Lippis Report has an excellent review), but they are supporting various open standards like open vSwitch and VXLAN while simultaneously building proprietary alternatives to OpenFlow (OpFlex) and OpenDaylight (ACI).

Ultimately, the debate about open source versus proprietary software is academic. For enterprise IT, open source purity is neither possible nor desirable. Instead, demand interoperable, flexible products resistant to lock-in that use standard or published APIs and data formats. The focus should be on results not process.

A year ago, Target was in the midst of being pwned by cyber criminals that turned the season into anything but a Merry Christmas. The retailer ultimately discovered that more than 70 million customers had their credit card information stolen by an exploit that cost the company upwards of $400 million and the CEO his job. The year since has been filled with cyber-breach-of-the-week headlines (we’re looking at you, JP Morgan and Home Depot) to the point of giving the general public hack fatigue. Yet with people reliant on smartphones for more than just Facebook updates. Executives are increasingly leaving the laptop at home and working strictly off mobile devices.

While consumers and employees may be tuning out the exploit overload, cyber security has IT professionals in a state of high anxiety. When asked about project priorities, IT pros put security at or near the top of the list. An InformationWeek survey of IT executives found 88% have security initiatives and major implementation projects planned for the next year or two. Although enterprise-wide security is top-of-mind, dealing with mobile app security issues is a close second, and it’s no longer optional. In the same InformationWeek survey, 80% of respondents say they plan to build mobile apps, while 58% are deploying tablets.

As the full column points out, Silicon Valley VCs have noticed and see a huge business opportunity. Last summer, Lookout, a developer of one of the first consumer-oriented smartphone security apps, landed $150 million to fund expansion into the enterprise market. This comes on top of earlier funding rounds from a who’s who of Silicon Valley VCs totaling $130 million. Other startups like Bluebox Security, Nok Nok Labs and Wickr have raised tens of millions in venture funding this year alone. Money like that clearly shows that security is a critical piece of the mobile ecosystem and economy.

Mobile security innovation is happening on several fronts. The key technology categories are:

strong authentication and password management

application sandboxing (also called application containerization): keeping business data safe from rogue applications employees might download on personal devices by controlling the code execution environment and interaction with the rest of the system

secure, encrypted, time-limited messaging, aka Snapchat for business: a place to exchange sensitive information without risking that the messages live in perpetuity

Source: Microsoft

As I point out in the column, a password vault is a great starting point for reducing your overall threat profile on both mobile devices and PCs. For mobile users password vaults mean never having to remember (and trying to type) a long, random password. Apps like 1Password, Dashlane and LastPass can either automatically populate login screens or be used to copy/paste into the password field. Some even support Apple’s Touch ID, meaning opening the password vault is just a thumbprint away.

There are several interesting password alternatives under development that I will be discussing in the coming months, but given the inertia involved in changing security procedures at millions of sites with billions of users, passwords will continue to be a part of our online existence for quite a while. Until that day of password-free bliss arrives, a password vault should be part of everyone’s mobile tool chest.

There’s an interesting dichotomy between old-tech and new-tech companies that crosses several dimensions: speed and level of innovation, financial performance and cultural work environment. It’s the contrast between tech stalwarts that gave us the PC, Internet and modern database and those from the post-PC generation whose origins date from the era of smartphones, social networks and cloud services. However as companies like a revitalized Microsoft under it’s cloud-savvy new CEO have demonstrated, the distinction needn’t be an unbridgeable chasm, and of old-tech companies, the former Computer Associates is perhaps the perfect evangelist of new-tech business transformation. The company that started selling mainframe software loaded by keypunch cards has successfully navigated several major technology epochs. But as I learned last week at the firm’s premier customer event, CA World and detail in this column, CA is in the midst of another rebirth that rivals its prior evolution from mainframe software developer to full-line supplier of IT infrastructure and development software.

Source: Roger Pielke, Jr. Blog

CA sees the confluence of cloud systems and services, high-velocity, API-enabled app development and mobile device ubiquity as the catalysts for its next transition into hoped for preeminence as the arms merchant for the next-generation of business software. In sum, CA wants to be your app store and SaaS provider for a new generation of applications for the software defined business. CA execs were on message with the refrain, “software is the business” and while it over-simplifies the complexities of modern business, it served to drive home a key message: Marc Andreessen was right, software is eating the world and business success will go to those that exploit it.

As the column illustrates, there are plenty of trends in support of CA’s contention that software in all its forms, but particularly applications that engage with customers (increasingly mobile) and enable new business processes, is perhaps the most important element of business strategy. Collectively, as CA CEO Mike Gregoire pointed out, it means “applications now define a business’ relationship with its customers and fuel the productivity of its employees”, what CA terms the application economy. In this app-centric era, it means that proficiency at agile software development using DevOps methodologies, exploiting cloud services and seamlessly integrating tight security is critical to future business success.

While CA’s visionary message occupied the headlines, there was plenty of old-tech on display at CA World: mainframe management (IBM even had the guts of a mainframe on display!), central user identity control and SSO, IT portfolio management with KPI dashboards, asset tracking, etc. Indeed, listen to the Q&A during the technical sessions and it was clear most of the assembled masses were concerned about the here and now of managing existing IT systems, working with legacy CA products and learning about the latest feature upgrades.

Multi-chip CPU module for latest IBM mainframes.

Indeed, this apparent disconnect between the short-term needs and goals of most IT practitioners and the strategic direction business competition compels old-tech IT vendors to embrace poses a significant for executives across the industry, whether Cisco, HP, Oracle, or CA. Leaders at CA used business case studies from forward-thinking customers as the carrot, enticing attendees with stories of rapid new product development, revenue growth and ROI. Whether coming from new-tech leaders like Twitter or century-old household names like Smuckers, the message was similar: innovative use of the new technology like cloud services, mobile apps, DevOps processes and rapid, API-driven software development leads to competitive advantage via faster product and service development, new revenue streams, more satisfied and engaged customers and re-energized employees.

It’s a powerful message since in the software defined business, the race (and success) does go to the swift.

Typical CA World session walking through mobile app design and system integration.

Cyber security news has been almost universally dispiriting for the last few years as the barrage of new exploits has created a sort of ‘breach fatigue’. As I wrote in this column, cyber security has been a losing game of whac-a-mole for years as the malefactors manage to pop out of new security holes faster than IT and their software suppliers can plug the last batch. With the knee-jerk IT response of reflexively adding another security product to patch the latest hole, the game has also been a costly one for businesses and end users, which have collectively spent billions of dollars on an increasing array of products and annual upgrades to address each new threat category and set of exploits. Of course, this has made the escalation of breaches quite lucrative for the security-industrial complex, both established multi-billion dollar firms like Symantec and McAfee (now Intel Security) and startups like Fireeye and Palo Alto Networks, that have racked up multi-billion dollar sales and stock valuations in the past few years.

Source: Touro University, California

Yet the spending has arguably been a waste. The unrelenting onslaught of cyber criminals, with the successful escapades demonstrating the need for dramatic changes to security product designs and substantial upgrades to enterprise systems and practices. Fortunately, some in the industry see the need as Intel and its McAfee division illustrated by outlining a new security architecture at the recent FOCUS conference.

As I detail in the column, Intel used keynotes by GM Pat Calhoun and CTO Mike Fey to explain the firm’s integrated security architecture and power of automated information collection and sharing between myriad security systems, what Gartner Research Director Lawrence Pingree calls “intelligence awareness”. The idea is to build security systems that continuously inform each other of new detected threats and adapt their behavior in real-time. This prevents spread of detected and remediated threats via alternative distribution channels. For example, once a PC anti-malware system detects a dangerous PDF attachment, why not have a content filter block all files matching the signature at the network edge?

Intel Security Data ExchangeSource: Intel Security

Another critical aspect of a data-driven security design is linking incident detection with response. Says Fey, “Don’t just ring bells and blow whistles, but take action on indicators of attack.” It seems obvious, but so many security systems are just noise machines, logging alert after alert making it impossible for stressed and overworked security staffers to analyze them all. Indeed, that’s one factor the led to the Target incident: the company’s security team either missed or didn’t appreciate the severity of warnings issued by its newly deployed FireEye system.

Intel’s strategy, and similar approaches using aggregated data from myriad security systems, promises to finally give businesses and IT security teams the upper hand against cyber attackers. I agree with Fey that by basing security systems on an extensible architecture of linked and communicating systems, security researchers and operations teams can accelerate the rate of defense innovation and deployment and finally squelch the wily moles before they have a chance to pop up from another hole.

In a city where “doubling down” evokes images of smoke-filled rooms filled with gamblers donning shades pushing stacks of chips around a green felt table, as I detailed in this column, NetApp used its Insight conference in Las Vegas to do exactly that with a cloud-centric strategy that moves the company further away from its heritage as a storage box builder. It’s ironic that a company whose name derives from the word “appliance” has now firmly planted its flag on the promised land of software virtualization and cloud services, but like other enterprise box builders, NetApp must react to the inexorable advance of commodity hardware and cloud services on the heart of its business. Insight is NetApp’s highest-profile event, featuring the company’s entire C-Suite wooing thousands of its best customers, and of the five main product announcements, four dealt with software and the fifth highlighted new professional services designed to facilitate customers’ move to the cloud. Speeds and feeds are a thing of the past.

After a series of press briefings and individual interviews with NetApp’s top executives, it’s clear the company sees itself “as a provider of enterprise-class data management systems”, not just big storage arrays; and the cloud is the change agent. Storage technologies and products pioneered by the likes of Amazon, Google, Facebook and the OpenStack ecosystem are now seen as safe for corporate use and businesses increasingly question the need to spend 6- or 7-figures on a new storage system when they can get the same or better performance and capacity at a fraction of the cost using cloud design principles. Indeed, NetApp’s revenues have been flat for the last three years while 2014 operating income is 11% below that of three years ago.

NetApp’s answer is what they call a data fabric stitching cloud and managed services, public IaaS and on-premise virtual systems into an information quilt used by all applications, regardless of where they run. The strategy is to evolve NetApp’s storage software and management services into the unifying broker between business applications and multiple cloud platforms. The vision is of a data-centric cloud architecture, built upon a consistent storage layer that facilitates inter-cloud usage and workload movement and eliminates cloud lock-in. It’s an ambitious plan that starts by decoupling NetApp’s sophisticated data management software from the hardware, allowing it to run on third-party disk arrays, directly on virtual servers and now, natively on cloud services.

The column provides more details and analysis, however using storage as a foundation for a hybrid cloud architecture is hardly unique to NetApp, indeed its main competitor, EMC announced a set of hybrid cloud services on the same day. While I like NetApp’s approach of augmenting bulk cloud or white box storage with a rich set of services, I’m skeptical of the long-term business value since it seems to trade one form of lock-in (to cloud services like AWS) for another (to NetApp’s fabric). Indeed, one could argue NetApp would have users in more of a bind with all of their information assets locked within a single storage pool. How the company mitigates or rationalizes this lock-in potential is something I plan to explore in the coming months. In the meantime, the collective rush of IT vendors to embrace (and define) the hybrid cloud is breathtaking and fun to watch.

Now that both Amazon and Apple have made streaming an included perk of their primary products , wondering how it changes the market. Will enough people see the combo as good enough, ditching NFLX?
>>
Apple's Streaming Strategy Is The Ultimate Magic Trick
buff.ly/2O2QmNM