So it appears the Internet went down, or so many claimed when they were presented with 404 errors when attempting to watch “Georgia Hillbilly Massacre 17: The return of the Banjo Man” on Netflix – Since Netflix is selective on what you can stream they certainly weren’t queuing up the latest and greatest new releases, but that is a totally different rant – or attempting to declare themselves the Mayor of “who gives a rats ass where you are right now” on Foursquare.

Last time this happened some started to claim that it rocked the very foundation of confidence in cloud-computing (here), yet they failed to juxtapose Amazon’s operational failures against the universe of enterprise operational failures, security compromises and general administrative stupidity that plagues nearly 99.98% of every organization on Earth (minus the DPRK’s website, really not more you can do to fudge that one up)

In these posts Hoff posits that the mass centralization of information will benefit the industry and that monitoring tools will experience a boon, especially those that leverage a cloud-computing architecture…

This will bring about a resurgence of DLP and monitoring tools using a variety of deployment methodologies via virtualization and cloud that was at first seen as a hinderance but will now be an incredible boon.

As Big Data and the databases/datastores it lives in interact with then proliferation of PaaS and SaaS offers, we have an opportunity to explore better ways of dealing with these problems — this is the benefit of mass centralization of information.

Hoff then goes on to describe how new data warehousing and analytics technologies, such as Hadoop, would positively impact the industry…

Even when we do start to be able to integrate and correlate event, configuration, vulnerability or logging data, it’s very IT-centric. It’s very INFRASTRUCTURE-centric. It doesn’t really include much value about the actual information in use/transit or the implication of how it’s being consumed or related to.

This is where using Big Data and collective pools of sourced “puddles” as part of a larger data “lake” and then mining it using toolsets such as Hadoop come into play…

So apparently a group of technologists and vendors working under the cloak of digital darkness drew out a pentagram and locked arms as they called out to Cthulhu to manifest and drive out those that would oppose their ultimate aims of total and complete world domination. Domination brought about through a set of cloud computing solutions that would revolutionize antiquated IT infrastructures and deliver agility, scalability, and operational efficiencies through an open platform at a really, really good price. Blood was spilled, virgins were killed, and apparently an “open” cloud-computing manifesto was drafted. (more…)

I had an interesting conversation with a peer recently that started with a statement he made that “innovation was all but dead in security”. The implication was that we had done all we could do and that there was very little more that would be accomplished. Of course I felt this was an overly simplistic and narrow view, not to mention that it completely ignores the rather dramatic impact changes in computing infrastructures will have over the next 5-10 years and beyond.

How have enterprise architectures evolved over the past 10 years and how will it continue to evolve? Simply put we are pushing more of our computing assets and the infrastructure that supports them out into the Internet / cloud. It began with mobile computing devices, remote offices, and telecommuters and is now moving into aspects of the traditional internal infrastructure, such as storage, application / service delivery, and data management. This has forced IT to, in some cases, radically redefine the technologies and processes they implement to even provide the basics of availability, maintenance and security. How does an IT organization maintain the health and availability of the evolving enterprise while securing the environment? How do they ensure visibility into and control over an increasingly complex and opaque infrastructure? (more…)

Windows Azure, previously code name “Red Dog” is a hosted suite of services, including a highly scalable virtualization fabric (a what?), scalable storage, and an automated service management system. It is pretty close to the Amazon web services platform EC2 (Elastic Compute Cloud), except for the whole “Only Microsoft” thing. Hoff was on the ball and posted his thoughts earlier today (here)

Look, when I’m forced into vendor lock-in in order to host my applications and I am confined to one vendor’s datacenters without portability, that’s not ” the cloud” and it’s not an “open architecture,” it’s marketing-speak for “we’re now your ASP/XaaS service provider of choice.”

You can “experience” Azure here (here) also check out Manuvir Das, Director in the Windows Azure team explain the Windows “Cloud OS” (here) or Steve Marx presentation, Azure for Developers (here)

You can read my previous thoughts on cloud-computing (here) and (here)