Search

Administrative

Cloud

Dec 15, 2013

I like asking questions and I like getting good answers even better. It is because of that, I now have a love / hate relationship with search engines. Most of the time they give me a 50% answer, a kind of direction, a suggestion, a kind of coaching to the real answer. It is like the joke about the consultant; “the right answer must be in there somewhere, because he or she gives me so many responses”.

In spite of all kind of promises, search engines have not really increased their intelligence. Complex questions with multiple variables are still nearly impossible to get answered and the suggestions to improve my question are mostly about my spelling or because the search engine would have liked a different subject to be questioned on.

So nothing really good is coming from search engines then? Well most arguably search engines have brought us cloud computing and a very powerful access to lots and lots and lots of data, otherwise known as ‘the world wide web’.

No wonder I envision that powerful access and cloud computing are the two most important values we want to keep while increasing the capacity and intelligence to do real analytics on large data sets.

Data Analytics needs cloud computing to create an “Analytics as a Service” – model because that model addresses in the best way how people and organizations want to use analytics.

This Data Analytics as a Service – model (DAaaS) should not behave as an application, but it should be available as a platform for application development.

The first statement on the cloud computing needs suggests we can expect analytics to become easily deployed, widely accessible and not depending on deep investments by single organizations; ‘as a service’ implies relatively low cost and certainly a flexible usage model.

The second statement about the platform capability of data analytics however, has far reaching consequences for the way we implement and build the analytic capabilities for large data collections.

“Architecturally, and due to the intrinsic complexities of analytical processes, the implementation of DAaaS represents an important set of challenges, as it is more similar to a flexible Platform as a Service (PaaS) solution than a more “fixed” Software as a Service (SaaS) application”

It is relatively easy to implement a single application that will give you an answer to a complex question; many of the applications for mobile devices are built on this model (take for example the many applications for public transport departure, arrival times and connections).

This “1-application-1-question” approach is in my opinion not a sustainable business model for business environments; we need some kind of workbench and toolkit that is based on a stable and well defined service.

The white paper describes a proof of concept that has explored such an environment for re-usability, cloud aspects and flexibility. It also points to the technology used and how the technology can work together to create ‘Data Analytics as a Service’.

Nov 05, 2013

The change in the IT landscape brought about through the introduction of Cloud Computing is now driving a next generation of IT enablement. You might call it Cloud 2.0, but the term 'Liquid IT' much better covers what is being developed.

In a recently published white paper by the Atos Scientific Community, Liquid IT is positioned not only as a technology or architecture; it is also very much focused on the results of this change on the business you are doing day to day with your customer(s).

"A journey towards Liquid IT is actually rather subtle, and it is much more than a technology journey"

The paper explains in detail how the introduction of more flexible IT provisioning, now done in real time allows for financial transparency and agility. A zero latency provisioning and decommissioning model, complete with genuine utility pricing based on actual resources consumed, enables us to drive the optimal blend of minimizing cost and maximizing agility. Right-sizing capabilities and capacity all of the time to the needs of the users will impact your customer relationship – but, very important, designing such a systems starts with understanding the business needs.

"Liquid IT starts from the business needs: speed, savings, flexibility, and ease of use"

Existing examples of extreme flexibility in IT (think gMail, Hotmail or other consumer oriented cloud offerings) have had to balance between standardization and scale. The more standard the offering, the more results in scaling can be achieved. This has always been a difficult scenario for more business oriented applications. The paper postulates that with proper care for business needs and the right architecture, similar flexibility is achievable for business processes.

Such a journey to 'Liquid IT' indeed includes tough choices in technology and organization, but also forces the providers of such an environment to have an in-depth look at the financial drivers in the IT provisioning and the IT consumption landscape.

"The objectives of financial transparency dictate that all IT services are associated with agreed processes for allocation, charging and invoicing"

There are two other aspects that need to change in parallel with this move to more agility in IT; the role of the CIO will evolve and the SLA that he is either buying or selling will change accordingly.

Change management will transform into Information Management as the use of IT as a business enabler is no longer the concern of the CIO. IT benchmarking will become a more and more important tool to measure the level of achieved agility for the business owners. The focus on the contribution to the business performance will be measured and needs to be managed in line with business forecasts.

The white paper authors conclude that "Business agility is the main result of Liquid IT" – sounds like a plan!

Jan 22, 2013

Atos just announced the publication of Ascent Journey 2016 - Enterprise Without Boundaries

"Ascent Journey 2016 is a unique and comprehensive document where Atos’ Scientific Community presents its predictions and vision for the technology that will shape business through to 2016.

It builds on Journey 2014 – Simplicity with Control and is enriched by the new challenges which have now emerged in reshaping both society and business alike.

Our research suggests that the convergence of key issues affecting Demographics, Globalization and Economic Sustainability, underpinned by Trust, will see a new way of working emerge in which traditional barriers no longer exist, but where security and privacy are more important than ever."

Exiting stuff and I am honoured to say I was part of the editorial board who produced this document.

Jan 13, 2013

Cloud Computing changed from choosing an easy solution, into making a difficult decision.

The reason is the proliferation of cloud offerings at all layers; today we do not only find ‘everything-as-a-service’ cloud solutions, but also ‘everything-is-tailored-for-your-specific-situation-as-a-service’ tagged as cloud solutions.

Is this good? I do not think so.

My main objection is that you will end up with a cloud solution that is no different than any solution you have previously designed and installed yourself, at a cheaper rate and lower quality SLA.

True cloud solutions should not only focus on cost reduction, increased agility and flexible capabilities. You should also be buying something that supports portability between the private and public computing domain, and across different vendor platforms.

In early cloud solutions, mainly the ones focussing on Infrastructure-as-a-service, this portability has been heavily debated (remember the ‘Open Cloud Manifesto’?) and in the end we concluded that server virtualization solved a lot of the portability issues (I am simplifying of course).

We also had Software-as-a-service and some publications showed that the portability could be addressed by looking at standardized business process definitions and data normalisation (again, I am simplifying).
Now the Atos Scientific Community has published a whitepaper that looks at the most complex form of cloud computing; Platform-as-a-service.

“PaaS offerings today are diverse, but they share a vendor lock-in characteristic. As in any market for an emerging technology, there is a truly diverse array of capabilities being offered by PaaS providers, from supported programming tools (languages, frameworks, runtime environments, and databases) to various types of underlying infrastructure, even within the capabilities available for each PaaS”

So a common characteristic that can be extracted of all this diversity is the fact of PaaS users currently are being bound to the specific platform they use, making the portability of their software (and data) created on top of these platforms difficult.

As a result we see a slow adoption of PaaS in the enterprise; only those groups that have a very well defined end-user group are looking at PaaS – and mostly for the wrong reason: ‘just’ cost saving through standardization.

In the Atos Scientific Community whitepaper they are identified as:

“Two primary user groups which benefit from using Cloud at the Platform as a Service level: Enterprises with their own internal software development activities and ISVs interested in selling SaaS services on top of a hosted PaaS.”

The current situation where PaaS is mostly resulting in a vendor lock-in scenarios is holding back the full potential for applications on a PaaS.

By introducing a general purpose PaaS, we would allow a comprehensive, open, flexible, and interoperable solution that simplifies the process of developing, deploying, integrating, and managing applications both in public and private clouds.

Such an architecture is proposed and explained in detail in the whitepaper; it describes the desired capabilities and building blocks that need to be established and it also offers an analysis of market trends and existing solutions, in order to establish a future vision and direction for PaaS, as well as outlining the business potential of such a solution.

We can all continue to feel positive about the power and the business potential of cloud computing.

Changing your cost base from capex to opex, increasing your speed in your go-to-market strategies and the flexibility in capacity and location are very important for your business.

We should not however confuse vendor specific solutions with cloud solutions only because they promise flexibility in cost and easy deployment; being able to shift and shop around is always better – also in cloud computing.

A year ago the Atos Scientific Community published a whitepaper on Cloud Computing. In the paper the concept was explained and we predicted that interoperability among clouds was going to be a major headache.

The paper also showed the result of a proof of concept in which we connected multiple private and public clouds to perform a single business workflow.

The hypothesis of the paper is that organizations will end up with multiple cloud environments:

“This will be driven by what is most fit for purpose for any given application (or part of it), based on an SLA trade-off between cost and business criticality. The corporate application landscape will therefore also fragment into those layers and into many business processes, requiring access to multiple applications and data connections that will need to span those layers. Unless enterprises consider these implications in advance, they risk building a heterogeneous IT infrastructure, only to discover that their key business processes can no longer be plugged together or supported.”

I think the authors (full disclosure: I was one of them) were right in their assumption and the situation nowadays is not any better than 1 year ago.

There are a couple of reasons I wanted to bring this to your attention again.

First because the paper has been re-launched on www.Atos.net , secondly because the paper has been accepted as a submission to the yearly Internet conference WWW2012 (www.www2012.org ) and thirdly because on February 10, 2012 the United Nations announced they will take initiatives to “Aim for Cloud Interoperability”.

At least for me this was a surprise as I saw the UN mainly as an intergovernmental body looking to create lasting world peace.

But if you think this through it actually makes sense. The UN’s International Telecommunication Union (source: www.itu.int) “is committed to connecting all the world’s people – wherever they live and whatever their means. Through our work, we protect and support everyone’s fundamental right to communicate.” And there is a lot more on vision, collaboration, achieving standards etcetera, etcetera.

During the January meeting of the Telecommunication Standardization Advisory Group an initiative has been taken to start a study on this subject of cloud interoperability.

Apparently this was done on request of “leading CTO’s” to “investigate the standardization landscape in the cloud computing market and pursue standards to lead to further commoditization and interoperability of clouds”. (Source: ITU-T Newslog January 17, 2012).

This is good news and it is not the only initiative that came out recently.

The Organization for the Advancement of Structured Information Standards (OASIS), previously known as SGML OPEN, has started a technical committee on “Topology and Orchestration Specification for Cloud Applications” (TOSCA) aiming to make it easier to deploy cloud applications without vendor lock-in, “…while maintaining application requirements for security, governance and compliance” (source: www.oasis-open.org/news/pr/tosca-tc).

The newly formed committee is supported by vendors like CA, Cisco, EMC, IBM, Red Hat, SAP and others.

In addition I recently Googled “Cloud interoperability” and when filtering for the last month, I got about 60.000 hits, so I can say that the subject is very well alive on the internet.

The point of all this? I firmly believe that in addition to your cloud strategy, you need to have a cloud interoperability strategy. You need to be aware of emerging standards and you need to talk to your vendor about it.

It is inevitable that some parts of your business will be run “in the cloud”; nowadays it is not only important how to get there, but also how to (securely) stay there while maintaining the flexibility to move around, interconnect your processes and still take end-to-end responsibility.

Oct 14, 2012

If you are thinking about the number 42 after reading the title of this blog entry, I compliment you about your knowledge of classic science fiction literature – for you there is no reason to read on as you already know everything.

For all others, please keep reading because I am about to give you access to a better answer. In late 2009 a group of smart people in Atos sat together and defined 10 challenges for our IT industry that will play an important role in the coming 5 years. Each of these challenges were thoroughly discussed and examined. The reason we did this was to support Atos in its changing strategy to become a more global organization with a clear view on the future . Since then the results have also been shared with customers and in 2011 the result was bundled in the book “Journey 2014” (it is available as a download on the Atos website).

There is no particular order in the priority of the challenges that was set, so I will present them in alphabetical order and quote from the book to give you a preview of the conclusions – after that I will give you a view on how this all comes together;

1.Alternative Delivery Models

“Organizations should make rapid progress on realizing the benefits of cloud services…”“Cloud computing is such a broad and diverse phenomenon that it is easy to become confused about its many forms and the way organizations can benefit…”

2.Business Process Management

“…Within 3 to 5 years, Business Process Management will become the dominant process change tool used by business stakeholders, working at two levels: first on Business Process within an organization (Orchestration) but also considering End to End processes involving interaction among different players (partners, customers and suppliers) and their systems (Choreography).…”

“A close eye has to be kept on the BPMN 2.0 evolution which may address BPEL and BPMN 1.0 short comings…” “An increasing number of BPM vendors are starting to offer BPM software-as-a-service (BPMSaaS). BPM services represent the highest level in the Cloud services. BPMaaS provides the complete end to end business process management needed for the creation and follow-on management of unique business processes.”

3.Context Aware Computing

“The Hyper Inter-Connected world faces an even greater challenge (…) to make sense of the literally trillions of data sources that could influence any given situation. Coupling this with the maturing of the smart phone (…) it paves the way for a new generation of intelligent applications that adapt to the user’s context on time to enrich the delivered experience…”

“…services enabled by context aware computing will anticipate and react to the needs of user, providing relevant, useful information to be able to make better informed decisions. These services will supersede the existing (…) applications and revolutionize how providers interact with consumers, organizations with employees, governments with employees and people with their social networks.”

4.Collaboration

“It is time for companies to catch up and stop ignoring modern collaboration methods that have proved to be very effective in the consumer world. The same way that social networks connect people with common interests, organizations have to take advantage of these solutions to connect people for a given purpose. It is not only a matter of cost saving it is also about improving the Decision Process, empowering employees and reaching consistent and supported consensus.”

“Information Management remains a key priority for enterprises to compete in local and global markets and collaboration is expected to generate even more strategic information which will need to be managed."

5.Control and Command

“Several strategies are being devised to synthesize a large system into a not-too-complex model, such as filtering events based on relevance, or aggregating data at different hierarchical level. Dealing with events coming out too fast is a stressful situation where an operator is more likely to make a mistake. Providing him with the appropriate information, at the right time and the right level of detail is a requirement to have him make an informed decision in time.”

“As the next generation of connected devices has started coalescing into an Internet of Things, control-command techniques will be required to bridge gaps and monitor the massive amount of information these will generate.”

6.Decision Support

“Decision Support has to deal with huge amounts of information, often unstructured, that change dynamically, and whose relevance and timeliness depend on the problem to be solved.”“By combining Business Intelligence capability for analytical insights and measures with collaboration tools and social software, they allow decision on non-structured problems to be made in a collective way.”

7.Electronic Entertainment and Gaming

“Media consumers tend to become actors while consuming media, which has an important impact on the way media is consumed and edited..”“The trends and technologies developed for the electronic entertainment and gaming market tend to gain other markets, benefitting from the mass market effect to become affordable in the industrial or business world.”

8.Green IT

“The know-how obtained in these practical experiences, if appropriately transferred, would enable IT departments and IT companies to accelerate their capability to serve clients in designing, engineering and operating IT for Green services”“There is a need for Business Transformation capabilities to manage the necessary behavioral change to leverage benefits from Green for IT and IT for Green.”

9.Social Networking

“Effectively using social platforms will be a key objective for companies coping with changing customer and employee relations…”“Creating an reward program to an agile, social, engagement that boosts user interaction is not so much a technical as a philosophical or political problem, going from authority to collaboration, from obscurity to transparency, from direct marketing to community management.”

10.Working Environment

“For the foreseeable future, offshoring will remain an effective strategy for reducing cost of service delivery and hence attracting and retaining talent is an issue that equally applies to offshore locations. Organizations must extend the working environment vision to apply to offshore locations.”

“Organizations will have to go beyond traditional financial incentives as the majority of employees look beyond money to find a meaning for their lives. With work life encroaching on home life, benefits from employers must reflect personal needs too.”

Bringing it all together When we look at the various challenges, and BTW there is much more info in the book, there is a need to understand how we can connect the dots – what is the overall idea or even vision that drives our behavior to these challenges. While we were discussing all of the different components it became very clear that 2 things are at the heart of our preferred way of interacting with the challenges. Handling the results should be simple and allow for a level of control.

This statement of “Simplicity with Control” became a mantra for further investigation and has driven many proof of concepts since.

The second point of clarity came when we made the decision to put the user at the heart of our set of challenges (and the underlying building blocks). Through collaboration and social networking, the user wants to reach its objectives. If we look at the challenges in this way we conclude that they are not about solving technology questions, but about addressing the user’s needs.

By combining simplicity, control and the needs of the user we have defined the starting point and the context for answering the question that is in the title of this blog. The philosophical statement is that the answer lies within ourselves; and to be honest, I prefer it that way.

If you run out of floor-space in your house, the problem is a bit more complex from a financial point of view – but the solution is similar.

I think we had, for many years, the same expectation in IT. If we ran out of storage, we would buy additional storage. Well, it seems we need to wake up and face the problem, because the solution is not that simple anymore. In a published whitepaper from the Atos Scientific Community (“Open Source Solutions for Big Data Management”) I read:

“[…] several major changes in the IT world have dramatically increased [data storage and processing needs] rate of growth.

[…] Computer capabilities have not increased fast enough to meet these new requirements. When data is counted in terabytes or petabytes, traditional data and computing models can no longer cope.”

This problem forces us to have a different view on storage and database technologies.

Traditional databases that use a relational model cannot process the data quick enough and adding additional computing power and memory is not the solution.

The issue is luckily addressed by storage and database vendors – they coined the term “Big Data” and are developing new solutions to make sure we can cope with the rapid increase in the information we want to be available online.

Unfortunately the impact of these new technologies is big (no pun intended) and there is limited experience in the way the technology is applied successfully and sustainable.

Some vendors are looking towards changes in hardware and provide dedicated storage-boxes that are hardwired to handle large databases or large data-files. Others are looking to provide solutions using new database software.

Most of the software developers and vendors that are facing big data issues are reconsidering the ‘traditional’ relational database model and are bringing new ‘NoSQL’ database models into view.

Based on the amount of marketing and buzz , this ‘NoSQL’ seems to be the next best thing to go with for these type of solutions.

So, do we really need all of this stuff? The Scientific Community whitepaper claims:

“In most situations, using NoSQL solutions instead of RDBMS (relational database management systems - paj) does not make sense in cases where the limits of the database have not been reached. Although, given the current exponential growth of data storage requirements, these limits are increasingly likely to be reached in the future. Unless RDBMS evolves quickly to include more advanced data distribution features, NoSQL solutions will become more and more important.”

The specialists have spoken – it is important. We need to care and we need to take action.

Additional problem is that the field is evolving quickly, good solutions are provided by small companies and will soon become part of large providers through acquisitions or other business activities.

I also expect some patent-conflicts (do we not love those?) and maybe some bad choices leading to loss of data.

My recommendation is you start looking for areas in your organization where this challenge will become a problem very soon. Ask your systems administrator about the time they need to do backups of databases and restore times. Ask your system developers if they foresee issues with your next generation document management or transaction processing system.

And while you are at it – ask your business analyst about the data they need to create meaningful business intelligence reports (and how much time it takes to create them). This will give you a good overview of your Big Data improvement areas.

Do not ask your vendor before having done an internal assessment. You do not want to be stuck with the wrong technology.

Mar 20, 2012

Global warming is certainly a topic that the IT industry should care about, especially if you consider the way we contribute to the rising CO2 levels and power consumption. Good news, though; apparently we do already care about it.After his earlier report in 2008, on August 1st, 2011, a follow up report was presented by Jonathan Koomey, a consulting professor at Stanford University.

“In summary, the rapid rates of growth in data-center electricity use that prevailed from 2000 to 2005 slowed significantly from 2005 to 2010, yielding total electricity use by data-centers in 2010 of about 1.3% of all electricity use for the world, and 2% of all electricity use for the US.”

Still, that is a lot of power and since we can expect a growth in the amount of data-center space, we need to spend considerable time thinking about further lowering the CO2 footprint of data-centers. A white-paper by the Atos Scientific Community, takes an interesting view to the subject. The authors claim that the two worlds of data-centers and of 'Control and Command' have lived up to now in relatively separate spaces, although:

“… the data-center can be seen as an industrial equipment with processes like electricity management, temperature control, humidity control, physical security, and finally the management of IT equipment themselves…”

After a short description of technology that is deployed in data-centers, it is concluded that:

“…computers kept in the room are not the dumb heaters that the physics rules would describe. They have their own operational rules and constraints, acting at a logical level rather than a physical one, and providing software solutions to manage that logical level. A communication channel between the computers or blades and the Control and Command could be mutually beneficial.”

This introduces a new way to look at datacenters, using end-to-end monitoring to manage the facility as a whole, not as a collection of separate components. The fact that all components interact and should be managed as such opens new ways to bring power consumption under control. A better future is possible, but a lot of work still needs to be done.

Mar 18, 2012

Readers of this blog know that I am always looking for new ways how different computers (or computer clouds) can interact to executed a complex task. Nowadays we see people work and play with a variety of different devices and most of the time struggling when these devices need to share information.

It becomes even a bigger problem when the devices need to be working together. But it seems we are getting there, using existing technologies, the video below shows a glimpse of the type of task we can through such combinations of devices and different interfaces.

It is not a ‘wow-look-at-the-future’ video, showing us all kinds of things that still need to be invented. Instead it is using existing technology, combined in a smart way.

I bet that doing the task that is shown in the video, in a ‘normal’ way, would take several hours and a lot of conversions.