Can you start by telling us a little about how the data archiving project first came about?

Back in 2000 the NUI (Norwich Union Insurance) claims and workflow systems had been running on the current platform for 7 years and policy for 2 years. The Capacity Management team began to have concerns about the levels of data growth and the cost of a serious system failure to a rapidly increasing user base. This kick-started the project with problem definition and research of options being undertaken.

You say that the levels of data within NUI were growing at that time, what were the main drivers for that growth?

Data from legacy systems was converting to the new platform, this was also the period that Norwich Union was merging with CGU. The user base was increasing and with it the need for greater availability. The whole business was starting to move to where it is today.

What sort of pressures were IT facing from around the business with regards to the growing data volumes?

The business weren't experiencing any issues as a result of the increased data at that time. The Capacity Management team could see problems on the horizon so the pressure was convincing the business that action was required. We were a bit ahead of the game.

Excellent, so you weren't facing pressure from the business, this must have made your lives easier whilst you were researching the solutions.

Not really because the pressure was reversed in a way. As an IT department we needed a business case to justify the need to run the project. The fact that the business were not experiencing any issues relating to data growth at that time meant that we faced some resistance.

So how were you able to demonstrate to the business that a data archiving strategy was needed?

The two main drivers were the Data Protection Act and system recovery times.

The DPA convinced the business that we needed to implement an archiving strategy. The business was fully aware of the need to comply and potential costs and damage to the brand if Norwich Union were to be found in breach. DPA compliance was high profile and resulted in a separate group wide project being set up to deal with the issues.

The other major factor was system recovery times. Due to the large amounts of data we were dealing with we concluded that should we suffer a major system fail we would not be able to keep within our SLA for recovery. Given the increased user base mentioned earlier, any outage would mean the costs incurred would be far greater than previously.

When presented with these two strong arguments the business agreed and the project began in earnest.

With the advent of Software as a Service (SaaS) more businesses are relying on the ability to access their business data through web based applications. In addition to the rise of SaaS and Cloud Computing, our businesses are increasingly operating on a global scale. When once you could schedule your maintenance updates for Sunday night, this now affects users across the other side of the globe.

When downtime is unplanned however, these issues multiply ten-fold. These outages are a lot more visible to users and the public at large with potential ramifications to revenue, brand image and customer satisfaction.

In this paper we will look at the various solutions to the application availability issue for DB2 databases and how they meet the demands of our ever changing global operations.

As BT famously pronounced "It's good to talk" - and we've been doing a lot of that recently! In particular we've been focussing our attention to LinkedIn. Because we enjoy talking we thought that it is quite likely that some of our customers do too so we started our very own group especially for Z series users.

The idea of the group is that we get as many z/OS people together as we can to discuss industry trends, best practice and to share knowledge. We've currently got discussions running about managing data growth and data security.

Our partners, xkoto, have been named a "Cool Vendor" in the recently released Gartner report, "Cool Vendors in IT Operations and Virtualization, 2009."

Released on March 11, the report profiles vendors applying new techniques to manage the changing IT infrastructure, including SaaS, search and even virtualization itself. In the report, Gartner states, "The vendors highlighted in this research provide a variety of new approaches to address existing and emerging challenges in IT infrastructure." The report goes on to say, "Virtualization technology is not relegated just to servers and storage; virtualization of databases can also deliver many of the same availability and performance attributes."

For more information on database virtualization take a look at our website.

The need for operational BI in a tough economic climate

Our partners, Datawatch, recently spoke with e-week about Operational BI and why this is where we need to be concentrating our efforts in the downturn. Listen to the podcast here.

CA IDUG Award for Outstanding work in DB2

CA has announced the CA IDUG Award for Outstanding Work in DB2.

There is a panel of 7 judges (see below) and the closing date for European entrants is August 17th 2009.

Prizes include an IDUG conference pass, half-day consulting engagements by each of the four consultant judges, expense reimbursement up to $1500 for travel to an IDUG conference or an on-site consulting engagement and a plaque.

The criteria take into account the human aspects of delivering great results in DB2 - weighted as much toward the people doing the work as the technology supporting their projects.