ITOA News

The Right Analytics Solution Should Address These 5 IT Operations Challenges

The importance of IT in organisations has increased dramatically in the last decade. Technology departments have become intrinsically tied to the key business objectives of generating revenue, cutting costs and protecting the organisation from outside threats.

There are five broad challenges facing the IT operations world today all of which can be mediated through analytics – if you know how.

1. Data is exploding – How do we cope?

The volume of data is expanding at an ever-increasing rate. That’s a fact. But the question of how to store all this data, that is subjective.

Traditional relational database systems are not always best placed to cope with the high ingest rates, varying data types, and large quantities of operational data. This means most organisations have to discard, archive, or in the worst case, not even collect the data in the first place.

Enter NoSQL (or non-relational) distributed databases. They run on commodity hardware making them the most cost-effective option as well as easily scalable based on demand (by seamlessly adding and removing servers or nodes to a cluster).

Imagine a small investment fund. It seeks a cost-effective solution for tick databases to collect process and manage market data. Since some traditional software solutions come with a hefty price tag, an affordable, scalable NoSQL technology may prove to be the best option. The fund could capture granular real-time pricing data and query the original ticks or daily/weekly averages. The quantative analysts in the fund could conduct back- testing on vast amounts of historical data as well as write real-time queries for automated trading signals and predictions.

2. IT infrastructure is becoming increasingly distributed.

Today’s IT infrastructure is a complex beast; different servers run different processes or applications. This poses both a monitoring and storage challenge. How do you keep track of big data sourced from varied applications, servers and devices? Better yet, how do you store all of this data?

Analytics solutions must be capable of collecting and aggregating data from thousands of different sources in a non-intrusive way to provide a centralised point of access for all relevant stakeholders in an organisation.

Take an equities or FX department of a Tier 1 investment bank. The team is responsible for pre-market checks, supporting order management and execution, as well as monitoring multiple pricing and event data streams across regions from trading platforms. The end goal is a single interface which provides comprehensive real time visibility into trading activity.

3. Due to higher levels of customer expectations and increasing risks of failure, IT departments need to access and respond to data faster.

Traditional relational database companies are increasingly making acquisitions to expand data storage options – particularly in-memory and streaming database start-ups. Generally, we can see that data storage is pivoting towards increasing speed of access to analysis.

Streaming analytics meet these demands by processing events in real time. Multiple streams of data from applications, infrastructure and network devices can be queried in real time.

Fraud is perhaps the most important area where speed of response most important. Many tools simply are not fit for purpose as they do not operate in real time. This means fraudulent transactions are spotted too late or not at all. With real time analytics, payment transactions can be compared with the 'normal' behaviour. This approach quickly identifies abnormal events that can be flagged for further action.

4. Processing IT operations data can eat up significant money and IT resources.

With the NoSQL storage solution already in mind, adding a streaming layer makes it possible for storage to be even more cost efficient and easier to maintain in terms of IT resources.

Whereas you may have had to repeatedly run the same batch query over time, streaming queries are memory-efficient and can be left to run forever (or at least until the user decides to stop the query). In the streaming paradigm, all of the raw real-time data need not even be stored. It is now possible to set a continuous query to take 10 minute averages of a particular metric, such as CPU utilisation, and only store the results of this query (instead of the highly granular raw data stream).

5. Finding incidents is often like finding a needle in a haystack.

Every few months a major bank suffers a critical outage, exposing millions of their customers to uncertainty and commercial distress. Banks have developed complex IT infrastructures over the years and need to continuously monitor thousands of devices, servers, database systems, trading applications and numerous other tools. This, in itself, is no easy feat. But simply monitoring data is not enough.

Becoming more resistant to IT failures in today’s environment requires sophisticated analytics, data mining and visualization, beyond the old days of simply looking at isolated metrics. ITOA (IT Operations Analytics) technologies are essential for monitoring the health of a whole IT stack and detecting more incidents at an earlier rate with anomaly detection.

Firstly, in order to gauge the health of the whole IT infrastructure, ITOA allows an IT operations team to store data from multiple applications (as well as from different monitoring tools). This consolidates hundreds or thousands of disparate data streams within one centralised data hub.

Using these techniques enables users to speed up root cause discovery and remediation times, and stops support teams from drowning in a deluge of alerts. Using the right algorithms for anomaly detection ensures that the number of false positives are minimised.

Embarking on a journey of ITOA transformation is not easy. But investing in the right analytics infrastructure has large payoffs by ensuring that a bank can keep the light on 24/7, maintain a healthy reputation among its customers, and prevent the regulators knocking on the door.