TOP DATA TRENDS FOR 2017

Predicting the future is always a risky business (just ask President Hillary), but we’re pretty sure data analysis in the following fields will see plenty of activity in 2017.

1. Data and the monetisation of ‘The Internet of Things’

The Internet of Things (IoT) is the phrase that encompasses the connection of a vast array of devices to the internet (and consequently to each other). IoT includes the obvious, such as cell phones and laptops, but extends well beyond this to things like refrigerators, cars, air conditioners and even people with smart watches and glasses. Once connected, these devices have the potential to generate data and feed it back to interested parties. Early business applications of IoT have typically been in the field of monitoring — for example, drill bits on oil rigs advising operators that they are about to fail, aircraft engines alerting maintenance engineers to potential issues, or vending machines able to order their replenishment based on actual sales.

Gartner predicts that by 2020, there will be over 26 billion connected devices, all generating data. The challenge for industry, though, is in how to analyse and find opportunities in these massive data sets. One suggestion, as Cisco head of strategy Macario Namie told TechRepublic recently, may be to move away from a centralised analytics department, and move instead to a more widely distributed analytics model. “As IoT gains momentum,” Namie says, “the volume of data generated will be stratospheric. Not only will there be more data, but there will be different types of data, and data from sources that have yet to be considered. All this new data points to fresh opportunities for revenue generation.” Namie thinks the main benefit of big data analytics evolving into a distributed analytics model, would be that the ways in which to monetisation the IoT data will more readily apparent to those closer to the customer interaction. ”We will see more devices capable of analysing data locally, processing and capturing the most important data for more real-time IoT services."

As they have done every year since the 90s, the AI goalposts will continue to move in 2017. The ‘AI effect’ as it is known, is the phenomenon whereby once an activity has been mastered by a machine, it is no longer considered to be ‘intelligent’. The idea was precisely stated by Larry Tesler in 1970 when he said; “Intelligence is whatever machines haven't done yet.” Certainly, what was once considered AI is now perceived as routine and everyday. Sat-nav route planning, for example, is AI. So are book and movie recommendation engines and voice response personal assistants like Apple’s Siri. Phone answering systems that ask what you’re calling about, then step you through security questions before transferring you to a human being are AI. These were all considered futuristic once, but are now just part of everyday life. Some things, like driverless cars, we still marvel at, But Google’s driverless cars have been operating for nearly 10 years and Christchurch Airport has recently begun trials of a driverless shuttle. Soon enough we won’t marvel at these things either — and children born today will be perplexed a decade from now as to why their parents still insist on driving cars manually.

The great leap forward in AI that is looming large in 2017, however, is the likely impact of AI on industries and roles that were formally considered ring-fenced from it. In a 2016 think piece entitled Artificial Intelligence - Opportunities and Challenges, the Institute of Directors sought to focus the government’s attention on the many issues raised by the impending arrival of ubiquitous artificial intelligence.

“What determines vulnerability to automation is not so much whether the work concerned is manual or skilled, but whether or not it is routine. Since the age of the steam engine, machines have increasingly performed many forms of routine manual labour. Now those machines may also use AI technologies to perform some routine cognitive tasks too. Just as the Industrial Revolution reduced the demand for human labour in manufacturing and agriculture, AI technologies have the potential to reduce the need for skilled professionals in service fields that have been largely insulated from disruption, such as finance, accounting, law and medicine. AI-related industrial applications will replace humans in a number of readily disrupted fields, including call centres, customer services, legal document review, or any other industry involving other routine tasks.”

The report isn’t reluctant in its assertion that these industries will be disrupted by AI — and it’s easy to see why. Watch this demonstration of IBM’s Watson, for example, and then imagine that instead of a product manager creating a marketing plan, there was a lawyer asking Watson to research precedents and recommend legal strategies for an upcoming case. Or a doctor asking the AI to review brain scans for any anomalies (IBM’s Watson is already being trialled by Counties Manukau District Health Board — assisting with acute care optimisation).

While for lawyers and doctors the reality of mass layoffs due to AI is still a ways off, for the ‘somewhat’ skilled that reality is much closer. As identified in the report, call centres are big employers of semi-skilled workers in NZ but many of those roles are now at risk as AI and machine learning continues to get better at understanding and creating natural language. Currently, the majority of ‘Chat Now’ customer service interfaces are staffed by people. While there are always some unique human variables, the majority of these interactions are routine in nature — address changes, card replacements, cancellations or renewals etc. Given this comparatively narrow range of interaction and with the application of machine learning based on the data from hundreds of thousands of previously recorded interactions, it is expected that chat bots and other dialogue systems will begin impacting call centre staffing in the near term. Could be time to get your CV up-to-date (robots can write those too by the way).

3. Increased Demand For Quality Data Visualisation

According to German research consultancy BARC, 2017 will see a growing demand for effective data and analytical visualisation. Driving this trend is the continued move from written to visual communication and the proliferation of new visualisation tools. While there’s still a place for the classics — pie charts and bar graphs, for example — the new tools are empowering more engaging data story telling and the adage “a picture is worth a thousand words” has never been more relevant.

The issue, however, is that greater graphics capacity does not automatically result in increased comprehension — and the focus for data artists must remain on the clarity of their data messaging. To that end, a quick refresher as to the most important components of good data visualisation might be timely. Keep these four principles in mind when creating data visualisations:

i. Relevance A good data artist must always ask one simple question ‘does this answer the question it was intended to?’ If the answer isn’t ‘yes’, it’s time to go back to the drawing board. Answering in the affirmative means your information is fit for purpose and it’s time to start building the data story.

ii. Story telling There is a distinction between telling a story with data and communicating the story of the data with visualisations. Anyone can look at some data and put together astory — but oftentimes it will be one that conforms to a predetermined narrative. Instead, a competent data artist must be able to recognise the story that inherently exists within the data and bring that to light in a visual way. This involves maintaining integrity about what the data does and does not say — and if holes remain, more questions need to be asked.

iii. Visualisation tools Now that you have a data story to tell, it’s time to find the best way to display it. Tableau, for example, is an excellent tool for many different types of static data and can organise a visualisation using story points — which help to move readers through the data, uncovering each relevant piece sequentially (it’s the data artist’s job to make sure the data at each story point leads on to the next). D3, on the other hand, is especially good for working with moving images as data can be made interactive using transitions, transformations, zooming and panning.

Within the selected medium it’s imperative to use the correct types of charts for each point. A pie chart is of no use, for example, when trying to display information across time; likewise a 100% stacked bar chart won’t provide size relativities across different groups.

iv. Know your audience Finally, ensure you’re not leaving your audience in the dust. If they can’t understand the medium or the complexity of the visualisations, then it’s not fit for purpose. Keep in mind that the type of information to show the C-suite is different than that being shown to a product manager. The C-suite is typically interested in higher level, overview figures whilst the latter will want more detail, for example showing output down to individual categories and being able to filter to suit their needs.

ABOUT THE AUTHOR

Sally Carey is a director of Datamine and has over 20 years experience consulting on data analytics solutions across a range of industry sectors. Carey specializes in delivering clarity from the complexity of big data – advising organizations on a host of predictive analytics disciplines - including quantitative decision making, loyalty programmes, organisational change and marketing strategy.