TotalData

The concept of TotalData™ implements the four dimensions of data and information – reach, richness, assurance and agility. But where did these dimensions come from?

I first encountered these four dimensions in discussions of net-centricity, which spilled out from the US defence world into the commercial world over ten years ago. Trying to dig up the original material recently, I found a military version in a report written in 2005 by the Association for Enterprise Integration (AFEI) for the Net-Centric Operations Industry Forum (NCOIF).

Going further back, the first two dimensions – reach and richness – had been discussed by Evans and Wurster before the turn of the millennium. They argued that old technologies had forced you to choose (either/or) between reach and richness, whereas the new technologies emerging at that time allowed you to have both/and.

The authors also introduced the concept of affiliation, by which they meant transparency of relationships – for example, knowing whether the intermediary agent is working for you or working for the other side. Or both. And knowing who really wrote all those “customer reviews”.

According to the authors, it would be these three factors – reach, richness and affiliation – that would determine the success of e-commerce. Clearly some sectors would be more open to these factors than others – according to The Economist in February 2000, online trade was then dominated by business-to-business (B2B). The three factors identified some of the challenges facing other sectors, including professional services, in going online. As Duncan, Barton and McKellar argued for legal firms, “The Web provides Reach, but offering Richness and the sense of community required for creating and sustaining relationships with visitors could be difficult.”

Meanwhile, new architectural thinking had shown ways of resolving the traditional trade-off between speed (agility) and quality (assurance). (A very early version of this was known as Bimodal IT. Some industry analysts are still pushing this idea.)

When agility and assurance were added to reach and richness to produce the four dimensions of net-centricity, affiliation appears to have been divided between community (reach) and trust (assurance). But the importance of affiliation was never entirely forgotten. As Commander Chakraborty observes, “organisational affiliations and culture … play very significant roles in a networked environment.”

So whatever happened to net-centricity? It has been replaced by data-centricity, which, as Dan Risacher argues, is probably a more accurate term anyway. Or as we call it at Reply, TotalData™.

Notes and References

Much of the original material for the NCOW Reference Model is no longer available. This includes the pages referenced from Wikipedia: NCOW (retrieved 8 August 2017). Net-centric concepts were incorporated into DODAF Version 1.5 (April 2007).

The TotalData™ value chain is about the flow from raw data to business decisions (including evidence-based policy decisions).

In this post, I want to talk about an interesting example of a flawed data-driven policy. The UK Prime Minister, Theresa May, is determined to reduce the number of international students visiting the UK. This conflicts with the advice she is getting from nearly everyone, including her own ministers.

As @Skapinker explains in the Financial Times, there are a number of mis-steps in this case.

Distorted data collection. Mrs May’s policy is supported by raw data indicating the number of students that return to their country of origin. These are estimated measurements, based on daytime and evening surveys taken at UK airports. Therefore students travelling on late-night flights to such countries as China, Nigeria, Hong Kong, Saudi Arabia and Singapore are systematically excluded from the data.

Disputed data definition. Most British people do not regard international students as immigrants. But as May stubbornly repeated to a parliamentary committee in December 2016, she insists on using an international definition of migration, which includes any students that stay for more than 12 months.

Conflating measurement with target. Mrs May told the committee that “the target figures are calculated from the overall migration figures, and students are in the overall migration figures because it is an international definition of migration”. But as Yvette Cooper pointed out “The figures are different from the target. … You choose what to target.”

Refusal to correct baseline. Sometimes the easiest way to achieve a goal is to move the goalposts. Some people are quick to use this tactic, while others instinctively resist change. Mrs May is in the latter camp, and appears to regard any adjustment of the baseline as backsliding and morally suspect.

If you work with enterprise data, you may recognize these anti-patterns.

When Complex Event Processing (CEP) emerged around ten years ago, one of the early applications was real-time risk management. In the financial sector, there was growing recognition for the need for real-time visibility – continuous calibration of positions – in order to keep pace with the emerging importance of algorithmic trading. This is now relatively well-established in banking and trading sectors; Chemitiganti argues that the insurance industry now faces similar requirements.

In 2008, Chris Martins, then Marketing Director for CEP firm Apama, suggested considering CEP as a prospective “dog whisperer” that can help manage the risk of the technology “dog” biting its master.

But “dog bites master” works in both directions. In the case of Eliot Spitzer, the dog that bit its master was the anti money-laundering software that he had used against others.

And in the case of algorithmic trading, it seems we can no longer be sure who is master – whether black swan events are the inevitable and emergent result of excessive complexity, or whether hostile agents are engaged in a black swan breeding programme. One of the first CEP insiders to raise this concern was John Bates, first as CTO at Apama and subsequently with Software AG. (He now works for a subsidiary of SAP.)

from Dark Pools by Scott Patterson

And in 2015, Bates wrote that “high-speed trading algorithms are an alluring target for cyber thieves”.

So if technology is capable of both generating unexpected events and amplifying hostile attacks, are we being naive to imagine we use the same technology to protect ourselves?

Perhaps, but I believe there are some productive lines of development, as I’ve discussed previously on this blog and elsewhere.

1. Organizational intelligence – not relying either on human intelligence alone or on artificial intelligence alone, but looking for establishing sociotechnical systems that allow people and algorithms to collaborate effectively.

2. Algorithmic biodiversity – maintaining multiple algorithms, developed by different teams using different datasets, in order to detect additional weak signals and generate “second opinions”.

Announced and rapidly withdrawn, Admiral’s proposed collaboration with Facebook was supposed to give drivers a discount on their car insurance premiums if their Facebook posts indicated the right kind of personality. According to some reports, the i…

@mrkwpalmer (TIBCO) invites us to take what he calls a Hyper-Darwinian approach to analytics. He observes that “many algorithms, once discovered, have a remarkably short shelf-life” and argues that one must be as good at “killing off weak or vanquished…

It should be pretty obvious why Microsoft wants 85 million faces. According to its privacy policy

Microsoft uses the data we collect to provide you the products we offer, which includes using data to improve and personalize your experiences. We also may use the data to communicate with you, for example, informing you about your account, security updates and product information. And we use data to help show more relevant ads, whether in our own products like MSN and Bing, or in products offered by third parties. (retrieved 25 October 2016)

Facial recognition software is big business, and high quality image data is clearly a valuable asset.

But why would 85 million people go along with this? I guess they thought they were just playing a game, and didn’t think of it in terms of donating their personal data to Microsoft. The bait was to persuade people to find out how old the software thought they were.

The Daily Mail persuaded a number of female celebrities to test the software, and printed the results in today’s paper.

Computer”tell yr age” programme on my face puts me 69 https://t.co/EhEog5LQcN Haha!But why are those judged younger than they are so pleased

I recently went into a High Street branch of my bank and moved a bit of money between accounts. I could have done more, but I didn’t have any additional forms of identification with me.At the end, the cashier asked me for my nationality. British, as it…

There are several useful ways that an algorithm might contribute to the collective intelligence of a Board of Directors. One is to provide an automated judgement on some topic, which can be put into the pot together with a number of human judgements. This is what seems to be planned by the company Deep Knowledge Ventures, whose Board of Directors is faced with a series of important investment decisions. Although each decision is unique, there are some basic similarities in the decision process that may be amenable to automation and machine learning.

Another possible contribution is to evaluate other board members. According to the BBC article, IBM Watson could be programmed to analyse the contributions made by each board member for usefulness and accuracy. There are several ways such a feedback loop could enhance the collective intelligence of the Board.

Retrain individuals to improve their contributions in specific contexts.

Identify and eliminate individuals whose contribution is weak.

Identify and eliminate individuals whose contribution is similar to other members. In other words, promote greater diversity.

Enable trial membership of individuals from a wider range of backgrounds, to see whether they can make a valuable contribution.

Organizational Intelligence is about an effective combination of human/social intelligence and machine intelligence. Remember this when people try to develop an either-us-or-them narrative.