In search of context, balance and perspective in a rapidly evolving world.

Escalating Ignorance in the Information Age

Oxymoronic? Perhaps, but true. The more ‘information’ that we have produced in the past forty years of networked information systems and the internet, the less we seem to know or trust. We are in an era of information entropy in which more is less.

I remember six years ago when an acquaintance of mine mentioned that she did not have cable t.v. I wondered how she could possibly keep informed of current events. Two years later, I dropped it myself, never regretting my ‘loss’. Subsequently, I have become progressively more selective in my reading, particularly on the web, finding that much of what I have consumed provides less insight.

The information age has provided a wealth of data, but not a corresponding wealth of insight. Why is that? Let’s review.

Reality is changing at warp speed. Yesterday’s facts and truisms are being rapidly rendered obsolete. This ain’t your granddaddy’s nothin’!

We are producing mountains of data, but proportionately less ‘information’ (remember: data and information are not the same) .

The information that we do produce from the data is often without meaningful context or perspective, and therefore of limited utility, relevance or reliability in a world where context can change as quickly as facts, and perspectives proliferate.

The institutions and information intermediaries (the press, government, academia, science, professions , unprofessional organizations such as Facebook and Google) that we depend upon for reliable and trustworthy information have almost all been diminished by scandals as they have become ‘monetized’, or otherwise compromised directly or indirectly by economic forces which have bent their values to serve other objectives.

Concerted efforts to distort or undermine or repudiate otherwise valid information have been refined and deployed with devastating effectiveness.

We have become conditioned, if not programmed, to suspend, if not avoid, critical thinking in preference to simple or comforting dogmas, also known as ‘thought on auto-pilot’. We have willingly become prisoners of our own illusions, or those which too many are willing to sell us, in a world where there are now too many factoids to make sense of very much for very long.

One of the interesting consequences of all this is that in many subtle ways we take more time to do things that once seemed so simple, or to make decisions that are now more difficult in an increasingly complex world. I remember standing in the soap isle of the local supermarket gazing at the various offerings of dishwasher detergent. There before me was New and Improved, Extra New, Super Improved, and You Won’t Believe Your Eyes, all in similar but different containers by the same manufacturer, all at nearly the same price. Along came a lady who engaged in the same exercise as I. After a few minutes, we looked at each other and asked ‘What’s the difference?’. I could just grab one off the shelf and be done, but I’ve been programed to optimize; best value for the price. Ultimately, I just grabbed one off the shelf. Now multiply this simple example across the plethora of shopping transactions. Recognize that this phenomenon applies to information as much as dishwasher soap. The default for decision gridlock is snap judgment which often leads to the unintended consequence of buyer’s remorse, and the oft resultant lament: ‘What was/were I/they thinking?’

At a higher level of consequence, business and governmental decisions become similarly captive of a world that is devolving from long assumed perceptions of homogeneity to ever more complex and finite sub-groups, sub-cultures, sub-markets, sub divisions; each with their peculiarities and potential risks to the unwary, and few of which we really understand. Middle east peace? Climate Change? Healthcare policies? Renewable energy strategies? Transportation strategy? Tax reform? Nuclear energy?

So here we are at the pinnacle of the data-pile at which our economic elite, blessed with all the raw data and algorithms they possess, are risk averse to investing their parked trillions for fear of risks they cannot effectively define, and therefore cannot effectively hedge.

And our ‘intelligence services’ with their army of server farms cannot pro-act with reliability; only react once the threat has manifested itself. You don’t need big data to set up a sting for the witless. But all their data isn’t helping them to preempt the wily.

And government, which is more reactive than proactive by nature, works on old and fragmented systems evolved from the vastness of its enterprise and the granularity of its operations as defined by ever more complex regulations; systems which are too big, too complex to upgrade, but too critical to let die. This also applies to large corporations, which are bureaucratically not too far removed from government.

I do not consider myself an information Luddite. By virtue of the very nature of my profession, I love good data; I crave good data; I pine for good data; but I also distrust all data until its reliability can be proven. More is not necessarily better.

Our data and its infrastructure is steadily holding us captive while we perpetuate the delusion that it is setting us free. Unwilling to accept this possibility, we double down on our bet on artificial intelligence (AI) as the means to master the data-pile and set us free. No doubt, AI will bring many advantages.

But it also holds the risk that in seeking to outsource our thinking and judgements to so-called sentient machines, we are inviting a concentration of power (think Amazon, Facebook and Google) and a potential for manipulation that enslaves rather than liberates us. Given our own individual and collective imperfections as citizens, professionals and societies, is it reasonably plausible that we can create AI that transcends our manifest imperfections and biases, but is vastly more capable of the harm we can already do without AI’s assistance? Stated more simply, can imperfect humans create perfect machines, or merely machines more capable of leveraging our imperfections to greater consequence?

We need not look far to preview the risks. Darkness is descending as the Trump administration seizes the reins of power and systematically draws the shades on the windows of government. Today it seeks to withhold information; to render us less informed. Today, as it has for the past two years, it perpetrates blatant lies, increasingly devoid of any subtlety, to propagate its world view. Imagine what it might do once it has implanted its partisans where all the levers of information creation or influence are located. Consider a modification of the adage: ‘To err is human; to really screw up takes a computer’.