Evidence-Based Information Governance

We Don’t Know Ourselves

Imagine that you want to lose weight. You have tried cutting back, but it hasn’t really helped. Maybe you should get a little more exercise. Maybe you should eat less fat. Or is it sugar? Greek yogurt is supposed to help. Am I really drinking enough coconut water? Who knows? So you mention it to your doctor, or make an appointment with a dietician, or perhaps sign up with a club or clinic that specializes in weight loss. What is the first thing that they ask you to do? Keep a food log. A diary. Write down what you eat, record the exercise you do, and then report back in a couple of weeks so they can give you a customized recommendation.

Great! You have a made a positive decision to take charge of your health.

The first day you are on top of it, and even pretty honest. That corned beef hash that you accidentally ate at the diner when you stopped in for coffee? In the diary. The late-night bowl of sugary flakes? In the diary. Day two, only the good stuff goes in the diary, and a few days later you are still making a half-hearted attempt until finally, you just find yourself scribbling down a bunch of made up stuff in the waiting room moments before your next appointment.

Sound familiar?

For decades, the self-reported diary has been the primary research tool for studying and measuring our eating, sleeping, and other behaviors; the foundation of efforts to help us change those behaviors. But, it doesn’t really work. It is a fantasy.

The Quantified Self

New technology offers a different approach. In the past few years we have spent millions of dollars on a host of devices and apps that passively track our behaviors. Products from FitBit, Nike, Jawbone, Garmin and others. The theory of this technology, or movement (called “The Quantified Self” by some), is that more data – and more accurate data – about our behavior will help us understand ourselves better, and thus provide a foundation and methodology for improving ourselves.

Today’s technology tracks our steps, sleep patterns, communication habits, and more. Tomorrow’s technology will automatically log the food we eat, its caloric and nutritional components, and its effect on our bodies. This passive tracking of data clearly is a more realistic approach for us fragile, distracted, willpower-exhausted humans. The machine collects the data in clever way. The algorithms automate the analysis of the data to give us insight into our habits and patterns, and help us track our progress towards a goal.

Of course this approach to problems – any kind of problem – is de rigueur. We know it as Big Data and it is prescribed as a solution to everything from unemployment to world hunger.

We are bringing the Quantified Life philosophy to companies, governments, and to entire nations. Tomorrow we will have the Quantified Organization, with the promise that decisions based on tradition and superstition are replaced by decisions based on facts and evidence.

The Quantified Organization

It is easy to be cynical about Big Data. Sometimes I am. But mostly I get it and I believe it. Clearly it raises a host of business, policy, legal, ethical, and societal issues. In any case, it doesn’t matter whether I get it or not: it will be the way that we function as organizations – and increasingly, as individuals – moving forward.

The idea that we should make decisions based on facts or evidence as opposed to tradition, intuition, and superstition of course derives from the Enlightenment and the scientific method itself. But even in areas where you might expect that this approach is already baked in, there has been a push to focus on the evidence. In the 1990s, for example, the concept of “evidence-based medicine” (or “evidence-based practice”) was introduced into the medical field and has since taken hold as an operating philosophy in branches of medicine from optometry to dentistry.

Applying the best available research results (evidence) when making decisions about health care. Health care professionals who perform evidence-based practice use research evidence along with clinical expertise and patient preferences. Systematic reviews (summaries of health care research results) provide information that aids in the process of evidence-based practice.

If the practice of medicine – which has embraced the scientific method for over a century – can benefit from a heightened focus on evidence-based decisions and policy, then surely there are other practices that could benefit from it as well. Any come to mind?

How about Information Governance?

Evidence-Based Practice and Information Governance

Today in IG we make so many decisions, and craft so many policies, based on nothing more that tradition and superstition. This is especially prevalent in the records and information management (RIM) facet of IG, but it exists elsewhere as well. Why do we have 1000 categories in our records retention schedule? Because that’s the way the last guy did it. Because we inherited the schedule from a company we acquired. Because Janice liked it that way. Because that’s the right way. Because that’s what makes the most sense to me. Because that’s what my old boss told us to do. Because that is what the consulting company sold us.

Where is the evidence?

What is true?

Are these justifications based on anything more than tradition, superstition, or office politics?

I propose a new focus for IG practitioners – a focus on Evidence-Based Information Governance. This philosophy should be embedded in everything we do in IG. It is egregious that we wave our hands magically and use purely anecdotal evidence to create fear around information risk. The risk of a spoliation charge in a litigation, for example. How often does it happen? What is the risk of it happening? Go look it up for yourself.

We need to bring evidence into the practice of IG. We need evidence to quantify value. To quantify risk. Evidence to make decisions about how much time, money and effort we should put into managing specific kinds of information.

It is shameful that today, in 2014, this is the exception rather than the rule in IG.

Today we have incredible tools that can easily shed light on our information to give us the visibility and the evidence we need to make good decisions. Go take a look at the providers who support the IGI as an example, as a starting point.

Anyway this post is getting a little long.

But I am passionate about this idea, and will write and work to advance this idea.

4 comments

You are completely right, every organization is influenced by its history and other factors you mention. But I wonder whether it is really possible to wipe those factors out and get an “absolute evidence” ? Is always “evidence” relative to intrinsic characteristics of organizations ?

For me, “evidence” is dependant on we can call “corporate personality”, a combination of corporate culture and what main focus of the organization is : Growth, Financial Results, Technology, … and hierarchy of those elements. So, because of it, companies only consider part of available information as “evidence”, without always considering the other areas at the same level. Then value of information is relative to this main focus. So, priorities about IG and RIM depend on it. And my experience as a consultant tells me most of the time, companies are not even conscious of their focus that influence their strategy and decisions.

Second factor I already mentioned : Corporate Culture. For example, whether a company has a culture based on trust or all key decisions are taken centrally, again, decisions about IG and RIM will be different.

So, IG focus, policies and procedures will be different.

Then, can we really speak of “evidence”, let’s say “absolute evidence” ? Is it possible to make it absolute ?

If you want, I can write a more complete paper on this subject explaining how relative is “evidence” and also the links with corporate strategies.

Jean-Marc, thanks for taking the time to share you thoughts. You raise a bunch of useful questions. From your comment, I see at least two major tasks: 1) Making a commitment to make IG decisions based on evidence and 2) Deciding what “evidence” means. On the latter, probably the most important thing is that everyone agrees on this, even if the evidence is not absolutely true. One of the lies of the Big Data world, of course, is that decisions about what data is important and what analytical results are important still rely on intuition.

Great post, Barclay. You raise some excellent points about the need for intelligence and facts around data decisions. I’ll be sure to share it out with my network. We think alike on this: http://bit.ly/1ql4MYL