data management

March 01, 2010

Today I am working on my presentations for Burton Group's Catalyst conference "Forging Ahead: Navigating the New Normal" in Prague April 19-22. Joe Bugajski and I will be doing the keynote for the business intelligence (BI) track -- What Can IT Do to Deliver Business Insight – Not Just Intelligence?

BI is 25 years old, yet its promise of important insight eludes most organisations. CEOs list BI as a top initiative and simultaneously lament poor results from large BI investments. One reason for this incongruence is the detrimental effects of the automation mindset and the pursuit of efficiency. BI is aimed at the wrong thing.

For the last 239 years organizations have been applying automation as their primary tool of choice to improve production and efficiency. An automation mindset erupted with the dawn of the steam engine in 1771 and was honed to razor sharpness by the management revolution started by Frederick Winslow Taylor. This mindset has shaped how many have applied BI. For most organizations, BI is nothing more than a glorified reporting tool to examine efficiency. They use BI to look in the rear-view mirror and learn about the past, but most are unable to predict upcoming events or learn lessons that will help them make decisions to navigate uncertainty.

The need for insight is not the tail (of technology) wagging the dog (business). Management innovation is reshaping organizations perspectives about the devotion to efficiency where the goal is to reduce human involvement. Profitable innovators are not driven solely by efficiency; they look for ways to magnify human involvement. They realize that the reshaping of organizations and management called for by the new normal thrives on how well they can foster the development of insight by humans.

Joe and I believe that BI must trend toward a new definition – Business Insight. Insight represents the opposite goal of automation, and it should reshape the primary purpose of most information technology organizations. We hope you can attend Catalyst to discuss this significant shift in thinking.

If you would like information about presenting at our Catalyst conference or to submit a proposal see our presenters page.

To get a discounted price of €995 use the promo code “INSIDER” during registration.

February 04, 2010

Part of the Gartner/Burton integration plan calls for, of course, integration of benefit plans. I have some particular insight here, since I have been on both sides of that equation – in health insurance IT, and as a health care provider CIO.

Insurers have always been on the leading edge of technology usage – and that continues to be the case. There is extensive use of websites, online modeling, customer based data entry, and self help. It’s cool stuff.

But it’s also an example of a nascent problem with “cloud computing” – or even its red-headed step child: e-business.

As businesses have found that an internet-based delivery model solves many issues (and reduces many costs) they have gone at it with a vengeance. So has government – health plans are becoming increasingly complex (HSAs, HFSAs, CDHP are all in the alphabet soup), because a side effect of increasing automation is that it is (ostensibly) easier to develop more complex plans, and easier to deliver them. Hence, like water flowing into an empty glass, the level of complexity rises.

This issue is not unlike a situation I found myself in a few years ago. After months of bake-offs, evaluations, vendor grilling, RFIs, RFPs, etc, we were about to make a major software decision, and were reviewing vendor contract terms (it was a 15-20MM dollar purchase). Of course, the contract was internet accessible, PDF format; And it was littered with hyperlinks to terms on web pages, references to commitments on web pages, and detailed product features on web pages.

I wouldn't sign it; they were reluctant to change it, and it almost cost the deal.

Besides the complexity of referencing material outside of the signed document, there was the issue of audit – clearly those web pages change frequently, almost at whim – with no clear audit trail of who/when/where/why. God forbid we should get into an argument at a later date (inevitable) and both parties look at the terms – just to find they were written on beach sand that had been scoured by hundreds of successive tides of changes…..In the end, they dutifully printed out all of the referenced web pages, we reviewed them, and explicitly included them in the contract.

I submit this is an example of yet another potential unintended consequence of cloud computing; we are attracted by the dynamic and elastic aspects of cloud, but those same features can be detriments to a long term, sustainable business model.

A potential way to deal with this may come out of design thinking and hybrid thinking approaches. Gartner research is exploring that genre of ways to think about IT solutions, and the impact on business strategy…

When I had 20 different websites open from 5 different companies, and multiple time-sensitive modeling and data capture pages in process, it occurred to me that the benefits enrollment process is approaching lunacy-- not any one business’ fault, just a zeitgeist. I am not sure whether health care reform (if it ever happens) will worsen or improve this situation. But issues around appropriate automation, user interface design, and silo integration are at the heart of what should be on an IT leader’s agenda for the coming years….and cloud computing will only add fuel to the fire…

Now if I can only get Fidelity to stop insisting that I only rely on web-based statements and alerts….

January 18, 2010

In my last post I stated, "This is my first post as a Gartner employee. In the spirit of business as usual, here we go". Truthfully, this is anything but business as usual, and that is good. The New Normal is filled with business as un-usual.

Taking a page from my own research in April 2009, "Real Transformation: Why IT Change Is Not Enough"... "Your organization’s success has been built on a set of assumptions. Those assumptions have driven choices related to every aspect of what you do and how you do it. But fundamental assumptions upon which you have built success may have dramatically altered. Like any life-changing event, this may alter what customers you serve, the products and services you provide, and the processes and information required to support a new normal".

In the new normal all organization's need to release the death grip on what you think is important based on assumptions about the past, and they must keep in mind two uniquely human characteristics.

First is our ability to derive comfort from 'business as usual' and to let it obscure our ability to see anything other than what we 'want to see'. Like Michael Disabato writes, "Processes provide a framework and guidance, but should not be relied on as a universal panacea. While processes must be executed to be effective, blind reliance on them in the face of changing requirements is a recipe for disaster. Likewise, metrics can be a trap for the unwary". Human's have an ability to develop blind reliance on status quo and 'facts' -- we manufacture reality.

Many facts are 'facts' because we have declared them to be 'fact'. The analysis that provided comfort to the managers of financial instruments leading to the sub-prime mortgage crisis is one example. Another piece of this puzzle about 'fact' is in front of your eyes right now -- the device that allows you to read this blog post. The device allows you to search and find your own facts, but it also has the power to obscure them. Many of the business and information systems within organizations present a fact that was determined by someone (or something) else. Do you really know why the information should be trusted? When you add the fact that many data environments have multiple pieces of data that could be considered the same 'fact' and the one selected is just one of many possibilities, then it really makes you think, doesn't it?

The over-reliance on ‘facts’ also applies to planning by past patterns. Don't assume it is the same situation and the old-business playbook still applies. A pattern may render an important piece of the puzzle, but assuming it is the same is dangerous. (Also see ‘Fixing Intel’ is a wake-up call for analysts everywhere).

The second uniquely human characteristic is our ability to use our minds to use information from multiple contexts to create a new insight. This is an amazing gift that we all possess. But as Gary Hamel states in "The Future of Management", "The machinery of modern management gets fractious, opinionated, and free-spirited human beings to conform to standards and rules, but in so doing it squanders prodigious quantities of human imagination and initiative. It brings discipline to operations, but imperils organizational adaptability". Hamel's statement is not a call for anarchy, it is a call for all of us to engage the human mind and spirit to do amazing things that are otherwise impossible, and to reorient organizations so that this can happen.

I believe we need to reorient our technology focus too. Our IT focus for the last 50 years has been to automate and to fuel the credo of modern management -- pursue efficiency ahead of every other goal. This is not surprising, since modern management was invented to solve the problem of inefficiency.

Frederick Taylor who is regarded as the father of modern management believed that an empirical, data-driven approach to the design of work would yield big productivity gain. Hamel quotes Taylor who maintained that efficiency came from “knowing exactly what you want men to do, and then seeing that they do it in the best and cheapest way.” But as Hamel further illustrates, this has become a liability for organizations trying to drive innovation and to tap into the human ability for discovery. An overt focus on efficiency and its brother automation can hobble our ability to benefit from human insight.

The new normal requires forward-looking insight versus an over-reliance on historical patterns and 'facts' that may be irrelevant. The new normal is business as un-usual. As we proceed, we need to account for these two uniquely human characteristics to make the most of what lies ahead.

November 17, 2009

How many emails do you get daily? I remember when my response of 50-100 per day would elicit a “yeah, right…do you get anything else done?”.

Sad to say, 50-100 per day is probably the low end of the norm for most people these days, and my work load has topped over 150-200 a day…then add my personal email box of 50-100.

Our collaboration and content group has a lot to say on information overload. Craig Roth has for years talked about the concept of information pull, instead of push. That is, set up a system so that you can easily get what interests you, rather than just get a torrent of unsolicited stuff that screams for attention.

One of the ways is to move some of that load to wikis or discussion groups. The jury is still out for me on that, and my take on the use of wiki’s (for other than joint authoring, like with Wikipedia) is skeptical. But some of the benefits of SharePoint discussion groups, or even wikis, are (as Craig pointed out):

it enables simple subscription/unsubscription to the topic

it maintains a historical record of how decisions were made for future justification

it allows new employees to gain visibility to the whole thread

it applies thread management like a tree structure

it surfaces tribal knowledge so the corporation can maintain it (rather than this knowledge residing in our personal inboxes which are unsearchable and destroyed when we leave)

I guess so.

But it’s still yet another place to go, yet another place for MORE information, yet another place vying for your attention. Add to that blogs, news feeds, Facebook, Twitter…. well….its just piling on. Where will it all lead?

We may be at a tipping point where the processes for managing information are far more valued than the delivery mechanisms. One can only hope. Lyn Robison (from our Data Management group) loves to rant about how business people put up with bad information from their IT departments. No doubt. The sad thing is businesses want more.

Like the old shtick about the two senior citizens in the Catskills: “This pie tastes awful!” says the first, to which the friend replies “ YES! and the pieces are too small!”

Dan’s approach, and back-handed chastisement of traditional IT, certainly fits trends that we at Burton group talk about.

Specifically the consumerization and democratization of IT. A recent Computerworld report brought the point home: Internet access is now a human right, according to the EU.

So, as information technology continues to become integrated in everyday life, the trend toward “organic IT” will only get worse – or better depending on your POV.

That’s why I often talk about IT integration versus IT alignment.

But the flip side of organic is chaos (both data and process) and unintended consequences. Think invasive species.

The business has ultimate control and responsibility for both data and process, and the organic approach, if not managed, can negatively impact both.

Usually what that means is that the organization looks for someone to mediate the decision-making and organizational/systems decisions…and the only piece of the organization that has the expertise and (ostensibly) the umpire-like lack of line responsibility skin-in-the-game is IT – so IT ends up managing the use of technology, and all that goes along with it -- including governance processes.

Whether governance fully involves the business, or IT just takes on the mantle of responsibility is a tricky balance. The latter often occurs at the urging/delegation/abdication by business units. I would submit that IT needs another set of skills – that of making sure that technology (read data and process) decision-making is business driven.

It’s a biblical “give a fish vs. teach to fish” scenario – but teaching to fish assumes learning acceptance by the target, and some level of skill by the teacher. And these skills are not technical skills – but relationship and influence skills. Not the usual courses taken by a rising C programmer, or a database administrator.

When getting the business to be fully involved in technology decision-making doesn’t work well, it's just a short hop to blaming IT, or for IT to overreact (again as proxy to the business) and set inflexible standards that squelch business innovation.

That’s why architecture (and governance) is so important.

It’s all about collaboration and communication, and neither a soviet-style centralized approach (which, incidentally, has strong similarities to how capitalist companies are managed), nor a total laissez-faire anarchy (which is where organic has strong tendencies) will work.

Which reminds me of a t-shirt…

“Give a man a fish and you’ll feed him for a day, teach a man to fish and he’ll sit in the boat drinking beer all day…”

October 29, 2009

Rocky Horror Picture Show fans will immediately sing the title of this post in their head. It's funny how the mind associates things. The same thing happened when I read a recent article:

(Edited excerpt) "The initial reaction was anything but positive. They thought it was a horrible idea... every institution he funded demanding ever more computer power and duplicating research on those machines. At the time computers were completely incompatible and moving data was a huge chore. The resistance came about because those institutions wanted to keep control of their computer resources. But, they soon saw that hooking up to it meant a huge increase in the potential computer power they had at their disposal. They quickly learned that there was a tremendous gain for them, and it also fulfilled the goal of cutting spending on computers".

I read this and thought it could apply to a discussion about cloud computing, decisions about shared corporate computing environments, or just about any transformative change.

What did this quote apply to? Forty years ago today data flowed between the first nodes of what was then known as Arpanet -- the beginning of the Internet. Mark Ward of BBC News interviewed Dr Larry Roberts, the MIT scientist fundamental to Arpanet. The article "Celebrating 40 years of the net" describes what happened.

Happy Birthday Internet! The story is just another reminder that human behavior is a critical aspect of change.

September 19, 2009

Sometimes intriguing stories come from unexpected meetings. Because of a canceled flight I needed to share a cab from Basel To Zurich with 2 other young men – they looked slightly older than college age to me, although I suspect they were in their 30s.

Rupert had a backpack and an IBM laptop. After brief introductions, his excellent English led into an engrossing conversation on linguistics, language extinction, and programming. Rupert’s undergrad degree is in computer science, but he now is involved in University work in linguistics. I presume he teaches – but we never got that far to find out. Our conversation led to a discussion of languages, language extinction, and his current research focus. Being from the Boston area, I referred to MIT's Noam Chomsky – which elicited much disdain from Rupert: “he’s not a real linguist”. I sensed he felt that Chomsky bordered on a fraud.

Rupert is particularly interested in Slavic languages, specifically languages that are in danger of extinction because there are so few people speaking them. I presumed at first it was because of the predominance of American English (the global lingua franca); but in the Czech-Polish-Russian belt it actually is the slaughter of languages and dialects by Russian speakers and hegemony.

Even more interesting was how he was trying to save endangered languages. He had spent some of his research studies using XML style sheets and XSLT – web programming technologies – to codify the structure of languages that are bordering on extinction. In one specific case he used two written guides to the language that were published over a hundred years ago or more, and is capturing the language rules and constructs using style sheets and XSLT transformations.

So he is using what is basically a modern computer language to preserve legacy spoken languages. Brilliant. He promises to send along his research when complete. I can’t wait.

May 04, 2009

Isn't it nice to sit down for a helping of comfort food? I grew up with pot roast, pork chops and gravy, fried chicken, mash potatoes and other wonderfully tasty foods. I don't eat them all the time, but when I make them now I have fond memories of my childhood. It is... comforting.

The way IT implements business solutions today is loaded with comfort food. The solution paradigm has incrementally improved from the days of the mainframe. Sure there have been some major improvements that give the illusion of a transformation, but the dominant paradigm has been to build solutions inside the four walls of an environment protected at the perimeter. We are comfortable with it!

Like blocking the holes in a sieve, we keep trying to incrementally improve our approach and stuff keeps falling through. Identity, security, services, integration, information management... it is all a mess, and it becomes more of a mess when you try to apply it to a new paradigm like cloud computing. Complexity is constantly increasing and IT staff just continue to cope with the burden. I think it is a sign that you can become comfortable with anything. As long as it is incremental, you can put up with a lot of pain.

Two areas where the affect of comfort food is evident are the application portfolio and information quagmires. Many organizations have been ordering high-fat functionality from IT. But IT—not the people and departments ordering this unhealthy fare—has been piling on the pounds. The result is a bloated application portfolio, redundant data, and complex duplicative infrastructures. The business does not see the weight gain. These are two areas that cannot transform without the business at the helm pushing for a new lifestyle.

Cloud computing requires a new lifestyle.

When the assumptions fundamentally shift, incremental improvement that addresses “change” is the less-able brother of transformation. Incremental change typically occurs by examining what’s not working with the organization and then taking a modest step forward. Like a house inspector looking for cracks in the foundation and other imperfections, a list of problems is assembled. Estimates are then formed for all the repairs, and you may find that you have neither the time nor the resources to fix them all. Therefore, a small number of the achievable modifications are chosen to make the house more livable, slightly more efficient, and less prone to breakdowns. But many assumptions and constraints remain the same.

Transformation is different. Life changes present the opportunity for individuals to transform themselves. Yet many look to incremental change when it is incapable of yielding the transformation they seek. Sometimes we need a new house, yet we falsely believe that incremental improvement will achieve the same thing. The IT castle we have created...

prone to leaks...

constantly needs repair...

only keeps out the casual bad guy...

costly to maintain...

difficult to remodel, and

no matter how much we remodel it...

It still doesn't seem to be what we need. It's time for something really new.

Cloud computing could provide the vision for a new lifestyle, but my concern is that our ravenous hunger for comfort food will just keep packing on the pounds for IT. Cloud must support a paradigm where interoperability is a forethought; where security, identity, and entitlements are a fundamental part of a service's invocation; where the compromises we make about data do not expose us more than we are; and where the utility we dream it to be does not fade like Camelot.

Look at IBM's announcement for a private cloud appliance as an example. Create an appliance that runs a virtual machine, runs Websphere applications, and slap a cloud sticker on it -- PRESTO -- Cloud Computing is Here! NOT!!! It seems to leave out policy enforcement and operations management aspects of cloud.

Then listen to Larry Ellison. His sarcasm is evident as he says cloud computing describes everything we've ever done or will do. I agree that if it is not truly different, then we are only having another heaping serving of comfort food.

Mr. Elllison continues on to say, "I don't know what we would do differently in light of cloud computing". It seems to me that many vendors do not and sales of "cloud ready" stickers are set to skyrocket.

If we truly intend cloud to revolutionize IT then it must be revolutionary. Beware of new stickers that suggest doing the same thing will give you a new result.

The Chinese government has a new identity card. The character recognition software used can only recognize 32,252 of the roughly 55,000 Chinese characters. As a result, 60 million Chinese citizens cannot get new cards. But the unique combinations of characters that form names makes it even worse. The government list of names will include only 8,000 characters. As a result, people with unique names that do not make the cut are being asked to change their name. "Mike we have determined that our IT systems can only recognize one letter in your name, so we would like you to be named M."

This is not just a folly of a government's inability to protect its culture, it is a profound example of IT limitations imposed on a fundamentally human quality - individuality. But, IT limitations do not need to be as profound as human individuality to take note of this problem. We don't need to look far for other examples of where a technical limitation imposes, sometimes hidden, restrictions.

One example is the limitation imposed by the carrying capacity of data formats used inside the IT systems we create. Limitations like field length that exist in many IT transaction systems cause organizations to reduce the meaning of the information the format is meant to convey because it does not fit the carrying capacity of the format. The impact is compounded as data flows from one system to another, each with their own format and length quirks, adding to the reduction in information fidelity.

Information fidelity is a problem for almost all organizations, but the business is unaware of just how messy the information environment is. My friends in our Data Management Strategies coverage area have a deep understanding of data quality and data interoperability issues like this.

Back to the Chinese ID card for a moment... Maybe the IT project started with the fact that 100 surnames cover 85 percent fo China's citizens versus 70,000 surnames covering 90 percent of Americans. So what's the big deal? We increased the number of names that people can have - deal with it! The IT assumption being an increase in the number of names recognized is good enough.

I'm sure there was a reason for the limitation, but I think the original assumption needs rethinking. If you assumed citizens could have any name, and you have an ID card with a unique number for every citizen, do you really need to have a registered name that can be recognized as an authorized character combination? Couldn't you just treat it like a string of Chinese characters till the software improves? Perhaps there is another way around this issue.

Maybe the assumptions have been questioned, but it is concerning in a broader respect. Has IT become complacent about these types of limitations to the point where we unconsciously obscure the affect of these limitations and as in this case undermine ancient culture?

April 14, 2009

I've been thinking about Joe Bugajski's recent blog post about his health ordeal and resulting skepticism about electronic medical records. OK, skepticism is perhaps a bit mild compared to what Joe actually feels.

Joe had a life-threatening health emergency in early 2009 that was exacerbated by innacuracies and user experience problems in the electronic medical records system. He calls out weak data modeling as the culprit preventing effective insight into his medical history, medications, and courses of action.

As I read deeper into Joe's personal blog, some other things emerged that I want to call attention to. Health care culture and style of work, especially in fast-paced emergency settings, may resist the intrusion of technology beyond that which attaches to the patient's body. To ask a doctor or nurse to divert their attention to data entry or retrieval takes them away from their primary goal, even though that data may contribute to the overall care of the patient. In effect, the technology becomes a distracting third-party to the provider-patient relationship. It is not a passive technology, and as a result may actually increase the risk of medical error.

To be effective, medical records technology must be passive and comprehensive. It must become transparent through its use, so that the provider-patient relationship is strengthened, not weakened. As Joe suggests, data modeling is an essential component of ensuring that the correct data is surfaced at the correct time to the correct device/person. The data model is a logical representation of the patient drawn from various sources— some historical and some real-time. Some data is more critical than others, and that criticality can shift in the midst of care. It is not a replacement for human insight. Automation of patient data will not improve care on its own. Automation of correctly modeled data should facilitate and accelerate human insight.

Electronic medical record technology should fit the various ways that medical professionals work. To the extent possible, EMR approaches should be woven into the ergonomics of care. But at this early stage, EMR processes are bolted on to hospital procedures. Perhaps the early demand for EMR is driven by liability protection vs. improved care. The paradox is that unless EMR entry and retrieval is integrated into overall work practice, risk and liability increases.

The German philosopher Martin Heidegger wrote about "throwness", "breakdown", and "readiness-to-hand", and they apply in this and many other human computer interface (HCI) scenarios.

Throwness is the ability to react intuitively in highly fluid situations.Throwness is the result of learning and assimilating actions so that they can be performed without active analysis or thought. When it comes to golf, for example, Tiger Woods exhibits throwness.

Breakdown occurs when intuition is interrupted, and the objects present in the scenario stop functioning as a cohesive set. Breakdown occurs, for example, when your car blows a tire at high speed, and your attention is suddenly drawn to the control of the car. In effect, those actions that were suppressed into your mid-consciousness come to the fore. Breakdown occurs in HCI when your intuitive actions are distracted by system issues (like crashing programs, annoying data entry).

Readiness-to-hand occurs when a tool perfectly matches its function. You see a nail, you reach for a hammer. There is no analysis necessary. The act of hammering a nail is not intruded upon.

It is difficult to design sophisticated tools that allow for human "throwness", avoid breakdown (or handle it gracefully), and reach the stage of readiness-to-hand. This is especially true in a highly human-dependent work environment that moves at an accelerated and interrupt-driven pace. Unless the technology you place into that environment is "ready-to-hand", it will inhibit effective work rather than supporting it. I'm afraid that's what happened to Joe.

So, healthcare will not be improved with EMR alone. Good data modeling, coupled with significant user experience design will work together to make humans more effective and healthier, too.