I am a photographer. Wherever I go in the world, my camera comes with me and after my work day is over I use the camera as a creative outlet. Capturing how light falls in the world around us is for me a way to express myself and create art, to be an artist in addition to an IT professional, husband, father, friend.

Unfortunately for me I am also a Linux user and thus my options for post-production photography management are extremely limited. There are some open source tools which have promise but as yet are more limited than my needs. Most photographers use Adobe Photoshop or Lightroom, but neither of these tools as yet runs in Linux. Fortunately, there is one program, called Bibble 5, that is in many ways just as good the Adobe tools and it runs in Linux. I had been using Bibble 5 for a couple of years when Corel decided to acquire Bibble and its developers in January 2012. Bibble had been developed by a group of 5-6 very skilled developers who created a wonderful tool. They were like open source developers - open, available, interactive, and proactive about communication. They maintained blogs, hosted forums, answered emails, and were extremely responsive to bug reports, customer needs, and new ideas. They created a 3rd party plugin model for their software and had an enthusiastic ISV developer ecosystem that provided awesome additional tools for their tool.

The Corel acquisition was warmly welcomed by the Bibble customer and ISV community and Corel won immediate good will by quickly releasing a new version of the product, now called Aftershot Pro (ASP), in February, just two months after the close of the acquisition. The new version had the same power as Bibble, but with a nicer interface, some new tools, and additional camera support. However, Corel was not quick to migrate the communication tools that Bibble had effectively used to maintain customer loyalty.

They had a different communication culture.They talked to us when they had new product versions to announce, other tools to sell us, and otherwise nothing - silence. The user community, used to vibrant interactions amongst themselves to solve problems and trade advice, setup their own user forum. On the Forum, users posted general discussions about their uses of the product, posted bugs and improvement suggestions, posted and tested plugins, and discussed photography in general. Corel set up a Facebook page to promote their software, and users gratefully posted photography examples from using Aftershot. A small point release was announced and shipped in May, and it seemed that the product would become a real Adobe competitor for the Linux world.

Over the summer, one of the plugin developers learned that all of the former Bibble developers had been terminated by Corel. This information was posted in the ASP Forum and it caused immediate alarm among users. The Bibble developers were extremely popular among end-users and their termination was seen as an ominous sign of cost-cutting at Corel that would have dire consequences on the future of the product. Some users expressed their fears that the Linux version of ASP would soon be discontinued. Others analyzed past Corel acquisitions and noted historical declines in innovation that followed similar terminations of acquired development teams. Others fumed about outsourcing in general and discussed fears that new developers would not understand customer needs.

This conversation devolved over the summer. In August, one customer started a thread called "Goodbye ASP," and discussed the features he was missing and why he would move to another product. Another customer started a thread called, "What are you switching to?," where many of us discussed competing open source products and running Adobe in virtual machines. The fear and anger in these thread accelerated in September, and more customers started threads bashing the ASP product and the lack of innovation, new camera and lens support. One thread invited users to post photos with unattractive artifacts created by ASP, illustrating how the product ruined photos. Another thread helped users to transition to competing products, with tips to migrate photo catalogs. Soon, plugin developers started posting on the forums of competitors discussing their projects to port their plugins.

You can see this conversation here: http://forum.corel.com/EN/viewforum.php?f=90

All the while, over three months of rising rumors, customer fear and anger, Corel was absolutely silent. Customers posted threads asking Corel to comment on the Bibble terminations and product improvement plants. No answer.

The damage that was done to this company's reputation is enormous. I am a customer who has already learned to use other tools to process my photos. Hundreds of other current customers have been turned into vocal antagonists. Corel has lost customer trust and goodwill not 10 months after spending millions of dollars to acquire a terrific product and its loyal customer base.

For all we know, Corel is working on a terrific upgrade that will soon ship and do amazing things for us. But I can say clearly that few existing customers will recommend it because the company has demonstrated complete ignorance of customer communication.

What should they be doing and what can they do now to change the situation? Corel should be using Big Data to analyze social media feeds from their user forums, Facebook, and Twitter. They should be looking for negative feedback and they should respond immediately with information and re-assurances. Companies build sophisticated call-center applications to answer questions from customers when they proactively call. They need to build similar systems to read customer sentiments when passively when they discuss the product and other products online. This is extremely valuable information that must be captured, analyzed, and used proactively every day. There is lots of information about product features and customer needs from other product forums that are also important. Comanies today need to be seen as responsive, sensitive, and caring. They need to behave like open source communities, sharing bugs, discussing features, and working with their user base as an extended community of experts and product architects who know their product as well as the developers.

Companies who do not integrate user base communications and social media transparently into their customer service solutions via big data won't long have customers to serve.

Take my advice. Learn from Corel's mistakes. You don't want users complaining like this without company communication:

"Hello,

more than one month since the beginning of the crisis and
still no news from COREL. The post on facebook seems to be just words
and only words but actually no real fact were shown. No a single beta,
not a single concrete action were presented to the customer (for all of
the customer, the corel one let'say the new comers and the ""old""
former customer of Bibble)

I am really considering to switch
now. I was a customer of Bibble 4, then I went to Bibble 5, with its pro
and cons but at least I was happy with this SW, and its community, its
evolution, and its third party plugins. Today it s only rumours, there
is no evolution, the old users are leaving, the company seems be
completely static, and thirds parties developers are loosing complete
interest into the product.

I don't want to waste more time in a
software which goes nowhere, where there is absolutely no concrete plan
for the future, and might disappear from one day to the other. As a
linux users I wonder if I should switch to a solution with a virtual
machine - that one will be dedicated to RAW processing, without any
internet access neither crazy anti-virus running on it... - what are your experience: VirtualBox or Vmware ? -
which software to put on the virtual machine ? lightroom, Capture one,
DXO ... ? I was still happy with the speed of ASP and its
responsiveness. I really want a solution which doesnt freezes every few
seconds before applying a change. Once again what are your experience
in this virtualized world ?

I know that there are still some
solution under linux, at least 2 : rawtherapee and darktable. But from
my point of view they are not yet ready. - rawtherapee is quite
good in terms of rendering, but it s so slow that it become unusable for
a long and correct workflow. More there are some important features
for me which are missing, such as layer, region correction, grad filter,
black and white tools, catalogues ...- darktable is a little bit
too young, it might become a solution. The quality output is still not
there, and the user interface is a bit unbelievable sometimes.

I
was a really happy user of bibble 4, an happy user of bibble 5. I
wished that ASP could have been a real alternative... That's the past, I
will not recommend this software as I could have done."

The US Freedom of Information Act (FOIA) is celebrating its 45th Birthday this year. It was signed into law by Lyndon Johnson on July 4, 1966, and its goal was to provide Americans with the right to petition their government to release documents deemed in the public interest which might not otherwise ever see the light of day. Since FOIA was enacted, hundreds of thousands of public records have been released. However, the process is not easy. Requests must be made in writing, documents must be found and analysed by the government, and FOIA requests can often take many years to fulfil if the government has an interest in withholding the information.

All requested records are provided on paper because in 1966 that's all there was. Computers occupied huge glass rooms and were not used for document archival or retrieval. Today of course, huge amounts of data and documents are stored on computers and the process of document retrieval should be much faster. But the government doesn't really want to make it easier because that would make government accountability far more transparent and lets face it people on the inside don't like transparency.

But the opacity of information is damaging. It creates asymmetries that favour organisations and disadvantage individuals.

Examples. You go to your doctor when you are sick and they pull out a big file on you. If you see a specialist, the doctor forwards your file to them. Get surgery and that goes in the file too. How come doctors have your data but you don't get a copy by default?

Answer: the medical industry just hasn't worked that way in the past and giving up your data means they give up some control over you as a patient. If you had your own data, you could share it online anonymously and ask many other patients and practitioners around the world to offer you options that might help you deal with your illness and find a cure beyond the scope and capabilities of your local practitioners. That capability isn't in their interest as care providers, but is in your interest. Unfortunately, your health information isn't free to obtain, and the continued opacity of your own data hurts you.

Another example. Congress is being lobbied today to pass all kinds of new restrictions on copyright infringement. Websites may be taken down if infringement is alleged, and there's even a new bill proposed to make the streaming of copyright material illegal. Why is it that corporations get so much protection for their content but you and I enjoy almost none? Why can't we copyright our Personally Identifiable Information and force organisations to pay us to use it?

You don't have Information Freedom if you can't even control your own information.

Lets reform FOIA before we celebrate its 50th Anniversary and make the Freedom of Information a universal human right. Our government should be transparent, access to trusted information should be unhindered, cheap, and universal, and we as citizens and consumers should be able to exercise far more control over our own information as a fundamental right of freedom.

On March 22-23, Information Governance Community Members will meet in Asheville, North Carolina to conduct a Maturity Model Workshop. We will use the Maturity Model to self-assess our capabilities. But if you can't make it to Asheville, join us on-line as we broadcast the Workshop live so you can participate virtually.

You can follow our progress on the phone, see the presentations in a webex, and perform your own Maturity Model Self-Assessment at www.infogovcommunity.com. Its a lot more fun doing it together with your peers.

There will be a webmeeting for this presentation:1. Go to the URL - http://www.webdialogs.com 2. Click 'Join a Meeting' button in the top right corner of the page3. Enter the Conference ID: 32089284. Enter your name and email address5. Click the 'Log In' button

The moderators, times, and descriptions of each section follow.

Metadata: Amy Pfaff, TIAA-CREF

Self-Assessing Information Governance MaturityDate: Tuesday, March 22, 2011 @ 2:30pm ET Classification and Metadata is fundamental in any information related initiative. It is the connection between abstract data (business processes, rules and concepts) and the people who use it to meet their goals and objectives. The Metadata CategoryReflects both the capabilities and tools surrounding metadata and the ease with which information can be used and understood.

Data Architecture: Tim Enten, Wells/Wachovia

Self-Assessing Information Governance MaturityDate: Tuesday, March 22, 2011 @ 4:00pm ETData Architecture is the foundation of any solid data governance program. As an organization progresses through Data Architecture a number of specific goals and benefits will be accomplished on an enterprise-wide basis:1. Alignment between business and technology 2. Adherence to architecture standards 3. Data is viewed and leveraged as a common asset across the enterprise

Data Quality: Bill Haase, Logic Trends

Self-Assessing Information Governance MaturityDate: Wednesday, March 23, 2011 @ 9:00am ETAs businesses move toward data-driven decision-making, Data Quality becomes paramount to the ability to succeed. The improvements outlined in the transitional layers between each stage involve additional value creators to the core business. As an organization matures through the levels, the following objectives become attainable:* Enterprise measurement of the quality, classification, and value of the data being provided* Data quality metrics become core to business processes * Data issues are easily addressed via strong lineage and stewardship* Data quality becomes integrated into organization's culture, business, and technology processes.

I've been watching this Madoff scandal unfold with incredulous amusement. Ponzi schemes aren't new but the sheer size and scale of this one I think perfectly epitomizes the gap between financial innovation and regulatory oversight capacity.

Mr. Madoff was certainly a financial innovator. He figured out a way to get thousands of people to part with their money on the promise of consistently good returns and he never executed any investment transactions.

Am I the only one wondering how it is possible that an investment fund can take investor dollars, send out an investment prospectus, never really execute any trades, and that there is no regulatory authority that is connecting the dots between investment claims and results?

I think the answer is that there are many regulatory authorities that have the mandates to monitor these gaps but that none of them have the information infrastructure necessary to connect corporate actions, financial disclosures, regulatory filings, and exchange transactions.

While the Madoff scandal is stunning, one does wonder how many other smaller scandals have not been discovered, or could have been discovered before all investor money was lost. After Congress exhaults in a round of blame and recrimination, I do hope efforts are directed at enabling regulatory authorities to build a 21st Century Information-Driven Regulatory Infrastructure.

Data=Information=Knowledge. Or so we would like to say. In theory, data is unorganized information, and knowledge is information put to use by human beings. But theories are for academics. And this theory is super convenient if semantic consistency is important. There are Data Architects who only think about data in databases, Information and Content Architects who only work with unstructured repositories, and even Knowledge Architects who I suppose work with information taken out of human brains and put into... structured or unstructured repositories on computers...

In real life, in real companies, these are artificial distinctions. Organizations want to control data/information supply chains because they are full of quality control problems, security vulnerabilities, compliance challenges, and operational exposures. Those risks imperil decision-making, increase operational costs, and reduce revenue opportunities. Quality control and risk mitigation are challenges for every data type.

Five years ago, "Data Governance" seemed like a great name for a new discipline to help transform organizational behavior from vertical to horizontal; because information is transformational. What we meant then and mean now is not just about "Data" in the purest structured sense. We mean Data in the most plural and unlimited sense. People want to govern other people's use of all kinds of information in every form.

No data stovepipes please! We need Data Governance Solutions for all human uses of information regardless of their form or structure, use or abuse.

"Three Nordic central banks unveiled an unprecedented €1.5bn emergency funding package on Friday to support Iceland’s troubled currency and stabilise its banking system as the tiny north Atlantic nation tries to fend off the effects of the global credit crisis.

The plan allows Iceland’s central bank to acquire up to €500m ($775m, £400m) each from the central banks of Sweden, Denmark and Norway in the case of an emergency, the first time the region’s central banks have joined forces to help a troubled neighbour."

http://www.ft.com/cms/s/0/56a76dd4-2327-11dd-b214-000077b07658.html

This story illustrates the downstream impact of polluted data in the global economy. But of course, for the rest of us not living in Iceland the global credit crunch has impacted our lives in other indirect ways. Since September 2007, when the US Federal Reserve started cutting interest rates in a drastic program that shaved 3.25% of the Discount Rate in 7 months, the price of oil (valued in depreciated dollars) has increased 50% from $80 to $127 per barrel. Food costs have skyrocketed, and countries around the world are challenged to find credit for government bonds. Inflation, thanks to Subprime, is a growing threat to the world economy and to the lives of poor people living at the edge of subsistence.

But how is this related to Toxic Content and Data Governance you ask?

Well, of course the public Subprime narrative states that Banks invested in fancy hybrid home loans extended to subprime borrowers and created inherent risk in the market that was compounded through exotic derivatives that no one understood. This is partially true, and many banks have since admitted that they had poor internal risk governance.

But there is another part of the story that doesn't attract as much publicity. In 2005, at the peak of the Housing Bubble in the US, Alan Greenspan went before Congress to declare that the US housing market was "frothing." At about the same time, US Regulators decided to relax underwriting guidelines on new mortgage applications for a key segment of the marketplace - self-employed individuals.

Self-employed individuals face a moral hazard when they apply for a mortgage. This hazard is well known in the residential mortgage marketplace. It occurs when a self-employed individual has to demonstrate their income to obtain a loan. People who are employed by big companies get direct deposit pay checks and have income tax statements which closely match their real income. Self-employed individuals don't get regular pay checks and have tax statements that, shall we say, may frequently differ from real income.

This is especially true for the segment of the population that is paid in cash. Producing documentation of "real" income for these people is a challenge that typically caused the loan underwriting process to take longer for self-employed individuals than employed. And in 2005, as housing prices peaked and interest rates rose .25% each month, mortgage volume started to decline and for some reason US regulators chose to remove income documentation standards for the self-employed. From that time forward, they only had to make an income declaration.

Case in point. I have a friend who is a mortgage broker. He had a customer who owns a Pizza Parlor and wanted to buy a house. This customer had a good credit score and was a prime buyer. His Loan-to-Value Ratio was good. As a self-employed individual he was paid in cash, and he declared his income to be $10K a month. But when my friend input the numbers into the super-fast online loan application it turned out that his debt ratio was too high. He had some car loans and credit card debt that put the ratio above 41%, and the loan could not get through. So my friend simply changed his declared income to $12k per month and the loan got approved.

In 2007, what I described above was a compliant business process for a self-employed mortgage loan application. Income only had to be declared, not verified. In fact, by this time in the marketplace most banks had automated underwriting applications that turned out a rate quote in 40 seconds for conforming rate mortgages. But what was obviously dangerous about this process is that the Pizza Parlor owner made an income declaration without documentation. $10K might have been his best income in his best month in the year. $12K per month might have been his fantasy income. Maybe his real income is closer to $8500 a month.

But now he owns a home with an adjustable rate mortgage that he can barely afford at the current rate he's paying and certainly can't afford when the rate adjusts up.

This is a story that was repeated thousands of times in 2005-7, which is one reason why delinquency and foreclosure rates on those vintages of prime AND subprime loans is at 12-16%.

The fateful regulatory decision in 2005 to relax documentation standards in loan underwriting allowed vast amounts of Toxic Loan Content (poisonously polluted data) to enter the banking system through automated underwriting systems that got their business rules from the regulators. That created systemic risk that was entirely opaque to the MBS issuers, Rating Agencies, CDO issuers, and the marketplace. And unless the credit risk is transparent to investors, the market can't price risk correctly and default is in an inevitable outcome.

By 2005, banks were already aware of rising risk from documented subprime loans and were raising interest rates to collateralize their risk. They just weren't aware of the undocumented risks, which left their reserves deficient to cover their exposures.

But it didn't have to end this way. If regulators in 2005 had just left underwriting documentation regulations in place, or even strengthened them, the Housing market would have seen a soft-landing and the credit crisis would not have happened.

I see quite a few important lessons here for the future:

1. Regulations are not Holy Scripture. Automating Compliance in IT can just as easily automate exposure as it can value.

2. We must learn to measure data quality and validate before we trust it. Data can pollute our businesses, our societies, and our lives and we must invest in methods and technologies to certify its quality on a continual basis to enhance and protect the value of our businesses.

3. The marketplace needs new tools to measure and price business risk. Regulators should not measure risk in businesses and force process changes. This is a reactive and inefficient method.

4. Transparency creates its own rules. Businesses should be required to report and capitalize (self-insure) their risks regularly to the marketplace so that self-regulating market economics can arbitrate between good business stewardship and folly. That arbitration will be reflected in stock prices, which is far more efficient than regulatory sanctions.

We will need many new Data Governance Solutions to help banking institutions across the world adjust to the increased scrutiny the post-Subprime world will bring. But most of all, we will need international forums, like the Data Governance Council, to discuss these issues and bring different perspectives forward because this crisis was eminently avoidable. And it is only through communication that we can develop more mature practices to prevent it again in the future.[Read More]

Eight years ago, I was in La Hulpe Belgium for a training seminar. La Hulpe is a leafy village in one of the POSH suburbs of Brussels, near Waterloo, where Napoleon lost his final battle in 1815. Back then, IBM owned a Danish designed training facility nestled in the woods. It had spartan hotel rooms - single beds in mid sixties teak cosseted next to walls with small night tables and large windows facing mossy oaks and elms. There was a large auditorium, many good sized classrooms, a decent cafeteria, a good bar where I first tasted Belgian Ale, and wonderful trails through the woods. I remember this place as a part of IBM that sadly doesn't exist anymore, a collegiate history where education took place face to face. So quaint in retrospect.

The training seminar spanned two weeks, and I had a weekend in-between. In 1995, when I first joined IBM, I spent my first six weeks of employment in La Hulpe, studying the Insurance Application Architecture - an object-oriented data model for insurance optimization. Over the next five years of my IBM career in EMEA (working in Copenhagen) I spent months in Brussels and La Hulpe and got to know it extremely well. So on this weekend, I decided on something new.

I rented a turbo-diesel Alpha Romeo 146 and took off on the motorway - direction Bruges. Ghent lies on the motorway between Brussels and Bruges and I made a breakfast pitstop in that city on a sunny Sunday in 2002. Ghent is a an ancient city like Bruges, with wonderful streets, canals, old buildings and churches. It is perhaps less famous than Bruges but no less charming. My decision to stop there eight years ago would prove more important than I could have realized at the time. In many ways, one short experience in that city changed my world view in profound ways.

It was about 11am by the time I got to the city center and found parking. I had a camera in hand, and was prepared for sightseeing. My parking spot was on the outskirts of town and I had to walk far to reach the center. Modern balcony apartment buildings on tree lined streets with shops below gave way to older buildings with stucco walls and flower boxes. Along the way, I stumbled upon a darkened 15th Century Church. The walls were black, the flying buttresses grey, and the outer door was open.

As I stepped inside, I heard voices and I froze. It was Sunday. 11am. Mass was being celebrated. I speak German and Danish. I can understand some French and Dutch. But Flemish is beyond me. During the day, an empty church is a museum of art and architecture. It requires no involvement. You stroll through and admire ancient imagination.

At this moment, I thought about turning back. "I don't belong, I'm not part of this community, I don't speak the language, it's not my religion."

I went in anyway. When you travel you can either be an observer and stay in your bubble, or you can jump into other people's lives and be a participant - to try and understand the world through someone else's eyes, ears, and heart. You can't do that on the outside looking in. You need to be on the inside.

It was a massive space inside, and only a small group of 20 or so seated in the first three pews near the priest. No one paid any attention to me sitting several rows behind. It was a catholic service. I'm not catholic. It was in Flemish. Mircea Eliade calls such moments historical archetypes - the temporary suspension of historical time - because each group repeats a ritual and in doing so connects back in time to other groups who repeated the same ritual. There was incense, standing and sitting, singing and praying, communion and kissing. And it was beautiful.

After 30 minutes or so everyone got up, shook hands, and walked out into the bright light of a sunny spring Sunday. I told some Jewish friends about this experience later and they were shocked I attended a Christian service. The history of Jews and Christians in this world is not a good one. Blood and recrimination, persecution and ignorance mark most of it. My own grandparents fled Christian pogroms against Jews in Russia that killed many of their relatives. But I told them what I saw in that church was trans-formative. Not that I felt Christian at that moment, but that for the first time I experienced what they experienced in a service, transforming the church from a museum into a community, a living replica of human experience that the paintings fail to adequately describe.

You can't describe the world in a painting. No reference compares with being there in the moment.

Some years later, I was in Istanbul speaking at a conference on Compliance and Data Governance. The venue was in Maslak, the business center of Istanbul and after my speech I took the subway to Taksim and walked across the bridge to the old city. Constantinople it was called from 303 to 1455, when the Eastern Roman Empire ruled a vast swath of Africa, Arabia, and Southern Europe. They build the Hagia Sophia (Church of Holy Wisdom) in 457 and for centuries it was the largest free standing dome in the world. 1500 years later it still stands, but in 1455 the Ottomans converted it from a church into a Mosque as the city and the empire changed rulers and religions.

There are five other magnificent mosques in Istanbul and one can't possibly understand the city, the history, the ottomans, or the Turks without visiting them. My first was the Blue Mosque, just across from Hagia Sophia. It is just as massive, and like all Mosques you have to take your shoes off to enter. You sit on the floor in a mosque. There are no pews. The floor space is covered with rugs, the walls and dome are adorned with mosaic. There are no paintings, statues, or physical representations. Electric lights hang 30 feet above the floor in chandeliers that must have once held candles.

Most tourists huddle together in the back gaping at the space. But there are no restrictions on where you can sit, how you can sit, or what you can do. Except, you shouldn't talk much. It is very quiet. Moslems pray 5 times a day and I was there in the afternoon between prayers. There were a few people praying quietly and the overall impression one had was private contemplation.

I visited five other mosques in Istanbul that week. Outside, in the streets, protesters were railing against the islamic government of Recep Erdogan as he was installing his foreign minister as President. There were pro-democracy ralies, and many anti-American protests. The Turkish Army was sitting on the sidelines contemplating a coup, and the courts were considering constitutional appeals.

The Mosques were silent inside. People prayed, life went on. And it was beautiful.

The next week, I flew to Israel to visit customers in Tel Aviv and talk with IBM Researchers in Haifa. After the meetings, I drove to Jerusalem. We parked in East Jerusalem, near the Damascus Gate. Our tour guide was Palestinian and we were Jews. We did the Via Dolorosa (walk of Jesus), prayed at the Wailing Wall, and walked on the Temple Mount near the Dome of the Rock.

At the Wailing Wall, I met a lubuvitcher from Brooklyn. We talked about NY, Judiasm, and Jerusalem for a few minutes and he asked me if I would like to pray at the wall. I said yes and he led me to his trolly where he laced me up with a Teffilin on my head and fore-arm. The Teffilin is a wooden box with a microscopic copy of verses from Torah inside and leather bands that tie it to your body. They put a Tallit around my shoulders and Kippah on my head. I no longer can read ancient Hebrew so the men from the Chabad gave me an english phonetic version of a few prayers and together we walked to the Wall.

The Wailing Wall is the last remnants of the City of Jerusalem and the Second Temple of Solomon that remains since the Romans destroyed it in 70 AD during their sack and plunder of the city (from which they took enough gold to finance the construction of the Colosseum in Rome). The lowest blocks weigh 70 tons and they are massive. People have for centuries written small notes to God and inserted them between the blocks.

I put my hand on the wall and said the prayers. There were many other people nearby doing the same thing, repeating a ritual, suspending historical time. And it was beautiful.

Above us Moslem's were praying at the Dome of the Rock, where Abraham was persuaded to spare his son Isaac 5000 years ago. And nearby, Christians were worshipping at the Church of the Holy Sepulchre, on the spot where Jesus was crucified 2000 years ago. All so close.

These are three tribes with derivative roots. They share common ancestry and ideas. And each believes they are right and the others are wrong. They have fought and continue to fight savage wars against each other in the name of their beliefs.

Is there something that unifies them? Perhaps. Is there one way of living or doing things that is better than any other way for every human being and every human culture on the planet. I think not.

The best we can hope for is that people from different tribes, cultures, religions, and beliefs learn to appreciate what's beautiful about life regardless of how it is described.

The two most historically important developments of the last two decades are the growth of global markets and the speed of information technology development. Markets and IT are transforming the world at a faster rate than any other developments in human history. And they are also challenging Governance Models in ways that are equally profound. Kings and Crowds fought it out politically at the dawn of the 20th Century when the ancient Russian, Chinese, Austrian, and Ottoman empires fell, and they are battling commercially today in many markets in which Crowds are winning again.

Two examples:

1. Product Development

I'm an audiophile. I buy expensive audio equipment in the hope of reproducing an emotional connection with music in my home that people feel when they attend a live concert. Being on a limited budget, I'm also a cheap audiophile. I like the best product for the lowest cost, which is one reason I applaud globalization. Over the last decade, high quality, low cost audiophile equipment has been coming out of China that rivals the best high cost gear manufactured in North America and Europe. Some companies have setup local design and Chinese manufacturing with online distribution that brings incredible bargains to mainstream US and European consumers. Two such companies are Oppo Digital and Emotiva.

Oppo makes DVD and Bluray players that are designed in San Francisco and manufactured in China. I've owned their products for several years and am always impressed with their price/performance ratio. But now I'm even more impressed with their product development process. In 2008, they announced the development of a new Bluray player, the BPD-83. These days, consumer electronics are more like computers than audio equipment, with complex Digital Signal Processors, graphics chips and CPU's interacting in intricate designs. Oppo knew product development would be difficult and testing even more so. With the complexity of hardware and software in one appliance it is really difficult for a small team of product designers and marketing professionals in San Francisco to test against every potential usage scenario. And when manufacturing is outsourced to China it is even harder. Distance, language, and culture create barriers that make communication a new challenge.

In this environment, Oppo decided to outsource product testing to its customers by using a Crowdsourcing solution. Several hundred customers received pre-production units of the BluRay player and tested it in their homes. Their product feedback went to the design team who translated feedback into design changes for the manufacturer. The Crowd were given the option to vote on final product readiness. The first vote sent the product back for more changes and fixes in late 2008 and the second vote in Spring 2009 released it for GA in June.

I bought the product in July 2009 and it is superb. I contrast this to Emotiva, which is also a small design team based in Tennessee that manufactures in China. They make outstanding AV amplifiers, speakers, processors, and other equipment. In 2007, Emotiva announced a new AV processor, the UMC-1, for delivery in 2008. That slipped to early 2009, when it was announced that the product would ship in June. In July, the company announced it discovered bugs in the production units from China and would need a couple of months to fix them. By October, more than a couple of months went by and customers were fuming on the company's forums about the delays and the poor communication. In November, the company announced it would begin shipping to the pre-order list and many customers anticipated units before Thanksgiving. By early December, no units had shipped and the company had to start censoring its Forum because customer rants were getting abusive. The Emotiva CEO promised some customers would receive their units by Christmas, and when that didn't materialize many Forum members started talking about buying alternatives.

Last week, Emotiva finally began shipping a handful of units to pre-order customers without manuals. The first reviews appeared over the weekend and talked about stunning video quality but also a few audio and connectivity glitches. The CEO posted a very nice note on the Forum describing the company's pride in the product but also that a firmware release would soon be forthcoming.

So what this company did was use its customers for an unannounced Beta Testing program. They shipped their product very late to market, after a year of inconsistent market communication, with bugs they were probably aware of but couldn't fix without suffering more brand damage.

Contrast the two companies. Oppo used a market based Crowdsourcing mechanism to recruit customers to beta test the new product. The customers who participated in the testing provided open feedback which was visible to all members of the company forum. They fixed bugs quickly and used customers to determine when the product was ready for shipment. That process created customer loyalty and ensured a bug-free product that shipped only six months late. Emotiva used a hierarchical mechanism of in-house testing and opaque customer communication to ship a product more than 18 months late and filled with bugs that alienated customers and reduced brand loyalty.

Some people might say these companies have different approaches to product development or customer service. I abstract these situations as examples of governance models in complex social systems. Oppo used a market-based governance (coordination and cooperation) model and succeeded in satisfying the needs and interests of its market participants. Needs and wants are at a primary market level. Feedback Information about the product are at a secondary market level. Emotiva used a hierarchical governance model (command and control) and failed to satisfy secondary market interests in information and primary market needs for products.

This doesn't mean that market mechanisms always trump hierarchical control. But when a small number of people are trying to govern complex systems for consistent outcomes, a market-based model can be more efficient and produce better results.

2. Cost Containment.

My boss sent me a note over the weekend reminding me to use our ATT Calling Card from land lines when I am travelling abroad. It seems my cell phone bill in November was higher than the accounting police think necessary. All calls above $100 qualify for an immediate audit. Its not clear from my bill if any of my calls were or could be audited, but my boss, who is altogether a terrific guy, wants me to avoid that root canal and work smart abroad. Being a Governance Guy I do have to question the intelligence of a governance system that controls costs through managerial oversight of cell phone bills and automatic audits for $100 calls.

If there is already a trigger for automatic audit at the $100 per call threshold then someone has already noticed a pattern of calls that exceed $100. That kind of pattern calls for a policy change, but automatic audits require a fair degree of manual labor - both from my boss and the auditors. Wouldn't it be far Smarter to develop policies that cause the cell phone users themselves to police their own usage by giving them alternative means to reduce costs?

Some might argue that the warning note from my boss is a policy tool being used to change my behavior. But because the billing system is deliberately opaque in IBM, it isn't possible for me to evaluate the impact of each of my calls on the overall phone bill I incur each month. I can't see the incremental impact of my decisions as Risks to The System as a whole.

A more intelligent approach to cost containment in this case would be to toss the issue out to the Crowd of cell phone users in IBM and get them to come up with ideas to mitigate costs for each user. That process would include users in the decision-making process, getting them to brainstorm ways to reduce costs instead of treating them like cost creators in a hierarchical model to impose control.

Like, shouldn't IBM have Skype strategy for global travelers who make calls in cars and trains so that productivity isn't imperiled while costs are contained?

In the past 18 months, I've written extensively on the need for risk measurement standards, but this IBM Blog interface doesn't make it that easy to find all the articles. To make that easier, I've collected the salient points and made a chronology of them to illustrate the pieces and how they fit together:

On September 18, 2007 the US Federal Reserve cut the Federal Funds rate by half a percent in response to the looming sub-prime loan scandal. The markets had lost confidence and Banks were holding debt they could not sell. Write offs ensued, and the market forecast looked questionable at best.

At the time, this rate cut was seen as a dramatic response to worsening market conditions and proof that the Fed would act aggressively to protect the economy from the housing bubble. In the next two months, the Fed intervened again to cut rates .25% in October and .25% again in December. Each rate cut was seen as a prudent response to market conditions.

In January 2008, just a few weeks after the last rate cut, the Fed had to intervene again with a very sudden 1.25% cumulative rate cut to stem an asian-driven equity market sell-off following more sub-prime write offs and loss disclosures. In just five months, the Federal Reserve had to intervene five times with a combined interest rate cut of 2.25% following 17 quarter point rate increases in as many months.

This was an incredible see-saw of macro-economic policy - gradual rate increases were followed immediately by sudden rate cuts.

In hindsight, the half-point cut in September 2007 was not very dramatic in comparison to the 1.75% in cuts that followed in the next four months. No one then could have foreseen the volatility in the markets that was to come, or could they have?

Why is it that the US Federal Reserve rate policy was reactive to market volatility? Why didn't their monetary policy, which had run up rates from 1% in June 2003 to 5.25% in June 2006, anticipate the looming housing bubble and bank losses that would surely ensue? Hadn't Alan Greenspan warned of this outcome in 2005? Didn't we all know the housing joyride would end at some point?

Today, we can see banking and financial market data that shows the risk trends in our rear view mirror. Unfortunately, no one has a mirror that forecasts the future, but they could if capitalized risk data were collected on a systemic basis by banks and shared with the Federal Reserve. The Federal Reserve does an excellent job of studying catastrophic risks and running sophisticated macroeconomic loss models on everything from terrorist attacks to coastal hurricanes. The Fed uses this catastrophic loss data to provide capitalize insurance loss reserves for the US economy - ie, they print more money when very bad things happen.

The insurance reserves got tapped after 9/11 and hurricane Katrina, when the Fed injected huge amounts of liquidity into the economy to stabilize markets and restore confidence. Of course, the timing of catastrophic events can't be forecasted, but the monetary response can be estimated based on a variety of risk factors. the fed constantly analyzes and wargames these risk factors and the success of Fed liquidity and monetary responses to 9/11 and Katrina attest to the diligence of their planning and the value of risk-based forecasting models.

What does this have to do with the sub-prime loan meltdown you ask?

Well, if the Fed had non-catastrophic risk-data forecasting models they could possibly pre-empt loss events with macroeconomic policy tools that could even out some of the worst aspects of the business cycle. Unfortunately, that kind of non-catastrophic risk-data has to come from banks, who until recently were totally incapable of providing that kind of data, let alone using it themselves for their own risk-based policy-making.

That's changing. In the last two years banks around the world have been working to assess and collateralize market, credit, and operational risks as part of the Basel II compliance process. That data isn't normalized across banks, and there are wide disparities in how risks are assessed, calculated, and capitalized from bank to bank, country to country. But the raw data, and the beginnings of the knowhow are, for the first time in history, there. And that data and knowhow can be leveraged to provide new macroeconomic tools for Central Bank policymakers around the world.

What's needed are standards in risk assessment, classification, calculation, and the reporting of capitalized risk data from US banks to the Federal Reserve. This may take some years yet to accomplish but the time is right to begin discussing these issues. As US Banks reach Basel II compliance they will be in a position to leverage risk-data for their own self-insurance against non-catastrophic losses, and if they would be willing to share their capitalize risk data they could help the Federal Reserve to reduce market volatility and improve macroeconomic performance for everyone.

Here's a case where regulatory compliance really can improve business performance.

The IMF put out the Global Financial Stability Report last week and it contains a very accurate and sobering description of the systemic failures involved in the Subprime Financial Crisis. It has an institutional focus, and makes some solid observations and recommendations.

The entire report is worth a read, but the Executive Summary contains most of the key points if you just want the meat of the matter:

I will summarize the findings and recommendations that have Data Governance implications:

"The events of the past six months have demonstrated the fragility of the global financial system and raised fundamental questions about the effectiveness of the response by private and public sector institutions. While events are still unfolding, the April 2008 Global Financial Stability Report (GFSR) assesses the vulnerabilities that the system is facing and offers tentative conclusions and policy lessons.

Some key themes that emerge from this analysis include:

• There was a collective failure to appreciate the extent of leverage taken on by a wide range of institutions—banks, monoline insurers, government-sponsored entities, hedge funds—and the associated risks of a disorderly unwinding.

What follows are a number of short- and medium-term recommendations relevant to the current episode. Several others groups and for a—such as the Financial Stability Forum, the Joint Forum, the Basel Committee on Banking Supervision—are concurrently developing their own detailed standards and guidance, much of which is likely to address practical issues at a deeper level than the recommendations proposed below.

In the short term...

The immediate challenge is to reduce the duration and severity of the crisis. Actions that focus on reducing uncertainty and strengthening confidence in mature market financial systems should be the first priority. Some steps can be accomplished by the private sector without the need for formal regulation. Others, where the public-good nature of the problem precludes a purely private solution, will require official sector involvement.

Areas in which the private sector could usefully contribute are:

• Disclosure. Providing timely and consistent reporting of exposures and valuation methods to the public, particularly for structured credit products and other illiquid assets, will help alleviate uncertainties about regulated financial institutions’ positions.

• Overall risk management. Institutions could usefully disclose broad strategies that aim to correct the risk management failings that may have contributed to losses and liquidity difficulties. Governance structures and the integration of the management of different types of risk across the institution need to be improved. Counterparty risk management has also resurfaced as an issue to address. A re-examination of the progress made over the last decade and gaps that are still present (perhaps inadequate information or risk management structures) will need to be closed.

• Consistency of treatment. Along with auditors, supervisors can encourage transparencyand ensure the consistency of approach for difficult-to-value securities so that accountingand valuation discrepancies across global financial institutions are minimized. Supervisorsshould be able to evaluate the robustness of the models used by regulated entities to value securities. Some latitude in the strict application of fair value accounting during stressful events may need to be more formally recognized.

• More intense supervision. Supervisors will need to better assess capital adequacy related to risks that may not be covered in Pillar 1 of the Basel II framework. More attention could be paid to ensuring that banks have an appropriate risk management system (including for market and liquidity risks) and a strong internal governance structure. When supervisors are not satisfied that risk is being appropriately managed or that adequate contingency plans are in place, they should be able to insist on greater capital and liquidity buffers.

In the medium term...

More fundamental changes are needed over the medium term. Policymakers should avoid a “rush to regulate,” especially in ways that unduly stifle innovation or that could exacerbate the effects of the current credit squeeze. Moreover, the Basel II capital accord, if implemented rigorously, already provides scope for improvements in the banking area. Nonetheless, there are areas that need further scrutiny, especially as regards structured products and treatment of off-balance-sheet entities, and thus further adjustments to frameworks are needed.

The private sector could usefully move in the following directions:

• Standardization of some components of structured finance products. This could help increase market participants’ understanding of risks, facilitate the development of a secondary market with more liquidity, and help the comparability of valuation. Standardization could also facilitate the development of a clearinghouse that would mutualize counterparty risks associated with these types of over-the-counter products.

• Transparency at origination and subsequently. Investors will be better able to assess the risk of securitized products if they receive more timely, comprehensible, and adequate information about the underlying assets and the sensitivity of valuation to various assumptions.

• Reform of rating systems. A differentiated rating scale for structured credit products was recommended in the April 2006 GFSR. Also, additional information on the vulnerability of structured credit products to downgrades would need to accompany the new scale for it to be meaningful. This step may require a reassessment of the regulatory and supervisory treatment of rated securities.

• Transparency and disclosure. Originators should disclose to their investors relevant aggregate information on key risks in off-balance-sheet entities on a timely and regular basis. These should include the reliance by institutions on credit risk mitigation instruments such as insurance, and the degree to which the risks reside with the sponsor, particularly in cases of distress. More generally, convergence of disclosure practices (e.g., timing and content) internationally should be considered by standard setters and regulators.

• Tighten oversight of mortgage originators. In the United States, broadening 2006 and 2007 bank guidance notes on good lending practices to cover nonbank mortgage originators should be considered. The efficiency of coordination across banking regulators would also be enhanced if the fragmentation across the various regulatory bodies were addressed. Consideration could be given to devising mechanisms that would leave originators with a financial stake in the loans they originate."

******************************************************

New standards and banking practices will clearly be needed moving forward. But we already have most of the regulations we need to mitigate most risks identified in the report. Indeed, one of the great ironies of the crisis is how little Banks used their own fraud and risk management systems to catch underwriting errors and omissions in Loan Origination applications, House Assessments, risk capitalization, etc.

I suspect that the IMF's warning on regulation will not be heeded in Washington, though I do hope regulators will listen to the seasoned advice of some Data Governance veterans because this is a crisis with so many Data Governance challenges.

May 17, 2008: Nordic banks step in to back Iceland

The Financial Times reported today

"Three Nordic central banks unveiled an unprecedented €1.5bn emergency funding package on Friday to support Iceland’s troubled currency and stabilise its banking system as the tiny north Atlantic nation tries to fend off the effects of the global credit crisis.

The plan allows Iceland’s central bank to acquire up to €500m ($775m, £400m) each from the central banks of Sweden, Denmark and Norway in the case of an emergency, the first time the region’s central banks have joined forces to help a troubled neighbour."

This story illustrates the downstream impact of polluted data in the global economy. But of course, for the rest of us not living in Iceland the global credit crunch has impacted our lives in other indirect ways.

Since September 2007, when the US Federal Reserve started cutting interest rates in a drastic program that shaved 3.25% of the Discount Rate in 7 months, the price of oil (valued in depreciated dollars) has increased 50% from $80 to $127 per barrel. Food costs have skyrocketed, and countries around the world are challenged to find credit for government bonds. Inflation, thanks to Subprime, is a growing threat to the world economy and to the lives of poor people living at the edge of subsistence.

But how is this related to Toxic Content and Data Governance you ask?

Well, of course the public Subprime narrative states that Banks invested in fancy hybrid home loans extended to subprime borrowers and created inherent risk in the market that was compounded through exotic derivatives that no one understood. This is partially true, and many banks have since admitted that they had poor internal risk governance.

But there is another part of the story that doesn't attract as much publicity. In 2005, at the peak of the Housing Bubble in the US, Alan Greenspan went before Congress to declare that the US housing market was "frothing." At about the same time, US Regulators decided to relax underwriting guidelines on new mortgage applications for a key segment of the marketplace - self-employed individuals.

Self-employed individuals face a moral hazard when they apply for a mortgage.

This hazard is well known in the residential mortgage marketplace. It occurs when a self-employed individual has to demonstrate their income to obtain a loan. People who are employed by big companies get direct deposit pay checks and have income tax statements which closely match their real income. Self-employed individuals don't get regular pay checks and have tax statements that, shall we say, may frequently differ from real income.

This is especially true for the segment of the population that is paid in cash. Producing documentation of "real" income for these people is a challenge that typically caused the loan underwriting process to take longer for self-employed individuals than employed.

And in 2005, as housing prices peaked and interest rates rose .25% each month, mortgage volume started to decline and for some reason US regulators chose to remove income documentation standards for the self-employed. From that time forward, they only had to make an income declaration.

Case in point. I have a friend who is a mortgage broker. He had a customer who owns a Pizza Parlor and wanted to buy a house. This customer had a good credit score and was a prime buyer. His Loan-to-Value Ratio was good. As a self-employed individual he was paid in cash, and he declared his income to be $10K a month.

But when my friend input the numbers into the super-fast online loan application it turned out that his debt ratio was too high. He had some car loans and credit card debt that put the ratio above 41%, and the loan could not get through. So my friend simply changed his declared income to $12k per month and the loan got approved.

In 2007, what I described above was a compliant business process for a self-employed mortgage loan application. Income only had to be declared, not verified.

In fact, by this time in the marketplace most banks had automated underwriting applications that turned out a rate quote in 40 seconds for conforming rate mortgages. But what was obviously dangerous about this process is that the Pizza Parlor owner made an income declaration without documentation. $10K might have been his best income in his best month in the year. $12K per month might have been his fantasy income. Maybe his real income is closer to $8500 a month.

But now he owns a home with an adjustable rate mortgage that he can barely afford at the current rate he's paying and certainly can't afford when the rate adjusts up.

This is a story that was repeated thousands of times in 2005-7, which is one reason why delinquency and foreclosure rates on those vintages of prime AND subprime loans is at 12-16%.

The fateful regulatory decision in 2005 to relax documentation standards in loan underwriting allowed vast amounts of Toxic Loan Content (poisonously polluted data) to enter the banking system through automated underwriting systems that got their business rules from the regulators. That created systemic risk that was entirely opaque to the MBS issuers, Rating Agencies, CDO issuers, and the marketplace. And unless the credit risk is transparent to investors, the market can't price risk correctly and default is in an inevitable outcome.

By 2005, banks were already aware of rising risk from documented subprime loans and were raising interest rates to collateralize their risk. They just weren't aware of the undocumented risks, which left their reserves deficient to cover their exposures.

But it didn't have to end this way. If regulators in 2005 had just left underwriting documentation regulations in place, or even strengthened them, the Housing market would have seen a soft-landing and the credit crisis would not have happened.

I see quite a few important lessons here for the future:

1. Regulations are not Holy Scripture. Automating Compliance in IT can just as easily automate exposure as it can value.

2. We must learn to measure data quality and validate before we trust it. Data can pollute our businesses, our societies, and our lives and we must invest in methods and technologies to certify its quality on a continual basis to enhance and protect the value of our businesses.

3. The marketplace needs new tools to measure and price business risk. Regulators should not measure risk in businesses and force process changes. This is a reactive and inefficient method.

4. Transparency creates its own rules. Businesses should be required to report and capitalize (self-insure) their risks regularly to the marketplace so that self-regulating market economics can arbitrate between good business stewardship and folly. That arbitration will be reflected in stock prices, which is far more efficient than regulatory sanctions.

We will need many new Data Governance Solutions to help banking institutions across the world adjust to the increased scrutiny the post-Subprime world will bring. But most of all, we will need international forums, like the Data Governance Council, to discuss these issues and bring different perspectives forward because this crisis was eminently avoidable. And it is only through communication that we can develop more mature practices to prevent it again in the future.

The US Federal Reserve announced new mortgage lending standards today that are designed to address so-called deceptive business practices among lenders.

Those measures include:

¶Bar lenders from making loans without proof of a borrower’s income.

¶Require lenders to make sure risky borrowers set aside money to pay for taxes and insurance.

¶Restrict lenders from penalizing risky borrowers who pay loans off early. Such ”prepayment” penalties are banned if the payment can change during the initial four years of the mortgage. In other cases, a penalty cannot be imposed in the first two years of the mortgage.

¶Prohibit lenders from making a loan without considering a borrower’s ability to repay a home loan from sources other than the home’s value.

The borrower need not have to prove that the lender engaged in a “pattern or practice” for this to be deemed a violation. That marks a change — sought by consumer advocates — from the Fed’s initial proposal and should make it easier for borrowers to lodge a complaint.

“Rates of mortgage delinquencies and foreclosures have been increasing rapidly lately, imposing large costs on borrowers, their communities and the national economy,” Mr. Ben Bernanke, the Federal Reserve Chairman, said.

“Although the high rate of delinquency has a number of causes, it seems clear that unfair or deceptive acts and practices by lenders resulted in the extension of many loans, particularly high-cost loans, that were inappropriate for or misled the borrower,” he added.

Excellent. Markets around the world can feel confident again that the US Federal Reserve has rooted out the major mortgage lending problems confronting the US Economy and has the entire situation under control.

NOT!

It is beyond shocking that deceptive lending practices like this even exist in the most "efficient mortgage market in the world" (according to a 2006 IMF Mortgage Market Survey). What's more shocking is that the Fed knew about these practices, had data attesting to their impact on rising rates of mortgage fraud going back to 2005, and did nothing about it until today.

And how do I know that, you ask?

Well, the Fed's own economists put out an insightful summary of what went wrong in the current credit crisis and you can read it here:

The first draft of this report was published in October 2007. And in perfect hindsight, the economists concluded,

"Were problems in the subprime mortgage market apparent before the actual crisis showed signs in 2007? Our answer is yes, at least by the end of 2005. Using the data available only at the end of 2005, we show that the monotonic degradation of the subprime market was already apparent. Loan quality had been worsening for five consecutive years at that point. Rapid appreciation in housing prices masked the deterioration in the subprime mortgage market and thus the true riskiness of subprime mortgage loans. When housing prices stopped climbing, the risk in the market became apparent."

Now the US Federal Reserve would not be the first organization in history to have better hindsight than foresight, but shouldn't we expect them at least to move faster with policy controls when the global credit market is facing the second worst crisis in history? And if it takes the Federal Reserve 2 years to study market data and write a telling report more than three months after the crisis has hit, and 8 months more to digest and issue lending guidelines to restrict fraud in the mortgage marketplace, how long exactly will it take them to react when Hank Paulsen consolidates all financial regulation in their hands?

To me this story offers some important lessons that I do hope Congress recognizes:

1. Regulatory Consolidation is not a panacea. Consolidated beauracracies do not historically produce operational efficiency. Witness the Department of Homeland Security and the performance of FEMA during Hurricane Katrina.

2. Data is useless without people empowered to act. The Fed had ample data to control mortgage lending fraud and prevent the worst aspects of the current credit crisis and it either chose not to act or its internal governance is so poor that there was no mechanism in place to forecast non-monetary economic risks and make micro-policy adjustments.

3. More important than regulatory consolidation, Congress should review the operational procedures and data governance practices at the Fed itself. A GAO Audit of Fed operational procedures and internal, below the Board, decision-making would be a great start!

4. The Subprime Credit Crisis was preventable! The Fed had the data and they had the economic skills to use it. They proved today that they even had the regulatory mandate to affect the changes in lending guidelines necessary.

Congress, the US Public, and the world at large, have a right to know what took them so long to use their own data before we entrust that organization with even more regulatory responsibility.

The stunning market downturn this week has revealed the breathtaking lack of Enterprise Risk Management in Banks, Brokers, and Washington.

Let me be clear; Risk Taking is not the same as Risk Management.

We've had 7 years of extraordinary risk taking in our mortgage markets, financial markets, tax policies, and war policies.

No one, in Banking or Government, can say with any confidence what their risk profile is, calculate probability of loss, or forecast future exposures.

This makes all claims of Enterprise Risk Management one of the most stellar myths in modern business. Anyone telling you they are doing it is either lying or a rare breed indeed.

I do hope that someone is recording the history of losses at the institutions failing and surviving every day. Because the historical study of that record of loss - and new regulations mandating real Enterprise Risk Management and loss capitalization - is the only thing that will prevent them from happening again.

November 20, 2008: Three Things Basel Forgot

Today, The Basel Committee on Banking Supervision announced a new strategy to address shortcomings in its own global regulatory structure. The proposal creates new capital requirements, leverage ratios, and risk measurements designed to more carefully regulate banking practices across the globe.

The proposal includes the following elements:

# strengthening the risk capture of the Basel II framework (in particular for trading book and off-balance sheet exposures);# enhancing the quality of Tier 1 capital;# building additional shock absorbers into the capital framework that can be drawn upon during periods of stress and dampen procyclicality;# evaluating the need to supplement risk-based measures with simple gross measures of exposure in both prudential and risk management frameworks to help contain leverage in the banking system;# strengthening supervisory frameworks to assess funding liquidity at cross-border banks;# leveraging Basel II to strengthen risk management and governance practices at banks;# strengthening counterparty credit risk capital, risk management and disclosure at banks; and# promoting globally coordinated supervisory follow-up exercises to ensure implementation of supervisory and industry sound principles.

Strengthening liquidity and solvency requirements seem like regretful afterthoughts during a time of historically low liquidity and high insolvency, but better late than never. One does wish that the Basel Committee had applied these measures as forethoughts rather than afterthoughts, but that's human nature.

There are three elements missing that I hope to see emerge in 2009:

1. A Global Loss History DB of anonymous credit, market, and operational incidents, events, and losses from every Basel conforming institution. Individual institutions do not have enough loss history to compare their past exposures and "claims" to trend and forecast. Industry and geographic loss information is needed to better inform decision-making at banking institutions. 3rd Party loss data is available to every insurance company for all lines of business. Only the banking community could conceive of risk measurement programs without 3rd party institutional validation.

The Operational Risk Exchange has been aggregating banking loss data for operational risk among the 41 banks who participate in that consortium for 3 years. That model is valid, but the sample size is too small even for ORX. I hope the Basel Committee sees ORX as a valid architype that should be replicated worldwide with each Central Bank collecting the anonymized loss data from each member institution and sharing that loss data worldwide so that all financial institutions can compare their own loss trends to global trends and forecast future exposures more accurately.

2. An XBRL for Risk Reporting Taxonomy. Banks can't report loss events without a global taxonomy so that everyone can agree on what to call things and what things mean when they are reported. Even within banks, the word Risk has many different meanings to different people. For business people, Risk is an omnipresent feature of life, an attribute to calculate potential returns or losses in investments. Many business careers are made by taking risks. For an IT person, Risk is something to be avoided at all costs, the result of flaws in architecture that lead to vulnerabilities and loss. Many IT careers are lost by taking risks.

Business and IT can sit at the same table and have exhaustive conversations about Risk, each thinking they understand the other, and walk away having fundamentally different idiomatic understandings of what was discussed.

That misunderstanding is often a source of new risk.

XBRL (Extensible Business Reporting Language) is an XML language for describing business terms, and the relationship of terms, in a report. It enables semantic clarity of terminology, and that clarity is absolutely essential for the accurate recording and reporting of credit, market, and operational incidents, loss events, and losses.

A Risk Taxonomy is like an alphabet - the letters alone convey no meaning, but they are the foundational elements that allow humans to understand each other. We desperately need a new alphabet to describe Risk - incidents, events, losses, claims, exposures, forecasts, reserves - so that firms everywhere can aggregate loss information, analyze it with standard actuarial methods, compare past exposures to present conditions and opportunities, and forecast potential outcomes to illuminate options.

A year ago, I wrote on this page about the need for new macro-economic tools to enable Central Banks to measure aggregate risk taking in the financial world. An XBRL Taxonomy of Risk is a fundamental building block to enable interoperability and standard practices in the measuring and reporting of risk.

Those standards in turn will enable Central Banks to manage vast databases of loss history and trend analysis that will inform policymakers and member banks to make better decisions that produce better returns. We will still need new information management software and governance models to make sure the right information gets to the right people at the right time, but none of that is possible without a standard alphabet and vocabulary to describe what's being recorded and read.

Recently, I announced an IBM Data Governance Council initiative to develop an XBRL Taxonomy for Risk. We are inviting all interested parties - banks, broker/dealers, hedge funds, consortia, think tanks, and regulators - to participate in this initiative. We will be working closely with XBRL International and XBRL.US to share ideas in an open and transparent process to bring forward a standards proposal quickly. If you are interested in participating, please drop me a line.

3. Lets bring back Glass-Steagall. Gee what a great idea. No leverage ratios, because investment banks can't leverage with bank deposits at all. Banks, Brokerages, Hedge Funds, and Insurance Companies all need to have their activities segregated. It isn't enough to insist on new solvency, liquidity, and risk measures. We need to separate temptation from action. And when all three of these things are done - new solvency requirements to shore up assets on the balance sheet, risk taxonomies and loss history data to forecast future exposures, and Glass-Steagall V2 - we'll have risk tied up in a knot... until it's not.

Does the European Union "promise to be true in good times and in bad,
in sickness and in health?" Will the Union survive the current Debt
Crisis and become more integrated or will it break apart under the
pressure and allow insolvent states to exit the common currency?

Can the United States maintain its high standard of living and reduce its debt burden at the same time?

You
may read these questions in the press every day and never believe they
have everything to do with Data Governance, but they very much do.
Governments make tactical decisions every day to increase debt amounts
by small fractions thinking that their incremental spending is nothing
in comparison to what others have done in the past - failing to see the
correlations between current consumption and long term systemic
instability.

With 7 billion people on the planet Earth, our
societies have become so complex it is impossible with past methods of
governance to foresee how policies impact even the smallest ecosystems.
So we rely on blunt cause and effect relationships to over-simplify our
options and fit our ideas into media soundbites. And the result is
non-correlated policies that are anything but smart or predictive.

We
seek to change this. We know that without new tools and techniques to
see beyond the next effect, every cause will yield policies that fail.
We are the IBM Data Governance Council and we see that Data is the raw
material of the Information Age and that effective Governance relies on
conceptual thinking, integrated approaches, correlated analysis, and a
relentless search for truth.

We call this Predictive Governance
and this meeting will explore what this means, how it works, and how we
as a Community can create predictive models that:

1. See the
Relationships between Data Quality and Security & Privacy and Data
Architecture and ILM and Metadata and Audit and Reporting and
Stewardship and Policy and Organizational Awareness and Business
Outcomes - the Forest and the Trees in our Information Ecosystems.

2.
Model and Simmulate how new integrated policies, people and
technologies are available to Govern in these complex Ecosystems.

3.
Understand and articulate these relationships to laymen who only see
the problems at hand and have no patience for larger integrated
discussions.

Please join us for this important two day event.
Participation is open only to members of the IBM Data Governance
Council. Organizations wishing to join the Council may sign up for this
event and execute a Council Agreement in New York at the meeting.