A node between the physical and digital.
The rants and raves of Simon Wardley.
Industry and technology mapper, business strategist, destroyer of undeserved value. "I like ducks, they're fowl but not through choice"

Friday, February 29, 2008

Last year, Google invested in 23 and me, an organisation that specialises in genetic sequencing. Apparently, the company intends to create a genetic database that people can search for both personal and scientific reasons. As they say on their site: "unlock the secrets of your own DNA, today!"

This year, Google has unveiled its personal health records service. According to Nick Carr, future partners will include a "slew of hospitals and care providers, medical testing companies, pharmacy chains, and health insurers".

For some reason, I always get nervous whenever "the secrets of my DNA", "search" and "health insurers" are mentioned in close proximity to each other.

Out of curiosity, the 15th amendment states: "The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State on account of race, color, or previous condition of servitude".

According to the Guardian, Andrew Keen's view of the blogosphere is that "these 'monkeys' are not producing Shakespeare, they're deluging us with 'everything from uninformed political commentary .... to unreadable poems, reviews, essays, and novels'"

Every now and then, I come across a post that makes me agree with him.

[Update: Unfortunately, I've found another such post today, full of horrendous ad hominem attacks. Normally, I'm quite opposed to Keen but with these two bloggers he's got a point.]

Last night on Question Time, the fracas about the speaker of the house, Michael Martin, was raised.

Michael has been claiming expenses(apparently within the letter of the rules) that are not quite within the spirit of them. It reminds me of the old tax quote:-

"the difference between tax efficiency and tax avoidance is normally a matter of time".

However, there is an accusation that most of the arguments against the speaker are based upon snobbery and class hatred. According to the New Statesman, "Quentin Letts of the Daily Mail and Simon Hoggart of the Guardian" are the main offenders.

Well, this may or may not be the case, however one of the panelist on Question Time thought that this sorry saga could end up damaging the reputation of the Daily Mail. I'm surprised, I didn't think it had one. I've personally thought the Daily Mail was the perfect counter argument to Keen's horror story that the cult of the amateur was destroying our culture. You want to keep the pre-internet culture and its elitism? You want to keep the Mail?

Far from being an equitable society, according to the Sutton Trust, the UK is "very low on the international rankings of social mobility when compared with other advanced nations". It could even be declining.

As Peter Lampl, Chairman of the Sutton Trust, said :

"It is appalling that young people’s life chances are still so tied to the fortunes of their parents, and that this situation has not improved over the last three decades."

If you're born rich, congratulations. If you're born poor, get used to it. Whilst ability is evenly distributed in our society, much of that ability is squandered by the uneven distribution of opportunity. This is an appalling economic and human waste.

Fortunately, there will always be small glimmers of hope and this time it comes from tax reforms. With the non-domicile changes being suggested, some well-heeled individuals are threatening to leave the country. Their leaving should create some new but small opportunities for social mobility.

It's a minor but noble sacrifice those wealthy individuals make. We should cheer them "bon voyage" as they board their jets.

Thursday, February 28, 2008

Over 3 million children live in poverty in the UK. The government plans to cut this by 50% by 2010, however it needs an extra £3.4bn to do so.

Well, on the 3rd March, Tesco runs its annual "computers for school" advertising campaign. Through the scheme, which is now in its 17th year, you can donate to your school a £700 computer if you spend £379,000 at Tesco's.

I have a really radical idea for them. Since Tesco likes to play "our part in local communities", why don't they instead pay the £1 billion in tax which allegedly they are trying to reduce through the use of tax efficient off-shore investment vehicles?

Tuesday, February 26, 2008

Thank you for your previously expressed interest in {name removed} beta testing which has now begun. We would greatly appreciate your participation and subsequent feedback. Please click on the following link to create your account and begin your involvement: http://{url removed}If you did not previously request to be involved, please do not click the above link but instead contact us immediately at {email removed}.ThanksThe {name removed} Team

My immediate thought was Social Engineering Attack! Except, of course, I knew I had expressed an interest. I still had to check the raw source and headers just in case.

I was suspicious because of the text that I've highlighted. Why would you ask someone who has not contacted you, to contact you? Doesn't that say something awful about the quality of your service? Are you just randomly spamming people as well?

If you don't mean to contact me, please don't. Certainly don't contact me and then ask me, that if I didn't want you to contact me, to then contact you to say so.

As I said before, I don't want to be pushed or pulled, I want to draw.

Monday, February 25, 2008

Rather than discussing a topic, I thought I'd just choose some random quotes which mean something to me.

"I would define the Valley of Death as when the amount of money you’re starting to ask for—the bill—starts to add up to the point where management says, ‘What are you guys up to, what are you doing, and what am I going to get out of it?’ But yet it is sufficiently early in the process that you don’t feel you can answer that question. If you are fortunate enough that the questions come when you have an answer, you, in fact, have scooted over the Valley. If not, you are squarely in that Valley."

Gerald Adolph, Booz Allen Hamilton.

"Innovations always threaten someone in power"

Scott Berkun, The Myths of Innovation, 2007

“The ability to make the kinds of generalisations and predictions that are typically associated with science and models is consistently being undermined by the phenomenon of complexity”

Downs and Mohr, Towards a Theory of Innovation, 1979

”paradox is at the heart of innovation. The pressing need for survival in the short term requires efficient exploration of current competencies and requires ‘coherence, coordination and stability’; whereas exploration / innovation requires the discovery and development of new competencies and this requires the loosening and replacement of these erstwhile virtues”

RE: "assumption that innovation == business"From the work of Joseph Schumpeter, innovation is the "future source" of long term profits for any organisation. Innovation is not necessary for a business today, merely for a business tomorrow. This is wrapped up in the concepts of creative destruction. The assumption should be that "innovation == future business".

RE: "open source does not innovate"Innovation is the first attempt to put an idea into practice (Prof. Jan. Fagerberg). The licensing arrangements are hence irrelevant. However, open source does promote distribution of an innovation by reducing any barriers to adoption. This distribution results in further commoditisation (the movement towards more of a commodity) and hence the "commoditization trend" that you highlight. I provided a link to my talk at Web 2.0 which covers this, just in case it is of interest.

Open source therefore promotes commoditisation. However, new stuff (aka "creative destruction") is built upon existing commoditised services. Without commodity-like power (electricity), data processing (silicon chip) and communications (internet) there would be no Google. Hence open source promotes further innovation by accelerating the innovation process in much the same way that the internet, network effects and standards have.

The other effect that open source has, is to do with leveling the inequality between the distribution of ability and the distribution of opportunity. Rich companies and countries don't have a natural monopoly on ability, they just have more opportunity. So you can say "open source promotes innovation".

RE: "commodity and innovation are inverse"Yes, they are but they are also part of the same overall process.

So overall :

"open source promotes innovation"

"innovation == future business"

"commodity and innovation are inverse and inter-dependent"

Ideally the question should really be: "Are there any profitable businesses which would not exist today if there was no concept of 'open source'?"

Which one is more important open source or open standards? According to Opera Software's CEO Jon S. von Tetzchner it's open standards. A lot of people seem to agree:

"open standards ARE more important than the licensing of an individual piece of software. Who cares what license the software someone uses is, as long as its always possible to replace it, and freely compete, which a free and open standard ensures."

I disagree and I'm starting to feel like a lone voice here, maybe I'm wrong? Well let's go through my reasoning. I happen to agree with Bob Suter that "the easy availability of software can accelerate the adoption of a standard" and that adoption makes something a standard or not. For me, the fastest way of achieving such adoption is through providing an operational means of implementing an open standard - which means open source. So in my view, if you want portability you will need BOTH open standards AND open source. BOTH are important.

Now there are many definitions for what an "open standard" is. I'll just note the OSI's comment that:"We don't try to define it ourselves, but we know that if you can't implement an Open Standard under an Open Source License, it's not open enough for us."

That's a curious statement? Are they concerned that some standards could be touted as open but could actually contain design flaws to create an advantage for some company? Of course they are. There are lots of evil strategies for subverting standards, even open standards, to your advantage. But then this is what the debate is really all about .... advantage.

There are two main areas for competitive advantage in the software business - the product I sell or the service I provide. For example:

If I believe my competitive advantages are in the product, then the last thing I want is a level playing field in terms of technology. I'm certainly happy for consumers to be able to easily move to my product, but I want my product to have advantages over others. Open standards is desirable, open source is not.

If I believe my competitive advantages are in the service provided then having a level playing field in terms of the technology offered, benefits me. It enables me to differentiate my company against all others on service quality alone. I certainly want consumers to be able to move to my service but I don't want a competitor to have some technological feature which I can't offer. Both open standards and open source are desirable.

Now a service industry requires a commonly used and well defined activity. Many of the once novel and new software systems that we use, are rapidly becoming well defined and common. Even standards for service delivery are more widespread (whether ITIL or ISO2000) and we have an emerging Software as a Service industry. The biggest obstacles faced by this industry are adoption fears, which can be reduced by encouraging portability between service providers.

As this trend continues, we will move further along the path of service delivery for much of IT. Service will become the key competitive ground. For many IT sectors the differentiator will become "how" (the service) IT is delivered rather than "what"(the product). For those in the business of utility computing provision, operations will be a source of competitive advantage. For the rest of us, such competition will keep quality and prices keen on a commodity-like and cost of doing business activity.

Whist such a change will benefit consumers, it is a complete nightmare for those who see their product as their source of competitive advantage. I'll use this reasoning to respond to the original comment that I included:

"open standards ARE more important than the [open source] licensing of an individual piece of software" - yes, to a manufacturer of a proprietary product.

"as long as its always possible to replace it, and freely compete, which a free and open standard ensures" - Open standards don't do that. You can use dependencies, secrets, meta data lock-in and limitations of function to subvert this.

"who cares what license the software someone uses" - I do, I'm a consumer of software services and I want a competitive service based industry.

In my view if you want portability you will need BOTH open standards AND open source. BOTH are important. What seems to be happening feels more like a marketing campaign to describe products as "open" to the enterprise and to undermine "open source".

Sunday, February 24, 2008

At Web 2.0 summit in 2006, I was concerned that a lack of portability between SaaS (software as a service) providers would prove a stumbling block to adoption. During 2007, from E-Tech to IGNITE to OSCON to FOWA to Web 2.0 Expo, I emphasised this point and argued that if you want wide-scale adoption of SaaS, you need a competitive utility computing market. Such a market requires an ecosystem of providers with portability between them.

This stuff is old hat, however, I thought I'd do one last impression of a stuck record just in case it was beneficial to someone new to the field.

First, I'm going to define some terms including software as a service. Then I'm going to go through the benefits of such services and the main reasons given for not adopting them. Lastly I'll explain why open standards are not enough and why we need competitive utility markets. Let's start with some definitions:

Software as a Service is "a software application delivery model where a software vendor develops a web-native software application and hosts and operates the application for use by its customers over the Internet". Now such applications include CRM (such as Sales Force) or development and deployment frameworks (such as Ning) and even operating environments (such as Amazon EC2).

SaaS can be built upon SaaS. For example an application (such as ERP or Enterprise Resource Planning) can be built upon a development framework (such as Bungee Labs) which can built upon an operating environment (such as Amazon EC2 + S3). I personally find it useful to consider this as a stack of software, framework and hardware. However due to the explosion of XaaS (X as a Service) terms, I now agree that the term software as a service is sufficient as the stack is applicable whatever the delivery model is.

Utility computing is "the packaging of computing resources, such as computation and storage, as a metered service similar to a physical public utility, This system has the advantage of a low or no initial cost to acquire hardware; instead, computational resources are essentially rented". In the most narrow sense (ignoring overall comparison to the utility industry), it is a billing and provisioning model.

Software as a Service can be provided on a utility computing basis. For example an operating environment can be delivered and operated online (software as a service) and charged on the basis of consumption of CPU, bandwidth and storage (utility). This is exactly what happens with Amazon EC2 & S3.

So now we have some basic terms, let's looks at reasons why people should adopt these services.

Benefits of Software as a ServiceSoftware as a Service works by providing a standard service to many consumers in a multi-tenanted architecture. The principle idea is that management, operation, monitoring, security, network, capex and other costs are shared more efficiently among many consumers. This allows for an overall reduction in the cost of service for any particular quality of service. As the service already exists, it allows for faster deployment and reduces the amount of commonly repeated and tedious tasks (aka "Yak-shaving") normally involved. Finally, in order for this to be practical, the service must be well defined and commonly used. Hence the services offered are likely to cover cost of doing business (CODB) and non-strategic activities. The provision of services in these areas will also encourage further standardisation and hence result in increasing cost reductions.

The overall benefits are:-

Reduction in cost and / or improvement in quality of service, due to economies of scale.

Faster deployment compared to self build models.

Reduction in Yak-Shaving.

Shifts non-strategic activities to a third party provider.

Encourages standardisation for CODB-like activities.

Benefits of Utility computing.With utility computing, resources such as bandwidth, storage and computer operations are paid for on a per use basis. There is no need for initial capex to acquire hardware or the costs associated with setup. There is no need to over provision capacity for demand spikes as the provider balances supply and demand for many customers. As Chaki Ng stated: "it is more efficient to have multiple network services share a common infrastructure that can absorb failures and bursts in client demand than it is to have every service over provision resources to accommodate peak requirements"

There are also benefits for the consumer in terms of reducing the financial risk and complexity involved in any new business venture. Furthermore, there are none of the delays normally involved in acquiring and installing physical hardware, air conditioning, racking and power.

The overall benefits are therefore:-

Reduction in business risk in terms of capital outlay and planning.

Reduction in Yak-Shaving.

More efficient energy and resource usage.

Minimal delays in provisioning.

Overall, using Software as a Service on a utility computing basis should mean:-

Reduction in cost and / or improvement in quality of service, due to economies of scale.

Faster deployment.

Reduction in business risk in terms of capital outlay and planning.

Reduction in Yak-Shaving.

Shifts non-strategic activities to a third party provider.

Encourages standardisation for CODB-like activities.

More efficient energy and resource usage.

Why on earth would anyone say no? Well these are the main reasons:-

Excuse No.1: We have concerns over availability."I can only access the service if the internet and the provider is available""I need access to the service when there is no internet connection"

I completely understand this concern when you are talking about desktop applications such as word processors, spreadsheets or maybe a presentation machine for a roadshow. However most employees don't have the companies accounting system or CRM package on their desktop and mail services are fairly pointless without the internet. A huge number of systems are remotely accessed from the desktop and the real question here is whether your internal systems are more reliable than the internet and the provider. Well the internet is almost certainly more reliable than your own personal corporate network simply because there are more routable nodes. As for the provider, that brings us on to :-

Excuse No.2: We have concerns over the reliability of the vendor."We'd never use a vendor, if something goes wrong, my guys will fix it ... I trust them"

Whilst this argument has some validity in the short-term, in the long term it is utter nonsense. If we ALL operated on the basis of this argument, then there would be no banks, no railways, no airlines, no supply chains, no power supply, no commodities and no change through commoditisation. Instead every company would be trying to do everything on the grounds that "our guys do it best". It's simple minded, protectionist drivel. Reliability in the software as a service world will increase over time, especially as competitive markets form and as we see third party assurance services and computer resource brokerages emerge.

Excuse No.3: We can't use a standard system."Our systems are tailored to us and our way of doing business.""Our systems are a source of competitive advantage."

The majority of activities and processes that organisations undertake are common within their industry or the market as a whole. Few activities are a genuine innovation or a source of competitive advantage. For your average company, electronic book-keeping is not a source of competitive advantage, nor are health and safety forms, holiday request services, payroll payment systems and the list goes on. Even where such a system is a competitive advantage, the provisioning of resources for such a system is a commonly repeated problem. If builder's built houses in the same way that software engineers built systems, then every house would have its own power station, sewage works and brick factory. More often than not, upon investigation "can't" turns out to be "won't". This brings me onto one of my favourite excuse:-

Excuse No.4: It's not worth it."The amount of money we spend on IT is small compared to the value of the data it holds. It's just not worth us considering using a vendor, what if they lost our data or someone else got hold of it?"

Ask yourself, does your company use banks or does it keep all its money in its own guarded safe? Even banks use banks. Any investigation shows that we've been using 3rd party providers in many industries such as manufacturing for a considerable length of time. Such outsourcing is not a new phenomenon. Which leaves the "it's not worth it". I find this alarming as it is akin to saying "cost is not important in our business" or "why spend less when you can spend more". By not accepting the same or better quality for a commodity service with a provider at a lower cost, then you're actively putting yourself at a cost disadvantage (no matter how small). As a shareholder, it annoys me when I hear management talk in such a manner. This leads to me the final reason.

Excuse No.5: We have concerns over lock-in."We'd be tied into a particular vendor"

This is a real and genuine concern and without portability between vendors there will be no competitive pressure to keep prices / quality keen and an issue of lock-in and vendor dependency.

Now for me, portability is the key for adoption of these services. Taking the example of banking, we have portability between banks as we are able to move our account and hence our money from one bank to another. When we transfer our balance, our money means the same thing at one bank as it does to another - a pound still means a pound. However our statements don't transfer, this additional "meta data" on activity stays with the original bank. This isn't really a problem as we use secondary systems, accounting packages, to collate such information.

Imagine that the accounting package was from a Software as a Service provider and we decided to move to another provider. If we wanted to move then we'd need to have our data move, and our data to be interpreted by the new provider in the same way as the old. We don't want our data to change meaning when we move providers, in much the same way that I don't want my £100 to become $100 because I switched bank. However since what we a moving here is data rather than currency, we'd want ALL our data including any "meta data" to move.

In order to have such "true portability" and to avoid the necessity for further systems we would need :-

A choice of providers of the same service.

Portability of all data (including any meta data) from one provider to another.

Interpretation of all data (including any meta data) to be identical in the new provider.

The switching from one provider to another to be a useable process.

Now, I agree that open standards are necessary to solve these problems but I don't believe they are sufficient. Rishab Ghosh's 2005 paper describes the economic effect of an open standard as when:

"a natural monopoly arises (de facto) or a monopoly is defined and agreed upon (de jure) in a technology, but the monopoly in the technology is accompanied by full competition in the market for products and services based on the technology, with no a priori advantage based on the ownership of the rights for the rights holder. This occurs when access to the technology is available to all (potential) economic actors on equal terms providing no advantages for the rights holders. In particular, rights to the standard are made available to economic actors other than the rights holders under terms that allow all potential competitors using all potential business models to compete with the same degree of access to the technology as the rights holders themselves."

I highlighted the phrase above because open sourced implementations of open standard services provide a means for other providers to access and rapidly implement a compliant service. A provider can do so without losing any strategic control of their business and on equal terms to any other. Furthermore with an open sourced system, any meta data issues become more transparent and any consumer can implement their own in-house version in order to ease adoption fears.

Open source is an essential part of the portability conundrum. It is only when you have such portability that we are likely to see the development of competitive utility computing markets and adoption fears truly overcome.

We're not concerned about using a bank, as long as we believe we can move our money from one to another and to ourselves. The movement of our accounts is also what keeps the banks competitive with one another and provides some element of market regulation. Without such portability, if you were locked into a bank and couldn't transfer your money in a useable form to either yourself or another bank, then you'd be more inclined to hide it under your own bed.

Software as a Service and utility computing when combined together have a bright and rosy future. By co-operating on open sourced implementations of services with portability in mind, SaaS providers could create a competitive ecosystem where they could enjoy a slice of a very large pie. Unfortunately, it seems likely that many companies will try to control and lock-in their customers and open standards alone will not create the portability desired. Eventually the whole marketplace could well require Government intervention.

I've had long held views on the commoditisaton of IT and how we are heading towards a future where a mix of small and large "computing power plants" feed computer resources into a grid(or a few grids) and multiple brokerages sell those computing resources bundled as services to customers. Before we get there I expect there will be many more outages as happened with Amazon's S3 service, as well as a lot of arguments and dubious practices.

Saturday, February 23, 2008

Northern Rock was a building society whose management gambled on a strategy of borrowing from other financial institutions, rather than using customer deposits. This strategy went disastrously wrong as the sub-prime mortgage crisis spread around the money markets. The Government was then expected to bail them out with huge wads of public money, which it did. It was then discovered that most of Northern Rock's valuable mortgage assets had been used as securities for loans. The securities vehicle raising the finance was an off shore fund called 'Granite' and the money raised had already been used by Northern Rock.

If you have a 100% mortgage on your home, then you really don't "own" it. In much the same way, Northern Rock doesn't really "own" most its most valuable assets.

This is a company in which the Government has risked £100bn of public money. The government is now planning to nationalise Northern Rock which of course has the current shareholders screaming that they should still be paid for this car crash of a company. It should be remembered that some of these shareholders were less than supportive in attempts to find an alternative option and most of the value in the company has already been wiped out by reckless management. I wonder why they think they should be rewarded? Prudence? Good Governance?

Not everyone has lost out, some of the lawyers and bankers who have been advising the Government are demanding a huge £100 million windfall and apparently the CEO had made a few bob or two by trousering the profits from the sale of 1.5 million shares at the beginning of the year. The biggest loser seems to be the tax payer.

Let's be clear, the only people responsible for the failed strategy were the management and the shareholders. The management chose the strategy and the tactics, which crashed the bank when the environment changed. The shareholders chose the management.

As of 2006, we know that Northern Rock had £86 billion in mortgages and loans assets and £40 billion in securitised notes. As mentioned above, the securitised notes were against the mortgage and loans business. Now, we don't actually know how much this has increased to in 2007 but according to the Guardian it includes over £53 billion of residential mortgages. So I'll have to guess that this leaves an actual mortgage and loan book of say £40 billion or less. Some would argue the value of this could be much lower.

This system of securitising mortgages worked, as long as other investors were willing to buy the securities. Unfortunately the wobbles about the sub-prime market changed the game and the bank ran out of cash. Northern Rock had to borrow from the Bank of England and this caused the run. The bank has had to continuously borrow more and apparently the loan now stands at a £30 billion.

A £40 billion mortgage and loan business to cover customer deposits, a government loan and any existing bonds! This loan and the Government guarantee is what is keeping Northern Rock afloat.

Public money is at risk, and we, the public, want our money back with a good return. I'd also like to know more about PwC's role in all of this, and how comes £40m was distributed to preference share holders.

"Its a shame that good old market economics are dandy as long as things are going up, but when it's on the way down the banks want to be bailed out by good old state intervention."

Whilst the financial market is quite happy to make a fast buck out of the state as per the QinetiQ fiasco and Black Wednesday, it is also quite prepared to shriek about "minimalist intervention" (aka Friedman's folly) and the need for "self-regulation" when that is more advantageous. We even have the EU Internal Market Commissioner, Charlie McCreevy, accusing the banks of "failing standards".

It's the same old story: "Government interference in the market is wrong, the market should regulate itself"UNTIL"the market can't regulate itself, that's the Government's job".

Adam Smith knew that markets were no more than a tool to be used by society and that they needed control. The market is part of society and not the other way around. The Government should be a firm and visible hand, providing consistency but remaining suitably distant from business. Unfortunately we have allowed MG and other manufacturing companies to go the wall whilst we bail out financial services - so no consistency. We even hold meetings with oil companies to discuss post war economic opportunities in Iraq before the first bombs have dropped - too cosy a relationship for my liking.

There needs to be some distance put between Government and business, some consistency in intervention, a realisation that the markets need firm regulation (as per Keynes and Smith) and an understanding that Government's role isn't for the benefit of the market but for the benefit of society.

Judging by the recent howls about the changes to non-doms and other recent interventions, it looks like this might just be starting to happen.

Friday, February 22, 2008

I've been preparing for my upcoming talk at Enterprise 2.0 Summit, hence my silence in the blogosphere. However during the last two weeks, the phrase "caveat emptor" has been ringing in my ears.

Let's start with groceries.

Under new rules put forward by the Competition Commission, planning proposals for new supermarket stores will face a competition test. This will in effect bar supermarkets with a certain number of stores from building new shops in that area. It is designed to increase competition.

According to the FT, Lucy Neville-Rolfe (a director of Tesco, the dominant supermarket in the UK) said of the proposals:

"We are against the competition test because we think it's wrong in terms of bureaucracy and economics." and "We're against growth cap- style regulation."

In other words, Government interference in the market is wrong, the market should regulate itself.

Hot on the heels of this, Tesco was again criticising the Competition Commission's proposal to introduce an ombudsman to ensure a level playing field between retailers and farmers. According to Farmers' Weekly, Lucy Neville-Rolfe said:

"Tesco considers that introducing a new ombudsman could be bureaucratic and an unnecessary cog in a supply chain which has worked well for consumers ..."

In other words, Government interference in the market is wrong, the market should regulate itself.

So we come to the latest fuss, which is supermarkets selling low cost alcohol, often as a loss leader to boost other grocery sales. According to the Times, Lucy Neville-Rolfe said:

“We can’t put up our prices because people will simply shop elsewhere – it could be commercial suicide - and we can’t act together to put up prices because that would be against competition law." and "The only safe solution is for the Government to initiate and lead these discussions and to bring forward legislative proposals which Tesco and others in our industry can support”

In other words, the market can't regulate itself, that's the Government's job.

This is a definite case of heads the market wins, tails the Government loses. As for Tesco's offer to work with the Government on possible legislation, I hope the Competition Commission remembers the words of Keynes:

"Capitalism is the astounding belief that the most wickedest of men will do the most wickedest of things for the greatest good of everyone"

Tuesday, February 05, 2008

Housing minister Caroline Flint's attempt to end the culture of "no one works round here" has been taking a bit of flak. Apparently some MP's families actually do some work for the huge wads of public cash they get given.

I missed this announcement, but it seems that the High Court said "that the Patent Office was incorrectly applying the law in automatically rejecting claims for computer programs".

According to IAM (Intellectual Asset Management) magazine, the situation in the UK is still confusing but there is hope of a single European patent jurisdiction.

Unfortunately, the European Patent Office has tended to be more lenient to software patents. Though this may favour patent lawyers, it would be a significant setback for society and innovation.

For those of you who have not read Noble Prize winning Eric Maskin's paper on Sequential Innovation and Patents, it conservatively summarises with the statement that:-

"In a dynamic world firms have plenty of incentive to innovate without patents and patents may constrict complementary innovation”

Unfortunately, some people see patents as purely a financial opportunity and have somewhat forgotten that the original purpose of patents was to be fair exchange designed to promote and disseminate innovation in society in return for a short lived monopoly.

If we were Golgafrinchams, then we'd probably have started building that B Ark by now.

Monday, February 04, 2008

I thought I'd talk about the use of open innovation markets and the outsourcing of innovation. However, in order to so I'll need to define a few terms first and explain what innovation is. Lets start with those definitions:-

From Jan Fagerberg, Innovation is the first-ish attempt to carry out an idea into practice. It is the embodiment, combination, or synthesis of knowledge in original, relevant or new products, processes or services.

An idea is an image or a concept or an abstraction. As John Locke said “it being that term which, I think, serves best to stand for whatsoever is the object of understanding when a man thinks”.

Discovery or invention are processes that result in the generation of new concepts or postulated entities or devices. Both of these processes involve serendipity, questioning and the use of analogy. So for example, you can have:-

Invention: a Turing Machine (a postulated entity)

Idea: the use of a Turing Machine to solve business problems (an abstraction of how to use the postulated entity)

Innovation: LEO, the worlds first business computer (putting the idea into practice)

This general movement from invention to idea and then on to innovation, I've summarised in the diagram below.

Figure 1 - Invention to Innovation

Now, this is a very neat, cosy and comfortable view of innovation which implies an ordered logical flow from one to another. Of course, innovation rarely works like that. Prior to the innovation, "pre-event" so to speak, there is a massive amount of uncertainty as to what will be invented or discovered, what ideas will be created and what will actually get put into practice. The Wright brothers apparently believed that powered flight would stop future war by removing the element of surprise. Unfortunately, others had different ideas.

Research is a highly uncertain activity, and most large research groups have a plethora of incidental creations. The often cited question is - "how do you turn these creations into something of value?" One possible solution is to use an open innovation market, such as InnoCentive, an exchange of problem seekers and problem solvers.

Let's consider such an exchange in terms of creations that are either "post-event"(i.e. something already discovered or invented or an existing idea or innovation) or "pre-event"(i.e. something which has not yet been discovered, invented, thought of or implemented)

Now for a "post-event" creation which is not directly useful to the owning company, then an exchange which allows for this incidental to be sold to another is beneficial. An alternative form of this exchange, is when a company which has a problem is able to announce this problem and allow other companies to see if they have a ready made solution. An exchange of "post-event" creations makes a great deal of commercial sense.

Now let's consider "pre-event" requests, such as we want a miracle cure for aging. The problem with pursuing such a requests is that you don't know how to get there, you will need a lot of trial and error. The danger with such a request is that you are proposing someone undertake a "pre-event" service but you are intending to buy a "post-event" good such as the miracle cure for aging. Who will end up paying for all the failures, the time spent on unsuccessful work and chasing down blind alleys?

At this moment in time, a fair amount of that chasing down blind alleys occurs in corporate research labs. It is tempting to think that such "non profitable" effort could be outsourced, that somehow we can outsource innovation to a marketplace and pay for only what we use in terms of results.

It is worth remembering:-

There are no economies of scale with the creation of innovations as they are novel and uncertain.

Any "innovation" providers will not only require a suitable price to cope with the risk of failure, but they will also benefit from the incidental discoveries.

Any company handing over innovation is in effect handing over its future source of profits to a market place. It becomes dependent on the marketplace.

So whilst such markets are a useful "post-event" tool, the outsourcing of "pre-event" innovation could well result in higher cost and a loss of control in the long term. Though they do offer the possibility of addressing the in-balance between the distribution of ability and the distribution of opportunity, there is also the danger that such markets could be exploitative on universities.

Whilst I would agree that innovation markets have a positive role to play, the idea that innovation can be outsourced entirely (post and pre-event) without creating some permanent form of exploitation would appear to be little more than youthful exuberance with the latest plaything.

Sunday, February 03, 2008

A couple of months ago, a friend of mine from "a galaxy far, far away" mentioned how the policy of outsourcing that they were following was causing all sorts of problems. The promised benefits and cost savings were much less than expected and they felt they had not only lost significant amounts of talent but were dependent upon the vendor.

What had gone wrong? I can't talk about the details but I'll try and explain.

If you take an snapshot of a company at any time, you will find that it consists of a mass of different activities (where each activity is either a process, or part of a process, or related to the result of a process such as a product or service). Each of these activities is on their own pathway from innovation (some new and relatively unknown) to commodity (something common and well defined).

In figure 1 - I've provided a very simplistic and hypothetical example of such activities in a profile form. The components are actually all connected but the profile helps to highlight that many are far more widespread (ubiquitous) and well defined (certain) than others.

Figure 1 - Hypothetical examples of organisational activity

Now, many companies organise themselves by type of activity such as marketing, finance, operations and IT. Each "function" of the business will therefore consist of a range of activities from innovation to commodity. Ideally when you outsource a "function", you want to outsource those activities which are well defined and common in your industry i.e. the commodities, see figure 2.

Figure 2 - Focus of outsourcing

Such activities are more likely to be suitable for economies of scale, standards and an ecosystem of providers. This is where outsourcing can work and for a business consumer this roughly translates to lower prices, reliability, increases in speed and potential for portability in a competitive market.

However if you outsource a function of the business, such as marketing or finance or IT, then along with the commodity like activities you will be outsourcing some innovative and transient activities - see figure 3. For such activities there will be no economies of scale, standards or an ecosystem of providers.

Figure 3 - Additional activities lost in outsourcing

The net result of such an outsourcing would be a loss of innovative activities, potential future profits, capability and talent. Furthermore such outsourced activities are unlikely to show any economies of scale and are more likely to result in lock-in and cost overruns.

Outsourcing is a sensible approach for commodity-like activities, for all others, buyer beware. If you are intending to outsource a function of the company, then ideally you should only be outsourcing those commodity activities, and ideally to multiple service providers.

I mention this because the same conversation came up again recently. I was a little bit harsh in my response by making numerous references to the Golgafrinchan B Ark and why management consultants were included.

In general though, if you are considering outsourcing, then :-

For goodness sake, make sure you are fully aware of ALL the activities you are outsourcing and whether those activities are suitable for outsourcing. Not everything is.

Saturday, February 02, 2008

Derek Conway has been caught out handing public money to members of his family for non existent services. The total amount seems unclear, but at least £70,000 was given to his two sons.

There is all sorts of talk about whether he should resign or not. Personally I think he should but that isn't the issue that concerns me.

Taking public money and giving it to another under false pretences is fraud, it's theft. If an employee of the DWP handed £70K to family members under false pretences then both they and the recipients would be facing criminal prosecution.

In my opinion, it would appear that both David and his sons have some serious questions to answer, as this looks like fraud against the public purse.

The question of whether he should resign is a matter of honour, however this would appear to be a matter of theft.

Friday, February 01, 2008

On Monday, a friend asked me to recommend a project management methodology for IT. I wasn't trying to be awkward, but my answer was "that depends".

Let me explain.

Any activity that a business performs (whether it's a process, or part of a process, or related to the result of a process such as a product or service) is transient. In other words an activity's characteristics change over time.

For example, the use of CRM (customer relationship management) systems was a new activity in the mid 80s however today it has become widespread. CRM has transformed from something novel, with little or no literature and a relatively undefined problem space to something which is common, backed up by a wealth of literature and a well defined problem space. I've marked this transition on figure 1 which shows a graph of ubiquity (from new to common) against certainty (from an undefined problem that few have solved to a well defined and specified problem that has been solved many times before).

Figure 1 - The transition of CRM

If you take an snapshot of a company at any time, you will find that it consists of a mass of different activities. At one end, you have those activities that are truly novel within the industry and because of their novelty they are relatively undefined. Such activities are the innovations, the differentiators and the potential sources of competitive advantage. On the other extreme you have those activities that are common or ubiquitous within the industry and hence because of this, they are well defined (to the point of best practice solutions and step by step how to guides). They are the commodity like activities and a likely cost of doing business. I've shown this on figure 2.

Figure 2 - Organisation as a mass of activities

Now this is only a snapshot in time. Any activity (as per CRM) is on a path from innovation to commodity and hence its position on this graph will change over time. As an activity moves down this path, it changes from a highly variable dynamic class of problem to a more fixed static class of problem. Its characteristics change and hence the method by which it is best managed. I've tried to show this in figures 3-4.

Figure 3 - Characteristics at different stages of the activity lifecycle

Figure 4 - Effective methods for dealing with an activity at different stages of its lifecycle

In order to know how to manage something, you need to know where it is on this curve. Hence my original answer. The project methodology depends upon what you are doing and most importantly when you are intending on doing it.

The methodology I would use to manage a CRM project has changed drastically, but that is only because CRM is no longer an innovation - it has become more of a commodity.

This is also why there are no single, magic bullet solutions to project management. When dealing with a commodity, your focus is to eliminate variation. No business wants variability in its power supply or telephone or internet connection. Innovation, however, requires deviation from the accepted norms of today - it's something new. So on one extreme you need methods to eliminate variation whilst on the other you needs methods to encourage and cope with variation.

In my opinion, the simple solution is just to use different methods and to learn when to use them.