It’s easy to fail fast. It’s harder to succeed speedily

There is no simple formula for ensuring successful innovation but neither are the biggest breakthroughs just lucky accidents emerging from chaos. As the outgoing DMA fellows move on to new challenges and a new cohort prepare to run experiments of their own, this post looks at some of the factors that can help make successful innovation more likely.

Over the past few months all the DMA fellows have explored new ways of connecting their organisations with audiences. I’ve worked with some as a mentor while at the same time helping other organisations manage digital experimentation.

Most recently I’ve been developing processes to support the BBC Connected Studio programme. Much like the DMA, Connected Studio is a series of experiments testing new digital features, formats and services that aim to reach new audiences or existing audiences in new ways.

Most of the Connected Studio projects are several orders of magnitude larger than their DMA counterparts in terms of budget and likely audience reach, but what’s striking is the similarities between the ones that are most successful regardless of scale.

There are six factors in particular that are common to the most successful experiments in both programmes:

1. Speed – make it fast

When my colleague’s Jon, Mike and I recommended to the Arts Council that they should fund the Arts Marketing Association DMA programme, our overarching criterion was that the programme should be ‘lean’ with ‘rapid iteration’ allowing ‘innovative ideas to be tested cheaply with real audiences’. The AMA has done this well.

The mantra ‘fail fast’ is often used in digital startups but as no-one really likes failure ‘succeed speedily’ is more appropriate and certainly more optimistic. The key in either case is pace: build quickly, test quickly, learn, move on.

Connected Studio has launched the new BBC Taster platform which makes it even quicker to get a new idea in front of audiences. But such an approach isn’t the preserve of a large broadcaster – clearly labelled public experiments are one of the best ways to gauge user reaction to new ideas.

2. Scope – keep it small

Another concept used by digital startups is the ‘minimum viable product’ (MVP). Which core features must be there before you can put your new product in front of users? This echoes the point in my earlier post about finding the quickest way to move things forward:

If you need some audience feedback do you really need an ethnographic study or would asking a few people outside on the street be a good first step? Often paper sketches are a good way of bringing ideas to life at very minimal cost and they don’t need to be beautifully designed to help you gather quick feedback, if anything the sketchier the better.

3. Novelty – don’t try everything at once

When scientists run experiments, when car mechanics look for faults, when chefs adapt their recipes, one thing is always true: some parameters can be varied but others must remain constant. If you test too many things at the same time your results will be harder if not impossible to decipher.

Added to this if you try too many new things at once – let’s say a technology you’ve not tried before on a new platform with a new audience in a new genre – it only takes one element to fail and your opportunity for valuable lessons to be learned has been lost.

4. Rigour – innovation needs managing

Innovation should not be used as an excuse for lack of project management rigour. Nobody likes an overblown process, nobody wants multiple tiers of meetings and reporting lines, but innovation needs managing if anything more closely than business as usual activity which is often well enough known that the manual doesn’t need to be referenced.

When I was Head of Online for the UK Parliament we relied on the Parliamentary ICT service for computer hardware and software support. They used agile methodologies and the fact that ‘Parliament is unique’ as an excuse for a lack of rigour, were widely criticised for a string of failures and outages, and have since been restructured under new management. Which teaches us:

If someone tells you that what they’re doing is so new or different it can’t be managed in a normal way then alarm bells should ring. There’s no reason why a project – even an agile, experimental, innovative digital project – should not have a plan. All projects are a trade off between cost, time, quality and scope and it’s essential to pin these things down.

5. Focus – keep your research question in mind

The DMA work plan and experiment log asks: what will success look like and how will it be analysed, measured and shared? The Connected Studio project approval document asks: what questions or problems are we answering, how can these be measured, and by what metrics?

In both cases the answer to these fundamental questions is itself a question – the research question – and the most successful projects keep this research question in mind at all times.

There is a risk that as an online experiment starts to show signs of success that the service will be expanded or idea extended before the original research question has been answered.

You must keep coming back to why you are running a trial to begin with. You should also frame the question and the experiment so that not only do you get a result but you also gather sufficient evidence to convince others the idea is worth pursuing further.

6. Stop, collaborate and listen

Finally, and most importantly of all, the most successful innovation projects have teams where people stop, collaborate and listen. Keep talking to each other, keep reminding yourselves of your goal, and your project will be more likely to succeed.

Hopefully, in that spirit of collaboration, this post will trigger some discussion about other factors that have made the DMA experiments successful. I look forward to hearing these as the year one cohort collate their final results and lessons learned.