Abstract

The phrase 'Web 2.0', now so well known as to be
generally considered 'mainstream', has taken hold
online, first as a catch phrase and now as a way of life to many
of the bigger, content rich providers. No longer are users
content to just consume content; instead they want to take a part
in it, to personalise it and to share experiences with others. In
the museum sector, however, uptake has typically been low. Some
notable exceptions exist, of course, but the key question
remains: why has deployment of this 'new' approach to
content been slow? What barriers exist in museums and how can we
go about addressing these?

This paper attempts firstly to identify why Web 2.0 is of
particular importance to our sector, then to examine common
barriers in our particular context and finally the ways in which
practitioners might go about addressing these barriers in their
organisations.

Keywords: Web 2.0, policies, cultural change

Web 2.0 and Why Museums Should Care

Web 2.0 is often lauded, sometimes derided and almost
always subject to scrutiny. What exactly does this phrase mean?
Does it mean anything, or nothing? Has Web 2.0 been and gone,
already bloomed, about to be the next big thing, a fad? Is it
finally the Berners-Lee vision of the 'Read-Write
Web'? Or is it 'just' about technology, and
not, actually, about content? Aren't we better off waiting
for Web 3? And last, but not least, as people who produce Web
content, should we actually care?

The fact that 'Web 2.0' is a catchphrase, a
meaningless marketing slogan to some, is immaterial here. What
the phrase has done, usefully, is to draw a dotted line around a
series of aspects of Web experience. The finer points about the
extent of 'Web 2-ness' is also unimportant – to
some, 'Web 2.0' may encompass any kind of Web site
which has the means to author the experience in some
way; to others, it may be about an 'AJAX-ified' user interface.

Trawling the Web finds the following phrases recurring around
Web 2.0: 'mashup', 'de-centralisation', 'non-Web like', 'user generated content',
'permission based activity', 'collaboration', 'Creative Commons', ...
What sits at the heart of all of these, and one of the
reasons Web 2.0 has been difficult for bigger, established
organisations (including museums) to embrace, is that almost all
the things talked about put users and not
the organisation at the centre of the equation.
Organisational structures, departmental ways of naming things,
the perceived 'value' or our assets, in fact, what the organisation has to say
about itself are all being challenged.

These are difficult things to question, particularly in an
organisation which is historically highly respected, or has a
long-standing way of doing things. Web 2.0 does uncomfortable
things: it releases assets into the wild, it empowers users to
speak their mind, it asks people to share and collaborate in a
way which has been unprecedented in the past. For museums, the
challenges are even more profound: What about dumbing down? Who
is going to moderate? What if they don't like our exhibition?
Surely our curators are the
experts, not some random bloke who rode one of those bikes when
he was a kid ...?

Questions like this often form the basis of the barriers to
'doing' Web 2.0. Later on, we'll examine these
barriers in detail and try to find some strategies for addressing them.

As mainstream writers start to bring YouTube and MySpace into
the public eye, there is also of course a major risk. It is no
different to the risk with any new technology, particularly those
that are climbing the Gartner Hype Curve. The risk is
that we do these things just because we can, or because everyone
else is doing them, or even more dangerously, because it attracts funding.

These are not good reasons for 'doing Web 2.0'.

There are, however, many synergies between some of the key
tenets of Web 2.0 and what Museums are trying to do, both
on-gallery and on-line. For years, we have tried as a sector to
appeal to mass market, to solicit opinions about our objects, to
not 'just be a repository of dusty stuff in cases'.
At the centre of Web 2.0 is the promise of richer, more relevant,
more personal content; content which can make a difference to
users – ultimately, content which goes at least some
distance to answering these challenges. This is a Web where the 'reactive consumers'
become 'public producers'. Experiences are shared and opinions given.

In this light, 'Web 2.0' brings some very
interesting things to the party, especially in our particular environment.

Doing Web 2.0 just because we can is wrong; however, doing it
because our users expect it, because it adds real value to what
we - and they - have to say, or extends the content
experience in real and meaningful ways is, of course, right.

We Should Care Because ...It's the Way the Web is Going

User Generated Content (UGC)

Users are no longer content to just surf. Media consumers are
instead becoming 'prosumers' - spending time
either producing content themselves or reading content produced
by other users. Although current thinking has percentages of
content producers as being far eclipsed by content consumers, as
the tools become available and confidence grows, people will
produce more and more. At the same time, the means by which value
is atttibuted to this content will become better defined.

Museum Web sites should not be an exception to this
groundswell of UGC. Of course, there are still many places where
museum Web site content (like any Web site content) should just
be published, and where UGC is irrelevant. Opening times and
location maps don't typically need comments, opinions or
other external input. Museological content, however -
objects, stories, games - are often ripe for UGC. What
better way to make that object relevant to today's users
than by offering the opportunity to write about it, comment about
it, compare it with what is in their lives now?

Opinions ...

Opinions count, and although authority is important (more on
this later..), real people are attracted by real people's
opinions. Sites like eBay are an example of user generated
content at its most lucrative. eBay would not exist without the
core understanding of trust and what it means. This trust is
built by and between users; eBay provides a mechanism
for expressing this trust, but it has absolutely nothing to do
with authoring or moderating this trust. At a secondary level is
of course trust of the eBay brand, or trust of their e-commerce
mechanism, but the means by which users rate each other is
absolutely fundamental to the operation of the site.

Countless examples exist of the amplifying power of the web for user opinion:
from RatherGate (http://en.wikipedia.org/wiki/Rathergate) to
iPod Nano Screens (www.flawedmusicplayer.com
- no longer live, following requests from Apple: see http://www.google.com/search?q=flawedmusicplayer).
Real people are changing the real world through the Web.

Syndication - We're Already Doing It, But May Not Realise ...

Mashups, feed sharing, APIs – these are still on the
edge of the curve, only really understood and used by early
adopters. RSS is one area where syndication is starting to become
main-stream, but what is slightly less well understood is that
almost every Web site is essentially already a source of
syndicated content. Many people browse your content without even
visiting your Web site.

Take Google Images. Here is a tool which sets out specifically
to side-step on-site search engines and to provide a much more
powerful user-focussed interface. Why search one
(museum) site for information on George Stephenson when you can
search the entire web? Once images have been found in this way,
people are increasingly then embedding them in their blogs and
MySpace pages. Around 9% of referring traffic to
the Science Museum Web site is from MySpace with people
'borrowing' images in this way. And quite possibly,
none of these people have ever actually been to
the Science Museum Web site!

Sites like ClipMarks (www.clipmarks.com/)
or Google Notebook (www.google.com/notebook/)
let users 'clip' (essentially, copy and paste) content from any site - again,
the consumers of this content probably never even get as far as your site, let alone
the homepage or some means of contextualising what they've
just read. These sites, and RSS, are popular because people are
busy – often far too busy to want to dig deep into pages of description.

Later on, we will describe why this is an important point,
particularly when trying to justify the production of feeds and
APIs, as well as when supporting more open content sharing standards.

Museums are a perfect example of the 'long tail'
- our popular content is probably eclipsed in overall
popularity by the weirdnesses lurking at the end of the tail; the
artefacts and stories which make our curators and our
organisations real and interesting to users. Google isn't
great in many ways at helping long tail content (by its nature it
promotes pages which are 'popular' - i.e. by
definition, not at the thinner end of the tail), but it does get
people to content on sites they probably wouldn't normally
visit. People increasingly use Google to find stuff (surprise!)
- but it is easily forgotten that they may well not have
set out to look at your museum Web site but instead were just
searching for a biography of Charles Babbage or an image for their homework essay.

User Experience

For some people, Web 2.0 seemed as well to define a whole new
look and feel. For a while, it seemed as if the entire world had
reflected logos, beta signs and the letter 'r'
suffixed at the end of the name (Flickr, Mappr ...Objectr,
anybody?). Any serious examination of what Web 2.0 means should
ignore the more fickle aspects of this, but there are some
interesting things which this 'new' way of working uncovers.

Probably most interesting here is that
'technologies' such as AJAX can be used to change the
users' relationship with the online environment. The
'rich user experience' as explained by the original
O'Reilly 'meme map' (O'Reilly, 2005)
covers not only the ways in which users are empowered to engage
with content, but also the environment in which
they engage. AJAX, as one example, provides new paradigms in Web
browsing where drag and drop shopping baskets can co-exist with
accessibility, where external content can be fed seamlessly onto
the page, or where a pop-up window can work (elegantly!) to
display large versions of images. In short, an environment can
now be created where the users' experience is much closer
to that of a desktop program.

The Barriers ...

Like any change, 'Web 2-ness' can be a
source of fear. This is augmented by the fact that new
technologies - in particular those that impact on core
aspects of what an organisation is and does - are particularly
subject to reservations from all levels of an organisation. From
the fears that a Web developer might have over learning a new
technology to clashes at management level over ownership, to
director-level fears about the public and sponsor perceptions of
an organisation, Web 2.0 is a difficult beast to
tame.

'Simply, Why Should We Bother?'

Challenging the status quo of an activity which is already
well established (or not, in some cases!) is always a difficult
task. Hearing about major Web 2.0 companies in the popular media
as they continue to thrive certainly helps the drive to adopt
these technologies, provided expectation can be managed, of course.

Often, though, finding justification to 'do Web
2.0' is a challenge in itself: Measurement of success is
ill-defined (see 'Measuring It', below); and the
actual tangible benefit is hard to define. 'Content
ROI', as in the Web many of us worked in 5-8 years ago, is
very hard to pin down.

Cultural and Political: 'It's My Content ...'

Cultural and political barriers are often the most challenging
to respond to, because they require a hard-to-define
'soft' approach which is not about proving of new
technologies, but is instead about working with people who feel
challenged by these. Often, the Web function is situated in an
already debated position within an organisation. Marketing, IT
and education departments often lay claim to Web, with
'matrix managed' lines between them: Web teams are
often divided between these teams and are often not firmly bedded
into any one.

Throwing contestable technologies into this mix can prove
extremely challenging, especially when these technologies go
against much of what has gone before. Education and Marketing
functions often have trouble with the UGC elements of Web 2.0;
both find the concept of an external party editing content on the
site difficult from both a brand and a 'trusted
organisation' perspective. Curatorial staff have
additional, deep seated concerns about authority once user
content is brought into the mix.

Technical

Over in IT, the questions are often instead about security,
denial of service attacks, or how to manage accessibility or
metrics within these new frameworks of content delivery.

As well as these concerns, members of the IT community may
well also focus on issues such as interoperability; import and
export of data; management of users' ids and the
scalability and the reliability of a service. This becomes
particularly relevant when external services are also a part of
the delivery framework: depending on one's own IT
infrastructure is one thing; depending on a (sometimes unknown)
third party is something else altogether.

There may also be understandable concerns regarding the levels
of technical expertise which may be required in order to develop
innovative services.

Resource and Cost

Resource concerns are common: surely the organisation requires
additional money, time or expertise to invest in the deployment
of new services? There is a perception that all UGC requires
intensive moderation, or that any new technology is, by its
definition, expensive and difficult to implement. In the early
stages of implementation of technologies, figures and comparisons
are hard to come by – case studies often only exist in the
form of privately funded start-ups who are often loath to produce
details on implementation.

Content: Legality, Privacy, Liability

Data protection, privacy, liability and accessibility issues,
uncertainties regarding the lack of any formal contractual
agreements - are often fears which surround Web 2.0. This may
also give rise to concerns regarding the sustainability of such
services, and disaster recovery strategies which may be needed if
an external provider of a service becomes bankrupt or changes the
terms and conditions governing use of the service to the
detriment of the user organisation.

At the same time, commercial arms of organisations - who
obviously guard their content jealously and often charge for
access - are of course nervous when a new approach is suggested
which appears to give away that content for free, or apparently
dilutes the value of that content by allowing users to edit.

Liability and the legality of who is responsible for content
is also an area of much confusion. Negotiating terms with users
and funders is often key to this, and this negotiation is
frequently done by those outside the group who is driving for Web
2.0. Internal relationship building is vital, and where disparate
groups are negotiating for different things, major setbacks can
easily occur.

Measuring It

Even basic Web metrics are often difficult to pin down:
already, defining terms such as 'visits' to internal
groups, let alone government or sponsors, is very often hard to do.

Throwing Web 2.0 into the mix confuses things much further.
Not only are there ill-defined ways of measuring success,
technically, but agreed standards are often non-existent. Couple
with this the fact that technologies such as AJAX actually
change the way that people interact with a page
(and hence change visit analysis), and things become much more
complex: an AJAX style approach to pages means that the
relationship between page views, visits and hits are skewed in
as-yet unknown ways. A third layer of confusion is added when you
consider off-site content. How do you measure syndicated content:
either RSS feeds or 'borrowed' resources such as
images or 'clips'? And if you could find a way of
measuring them, technically, how can you then measure their
effectiveness? Do they count as 'your' content? And
what if your content is mashed up with someone else's?

Some Answers ...

Simply, Why Should We Bother...

Museums must continue to pioneer on the Web.
We have extraordinary content: niche, long tail content as well
as high-profile 'exhibition friendly' content. We
also have people who are among the best in the world and
certainly the most knowledgeable about their fields.

The opportunities we have as a sector for touching real people
with what we do are immense. To do this we need to find
technologies which bridge the gap between 'us' and 'them'.

Obviously, the environment needs first and foremost to be
right for Web 2.0: it needs to fit the
organisation, the process and the particular application. But,
provided this is the case, examples from the bigger players
(YouTube, MySpace, Google, Yahoo, Flickr) can demonstrate the
immense power, and profitability, of user generated content, the
mashup environment and ways of distributed working. Real-world
examples are starting to emerge of how UGC can boost Web site
traffic and profits. Museums are also bringing specific examples
into the arena of how to use these tools effectively in our sector.

Provided expectation can be managed, one of the strongest
things about Web 2.0 is its media visibility. Unlike server-side
technologies, or 'deep tech' such as hardware, Web
2.0 is now talked about, debated, and endlessly cited in the
public media. Museums can look to this publicity and use it to
their own advantage. Having said already that 'doing Web
2.0' just for the funding is wrong, it is also accurate to
highlight that funding follows significant social
movement: money is usually available for technologies
that pioneer new ways of engaging users.

Cultural and Political: 'It's My Content ...'

Ownership of Web sites, content, and user-facing resources has
always been a challenge and it is unlikely that this is going to
change in the short term. On the one hand, Web 2.0 approaches can
be considered difficult in these environments: conversely it can
also be demonstrated that UGC and associated Web 2.0 technologies
can bring real benefits to educational audiences and also provide
a powerful marketing tool. Increasingly as museums and other
respected organisations such as the BBC produce this kind of
content and begin to embrace what it means, case studies are
becoming available which prove that users feel empowered and
engaged by these new content approaches.

Often concerns about how UGC may damage a brand or the
authority of that brand can be alleviated very simply by good
graphic and interface design: for example, users understand now
that certain reviews on Amazon are from the publisher whilst
others are from other users. There is no confusion here about
'authority'. We can afford to mix and match our curatorial 'expert' content
with those who have different sets of experiences.

Technical

Technical concerns are often difficult to un-pick, but in many
scenarios the core technical and IT teams are often the first
users (the 'Early Adopters' - see below) of
these kinds of technology. As such, these teams are incredibly
useful sources of information when looking for innovative
solutions to Web 2.0 issues.

There are understandable concerns with distributed computing,
but often technologies exist which can be brought to bear on
these issues. The 'API approach' – where
systems provide data via web services or other network
connectivity is the way ahead for web
development, and few deny this common-sense approach. Museums
should look to apply pressure to software suppliers to provide
well-documented APIs, and even embargo those who don't have
this functionality on their development roadmaps.

Resource, Cost and Content Legality

One of the most interesting things about Web 2.0 is that it
does not call for a sea-change: small-scale solutions can, and
should, be rolled out very easily. Benefits can be measured and
fed back quickly, and used as input into a virtuous cycle of
support for these technologies. This is Rapid Application Design
(RAD) for the web: build it, test it, amend it, then rinse and repeat ...

From a human resource perspective, User Generated Content is
usually not the scary 'all we'll end up doing is
editing endless obscene comments 24/7' beast that it first
appears. The key here is to provide user-facing platforms which
encourage users to get involved but at the same
time have a certain level barrier to entry to
discourage spam. The classic example here is asking users to
provide a valid email address. Not only does this kind of
approach discourage bots, it also means that the user is probably
serious about writing some content for your site. Users who
genuinely want to take part will cross these barriers. Those who
are simply looking to abuse your institution probably won't.

Other strategies can always be brought to bear, which often
answer legal concerns. The BBC for example has a 'tell us
and we'll remove it within X hours' disclaimer on
their UGC. External moderation companies such as eModeration
(www.emoderation.com)
can be employed purely to moderate content. Part-moderation (pre-
and post-moderation) or asking end-users themselves to moderate
content have also been employed successfully on many sites. As
more institutions, including museums, start to work with these
kind of techniques the legal boundaries will become clearer and
better defined.

Measuring It

Our sector could, and should, play a leading role in helping
define what Web 2.0 metrics look like. By engaging with
government and other funding bodies early, we could begin to
shape some of the emerging standards. Sites like Feed Burner
(www.feedburner.com)
already provide (free) tools for measuring RSS usage. As a
sector, we should start to report on these usage figures for our
own feeds and begin defining what we think success looks like for
these different ways of producing and consuming content.

More Answers: Shaping the Curve

The Gartner hype curve, a modified version of which is shown
in Figure 3, provides a useful mechanism for understanding how
new technologies may be perceived. This can help us to develop
generic strategies which are appropriate at particular points on
the curve when dealing with any new technology.

Figure 3: Gartner Hype Curve (modified)

Early adoptersare little or no challenge. These are
the people who are technically savvy; those who 'get
it' and are prepared to make do with (in fact, relish) beta
versions which are often buggy, incomplete, or don't quite
deliver in other ways. As the technology becomes more widespread,
often with increased media coverage, it moves up the curve,
before peaking at a point called the Peak of Inflated
Expectations. Here, media hype has expanded and extended the
original reach of the technology to realms often way beyond those
which are actually possible. Typically here we see the technology
presented as somehow being the panacea to everything; funding is
widely available; and, to quote a well known blog,
'... your mum has one in her living room'.

At around this time, or earlier, voices of dissent begin to
become heard, typically in the non-mainstream media, blogosphere
or technical community. The technology isn't as good as the
hype, the screen tends to break ... the battery life is
short ...the wheels fall off ... Shortly afterwards the
technology begins an inevitable descent into the trough
of despair - before levelling out sometime later
with the service plateau.

The growth and burst of the 'first Internet bubble' is a typical example of
Gartner hype, albeit one with a quite unprecedented and widespread impact.

The technology chasm which is indicated on
the graph is a critical time for any new technology. Given the
widespread understanding of how hype ebbs and flows, it is often
during the rising phase that technology is at the highest risk of
failing to gain a foothold in the mainstream. During this time,
the challenges to organisations (as outlined above) are at their
most acute.

In order to avoid potentially useful technologies failing to
bridge this chasm and to reduce the time it takes for useful
technologies to move from use by early adopters to more
widespread usage, there is a need to adopt an appropriate set of
strategies. There is also a need to manage expectations so that
organisations do not have unrealistic expectations as to the
capabilities of a particular technology or the difficulties which
may be experienced in achieving such expectations. Similarly
there is a need to minimise the 'trough of despair' and to ease
the transition to a stable service plateau – until, of
course, the next disruptive technology arrives.

Avoiding the Chasm

The following approaches may help to shape the Gartner hype in
a Web 2.0 environment:

Advocacy:

It is not necessarily always true to
say that IT innovation should be deployed in response to a
clearly articulated user requirements. The take-up of the Web in
the early to mid 1990s was due to the potential which
organisations identified once that had seen the Web and
identified its potential to support current business requirements
and also to provide new services which hadn't been considered previously.

Listening to and addressing concerns:

The advocacy of the potential benefits should be followed up by a period of listening to
concerns and addressing issues which may be raised. There need
not, however, be a clearly identified solution to all of the
concerns. Solutions may emerge as more experience is gained.
Alternatively it may be that concerns are not as significant as
may have initially thought.

Supporting enthusiasts:

It will probably be naive to expect everyone to be willing to accept a major
new technological development. Rather than waiting to gain general acceptance, an
alternative approach may be to support those who are enthusiastic
and who may still have concerns but would be willing to experiment.

Refining approaches:

It is important to ensure that the experiences (positive and negative) gained by the initial
adopters are noted and refinements to a final service deployment are developed.

Risk assessment:

It may be a mistake to expect innovation to
be completely risk free. Rather any potential risks should be
identified and assessed. There will be a balance between the
risks associated with deploying an innovative service and the
risks in doing nothing. The latter, for example, could be that
one's competitors take the risks resulting in your organisation being marginalised.

Managing expectations:

The need to promote potential benefits in
order to overcome inertia needs to be balanced against the need
to avoid overselling the benefits of a technology or the effort
needed to ensure that the technology can be used in a sustainable fashion.

Sharing experiences and expertise:

Conferences and events (such as the Museums and the Web and the
Museums and the Web UK conferences), mailing lists (such as the MCG JISCMail
list) and resources such as the QA Focus briefing documents
(http://www.ukoln.ac.uk/qa-focus/documents/briefings/)
can help developers in learning about innovations and sharing
implementation experiences.

Avoiding the Trough

Once technologies become over-hyped there is a danger that
disillusionment will set in when the technologies fail to live up
to expectations. The 'trough of despair' can be avoided by:

Low risk and low cost solutions: In a Web 2.0 environment in
particular, it should be noted that there may be low risk solutions which can be
deployed for little cost. Hosted services such as Blogger provide
resilient, flexible, and most important, free, means to build Web 2.0 platforms.

Flexible business cases: At these early stages it can be
useful to examine existing business models and reflect on the opportunities which
new technologies may provide. For example, providing RSS feeds
about the museum can allow third party aggregators to expose this
information to their user community – which may result in
visits to your physical museum from groups who might otherwise
have proved difficult to reach.

Quality assurance: There will be a need to develop and deploy
quality assurance procedures which document both policies for the
service and systematic procedures which will ensure that the
policies are being correctly implemented.

Managed transition into a service environment: The enthusiasm when innovators
and early adopters may have is not likely to be sustainable when
the innovation is deployed in a service environment. There will
be a need to manage this transitional stage.

Migration: The
planning stage for the deployment of a new service should also be
the time when plans are made for the migration of the service to
a new environment. This should include the export of data held in
the system and testing processes for importing the data into
alternative services.

Risk management: Migration of data is one aspect of a risk
management strategy. A risk management strategy should also
include aspects such as planning for server unavailability,
performance problems, etc.

Openness and Transparency: A simple technique for minimising possible
risks associated with innovation is to be open with one's
user community. If a new service is being trialled, inform the
users of this, and be honest about possible dangers. You may find
that they appreciate being informed and involved in the
experimentation.

Professional
development: There will be a need to ensure that those
who are involved in the development work have suitable training.
There will also be a need to ensure that other members of the
organisation have a better understanding of how Web 2.0 is being
used and how possible risks are being managed.

Conclusions

This paper has argued that the museum community needs to
continue to strengthen its understanding of the Web 2.0
phenomenon. Although 'doing Web 2.0' just for the
sake of it is a danger, the opportunities which Web 2.0 presents
are incredibly exciting, particularly given the content that
museums have and the audiences that we seek to engage.

It is only by working with these technologies 'in the
wild' that will we begin to understand exactly what the
benefits and risks of these approaches can be. This paper has
identified some of the more common barriers for not engaging with
these 'new' approaches and suggests strategies for
overcoming these barriers. Continued peer dialogue will be the
strongest means of building engaging and relevant Web 2.0
experiences within the museum sector.

References

Biographical Details

Brian Kelly works for UKOLN, a centre of
expertise in digital information funded by the Museums, Libraries
and Archives Council (MLA) and Joint Information Systems
Committee (JISC) of the Further and Higher Education Funding
Councils. Brian's job title is UK Web Focus - a national Web
coordination and advisory post. His areas of work include Web
standards, Web accessibility and quality assurance for digital
library development activities. A current key area of work is in
describing what Web 2.0 is and developing strategies for
exploiting the benefits which Web 2.0 can provide whilst
minimising potential risks.

Mike Ellis is Website Manager at The Science
Museum, London. He looks after several websites for the Museum,
which between them attract well over a million visits a month. As
well as managing the operational running of the sites, he spends
a lot of time building e-strategy and policy frameworks. He is
particularly keen on developing innovative multi-channel content
which puts users at the centre of the equation and which cross
real-virtual boundaries.