We will only use this email to keep you updated about On Think Tanks, its activities and its projects. I agree to the privacy policy and confirm my subscription

Join our network:

Support us

On Think Tanks was founded in mid 2010. It has evolved from a blog into a global platform dedicated to study and support policy research and policy research centres, or think tanks. The members of the On Think Tanks Team and its Advisory Board are spread out across 6 continents!

Please subscribe to On Think Tanks. The suggested contribution is GBP5.00 per month and you can unsubscribe at any time.

Opinion

A monitoring and evaluation activity for all think tanks: ask what explains your reach

Nick Scott has published an interesting post on the ODI blog: The 2012 ODI online awards – and some insights they offer into ‘success’. On Think Tanks was spared his critique of organisations and people who publish the lists of the most popular posts of the year (probably because our post offered some analysis into the list) but still his detailed analysis of the various reports, posts, events, etc. provides a challenge for us.

Nick’s post focuses on ODI’s content so it should not be taken to be an assessment of the best out there. Rather, it is an interesting monitoring (and learning) component of any think tank. One that should be on the top of the to-do list of any communications lead. Or, if we follow CGD’s example, something researchers themselves should pay attention to.

Thinking about the goals of altmetrics — identifying content that’s more relevant, more interesting, novel, or important, and doing so as quickly as possible after publication or, better yet, helping authors to find the best match for their works — made me wonder if we’re missing some obvious alternatives to metrics, ones based on words rather than numbers. Not alternative metrics (or altmetrics), but alternatives to metrics (which I’ll call alt2metrics).

He argues that:

There is an obsession with metrics that borders on fetish and can be rather unhelpful;

Often the people and organisations promoting them are interested in selling them to us -of making us play ‘their’ game; and

That intangibles can matter a great deal more than things we can measure.

This last point he expands with the following idea:

Perhaps key to all this is the fact that metrics take time to assemble — they are delayed, and secondary to activity. Non-measurement-dependent signals are more important, anticipatory, and upstream from metrics. And they are what scientists rely on every day to guide them and their searches for information. It would be a shame to spend all our time on secondary, derivative measurements while primary, original signals of value are ignored or downplayed inappropriately.

His analysis is worth considering when reading through Nick’s own (and do remember that Nick has argued something quite similar).

Global audiences are increasingly relevant in certain occasions and so their origin should feature high in their analysis:

Events were broadcast around the world through video streaming and reached viewers around the globe. There were registrations to watch from as far afield as Zimbabwe, Iraq, Pakistan, the Philippines and the three countries themselves. Overall, we logged registrations from 48 different countries in all five continents

VIPs still matter:

The blog around the graphic received 2,017 views. The big story, however, is not how many times it was viewed but who it was viewed by. We happen to know that this info-graphic was shared among the negotiators on climate finance at the Doha summit.

Alternative metrics demand alternative analyses, too; things are not always what they appears to be:

The ODI Centre for Aid and Public Expenditure conference saw 113 tweets mentioning the event page. However, this again has a lesson, and it is one that will become increasingly important to take on board as AltMetrics become more important to judging academic success. The issue is what makes up a figure – the 113 tweets about the conference include a number of tweets by ODI researchers and staff promoting the conference. Given this, a more likely winner for the most tweeted piece of content would be Inconvenient truths about corruption and development, a November 2012 blog written by Marta Foresti, which was tweeted about 99 times.

That is the fastest growing research site I’ve seen in the six years I’ve been at ODI. Perhaps that is because it has, in those months, published 207 posts – that is over a post a day. This includes posts from people at numerous organisations working on the subject. This regular updating has made it a vital source for information in the sector, a fact also evidenced by the 719 followers it has quickly accrued on Twitter, including leading figures in the post-2015 debates.

Third-party posting, and using other people’s networks, can be more effective than attempting to keep it all in-house:

This works for publishing in other spaces:

This points to another of the issues with top 10 lists, especially for ODI given our ‘being there’ digital communications strategy that encourages posting to large or sector-specific blogs wherever possible to reach a wider or more directly relevant audience.

Most people are looking for jobs (this appears to be true also for some of the most read newspapers in many countries) so we should not be fooled by ‘visits’:

around 70 percent of visitors to the ODI site come to research or look for jobs. They’re not necessarily in an interacting mood.

Action beyond reading is an important indicator of success:

And my final award of the year goes to the blog that managed to get most comments on the ODI site – a key indication that the blog has generated engagement and interest.

Don’t forget the old stuff; it is often work done many years ago that keeps getting the most downloads and the most interest:

For many years now, this 2003 online toolkit has been by far the most viewed piece of content. In 2012, the toolkit received 30,310 views and the downloadable version was accessed 8,410 times. Why is this still getting so much attention? One word: Google.

Beyond the lessons, too, this kind of analysis is important for any organisation. I would even argue that this is really all that is necessary. Asking questions about impact on a policy simply goes too far. It escapes the legitimate role of most think tanks and demands that they make assumptions that make any assertion highly unreliable; and useless. This, however, offers many useful ideas for immediate and future action.