Don’t forget if your research publications are on LRA, that they can be accessed by anyone in the world, unlike those behind publisher paywalls. Simply by sharing the unique identifier (handle) in an email list, on webpage or via social networks you will find that your access rates and citations should climb yet further.

As a special Monday treat – here’s the third guest post from my team, this time from Valérie Spezi.
______________________________________________________________________________

I went to a very interesting conference last Friday on Open Access and its impact on libraries and librarians, organised by the Repository Support Project (RSP). As usual with RSP events the conference was very well organised, including venue and catering, and the selection of speakers was of great quality.

The conference was well-attended, mostly by librarians from all over the UK and across sectors, which wasn’t really surprising given the title of the conference. Librarians from the Higher Education sector formed the bulk of the audience; but it was nice to see that there were also a few publisher and funding agency representatives, as well as information and library consultants.

The conference objective was to provide librarians with greater knowledge of Open Access, what it is and what it isn’t, how it is currently developing and shaping scholarly communication systems and what impact it’s having on libraries. Far from preaching Open Access at any costs, the selection of speakers offered a balanced view of what is achievable within Open Access and what isn’t.

Bill Hubbard started off the conference day with a presentation setting out the background to Open Access (OA), i.e. what OA is and what it isn’t, and what the rationale supporting OA and the drivers are. Open Access for most of us simply means ‘open to read’ (including cache, save and print), and most materials available today on Open Access are open in this way. But it was interesting to learn that the original idea in the Budapest vision started out with a more unifying view of an open world where material would be open to read and re-use. However the reality proved to be much more complex and made it difficult to achieve this ideal of a total open information world. Then, Bill presented the usual drivers for OA (serial crisis, moral case, financial rationale and academic need to have speedy access to scholarship). More importantly, Bill added an additional driver: ‘because we can!’, meaning that we are witnessing major changes in the information environment, and Open Access is one of these major changes taking place and it can’t be ignored as we have the technology and all the good reasons to do it. Finally, Bill went through the usual misconceptions about OA (subversion of peer review, replacement for publication, invitation to plagiarism and an attack on copyright) and explained why these arguments against OA were erroneous or even sometimes illogical. In conclusion, it was suggested that what is needed to build OA are systems and workflows to support researchers, institutions and funders, who all seem to be favour of Open Access but are facing great challenges individually.

The second speaker was Alma Swan, from Key Perspectives Ltd, who has done a great deal of work on scholarly communication and Open Access (it is only fair to note that she and her business partner pioneered key research studies on researchers’ attitudes towards Open Access in 2004/05). Alma’s talk was about the economics of OA. The aim of her talk was to present the costs and savings at system levels (Houghton et al. report [2009]) and at university level (Swan [2010]). She presented first, in great details, the results of the Houghton et al. report, which looks at costs and savings in three different OA scenarios (self-archiving or green access; repository archiving with overlay services; open access journal or gold OA). She then moved on to her own work which builds upon the Houghton report as she used the Houghton methodology to build a limited number of case studies looking at costs and savings at institutional level.

In conclusion, the main idea from Alma’s talk was that universities differ greatly and therefore results differ greatly from one institution to another: whereas the Houghton report indicated that the UK scholarly communication system as a whole could enjoy substantial savings across the sector, Alma’s work indicate that research-intensive institutions may end up yielding negative savings (i.e. paying a cost) in the move towards Open Access, and therefore the national OA savings case needs to be managed so that some universities are not individually disadvantaged.

Following on was the interesting talk from Wim van der Stelt, Executive Vice President Corporate Strategy at Springer, one of the biggest STM publishing houses. Beside the presentation of the various open access options offered by Springer, Wim made a few interesting points:

Firstly, Springer has adopted the strong Open Access position, meaning that all their open access content is open to read and to re-use as in the Budapest initiative, save for commercial purposes. The main reason for this was according to Wim the fact that it seems that end-users generally don’t care about copyright and hence Springer’s decision to go strong OA in order to simplify copyright policies. Wim emphasised the fact that Springer leaves the copyright to authors and is generally happy with just a licence to publish.

Secondly, Wim bluntly took on the role of publishers in today’s scholarly communication system. If publishers’ role in distributing or disseminating scholarship was acknowledged as a thing of the past, it was however said that publishers create value-added services to their authors – though what exactly those value-added services consisted of was not really debated. In short, Wim reminded the audience that publishers are in the scholarly communication market to make money first and foremost, thus the legitimacy of open access embargoes enabling publishers to ‘rightly’ monetise on content. So, yes to Open Access as long as there is a business model, and BMC, recently acquired by Springer, has certainly proved that Open Access can be extremely lucrative. But Wim also insisted on the fact that publishers are equally eager to please their customers, i.e. the research community, and thus Springer wouldn’t go down the open access route if it was felt that there was no demand for this from the research community they serve.

Thirdly, Wim talked about the difficulty to introduce new OA journals as they often bear huge fees whereas they still have not yet yield any impact factors.

Wim concluded his speech by saying that it was believed that OA would stay as a complementary business model; a conclusion I understood was also shared by some of the speakers present at the conference. As Bill repeated several times, Open Access is not THE solution to today’s scholarly communication failures; it’s only one of the components of the changes scholarly communication is going through.

Susan Ashworth, Assistant director for Research and Learning Support services, presented the development of Enlighten, the University of Glasgow’s research repository holding just over 35,000 records (but only 3,600 full-text items). Susan talked about the organisational/staff structure of Enlighten, and it was interesting to see that the University of Glasgow went for a manager-less unit but with a very strong support team consisting of cataloguers, a service development manager and an advocacy manager. From the many common drivers listed by Susan, one specific driver stood out: the imperative need for research management the RAE brought about in Higher Education and Enlighten seems to fulfil this role just fine. Other interesting aspects of Susan’s talk were the excellent work the Library has done on author disambiguation using Glasgow Unique identifier (GUID), the delivery of subject feeds from Twitter, and the hot-linking to publications of all that valuable information when it comes to research management that are grant funding and funder names.

Dave Carr, the Open Access Adviser from the Wellcome Trust (WT), was the next speaker. The OA policy was introduced in 2005 and made mandatory in 2006, requiring WT-funded researchers to make their publications available within 6 months of publication. Dave estimated the compliance rate to the policy to be close to 50%. But the WT is aiming high, and raising the compliance rate is one of the top priorities of the trust. Therefore Dave’s talk was all about how the WT can persuade researchers of the benefits of Open Access and how the WT can help their researchers to make their publications available on OA. Dave’s talk was focused on the need to establish communication with researchers. Too many researchers are still unaware of OA funds or self-archiving practice, this is why the WT is working hard in getting the message across to researchers. In conclusion, it was felt that sanctioning non-compliant researchers was not the way to go for now, and the WT would strive to persuade researchers rather than adopting a punitive approach.

Chris Middleton, from the University of Nottingham, presented a case study on institutional funding for OA publishing. The main driver for setting up the OA fund was the looming REF. Some figures were provided: 353 OA fund requests were made over 4 years, whereof 140 were made in 2009/10, representing circa £171,179. The average cost per article at the University of Nottingham, was estimated at £1,317, but the payments were wide-spread, ranging from £277 to £2,990. The message Chris tried to pass across to the audience was the paramount importance of budgeting those OA costs (where is the break-even point?), but also that this is a very difficult task a University/library sets for itself. Chris used a slightly modified version of Alma Swan’s economic model to base her calculations on OA costs and savings.

Jackie Wickham, the RSP Open Access Adviser, reported on a survey of repository staff she conducted over the summer 2010. The survey was distributed via the UKCoRR list. Beside some useful data on repository staffing, such as 76.2% work part-time, 73.8% work as part of a team or that repository staff tend to be highly educated, Jackie also offered a summary of the skills repository staff (managers and administrators) thought were important, and this included communication (getting the message across), interpersonal skills, project management, determination, perseverance and patience .

Finally, the conference day came to an end with Paul Ayris, Director of Library Services at UCL and President of LIBER, who presented a selection of Open Access projects he’s involved in such as:

DART-Europe, the principal gateway for discovery and retrieval of OA European theses, and how the EThOS project fits into this.

Europeana libraries, some sort of Google-equivalent (which in itself is an already ambitious endeavour…) for European quality-assured research content.

LERU (League of European Research Universities) – a consortium of 22 research-intensive universities in Europe lobbying at the European level for the promotion of research.

In conclusion, this RSP Open Access conference was very informative and enlightening and helped to understand the debate, the drivers but also the challenges of Open Access. And, in the words of Bill, the conclusion would be that OA enables publishers and librarians to channel the huge changes currently taking place and ensure quality control of the research made freely available to anyone. Far from delivering anarchy in scholarly communication, OA helps stakeholders to organise and channel the mass of open research content.

It’s not come as a big shock to anyone that the new Government have made good on one of their pre-election promises to push back to the REF-2012. You can read more about it and the response from the sector at the following locations:

On Wednesday the 17th March, I and several colleagues from Leicester visited the Pilkington Library, Loughborough University, for the EMALink event Subject Librarians…defining their mission, measuring their impact, preparing for the future.

Subject Librarians are feeling a little uneasy about their job security these days. This is due to events such as those at Bangor University, where several subject librarians lost their jobs, and the more recent events at Warwick University, where subject librarians had to re-apply for their jobs at a lower grade. So, the aim of this event was to look at what we do and how we can show our worth.

Loughborough did a survey of academics in the Departments of Civil & Building Engineering, English & Drama, and Materials Engineering to assess the impact subject librarians have on their communities.

They got a 27% return rate and felt that they were probably preaching to the converted, as the respondents were generally those already known to library staff.

25 out of 29 respondents knew they had a subject librarian and 22 could name their subject librarian.

What was interesting was the difference in how academics rated the skills they thought subject libs should have, compared with how subject libs themselves rated the same skills.

Subject knowledge (not just information resources knowledge) was rated highly by academics, as was the ability to keep up to date, whereas subject libs rated subject knowledge high but not as high and thought presentation skills were pretty important.

When asked which services subject libs should be able to help with the academics rated the top three as copyright advice, putting content into the institutional repository and finding journal impact factors.

Copyright came as a surprise as the University has a copyright officer who is not based in the library.

What also surprised the subject libs was the glowing testimonials that accompanied the surveys,a nd which they hope to use in marketing their services at a later date. Comments such as “Invaluable”, “Important” and “Skilled Professionals”.

They tried to do some social network analysis based on the responses (i.e. how the academics and subject libs were related, who knew who etc), but the sample was too small.

They hope to further the research with a new bid for funding and would look at widening the survey to non-users, measure departmental use of the library management system and analyse subject libs communications with academics.

From the findings of this initial survey they are looking at the issue of offering copyright advice, offering research impact training to Depts (which has raised the usage of JCR), and marketing the subject libs better to the academics.

Chris did a short introduction to this session musing on what is a subject librarian?

Are we there to improve services? As experts in our field? As a gateway to collections?

Are we endangered? Should we have functional skills or subject skills? Do we suffer from poor job definitions (Pinfield, S, 2001)?

We should be positive in response to change/challenges.

What do we do? are we moving into new roles? Do we need new ways of working (Roberts & Levy, 2005)? How do we demonstrate value?

We then split into groups to try to write a subject librarian mission statement (see photos). Our group got distracted by talking about the differences in what we did, whether we taught and how we supported research.

One of the things we identified with in Chris’ talk was being compared to “middleware”, as we sat between the library and the department, and had to represent the views both to each other.

We then broke for a speed-dating lunch, but I was too busy chatting to people to do any speed-dating!

Average age of articles accessed varies between disciplines (3-4 years old for bioscience, typically 8 years for History)

Life Scientists are biggest user of ejournals over all, Economists work more on Weekends than others and Historians are the biggest users of Google as access route (?!)

Average download cost (calculated from total cost of 06/07 subscriptions) was 80p

In the end the study concludes that yes, ejournals do represent good value for money, and that “per capita expenditure and uses of e-journals strongly correlated with numbers of papers published, of PhD awards, and of research grants and contracts income.” Interestingly though they end with a question: does good ejournal provision enable an effective research environment, or does a strong research environment create the need for good library services?

The h-index (Hirsch Number) is a metric that is increasingly becoming of interest to researchers, especially in the light of the REF. An h-index is “a number that quantifies both the actual scientific productivity and the apparent scientific impact of a scientist“. You can work it out manually, but to be honest you’d need to be mad or a bibliometrics fiend to want to.

I’ve been asked by a few people how to find it, and each time I totally forget how! So in the light of this, here’s my step by step guide to discovering an author’s h-index automatically using that wonderful Web of Knowledge tool!

Rnter the author’s name in the format surname initial* (e.g. raven e*)

Change the search option from the drop down menu to Author

Click Search

At the top right of the results is the option to Create Citation Report. Click this.

The analysis appears, along with the person’s relative h-index.

It seems simple, but I was scratching my head using WoK until I discovered that I need to just use Web of Science, not the whole WoK in order to get the value. And so, now you know! It is worth noting you do have to be fairly exact in your author naming conventions, as the citation report will not run for more than 10, 000 result records.

I did wonder if between steps 6 and 7 about selecting individual papers from the list of results, but it appears that this has no effect on the citation analysis; for example selecting 5 papers from a list of 120, 000 doesn’t enable me to run the citation reports – it appears to run in an all or nothing manner. Or maybe there’s a trick here I’m missing?