Friday, January 27, 2012

For a successful careerFind something to measureAnd measure the f**k out of it.

Simple targets work best, for example, if you're measuring interactivity, count number of Twitter @ replies or followers. And definitely count the number of hits on your web properties originating from Twitter, Google+ or wherever. If you insist on having fancy tools, there are lots available, although I'm slightly dubious of the value of tools such as SocialBro, and outraged at the spurious influence claims made by Klout. So keep it simple and measure your impact by setting clear targets to aim for.

For more in the same vein, be sure to follow the live video stream from Martin Weller's keynote talk at #dr12vitae.

Wednesday, January 25, 2012

Aim: To help you understand how social media can support your research and which tools are the most appropriate for you to use. Date: 26 Jan 2012 Time: 10:30 - 13:30 Location: University of Bath 1E 3.6 Speakers: Alan Cann, University of Leicester, with Jez Cope and Geraldine Jones, University of Bath. Funded by the EPSRC Knowledge Transfer Account.
This workshop will show you how you can use social media to help your research and your career. Social tools have important implications for how researchers (and educators) communicate and collaborate. This session will provide you with information to make informed decisions about using social media and help you select from the vast range of tools available. Social media has downsides as well as upsides, but on balance there is real value for researchers, from information discovery, through dissemination of your research, to impact metrics.

The hashtag for the session is #bathcr. I don't know how many of the participants will be tweeting, but if you'd like to follow along and contribute between 10am and 1pm tomorrow, that would be great.

Tuesday, January 24, 2012

Last week I put up a manuscript for open peer review (It's academic publishing Jim, but not as we know it). In that post, I explained my reasons for doing this rather than going down the conventional (journal) academic publishing route. The review process, which I arbitrarily set at 14 days, is still running, but in this post I want to discuss my reflections on the process to date.

As I expected, reviews started to come in rapidly, 7 within the first 48 hours, then stopped equally rapidly. Internet attention is transitory, but in part this is a reflection of the fact that I drew the blog post to the attention of a number of people by email, inviting reviews. However, this pattern is typical for Internet content - a fast decay phase followed by a longer, slower tail (The Spread of Scientific Information: Insights from the Web Usage Statistics in PLoS Article-Level Metrics. (2011) PLoS ONE 6(5): e19917). There were no "spam" comments, which I had anticipated, and even though I had attempted to make clear in the post that anonymous reviews were entirely acceptable, all reviewers chose to identify themselves. Ironically, this is a concern, as while I suspect that reviewers consider named reviews to be somehow more "valid", I am worried that potential negative reviews are simply not posted, rather than being contributed anonymously. Interestingly, relatively few colleagues from my own institution, who I had alerted by email, contributed a review. In part this may be because they were wary of possible conflict of interests. When I repeat this process in future, I will simply post the article and reviewing guidelines online, without individual email notifications. Another concern for the future is the possibility that familiarity may breed indifference, limiting the number of reviews received.

I am grateful to Martin Weller for his additional comment on the review process:

"I tried to put my official reviewer hat on and review it as if I was doing a standard (blind) peer review. It may be that this is an inappropriate transfer of process, and instead I should adopt a different style for open, informal review. But we fall back on what we know. My review may be a bit harsh, but I was conscious that 'asking your mates to review' isn't really comparable to anonymous peer review at all. I might be far less likely to criticise a friend. My colleague Gill Kirkup maintains that anonymity in the peer review process is essential because it protects the reviewer, particularly a young reviewer who is reviewing a paper by someone eminent in the field. Of course, it also allows people to be ruder than they would be otherwise, and often to say incorrect judgements because there is no debate or come back.So this may be a good way to get feedback on a paper, but would it equate to peer review? I don't think so, but then maybe it's a sufficient filter to allow publication and then post-review. It's also quite a brave thing to do and I suspect many colleagues might be reluctant to go this route. If you write a crap paper that gets rejected by a journal, only a handful of people have seen it - if you do it this way, potentially hundreds will."

A number of people commented on various forums that I was "brave" to expose my work in this way. It doesn't feel brave to me, it feels liberating, although possibly foolish. Specifically, it feels far less brave than exposing my work to non-transparent peer review. Maybe I've just had a run of bad luck, with editors taking capricious cost-based decisions to refuse to even send my work out for review. Entering that lottery - now that's brave (or foolish). Accepting that my peers may tell me that my work is of little or no value (and I have no doubts about the honesty of people who responded, so I feel confident they would), the whole process feels right to me. If some papers are slammed, then I either work on them further or abandon the concepts they contain.

So will I repeat this exercise in future? Most definitely - I already have a manuscript in mind, although this one is perhaps more of a technical report than an investigation. Will this become my sole future publication channel? No, not because I do not believe in it, but there are circumstances (collaboration with junior colleagues for example) where the alleged kudos attaching to publication in conventional journals is important for their careers. Should you repeat my experiment. That's up to you, but if you feel your circumstances permit, I would encourage you to try it for yourself. As I commented on Frances Bell's blog, "... I am not suggesting the approach I have taken is the “best” solution, nor necessarily appropriate for everyone – I have already identified a number of flaws. I do suggest that it is an improvement on the current model of closed, and frequently capricious, peer review. Open is good. If we support open access, why not open peer review?".

Monday, January 23, 2012

The fuss over Apple's launch of iBooks last week obscured what could have been much more important - the launch of the "new" version of iTunesU, together with accompanying free iPad/Phone/PodTouch app.

At first, I was excited by this, because it appeared that this was iVLE, aka VLE in the cloud. And the iPad app is very nice. But sadly, the app functionality is not replicated well in iTunes, thus cutting out students who do not own iPads, and all Windows users. iPhones/Pods are OK for listening to a couple of podcasts, but no-one in their right minds is going to attempt a full-blown statistics course on an iPhone. And the content on iTunesU is still as variable in quality as it ever was.

Presumably Apple could not see a revenue angle in iVLE. Oh, what might have been.

Thursday, January 19, 2012

In my module questionnaires our students say they want face to face sessions - but do they? Very few students attend voluntary help sessions intended to supplement detailed notes online. In the past I have tried "Office Hours" but still very few takers. I put this down to the fact that our students are not familiar with the "Office Hours" culture.

So what is the way forward? How do I give the "personal touch" with >250 students? Any suggestions? (Google+ discussion)

Tuesday, January 17, 2012

At our monthly PedR meeting yesterday we discussed the following paper:

Mark Huxham, Fiona Campbell, Jenny Westwood (2011) Oral versus written assessments: a test of student performance and attitudes. Assessment and Evaluation in Higher Education 37(1): 125-136
Student performance in and attitudes towards oral and written assessments were compared using quantitative and qualitative methods. Two separate cohorts of students were examined. The first larger cohort of students (n=99) was randomly divided into ‘oral’ and ‘written’ groups, and the marks that they achieved in the same biology questions were compared. Students in the second smaller cohort (n=29) were all examined using both written and oral questions concerning both ‘scientific’ and ‘personal development’ topics. Both cohorts showed highly significant differences in the mean marks achieved, with better performance in the oral assessment. There was no evidence of particular groups of students being disadvantaged in the oral tests. These students and also an additional cohort were asked about their attitudes to the two different assessment approaches. Although they tended to be more nervous in the face of oral assessments, many students thought oral assessments were more useful than written assessments. An important theme involved the perceived authenticity or ‘professionalism’ of an oral examination. This study suggests that oral assessments may be more inclusive than written ones and that they can act as powerful tools in helping students establish a ‘professional identity’.

I enjoyed reading the paper and was happy to see oral assessment "winning out" over writing as the sole means of assessment. Nevertheless, I was disappointed not to see any accounting comparing time taken for oral and written tests - in reality, this is the factor likely to scupper any back to the future return to the Socratic method.
Our discussion at the PedR meeting pulled out a number of statistical errors and other possible confounding factors not discussed, but overall we agreed this is a good paper worthy of note. What a shame the authors did not subject the manuscript to open peer review to make it an even better paper.

Vitae in partnership with The British Library are running Digital Researcher 2012: an innovative, thought-provoking one day event to help researchers make the most of new technologies and social media tools in their research. Designed for both postgraduate researchers and research staff within any UK institution, this interactive event will be held at the British Library on Monday 20th February 2012, and will provide an opportunity for researchers to think about how they undertake research and to consider whether new technologies could improve their research.

Although this popular event is now full, there is a waiting list, and this year, we are looking to make the online event better than ever so that everyone can participate.

Monday, January 16, 2012

I have a manuscript currently in press with an academic journal which describes work that we performed three years ago. In part, the fault for the delay in publication lies at my door, but the original version of the manuscript now in press was written 18 months ago and first submitted for publication over a year ago. There followed a catalog of errors, some due to me, others due to editors and journals. The current incarnation of the paper was submitted to the journal where it will appear shortly six months ago. It is still not published. I should feel lucky - others have had worse experiences than this:

"Because of the work described in the paper had already been talked about in public forums and included in grant applications, and because publication was important for moving forward with our grant applications, job applications and other papers, we felt we could not spend another year in the review process. The very essence of the scientific process is to challenge paradigms and share the experimental details with other scientists who can then reproduce or refute the findings. Publication is key for this process. We needed to publish."

An efficient and effective system for interactive student feedback using Google+ to enhance an institutional virtual learning environment(PDF download via Dropbox) Update: Final version now publishedAbstract:
Whether or not you take a constructivist view of education, feedback on performance is inevitably seen as a crucial component of the process. However, experience shows that students (and academic staff) often struggle with feedback, which all too often fails to translate into feed-forward actions leading to educational gains. Problems get worse as student cohort sizes increase. By building on the well-established principle of separating marks from feedback and by using a social network approach to amplify peer discussion of assessed tasks, this paper describes an efficient system for interactive student feedback. Although the majority of students remain passive recipients in this system, they are still exposed to deeper reflection on assessed tasks than in traditional one-to-one feedback processes.

How it works:

Please read the manuscript then leave your review as a comment on this blog post. Please use page and paragraph numbers to refer to specific sections of the manuscript.

Reviews may be named or anonymous as you wish.

To expedite the publication process, this manuscript will be open for review for 14 days from today.

Following the review period, all substantive reviews will be taken into account and the manuscript revised accordingly. (My best estimate from blog stats is that between 1,000 - 2,000 unique visitors view the content on this site. If 1% of visitors take the trouble to leave a substantive review, that's a much more rigorous review process than any academic journal I am aware of.)

If the majority view is generally positive, the revised manuscript (including reviews and author responses) will be published on the Leicester Research Archive.

I think this is as efficient and transparent as I am able to make the academic publishing process, but if you have any comments or suggestions, I welcome them. Most of all, I would welcome your review of the manuscript as a comment here. I cannot offer you any payment or other inducement beyond the knowledge that you will be helping to fix the broken model of academic publishing. And of course, given the opportunity, I will be happy to reciprocate your time in reviewing any papers I feel competent to comment on should you wish to participate in a similar process.

Notes:
Other options considered for sharing the provisional PDF were Slideshare and Google Docs. These were rejected due to problems with PDFs being reformatted and Dropbox selected as the best general purpose solution, but potentially any site which allows free PDF downloads would be suitable. if this blog had been hosted on Wordpress, that would have been a suitable choice, but Blogger does not allow PDF uploads.

Thursday, January 12, 2012

"As an outreach vehicle, blogs with well-structured messages and delivery mediums reach beyond the uni-directional information provision typical of many scholarly communication efforts to connect with readers and compel them to look critically at sources of information; to search out more information; and, ultimately, to influence practices. The flexibility and ease of publishing a blog allows for greater engagement between researchers, stakeholders, and the public through rapid dissemination of commentary and analysis on research. The accessibility of new media, such as blogs, helps create a multi-way dialogue and exchange of ideas so as to complement traditional communication avenues used in research, teaching, learning, and extension work carried out at higher education institutions.
Recognition and reward frameworks used at higher education institutions to evaluate scholarly activities have been structured around traditional forms of academic publication. New media, such as blogging, provide new channels for conducting and disseminating scholarly work. We suggest that ample evidence can be provided for new media practice and products to be considered for promotion and tenure within an academic portfolio."

I was particularly interested in Olstrom's discussion of the "theory of the firm" (entrepreneurs) and the "theory of the state" (rulers), and how this relates to academics working in universities struggling with OER production and use. All I have to do now is read Hobbes Leviathan, something I had an inking I might need to do when all my university contemporaries had it on their shelves while I had biochemistry textbooks. Olstrom's solution to CPR allocation is to address three problems: supply, commitment, and mutual monitoring, giving us a framework for addressing OER issues. What is apparent from reading the many case studies analysed is the absence of heavy-handed institutional intervention in successful and stable CPR allocation.

Although Olstrom states that "in a highly competitive environment, those that do not search for and select ... rules that enhance net benefits will lose out to those who are successful in adopting better rules" the question remains in terms of OER adoption whether universities are competitive or are in fact a cartel.

Monday, January 09, 2012

The recently published second edition of Steve McKillup's Statistics Explained: An Introductory Guide for Life Scientists is an excellent introductory textbook, theory based, nicely contextualized in life sciences. I'm adopting it for my R-based statistics module next term, where it will sit nicely alongside the practical aspects of using R.