Tuesday, April 27, 2010

Helen Muir's "UKSG 2010, Librarians and Open Access"

I promised you more snapshots of UKSG conference summaries, and here's one from Helen Muir, Research Support Librarian at Queen Margaret University (excerpted from Helen's post here). Helen's piece adds further useful examples from experience to some of the issues discussed at the conference.
I attended UKSG on Tuesday 13th April, and the first 3 speakers (Dorothea Salo, Eelco Ferwerda and Jill Russell) were all keen advocates of open access, and from what I could gather, open access had been discussed during the previous day also. Jill's presentation included details of the pilot project that she has been involved with at the University of Birmingham where three colleges have been allocated funding to publish their research findings following a gold open access publishing model. At my own institution, following the gold route has already been disregarded - we simply do not have the funds to both pay for journals subscriptions and pay for researchers to publish with journals as well, and despite funder mandates to make research publicly available, I know that researchers here are not as yet keeping money aside to pay an open access publication fee. I imagine that this is the case for most other institutions, indeed Jill was quick to point out that her institution could not afford to pay these fees for the majority of researchers there, leaving some of those not taking part in the pilot somewhat disgruntled.

All of this has lead to me looking at open access from a different angle. In my own role, I am actively encouraging researchers at my institution to both publish their research following an open access model, and to deposit their work in our repository. I've been doing this for about a year and a half now, we have our open access mandate in place, and I thought that encouraging researchers to make their paper open access would eventually lead to the tipping point where open access journal publishing would overtake subscription based. It's a lot more complicated than that though, isn't it? As well as the researchers being convinced, there is also the much less talked about hurdle (and from what I've seen it's a big one) of librarians who are still happy to follow the subscription based journal publishing model. I don't mean this to be a criticism of library budget holders and serials librarians who are working very hard to negotiate with publishers to retain access to as many journal titles as they can with their ever-decreasing budgets. Academic libraries have students who are paying for their education, and expect to have access to journal articles, and academics who expect access to current research to do their jobs - the idea of just stopping paying subscriptions as an individual institution, or even as part of a consortium is unthinkable at present, when the backlash from students and academics at losing access to journals would be so great.

Discussion of open access does have to be opened up further to the whole academic and research library community, and not just remain mostly within the world of repository practitioners and developers. I also wonder about the effects of CILIP's reporting on open access to the wider library community, with another less-than-positive review in this month's Update entitled Open access could cost some universities dear, says Jisc report. (I'm not putting the link to this in here as CILIP members will know where to look, and for the rest of you CILIP Update is not open access - sorry.) Hopefully the more positive coverage that open access has received at UKSG10 will help to redress the balance of this.
Helen's post was commented on by Stevan Harnad, who concluded that "Universities need to commit to mandating Green OA self-archiving before committing to spend their scarce available funds to pay for Gold OA publishing. ... Journal subscriptions cannot be cancelled unless the journals' contents are otherwise accessible [via green, not gold, OA] to a university's users." Helen's response adds further useful experience to this discssion (emphasis is mine):
I understand and support your argument fully, but also witness some drawbacks as well. Queen Margaret was a very early mandate adopter, but unfortunately this has not meant a dramatic increase in deposits in our repository. Indeed, most of the material that finds its way in gets there because I've actively searched for it, then chased researchers up for the papers. Only if I'm very lucky do I find a researcher who has kept a draft that the publishers copyright policies will allow me to put up following the green model. I do believe that the message is slowly getting through to some of our researchers however, and think that this is encouraging (I was sent a green OA paper today that I can deposit, but I have to honour a 6 month embargo first. The embargo is quite off-putting for some researchers and was a key reason for Birmingham piloting a gold model, according to Jill Russell at UKSG10).

One of my main concerns is my perception that librarians in academic libraries are not directing students and academics towards OA resources e.g. when the institution does not subscribe to a particular journal - it is not within our culture yet to explore databases of institutional repositories such as OAIster yet (I've not had a chance to have more than a quick glance at http://mimas.ac.uk/irs/demonstrator/ yet , but was very pleased to hear of its existence earlier today). It seem to me that I am contradicting myself when I recommend to students that they use the bibliographic databases for the best journal results, but then tell researchers that search engines such as Google will index their articles in the repository, thus making the accessible to a much wider audience. Searching for open access resources has to become an integral part of information search strategies.
Helen's posting and the resulting discussion add some additional nuances to the paper given at UKSG by Jill Russell, and the wider open access debate. Can readers of this blog add comments from their own experience? (And in this context I propose that we stick to experience / evidence-based comment since discussions elsewhere are giving sufficient focus to the theoretical aspects.)

(Syndicated with permission from http://libraryresearchsupport.blogspot.com/2010/04/uksg-2010-librarians-and-open-access.html.)

Labels: , ,

Monday, April 26, 2010

UKSG Conference Summary

This is the summary I put together for my SAGE colleagues. I am reposting here in case it is of use. Comments on theme omissions welcome.

The 33rd Annual UKSG Conference was held in Edinburgh a couple of weeks ago, with a varied programme and over 850 attendees. Themes gleaned from the sessions and discussions I attended are summarised below. The very active twitter stream from the conference can be found here. I was amazed at the ability of delegates to listen, assimilate and then tweet or blog all at the same time!


Social Media
An increase in focus on Social Media and new ways of sharing information and research was very evident. There was a much higher engagement and understanding of newer communication channels. Interestingly, it was not younger researchers quoted as spending more time blogging or commenting online but more established researchers who have already secured a reputation and can afford to spend time in this way. They are contributing to knowledge and growth in their field but not just in the tradition of peer-reviewed high impact journals. An interesting development would be a unique researcher identifier such as ORCID which would help tie all researcher output together.

Researchblogging is interesting, its focus is on serious posts about peer reviewed academic research. It has over 1200 blogs and has doubled in the last year. Adam Bly from Researchblogging explained that more collaborative science and the creation of new knowledge has the scientific information industry running to keep up. We’ll be contacting them to ensure posts on Sage content can be linked back to the original research.


Open Access
OA feels like it is slowly gaining traction as a publishing model. The vibe from various sessions was less if than when. Tony Hirst was emphatic on the new openness of communication channels. Why use traditional journal articles to share ideas he asked? If he or his colleagues do decide to publish an article it will be in OA jnls. However, no clear wide-ranging path for institutions to fund OA was apparent yet.

Usage
JISC Collections have secured funding to progress with the JISC usage stats portal or “make it real”. We'll be contacted for our input, the prototype involved data from Elsevier, OUP and Springer. Its aim is to present usage data in user-friendly ways for librarians and include new ways to benchmark usage. There is not clarity on confidentiality issues and was no resolution proposed on the question of the confidential nature of usage data. It’s early days.


Big Deals, Value & Pricing
An interesting session on value had Ted Bergstrom from UCSB explaining that he is securing information about big deal pricing so he can publish information about outliers in the public domain. His reason: “as citizens of the academic community, we are interested in helping librarians to understand the dynamic economic problem that they face and aiding them in negotiating effectively with large publishers. We plan to release a collection of information and analyses that will serve this purpose. See his Big Deal Contract Project page.

As in previous years the mood seemed to be that the Big Deal bubble must burst, as it is unsustainable for many institutions, but there was no clear way forward proposed still.

Carol Tenopir talked about developing real tools for librarians to demonstrate value of their collections. She pointed us to the ARL website for more information.

Jill emery from Texas talked about patron-driven ebook and journal article acquisition stating 'the age of the article is here'. She explained they need “aggregated article access”. She wants publishers to listen as they have to purchase “Just in time not just in case'.


Quality Metrics
Frustration with the reliance on the Impact Factor and the fact it ranks journals and not articles was apparent. There is a desire to produce new ranking metrics. the Australian Research Assessment is no longer using the Impact Factor we were told.

Pete Binfield (PLOS) ran a session on how they have introduced article-level metrics - it’s worth a look if you haven’t seen. They recognise they are very much at the beginning but are very keen to do whatever they can to help users decide which content is highly valued by the community. Also, these metrics not just about evaluation but to help users filter and discover articles of value.

Pete talked about how they have work to do on working out how to measure “influence”. It’s important to demonstrate influence beyond the scientific community. This ties in with our work to show the value of social science research. How do we ensure the research we publish is credited appropriately when it influences Government policy for instance?


As Hannah Whaley says on her blog: The discussion around these issues is healthy, as is the growing volume with which librarians and researchers are willing to speak them out loud. However these key themes are notable for representing problems, not solutions. It is clear that licensing models, researcher metrics, electronic and open access still have some way to evolve to meet the growing needs and expectations of the community.

Labels: , , , , , , , , ,

Hannah Whaley's UKSG summary

Hannah Whaley is an Assistant Director in the University of Dundee’s Library and Learning Centre, with responsibility for Research and Systems. She specialises in system design, service development and innovation within HE teaching and research. Hannah recently wrote a great blog posting identifying key themes at this year's UKSG conference, and she's kindly agreed that we can syndicate her posting here. Read on for a snapshot of the conference from Hannah's point of view - does it tally with yours? More snapshots coming soon!
The 33rd Annual UKSG Conference was in Edinburgh this week, with a varied programme and over 850 attendees. A number of themes started to recur through the sessions and discussions, as summarised:

  • Big deal bubble must burst, as it is unsustainable for many institutions
  • We must move further towards open access, but it is not yet clear how
  • Journal impact factor isn’t good enough anymore, we need to review the commentary and produce new ranking factors
  • Linked information is nearly here, allowing informal and pre-publish conversations to be viewed and measured in a structured way on the web
  • The age of the article is here, meaning metrics, usage and discoverability will increasingly be at article level rather than the ‘journal container’
  • Just-in-time must replace just-in-case, as no one can maintain a full array of items that may only occasionally be required

The discussion around these issues is healthy, as is the growing volume with which librarians and researchers are willing to speak them out loud. However these key themes are notable for representing problems, not solutions. It is clear that licensing models, researcher metrics, electronic and open access still have some way to evolve to meet the growing needs and expectations of the community.

(Syndicated with permission from http://www.hannahwhaley.com/2010/04/18/uksg-main-themes/)

Labels: , , , , ,

Monday, April 19, 2010

The cringe parade: conference photos online now

This year's conference was fabulously photographed by my very charming nemesis, Simon Williams. (He is my nemesis because I do not like to be photographed unawares, and he was very good at being unobtrusive as he sidled about capturing the conference in full flow).

Enjoy the photographs over on Simon's site. Some personal favourites:

And many more, from the very moment the conference opened, through all the plenaries and lots of breakouts, and a good number of the parties ...... go and see if you can find yourself! (Not in a new-age sense.)

Thursday, April 15, 2010

A library for the 21st Century - is e-only finally a possibility?

Monica Crump and Neil O'Brien presented on their work at NUI Galway where they have taken the decision to move towards the 'promised land' of a library with e-only journals whereever possible.

They outlined the recent history in Irish politics and how research had become a key strategy for the Irish Government in recent years. Ireland is a relatively small country with a population of only 4 million people and the NUI Galway Library (previously Sunday Times University of the Year) serves 17,000 students. They have a tradition of being early adopters of new technology (SFX, Metalib, Primo etc) and can the potential for great benefits such as saving on storage; binding and staff costs for example in moving towards e-only.

Over time there has been an evolution in the attitude towards e-journals on the behalf of academics and the collection management policy of the library has evolved too to a position where online subscription is now recommended over print where available.

In Ireland the cost for going e-only is amplified by VAT rates (print 13.5% and electronic 21%) but this has not deterred the NUI Galway Library who believe they save on the costs associated with print (ie storage, shelf space, staff time and binding) and are committed to going e-only wherever possible.

As part of the project they set up a Collection Management Committee to consider any issues raised by academics such as quality of images used online being inferior to print and to ensure core titles (JAMA, the Lancet, BMJ etc) are kept on display. Academics were informed of planned print cancellations and given time to submit any pleas for retention by a set deadline.

This is a legacy project and when complete they hope to be up to 75% e-only by this summer. Moving away from practices which began in the 19th Century and creating a library for the 21st century has meant making big changes (staffing; job descriptions etc) to ensure the library team have the necessary skills. The renewals process too has been so complex that it has taken Swets longer than usual to finish...with some still outstanding.

However, at NUI Galway they appear to be making steady progress towards an e-only library - something that they (and others I'm sure) have been talking about for the past decade. It was interesting to see how far they have come.

Labels: ,

Real Challenges in a virtual world

Philippa Sheail gave a fascinating insight into the challenges of information provision in the virtual world as a part-time (16 hours a week outside work) student on the postgraduate MSc in E-Learning at the University of Edinburgh.

The course is taught completely online and she is particularly interested in the debate surrounding ‘e-learning’. Because it’s a huge variety of different things, there needs to be further discussion about what e-learning is and isn’t, she said, and referred to the 2007 and 2009 JISC reports on the subject.

Philippa talked about the different interfaces she uses (VLEs; online discussion boards, Second Life etc) and talked the audience through how she (and fellow students) use them. Students can see who else is online when they need help at any give time, which is useful as students are based globally and work in different time zones. Discussion boards can be a bit intimidating to students not confidence with their level of subject knowledge but can be really useful. For some modules on the course students have to keep a blog which is assessed. Tutorials and virtual corridor chat is via SKYPE and Second Life. Students really appreciate the feeling of being together virtually even when not together in the ‘real’ world. Different University Schools (Management, Education etc) have buildings in Second Life and Graduation ceremonies take place in Second Life and real life simultaneously with the latter filmed and streamed into the virtual environment. Students can talk to each in Second Life with microphones but prefer to send text messages as easier to refer back to earlier ‘conversations’ and keep track of online discussion.

On one particular module of the course, ‘E-Learning and Digital Cultures’ public Twitter feeds are used and generally students are increasingly following academics ‘live’ as a way of developing a greater awareness of the subject and discovering what the academics are particularly interested in. The trail can lead to other academics in the subject area and creates a wider teaching group to students. Zotero is used to see what others are reading too…both peers and tutors.

Advice from the library is to ‘use the advanced search on Google’ at the least but Philippa says she uses Google for cross search all the time (sometimes Google Scholar) and interestingly said that sometimes Google Scholar doesn’t take her to the article she wants but Google will via a link on the author’s blog for example as a ‘free to download’ pdf from their book. She complained that it can be problematic to access journal content sometimes as she needs to remember to log in first or think about where she is trying to access it from first. Is the content in a journal or a database; is an abstract enough of does she need the full article? Often she just needs to get to the full text quickly and doesn’t care how.

‘I love e-books’ she said and highlighted Dawsons and NetLibrary in particular and said that she uses Amazon all the time as a quick and easy interface for finding what exists before going to library to do further research.

All in all, Philippa’s session was very interesting…so much so that I asked her if she would come and visit us at SAGE. I’m sure others would be interested to hear her views too.

Labels: ,

Wednesday, April 14, 2010

E-book readers in a mobile-friendly library

Breakout Session B: E-book readers in a mobile-friendly library / Alison Brock, Open University

Although e-books have hit critical mass amongst consumers in terms of awareness and increasing interest libraries are yet to progress their e-book agendas due to hardware limitations and licensing restrictions.

Alison Brock reported on the joint Open University and Cranfield University e-reader project. The main aim was to explore students working practice of using e-readers and to see what kinds of content was available to them and to see how this could inform the development of library services.

6 participants from undergraduate to PhD level were chosen at both institutions using 4 Sony PRS-505 and 2 iPod Touch (8GB) devices. Students were given the devices for 3 months starting from August 2009. The devices were not preloaded with content. Since the study the Kindle is now available in the UK and the UK launch of the iPad imminent.

Prior to the survey less than half the group had used e-readers.

Usability
Students found the devices easy to use and easy to read although no specific testing was carried out with visually impaired students. They also liked the portability and lightweight feel of the devices. The colour screen of the iPod Touch and its multipurpose functionality was appreciated, although it’s highly reliant on wi-fi access.

Barriers
The weaknesses predictably centred on difficulties in finding and uploading content to the devices and the single-purpose nature of the Sony. Most participants would not buy either model tested even if they would consider an e-reader. Presumably different answers would be received in a post iPad landscape. The main barriers for use were formatting issues, navigation, the inability to annotate or interact with text and how tiring they are to use. There is also no way to link between books or link out to other content.

Similar studies undertaken in the United States and the UK found similar results. The business model for e-readers is still aimed at single users purchasing individual titles for their devices. Although ePub is the most common format it is not used on all devices - Amazon’s Kindle for example. Users cannot transfer purchases from one device to another. Licensing restricts libraries and library users from downloading e-books to mobile devices. Obviously cost is a factor. If e-readers were priced at a more competitive £50 or below £100 it would encourage more rapid take-up and experimentation among libraries to see how they can best be used for educational purposes.

Most of all it is an area of constant flux. Other manufacturers and consumers are waiting to see how the Apple iPad performs in the market. Google has announced a new tablet device and other companies are following suit. Until the hardware develops and is able to meet the study needs of learners and licensing terms change, academic libraries are unlikely to heavily invest in e-readers on a huge scale for the foreseeable future until it is clearer how they will impact the development of library services.

Labels: , , ,

A rollicking end to the conference with Marc Abrahams, founder of the Ig Nobel Prizes and the Annals of Improbable Research (which, as I recall, used to be hosted alongside lots of very serious research on IngentaConnect!). Marc focusses on achievements that make people *laugh* - but then make them *think*. He and his team choose ten Ig Nobel Prize winners every year, from around 8,000 nominations. "It's not easy to win one of these prizes, and in most cases, if you win, we give you the opportunity to decline this 'honor' ... most people choose to accept."

The Ig Nobel Experience
Ig Nobel Prize certificates are signed by 'real' Nobel prize winners, and are accompanied by a specially-designed award that reflects the annual theme. The annual awards ceremonies are packed - 1,100 people (bigger than even this year's monster UKSG!) - and the awards are presented by a raft of Nobel prize winners. (Winner's speeches are chaired by a feisty 8-year-old girl who repeats "Please stop. I'm bored." at ramblers.)

Some Ig Nobel winners
  • Cows who have names give more milk than cows who are nameless
  • Whether it is better to be smashed over the head with a full bottle of beer or with an empty bottle (the Peace Prize!)
  • The directors of four Icelandic banks (economic prize)
  • Creation of diamonds from tequila (chemistry prize)
  • Does knuckle-cracking lead to arthritis? for 60 years of left-hand-only knuckle-cracking (medicine)
  • Analytically determining why pregnant women don't tip over (physics)
  • The bra that converts to a pair of facemasks in an emergency
Marc demonstrates the journal's old distribution method (prior to online hosting) by chucking a bunch of hard copies into the audience. Not quite as fought over as a ball at a baseball game, but pretty popular nonetheless.

(I have to run away now so hope that I am not about to miss something very funny!)

Sense and Sensibility

For the closing plenary Brendan Dawes starts by telling us that love and art don't make sense - to a machine. We make decisions that make no sense at all, and enjoy things that are completely pointless. One of the reasons we have such an emotive reaction to the i-phone is that Apple are not afraid to include fun, pointless things in their design process.

App designers are picking up on the need for technology to have a human element. Brendan highlights 'It's A Clock' which tells the time the way humans do - it's just coming up to 12.3o by the way :-) We need to continuely question accepted practices and methods which means not necessarily always designing the most efficient thing. To quote Hendrix - you've got to know what goes between the notes, not just the notes.

Brendan then takes this concept and applies it to data - the transition of the data is very important. DoodleBuzz is a great example of this. It moves away from the 'click' paradigm of using the web and encourages people to doodle across the canvas (webpage) to pull and sort information in different ways. This concept is taking further forward by the Magnetic North website which allows you to draw to pull snapshots of information about the company rather than just presenting pages with 'cv' like descriptions of the company. This is more typical human to human interaction, rather than human to machine interaction. A site I would recommend to all of you to go and look at - now!

Brendan ends by quoting Muriel Rukeyser - the Universe is made of stories not atoms.

Labels: ,

And now, the end is near ... taking digital risks

As we approach the end of the conference, I am under the influence of the usual mixture of exhaustion, elation, a faint sense of regret at everything I didn't manage to do, and a definite sadness that it's all over for another year. So I'm looking forward to being cheered up by our Light Programme, which kicks off with Brendan Dawes from magneticNorth, imploring us to "Stop Making Sense". That should be easy enough, given my mental and emotional state.

Those pesky robots
Machines, on the other hand, don't have emotions. Brendan shows us "desire lines" to remind us of the random nature of human desire, in comparison to machines' programmed processes. He distinguishes between human and machine communications using "It's a Clock" (which tells the time like a human, rendering "it's nearly quarter to nine" rather than "The time is eight forty three a.m." - *BIG laugh*).

Humanising through risk-taking
To create connections between human beings and objects, for example by creating beautiful architecture, you don't always have to create the most efficient thing, says Brendan, and this goes against machine logic. He shows us a Cecil Balmond bridge with a big kink in the middle - "bonkers.. but everyone loves it. And the bridge was much stronger, as a side effect." Why aren't we taking more of these risks digitally? "No-one's going to die, taking risks, digitally."

Enhancing data with context
Next up he quotes Jimi Hendrix (yes, Jimi Hendrix, not Ginny Hendricks). "You've got to know more than just the technicalities of notes; you've got to know what goes between the notes." It's about what he *doesn't* play. And we get a comedy rubbish robot reading of "Under Milk Wood", by Brendan's Mac, versus a sonorous rendition by the lovely Richard Burton. "If you apply that logic to data in an RSS feed, the way you present is very important. The flow of the information is important - transitions. The iPhone makes things more human because of the transitions."

Rethinking how we interact with the web
"I like the idea of discovering things over time" - objects, people, persons. The rules of information design are not written in stone, and there is always more to discover. Brendan's company have created a gestural interface for navigating content - forget clicking, it's all about drawing shapes and lines to reflect the way people naturally do things in the way that they browse. "It was just offering an alternative." The iPhone has been a game-changer - "it's an exciting time".

"The universe is made of stories, not atoms" - Muriel Rukeyser. Focus on the human elements, not the science. (As a marketer, I completely agree - and in the context of a conference, for example, the best presentations are those that focus on a story, a structured narrative, not just a wealth of information to be imparted.) "Wonderful things come out of serendipity."

E-book readers in a mobile friendly library

Alison Brock, Open University, talked about a joint project with Cranfield University to look at how e-book readers could be used in a library setting. OU have a “digi-lab” of technology such as ebook readers, even a Wii console, to help tutors explore ways of using new technology in teaching.

The aim of the project was to explore student working practice of using e-books. There was a total of 12 participants using a mixture of Sony e-readers and iPod Touch (Kindles weren't available in Europe at the time of project). Students covered a mix of levels and subjects and were given the ebook reader to use for 3 months. The project team conducted a pre-pilot survey and start-up workshops on how to download books etc. A Ning forum was also set up, for blogging, news about the project, and technical help, and end of project surveys and interviews were also held.

Less than half the participants had used e-books at all before the project, and those who had used them had only done so on PC/laptops. Participants hoped e-books would help save paper, be more portable and lightweight than books and help them find things more easily.

Sony reader strengths were that is was:
  • Good for sequential, narrative reading
  • Lightweight, portable
  • Easy on the eyes

Weaknesses:
  • Slowness of navigation
  • A bit “clicky and clunky”
  • Only does one thing, e-books only

The verdict on the iPod was that it's:
  • A “nice gadget”, it does other things
  • Portable, pocket sized
  • Page turning easy on touch screen
  • Coloured pages aided reading

Weaknesses:
  • Tricky to get content on
  • Screen size just a bit small
  • Reliant on wifi

The post pilot survey found that most participants had used the reader for more than just study, including listening to music and audio books, reading fiction and games. However, overall they found that the devices were limited in their functionality. The students said it was tricky to get content onto the devices, and use for study was difficult even for tech-savvy users: they were lukewarm about idea of borrowing e-readers from the library. Most would not consider buying the model they'd used. The main barriers (particularly for study purposes) were formatting issues (eg PDFs, diagrams, images), navigation, not being able to annotate or highlight text, and the fact they found the devices tiring to use.

The OU also found that library subscribed e-books were only licensed for PC use, not for downloading onto e-book readers. They even found that it could be impossible for libraries to buy suitable downloadable copies: in one situation, the student had to buy the book themselves and claim back the cost as the library couldn't buy it even with credit card due to the licensing issues.

Participants also complained that it was difficult to locate suitable e-book content to use, as it's available across so many places.

With text-based, sequential reading, they did see the advantages of portablility, and felt they could work more on the move and print less. The iPod was more popular than the Sony reader, but most still preferred the idea of a laptop which could do multiple things.

Conclusions of the project:
  • Ebook readers are designed for reading fiction not academic texts (may change with arrival of iPad etc)
  • They will only play a part in how people study, not replace textbooks altogether
  • Potential for loan out of pre-loaded e-book readers? Potentially, but there have been issues in US about Kindle and conflicting advice on whether loaning pre-loaded readers infringes terms of service
  • Potential role for libraries in facilitating and guiding students to e-book content, and also negotiating better licence agreements for commercial e-book content

Students' wish list for an ideal ebook reader would be
  • Screen A4-A5 size
  • Touch screen
  • Ability to highlight/make notes
  • Internet access
  • Easier to transfer content quickly direct to device
  • Lower retail price

They thought the OU could help by:
  • Loaning out e-book readers with course materials and readings pre-loaded
  • Offering help with finding appropriate e-book content
  • Having better systems for transferring existing course materials onto reader eg OU courses being turned into ePub format

So, is 2010 the year of the e-book?

Similar e-reader projects have been run at Penn State University Library, North West Missouri State University, Princeton University, and the Darden School of Business, University of Virginia. However there's still some big issues. There's the more general question about how e-textbooks will be made available in terms of licensing and pricing (mobile e-readers haven't even been part of the discussion yet). Most manufacturers and content providers are still working on the one-reader, one-book model, aimed at individuals not libraries. Technology still being developed, and still dependent on proprietary formats.

Labels: , , , ,

Research quality: responses from the floor

Questions from the audience, and answers from the panel, relating to Richard "call me Dimbleby" Gedye and the Four Horses of the Research Quality Apocalypse

Ed Pentz (audience): is there a common definition / agreement on what quality or impact actually is?


Jim Pringle: we rarely articulate this. Quality is in the eye of the beholder - subjective. Is the work making a significant contribution to the field, changing it? Is it well constructed, convincing? From the funders' point of view, was it worthwhile?

Hugh Look: ye-es. I'm pessimistic about this. We can come up with words that everyone agrees on but will it really mean anything? We just manufacture more specialist language. (Rick Anderson summarises this on Twitter as "Shared language" doesn't necessarily equal "common understanding." - exactly.)

Alain Peyraube: who decides what is making a significant contribution to the field? It depends on judgements that inherently don't achieve consensus. And it changes over time.

Peter Shepherd: the easiest way to describe quality is that you know it when you see it! It helps create insight, and enable others to take things further forward.

Hugh (again): if peer review and metrics lead to risk-averse decisions, what do we do? Encourage plurality and diversity. Metrics have a chilling, crushing effect on plurality and diversity. We do need to do riskier things, that we can't foresee the outcome of. That is being lost from the system because we are focussing on targets.

Jim: quality is associated with value and worth. We talk about it because of related funding decisions. Challenge: what would be the evidence of worth and value that we should look for? The issues we have discussed are a failure of management to use tools correctly; they don't mean there should be no search for measurable value.

Judy Luther (audience): what's your sense of the future of the journal as a signifier of research quality?

Peter: journals are a service to authors, and a service to readers. On the harvesting side, journals will continue to be an important measure of quality (from the author's point of view) - represented by editors and editorial boards. Within the great mass of information, most of us need something like the journal's personality to trust our research to. From the reader's point of view, the journal as quality is "becoming more shady". The collection (database) or the individual article might become more meaningful than the journal, as a proxy for quality.

Alain: the journal will continue to exist. The problem is that when a paper is published in a journal with a good reputation, the content of the paper has already been discussed, criticised etc (e.g. at conferences - several months prior to publication) so the article is too late to improve science - there is nothing new for specialists in the field; they already know the content, and they don't read it again. (On Twitter, Ed Pentz argues that "research may have moved on but in many fields old articles can have a huge impact"),

Hugh: we might start to see metrics relating to interestingness, relevance, rather than long-term quality. "Perhaps we need to look at temporal variation in the significance and meaning of metrics, to understand in a more sophisticated way what these things are doing for us."

Hazel Woodward (audience): the focus for assigning value is the article / journal, but research funds are allocated to researchers (individuals and institutions). Should we move to allocating research quality points to individuals / institutions, or is it not practical to pursue this?

Hugh: I feel uncomfortable - that this is an ill-thought out idea. It seems to be further atomisation, further subjective judgements, further bureaucracy. I instinctively feel that practitioners should take a fairly aggressive stance against this.

Richard Gedye: do you worry that it's already happening (RAE etc)?

Hugh: I worry about its use. In the end, it rewards the compliant, not the difficult - those who are performing. It fundamentally supports a structure that does not reward those who go against the grain.

Alain: It's already happening (in institutions) - Shanghai rankings etc. Attributing research quality points to researchers would invoke strong reactions.

Jim: if you're a researcher, and you have a body of work that has never been downloaded, never visibly has been read or shown or cited, shouldn't we ask some questions about the value of your research?

Peter: is this Question Time or I'm Sorry I Haven't a Clue? Points mean prizes! I don't in principle have a problem with awarding points, but the problem is the scale of the enterprise sitting behind this. A huge bureaucracy outweighs the benefit of the metric. The cost, in many senses, of playing this game, has become too much.

Labels: , ,

Richard "call me Dimbleby" Gedye and the Four Horses of the Research Quality Apocalypse

We're not too thin on the ground for this morning's 9am session, which suggests either that the conference dinner was a washout (it wasn't, as witnessed by my slight queasiness and faint headache) or that this morning's session is big draw (it is - a new, Question-Time-style format for UKSG).

Jim Pringle from Thomson Reuters opens the show. We're talking about the value / quality of research, and the ways in which it can be measured. Jim suggests:
  • attention (citations, but also more generally)
  • aggregation (researcher -> institution, article -> journal)
  • relation (links, related content, metadata - lots of data stored around the institution can be useful in assessing researcher value / impact).
The Impact Factor is "foundational" in this space, and "will continue to be the touchstone for the evaluation of journals for a great deal of time to come." But the area is a "playground for mathematicians, these days". Eigenfactor, for example (network-based factors) consider the journals as a node in a network, and look at relationships between different journals. These metrics show different results, in part due to the way in which different citations are weighted. Normalised aggregates enable us to compare collections across disciplines, and are arguably more flexible / useful.

Tools are emerging that make use of metadata relationships (funders, co-authors, patents, etc) to more powerfully and accurately measure value. "But metrics are only as good as the people who use them, and can support judgements but should never be the sole ground for judgements."

COUNTER's Peter Shepherd walks us through PIRUS2, a project that is developing a standard for reporting usage at the individual article level. Journal-level metrics may not be representative of individual articles and therefore author, and citation data is not adequate for measuring some fields. The H-Index is author-centred but can be biased towards older researchers. Overall reliance on any one metric is misleading, and distorts author behaviour.

Article-level usage is becoming more relevant because:
  • more journal articles in repositories
  • interest from authors and funding agencies
  • online usage growing in credibility e.g. PLoS reporting, Knowledge Exchange
  • increased practicality - COUNTER, SUSHI
Challenges include the widespread distribution of articles - dispersed usage data to be captured. PIRUS aims to create guidelines for creating and consolidating article-level usage reports. Ongoing issues are technical (scaling previous work), organisational (underpinning business models), economic (allocating costs among stakeholders) and political (engaging stakeholders). Peter closes by quoting Joel Best - statistics are not magical, and need to be considered in context.

Alain Peyraube from CNRS picks up the baton, talking about the European Reference Index for the Humanities project. "We need appropriate tools (for measuring impact) or we are going to lose valuable grants." In the humanities in particular, such a methodology needs to address monographs, book chapters, edited volumes etc as well as journals. The ERIH steering committee took some time to set up the project, identifying which disciplines would be considered (lots of overlap with social sciences), selecting peer review as "the only practicable method of evaluation in basic research", setting up panels for each of 15 disciplines, providing guidelines and soliciting categorised lists of journals. The project suffered from considerable misunderstanding and criticism, especially from the UK scientific community, and particularly around the categorisation of journals. Both the panels and the journal lists have been revised since the project's inception.

Hugh Look takes the stage in a fabulous red blazer (Good Morning Campers!). Trying to attach numbers to things, says Hugh, is a difficult business - numeric targets are always open to abuse, and he points us to the LSE's Michael Power for a widespread analysis of this. What's the benefit of measuring - and who benefits? What are the underlying structures, and what behaviours do they engender? How strong is the link between measurements and real-world impact? We're not currently measuring performance or quality - we're setting up a structure of control, and this risks unproductive use of public money. "The managerial class are the primary beneficiaries of the measurement culture." They manage risk to the institution - but also the risk to themselves, so they use metrics to safeguard their own position / avoid bad PR, and coerce practitioners into compliance by making it part of annual reviews. There is a flight from judgement in the way that metrics are used. Managers cannot do peer review themselves, so they mistrust it.

Hugh acknowledges that measuring things can break "who you know" elitism, but both he and Peter referred to "fetishism" around statistics. Some organisations, e.g. the RAF, are stepping back from using metrics to assess research quality - a "dawning of common sense". Metrics divert attention away from "other things that are more useful to do" - "what aren't we doing?"

(Questions / floor discussion I will post separately, or I'll be verging on LiveSerials' longest posting).

Labels: , , , , , , , ,

Anti-acquisitions librarians in the era of economic downsizing

Jill Emery started the perspective of the University of Texas Libraries with the comment that 'anti-acquisitions' isn't necessarily about cuts, it can be about core collection development, which is much more positive message. What's essential to the library service to keep in-house?

Jill posed a model of collection development based on four levels:
  • Core collections: essential, won't be cut (including some big deals ,when tied to consortia access)
  • Librarian selected content: small funds mostly based on endowment accounts and linked to specific subject areas
  • Patron selected content: mostly in the area of e-books at present, although also encompassing some rush print orders coming from the ILL request system
  • Print-on-demand content: as POD expands, this may become more of an option for supplying out of print content

Texas have also been developing “disapproval” plans, telling vendors actively what they don't want to buy, which is leading to a more granular selection of approval plan stock, and they are considering separating out approval plans and e-notifications (slip plans).

Book acquisitions are the target for a group of big US universities, who want patron driven purchasing of both print and electronic books. E-book vendors are on the whole nearly there, but it's further behind for print, although the fact that print is still coming out before the e-version in many cases means there still needs to be the option.

For patron driven acquisitions, there needs to be set thresholds for purchase and cost, and it also needs interoperable vendor systems and LMS (something which is improving). Texas' experience does show that brief MARC records are OK: they find that users manage to find books and reserve them before they've come in, even just from basic title and author information.

Texas held a pilot project on patron driven article access, but felt that the level of customization wasn't good enough to go further, eg in terms of branding and making sure it was clear access was provided by the university. Other US libraries have moved further in this direction.

Print on demand is “lurking in the background”: their campus bookshop has brought an Expresso book machine, primarily for textbooks, but it opens up new options. Texas are finding that even new books coming in on approval plans are starting to be print on demand copies.

Dana Walker from University of Georgia Libraries was unfortunately unable to attend the conference in person to present Georgia's work on journal management.

University of Georgia faced a significant budget shortfall in 2008, so were forced to re-negotiate some deals and bring in selective pay per view. They also looked closely at usage statistics data, cost per use, cancellation restrictions, ISI impact factors and aggregator availability for all subscriptions (a task complicated by the difficulties in matching ISSNs between usage data and subscription data, so they ended up with a “family” of ISSNs for each journal).

After the initial work, Georgia decided that they needed to create a web based application, more dynamic than spreadsheets, if it was to be used for on-going decision making. They created their own “journal list” bringing together data from multiple sources using WinPerl which allowed them to, eg, connect orders with cost data and usage data, jump to entries in OPAC and e-journals A-Z, produce alerts if journal is non-cancellable, link to ISI impact factors etc. Their next step was to add licence information. The Georgia “Journal list” is effectively a home-grown ERMS, but based around collection development needs, and making decisions about renewals and cancellations, rather than a focus solely on lifecycle management. Interestingly the university had previously purchased a commercial ERM, as part of a consortial deal, but never implemented it due to the time costs of populating it with data [!].

Labels: , , ,

Tuesday, April 13, 2010

Life Scientists Go Online: collaboration, communication and credit

Lucy Power at Oxford Internet Institute, University of Oxford


Following Tony Hirst’s talk on Network Ecology & the Knowledge Economy, Lucy Power took a look at the roots of scientific communication and linked early scientists’ sharing of marginalia to today’s potential version of the same collaboration – friendfeed.


Marginalia, making notes in the margin of books, was how our early scientists (or ‘Natural Philosophers’) communicated with each other and built on others’ research; sharing by posting their margin notes around the world, forming an ‘invisible college’. This actually continues in the present day, online.


Power is currently conducting research into the use of online tools by life scientists for their work and in particular is looking at friendfeed and also conducting interviews with the life science community. Friendfeed aggregates feeds from any number of sources – videos, images, blogs. The most common use is twitter and personal blogs but also posts directly to friendfeed itself. The main features include:


The concept of liking. This is a very simple and instant way of giving a thumbs up to useful content, articles, data or images. Introduced by friendfeed and also adopted now by Facebook. One of Power’s interviewees liked the like button for “instant karma”.

Commenting – just like marginalia, adding to and building upon others’ work. An interviewee liked the conversational aspect; that you can interact just as much as with the person sitting next to you in the lab.

Sharing – feeding content to other groups within friendfeed. For example The Life Sciences group has about 1350 members and is very active.


The main use of friendfeed among The Life Scientists group is to discuss science and pose questions. They do their research first, they are not lazy; the questions are not necessarily being addressed elsewhere and can generate a lot of answers.


Power presents some examples of these researchers, through friendfeed, securing funding, getting published, and sharing ideas at conferences:


One Chemist at Drexal University posted a quick query in August 2008. About 16 people ‘liked’ and some commented; one comment was from someone involved with Open Notebook Science. The researcher was able to announce by the November that he had secured funding.

The same chemist said he had not met half of the people he collaborated with, everything happened online with friendfeed’s The Life Scientists group kicking it off but also helping to move it forward. He was able to be published. ‘Publishing’ might be informally through blogs, or though the formal sense in journals.

Conference Reporting. During the Intelligent Systems for Microbiology ISMB 2008 conference, all hashtags from twitter, comments from friend feed were aggregated, written up and published. It was a summary of conference but also a ‘meta-conversation’ about microblogging and the use of it at conferences.


So linking the 17c sharing of marginalia to current sharing such as friendfeed - what are the benefits?


It’s much faster – obviously!

Network effects – serendipity, people have even found jobs thru the network. An interviewee also mentioned the connections they made with their information management & librarians (a lot of librarians were feeding their twitter thru to friendfeed). Also, cross-disciplinary connections were made.

Informality – low barriers to entry.

Openness – ideas are traceable from genesis to publication.

Global distribution of discussion is quote astonishing – you can fire off a question before bed in one timezone, wake up and have lots of answers from around the world.


Not drawbacks but aspects to be managed:


Field and disciplinary clashes/differences

Work habits, managing time

Information selection – filtering the volume – the disadvantages of scale)

Ephemeral – no formal archiving – of concern to the community – how can we preserve this and keep it. Still needs to be solved.


Questions:


Q: With all the talk of social discussion and blogs evolving into journal articles, it was interesting to note that both examples used by Lucy Power were open access articles.

Power: getting into Nature would be the ultimate aim, if they think they might have half a chance, but yes a few people did say that they seek to publish in open access journals first.


Q: Is it no longer critical to publish in journals if the way in which reputation building is changing?

Tony Hirst: a lot of young researchers don’t feel they can just go for informal publishing as they don’t have an established rep yet. I was publishing five years ago but stopped – all my conversations take place online and through conferences, all real-time.


Q: What will be the role of librarian in all this?

Hirst: There are lots of tools for processing information and RSS feeds. One important thing is for librarians to get involved in building tools (e.g. yahoo pipes) for processing rss feeds or to build apps that help curate and aggregate.

Dorothea Salo: search for librarian groups on friendfeed too e.g. the LSW room/group – they really communicate with each other there. It can also be a great marketing tool so do just try it out.


A big long list of cool social tools to check out. Thanks Tony Hirst!

Charlie Rapple introduced the fourth plenary session - this one on Researchers' Social Behaviour - by announcing an experiment to take questions via Twitter.


Audience members (and those following remotely) sent questions to @UKSG and it appeared to me to be the most efficient Q&A session so far at the conference. Perhaps the size and formality of the Pentland Auditorium has discouraged some in-person questions via roving mic, and the virtual posing of questions within a 140 character limit was less daunting and more considered?


Network ecology and the knowledge economy: why researchers need to get online and social.

Tony Hirst, Open University.


Tony gave a super-speedy rundown of what social tools are out there and being used by scientists. He showed examples of the many methods by which researchers are engaging online, how they are creating objects that are themselves social, and how publishers could embrace this.


Hirst started by setting the scene of the traditional environment where if you don’t publish, you are no one. Metrics such as the h-Index give an indication of an individual author’s reputation based on how embedded their formally-published work is within other articles, known as the citation network.


These days, citations are also being generated informally (particularly in the physics field):

An example of this is Nature Precedings, a forum for scientists to discuss preliminary findings pre-publication.

Researchblogging.org has had good exposure with it’s (at least) fourth plenary mention at UKSG 2010; it allows blogs to be linked back to cited articles in a well-defined way so that cited articles can be informed by the discussion, and vice versa.

PostRank is a look at social activity around blog posts e.g. # views, # comments, # backlinks, # friendfeed shares, # votes.

F1000 allows academics to filter content, generating post ranks even before those articles become cited. Quite a formal social network essentially providing ‘pre-citation ratings’.


So whole networks of discussion are being built around an article.


Increasingly, these social networks drive how work can be discovered. Google search engine originally was different because of pagerank to determine how important a page was. But this is just one factor for them now. If you have a google acc, all your google searches are logged – now they customise results based on your previous use of results.


Social content is very connected – you can associate your google profile with e.g. your twitter account and so info such as ‘who you follow’ is used to inform what results you get from google. So your results are really yours, including for example results from people you follow on twitter. Social info is being mined heavily and it’s therefore very important to nurture and curate one’s social circle (true too in real life ;-)) and choose your friends wisely as you’re making a public declaration of who you are and what you like.


A brief warning about the risk of ‘deanonymisation’ where thought-to-be anonymous data could be tracked to individuals. For examples Netflix (for dvd rentals) released information from a competition and someone managed to identify some of the borrowers based on their borrowing behaviour, and patterns of their searches. This will be an issue with the release of govt data. BUT these fears should not prevent people from releasing data.

There are some services that do filtering for you such as ResearchGate which is kind of a facebook for researchers where you can recommend a friend in similar area.


How can publishers learn from these networks?


Participate in the conversation. Some people think twitter is a toy for the kids – not necessarily a good opinion. Twitter can be used to identify projects, to relate projects, crystallise work, to discover people at and during an event, for example Hirst created a ‘Hashtag Community’ for this #uksg conference. Also...


Tweets can be used to automate captions on YouTube.

Jiscri (JISC Rapid Innovation programme) - generating references for only a small element, paragraph or dataset of the full article

Gapminder – set up different datasets

Embedded video is increasingly popular on third party sites

Many eyes – generate visualisations and support commenting around the data and embed in your own publications.

Wolfram has introduced the idea of ‘living documents’.

Tagging means that users are generating metadata for us

Contribute back to the community – we saw yesterday how the BBC is using Wikipedia to populate their site, using its information but also contributing and enhancing Wikipedia in the reciprocally. The Chemistry community is also doing that – a CAS-number on a Wikipedia articles mean that the community has vetted it.

Look at trails like Tesco does with its clubcard. Google gives us trails to follow for its advertising. And follows you – it remembers where you’ve been based on where you’ve mean. Leave trails on del.icio.us, Slideshare etc.

Utilise and integrate with academic citation sites like mendeley and zotero.

Use Yahoo’s pipes to aggregate and mashup the web’s content


Publishers can get as much value as they want from this.


Questions from Twitter & Audience:


Q: You’ve shown a lot of social networking sites and tools, how do we keep all our networks up to date?

Hirst: Through proactivity. Every so often prune your twitter network – remove people who don’t tweet or only Re-tweet what you already know. Cull them.


Q: You talked about a lot of services but didn’t include your own opinion on some quite contentious subjects; on the subject of personalised searches, do you think it is a good thing to have google filter out results based on your past behaviour? Not just the privacy issue but what if you want stuff want the wider world, influences you haven’t come across or wouldn’t normally consult. Does it need an ‘unrecommend’ button?

Hirst: There is a danger yes but your networks are your own to cultivate so you can shape what you see. People in networks are also in other networks and these scale very quickly. But sure, I subscribe to things that are tangential to my work and want to see alternative opinions to mine, I strongly believe in serendipitous discovery. When you build networks you take responsibility.


Q: With all these sites can you recommend a good password keeper!?

Hirst: I don’t trust them and have actually developed my own algorithm which wouldn’t be wise to share with you all!