Tuesday, March 31, 2009

Survival of the fittest: big deals at risk of extinction

Lucky Jill Taylor-Roe has a relatively healthy-sized audience for her graveyard slot at 9am the morning after the conference dinner. She wakes us up with a bit of the Byrds as she posits her theme: to every thing, there is a season. Jill is talking about change in the context of the Big Deal.

Background
In the early days of the Big Deal, the purchase model was based on maintained print spend plus an additional e-access fee that provided access to (almost) all of a publisher's collection. In the period since, Big Deals have been hugely advantageous, with a huge growth in full text downloads (Newcastle's Library is at around 1.5m downloads per year) and a huge drop in ILL and photocopying. NESLi Big Deals have become the major supply model for acquiring new journals.

Problems
But libraries are still not able to buy everything that is requested; in some subject areas (in Newcastle's case it's engineering and the humanities) new titles are not forthcoming because they are not available via any big deals - and the big deals are taking up the lion's share of the budget. This means the collection is compromised - and things will not improve as the credit crunch impacts the value of sterling against the currencies in which most journals are bought (the Euro and the dollar). Libraries are already having to ask for additional funds simply to maintain the current portfolio. In this context we need to reconsider the value of the big deal.

Research findings
Jill has been researching librarians' views of the big deal - early results show some obvious findings e.g. that the big deal simplifies administration and reduces ILL spend. But there is frustration with the limitations (cancellations) and the impact of titles moving publisher. There is a sense that the pricing model, based on retained print spend, is no longer satisfactory. Some are still happy with the big deal, but up to a third have cancelled big deals recently - due to budget pressures caused by currency weakness, in the newer cases - a challenge that will not go away.

Current solutions
In terms of managing the shortfall (median - £100k per annum), many librarians have been cutting the book budget. This is in direct contravention of stated student demand for more textbooks and leaves libraries open to poor ratings from students. Only one library noted that they were making up shortfall with a reserve fund set up for this purpose.

Future challenges
The VAT issue remains and while the recent cut to 15% has helped to moderate the effect of currency fluctuations, this is only a short term benefit - and there's a fee it will be raised to a higher rate than the original 17.5%.

The economic problems we face are not short term. In 2010 and beyond, libraries still plan to raid the book fund but will also tackle the serials fund. Jill's research shows an increase in plans to cut big deals, which will "no longer be sacrosanct". This is driven by other factors beyond currency fluctuations - budgets not keeping up with inflation, growing dissatisfaction with pricing models, an awareness that the books budget cannot be raided indefinitely.

Next steps
Jill's survey is ongoing and further results will become available but it is clear that inflexible deals that don't offer value for money will be vulnerable. Publishers must not bury their heads in the sand but acknowledge the warning signs and "think seriously about this - don't be complacent - there are hard times ahead for all of us and survival of the fittest is not just an empty phrase".

Labels: , , , ,

Library marketing: a strategic approach to an interactive library experience

Olin College of Engineering is a young college (last ten years) set up to bring more hands-on training, entrepreneurial spirit, cross-disciplinary learning and design concepts into engineering education. Students are a diverse group with a range of talents beyond their academic excellence. Dee Magnoni is its Library Directory with some background in advertising.

The interactive library: escaping temporal exhaustion
Olin's library is not huge but Dee believes that you learn by more than just reading and writing; the interactive collections reflect this. The library is a 24/7 space to escape "temporal exhaustion" (when we're too busy to pause and contemplate, our creativity suffers). The small staff at Olin doesn't sit behind a desk. The virtual collection is much larger and deeper than the physical collection. The library is full of games (chess), modelling kits and other interactive "realia" to encourage creativity and thought. It sounds like an inspiring and fun place to learn.

Encouraging e-resource usage
Olin puts most of its budget into e-resources, and makes sure they're used by holding a vendor fair. Olin advocates four steps: goal, timeline, budget, communicate. Dee allows herself 6 months to plan a fair! and works with other departments (IT, facilities) and external partners (caterers, balloons, photographer). In attracting vendors Dee communicated her own excitement along with the benefits for vendors. Vendors have been "fabulous" in partnering with costs and prizes. In the run-up to the event she put together and distributed publicity posters and flyers, and re-confirmed all the vendors and suppliers.

Dee's event coincided with "Talk like a pirate" day and so they used this theme and the seasonal treats (caramel apples and cider) to theme the decor and catering. In order to enter the raffle, users had to answer the question "What did you learn?" - a great way to elicit feedback such as "I learnt where and how to do my research". She also gathered feedback from vendors as to the value of the event to them. "One event is not going to solve all my PR challenges," she notes, and tells us the lessons she learned:
  • always communicate more, more, more
  • be ready for something to go wrong, because something will
  • don't do it alone - get all the support you can from vendors, suppliers, internal depts
  • make sure you, as well as everyone else, have fun!
A multichannel approach
Olin does other forms of marketing and Dee cites Springshare's LibGuides as a useful tool for helping students to find resources in specific areas. The library has a Facebook page, uses Wikis, blogs, instant messaging and news feeds to reach its users (I applaud this multi-channel approach). Dee has also carried out considerable research among faculty and students to inform her strategic planning, and has created an external library advisory board (including vendors, researchers, a copyright expert, a consortia director and faculty from other colleges) to visit regularly and provide strategic advice as well as occasional tactical input. Isn't this a great idea - I wonder how many other libraries are capturing the skills of those around them in this way? Dee rates conferences as an opportunity to pick up on the zeitgeist and share experiences with others.
Exemplum, exemplum, an example from your own life ... One library realised its students weren't taking in the guidance they had received from the library as freshers, and were calling their parents for the kind of help the library should provide. So the library scrapped its freshers event and invited the parents to tea, so they would later tell their offspring to use the library.
The library as part of the bigger study picture
At Olin they talk about information fluency, not information literacy. Dee gathered together relevant standards and worked with students to come up with their own curriculum (which they called Lifehacks) with modules on sleep, nutrition, relaxation, and "everything else you need to be successful to study". She paints a compelling picture of a library that has been able to grow itself into being precisely what its students need it to be - I guess the challenge for others is to be able to evolve from a more traditional library into an interactive and welcoming environment such as Olin has managed to create.

Labels: , ,

Moving to e-only from a library perspective

I attended this breakout session yesterday and was reminded how useful it is to come to UKSG each year. I was drawn to the title of this session and in particular the '...from a library perspective' bit. That, it seems to me, is the best thing about coming to UKSG. You get to hear the view from librarians and, working in publishing as I do, find this insight invaluable.


Sarah Pearson from the University of Birmingham gave a great overview of the many challenges facing her library (and many others too I'm sure) as they move further towards an e-only model. I hope she doesn't mind me giving the ending away but as Sarah herself admited this e-only model is not likely to arrive at her University library any day soon. Instead a hybrid of print and electronic content exisits.


She outlined the collection development principles currently in place saying that web-based resources are the preferred medium and how important it is to have a flexible budget that responds to changes in course content and research directions. She also said it was key to negotiate great value for money.


The UofB has 24,000 free and subscribed e-journals plus 1000 e-resources (340 subscribed) and 4000 e-books. Sarah highlighted the many benefits of opening up greater access to their collection and offering e-access to library users including distance learners. E-delivery adds value such as alerting services; citation links and discussion forums, which are not available in print of course.

Sarah outlined the benefits, such as opening up a much bigger collection to users for a lower fee, and the drawbacks of big deals, such as taking some journals that may not be used extensively and having less control over collection development. She summed up with some useful learning points to take away:
  • don't expect to go completely e-only
  • usage is an important tool but don't forget about feedback
  • big deals have benefits but there are trade offs
  • negotiate, negotiate, negotiate.

All in all I found the session useful and it certainly gave me a library perspective on the potential for moving to an e-only model and the pitfalls that entails.

Labels: ,

Data Analysis Will Drive Decision-Making in Research – Jay Katzen, Elsevier

Katzen starts off by reviewing the reactions to the recent economic crisis. World leaders are making grand statements about continuing to invest in R&D e.g. Gordon Brown said that innovation is the way out of the economic crisis. But we know that lean times are here: there is a hiring freeze at most universities in the States such as Harvard; there is a drop in funding levels so that e.g. Stanford is basing their budget on a 5% decrease; and private institutions such as the Wellcome Trust plan to decrease their endowments.

Aside from the economic crisis, other factors to take into account are that governments are playing a bigger role in research assessment; there is an increased competition for funding; as well as a clear drive for multidisciplinary research. And these pre-existing issues with performance measurement are now exacerbated by the economy.

The UK’s Research Assessment Exercise (RAE) released their analysis in 2008 and as a result there are fourteen universities in the UK getting less investment than the previous year, and forty institutions receiving funding that is below inflation. So there are more than fifty UK universities that have to make cutbacks somewhere. These include University College London, Kings College, Imperial College, and University of Cambridge.

The Australian Research Council also aims to use metrics to monitor and measure research to base funding decisions on.

So Lean times = Lean research. But no one can do less research. We have to do more, with less, and the key tools to help us deal with this will be around data and analytics. More data will be needed for evaluation and decision-making at every level of the academic institution.


The scholarly landscape will change, and technology will be key

The United States is ranked number 1 with 310k published articles in 1997 and 340k in 2007. In that decade China has moved from the number 10 spot to the second, and it is predicted that China will surpass the US in research output quantity within the next few years. We need to be aware of this, particularly as research continues to cross national boundaries, making measurement even harder.

So one question is whether research is optimized and efficient?

The average researcher spends 6.5 hours a week searching for information, and 5 hours a week analysing it. They spend more time looking for the information than they do actually using it and this has to be reversed.

We also know that forty-two is the average age when a biomed researcher gets his/her first research grant from the NIH but the approval rate is only 15%. So researchers spend a significant amount of time identifying ideas and proposals, but there is a lot of time wasted when not much is then approved for funding.

Katzen describes the investigation they have done into the researchers’ workflow; an exercise that aimed to spot gaps and see where tools can be developed to improve efficiencies, leading to increased published output and ultimately institutional ranking.

Katzen therefore disagrees with Derk Haank’s view that there is no information overload and that technology will not play a significant role in the publishing industry during the next decade. He argues that we can now make revolutionary changes to the whole research environment by using technology to connect the mass of information through new tools and methodologies to really analyse and evaluate performance through data. This will change the landscape for how people practice research, and technology will facilitate that. Katzen argues that “we are moving from traditional publishers to information solution providers”.


New Mapping Methodology For Reputation and Performance Measurement

The tools linking reputation ranking tools don’t yet exist; we cannot look solely at journal-based classification anymore, especially with multidisciplinary research increasing and non-English language research increasing. The number and scope of journals is too limiting and the level of aggregation is ineffective when we cannot see how different disciplines interconnect.

There is a new mapping methodology, presented at the National Science Foundation (NSF), which used co-citation analysis to look at the quality of output. There are thirteen categories but more than 40,000 subcategories so if institutions want to understand where their competencies really lie, they should look much deeper into the data and use this mapping methodology.



You can also look at this mapping technique to see which authors are driving certain areas. It’s also possible to look at a national level too: the NSF created these maps and looked at UK – previously they had thought that the UK had two areas of strength: in social sciences and in health services. But this doesn’t make any sense – where is the map that shows physics and maths are actually key drivers? It’s also useful in order to see where you are vulnerable internationally and where you should be careful. The NSF says that this new mapping methodology “gives us vivid insight into rapidly evolving research areas and the relationships among them”.

The people thinking about these things are generally Deputy Vice-Chancellors, Deans etc, but Librarians can and should play key and critical role in supporting their universities to become leaner organisations. One Library Director Katzen spoke with said that people view her as a procurement centre – she just buys it and switches it on and the job is done. But Katzen argues that the library role is significantly undervalued; they can and should be looking into the performance of their institution in order to adapt their services to be more involved in the research process.


Audience question 1: That analysis is all great but who will actually be the first to move to make the paradigm shift? Katzen answers that it will be the research councils and university deans who drive the need for this methodology (they already are) and publishers will support it with data and tools.

Audience question 2: Peter Shepherd asks that since there is a lot of historical citation data, whether this could be applied retrospectively to trace what triggered critical and sudden changes in the past. Katzen replies yes, there’s no reason why you can’t look not only at today’s performance but indeed this method would allow you to trace that. Shepherd states that this would be something scientists would really get excited about.

Labels:

Publishing and catalguing datasets: it's time everyone got involved

Presented by Toby Green from OECD.

Haver overview of OECD & definition of data sets & data access.

From PDF provide links to raw data.

OECD provides raw data via DOI in xls file for researchers to get at raw data. The next steps are to provide access to data cubes/sets of underlying data.

OECD working on building this access on their platform: OECD ilibrary. It is cross-searchable.

The Economist utilizes these data sets regularly but citation is very poor. Other authors also have trouble citing data sets. OPACs also not good at providing data set access.

The data sets become like black sheep that cannot be found. There is some scholarly publishing networks for journals & books that provide hard links to data sets. Not at all perfect & misses lots of sets.

OECD will create a dataset with authorized title; ISSB; DOI & MARC record attached.

Challenges:

Dynamic data sets (they change)
Versioning (recalculations all the time)
Preservation (not happening)

OECD issuing a white paper on Publishing Standards for DataSets

Speaking with CrossRef about citation standards for dynamic objects

Mid2009 will have

MARC records, ONIX records, Citation records

Question about licensing data?

Problem is where/how cited.

Question on how it would be discoverable?

Through these metadata channels would allow for discoverable

Statement: it will be interesting to see if data set discovery will increase fulltext usage. Currently, fulltext only way to get to data so it will be interesting to see.

Being cool with data: starting in 2007--two companies: Swivel & Many Eyes (flickr for data) can tag data, send to blogs, etc.

OECD loaded up Factbook data sets to see what would happen. Traffic has been slow but does allow for visualization tools that are pretty nifty. Thought it could make a good teaching tool. Free tools & encouraged to use.

Creating factbook for iPhone which will be offered for free. Also creating NCVA regional data eXplored. Hunt data using maps & graphs instead of access being just textual. Can then develop stories based on data retrieval. Example showed aging population.

Noted that this product is similiar to gap minder. Direct feed can be set up.

Question does OECD see themselves creating/selling data management platforms? With eXplorer--it will be open-source but other services may be for-fee.

IMF datamapper presented-based on OECD data in part.

www.gapminder.org: can provide trends & gaps occurring-podcasts of data

Newspaper websites gone crazy with data/charts/graphs

Loads of visualization resources being developed.

Labels:

Where's My Jetpack

The UK federation was launched in November 2006 and has reached 690 members and membership is still growing. It is a deliberately inclusive federation - it includes all of the education sector and anyone providing services to that sector. Federations are essentially enablers of communication between this vast membership.

Ian demonstrates that current implementations by institution shows an even division between use of in-house identity management and outsourced identity management. Service Provider implementations came much later in the day for the UK federation but are now outstripping institutional implementations.

The Jetpack refers to Ian's assertion that the future is already here, it is just not widely distributed yet.

The UK is the first adopter of concepts such as outsourced identity managers and adoption in the schools sector. A similar type of uptake is expected in other countries. Scale is important for adoption. The UK federation is now seen as a 'must have', this position has not been reached in the US.

Software diversity based on standards is important in the UK federation. This provides choice, business models and sustainability. It is also noted that people get support for their software choices from a variety of other places than the UK federation itself.

There is a problem with helping users find the right place to log-in - it is known as the 'discovery problem'. Although the UK federation provides a WAYF (Where are You From) process to help guide users, it is better if this is integrated in to the Service Provider interface - Service Providers know how to best present information about their customers.

Authentication processes are in the process of changing - usernames and passwords will not be the process used in the future. People are starting to use cards, tokens or USB devices for access, and this will get more common. This will quickly be followed by interfederation - federations talking to federations, thus making the experience more seamless for Service Providers and Identity Providers.

The best way to predict the future is to invent it - Alan Kay.

Labels: , , ,

From Timbuktu to Here

How many electronic journals will have the shelf-life of the famous texts of Timbuktu, which are still readable and preserved today? This question set the theme for the session on Access and Preservation for Electronic Journals, led by Terry Morrow.

Continued access to journals is a shared problem, Morrow argues - everyone who gains a benefit from the availability of e-journals should take a part in solving the problem. The specific problems for e-journals are:
  • Continued access after a subscription has been cancelled, but does subscription = ownership?
  • What do you save? Articles are often built on the fly from various components, including complex metadata files.
  • Technology: don't assume that PDF will be around for ever.

Preservation is never a free option, but can be viewed as insurance cover for the future. These costs will be ongoing, not one off so a full risk analysis against costs should be done before preservation decisions are made.

The current economic climate is making the loss of journals very real with concerns about loss of vulnerable publishers and the ability to maintain subscriptions: the impact of the euro exchange rate on UK subscriptions is causing problems now for institutions.

Chris Rusbridge via the power of twitter and blogging poses the fact that this is a very real problem. There are a variety of different scenarios outlined by Chris with regard to the loss of journals, but the crux of the question is - who is responsible for taking the action to preserve journals. This ties in with questions from the room about how do we assess the value of the journals. A serious question for attendees at UKSG.

A range of preservation systems were described: LOCKSS, CLOCKSS, Portico, e-Depot, OCLC ECO, and British Library developments. Managing the trigger events for all of these systems and the roles and the responsibilities for all the stakeholders are different in each system.

There is a challenge for all of the attendees at UKSG to answer the question, am I responsible?

Labels: , , ,

Library marketing: running an event to promote usage

"Marketing isn't taught in library school, and I think we're at the point where it should be," says Ruth Wolfish from the IEEE. I have just followed coloured footprints along the hallway to Ruth's session so it was clear before I even arrived that it would be a break from the norm. I think the guys next door in the API session were jealous.

Ruth's starts with her tips for a successful event: make it meaningful, time it right for your audience, promote it well to the right people, get the endorsement of influencers in your target audience, make the benefits clear - and make it fun.

She then proceeds to set us a task list, starting with checking for conflicts and scheduling your project tasks. She suggests involving students in creating promotional materials, and seeking assistance from vendors and support staff around the university. She emphasises the importance of food in attracting attendees, and suggests a quiz or a raffle to keep people there until the end. Ruth's guidance even extends to design tips for your promotional posters - "uniform and easy to read" fonts, making primary messaging more prominent, avoiding too much text, being careful with colour combinations.

A really good event makes library staff more accessible - Ruth cites one library's Halloween event where librarians dress up; students see it as a "don't miss" event and remember the librarians personally afterwards. Ultimately the objective is encouraging more usage of the library and its resources (I was a tiny bit late for this session and I hope that objectives were brought up at the beginning as well as pitching up half way through - all marketing has to start with clear objectives against which success can later be measured).

Communicating your event
Library blogs are taking off - particularly in the US? I think - and Ruth shows us lots of examples, commenting on the layout of the text (make sure your offers are clear). She also shows examples of how universities are using Twitter "to communicate with our users more effectively" - library hours, catalogue updates, "whatever you want to say". It doesn't take the place of existing communication channels (website, newsletters) but adds to the library's means of publicising the e-resources on which it spends such a considerable amount. We look at one library's Facebook page which highlights all their events ("pizza in the library") and incorporates applications added by the library e.g. catalogue search, find articles, news feeds etc. Use the photo galleries to help build your library's presence and character.

Ruth moves on to "Little Ideas with Big Impact" - with examples from librarians all over the US, including "flyers in places people can't avoid (back of toilet doors)" and tear-off slips to remind people of the dates and times of your next library events.

I'm pleased that Ruth closes with measurement. I couldn't agree more with her assertion that you need to "make sure that you measure your success - that you have metrics when you're asked for them." She suggests thinking along the following lines:
  • What does the library do for the school?
  • Has usage gone up since you started running events?
  • Have you had more research requests?
  • Did you make new contacts?
  • Have you been invited to speak at classes?
Ref. Bhatt, J., Wolfson, R. "A successful collaborative partnership among the Faculty and Librarians at Drexel University and IEEE" - a study Ruth co-authored that may provide further insight into the value of library marketing.

Labels: , ,

Monday, March 30, 2009

No Revolutions For Scholarly Publishing – Derk Haank, Springer

In contrast to Hindawi's predictions for a ‘Journal Commoditization’ revolution in the next decade, Haank’s view is that the major advancements in scholarly publishing have already taken place. For a revolution to occur people have to be very dissatisfied with the current situation and that is no longer the case since the shared publishing goals of 1998 have already been achieved. These were:

1) improving access;

2) seamless linking; and

3) improving value for money.


The CrossRef initiative has solved one of the biggest problems by providing pure linking to enable seamless access to everything, for everyone. The fear and excitement of the late nineties meant that publishers “invested heavily - too much in my view - in technology” and this resulted in having to charge much higher fees for publishers’ platforms, rendering content inaccessible to some users.


The technology will not be important in the next decade as people are no longer concerned with how systems work, only with the end product/use. “The techies are back in the cellar where they belong”. So Haank doesn’t care about the next Web (2.0/3.0 or 99.9) as we’ve already achieved a lot and it will not be possible to invest much further anyway.


We’ve talked about Open Access for ten years but only 3% of articles are published in the OA model – hardly a revolution. But of course OA will not disappear (noting his recent investment in BioMed Central!) but will build slowly alongside and in parallel to traditional publishing business models as an evolution, not a revolution.


More content is produced each year than the previous year but library budgets do not increase so we just need to get much more efficient every year instead of looking for the next big development.


Haank’s conclusion for 2014-1019 is that “we’re in for a boring decade” but pointed out that while he and Hindawi may disagree, they could both be right!


PS – Haank was asked about the "elephant in the room" and said that Springer is not up for sale but looking for a third additional partner not replacing current shareholders.

Labels:

OPAC 2.0... and beyond!

Presentation from Dave Pattern of Huddersfield University.

Started with a brief overview of the history of the library catalogue - from the card catalogue, to "OPAC 1.0" in the 1980s and web-based OPACs in the 1990s, which were arguably just displaying the card catalogue in a web browser.

Dave talked about MARC21 being a format optimised for printing catalogue cards, and introduced his first "conspiracy theory" - that cataloguers are gearing up for sabotage of web-based OPACs and the return of the card catalogue!

Looked at how OPACs are currently designed, and what librarians think users want. Showed some examples of ludicrously complex advanced search forms, requiring pages of instructions. Also mentioned the problems with expecting users to use Boolean logic in their searching - example of searching for "Oranges are not the only fruit" via the BL catalogue - title only came up if you searched "Oranges are the only fruit"! Catalogue search interfaces that require expert searchers, and over-complicated notification systems, led Dave to his second conspiracy theory - that we are trying to turn our users into mini librarians!

Went on to a reminder of Ranganathan's 4th law - saving the time of the user. Quoted Roy Tennant - "create a system that doesn't need to be taught". If the OPAC is too complex for our users, that is our fault [sidenote - thoroughly agree with that, it irritates me to hear colleagues criticizing the students for not knowing how to use the catalogue. Why should it be their responsibility to find ways around our clunky and antiquated OPAC?].

Talked about the results of a 2007 survey on OPACs - on a scale of 1-10 (ten being best), the average rating for "how happy are you with your OPAC" was 5.1; the average for "how well does your OPAC meet your users needs" was 4.5.

Moving on to OPAC 2.0, began by mentioning the Ann Arbor District Library catalogue, which has features like tagging, rating and reviews, and the option to bring up an image of the catalogue card for the item - which users can add "graffiti" to. Then went on to talk about the work done on the OPAC at Huddersfield university. In deciding what features to add, they looked at user suggestions, web 2.0 inspired features, and successful ideas from elsewhere. Dave described the OPAC as being in "perpetual beta" - new features are added as and when they come up, and are removed if unsuccessful.

Keyword searching was monitored, to find out how the system was being used. This showed that 23% of searches gave no results, and that users frequently gave up if they didn't get any results from their first search - probably because there was nowhere else to go from that point. Users expect suggestions and prompts for unsuccessful searches, not dead ends. Introduced features such as a spell checker, keyword suggestions for terms not in the catalogue (cross-referenced with answers.com) and suggestions of popular combined search terms for general searches with too many results. Also added a keyword cloud of recent searches on the first page, and a "virtual shelf browser" - originally as "eye candy" but both tools turned out to be very popular and useful. Several other features mentioned such as recommendations, RSS feeds for new titles, and the ability to add ratings and comments.

Looking at usage statistics - borrowing peaks in October, but usage of keyword suggestions and borrowing recommendations peaks in November - maybe when all the reading list books are checked out so users appreciate the recommendations. Also showed increase in the range of unique titles borrowed - suggests users are checking out books they may not have looked at before.

Dave concluded this section by pointing out that what is required is more than just cosmetic changes, otherwise it's just "putting lipstick on a pig". He also pointed out that the changes need to come from vendors, as many libraries do not have the resources to do the kind of work he has by themselves.

He went on with some suggestions for libraries who did want to try 2.0-ing their OPACs themselves, including encouraging ideas from staff, listening to user feedback, not being afraid to make mistakes, and monitoring usage - if a feature isn't being used, get rid of it!

Dave talked about what is needed for OPAC 2.0 - relevancy ranking by default, faceted browsing, spellcheck, RSS feeds - and what is still missing - more serendipity, in the form of tailored borrowing suggestions and "just in time" recommendations, and social features to allow users to build a community.

Session finished with some suggestions of some commercial products, open source and web services available for OPAC 2.0, and a reminder of the benefits for students (better recommendations), librarians (collection development) and academics (improved reading lists) of OPAC 2.0.

Slides available: www.slideshare.net/daveyp

Labels:

2020: a publishing odyssey

Ahmed Hindawi opened the afternoon plenary session by talking about the "three big changes" that will affect scholarly publishing in the next ten years. He started by looking at other types of publishing and the issues facing those industries:

Newspapers have traditionally had reader payments and advertising revenues, but lose reader revenue when they go online and make their content freely available. Per-page revenue is a fraction of what it was. But newspaper publishers made the decision to go free, and did so because the content they publish is reproduced in many places. This would result in price wars that would only end at zero anyway. There are some exceptions - the Wall Street Journal has managed to keep subscribers because its content is differentiated.

Trade book publishers are just starting to embrace digital. Challenges: no print, less need for publishers? Anyone can get a digital book into a book store, when they couldn't with print. 30m out of print books are "coming back from the dead" and becoming available online - the long tail will vie for space with the new titles.

The music industry has seen well-documented problems with piracy.

Scholarly journals publishing doesn't have these problems: unlike newspapers, the content of scholarly journals is highly differentiated, and you're unlikely to just go and read a different article if the one you want is too expensive or behind access control. Scholarly journals are bought by organisations, so there's still a "middle man" in the sale as compared to author to reader trade book sales. And piracy isn't a big issue.

The three changes that Ahmed predicts will affect scholarly publishing:
  1. Open access vs toll
  2. The journal as a brand on author side
  3. The journal as a brand on the librarian side

(Blogs and wikis have their place, but won't significantly impact scholarly publishing..)

Drivers for open access
  • Recognition of merits of OA by researchers
  • Serials crisis = difficult to expand toll publications
  • Green open access - publishers will realise gold is more secure and more financially viable.

The journal as brand on author side
  • Citation databases could lead to the creation of author impact factors that become more important than journal impact factors
  • This highlights the need for author identifiers! Scopus, Researcher ID, Contributor ID.

Journal as brand on library side
  • Budgets - librarians can't consider individual titles, and will go for Big Deals instead when budgets are tight.


So, five possible futures for scholarly publishing.

1. The Near Past
Journals are toll access, and are important to authors and librarians. It's what we have or have just had, and has resulted in the serials crisis

Possible Future 2: Here comes the Big Deal
Journals are toll, are important to authors, but are in big deals and their brands are not important to libraries. Will see consolidations. Unlikely?
Should expect intervention from external markets.

Possible Future 3: Journal commmoditization
Journals are toll, but lose their brands on the author side. Publishers will have to work hard to keep authors. Publishers will accept more manuscripts (all that are factually correct). New pricing models will emerge - based on subject and downloads. Will be an ongoing market price, so competition for profit will be all about saving costs

Possible Future 4: Open Access
Journals still have a strong brand with authors, but libraries don't need to purchase journals. High impact journals will be able to demand higher author publishing charges. Will be more competition between journals and publishers.

Possible Future 5: Commoditization 2.0
Open access and lost journal brand on author side. All journals are like PLoS ONE journals, publishing all rigorous artlcles. A&I databases will be only place to navigate content.

What will materialize will be more complex than any one of these examples. Open Access is important, but isn't the only issue. Commoditization can bring benefits. Scholarly journals have many stakeholders. It's important to be as "humble and objective as possible" and consider all of the stakeholders. There will be winners and losers.

Labels: , , ,

"If we invented the scholarly journal today, what would it look like?"

"Disorientation," says the University of Washington's Joe Janes. "And Dairy Milk."

(I love Joe for loving Dairy Milk. Having given it up for Lent, I'm also a bit growly at having had Dairy Milk brought into my frame of reference so early in the day.)

He talks about his strolls around Torquay in the last couple of days, and the "busman's holiday" treat of checking out the library ... which was closed (it was Sunday) - unlike the Tesco opposite with its poor excuse for a BLT.

"How much more disorienting things are," says Joe, "when things seem familiar, but are just a little off." He describes his first trip to Britain where everything looked normal but - wasn't. In scholarly communication, we're currently in the process of leaving a country that we know really well (because we built it) and entering one that seems familiar, but isn't. This is harder than just starting all over again - and the transitions we see before us will be fast, profound, radical and forever. "Your future," he notes to the students in the room, "will be nothing like this. Except the parts that are."

When we're disoriented, we look for guideposts and parallels to work out where to go from here. Scholarly communications matters in guiding future research activity - and all our pieces must fit together well for it to work (cites a story about a woman researching asthma who died because she did not find crucial information in PubMed - need Joe to write a comment expanding on this story!)

Editing, peer review, tenure, pricing and all these other functions around scholarly communications are currently up for grabs - access, e-science and a million other developments. The way in which scholarly artefacts are created, the form and structure they take on, the way they're searched, used, distributed and preserved - these are all changing as we speak - some will even change as a result of this conference. How much longer will an article be called an article? As we live in an increasingly digital, networked world, so the outputs of our research will be increasingly digital and networked. What about an article that includes a live satellite feed, or live peer-review? The containers of scholarly communications are cracking apart, and the object itself can begin to crack in and new aspects (video, audio, social networking) can become a part of it.

The scholarly journal looks like it does based on what was the common medium of communication in the 18th century. If we were to invent it today, what would it look like? Scholarship itself will take a dramatic leap in terms of authenticity and genuineness now that researchers can express more effectively what their results are - leading to different kinds of research endeavours and questions. Our new and forthcoming capabilities will change the face of knowledge itself - "a boon for all of us".

A lot of what we build into what we do is based on an assumption of permanence and endurance - giant buildings of bound journal runs. If we didn't have these, or put them somewhere else, it would change how we build our services and even our professional ethics. "I'm the token American here so I have to say change, hope and 'yes we can'!"

Some of us will make it happen and some of us will be cleaning up after it has happened. One approach to figure out where we'll go from here is to look for our signposts. We mustn't shoehorn new developments into old pigeonholes - a blog is not a scholarly journal, Wikipedia is not Encylopedia Britannica. The further down the road we get, the more we'll see what our current harbingers of change mean. We all have to be mindful of the long-haul - careful not to put our eggs in a basket that's not going to be around (remember Gopher?). It probably works to base our strategies on incremental change, but it might work better to think about starting over: what is the right way for us, together, to design the right system to engender, distribute, collect, scan and use the results of scholarly work?

"Maybe the question isn't 'where do we go from here', but 'how do we get there?'"

Labels: , ,

The future of learning: starting now

The journal and the book, suggests Professor Timothy O'Shea, will not die but will inevitably mutate as we find new modes of knowledge sharing and use - and ownership (individuals up to open collectives).

Technology is changing learning and research in universities - centralised systems, e.g. for authentication or records management; distributed software that's installed and used by random academics regardless of whether they are supposed to or not. All sorts of innovative uses of technologies for students e.g. audience-participation style clickers for lecturers to take quick polls during lectures (does this add much value over the old hands-up method?).

Dead horse of the week
Students are very at ease with technology and "view ICT in education positively and confidently". Interesting applications in veterinary sciences - virtual sick cats, dogs, cows - virtual "dead horse of the week" (first big audience chuckle of the day). Vet students also construct their own virtual subjects ("imaginary sick dogs") to support their own studies. Virtual sick animals are archived and can be reviewed in later years.

Vicarious learning
Vicarious learning happens by watching other students in action - even when the student is not actively participating in a discussion. YouTube is a great mechanism for organising vicarious learning and has been used for example in computing science lessons. Elearning students use Edinburgh's "best of breed" platform for elearning (Virtual University of Edinburgh - VUE) - they use Wikis, Second Life (constructing exhibitions within it) and "assess co-created artefacts". VUE is used by all sorts - staff, students, alumni - for awareness, e-learning, PhD projects etc. The people using it may never meet but share virtual spaces - often in hybrid form (real people in real offices connecting in a virtual shared space).

Speckled computing
Speckled computing is also changing research; it's based on specks: "miniature programmable semiconductor devices which can sense, compute and network wirelessly". These are e.g. placed all over a person to track their movement and reflect it in an avatar - enables a person to teach a robot how to dance. Tim also gave a good example of using specks in capturing movements of shy, nocturnal creatures like badgers - place the speck and it will wake up and start to monitor activity when it senses some movement.

Collaboration
Through collaborative activities like SAGES, we're seeing academics across a range of institutions (with different computing sources and sources of money) engaging in research which could not be done without the computational facilities their technical collaboration enables - intensive and large-scale data analysis that requires massive computing power. This is akin to the Large Hadron Collider, the data from which could not have been analysed prior to the super-computing era.

Innovation
Procurement is increasingly innovative and driven by the needs of learners. Scholarly communication is also evolving with Open Access presenting challenges as well as benefits - how do you motive researchers to engage, control versions, respect copyright - etc. Libraries are evolving as universities around the world invest in library spaces and move away from "librarians roaming the corridors shouting 'silence in the library'!" - Seriously, the library is a good place to remind students they are in the university; even if they're not using the resources they like to come in and soak up the atmosphere (particularly non-science students, who don't get to hang out in labs, and those not living in halls of residence). Edinburgh keeps its library open till midnight and still has to hustle out a few hundred students at that point - recognises the importance of "not having silly signs stuck up" and pointless legacy rules; using zones to allow for different work styles from vibrant to quiet. "Obviously, one has to support mobile computing" and recognise how many people will want to bring their own laptops - allow for enough workstations.

Conclusions
Student learning has changed - group work and digital assets. Research has changed, using technology to drive achievements that would not have been possible in the past. Technology's not just changing how people produce things but how people own things (more collective ownership). Libraries have changed and are continuing to change - mostly for the positive. More social learning - and considerable social benefits from learning and teaching with computers. Computers have not dehumanised learning just as email has not dehumanised communications. We'll see more research-led learning - because research is expressed in digital form and students have very easy access to the research output of the academics around them; research is published on websites, conference supp data etc - don't need to wait for it to be published in a journal now.

And the next 10 years will see even more dramatic change than the last 10. Eeeep.

Labels: , , , , , , ,

Solving organisation underload: rethinking scholarly communications to add new conceptual value

"Open Access is going now," says Jan Velterop. "So I feel I can talk about something else - Beyond Open Access." That 'something' is organisation underload.

Too much of our data is too deeply hidden - we struggle to get the most out of it. Jan suggests the problem is not information overload but 'organisation underload'; a lack of organisational conceptual structures to manage all this information. The information overload aspect is going to increase; think expanding communication mechanisms e.g. blogs, peer-reviewed wikis - and why, says Jan, are these not being initiated by publishers?

He uses water as an analogy for information. When there's just a bit, we take it in (we drink it). When there's too much, we have to devise a means to navigate it - a boat. We need to find ways of presenting knowledge that helps us to do something useful and immediate with it. This means not just publishing articles but creating visualisations of conceptually connected data - "this is where the future lies". And as the communications process changes, we will need to think again about what skills and workflows are required to manage it.

As scientists we have traditionally focussed on the detail; now we complement this with a step back to see the bigger picture of how things are connected. This isn't feasible in the traditional manner of ingesting research, but it's this lateral thinking that produces breakthroughs (revealing new conceptual connections).

Jan talks about Knewco's software that mines data for concepts rather than simply words, and (I paraphrase, but I think the essence is that it) breaks the data down into triples that codify the relationships between lots of different pieces of data. Mapping these connections can reveal a powerful picture but once you scale this to millions of triples there will be redundancy that needs to be removed in order to focus on the valuable connections. This kind of semantic data analysis and mapping can make scientific literature even more useful by helping users find new, valuable sources of information - using the connections between literature to support new browse options in library catalogues and publisher websites.

[I have not done Jan's paper justice and would urge you to check out the wealth of additional interesting comments rippling around the Twittersphere (#UKSG09) - and must acknowledge that if I have managed to grasp his thesis at all it is because this is almost exactly what powers Publishing Technology's pub2web platform, which I have spent a good long time getting my head round!]

Labels: , , , ,

Lift off: opening of the 32nd (not the 31st) UKSG annual conference

We've all made it (just about) and the auditorium is packed as retiring chair Paul Harwood steps up and taps the mike. Paul welcomes us all and in particular thanks our student attendees, the committee and the exhibitors - pleased to see we've still got such a full exhibition hall despite 'the current climate'. Paul hands over to NASIG president Jill Emery who gives us an update on activities across the pond.

Jill references Twitter and as I sit here the Twitter channels are already going mad with #UKSG09 updates. I've got that UKSG flutter in my stomach - here we go again!

Labels: , , ,

Sunday, March 29, 2009

Train chaos, not the best start…

In the hour I’ve just spent at Exeter St David station waiting for a connection to Torquay, I’ve had plenty of time to think about what to expect at the conference, as a first-timer. Since I’ve finally made it onto a train which I hope is heading in the right direction, I thought I’d use this time to jot down some pre-conference thoughts.

I’m massively grateful to have been given the opportunity, as a student, to come to UKSG. As this is the first conference I’ve ever attended, and I’ve only been working with serials for about six months, I actually have no idea what to expect (I’m also new to the whole blogging thing, so apologies if this is a little clumsy!). I’m looking forward to chatting to other professionals in the field, meeting up with my fellow student delegates, and hearing what the speakers have to say (some of the breakout sessions look fantastic – I had a really hard time only choosing four!).

I’m also told this is a fantastic opportunity for “networking”, which I think is professional-speak for “talking to people”. I recently decided to see what all the hype was about Twitter, and signed up… and have rarely been off it since! I’ve been following the buzz about UKSG on Twitter, and am looking forward to meeting some fellow Tweeple in person. I liked Todd’s suggestion about using Twitter to record your thoughts during a conference, but I’m not sure I’m organised enough for that just yet!

Well, I’m almost in sunny Torquay (it really is! Really enjoyed seeing the scenery get greener and the sky get bluer as I got further from London…) so I’d better power my laptop down and find my hotel. Will have to post this when I get there. Looking forward to seeing some of you at the buffet tonight!

Labels:

Friday, March 27, 2009

Twittering conferences: public notes, the back channel conversation, and other uses

Twitter has entered mainstream in a big way over the past few months. It seems like everywhere one turns someone mentions twittering this or tweeting that. For those who were at the Tools of Change Conference in New York earlier this year, the fascination with the service was such that you might think that twitter will replace the book, the e-book, the blog, and every other form of human communication. Those of us in the book and serials worlds can rest assured that twitter wont displace publishers or libraries anytime soon.

A lot of people who haven’t used twitter ask, “What’s the point?” A fair question, at first glance. With a limit of 140 characters and streamed to the whole world, tweets are too short for meaningful analysis, and distributed so widely as to be too diffuse to be considered a “conversation”. And yet, their use and value can be quite amazing, particularly in the context of a conference.

Many people at meetings take notes; scribbling down their thoughts, noting interesting things that people said, or links to follow-up on for more information. Why keep those notes in your hands only? Others might benefit from sharing these ideas – and credited to the speaker if they are theirs. You might benefit too, by broadcasting your interest in those ideas. Several times, when I’ve expressed fascination or interest in some comment, others monitoring my tweets will pass along additional information. Similarly, people who I don’t know and have never met who are following the meeting have reached out to me with additional information or to talk further. This is part of the reason we all attend conferences and twitter can help us find others with similar interests and business needs.

One of the most interesting uses of twitter is in the back channel discussions and commentary that can take place during a meeting. Much of the twittering during a meeting is parroting or paraphrasing what the speaker is saying. This can be useful, particularly in retrospect or for those who aren’t present and for you as I just noted. However, the simultaneous chatting that can occur on twitter, I think is more valuable. It gives people an opportunity to think more critically about what is said, link to other resources and enrich the conversation. Certainly, this takes some skill at multi-tasking and perhaps detracts from the attention due the speaker. However, my feeling is that lecturing to people is a decidedly one-way form of communication and while challenging in an in-person forum, virtual dialog, if respectful, can enhance the experience of the lecture format.

One can’t be everywhere all the time. Often at a busy conference, there are multiple session of interest that overlap and choosing one session over another can be a challenge. At a meeting where many people are twittering, you can get a sense of what was being said, what people thought about what was said, as well as have and follow the links to more information.

One of the benefits to twitter is that it’s short, sweet, and quick. A long blog post can take a long time to write and (hopefully) be well written. A tweet, with its tight size limit forces you to be judicious in what you say.

I’m a fairly active twitter-er, but by no means one of the chattiest on the network. Among the most active people I follow, Dave Winer has posted more than 13K tweets, and Robert Scoble has posted more than 19K tweets. How do these people find the time to do anything else, I wonder? But you can expect that I, and several others in the community, will be tweeting away a fast as our laptops, iPhones or Blackberries (or just possibly the network speed at the conference center) can keep up.

Follow the UKSG 2009 conference on twitter by using the search button at the bottom of the twitter page using the hashtag #UKSG09. Not surprisingly, theres's already a lot of tweeting going on there.

Labels: , , , , ,

Thursday, March 26, 2009

Counting down ... your blogging team and Twitter tags!

A quick note to confirm that LiveSerials will be springing back into action again over the next few days as the 31st UKSG Annual Conference gets going down in sunny Torquay.

Your blogging team this year:
  • Bernie Folan (SAGE)
  • Bev Acreman (Taylor & Francis)
  • Charlie Rapple (TBI Communications)
  • Ginny Hendricks (Ardent Marketing)
  • Jill Emery (University of Texas Libraries)
  • Karen Halliday (Wiley-Blackwell)
  • Kirsty Meddings (CrossRef)
  • Laura Woods (City University)
  • Mandy Phillips (Edgehill University)
  • Mark O'Loughlin (SAGE)
  • Nadine Edwards (University of Greenwich)
  • Todd Carpenter (NISO)
We've also got a UKSG Twitter channel for you to follow (@UKSG), and we're asking any other tweeters to join us in using the conference's Official Hash Tag, #UKSG09.

As usual we're looking forward to your feedback via the blogs comments and now via Twitter. Join us there in spirit if not in person!

Labels: , , , ,

Tuesday, March 17, 2009

Fancy joining the blogging team?

We're once again counting down the days to the UKSG's annual conference - registrations are closed and we're expecting another mammoth number of delegates to be descending on sunny Torquay at the end of this month for a couple of jam-packed days of learning, sharing, networking ... and blogging.

As ever, we'll be reporting back on all the action here on the LiveSerials blog. If you fancy joining us in our prime bloggers' seats, then do drop me a line. You don't need to be an experienced blogger (we'll provide all the instructions you need to get started), and you don't even need to have a laptop with you at the conference - we'd also welcome follow-up thoughts and opinions once you're safely back at your desk.

If you want to get a flavour of how we roll, take a look at some of the postings from previous conferences - 2006, 2007 and 2008.

Labels: , , ,