Monday, April 04, 2011
In 1994 Cameron Neylon's PhD supervisor told him he needed to spend half a day a week in the library reading the new journals. Five years later when he finished his doctorate he had Google. This, he says, makes him part of the last generation to remember the library as the place that you go to access information. The last generation to think of journals as paper objects. The idea of physically searching a paper index is almost a joke.
Neylon defines his audience this afternoon pretty neatly as "the people who have to deliver for research and education, and also have to add value". In this task we - and researchers - have a shared problem: there's "too much stuff", information overload.
However, Neylon takes issue with Clay Shirky's statement that "it's not information overload, it's filter failure". This isn't a good way to think about it, because filters block. But surely it's good to block out the stuff no one wants from the deluge of information they can't deal with, to apply standards, etc? Neylon suggests that these filters that we apply actually limit the researcher's ability to explore.
Filters are a problem when the researcher doesn't know what they are blocking and why. They can be useful, but the researcher needs to be allowed to choose the filters that they want to apply, and doesn't want publishers or librarians applying the filters for them. Google allows you to set your own filters.
Neylon gives the example of a chemistry paper that claimed to show something fairly revolutionary. Within hours of publication the experiment had been recreated by several researchers and proved wrong - one of the samples was contaminated. The paper was retracted and labelled as such "for scientific reasons". The reaction of the unintended chemicals is of interest to many, but it's been retracted with no explanation. Failed experiments are as useful as successful, but they don't get published - they are filtered out.
The number of retractions is going up. That's a lot of failed experiments that could be useful to someone's research. Researchers don't know they are repeating failed experiments, but they could.
The gatekeeper was needed in a broadcast world - expensive printing and distribution needed centralising. Decisions needed to be made about what to publish and what to collect. The current flood of information is the "central research opportunity of our age".
"Every book its reader" - Ranganathan's third law of libraries. Filtering is not adding value. Rather than filter failure, Neylon believes we've got a "discovery deficit".
People can be the filter - social aggregation, annotation, critique. A network of linked objects - blogs, tweets, RSS feeds can all be found using Google, and come together to be the researcher's own personal collection. Neylon doesn't want a collection that has been chosen for him by someone else - he wants to choose his own filters from those that are available to him.
Neylon's closing advice: We need to connect people with people so they can build discovery systems. Enable, don't block. Build platforms not destinations, sell (provide) services not content. The content game is dead. Forget about filtering and control and enable discovery.
Q: Are there any publishers who are currently enabling?
A: Most are making some effort, but we need to think about how to make them more effective. We're feeling our way together.
Q: There's a lot of rubbish on the web - are you saying publishers should be publishing this too?
A: No - there's nothing wrong with authorities labelling things as trustworthy - the problem is that no one publishes all of the experiments that didn't work. Publishers can mark up, validate, etc. Just don't block the other stuff.