Data Analysis Will Drive Decision-Making in Research – Jay Katzen, Elsevier
Katzen starts off by reviewing the reactions to the recent economic crisis. World leaders are making grand statements about continuing to invest in R&D e.g. Gordon Brown said that innovation is the way out of the economic crisis. But we know that lean times are here: there is a hiring freeze at most universities in the States such as Harvard; there is a drop in funding levels so that e.g. Stanford is basing their budget on a 5% decrease; and private institutions such as the Wellcome Trust plan to decrease their endowments.
Aside from the economic crisis, other factors to take into account are that governments are playing a bigger role in research assessment; there is an increased competition for funding; as well as a clear drive for multidisciplinary research. And these pre-existing issues with performance measurement are now exacerbated by the economy.
The UK’s Research Assessment Exercise (RAE) released their analysis in 2008 and as a result there are fourteen universities in the UK getting less investment than the previous year, and forty institutions receiving funding that is below inflation. So there are more than fifty UK universities that have to make cutbacks somewhere. These include University College London, Kings College, Imperial College, and University of Cambridge.
The Australian Research Council also aims to use metrics to monitor and measure research to base funding decisions on.
So Lean times = Lean research. But no one can do less research. We have to do more, with less, and the key tools to help us deal with this will be around data and analytics. More data will be needed for evaluation and decision-making at every level of the academic institution.
The scholarly landscape will change, and technology will be key
The United States is ranked number 1 with 310k published articles in 1997 and 340k in 2007. In that decade China has moved from the number 10 spot to the second, and it is predicted that China will surpass the US in research output quantity within the next few years. We need to be aware of this, particularly as research continues to cross national boundaries, making measurement even harder.
So one question is whether research is optimized and efficient?
The average researcher spends 6.5 hours a week searching for information, and 5 hours a week analysing it. They spend more time looking for the information than they do actually using it and this has to be reversed.
We also know that forty-two is the average age when a biomed researcher gets his/her first research grant from the NIH but the approval rate is only 15%. So researchers spend a significant amount of time identifying ideas and proposals, but there is a lot of time wasted when not much is then approved for funding.
Katzen describes the investigation they have done into the researchers’ workflow; an exercise that aimed to spot gaps and see where tools can be developed to improve efficiencies, leading to increased published output and ultimately institutional ranking.
Katzen therefore disagrees with Derk Haank’s view that there is no information overload and that technology will not play a significant role in the publishing industry during the next decade. He argues that we can now make revolutionary changes to the whole research environment by using technology to connect the mass of information through new tools and methodologies to really analyse and evaluate performance through data. This will change the landscape for how people practice research, and technology will facilitate that. Katzen argues that “we are moving from traditional publishers to information solution providers”.
New Mapping Methodology For Reputation and Performance Measurement
The tools linking reputation ranking tools don’t yet exist; we cannot look solely at journal-based classification anymore, especially with multidisciplinary research increasing and non-English language research increasing. The number and scope of journals is too limiting and the level of aggregation is ineffective when we cannot see how different disciplines interconnect.
There is a new mapping methodology, presented at the National Science Foundation (NSF), which used co-citation analysis to look at the quality of output. There are thirteen categories but more than 40,000 subcategories so if institutions want to understand where their competencies really lie, they should look much deeper into the data and use this mapping methodology.
You can also look at this mapping technique to see which authors are driving certain areas. It’s also possible to look at a national level too: the NSF created these maps and looked at UK – previously they had thought that the UK had two areas of strength: in social sciences and in health services. But this doesn’t make any sense – where is the map that shows physics and maths are actually key drivers? It’s also useful in order to see where you are vulnerable internationally and where you should be careful. The NSF says that this new mapping methodology “gives us vivid insight into rapidly evolving research areas and the relationships among them”.
The people thinking about these things are generally Deputy Vice-Chancellors, Deans etc, but Librarians can and should play key and critical role in supporting their universities to become leaner organisations. One Library Director Katzen spoke with said that people view her as a procurement centre – she just buys it and switches it on and the job is done. But Katzen argues that the library role is significantly undervalued; they can and should be looking into the performance of their institution in order to adapt their services to be more involved in the research process.
Audience question 1: That analysis is all great but who will actually be the first to move to make the paradigm shift? Katzen answers that it will be the research councils and university deans who drive the need for this methodology (they already are) and publishers will support it with data and tools.
Audience question 2: Peter Shepherd asks that since there is a lot of historical citation data, whether this could be applied retrospectively to trace what triggered critical and sudden changes in the past. Katzen replies yes, there’s no reason why you can’t look not only at today’s performance but indeed this method would allow you to trace that. Shepherd states that this would be something scientists would really get excited about.
Aside from the economic crisis, other factors to take into account are that governments are playing a bigger role in research assessment; there is an increased competition for funding; as well as a clear drive for multidisciplinary research. And these pre-existing issues with performance measurement are now exacerbated by the economy.
The UK’s Research Assessment Exercise (RAE) released their analysis in 2008 and as a result there are fourteen universities in the UK getting less investment than the previous year, and forty institutions receiving funding that is below inflation. So there are more than fifty UK universities that have to make cutbacks somewhere. These include University College London, Kings College, Imperial College, and University of Cambridge.
The Australian Research Council also aims to use metrics to monitor and measure research to base funding decisions on.
So Lean times = Lean research. But no one can do less research. We have to do more, with less, and the key tools to help us deal with this will be around data and analytics. More data will be needed for evaluation and decision-making at every level of the academic institution.
The scholarly landscape will change, and technology will be key
The United States is ranked number 1 with 310k published articles in 1997 and 340k in 2007. In that decade China has moved from the number 10 spot to the second, and it is predicted that China will surpass the US in research output quantity within the next few years. We need to be aware of this, particularly as research continues to cross national boundaries, making measurement even harder.
So one question is whether research is optimized and efficient?
The average researcher spends 6.5 hours a week searching for information, and 5 hours a week analysing it. They spend more time looking for the information than they do actually using it and this has to be reversed.
We also know that forty-two is the average age when a biomed researcher gets his/her first research grant from the NIH but the approval rate is only 15%. So researchers spend a significant amount of time identifying ideas and proposals, but there is a lot of time wasted when not much is then approved for funding.
Katzen describes the investigation they have done into the researchers’ workflow; an exercise that aimed to spot gaps and see where tools can be developed to improve efficiencies, leading to increased published output and ultimately institutional ranking.
Katzen therefore disagrees with Derk Haank’s view that there is no information overload and that technology will not play a significant role in the publishing industry during the next decade. He argues that we can now make revolutionary changes to the whole research environment by using technology to connect the mass of information through new tools and methodologies to really analyse and evaluate performance through data. This will change the landscape for how people practice research, and technology will facilitate that. Katzen argues that “we are moving from traditional publishers to information solution providers”.
New Mapping Methodology For Reputation and Performance Measurement
The tools linking reputation ranking tools don’t yet exist; we cannot look solely at journal-based classification anymore, especially with multidisciplinary research increasing and non-English language research increasing. The number and scope of journals is too limiting and the level of aggregation is ineffective when we cannot see how different disciplines interconnect.
There is a new mapping methodology, presented at the National Science Foundation (NSF), which used co-citation analysis to look at the quality of output. There are thirteen categories but more than 40,000 subcategories so if institutions want to understand where their competencies really lie, they should look much deeper into the data and use this mapping methodology.
You can also look at this mapping technique to see which authors are driving certain areas. It’s also possible to look at a national level too: the NSF created these maps and looked at UK – previously they had thought that the UK had two areas of strength: in social sciences and in health services. But this doesn’t make any sense – where is the map that shows physics and maths are actually key drivers? It’s also useful in order to see where you are vulnerable internationally and where you should be careful. The NSF says that this new mapping methodology “gives us vivid insight into rapidly evolving research areas and the relationships among them”.
The people thinking about these things are generally Deputy Vice-Chancellors, Deans etc, but Librarians can and should play key and critical role in supporting their universities to become leaner organisations. One Library Director Katzen spoke with said that people view her as a procurement centre – she just buys it and switches it on and the job is done. But Katzen argues that the library role is significantly undervalued; they can and should be looking into the performance of their institution in order to adapt their services to be more involved in the research process.
Audience question 1: That analysis is all great but who will actually be the first to move to make the paradigm shift? Katzen answers that it will be the research councils and university deans who drive the need for this methodology (they already are) and publishers will support it with data and tools.
Audience question 2: Peter Shepherd asks that since there is a lot of historical citation data, whether this could be applied retrospectively to trace what triggered critical and sudden changes in the past. Katzen replies yes, there’s no reason why you can’t look not only at today’s performance but indeed this method would allow you to trace that. Shepherd states that this would be something scientists would really get excited about.
Labels: uksg09
0 Comments:
Post a Comment
<< Home