Your SlideShare is downloading. ×
Bibliometrics in the library Wageningen UR Library experience
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Bibliometrics in the library Wageningen UR Library experience

1,067

Published on

Presentation for the bibliometricians meeting, March 5th at the Open University, Milton Keynes.

Presentation for the bibliometricians meeting, March 5th at the Open University, Milton Keynes.

Published in: Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,067
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
19
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Bibliometrics in the library Wageningen UR library experience at Milton Keynes, March 5th 2013 Wouter Gerritsma
  • 2. Contents  Research Evaluation in the Netherlands  CRIS & Repository @Wageningen  Bibliometrics module  Research questions  Developments in the marketplace  Lessons learned  Some advice
  • 3. Research assessment in the Netherlands  Supervised by VSNU/QANU ● 6 year cycle for external peer reviews ● After 3 year midterm review ● Unit of analysis (in Wageningen): Graduate schools  Citation analyses are not stipulated in the current Standard Evaluation Protocol (SEP). ● This has become mandatory at Wageningen UR, als at the social sciences department and for the research institutes
  • 4. SEP criteria  quality (including international academic reputation and PhD training)  productivity (the relationship between input and output)  societal relevance (including valorisation)  vitality and feasibility (the ability to react adequately to important changes in the environment).
  • 5. Metis, our CRIS  Metis is a Current Research Information System (CRIS)  Data entry at chair group level  Quality control by the library ● Locating full text (uploading to e-depot) ● Maintenance journal lists ● Document type assignation and inclusion of DOI's  Compulsory output registration ● Research assessments only on metis registered publications  Information on all labour relations of faculty and staff  Information on all projects
  • 6. Repository or Institutional Bibliography?  Wageningen Yield (WaY) is the repository of Wageningen UR ● Synchronized overnight with the updates from Metis ● WaY contains metadata descriptions of all Wageningen UR publication output, >190.000 items ● WaY is our OA repository, >40.000 items ● WaY is our tool for citation analyses, >22.000 publications ● Advanced bibliometrics
  • 7. Full screen image with title
  • 8. How do we compare numbers  Scientist Z. Math has a publication from 2002 with 17 citations  Scientist M. Biology has a publication from 2008 with 32 citations
  • 9. Baselines for Mathematics
  • 10. Baselines for Molecular Biology
  • 11. For a single publication  Zee, F.P.v.d., G. Lettinga & J.A. Field (2001) Azo dye decolourisation by anaerobic granular sludge. Chemosphere 44:1169-1176. ● Citations from WoS: 94  Journal: Chemosphere  Categorised by ESI in Environment/Ecology  Baseline data for Environment/Ecology. ● Article from 2001 in Environment/ecology: ● On average: 19.36 citations; ● Top 10%: 44 citations; Top1%: 141 citations  Relative Impact: 94 / 19.36 = 4.9 van Veller, M.G.P et al. (2010). Bibliometric analyses on repository contents for the evaluation of research at Wageningen UR. In: Qualitative and Quantitative Methods in Libraries: Theory and Applications. A. Katsirikou and C. H. Skiadas. p.19-26. http://edepot.wur.nl/7266.
  • 12. Example of results
  • 13. The actual publications and their impact are provided
  • 14. Research credits report (partly)
  • 15. Slice and dice any way you want
  • 16. Advanced bibliometric indicators  Follow Moed (1995) as closely as possible; but.....  Web of Science is used for citation data ● We can’t make corrections for self citations  Essential Science Indicators for baseline data (World average, Top 10% and Top 1%) ● Limited number of research fields (22) BUT:  We can determine the representativeness of the citation analysis!
  • 17. Representativeness Publication type #Pubs Refereed articles 324 Non-refereed articles 7 Books 1 Refereed book chapters 36 Non-refereed book chapters 13 PhD Theses 45 Conference papers 137 Total Academic Publications 563
  • 18. Representativeness Publication type #Pubs WoS Repr. Refereed articles 324 288 89% Non-refereed articles 7 Google Scholar 1 Refereed book chapters 36 Non-refereed book chapters 13 PhD Theses 45 Conference papers 137 Total Academic Publications 563
  • 19. Prospective versus Retrospective analyses  CWTS performs normally Prospective analyses ● Current researchers, 10 years back ● Missing some retired bigshots!  You need to keep track of the actual publication record for retrospective analyses. This is difficult for external parties. ● Head-tail problems  No research on differences in outcomes of prospective versus retrospective analyses ● We need research in this area!
  • 20. Self citations  CWTS performs corrections for self citations  Correcting for self citations in Web of Science is incomplete ● As long as the RsearcherID is not fully introduced this will be impossible in WoS  Correcting for Self citations in Scopus is possible  Belgian research has shown that it has not a tremendous influence Glänzel, W., K. Debackere, B. Thijs & A. Schubert (2006). A concise review on the role of author self-citations in information science, bibliometrics and science policy. Scientometrics, 67(2): 263-277 http://dx.doi.org/10.1007/s11192-006-0098-9
  • 21. Some observations on commercial analytical tools
  • 22. Web of Science  Citation data (you can include citations from other databases on Wok)  API to download citation data  Baselines from ESI  InCites more advanced, but where do you manage the information?  Author disambiguation is a major problem
  • 23. Scopus  Citation data obtainable through an API  Benchmarking with SciVal Strata, no API yet  Not yet fully developed, major changes coming up.
  • 24. CWTS monitor  So far the most elegant and comprehensive citation analysis tool (still in beta) to be launched soon.  Citation database agnostic!
  • 25. Google Scholar  Very popular by social scientists and arts & humanities  Have you ever retrieved more than 1000 results from any Google product?  Google Scholar can't count  Harzing's Publish or Perish software does a decent job.
  • 26. Altmetrics  Quickly developing ● ScienceCard ● Total-Impact ● Readermeter ● Microsoft Academic Search ● etc.  We look into inclusion on top of the WaY for article level metrics Wouters, P. & R. Costas (2012). Users, narcissism and control. Utrecht, NL: SURFfoundation. http://www.surffoundation.nl/ en/publicaties/Pages/Users_narcissism_control.aspx.
  • 27. CRIS and Bibliometric analysis tools  If you maintain a CRIS, why should you maintain your researchers and organisation structure in a bibliometrics analysis tool as well?  Do commercial packages have ways to publicize the results for scrutiny by the researchers?
  • 28. Lessons learned
  • 29. Matching Wageningen Yield and WoS WoS: 9577 articles WaY: 10933 articles Missing in Way: 807 articles Missing in WoS: 1159 articles 1161 peer reviewed articles not in ISI journals It is a lot of hard work to keep track of all publications. The library can a should do a better job than commercial service providers
  • 30. Journal selection and impact @WUR
  • 31. Why in the library?  Library is the functional manager of Metis / WaY because of wide experience with bibliographic metadata  Library manages contracts with publisher(s) of external databases that are being used  Library has experience in developing and maintaining large databases  Library has ample experience in searching complicated databases such as Web of Science
  • 32. Advantage of using Metis / WaY  Improvements in publication lists, etc. recorded  Knowledge of, and experience with bibliometric analyses is better institutionalized  More visibility through Open Access management  Clarity / transparency for researchers  Analysis of a single unit within the institute offers advantages for the organization as a whole  Better understanding of our own researchers ● We know where they publish ● We know what they cite ● We know something about their impact
  • 33. Raising library awareness  Improvement of the (metadata) quality in the repository  Quality has lead to compulsory registration for research assessments  Presentations for research groups during the preparation for peer reviews  Presentations based on detailed studies of single groups  Library gives advice on elements for publication strategies for groups and individuals ● there is a huge demand for these workshops
  • 34. Closing the circle
  • 35. My advise  Start small, gain experience  Show you can pull it off  Be transparent!  How much is your university spending research evaluations?  Invest those resources in your own systems
  • 36. Thank you http://viaf.org/viaf/285392263/ http://orcid.org/0000-0001-7274-0698 http://wu.academia.edu/WouterGerritsma http://www.researcherid.com/rid/A-4161-2008 http://www.mendeley.com/profiles/wouter-gerritsma http://www.researchgate.net/profile/Wouter_Gerritsma http://scholar.google.com/citations?user=3iDBE-MAAAAJ http://academic.research.microsoft.com/Author/34373815 http://www.narcis.nl/person/info:eu-repo/dai/nl/33714253X

×