Providing Tools for Author Evaluation - A case study

10,618 views

Published on

Elsevier/Scopus' presentation at InSciT2006.

Published in: Economy & Finance, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
10,618
On SlideShare
0
From Embeds
0
Number of Embeds
54
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Providing Tools for Author Evaluation - A case study

  1. 2. Providing Tools for Author Evaluation A case study _______________________________________ Merida, October 27, 2006 Jaco Zijlstra Director Scopus
  2. 3. Why do we evaluate scientific output <ul><li>Government </li></ul><ul><li>Funding Agencies </li></ul><ul><li>Institutions </li></ul><ul><li>Faculties </li></ul><ul><li>Libraries </li></ul><ul><li>Researchers </li></ul><ul><li>Funding allocations </li></ul><ul><li>Grant Allocations </li></ul><ul><li>Policy Decisions </li></ul><ul><li>Benchmarking </li></ul><ul><li>Promotion </li></ul><ul><li>Collection management </li></ul>
  3. 4. Criteria for effective evaluation <ul><li>Objective </li></ul><ul><li>Quantitative </li></ul><ul><li>Relevant variables </li></ul><ul><li>Independent variables (avoid bias) </li></ul><ul><li>Globally comparative </li></ul>
  4. 5. Why do we evaluate authors? <ul><li>Promotion </li></ul><ul><li>Funding </li></ul><ul><li>Grants </li></ul><ul><li>Policy changes </li></ul><ul><li>Research tracking </li></ul>Important to get it right
  5. 6. Author evaluation in Scopus <ul><li>Scopus is designed as an end user tool </li></ul><ul><li>Scopus end user research identified most common tasks users have to fulfil when doing literature research </li></ul><ul><ul><li>Find ( new ) articles in a familiar subject field </li></ul></ul><ul><ul><li>Stay up-to-date </li></ul></ul><ul><ul><li>Get an overview or understanding of a new subject field </li></ul></ul><ul><ul><li>Find information that helps in evaluating a specific author </li></ul></ul><ul><li>Author evaluation = key task within literature research </li></ul>
  6. 7. <ul><li>Challenges in evaluation </li></ul>
  7. 8. <ul><li>Broad title coverage </li></ul><ul><li>Affiliation names </li></ul><ul><li>Author names </li></ul><ul><li>Including co-authors </li></ul><ul><li>References </li></ul><ul><li>Subject categories </li></ul><ul><li>ISSN (e and print) </li></ul><ul><li>Article length (page numbers) </li></ul><ul><li>Publication year </li></ul><ul><li>Language </li></ul><ul><li>Keywords </li></ul><ul><li>Article type </li></ul><ul><li>Etcetera … </li></ul>Data requirements for evaluation There are limitations that complicate author evaluation <ul><li>Citation counts </li></ul><ul><li>Article counts </li></ul><ul><li>Usage counts </li></ul>
  8. 9. Data limitations <ul><li>Author disambiguation </li></ul><ul><li>Normalising Affiliations </li></ul><ul><li>Subject allocations may vary </li></ul><ul><li>Matching authors to affiliations </li></ul><ul><li>Deduplication/grouping </li></ul><ul><li>Etcetera </li></ul>Finding/matching all relevant information to evaluate authors is difficult
  9. 10. <ul><li>The Author Identifier </li></ul>
  10. 11. The Challenge: finding an author <ul><li>How to distinguish results between those belonging to one author and those belonging to other authors who share the same name? </li></ul><ul><li>How to be confident that your search has captured all results for an author when their name is recorded in different ways? </li></ul><ul><li>How to be sure that names with unusual characters such as accents have been included – including all variants? </li></ul>
  11. 12. The Solution: Author Disambiguation <ul><li>We have approached solving these problems by using the data available in the publication records such as </li></ul><ul><ul><li>Author Names </li></ul></ul><ul><ul><li>Affiliation </li></ul></ul><ul><ul><li>Co-authors </li></ul></ul><ul><ul><li>Self citations </li></ul></ul><ul><ul><li>Source title </li></ul></ul><ul><ul><li>Subject area </li></ul></ul>… and used this data to group articles that belong to a specific author
  12. 13. Step 1: Searching for an author Enter name in Author Search box
  13. 14. Step 2: Select your author Which S. Albracht are you looking for? Available information
  14. 15. Step 3: The Author details Unique Author ID & matched documents
  15. 16. No 100% recall… The same author with different author ID’s
  16. 17. Why were these not matched? <ul><li>Quality above all: </li></ul><ul><ul><li>Precision (>99%) was given priority over recall (>95%) </li></ul></ul><ul><li>Not enough information to match with enough certainty </li></ul><ul><ul><li>For instance affiliations missing or different departments, and all different co-authors or no co-authors, no shared references </li></ul></ul><ul><li>As there are many millions of authors there will be unmatched papers and authors </li></ul>
  17. 18. Solution: Author Feedback
  18. 19. Feedback loop includes Scopus check Dedicated team investigating feedback requests to guarantee quality
  19. 20. Author Identifier Feedback <ul><li>Invited 600.000 authors to check their profiles </li></ul><ul><li>2.000 feedback queries handled since release this summer </li></ul><ul><ul><li>85% are attributing several articles to one author </li></ul></ul><ul><ul><li>15% other, including: </li></ul></ul><ul><ul><ul><li>Changing preferred spelling </li></ul></ul></ul><ul><ul><ul><li>Split incorrectly matched articles </li></ul></ul></ul><ul><ul><li>Feedback overwhelmingly positive </li></ul></ul>
  20. 21. Step 3: The Author details … we have matched the author to documents – now what? Unique Author ID & matched documents
  21. 22. <ul><li>The Citation Tracker </li></ul><ul><li>- to evaluate authors </li></ul>
  22. 23. Why the Scopus Citation Tracker? <ul><li>Extensive research with hundreds of users and librarians showed a clear need for easily spotting and tracking citation patterns, visually </li></ul><ul><li>Scientists told us: </li></ul><ul><ul><li>They want to move away from pre-defined metrics </li></ul></ul><ul><ul><li>Analyze article-level data in order to more easily: </li></ul></ul><ul><ul><ul><li>Identify trends in a particular subject field </li></ul></ul></ul><ul><ul><ul><li>Determine influence of specific research groups </li></ul></ul></ul><ul><ul><ul><li>Evaluate individual authors </li></ul></ul></ul>
  23. 24. Evaluation Data Instant citation overview for an author
  24. 25. Step 4: The citation overview Excluding self citations
  25. 26. Excluding Self-Citations
  26. 27. Export to excel for further analysis
  27. 28. But not: X
  28. 29. The H-index in Scopus An alternative approach to evaluating an author
  29. 30. Step 1: The Author details
  30. 31. Step 2: Ranked Author records
  31. 32. Step 3: Scroll to H-index “ Dr. Albracht has index 30 as 30 of his 127 papers have at least 30 citations each, and the remaining papers have no more than 30 citations each”
  32. 33. Additional tools for evaluating authors PatentCites and WebCites
  33. 34. PatentCites and WebCites <ul><li>PatentCites are citations of articles in Scopus that appear in official patent documents </li></ul><ul><ul><li>US Patent Office (USPTO), European Patent Office (EPO) or World Intellectual Property Organisation (WIPO) </li></ul></ul><ul><li>WebCites are citations of articles in Scopus that appear in selected web sources </li></ul><ul><ul><li>Theses and Dissertations (NDLTD) </li></ul></ul><ul><ul><li>Courseware (MIT) </li></ul></ul><ul><ul><li>Insititutional Repositories (DiVA, U of Toronto, Caltech….) </li></ul></ul><ul><ul><li>More to follow </li></ul></ul>
  34. 35. PatentCites per document Dr. Albracht has additional citations…
  35. 36. PatentCites for the document
  36. 37. More citations …
  37. 38. WebCites for the document
  38. 39. Information available in Scopus for Dr. Albracht <ul><li>1375 Citations received </li></ul><ul><li>H-Index = 30 </li></ul><ul><li>Matched to 149 Co-Authors </li></ul><ul><li>Published 127 papers matched to ID </li></ul><ul><li>Cited 8 times in patents </li></ul><ul><li>24 credible web citations – in high quality sources </li></ul>
  39. 40. Quality of evaluation is dependent on quality of underlying data <ul><li>Scopus = end user tool </li></ul><ul><ul><li>Must address this key task within literature research </li></ul></ul><ul><li>Invested heavily in making author information </li></ul><ul><ul><li>Objective </li></ul></ul><ul><ul><li>Quantitative </li></ul></ul><ul><ul><li>Providing relevant variables </li></ul></ul><ul><ul><li>Independent variables (avoid bias) </li></ul></ul><ul><ul><li>Globally comparative </li></ul></ul>
  40. 41. Evaluating authors is at root of any evaluation of scientific output <ul><li>Government </li></ul><ul><li>Funding Agencies </li></ul><ul><li>Institutions </li></ul><ul><li>Faculties </li></ul><ul><li>Libraries </li></ul><ul><li>Researchers </li></ul><ul><li>Starting with authors we can aggregate </li></ul><ul><li>Aggregated data = field of bibliometrics </li></ul><ul><li>Next step for Scopus </li></ul>

×