• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Providing Tools for Author Evaluation - A case study

Providing Tools for Author Evaluation - A case study



Elsevier/Scopus' presentation at InSciT2006.

Elsevier/Scopus' presentation at InSciT2006.



Total Views
Views on SlideShare
Embed Views



3 Embeds 3

http://www.instac.es 1
http://nclc.blackboard.com 1
http://www.slideshare.net 1


Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Providing Tools for Author Evaluation - A case study Providing Tools for Author Evaluation - A case study Presentation Transcript

    • Providing Tools for Author Evaluation A case study _______________________________________ Merida, October 27, 2006 Jaco Zijlstra Director Scopus
    • Why do we evaluate scientific output
      • Government
      • Funding Agencies
      • Institutions
      • Faculties
      • Libraries
      • Researchers
      • Funding allocations
      • Grant Allocations
      • Policy Decisions
      • Benchmarking
      • Promotion
      • Collection management
    • Criteria for effective evaluation
      • Objective
      • Quantitative
      • Relevant variables
      • Independent variables (avoid bias)
      • Globally comparative
    • Why do we evaluate authors?
      • Promotion
      • Funding
      • Grants
      • Policy changes
      • Research tracking
      Important to get it right
    • Author evaluation in Scopus
      • Scopus is designed as an end user tool
      • Scopus end user research identified most common tasks users have to fulfil when doing literature research
        • Find ( new ) articles in a familiar subject field
        • Stay up-to-date
        • Get an overview or understanding of a new subject field
        • Find information that helps in evaluating a specific author
      • Author evaluation = key task within literature research
      • Challenges in evaluation
      • Broad title coverage
      • Affiliation names
      • Author names
      • Including co-authors
      • References
      • Subject categories
      • ISSN (e and print)
      • Article length (page numbers)
      • Publication year
      • Language
      • Keywords
      • Article type
      • Etcetera …
      Data requirements for evaluation There are limitations that complicate author evaluation
      • Citation counts
      • Article counts
      • Usage counts
    • Data limitations
      • Author disambiguation
      • Normalising Affiliations
      • Subject allocations may vary
      • Matching authors to affiliations
      • Deduplication/grouping
      • Etcetera
      Finding/matching all relevant information to evaluate authors is difficult
      • The Author Identifier
    • The Challenge: finding an author
      • How to distinguish results between those belonging to one author and those belonging to other authors who share the same name?
      • How to be confident that your search has captured all results for an author when their name is recorded in different ways?
      • How to be sure that names with unusual characters such as accents have been included – including all variants?
    • The Solution: Author Disambiguation
      • We have approached solving these problems by using the data available in the publication records such as
        • Author Names
        • Affiliation
        • Co-authors
        • Self citations
        • Source title
        • Subject area
      … and used this data to group articles that belong to a specific author
    • Step 1: Searching for an author Enter name in Author Search box
    • Step 2: Select your author Which S. Albracht are you looking for? Available information
    • Step 3: The Author details Unique Author ID & matched documents
    • No 100% recall… The same author with different author ID’s
    • Why were these not matched?
      • Quality above all:
        • Precision (>99%) was given priority over recall (>95%)
      • Not enough information to match with enough certainty
        • For instance affiliations missing or different departments, and all different co-authors or no co-authors, no shared references
      • As there are many millions of authors there will be unmatched papers and authors
    • Solution: Author Feedback
    • Feedback loop includes Scopus check Dedicated team investigating feedback requests to guarantee quality
    • Author Identifier Feedback
      • Invited 600.000 authors to check their profiles
      • 2.000 feedback queries handled since release this summer
        • 85% are attributing several articles to one author
        • 15% other, including:
          • Changing preferred spelling
          • Split incorrectly matched articles
        • Feedback overwhelmingly positive
    • Step 3: The Author details … we have matched the author to documents – now what? Unique Author ID & matched documents
      • The Citation Tracker
      • - to evaluate authors
    • Why the Scopus Citation Tracker?
      • Extensive research with hundreds of users and librarians showed a clear need for easily spotting and tracking citation patterns, visually
      • Scientists told us:
        • They want to move away from pre-defined metrics
        • Analyze article-level data in order to more easily:
          • Identify trends in a particular subject field
          • Determine influence of specific research groups
          • Evaluate individual authors
    • Evaluation Data Instant citation overview for an author
    • Step 4: The citation overview Excluding self citations
    • Excluding Self-Citations
    • Export to excel for further analysis
    • But not: X
    • The H-index in Scopus An alternative approach to evaluating an author
    • Step 1: The Author details
    • Step 2: Ranked Author records
    • Step 3: Scroll to H-index “ Dr. Albracht has index 30 as 30 of his 127 papers have at least 30 citations each, and the remaining papers have no more than 30 citations each”
    • Additional tools for evaluating authors PatentCites and WebCites
    • PatentCites and WebCites
      • PatentCites are citations of articles in Scopus that appear in official patent documents
        • US Patent Office (USPTO), European Patent Office (EPO) or World Intellectual Property Organisation (WIPO)
      • WebCites are citations of articles in Scopus that appear in selected web sources
        • Theses and Dissertations (NDLTD)
        • Courseware (MIT)
        • Insititutional Repositories (DiVA, U of Toronto, Caltech….)
        • More to follow
    • PatentCites per document Dr. Albracht has additional citations…
    • PatentCites for the document
    • More citations …
    • WebCites for the document
    • Information available in Scopus for Dr. Albracht
      • 1375 Citations received
      • H-Index = 30
      • Matched to 149 Co-Authors
      • Published 127 papers matched to ID
      • Cited 8 times in patents
      • 24 credible web citations – in high quality sources
    • Quality of evaluation is dependent on quality of underlying data
      • Scopus = end user tool
        • Must address this key task within literature research
      • Invested heavily in making author information
        • Objective
        • Quantitative
        • Providing relevant variables
        • Independent variables (avoid bias)
        • Globally comparative
    • Evaluating authors is at root of any evaluation of scientific output
      • Government
      • Funding Agencies
      • Institutions
      • Faculties
      • Libraries
      • Researchers
      • Starting with authors we can aggregate
      • Aggregated data = field of bibliometrics
      • Next step for Scopus