Your SlideShare is downloading. ×
ResEval: Resource-oriented Research Impact Evaluation platform
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

ResEval: Resource-oriented Research Impact Evaluation platform

984
views

Published on

Published in: Education, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
984
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. An Open and Resource-oriented Research Impact Evaluation platform
    By: Muhammad Imran
  • 2. OUTLINE
    Brief introduction of research evaluation
    Problems and motivation
    Primary users of citation data in research evaluation
    Existing solutions and their problems
    Proposed solution and novelties
    Languages & Architecture
    Conclusion
  • 3. HOW IS RESEARCH EVALUATION DONE?
    • Examples of data sources / methodologies:
    • 4. Number of and value of Grants awarded
    • 5. Number of awards (e.g. Nobel Prizes)
    • 6. Number of patents
    • 7. Number of post-graduate researchers
    • 8. Publication counts
    • 9. Citation counts
    • 10. Bibliometric indices (H-index, G-index, W-index)
    • 11. Peer Evaluation
    • 12. Expensive, time consuming
  • Evaluation criteria for faculty recruitment and promotion
    Funding of research projects
    Finding experts in some specific area
    Governments policy for future investment
    Countries, institutes, department and individual evaluation
    WHY DO WE NEED RESEARCH EVALUATION
  • 13. External Entities
    Government agencies/funding organizations
    University Management
    Management, including committees, provost, vice provosts
    University Departments
    Institutional research, academic affairs, tech transfer, etc.
    Individuals
    Faculty, staff, students
    PRIMARY USERS OF CITATION DATA INRESEARCH EVALUATION
  • 14. EVALUATING COUNTRIES
    Number of paper for overall country
    Number of citation for overall country
    Citation impact for overall country
  • 15. EVALUATING UNIVERSITIES
  • 16. EVALUATING INDIVIDUALS
    • Number of articles : 148
    • 17. Sum of the Times Cited : 16,037
    • 18. Average Citations / Item : 108.36
    • 19. h-index : 59
  • EXISTING SOLUTIONS: Web of Science
    Consists of 5 databases
    Pros
    Citation report provides clear data and graphs over time
    ‘My Citation Alerts’ allows personalization
    Results page allows for refining
    Cons
    Difficult to differentiate authors with same name
    Cannot produce author-based citation reports from a list of articles
    Limited coverage for non-English journals
    Cited reference search is limited to Thomson Reuters journals
  • 20.
    • Scientific, technical, medical and social science literature coverage.
    • 21. References go back to 1996
    • 22. Automatic alerts
    Other Tools
    Publish or Perish Tool
    Google Scholar
    EXISTING SOLUTIONS Scopus
  • 23. PROBLEMS IN GENERAL
    Data incompleteness (data sources)
    Predefined metrics for evaluation
    Only allowed fixed interface based queries
    Cannot evaluate group of researchers
  • 24. PROPOSED SOLUTION
    Common platform to access various kind of scientific resources.
    Support the definition of personalized metrics.
    Capability to define Research Evaluation language (Natural language queries).
    Capable to evaluate individual, contribution and group of researchers.
  • 25. ARCHITECTURE
    • Architecture is defined in three well differentiated layers
  • Languages
    • We proposed languages for
    • 26. For defining personalized metrics definition
    • 27. For group formation
  • PROTOTYPES
    We have implemented two prototypes
    ResEval
    Individual researcher
    Contribution
    Available at http://project.liquidpub.org/reseval/
    Group Comparison
    Group of researchers
    Available at http://project.liquidpub.org/groupcomparison/
  • 28. RESUTLS
    H-Index of Individual
    Combined Results
    Distribution of indices
  • 29. CONCLUSION & FUTURE WORK
    We proposed architecture for
    Personalized metrics definition
    Resource-Oriented
    Provide Impact evaluation for individual & group
    Future work
    Language module needs more work
    More options in prototypes