An Open and Resource-oriented Research Impact Evaluation platform<br />By: Muhammad Imran<br />
OUTLINE<br />Brief introduction of research evaluation<br />Problems and motivation<br />Primary users of citation data in...
HOW IS RESEARCH EVALUATION DONE?<br /><ul><li>Examples of data sources / methodologies:
Number of and value of Grants awarded
Number of awards (e.g. Nobel Prizes)
Number of patents
Number of post-graduate researchers
Publication counts
Citation counts
Bibliometric indices (H-index, G-index, W-index)
Peer Evaluation
Expensive, time consuming</li></li></ul><li>Evaluation criteria for faculty recruitment and promotion<br />Funding of rese...
External Entities<br />Government agencies/funding organizations  <br />University Management<br />Management, including c...
EVALUATING COUNTRIES<br />Number of paper for overall country<br />Number of citation for overall country<br />Citation im...
EVALUATING UNIVERSITIES<br />
EVALUATING INDIVIDUALS<br /><ul><li>Number of articles : 148
Sum of the Times Cited : 16,037
Upcoming SlideShare
Loading in...5
×

ResEval: Resource-oriented Research Impact Evaluation platform

1,012

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,012
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

ResEval: Resource-oriented Research Impact Evaluation platform

  1. 1. An Open and Resource-oriented Research Impact Evaluation platform<br />By: Muhammad Imran<br />
  2. 2. OUTLINE<br />Brief introduction of research evaluation<br />Problems and motivation<br />Primary users of citation data in research evaluation<br />Existing solutions and their problems<br />Proposed solution and novelties<br />Languages & Architecture<br />Conclusion<br />
  3. 3. HOW IS RESEARCH EVALUATION DONE?<br /><ul><li>Examples of data sources / methodologies:
  4. 4. Number of and value of Grants awarded
  5. 5. Number of awards (e.g. Nobel Prizes)
  6. 6. Number of patents
  7. 7. Number of post-graduate researchers
  8. 8. Publication counts
  9. 9. Citation counts
  10. 10. Bibliometric indices (H-index, G-index, W-index)
  11. 11. Peer Evaluation
  12. 12. Expensive, time consuming</li></li></ul><li>Evaluation criteria for faculty recruitment and promotion<br />Funding of research projects<br />Finding experts in some specific area<br />Governments policy for future investment<br />Countries, institutes, department and individual evaluation<br />WHY DO WE NEED RESEARCH EVALUATION<br />
  13. 13. External Entities<br />Government agencies/funding organizations <br />University Management<br />Management, including committees, provost, vice provosts<br />University Departments<br />Institutional research, academic affairs, tech transfer, etc.<br />Individuals<br />Faculty, staff, students<br />PRIMARY USERS OF CITATION DATA INRESEARCH EVALUATION<br />
  14. 14. EVALUATING COUNTRIES<br />Number of paper for overall country<br />Number of citation for overall country<br />Citation impact for overall country<br />
  15. 15. EVALUATING UNIVERSITIES<br />
  16. 16. EVALUATING INDIVIDUALS<br /><ul><li>Number of articles : 148
  17. 17. Sum of the Times Cited : 16,037
  18. 18. Average Citations / Item : 108.36
  19. 19. h-index : 59</li></li></ul><li> EXISTING SOLUTIONS: Web of Science<br />Consists of 5 databases<br />Pros<br />Citation report provides clear data and graphs over time<br />‘My Citation Alerts’ allows personalization<br />Results page allows for refining<br />Cons<br />Difficult to differentiate authors with same name<br />Cannot produce author-based citation reports from a list of articles<br />Limited coverage for non-English journals<br />Cited reference search is limited to Thomson Reuters journals<br />
  20. 20. <ul><li>Scientific, technical, medical and social science literature coverage.
  21. 21. References go back to 1996
  22. 22. Automatic alerts</li></ul>Other Tools<br />Publish or Perish Tool<br />Google Scholar<br /> EXISTING SOLUTIONS Scopus<br />
  23. 23. PROBLEMS IN GENERAL<br />Data incompleteness (data sources)<br />Predefined metrics for evaluation<br />Only allowed fixed interface based queries<br />Cannot evaluate group of researchers<br />
  24. 24. PROPOSED SOLUTION<br />Common platform to access various kind of scientific resources.<br />Support the definition of personalized metrics.<br />Capability to define Research Evaluation language (Natural language queries).<br />Capable to evaluate individual, contribution and group of researchers.<br />
  25. 25. ARCHITECTURE<br /><ul><li>Architecture is defined in three well differentiated layers</li></li></ul><li>Languages<br /><ul><li>We proposed languages for
  26. 26. For defining personalized metrics definition
  27. 27. For group formation</li></li></ul><li>PROTOTYPES<br />We have implemented two prototypes<br />ResEval<br />Individual researcher<br />Contribution<br />Available at http://project.liquidpub.org/reseval/<br />Group Comparison<br />Group of researchers<br />Available at http://project.liquidpub.org/groupcomparison/<br />
  28. 28. RESUTLS<br />H-Index of Individual<br />Combined Results<br />Distribution of indices<br />
  29. 29. CONCLUSION & FUTURE WORK<br />We proposed architecture for<br />Personalized metrics definition<br />Resource-Oriented <br />Provide Impact evaluation for individual & group<br />Future work<br />Language module needs more work<br />More options in prototypes<br />

×