Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
An Open and Resource-oriented Research Impact Evaluation platform<br />By: Muhammad Imran<br />
OUTLINE<br />Brief introduction of research evaluation<br />Problems and motivation<br />Primary users of citation data in...
HOW IS RESEARCH EVALUATION DONE?<br /><ul><li>Examples of data sources / methodologies:
Number of and value of Grants awarded
Number of awards (e.g. Nobel Prizes)
Number of patents
Number of post-graduate researchers
Publication counts
Citation counts
Bibliometric indices (H-index, G-index, W-index)
Peer Evaluation
Expensive, time consuming</li></li></ul><li>Evaluation criteria for faculty recruitment and promotion<br />Funding of rese...
External Entities<br />Government agencies/funding organizations  <br />University Management<br />Management, including c...
EVALUATING COUNTRIES<br />Number of paper for overall country<br />Number of citation for overall country<br />Citation im...
EVALUATING UNIVERSITIES<br />
EVALUATING INDIVIDUALS<br /><ul><li>Number of articles : 148
Sum of the Times Cited : 16,037
Upcoming SlideShare
Loading in …5
×

ResEval: Resource-oriented Research Impact Evaluation platform

1,258 views

Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

ResEval: Resource-oriented Research Impact Evaluation platform

  1. 1. An Open and Resource-oriented Research Impact Evaluation platform<br />By: Muhammad Imran<br />
  2. 2. OUTLINE<br />Brief introduction of research evaluation<br />Problems and motivation<br />Primary users of citation data in research evaluation<br />Existing solutions and their problems<br />Proposed solution and novelties<br />Languages & Architecture<br />Conclusion<br />
  3. 3. HOW IS RESEARCH EVALUATION DONE?<br /><ul><li>Examples of data sources / methodologies:
  4. 4. Number of and value of Grants awarded
  5. 5. Number of awards (e.g. Nobel Prizes)
  6. 6. Number of patents
  7. 7. Number of post-graduate researchers
  8. 8. Publication counts
  9. 9. Citation counts
  10. 10. Bibliometric indices (H-index, G-index, W-index)
  11. 11. Peer Evaluation
  12. 12. Expensive, time consuming</li></li></ul><li>Evaluation criteria for faculty recruitment and promotion<br />Funding of research projects<br />Finding experts in some specific area<br />Governments policy for future investment<br />Countries, institutes, department and individual evaluation<br />WHY DO WE NEED RESEARCH EVALUATION<br />
  13. 13. External Entities<br />Government agencies/funding organizations <br />University Management<br />Management, including committees, provost, vice provosts<br />University Departments<br />Institutional research, academic affairs, tech transfer, etc.<br />Individuals<br />Faculty, staff, students<br />PRIMARY USERS OF CITATION DATA INRESEARCH EVALUATION<br />
  14. 14. EVALUATING COUNTRIES<br />Number of paper for overall country<br />Number of citation for overall country<br />Citation impact for overall country<br />
  15. 15. EVALUATING UNIVERSITIES<br />
  16. 16. EVALUATING INDIVIDUALS<br /><ul><li>Number of articles : 148
  17. 17. Sum of the Times Cited : 16,037
  18. 18. Average Citations / Item : 108.36
  19. 19. h-index : 59</li></li></ul><li> EXISTING SOLUTIONS: Web of Science<br />Consists of 5 databases<br />Pros<br />Citation report provides clear data and graphs over time<br />‘My Citation Alerts’ allows personalization<br />Results page allows for refining<br />Cons<br />Difficult to differentiate authors with same name<br />Cannot produce author-based citation reports from a list of articles<br />Limited coverage for non-English journals<br />Cited reference search is limited to Thomson Reuters journals<br />
  20. 20. <ul><li>Scientific, technical, medical and social science literature coverage.
  21. 21. References go back to 1996
  22. 22. Automatic alerts</li></ul>Other Tools<br />Publish or Perish Tool<br />Google Scholar<br /> EXISTING SOLUTIONS Scopus<br />
  23. 23. PROBLEMS IN GENERAL<br />Data incompleteness (data sources)<br />Predefined metrics for evaluation<br />Only allowed fixed interface based queries<br />Cannot evaluate group of researchers<br />
  24. 24. PROPOSED SOLUTION<br />Common platform to access various kind of scientific resources.<br />Support the definition of personalized metrics.<br />Capability to define Research Evaluation language (Natural language queries).<br />Capable to evaluate individual, contribution and group of researchers.<br />
  25. 25. ARCHITECTURE<br /><ul><li>Architecture is defined in three well differentiated layers</li></li></ul><li>Languages<br /><ul><li>We proposed languages for
  26. 26. For defining personalized metrics definition
  27. 27. For group formation</li></li></ul><li>PROTOTYPES<br />We have implemented two prototypes<br />ResEval<br />Individual researcher<br />Contribution<br />Available at http://project.liquidpub.org/reseval/<br />Group Comparison<br />Group of researchers<br />Available at http://project.liquidpub.org/groupcomparison/<br />
  28. 28. RESUTLS<br />H-Index of Individual<br />Combined Results<br />Distribution of indices<br />
  29. 29. CONCLUSION & FUTURE WORK<br />We proposed architecture for<br />Personalized metrics definition<br />Resource-Oriented <br />Provide Impact evaluation for individual & group<br />Future work<br />Language module needs more work<br />More options in prototypes<br />

×