This is presentation on library assessment at Pitt University Library System delivered to iSchool Academic Librarianship Graduate students. December 2015.
Prof. sp singh.ph d.course work.2020-21.citation index, journal impact factor...Saurashtra University
Citation index, Journal Impact Factors , H – Index and Impact Factor
-------
RESEARCH, PUBLICATIONS AND QUALITY ASSESSMENT
WIDE VARIATION IN THE ASSESSMENT AND QUALITY JUDGMENT
DIFFRENTIAL LEVEL OF RESEARCH OUTPUT- Reflected by number/frequency/quality of the publication
LACK OF INTEREST
DIFFERNCES IN OVER ALL OBJECTIVES
TYPES OF PUBLICATIONS
TYPES AND QUALITY OF THE JOURNALS
This is presentation on library assessment at Pitt University Library System delivered to iSchool Academic Librarianship Graduate students. December 2015.
Prof. sp singh.ph d.course work.2020-21.citation index, journal impact factor...Saurashtra University
Citation index, Journal Impact Factors , H – Index and Impact Factor
-------
RESEARCH, PUBLICATIONS AND QUALITY ASSESSMENT
WIDE VARIATION IN THE ASSESSMENT AND QUALITY JUDGMENT
DIFFRENTIAL LEVEL OF RESEARCH OUTPUT- Reflected by number/frequency/quality of the publication
LACK OF INTEREST
DIFFERNCES IN OVER ALL OBJECTIVES
TYPES OF PUBLICATIONS
TYPES AND QUALITY OF THE JOURNALS
In this presentation prepared for the 2016 STM conference in Frankfurt, Stephen Pinfield presents the latest developments of the Open-Access Mega-Journals project, funded by the Arts and Humanities Research Council (UK)
h index: Benchmark of productivity and impact of researcher AJAY SEMALTY
In the Indices of research series h index is discussed here. The h-index (sometimes called the Hirsch index or Hirsch number) is one of the several research indices which is used to measure the productivity and impact of of a researcher/ research group/ institution. It’s an index which increases on the basis of citations and number of papers continuously with the passage of time. It is the major benchmark used by the employers for selection/recruitment and/ or assessment of Researchers. This e-module will let you know all about the h index: What, How, Who, why......about h index will be answered here. In the very next video we will cover how to identify h index of a researcher in various platforms. (URL link for video: https://youtu.be/BAhPzxWVtVE) For any query please feel free to write to us at openknowledgeok@gmail.com and please do subscribe our youtube channel.......THANKS FOR GIVING YOUR TIME. --- Team OK
A combination of powerpoint presentations on bibliometrics in higher education, originally presented at (CONCERT) Council on Core Electronic Resources in Taiwan, November 2008 and modified for a paper on bibliometrics and university rankings.
http://ir.library.smu.edu.sg/record=d1010558
Citation analysis: State of the art, good practices, and future developmentsLudo Waltman
Presentation at Bibliometrics & Research Assessment: A Symposium for Librarians & Information Professionals. Bethesda, MD, United States, October 31, 2016.
Makes the case that we should let metrics do the "heavy lifting" in the UK REF [Research Excellence Framework]. I show that a university-level ranking based on metrics (Microsoft Academic citations for all papers published with the university's affiliation between 2008-2013) correlates at 0.97 with the The REF power rating taken from Research Fortnight’s calculation. Using metrics to distribute research-related funding would free up a staggering amount of time and money and would allow us to come up with more creative and meaningful ways to build in a research quality component in the REF.
Metrics vs peer review: Why metrics can (and should?) be applied in the Socia...Anne-Wil Harzing
Review the debates on metrics vs peer review and suggests that we are comparing the idealised version of peer review to the reductionist version of metrics. Instead we should compare the reality of peer review with the inclusive version of metrics.
In this presentation prepared for the 2016 STM conference in Frankfurt, Stephen Pinfield presents the latest developments of the Open-Access Mega-Journals project, funded by the Arts and Humanities Research Council (UK)
h index: Benchmark of productivity and impact of researcher AJAY SEMALTY
In the Indices of research series h index is discussed here. The h-index (sometimes called the Hirsch index or Hirsch number) is one of the several research indices which is used to measure the productivity and impact of of a researcher/ research group/ institution. It’s an index which increases on the basis of citations and number of papers continuously with the passage of time. It is the major benchmark used by the employers for selection/recruitment and/ or assessment of Researchers. This e-module will let you know all about the h index: What, How, Who, why......about h index will be answered here. In the very next video we will cover how to identify h index of a researcher in various platforms. (URL link for video: https://youtu.be/BAhPzxWVtVE) For any query please feel free to write to us at openknowledgeok@gmail.com and please do subscribe our youtube channel.......THANKS FOR GIVING YOUR TIME. --- Team OK
A combination of powerpoint presentations on bibliometrics in higher education, originally presented at (CONCERT) Council on Core Electronic Resources in Taiwan, November 2008 and modified for a paper on bibliometrics and university rankings.
http://ir.library.smu.edu.sg/record=d1010558
Citation analysis: State of the art, good practices, and future developmentsLudo Waltman
Presentation at Bibliometrics & Research Assessment: A Symposium for Librarians & Information Professionals. Bethesda, MD, United States, October 31, 2016.
Makes the case that we should let metrics do the "heavy lifting" in the UK REF [Research Excellence Framework]. I show that a university-level ranking based on metrics (Microsoft Academic citations for all papers published with the university's affiliation between 2008-2013) correlates at 0.97 with the The REF power rating taken from Research Fortnight’s calculation. Using metrics to distribute research-related funding would free up a staggering amount of time and money and would allow us to come up with more creative and meaningful ways to build in a research quality component in the REF.
Metrics vs peer review: Why metrics can (and should?) be applied in the Socia...Anne-Wil Harzing
Review the debates on metrics vs peer review and suggests that we are comparing the idealised version of peer review to the reductionist version of metrics. Instead we should compare the reality of peer review with the inclusive version of metrics.
Research proposal and assessment of outputs jan 2021. prof.s.p.singhSaurashtra University
This is about the preparation of research proposals for PhD research and research projects. Further, it also includes the matrix and Indexes to evaluate research outputs.
Can Bibliometric and Scientometric Measures be Used to Assess Research Quali...Yasar Tonta
The quality of research output has traditionally been assessed by peer review. Yet, bibliometric and scientometric measures are increasingly being used nowadays to support or even supplant peer review for research assessment. The main reasons for the popularity of such measures are that they can be obtained easily and that they are considered to be more "objective" in comparison to peer review. This paper explores the misuse of bibliometric and scientometric measures to assess research quality, provides recommendations of San Francisco Declaration on Research Assessment and the Leiden Manifesto for Research Metrics, and briefly addresses "responsible metrics" introduced in "The Metric Tide", a report of independent review chaired by James Wilsdon (Wilsdon, et al. 2015).
Presentation at the Data Science seminar at the Faculty of Social and Behavioural Sciences, Leiden University, Leiden, The Netherlands, December 7, 2018.
Vnsgu.pre ph d.course work.27aug2021.a talk on 'quality evaluation and ethic...Saurashtra University
V N South Gujarat University: A Presentation in PhD Course WorkQuality Evaluation and Ethics in Research and PublicationsCitation index, Journal Impact Factors , H – Index and Impact Factor
Although the main bibliometric databases (Web of Science (WoS) and Scopus) claim to include journals on the basis of scientific and publication standards, there have long been concerns that its coverage is biased in favour of journals from industrialised countries and towards topics relevant to these countries. This webinar presents an investigation of this claim for research on rice, comparing the database CAB Abstracts with the mainstream databases. We find clear evidence that for a field such as rice, statistics based on WoS and Scopus strongly under-represent the scientific production by developing countries, and over-represent production by industrialised countries. More importantly, we also find a substantial bias in coverage of different research topics. The study suggests that statistics based on mainstream databases provide a significantly distorted view of the amount of research and diversity of agendas in most countries. Given that bibliometric statistics are often used for benchmarking and evaluation purposes, the database biases may translate into policy framings that undervalue domestic capabilities and research agendas more attuned to local needs in the global south.
Similar to What is your h-index and other measures of impact (20)
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
3. They are used to measure/demonstrate
impact of…
• National science systems
• Research-producing institutions
• Research groups
• Individual researchers
In order to…
• Understand impact of investment
• Award funding
• Employment and promotion decisions
• Apply for funding, job, promotion
• Identify experts, collaborators, etc.
3
4. Learning outcomes
• At the end of sessions participants will
• be able to identify paper-level and author-level indicators of impact
• be able to extract these indicators from Scopus, WoS and SciVal
• Understand how individual researchers may use these indicators to
demonstrate impact
4
5. “Citizen” bibliometrics…
• Productivity (publications counts)
• leads to “salami slicing”, maybe
• quantity vs quality
• Impact (citation counts)
• but what impact?
• Impact factor
• Speaks to prestige of outlet, not quality of individual paper
• 20% of papers in Nature get 80% of all citations (averages do not
work when distributions are skewed)
• h-index
• arbitrary, simplistic and lacks consistency
• it’s always highest in Google Scholar
5
7. Story of one publication
7
Kato, M., Han, TW., Xie, S., Shi, K., Du, X., Wu, LC., Mirzaei, H.,
Goldsmith, EJ., Longgood, J., Pei,J., and Grishin, N.V. and Frantz, DE.,
Schneider, JW., Chen, S., Li, L., Sawaya, MR., Eisenberg, D, Tycko, R.
and S. McKnight. (2012) “Cell-free Formation of RNA Granules: Low
Complexity Sequence Domains Form Dynamic Fibers within
Hydrogels.”, CELL, 149 (4): 753-67. (citations 325, IF 34.242)
8. Story of one publication
• Journal characteristics
• JIF or SNIP or another indicator of impact in context
• Acceptance rates (e.g. from Cabell's Directory of Publishing Opportunities)
• Co-author characteristics
• Names, institutions, countries
• Citations
• Number of citations in context (normalised baselines)
• Ratio
• Percentile distribution
• Characteristics of citing papers
• authors, institutions, countries, subject fields, journals
• Altmetrics
• Predictor of citation impact (???)
• Indicator of attention
• Indicator of impact outside academic realm (e.g. citations in policy documents or clinical
guidelines)
8
13. Baselines
• Web of Science (WoS) (at Pitt publication and citation data going back to 1980)
• Highly Cited?; Hot?, Impact Factor value
• Essential Science Indicators (ESI) (publication and citation data going back 10 years +
current year)
• across 22 broad disciplines
• Baselines (Averages and Percentiles)
• Journal Citation Reports (JCR)
• IF values of thousands of journals and much more (citation networks, rates of
obsolescence, etc.)
• Scopus (complete citation data going back to 1996, project to extend to 1970 under
way to be completed by end of 2016)
• discipline adjusted citation rates (more granular disciplinary and sub-disciplinary
divisions), also adjusted for all doc. types
• SciVal (complete publication and citation data going back to 1996. Based on
SCOPUS-indexed publications)
• Over 20 indicators of productivity and impact
13
14. Story of one publication
14
My paper has been published in Cell, 2nd top ranked journal in the field of
biochemistry and mol. biology and 3nd top title in the filed of cell biology (JCR,
2015 ed.)
To date it was cited 325 times, which places it in the top 1% of all world’s 2012
biochem. and mol. biology papers. The paper has an international and
multidisciplinary reach. Citations come from authors from 34 different countries
and from across 95 different journals in 35 different fields (including 7 citations
from Nature and 6 from Science). (WoS, 20 Sept. 2016).
My paper also attracted attention from non-scholarly audiences, …
Kato, M., Han, TW., Xie, S., Shi, K., Du, X., Wu, LC., Mirzaei, H., Goldsmith, EJ.,
Longgood, J., Pei,J., and Grishin, N.V. and Frantz, DE., Schneider, JW., Chen, S., Li, L.,
Sawaya, MR., Eisenberg, D, Tycko, R. and S. McKnight. (2012) “Cell-free Formation of
RNA Granules: Low Complexity Sequence Domains Form Dynamic Fibers within
Hydrogels.”, CELL, 149 (4): 753-67. (citations 325, IF 34.242)
15. Exercise
• Murphy, S.V. and A. Atala (2014) “3D bioprinting of tissues
and organs” NATURE BIOTECHNOLOGY; 32(8):773-785.
DOI: 10.1038/nbt.2958
• Describe it using bibliometric indicators discussed today
• Citation count
• Normalised citation count
• Percentile position
• Impact of publishing journal
• Characteristics of citing publications
• Use WoS or SCOPUS and Altmetric.com
15
16. Web of Science and Essential
Science Indicators
• Murphy, S.V. and A. Atala (2014) “3D bioprinting of
tissues and organs” NATURE BIOTECHNOLOGY;
32(8):773-785.
• 337 citations
• 337/5.31 = 63.45 (normalised citation impact)
• In top 1% of all 2014 Biology and Biochemistry
publications (HICI paper)
16
20. Story of one author
• Source of information about publications
• CV
• Pitt Faculty Info System (Elements)
• ORCID
• Google Scholar profile
• How much output is captured in databases
• Scopus Author Identifier
• ResearcherID (WoS)
20
21. Scholarly communication practices
21
Subject area Books and book chapters Conference papers Journal articles
History 45.6 3.8 50.6
Politics and Policy 43.1 10.8 46.1
Language 40.5 7.6 51.8
Human Society 31.3 5.6 63
Philosophy 29.8 5.4 64.8
Economics 27.4 8 64.5
Law 26.2 1.9 71.9
The Arts 25.2 20.3 54.5
Education 21.8 23.6 54.5
Architecture 20.8 43.6 35.6
Psychology 18.9 4.9 76.2
Journalism, library 18.6 24.2 57.2
Management 13 34 52.9
Earth Sciences 8.6 9.2 82.2
Medical & Health Sci 6.6 2.9 90.5
Biological Sciences 6.6 2.7 90.7
Agriculture 6.3 14.7 79
Computing 5 62.3 32.8
Mathematical Sciences 5 11.2 83.8
Engineering 2.9 45.1 52
Physical Sciences 2.7 7.3 90
Chemical Sciences 2.3 1.9 95.7
L. Butler, 2006
And what about
all other
research
outputs?
22. Author metrics
• Overview of productivity and impact
• Should use size-dependent variables as these take into
consideration volume of outputs
• Publications (other outputs) counts and characteristics
• Total citation counts in context (discipline, age, output type)
• Excellence measures (top 1, 5, 10% of distributions)
• ESI’s Highly Cited author?
• h-index (if you must) – though it is inconsistent (example later)
22
24. SCOPUS view
24
Citation per publication rate for my 200 publications
indexed in Scopus is 91.2, only 5 of my publications (2.5%)
were not cited o date. (Scopus, 8 June 2016)
My research is multidisciplinary spanning across
biochemistry, genetics and molecular biology,
physics chemistry, materials science and
engineering.
I have been publishing consistently since 1982,
averaging 7 publications per year in high quality, peer
reviewed outlets. These publications, collectively receive
upwards of 1,000 citations per year. (Scopus, 8 June 2016)
25. Story of one author
25
Citations to my publications in the last 10 years placed
me in the group of top 1% of biochemistry researchers
in the world and 5 of these publications placed in the
top 1% of publications in their field. (ESI, 20 Sept
2016)
26. SciVal view - overview
26
In the last 5 years, 68% of my
publications placed in the top
10% of all similar publications
(by field and age), and 32% of
them were published in the
top 10% of scientific journals.
(SciVal, 8 June 2016)
27. My publications in biochemistry consistency outperform these of NIH and USA as a whole in
relation of impact and percentage of publications in top 1% in the world. (SciVal, 8 June 2016)
27
29. Author 1 Author 2 Author
3
15 150 15
10 100 10
10 50 10
5 25 5
5 5 5
4 1 0
4 0 0
3 0 0
3 0 0
1 0 0
Does not account
for:
•insensitive to highly-cited
publications
•citation characteristics of
publication outlets
•citation characteristics of fields
of science
•age of publications
•type of publications
•co-authorship
•self-citations
•scientific age of author
29
30. More problems with h-index
30
Ludo Waltman and Nees Jan van Eck, JASIST 2012
If two scientists achieve the same absolute performance improvement, their
ranking relative to each other should remain unchanged.
Scientist X Scientist Y
Publication Citations
1 5 6
2 5 6
3 5 6
4 5 6
5 5 3
6 2 3
7 2 3
Scientist X Scientist Y
Publication Citations
1 8 8
2 8 8
3 5 6
4 5 6
5 5 6
6 5 6
7 5 3
8 2 3
9 2 3
Scientist X h-index 5
Scientist Y 4
Scientist X – 5
Scientist Y - 6
31. Exercise: create a bibliometric profile of a
researcher
• Ludo Waltman, Leiden U
• Krzysztof Matyjaszewski, CMU
• Ervin Sejdic, Pitt
• Anne-Wil Harzing, U Melbourne
• Try SCOUPS, WOS and SciVal to find out
• Number of publications
• Total number of citations
• Normalised citations (e.g. within a discipline)
• Subject distributions
• Characteristics/impact of citing publications
• For an entire career span/for last 5 years
31
Editor's Notes
Researchers usually provide
Raw citation count (lacks context)
JIF(says nothing about impact of individual publication
View Jln Metrics
Co-authors (institutions and countries)
Citation count
HI CI? If not: check ESI baselines and percentiles
HI CI author?
Citing pubs – authors, institutions,