Journal-level metrics
Metrics have become a fact of life in many - if not all - fields of research and scholarship. In an age of information abundance (often termed ‘information overload’), having a shorthand for the signals for where in the ocean of published literature to focus our limited attention has become increasingly important.
Research metrics are sometimes controversial, especially when in popular usage they become proxies for multidimensional concepts such as research quality or impact. Each metric may offer a different emphasis based on its underlying data source, method of calculation, or context of use. For this reason, Elsevier promotes the responsible use of research metrics encapsulated in two “golden rules”. Those are: always use both qualitative and quantitative input for decisions (i.e. expert opinion alongside metrics), and always use more than one research metric as the quantitative input. This second rule acknowledges that performance cannot be expressed by any single metric, as well as the fact that all metrics have specific strengths and weaknesses. Therefore, using multiple complementary metrics can help to provide a more complete picture and reflect different aspects of research productivity and impact in the final assessment. ( Elsevier)
Open Access (OA) is a system provide access to knowledge resources with free of cost and other restrictions. This PPT answer to the questions what, why, types, benefits etc. and also describes the creative commons licensing, concept of predatory journals, open access journals, and Sharpa RoMeO.
Elsevier's Scopus.com upgraded the Journal Analyzer with Source Normalized Impact per Paper (SNIP), which measures a source's contextual impact, and SCImago Journal Rank (SJR), which measures the scientific prestige of scholarly sources.
These indicators will be applied to all journals indexed by Scopus and will be freely available to both subscribers and non-subscribers @ scopus.com and www.journalmetrics.com
In academia, the pressure to publish is high and the competition intense. This can lead authors to follow unethical publication practices, such as salami slicing, duplicate publication, and simultaneous submission. This slide deck explains these malpractices and shares tips on how authors can avoid them.
Selective Reporting and Misrepresentation of DataSaptarshi Ghosh
Research integrity means conducting research according to the highest professional and ethical standards, so that the results are trustworthy.
It concerns the behavior of researchers at all stages of the research life-cycle, including declaring competing interests; data collection and data management; using appropriate methodology; drawing conclusions from results; and writing up research findings.
Redundant, Duplicate and Repetitive publications are the most important concerns in the scientific research/literature writing. The occurrence of redundancy affects the concepts of science/literature and carries with it sanctions of consequences. To define this issue is much challenging because of the many varieties in which one can slice, reformat, or reproduce material from an already published study. This issue also goes beyond the duplication of a single study because it might possible that the same or similar data can be published in the early, middle, and later stages of an on-going study. This may have a damaging impact on the scientific study/literature base. Similar to slicing a cake, there are so many ways of representing a study or a set of data/information. We can slice a cake into different shapes like squares, triangles, rounds, or layers. Which of these might be the best way to slice a cake? Unfortunately, this may be the wrong question. The point is that the cake that is being referred to, the data/ information set or the study/findings, should not be sliced at all. Instead, the study should be presented as a whole to the readership to ensure the integrity of science/technology because of the impact that may have on patients who will be affected by the information contained in the literature/findings. Redundant, duplicate, or repetitive publications occur when there is representation of two or more studies, data sets, or publications in either electronic or print media. The publications can overlap partially or completely, such that a similar portion, major component(s), or complete representation of a previously/simultaneous ly or future published study is duplicated.
SALAMI SLICING: The slicing of research publication that would form one meaningful paper into several different papers is known as salami publication or salami slicing. Unlike duplicate publication, which involves reporting the exact same data in two or more publications, salami slicing involves breaking up or segmenting a large study into two or more publications. These segments are called slices of a study. As a general rule, as long as the slices of a broken-up study share the same hypotheses, population, and methods, this is not acceptable in general practice. The same slice should never be published more than once at all. According to the United States Office of Research Integrity (USORI), salami slicing can result in a distortion of the literature/findings by leading unsuspecting readers to believe that data presented in each salami slice (journal article) is derived from a different subject sample/source. Somehow this practice not only skews the scientific database but it creates repetition to waste reader's time as well as the time of editors and peer reviewers, who must also handle each paper separately.
Web of Science and Scopus: Understanding the indexing systemDr. Sharad Chand
In this article, Ii is explained about the Web of Science and Scopus indexing databases and their quality measures. This provides a basic insight into the selection of a good quality journal for publications.
Els lc metrics_reference_cards_v1.0_slides_2016Jenny Delasalle
Each slide covers one of a selection of metrics, with definitions and information about how it might be used. This is just part of a suite of resources from https://libraryconnect.elsevier.com/metrics
Open Access (OA) is a system provide access to knowledge resources with free of cost and other restrictions. This PPT answer to the questions what, why, types, benefits etc. and also describes the creative commons licensing, concept of predatory journals, open access journals, and Sharpa RoMeO.
Elsevier's Scopus.com upgraded the Journal Analyzer with Source Normalized Impact per Paper (SNIP), which measures a source's contextual impact, and SCImago Journal Rank (SJR), which measures the scientific prestige of scholarly sources.
These indicators will be applied to all journals indexed by Scopus and will be freely available to both subscribers and non-subscribers @ scopus.com and www.journalmetrics.com
In academia, the pressure to publish is high and the competition intense. This can lead authors to follow unethical publication practices, such as salami slicing, duplicate publication, and simultaneous submission. This slide deck explains these malpractices and shares tips on how authors can avoid them.
Selective Reporting and Misrepresentation of DataSaptarshi Ghosh
Research integrity means conducting research according to the highest professional and ethical standards, so that the results are trustworthy.
It concerns the behavior of researchers at all stages of the research life-cycle, including declaring competing interests; data collection and data management; using appropriate methodology; drawing conclusions from results; and writing up research findings.
Redundant, Duplicate and Repetitive publications are the most important concerns in the scientific research/literature writing. The occurrence of redundancy affects the concepts of science/literature and carries with it sanctions of consequences. To define this issue is much challenging because of the many varieties in which one can slice, reformat, or reproduce material from an already published study. This issue also goes beyond the duplication of a single study because it might possible that the same or similar data can be published in the early, middle, and later stages of an on-going study. This may have a damaging impact on the scientific study/literature base. Similar to slicing a cake, there are so many ways of representing a study or a set of data/information. We can slice a cake into different shapes like squares, triangles, rounds, or layers. Which of these might be the best way to slice a cake? Unfortunately, this may be the wrong question. The point is that the cake that is being referred to, the data/ information set or the study/findings, should not be sliced at all. Instead, the study should be presented as a whole to the readership to ensure the integrity of science/technology because of the impact that may have on patients who will be affected by the information contained in the literature/findings. Redundant, duplicate, or repetitive publications occur when there is representation of two or more studies, data sets, or publications in either electronic or print media. The publications can overlap partially or completely, such that a similar portion, major component(s), or complete representation of a previously/simultaneous ly or future published study is duplicated.
SALAMI SLICING: The slicing of research publication that would form one meaningful paper into several different papers is known as salami publication or salami slicing. Unlike duplicate publication, which involves reporting the exact same data in two or more publications, salami slicing involves breaking up or segmenting a large study into two or more publications. These segments are called slices of a study. As a general rule, as long as the slices of a broken-up study share the same hypotheses, population, and methods, this is not acceptable in general practice. The same slice should never be published more than once at all. According to the United States Office of Research Integrity (USORI), salami slicing can result in a distortion of the literature/findings by leading unsuspecting readers to believe that data presented in each salami slice (journal article) is derived from a different subject sample/source. Somehow this practice not only skews the scientific database but it creates repetition to waste reader's time as well as the time of editors and peer reviewers, who must also handle each paper separately.
Web of Science and Scopus: Understanding the indexing systemDr. Sharad Chand
In this article, Ii is explained about the Web of Science and Scopus indexing databases and their quality measures. This provides a basic insight into the selection of a good quality journal for publications.
Els lc metrics_reference_cards_v1.0_slides_2016Jenny Delasalle
Each slide covers one of a selection of metrics, with definitions and information about how it might be used. This is just part of a suite of resources from https://libraryconnect.elsevier.com/metrics
Paradoxical betweenness in Academic endeavors and research metricsSaptarshi Ghosh
Publish or perish" is an aphorism describing the pressure to publish academic work in order to succeed in an academic career. ... The pressure to publish has been cited as a cause of poor work being submitted to academic journals.
Els lc metrics_reference_cards_v2.0_slides_dec2016Jenny Delasalle
Version 2 includes the new Citescore metric. I worked on the research behind these cards, but am not the copyright owner. Originals provided at https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Quick reference cards for research impact metricsLibrary_Connect
When meeting with students, researchers, deans or department heads, the metrics on these quick reference cards can serve as a jumping off point in conversations about where to publish, adding to researcher profiles, enriching promotion and tenure files, and benchmarking research outputs. The cards were co-developed by librarian Jenny Delasalle and Elsevier's Library Connect program. Learn more and download poster versions as well at: https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics
Modified CiteScore metric for reducing the effect of self-citationsTELKOMNIKA JOURNAL
Elsevier B.V. launched a scholarly metric called CiteScore (CS) on December 8, 2016. Up till
then, the journal impact factor (JIF) owned by Clarivate Analytics (Thomson Reuters) was the only trusted
metric for journal evaluation. As noted by Teixeira da Silva & Memon (2017), CS offers some observed
advantages over JIF. The potentials of CiteScore as a viable metric are still emerging. The paper briefly
introduces a variant of the CiteScore that can be used in quantifying the impact of researchers and their
institutions. The ultimate aim is to reduce the numerical effect of self-citations (SC) in academic publishing.
The reduction is designed to discourage SC but not diminishing it. The reasons for the adopted
methodology are discussed extensively. The proposed modified CiteScore metric is simple, transparent
and constructed to ensure integrity in academic publication. The result showed that the proposed modified
CiteScore is a better option than the traditional CiteScore and hence, can be applied in impact
determination, the ranking of authors and their institutions, and evaluation of scientists for a grant award.
The approach used in this paper is entirely new in two ways; first, a metric similar to journal ranking is
proposed for ranking authors and their institutions and secondly, disproportionate scores are awarded to
different sources of citations to reduce perceived dishonesty in academic publications. In conclusion, this
research is one of very few to report the effect of SC on CiteScore. Hitherto, the effect of SC has always
been on the journal impact factor (IF).
Journal and author impact measures Assessing your impact (h-index and beyond)Aboul Ella Hassanien
This seminar presented at faculty of Computers Monofiya university on Saturday 12 Dec. 2015. Seminar for researchers and graduate students at Egyptian universities to increase awareness of the importance of publication and scientific research and how to increase the researchers weight, its calculation, and calculation of magazines weight and how to calculate new weights that differ from the impact of the magazines and tips for students attic studies on how to increase citation of the published research papers and How to use open access publishing. In addition discuss the Issues in the field of open access including its advantages and disadvantages
Sole reliance on citation data provides an incomplete understanding of research. Although citation analysis may be simple to apply, it should be used with caution to avoid it coming under disrepute through uncritical use. Ideally, citation analysis should be performed to supplement, not replace, a robust system of expert review to determine the actual quality and impact of published research.
A tool for librarians to select metrics across the research lifecycleLibrary_Connect
These slides introduce a range of research impact metrics. They were presented at the ER&L Conference (April 2017) by Chris James, Product Manager Research Metrics, Elsevier.
Writing Tools and Software, Referencing Tools and Reference Management Software, Research Tools and Software, Grammar Checkers and Sentence Correction Tools.
Predatory Publications and Software Tools for IdentificationSaptarshi Ghosh
Journals that publish work without proper peer review and which charge scholars sometimes huge fees to submit should not be allowed to share space with legitimate journals and publishers, whether open access or not. These journals and publishers cheapen intellectual work by misleading scholars, preying particularly early career researchers trying to gain an edge. The credibility of scholars duped into publishing in these journals can be seriously damaged by doing so. It is important that as a scholarly community we help to protect each other from being taken advantage of in this way.
Finding the Right Journal at the Right Time for the Right WorkSaptarshi Ghosh
JournalFinder helps you find journals that could be best suited for publishing your scientific article. Please also consult the journal’s Aims and Scope for further guidance. Ultimately, the Editor will decide on how well your article matches the journal.
The phrase new normal is an oxymoron typically used to indicate a life event that is out of the ordinary and has a long-lasting or permanent impact on someone’s day-to-day routine.
But using the phrase to describe efforts that makes me uncomfortable to fight a global pandemic implies a sense of permanence.
In her foreword to John Putzier, Weirdos in the Workplace: The New Normal—Thriving in the Age of the Individual (2004), Libby Sartain claims that the phrase “the new normal” is a recent coinage
Great wits are sure to madness near allied
And thin partitions do their bounds divide.
(John Dryden, 1681)
There is no great genius without a tincture of madness. (Seneca, 1st Century A.D.)
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
Management of Change is being relevant with the time and space. This presentation elaborates existence of information professionals beyond their territories as survival of the fittest lies only on more information diffusion and information dissemination for the collective wisdom of the stakeholders in a society
Will the Digital library sustain as a Social Capital for dissemination of Inf...Saptarshi Ghosh
Abstract
This paper deals with the relationship between digital library and social development. The core of digital library which rests with strong social bonding and participatory approach, has been reflected in this write-up. Today, global prosperity and individual productivity depend upon the ability to learn constantly, adapt to change readily, and to evaluate information critically. Right now in this information rich world, we must remain ways to transform information into knowledge. So, how can we ensure that our communities can access the resources and services that we have available? How can we ensure that we are responsive to, and representative of, our communities' actual, as opposed to perceived, needs? We will look at various ways that library services can partner with their communities to bring about better outcomes for all. The digital library can bridge these gaps and it may be turned as a people’s access to the information repository and can be a motivator to sustainable development.
Information System Design in Context of Social InformaticsSaptarshi Ghosh
Informatics is a branch of information engineering. It involves the practice of information processing and the engineering of information systems, and as an academic field it is an applied form of information science.
The field considers the interaction between humans and information alongside the construction of interfaces, organisations, technologies and systems.
“Organization Behaviour is concerned with the study of what people do in an organization and how that behaviour affects the performance of the organization.” (Robbins: 1989)
Library Intelligence The collection, analysis, and synthesis of data. Time devoted to reflection and development of insight Willingness and ability to change. Library Intelligence makes it easier for library staff to focus on improving their digital literacy fluency.
Information Ecology: Legacy Practices with changing dynamicsSaptarshi Ghosh
“The study of the inter-relationships between people, enterprises, technologies and the information environment” -The International Encyclopedia of Information and Library Science
Impact of Social Networking /Web 2.0 features in Library Management SoftwareSaptarshi Ghosh
Web 2.0 describes World Wide Web websites that emphasize user-generated content, usability (ease of use, even by non-experts), and interoperability (this means that a website can work well with other products, systems and devices) for end users. The term was popularized by Tim O'Reilly. Social networking sites like facebook, twitter, etc. are result of web 2.0.
Optimistic interpretations: ignoring social relations that influence the social distribution and impact of the new ICT. The new digital technologies function as commodities, and their distribution – at least initially – tends to follow existing divisions of class, race and gender. Rather than assisting with equalization, the new information and communication technologies tend to reinforce social inequality, and lead to the formation of socially and technologically disadvantaged and excluded individuals (Golding, 1996; Zappala, 2000).
COLLECTIVES OR SUBJUGATION: POLITICS OF MISINFORMATION Saptarshi Ghosh
If you assume that there is no hope, you guarantee that there will be no hope. If you assume
that there is an instinct for freedom, that there are opportunities to change things, then there
is a possibility that you can contribute to making a better world. That’s your choice (Chomsky 2002, p.6).
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
We all have good and bad thoughts from time to time and situation to situation. We are bombarded daily with spiraling thoughts(both negative and positive) creating all-consuming feel , making us difficult to manage with associated suffering. Good thoughts are like our Mob Signal (Positive thought) amidst noise(negative thought) in the atmosphere. Negative thoughts like noise outweigh positive thoughts. These thoughts often create unwanted confusion, trouble, stress and frustration in our mind as well as chaos in our physical world. Negative thoughts are also known as “distorted thinking”.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Impact Factor Journals as per JCR, SNIP, SJR, IPP, CiteScore
1. Impact factor
journals as per as
per journal
citation
report SNIP, SJR,
IPP, CiteScore
Dr. S.Ghosh
Associate Professor
Department of Library &
Information Science, University of
North Bengal, West Bengal 734013
2. Publish or
Perish?
“Publish or perish" is an aphorism
describing the pressure to publish academic
work in order to succeed in an academic
career. ... The pressure to publish has been
cited as a cause of poor work being
submitted to academic journals.
12/6/2020@sghoshnbu 2
3. The Harsh
Consequences of
“Publish or Perish”
The culture of “publish or perish” is clearly pervasive and
appears to be here to stay. Calls for instant distribution and
transparency of both authorship and peer review may help
to address problems with research quality, but as long as
researchers are threatened by the publication venue of their
research, the system will remain fundamentally broken.
12/6/2020@sghoshnbu 3
4. Download counts
Page views
Mentions in news reports
Mentions in social media
Mentions in blogs
Reference manager readers
… etc.
Journal Impact Factor
Citation counts
Perspectives of impact
ACADEMIC IMPACT SOCIETAL IMPACT
Alternative metrics
“altmetrics”
+
Traditional metricsTraditional metrics
More article-centric, as opposed to
journal-centric.
6. Why is
metrics?
Quantification of research
impact
Multidimensional Array of
Stakeholders
Calculations of fuzzy concepts
and associative activities
7. What are the
different
metrics?
Scholars have combined standard research
metrics, like scholarly output and citation
counts, into formulas to measure and assess
author and journal impact in new ways. Some
of these metrics include:
Journal Impact Factor
h-index
g-index
Eigenfactor score
Altmetric
@sghoshnbu 12/6/2020 7
8. Ways of
Measuring
Impact
Article Impact - citation count and analysis using Web of
Science and Google Scholar
Journal Impact - journal data and standard measures for
journals
Author Impact - common measures of author impact (h-
index) and other metrics scholars might encounter
Altmetrics - what are altmetrics? Altmetric badges and
altmetrics tools
Book and Book Chapter Impact - book citation counts,
holdings, book reviews and other qualitative indicators
Maximize Impact - unique researcher identifiers and profiles,
academic communities, and other strategies to maximize
impact
12/6/2020@sghoshnbu 8
9. Journal-level metrics
al-level metrics
Metrics have become a fact of life in many - if not all - fields of research and scholarship. In an age of
information abundance (often termed ‘information overload’), having a shorthand for the signals for
where in the ocean of published literature to focus our limited attention has become increasingly
important.
Research metrics are sometimes controversial, especially when in popular usage they become proxies
for multidimensional concepts such as research quality or impact. Each metric may offer a different
emphasis based on its underlying data source, method of calculation, or context of use. For this reason,
Elsevier promotes the responsible use of research metrics encapsulated in two “golden rules”. Those
are: always use both qualitative and quantitative input for decisions (i.e. expert
opinion alongside metrics), and always use more than one research metric as the quantitative input. This
second rule acknowledges that performance cannot be expressed by any single metric, as well as the
fact that all metrics have specific strengths and weaknesses. Therefore, using multiple complementary
metrics can help to provide a more complete picture and reflect different aspects of research productivity
and impact in the final assessment.
12/6/2020@sghoshnbu 9
10. Journal Citation Reports™ (JCR)
Journal Citation Reports™ (JCR) provides you with the transparent, publisher-neutral data and statistics you need to make
confident decisions in today’s evolving scholarly publishing landscape, whether you’re submitting your first manuscript or
managing a portfolio of thousands of publications.
Quickly understand a journal’s role within and influence upon the global research community by exploring a rich array of
citation metrics, including the Journal Impact Factor™ (JIF), alongside descriptive data about a journal’s open access
content and contributing authors.
Web of Science does not depend on the Journal Impact Factor alone in assessing the usefulness of a journal, and neither
should anyone else. The Journal Impact Factor should not be used without careful attention to the many phenomena that
influence citation rates – for example, the average number of references cited in the average article. The Journal Impact
Factor should be used with informed peer review. In the case of academic evaluation for tenure, it is sometimes
inappropriate to use the impact of the source journal to estimate the expected frequency of a recently published article.
Again, the Journal Impact Factor should be used with informed peer review. Citation frequencies for individual articles are
quite varied.
Journal Citation Reports now includes more article-level data to provide a clearer understanding of the reciprocal
relationship between the article and the journal. This level of transparency allows you to not only see the data, but also see
through the data to a more nuanced consideration of journal value.
12/6/2020@sghoshnbu 10
11. Journal Impact Factor (JIF)
Journal Impact Factor (JIF) is calculated by Clarivate Analytics as the average of
the sum of the citations received each year to a journal’s previous two years of
publications (linked to the journal, but not necessarily to specific publications)
divided by the sum of “citable” publications in the previous two years. Owing to
the way in which citations are counted in the numerator and the subjectivity of
what constitutes a “citable item” in the denominator, JIF has received sustained
criticism for many years for its lack of transparency and reproducibility and the
potential for manipulation of the metric. Available for only 11,785 journals
(Science Citation Index Expanded plus Social Sciences Citation Index, per
December 2019), JIF is based on an extract of Clarivate’s Web of Science
database and includes citations that could not be linked to specific articles in the
journal, so-called unlinked citations.
12/6/2020@sghoshnbu 11
12. Metrics in a nutshell(Impact Factor)
@sghoshnbu 12/6/2020 12
Impact Factor
Journal Citation
Reports
Use a two-year period to divide the
number of times articles were cited by
the number of articles that were
published
Example:
200 = the number of times articles
published in 2008 and 2009 were cited
by indexed journals during 2010.
73 = the total number of "citable
items" published in 2008 and 2009.
200/73 = 2.73
2010 impact factor
Impact factor reflects only on
how many citations on a
specific journal there are (on
average). A journal with a
high impact factor has articles
that are cited often.
13. Traditional metrics for journals
Impact Factor and Citation Counts, created to measure
Journals and journal articles
Scholarly (journal) impact
Initially created for librarians, then largely adopted by STEM
Image from Journal Citation Reports (library database)
Software
14. Source Normalized Impact per Paper (SNIP)
Source Normalized Impact per Paper (SNIP) is a sophisticated metric
that intrinsically accounts for field-specific differences in citation
practices. It does so by comparing each journal’s citations per
publication with the citation potential of its field, defined as the set of
publications citing that journal. SNIP therefore
measures contextual citation impact and enables direct comparison
of journals in different subject fields, since the value of a single
citation is greater for journals in fields where citations are less likely,
and vice versa. SNIP is calculated annually from Scopus data and is
freely available alongside CiteScore and SJR
at www.scopus.com/sources. Unlike the well-known journal impact
factor, SNIP corrects for differences in citation practices between
scientific fields, thereby allowing for more accurate between-field
comparisons of citation impact. Centre for Science and Technology
Studies(CWTS) Journal Indicators also provides stability intervals
that indicate the reliability of the SNIP value of a journal. SNIP was
created by Professor Henk F. Moed at Centre for Science and
Technology Studies (CWTS), University of Leiden.@sghoshnbu 12/6/2020 14
15. CiteScore metrics
CiteScore metrics are a suite of indicators calculated from data in Scopus, the world’s leading abstract
and citation database of peer-reviewed literature. CiteScore itself is an average of the sum of the
citations received in a given year to publications published in the previous three years divided by the
sum of publications in the same previous three years. CiteScore is calculated for the current year on a
monthly basis until it is fixed as a permanent value in May the following year, permitting a real-time
view on how the metric builds as citations accrue. Once fixed, the other CiteScore metrics are also
computed and contextualise this score with rankings and other indicators to allow comparison.
CiteScore metrics are:
Current: A monthly CiteScore Tracker keeps you up-to-date about latest progression towards the next annual
value, which makes next CiteScore more predictable.
Comprehensive: Based on Scopus, the leading scientific citation database.
Clear: Values are transparent and reproducible to individual articles in Scopus.
The scores and underlying data for more than 25,000 active journals, book series and conference
proceedings are freely available at www.scopus.com/sources or via a widget (available on each
source page on Scopus.com) or the Scopus API.
12/6/2020@sghoshnbu 15
16. SCImago Journal Rank (SJR)
SCImago Journal Rank (SJR) is based on the concept of a transfer of
prestige between journals via their citation links. Drawing on a similar
approach to the Google PageRank algorithm - which assumes that
important websites are linked to from other important websites - SJR
weights each incoming citation to a journal by the SJR of the citing
journal, with a citation from a high-SJR source counting for more than a
citation from a low-SJR source. Like CiteScore, SJR accounts for journal
size by averaging across recent publications and is calculated annually.
SJR is also powered by Scopus data and is freely available alongside
CiteScore at www.scopus.com/sources.
12/6/2020@sghoshnbu 16
17. The impact per publication(IPP)
The impact per publication, calculated as the number of citations given in
the present year to publications in the past three years divided by the total
number of publications in the past three years. IPP is fairly similar to the
well-known journal impact factor. Like the journal impact factor, IPP does
not correct for differences in citation practices between scientific fields.
IPP was previously known as RIP (raw impact per publication).
12/6/2020@sghoshnbu 17
18. Immediacy Index
The Immediacy Index measures how frequently the average article from a
journal is cited within the same year as publication. This number is useful
for evaluating journals that publish cutting-edge research.
Immediacy Index Numerator - Cites to recent items:
The numerator looks at citations in a particular JCR year to a journal's
content from the same year. For example, the 2015 Immediacy Index for a
journal would take into account 2015 citations to the journal's 2015 papers.
The numerator includes citations to anything published by the journal in that
year.
Immediacy Index Denominator - Number of recent items:
The denominator takes into account the number of citable items published
in the journal in 2015. Citable items include articles and reviews.
@sghoshnbu 12/6/2020 18
20. H-index variant H5-Index
@sghoshnbu 12/6/2020 20
h-index
Web of
Science, Google
Scholar, Scopus
1) Create a list of all your publications. Put the list in descending order based
on the number of times it was cited (you can get this information from any
of the sources to the left). The first article should have the most citations. Go
through and number these.
2) Look down through the list to figure out at what point the number of
times a publication has been cited is equal to or larger than the line (or
paper) number of the publication.
Example:
Paper Number # of citations
1 13
2 7
3 4
h-index= 3
*please remember that many databases will give you this number; this is
only if you'd like to calculate it manually. You can also often find calculators
online.
The h-index focuses more
specifically on the impact
of only one scholar instead
of an entire journal. The
higher the h-index, the
more scholarly output a
researcher has.
SoftwareJorge E. Hirsch
Argentine American professor of physics at the University of California,
San Diego.[1] He is known for inventing the h-index in 2005
21. G-index
@sghoshnbu 12/6/2020 21
g-index Harzing's Publish or Perish
Given a list of articles ranked
in decreasing order of the
number citations that they
received, the g-index is the
largest unique number to the
extent that the top g articles
received together is at least
g
2
citations.
The g-index can be thought of
as a continuation of the h-index.
The difference is that this index
puts more weight on highly-
cited citations. The g-index was
created because scholars
noticed that h-index ignores the
number of citations to each
individual article beyond what is
needed to achieve a certain h-
index. This number often
complements the h-index and
isn't necessarily a replacement.
Egghe, Leo
Hasselt University, Nederlands in 2006 suggested
g-index
23. Eigenfactor score
@sghoshnbu 12/6/2020 23
Eigenfactor score Eigenfactor.org
• The Eigenfactor score is calculated by
eigenfactor.org.
• However, their process is very similar to
calculating impact factor and they pull
their data from the JCR as well.
• The major difference is that the
Eigenfactor score deletes references
from one article in a journal to another
in the same journal.
• This eliminates the problem of self-
citing.
• The Eigenfactor score is also a five-year
calculation.
• More information can be found
through Journal Citation Reports.
A high Eigenfactor score signals
that the journal does not self-
cite and controls the network of
that discipline. It's useful to look
at scholar's h-index as well as
the Eigenfactor score of the
journals they publish in in order
to get a broad sense of their
impact as a researcher.
Jevin West Carl T. Bergstrom Ted C. Bergstrom
Ben Althouse
24. i10-index
The i10-index is used by Google
Scholar and indicates the
number of publications that have
been cited at least 10 times.
12/6/2020@sghoshnbu 24
25. Altmetrics
Jason Priem
The tweet by Jason Priem,
which coined the
term altmetrics.
The term "altmetrics" (alternative metrics) is used to describe
approaches to measure the impact of scholarship by using new
social media tools such as bookmarks, links, blog postings,
inclusion in citation management tools, mentions and tweets to
measure the importance of scholarly output.
Proponents of altmetrics believe that using altmetrics will help
measure the impact of an article in a more comprehensive and
objective way than was done with more traditional scholarly
impact measures such as journal impact factor. However, there
are limits to this approach and caution should be used to not rely
on any one particular measure in evaluating the importance of
scholarship.
12/6/2020@sghoshnbu 25
26. “The Umbrella
Classification of
Non-Citation
based Metrics”
“alternative metrics”
• new ways of measuring different, non-traditional
forms of impact.
• “alternative to only using citations”, not
“alternative to citations”.
• complementary to traditional citation-based
analysis.
Article-level metrics have come to refer to
any metrics (e.g., including altmetrics) that
surround a scholarly article.
27. An article-centric approach
Measure online attention surrounding journal articles (and datasets).
Collect and deliver article-level metrics to journal publishers.
30. How do we collect
data for altmetrics?
Directly from the individual tools
From publishers (views, download data)
From (some) library databases
From scholarly networks
Through aggregating tools
SlideShare views
PLOS article metrics
Web of Science usage
ResearchGate metrics
Altmetric metrics
31. Altmetrics
Measures
12/6/2020 @sghoshnbu 31
Usage : clicks, downloads, views; Social Media - likes, shares, or tweets;
Captures - bookmarks, favorites, followers; Mentions - blog posts,
reviews, comments, or ratings
Altmetrics are often used to measure the impact of gray literature or
materials that are not formally published, such as posters and working
papers. They can also be used to provide more information about the
reach of published articles and books.
It is unlikely that altmetrics will supplant traditional metrics as the
measure of research impact. However, altmetrics can demonstrate the
reach and interest in a topic from the public, practitioners, and policy
makers
Authors should refrain from judging the impact of a work based on the
altmetrics numbers. Digging into who is saying what about the work
may provide more reliable information about the quality and influence
of a work.
38. Strategies to
Maximize
Your Impact
12/6/2020 @sghoshnbu 38
Create Unique
Researcher Identifiers
Create Researcher
Profiles
Share Your Research
Online
Take Steps to Broaden
Your Impact
39. Take Steps
to Broaden
Your Impact
Contribute
Contribute to Wikipedia, either in a new entry or in the text and
references of an existing entry.
Discuss Discuss your research findings on a blog or through Twitter.
Link Link your most recent research to your email signature.
Publish in
Publish in open access journals or pay to have the work available
open access in a subscription journal.
Craft
Craft a work's title and abstract carefully. Repeat keywords so the
work is highly relevant in search engines.
Add
Add postprints/white papers/drafts of work to your institutional
repository, DigitalCommons@EMU, or to a disciplinary repository.
12/6/2020@sghoshnbu 39
40. Identity
Exploration
Google Scholar Profile
A Google Scholar Profile tracks your publications listed in Google Scholar,
provides the number of citations and links to the items citing your work, and
calculates your h-index. (Note: You need to have a Gmail account to track
your profile. Once you are logged in to your Gmail account, click on "My
citations" to view and edit your profile.)
Impactstory
This web-based service collects metrics and displays them with a link that
can be added to CVs. Join free with an ORCID account.
Share Your Research Online
The process of writing for publication often creates several outputs in
addition to the final journal article, book, or book chapter. Consider posting
slides from presentations, brief videos of presentations, data sets, or other
materials online with a link to the official publication.
Postprints/White Papers/Drafts of work - DigitalCommons@EMU or
subject/disciplinary repositories.
Presentation Slides - SlideShare or Speaker Deck
Videos - Vimeo or YouTube
Data Sets - Dryad or figshare (figshare can handle other outputs as well)
Code & Software - GitHub
@sghoshnbu 12/6/2020 40
42. References
Ayris, P., López de San Román, A., Maes, K., & Labastida, I. (2018). Open Science and its role in universities : A roadmap for cultural
change. League of European Research Universities.
Bose, R. (2004). Knowledge management metrics. Industrial Management and Data Systems. https://doi.org/10.1108/02635570410543771
Commission, E. (2017). Next generation metrics: Responsible metrics and evaluation for open science: European commission Report.
Brussels.
Hicks, D., Wouters, P., Waltman, L., De Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature.
https://doi.org/10.1038/520429a
Lăzăroiu, G. (2017). What do altmetrics measure? Maybe the broader impact of research on society. Educational Philosophy and Theory.
https://doi.org/10.1080/00131857.2016.1237735
LibGuides: Introduction to Impact Factor and Other Research Metrics: Home. (n.d.). Retrieved from
https://guides.library.illinois.edu/impact
SAGE Publishing. (2019). The latest thinking about metrics for research
impact in the social sciences (White paper). Thousand Oaks, CA: Author. doi: 10.4135/wp190522.
Understanding research metrics. (n.d.). Retrieved May 17, 2020, from https://editorresources.taylorandfrancis.com/understanding-
research-metrics/
12/6/2020@sghoshnbu 42
It’s easy to dismiss publish or perish as an old maxim that academics use to complain about their terrible working conditions, but research has shown that the longer this culture of pressure persists, the greater the risk to academic research integrity. As the players in this publishing game start to suffer, and the cracks begin to appear, we can see real consequences:
Focus has been shifting to metrics at the article level. Why should the value of a work be judged the journal it has been published in?