Citation index, Journal Impact Factors , H – Index and Impact Factor
-------
RESEARCH, PUBLICATIONS AND QUALITY ASSESSMENT
WIDE VARIATION IN THE ASSESSMENT AND QUALITY JUDGMENT
DIFFRENTIAL LEVEL OF RESEARCH OUTPUT- Reflected by number/frequency/quality of the publication
LACK OF INTEREST
DIFFERNCES IN OVER ALL OBJECTIVES
TYPES OF PUBLICATIONS
TYPES AND QUALITY OF THE JOURNALS
Seismic Method Estimate velocity from seismic data.pptx
Prof. sp singh.ph d.course work.2020-21.citation index, journal impact factors , h – index and impact factor
1. PhD Course Work.2020-21
Citation index, Journal Impact Factors ,
H – Index and Impact Factor
Prof. Satya P. Singh
Department Of Biosciences,
Saurashtra University
2. RESEARCH, PUBLICATIONS AND QUALITY ASSESSMENT
WIDE VARIATION IN THE ASSESSMENT AND QUALITY JUDGMENT
DIFFRENTIAL LEVEL OF RESEARCH OUTPUT- Reflected by
number/frequency/quality of the publication
LACK OF INTEREST
DIFFERNCES IN OVER ALL OBJECTIVES
TYPES OF PUBLICATIONS
TYPES AND QUALITY OF THE JOURNALS
3. Citation index
• A citation index is an index of citations between publications, allowing the
user to easily establish which later documents cite which earlier documents.
• First as legal citation- Shephard citation-1873
• In 1960, Eugene Grafield’s institute for Institute of Scientific Information
(ISI) introduced the first citation index for papers for academic journals.
• Starting with Science Citation Index (SCI) and later expanding to produce
the Social Sciences Citation Index (SSCI).
• The first citation was done by CiteSeer in 1997.Other sources for such data
include Google Scholar.
• Science ► Social Sciences ► Humanities
• Retrieval Informations ► Research Evaluation ► Basis for IF
4. News
Private Investment in Higher Education Must be
Increased to Achieve a 5 Trillion USD Economy:
Prof. C. Raj Kumar Founding Vice Chancellor of O.P.
Jindal Global University
PTI
Financial investment in higher education: Lagging behind other
nations like China and South Korea if
“Two decades ago, India and Chinese educational institutions were at
par
Today China is one of the leading education miracles and has several
universities featuring in world rankings.
Not a single university from India is among the top 100 universities of
the world, while there are several universities from Asia - China,
Japan, Hong Kong, Singapore, South Korea, and Taipei
Sound policies, consistent financial commitment and international
partnerships
5. Major citation indexing services
• ISI (now part of Thomson Scientific), which
publishes the ISI citation indexes in print and
compact disc.
• Elsevier, which publishes Scopus .
• They differ widely in cost: the ISI databases and
Scopus are Subscription databases, the others are
freely available online.
6. Citation analysis
• Citation indexes were originally designed for information
retrieval purposes, increasingly used for bibliometric
and other studies involving research evaluation.
• Citation is the basis of the popular journal impact
factor.
• Large body of literature on citation analysis sometimes
called as scientometrics or more specifically
bibliometrices.
• Free citation tools are CiteBase, CiteSeerX, Google
Scholar and Windows Live Academic.
7. Cont.
• The use of citation counts to rank journals was a
technique used in the early part of the nineteenth century.
• Irving Sher showed the correlation between citation
frequency and eminence in demonstrating that
Noble prize winners
Published 5 times the average numbers of papers
Citation 30-50 times
8. Journal Impact Factor
• The impact factor, is a measure reflecting the average
number of citations to articles published in science
and social science journals.
• Journals with higher impact factors ► More
Important than those with lower ones.
• The impact factor was devised by Eugene Garfield,
the founder of the Institute for Scientific
Information (ISI), now part of Thomson Reuters.
9. Thomson Reuters Corporation
A Canada-based multinational media conglomerate
The company was founded in Toronto, Ontario, Canada,
Thomson Reuters was created by the Thomson Corporation's
purchase of the British company Reuters Group in April 2008[8] and is
majority owned by The Woodbridge Company, a holding company for
the Thomson family.[9]
10. • Calculation of Impact Factor:
• In a given year, the impact factor of a journal is the average
number of citations received per paper published in that
journal during the two preceding years.
• For example, if a journal has an impact factor of 3 in 2008,
then its papers published in 2006 and 2007 received 3 citations
each on average. The 2008 impact factor of a journal would be
calculated as follows:
• A = the number of times articles published in 2006 and 2007
were cited by indexed journals during 2008
B = the total number of "citable items" published by that
journal in 2006 and 2007.
2008 impact factor = A/B
11. Use & Validity of Impact factor
• To compare different journals: within a certain field
• Highly discipline-dependent:
• Could not be reproduced in an independent audit
• It refers to the average number of citations per paper
• Not a normal distribution
• Self-citation: The frequency of author’s own work
• Journals Self-citation is common- High overlaps in
readerships and authors
12. Editorial policies which alter the impact
factor
• Larger percentage of review articles by the journals: Generally
cited more than research reports
• Review Journals: Highest impact factors in their respective fields.
• Methods in the same Journals: Would enhance IF in their
respective fields.
• “Citable items”: Journals may change the fraction
• The impact factor of the journal Acta Crystallographica Section A
rose from 2.051 in 2008 to 49.926 in 2009, more than Nature
(31.434) and Science (28.103). – Crystal structure determination
• Word- Phrase-software being frequently used
13. Incorrect application of IF
• To evaluate the significance of an individual publication or
to evaluate an individual researcher, may be incorrectly
applied
• A small number of publications are cited much more than
the majority - for example, about 90% of Nature's 2004
impact factor was based on only 25% of its publications,
• Underestimation of the MOST-CITED Articles and
Exaggeration of the citations of the majority of the articles
• Assess the quality of the content of individual articles, not the
reputation of the journal.
• Indivisul Articles vs Journals Reputation
14. Impact Factor Distortions
• Science 17 May 2013:
Vol. 340 no. 6134 p. 787
DOI: 10.1126/science.1240319
• EDITORIAL
• Bruce Alberts
• Bruce Alberts is Editor-in-Chief of Science.
• San Francisco declaration on research Assessment (DORA)
• "journal impact factor" in judging an individual scientist's work.
The Declaration states that the impact factor must not be used as "a
surrogate measure of the quality of individual research articles,
To assess an individual scientist's contributions, or in hiring, promotion, or
funding decisions."
15. The misuse of the journal impact factor:
Bias journals against publishing important papers in fields (such as social
sciences and ecology) that are much less cited than others (such as
biomedicine).
It takes years to create a new approach in a new experimental context,
during which no publications should be expected.
Such metrics further block innovation because they encourage scientists to
work in areas of science that are already highly populated, as it is only in
these fields that large numbers of scientists can be expected to reference
one's work, no matter how outstanding
16. Nature NEWS 06 FEBRUARY 2020
Highly cited researcher banned from journal board
for citation abuse
Investigation finds that biophysicist Kuo-Chen Chou repeatedly suggested
dozens of citations be added to papers.
Richard Van Noorden
A US-based biophysicist who is one of the world’s most highly cited
researchers has been removed from the editorial board of one journal and
barred as a reviewer for another, after repeatedly manipulating the peer-
review process to amass citations to his own work.
On 29 January, three editors at the Journal of Theoretical
Biology (JTB) announced in an editorial that the journal had
investigated and barred an unnamed editor from the board for “scientific
misconduct of the highest order”.
17. Quantifying Long-Term Scientific Impact
Dashun Wang1,2,*, Chaoming Song1,3,*, Albert-László Barabási1,4,5,6,†
Science 4 October 2013: Vol. 342 no. 6154 pp. 127-132
The lack of predictability of citation-based measures frequently
used to gauge impact, from impact factors to short-term
citations:
Is there long-term predictability in citation patterns?
A mechanistic model for the citation dynamics of individual papers that
allows us to collapse the citation histories of papers from different
journals and disciplines into a single curve,
Indicating that all papers tend to follow the same universal temporal
pattern.
The observed patterns not only help us uncover basic mechanisms that
govern scientific impact but also offer reliable measures of influence
that may have potential policy implications.
18. Quantifying Long-Term Scientific Impact
Dashun Wang1,2,*, Chaoming Song1,3,*, Albert-László Barabási1,4,5,6,†
Science 4 October 2013: Vol. 342 no. 6154 pp. 127-132
The lack of predictability of citation-based measures frequently
used to gauge impact, from impact factors to short-term
citations:
Is there long-term predictability in citation patterns?
A mechanistic model for the citation dynamics of individual papers that
allows us to collapse the citation histories of papers from different
journals and disciplines into a single curve,
Indicating that all papers tend to follow the same universal temporal
pattern.
The observed patterns not only help us uncover basic mechanisms that
govern scientific impact but also offer reliable measures of influence
that may have potential policy implications.
19. Quantifying Long-Term Scientific Impact
Dashun Wang1,2,*, Chaoming Song1,3,*, Albert-László Barabási1,4,5,6,†
Science 4 October 2013: Vol. 342 no. 6154 pp. 127-132
The lack of predictability of citation-based measures frequently
used to gauge impact, from impact factors to short-term
citations:
Is there long-term predictability in citation patterns?
A mechanistic model for the citation dynamics of individual papers that
allows us to collapse the citation histories of papers from different
journals and disciplines into a single curve,
Indicating that all papers tend to follow the same universal temporal
pattern.
The observed patterns not only help us uncover basic mechanisms that
govern scientific impact but also offer reliable measures of influence
that may have potential policy implications.
20. Science 25 October 2013:
Vol. 342 no. 6157 pp. 468-472
DOI: 10.1126/science.1240474
•REPORT
Atypical Combinations and Scientific Impact
•Brian Uzzi1,2,
•Satyam Mukherjee1,2,
•Michael Stringer2,3,
•Ben Jones1,4,*
+Author Affiliations
•1Kellogg School of Management, Northwestern University, 2001 Sheridan Road, Evanston, IL 60208, USA.
•2Northwestern Institute on Complex Systems, Northwestern University, 600 Foster, Evanston, IL 60208, USA.
•3Datascope Analytics, 180 West Adams Street, Chicago, IL 60603, USA.
•4National Bureau of Economic Research, 1050 Massachusetts Avenue, Cambridge, MA 02138, USA.
•↵*Corresponding author. E-mail: bjones@kellogg.northwestern.edu
•ABSTRACT
•EDITOR'S SUMMARY
Novelty is an essential feature of creative ideas, yet the building blocks of new ideas are often embodied in existing
knowledge. From this perspective, balancing atypical knowledge with conventional knowledge may be critical to the
link between innovativeness and impact.
Our analysis of 17.9 million papers spanning all scientific fields suggests that science follows a nearly universal pattern:
The highest-impact science is primarily
grounded in exceptionally conventional combinations of prior work yet simultaneously features an intrusion
of unusual combinations. Papers of this
type were twice as likely to be highly cited works. Novel combinations of prior work are rare, yet teams are 37.7%
more likely than solo authors
to insert novel combinations into familiar knowledge domains.
•Received for publication 14 May 2013.
•Accepted for publication 20 September 2013.
Read the Full Text
21. No more first authors, no more last authors
WORLD VIEW
25 September 2018, Nature 561, 435 (2018) doi: 10.1038/d41586-018-06779-2
Gretchen L. Kiser
If we really want transdisciplinary research, we must ditch the ordered listing of authors that stalls collaborative science,
says Gretchen L. Kiser.
Every academic scientist has heard a tale of someone being shafted on an authorship list, or had it happen to them. Less appreciated is how much the attribution of credit impedes cross-disciplinary
approaches to difficult questions. It creates a negative feedback loop that hinders research. Most scientists agree that research questions and approaches have become more complex, so the need
to engage in expanded team science has increased. I’ve found, however, that there is great reluctance among faculty members to join such efforts. I find myself asking, ‘What if we completely blow
up the way in which we attribute authorship?’ I suspect that if we got rid of first authors, last authors and the fight for asterisks, we might interrupt the negative feedback loop and see more innovation.
Since 2012, I’ve led the Research Development Office at the University of California, San Francisco (UCSF). One of our goals is to bring together researchers of varying backgrounds to encourage
innovative thinking and new approaches. My team identifies and cajoles ‘champions’ to invite colleagues to participate in team-building events. We offer financial and logistical support; we bring in
interesting speakers; we provide drinks and food (and not just pizza!) — all to get scientists to talk to each other about their research, needs and ambitions. But the resource that really matters is not
mine to dispense: credit for scientific contributions.
There are real successes: one of our ‘speed-networking’ events at UCSF introduced neurologist Dena Dubal, who investigates the molecular mechanisms of longevity and neurodegenerative
disease, to psychologist Aric Prather, who researches the effects of stress on health. That led to a project that revealed an association between chronic psychological stress and lower levels of a
longevity hormone. They published that work and continue to collaborate (A. A. Prather et al. Transl. Psychiatr. 5, e585; 2015).
Other teams we’ve helped have received follow-on support from external funders such as the US National Institutes of Health. Surveys tell me that faculty members enjoy our team-building events,
even when they did not expect to, and that they would recommend them to others. Nevertheless, there seems to be an undeclared disincentive for researchers to build unconventional collaborations.
I get frustrated with the disconnect between what we say about the need for transdisciplinary teams to solve complex problems and the reluctance to try something new to build those teams. The
assessment of publications during promotion and tenure decisions is a big part of the problem. Although these processes often have some mechanism to recognize a researcher’s team contributions,
the culture remains largely unchanged from 50 years ago. The gravitas associated with ‘first’ and ‘senior’ authorship is entrenched. What about the middle author who might have significantly altered
the approach? Or the fourth-place author who linked different disciplines? Often these researchers are left to find only self-satisfaction. Many journals now allow, and even require, statements that
explain contributors’ roles in their publications. Taxonomies and standardized vocabularies for describing authors’ roles have been developed. Similarly, promotion and tenure committees are using
contribution narratives in their assessments. These changes are helping. They capture a fuller spectrum of a researcher’s productivity so that evaluators can consider more than where someone sits
in an author list. Still, I’ve had senior faculty members tell me that, even though they look at the contribution narratives, they still expect to see first-author and then senior-author papers when
assessing candidates.
Meanwhile, research projects are starting to incorporate data that no one on the immediate team collected, and there are no settled conventions for crediting outside researchers or incentivizing that
valuable work.
We need a cultural shift to recognize and reward scientists who make their work useful to others, including researchers who might never meet but whose data are used. One way to make this happen
is to get rid of ordered author lists. By developing author contribution taxonomies and narratives, we have already acknowledged the need to reflect the multifaceted nature of authorship. Large
consortia and organizations are adopting contribution frameworks to reflect author roles and participation more accurately. We are also moving to use repository tools that assign authorship to
different types of research output, such as data sets. More effort, creativity and diversity of thought are needed. We should stop trying to apply old attribution models to the innovative ways we now
generate data.
If we can reveal the shape of proteins at atomic resolutions, tweak genes to order and detect cosmic signals from the beginning of time, then surely we can work out better ways to represent author
contributions. We already send complex basic research and clinical data into ‘information commons’ and build computational ‘knowledge network’ tools to inform patient diagnostics and therapeutics.
A well-annotated data set might be combined with other data to expand its impact synergistically. Can we imagine an author attribution method that would use cutting-edge computational tools similar
to those being applied to scientific research itself? A tool that gives credit where credit is due?
If we acknowledge the products of research in more-innovative ways, the value of ‘team-
ness’ might grow in academic culture and the cutting edge will get sharper. Perhaps, then, I
won’t have to cajole anyone to participate in team-building activities.
22.
23. Responses and Other measures of impact
• According to European Association of Science Editors
(EASE)“the journal impact factors are used only - and cautiously –
For measuring and comparing the influence of the entire
journals,
Not for the assessment of single papers, and certainly not for the
assessment of researchers or research programmes
• Cited half-life: the median age of the articles that were cited in
Journal Citation Reports each year.
• Aggregate impact factor for a subject category: it is calculated
taking into account the number of citations to all journals in the
subject category and the number of articles from all the journals in
the subject category.
24. H - Index
• Productivity and impact of the published work of
a scientist or a scholar
• Applied to the productivity and impact of a group
of scientists, such as a department or university or
country.
• The index was suggested by Jorge E. Hirsch, and is
sometimes called the Hirsch index or Hirsch number.
26. • Comparing scientists working in the same field: citation
conventions differ widely among different fields
• Highly cited articles contribute to the h-index: its
determination is a relatively simpler process
• The h-index grows as citations accumulate and thus it depends
on the 'academic age' of a researcher.
• Hirsch suggested that, for physicists, a value for h of about:
10–12 for tenure decisions at major research universities. A val
18 -a full professorship
15–20- a fellowship in the American Physical Society, and
45- Membership in the United States National Academy of
27. Calculating h
• The h-index can be manually determined using citation
databases or using automatic tools
• Subscription-based databases such as Scopus and the
Web of Knowledge provide automated calculators.
• H index means h, it means that h of his Np papers have
at least h citations each
• In addition, specific databases, such as the Stanford
Physics Information Retrieval System (SPIRES) can
automatically calculate h-index for researchers working in
High Energy Physics.
28. Advantages
• Bibliometric indicators: The h-index was intended to
address the main disadvantages of other, such as total
number of papers or total number of citations.
• Quality and sustainability of the scientific output:
simultaneously
• Much less affected by the methodological papers-
proposing successful new techniques, methods or
approximations, which can be extremely highly cited.
29. Criticism
• Number of authors: The h-index does not account
• Does not account for the typical number of citations
in different fields: Different fields, or journals,
traditionally use different numbers of citations
• Bounded by the total number of publications:
citations made in a negative context and citations made
to fraudulent or retracted work
• Researcher in the same stage of their careers: The
index intended as a tool to evaluate researchers in the
same stage of their careers
30. Impact Factor: Indian Contest
• Impact factor is an index based on the frequency with
which a journal’s articles are cited in scientific
publications, a marker of journal quality
• It provides quantitative tools for ranking, evaluating,
categorizing and comparing journals
• A tool for the management of library journal
collections
• In research, the impact factor provides quantitative
evidence for editors and publishers
31. Limitations
• Review articles generally are cited more frequently than
typical research articles.
• Method articles: It is widely believed that methods
articles attract more citations than other types of articles
• Journal self-citation.
• It does not distinguish between letters, reviews, or
original research.
• The coverage is very uneven.
32. References
• The Thomson Scientific Impact Factor. Available at
http://scientific.thomson.com/free/essays/journal
citation reports/
• Journal self-citation in the Journal Citation Reports®
– Science Edition (2002): A Citation Study from The
Thomson Corporation.
• Impact Factor. Available on
http://en.wikipedia.org/wiki/
34. Citation Calculation
The meaning of citations is not simple and citation‐based statistics are not nearly as
"objective" as proponents assert.
Research usually has multiple goals and it is therefore reasonable that its value must be
judged by multiple criteria.
35. Cited Reference Searching
Traditional search
1982
paper 1957
paper
1996
paper 1982
paper
1996
paper
1957
paper
1987
paper
2004
paper
2001
paper
1993
paper
Cited reference search
36. Why use this type of search?
Forward and backward in time: Allows you to move Forward
and backward in time, discovering relationships between published
works as determined by the articles authors
Find new, unknown information based on older, known
information
Find variant citations
Search cites to non-journal literature
Backward through “Cited References”
Uses cited references as subject terms
Explore hidden connections between research papers.
Give the strength to the researcher
37. Citation Reports
• Create reports
• Available for:
– Search
– Search within Results
– Author Finder
– Refine/Analyze
• Not available for:
– Cited Reference Search
– Citing Articles Summary
– Related Records Search Results
38. • Common use of citation analysis is to determine the impact
of a single author on a given field by counting the number of
times the author has been cited by others
• To explore the impact of their field
• To examine the effects of a set of researchers
• To study the impact of a particular paper
• To help the government to find top experts in the field in
order to set up a group of scholars
Use
39.
40. • Consists of 5 databases
– Science Citation Index Expanded
• 6,650 journals since 1900
– Social Sciences Citation Index
• 1,950 (+3,300) journals since 1956
– Arts & Humanities Citation Index
• 1,160 journals (+6,800)
– Index Chemicus (chemical structures)
– Current Chemical Reactions (synthetic methods)
Web of Science
41. • 100,000s NEW cited references added each week
• Generate citation reports
• Eliminate self-citations
• ‘My Citation Alerts’
• Author-finder (The Distinct Author Set feature)
– “A discovery tool showing sets of papers likely written by
the same person. Citation data is analyzed to create these
sets. This feature should be used as a tool to focus your
search rather than as a definitive list of a specific author's
works.”
Web of Science: citation analysis
42. • Search for relevant articles on the topic
• Limit articles by location and institution
• Rank articles by times cited
• List authors of most often cited articles
• Create citation reports on individual authors
• Rank authors according to citations
• Choose appropriate scholars from list
Web of Science: methodology
43.
44.
45.
46. Pros
Citation report provides clear data and graphs over time
Author finder helps differentiate different authors with same name
‘My Citation Alerts’ allows personalization
Results page allows for refining
Cons
Still difficult to differentiate authors with same name
Cannot produce author-based citation reports from a list of
articles
Can limit by country but cannot exclude countries
Difficult to locate controlled vocabulary used in subject
indexing
Web of Science: Evaluation
47. Coverage of:
• social sciences
• life sciences
• health sciences
• physical sciences
Scopus includes research literature published in:
• 15, 000 peer reviewed academic journals
• 1,200 Open Access journals
• 500 conference proceedings
• Over 600 trade publications
• book chapters and 200 book series
• 386 million web sources (author homepages, university sites, Open
Archives Initiative)
• 22 million patents
Includes citation analysis for journals and authors from 1996 and on.
Multidisciplinary Database
48. What is the Scopus Citation Tracker?
• It enables users to track data year by year for a specific author or topic.
With the Scopus Citation Tracker, users can:
• Identify most highly cited author in a field
• Find real-time citation data of articles and authors of interest
• Find out what topics are hot in various subject areas
• Find out what subjects are being cited by other subjects
• See what are the top cited works in 26 pre-selected fields
Also generates the h-graph:
• The h-index was developed by J.E. Hirsch.
Hirsch defines the h index as follows:
“The h-index is based on the highest number of papers included that have had at least the same
number of citations. “
• For Example: An h-graph for a group of selected documents or selected author(s) with an h
index of 12 means that out of the total number of documents selected to produce the graph, 12
of the documents have been cited at least 12 times.
Citation Tracker Tool
49. Once you have refined your results, you can select one, some or all of the hits and run a citation
analysis using the “citation tracker” option.
Citation Analysis Step 1
50. The Citation Overview page allow exclusion of “self-citations”, date range manipulation and 4 different ways to
sort the results.
Citation Analysis Step 2
52. At this point, the searcher must make a decision: to track the authors whose works
appear most often in the results or the authors whose works are ones most often
cited?
Vs.
Author Tracking –
Decision Point
56. Personal observations:
Cons:
• Citation analysis from 1996 on- so landmark articles from earlier years may
not show up.
• Does not allow automatic generation of top-cited authors’ list from search
results
• Poor ability to limit to a country (academics move around and publish in
countries other than their country of residence)
• The citation analysis page does not have the author’s name displayed
Pros:
• Generates a list of keywords relevant to the term searched (enriches searcher’s
vocabulary and assists in refining search query
• Allows very precise control and manipulation of results through “refine
results” options
• Allows personalization, list-saving and automatic alerts
• Good multidisciplinary coverage of topics in one search (e.g.- see who is
writing about sustainable agriculture in a variety of fields)
• Has function to eliminate self citations by the authors being analyzed
• Precise author indexing (few instances of numerous variations in authors’
names)
Multidisciplinary Database
57. Google Scholar
• Articles from some of CrossRef’s partipating publishers and others that
have made content available to Google Scholar
• More medicine and scientific resources than humanities and social
science
• Preprints, e-prints, university publications
• Books from OCLC’s Open WorldCat
the most accessible content, i.e., open access (OA) and public domain
collections. ... WorldCat Local helps us prioritize which copyright holders
to approach.
• Citations extracted from crawled articles using “special algorithms”
58. How scholarly is Google Scholar?
• Google says it has “peer-reviewed papers, theses, books,
abstracts and articles…”
• Unscholarly items find their way into GS
• Ranking of items gives weight to number of times cited,
and the number of versions (preprints, conference papers,
mirror sites)
• Never assume…
59. Go to Google Scholar – http://scholar.google.com/
Do a search for a particular name and topic and then add to the end of the search “-author:”
Example searches:
author: S.P.Singh
Google Scholar Articles Add id and password in Gmail
account Add name of author Get Result
Citations in Google Scholar:
60.
61. Web of Science Scopus Google Scholar
◦Science Citation Index
Expanded
◦Social Sciences
Citation Index
◦Arts & Humanities
Citation Index
◦Index Chemicus
(chemical structures)
◦Current Chemical
Reactions (synthetic
method)
Social sciences
Life sciences
Health sciences
Physical sciences
Articles from some of CrossRef’s partipating
publishers and others that have made content available
to Google Scholar
More medicine and scientific resources
than humanities and social science
Preprints, e-prints, university publications
Books from OCLC’s Open WorldCat
6,650 journals since 1900
1,950 (+3,300) journals
since 1956
1,160 journals (+6,800)
15, 000 peer reviewed academic
journals
1,200 Open Access journals
500 conference proceedings
Over 600 trade publications
book chapters and 200 book series
386 million web sources (author
homepages, university sites, Open
Archives Initiative)
22 million patents
Includes citation analysis for
journals and authors from 1996 and
on
Citations extracted from crawled articles using “special
algorithms”
GS also includes non scholarly material as well as
books
Google Scholar has a wider coverage of Open Access (OA)
web documents and non-journal documents more useful for
citation tracking across full text documents
63. Assignment
Question -1: What are different reasons for the variability in publications
among the scientists/students. (One para or 5-7 Points)
Question-2: In the light of the historical background of the citations, discus its
implications (One para or 5-7 Points)
Question-3 Discuss the Impact factors of the journals in the context of the
assessment of the credentials of the scientists. (One-two para)
Question-4 Highlight the merits of H-Index? (up to 5 Points)
Question-5 List 10 Journals with Impact factors and publishers of your
research areas