This document discusses the rapidly changing scholarly communications environment and issues surrounding publishing research. It notes debates around making federally funded research openly accessible and proposed legislation. It also covers tools for tracking citations and measuring impact, such as the Journal Impact Factor, Eigenfactor, Article Influence Score, and Hirsch index. Various publishing models and players in the field, including open access options, are outlined. Evaluation criteria like the CRAAP test for assessing information sources are presented.
Existing Impact factors are heavily criticized as measures of scientific quality. However,they are still used to select candidates for positions or consider during promotion of academic staff or grant application processes. As a consequence, researchers tend to adapt their publication strategy to avoid negative impact on their careers. The presenter, a researcher and a librarian. describes the existing metrics and shows how to improve alternative impact factors.
This presentation is about Scholarly Communications and how it works, what are ways through one can identify right journals for publications and also briefly discusses preprints as an alternative publications space for making the research more open and visible.
Early Career Tactics to Increase Scholarly ImpactElaine Lasda
Workshp for Ph.D. candidates, postdocs and faculy on how bilbiometrics, altmetrics, open access, ORCID, and other resources enable greater visibility of research output.
Existing Impact factors are heavily criticized as measures of scientific quality. However,they are still used to select candidates for positions or consider during promotion of academic staff or grant application processes. As a consequence, researchers tend to adapt their publication strategy to avoid negative impact on their careers. The presenter, a researcher and a librarian. describes the existing metrics and shows how to improve alternative impact factors.
This presentation is about Scholarly Communications and how it works, what are ways through one can identify right journals for publications and also briefly discusses preprints as an alternative publications space for making the research more open and visible.
Early Career Tactics to Increase Scholarly ImpactElaine Lasda
Workshp for Ph.D. candidates, postdocs and faculy on how bilbiometrics, altmetrics, open access, ORCID, and other resources enable greater visibility of research output.
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
Academics must provide evidence to demonstrate the impact and outcomes of their scholarly work. This webinar, presented by librarians, will help faculty explore various forms of documentary evidence to support their case for excellence. Sponsored by the IUPUI Office of Academic Affairs.
Note: The webinar included demonstrations of Web of Science & Scopus, which the slides do not reflect.
Quality Assurance for Journal GuidanceSmriti Arora
Definitions
What is the need for quality assurance in journals ?
Type of journals
Bibliometric indicators
How to identify credible journals ?
Predatory/cloned journals
Citation metrics versus peer review: Google Scholar, Scopus and the Web of Sc...Anne-Wil Harzing
This presentations reports on a systematic and comprehensive comparison of the coverage of the three major bibliometric databases: Google Scholar, Scopus and the Web of Science. Based on a sample of 146 senior academics in five broad disciplinary areas, we therefore provide both a longitudinal and a cross-disciplinary comparison of the three databases.
Our longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Our cross-disciplinary comparison of the three databases includes four key research metrics (publications, citations, h-index, and hI,annual, an annualised individual h-index) and five major disciplines (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We show that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.
Open Access - PeerJ Presentation to Lawrence Berkeley Labs (LBL)Peter Binfield
Slides from the PeerJ presentation to Lawrence Berkeley Labs (LBL) on May 23rd 2013. As hosted by Mark Biggin. Originally titled “What's All the Fuss About Open Access? What Do I Need to Know, and How Does it Benefit Me?”
Predatory publishing is a relatively recent phenomenon that seems to be exploiting some key features of the open access publishing model, sustained by collecting APCs that are far less than those found in legitimate open access journals. This CME aims to introduce to the participants on the phenomenon of predatory journals, why they continue to thrive, characteristics that are suggestive of a predatory journal, and how one can take step to minimize the risk of faling into predatory journal publication
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
Academics must provide evidence to demonstrate the impact and outcomes of their scholarly work. This webinar, presented by librarians, will help faculty explore various forms of documentary evidence to support their case for excellence. Sponsored by the IUPUI Office of Academic Affairs.
Note: The webinar included demonstrations of Web of Science & Scopus, which the slides do not reflect.
Quality Assurance for Journal GuidanceSmriti Arora
Definitions
What is the need for quality assurance in journals ?
Type of journals
Bibliometric indicators
How to identify credible journals ?
Predatory/cloned journals
Citation metrics versus peer review: Google Scholar, Scopus and the Web of Sc...Anne-Wil Harzing
This presentations reports on a systematic and comprehensive comparison of the coverage of the three major bibliometric databases: Google Scholar, Scopus and the Web of Science. Based on a sample of 146 senior academics in five broad disciplinary areas, we therefore provide both a longitudinal and a cross-disciplinary comparison of the three databases.
Our longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Our cross-disciplinary comparison of the three databases includes four key research metrics (publications, citations, h-index, and hI,annual, an annualised individual h-index) and five major disciplines (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We show that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.
Open Access - PeerJ Presentation to Lawrence Berkeley Labs (LBL)Peter Binfield
Slides from the PeerJ presentation to Lawrence Berkeley Labs (LBL) on May 23rd 2013. As hosted by Mark Biggin. Originally titled “What's All the Fuss About Open Access? What Do I Need to Know, and How Does it Benefit Me?”
Predatory publishing is a relatively recent phenomenon that seems to be exploiting some key features of the open access publishing model, sustained by collecting APCs that are far less than those found in legitimate open access journals. This CME aims to introduce to the participants on the phenomenon of predatory journals, why they continue to thrive, characteristics that are suggestive of a predatory journal, and how one can take step to minimize the risk of faling into predatory journal publication
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
Assessing Research Impact: Bibliometrics, Citations and the H-IndexFintan Bracken
Talk presented by Dr. Fintan Bracken at the Mary Immaculate College Research Day on 1st September 2015. The talk looked at assessing and maximising the impact of the arts and humanities research conducted at Mary Immaculate College in Limerick, Ireland.
Presented to members of the Psychology department as part of the New Tricks Seminar series (February 2016)
• journal metrics using WoS and Scopus
• article level metrics in WoS, Scopus and Google Scholar, and from publishers and the differences in each. Touch on altmetrics.
• author metrics in the above. Touch on Publish or Perish
Tanya Williamson, Academic Liaison Librarian
Bibliometrics, Journal Impact Factors and Maximising the Cite-ability of Jour...Jamie Bisset
Most recent version of slides from Durham "Bibliometrics, Journal Impact Factors and Maximising the Cite-ability of Journal Articles" session.. Delivered as part of the Durham University Researcher Development Programme.
[Last Devlivered November 2014]
Further Training available at https://www.dur.ac.uk/library/research/training/
This slide aims to help and guide students on how to start finding literature review through WOS and SCOPUS. The content is excerpted from various sources available from the internet. This is solely meant for education purpose.
Reputation and bibliometric approaches to identifying the most influential journals to which a scholar should submit his or her research for maximum impact and influence.
Shifting ground: scholarly communication in geographyElizabeth Yates
Joint presentation by me, Data/Liaison Librarian Heather Whipple and Collections Librarian Ian Gibson for the Canadian Association of Geographers' meeting during Congress 2014.
Durham Researcher Development Programme 2015-16: Bibliometric Research Indica...Jamie Bisset
There is an ever-increasing need to make your research more visible as you establish your career, and metrics to measure your research performance when it comes to thinking about promotion and probation.
This session will focus on bibliometric research indicators (such as the Journal Impact Factor and SCImago, author metrics such as the h-index and g-index) and sources for accessing citation data (Web of Science, Journal Citation Reports and Google Scholar). These may be one of several factors to consider when thinking about where to submit an article manuscript for publication to maximise the potential academic impact of the research, and tools useful to be familiar with if they form part of any research evaluation you and your authored journal papers may be subject to.
An additional section will also look at tips to consider when writing an article abstract to maximise its discoverability and cite-ability.
Learning Outcomes:
• Understanding of meaning and intended uses of bibliometric research indicators
• Understanding of how some key indicators (JIF, H-index) are calculated
• Ability to make a judgement as to the appropriateness and limitations of such indicators
• Ability to use online datasets to view and calculate key bibliometric measures
• Awareness of some factors which can increase the visibility and discoverability of your own research in bibliographic databases.
Previous participants have said:
"The session has helped provide me with the basic information on Journal Impact and where to find information such as an author's h-index. It will be useful for future journal submission consideration."
"This session was very useful for me to become familiar with the topic."
Similar to Publishing your Work in a Rapidly Changing Scholarly Communications Environment (20)
Summary of webinar given by Warren Wiechmann, MD MBA Faculty Director, Instructional Technologies, University of California, Irvine, School of Medicine
Feb. 2011 for NSU-HPD iPad Initiative Group
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
Publishing your Work in a Rapidly Changing Scholarly Communications Environment
1. Publishing your Work in a
Rapidly Changing Scholarly
Communications Environment
Courtney Mlinar
March 2012
2. Current Publishing Model Debates
Research Works Act- defeated February 2012
• required making published, federally funded research freely available
illegal (Elsevier was major player
Federal Research Public Access Act (FRPAA) 2012
• any research supported by federal $ to be deposited in a publicly
accessible online repository within 6 months of publication
• expands public access to federally funded research (like NIH funded) to 11
new agencies (opposed by Association of American Publishers- calls it
“intellectual eminent domain”*)
http://www.taxpayeraccess.org/issues/frpaa/index.shtml
*Grant B. (2012) Publishers fight open access bill. Available
http://www.taxpayeraccess.org/issues/frpaa/index.shtml . Accessed March 4, 2012.
3. Agencies affected by FRPAA
• Department of Agriculture
• Department of Commerce
• Department of Defense
• Department of Education
• Department of Energy
• Department of Health and Human Services
• Department of Homeland Security
• Department of Transportation
• Environmental Protection Agency
• National Aeronautics and Space Administration
• National Science Foundation
4. Similar to NIH Public Access Policy:
Public Access Policy:
• public has access to federally funded NIH
research
• Researchers submit final peer-reviewed
manuscripts to PubMed Central upon
acceptance for publication within 12 month
period (submission site is online)
http://publicaccess.nih.gov/
5. PubMed Central
http://0-www.ncbi.nlm.nih.gov.novacat.nova.edu/pmc/
PubMed Health
http://0-www.ncbi.nlm.nih.gov.novacat.nova.edu/pubmedhealth/
• Reviews of Clinical Effectiveness Research
• Links out to Wiley Cochrane Library (HPD Library has EBSCO
Cochrane Library- go to PubMed to linkout)
• Systematic Reviews of Clinical Trial Research (2003)
• PubMed searches PubMed Health
6. Tools:
PubMed:
• My NCBI for PIs> My Saved Data > Manage >
Public > Add a Delegate to share bibliography
• My Bibliography
Electronic Research Administration (eRA)
• New awards register account
• Link account to My NCBI
http://era.nih.gov/
7. Author issues:
• Copyright
• Publishing model
• Promotion and Tenure requirements
• Grant publishing requirements
8. Copyright
Asher A . Bucknell Scholarly Communications Practices Survey: Summary Results. Accessed from http://goo.gl/EQUe4 March. 18, 2012.
9. Journal Citation Reports
• Journal Impact on Scholarship in your
specialty
• Devised by Eugene Garfield, ISI (1955)
• ISI Web of Knowledge (Thomson Reuters)
• Average number of citations per article
• For last 2 years
10. JCR Measures
About 10,000 Influential Journals (not articles or
researchers)
• Science Index
• Social Science Index
Search by subject or specific journal
14. New in 2010 edition
• Impact Factor controlled for Self-citations
• 1,075 titles receiving Impact Factor for the
first time:
– 1,300 regional titles
• Total of over 10,000 journals representing:
– 2,500 publishers
– 84 countries
15. Impact Factor
• A = number of times articles published in 2008
and 2009 were cited by JCR indexed journals
during 2010.
• B = total number of "citable items" published
by that journal in 2008 and 2009.
• 2010 impact factor = A/B
"Citable items" =
articles, reviews, proceedings, or notes; not
editorials or Letters-to-the-Editor
16. Impact factor= 5.395
Divide the number of citations in 2008 to articles
published past 2 years (2006-2007) = 1133 by total
number of articles published previous 2 years
(2006-2007) = 210
17. Number of citations in 2008 to articles published in previous five
years (2003-2007) divided by the total number of articles
published in the previous two years (2003-2007)
18.
19. More…
• New journals assigned impact factor after 2
years
• Journals indexed starting with a volume other
than the first volume- no impact factor for 3
years
• Annuals and other irregular publications
sometimes publish no items in a particular
year, affecting the count
20. Eigenfactor (2008)
Eigenfactor Score calculation:
www.eigenfactor.org
• Highly cited journals = greater influence in the
community ---> weighted citations
• Journal self-citation removed
The Eigenfactor™ Algorithm-2008, was developed by the Metrics
Eigenfactor™ Project: a bibliometric research project conducted by
Professor Carl Bergstrom and his laboratory at University of Washington.
21.
22.
23. Article Score
Measures an article’s influence in scientific
community
• Greater than 1= Above average
• Less than 1= Below average
24. Hirsch- or h-index
Measures impact of an individual scientist
• Can also be applied to the productivity and impact of a
group of scientists, such as a department or university
or country
• Depends on “academic age” of researcher
• Only used for comparing scientists in the same field
Suggested by Jorge E. Hirsch, a physicist at UCSD, as a tool for
determining theoretical physicists relative quality and is sometimes
called the Hirsch index or Hirsch number.
Hirsch JE. Proceedings of the National Academy of Sciences of the United
States of America 102 (46): 16569-16572 November 15 2005.
25. JCR faults?
• Western bias?
• Missing high-impact conferences or meetings?
• Popularity vs. Prestige?
26. Editors and Impact Factor
• Recruiting high-profile researchers
• Improved author services
• Boosting media profiles
• Fewer articles per issue
• Increases in article self-citations
28. New Players:
• BMJ Updates: newsworthy articles
• Biomed Central Faculty of 1000 (2002)
• Faculty of Medicine (2006)
• PLoS App -Public Library of Science:
Biology, Genetics…
29. Scopus
• Originates from Elsevier(Elsevier)- 2004
• No citation analysis before 1990s
• Tiny bit more coverage of high-impact
conferences
• Coverage begins 1966
• Western bias still present
30. SCImago Journal Rank
• Similar to Page Rank
• Variant of Eigenfactor
• Measures of prestige: the SJR indicator
31. SCImago
• Developed from Scopus (Elsevier)
• SCImago research group from Consejo
Superior de Investigaciones Científicas (CSIC),
University of Granada, Extremadura, Carlos III
(Madrid) and Alcalá de Henares
http://www.scimagojr.com/index.php
32. Open Access
• Gold OA- electronic, selected full text on
publisher web site
• Green OA- print converted to electronic for
archival purposes
• Public model OA- public institutions and
gov, J-Stage (Japan gov funded
research), SciELO
(Brazil), Nursing.com, FindArticles…
33. How to Evaluate?
Similar to web site evaluation:
CRAAP (University of California Chico State)
Meriam Library, U. o. C., Chico. (2010). Evaluating Information - Applying the CRAAP
Test, from http://www.csuchico.edu/lins/handouts/eval_websites.pdf
34. Web Site: CRAAP test
• C = Currency
• R = Relevance
• A = Authority
• A = Accuracy
• P = Purpose
Meriam Library, U. o. C., Chico. (2010). Evaluating Information - Applying the
CRAAP Test. Accessed February 21, 2012 from
http://www.csuchico.edu/lins/handouts/eval_websites.pdf
37. Red Flags in OA
• Less than 21 day turn-around Peer-Review
process
• Web site doesn’t pass CRAAP test
• No information about Editorial Board
• Unreasonable author or processing fees
• Not indexed in many databases (check Ulrichs)
39. Publishing Models
• Traditional Print and Online subscription
• Online only subscription
• Open Access- Gold, Green, Public
• Gray Literature: Conferences, Proceedings
• Wikis, Blogs, Social Networking
40. Open Access
• Investigate publisher and fees thoroughly
• Some open access journals have higher impact
factors than traditional
• Can you access the articles?
• Check database indexing for publication
41. Thank you!
Courtney Mlinar, M.L.S.
Reference/ Academic Support Services Librarian
Liaison College of Pharmacy
(954) 262-3121
cm1470@nova.edu
Editor's Notes
The impact factor, often abbreviated IF, is a measure reflecting the average number of citationsto recent articles published in science and social science journals. It is frequently used as an indicator for the relative importance of a journal within its field, with journals with higher impact factors deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI), now part of Thomson Reuters. Impact factors are calculated yearly for those journals that are indexed in Thomson Reuters Journal Citation Reports.
Courting researchers by hiring experts to attend research presentations (what is hot) and commission papers at these meetings and conferences. Quicker turn-around times (fast track high impact papers), niches in the market, topics which interest general public as well as readership, publishing fewer articles are other things that can be done to increase IF. Most felt Open Access did not affect IF.
measure of scientific influence of scholarly journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where such citations come from. The SJR indicator is a variant of the eigenvector centrality measure used in network theory. Such measures establish the importance of a node in a network based on the principle that connections to high-scoring nodes contribute more to the score of the node. The SJR indicator, which is inspired by the PageRank algorithm, has been developed to be used in extremely large and heterogeneous journal citation networks. It is a size-independent indicator and its values order journals by their "average prestige per article" and can be used for journal comparisons in science evaluation processes.