The traditional way to understand and demonstrate your impact–through citation counts–doesn’t meet the needs of today’s researchers. What Generation Open needs is altmetrics.
In this presentation, we cover:
- what altmetrics are and the types of altmetrics today’s researchers can expect to receive,
- how you can track and share those metrics to get all the credit you deserve, and
- real life examples of scientists who used altmetrics to get grants and tenure
Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.
Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.
Assessing Research Impact: Bibliometrics, Citations and the H-IndexFintan Bracken
Talk presented by Dr. Fintan Bracken at the Mary Immaculate College Research Day on 1st September 2015. The talk looked at assessing and maximising the impact of the arts and humanities research conducted at Mary Immaculate College in Limerick, Ireland.
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
Brace for Impact: New Means for Measuring Research MetricsMary Ellen Sloane
As open access journals and repositories gain a foothold in scholarly communication, researchers are finding that the traditional impact factor and citation count metrics only reflect a portion of the dissemination of scholarly works.
New technology, research, and citation tools aid our ability to measure the influence of research. A matrix of tools and initiatives, like PLoS Article-Level Metrics, BePress’ Author Dashboard, Mendeley, Altmetrics, and ImpactStory are providing a more robust picture of scholarly communication today.
This presentation provides an overview of the impact factor system and new tools for gathering metrics and their relevance for librarians and researchers.
Presentation given at the Library Information Technology Association (LITA) Forum in Louisville, KY, in November 2013.
Webinar slides from June 8 Library Connect webinar "Researcher profiles and metrics that matter" with: Chris Belter, Bibliometrics Informationist, NIH Library; Andrea Michalek, VP of Research Metrics, Elsevier | Managing Director of Plum Analytics; Ellen Cole, Scholarly Publications Librarian, Learning and Research Services, Northumbria University.
View the webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=257883
Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.
Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.
Assessing Research Impact: Bibliometrics, Citations and the H-IndexFintan Bracken
Talk presented by Dr. Fintan Bracken at the Mary Immaculate College Research Day on 1st September 2015. The talk looked at assessing and maximising the impact of the arts and humanities research conducted at Mary Immaculate College in Limerick, Ireland.
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
Brace for Impact: New Means for Measuring Research MetricsMary Ellen Sloane
As open access journals and repositories gain a foothold in scholarly communication, researchers are finding that the traditional impact factor and citation count metrics only reflect a portion of the dissemination of scholarly works.
New technology, research, and citation tools aid our ability to measure the influence of research. A matrix of tools and initiatives, like PLoS Article-Level Metrics, BePress’ Author Dashboard, Mendeley, Altmetrics, and ImpactStory are providing a more robust picture of scholarly communication today.
This presentation provides an overview of the impact factor system and new tools for gathering metrics and their relevance for librarians and researchers.
Presentation given at the Library Information Technology Association (LITA) Forum in Louisville, KY, in November 2013.
Webinar slides from June 8 Library Connect webinar "Researcher profiles and metrics that matter" with: Chris Belter, Bibliometrics Informationist, NIH Library; Andrea Michalek, VP of Research Metrics, Elsevier | Managing Director of Plum Analytics; Ellen Cole, Scholarly Publications Librarian, Learning and Research Services, Northumbria University.
View the webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=257883
Citation Metrics: Established and Emerging ToolsLinda Galloway
An overview of established and emerging citation analysis tools including Scopus, Web of Science, Google Scholar Citations and altmetric tools used to measure scholarly influence. The presenter will compare and contrast these tools and provide an example of a basic search in each resource.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Finding the Right Journal at the Right Time for the Right WorkSaptarshi Ghosh
JournalFinder helps you find journals that could be best suited for publishing your scientific article. Please also consult the journal’s Aims and Scope for further guidance. Ultimately, the Editor will decide on how well your article matches the journal.
h index: Benchmark of productivity and impact of researcher AJAY SEMALTY
In the Indices of research series h index is discussed here. The h-index (sometimes called the Hirsch index or Hirsch number) is one of the several research indices which is used to measure the productivity and impact of of a researcher/ research group/ institution. It’s an index which increases on the basis of citations and number of papers continuously with the passage of time. It is the major benchmark used by the employers for selection/recruitment and/ or assessment of Researchers. This e-module will let you know all about the h index: What, How, Who, why......about h index will be answered here. In the very next video we will cover how to identify h index of a researcher in various platforms. (URL link for video: https://youtu.be/BAhPzxWVtVE) For any query please feel free to write to us at openknowledgeok@gmail.com and please do subscribe our youtube channel.......THANKS FOR GIVING YOUR TIME. --- Team OK
Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.
The Impact Factor, Eigenfactor, and Altmetrics: From Theory to AnalysisMolly Keener
Altmetrics is an emerging area encompassing broader assessment of scholarly impact through downloads, links, and online conversations to fill gaps in assessing research. Bibliometrics is the traditional form of measuring the impact of scholarly research through citation rates. The Research & Instruction Librarian for Sciences and the Scholarly Communication Librarian at Wake Forest University will compare bibliometrics and altmetrics, and discuss their applications in science information literacy and research assessment in higher education.
This presentation gives a quick insight into how Scopus can benefit the scientific community and which value it adds to research institutions.
Increasing the speed to discovery and making resources more visible are just a few key drivers for the world wide success of www.scopus.com.
Read more on at http://info.scopus.com
Software Repositories for Research -- An Environmental ScanMicah Altman
Presented at the Software Preservation Network Forum:
"We discuss the results of an environmental scan characterizing the current landscape of software repositories, hubs, and publication venues that are used in research and scholarships. The study aims to characterize the research and scholarship use cases supported by exemplar repositories, their models for sustainability, and the related key affordances, significant properties which the repository offers/maintains. We supplement this with a scan of funder and publisher policies toward software curation and citation; and a summary of key policy resources and guidelines. Using this environmental scan, we discuss a preliminary gap analysis. It hoped that by addressing these key questions, new insights will be provided into the types of decisions research Libraries can expect to make when designing future pilot software curation services."
Citations—often termed as intellectual transactions, acknowledgment of intellectual debts, and conceptual association—are a link between the author’s current study and already published work. It not only provides credibility to the author’s work but also helps funders evaluate the impact of the research study. Citation indexes are maintained for information retrieval of both cited and citing work, facilitating the literature search process. It also helps authors in identifying the number of citations that their papers have received. Citation data is considered as a legitimate measure to rank authors, journals, and publishers. Through this webinar, we aim to provide information about citation indexing and how authors and publishers can get indexed in established citation databases.
Quality Assurance for Journal GuidanceSmriti Arora
Definitions
What is the need for quality assurance in journals ?
Type of journals
Bibliometric indicators
How to identify credible journals ?
Predatory/cloned journals
Altmetrics attempts to provide timely measures of an impact through the use of metrics from HTML views and downloads of scholarly articles, blog posts, tweets, bookmarks, etc. Publishers of scientific research have enabled altmetrics on their articles, open source applications are available for platforms to display altmetrics on scientific research and subscription models have been created to measure the use that research articles receive online. This presentation reviews some of the current models for providing altmetrics along with information on a selection the providers that have made altmetrics available for general use.
Digital strategies to find the right journal for publishing your researchSC CTSI at USC and CHLA
Date: Apr 3, 2019
Speaker: Duncan Nicholas, Former Development Editor at international academic publisher Taylor and Francis Group, and now Director of DN Journals research publishing consultancy, and Senior Consultant for Enago Academy.
Overview: This webinar will provide an overview of digital tools and initiatives that help researchers select the right journal for their manuscript to ensure the best chance of article acceptance.
Stepping out of the echo chamber - Alternative indicators of scholarly commun...Andy Tattersall
This set of slides which was presented at Sheffield Hallam University and The London School of Hygene and Tropical Medicine. They showcase the many ways academics can leverage digital scholary communication tools to discover what is being said about their research and how best to respond to that conversation.
Modern research metrics and new models of evaluation have risen high on the academic agenda in the last few years. In this session two UK institutions who have adopted such metrics across their faculty will share their motivations and experiences of doing so, and explain further how they are integrating these data into existing models of review and analysis.
Citation Metrics: Established and Emerging ToolsLinda Galloway
An overview of established and emerging citation analysis tools including Scopus, Web of Science, Google Scholar Citations and altmetric tools used to measure scholarly influence. The presenter will compare and contrast these tools and provide an example of a basic search in each resource.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Finding the Right Journal at the Right Time for the Right WorkSaptarshi Ghosh
JournalFinder helps you find journals that could be best suited for publishing your scientific article. Please also consult the journal’s Aims and Scope for further guidance. Ultimately, the Editor will decide on how well your article matches the journal.
h index: Benchmark of productivity and impact of researcher AJAY SEMALTY
In the Indices of research series h index is discussed here. The h-index (sometimes called the Hirsch index or Hirsch number) is one of the several research indices which is used to measure the productivity and impact of of a researcher/ research group/ institution. It’s an index which increases on the basis of citations and number of papers continuously with the passage of time. It is the major benchmark used by the employers for selection/recruitment and/ or assessment of Researchers. This e-module will let you know all about the h index: What, How, Who, why......about h index will be answered here. In the very next video we will cover how to identify h index of a researcher in various platforms. (URL link for video: https://youtu.be/BAhPzxWVtVE) For any query please feel free to write to us at openknowledgeok@gmail.com and please do subscribe our youtube channel.......THANKS FOR GIVING YOUR TIME. --- Team OK
Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.
The Impact Factor, Eigenfactor, and Altmetrics: From Theory to AnalysisMolly Keener
Altmetrics is an emerging area encompassing broader assessment of scholarly impact through downloads, links, and online conversations to fill gaps in assessing research. Bibliometrics is the traditional form of measuring the impact of scholarly research through citation rates. The Research & Instruction Librarian for Sciences and the Scholarly Communication Librarian at Wake Forest University will compare bibliometrics and altmetrics, and discuss their applications in science information literacy and research assessment in higher education.
This presentation gives a quick insight into how Scopus can benefit the scientific community and which value it adds to research institutions.
Increasing the speed to discovery and making resources more visible are just a few key drivers for the world wide success of www.scopus.com.
Read more on at http://info.scopus.com
Software Repositories for Research -- An Environmental ScanMicah Altman
Presented at the Software Preservation Network Forum:
"We discuss the results of an environmental scan characterizing the current landscape of software repositories, hubs, and publication venues that are used in research and scholarships. The study aims to characterize the research and scholarship use cases supported by exemplar repositories, their models for sustainability, and the related key affordances, significant properties which the repository offers/maintains. We supplement this with a scan of funder and publisher policies toward software curation and citation; and a summary of key policy resources and guidelines. Using this environmental scan, we discuss a preliminary gap analysis. It hoped that by addressing these key questions, new insights will be provided into the types of decisions research Libraries can expect to make when designing future pilot software curation services."
Citations—often termed as intellectual transactions, acknowledgment of intellectual debts, and conceptual association—are a link between the author’s current study and already published work. It not only provides credibility to the author’s work but also helps funders evaluate the impact of the research study. Citation indexes are maintained for information retrieval of both cited and citing work, facilitating the literature search process. It also helps authors in identifying the number of citations that their papers have received. Citation data is considered as a legitimate measure to rank authors, journals, and publishers. Through this webinar, we aim to provide information about citation indexing and how authors and publishers can get indexed in established citation databases.
Quality Assurance for Journal GuidanceSmriti Arora
Definitions
What is the need for quality assurance in journals ?
Type of journals
Bibliometric indicators
How to identify credible journals ?
Predatory/cloned journals
Altmetrics attempts to provide timely measures of an impact through the use of metrics from HTML views and downloads of scholarly articles, blog posts, tweets, bookmarks, etc. Publishers of scientific research have enabled altmetrics on their articles, open source applications are available for platforms to display altmetrics on scientific research and subscription models have been created to measure the use that research articles receive online. This presentation reviews some of the current models for providing altmetrics along with information on a selection the providers that have made altmetrics available for general use.
Digital strategies to find the right journal for publishing your researchSC CTSI at USC and CHLA
Date: Apr 3, 2019
Speaker: Duncan Nicholas, Former Development Editor at international academic publisher Taylor and Francis Group, and now Director of DN Journals research publishing consultancy, and Senior Consultant for Enago Academy.
Overview: This webinar will provide an overview of digital tools and initiatives that help researchers select the right journal for their manuscript to ensure the best chance of article acceptance.
Stepping out of the echo chamber - Alternative indicators of scholarly commun...Andy Tattersall
This set of slides which was presented at Sheffield Hallam University and The London School of Hygene and Tropical Medicine. They showcase the many ways academics can leverage digital scholary communication tools to discover what is being said about their research and how best to respond to that conversation.
Modern research metrics and new models of evaluation have risen high on the academic agenda in the last few years. In this session two UK institutions who have adopted such metrics across their faculty will share their motivations and experiences of doing so, and explain further how they are integrating these data into existing models of review and analysis.
Measuring and Enhancing Your Academic Medical ImpactMarion Sills
Overview of measuring and enhancing the impact of your scholarly work in academic medicine. The talk reviews how impact is defined and measured, how to improve your own impact metrics and how to describe the impact of your scholarly contributions to science.
Practical applications for altmetrics in a changing metrics landscapeDigital Science
"Practical applications for altmetrics in a changing metrics landscape" - Sara Rouhi, Altmetric product specialist, and Anirvan Chatterjee, Director Data Strategy for CTSI at UCSF
Altmetrics are here: are you ready to help your faculty? [ALA Research & Stat...Impactstory Team
Scholarship is changing, along with the way we measure impact. This webinar explores altmetrics and the crucial role librarians have in helping faculty navigate these changes.
This concept can be applied to the wisdom of clinicians inside healthcare institutions. By gathering and sharing course content and tools between care facilities, hospitals can be connected to more than just the technical cloud. They can be connected to the wisdom of the cloud.
Librarians & altmetrics: Tools, tips and use casesLibrary_Connect
Altmetrics are becoming an integral part of looking at the impact and reach of research. Tracking social and online outlets, altmetrics provide quick feedback from a wide range of sources. In this webinar, library experts will discuss how altmetrics work, tools available, and the application of altmetrics in a range of institutions and for various user groups. Watch the webinar: http://ow.ly/vNeax
Altmetrics: the movement, the tools, and the implicationsKR_Barker
The October 2015 iteration of the class created and taught by Andrea Denton and Kimberley R. Barker, both of the UVA Claude Moore Health Sciences Library.
Disseminating Scientific Research via Twitter: Research Evidence and Practica...Katja Reuter, PhD
About one-fifth of current scientific papers are being shared on Twitter. With nearly 69 million active U.S. Twitter users (24% of the U.S. adult population) and 328 million monthly active users worldwide, Twitter is one of the biggest social networks worldwide. Understandably, hopes are high that tweets mentioning scientific articles and research findings can reach peers and the general public. Studies show that most of the engagement with scientific papers on Twitter takes place among members of academia and thus reflects visibility within the scientific community rather than impact on society. However, there are ways to reach the broader public. This webinar will provide an overview of using Twitter to reach peers and non-specialist groups, the relationship between tweets and citations, and provide tips for building an academic Twitter presence.
Speaker: Katja Reuter, PhD, Assistant Professor of Clinical Preventive Medicine at the Institute for Health Promotion and Disease Prevention Research in the Department of Preventive Medicine at the Keck School of Medicine of USC; Director of Digital Innovation and Communication for the Southern California Clinical and Translational Research Institute (SC CTSI).
Learning objectives:
1. Describe the strengths and limitations of using Twitter for the dissemination of scientific research.
2. Describe practical approaches for building an academic presence on Twitter.
3. Describe approaches to identify and reach different audiences on Twitter.
Slides from the workshop on social media for impact presented at the Economic and Social Research Council final year conference, Edinburgh, 25 April 2014: http://www.socsciscotland.ac.uk/events/esrc_fyc_2014
Engaging with Patients Online: The do’s and don’t’s, and what’s to gainKatja Reuter, PhD
These slides were presented at the the Annual Meeting of the American College of Rheumatology (ACR) and Association of Rheumatology Health Professionals (ARHP) on Nov 15, 2016 in Washington DC. The presentation highlights ways in which physician-scientists may reach and engage patients online for different purposes such as health promotion, study recruitment, attracting patients, and reputation building. The presentation also touches upon tracking online activities for performance reviews and responding to negative reviews.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
ANAMOLOUS SECONDARY GROWTH IN DICOT ROOTS.pptxRASHMI M G
Abnormal or anomalous secondary growth in plants. It defines secondary growth as an increase in plant girth due to vascular cambium or cork cambium. Anomalous secondary growth does not follow the normal pattern of a single vascular cambium producing xylem internally and phloem externally.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...
The Right Metrics for Generation Open [Open Access Week 2014]
1. The Right Metrics
for Generation Open
a guide to getting credit for practicing Open Science
Stacy Konkiel
Open Access Week 2014
All content in this work is licensed under a
Creative Commons Attribution 4.0 International
License unless otherwise indicated.
2. Dr. Holly Bik, computational biologist (Univ. of Birmingham, UK)
13. But what about...
Different audiences? Different types of
engagement?
views, discussion,
saves, citation,
recommendation
scholars,
practitioners, public
15. We return to our question...
HOW CAN YOU CAPTURE
IMPACT?
16. Altmetrics measure impact...
scholarly public
recommended faculty of 1000 popular press
cited traditional citatio
n
wikipedia
discussed scholarly blogs blogs, twitter
saved mendeley,
citeulike
delicious
read pdf views html views
19. Jason Priem, Heather A. Piwowar, Bradley
M. Hemminger. (2012). Altmetrics in the wild:
Using social media to explore scholarly
impact. http://arxiv.org/abs/1203.4745
20. Impact categories
Economic or commercial
Societal
Public policy or services
Health
Environmental
Professional services
http://www.sfi.ie/funding/sfi-research-impact/impacts-and-outputs/types-of-impact.html
Welcome! My name is Stacy Konkiel, and I’m the director of marketing & research at Impactstory, a non-profit devoted to altmetrics.
Before we get started, please “mute” your computer microphones for the presentation portion of this chat. After I conclude, you can unmute ‘em so we can talk freely.
You’re here today because you know that the traditional means of sharing your impact–citation counts for journal articles–aren’t meeting your needs. What you need is a way to understand how your outputs, beyond the journal article, are being shared, saved, discussed, and more on the social web. Altmetrics can give you that information.
Over the next half-hour, I’m going to share with you the key ways that you can get credit for practicing Open Science. I’ll describe what altmetrics are and the types of altmetrics you can expect to receive as someone who practices Open Science. We’ll also cover real life examples of scientists who used altmetrics to get grants and tenure--and how you can do the same.
So, let’s get started!
This is Holly Bik.
Holly is a researcher in metagenomics & environmental sequencing
She’s a research fellow--the equivalent of an Assistant Professor--at the University of Birmingham. In the next few years, as she makes progress towards tenure, Holly will need to publish, get grants, and win awards. Those are some of the ways that she can benchmark her success, and prove it to others.
Oftentimes, as she applies for those ever-important grants and awards, she’ll need to document how her research is making an impact on other scholars and the public.
Used to be she’d come up with a research idea, conduct her research with a handful of colleagues that she already knows, publish in a journal with a high impact factor, and wait for a year or two for citations to accumulate (to understand the reception to her work, and also get tenure).
Dr. Bik would want to come across as productive as she possible can, since the more publications she has, the more favorably she is sometimes looked upon. publish as much as possible on a single study--a practice that some call “salami slicing”--in order to fatten up the publications section of her CV.
Then she’d start all over again with a new or related idea.
This system worked alright for many decades, but it had several disadvantages:
it takes months or years for Holly’s paper to be published, delaying the dispersal of important findings
it takes months or years for Holly’s paper to be cited after publication, making it hard for her to share with others the importance of her paper--something that can be damaging when she’s got to document that paper’s importance when applying for grants or tenure
All of the other research products she created in the course of the study that led to this single paper--her data, the software she used to analyze the data, the slides and posters she presented at conferences when sharing her preliminary results, and so on--never see the light of day, meaning that other researchers cannot build upon her data, reuse her software, and so on, leading to a lot of duplicated efforts and waste.
This system incentivizes “playing it safe”--publishing primarily in journals with high impact factors, and publishing only positive results that are “sexy,” so to speak. Incremental advances or work that’s proven negative--and can be used by other scholars to help rule inefficient methods out--also never see the light of day.
Luckily, things have changed a LOT in the last ten years.
Nowadays, she’ll come up with an idea, start a literature review and save a heap of citations to Mendeley, find an author on Academia.edu, email them and end up becoming collaborators,
Then she’ll start her study and begin to collect genomic data from her samples, post that data to a repository like Figshare or GenBank, and use research software and scripts that she created (and has shared on a website like GitHub or SourceForge) to analyze the data,
She’ll share her preliminary results at a conference, and the talk will be recorded and posted to Youtube, talked about on Twitter by other scientists, as it’s happening, and she’ll post the slides to slideshare, after the meeting, so researchers who didnt’ attend have a chance to see her results.
Once she gets feedback enough to write a preprint on the study, she posts it to the ArXiv.org repository to get even more feedback, often submitted via Twitter and blogs by other bioinformaticians. They help her hash out some problems with the study and shape it up so that it’s good enough
to submit a formal journal article, which along with her preprint, accumulates citations. It’s also bookmarked on Mendeley, shared and discussed on Google+, and even makes it into a Wikipedia article related to the study subject.
Holly then blogs about her study to help communicate it to the public and other researchers.
All this time, while she’s gathering feedback, presenting, etc--she’s getting altmetrics for her work. And she’s seeing that her work is being used and discussed not only among other bioinformaticians, but also among the public and policymakers--two audiences that she couldn’t have known about if she were only tracking citations to her work. She then includes that information in her grant progress reports, so her funder--the National Science Foundation--can see the impacts she’s having and--hopefully--give her more money to continue her work in the coming years.
Here is what Holly’s story has to do with you.
Holly is you.
Early career researchers are changing the way science is done, making Open Science the “New Normal.” And the systems that are in place to recognize and reward researchers, awarding them jobs, giving them tenure, promoting them, and securing grants so they can conduct their research, are starting to change, too.
In the US, the National Science Foundation recently changed their award guidelines to recognize not just important publications, but also important products--making it possible for Holly to showcase research software she’s created that’s been very useful to others in her community, and so on.
And the Wellcome Trust in the UK is using altmetrics to understand the reach and impacts of the research they fund.
Also in the UK, faculty, funders and universities are exploring the role of metrics, including altmetrics, in the Research Excellence Framework exercise, which helps determine how funding is allocated to universities at a national level
We’re also beginning to hear from faculty, who are using altmetrics to document their impacts for the purposes of jobs, tenure, and promotion.
We’ll get into specific examples later on in the presentation, but I want to make it clear, as we begin: universities like University of North Carolina and Indiana University are increasingly writing guidelines for evaluating so-called “digital scholarship”--that is, web native research products like datasets, software, and so on--into tenure & promotion guidelines. And researchers are beginning to use altmetrics to show just how widely their digital scholarship is being used, by whom, and how.
So. It’s clear that altmetrics are an important part of documenting scholarly impact for cutting edge researchers like you, and that universities and funders are taking these supplementary metrics seriously.
Let’s get down to brass tacks and talk about the basics of altmetrics and how they can be used.
Altmetrics measure the attention surrounding the work you’ve made available online, for all sorts of scholarly products beyond the journal article.
That’s because the science communication lifecycle has many components. It starts with an idea--and data. Then you goes on through the thinking and analysis phase. Eventually, stories are told in order to communicate your findings to others--those stories are often in the form of journal articles, but can also be in the form of blogposts, videos, and so on. And then conversation surrounding your work happens--sometimes in the literature, where other researchers might discuss your articles or data, citing your past work; sometimes it happens out in the open, on blogs and Twitter, or on journal club websites like Pubpeer or post-publication peer review sites like Publons. Eventually, those conversations may lead to future research.
Point is, a lot is happening with your work.
So, how can you capture the impacts your work is having?
It used to be simplistic. We used to chart the path of scholarly communication using citations to journal articles. They served as an important breadcrumb, helping us to credit work that came before our own. And they were also an important indicitaor to us as authors, helping us to know if our articles were considered useful, important, or interesting to our colleagues.
In the past, when we were only measuring citations, We could easily manage our scholarly identities. It was like sipping from a small data stream. We’d put citations onto our CVs and into our annual reviews and were done with it. Impact backed up with data. Job done.
But now, administrators and funders often want to know: how’s your work making an impact in the quote-unquote real world? What effect is it having not only on other scholars, but also practictioners, the public, and policy makers.
And its great that others are citing your work, but are they reading it? Do they consider it worthy of saving and revisiting, recommending to others, discussing in their spare time?
As we showed from our example with Holly, there’s a lot of engagement that it’s now possible to measure, and you can measure it for research outputs other than journal articles.
Now it’s like drinking from a firehose--so much information we sometimes can’t make sense of it all!
So, How can we measure the impact of web native scholarship?
That’s where altmetrics come in.
Altmetrics can help you measure and understand their impact for all their diverse research outputs, among many different stakeholder audiences.
The term altmetrics refers specifically to social media metrics and other things that can be measured online, like pageviews and downloads, related to scholarship.
Altmetrics aren’t only for papers, but also all the other outputs that are created along every step of the research lifecycle.
And we can use altmetrics to understand the many varied flavors of impact of our work, among different stakeholder audiences.
For example,
this 2012 study, “Altmetrics in the Wild”, introduced several example “flavors” by looking at correlations between different types of altmetrics for articles published in PLOS journals. They saw correlations between:
How often something was Read & and then later Cited--you can see here in this heatmap that there are moderate correlations between how often something is viewed and downloaded and then later cited in the literature.
Another flavor is the “popular hit,” which was discovered by looking at correlations between Facebook and Twitter shares
And finally, there’s the Expert pick, seen when we examine the moderate correlations between post-publication expert peer review on Faculty of 1000 prime and later citations.
Other researchers are using altmetrics to find other types of flavors. Lutz Bornmann recently found that research articles tagged on Faculty of 1000 as being “Good for teaching” are often more widely shared on non-scholarly social networks--a useful indicator of popularity among non-scientists.
One can imagine finding other flavors for non-article outputs like datasets and software. This is an area that hasn’t been as widely studied yet, but we’re starting to explore. For eaxmple, what if your dataset is being used in the classroom to teach concepts to undergraduates?you’d see your dataset included in syllabi and shared in lesson plans? That’s a great contribution you’re making to training future scientists. And what if your software is being installed and reused by hundreds of other scientists in your domain, since it’s so cleverly created and does so much? Those installation statistics, and mentions of your software in journal articles, are two useful indicators of the scholarly impact your software is having.
Now, the goal is not to compare flavors: one flavor is not objectively better than another. However, recognizing different types of contributions might help us appreciate scholarly products for the particular needs they meet.
And there are a lot of needs to be met. These impact categories were created by the Science Foundation of Ireland to help researchers and the public understand the many ways in which funded research can have an impact.
Economic or commercial
A new business sector or activity has been created.
A business or sector has adopted a new or significantly improved technology or process
Societal
The awareness, attitudes, education and understanding of the public have been enhanced by engaging them with research of social or cultural significance.
The work of an NGO has been improved or enhanced by the research.
Public policy or services
Policy debate has been stimulated or informed by research evidence.
Policy decisions or changes to legislation, regulations or guidelines have been informed by research evidence.
and so on. So, what types of metrics can you expect to find for your work, based on what type of work you’re sharing?
This is not comprehensive--it’s primer.
Let’s look at metrics for papers first.
For other scholarly products, the type of metrics you receive are often related to the platform where it’s being shared.
So, for software that’s often shared on sites like GitHub, here’s what you can expect
So, how do you go about finding these metrics?
You can use altmetrics aggregators to discover where your research is being talked about, cited, shared, and bookmarked online.
There are three main independent aggregators out there today.
Altmetric.com is a london-based startup that is primarily used by publishers and institutions to track the impact of publications. But they also track anything else with a permanent ID like a DOI, such as data and figures shared on Figshare. Their strengths are that they do publications *really* well, and they’re also great at mining mainstream media mentions and mentions in public policy documents. So, they’re great for showing the impact of your publications. Their weaknesses are that they are focused mainly on publications, and they also don’t aggregate metrics for your work into a single profile--at least not using their browser bookmarklet, which is free for authors. If you want to track multiple publications, you have to use their bookmarklet to sign up for email updates for each paper.
PlumX is a US-based company that was recently acquired by EBSCO. They track metrics for a variety of research outputs, and are oriented towards meeting the needs of funders and institutions. Their strengths are that they aggregate the most diverse data sources for all types of research outputs, making them the altmetrics provider with the most complete coverage. A downside to PlumX is that it’s not possible for individuals to sign up for profiles at this time--they only sell institutional subscriptions.
I work for Impactstory.org. We’re a US and Canada-based non-profit startup, and we also track metrics for all your research outputs--Articles Datasets Figures Posters Slides Software & more. We offer researcher-based profiles, built specifically to meet the needs of scientists. We’re committed to open data, open source software, and operate our non-profit on the premise of radical transparency. Individual subscriptions are the only subscriptions you can get, which is a downside for some, who’d like to use our platform to track impacts for their institutions. But it’s also our biggest strength--with the laser-sharp focus on researchers, we’re able to meet the needs of individual scientists really well.
That said, let’s finish this webinar by looking at individuals who have used altmetrics data from Impactstory and other platforms to advance their career.
Steven roberts studies the effects of environmental change on shellfish at a university in the united states
Steven used altmetrics in his tenure dossier at the University of Washington.
here’s how he did it
Included this table in his Education and Outreach section of his dossier
Also included metrics in the CV itself, to showcase the impacts of his scholarly work.
for example the slideshare views of his slides for this invited presentation
And the article views and supplemental data views for this paper
Ahmed’s a biologist who has made a significant contribution to the understanding of the origin of photosynthesis in eukaryotes.
Ahmed used altmetrics in his tenure dossier at the American University in Cairo in 2013.
Here’s how he did it: in the narrative section of his dossier, he included a section on the quote-unquote total impact of his work by showing a screencap of his most impactful publications from his Impactstory profile, alongside badges that showed the relative attention his work has received (highly discussed by scholars, highly viewed by the public, and so on). He included this alongside
He also described a software product he created, JAligner, that is used often by others in his field. And in his description, he included a link to all the citations that JAligner has received on Google Scholar--over 160 citations!
Ahmed actually asked a member of his review committee what they thought about the inclusion of altmetrics, and they said they were impressed!
Holly has developed a bioinformatics data visualization tool called Phinch, which was funded by the Alfred P Sloan foundation.
When reporting back to Sloan on the success of her project, she included Figshare views for related posters and talks, Github statistics for the Phinch software, and other altmetrics related to the varied outputs that the project created over the last few years.
Holly’s hopeful that these metrics, in addition to the traditional metrics she’s reported to Sloan, will make a great case for renewal funding, so they can continue their work on Phinch.
Funders and an increasing number of universities are accepting altmetrics as supplementary measures of impact nowadays. Whether or not you use them is up to you--if you think your situation warrants a more traditionally-minded approach, by all means, be traditional. But in situations where you need to make your diverse impacts clear, and when you need to make a case for non-traditional impacts, altmetrics can help.
Thanks very much!
I’ll make a short plug here for using Impactstory when aggregating altmetrics.
You can sign up for a free, 30-day trial at Impactstory to track the impacts of your work. And if you’re interested in subscribing, we’re offering a “2 years for the price of 1” deal. Just use coupon code “OAW_241” when signing up!
Now, questions? Thoughts? Let’s talk about how altmetrics might be useful for you.