"Practical applications for altmetrics in a changing metrics landscape" - Sara Rouhi, Altmetric product specialist, and Anirvan Chatterjee, Director Data Strategy for CTSI at UCSF
Research information management: making sense of it allDigital Science
"Research information management: making sense of it all" - Julia Hawks, VP North America, Symplectic
Slides from Shaking It Up: Challenges and Solutions in Scholarly Information Management, San Francisco, April 22, 2015
This presentation was provided by Vincent Cassidy of The IET during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
The Kaleidoscope of Impact: same data, different perspectives, constantly cha...Kudos
Scholars, scientists, academic institutions, publishers and funders are all interested in impact. We have different roles and goals, and therefore different reasons for needing to understand impact; we are therefore asking different questions about impact, and those questions continue to evolve, much as the concept of impact itself is evolving. To answer our different questions, do we need different data, in separate silos, or are we looking at the same data, from different angles? This session gathered researcher, library, publisher and metrics provider perspectives to consider who has an interest in impact, what data they are interested in, how they use it, and how the situation is evolving as e.g. business models and technical infrastructures shift.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Research information management: making sense of it allDigital Science
"Research information management: making sense of it all" - Julia Hawks, VP North America, Symplectic
Slides from Shaking It Up: Challenges and Solutions in Scholarly Information Management, San Francisco, April 22, 2015
This presentation was provided by Vincent Cassidy of The IET during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
The Kaleidoscope of Impact: same data, different perspectives, constantly cha...Kudos
Scholars, scientists, academic institutions, publishers and funders are all interested in impact. We have different roles and goals, and therefore different reasons for needing to understand impact; we are therefore asking different questions about impact, and those questions continue to evolve, much as the concept of impact itself is evolving. To answer our different questions, do we need different data, in separate silos, or are we looking at the same data, from different angles? This session gathered researcher, library, publisher and metrics provider perspectives to consider who has an interest in impact, what data they are interested in, how they use it, and how the situation is evolving as e.g. business models and technical infrastructures shift.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Feb 26 NISO Training Thursday
Crafting a Scientific Data Management Plan
About the Training
Addressing a data management plan for the first time can be an intimidating exercise. Join NISO for a hands-on workshop that will guide you through the elements of creating a data management plan, including gathering necessary information, identifying needed resources, and navigating potential pitfalls. Participants explore the important components of a data management plan and critique excerpts of sample plans provided by the instructors.
This session is meant to be a guided, step-by-step session that will follow the February 18 NISO Virtual Conference, Scientific Data Management: Caring for Your Institution and its Intellectual Wealth.
About the Instructors
Kiyomi D. Deards, MSLIS, Assistant Professor, University of Nebraska-Lincoln Libraries
Jennifer Thoegersen, Data Curation Librarian, University of Nebraska-Lincoln Libraries
This presentation was provided by Bert Carelli of TrendMD, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
This presentation was provided by Joe Zucca of the University of Pennsylvania, during Session Five of the NISO event "Assessment Practices and Metrics for the 21st Century," held on November 22, 2019.
Capturing and Analyzing Publication, Citation and Usage Data for Contextual C...NASIG
Libraries have long sought to demonstrate the value of their collections through a variety of usage statistics. Traditionally, a strong emphasis is placed on high usage statistics when evaluating journals in collection development discussions. However, as budget pressures persist, administrators are increasingly concerned with looking beyond traditional usage metrics to determine the real impact of library services and collections. By examining journal usage in the context of scholarly communication, we hope to gain a more holistic understanding of the use and impact of our library’s resources. In this session, we begin by outlining our methodology for gathering comprehensive publication and citation data for authors affiliated with Northwestern University’s Feinberg School of Medicine, utilizing Web of Science as our primary data source and leveraging a custom Python script to manage the data. Using this data we discuss various potential metrics that could be employed to measure and evaluate journals in institutional and field-specific contexts, including but not limited to: number of publications and references per journal, co-citation networks, percentage of references per journal, and increases or decreases of references over time per title. We then consider the development of normalized benchmarks and criteria for creating field-specific core journal lists. We also discuss a process for establishing usage thresholds to evaluate existing journal subscriptions and to highlight potential gaps in the collection. Finally, we apply and compare these metrics to traditional collection development tools like COUNTER usage reports, cost-per-use analysis, Inter-Library Loan statistics and turnaway reports, to determine what correlations or discrepancies might exist. We finish by highlighting some use-cases which demonstrate the value of considering publication and citation metrics, and provide suggestions for incorporating these metrics into library collection development practices.
Speakers: Joelen Pastva and Jonathan Shank, Northwestern University
Project GitHub page: https://goo.gl/2C2Pcy
Presentation given at the British Library Turing workshop on Software Citation, considering what lessons could be learned from the world of data citation
Wouter Haak's presentation on open science and research data management from the Elsevier Library Connect Event 2016 "Navigating the new publishing & open science terrain: what librarians need to know." Wouter is Elsevier's Vice President of Research Data Management Solutions.
This presentation was provided by Carly Strasser of the Chan Zuckerberg Initiative during the NISO hot topic virtual conference "Effective Data Management," which was held on September 29, 2021.
February 18 2015 NISO Virtual Conference
Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Network Effects: RMap Project
Sheila M. Morrissey, Senior Researcher, ITHAKA
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...ASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Part of Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Presenter:
Angi Ogier, Virginia Tech University
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
This presentation was provided by Dr. Paul Burton of the University of Bristol during the NISO Symposium, Privacy Implications of Research Data, held on September 11, 2016, in conjunction with the International Data Week in Denver, Colorado.
February 18 2015 NISO Virtual Conference Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Keynote Address: Data Management Plan Requirements at the US Department of Energy
Laura J. Biven, Ph.D., Senior Science and Technology Advisor, Office of the Deputy Director for Science Programs, Office of Science, US Department of Energy
RDAP 16: DMPs and Public Access: Agency and Data Service ExperiencesASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Outline for Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...ASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Part of Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Presenter:
Jonathan Petters, Johns Hopkins University
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
This presentation was provided by Micah Altman of MIT during the August 10 NISO webinar, How Libraries Use, Support and Can Implement Researcher Identifiers
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)ASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Part of Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Presenter:
Laura J. Biven, US Department of Energy
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
Modern research metrics and new models of evaluation have risen high on the academic agenda in the last few years. In this session two UK institutions who have adopted such metrics across their faculty will share their motivations and experiences of doing so, and explain further how they are integrating these data into existing models of review and analysis.
Feb 26 NISO Training Thursday
Crafting a Scientific Data Management Plan
About the Training
Addressing a data management plan for the first time can be an intimidating exercise. Join NISO for a hands-on workshop that will guide you through the elements of creating a data management plan, including gathering necessary information, identifying needed resources, and navigating potential pitfalls. Participants explore the important components of a data management plan and critique excerpts of sample plans provided by the instructors.
This session is meant to be a guided, step-by-step session that will follow the February 18 NISO Virtual Conference, Scientific Data Management: Caring for Your Institution and its Intellectual Wealth.
About the Instructors
Kiyomi D. Deards, MSLIS, Assistant Professor, University of Nebraska-Lincoln Libraries
Jennifer Thoegersen, Data Curation Librarian, University of Nebraska-Lincoln Libraries
This presentation was provided by Bert Carelli of TrendMD, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
This presentation was provided by Joe Zucca of the University of Pennsylvania, during Session Five of the NISO event "Assessment Practices and Metrics for the 21st Century," held on November 22, 2019.
Capturing and Analyzing Publication, Citation and Usage Data for Contextual C...NASIG
Libraries have long sought to demonstrate the value of their collections through a variety of usage statistics. Traditionally, a strong emphasis is placed on high usage statistics when evaluating journals in collection development discussions. However, as budget pressures persist, administrators are increasingly concerned with looking beyond traditional usage metrics to determine the real impact of library services and collections. By examining journal usage in the context of scholarly communication, we hope to gain a more holistic understanding of the use and impact of our library’s resources. In this session, we begin by outlining our methodology for gathering comprehensive publication and citation data for authors affiliated with Northwestern University’s Feinberg School of Medicine, utilizing Web of Science as our primary data source and leveraging a custom Python script to manage the data. Using this data we discuss various potential metrics that could be employed to measure and evaluate journals in institutional and field-specific contexts, including but not limited to: number of publications and references per journal, co-citation networks, percentage of references per journal, and increases or decreases of references over time per title. We then consider the development of normalized benchmarks and criteria for creating field-specific core journal lists. We also discuss a process for establishing usage thresholds to evaluate existing journal subscriptions and to highlight potential gaps in the collection. Finally, we apply and compare these metrics to traditional collection development tools like COUNTER usage reports, cost-per-use analysis, Inter-Library Loan statistics and turnaway reports, to determine what correlations or discrepancies might exist. We finish by highlighting some use-cases which demonstrate the value of considering publication and citation metrics, and provide suggestions for incorporating these metrics into library collection development practices.
Speakers: Joelen Pastva and Jonathan Shank, Northwestern University
Project GitHub page: https://goo.gl/2C2Pcy
Presentation given at the British Library Turing workshop on Software Citation, considering what lessons could be learned from the world of data citation
Wouter Haak's presentation on open science and research data management from the Elsevier Library Connect Event 2016 "Navigating the new publishing & open science terrain: what librarians need to know." Wouter is Elsevier's Vice President of Research Data Management Solutions.
This presentation was provided by Carly Strasser of the Chan Zuckerberg Initiative during the NISO hot topic virtual conference "Effective Data Management," which was held on September 29, 2021.
February 18 2015 NISO Virtual Conference
Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Network Effects: RMap Project
Sheila M. Morrissey, Senior Researcher, ITHAKA
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
RDAP 16: If I could turn back time: Looking back on 2+ years of DMP consultin...ASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Part of Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Presenter:
Angi Ogier, Virginia Tech University
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
This presentation was provided by Dr. Paul Burton of the University of Bristol during the NISO Symposium, Privacy Implications of Research Data, held on September 11, 2016, in conjunction with the International Data Week in Denver, Colorado.
February 18 2015 NISO Virtual Conference Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Keynote Address: Data Management Plan Requirements at the US Department of Energy
Laura J. Biven, Ph.D., Senior Science and Technology Advisor, Office of the Deputy Director for Science Programs, Office of Science, US Department of Energy
RDAP 16: DMPs and Public Access: Agency and Data Service ExperiencesASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Outline for Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
RDAP 16: Perspective on DMPs, Funders and Public Access (Panel 5: DMPs and Pu...ASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Part of Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Presenter:
Jonathan Petters, Johns Hopkins University
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
This presentation was provided by Micah Altman of MIT during the August 10 NISO webinar, How Libraries Use, Support and Can Implement Researcher Identifiers
RDAP 16: Data Management Plan Perspectives (Panel 5, DMPs and Public Access)ASIS&T
Research Data Access and Preservation Summit, 2016
Atlanta, GA
May 4-7, 2016
Part of Panel 5, "DMPs and Public Access: Agency and Data Service Experiences"
Presenter:
Laura J. Biven, US Department of Energy
Panel Lead:
Margaret Henderson, Virginia Commonwealth University
Modern research metrics and new models of evaluation have risen high on the academic agenda in the last few years. In this session two UK institutions who have adopted such metrics across their faculty will share their motivations and experiences of doing so, and explain further how they are integrating these data into existing models of review and analysis.
Academics must provide evidence to demonstrate the impact and outcomes of their scholarly work. This webinar, presented by librarians, will help faculty explore various forms of documentary evidence to support their case for excellence. Sponsored by the IUPUI Office of Academic Affairs.
Note: The webinar included demonstrations of Web of Science & Scopus, which the slides do not reflect.
Gather evidence to demonstrate the impact of your researchIUPUI
This workshop is the 3rd in a series of 4 titled "Maximize your impact" offered by the IUPUI University Library Center for Digital Scholarship. Faculty must provide strong evidence of impact in order to achieve promotion and tenure. Having strong evidence in year 5 is made easier by strategic dissemination early in your tenure track. In this hands-on workshop, we will introduce key sources of evidence to support your case, demonstrate strategies for gathering this evidence, and provide a variety of examples. These sources include citation metrics, article level metrics, and altmetrics as indicators of impact to support your narrative of excellence.
Altmetrics: the movement, the tools, and the implicationsKR_Barker
The October 2015 iteration of the class created and taught by Andrea Denton and Kimberley R. Barker, both of the UVA Claude Moore Health Sciences Library.
Altmetrics: the movement, the tools, and the implicationsKR_Barker
Measuring scholarly impact has traditionally been tied to the calculation of a scholarly article’s number of citations and the Impact Factor of its journal. Today, however, scholarly contributions take many forms: computer code, data sets, blog postings, tweets, practice guidelines and beyond. As the products of research evolve, so will the way in which credit is measured. This class will provide an overview of “altmetrics”, the movement to assess influence of both traditional and non-traditional scholarly contributions. We will define altmetrics, discuss why it is important in today’s digital scholarly environment, and demonstrate tools available to measure influence. After completing this course, the learner will be able to define altmetrics and compare it to traditional forms of measuring scholarly impact; name examples of scholarly contributions that are alternatives to traditional methods (e.g. datasets, blog postings, tweets, etc.); name examples of alternative means of measuring scholarly contributions (e.g. download counts, tweets about, etc.); discuss why today’s online, social environment necessitates a change in the way scholarly contributions are measured; name resources to learn more about altmetrics such as altmetrics.org; and name tools to measure alternative scholarly contributions such as Altmetric.com, Impact Story, Plum Analytics, etc.
Understanding impact through alternative metrics: developing library-based as...Kristi Holmes
There’s never been a more critical need to better understand the impact of research efforts. The challenging state of funding models (1) and an enhanced pressure on young investigators to stand out from the crowd magnify this need as well as the perceived value of locally based impact services. These services are leveraged by a diverse range of stakeholders, from individuals to university-level decision makers and strategists. Individuals often wish to better demonstrate impact of published works to promotion committees or describe the impact of research studies to funding agencies when applying for funding or complying with institution-level or federal reporting exercises. Research groups, departments, and institutions often wish to discover how research findings are being used to promote science and gain a better overall view of research publications and outputs.
Libraries are particularly well poised to meet the need to understand a more nuanced view of impact. Libraries are trusted, neutral parties with a tradition of service and support and often act as technology hubs on campus with IT and data expertise. Librarians are trained information professionals with information and searching skills and a keen understanding of the research, education, clinical landscape of their institution. This presentation will discuss general trends in the field, including an overview of resources, assessment frameworks and tools; strategies for partnering with stakeholders; and examples of library based service models, from basic services to highly integrated library-based core research units.
(1) http://dx.doi.org/10.1126/scitranslmed.aac5200
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
Digging for data: opportunities and challenges in an open research landscape_...Platforma Otwartej Nauki
“Open Research Data: Implications for Science and Society”, Warsaw, Poland, May 28–29, 2015, conference organized by the Open Science Platform — an initiative of the Interdisciplinary Centre for Mathematical and Computational Modelling at the University of Warsaw. pon.edu.pl @OpenSciPlatform #ORD2015
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
On November 21st 2014 at the Tufts University Medford campus and November 25th 2014 at the campus of the University of Massachusetts Medical School in Worcester, the BLC and Digital Science hosted a workshop focused on better understanding the research information management landscape.
Jonathan Breeze, CEO of Symplectic, reflected on the emergence of research information management systems and the resulting benefits they can provide.
Lecture on "Altmetrics: An Alternative View-Point to Assess Research Impact" in Five days Advanced Training Programme on Bibliometrics and Research Output Analysis during 15th - 20th June, 2015 at INFLIBNET Centre, Gandhinagar.
Overview to: BBSRC Oxford Doctoral Training Partnership - Dr Sansone - July 2014Susanna-Assunta Sansone
What to know when planning for your data management strategy and preparing a data management statement for a research proposal for BBSRC DTP first year students
How altmetrics can help researchers broaden the reach of their work
Slides from workshop to pepnet (Public Engagement network) at the University of Leeds on 28th November 2018
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
Similar to Practical applications for altmetrics in a changing metrics landscape (20)
The increased availability of biomedical data, particularly in the public domain, offers the opportunity to better understand human health and to develop effective therapeutics for a wide range of unmet medical needs. However, data scientists remain stymied by the fact that data remain hard to find and to productively reuse because data and their metadata i) are wholly inaccessible, ii) are in non-standard or incompatible representations, iii) do not conform to community standards, and iv) have unclear or highly restricted terms and conditions that preclude legitimate reuse. These limitations require a rethink on data can be made machine and AI-ready - the key motivation behind the FAIR Guiding Principles. Concurrently, while recent efforts have explored the use of deep learning to fuse disparate data into predictive models for a wide range of biomedical applications, these models often fail even when the correct answer is already known, and fail to explain individual predictions in terms that data scientists can appreciate. These limitations suggest that new methods to produce practical artificial intelligence are still needed.
In this talk, I will discuss our work in (1) building an integrative knowledge infrastructure to prepare FAIR and "AI-ready" data and services along with (2) neurosymbolic AI methods to improve the quality of predictions and to generate plausible explanations. Attention is given to standards, platforms, and methods to wrangle knowledge into simple, but effective semantic and latent representations, and to make these available into standards-compliant and discoverable interfaces that can be used in model building, validation, and explanation. Our work, and those of others in the field, creates a baseline for building trustworthy and easy to deploy AI models in biomedicine.
Bio
Dr. Michel Dumontier is the Distinguished Professor of Data Science at Maastricht University, founder and executive director of the Institute of Data Science, and co-founder of the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. His research explores socio-technological approaches for responsible discovery science, which includes collaborative multi-modal knowledge graphs, privacy-preserving distributed data mining, and AI methods for drug discovery and personalized medicine. His work is supported through the Dutch National Research Agenda, the Netherlands Organisation for Scientific Research, Horizon Europe, the European Open Science Cloud, the US National Institutes of Health, and a Marie-Curie Innovative Training Network. He is the editor-in-chief for the journal Data Science and is internationally recognized for his contributions in bioinformatics, biomedical informatics, and semantic technologies including ontologies and linked data.
Slide 1: Title Slide
Extrachromosomal Inheritance
Slide 2: Introduction to Extrachromosomal Inheritance
Definition: Extrachromosomal inheritance refers to the transmission of genetic material that is not found within the nucleus.
Key Components: Involves genes located in mitochondria, chloroplasts, and plasmids.
Slide 3: Mitochondrial Inheritance
Mitochondria: Organelles responsible for energy production.
Mitochondrial DNA (mtDNA): Circular DNA molecule found in mitochondria.
Inheritance Pattern: Maternally inherited, meaning it is passed from mothers to all their offspring.
Diseases: Examples include Leber’s hereditary optic neuropathy (LHON) and mitochondrial myopathy.
Slide 4: Chloroplast Inheritance
Chloroplasts: Organelles responsible for photosynthesis in plants.
Chloroplast DNA (cpDNA): Circular DNA molecule found in chloroplasts.
Inheritance Pattern: Often maternally inherited in most plants, but can vary in some species.
Examples: Variegation in plants, where leaf color patterns are determined by chloroplast DNA.
Slide 5: Plasmid Inheritance
Plasmids: Small, circular DNA molecules found in bacteria and some eukaryotes.
Features: Can carry antibiotic resistance genes and can be transferred between cells through processes like conjugation.
Significance: Important in biotechnology for gene cloning and genetic engineering.
Slide 6: Mechanisms of Extrachromosomal Inheritance
Non-Mendelian Patterns: Do not follow Mendel’s laws of inheritance.
Cytoplasmic Segregation: During cell division, organelles like mitochondria and chloroplasts are randomly distributed to daughter cells.
Heteroplasmy: Presence of more than one type of organellar genome within a cell, leading to variation in expression.
Slide 7: Examples of Extrachromosomal Inheritance
Four O’clock Plant (Mirabilis jalapa): Shows variegated leaves due to different cpDNA in leaf cells.
Petite Mutants in Yeast: Result from mutations in mitochondrial DNA affecting respiration.
Slide 8: Importance of Extrachromosomal Inheritance
Evolution: Provides insight into the evolution of eukaryotic cells.
Medicine: Understanding mitochondrial inheritance helps in diagnosing and treating mitochondrial diseases.
Agriculture: Chloroplast inheritance can be used in plant breeding and genetic modification.
Slide 9: Recent Research and Advances
Gene Editing: Techniques like CRISPR-Cas9 are being used to edit mitochondrial and chloroplast DNA.
Therapies: Development of mitochondrial replacement therapy (MRT) for preventing mitochondrial diseases.
Slide 10: Conclusion
Summary: Extrachromosomal inheritance involves the transmission of genetic material outside the nucleus and plays a crucial role in genetics, medicine, and biotechnology.
Future Directions: Continued research and technological advancements hold promise for new treatments and applications.
Slide 11: Questions and Discussion
Invite Audience: Open the floor for any questions or further discussion on the topic.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Professional air quality monitoring systems provide immediate, on-site data for analysis, compliance, and decision-making.
Monitor common gases, weather parameters, particulates.
Practical applications for altmetrics in a changing metrics landscape
1. Practical applications for
altmetrics in a changing
metrics landscape
Sara Rouhi, @RouhiRoo
Product Specialist, Altmetric
sara@altmetric.com
Anirvan Chatterjee, @anirvan
Director, Data Strategy
Clinical & Translational Science Institute, UCSF
anirvan.chatterjee@ucsf.edu
2. Today we’ll cover
• Need and origins
• Definitions
• Where you can find altmetrics
• How they’re being used at UCSF
• How they’re being used at Duke
University
• What the future may bring…
5. If metrics are about filtering the
good research from the bad,
traditional metrics* aren’t working
*Peer review, journal impact
factor, citation counting
8. 44K
online mentions of
scholarly articles
every day.
1 mention every
2 seconds!
50K unique articles are
shared each week.
>3.5M
articles with tracked
attention data.
The conversation has moved online..
Source: Altmetric internal data, March 2015
9. Funders want evidence of societal impact
Grant funders looking for proof of “broader impacts”
often defined as “an effect, change, or benefit to the
economy, society, culture, public policies, health, the
environment, etc.”
Research Excellence Framework,
http://www.ref.ac.uk/panels/assessmentcriteriaandleveldefinitions/
Broaden dissemination to enhance scientific and
technological understanding, for example, by presenting
results of research and education projects in formats useful
to students, scientists and engineers, members of Congress,
teachers, and the general public.
http://www.nsf.gov/pubs/2007/nsf07046/nsf07046.jsp
10. Expanded administrative remits
• Strategic planning
• Supervision of academic affairs
• Fundraising
• Grants administration
• Public affairs
12. Altmetric is a data science company that tracks
attention to research outputs, delivering output
level metrics via visually engaging, intuitive
interfaces.
In other words, we help give credit where credit
is due.
Who are we?
14. Multifaceted picture of engagement:
Audiences
Practitioners General Public
Professional
Communicators
Interested Parties
Scholars
15. Multi-faceted picture of engagement:
Interaction
• Scholars
– Downloads
– Citations
– Bookmarks/saves
• Early career
– Social media, blogs
• General public
– News, blogs,
– Social media
• Practitioners
– Policy documents
– Field-specific blogs/Social
Media
• Research communicators
– News, blogs, social media
• Interested parties
– Policy docs, blogs
16. Who on campus needs to track this?
Administrators
(Grants, Departmental,
Institutional)
Library
Marketing/PR/Communications Research Groups
17. Why? Administrators
• Are we in compliance with grant/govt
mandates?
• Do our research outputs work toward
our group/dept/instit. mission?
• Does our campus have global reach?
• Does our research influence policy,
legislation, best practices?
18. Why? Libraries
• Do our collections reflect where our
research gets the most attention (i.e.
are we missing anything? Are we
purchasing the wrong things?)
• Does our OA policy bring more attention
to our work?
• How does our institutional repository
bring attention to campus research?
19. Why? Marketing/PR/Communications
• Is anyone out there getting it wrong?
• Have we missed opportunities to get in
front of a PR/communications storm?
• Can we benchmark our outreach
efforts?
• Are we reaching the target markets we
want?
• Are we using the right media?
20. Why? Research Groups
• Are we reaching the audiences we want to
see our work?
• Is anyone misrepresenting/confused by our
work?
• How do we demonstrate “broader impact”
to grant funders?
• How can we reach more people with our
research?
• Are we engaging unexpected communities?
25. Where will you see our data? Author Tools
“A CV that documents alternative metrics […]
offers a much more compelling argument to a
tenure committee of their research impact
than a traditional publication list.”
- Donald Samulack, Editage
26. Recommendation Engine
Integration for Medical research Apps
Integrating Altmetric service into
publishing platform
Altmetric Integration for JAMA and
others to monitor research impact
Integrates Altmetric data for over 1
million articles
Where will you see our data? Platforms
27. • Institutional repository badge embeds
• Badge integration with discovery systems
Where will you see our data? Institutional
repositories/discovery systems
30. Three Experiments
with Altmetric data
April 22, 2015
Anirvan Chatterjee
Director, Data Strategy
Clinical & Translational Science Institute
Clinical & Translational
Science Institute
31.
32. UCSF Profiles profiles.ucsf.edu
Research networking system (like VIVO, Symplectic Elements)
Research profile of 7,000 people on campus
• Bios, publications, NIH grants, awards, etc.
Not just a directory
Publications automatically kept current
• Heavily used — 100,000+ visits per month from on/off campus
• Data reuse — APIs used by 25 other campus systems
33.
34.
35.
36. Why altmetrics?
Show early impacts of research
Attempt to measure/visualize impact, rather than just anecdata
Doesn’t displace traditional metrics of research output
(e.g. citations, journal rankings, etc.)
52. Lessons learned
Altmetrics advocates were supportive
Zero pushback from campus community
Because of easy Altmetric integration, we could add
altmetrics even before we added citation data
54. Background
Many researchers focus on a handful of key journals, but may miss
out on trending stories on non-core topics of interest
• e.g. cardiologist interested in digital health
We know UCSF researchers’ research topics/interests…
• Hand-entered
• Algorithmically derived from publications
Our recommendation engine shares new articles of interest
that matches researchers’ known areas of interest
55. Altmetric API
Details at http://api.altmetric.com/
Free to use basic data for apps and mashups, with rate limits
Generous free access for noncommercial academic research
projects
5/18/2015Presentation Title and/or Sub Brand Name Here55
56.
57. Lessons learned so far (work in progress)
Altmetric API made it easy to integrate altmetrics data
Among first round of beta testers:
• Most hadn’t seen recommended papers
• Some questions about article level metrics vs. journal reputation
• Need to improve relevance matching
• Enough positive feedback for us to keep exploring
58. Takeaways from our three experiments…
When it comes to altmetrics, researchers aren’t monolithic
• Some bullish, others guardedly positive, few/none offended
Altmetrics data doesn’t yet solve a burning institutional need
• We’re hearing more about altmetrics from early adopters, rather
than leadership
Low barriers to experimentation
• It’s very easy to get started and integrate into our processes
• We’re able to keep tossing around ideas to find the best fit
61. NSF Broader Impacts Criterion
To what extent will [the research] enhance the
infrastructure for research and education, such as
facilities, instrumentation, networks, and partnerships?
Will the results be disseminated broadly to enhance
scientific and technological understanding?
What may be the benefits of the proposed activity to
society?
http://www.nsf.gov/pubs/2007/nsf07046/nsf07046.jsp
NSF Broader Impacts Criterion
67. 38
12
3
2
6
2 2
25
12
3
2
6
2 2
0
5
10
15
20
25
30
35
40
Ar cle 1 Ar cle 2 Ar cle 3 Ar cle 4 Ar cle 5 Ar cle 6 Ar cle 7 Ar cle 8 Ar cle 9 Ar cle 10
Total No. of stories
Total No. of outlets
No. Interna onal outlets
Demonstrating “broader impact” with
International News coverage = 60%
Data from Article Details Pages
70. Many many more eyeballs
3,941,227
634,343
190,593 187,480
263,719
70,617 137,926 115,455 84,158
5,121
0
500,000
1,000,000
1,500,000
2,000,000
2,500,000
3,000,000
3,500,000
4,000,000
4,500,000
Article 1 Article 2 Article 3 Article 4 Article 5 Article 6 Article 7 Article 8 Article 9 Article 10
Twitter reach by article - Total upward bound: 5,630,639
Data from Article Details Pages
Even if 1% click on the
article, that’s 56,000 eyes
that never would have
seen it before Twitter.
71. Saved Terrie time; saved her program
manager time…
NIH Program Manager:
“[This Altmetric data is]
fantastic information for
[our] budget report.”
72. Before Altmetric data she didn’t know…
• How broadly her work was disseminated
– News vs policy vs blogosphere
• The difference in interest by source
– Methodology papers via Twitter
• That all this data could be aggregated to
save time
73. Recap of where we are…
• Education is critical
• Tenure/promotion paradigm
• “Here one day, gone the next”
• Need for sentiment analysis
– So it’s not just more numbers
• Facilitating industry standards
– NISO Altmetrics Whitepaper
76. Attention exists on a spectrum
Tweets/bookmarks Holdings/saves/shares Usage Citations
Policy document
citations
Blog coverage
Post publication peer
review
News coverage
• Superficial
• Article may or may not have been read
• Many potential readers but few actual
• Cost-light (er)
• Article more likely to be read
• Cost-heavy (ier)
• Readers = practitioners (?)
• Actionable (?)
Editor's Notes
Welcome Attendees.
We’ll be taking a look at how Altmetric can help researchers, departments and institutions evaluate their research impact in a timely and comprehensive manner.
Running time of 30 mins, plus questions at the end. Use the raise the hand feature for asking questions.
Please feel free to ask questions via the Q&A feature. Please also use this for any audio issues you may have.
Lets get started
Not only that, but impact factors and citation counts are very laggy.
Take several years to accrue
-Don’t tell the whole story
Provide only part of the impact profile of your research
As this gap has become wider, funders are now noticing that judging impact of research funded by them requires new tools, and the monitoring of new platforms.
Need more evidence about attention their work is receiving
Is funding going towards engaging work?
What other areas are getting traction within alternative metrics?
What are particular other funders getting traction with?
Can new avenues for funding be approached, or engaged with?
Altmetric is data science company
Why
What are “altmetrics”?
“alternative metrics”
new ways of measuring different, non-traditional forms of impact.
“alternative to only using citations”, not “alternative to citations”.
complementary to traditional citation-based analysis.
Article-level metrics have come to refer to any metrics (e.g., including altmetrics) that surround a scholarly article.
Altmetric has also been rapidly adopted as the industry standard amongst publishers for demonstrating article impact on their web pages. Publishers display our badges and widgets on abstract pages to show overall impact, and easily link users to the news, conversations and commentary happening right now for their articles.
Marketing materials/driving sales strategy/reporting to editorial boards
Your researchers are already seeing our data and visualisations on a lot of articles, becoming insterested in and familiar with the idea of altmetrics
Altmetric has also been rapidly adopted as the industry standard amongst publishers for demonstrating article impact on their web pages. Publishers display our badges and widgets on abstract pages to show overall impact, and easily link users to the news, conversations and commentary happening right now for their articles.
Marketing materials/driving sales strategy/reporting to editorial boards
Your researchers are already seeing our data and visualisations on a lot of articles, becoming insterested in and familiar with the idea of altmetrics
Altmetric supports the altmetrics community. Providers like Kudos and Impactstory feature Altmetric data within their own data.
Get sussex screenshot
The integration of altmetric badges in a discovery service allows the user a deep-dove onto the interactions a result has had. Integration works with anything with an identifier altmetric supports.
Lots of institutions are already using our data, and we’ll come to AFI later
Being in San Francisco, it's hard not to be influenced by Silicon Valley culture — build minimum viable products, fail fast, see what sticks
Final boiler plate language: 2 versions
Through its singular focus on health, UCSF is leading revolutions in health.
UCSF is driven by the idea that great breakthroughs are achieved when the best research, the best education and the best patient care converge.
Introduce yourself and your two postings
This is what sent me looking at altmetric data in the first place….
Brief explanation of what it is and what you’re asked to do…
A researcher from Duke started using the Bookmarklet when a former student sent it to her telling her“this is ultra cool.”
It’s basically a free bookmark you add to your browser that shows you altmetric data for any article with an identifier IF it has attention.
Basically visit altmetric.it
(CLICK) drag and drop the bookmark into your toolbar
(CLICK arrow appears)
Visit any article of your choice
CLICK (EST article popsup)
CLICK (Red circle pops up) Click on the bookmark and the article details drop down as you see here.
CLICK (Red arrow appears) If you click on “Click for more details” then you get (CLICK and details page appears) you get the details for each mention specified in the donut. You can read every news mention, policy document mention, blog post, or tweet.
And this tool is totally free at altmetric.it
But I thought, there has got to be a way to summarize impact across ALL of the publications from a lab, or all from a funded project. I’d also like to be able to see one set of altmetrics covering all the publications on the CV of scientists applying for a job in my department. So, I sent Altmetrics a question:
How do I see my research in aggregate – this brought me to Sara
This is the summary report for the 82 articles of mine that had attention. I had a total of 2933mentions across all those different sources. Source that I had NO idea covered me, including 109 mentions and mentions in three different policy documents.
She also could show interesting spikes in attention over the time period of the articles I provided. The two big ones correlate with major papers I published at the time. Bthe massive amount of twitter data also really surprised. Altmetric allows you to read every single mention so I was able to read every news article that mentioned my work and every policy document. Even every tweet!
I have never been a twitter user but all this attention made me think maybe I should approach this kind of exposure differently.
Little did I know that my research was being used in policy recommendations by two different UK organizations, Mental Health Foundation – a mental health research and policy charity as well as the UK government.
If I wanted to make a case for broader impact, this is bona fide data demonstrating that practitioners – not researchers – but folks who can affect lives through legislation, health care, and education, are using my research to better their work.
Similarly I had no idea about the scope of news coverage and how much of the coverage around my work was international. 6 of my top 10 papers had international news coverage.
Similarly I wasn’t aware how large and active the health and psychaitry blog community is. Resaerchers, doctors, patients and the general public are actively using blogs as ways to learn more, disseminate findings and find solutions to debilatating mental illness effecting every day lives.
Twitter itself astonished me.I had no idea that non researchers were looking at my work. The twitter data allowed me to see that real practitioners – folks who might actually make decisions based on my research – were reading and commenting on my work.
This graph shows the upward bound of twitter reach (that is the maximum # of followers that might have potentially seen the tweet) for my top ten most attention drawing articles.
‘
I had no idea about the impact of twitter in terms of reach and how many potential new audiences including journalists, bloggers, the newsmedia, practictioners, and legislators could see my work .
A recent piece of research out of the university of wisconsin – called building buzz – makes the case that coverage in news media and twitter is notto be underestimated as crucial outlets for getting ones work out to the general public.
I have asked my studentsand postdocs to work a on a strategy surrounding twitter to continue these positive trends.