Assessing and Reporting Research Impact – A Role for the Library - Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
June 18 NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Keynote Speaker: Altmetrics at the Portfolio Level
- Paul Groth, Ph.D., Assistant Professor at the VU University Amsterdam
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
Scholarly Metrics in Specialized SettingsElaine Lasda
Presentation for the Bibliometric and Research Impact Community (BRIC) of Canada on case studies of research impact in specialized settings. Focus on Michigan Publishing by co-presenter Rebecca Welzenbach
Research Impact in Specialized Settings: 3 Case StudiesElaine Lasda
Presentation of 3 case studies where research impact metrics are used to further the mission of institutions and organizations out of the traditional academic millieu.
The Kaleidoscope of Impact: same data, different perspectives, constantly cha...Kudos
Scholars, scientists, academic institutions, publishers and funders are all interested in impact. We have different roles and goals, and therefore different reasons for needing to understand impact; we are therefore asking different questions about impact, and those questions continue to evolve, much as the concept of impact itself is evolving. To answer our different questions, do we need different data, in separate silos, or are we looking at the same data, from different angles? This session gathered researcher, library, publisher and metrics provider perspectives to consider who has an interest in impact, what data they are interested in, how they use it, and how the situation is evolving as e.g. business models and technical infrastructures shift.
June 18 NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Keynote Speaker: Altmetrics at the Portfolio Level
- Paul Groth, Ph.D., Assistant Professor at the VU University Amsterdam
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Assessing and Reporting Research Impact – A Role for the Library
- Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
Scholarly Metrics in Specialized SettingsElaine Lasda
Presentation for the Bibliometric and Research Impact Community (BRIC) of Canada on case studies of research impact in specialized settings. Focus on Michigan Publishing by co-presenter Rebecca Welzenbach
Research Impact in Specialized Settings: 3 Case StudiesElaine Lasda
Presentation of 3 case studies where research impact metrics are used to further the mission of institutions and organizations out of the traditional academic millieu.
The Kaleidoscope of Impact: same data, different perspectives, constantly cha...Kudos
Scholars, scientists, academic institutions, publishers and funders are all interested in impact. We have different roles and goals, and therefore different reasons for needing to understand impact; we are therefore asking different questions about impact, and those questions continue to evolve, much as the concept of impact itself is evolving. To answer our different questions, do we need different data, in separate silos, or are we looking at the same data, from different angles? This session gathered researcher, library, publisher and metrics provider perspectives to consider who has an interest in impact, what data they are interested in, how they use it, and how the situation is evolving as e.g. business models and technical infrastructures shift.
This presentation was provided by Vincent Cassidy of The IET during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Your Systematic Review: Getting StartedElaine Lasda
Presentation for University at Albany- SUNY community related to best practices for conducting systematic reviews and other evidence synthesis practices.
Academics must provide evidence to demonstrate the impact and outcomes of their scholarly work. This webinar, presented by librarians, will help faculty explore various forms of documentary evidence to support their case for excellence. Sponsored by the IUPUI Office of Academic Affairs.
Note: The webinar included demonstrations of Web of Science & Scopus, which the slides do not reflect.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Capturing and Analyzing Publication, Citation and Usage Data for Contextual C...NASIG
Libraries have long sought to demonstrate the value of their collections through a variety of usage statistics. Traditionally, a strong emphasis is placed on high usage statistics when evaluating journals in collection development discussions. However, as budget pressures persist, administrators are increasingly concerned with looking beyond traditional usage metrics to determine the real impact of library services and collections. By examining journal usage in the context of scholarly communication, we hope to gain a more holistic understanding of the use and impact of our library’s resources. In this session, we begin by outlining our methodology for gathering comprehensive publication and citation data for authors affiliated with Northwestern University’s Feinberg School of Medicine, utilizing Web of Science as our primary data source and leveraging a custom Python script to manage the data. Using this data we discuss various potential metrics that could be employed to measure and evaluate journals in institutional and field-specific contexts, including but not limited to: number of publications and references per journal, co-citation networks, percentage of references per journal, and increases or decreases of references over time per title. We then consider the development of normalized benchmarks and criteria for creating field-specific core journal lists. We also discuss a process for establishing usage thresholds to evaluate existing journal subscriptions and to highlight potential gaps in the collection. Finally, we apply and compare these metrics to traditional collection development tools like COUNTER usage reports, cost-per-use analysis, Inter-Library Loan statistics and turnaway reports, to determine what correlations or discrepancies might exist. We finish by highlighting some use-cases which demonstrate the value of considering publication and citation metrics, and provide suggestions for incorporating these metrics into library collection development practices.
Speakers: Joelen Pastva and Jonathan Shank, Northwestern University
Project GitHub page: https://goo.gl/2C2Pcy
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
NISO Altmetrics Initiative: A Project Update
- Martin Fenner, Technical Lead for the PLOS Article-Level Metrics project
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Snowball Metrics: University-owned Benchmarking to Reveal Strengths within All Activities
- Dr. Lisa Colledge, Snowball Metrics Program Director, Elsevier
An introduction to open science for the Library Journal webcast Case Studies for Open Science on February 9, 2016.
http://lj.libraryjournal.com/2016/01/webcasts/case-studies-for-open-science/
This presentation was provided by Sarah Young of Cornell University during a NISO webinar on the topic of Compliance With Funder mandates, held on September 14, 2016.
Promoting Open Access and Open Educational Resources to FacultyNASIG
Heather Crozier, presenter
Student debt is a compelling issue and many institutions are investigating solutions to ease the financial burdens of their students. Increasing the use of open educational resources benefits students by reducing course costs. Adopting OER in the classroom allows faculty more freedom in choosing instructional tools. Faculty also benefit from open access publishing by increasing their exposure. However, on the campus of a small, private institution, attendance at workshops to spread awareness and increase the use of these materials was minimal. Faculty had the perception that free resources could not be the same quality as traditional resources. In order to dispel this myth, the Electronic Resources Librarian and Educational Technology Manager collaborated to create custom one hour sessions for individual departments, leveraging library/faculty liaison relationships and the expertise of the office of educational technology. In the session, faculty learn more about open access publishing options, the value of open educational resources, the quality of many open educational resources, and where to find these resources. The session uses the course management system to both disseminate the information shared in the session and create a forum for departments to share resources with each other. Through the CMS, faculty gain access to vetted resources. All attendants have editing privileges within the site after the workshop, allowing them to curate course-specific lists for sharing and future reference. Pilot sessions have been well received and wider implementation is planned for the next academic year.
This presentation was provided by Pamela Shaw of Northwestern University during the NISO Webinar, Compliance with Funder Mandates, held on September 14, 2016
Practical applications for altmetrics in a changing metrics landscapeDigital Science
"Practical applications for altmetrics in a changing metrics landscape" - Sara Rouhi, Altmetric product specialist, and Anirvan Chatterjee, Director Data Strategy for CTSI at UCSF
Research information management: making sense of it allDigital Science
"Research information management: making sense of it all" - Julia Hawks, VP North America, Symplectic
Slides from Shaking It Up: Challenges and Solutions in Scholarly Information Management, San Francisco, April 22, 2015
Is what's 'trending' what¹s worth purchasing?NASIG
Presenters:
Stacy Konkiel, Outreach & Engagement Manager, Altmetric
Rachel Miles, Kansas State University Libraries
Sarah Sutton, Assistant Professor in the School of Library and Information Management at Emporia State University
New forms of usage data like altmetrics are helping librarians to make smarter decisions about their collections. A recent nationwide study administered to 13,000+ librarians at R1 universities shines light on exactly how these metrics are being applied in academia. This presentation will share survey results, including as-yet-unknown rates of technology and metrics uptake among collection development librarians, the most popular citation databases and altmetrics services being used to make decisions, and surprising factors that affect attitudes toward the use of metrics. This presentation will also offer actionable insights on how altmetrics are being paired with bibliometrics and usage statistics to form a more complete picture of “trending” scholarship that’s worth purchasing. Through sharing the survey results and opening up a discussion about the potential altmetrics hold for informing collection development, the presenters aim to provide a learning opportunity for attendees which will enhance their competencies for e-resource management, specifically, core competence for e-resource librarians 3.5, use of bibliometrics for collection assessment, and 3.7, identity and analyze emerging technologies.
This presentation was provided by Vincent Cassidy of The IET during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Your Systematic Review: Getting StartedElaine Lasda
Presentation for University at Albany- SUNY community related to best practices for conducting systematic reviews and other evidence synthesis practices.
Academics must provide evidence to demonstrate the impact and outcomes of their scholarly work. This webinar, presented by librarians, will help faculty explore various forms of documentary evidence to support their case for excellence. Sponsored by the IUPUI Office of Academic Affairs.
Note: The webinar included demonstrations of Web of Science & Scopus, which the slides do not reflect.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Capturing and Analyzing Publication, Citation and Usage Data for Contextual C...NASIG
Libraries have long sought to demonstrate the value of their collections through a variety of usage statistics. Traditionally, a strong emphasis is placed on high usage statistics when evaluating journals in collection development discussions. However, as budget pressures persist, administrators are increasingly concerned with looking beyond traditional usage metrics to determine the real impact of library services and collections. By examining journal usage in the context of scholarly communication, we hope to gain a more holistic understanding of the use and impact of our library’s resources. In this session, we begin by outlining our methodology for gathering comprehensive publication and citation data for authors affiliated with Northwestern University’s Feinberg School of Medicine, utilizing Web of Science as our primary data source and leveraging a custom Python script to manage the data. Using this data we discuss various potential metrics that could be employed to measure and evaluate journals in institutional and field-specific contexts, including but not limited to: number of publications and references per journal, co-citation networks, percentage of references per journal, and increases or decreases of references over time per title. We then consider the development of normalized benchmarks and criteria for creating field-specific core journal lists. We also discuss a process for establishing usage thresholds to evaluate existing journal subscriptions and to highlight potential gaps in the collection. Finally, we apply and compare these metrics to traditional collection development tools like COUNTER usage reports, cost-per-use analysis, Inter-Library Loan statistics and turnaway reports, to determine what correlations or discrepancies might exist. We finish by highlighting some use-cases which demonstrate the value of considering publication and citation metrics, and provide suggestions for incorporating these metrics into library collection development practices.
Speakers: Joelen Pastva and Jonathan Shank, Northwestern University
Project GitHub page: https://goo.gl/2C2Pcy
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
NISO Altmetrics Initiative: A Project Update
- Martin Fenner, Technical Lead for the PLOS Article-Level Metrics project
LITA’s Altmetrics and Digital Analytics Interest Group is proud to present Heather Coates, Richard Naples, and Lauren Collister in our second free webinar of the season. Heather will introduce the concept of altmetrics with a quick "Altmetrics 101," Richard will discuss the Smithsonian's implementation of Altmetric, and Lauren will share the University of Pittsburgh's experience with Plum Analytics.
June 18, 2014
NISO Virtual Conference: Transforming Assessment: Alternative Metrics and Other Trends
Snowball Metrics: University-owned Benchmarking to Reveal Strengths within All Activities
- Dr. Lisa Colledge, Snowball Metrics Program Director, Elsevier
An introduction to open science for the Library Journal webcast Case Studies for Open Science on February 9, 2016.
http://lj.libraryjournal.com/2016/01/webcasts/case-studies-for-open-science/
This presentation was provided by Sarah Young of Cornell University during a NISO webinar on the topic of Compliance With Funder mandates, held on September 14, 2016.
Promoting Open Access and Open Educational Resources to FacultyNASIG
Heather Crozier, presenter
Student debt is a compelling issue and many institutions are investigating solutions to ease the financial burdens of their students. Increasing the use of open educational resources benefits students by reducing course costs. Adopting OER in the classroom allows faculty more freedom in choosing instructional tools. Faculty also benefit from open access publishing by increasing their exposure. However, on the campus of a small, private institution, attendance at workshops to spread awareness and increase the use of these materials was minimal. Faculty had the perception that free resources could not be the same quality as traditional resources. In order to dispel this myth, the Electronic Resources Librarian and Educational Technology Manager collaborated to create custom one hour sessions for individual departments, leveraging library/faculty liaison relationships and the expertise of the office of educational technology. In the session, faculty learn more about open access publishing options, the value of open educational resources, the quality of many open educational resources, and where to find these resources. The session uses the course management system to both disseminate the information shared in the session and create a forum for departments to share resources with each other. Through the CMS, faculty gain access to vetted resources. All attendants have editing privileges within the site after the workshop, allowing them to curate course-specific lists for sharing and future reference. Pilot sessions have been well received and wider implementation is planned for the next academic year.
This presentation was provided by Pamela Shaw of Northwestern University during the NISO Webinar, Compliance with Funder Mandates, held on September 14, 2016
Practical applications for altmetrics in a changing metrics landscapeDigital Science
"Practical applications for altmetrics in a changing metrics landscape" - Sara Rouhi, Altmetric product specialist, and Anirvan Chatterjee, Director Data Strategy for CTSI at UCSF
Research information management: making sense of it allDigital Science
"Research information management: making sense of it all" - Julia Hawks, VP North America, Symplectic
Slides from Shaking It Up: Challenges and Solutions in Scholarly Information Management, San Francisco, April 22, 2015
Is what's 'trending' what¹s worth purchasing?NASIG
Presenters:
Stacy Konkiel, Outreach & Engagement Manager, Altmetric
Rachel Miles, Kansas State University Libraries
Sarah Sutton, Assistant Professor in the School of Library and Information Management at Emporia State University
New forms of usage data like altmetrics are helping librarians to make smarter decisions about their collections. A recent nationwide study administered to 13,000+ librarians at R1 universities shines light on exactly how these metrics are being applied in academia. This presentation will share survey results, including as-yet-unknown rates of technology and metrics uptake among collection development librarians, the most popular citation databases and altmetrics services being used to make decisions, and surprising factors that affect attitudes toward the use of metrics. This presentation will also offer actionable insights on how altmetrics are being paired with bibliometrics and usage statistics to form a more complete picture of “trending” scholarship that’s worth purchasing. Through sharing the survey results and opening up a discussion about the potential altmetrics hold for informing collection development, the presenters aim to provide a learning opportunity for attendees which will enhance their competencies for e-resource management, specifically, core competence for e-resource librarians 3.5, use of bibliometrics for collection assessment, and 3.7, identity and analyze emerging technologies.
Similar to Assessing and Reporting Research Impact – A Role for the Library - Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
Understanding impact through alternative metrics: developing library-based as...Kristi Holmes
There’s never been a more critical need to better understand the impact of research efforts. The challenging state of funding models (1) and an enhanced pressure on young investigators to stand out from the crowd magnify this need as well as the perceived value of locally based impact services. These services are leveraged by a diverse range of stakeholders, from individuals to university-level decision makers and strategists. Individuals often wish to better demonstrate impact of published works to promotion committees or describe the impact of research studies to funding agencies when applying for funding or complying with institution-level or federal reporting exercises. Research groups, departments, and institutions often wish to discover how research findings are being used to promote science and gain a better overall view of research publications and outputs.
Libraries are particularly well poised to meet the need to understand a more nuanced view of impact. Libraries are trusted, neutral parties with a tradition of service and support and often act as technology hubs on campus with IT and data expertise. Librarians are trained information professionals with information and searching skills and a keen understanding of the research, education, clinical landscape of their institution. This presentation will discuss general trends in the field, including an overview of resources, assessment frameworks and tools; strategies for partnering with stakeholders; and examples of library based service models, from basic services to highly integrated library-based core research units.
(1) http://dx.doi.org/10.1126/scitranslmed.aac5200
This presentation was provided by Holly Falk-Krzesinski of Elsevier during the NISO event, "Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems," held on February 20, 2019.
Gather evidence to demonstrate the impact of your researchIUPUI
This workshop is the 3rd in a series of 4 titled "Maximize your impact" offered by the IUPUI University Library Center for Digital Scholarship. Faculty must provide strong evidence of impact in order to achieve promotion and tenure. Having strong evidence in year 5 is made easier by strategic dissemination early in your tenure track. In this hands-on workshop, we will introduce key sources of evidence to support your case, demonstrate strategies for gathering this evidence, and provide a variety of examples. These sources include citation metrics, article level metrics, and altmetrics as indicators of impact to support your narrative of excellence.
Librarians and research impact - Download and share the new infographicLibrary_Connect
More and more librarians are being called upon to help track and report on the outputs and impact of research. From a landscape of article and book citations, the vista has broadened to a range of research outputs, measures and applications.
Download and share the new infographic from Elsevier's Library Connect and Jenny Delasalle, a freelance consultant and librarian. It tells the story of how librarians are working with researchers and the research office to measure research impact and to explore the application of these measurements. See more at: http://libraryconnect.elsevier.com/articles/2014-06/librarians-and-research-impact-download-and-share-new-infographic-0
Webinar slides from June 8 Library Connect webinar "Researcher profiles and metrics that matter" with: Chris Belter, Bibliometrics Informationist, NIH Library; Andrea Michalek, VP of Research Metrics, Elsevier | Managing Director of Plum Analytics; Ellen Cole, Scholarly Publications Librarian, Learning and Research Services, Northumbria University.
View the webinar at: http://libraryconnect.elsevier.com/library-connect-webinars?commid=257883
Let's Talk Research Annual Conference - 24th-25th September 2014 (Professor R...NHSNWRD
"Introduction to Evidence Synthesis": Professor Rumona Dickson's presentation provided an overview of evidence synthesis and a platform to refine questions that participants wanted to answer related to their own clinical practice. The workshop also included information detailing how teams of health care professionals might access support for addressing their clinical review questions through the CPD programme of the CLAHRC NWC.
On November 21st 2014 at the Tufts University Medford campus and November 25th 2014 at the campus of the University of Massachusetts Medical School in Worcester, the BLC and Digital Science hosted a workshop focused on better understanding the research information management landscape.
Kevin Gardner, Director of Strategic Initiatives, Office of the Senior Vice Provost for Research, University of New Hampshire, described UNH's decision to implement a research information management system and the lessons learned.
On November 21st 2014 at the Tufts University Medford campus and November 25th 2014 at the campus of the University of Massachusetts Medical School in Worcester, the BLC and Digital Science hosted a workshop focused on better understanding the research information management landscape.
Jonathan Breeze, CEO of Symplectic, reflected on the emergence of research information management systems and the resulting benefits they can provide.
Vicky Scott: Implementing research into practiceTHL
Presentation by Dr Vicky Scott, Clinical Associate Professor, RN, PhD, Canada, BC Injury Research and Prevention Unit, BC Ministry of Health, Canada at at Safety 2016 World Conference, 18-21 September 2016, Tampere, Finland
#Safety2016FIN
This presentation was made at a large pharmaceutical company's R&D and corporate affairs campus - going a little more indepth than the one from the prior Science of Team Science Conference
Similar to Assessing and Reporting Research Impact – A Role for the Library - Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine (20)
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the closing segment of the NISO training series "AI & Prompt Design." Session Eight: Limitations and Potential Solutions, was held on May 23, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the seventh segment of the NISO training series "AI & Prompt Design." Session 7: Open Source Language Models, was held on May 16, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the sixth segment of the NISO training series "AI & Prompt Design." Session Six: Text Classification with LLMs, was held on May 9, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the fifth segment of the NISO training series "AI & Prompt Design." Session Five: Named Entity Recognition with LLMs, was held on May 2, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the fourth segment of the NISO training series "AI & Prompt Design." Session Four: Structured Data and Assistants, was held on April 25, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the third segment of the NISO training series "AI & Prompt Design." Session Three: Beginning Conversations, was held on April 18, 2024.
This presentation was provided by Kaveh Bazargan of River Valley Technologies, during the NISO webinar "Sustainability in Publishing." The event was held April 17, 2024.
This presentation was provided by Dana Compton of the American Society of Civil Engineers (ASCE), during the NISO webinar "Sustainability in Publishing." The event was held April 17, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, during the second segment of the NISO training series "AI & Prompt Design." Session Two: Large Language Models, was held on April 11, 2024.
This presentation was provided by Teresa Hazen of the University of Arizona, Geoff Morse of Northwestern University. and Ken Varnum of the University of Michigan, during the Spring ODI Conformance Statement Workshop for Libraries. This event was held on April 9, 2024
This presentation was provided by William Mattingly of the Smithsonian Institution, during the opening segment of the NISO training series "AI & Prompt Design." Session One: Introduction to Machine Learning, was held on April 4, 2024.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the eight and final session of NISO's 2023 Training Series on Text and Data Mining. Session eight, "Building Data Driven Applications" was held on Thursday, December 7, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the seventh session of NISO's 2023 Training Series on Text and Data Mining. Session seven, "Vector Databases and Semantic Searching" was held on Thursday, November 30, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the sixth session of NISO's 2023 Training Series on Text and Data Mining. Session six, "Text Mining Techniques" was held on Thursday, November 16, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the fifth session of NISO's 2023 Training Series on Text and Data Mining. Session five, "Text Processing for Library Data" was held on Thursday, November 9, 2023.
This presentation was provided by Todd Carpenter, Executive Director, during the NISO webinar on "Strategic Planning." The event was held virtually on November 8, 2023.
This presentation was provided by Rhonda Ross of CAS, a division of the American Chemical Society, and Jonathan Clark of the International DOI Foundation, during the NISO webinar on "Strategic Planning." The event was held virtually on November 8, 2023.
This presentation was provided by William Mattingly of the Smithsonian Institution, for the fourth session of NISO's 2023 Training Series on Text and Data Mining. Session four, "Data Mining Techniques" was held on Thursday, November 2, 2023.
More from National Information Standards Organization (NISO) (20)
Thinking of getting a dog? Be aware that breeds like Pit Bulls, Rottweilers, and German Shepherds can be loyal and dangerous. Proper training and socialization are crucial to preventing aggressive behaviors. Ensure safety by understanding their needs and always supervising interactions. Stay safe, and enjoy your furry friends!
Delivering Micro-Credentials in Technical and Vocational Education and TrainingAG2 Design
Explore how micro-credentials are transforming Technical and Vocational Education and Training (TVET) with this comprehensive slide deck. Discover what micro-credentials are, their importance in TVET, the advantages they offer, and the insights from industry experts. Additionally, learn about the top software applications available for creating and managing micro-credentials. This presentation also includes valuable resources and a discussion on the future of these specialised certifications.
For more detailed information on delivering micro-credentials in TVET, visit this https://tvettrainer.com/delivering-micro-credentials-in-tvet/
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
Landownership in the Philippines under the Americans-2-pptx.pptx
Assessing and Reporting Research Impact – A Role for the Library - Kristi L. Holmes, Ph.D., Director, Galter Health Sciences Library, Northwestern University, Feinberg School of Medicine
1. Assessing and Reporting Research
Impact – A Role for the Library
Kristi L. Holmes, PhD
Director, Galter Health Sciences Library
Associate Professor, Preventative Medicine-Health and Biomedical Informatics
Northwestern University, Feinberg School of Medicine
http://orcid.org/0000-0001-8420-5254
NISO Virtual Conference:
Transforming Assessment: Alternative Metrics and Other Trends
June 18, 2014
2. overview
• Impact metrics - the ususal suspects and more
• A view from the library
• Identifying needs
• Roles played by libraries
• Developing services
• Moving this space forward on the local and larger
level
3. impact metrics – why do we care?
• Quantify and document research
impact
• Justify future requests for
funding
• Quantify return on research
investment
• Discover how research findings
are being used
• Identify similar research projects
• Identify possible collaborators
• Determine if research findings
are duplicated, confirmed,
corrected, improved or
repudiated
• Determine if research findings
were extended (different
human populations, different
animal models/species, etc.)
• Confirm that research findings
were properly
attributed/credited
• Demonstrate that research
findings are resulting in
meaningful health outcomes
• Discover community benefit as
a result of research findings
• Progress reports
• Tenure
• Promotion dossiers
https://becker.wustl.edu/impact-assessment/model
4. alternative metrics
• Twitter
• Facebook
• Blogs
• Research Highlights
• Google+
• Main Stream Media
• Reddit
• Forums
• Q&A – StackExchange
• Pintrest
• LinkedIn
• FigShare
• F1000 Reviews
• GitHub
Thelwall M, Haustein S, Larivière V, Sugimoto CR (2013) Do Altmetrics Work? Twitter
and Ten Other Social Web Services. PLoS ONE 8(5): e64841.
doi:10.1371/journal.pone.0064841
Konkiel S (2013) Altmetrics: A 21st-Century Solution to Determining Research
Quality. Information Today. Available at
http://www.infotoday.com/OnlineSearcher/Articles/Features/Altmetrics-A-stCentury-
Solution-to-Determining-Research-Quality-90551.shtml
Articles
Datasets
Slides
Software
Webpages
5. Understanding the impact of research is a massive task.
How do we understand the impact of a published work?
A person?
A research center?
A university?
What IS impact?
6. Ioannidis JA, Khoury MJ. Assessing Value in Biomedical Research: The PQRST of Appraisal and Reward. JAMA. Published online June 09, 2014.
doi:10.1001/jama.2014.6932.
• We need to move assessment toward
“desired outcomes: research that is
productive, high-quality, reproducible,
shareable, and translatable (PQRST)”
• Productivity metrics should reward high-
influence science rather than least
publishable units and decrease publication
bias against negative results.
Research impact and assessment
7. go beyond counts!
https://becker.wustl.edu/impact-assessment
Investigate and adapt
frameworks to help put
things into context
More and more organizations and efforts are
considering the impact of their work through
frameworks.
The Becker Model
Frameworks from the IOM, CDC, and NIEHS
Snowball Metrics
Discipline-specific apporaches
and more…
9. The Becker Model
• Provides a supplement to publication analysis to provide a more
robust and comprehensive perspective of biomedical research
impact.
– reporting templates, glossary of resources and terms, examples of
relevant indicators of impact across the research process, and readings
• Straightforward framework for tracking diffusion of research outputs
and activities to locate indicators that demonstrate evidence of
biomedical research impact
– individual, core, and institutional-level; modify for different disciplines
• Guidance for quantifying and documenting research impact as well
as resources for locating evidence of impact.
• Strategies for enhancing the impact of research
10. strategies for enhancing the impact
of research
https://becker.wustl.edu/impact-assessment/strategies
Repetition, consistency, and an
awareness of the intended
audience form the basis of most
the strategies.
The strategies focus upon
Preparing for Publication,
Dissemination, and Keeping
Track of Your Research.
Optimizing discoverability and
access of your research is the
surest way to enhance its
visibility and impact.
Suggestions for researchers and
recommendations to reach out to
their library for assistance.
11. Understanding the impact of research is a massive task.
How do we understand the impact of a published work?
Aperson?
Aresearch center?
Auniversity?
What IS impact?
How do we scale up and operationalize this process?
How do we support it locally from the library?
12. leverage ongoing efforts on the local
level and beyond
• ORCID
• CASRAI
• CERIF
• VIVO-ISF ontology
• NISO Altmetric work
• CV/Biosketch
• Large-scale semantic search
• Institutional
repositories
• Research
networking systems
• Institutional
research
information and
management
systems
• Analytical and
benchmarking
efforts
• Tracking and
evaluation
13. VIVO lends itself to research
assessment and impact
• Facilitates a more robust research
ecosystem
– Diffusion of research products
– Discovery of shared products
– Reproducibility
– Translation
– Enables credit and visibility by
showcasing individual
achievements and expertise
• Supports team-based science and
collaboration
• Allows better large-scale understanding
of the research enterprise
– Temporal relationships (career
development, time from publication to
research synthesis)
– Peer comparisons
– Strategic planning and visualization
– Identify emerging trends
• Enables more efficient means of
collecting & representing meaningful
outputs en masse
• New experimental protocols
• Datasets
• Instrumentation
• Software code
• New diagnostic criteria
• New standard of care
• Curriculum guidelines
• Measurement instruments
• Continuing education
materials
• Clinical/practice guidelines
14. VIVO lends itself to research
assessment and impact
• Facilitates a more robust research
ecosystem
– Diffusion of research products
– Discovery of shared products
– Reproducibility
– Translation
– Enables credit and visibility by
showcasing individual
achievements and expertise
• Supports team-based science and
collaboration
• Allows better large-scale understanding
of the research enterprise
– Temporal relationships (career
development, time from publication to
research synthesis)
– Peer comparisons
– Strategic planning and visualization
– Identify emerging trends
• Enables more efficient means of
collecting & representing meaningful
outputs en masse
• New experimental protocols
• Datasets
• Instrumentation
• Software code
• New diagnostic criteria
• New standard of care
• Curriculum guidelines
• Measurement instruments
• Continuing education
materials
• Clinical/practice guidelines
VIVO can help operationalize this
process:
Enhance profiles by incorporating meaningful
outputs to enhance content and value; can
facilitate dissemination and discovery of
scholarship
The ontology is the key…
15. • Plum Analytics gathers metrics across
five categories—usage, mentions,
captures, social media and citations.
• Metrics are gathered around artifacts
• Collected information is displayed in
visualizations, dashboards, and widgets.
• Customized for the institution or
organization.
ARTIFACTS
articles
blog posts
book chapters
books
cases
clinical trials
conference papers
datasets
figures
grants
interviews
letters
media
patents
posters
presentations
source code
theses / dissertations
videos
webpages
http://www.ebscohost.com/newsroom/stories/plum-analytics-becomes-part-of-ebsco-information-services
IR integration
17. IR integration
Benefits?
1. Create a feedback loop for researchers that
gives them timely insight into the impact of
their research right from the start
2. Easy access to advanced metrics from
PlumX can help build buy-in from
researchers, increasing repository support
by your contributors
3. Aggregated metrics and reports from the
PlumX dashboard puts the repository at the
center of communicating about the impacts
of your organization’s research
Courtesy of Plum
18. a spectrum of possibilities…
http://commons.wikimedia.org/wiki/File:Linear_visible_spectrum.svg
Integrate
altmetrics
in your IR
Lead or
support
RNS project
(VIVO, etc)
Run citation
reports
ORCID
services and
support
Encourage
prudent use of
online tools
Welcome deposit of
alternative outputs
in your IR
Keep
current
Collaborate with
TT office to
track outputs
Develop
standard
reports/visualiz
ations/analyses Hire an impact
and evaluation
librarian
Workshop on
strategies for
enhancing
research impactRun
publication
data reports
Establish a
consultation
service
Help
researchers
obtain DOIs
and understand
data options
Understand
motivations
of funding
agencies
19. local success depends on
stakeholder engagement
• Stay up to date on scholarly issues and the scholarly
workflow
• Stay up to date on funding/reporting requirements
• Brainstorm to identify and understand motivations of
stakeholders
• Anticipate need and present solutions
• Call on advocates
• Think beyond ‘business as usual’
• Be persistent!
20. Acknowledgements
Support:
• Northwestern University
Clinical and Translational
Sciences Institute, NIH
award 8UL1TR000150-05
• VIVO - DuraSpace
Thanks:
• Cathy Sarli, MLS, AHIP
• Karen Gutzman – NLM Fellow
• Andrea Michalek at Plum
@amichalek
• VIVO Community @VIVOcollab
• Galter Health Sciences Library
Questions:
@kristiholmes
kristi.holmes@northwestern.edu
Thank
you!
Editor's Notes
How do we scale up and operationalize this process
Our work in this area really orbits around efforts to develop a library-based framework for understanding meaningful research impact, The Becker Model.
Project with Cathy Sarli, originally motivated by work she did for one of our researchers at Washington University, Dr. Mae Gordon.
Our work in this area really orbits around efforts to develop a library-based framework for understanding meaningful research impact, The Becker Model.
Project with Cathy Sarli, originally motivated by work she did for one of our researchers at Washington University, Dr. Mae Gordon.
Consistency enhances retrieval.
A lot of the suggestions lie directly in our wheelhouse – makes this a great opportunity for the library
How do we scale up and operationalize this process
Multiple platforms
VIVO can play a big role in helping us operationalize this process – as well as leverage the data at the enterprise level for higher-order analyses, visualizations, and strategic planning efforts.
Much of this is not possible to tease out from bibliographic or grant data
Applying the pathways, we are able to uncover outputs and impacts that may not be visible if one only follows the papers – things related to
Diagnosis, clinical care, clinical environment.
Cost-effective interventions
Even materials and improvements related to how we educate doctors to practice medicine
VIVO can play a big role in helping us operationalize this process – as well as leverage the data at the enterprise level for higher-order analyses, visualizations, and strategic planning efforts.
Much of this is not possible to tease out from bibliographic or grant data
Applying the pathways, we are able to uncover outputs and impacts that may not be visible if one only follows the papers – things related to
Diagnosis, clinical care, clinical environment.
Cost-effective interventions
Even materials and improvements related to how we educate doctors to practice medicine
KH
Let’s use Plum Analytics as a good example for a deep dive on the types of things you can track…
Subjects
Trade Names
Manufacturers
Publisher
Abstract
Author Keywords
Index Keywords
Date
Grant Funding Support
Language
DOI
Source
Document Type
Peer Review Status
Authors
Author Affiliations
Corresponding Author
Group Authors
Chemicals
Institutions
Countries
References
Citation Counts
Just a few notes of thanks.
First to Cathy Sarli, a dear colleague and collaborator. We work together on the Becker Model.
Cathy, Karen Gutzman (NLM second year fellow) and I work together at Becker Library and we are also members of the WU ICTS Tracking and evaluation team. We’re in the midst of a pilot project to apply the Becker Model to ICTS investigators and look forward to learning more from this process.
Andrea Michalek from Plum – very cool work – for their work with VIVO and information for this presentation
Finally, the VIVO team and open source community – a fantastic group of friends and colleagues from all over the world doing very cool work and spreading Linked Open Data love wherever they go.
Librarians may be interested to read more about a complementary effort, Linked Data for Libraries, a recent award from the Mellon Foundation to Cornell, Stanford, and Harvard Libraries.