Whether preparing a portfolio or reassessing a research topic, metrics help quantify scholarly impact. Traditional metrics such as the h-index or Impact Factor assist in this endeavor but often fall short in capturing all sides of the story. This informational session, tailored to faculty but open to all, focuses on Alternative Metrics: what they are, where they can be accessed, and how they can be used to demonstrate impact.
Joining the ‘buzz’ : the role of social media in raising research visibilityEileen Shepherd
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
I apply Ranganathan's 5 laws of library science to altmetrics, as part of a holistic research impact support service. I discuss what altmetrics are, what they measure, their uses throughout the research lifecycle, and where you can get them. I then apply Ranganathan's 4th law, saving the time of the user, to the harvesting of altmetrics by research information systems, embedding them at the point of need. The challenge of altmetrics is to change our concept of what an institutional repository is, from a simple container of research outputs to a smart system that harvests and catalogs a much broader range of output and impact data elements.
Joining the ‘buzz’ : the role of social media in raising research visibilityEileen Shepherd
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
I apply Ranganathan's 5 laws of library science to altmetrics, as part of a holistic research impact support service. I discuss what altmetrics are, what they measure, their uses throughout the research lifecycle, and where you can get them. I then apply Ranganathan's 4th law, saving the time of the user, to the harvesting of altmetrics by research information systems, embedding them at the point of need. The challenge of altmetrics is to change our concept of what an institutional repository is, from a simple container of research outputs to a smart system that harvests and catalogs a much broader range of output and impact data elements.
Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.
Citation metrics versus peer review: Google Scholar, Scopus and the Web of Sc...Anne-Wil Harzing
This presentations reports on a systematic and comprehensive comparison of the coverage of the three major bibliometric databases: Google Scholar, Scopus and the Web of Science. Based on a sample of 146 senior academics in five broad disciplinary areas, we therefore provide both a longitudinal and a cross-disciplinary comparison of the three databases.
Our longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Our cross-disciplinary comparison of the three databases includes four key research metrics (publications, citations, h-index, and hI,annual, an annualised individual h-index) and five major disciplines (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We show that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.
This presentation first outlines five different aspects of impact. I then look at what we can learn from the measurement of academic impact, usually operationalised as citations. I show that four key recommendations for academic impact (multiple sources, multiple metrics, cross-disciplinary focus, and long term perspective) can be applied to non-academic impact as well. In addition, I argue that the four C's of citation impact (competence, communication, collaboration, and care) also apply to non-academic impact.
Tweet Your Pubs: How Altmetrics are Changing the Way We Measure Research ImpactRobin Featherstone
Presentation given to the Northern Alberta Health Libraries Association (NAHLA) Trends Mini Conference in Edmonton at the University of Alberta on May 2, 2014
10 SIMPLE STEPS TO BUILDING A REPUTATION AS A RESEARCHER, IN YOUR EARLY CAREERMicah Altman
A talk sponsored by the MIT Postdoctoral Association with support from the Office of the Vice President for Research.
In the rapidly changing world of research and scholarly communications researchers are faced with a rapidly growing range of options to publicly disseminate, review, and discuss research—options which will affect their long-term reputation. Junior scholars must be especially thoughtful in choosing how much effort to invest in dissemination and communication, and what strategies to use.
In this talk, I briefly discuss a number of review of bibliometric and scientometric studies of quantitative research impact, a sampling of influential qualitative writings advising this area, and an environmental scan of emerging researcher profile systems. Based on this review, and on professional experience on dozens of review panels, I suggest some steps junior researchers may consider when disseminating their research and participating in public review and discussion.
Integrating ORCID, Funding, and Institutional IdentifiersMicah Altman
Presented at the "Twelfth Annual ARIES EMUG Users Group Meeting".
The presentation embedded below provides an overview of ORCID researcher identifiers; their role in integrating systems for managing, evaluating, and tracking scholarly outputs; and the broader integration of researcher identifiers with publication, funder, and institutional identifiers.
Citation metrics across disciplines - Google Scholar, Scopus, and the Web of ...Anne-Wil Harzing
Key conclusions:
1. Will the use of citation metrics disadvantage the Social Sciences and Humanities?
* Not, if you use a database that includes publications important in those disciplines (e.g. books, national journals)
* Not, if you correct for differences in co-authorships
2. Is peer review better than metrics for the Social Sciences and Humanities?
* Yes, in a way…. The ideal version of peer review (informed, dedicated, and unbiased experts) is better than a reductionist version of metrics
* However, an inclusive version of metrics is probably better than the likely reality of peer review (hurried semi-experts, potentially influenced by journal outlet and affiliation)
Keynote speech at the Eureopan Academy of Management at a panel on the future of business schools. Discusses the case for and against becoming more relevant.
The case for:
Engagement leads to better research
Ranking-mania leads us astray
Engagement through new media is easy
The case against:
Has the quest for relevance gone too far?
Are we asking too much of (junior) academics?
Let’s not create opposing “camps”
Building your academic brand through engagement with social mediaAnne-Wil Harzing
What constitutes social media in an academic context?
Why do you (not) use social media?
Five key types of social media with different functions
Brief overview of key purpose and functionality
Look at a real-life example
Recommendations for how to use social media
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
Presented by Dom Mitchell, Community Manager for DOAJ to 35th Conference of International Association of Scientific and Technological University Libraries (IATUL).
A presentation exploring how DOAJ is using crowdsourcing to evaluate the ~9700 journals currently in DOAJ. Using a network of volunteers, every journals will be reassessed and evaluated based on the new criteria.
This version contains a handful of extra slides that were originally removed due to time restrictions.
How to measure research impact on the webKinga Hosszu
This presentation explains how research impact measurement has changed with the advent of the internet, and provides examples of how impact can be measurement using several online tools.
Making an Impact: The Impact Factor's Intent, Benefits, Limitations, and Comp...Erin Owens
The Impact Factor is popularly viewed as a representation of a scholarly journal's quality and desirability for publication. But this metric is frequently misused, while other metrics more suitable to a goal may be overlooked. This presentation will help researchers understand the purpose of the Impact Factor, analyze its benefits and limitations, and evaluate available alternatives.
Joining the ‘buzz’ : the role of social media in raising research visibility ...Eileen Shepherd
[This presentation is based on my previous presentation, of the same title, at the LIASA 2014 conference. It was presented as a webinar for LIASA Higher Education Libraries Interest Group on 6/11/2014]
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.
Citation metrics versus peer review: Google Scholar, Scopus and the Web of Sc...Anne-Wil Harzing
This presentations reports on a systematic and comprehensive comparison of the coverage of the three major bibliometric databases: Google Scholar, Scopus and the Web of Science. Based on a sample of 146 senior academics in five broad disciplinary areas, we therefore provide both a longitudinal and a cross-disciplinary comparison of the three databases.
Our longitudinal comparison of eight data points between 2013 and 2015 shows a consistent and reasonably stable quarterly growth for both publications and citations across the three databases. This suggests that all three databases provide sufficient stability of coverage to be used for more detailed cross-disciplinary comparisons.
Our cross-disciplinary comparison of the three databases includes four key research metrics (publications, citations, h-index, and hI,annual, an annualised individual h-index) and five major disciplines (Humanities, Social Sciences, Engineering, Sciences and Life Sciences). We show that both the data source and the specific metrics used change the conclusions that can be drawn from cross-disciplinary comparisons.
This presentation first outlines five different aspects of impact. I then look at what we can learn from the measurement of academic impact, usually operationalised as citations. I show that four key recommendations for academic impact (multiple sources, multiple metrics, cross-disciplinary focus, and long term perspective) can be applied to non-academic impact as well. In addition, I argue that the four C's of citation impact (competence, communication, collaboration, and care) also apply to non-academic impact.
Tweet Your Pubs: How Altmetrics are Changing the Way We Measure Research ImpactRobin Featherstone
Presentation given to the Northern Alberta Health Libraries Association (NAHLA) Trends Mini Conference in Edmonton at the University of Alberta on May 2, 2014
10 SIMPLE STEPS TO BUILDING A REPUTATION AS A RESEARCHER, IN YOUR EARLY CAREERMicah Altman
A talk sponsored by the MIT Postdoctoral Association with support from the Office of the Vice President for Research.
In the rapidly changing world of research and scholarly communications researchers are faced with a rapidly growing range of options to publicly disseminate, review, and discuss research—options which will affect their long-term reputation. Junior scholars must be especially thoughtful in choosing how much effort to invest in dissemination and communication, and what strategies to use.
In this talk, I briefly discuss a number of review of bibliometric and scientometric studies of quantitative research impact, a sampling of influential qualitative writings advising this area, and an environmental scan of emerging researcher profile systems. Based on this review, and on professional experience on dozens of review panels, I suggest some steps junior researchers may consider when disseminating their research and participating in public review and discussion.
Integrating ORCID, Funding, and Institutional IdentifiersMicah Altman
Presented at the "Twelfth Annual ARIES EMUG Users Group Meeting".
The presentation embedded below provides an overview of ORCID researcher identifiers; their role in integrating systems for managing, evaluating, and tracking scholarly outputs; and the broader integration of researcher identifiers with publication, funder, and institutional identifiers.
Citation metrics across disciplines - Google Scholar, Scopus, and the Web of ...Anne-Wil Harzing
Key conclusions:
1. Will the use of citation metrics disadvantage the Social Sciences and Humanities?
* Not, if you use a database that includes publications important in those disciplines (e.g. books, national journals)
* Not, if you correct for differences in co-authorships
2. Is peer review better than metrics for the Social Sciences and Humanities?
* Yes, in a way…. The ideal version of peer review (informed, dedicated, and unbiased experts) is better than a reductionist version of metrics
* However, an inclusive version of metrics is probably better than the likely reality of peer review (hurried semi-experts, potentially influenced by journal outlet and affiliation)
Keynote speech at the Eureopan Academy of Management at a panel on the future of business schools. Discusses the case for and against becoming more relevant.
The case for:
Engagement leads to better research
Ranking-mania leads us astray
Engagement through new media is easy
The case against:
Has the quest for relevance gone too far?
Are we asking too much of (junior) academics?
Let’s not create opposing “camps”
Building your academic brand through engagement with social mediaAnne-Wil Harzing
What constitutes social media in an academic context?
Why do you (not) use social media?
Five key types of social media with different functions
Brief overview of key purpose and functionality
Look at a real-life example
Recommendations for how to use social media
Research impact metrics for librarians: calculation & contextLibrary_Connect
Slides from the May 19, 2016, Library Connect webinar "Research impact metrics for librarians: calculation & context" with Jenny Delasalle and Andrew Plume.
Watch the webinar at: https://libraryconnect.elsevier.com/library-connect-webinars?commid=199783
Presented by Dom Mitchell, Community Manager for DOAJ to 35th Conference of International Association of Scientific and Technological University Libraries (IATUL).
A presentation exploring how DOAJ is using crowdsourcing to evaluate the ~9700 journals currently in DOAJ. Using a network of volunteers, every journals will be reassessed and evaluated based on the new criteria.
This version contains a handful of extra slides that were originally removed due to time restrictions.
How to measure research impact on the webKinga Hosszu
This presentation explains how research impact measurement has changed with the advent of the internet, and provides examples of how impact can be measurement using several online tools.
Making an Impact: The Impact Factor's Intent, Benefits, Limitations, and Comp...Erin Owens
The Impact Factor is popularly viewed as a representation of a scholarly journal's quality and desirability for publication. But this metric is frequently misused, while other metrics more suitable to a goal may be overlooked. This presentation will help researchers understand the purpose of the Impact Factor, analyze its benefits and limitations, and evaluate available alternatives.
Joining the ‘buzz’ : the role of social media in raising research visibility ...Eileen Shepherd
[This presentation is based on my previous presentation, of the same title, at the LIASA 2014 conference. It was presented as a webinar for LIASA Higher Education Libraries Interest Group on 6/11/2014]
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
Joining the ‘buzz’ : the role of social media in raising research visibility at Rhodes University, Grahamstown, South Africa - HELIG Webinar presented by Eileen Shepherd
WEBINAR: Joining the "buzz": the role of social media in raising research vi...HELIGLIASA
Joining the ‘buzz’ : the role of social media in raising research visibility: Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics/article level metrics). Altmetrics measures aspects of the impact of a work, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media.
This webinar (based on a presentation of the same name at the LIASA conference on 24th September 2014) gives a brief background to altmetrics and demonstrates how Rhodes University, Grahamstown, librarians are using social media to raise the visibility of the research output of their institution.
Presented by Eileen Shepherd, Principal Librarian, Science & Pharmacy, Rhodes University Library
Research-Open Access-Social Media: A winning combinationEileen Shepherd
This presentation endeavours to show that social media and open access are a great couple, to provide a brief introduction to altmetrics – a non-traditional form of measuring scholarly impact and to demonstrate the use of social media in raising awareness and visibility of Rhodes University research
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
The academic impact of research: Current and the future citation trends in de...Nader Ale Ebrahim
Writing an article for online distribution in a way that maximized the chances of citation hits, is different from preparing one for print journals in some small, but important, respects. To be cited, articles have to be visible in an electronic environment. Therefore, publishing a high quality paper in scientific journals will be a halfway of receiving citation in the future. The rest of the way is advertising and disseminating the publications by using the proper “Research Tools”. Familiarity with the tools allows the researcher to increase his/her h-index in the short time.
The number of citations has over 30% share in academic ranking. Hence, most of the scientists are looking for a method to increase their citation record. Nader developed and introduced a method for increasing the visibility and impact of the research which directly effects on the number of citations. This talk tends to introduce some of the key points for improving the citation trends in developing countries by presenting the current situation and the future trends.
Introduction to “Research Tools”: Tools for Collecting, Writing, Publishing, ...Nader Ale Ebrahim
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. I have collected over 700 tools that enable researchers to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1) Searching the literature, (2) Writing a paper, (3) Targeting suitable journals, and (4) Enhancing visibility and impact of the research. This presentation will provide an overview to the most important tools from searching literature to disseminating researcher outputs. The e-skills learned from the workshop are useful across various research disciplines and research institutions.
Research-Open Access-Social Media: a winning combination, presented by Eileen Shepherd at the Open Access Symposium on 21 October 2014 - Rhodes University Library
Introduction to Altmetrics for Medical and Special LibrariansLinda Galloway
Altmetrics (or alternative citation metrics) provide new ways to track scholarly influence across a wide range of media and platforms. This presentation covers altmetric fundamentals, tips on connecting your users with altmetrics, and an overview of newly published research. Presented as part of the NN/LM MAR Boost Box Series; http://nnlm.gov/mar/training/boost_mar2014.pdf
Analysis of Bibliometrics information for selecting the best field of studyNader Ale Ebrahim
Bibliometrics can be defined as the statistical analysis of publications. Bibliometrics has focused on the quantitative analysis of citations and citation counts which is complex. It is so complex and specialized that personal knowledge and experience are insufficient tools for understanding trends for making decisions. We need tools for analysis of Bibliometrics information for select the best field of study with promising enough attention. This presentation will provide tools to discover the new trends in our field of study in order to select an area for research and publication which promising the highest research impact.
Using Bibliometrics Tools to Increase the visibility of your publicationsCiarán Quinn
Strategies to increase the visibility of your research including using keywords, Bibliometric resources, measuring your H Index,Journal Impact, Article level metrics, Altmetrics, and Academic Social Networks
Metrics vs peer review: Why metrics can (and should?) be applied in the Socia...Anne-Wil Harzing
Review the debates on metrics vs peer review and suggests that we are comparing the idealised version of peer review to the reductionist version of metrics. Instead we should compare the reality of peer review with the inclusive version of metrics.
Google Scholar: Can it Really Be Used for Bibliometrics? by Isobel Stark and Michael Whitton, University of Southampton. Presentation at the Research Evaluation: Is It Our Business? The Role of Librarians in the Brave New World of Research Evaluation 29 June 2011, University of Birmingham, Edgbaston Campus.
Analysis of Bibliometrics information for select the best field of studyNader Ale Ebrahim
Bibliometrics can be defined as the statistical analysis of publications. Bibliometrics has focused on the quantitative analysis of citations and citation counts which is complex. It is so complex and specialized that personal knowledge and experience are insufficient tools for understanding trends for making decisions. We need tools for analysis of Bibliometrics information for select the best field of study with promising enough attention. This presentation will provide tools to discover the new trends in our field of study in order to select an area for research and publication which promising the highest research impact.
Similar to Beyond Citations - NEIU NETT Day Presentation on Altmetrics (20)
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Beyond Citations - NEIU NETT Day Presentation on Altmetrics
1. Beyond Citations
Demonstrating your Impact through
Alternative Metrics
October 14, 2014
Kelly Grossmann, Science Librarian
MS Information, MS Bioinformatics
2. Outline
1. How and why do we assess impact?
2. Methods and Metrics for Impact Assessment
- Traditional
- Alternative
3. Improving your Impact
3. How do we assess the impact of
research?
… and why should we?
10. For Faculty & Researchers
“Publish or Perish”
- Showing academic value in a sea of
information.
Increased importance of relevance and
significance of research
11. For Students
Evaluate sources and identify core papers
Follow key research trends
Perhaps even demonstrate your scholarly impact for
graduate school
17. Impact Factor
Journal level metric
Average number of citations to articles within journal
Easily Googled
Allows comparison by subject in Journal Citation Reports
Proprietary
23. Google to Find h-index
http://scholar.google.com
Use author profile to find author h-index
Use Journal metrics to find journal h-index (average)
Allows you to compare impact within your field
25. Criticism of Traditional Methods
Depend on Counters
“Gaming the system”
Slow
Vary by time and length of
career
Gender bias in referring
practices
Skew towards favoring popular
Do not connotate
negative/positive references
Vary greatly by discipline
Often only available by Journal
(IF & EF)
28. Altmetrics
Short for Alternative Metrics
Methods of analyzing impact beyond citations
Made possible by our ability to quickly, electronically share
information.
42. Where to Share
GitHub - Code
FigShare - Datasets, images, videos
SlideShare - Presentations
Twitter - Social networking
Facebook - Social networking
43. In summation...
Many metrics, many tools.
Combination of metrics is best.
Be sure to compare within your discipline.
Be sure to share your work to improve your
impact.
45. References
[1] Sci2 Team. Science of Science (Sci2) Tool. User Manual. http://wiki.cns.iu.edu/pages/viewpage.action?pageId=2200066
[5] Priem, J. "MEDLINE-indexed articles published per year." R Chart. Jason Priem/blog 18 Oct. 2010. Accessed 10 Oct. 2014 <http://jasonpriem.org/2010/10/medline-literature-growth-chart/>.
[6] Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C,et al. (2011) “The Development of Open Access Journal Publishing from 1993 to 2009.” Figure 2: The development of open access publishing.
PLoS ONE 6(6): e20961. doi:10.1371/journal.pone.0020961.
[7] Van Noorden, R. “Science publishing: The trouble with retractions.” Box: Rise of the retractions. Nature. Published online 5 Oct. 2011. Accessed 10 Oct. 2014. doi: 10.1038/478026a
<http://www.nature.com/news/2011/111005/full/478026a/box/2.html>
[7] Bohannon, J. “Who’s Afraid of Peer Review?” Science. 4 Oct. 2013. 342(6154). pages 60-65. doi: 10.1126/science.342.6154.60 <http://www.sciencemag.org/content/342/6154/60.full>
[8] McGovern, V. “Foundation funding and chemical biology.” Trends in research funding by agency. Nature Chemical Biology. 4, 519-522. 2008. doi: 10.1038/nchembio0908-519.
<http://www.nature.com/nchembio/journal/v4/n9/fig_tab/nchembio0908-519_F1.html>
[12, 13] Sci2 Team. Science of Science (Sci2) Tool. Indiana University and SciTech Strategies, (2009). https://sci2.cns.iu.edu.
[19] University of Washington. “Overview”. Eigenfactor information page. <http://www.eigenfactor.org/methods.php> Image: A model of research. <http://www.eigenfactor.org/images/animatedfigure.gif>
Accessed on 10 Oct. 2014.
[20] Vulpecula (User name). "h-index" h-index from a plot of decreasing citations for numbered papers. Wikipedia. 02 Jan. 2008. Accessed 10 Oct. 2014 <http://en.wikipedia.org/wiki/H-index>.
[25] Wendl, M.C. “H-index: however ranked, citations need context.” Nature. (2007) 449: 403.
[25] Kelly, C. D. , Jennions, M.D. “H-index: age and sex make it unreliable.” Nature. (2007) 449: 403.
[25] Pagel, P.S., Hudetz, J.A. “H-index is a sensitive indicator of academic activity in highly productive anaesthesiologists: results of a bibliometric analysis.” Acta Anaesthesiologica Scandinavica. (2011).
55:9. 1085-1089. dio: 10.111/j.1399-6576.2011.02508.x <http://onlinelibrary.wiley.com/doi/10.1111/j.1399-6576.2011.02508.x/abstract>
[26] West Side Story. Dir. Jerome Robbins and Robert Wise. Perf. Natalie Wood. Comp. Leonard Bernstein. Mirisch Corporation, 1961. Film.
[29] Careless, J. (2013). Altmetrics 101: A Primer. (cover story). Information Today, 30(2), 1-36. [Accessed October, 2014].
[32] Lowman, M. (2014). How to Raise a Woman Scientist. Retrieved 13 Oct. 2014, from http://www.huffingtonpost.com/meg-lowman/how-to-raise-a-woman-scie_b_5928644.html
[34, 35] Mendeley. <http://www.mendeley.com/catalog/modularity-community-structure-networks-27/> Accessed 10 Oct. 2014
[37] Thomson, J. “Altmetrics added to Royal Society of Chemistry Journals”. RSC Publishing Blog. 12 Sept. 2013. Accessed on 10 Oct. 2014. http://blogs.rsc.org/rscpublishing/2013/09/12/altmetrics-added-to-
royal-society-of-chemistry-journals/
46. Further Reading
Altmetrics Manifesto http://altmetrics.org/
University of Maryland Altmetrics LibGuide
http://lib.guides.umd.edu/altmetrics
HLWiki International:
http://hlwiki.slais.ubc.ca/index.php/Author_impact_metrics
Full citations available in presentation notes.
# of articles indexed in MEDLINE (the database of PubMed) has grown almost exponentially since 1950, trending upward in the late 90’s.
Source:
Priem, J. "MEDLINE-indexed articles published per year." R Chart. Jason Priem/blog 18 Oct. 2010. Accessed 10 Oct. 2014 <http://jasonpriem.org/2010/10/medline-literature-growth-chart/>.
The web has allowed for the growth of open access journals and thus an increase in open access articles, via a more rapid rate of production and review. This graph was developed by a group of economics researchers and published in an open access journal. The analysis focused on the rate of publication of articles and the increasing prevalence of open access journals. Here we see that the number of articles published has grown greatly along with the number of open access journals.
Source:
Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C,et al. (2011) “The Development of Open Access Journal Publishing from 1993 to 2009.” Figure 2: The development of open access publishing. PLoS ONE 6(6): e20961. doi:10.1371/journal.pone.0020961.
With more articles, come more retractions. Retractions have actually grown at a faster rate than the rate of number of article growth.
I am not attributing this trend to the open access publication practices (Although, read John Bohannon’s “Who’s Afraid of Peer Review?” to learn more about open access journals with unacceptable publication practices), this increase in retractions more likely has to do with better research oversight committees (Van Noorden), and the culture of academic publication (publish or perish pressure).
Source:
Van Noorden, R. “Science publishing: The trouble with retractions.” Box: Rise of the retractions. Nature. Published online 5 Oct. 2011. Accessed 10 Oct. 2014. doi: 10.1038/478026a <http://www.nature.com/news/2011/111005/full/478026a/box/2.html>
Bohannon, J. “Who’s Afraid of Peer Review?” Science. 4 Oct. 2013. 342(6154). pages 60-65. doi: 10.1126/science.342.6154.60 <http://www.sciencemag.org/content/342/6154/60.full>
After inflation is accounted for, we see a townward trend in public funding for the sciences since 2004.
McGovern, V. “Foundation funding and chemical biology.” Trends in research funding by agency. Nature Chemical Biology. 4, 519-522. 2008. doi: 10.1038/nchembio0908-519. <http://www.nature.com/nchembio/journal/v4/n9/fig_tab/nchembio0908-519_F1.html>
The study of scholarly communication has become a science in and of itself. Sci2 is an example of a data analysis tool that can be used to examine networks of scholarly communication.
Sci2 Team. Science of Science (Sci2) Tool. Indiana University and SciTech Strategies, (2009). https://sci2.cns.iu.edu.
Sci2 Team. Science of Science (Sci2) Tool. Indiana University and SciTech Strategies, (2009). https://sci2.cns.iu.edu.
The following tools are a handful of the traditional metrics used to analyze scholarly impact.
The most traditional way of assessing impact. An article gets published, the more that additional articles refer to it, the more citation counts it has.
Article level
Note the differences in citations.
These are limited to the journals indexed in Web of Science and can only be located for articles listed in Biological Abstracts (due to our access restricitons). More articles can be found on Web of Science, but you must access them as a visitor of the UIC Library.
The difference in citation count differences can also be attributed to in accurate indexing in google scholar, double counting some citations.
Google scholar is also counting the citations of questionable ‘scholarly’ journals
Cons
Proprietary, algorithm is not public
Available only at the Journal level
Available only for journals in the ISI lists.
Should be compared to other journals in your field for a more accurate representation of impact.
Can be gamed by self citations.
An example of a very high impact factor.
Pros:
A more robust algorithm than IF, accounts for the impact of citations from influential journals (determined by the citations of the citing journals).
Scores the journals .01-100, all scores adding up to 100.
Really beautiful data visualizations.
Not proprietary.
Easily found via Google or the Eigenfactor website.
Makes it easy to compare to other Eigenfactors in your field.
Cons:
Only available for ISI Journals.
Only on the journal level.
Source:
University of Washington. “Overview”. Eigenfactor information page. <http://www.eigenfactor.org/methods.php> Image: A model of research. <http://www.eigenfactor.org/images/animatedfigure.gif> Accessed on 10 Oct. 2014.
AKA Hersch factor.
Traditionally used in science.
Pros:
Author level
Can be calculated for any author in any discipline (not limited to the ISI list).
Cons:
Varies greatly by discipline
Image Source:
User name: Vulpecula. "h-index" h-index from a plot of decreasing citations for numbered papers. Wikipedia. 02 Jan. 2008. Accessed 10 Oct. 2014 <http://en.wikipedia.org/wiki/H-index>.
h-index: 4 articles with at least 4 cited by citations, h-index = 4
i10 index: 2 articles with >10 cited by citations, i10 = 2
[Thank you to Lisa Wallis for allowing us to view her publications]
Can be used by anyone, including students to evaluate the impact of a journal.
Can also be used to compare the average h-indexes of journals.
Step 1. Visit scholar.google.com
Step 2. Click “Metrics”
Step 3. Adjust by field using links on the left column of the stage.
You may view the profiles of some authors (if they are set up) by clicking on the author’s name in the Google Scholar citation.
If you are an author, you can setup your google scholar profile to help collect metrics.
Wendl, M.C. “H-index: however ranked, citations need context.” Nature. (2007) 449: 403.
Kelly, C. D. , Jennions, M.D. “H-index: age and sex make it unreliable.” Nature. (2007) 449: 403.
Pagel, P.S., Hudetz, J.A. “H-index is a sensitive indicator of academic activity in highly productive anaesthesiologists: results of a bibliometric analysis.” Acta Anaesthesiologica Scandinavica. (2011). 55:9. 1085-1089. dio: 10.111/j.1399-6576.2011.02508.x <http://onlinelibrary.wiley.com/doi/10.1111/j.1399-6576.2011.02508.x/abstract>
Image from: West Side Story. Dir. Jerome Robbins and Robert Wise. Perf. Natalie Wood. Comp. Leonard Bernstein. Mirisch Corporation, 1961. Film.
Originally, article level metrics, failed to capture the measures vast array of additional, non-traditional methods to analyze an individual's impact.
Fittingly enough, the term was first coined in a tweet. - J. Priem https://twitter.com/jasonpriem/status/25844968813
Careless, J. (2013). Altmetrics 101: A Primer. (cover story). Information Today, 30(2), 1-36. [Accessed October, 2014].
Screen cap from: http://www.huffingtonpost.com/jeremy-scheinberg/fostering-an-early-love-o_b_5941804.html?utm_hp_ref=girls-in-stem
An example of a non scholarly publication that could still have a public or scholarly impact.
Lowman, M. (2014). How to Raise a Woman Scientist. Retrieved 13 Oct. 2014, from http://www.huffingtonpost.com/meg-lowman/how-to-raise-a-woman-scie_b_5928644.html
Most well known, most traditional. Can collect the number of readers of an article and examine the article’s popularity in various fields.
Criticism- Should the number go down if an item is removed from a library?
From Mendeley. <http://www.mendeley.com/catalog/modularity-community-structure-networks-27/> Accessed 10 Oct. 2014
From Mendeley. http://www.mendeley.com/research-papers/ Accessed 10 Oct. 2014.
Very handy tool for collecting altmetrics on articles.
Visit the URL. Drag the bookmarklet to your bookmarks bar.
Open an article online.
Click the Altmetric Bookmark.
A popup should appear with Altmetrics for the article.
Publishers are now making altmetrics available.
Thomson, J. “Altmetrics added to Royal Society of Chemistry Journals”. RSC Publishing Blog. 12 Sept. 2013. Accessed on 10 Oct. 2014. http://blogs.rsc.org/rscpublishing/2013/09/12/altmetrics-added-to-royal-society-of-chemistry-journals/
Some are even encouraging authors to promote using social media.
http://journalauthors.tandf.co.uk/pdfs/socialmedia-infographic.pdf
For the authors, there are a number of data aggregation tools to help you with your impact assessment.
We are no longer limited to one type of publication (scholarly articles) and one type of data (citation counts)
Enhance impact by increasing readership. Publicize your research!