Research evaluation is relevant to librarians because they can provide expertise and data to various stakeholders evaluating research performance. Key stakeholders include university rankings, research funders, institutions, and researchers themselves. There are several tools and data sources librarians can leverage, such as journal rankings and metrics, citation data from databases, and altmetrics. Librarians can advise on using these evaluation methods and managing research information and outputs through repositories and current research information systems.
Supporting Bibliometrics by Jenny Delasalle, Academic Support Manager (Research), University of Warwick. Presentation at the Research Evaluation: Is It Our Business? The Role of Librarians in the Brave New World of Research Evaluation 29 June 2011, University of Birmingham, Edgbaston Campus.
Supporting Bibliometrics by Jenny Delasalle, Academic Support Manager (Research), University of Warwick. Presentation at the Research Evaluation: Is It Our Business? The Role of Librarians in the Brave New World of Research Evaluation 29 June 2011, University of Birmingham, Edgbaston Campus.
Liam Cleere University College Dublin’s Senior Manager for Research Analytics...IrishHumanitiesAlliance
From the IHA Impact in the Humanities event 8 June held in QUB and co-sponsored by InterTradeIreland
Panel Three Impact: How should we capture it?
From the perspectives of analytics, science and policy: how should we capture and measure Impact, how should the definition of Impact incorporate academic perspectives and what role can the humanities play in policy?
This presentation was provided by Rachel Lewellen of Harvard University during the NISO event, Assessment Practices and Metrics in the 21st Century Training Session Four held on Friday, November 9th.
Poster Presentation for 4:am Altmetrics Conference, Toronto ON, CA and National Institutes of Health Bibliometrics and Assessment Conference, Bethesda MD, US
Research Impact in Specialized Settings: 3 Case StudiesElaine Lasda
Presentation of 3 case studies where research impact metrics are used to further the mission of institutions and organizations out of the traditional academic millieu.
We are going to represent a Mashup platform for the research evaluation. This talk was given at 2nd Search computing workshop in Como, italy on 27-may-2010.
The Evolving Role of the Library in Institutional and Faculty AssessmentState Of Innovation
A Discussion of Research Metrics - June 2016
Kim Powel, Life Sciences Informationist Emory University
Holly Miller, Associate Dean Scholarly Content and Faculty Engagement, Florida International University
Joey Figueroa, Solutions Specialist Thomson Reuters
Academic Library Impact: Improving Practice and Essential Areas to ResearchLynn Connaway
Connaway, Lynn Silipigni, William Harvey, Vanessa Kitzie, and Stephanie Mikitish. 2017. “Academic Library Impact: Improving Practice and Essential Areas to Research.” Presented at the Update on Value of Academic Libraries Initiative (ACRL) at the ALA Annual Conference, Chicago, Illinois, June 25.
Bibliometric solutions for identifying potential collaboratorsTorres Salinas
EC3metrics participa en la “European Summer School for Scientometrics” (ESSS) 2017 que tiene lugar en Berlín (Alemania) del 17 al 22 de septiembre de 2017. Este evento se viene celebrando anualmente desde 2010 y está organizado por la University of Vienna, el German Centre for Higher Education Research and Science Studies (DZHW alemán), la Katholieke Universiteit Leuven y EC3metrics, que desde 2017 es miembro del comité organizador. ESSS es una iniciativa que se creó en 2010 en respuesta a una falta de formación en cienciometría, -especialmente en los países de habla alemana- y por el aumento de esta demanda por parte de responsables políticos, directores, gestores de investigación, científicos, especialistas en información y bibliotecarios. Así, siguiendo el modelo de eventos anteriores, este año el tema del curso será “Identificación de focos de investigación. Perfiles institucionales y nacionales y colaboraciones estratégicas” (Identification of Research focuses. National & Institutional Profiles and Strategic Partnerships).
Daniel Torres-Salinas y Nicolás Robinson-García son miembros del comité organizador en representación de EC3metrics. Asimismo, participan como docentes. El próximo jueves 21 de septiembre, Nicolás Robinson-García y Daniel Torres-Salinas presentarán el seminario “Bibliometric solutions for identifying potential collaborators”.
Abstract: Bibliometric indicators and methodologies are commonly used for benchmarking institutions and individuals, and analyzing their research performance. Their potential for identifying partners and promoting collaboration is many times overseen by research institutions. In this presentation we will discuss different indicators and methodologies that can be used to spot institutions, research groups and individuals working on similar research fronts. By using different visualization techniques, we will provide examples on how to present these data in an appealing way which can inform university and research managers. These types of analyses are useful when searching for potential partners or designing strategies to establish scientific collaboration networks.
Libraries routinely gather and report data about their budgets, collections, staff, services, and so forth. But libraries need to do a better job of using these data to help them improve their existing services and communicate value to their stakeholders.
Assessing Students' Information Literacy Skills Using MAP-WorksMillstein Library
Poster presented for the Association of College & Research Libraries (ACRL) Assessment in Action (AiA) program at the American Library Association (ALA) Annual Conference on Friday, June 24, 2016 in Orlando, FL
Liam Cleere University College Dublin’s Senior Manager for Research Analytics...IrishHumanitiesAlliance
From the IHA Impact in the Humanities event 8 June held in QUB and co-sponsored by InterTradeIreland
Panel Three Impact: How should we capture it?
From the perspectives of analytics, science and policy: how should we capture and measure Impact, how should the definition of Impact incorporate academic perspectives and what role can the humanities play in policy?
This presentation was provided by Rachel Lewellen of Harvard University during the NISO event, Assessment Practices and Metrics in the 21st Century Training Session Four held on Friday, November 9th.
Poster Presentation for 4:am Altmetrics Conference, Toronto ON, CA and National Institutes of Health Bibliometrics and Assessment Conference, Bethesda MD, US
Research Impact in Specialized Settings: 3 Case StudiesElaine Lasda
Presentation of 3 case studies where research impact metrics are used to further the mission of institutions and organizations out of the traditional academic millieu.
We are going to represent a Mashup platform for the research evaluation. This talk was given at 2nd Search computing workshop in Como, italy on 27-may-2010.
The Evolving Role of the Library in Institutional and Faculty AssessmentState Of Innovation
A Discussion of Research Metrics - June 2016
Kim Powel, Life Sciences Informationist Emory University
Holly Miller, Associate Dean Scholarly Content and Faculty Engagement, Florida International University
Joey Figueroa, Solutions Specialist Thomson Reuters
Academic Library Impact: Improving Practice and Essential Areas to ResearchLynn Connaway
Connaway, Lynn Silipigni, William Harvey, Vanessa Kitzie, and Stephanie Mikitish. 2017. “Academic Library Impact: Improving Practice and Essential Areas to Research.” Presented at the Update on Value of Academic Libraries Initiative (ACRL) at the ALA Annual Conference, Chicago, Illinois, June 25.
Bibliometric solutions for identifying potential collaboratorsTorres Salinas
EC3metrics participa en la “European Summer School for Scientometrics” (ESSS) 2017 que tiene lugar en Berlín (Alemania) del 17 al 22 de septiembre de 2017. Este evento se viene celebrando anualmente desde 2010 y está organizado por la University of Vienna, el German Centre for Higher Education Research and Science Studies (DZHW alemán), la Katholieke Universiteit Leuven y EC3metrics, que desde 2017 es miembro del comité organizador. ESSS es una iniciativa que se creó en 2010 en respuesta a una falta de formación en cienciometría, -especialmente en los países de habla alemana- y por el aumento de esta demanda por parte de responsables políticos, directores, gestores de investigación, científicos, especialistas en información y bibliotecarios. Así, siguiendo el modelo de eventos anteriores, este año el tema del curso será “Identificación de focos de investigación. Perfiles institucionales y nacionales y colaboraciones estratégicas” (Identification of Research focuses. National & Institutional Profiles and Strategic Partnerships).
Daniel Torres-Salinas y Nicolás Robinson-García son miembros del comité organizador en representación de EC3metrics. Asimismo, participan como docentes. El próximo jueves 21 de septiembre, Nicolás Robinson-García y Daniel Torres-Salinas presentarán el seminario “Bibliometric solutions for identifying potential collaborators”.
Abstract: Bibliometric indicators and methodologies are commonly used for benchmarking institutions and individuals, and analyzing their research performance. Their potential for identifying partners and promoting collaboration is many times overseen by research institutions. In this presentation we will discuss different indicators and methodologies that can be used to spot institutions, research groups and individuals working on similar research fronts. By using different visualization techniques, we will provide examples on how to present these data in an appealing way which can inform university and research managers. These types of analyses are useful when searching for potential partners or designing strategies to establish scientific collaboration networks.
Libraries routinely gather and report data about their budgets, collections, staff, services, and so forth. But libraries need to do a better job of using these data to help them improve their existing services and communicate value to their stakeholders.
Assessing Students' Information Literacy Skills Using MAP-WorksMillstein Library
Poster presented for the Association of College & Research Libraries (ACRL) Assessment in Action (AiA) program at the American Library Association (ALA) Annual Conference on Friday, June 24, 2016 in Orlando, FL
Attendees will understand what the COUNTER initiative is and how it helps to provide the librarian community with reliable, consistent and compatible online usage statistic reports that can be used to inform collection management decisions. Attendees will be familiar with the different COUNTER reports available for journals, databases and ebooks, and how to access them. They will learn about the COUNTER Code of Practice which advises vendors how to become COUNTER compliant and why this is important.
Join us for a comprehensive insight into COUNTER and the COUNTER Code of Practice including:
What is COUNTER?
Why COUNTER is important to library customers
Why COUNTER is important to publishers
How to become COUNTER compliant and the COUNTER Code of Practice
COUNTER reports for books, journals and databases
The application of linked data concepts and technologies promises great benefits for the retrieval and access of library and cultural heritage resources, but requires new ways of thinking by professionals and end-users. This presentation will use linked data examples from RDA: resource description and access, and will discuss:
- A basic introduction to data structures: triples, chains, and clusters
- What is a linked data record?
- Global, multilingual linked data
- Linking data from multiple sources
- The Semantic Web: a paradigm shift?
Persistent Identifiers (PIDs) are an increasingly important tool and technology that enable new services for research. They are attached to outputs, grant funding, people and more. They provide a way to connect data, improve accuracy, and help the flow of information. Researchers can benefit from time savings, and more accurate attribution (think better citation data…) and institutions can gain efficiency savings and a better understanding of their research portfolio.
This webinar will provide an overview of the current PID landscape and will offer guidance on how PIDs for people (ORCID iDs) can be integrated in your systems, helping your researchers to take advantage of new and emerging services.
The British Library was one of the first national libraries to create and offer linked data in 2011 as part of its wider open data strategy. Since that point the organisation has gained considerable experience of the issues involved in the development and maintenance of a sustained linked data service.
This presentation describes
- Why libraries are interested in offering linked data?
- What are some of the basic concepts involved in linked data?
- How can linked data be created from library MARC data?
Elizabeth Gadd, Research Policy Manager - Loughborough University
Journal publications currently play a significant role in research assessment: in rankings, REF and recruitment. They are assessed by quantity, citedness, the journal they appear in, or peer review. However, assessing journal publications can be difficult and can drive wider system failures such as questionable research practices, rising publication costs, and delays. Increasingly there are calls to evaluate a broader diversity of roles, inputs, processes, and outputs through our research assessments but it’s not clear to what extent these approaches truly value a wider range of research 'qualities'.
This participative session will explore the appropriate place of journal publications in research and researcher assessment with the assistance of the INORMS SCOPE framework.
Presentation from a University of York Library workshop on bibliometrics. The session covers how published research outputs are measured at the article, author and journal level; with discussion of the limitations of a bibliometric approach.
Todd Carpenter's presentation at the 3:AM conference in Bucharest, Romania on September 29, 2016 describing the NISO Alternative Assessment Project final output and next steps.
Leveraging a Library CMS and Social Media to promote #openaccess (OA) to inst...Nick Sheppard
The confluence of various technologies and Open Access (OA) initiatives make it easy to share research outputs via social media and assess the reach and impact of dissemination. The Library at Leeds Beckett utilises LibGuides as our CMS and supports the institutional research management infrastructure comprising Symplectic Elements and EPrints, and we have developed a dedicated series of LibGuides around selected themes comprising a range of relevant information and including institutional research outputs. For World Diabetes Day, for example, we curated a collection of research outputs and utilised the Elements API to display a date ordered list of citations including, where available, links to author versions, self-archived and openly accessible in EPrints alongside an embedded Twitter feed from @WDD, the Official Twitter account of the campaign from the International Diabetes Federation. The page was disseminated via Twitter from accounts operated by the Library, @BeckettLibrary and @BeckettResearch, including targeted tweets to @WDD and individual academics. With over 4,500 and 1,500 followers respectively these accounts are well subscribed and received several "retweets". The guide, whilst highlighting and strengthening the role of the library as a tool for researchers, was also an advocacy tool to engage academics in OA. This paper will explore the context and technology of this initiative and present data from Twitter analytics and so called "altmetrics" as a means of visualising how research is shared and disseminated online and which are potential indicators of impact beyond the traditional readership of scholarly material, especially in conjunction with OA.
This session will demystify (generative) AI by exploring its workings as an advanced statistical modelling tool (suitable for any level of technical knowledge). Not only will this session explain the technological underpinnings of AI, it will also address concerns and (long-term) requirements around ethical and practical usage of AI. This includes data preparation and cleaning, data ownership, and the value of data-generated - but not owned - by libraries. It will also discuss the potentials for (hypothetical) use cases of AI in collections environments and making collections data AI-ready; providing examples of AI capabilities and applications beyond chatbots.
CATH DISHMAN, CENYU SHEN,
KATHERINE STEPHAN
Although scholarly communications has become more open, problems with predatory and problematic publishers remain. There are commercial providers of lists, start-up/renegade Internet lists of good/bad and the researchers, publishers and assessors that try to understand and process what being on/off a list means to themselves, their careers and their institutions. Still, these problems persist and leaves many asking: where is the list?
Christina Dinh Nguyen, University of Toronto Mississauga Library
In the world of digital literacies, liaison and instructional librarians are increasingly coming to terms with a new term: algorithmic literacy. No matter the liaison or instruction subjects – computer science, sociology, language and literature, chemistry, physics, economics, or other – students are grappling with assignments that demand a critical understanding, or even use, of algorithms. Over the course of this session, we’ll discuss the term ‘algorithmic literacies,’ explore how it fits into other digital literacies, and see why it as a curriculum might belong at your library. We’ll also look at some examples of practical pedagogical methods you can implement right away, depending on what types of AL lessons you want to teach, and who your patrons are. Lastly, we’ll discuss how librarians should view themselves as co-learners when working with AL skills. This session seeks to bring together participants from across the different libraries, with diverse missions/vision/mandates, to explore ways we can all benefit from teaching AL. If time permits, we may discuss how text and data librarians (functional specialists) can support the development of this curriculum.
David Pride, The Open University
In this paper, we present CORE-GPT, a novel question- answering platform that combines GPT-based language models and more than 32 million full-text open access scientific articles from CORE. We first demonstrate that GPT3.5 and GPT4 cannot be relied upon to provide references or citations for generated text. We then introduce CORE-GPT which delivers evidence-based answers to questions, along with citations and links to the cited papers, greatly increasing the trustworthiness of the answers and reducing the risk of hallucinations.
Cath Dishman, Cenyu Shen, Katherine Stephan
Although scholarly communications has become more open, problems with predatory and problematic publishers remain. There are commercial providers of lists, start-up/renegade Internet lists of good/bad and the researchers, publishers and assessors that try to understand and process what being on/off a list means to themselves, their careers and their institutions. Still, these problems persist and leaves many asking: where is the list?
This plenary panel will discuss the problems of “predatory” publishing and what, if anything, publishers, our community and researchers can do to try and help minimise their abundancy/impact.
eth Montague-Hellen, Francis Crick Institute, Katie Fraser, University of Nottingham
Open Access is a foundational topic in Scholarly Communications. However, when information professionals and publishers talk about its future, it is nearly always Gold open access we discuss. Green was seen as the big solution for providing access to those who couldn’t afford it. However, publishers have protested that Green destroys their business models. How true is this, and are we even all talking the same language when we talk about Green?
Chris Banks, Imperial College London, Caren Milloy, Jisc,
Transitional agreements were developed in response to funder policy and institutional demand to constrain costs and facilitate funder compliance. They have since become the dominant model by which UK research outputs are made open access. In January 2023, Jisc instigated a critical review of TAs and the OA landscape to provide an evidence base to inform a conversation on the desired future state of research dissemination. This session will discuss the key findings of the review and its impact on a sector-wide consultation and concrete actions in the UK and beyond.
Michael Levine-Clark, University of Denver, Jason Price, SCELC Library Consortium
As transformative agreements emerge as a new standard, it is critical for libraries, consortia, publishers, and vendors to have consistent and comprehensive data – yet data around publication profiles, authorship, and readership has been shown to be highly variable in availability and accuracy. Building on prior research around frameworks for assessing the combined value of open publishing and comprehensive read access that these deals provide, we will address multi-dimensional perspectives to the challenges that the industry faces with the dissemination, collection, and analysis of data about authorship, readership, and value.
Hylke Koers, STM Solutions
Get Full Text Research (GetFTR) launched in 2020 with the objective of streamlining discovery and access of scholarly content in the many tools that researchers use today, such as Dimensions, Semantic Scholar, Mendeley, and many others. It works equally well for open access content as it does for subscription-based content, providing researchers with recognizable buttons and indicators to get them to the most up-to-date version of content with minimal effort. Currently, around 30,000 OA articles are accessed every day via GetFTR links.
Gareth Cole, Loughborough University, Adrian Clark, Figshare
Researchers face more pressure to share their research data than ever before. Owing to a rise in funder policies and momentum towards more openness across the research landscape. Although policies for data sharing are in place, engagement work is undertaken by librarians in order to ensure repository uptake and compliance.
We will discuss a particular strategy implemented at Loughborough University that involved the application of conceptual messaging frameworks to engagement activities in order to promote and encourage use of our Figshare-powered repository. We will showcase the rationale behind the adoption of messaging frameworks for library outreach and some practical examples.
Mark Lester, Cardiff Metropolitan University
This talk will outline how a completely accidental occurrence led to brand new avenues for open research advocacy and reasons for being. This advocacy has occurred within student communities such as trainee teachers, student psychologists and (especially) those soon losing access to subscription-based library content. Alongside these new forms of advocacy, these ethical example of AI use cases has begun to form a cornerstone of directly connecting the work of the library to new technology.
Simon Bell, Bristol University Press
The UN SDG Publishers Compact, launched in 2020, was set up to inspire action among publishers to accelerate progress to achieve the Sustainable Development Goals by 2030, asking signatories to develop sustainable practices, act as champions and publish books and journals that will “inform, develop and inspire action in that direction”.
This Lightning Talk will discuss how our new Bristol University Press Digital has been developed as part of our mission to contribute a meaningful and impactful response to this call to action as well as the global social challenges we face.
Using thematic tagging to create uniquely curated themed eBook collections around the Global Social Challenges, Bristol University Press Digital responds directly to the need to provide the scholarly community access to a comprehensive range SDG focussed content while minimising time and resource at the institution end in collating content and maintaining collection relevance to rapidly evolving themes
Jenni Adams, University of Sheffield, Ric Campbell, University of Sheffield
Academic researchers are becoming increasingly aware of the need to make data and software FAIR in order to support the sharing and reuse of non-publication outputs. Currently there is still a lack of concise and practical guidance on how to achieve this in the context of specific data types and disciplines.
This presentation details recent and ongoing work at the University of Sheffield to bridge this gap. It will explore the development of a FAIR resource with specialist guidance for a range of data types and will examine the planned development of this project during the period 2023-25
TASHA MELLINS-COHEN
COUNTER & Mellins-Cohen Consulting, JOANNA BALL
DOAJ, YVONNE CAMPFENS
OA Switchboard,
ADAM DER, Max Planck Digital Library
Community-led organizations like DOAJ (Directory of Open Access Journals), COUNTER (the standard for usage metrics) and OA Switchboard (information exchange for OA publications) are committed to providing reliable, not-for-profit services and standards essential for a well-functioning global research ecosystem. These organizations operate behind the scenes, with low budgets and limited staffing – no salespeople, marketing teams, travel budgets, or in-house technology support. They collaborate with one another and with bigger infrastructure bodies like Crossref and ORCID, creating the foundations on which much scholarly infrastructure relies.
These organizations deliver value through open infrastructure, data and standards, and naturally services and tools have been built by commercial and not-for-profit groups that capitalize on their open, interoperable data and services – many of which you are likely to recognize and may use on a regular basis.
Hear from the Directors of COUNTER, DOAJ and OA Switchboard, as well as a library leader, on the role of these organizations, the challenges they face and why support from the community is essential to their sustainability.
CAMILLE LEMIEUX
Springer Nature
What is the current state of diversity, equity, and inclusion in the scholarly publishing community? It's time to take a thorough look at the 2023 global Workplace Equity (WE) Survey results. The C4DISC coalition conducted the WE Survey to capture perceptions, experiences, and demographics of colleagues working at publishers, associations, libraries, and many more types of organizations in the global community. Four key themes emerged from the 2023 results, which will be compared to the findings from the first WE Survey conducted in 2018. Recommendations for actions organisations can consider within their contexts will be proposed and discussed.
Rob Johnson, Research Consulting
Angela Cochran, American Society of Clinical Oncology
Gaynor Redvers-Mutton, Biochemical Society
Since 2015, the number of self-published learned societies in the UK has decreased by over a third, with the remaining societies experiencing real-term revenue declines. All around the world, society publishers are struggling with increased competition from commercial publishers and the rise of open access business models that reward quantity over quality. We will delve into the distinctive position of societies in research, examine the challenges confronting UK and US learned society publishers, and explore actionable steps for libraries and policymakers to support the continued relevance of learned society publishers in the evolving scholarly landscape.
Simon Bell, Clare Hooper, Katharine Horton, Ian Morgan
Over the last few years we have witnessed a seismic shift in the scholarly ecosystem. Three years since outset of the COVID pandemic and the establishment UN Publishers Compact, this is discussion-led presentation will look at how four UK Universities Presses have adopted a consultative and collaborative approach on projects to support their institutional missions, engage with the wider scholarly community while building on a commitment to make a meaningful difference to society.
This panel discussion will combine the perspectives of four UK based university presses, all with distinct identities and varied publishing programs drawn from humanities, arts and social sciences, yet with a shared recognition and value of the importance to collaborate and co-operate on a shared vision to support accessibility and inclusivity within the wider scholarly community and maintain a rich bibliodiversity.
While research support teams are generally small and specialist in nature, an increased demand of its service has been observed across the sector. This is particularly true for teaching-intensive institutions. As a pilot to expand research support across ARU library, the library graduate trainee was seconded to the research services team for a month. This dialogue between the former trainee and manager will discuss what the experience and outcomes of the secondment were from different perspectives. The conversation will also explore the exposure Library and Information Studies students have to research services throughout their degree.
TIM FELLOWS & EMILY WILD, Jisc
Octopus.ac is a UKRI funded research publishing model, designed to promote best practice. Intended to sit alongside journals, Octopus provides a space for researcher collaboration, recording work in detail, and receiving feedback from others, allowing journals to focus on narrative.
The platform removes existing barriers to publishing. It’s an entirely free, open space for researchers, without editorial and pre-publication peer review processes. The only requirement for authors is a valid ORCiD ID. Without barriers, Octopus must provide feedback mechanisms to ensure the community can self-moderate. During this session, we’ll explore Octopus’ aims to foster a collaborative environment and incentivise quality.
David Parker, Publisher and Founder, Lived Places Publishing
Dr. Kadian Pow, Lecturer in Sociology and Black Studies & LPP Author, Birmingham City University
Natasha Edmonds, Director, Publisher and Industry Strategy, Clarivate
Library patrons want to search for and locate authors by particular identity markers, such as gender identification, country of origin, sexual orientation, nature of disability, and the many intersectional points that allow an author to express a point-of-view. Artificial Intelligence, skilled web researchers, and data scientists in general struggle to achieve accuracy on single identity markers, such as gender. And what right does anybody have to affix identity metadata to an author other than the author theirselves? And what of the risks in disseminating author identity metadata in electronic distribution platforms and in library catalog systems? Can a "fully informed" author even imagine all the possible misuses of their identity metadata?
More from UKSG: connecting the knowledge community (20)
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
Andreas Schleicher presents at the OECD webinar ‘Digital devices in schools: detrimental distraction or secret to success?’ on 27 May 2024. The presentation was based on findings from PISA 2022 results and the webinar helped launch the PISA in Focus ‘Managing screen time: How to protect and equip students against distraction’ https://www.oecd-ilibrary.org/education/managing-screen-time_7c225af4-en and the OECD Education Policy Perspective ‘Students, digital devices and success’ can be found here - https://oe.cd/il/5yV
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
Instructions for Submissions thorugh G- Classroom.pptxJheel Barad
This presentation provides a briefing on how to upload submissions and documents in Google Classroom. It was prepared as part of an orientation for new Sainik School in-service teacher trainees. As a training officer, my goal is to ensure that you are comfortable and proficient with this essential tool for managing assignments and fostering student engagement.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
How to Create Map Views in the Odoo 17 ERPCeline George
The map views are useful for providing a geographical representation of data. They allow users to visualize and analyze the data in a more intuitive manner.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
3. University rankings
Academic Ranking of World Universities
(Shanghai)
QS World University Rankings
Times Higher Ed World University Rankings
Webometrics
… various others
4. REF 2014
36 Units of Assessment
Environ-
ment
15%
Impact
20%
Research
Outputs
65%
5. Funders, eg RCUK
RCUK’s Research Outcomes System (ROS)
An evidence base: reports to Government, the
public and other organisations.
publications
Other… (eg) new materials, exhibitions and
websites
staff development
collaborations and partnerships,
communication and dissemination activities
summaries of impact.
6. Institution / Head of Department
recruitment stage / performance review
Looking for:
1. Number of recent (quality) publications
2. Grant income
3. PhD students supervised
4. Things measured by funders & important
rankings
Snowballmetrics project
Showcasing their researchers
7. Researchers for themselves / as
peers
1. What to put on CV/in performance
review documents.
2. What should they do to reach target
audience? To enhance citations?
3. Peer review for journals (and on REF
panels)
4. Editors of journals review the reviewers!
9. Journal title: Short-hand for
quality?
Cachet, because of :
History.
Rigorous peer review & editing.
Attract many submissions & choose the
best.
Circulation/audience.
Invest in a high profile
Good Indexes/discovery tools.
10. Journal info sources & rankings
Ulrich’s Periodicals Directory
DOAJ
JCR – 2/5 year IF, Immediacy index, Cited
half life, Eigenfactor.
SJR & SNIP
Google Scholar’s “Metrics”.
European Reference Index for the
Humanities
Harzing.com
11. Article level measures
Sorting of results by citations.
Display the number of visitors/downloads/
citations per article.
Reviews and comments by readers: on
the journal site/on blogs/SSRN/twitter, etc.
Likes & bookmarks.
Scores or ratings by readers?
News/media coverage.
12.
13.
14. Librarians as...
Repository/CRIS managers: a source of
info. for others
Guides to Library subscription sources:
citation data, publishing patterns & trends
Expert advisers on alternative metrics and
how they are best used
15. The future?
An Article level economy.
Availability of Altmetrics will support
interest in other kinds of outputs than
journal articles.
My title could just as easily be, “Why are librarians relevant to research evaluation?”
LIBRARIANS: They use citation data from sources like TR WoK and Elsevier’s Scopus. These sources are bought and provided by the Library. Management might ask Librarians how to make the most of these tools, or indeed be interested to hear from Librarians about how the tools can be used, to analyse and predict performance in elements of such rankings.
Becoming more sophisticated: already talk about what will happen in REF 2020!1. Research Outputs (65%): assessed for ‘originality, significance and rigour’ with reference to international research quality standards. Panels will also recognise the significance of outputs beyond academia wherever appropriate.2. Impact (20%): the reach and significance of research on the economy, society and/or culture. This specifically excludes impact or advancement of academic knowledge within the HE sector.3. Environment (15%): the ‘vitality and sustainability’ of the research environment, including its contribution to the wider discipline or research base.ERA 2012 in AustraliaLIBRARIANS: Research outputs reporting often comes from the institutional repository or CRIS, which is sometimes run by the Library on behalf of the institution. Library provision is also relevant to the Environment statements. Researchers are looking for advice about “impact”, as this is new: an opportunity for the Library? Especially those who understand social media and teach its use to students: can adapt this advice for researchers.
- US National Science Foundation asks for “products” rather than “articles” in the biographical-sketch section of grant applications. - Article in Nature: US National Science Foundation, for example, has already begun requesting “products” rather than “articles” in the biographical-sketch section of grant applications - http://www.nature.com/nature/journal/v495/n7442/full/495437a.html Also, further funding? Impact: Recognition for their work, exploitation of it by others.LIBRARIANS: Repository/CRIS links with research office and reporting to ROS. Likely to be using metadata expertise in recording & sharing data. Similar to the role in support of REF reporting.(One source of data on collaborations and partnerships is co-authorship, and many Library subscribed products will help to track such partnerships and thus be of use to those who report to )
Snowball Metrics: Elsevier are involved, along with 8 UK Universities. Recipe book of methodologies, eg No. 2 might be about number of applications made, or no. of awards, or level of income: they have chosen both number and value of awards, by year and by quarter, and both flat and per FTE at the institution, as a metric that would allow benchmarking between participating institutions. LIBRARIANS: No. 1: source of data might be the repo or CRIS, managed by the Library. Quality of the publications & how to measure it, is a big topic!(No.2: we host workshops… ) (No. 3: librarians support PhD students) No. 4: One source of data on collaborations and partnerships is co-authorship, and many Library subscribed products will help to track such partnerships. Look for internal partnerships, international collaborations, disciplinary collaborations, etc: identify good practice, grow such practice.
LIBRARIANS: 1) Offer training sessions on the h-index and it’s ilk, and how to look it up. Might need to consider whether to display metrics on their articles in the inst. Repository. How to tell a good story about your research! 2) Help them to think about a “publication strategy” & what elements are important to them: impact factors and where to find them, along with the other measures available at journal title level, and other characteristics of a journal that might be of interest to them. Librarians know journals and have been evaluating them for years, albeit for a different purpose! 3) REF panel members in the sciences will get citation data about the outputs: therefore understanding citation data is an important skill for a researcher.
Most of those relevant to Librariansrelate to measuring the quality of journals or outputs.
REF panels: do they read every output? Are they pre-disposed to like certain articles, because of the significant journal it appears in? Journal brands “Nature, Science, The Lancet”… Best research was not in the article in Nature but somewhere less well known, but academically prestigious: but which put forward for RAE/REF?Audience: English language? Open Access? Just because people can read it on the web, doesn’t mean that they will!Recent blog article by a researcher who published in PLoS ONE & talks through his decision not to publish there again for a while indicates that researchers do care about the brand and reputation of the journal, but that one journal title might mean different things to different researchers, of different generations. LIBRARIANS: Explaining the different types of peer review & what acceptance/rejection rates mean. Which journal indexing sites should the journal appear on? Not every journal has such a clearly recognisable brand, so how do you differentiate?
LIBRARIANS: Classic territory! Find quality info sources & help researchers to use them! Not all journal titles have an IF. How to look up Impact Factors and other measures on these tools. The subject categories make a difference.“Data should tell a story”: Are the journals increasing IF, year on year?Understanding citation measurement’s strengths and weaknesses is important to using such tools.
SSRN = social science research network!These things are otherwise known as “Altmetrics” – alternatives to the citation metrics are available. “measures of scholarly impact mined from activity in online tools and environments,” - Jason PriemMendeley scores, RG scores: proprietary and not much better than citation measurement?!Open to manipulation : Not verifiable : Lack of clarity over how the measures are produced : lack of an authoritative source. But is citation measurement any better?!PotentialRole in Impact assessmentGood if you want to measure open, collaborative approach to research?As well as, not instead of other measures.(IRUS project to collate article download stats.)Not only for articles : relevant for scholarly blogs/tweets and other kinds of output.Downloads for internal views & analysis only: basis of awards/privilege, eg Springer OA. Shared with authors only, eg repository stats ??Role in post publication peer review systems?National Information Standards Organization (NISO) involvement… long term.LIBRARIANS can recommend such sorting to researchers when searching. Journals, repositories and authors can all display metrics at this level.
Displays his authorIDs (more to come!) More than one h-index. Microsoft Academic Search. Left hand side shows his networks of collaboration, grants & other information.
PLoS One and other PLoS titles are often given as an example of this.Altmetric tool provides the data for this. It is described as a measure of the “social impact of scholarly literature”.Other tools can be found on the altmetrics.org website.
How are Librarians relevant to Researcher Performance measurement?
This really answers the question, “How is researcher performance measurement relevant to Librarians”.Article level in OA, mega-journal? Readers will need to evaluate the articles, & to understand the metrics displayed about them. - LIBRARIANs as guides.Other kinds of outputs: how are these presented, collated/acquired, discovered & used: LIBRARIANS as collectors and as guides.The publishing landscape is changing and researchers do get worked up. Librarians with technical expertise & answers are reassuring to them.