Standards to facilitate information exchange has always been a subject of concern.
To provide a flexible exchange format that could be used for converting data from libraries and information services of all types, UNESCO developed the Common Communication Format (CCF). The main aim of this format was to produce a method of organising bibliographic descriptions which could be exchanged between institutions. This format was to act as a link between the databases produced in different internal formats of libraries.
CLASSIFIED CATALOGUE
Entries in Classified Catalogue
Different Sections of the Main Entry
Main Entry ( Back Portion)Tracing Section
These notes may be of the following kinds:
The following kinds of added entries are formed in a classified catalogue
These entries can be of the following types
There are many kinds of cross reference index entries ; of which the following five kinds are the main
References
presentation on "CATALOGUING" during Training workshop in library science for staff of muktangan school libraries organised by muktangan school teacher reference library, mumbai on 15th November 2010
Anglo-American Cataloguing Rules AACR2 to acquire an international adaptability.Cataloging & Classification.AACR1 and AACR2.AACR1 Anglo-American Cataloging Rules. North American text. Chicago: American Library Association, 1967.
AACR1, Chap. 12 Anglo-American Cataloging Rules. North American text. Chapter 12. Chicago: American Library
Association,
1975.
AACR2 Anglo-American Cataloguing Rules. 2nd ed. Chicago: American Library Association, 1
Standards to facilitate information exchange has always been a subject of concern.
To provide a flexible exchange format that could be used for converting data from libraries and information services of all types, UNESCO developed the Common Communication Format (CCF). The main aim of this format was to produce a method of organising bibliographic descriptions which could be exchanged between institutions. This format was to act as a link between the databases produced in different internal formats of libraries.
CLASSIFIED CATALOGUE
Entries in Classified Catalogue
Different Sections of the Main Entry
Main Entry ( Back Portion)Tracing Section
These notes may be of the following kinds:
The following kinds of added entries are formed in a classified catalogue
These entries can be of the following types
There are many kinds of cross reference index entries ; of which the following five kinds are the main
References
presentation on "CATALOGUING" during Training workshop in library science for staff of muktangan school libraries organised by muktangan school teacher reference library, mumbai on 15th November 2010
Anglo-American Cataloguing Rules AACR2 to acquire an international adaptability.Cataloging & Classification.AACR1 and AACR2.AACR1 Anglo-American Cataloging Rules. North American text. Chicago: American Library Association, 1967.
AACR1, Chap. 12 Anglo-American Cataloging Rules. North American text. Chapter 12. Chicago: American Library
Association,
1975.
AACR2 Anglo-American Cataloguing Rules. 2nd ed. Chicago: American Library Association, 1
This PPT contain details of Z39.50 and useful for Library Science students. This protocol used for information retrieval and in the end list of different types of protocols are given.
Introduction to MARC
History (MARC to MARC 21)
Why MARC 21/Need of MARC 21
Characteristics
Design principle for MARC 21
MARC 21 Documentation
MARC 21Record System
MARC 21 Communication formats
MARC 21 Format for Bibliographic Data
Component of bibliographic record
Communication Standard
Mapping of MARC 21
MARC 21 Translation
Maintenance Agency
MARC 21 Regulation
Advantage of MARC 21
Problems with MARC 21
Future of MARC 21
Canons of cataloguing are the specific normative principles applicable to cataloguing that is Drafting a catalogue code including the formulation of each rule. Interpretation of the rules to meet new situation brought out by a particular document or by the change in the practice of book production
A comparative analysis of library classification systemsAli Hassan Maken
We use classification each & every moment of the life by intentionally or unintentionally. Classification has always been the backbone of all Library operations and without it, library is definitely going to suffer in its recourse and to find a particular piece of information from unorganized heap of knowledge is almost impossible. The library classification is core instrument for organizing and retrieval of the documents stored in a library. At present era they are the navigation tools for locating and retrieving documents in more precisely and relevantly. The electronic versions of the DDC and UDC and other classification schemes make it possible to realize the potential of library classification to improve subject retrieval.
Features of the Dewey Decimal Classification. 16. Decimal ... The UDC is peculiar in the sense that it consists of a combination of both enumerative and analytical scheme.
An introductory presentation on the concept of Library Classification by Dr. Keshava, Professor, Department of Studies and Research in Library and Information Science, Tumkur University, Karnataka, INDIA.
Classified Catalogue Code ,Classified catalogue code (CCC), S.R. Ranganathan, Information system, OPAC, Database management system (DBMS) card catalogue and online catalogue, and emphasises on the need of developing computer-based library information systems and services. It describes database technology, kinds of databases, database management system, computerised library information system, and management information system. It coven in detail the database design and compatibility of cataloguing codes for developing databases of computer-based library information systems.
Presented at the seminar Libraries and the Semantic Web: the role of International Standard Bibliographic Description (ISBD), National Library of Scotland, Edinburgh, 25 Feb 2011
The FAO Open Archive: Enhancing Access to FAO Publications Using Internationa...Romolo Tassone
To improve the effectiveness of the proposed FAO repository, it is necessary to streamline the current electronic publishing workflow. The merger of the two systems will strengthen FAO’s role as a knowledge dissemination organization. Especially, as one of the principal tasks of the FAO is to efficiently collect and disseminate information regarding food, nutrition, agriculture, fisheries and forestry.
We would like to thank Anne Aubert, Johannes Keizer, Giorgio Lanzarone, Romolo Tassone and Jim Weinheimer for their valuable contributions.
Bibliographical information plays an important role information retrieval for the research community particularly in the field of science and technology. But during the bibliographical information exchange certain problems arise and more when the information interchange is on magnetic tape or CD-ROM. Different international organisations such as UNESCO/PGI, UNISIST, ICSU, IFLA, ISO have taken many steps towards the standardisation of bibliographic exchange formats. The process of standardisation follows a set of codes given by International Standard Organisation (ISO).
This PPT contain details of Z39.50 and useful for Library Science students. This protocol used for information retrieval and in the end list of different types of protocols are given.
Introduction to MARC
History (MARC to MARC 21)
Why MARC 21/Need of MARC 21
Characteristics
Design principle for MARC 21
MARC 21 Documentation
MARC 21Record System
MARC 21 Communication formats
MARC 21 Format for Bibliographic Data
Component of bibliographic record
Communication Standard
Mapping of MARC 21
MARC 21 Translation
Maintenance Agency
MARC 21 Regulation
Advantage of MARC 21
Problems with MARC 21
Future of MARC 21
Canons of cataloguing are the specific normative principles applicable to cataloguing that is Drafting a catalogue code including the formulation of each rule. Interpretation of the rules to meet new situation brought out by a particular document or by the change in the practice of book production
A comparative analysis of library classification systemsAli Hassan Maken
We use classification each & every moment of the life by intentionally or unintentionally. Classification has always been the backbone of all Library operations and without it, library is definitely going to suffer in its recourse and to find a particular piece of information from unorganized heap of knowledge is almost impossible. The library classification is core instrument for organizing and retrieval of the documents stored in a library. At present era they are the navigation tools for locating and retrieving documents in more precisely and relevantly. The electronic versions of the DDC and UDC and other classification schemes make it possible to realize the potential of library classification to improve subject retrieval.
Features of the Dewey Decimal Classification. 16. Decimal ... The UDC is peculiar in the sense that it consists of a combination of both enumerative and analytical scheme.
An introductory presentation on the concept of Library Classification by Dr. Keshava, Professor, Department of Studies and Research in Library and Information Science, Tumkur University, Karnataka, INDIA.
Classified Catalogue Code ,Classified catalogue code (CCC), S.R. Ranganathan, Information system, OPAC, Database management system (DBMS) card catalogue and online catalogue, and emphasises on the need of developing computer-based library information systems and services. It describes database technology, kinds of databases, database management system, computerised library information system, and management information system. It coven in detail the database design and compatibility of cataloguing codes for developing databases of computer-based library information systems.
Presented at the seminar Libraries and the Semantic Web: the role of International Standard Bibliographic Description (ISBD), National Library of Scotland, Edinburgh, 25 Feb 2011
The FAO Open Archive: Enhancing Access to FAO Publications Using Internationa...Romolo Tassone
To improve the effectiveness of the proposed FAO repository, it is necessary to streamline the current electronic publishing workflow. The merger of the two systems will strengthen FAO’s role as a knowledge dissemination organization. Especially, as one of the principal tasks of the FAO is to efficiently collect and disseminate information regarding food, nutrition, agriculture, fisheries and forestry.
We would like to thank Anne Aubert, Johannes Keizer, Giorgio Lanzarone, Romolo Tassone and Jim Weinheimer for their valuable contributions.
Bibliographical information plays an important role information retrieval for the research community particularly in the field of science and technology. But during the bibliographical information exchange certain problems arise and more when the information interchange is on magnetic tape or CD-ROM. Different international organisations such as UNESCO/PGI, UNISIST, ICSU, IFLA, ISO have taken many steps towards the standardisation of bibliographic exchange formats. The process of standardisation follows a set of codes given by International Standard Organisation (ISO).
Bibliographical information plays an important role information retrieval for the research community particularly in the field of science and technology. But during the bibliographical information exchange certain problems arise and more when the information interchange is on magnetic tape or CD-ROM. Different international organisations such as UNESCO/PGI, UNISIST, ICSU, IFLA, ISO have taken many steps towards the standardisation of bibliographic exchange formats. The process of standardisation follows a set of codes given by International Standard Organisation (ISO).
Babouk: Focused Web Crawling for Corpus Compilation and Automatic Terminology...Christophe Tricot
The use of the World Wide Web as a free source for large linguistic resources is a well-established idea. Such resources are keystones to domains such as lexicon-based categorization, information retrieval, machine translation and information extraction. In this paper, we present an industrial focused web crawler for the automatic compilation of specialized corpora from the web. This application, created within the framework of the TTC project1, is used daily by several linguists to bootstrap large thematic corpora which are then used to automatically generate bilingual terminologies
FRBR (Functional Requirements for Bibliographic Records) is a 1998 recommendation of the International Federation of Library Associations and Institutions (IFLA) to restructure catalog databases to reflect the conceptual structure of information resources
This presentation was provided by Bobbi Patham of Springer Nature, as part of the NISO Standards Update on "KBART (Knowledge Bases and Related Tools)" held during ALA Annual on June 25, 2023.
Jabes 2011 - Session plénière 18 mai "Le nouveau code de catalogage italien R...ABES
Jabes 2011 - Session plénière 18 mai "Le nouveau code de catalogage italien REICAT,(Regole italiane di catalogazione) applications et dispositions dans le cadre du catalogue collectif SBN (Servizio Bibliotecario Nazionale)", Cristina Magliano, Institut central pour le catalogage collectif des bibliothèques italiennes et pour l'information bibliographique (ICCU - Italie), dans le cadre des Journées Abes 2011
UiTM IM110 IMD253 : ORGANIZATION OF INFORMATION (IMD253) Individual Assignment Kumprinx Amin
FINAL PROJECT INDIVIDUAL:
ANALYZE AND REPORT
TABLE OF CONTENTS
Z39.50: An information Retrieval Protocol
• Introduction
• History And Backround
• Objective & Purpose
• Function
• Benefit
• Conclusion
MARC Standard
• Introduction
• History And Backround
• Objective & Purpose
• Function
• Benefit
• Conclusion
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
Software Engineering, Software Consulting, Tech Lead.
Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Security,
Spring Transaction, Spring MVC,
Log4j, REST/SOAP WEB-SERVICES.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
May Marketo Masterclass, London MUG May 22 2024.pdfAdele Miller
Can't make Adobe Summit in Vegas? No sweat because the EMEA Marketo Engage Champions are coming to London to share their Summit sessions, insights and more!
This is a MUG with a twist you don't want to miss.
Navigating the Metaverse: A Journey into Virtual Evolution"Donna Lenk
Join us for an exploration of the Metaverse's evolution, where innovation meets imagination. Discover new dimensions of virtual events, engage with thought-provoking discussions, and witness the transformative power of digital realms."
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
Innovating Inference - Remote Triggering of Large Language Models on HPC Clus...Globus
Large Language Models (LLMs) are currently the center of attention in the tech world, particularly for their potential to advance research. In this presentation, we'll explore a straightforward and effective method for quickly initiating inference runs on supercomputers using the vLLM tool with Globus Compute, specifically on the Polaris system at ALCF. We'll begin by briefly discussing the popularity and applications of LLMs in various fields. Following this, we will introduce the vLLM tool, and explain how it integrates with Globus Compute to efficiently manage LLM operations on Polaris. Attendees will learn the practical aspects of setting up and remotely triggering LLMs from local machines, focusing on ease of use and efficiency. This talk is ideal for researchers and practitioners looking to leverage the power of LLMs in their work, offering a clear guide to harnessing supercomputing resources for quick and effective LLM inference.
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
Custom Healthcare Software for Managing Chronic Conditions and Remote Patient...Mind IT Systems
Healthcare providers often struggle with the complexities of chronic conditions and remote patient monitoring, as each patient requires personalized care and ongoing monitoring. Off-the-shelf solutions may not meet these diverse needs, leading to inefficiencies and gaps in care. It’s here, custom healthcare software offers a tailored solution, ensuring improved care and effectiveness.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
4. INTRODUCTION
• CCF is a structured format for creating bibliographical records
and for exchanging records between groups of information
agencies and libraries
• The main aim of this format was to produce a method of
organising bibliographic descriptions which could be
exchanged between institutions.
• This format was to act as a link between the databases
produced in different internal formats of libraries
5. DEFINATION
The Common Communication Format (CCF) is a structured
method of writing documents used in the software development
industry.
The chapter concludes by highlighting the importance of using
the CCF in software development
6. BACKGROUNDTO CCF
In April 1978 the UNESCO general information programme (UNESCO/PGI)
sponsored an international symposium on bibliographic exchange formats.
First edition:- of CCF was published in 1984 under the editorship of Peter
Simmons and Hopkins.
• Second edition:- published in 1988.
• Third edition :- was published in 2 volumes i.e. CCF(B) Bibliographic
Information CCF(F) Factual
7. PURPOSE OF CCF
The Common Communication Format (CCF) document provides
an overview of the CCF and its purpose. The chapter explains
that the CCF is a standardized format for writing documents in
the software development industry. The purpose of the CCF is to
ensure consistent
documentation across projects and teams, making it easier for
stakeholders to understand and collaborate effectively.
8. SCOPE AND USE OF CCF
The CCF is basically designed to provide a standard format for three major
purposes:
1. To permit the exchange of bibliographic records between the groups of
libraries and the abstracting and indexing services.
2. To permit a bibliographic agency to use a single set of computer programs to
manipulate bibliographic records received from both the libraries and the
abstracting and indexing services.
3. To serve as the basis of a format for an agency's own bibliographic database, by
providing a list of useful data elements.
9. STRUCTURE OF CCF
• The record structure of the Common
Communication Format comprises a
specific implementation of the
international standard ISO-2709, Each
CCF record consists of four major parts
• 1). Record Label (A fixed- length label of
24 characters)
• Record status
• a = New Record
• b = Replace record
• c = Deleting Record
• 2). Directory (having variable
length)
• a)Tag: b) Length of datafield c)
Starting character position
• d) Segment identifier e)
Occurrence identifier
10. GUIDELINES OF CCF
The Common Communication Format (CCF) document provides detailed
guidelines for formatting text, including font size, font style, and spacing.
The purpose of these guidelines is to ensure that documents are easy to
read and visually agreeing.
The chapter then provides specific guidelines for font size suggest a
minimum font size of 10 points for body text and 12 points for headings
11. CONCLUSION
If two or more organizations wish to exchange records with one another, it
will be necessary for each of these organizations to agree upon a common
standard format for exchange purposes.
Each must be able to convert to an exchange-format record from an
internal-format record, and vice versa.