This document discusses engaging researchers in research data management (RDM) through data reference interviews. It provides an overview of EDINA and the University of Edinburgh Data Library and their roles in assisting researchers. It then describes the data reference interview process, highlighting the importance of understanding the researcher's field and data. Recommendations are provided for interviewing researchers and tools for assessing data are introduced. The document concludes by discussing the University's RDM strategy and engagement tools.
IASSIST40: Data management & curation workshopRobin Rice
The document summarizes Edinburgh DataShare, an open access data repository at the University of Edinburgh that supports the university's research data management policy. It stores a wide range of research data across disciplines. The repository uses the DSpace platform and is promoting open data, though getting some academics to deposit data can be challenging. It focuses on making metadata and data discoverable through various search tools and indexes. Basic quality assurance checks are performed during the self-deposit process.
‘Good, better, best’? Examining the range and rationales of institutional dat...Robin Rice
Introduction to panel presentations from Universities of Edinburgh, Southampton, Yale, Cornell at IPRES 2015 conference, Chapel Hill, North Carolina, 3 Nov 2015
The document summarizes a workshop on geospatial metadata and spatial data. It discusses the importance of metadata for managing and sharing spatial datasets, providing key information about the data. It also covers metadata standards like FGDC, ISO 19115, and application profiles. The workshop includes presentations on the UK Academic Geospatial Metadata Application Profile and tools for creating metadata like the Geodoc Metadata Editor and Go-Geo portal.
presented by Stuart Macdonald at the College of Science and Engineering - "What's new for you in the Library“, Murray Library, Kings Buildings, University of Edinburgh. 28 May 2014
Covers research data, research data management, funder policies and the University's RDM policy, RDM services and support, awareness raising, training, progress so far.
This document discusses engaging researchers in research data management (RDM) through data reference interviews. It provides an overview of EDINA and the University of Edinburgh Data Library and their roles in assisting researchers. It then describes the data reference interview process, highlighting the importance of understanding the researcher's field and data. Recommendations are provided for interviewing researchers and tools for assessing data are introduced. The document concludes by discussing the University's RDM strategy and engagement tools.
IASSIST40: Data management & curation workshopRobin Rice
The document summarizes Edinburgh DataShare, an open access data repository at the University of Edinburgh that supports the university's research data management policy. It stores a wide range of research data across disciplines. The repository uses the DSpace platform and is promoting open data, though getting some academics to deposit data can be challenging. It focuses on making metadata and data discoverable through various search tools and indexes. Basic quality assurance checks are performed during the self-deposit process.
‘Good, better, best’? Examining the range and rationales of institutional dat...Robin Rice
Introduction to panel presentations from Universities of Edinburgh, Southampton, Yale, Cornell at IPRES 2015 conference, Chapel Hill, North Carolina, 3 Nov 2015
The document summarizes a workshop on geospatial metadata and spatial data. It discusses the importance of metadata for managing and sharing spatial datasets, providing key information about the data. It also covers metadata standards like FGDC, ISO 19115, and application profiles. The workshop includes presentations on the UK Academic Geospatial Metadata Application Profile and tools for creating metadata like the Geodoc Metadata Editor and Go-Geo portal.
presented by Stuart Macdonald at the College of Science and Engineering - "What's new for you in the Library“, Murray Library, Kings Buildings, University of Edinburgh. 28 May 2014
Covers research data, research data management, funder policies and the University's RDM policy, RDM services and support, awareness raising, training, progress so far.
SDA (Survey Documentation and Analysis) is software that allows users to access and analyze numeric microdata from repositories without needing specialized statistical software. It generates descriptive and inferential statistics, and basic visualizations. SDA benefits researchers by providing statistical analysis capabilities and easy access to metadata. It benefits repositories by facilitating secondary use of data while protecting sensitive information. SDA shows the value of numeric data for teaching and research.
Research Data Management: Approaches to Institutional PolicyRobin Rice
This document summarizes research data management policies from several universities. It discusses the purpose statements, tones, roles and responsibilities outlined in the policies of universities in the UK, Australia, and US. The University of Edinburgh policy takes a partnership approach, sharing responsibilities between the university and researchers. It aims to support research excellence through managing data to high standards across the research lifecycle.
1) The University of Edinburgh drafted an 18-month Research Data Management Roadmap in August 2012 to address institutional research data management and comply with their RDM policy.
2) The Roadmap outlines governance, data management planning support, development of an active data infrastructure including a data store, and data stewardship services such as a data repository and registry.
3) Services under the Roadmap include tailored data management plan assistance, customizing an online DMP tool, infrastructure for storing and accessing research data, and a data repository for depositing and long-term management of completed research outputs.
The document summarizes a pilot project at the University of Edinburgh to support the development of a UK Research Data Discovery Service. PhD interns engaged with researchers from various schools to describe and deposit research datasets in the university's systems to be harvested by the discovery service. Observations found mixed results across schools, with humanities researchers less comfortable sharing data due to copyright and reluctance to share interpretations. Other schools had established data repositories causing less interest in the university's system. Building research data management practices will require tailored approaches and more training over time.
February 18 2015 NISO Virtual Conference
Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Network Effects: RMap Project
Sheila M. Morrissey, Senior Researcher, ITHAKA
This document discusses the importance of preserving both research data and literature for future use. It quotes two scientists emphasizing the value of original data. It then makes three recommendations: 1) Include research literature as part of the record of science; 2) Make data available for future unknown uses; 3) Regard assured access to digital content as a grand challenge. Several organizations are working to archive e-journals and digital content to ensure long-term preservation and access.
February 18 2015 NISO Virtual Conference Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Learning to Curate Research Data
Jennifer Doty, Research Data Librarian, Emory Center for Digital Scholarship, Emory University, Robert W. Woodruff Library
Practical and Conceptual Considerations of Research Object PreservationSEAD
This document discusses research object (RO) frameworks for preserving digital research data. It addresses the challenges of research spanning long periods of time and involving complex, heterogeneous data that changes states. The research object framework aims to capture agents, states, relationships, and content to enable automation, reproducibility, and reuse of research. The framework defines three states for research objects - live, curated, and published. Live objects are works in progress, curated objects are packaged for preservation, and published objects are immutable and citable. The framework allows documentation of research processes and outputs to build trust and facilitate reuse.
ESA14 Workshop on SEAD's Data Services and ToolsSEAD
This document provides an overview of the SEAD (Sustainable Environment and Ecological Development) services and tools for data curation, preservation, and sharing. It outlines the SEAD workshop agenda which demonstrates how to use project spaces to manage research data, metadata, and social features. It also describes how to publish and preserve data, connect with other researchers through profiles and a research network, and find data within a project space. The goal of SEAD is to provide secure, team-controlled spaces to manage research data throughout the data lifecycle and promote sharing and discovery.
RDAP14 Poster: openICPSR: a public access repository for storing and sharing ...ASIS&T
openICPSR is a service that provides public access to social and behavioral science research data. It aims to meet requirements for public access to federally funded data by enabling depositors to fulfill public access mandates. Researchers can deposit data through self deposit, professional curation, or as part of a full topic archive. Deposits are preserved and accessible through openICPSR for at least 10 years. Fees help sustain the service and ensure long-term access and preservation of deposited data.
The DFC project aims to federate data grids to enable collaboration. It uses iRODS to build a federated data grid that supports reproducible science with workflows as first class objects and provenance. The project focuses on interoperability by allowing iRODS grids to interface with other systems like DataONE. It also develops tools for data discovery, access, manipulation, transformation, subsetting, and visualization from workflows. Current work involves client side tools for ingestion, access control, and integrated analysis. The project also works on standards, policies, and repository management tools to support trustworthy and sustainable data curation practices.
Map Styling Tools and Interactive maps on the web with OpenLayers - Addy Pope...JISC GECO
Presentation given as part of the DevCSI/JISC GECO Open Mapping Workshop which was held at the Electron Club, CCA, Glasgow on Thursday 25th August 2011. The event was connected to the OpenStreetMap State of the Map Scotland event.
SDA (Survey Documentation and Analysis) is software that allows users to access and analyze numeric microdata from repositories without needing specialized statistical software. It generates descriptive and inferential statistics, and basic visualizations. SDA benefits researchers by providing statistical analysis capabilities and easy access to metadata. It benefits repositories by facilitating secondary use of data while protecting sensitive information. SDA shows the value of numeric data for teaching and research.
Research Data Management: Approaches to Institutional PolicyRobin Rice
This document summarizes research data management policies from several universities. It discusses the purpose statements, tones, roles and responsibilities outlined in the policies of universities in the UK, Australia, and US. The University of Edinburgh policy takes a partnership approach, sharing responsibilities between the university and researchers. It aims to support research excellence through managing data to high standards across the research lifecycle.
1) The University of Edinburgh drafted an 18-month Research Data Management Roadmap in August 2012 to address institutional research data management and comply with their RDM policy.
2) The Roadmap outlines governance, data management planning support, development of an active data infrastructure including a data store, and data stewardship services such as a data repository and registry.
3) Services under the Roadmap include tailored data management plan assistance, customizing an online DMP tool, infrastructure for storing and accessing research data, and a data repository for depositing and long-term management of completed research outputs.
The document summarizes a pilot project at the University of Edinburgh to support the development of a UK Research Data Discovery Service. PhD interns engaged with researchers from various schools to describe and deposit research datasets in the university's systems to be harvested by the discovery service. Observations found mixed results across schools, with humanities researchers less comfortable sharing data due to copyright and reluctance to share interpretations. Other schools had established data repositories causing less interest in the university's system. Building research data management practices will require tailored approaches and more training over time.
February 18 2015 NISO Virtual Conference
Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Network Effects: RMap Project
Sheila M. Morrissey, Senior Researcher, ITHAKA
This document discusses the importance of preserving both research data and literature for future use. It quotes two scientists emphasizing the value of original data. It then makes three recommendations: 1) Include research literature as part of the record of science; 2) Make data available for future unknown uses; 3) Regard assured access to digital content as a grand challenge. Several organizations are working to archive e-journals and digital content to ensure long-term preservation and access.
February 18 2015 NISO Virtual Conference Scientific Data Management: Caring for Your Institution and its Intellectual Wealth
Learning to Curate Research Data
Jennifer Doty, Research Data Librarian, Emory Center for Digital Scholarship, Emory University, Robert W. Woodruff Library
Practical and Conceptual Considerations of Research Object PreservationSEAD
This document discusses research object (RO) frameworks for preserving digital research data. It addresses the challenges of research spanning long periods of time and involving complex, heterogeneous data that changes states. The research object framework aims to capture agents, states, relationships, and content to enable automation, reproducibility, and reuse of research. The framework defines three states for research objects - live, curated, and published. Live objects are works in progress, curated objects are packaged for preservation, and published objects are immutable and citable. The framework allows documentation of research processes and outputs to build trust and facilitate reuse.
ESA14 Workshop on SEAD's Data Services and ToolsSEAD
This document provides an overview of the SEAD (Sustainable Environment and Ecological Development) services and tools for data curation, preservation, and sharing. It outlines the SEAD workshop agenda which demonstrates how to use project spaces to manage research data, metadata, and social features. It also describes how to publish and preserve data, connect with other researchers through profiles and a research network, and find data within a project space. The goal of SEAD is to provide secure, team-controlled spaces to manage research data throughout the data lifecycle and promote sharing and discovery.
RDAP14 Poster: openICPSR: a public access repository for storing and sharing ...ASIS&T
openICPSR is a service that provides public access to social and behavioral science research data. It aims to meet requirements for public access to federally funded data by enabling depositors to fulfill public access mandates. Researchers can deposit data through self deposit, professional curation, or as part of a full topic archive. Deposits are preserved and accessible through openICPSR for at least 10 years. Fees help sustain the service and ensure long-term access and preservation of deposited data.
The DFC project aims to federate data grids to enable collaboration. It uses iRODS to build a federated data grid that supports reproducible science with workflows as first class objects and provenance. The project focuses on interoperability by allowing iRODS grids to interface with other systems like DataONE. It also develops tools for data discovery, access, manipulation, transformation, subsetting, and visualization from workflows. Current work involves client side tools for ingestion, access control, and integrated analysis. The project also works on standards, policies, and repository management tools to support trustworthy and sustainable data curation practices.
Map Styling Tools and Interactive maps on the web with OpenLayers - Addy Pope...JISC GECO
Presentation given as part of the DevCSI/JISC GECO Open Mapping Workshop which was held at the Electron Club, CCA, Glasgow on Thursday 25th August 2011. The event was connected to the OpenStreetMap State of the Map Scotland event.
Introduction to an ICT based cross curricular resource for BEd students at the University of Strathclyde. Delivered by Anne Robertson and Carol Blackwood at the School of Education, University of Strathclyde, on 5 October 2015.
This document discusses three outputs from a project exploring methods for authorizing access to web services through federated identity systems:
1. A modified Digimap production service that provides access to geospatial data via OGC standards using UK federation single sign-on for registered users.
2. Instructions for a simple method to control access to existing web services from non-browser clients without modifying the web service, using Apache and scripting.
3. A demonstration of deploying new Shibboleth delegation software to allow a JSP to access a web service after authenticating with an identity provider without additional logins.
The MANTRA project created open online learning materials in research data management for postgraduate students and early career researchers at the University of Edinburgh. The materials include eight units covering topics like data management plans, file formats, and sharing data. They are grounded in three academic disciplines and include practical exercises and video interviews with researchers. The university also approved a new research data policy to improve research data practices and support for data management across the institution in response to issues identified by previous projects. The policy establishes principles for managing data responsibly and providing training and services to implement best practices.
This document provides an overview of a geospatial metadata and spatial data workshop held at the University of Oxford. The workshop covered topics such as metadata standards, application profiles, geospatial metadata tools and portals for sharing spatial data and metadata. Hands-on sessions demonstrated how to create metadata using the Geodoc Metadata Editor tool and access spatial data repositories through the Go-Geo portal and ShareGeo open data portal.
The state of play currently with the preservation of all things webby and concrete actions to take. Delivered by Peter Burnhill at the ALSP event "Standing on the Digits of Giants: Research data, preservation and innovation" on 8 March 2015 in London.
Presented by Anne Robertson and Carol Blackwood for the Scottish Association of Geography Teachers in Perth, on 25 October 2014. An overview of some of the features of the online mapping tool for schools.
The document proposes the AGILE Data Access Initiative to address issues researchers face in accessing core geospatial data from National Mapping and Cadastral Agencies across Europe. Surveys of NMCAs and academic users found that while most NMCAs make data available, barriers include cost, licensing restrictions, and difficulties obtaining cross-border data. The initiative seeks to negotiate national agreements for academic access and develop reciprocal licenses and access controls to enable easier transnational research.
The MIMAS workshop discussed the RepositoryNet infrastructure and components including aggregation, text mining, search, benchmarking and statistics, registries, deposit tools, and metadata quality. It provided updates on components outside RepositoryNet like IRS Search and NAMES 2. A demonstration of IRUS showed its current functionality for benchmarking and statistics and future plans for funding, APIs, international scope, and business models. Developing service level agreements for RepNet services was also discussed.
This document provides an overview of a geospatial metadata and spatial data workshop. It discusses the importance of metadata for discovering and managing spatial datasets. It introduces common geospatial metadata standards like FGDC, ISO 19115, and INSPIRE and the concept of application profiles. The document outlines tools and resources for UK academics to create and publish metadata, including the UK AGMAP profile, Geodoc editor, GoGeo portal, and ShareGeo repository. Hands-on sessions demonstrate using these resources to generate metadata and access open spatial data.
The document reports on the progress of the IASSIST Latin Engagement Strategic Action Group. It summarizes the group's findings from surveying data professionals in Spain. It found that while data library roles are not prominent, interest in research data management is growing. The document recommends that IASSIST provide multilingual resources, training events in Spain, and opportunities for Latin American members to attend conferences to further engage Latin members.
This document discusses using social media to develop an academic profile and engage others in research. It defines social media as websites that allow contribution and connection. Examples include blogs, Twitter, YouTube, Facebook and LinkedIn. The benefits of social media are that it allows researchers to share their expertise, engage in dialogue, and potentially generate interest in their work. The document provides tips on which social media tools to use and how to plan an effective strategy, including considering goals, audience, and content. It also discusses maintaining privacy and professionalism online.
This document summarizes a seminar on data management for undergraduate researchers. It discusses what data is, why it needs to be managed, and key aspects of the data management process such as data organization, metadata, storage, and archiving. Topics covered include file naming best practices, version control, documentation, metadata standards, storage options, and long-term archiving. The goal is to help researchers organize and document their data so it can be understood, preserved, and reused.
The state of global research data initiatives: observations from a life on th...Projeto RCAAP
The document discusses research data management and provides guidance on best practices. It defines research data management as the active management of data over its lifecycle. It recommends writing a data management plan to document how data will be created, stored, shared, and preserved. It also provides tips for making data accessible and reusable through use of metadata standards, documentation, open licensing, and depositing data in repositories with persistent identifiers. The goal is to help researchers manage and share their data effectively to increase access and reuse.
Data Management for Undergraduate Researchers (updated - 02/2016)Rebekah Cummings
This document summarizes a seminar on data management for undergraduate researchers. It discusses what data is, why it needs to be managed, and key aspects of effective data management including data organization, metadata, storage and archiving. Specific topics covered include creating data management plans, file naming conventions, structuring folders, describing data through codebooks and documentation, backup strategies, and long-term archival options. The goal is to help researchers organize and document their data so it can be understood and preserved over time.
This document provides an overview of data management best practices for graduate students presented in a workshop. It discusses what constitutes research data, the importance of managing data, how to create a data management plan, file naming conventions, metadata, data storage and backup strategies, and archiving options. The workshop covers topics like using a structured folder system, creating codebooks and documentation to describe data, and ensuring long-term access and preservation of research data. University librarians are available to help students with all aspects of responsible data management.
Lec20.pptx introduction to data bases and information systemssamiullahamjad06
The document provides an overview of databases and information systems. It defines what a database is, how data is organized in a hierarchy from bits to files, and the different types of database models including hierarchical, network, and relational. It also discusses how structured query language and query by example are used to retrieve data in relational databases. Finally, it outlines different types of computer-based information systems used in organizations like transaction processing systems, management information systems, and decision support systems.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Rebecca Raworth presented a workshop on research data management. The presentation covered:
- Why research data management plans are important, such as satisfying funder requirements and increasing research efficiency.
- Current requirements for data management plans in Canada.
- Tools for research data management, including Portage for creating data management plans and Dataverse for data storage and access.
- Best practices for organizing, documenting, storing and sharing research data, including using metadata standards, file naming conventions, and choosing appropriate data repositories.
Detailed slides of data resource management. The relationships among the many individual data elements stored in databases are based on one of several logical data structures, or models.
Data Analytics: HDFS with Big Data : Issues and ApplicationDr. Chitra Dhawale
This document provides information about a course on data analytics. It outlines the course outcomes, which include developing scalable systems using Apache and Hadoop, writing MapReduce applications, differentiating SQL and NoSQL, and analyzing and developing big data solutions using Hive and Pig. The document also describes some of the topics that will be covered in the course, including distributed file systems and their issues, an introduction to big data, characteristics of big data, types of big data, and comparisons between traditional and big data approaches.
The document provides guidance on early planning for data management, including becoming familiar with funder requirements, planning for the types and formats of data that will be created, designing a system for taking notes, organizing files through consistent naming schemes and use of folders, adding metadata to files to aid in documentation and discovery, and using RSS feeds to organize web-based information. It also touches on issues like plagiarism, data protection, intellectual property rights, and remote access to and backup of data.
Prerequisies of DBMS
Course Objectives of DBMS
Syllabus
What is the meaning of data and database
DBMS
History of DBMS
Different Databases available in Market
Storage areas
Why to Learn DBMS?
Peoples who work with Databases
Applications of DBMS
Database management system lecture notesUTSAHSINGH2
DBMS provide an organized collection of interrelated data stored and retrieved digitally in a computer system. A DBMS uses SQL to allow users to define, create, maintain and control access to the database. It consists of several components including a query processor, data dictionary, runtime database manager and data manager. The three schema architecture separates the logical and physical levels to provide data independence and abstraction.
Tools and Techniques for Creating, Maintaining, and Distributing Shareable Me...Jenn Riley
This document discusses tools and techniques for creating, maintaining, and distributing shareable metadata. It emphasizes that metadata should be structured to be understandable outside of local contexts and useful for other institutions. Key aspects of shareable metadata include using appropriate content and vocabularies, ensuring records are coherent, providing useful context, and conforming to standards. The document also outlines example workflows and considerations for making metadata shareable.
This document provides an overview of a workshop on good practice in research data management held at the University of Tartu, Estonia. The workshop covered various topics including defining research data, research data management and data management plans, organizing and documenting data, file formats and storage, metadata, security, and sharing and preserving data. The workshop was led by Stuart Macdonald from the University of Edinburgh and included presentations, introductions, and discussions around each of these research data management topics.
This document provides an overview and summary of key topics related to database design and management. It outlines the course contents, which include concepts of database management, database modeling, SQL, distributed databases, and database administration. It also discusses database terminology, the advantages of using a database management system (DBMS) compared to file-based systems, including improved data sharing and reduced redundancy. The components of a DBMS environment are identified as hardware, software, data, procedures, and people.
The document discusses database systems and provides an overview of key concepts. It begins with a brief history of databases, from early file-based systems to modern relational databases. It then defines what a database is, the components of a database system including data, software, hardware and users. The roles of different database users are identified. Database management systems are introduced as the software that allows users to store, organize, update and protect data.
The document provides information about a database course including:
1) An overview of the course content which covers database fundamentals, the relational model, normalization, conceptual modeling, query languages, and advanced SQL topics.
2) Details about the lecturer including their academic background and publications.
3) Assessment details for the course including exams, labs, and project work accounting for 100% of the grade.
This session covers topics related to data archiving and sharing. This includes data formats, metadata, controlled vocabularies, preservation, archiving and repositories.
The document provides an introduction to database management systems (DBMS). It defines what a database and DBMS are, and explains that a DBMS allows users to define, create, and manipulate databases for applications. It also discusses some key components of a DBMS environment, including software, hardware, data, procedures, and database access languages like SQL. The document compares traditional file-based data storage with DBMS approaches and outlines some benefits DBMS provide like reduced redundancy, improved data integrity and sharing, and increased accessibility.
A look at the research being carried out by Dr Stuart Dunn at Kings College London. This includes his work on rediscovering Corpse Paths in Great Britain.
The Land Cover Map 2015 (LCM2015) is a map of land cover classes across the UK produced every 5-10 years. It is based on classification of Landsat satellite imagery from the summer and winter and additional data layers. The LCM2015 contains over 7.5 million land parcels classified into 21 land cover classes. It is an important resource used widely in research, commercial, government and nonprofit applications related to agriculture, ecology, climate, planning and more.
A presentation by John Murray from Fusion Data Science given at EDINA's GeoForum 2017 about the use of Lidar Data and the technology and techniques that can be used on it to create useful datasets.
Slides accompanying the presentation:"Reference Rot in Theses: A HiberActive Pilot", a 10x10 session (10 slides over 10 minutes) presented by Nicola Osborne (EDINA, University of Edinburgh). This presentation was part of Repository Fringe 2017 (#rfringe17) held on 3rd August 2017 in Edinburgh. The slides describe a project to develop Site2Cite, a new (pilot) tool for researchers to archive their web citations and ensure their readers can access that archive copy should the website change over time (including "Reference Rot" and "Content Drift").
This document provides an overview of managing digital footprints. It discusses what a digital footprint is, research conducted at the University of Edinburgh on digital footprints, and factors that contribute to one's digital footprint such as social media, location data, and online searches. The document notes that digital footprints can impact professional and personal reputation. It provides tips for taking ownership of one's digital footprint such as regularly searching for oneself online and reviewing privacy settings. Resources for further information and managing digital footprints are also listed.
The document discusses using digital technology and maps to represent the HMS Iolaire tragedy, a maritime disaster in 1919 where 205 men from the Isle of Lewis died after returning from World War I. It describes adding photos, text, and showing change over time to maps to help tell the story and create a sense of place. Specific details are provided about the journey the men took from England to the Western Isles on New Year's Day 1919 and how maps at different scales can portray events in different ways.
This document introduces Digimap for Schools, an online mapping service designed for schools to use in geography and other subjects. It has Ordnance Survey maps of Great Britain at different zoom levels, as well as historic maps and aerial photography. Students can add their own labels, markers, and other elements to maps. The service allows measuring distances and areas. It is browser-based and can be accessed from school or home. Over 2,690 schools in Britain currently use the service, including 185 Scottish secondary schools. The document outlines how Digimap for Schools can support teaching and learning in subjects beyond geography like numeracy, social studies, sciences, and more. Examples of lessons and activities using the mapping service are provided.
This document provides an introduction to Digimap for Schools, an online mapping service designed for use in UK schools. It highlights key features such as access to historic maps from the 1890s and 1950s, aerial photography, and tools for annotating, measuring, and analyzing maps. Schools subscribe to the service, which allows unlimited users per school to access maps and tools through a web browser on any device. The presenter emphasizes how Digimap for Schools can support teaching and learning across the Scottish curriculum, particularly for geography, by facilitating hands-on activities with maps, data, and spatial analysis. Examples are given of how schools have used the service for topics like land use change, density calculations, and proportional mapping. Teachers observing the presentation
"Managing your Digital Footprint : Taking control of the metadata and tracks and traces that define us online" invited presentation for CIG Scotland's 7th Metadata & Web 2.0 Seminar: "Somewhere over the Rainbow: our metadata online, past, present & future", which took place at the National Library of Scotland, 5th April 2017.
Slides accompanying Nicola Osborne's(EDINA Digital Education Manager) session on "Social media and blogging to develop and communicate research in the arts and humanities" at the "Academic Publishing: Routes to Success" event held at the University of Stirling on 23rd January 2017.
"Enhancing your research impact through social media" - presentation given by Nicola Osborne, EDINA Digital Education Manager, at the Edinburgh Postgraduate Law Conference 2017 (19th January 2017).
Social Media in Marketing in Support of Your Personal Brand - Nicola Osborne, EDINA Digital Education Manager, for Abertay University (Dundee) 4th Year Marketing Students.
Best Practice for Social Media in Teaching & Learning Contexts, slides accompanying a presentation by Nicola Osborne, EDINA Digital Education Manager, for Abertay University (Dundee). The hashtag for this event was #AbTLEJan2017.
Big Just Got Bigger! discusses the challenges of managing large map collections through the Digimap service. Digimap provides access to geospatial data from various sources, including Ordnance Survey, British Geological Survey, aerial imagery, and more. It has grown significantly over time to include more data sources and users. Managing such large datasets and meeting user expectations of current data and performance presents challenges. Issues include keeping data current while sharing across platforms, disk storage needs increasing exponentially over time, and ensuring data can be accessed and used through various tools and formats.
This document summarizes new and enhanced features in Digimap services from 2015-2016. Key updates include a refreshed homepage, responsive design for tablets, a new historic downloader application, marine chart roam with updated data, additions to ancient roam, land cover vector data, and improvements to geology, marine, and OS data. Usability and performance enhancements were also made, such as improved geo-referencing, easier use of 3D data, and a more reliable backend system. Feedback from users helped inform priority quality improvements.
Thinking of getting a dog? Be aware that breeds like Pit Bulls, Rottweilers, and German Shepherds can be loyal and dangerous. Proper training and socialization are crucial to preventing aggressive behaviors. Ensure safety by understanding their needs and always supervising interactions. Stay safe, and enjoy your furry friends!
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
Assessment and Planning in Educational technology.pptxKavitha Krishnan
In an education system, it is understood that assessment is only for the students, but on the other hand, the Assessment of teachers is also an important aspect of the education system that ensures teachers are providing high-quality instruction to students. The assessment process can be used to provide feedback and support for professional development, to inform decisions about teacher retention or promotion, or to evaluate teacher effectiveness for accountability purposes.
The simplified electron and muon model, Oscillating Spacetime: The Foundation...RitikBhardwaj56
Discover the Simplified Electron and Muon Model: A New Wave-Based Approach to Understanding Particles delves into a groundbreaking theory that presents electrons and muons as rotating soliton waves within oscillating spacetime. Geared towards students, researchers, and science buffs, this book breaks down complex ideas into simple explanations. It covers topics such as electron waves, temporal dynamics, and the implications of this model on particle physics. With clear illustrations and easy-to-follow explanations, readers will gain a new outlook on the universe's fundamental nature.
How to Fix the Import Error in the Odoo 17Celine George
An import error occurs when a program fails to import a module or library, disrupting its execution. In languages like Python, this issue arises when the specified module cannot be found or accessed, hindering the program's functionality. Resolving import errors is crucial for maintaining smooth software operation and uninterrupted development processes.
How to Manage Your Lost Opportunities in Odoo 17 CRMCeline George
Odoo 17 CRM allows us to track why we lose sales opportunities with "Lost Reasons." This helps analyze our sales process and identify areas for improvement. Here's how to configure lost reasons in Odoo 17 CRM
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
South African Journal of Science: Writing with integrity workshop (2024)
Organising and Documenting Data
1. Organising and Documenting Data
Stuart Macdonald
EDINA & Data Library
stuart.macdonald@ed.ac.uk
RDM Academic Liaison Librarian Training, 15 November 2012
2. Organising your data
•RDM is one of the essential areas of responsible conduct of
research.
•Research data files and folders need to be organised in a
systematic way to be:
• identifiable and accessible for yourself,
• identifiable and accessible for colleagues, and for
future users.
•Thus it is important to plan the organisation of your data
before a research project begins.
•Doing so will prevent any confusion while research is
underway or when multiple individuals will be editing and / or
analysing the data.
3. This can be achieved through:
•Directory structure & file naming conventions
•(File naming conventions for specific disciplines)
•File renaming
•File version control
For this to be successful a consistent and disciplined
approach is required.
Easier to accomplish as and when data files are generated
rather than retrospectively attempting to implement.
When organization methods become too time consuming,
consider automated methods.
4. File Naming conventions
•Naming datasets according to agreed conventions should
make file naming easier for colleagues because they will not
have to ‘re-think’ the process each time.
•File names should provide context for the contents of the file,
making it distinguishable from files with similar subjects or
different versions of the same file.
•Many files are used independently of their file or directory
structure, so provide sufficient description in the file name.
•Suggested strategies: identify the project; avoid special
characters; use underscores rather than spaces; include date
of creation or modification in a standard format (e.g.
YYYY_MM_DD or YYYYMMDD): use project number
•Be consistent! Avoid being cryptic!
5. Batch (or bulk) renaming
• Software tools exist that can organise data files and
folders in a consistent and automated way through
batch renaming.
• There are many situations where batch renaming may
be useful, such as:
– where images from digital cameras are automatically
assigned filenames consisting of sequential numbers
– where proprietary software or instrumentation generate
crude, default or multiple filenames
– where files are transferred from a system that supports
spaces and/or non-English characters in filenames to one
that doesn't (or vice versa). Batch renaming software can
be used to substitute such characters with acceptable
ones.
6. Benefits of consistent data file
labelling are:
•Data files are not accidentally overwritten or deleted
•Data files are distinguishable from each other within their
containing folder
•Data file naming prevents confusion when multiple people are
working on shared files
•Data files are easier to locate and browse
•Data files can be retrieved both by creator and by other users
•Data files can be sorted in logical sequence
•Different versions of data files can be identified
•If data files are moved to other storage platform their names will
retain useful context
7. Version Control
It is important to consistently identify and distinguish versions
of data files.
This ensures that a clear audit trail exists for tracking the
development of a data file and identifying earlier versions
especially if data is frequently updated by multiple users.
Suggested strategies:
• Use a sequential numbered system: v1, v2, v3, etc.
• Don't use confusing labels: revision, final, final2, etc.
• Record all changes -- no matter how small
• Discard obsolete versions (but never the raw copy)
• Use auto-backup instead of self-archiving, if possible
The alternative is to use version control software. (Bazaar,
TortoiseSVN, SubVersion)
8. Documenting Data
There are many reasons why you need to document
your data:
•To help you remember the details later
•To help others understand your research
•Verify your findings
•Replicate your results
•Archive your data for access and re-use
Some examples of data documentation are:
•Laboratory notebooks
•Field notes
•Questionnaires
•SOPs
•Methodologies
9. Documenting Data
Laboratory or field notebooks, for example play an
important role in supporting claims relating to
intellectual property developed by University
researchers, and even defending claims against
scientific fraud.
Research data need to be documented at various
levels:
•Project level
• study background, methodologies, instruments,
research hypothesis
•File or database level
• formats, relationships between files
•Variable or item level
• How variable was generated & label descriptions
10. Metadata – ‘data about data’
The difference between documentation and metadata is
that the first is meant to be read by humans and the
second implies computer-processing (though may also be
human-readable) to assist location and access to data
through search interfaces.
Three broad categories of metadata are:
•Descriptive - common fields such as title, author, abstract,
keywords which help users to discover online sources through
searching and browsing e.g. DC, MARC
•Administrative - preservation, rights management, and technical
metadata about formats.
•Structural - how different components of a set of associated
data relate to one another, such as a schema describing relations
between tables in a database.
11. Need for metadata
Metadata may not be
Public required if you are
working alone on your
own computer, but
become crucial when data
Research are shared online.
Community
Metadata help to place
your dataset in a broader
context, allowing those
Project outside your institution,
discipline, or research
environment to
Researcher understand how to
interpret your data.
Is there a filenaming convention for your specific discipline (e.g. The Open Biological and Biomedical Ontologies, DOE’s Atmospheric Radiation measurement (ARM) program )
For qualitative data or small-scale surveys, the documentation might exist only in your head. Take the time to write it down while it is fresh in your mind. This may include writing methodology reports, creating codebooks with full variable and value labels, documenting decisions about software, tracking changes to different versions of the dataset, recording assumptions made during analysis.
METS, a Digital Library Federation initiative, attempts to build upon the work of MOA2 and provide an XML document format for encoding metadata necessary for both management of digital library objects within a repository and exchange of such objects between repositories (or between repositories and their users) Administrative metadata provides information necessary to allow a repository to manage objects, such as when, how and by whom a resource was created and how it can be accessed. Provenance and licensing