This use case shows the mapping and transformation of land use plan data from the Province of Trento, Italy to the INSPIRE Land Use data specification. The source data includes shapefiles of land use polygons and classifications, as well as documentation. The data is mapped to INSPIRE compliant feature types including ZoningElement, SpatialPlan, and OfficialDocumentation using operations like retype, merge, and property mappings in the HALE tool. Attributes are renamed, reclassified, and linked between the source and target schemas to satisfy INSPIRE requirements.
HALE is a tool for creating mappings between different data models and schemas. It allows users to transform and harmonize spatial data, with a focus on complex datasets. HALE provides both a graphical and textual interface for GIS experts to define logical and semantically consistent mappings. It also enables validating transformations step-by-step and comparing source and target data. The document then provides an example usage of HALE to map land use data from a regional plan in Trento, Italy to the INSPIRE data specification.
This document discusses several empirical approaches for analyzing large geographic datasets:
1) Matching datasets in ArcGIS by spatially joining points and polygons using tools like Extract Values to Points for large raster datasets.
2) Reading shapefile and database files into Stata using the shp2dta command to link geographic and attribute information.
3) Comparing the costs of action vs. inaction in biomodeling by predicting outcomes under different land management scenarios.
4) Geocoding villages without geographic coordinates using online tools to assign latitude and longitude for merging external geographic data.
5) Standardizing country names in Stata using the kountry command to facilitate linking datasets based on country information.
2017 GIS in Emergency Management Track: Situational Awareness: Building an O...GIS in the Rockies
The document describes plans to create a situational awareness dashboard for emergency management. The dashboard will integrate real-time data from various sources like 911 calls, weather, traffic to provide public safety agencies and the public visibility into current emergency and event impacts. It will be built using ESRI's Operations Dashboard software leveraging existing GIS data and tools. Key steps will include identifying appropriate data sources, obtaining access to real-time data, designing the dashboard interface, and planning for future enhancements.
The document discusses the three phases of developing a Plantation Decision Support System (DSS) for the Forest Department. Phase I involves formulating project reports for selecting suitable plantation sites. Phase II involves formulating plantation journals with yearly field data. Phase III involves monitoring and evaluating the plantations physically and through remote sensing data and software.
This document discusses land administration and the role of GIS. It describes how land administration involves recording ownership and other attributes of land. GIS helps with land administration by providing digital maps and data for tasks like land registration, valuation for taxation, planning, and dispute resolution. The document also presents a case study of a GIS-based land information system developed for rural areas in India, which digitized paper maps, integrated satellite imagery, and allowed for more accurate planning and management of land use.
Lect 1 & 2 introduction to gis & rsRehana Jamal
The document provides an introduction to remote sensing including:
- A definition of remote sensing as collecting information about objects or areas from a distance without physical contact.
- An overview of the history of remote sensing from early aerial photography using balloons and airplanes to modern satellite imagery.
- An explanation that remote sensing is a spatial data acquisition technique that collects remotely sensed data from various platforms and sources.
EuroPython 2019: GeoSpatial Analysis using Python and JupyterHubMartin Christen
1) The document discusses using Python and JupyterHub for geospatial analysis and visualization. It provides an overview of geospatial data types and important open source Python libraries for working with vector and raster data.
2) Examples are shown of loading geospatial data into GeoPandas and performing spatial queries. Folium is used to visualize the results on interactive maps.
3) The last part demonstrates loading live earthquake data from the USGS into a GeoPandas DataFrame and plotting multiple data sources.
HALE is a tool for creating mappings between different data models and schemas. It allows users to transform and harmonize spatial data, with a focus on complex datasets. HALE provides both a graphical and textual interface for GIS experts to define logical and semantically consistent mappings. It also enables validating transformations step-by-step and comparing source and target data. The document then provides an example usage of HALE to map land use data from a regional plan in Trento, Italy to the INSPIRE data specification.
This document discusses several empirical approaches for analyzing large geographic datasets:
1) Matching datasets in ArcGIS by spatially joining points and polygons using tools like Extract Values to Points for large raster datasets.
2) Reading shapefile and database files into Stata using the shp2dta command to link geographic and attribute information.
3) Comparing the costs of action vs. inaction in biomodeling by predicting outcomes under different land management scenarios.
4) Geocoding villages without geographic coordinates using online tools to assign latitude and longitude for merging external geographic data.
5) Standardizing country names in Stata using the kountry command to facilitate linking datasets based on country information.
2017 GIS in Emergency Management Track: Situational Awareness: Building an O...GIS in the Rockies
The document describes plans to create a situational awareness dashboard for emergency management. The dashboard will integrate real-time data from various sources like 911 calls, weather, traffic to provide public safety agencies and the public visibility into current emergency and event impacts. It will be built using ESRI's Operations Dashboard software leveraging existing GIS data and tools. Key steps will include identifying appropriate data sources, obtaining access to real-time data, designing the dashboard interface, and planning for future enhancements.
The document discusses the three phases of developing a Plantation Decision Support System (DSS) for the Forest Department. Phase I involves formulating project reports for selecting suitable plantation sites. Phase II involves formulating plantation journals with yearly field data. Phase III involves monitoring and evaluating the plantations physically and through remote sensing data and software.
This document discusses land administration and the role of GIS. It describes how land administration involves recording ownership and other attributes of land. GIS helps with land administration by providing digital maps and data for tasks like land registration, valuation for taxation, planning, and dispute resolution. The document also presents a case study of a GIS-based land information system developed for rural areas in India, which digitized paper maps, integrated satellite imagery, and allowed for more accurate planning and management of land use.
Lect 1 & 2 introduction to gis & rsRehana Jamal
The document provides an introduction to remote sensing including:
- A definition of remote sensing as collecting information about objects or areas from a distance without physical contact.
- An overview of the history of remote sensing from early aerial photography using balloons and airplanes to modern satellite imagery.
- An explanation that remote sensing is a spatial data acquisition technique that collects remotely sensed data from various platforms and sources.
EuroPython 2019: GeoSpatial Analysis using Python and JupyterHubMartin Christen
1) The document discusses using Python and JupyterHub for geospatial analysis and visualization. It provides an overview of geospatial data types and important open source Python libraries for working with vector and raster data.
2) Examples are shown of loading geospatial data into GeoPandas and performing spatial queries. Folium is used to visualize the results on interactive maps.
3) The last part demonstrates loading live earthquake data from the USGS into a GeoPandas DataFrame and plotting multiple data sources.
The document discusses using big data analysis techniques to analyze land use statistics in urban planning. It describes analyzing data on land parcels classified into categories like residential, green space, and industrial to understand land utilization patterns and their effects. The analysis uses a frequent pattern mining algorithm to discover patterns in the data, showing regions with high industrial/residential use and low green space. Challenges included finding suitable datasets and preprocessing the data for analysis.
Geographical Information System (GIS) Georeferencing and Digitization, Bihar ...Kamlesh Kumar
This work is an effort to share Geographical Information System: Georeferencing, digitization and map making steps through QGIS 2.0.1
Georeferencing
Digitization of Topographical sheet
Point
Line
Area
Bihar Map
District Headquarters
Railway of Bihar
District Boundaries
Thematic Maps (Literacy & Sex Ratio)
This document provides an introduction to using ArcGIS software to create and edit maps. It describes ArcMap, ArcScene, and ArcGlobe as the three main ArcGIS applications for 2D, 3D, and global mapping. It explains how to add shapefiles to a map, work with attribute tables, select features, and download additional GIS data files. The document demonstrates how to organize layers, set labels and symbols, and export selected data. It provides guidance on effective data display through techniques like color coding and label placement. Overall, the document serves as a tutorial for basic GIS mapping and visualization using Esri's ArcGIS suite of products.
First GIS Software that Convert GIS Shape files to HTML Google Map Web Site a...Gis Gis
This document describes a free and open source Arabic GIS and GPS software. It includes desktop and web-based versions for mapping, editing shapefiles, analyzing spatial data, and tracking mobile GPS devices. The software allows converting between file formats, performing spatial analyses, and integrating with open data sources and web mapping services like Google Maps. It aims to provide Arabic users with free and compatible GIS tools for storing, analyzing, and visualizing geographic information.
The USGS National Geospatial Program is scanning and georeferencing published USGS topographic maps from 1884 to 2006 to create a digital repository that is freely available to the public. Scanning the historic maps preserves an irreplaceable collection and makes them accessible online through the USGS Store and The National Map Viewer. Georeferencing the maps ties them to a coordinate system and allows overlaying the historic maps with current geospatial data, enabling analysis of changes over time. Future plans include continuing to scan and release more historical quadrangles and providing the maps in GeoTIFF format for additional uses.
APPLICATION OF GEOGRAPHIC INFORMATION SYSTEM FOR EXPLORATION ACTIVITIES IN SO...Yudi Syahnur
First published in 2016 Indonesia Petroleum Association (IPA) Technical Symposium, this paper will illustrate how GIS Best Practices have been employed in Saka Indonesia Sesulu. From planning and execution of 550 km square 3D Seismic Survey to Rig Move monitoring activity.
GIS has also helped explorationist to effectively distinct trends, find patterns and anomalies of surface and subsurface structures. GIS allows people from multi-discipline and different backgrounds to collaborate easily, and contribute to the success of Oil & Gas Exploration in South Sesulu PSC.
Raster data is commonly obtained by scanning maps or collecting aerial photographs and satellite images. Scanned map datasets don't normally contain spatial reference information (either embedded in the file or as a separate file). With aerial photography and satellite imagery, sometimes the location information delivered with them is inadequate, and the data does not align properly with other data one has. Thus, to use some raster datasets in conjunction with other spatial data, we need to align or georeference them to a map coordinate system. A map coordinate system is defined using a map projection (a method by which the curved surface of the earth is portrayed on a flat surface). Georeferencing a raster data defines its location using map coordinates and assigns the coordinate system of the data frame. Georeferencing raster data allows it to be viewed, queried, and analyzed with other geographic data.
Generally, we georeference raster data using existing spatial data (target data)—such as georeferenced rasters or a vector feature class—that resides in the desired map coordinate system. The process involves identifying a series of ground control points—known x,y coordinates—that link locations on the raster dataset with locations in the spatially referenced data (target data). Control points are locations that can be accurately identified on the raster dataset and in real-world coordinates. Many different types of features can be used as identifiable locations, such as road or stream intersections, the mouth of a stream, rock outcrops, the end of a jetty of land, the corner of an established field, street corners, or the intersection of two hedgerows. The control points are used to build a polynomial transformation that will shift the raster dataset from its existing location to the spatially correct location. The connection between one control point on the raster dataset (the from point) and the corresponding control point on the aligned target data (the to point) is a link.
Finally, the georeferenced raster file can be exported for further usage.
THIS PRESENTATION IS TO HELP YOU PERFORM THE TASK STEP BY STEP.
This document discusses applications of geographic information systems (GIS) including urban planning, 3D modeling, environmental analysis, and hydrocarbon exploration. It provides examples of how GIS has been used for urban planning tasks like siting a daycare, modeling population change, and analyzing transportation networks. 3D modeling applications include generating high-resolution digital models from laser scanning data for uses like mapping, education, and engineering. Environmental analysis examples include examining the relationship between toxic sites and disadvantaged communities. The document also discusses GIS applications in hydrocarbon exploration like mapping fields and reservoirs, seismic interpretation, and production analysis to optimize resource development.
This presentation provides an overview of Land Information Systems (LIS). It discusses that a LIS is a digital system that contains both spatial and non-spatial land data. It then reviews the background of LIS in Western countries and how they differ from Nepal's system. The presentation outlines the key concepts of LIS including its methodology, current problems, and future planning. It aims to introduce LIS and provide context around its use and development in Nepal.
R is a free software environment for statistical computing and graphics. It can be used for spatial data analysis and GIS tasks. Spatial data such as points, polygons, and raster files can be imported and analyzed in R using specialized packages. Two case studies demonstrated using R for spatial interpolation of temperature data, LiDAR data processing to create digital elevation models, and developing online viewers for spatial datasets. R allows for reproducible analysis through scripting and has numerous packages that implement statistical procedures, graphics, and interfaces with GIS software like GRASS and ArcGIS.
The document presents a presentation on Geographic Information Systems (GIS). It includes sections on what GIS is, its capabilities and components. GIS is a computer system for capturing, storing, analyzing and managing geographic information and spatial data. The key components of a GIS include hardware, software, data and people. GIS has many applications and uses spatial data and analysis to solve problems across many different domains.
This document summarizes a GIS training session at IOE Pulchowk covering adding various data types to projects, including vector shapefiles, coordinate data from Excel, and image data from online sources like the Shuttle Radar Topography Mission. It discusses importing Excel data, defining the coordinate system, classifying data as points, lines or polygons, and the differences between shapefile and layer file formats. The training covered using shapefiles to store location and attribute data, while layer files primarily store symbology and properties without the raw data.
1. The document discusses how GIS can be used to aid in selecting optimal routes for transcontinental natural gas pipelines by analyzing cost and environmental factors.
2. GIS specialists use data to evaluate potential routes and determine the most suitable path between starting and ending points.
3. A case study found that routes developed using GIS facilitated greater cost reductions than manually developed routes.
A group of 15 developers and experts held a Code Camp in 2012 at the University of West Bohemia in Pilsen to develop a web platform called plan4business. They combined user requirements with engines for integration, storage, and analysis of pan-European spatial datasets. This will soon be demonstrated in a pilot application enabling spatial analysis. Plan4business is an EU-funded project running from 2012 to 2014 that aims to offer harmonized planning data and analyses for users in an INSPIRE-compliant and open standard platform.
The document outlines the 5 steps to mapping with OpenStreetMap, which are: 1) Collect data using GPS trackers, data loggers, or aerial photos; 2) Upload collected data by creating a free account on OpenStreetMap.org; 3) Create and edit map data within OpenStreetMap; 4) Label and tag the data to add details; 5) Render and use the completed map. It provides examples of GPS logging devices and links for collecting aerial photo data with GPS tags to upload.
Geographic Information Systems in the Oil & Gas IndustryFrancois Viljoen
GIS is a tool that can aid decision making in the gas and petroleum industry. GIS allows users to capture, store, analyze and display geospatial data to locate oil and gas resources. It is used throughout the exploration, production, distribution and conservation processes. GIS integrates data on seismic surveys, pipelines, facilities and more to improve efficiency, save costs and support better decision making. The gas and petroleum industry is under pressure to adopt GIS and green technologies to enhance sustainability, environmental monitoring and benchmarking.
The document provides an overview of geographical information systems (GIS). It defines GIS as a system for capturing, storing, manipulating, analyzing and presenting spatial or geographic data. It describes the core components of GIS as hardware, software, data, people and methods. It outlines several applications of GIS in fields such as agriculture, natural resource management, transportation, military, business and more. It also discusses concepts such as data types, map scale and resolution, and provides examples of GIS terminology.
Exploratory data analysis of 2017 US Employment data using RChetan Khanzode
Data Science- Exploratory data analysis of year 2017 US Employment data using R – Use Case.Use of R library's for visualization of Employment data by state, county and industry sector - simple Geo spatial data visualization of employment data
This document provides an introduction to Geographic Information Systems (GIS). It defines GIS as a computer system for capturing, storing, manipulating, analyzing and presenting spatially-referenced data. The document discusses examples of GIS applications, the history of GIS from the 1970s to present, and its use in fields like urban planning, hydrological modeling and the water sector. It also compares open source GIS software like QGIS to proprietary software like ESRI ArcGIS, and reviews some key open source GIS tools including GDAL, Python and OSGeo4W.
What to do with the existing spatial data in planningKarel Charvat
The document discusses harmonizing spatial planning data from different regions in Europe to be compliant with INSPIRE directives. It describes the Plan4all project which developed conceptual data models for several INSPIRE themes including land cover, land use, and natural risk zones. The harmonization process involves describing source data, defining transformations between source and target structures, and publishing the harmonized data through web services and applications. Lessons learned include clearly defining source data and code lists to aid transformation and addressing differences in how terms are used across countries.
The role of metadata and gi in spatial planning and sdiKarel Charvat
The document discusses the role of metadata and geospatial information in spatial planning and spatial data infrastructures. It notes that spatial plans contain important information that should be included in INSPIRE themes and made interoperable. Metadata can help manage spatial planning processes by providing evidence of documents and tracking their status. Metadata profiles can also help catalog and discover spatial plans. Feature-level metadata is important to maintain traceability when spatial plan data is converted to other schemas or combined from multiple sources.
The document discusses using big data analysis techniques to analyze land use statistics in urban planning. It describes analyzing data on land parcels classified into categories like residential, green space, and industrial to understand land utilization patterns and their effects. The analysis uses a frequent pattern mining algorithm to discover patterns in the data, showing regions with high industrial/residential use and low green space. Challenges included finding suitable datasets and preprocessing the data for analysis.
Geographical Information System (GIS) Georeferencing and Digitization, Bihar ...Kamlesh Kumar
This work is an effort to share Geographical Information System: Georeferencing, digitization and map making steps through QGIS 2.0.1
Georeferencing
Digitization of Topographical sheet
Point
Line
Area
Bihar Map
District Headquarters
Railway of Bihar
District Boundaries
Thematic Maps (Literacy & Sex Ratio)
This document provides an introduction to using ArcGIS software to create and edit maps. It describes ArcMap, ArcScene, and ArcGlobe as the three main ArcGIS applications for 2D, 3D, and global mapping. It explains how to add shapefiles to a map, work with attribute tables, select features, and download additional GIS data files. The document demonstrates how to organize layers, set labels and symbols, and export selected data. It provides guidance on effective data display through techniques like color coding and label placement. Overall, the document serves as a tutorial for basic GIS mapping and visualization using Esri's ArcGIS suite of products.
First GIS Software that Convert GIS Shape files to HTML Google Map Web Site a...Gis Gis
This document describes a free and open source Arabic GIS and GPS software. It includes desktop and web-based versions for mapping, editing shapefiles, analyzing spatial data, and tracking mobile GPS devices. The software allows converting between file formats, performing spatial analyses, and integrating with open data sources and web mapping services like Google Maps. It aims to provide Arabic users with free and compatible GIS tools for storing, analyzing, and visualizing geographic information.
The USGS National Geospatial Program is scanning and georeferencing published USGS topographic maps from 1884 to 2006 to create a digital repository that is freely available to the public. Scanning the historic maps preserves an irreplaceable collection and makes them accessible online through the USGS Store and The National Map Viewer. Georeferencing the maps ties them to a coordinate system and allows overlaying the historic maps with current geospatial data, enabling analysis of changes over time. Future plans include continuing to scan and release more historical quadrangles and providing the maps in GeoTIFF format for additional uses.
APPLICATION OF GEOGRAPHIC INFORMATION SYSTEM FOR EXPLORATION ACTIVITIES IN SO...Yudi Syahnur
First published in 2016 Indonesia Petroleum Association (IPA) Technical Symposium, this paper will illustrate how GIS Best Practices have been employed in Saka Indonesia Sesulu. From planning and execution of 550 km square 3D Seismic Survey to Rig Move monitoring activity.
GIS has also helped explorationist to effectively distinct trends, find patterns and anomalies of surface and subsurface structures. GIS allows people from multi-discipline and different backgrounds to collaborate easily, and contribute to the success of Oil & Gas Exploration in South Sesulu PSC.
Raster data is commonly obtained by scanning maps or collecting aerial photographs and satellite images. Scanned map datasets don't normally contain spatial reference information (either embedded in the file or as a separate file). With aerial photography and satellite imagery, sometimes the location information delivered with them is inadequate, and the data does not align properly with other data one has. Thus, to use some raster datasets in conjunction with other spatial data, we need to align or georeference them to a map coordinate system. A map coordinate system is defined using a map projection (a method by which the curved surface of the earth is portrayed on a flat surface). Georeferencing a raster data defines its location using map coordinates and assigns the coordinate system of the data frame. Georeferencing raster data allows it to be viewed, queried, and analyzed with other geographic data.
Generally, we georeference raster data using existing spatial data (target data)—such as georeferenced rasters or a vector feature class—that resides in the desired map coordinate system. The process involves identifying a series of ground control points—known x,y coordinates—that link locations on the raster dataset with locations in the spatially referenced data (target data). Control points are locations that can be accurately identified on the raster dataset and in real-world coordinates. Many different types of features can be used as identifiable locations, such as road or stream intersections, the mouth of a stream, rock outcrops, the end of a jetty of land, the corner of an established field, street corners, or the intersection of two hedgerows. The control points are used to build a polynomial transformation that will shift the raster dataset from its existing location to the spatially correct location. The connection between one control point on the raster dataset (the from point) and the corresponding control point on the aligned target data (the to point) is a link.
Finally, the georeferenced raster file can be exported for further usage.
THIS PRESENTATION IS TO HELP YOU PERFORM THE TASK STEP BY STEP.
This document discusses applications of geographic information systems (GIS) including urban planning, 3D modeling, environmental analysis, and hydrocarbon exploration. It provides examples of how GIS has been used for urban planning tasks like siting a daycare, modeling population change, and analyzing transportation networks. 3D modeling applications include generating high-resolution digital models from laser scanning data for uses like mapping, education, and engineering. Environmental analysis examples include examining the relationship between toxic sites and disadvantaged communities. The document also discusses GIS applications in hydrocarbon exploration like mapping fields and reservoirs, seismic interpretation, and production analysis to optimize resource development.
This presentation provides an overview of Land Information Systems (LIS). It discusses that a LIS is a digital system that contains both spatial and non-spatial land data. It then reviews the background of LIS in Western countries and how they differ from Nepal's system. The presentation outlines the key concepts of LIS including its methodology, current problems, and future planning. It aims to introduce LIS and provide context around its use and development in Nepal.
R is a free software environment for statistical computing and graphics. It can be used for spatial data analysis and GIS tasks. Spatial data such as points, polygons, and raster files can be imported and analyzed in R using specialized packages. Two case studies demonstrated using R for spatial interpolation of temperature data, LiDAR data processing to create digital elevation models, and developing online viewers for spatial datasets. R allows for reproducible analysis through scripting and has numerous packages that implement statistical procedures, graphics, and interfaces with GIS software like GRASS and ArcGIS.
The document presents a presentation on Geographic Information Systems (GIS). It includes sections on what GIS is, its capabilities and components. GIS is a computer system for capturing, storing, analyzing and managing geographic information and spatial data. The key components of a GIS include hardware, software, data and people. GIS has many applications and uses spatial data and analysis to solve problems across many different domains.
This document summarizes a GIS training session at IOE Pulchowk covering adding various data types to projects, including vector shapefiles, coordinate data from Excel, and image data from online sources like the Shuttle Radar Topography Mission. It discusses importing Excel data, defining the coordinate system, classifying data as points, lines or polygons, and the differences between shapefile and layer file formats. The training covered using shapefiles to store location and attribute data, while layer files primarily store symbology and properties without the raw data.
1. The document discusses how GIS can be used to aid in selecting optimal routes for transcontinental natural gas pipelines by analyzing cost and environmental factors.
2. GIS specialists use data to evaluate potential routes and determine the most suitable path between starting and ending points.
3. A case study found that routes developed using GIS facilitated greater cost reductions than manually developed routes.
A group of 15 developers and experts held a Code Camp in 2012 at the University of West Bohemia in Pilsen to develop a web platform called plan4business. They combined user requirements with engines for integration, storage, and analysis of pan-European spatial datasets. This will soon be demonstrated in a pilot application enabling spatial analysis. Plan4business is an EU-funded project running from 2012 to 2014 that aims to offer harmonized planning data and analyses for users in an INSPIRE-compliant and open standard platform.
The document outlines the 5 steps to mapping with OpenStreetMap, which are: 1) Collect data using GPS trackers, data loggers, or aerial photos; 2) Upload collected data by creating a free account on OpenStreetMap.org; 3) Create and edit map data within OpenStreetMap; 4) Label and tag the data to add details; 5) Render and use the completed map. It provides examples of GPS logging devices and links for collecting aerial photo data with GPS tags to upload.
Geographic Information Systems in the Oil & Gas IndustryFrancois Viljoen
GIS is a tool that can aid decision making in the gas and petroleum industry. GIS allows users to capture, store, analyze and display geospatial data to locate oil and gas resources. It is used throughout the exploration, production, distribution and conservation processes. GIS integrates data on seismic surveys, pipelines, facilities and more to improve efficiency, save costs and support better decision making. The gas and petroleum industry is under pressure to adopt GIS and green technologies to enhance sustainability, environmental monitoring and benchmarking.
The document provides an overview of geographical information systems (GIS). It defines GIS as a system for capturing, storing, manipulating, analyzing and presenting spatial or geographic data. It describes the core components of GIS as hardware, software, data, people and methods. It outlines several applications of GIS in fields such as agriculture, natural resource management, transportation, military, business and more. It also discusses concepts such as data types, map scale and resolution, and provides examples of GIS terminology.
Exploratory data analysis of 2017 US Employment data using RChetan Khanzode
Data Science- Exploratory data analysis of year 2017 US Employment data using R – Use Case.Use of R library's for visualization of Employment data by state, county and industry sector - simple Geo spatial data visualization of employment data
This document provides an introduction to Geographic Information Systems (GIS). It defines GIS as a computer system for capturing, storing, manipulating, analyzing and presenting spatially-referenced data. The document discusses examples of GIS applications, the history of GIS from the 1970s to present, and its use in fields like urban planning, hydrological modeling and the water sector. It also compares open source GIS software like QGIS to proprietary software like ESRI ArcGIS, and reviews some key open source GIS tools including GDAL, Python and OSGeo4W.
What to do with the existing spatial data in planningKarel Charvat
The document discusses harmonizing spatial planning data from different regions in Europe to be compliant with INSPIRE directives. It describes the Plan4all project which developed conceptual data models for several INSPIRE themes including land cover, land use, and natural risk zones. The harmonization process involves describing source data, defining transformations between source and target structures, and publishing the harmonized data through web services and applications. Lessons learned include clearly defining source data and code lists to aid transformation and addressing differences in how terms are used across countries.
The role of metadata and gi in spatial planning and sdiKarel Charvat
The document discusses the role of metadata and geospatial information in spatial planning and spatial data infrastructures. It notes that spatial plans contain important information that should be included in INSPIRE themes and made interoperable. Metadata can help manage spatial planning processes by providing evidence of documents and tracking their status. Metadata profiles can also help catalog and discover spatial plans. Feature-level metadata is important to maintain traceability when spatial plan data is converted to other schemas or combined from multiple sources.
Plan4all Newsletter, Issue 4, December 2010plan4all
The newsletter provides information on the Plan4all project and its results. It introduces the Plan4all metadata profile, data model, and networking architecture. It also details an upcoming thematic workshop in Rome and introduces several project consortium partners, including CEIT ALANOVA and NASURSA. The high-level goal of the Plan4all project is to harmonize spatial planning data according to the INSPIRE Directive.
The document describes a software called EUROPOF that was developed to support land management projects in Portugal. EUROPOF integrates GIS, CAD, and DSS technologies to provide a single file for designing new land allocations. It allows quick and easy reallocation of land parcels compared to conventional tools. EUROPOF has been used successfully in over 16,000 hectares of land consolidation projects in Portugal, reducing project timelines by 25-30% and costs by 70-80%.
Precision Farming (PF) is introduced and history in short is reviewed. Essential activities of GPS locating, soil mapping, GIS dataprocessing and presentation and VRT application are described. Basic principles of PF are shown to be:
• Precision Farming is the management process of within-field variability.
• This management must bring profit or at least reduce the risk of loss
• This management must reduce the impact of farming on environment.
Techniques used in Precision Farming are described. Economics of Precision Farming is discussed. A general cost/benefit analysis and profitability of PF are reviewed. The price of PF adoption facing a farmer is discussed. Methods of process analysis and activity based costing are shown as useful instruments for PF process analysis and model building. PF process is analysed and process graph is developed.
This document provides an introduction to Geographic Information Systems (GIS) and Remote Sensing. It discusses what GIS and remote sensing are, their applications in civil engineering like transportation planning, terrain mapping, watershed analysis and environmental impact studies. It also outlines data sources for GIS like Natural Earth and Global Map. Finally, it discusses uses of GIS/remote sensing in business for industries like dairy, pest control, banking and electricity distribution.
Cutter et al-2007-Systematic Conservation Planning ModulePeter Cutter
This document provides an introduction to using decision support tools for conservation planning. It outlines the goals of gaining experience with conservation planning and tools like ArcGIS, Marxan, and CLUZ. The document describes the landscape and data for a case study in Oregon, including planning units, species occurrence data, and cost data. It provides overviews of the ArcGIS, Marxan, and CLUZ software, and includes step-by-step instructions for opening and navigating ArcGIS and initializing the CLUZ extension for conservation planning analyses.
S.2.e Specifications for Data Ingestion via Sunshine FTPSUNSHINEProject
This document provides specifications for ingesting pilot consumption and sensor data via FTP into Sunshine's data repository. It describes the required meter mapping file format and consumption/sensor data file formats, including naming conventions and metadata to include. Meter mappings relate devices to buildings and define reading types and frequencies. Data files group timestamped readings and costs into CSVs with specified naming and formatting.
First online hangout SC5 - Big Data Europe first pilot-presentation-hangoutBigData_Europe
This document describes a pilot project to support data-intensive climate research. The pilot aims to provide researchers with an intuitive interface to search, download, and dynamically downscale climate model and observational data. It will orchestrate the downscaling process on institutional computational resources while managing data products and lineage. Currently, acquiring and preprocessing climate data for downscaling is an ad-hoc process. The pilot seeks to improve research productivity by facilitating efficient data access, model execution, and reuse of experiments through the Big Data Europe platform. It may help climate impact assessments in other societal challenge areas like energy, food, and agriculture.
Overview of the world of geospatial metadata, and the role of the EDINA service GoGeo in creating, saving, and discovering it. Presented on 19 June 2014 by Tony Mathys in Aberdeen, Scotland.
This document discusses how to create custom concurrent requests that are integrated into Oracle Applications Release 11. It describes three main types of concurrent requests - SQL*PLUS, PL/SQL, and host programs - and the steps to register each type. These steps include defining an executable, defining the concurrent program, assigning parameters, and assigning the program to a request group. It provides examples of SQL*PLUS and PL/SQL code for concurrent requests and discusses considerations for implementing each type.
INSPIRE Data Specification - Utility and Governamental Services v3.0Maksim Sestic
The INSPIRE Directive came into force on 15 May 2007 and will be implemented in various stages, with full implementation required by 2019. It aims to create a spatial data infrastructure which enables the sharing of spatial information among public sector organisations and facilitates public access to spatial information across Europe.
This document describes the results of the first stage of the Plan4all data deployment task. It provides details on the LandUse and LandCover conceptual models used, describes the harmonization process, and reports on data deployment in each partner region. Key outputs include transformed local data published using web maps and services following the common models. Lessons learned and recommendations for the next stage are also provided.
S.1.a Data Model for Energy Map Data CollectionSUNSHINEProject
This document presents a data model for collecting building energy performance data as part of the Sunshine project. It includes classes, attributes, data types and domains for representing buildings and their characteristics. The model is based on INSPIRE specifications for buildings data. It defines two groups for basic building data needed for the project and an extended model with additional optional attributes. Mandatory attributes include a building ID, construction period, height, elevation, main use and energy performance validation data.
The document discusses functional modeling of air traffic control (ATC) and integration perspectives. It presents a model of the ATC information system that systematically maps managed data to functions and services. The model identifies necessary information for ATC tasks and provides a basis for integrated systems to jointly manage aviation data, increasing efficiency and safer operations. Key aspects covered include identifying ATC functions and data, assigning them to services, and analyzing relationships to develop a detailed system specification for aviation management.
Tum seminar specification of usage control requirementsBibek Shrestha
The document summarizes Specification of Usage Control Requirements. It discusses two policy languages - Obligation Specification Language (OSL) and Ponder - that can be used to specify usage control requirements. OSL is designed specifically for obligations. It has a formal semantics defined in Z notation and allows translating between OSL and other rights expression languages. Ponder does not distinguish between provisions and obligations and can specify both. The document focuses on describing the basics and formalization of OSL, including how it defines events, traces, indexed events and uses a refinement relation to order events.
GIS Automation in Ground Water AnalysisIJRESJOURNAL
This document describes a GIS tool developed to automate groundwater analysis in the Jaunpur Branch Sub Basin pilot area of India. The tool interpolates groundwater level data from over 475 piezometers using inverse distance weighting and produces outputs in raster maps and tables summarizing water levels for each of 146 Sub Irrigation Units. The outputs show changing groundwater trends over different months, helping analyze spatial and temporal variations. The automated tool is more efficient than previous manual methods and allows extracting information in various formats like maps, tables, and charts for monitoring and management purposes.
A Cost Estimation Model For Reuse Based Software ProgramIOSR Journals
This document presents a cost estimation model for a software reuse program. It proposes estimating costs at multiple levels - component, domain, application, and corporate. Costs include development, overhead, integration, and maintenance. The model is demonstrated using a hypothetical software company that develops reusable components from 2007-2010. Component size, availability, strategies, scale, standardization, and complexity are identified as factors that affect costs. Equations are provided to estimate costs at each level based on variables like component size, salary, reuse metrics, and probability of component availability. Graphs show relationships between various cost-affecting factors and total application costs.
This document provides an analysis of INSPIRE requirements that will inform the development of spatial planning standards and recommendations in the Plan4all project. It describes the project scope and objectives, outlines the INSPIRE directive and implementation process, analyzes key INSPIRE themes related to spatial planning, and provides recommendations for metadata, data models, and networking services based on INSPIRE. The analysis was informed by a seminar with external experts to review spatial data infrastructure and planning issues.
Similar to step by step mapping with hale for inspire (20)
The document provides instructions for creating forms in Geopaparazzi using the HortonMachine application. It describes how to:
1. Create sections, tabs, and widgets like text fields, checkboxes, and dropdowns within the form builder application to design a custom survey form for university buildings.
2. Add fields for general information like name, faculty, and number of enrolled students.
3. Add additional tabs for structural details and images.
4. Populate dropdowns by specifying options in the form builder.
5. Designate certain fields like name as mandatory for the survey.
The form can then be exported and used to collect geospatial data on university buildings using the
The HortonMachine library is an open source geospatial library focused on hydro-geomorphological analysis and environmental modeling. It provides tools to analyze environmental processes like floods, debris flows, wood floods, and landslides. The library is integrated into gvSIG and also available as a standalone suite of applications. It contains models for tasks like calculating maximum discharge, evaluating debris flow hazards, and predicting large wood transport during floods. Case studies demonstrate how the tools have been used to model past natural hazard events.
This document discusses the HortonMachine library, which is a set of open-source tools for modeling natural hazards. It was created by HydroloGIS, an engineering company in Bolzano, Italy. The HortonMachine library contains tools for hydrology, hydraulics, geomorphology, and forestry modeling. It allows users to analyze erosion processes, drainage networks, and landslides. The library contains commands organized into categories like basin analysis, DEM manipulation, and geomorphology. It provides functions for tasks like pit removal, calculating slope and curvature from a DEM, and extracting drainage networks.
Geopaparazzi and gvSIG allow for digital field mapping and data synchronization. Geopaparazzi is used on Android devices to collect geotagged notes, photos, and GPS tracks. GvSIG prepares background data and forms, and imports Geopaparazzi projects. The Geopaparazzi Survey Server syncs project data to a central server, accessible via a web interface.
Application of a pattern recognition algorithm for single tree detection from...silli
This document evaluates algorithms for detecting individual trees from LiDAR data. It compares local maxima (LM) algorithms on raster and point cloud data to a new pattern recognition algorithm based on geomorphons. All algorithms were tested on a study area in Italy containing different forest structures. The pattern recognition algorithm detected trees most accurately but the point cloud LM algorithm performed best overall. Particle swarm optimization calibration improved detection rates over manual calibration. The algorithms show potential for estimating forest parameters like volume from remote sensing data at large scales.
Geopaparazzi is a free and open source mobile application for collecting geospatial data and taking geotagged photos. It allows users to easily create notes, bookmarks, and track GPS data. Geopaparazzi supports a variety of basemaps and spatial databases. Data can be edited, imported, exported, and viewed in GIS software like gvSIG using plugins. Geopaparazzi has a variety of uses including engineering surveys, emergency management, and field data collection.
Geopaparazzi is a tool for digital field mapping on Android devices. It allows users to take georeferenced photos and notes during surveys and log GPS tracks. The main features include georeferenced notes, photos, sketches and forms; GPS track logging; map viewing of collected data; and import/export of projects. It provides an easy to use interface for collecting field data that can later be integrated into GIS applications.
The document discusses tools for modeling water supply systems using JGrassTools and gvSIG. It summarizes that JGrassTools is an open source geospatial library containing modules for vector/raster processing, geomorphology, forestry, and more. It also includes bindings to the EPANET hydraulic modeling library. A gvSIG plugin was developed to provide a graphical interface for preparing EPANET input data in a GIS environment. This allows generating shapefiles, synchronizing attributes, running EPANET simulations, and visualizing results on maps and charts within gvSIG without needing to install EPANET separately. The tools can help evaluate alternative management strategies to improve water system performance.
A simplified GIS-based model for Large Wood recruitment and connectivity in m...silli
This document describes a GIS-based tool called JGRASSTOOLS for modeling large wood recruitment and connectivity in mountain river basins. The tool uses inputs like digital terrain models, vegetation data, and channel networks to model three main processes: 1) wood recruitment from unstable slopes and hillslopes, 2) wood transport along the river network, and 3) identification of critical sections where wood may accumulate. The overall workflow involves modeling wood sources, propagation along the network, and identifying accumulation points to predict patterns of large wood recruitment and transport during flood events. Future improvements include enhancing the propagation modeling and connecting to LiDAR data.
Basic operations with Geopaparazzi (start, import, export)silli
Geopaparazzi is a tool for digital field mapping on Android devices. It allows users to take geotagged photos and notes during surveys. Key features include geotagged notes, GPS tracking, map viewing, and easy export of collected data. The document provides instructions on installing Geopaparazzi from the Android Market, taking notes and photos with geotagging, viewing maps and data, and exporting data for use in GIS software.
This document describes LESTO, an open source toolbox for analyzing LiDAR data related to forestry. It contains modules for pre-processing LiDAR point cloud data, interpolating rasters, extracting buildings, separating flightlines, identifying individual trees, and estimating forest structure and biomass. The toolbox is released under an open source license and open to contributions related to LiDAR analysis. It is developed to help analyze a case study area in the Aurina Valley using field plot data and LiDAR collected in 2012.
The document discusses various geomorphological analysis tools available in the open-source software HortonMachine. It describes how HortonMachine can be used to analyze digital elevation models (DEMs), calculate terrain attributes, extract stream networks, and delineate catchment boundaries. Specific commands are mentioned for calculating flow directions, drainage networks, slope, curvature, catchment attributes and more. The goal is to provide quantitative and qualitative tools for understanding catchment morphology.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
1.
HALE APPLICATION:
THE USECASE OF THE PLU OF
THE PROVINCE OF TRENTO
ing. Silvia Franceschi
ing. Andrea Antonello
Inspire conference, Firenze 24th
June 2013
2.
THE USECASE
This use case shows a real example of mapping and
transformation of the data of a regional plan of land
use between the original (national) and the requested
Insprire format.
3.
THE DATA
The plan is the PGUAP plan of the Province of Trento,
approved in February 2006. Two different updates of the
geometries of land use are considered for the same plan.
Considered data:
shapefile of land use: two different updates of this layer are
considered, one is the most recent one, approved with a
regional decree of February 2013 and the other one was the
previous one, approved in July 2011
shapefile of hydro-geomorphological risk maps: only the
geometries of the last update
main official documentation
5.
PLU Province of Trento
The attribute table shows the list of attribute and some example of
content. The real meaning of the attributes is:
● AREA: contains the information of the area of each polygon
● PERIMETER: contains the information of the perimeter of each
polygon
● USO_POL_: contains an ID of the polygons did for old databases
● USO_POL_ID: contains an ID of the polygons did for old
databases
● COD_TOT: contains the land use classification code
● PESOPOL: contains the information of the importance of the
object for the evaluation of the hydro-geomorphological risk maps
● PLAN_FROM: contains the date of the approval of the general
plan
● AGGIORN_6: contains the date of the update of the geometries
● PLAN_NAME: contains the name of the plan
7.
PLU TARGET SCHEMA
“Inspire PLU conceptual schema corresponds to a dataset
that corresponds to a spatial planning document.
Geographic information as well as the informative or
descriptive parts contained in a spatial planning
documents are taken into consideration in the LandUse
data application schema.
Only the spatial planning documents that are or have to be
legally adopted by an authority and are opposable to third
parties are considered within INSPIRE.”
9.
PLU TARGET SCHEMA
A spatial planning document
corresponds to the featureType
SpatialPlan.
The Spatial plan has specific
attributes related to the name,
date of the legal documents, and
the level of the administrative
hierarchy that has been adopted.
10.
PLU TARGET SCHEMA
The zoning is composed of polygons that are mutually
exclusive. Zoning provide regulations on how LandUse
can evolve.
The featureType ZoningElement has several specific
attributes (nature of regulation, dimension, rules, …)
and contains the land use classification in the attribute
hilucslanduse.
11.
PLU TARGET SCHEMA
Supplementary information delimits locations where a
specific regulation applies and supplements the
regulations of the zoning. Supplementary information is
implemented in the featureType
SupplementaryRegulation.
12.
PLU TARGET SCHEMA
Official documentation is
in the featureType
OfficialDocumentation.
13.
HILUCS CLASSIFICATION
Inspire requires to classify PLU data following the
categories of the HILUCS classification (Hierarchical
Inspire Land Use Classification).
PLU of the Province of Trento is classified with a local
classification.
We did the reclassification of all the local (specific) land
use classes to hilucs and stored this classification in a
CSV file.
A classification operation in HALE can be done manually
on the flight or loading a CSV file containing the
correspondence between the original and the HILUCS
classes.
15.
DOCUMENTATION
Another preliminary operation that would help us during
the mapping is the creation of a library of information and
links for the documentation of the plan.
This information is stored in a CSV file containing the basic
properties requested in the Insprire schema for
OfficialDocumentation:
● an identifier
● legislationCitation: reference to the document that
contains the text of the regulation
● DocumentCitation: citation of scanned plans and
structural drawings being sometimes georeferenced
and sometimes not (raster images, vector drawings or
scanned text).
18.
ALIGNMENT AND MAPPING
The very first operation to do in HALE is to load the source schema,
the source data and the target schema. This operation is very easy,
select the correspondent element to import from the menu
File → Import → …
HALE can load source schema from different sources and in
particular from shapefiles and CSV files.
Considering the source data and the target schema there are
different FeatureType that has to be considered in this example, in
particular:
●ZoningElement: contains the information of the geometries of the
Land Use
●SpatialPlan: contains all the information related to the official plan
●OfficialDocumentation: contains the links and reference for the
available documentation
●SupplementaryRegulation: contains the information of additional
regulation which supplement the zoning.
20.
ALIGNMENT AND MAPPING
The starting Type to map is the LandUse shapefile which contains the
basic information for both the ZoningElement and the SpatialPlan.
All the mapping is based on the retype operation.
Retype function espresses that a source and a target type are
semantically equal: for each instance of the source type, an instance
of the target type is created.
Property relation only take effect in the context of a type relation. First
a type relation must be defined, then property relations between the
involved types can be specified.
21.
ZoningElement
●do a retype mapping between uso_pol_part and ZoningElement
●do a rename mapping between the_geom and geometry (choose
structural rename)
●do a date extraction between attribute AGGIORN_6 and
validFrom
●do a rename between COD_TOT and specificLandUse
●create an Inspire Identifier using the information in USO_POL_ID
to generate inspireId
22.
ZoningElement
●assign the other mandatory properties:
●add assigns to regulationNature (definedInLegislation)
●add the generate unique ID for id in ZoningElement
●add assing to hilucsLandUse.codeSpace to
http://inspire.ec.europa.eu/codeList/HILUCSValue
●add reclassification using the CSV list between COD_TOT and
hilucsLandUse: since hilucslanduse is a mandatory property it can
not be null, so it is important to set the to use for unmapped
source values a fixed value, for example 6_6_NotKnownUse
●add a Groovy script to assign the validTo property to the
features of the two different updates: the validTo property is
defined as the date before the date of the new update, so it is
based on AGGIORN_6 attribute field of the land use shapefile
and the syntax is the following:
if(AGGIORN_6.equals("2011-07-18")) { return "2013-02-26"; } else { return null;}
27.
SpatialPlan
Considering the same source type it is possible to map the
SpatialPlan.
In this case a merge operation has to be used to set a relation
between the types.
Merge function merges multiple instances of the source type into one
instance of the target type based on one or more matching
properties.
28.
SpatialPlan
● do a merge mapping between land use shapefile uso_pol_part
and SpatialPlan using PLAN_NAME
● use a date extraction to associate the attribute PLAN_FROM to
the property validFrom
● create an Inspire Identifier using PLAN_NAME as reference to
generate inspireId
●do a rename between PLAN_NAME and officialTitle
●do a rename between PLAN_NAME and id of SpatialPlan
● do a compute extent of the geometry of the shapefile the_geom
to extent in SpatialPlan (boundingbox)
● add some additional assignment for the mandatory fields of
SpatialPlan
●add assing to levelOfSpatialPlan (regional)
●add assing to planTypeName (PianoRegionale)
30.
OfficialDocumentation
A very important request in the Inspire schema for PLU is the link and
reference to the official documentatin of the plan. The basic operation to
start with here is retype.
●do a retype between documents and OfficialDocumentation
● do a rename between documentationCitation and planDocument →
DocumentCitation → DocumentCitation → link (choose structural
rename)
● create an Inspire identifier using the information contained in
INSPIREID to generate inspireId
●do a rename between INSPIREID and id of OfficialDocumentation
● do a rename between legislationCitation and LegislationCitation →
LegislationCitation → link (choose structural rename)
● add the assignment to the missing mandatory properties
●add assign to legislationCitation → level (SubNational)
●add the generate unique ID for id in LegislationCitation
●add the generate sequential ID for name in LegislationCitation
(Official_Document_)
●add the generate unique ID for id in DocumentCitation
●add assign to name in DocumentCitation (Trentino Planned Land Use)
33.
LINK BETWEEN TYPES
Link between ZoningElement and SpatialPlan
Since all land use polygons are part of one plan we can use directly
assign operation to assign to plan → href in ZoningElement the
name of the reference plan.
Link between SpatialPlan and OfficialDocumentation
Since all the documents included in the OfficialDocumentation are
related to the plan it is necessary to link all of them to SpatialPlan →
officialDocumentation → href.
In this case we have to create two other new instances of the same
property officialDocumentation and assign to each of them the link to
the documents using the # before the name of the document (i.e.
#Doc_3)
34.
LINK BETWEEN TYPES
Link between ZoningElement and OfficialDocumentation
Use a groovy script to assign the reference to the
officialDocumentation of the features in ZoningElement, since
ZoningElement contains two different update of the land use
polygons, the official documentation will be different for the two and it
is based on the date of the update contained in the attribute
AGGIORN_6.
The two entities are AGGIORN_6 and officialDocument → href of
ZoningElement, the script is:
if(AGGIORN_6.equals("2011-07-18")) { return "#Doc_2"; } else { return
"#Doc_1";}
where #Doc_1 and #Doc_2 are the reference to the ID of the two
documents as inserted in the OfficialDocumentation.
36.
SupplementaryRegulation
The last information to be mapped is the additional regulation on hydro-
geomorphological risk mapping. This information is stored in an other
shapefile and the retype operations is used to map the contained elements
to SupplementaryRegulation.
● do a retype mapping between rispol_part and SupplementaryRegulation
● do a rename mapping between the_geom and geometry (choose
structural rename)
● create an Inspire Identifier using RISPOL_ID as reference to generate
inspireId
●do a rename between RISPOL_ID and id of SupplementaryRegulation
● do a date extraction between attribute UPDATE and validFrom
● do a rename between CLASSE and specificSupplementaryRegulation →
codeSpace
● add some additional assignment for the mandatory fields of
SupplementaryRegulation
●add assign to regulationNature (generallyBinding)
●add assign to supplementaryRegulation (2_1_2_FloodRiskManagementZone)
●add assign to supplementaryRegulation → codeSpace
http://inspire.ec.europa.eu/codeList/SupplementaryRegulationValue
37.
LINK BETWEEN TYPES
Link between SupplementaryRegulation and SpatialPlan
Since all risk zones polygons are part of one plan we can use directly
assign operation to assign to plan → href in
SupplementaryRegulation the name of the reference plan.
Link between SupplementaryRegulation and
OfficialDocumentation
Since all risk zones polygons are part of the same update of the land
use plan we can use directly assign operation to assign to
officialDocument → href in SupplementaryRegulation the name of the
reference documentation (#Doc_1).
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map
- why not simply with GIS? The tablets support everything - it is about interpretation/history of the survey/uncertainty - DFM is not just about surveying and bridgin the redigitalization of field data - it is about surveying data persisting as much as possible all those data that lead to the final map