The document introduces Takahiro Miyoshi and his work using GPS and GIS technologies for rural development in Zambia and Japan, including his background and experience. It then outlines an agenda for a workshop on using GPS and GIS, with sections on basics, applications for rural development projects, and exercises.
This document discusses the development of software tools called Global Soil Information Facilities (GSIF) for global soil mapping. It describes existing GSIF components like global soil databases and proposed new modules for tasks like data entry, harmonization, spatial analysis, and visualization. Key proposed software include the Global Soil Mapping package, plotKML for visualization, and the Soil Reference Library package. The document outlines the status of current work and provides next steps like releasing initial packages and continuing development through user feedback. It encourages participation in the GSIF workshop to help develop the software functionality.
Hengl & Reuter poster at Geomorphometry.org/2011Tomislav Hengl
This document proposes the creation of an open database of digital elevation model (DEM) derivatives from around the world. The database would provide precision, be multi-scale, have an open structure, and provide web access to DEM data and derived products like slope, aspect, and drainage patterns. It would support geomorphometry research through standardized algorithms and allow testing and comparison of methods. The global collection of DEM data and derivatives could advance knowledge and become a platform for improving data standards over time.
Uncertainty Analysis and Data Assimilation of Remote Sensing Data for the Cal...Hans van der Kwast
This document discusses using remote sensing data and data assimilation techniques to calibrate cellular automata based land-use models. It aims to improve land-use simulations by lowering uncertainties compared to other automatic calibration methods. A simplified MOLAND land-use model for Dublin is used to test error propagation modeling and a particle filtering data assimilation approach. Preliminary results seem promising, but differences between spatial metrics from remote sensing and model outputs may hamper the analysis. The overall goal is to develop robust and reliable tools for land-use change modeling and calibration to inform policy contexts.
Kevin Byrne’s Presentation: Sustainability Storyboarded and Geovisualized Acr...J. Kevin Byrne
This document summarizes a presentation on sustainability storyboarded and geovisualized across three scales. It includes tables highlighting anniversaries in geovisualization and sustainable development. It also contains tables describing research on water, ecological and curbside recycling footprints at different scales and media complexities. Maps and figures are presented analyzing recycling participation data in a neighborhood. Finally, it advertises a quicktime movie analyzing scale and complexity scores for sustainability initiatives.
This document summarizes a workshop presented by PCI Geomatics on applications using Kompsat-5 satellite data. PCI Geomatics develops geospatial software and has supported Kompsat-5 data in their software for several years. The workshop consisted of demonstrations of orthorectifying Kompsat-5 data, land use classification, change detection between images, and fusing Kompsat-5 radar data with optical Landsat images. It provided an overview of current Kompsat-5 support in PCI Geomatics software and capabilities for processing the different acquisition modes.
The document introduces Takahiro Miyoshi and his work using GPS and GIS technologies for rural development in Zambia and Japan, including his background and experience. It then outlines an agenda for a workshop on using GPS and GIS, with sections on basics, applications for rural development projects, and exercises.
This document discusses the development of software tools called Global Soil Information Facilities (GSIF) for global soil mapping. It describes existing GSIF components like global soil databases and proposed new modules for tasks like data entry, harmonization, spatial analysis, and visualization. Key proposed software include the Global Soil Mapping package, plotKML for visualization, and the Soil Reference Library package. The document outlines the status of current work and provides next steps like releasing initial packages and continuing development through user feedback. It encourages participation in the GSIF workshop to help develop the software functionality.
Hengl & Reuter poster at Geomorphometry.org/2011Tomislav Hengl
This document proposes the creation of an open database of digital elevation model (DEM) derivatives from around the world. The database would provide precision, be multi-scale, have an open structure, and provide web access to DEM data and derived products like slope, aspect, and drainage patterns. It would support geomorphometry research through standardized algorithms and allow testing and comparison of methods. The global collection of DEM data and derivatives could advance knowledge and become a platform for improving data standards over time.
Uncertainty Analysis and Data Assimilation of Remote Sensing Data for the Cal...Hans van der Kwast
This document discusses using remote sensing data and data assimilation techniques to calibrate cellular automata based land-use models. It aims to improve land-use simulations by lowering uncertainties compared to other automatic calibration methods. A simplified MOLAND land-use model for Dublin is used to test error propagation modeling and a particle filtering data assimilation approach. Preliminary results seem promising, but differences between spatial metrics from remote sensing and model outputs may hamper the analysis. The overall goal is to develop robust and reliable tools for land-use change modeling and calibration to inform policy contexts.
Kevin Byrne’s Presentation: Sustainability Storyboarded and Geovisualized Acr...J. Kevin Byrne
This document summarizes a presentation on sustainability storyboarded and geovisualized across three scales. It includes tables highlighting anniversaries in geovisualization and sustainable development. It also contains tables describing research on water, ecological and curbside recycling footprints at different scales and media complexities. Maps and figures are presented analyzing recycling participation data in a neighborhood. Finally, it advertises a quicktime movie analyzing scale and complexity scores for sustainability initiatives.
This document summarizes a workshop presented by PCI Geomatics on applications using Kompsat-5 satellite data. PCI Geomatics develops geospatial software and has supported Kompsat-5 data in their software for several years. The workshop consisted of demonstrations of orthorectifying Kompsat-5 data, land use classification, change detection between images, and fusing Kompsat-5 radar data with optical Landsat images. It provided an overview of current Kompsat-5 support in PCI Geomatics software and capabilities for processing the different acquisition modes.
Google Analytics is een gratis tool van Google die een schat aan informatie over de bezoekers van je website biedt. Waar komen ze vandaan, hoe lang zijn ze op je website, welke pagina’s bekijken ze en welke zoekwoorden gebruiken ze? Hartstikke leuk om die gegevens te hebben, maar hoe zorg je vervolgens dat je er ook wat zinnigs mee doet? Webanalist Gabriël Ramaker wijst de weg in de statistiekenjungle van Google.
Este documento ofrece consejos para manejar la rebeldía adolescente. Explica que la rebeldía es normal durante la adolescencia ya que los jóvenes buscan independizarse. Identifica cuatro tipos de rebeldía - regresiva, agresiva, transgresiva y progresiva - con diferentes causas y características. También señala factores como el estrés, la disciplina excesiva e imitación de padres impulsivos como causas de rebeldía anormal. Finalmente, recomienda mostrar afecto, escuchar, respetar la privacidad y tener
Africa is a diverse continent with varied landscapes ranging from coastal areas to deserts to tropical rainforests. The continent has over 50 countries and over 1,000 distinct ethnic groups that together speak over 2,000 languages. Despite challenges such as poverty and political instability in some areas, Africa has many developing economies and is working to address issues through initiatives in fields such as education, health care, and infrastructure.
IBL Online Services provides online business promotion and web services. They aim to empower customers to achieve their dreams through digital media and online marketing. Their franchise development program allows associates to earn income from direct sales, business listings, meeting revenue targets, and renewals. Associates are provided training, marketing support, and can earn bonuses quarterly and annually based on revenue targets. The franchise requires an investment of Rs. 1-3 lakhs and associates own office space to develop their business independently and profitably through IBL's online platform and support.
The document discusses the results of a study on the effects of exercise on memory and thinking abilities in older adults. The study found that regular exercise can help reduce the decline in thinking abilities that often occurs with age. Older adults who exercised regularly performed better on cognitive tests and brain scans showed they had greater activity in important areas for memory and learning compared to less active peers.
Facebook. Met meer dan 900 miljoen (!) gebruikers verreweg de grootste website ter wereld. Mensen uit alle leeftijdsgroepen en alle lagen van de bevolking zitten dagelijks uren te facebooken. Het is namelijk best leuk. En een ultiem sterk medium om je doelgroep te bereiken. Hoe? Daar geeft conceptenbedenker Brenda Dekkers een inspirerende workshop over. We zoeken nog een vrijwilliger om tijdens de workshop een Facebook-campagne voor te bedenken.
Matter exists in various states and is composed of fundamental particles. It can be combined to form different elements, molecules, and compounds with unique properties. Matter can transition between solid, liquid, and gas states depending on temperature and pressure. Nuclear reactions can convert matter into energy according to Einstein's mass-energy equivalence formula E=mc2. Antimatter is a substance that annihilates with normal matter to produce energy. Matter in all its various forms is essential for use in small-scale industries through machinery, tools, storage, transportation, and packaging.
The document discusses weather, climate, and how animals adapt to different climates. It defines weather as the daily conditions of temperature, humidity, and other atmospheric factors in a place, while climate refers to the average weather patterns over a long period, like 25 years. Animals in different climates, such as polar and tropical regions, have adapted traits that help them survive the extreme conditions, like thick fur for polar bears and the ability to climb trees for red-eyed frogs.
Google Analytics is een gratis tool van Google die een schat aan informatie over de bezoekers van je website biedt. Maar, hoe zet je het in om het het aantal conversies op je website te verhogen?
TELUS Case Study: iVAULT implementation improved corporate intelligence eventspat
This document summarizes a webinar about TELUS's implementation of the iVAULT content management system. Some key points:
- iVAULT was implemented to improve TELUS's corporate intelligence by creating a centralized spatial data store and new FieldView application. This consolidated data from disparate legacy systems and maps like ArcGIS, MapGuide, and Google Maps.
- The new architecture included an Oracle spatial data store replicating TELUS infrastructure data from their Intergraph Framme system. This cleaned up issues and standardized the data.
- A new FieldView application was developed to provide customized analysis tools for various departments through a web interface on both desktop and mobile.
- The
This document provides guidelines for generating spatial datasets for a hydrology project in India. It outlines 13 themes that will be mapped, including land use, soils, geology, geomorphology, administrative boundaries, hydrologic units, settlements, transportation, drainage, and contours. Standards are provided for mapping scale, projection, accuracy, and database organization. Spatial data will be generated through interpretation of satellite imagery and digitization of existing paper maps. Resulting data will be integrated into surface and groundwater data centers in participating states.
The document provides course descriptions and prerequisites for training sessions at NCTC in Shepherdstown, WV on June 14, 2011. The first course is on ArcGIS 10 and requires a basic understanding of ArcGIS 9.x. It will cover new features, tools, editing workflows and more. The second course is on coordinate systems and best GPS practices, requiring experience with GPS and GIS software. It will cover aligning GPS and GIS data, selecting appropriate coordinate systems and datums, and best practices for GPS data collection. The last course listed is an internet mapping overview with no prerequisites that will discuss trends in internet mapping and how to communicate geographic data online.
The document discusses how the CSTI methodology enables automated ANSI-748 EVMS compliance within Microsoft Project. It provides tools to manage EV budgets and forecasts within MSP and seamlessly integrate them into an organization's EVMS. The tools automate earned value calculations in MSP, export MSP data to an EVMS, and provide features like profiling, tracing and notifications to streamline the EV reporting cycle. The benefits include increased efficiency, simplified processes, and setting industry standards for program scheduling and analysis.
Crude-Oil Scheduling Technology: moving from simulation to optimizationBrenno Menezes
Scheduling technology either commercial or homegrown in today’s crude-oil refining industries relies on a complex simulation of scenarios where the user is solely responsible for making many different decisions manually in the search for feasible solutions over some limited time-horizon i.e., trial-and-error heuristics. As a normal outcome, schedulers abandon these solutions and then return to their simpler spreadsheet simulators due to: (i) time-consuming efforts to configure and manage numerous scheduling scenarios, and (ii) requirements of updating premises and situations that are constantly changing. Moving to solutions based in optimization rather than simulation, the lecture describes the future steps in the refactoring of the scheduling technology in PETROBRAS considering in separate the graphic user interface (GUI) and data communication developments (non-modeling related), and the modeling and process engineering related in an automated decision-making with built-in problem representation facilities and integrated data handling features among other techniques in a smart scheduling frontline.
FME World Tour 2015 - Around the World - Ken BraggIMGS
The document discusses how Pelmorex leverages FME Cloud to generate over 880,000 web map tiles from meteorological data every 12 hours in near real-time, using AWS services like S3, SQS, and Lambda to dynamically provision compute capacity and process the large volume of data more cost effectively than maintaining their own on-premises servers. FME Cloud allows Pelmorex to generate the time-sensitive map tiles much faster and at a lower annual cost than maintaining their own hardware infrastructure.
TELUS Case Study: GIS for Telecommunicationseventspat
This document describes how TELUS implemented an iVAULT system to improve access to and use of their spatial data. Key points:
1. TELUS integrated their disparate GIS systems and data into a single iVAULT system with a spatial data store, allowing unified access for field users and departments.
2. The iVAULT system included a new FieldView application for viewing, searching, analyzing and editing spatial and attribute data via web and mobile.
3. The spatial data store cleaned up TELUS' IMAGE database and consolidated over 1,000 design files, improving data quality and access.
4. The unified system allows TELUS to better analyze customer and network data
This document provides an overview of a workshop on land cover mapping using high resolution satellite images, OpenStreetMap data, and open source software tools. The workshop will involve preprocessing SPOT satellite imagery and OSM data, performing supervised image classification, and comparing the classification results to OSM features to identify areas for updating OSM. Key steps include extracting relevant OSM features to use as training data, preprocessing images, computing indices like NDVI, training and applying a classification algorithm, and assessing accuracy by comparing to OSM polygons. The goal is to demonstrate an approach for leveraging OSM as reference data for land cover mapping with satellite imagery.
Google Analytics is een gratis tool van Google die een schat aan informatie over de bezoekers van je website biedt. Waar komen ze vandaan, hoe lang zijn ze op je website, welke pagina’s bekijken ze en welke zoekwoorden gebruiken ze? Hartstikke leuk om die gegevens te hebben, maar hoe zorg je vervolgens dat je er ook wat zinnigs mee doet? Webanalist Gabriël Ramaker wijst de weg in de statistiekenjungle van Google.
Este documento ofrece consejos para manejar la rebeldía adolescente. Explica que la rebeldía es normal durante la adolescencia ya que los jóvenes buscan independizarse. Identifica cuatro tipos de rebeldía - regresiva, agresiva, transgresiva y progresiva - con diferentes causas y características. También señala factores como el estrés, la disciplina excesiva e imitación de padres impulsivos como causas de rebeldía anormal. Finalmente, recomienda mostrar afecto, escuchar, respetar la privacidad y tener
Africa is a diverse continent with varied landscapes ranging from coastal areas to deserts to tropical rainforests. The continent has over 50 countries and over 1,000 distinct ethnic groups that together speak over 2,000 languages. Despite challenges such as poverty and political instability in some areas, Africa has many developing economies and is working to address issues through initiatives in fields such as education, health care, and infrastructure.
IBL Online Services provides online business promotion and web services. They aim to empower customers to achieve their dreams through digital media and online marketing. Their franchise development program allows associates to earn income from direct sales, business listings, meeting revenue targets, and renewals. Associates are provided training, marketing support, and can earn bonuses quarterly and annually based on revenue targets. The franchise requires an investment of Rs. 1-3 lakhs and associates own office space to develop their business independently and profitably through IBL's online platform and support.
The document discusses the results of a study on the effects of exercise on memory and thinking abilities in older adults. The study found that regular exercise can help reduce the decline in thinking abilities that often occurs with age. Older adults who exercised regularly performed better on cognitive tests and brain scans showed they had greater activity in important areas for memory and learning compared to less active peers.
Facebook. Met meer dan 900 miljoen (!) gebruikers verreweg de grootste website ter wereld. Mensen uit alle leeftijdsgroepen en alle lagen van de bevolking zitten dagelijks uren te facebooken. Het is namelijk best leuk. En een ultiem sterk medium om je doelgroep te bereiken. Hoe? Daar geeft conceptenbedenker Brenda Dekkers een inspirerende workshop over. We zoeken nog een vrijwilliger om tijdens de workshop een Facebook-campagne voor te bedenken.
Matter exists in various states and is composed of fundamental particles. It can be combined to form different elements, molecules, and compounds with unique properties. Matter can transition between solid, liquid, and gas states depending on temperature and pressure. Nuclear reactions can convert matter into energy according to Einstein's mass-energy equivalence formula E=mc2. Antimatter is a substance that annihilates with normal matter to produce energy. Matter in all its various forms is essential for use in small-scale industries through machinery, tools, storage, transportation, and packaging.
The document discusses weather, climate, and how animals adapt to different climates. It defines weather as the daily conditions of temperature, humidity, and other atmospheric factors in a place, while climate refers to the average weather patterns over a long period, like 25 years. Animals in different climates, such as polar and tropical regions, have adapted traits that help them survive the extreme conditions, like thick fur for polar bears and the ability to climb trees for red-eyed frogs.
Google Analytics is een gratis tool van Google die een schat aan informatie over de bezoekers van je website biedt. Maar, hoe zet je het in om het het aantal conversies op je website te verhogen?
TELUS Case Study: iVAULT implementation improved corporate intelligence eventspat
This document summarizes a webinar about TELUS's implementation of the iVAULT content management system. Some key points:
- iVAULT was implemented to improve TELUS's corporate intelligence by creating a centralized spatial data store and new FieldView application. This consolidated data from disparate legacy systems and maps like ArcGIS, MapGuide, and Google Maps.
- The new architecture included an Oracle spatial data store replicating TELUS infrastructure data from their Intergraph Framme system. This cleaned up issues and standardized the data.
- A new FieldView application was developed to provide customized analysis tools for various departments through a web interface on both desktop and mobile.
- The
This document provides guidelines for generating spatial datasets for a hydrology project in India. It outlines 13 themes that will be mapped, including land use, soils, geology, geomorphology, administrative boundaries, hydrologic units, settlements, transportation, drainage, and contours. Standards are provided for mapping scale, projection, accuracy, and database organization. Spatial data will be generated through interpretation of satellite imagery and digitization of existing paper maps. Resulting data will be integrated into surface and groundwater data centers in participating states.
The document provides course descriptions and prerequisites for training sessions at NCTC in Shepherdstown, WV on June 14, 2011. The first course is on ArcGIS 10 and requires a basic understanding of ArcGIS 9.x. It will cover new features, tools, editing workflows and more. The second course is on coordinate systems and best GPS practices, requiring experience with GPS and GIS software. It will cover aligning GPS and GIS data, selecting appropriate coordinate systems and datums, and best practices for GPS data collection. The last course listed is an internet mapping overview with no prerequisites that will discuss trends in internet mapping and how to communicate geographic data online.
The document discusses how the CSTI methodology enables automated ANSI-748 EVMS compliance within Microsoft Project. It provides tools to manage EV budgets and forecasts within MSP and seamlessly integrate them into an organization's EVMS. The tools automate earned value calculations in MSP, export MSP data to an EVMS, and provide features like profiling, tracing and notifications to streamline the EV reporting cycle. The benefits include increased efficiency, simplified processes, and setting industry standards for program scheduling and analysis.
Crude-Oil Scheduling Technology: moving from simulation to optimizationBrenno Menezes
Scheduling technology either commercial or homegrown in today’s crude-oil refining industries relies on a complex simulation of scenarios where the user is solely responsible for making many different decisions manually in the search for feasible solutions over some limited time-horizon i.e., trial-and-error heuristics. As a normal outcome, schedulers abandon these solutions and then return to their simpler spreadsheet simulators due to: (i) time-consuming efforts to configure and manage numerous scheduling scenarios, and (ii) requirements of updating premises and situations that are constantly changing. Moving to solutions based in optimization rather than simulation, the lecture describes the future steps in the refactoring of the scheduling technology in PETROBRAS considering in separate the graphic user interface (GUI) and data communication developments (non-modeling related), and the modeling and process engineering related in an automated decision-making with built-in problem representation facilities and integrated data handling features among other techniques in a smart scheduling frontline.
FME World Tour 2015 - Around the World - Ken BraggIMGS
The document discusses how Pelmorex leverages FME Cloud to generate over 880,000 web map tiles from meteorological data every 12 hours in near real-time, using AWS services like S3, SQS, and Lambda to dynamically provision compute capacity and process the large volume of data more cost effectively than maintaining their own on-premises servers. FME Cloud allows Pelmorex to generate the time-sensitive map tiles much faster and at a lower annual cost than maintaining their own hardware infrastructure.
TELUS Case Study: GIS for Telecommunicationseventspat
This document describes how TELUS implemented an iVAULT system to improve access to and use of their spatial data. Key points:
1. TELUS integrated their disparate GIS systems and data into a single iVAULT system with a spatial data store, allowing unified access for field users and departments.
2. The iVAULT system included a new FieldView application for viewing, searching, analyzing and editing spatial and attribute data via web and mobile.
3. The spatial data store cleaned up TELUS' IMAGE database and consolidated over 1,000 design files, improving data quality and access.
4. The unified system allows TELUS to better analyze customer and network data
This document provides an overview of a workshop on land cover mapping using high resolution satellite images, OpenStreetMap data, and open source software tools. The workshop will involve preprocessing SPOT satellite imagery and OSM data, performing supervised image classification, and comparing the classification results to OSM features to identify areas for updating OSM. Key steps include extracting relevant OSM features to use as training data, preprocessing images, computing indices like NDVI, training and applying a classification algorithm, and assessing accuracy by comparing to OSM polygons. The goal is to demonstrate an approach for leveraging OSM as reference data for land cover mapping with satellite imagery.
Tim Malthus_Towards standards for the exchange of field spectral datasetsTERN Australia
This document discusses the development of standards for the exchange of field spectral datasets. It notes the importance of metadata for determining the quality and representativeness of spectral data obtained in the field. A workshop was held in 2012 to discuss best practices for data collection and exchange and key conclusions included the need for standards to facilitate accurate comparison across studies and the role of thorough metadata. Work is ongoing to enhance the SPECCHIO system for hosting spectral libraries and metadata and establishing it as the international tool for storage and exchange of spectral datasets.
This document provides an overview of a geospatial metadata and spatial data workshop. It discusses the importance of metadata for discovering and managing spatial datasets. It introduces common geospatial metadata standards like FGDC, ISO 19115, and INSPIRE and the concept of application profiles. The document outlines tools and resources for UK academics to create and publish metadata, including the UK AGMAP profile, Geodoc editor, GoGeo portal, and ShareGeo repository. Hands-on sessions demonstrate using these resources to generate metadata and access open spatial data.
Presented by Tony Mathys at a Current Issues and Applications of the Geospatial Technologies Lecture, Department of Geography and Environment, Aberdeen University, 24 February 2012
PCI Geomatics is a leading geospatial software and solutions company with over 70 employees and 25,000 licenses installed worldwide. They provide powerful and scalable image processing solutions to extract information from satellite imagery such as SAR (Synthetic Aperture Radar) and LIDAR. Their capabilities include orthorectification, image classification, change detection, and digital elevation model extraction. They support a variety of sensors and applications in areas such as maritime surveillance, disaster response, and natural resource monitoring.
The document summarizes a workshop on geospatial metadata and spatial data. It discusses the importance of metadata for discovering and managing spatial datasets. It introduces geospatial metadata standards like FGDC, ISO 19115, and INSPIRE. It also describes the UK AGMAP profile, Geodoc metadata editor tool, GoGeo portal, and ShareGeo open data repository for sharing spatial resources in academia. Hands-on sessions demonstrate creating metadata and accessing datasets.
The document summarizes a workshop on geospatial metadata and spatial data. It discusses the importance of metadata for managing and sharing spatial datasets, providing key information about the data. It also covers metadata standards like FGDC, ISO 19115, and application profiles. The workshop includes presentations on the UK Academic Geospatial Metadata Application Profile and tools for creating metadata like the Geodoc Metadata Editor and Go-Geo portal.
This document provides an overview of a geospatial metadata and spatial data workshop held at the University of Oxford. The workshop covered topics such as metadata standards, application profiles, geospatial metadata tools and portals for sharing spatial data and metadata. Hands-on sessions demonstrated how to create metadata using the Geodoc Metadata Editor tool and access spatial data repositories through the Go-Geo portal and ShareGeo open data portal.
IRJET-An Efficient Technique to Improve Resources Utilization for Hadoop Mapr...IRJET Journal
This document discusses techniques to improve resource utilization for Hadoop MapReduce in heterogeneous systems. It proposes implementing classification at the job level using SVM to assign jobs to appropriate nodes. It also proposes using the PRISM fine-grained scheduling algorithm to schedule tasks at the phase level to increase parallelism and reduce job running times by 10-30%. Finally, it proposes a dynamic slot configuration algorithm to optimize the number of map and reduce slots on each node. The authors claim these techniques improved performance and reduced running times by up to 30% depending on job characteristics.
This document proposes a system to implement common image processing routines for both single and distributed processors. It will develop algorithms for routines like thresholding, brightness/contrast adjustments, inversion, smoothing, edge detection, and morphological operations. The system will execute the routines sequentially on one processor and in parallel across multiple processors. It will analyze and compare the execution times to evaluate the performance benefits of single versus distributed computing for image processing.
ArcGIS Pro, ArcGIS Online, Story Maps, Web App Builder
Language Skills
English: Fluent in Speaking, Reading and Writing
Tok Pisin: Fluent in Speaking
Hiri Motu: Basic Speaking
Other Skills
Project Management, Training, Mentoring, Report Writing, Presentation Skills, Analytical Skills,
Environmental Awareness, Health Safety Environment (HSE) Awareness, Leadership, Teamwork,
Communication, Problem Solving, Adaptability, Attention to Details, Self-Motivated, Self-Starter,
Organized, Meticulous, Hardworking, Reliable, Honest and
Performance of Weighted Least Square Filter Based Pan Sharpening using Fuzzy ...IRJET Journal
This document proposes a new algorithm for pan sharpening images that combines weighted least squares filtering with fuzzy logic. It summarizes previous research on image fusion techniques like principal component analysis and discrete cosine transformation. The proposed algorithm applies fuzzy logic to evaluate membership functions and attain a pan sharpened image. It then uses a weighted least squares filter for pan sharpening. The algorithm is implemented in MATLAB and evaluated based on metrics like root mean square error, peak signal-to-noise ratio, and mean square error. Results show the proposed technique improves upon existing methods by reducing errors and increasing quality measurements of the fused images.
Similar to Official and crowdsourced geospatial data integration (20)
Este documento presenta una innovación educativa sobre webmapping. El objetivo es aprender herramientas y casos de aplicación de webmapping. Explica brevemente la evolución de los mapas a los SIG y webmapping. Luego describe los conocimientos previos y recursos educativos abiertos que se usarán, incluyendo cursos en Coursera y el uso práctico de herramientas de webmapping.
Semantic integration of authoritative and VGIJimena Martínez
This document discusses integrating authoritative and volunteered geographic information using an ontological approach. It presents the problems caused by semantic heterogeneity between different data sources. The proposed approach uses a domain ontology and R2RML mappings to provide a common conceptualization and allow flexible integration of datasets in RDF. Current work is analyzing semantic heterogeneity in OpenStreetMap tags by studying how tags used for real-world features vary over spatial scale and time. Future work includes developing the ontology further and creating a user interface for the R2RML mappings.
Taller de aplicaciones de software libre para las humanidades y acciones huma...Jimena Martínez
Este documento describe un taller sobre aplicaciones de software libre para proyectos humanitarios y de las humanidades. El taller será impartido por Geomun2, una asociación dedicada a la promoción del software libre y las tecnologías de la información geográfica. El taller mostrará herramientas útiles como SIG, presentará proyectos de éxito y realizará ejercicios prácticos utilizando datos y software libre.
Ampliando los horizontes de la formación relacionada con las Tecnologías de l...Jimena Martínez
Este documento propone ampliar el acceso a las Tecnologías de la Información Geográfica (TIG) mediante dos líneas de actuación: 1) Ofrecer formación básica en TIG con un enfoque amplio de aplicaciones para acercar los mapas a personas sin conocimientos previos. 2) Apoyar técnicamente proyectos sociales, culturales y solidarios que requieran TIG para su desarrollo. Se detallan diversos talleres y cursos de formación, así como un proyecto de mapeo colaborativo de fos
E-learning SINFOGEO: a new e-learning paradigm in Geographic Information Tech...Jimena Martínez
E-learning SINFOGEO is an e-learning platform about Geographic Information Technologies, with the aim to offer a better, practical and useful knowledge of the Geo Technologies to our students.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
A Comprehensive Guide to DeFi Development Services in 2024Intelisync
DeFi represents a paradigm shift in the financial industry. Instead of relying on traditional, centralized institutions like banks, DeFi leverages blockchain technology to create a decentralized network of financial services. This means that financial transactions can occur directly between parties, without intermediaries, using smart contracts on platforms like Ethereum.
In 2024, we are witnessing an explosion of new DeFi projects and protocols, each pushing the boundaries of what’s possible in finance.
In summary, DeFi in 2024 is not just a trend; it’s a revolution that democratizes finance, enhances security and transparency, and fosters continuous innovation. As we proceed through this presentation, we'll explore the various components and services of DeFi in detail, shedding light on how they are transforming the financial landscape.
At Intelisync, we specialize in providing comprehensive DeFi development services tailored to meet the unique needs of our clients. From smart contract development to dApp creation and security audits, we ensure that your DeFi project is built with innovation, security, and scalability in mind. Trust Intelisync to guide you through the intricate landscape of decentralized finance and unlock the full potential of blockchain technology.
Ready to take your DeFi project to the next level? Partner with Intelisync for expert DeFi development services today!
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
FREE A4 Cyber Security Awareness Posters-Social Engineering part 3Data Hops
Free A4 downloadable and printable Cyber Security, Social Engineering Safety and security Training Posters . Promote security awareness in the home or workplace. Lock them Out From training providers datahops.com
This presentation provides valuable insights into effective cost-saving techniques on AWS. Learn how to optimize your AWS resources by rightsizing, increasing elasticity, picking the right storage class, and choosing the best pricing model. Additionally, discover essential governance mechanisms to ensure continuous cost efficiency. Whether you are new to AWS or an experienced user, this presentation provides clear and practical tips to help you reduce your cloud costs and get the most out of your budget.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Skybuffer AI: Advanced Conversational and Generative AI Solution on SAP Busin...Tatiana Kojar
Skybuffer AI, built on the robust SAP Business Technology Platform (SAP BTP), is the latest and most advanced version of our AI development, reaffirming our commitment to delivering top-tier AI solutions. Skybuffer AI harnesses all the innovative capabilities of the SAP BTP in the AI domain, from Conversational AI to cutting-edge Generative AI and Retrieval-Augmented Generation (RAG). It also helps SAP customers safeguard their investments into SAP Conversational AI and ensure a seamless, one-click transition to SAP Business AI.
With Skybuffer AI, various AI models can be integrated into a single communication channel such as Microsoft Teams. This integration empowers business users with insights drawn from SAP backend systems, enterprise documents, and the expansive knowledge of Generative AI. And the best part of it is that it is all managed through our intuitive no-code Action Server interface, requiring no extensive coding knowledge and making the advanced AI accessible to more users.
Skybuffer AI: Advanced Conversational and Generative AI Solution on SAP Busin...
Official and crowdsourced geospatial data integration
1. OFFICIAL AND CROWDSOURCED
GEOSPATIAL DATA
INTEGRATION
Searching solutions to improve the processes in
cartography updating
By Jimena Martínez
Supervisors: Antonio Vázquez and Marianne de Vries
2. 2
Table of contents
Background
Problems
The idea
The steps to develop the idea, and an
example to show it
3. 3
Background
BCN200 BTN25 BTA5 MGCP
International
Local (Spanish
National (Spain) National (Spain) (Africa, Middle
Scope provinces)
East)
Cartography Cells (208 Spain: 6
Provinces Sheets Sheets
units countries)
Scale 1/200.000 1/25.000 1/5.000 1/50.000
Updating cycle 2 years 4 years 4 years 4 years
Spanish budget:
Budget 300.000 € 3.500.000 € 800.000 €
27.000.000 €
4. 4
Problems
1. Why official cartography is never enough updated?
Update
process
Satellite/ aerial Release date
images collecting date (2011 version)
Feb. 2011 Dec. 2011
Dec. 2010 May 2011
1st real change 2nd real change Off. Data reflects
1st change
5. 5
Problems
2. Why updating process is such long and expensive?
Traditional updating process
Vector cartography from last year.
Set of data sources against which
compare the cartography (images,
maps, raster, vector)
Reviewing the whole cartography
unit.
Too much time to review, not
much time to edit features.
6. 6
Problems
2. Why updating process is such long and expensive?
Madrid case (1/200k)
Time to update: 4 weeks 1 person
Features edited percentage: 30%
Time to edit this features: 1,5
weeks
Would be possible to save the
other 2,5 weeks?
7. 7
Problems
1. Why official cartography 2. Why updating process is such
As a result:
is never enough updated? long and expensive?
Reviewing the whole cartography
Traditional process based
against different data sources is Long process
on different data sources.
needed…
Data sources have different
to detect changes. Expensive process
dates (collecting dates)
Not always useful result
(if highly updated
cartography is needed)
8. 8
The idea
To develop a general
A system that finds
methodology to decide
where the official
whether crowdsourced
dataset need to be Saving costs
(OpenStreetMap) and
updated, and which and obtaining
official geodata could be
type of update needs better updated
integrated or not in order
each feature, without cartography
to use OSM to improve
reviewing the whole
the official cartography
cartography unit.
updating process.
9. 9
The idea
Data sources in the updating process Better updated
features (not always)
Official data Official data
Vector format
Not complete
OSM data
Not homogeneous
OSM to indicate
where to update
10. 10
The idea
Differences in updating processes
NMAs official data Crowdsourced data (OSM)
Government (NMA) Hours/days
Users/NMAs/companies
Updating &
MAP v.1. production
Months/
processes MAP v.1…v.n
Years
Tenders
(companies)
Updating processes
MAP v.2
11. 11
The idea
Differences in updating processes
Update
process
OSM update Satellite/ aerial OSM update Release date
images collecting date (2011 version)
Jan. 2011 Feb. 2011 June 2011 Dec. 2011
Dec. 2010 May 2011
1st real change OSM reflects 2nd real change OSM reflects Off. Data reflects
1st change 2nd change 1st change
12. 12
The idea Differences
Which dataset is “better”?
Official data OSM data
Which one is
better?
Some studies (Haklay 2008,
As a result OSM But, what
Zielstra&Zipf, 2010) take this data The desired
is not 100% happens with
set as the “truth” against which result will be:
complete that?
to compare OSM
Official data OSM data
Types of
updates
13. 13
The idea “Given enough
Questions to answer
eyeballs, all bugs are
shallow”
WHY OSM?
Accuracy data (Linus Law).
Amount of data. Updated data.
Comparative studies
WHAT features from OSM?
OSM not as features to take, but as indicators If not useful, not used: types of
to use. updates.
AIM 3
HOW to integrate OSM and official data?
Matching data models in a reference semantic Quality indicators (traditional and
model (domain ontology) Crowd quality parameters)
AIM 1 AIM 2
14. The idea: the proposed system 14
WEB
Update OSM
Official data set OSM data set
Semantic
INPUT specifications Specifications
Reference:
Domain Ontology
Feature class 1 Feature class 1 (50)
Feature class 2 Feature class 2 (80)
... Candidates ...
Feature class n Feature class n (N)
Matching process (feature classes filter)
Updating process
“Updating gaps” Feature class 1 Feature class 1 (50)
Feature class 2 Feature class 2 (80)
... ...
Types of updates
Feature class n Feature class n (N)
VGI teams/
Online updating
QC and QA (features filter)
Feature class 1 (30)
...
Crowd
Feature class n (N-M) ISO 19157
Quality
15. 15
The steps to reach the goal
And an example to show them
1 • Making the matching between data models
and features. Ontology approach
2 • To study Quality parameters to decide which
features could be used.
3 • Proposing a new updating process based on
flags and types of updates.
16. 16
1st step: making the matching
Comparing data models
NMA data model OSM data model
Format Database, shp XML (.osm)
Node Node
(Geometric) Primitives Arc Way Tag
Face Relations
Feature class Table, file Primary tag (key)
Feature (each object) Row Primary tag (value)
Attribute Column Tag (key)
Values (domains) Cells Tag (value)
17. 17
1st step: making the matching
An approach (based on H. Uitemark)
A1 A1: building of interest
1. Official dataset C1 Legend C1: motorway
D1 D1: toll motorway
A B
Real world Candidates:
C D
{[(A1,A2), (A1,B2)], [(C1,C2), (C1,D2)], [(D1,C2), (D1,D2)]}
D2 A2 A2: building, church
2. OpenStreetMap Legend B2: building, school
C2 B2
C2, D2: highway, motorway
18. 18
1st step: making the matching
The example: motorways (BCN Spain-OSM)
Something“superior” and
semanticisneededto
compare 2 data models
19. 19
1st step: making the matching
Using ontologies. First approach
Official OSM data
data set set
matching matching Ontology
Ontology Domain Ontology
(OSMONTO)
20. 20
1st step: making the matching
Using ontologies. Second approach
Official OSM data
data set set
mapping Ontology
Domain Ontology
(OSMONTO)
ODEMapster
R2RML
21. 21
2nd step: quality study
Studying the quality: traditional parameters
van Oort (2006) Haklay (2008) ISO 19157 (2011)
Completeness Completeness Completeness
Logical consistency Logical consistency Logical consistency
Positional accuracy Positional accuracy Positional accuracy
Attribute accuracy Attribute accuracy Thematic accuracy
Temporal quality Temporal quality Temporal quality
Semantic Accuracy Semantic Accuracy
Usage, purpose and Usage, purpose and
Usability element
constraints constraints
Lineage Lineage Lineage (19115)
Variation in Quality
Meta-quality
Resolution (≈ scale)
22. 22
2nd step: quality study
Studying the quality: (some) crowd quality parameters
Maué (2007). PGIS Haklay (2008) van Exel (2010) Others
Reputation of contributors Longevity of engagement User quality Lineage
• Local knowledege
Number of editions on a • Experience
Information assymetry • Recognition Homogeneity in Quality
feature
Number of contributors Feature related quality Time between editions
on a feature • Lineage on a feature
• Possitional accuracy
Number of bugs fixed • Semantic accuracy
23. 23
2nd step: quality study
Higher quality
Lower quality
Some methods to measure traditional quality (pos. accuracy)
Buffer width:
Perkal • Possitional accuracy
Until blue is totally
(1966) • Interpretation of epsilon band inside orange
Goodchild • Possitional accuracy
and Buffer width:
• Complete data sets are needed
Hunter Until blue is 90-
(1997) • A higher quality dataset is needed 95% inside orange
• Possitional accuracy (OS-OSM) Buffer width:
Haklay • Complete data sets are needed. He Two buffers.
(2008) completed OSM Compare de
• Suposed OS is higher quality than OSM overlap areas
• No complete data (and nobody is going Buffer width:
BCNSpain- to complete). Neither BCN nor OSM Could be
OSM • Don´t know which data set is better impossible to
(OSM to update BCN) achieve 90-95%
24. 24
2nd step: quality study
Example: measures of positional accuracy on motorways
BCN Spain
OSM
25. 25
2nd step: quality study
Example: measures of positional accuracy on motorways
% length of BCN motorways within
the OSM buffer
90%
% of BCN roadswithinthe OSM buffer
80%
70%
60%
50%
40%
30%
20%
10%
0%
1 2 3 4 5 6 7 8 9 10 15 20 25 30 50 100 200 500
Bufferwidth(m)
A 500 m buffer around OSM is needed to reach
80% of the BCN length within the buffer= lack of
completeness in OSM dataset
BCN Scale 1/200k (buffer must be ≈ 20m, which
means 73% of the length wihtin the OSM buffer)
26. 26
2nd step: quality study
Example: measures of positional accuracy on motorways
% length of OSM motorways within the
BCN buffer
% of OSM roadswithinthe BCN buffer
100%
80%
60%
40%
20%
0%
1 2 3 4 5 6 7 8 9 10 15 20 25 30 50 100 200 500
Bufferwidth(m)
A 25m buffer around BCN is needed to reach 90% of
the OSM length within the buffer.
In this case the method works because every OSM
motorways are also in BCN dataset.
27. 27
Higher quality
2nd step: quality study Lower quality
Some methods to measure traditional quality (completeness)
• Based on boundary box on each feature
• 300 m radius to find candidates to
match If the Bbox
OSL Musical • Additionally, levenshtein distance matches, then
Chairs (streets) the street
Algorithm name is
• A higher quality data set is needed to
compared
compare
• http://humanleg.org.uk/code/oslmusic
alchairs/
• Not useful for motorways or long
BCNSpain- features.
OSM • Useful for streets or polygons
• Convex hull could be used instead Bbox
28. 28
2nd step: quality study
Conclusions about traditional quality
Which parameter comes before? • Complete data set (not a measure of
Completeness or Possitional completeness) is needed to measure positional
accuracy accuracy
It is been proved that OSM is not • Congrats!
complete
• OSM not as features to take, but as indicators
It brings me to the first statement to use
• It doesn´t matter if OSM is not complete
• “Updating gaps”: which include the lack of
A new approach completeness of OSM
29. 29
3rd step: purpose updating process
Traditional classification of updates
Add
Updates Delete
Geometry
Modify
Attributes
30. 30
3rd step: purpose updating process
Proposed classification of updates
Don´t need to be
YES
updated
ROAD_ATT (offic) =
YES
ROAD_ATT (OSM)
Updating Attribute
ROAD_G (official)= NO
gap, type I updating
ROAD_G (OSM)
Updating Classification
YES
gap, type II updating
Official data ROAD_G (official)=
NO
OSM data OTHER_G (OSM)
Doesn´t exist in Updating
OSM can´t be
used, but
OSM gap, type III
adviced.
NO
Automatically
Doesn´t exist in Updating
updating from
official dataset gap, type IV
OSM?
31. 31
The result
Madrid case (1/200k)
Time to update: 1,5 weeks, 1
person
Features edited percentage: 30%
Time saved: 2,5 weeks
Costs saved: 40%
32. 32
Next steps
Find the best method to compare both data sets and
try it in different data sets (based on TQ and CQ)
Obtaining automatically different types of updating
gaps.
Look for a better way to compare data models
(ontology approach)
Try an automatic method to update the updating gaps
based on OSM.