This talk opened the geospatial track of the Apache Big Data conference. The geospatial track aimed to increase the benefits of implementing open source consistent with open geospatial standards.
After an introduction of the geospatial track this talk focused on these topics:
- Applications of Big Geo Data
- Geospatial Open Standards
- Big Geo Use Cases
- Open Source and Open Standards.
Analysis Ready Data workshop - OGC presentation George Percivall
The Open Geospatial Consortium (OGC) has activities relevant to the workshop scope of "the current state-of-the-art in satellite data interoperability”. This presentation will focus on two main topics with the option to discuss other relevant topics that the participants may wish to discuss, e.g., WFS3. The two focus areas of development: 1) Geospatial Datacubes and 2) Earth Observation Exploitation Platforms. 1) A Geospatial Datacube provides access to and analytics on analysis ready data (ARD) organized with coordinate axes of space and time with cells in the cube containing data of geospatial features, e.g., imagery. OGC members implementing geospatial datacubes are documenting common practices to spur development and leading to the possibility to federated geospatial datacubes. 2) OGC is forming a Earth Observation Exploitation Platform Domain Working Group with the goal of defining a standards-based framework for cloud-based access to and analysis of EO data. An ad-hoc meeting was held in March 2018 to scope the working group with the results issued in a request for comment: http://www.opengeospatial.org/pressroom/pressreleases/2792
OGC Update for State of Geospatial Tech at T-RexGeorge Percivall
An update on OGC activities in three time horizons: Now, Next and After Next. Finishing with how to keep updated on OGC activities.
Now
Recently approved OGC standards
Implementation of approved standards
Next
Standards Program
Innovation Program
After Next
Tech Forecast
How to keep in touch
"The Golden Age of Geospatial Data Science and Engineering" presented as the inital lecture in the Geospatial Data Science Distinguished Speaker Series at the University of Illinois, Urbana-Champaign. Series organized and presented by Professor Shaowen Wang, Head of the Geography and Geographic Information Science Department.
"Data Science is in a golden age. The mathematical foundations of Data Science, known for many years, are now seeing broad applicability due to engineering advances in cloud and big data computing and due to the explosive availability of data about nearly every aspect of human activity coming from mobile devices, remote sensing and the Internet of Things. Nearly all of this data has components of location and time leading to stunning advances in geospatial data science. Development of intelligent systems using knowledge models leading to insights and understanding have the potential to significantly transform geospatial data sciences. To achieve the fullest extent of their potential, these innovations require establishment of open consensus standards. This talk will review recent developments in innovations, standards, and applications of geospatial data science and engineering."
Presentation Location and Context World, 2015. Palo Alto, CA November 3-4, 2015.
Abstract: Creating useful local context requires big data platforms and marketplaces. Contextual awareness is relevant to location based marketing, first responders, urban planners and many others. Location-aware mobile devices are revolutionizing how consumers and brands interact in the physical world. Situational awareness is a key element to efficiently handling any emergency response. In all cases, big data processing and high velocity streaming of location based data creates the richest contextual awareness. Data from many sources including IoT devices, sensor webs, surveillance and crowdsourcing are combined with semantically-rich urban and indoor data models. The resulting context information is delivered to and shared by mobile devices in connected and disconnected operations. Standards play a key role in establishing context platforms and marketplaces. Successful approaches will consolidate data from ubiquitous sensing technologies on a common space-time basis to enabled context-aware analysis of environmental and social dynamics.
Time, Change and Habits in Geospatial-Temporal Information StandardsGeorge Percivall
Keynote for HIC 2014 – 11th International Conference on Hydroinformatics, New York, USA August 17 – 21, 2014
Time, Change and Habits in Geospatial-Temporal Information Standards
Time and change are fundamental to our scientific understanding of the world. Standards for geospatial-temporal information exist but new needs outstrip current standards. Geospatial-temporal information includes capturing change in features and coverages and modeling the processes that inform change. Key standards for time, calendars, and temporal reference systems are in place. Time series modeling from the WaterML standard is a recent advance of high value to hydrology. The OGC Moving Features standard will establish an encoding format for changes in “rigid” features. Interoperability standards are needed for Coverages with values that change based on observations, analytical expressions, or simulations. Applying a coverage model to time-varying, fluid Earth systems was the topic of the ground breaking GALEON Interoperability Experiment. Standards developments for spatial-temporal process models is progressing with WPS, OpenMI and ESMF - supporting a Model Web concept. A robust framework for sharing geospatial-temporal information is now coming into place based on developments captured in standards by ISO, WMO, ITU, ICSU and OGC - including the newly established OGC Temporal domain working group. The new framework will enable capabilities in expressing and sharing scientific investigations including research on the emergence of forms over time. With these new capabilities we may come to understand Peirce’s observation that over time “all things have a tendency to take habits.”
Innovation in Geospatial Technology and StandardsGeorge Percivall
All predictions are wrong; some are useful. This presentation offers a slate of geospatial trends developed in discussion with the OGC Board of Directors and expanded in an OGC blog series. These geospatial technology issues were developed by reviewing over 200 articles from geospatial publications as well as from information technology journals (IEEE, ACM, etc.).
These "Ripe Issues" of geospatial technology identify areas where further development of open standards can lead to great benefit:
* The Power of Location
* Internet of Things
* Mobile Development
* Indoor Frontier
* Cartographers of the future
* Big Processing of GeoData
* Smart Cities
The OGC is an international consortium where members participate in a consensus process to develop publicly available geospatial standards. OGC has a history of developing anticipatory standards. OGC is a leader in achieving a consensus balanced with innovation where OGC members actively designing the standard while implementing running software. In the role of OGC Chief Engineer, George Percivall identifies technology and market trends relevant to open standards development.
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
CyberGIS Architectures for Collaborative Problem Solving - OGC perspectiveGeorge Percivall
1. What is CyberGIS:
- collaboration; open data, open source, open standards
2. The plumbing for CyberGIS collaboration is available:
- Processing, Workflow, Model interoperability as web services are “solved” several times;
- the concepts for collaboration need to be made explicit
3. Need for “decision” and “hypothesis” objects including modeling and linked data
- Ontology for decision types. Templates for Decisions and Hypothesis
- Recommender systems - a guess at the riddle
- If I see these conditions then consider this decision template
- If I am researching these conditions then consider this hypothesis
Analysis Ready Data workshop - OGC presentation George Percivall
The Open Geospatial Consortium (OGC) has activities relevant to the workshop scope of "the current state-of-the-art in satellite data interoperability”. This presentation will focus on two main topics with the option to discuss other relevant topics that the participants may wish to discuss, e.g., WFS3. The two focus areas of development: 1) Geospatial Datacubes and 2) Earth Observation Exploitation Platforms. 1) A Geospatial Datacube provides access to and analytics on analysis ready data (ARD) organized with coordinate axes of space and time with cells in the cube containing data of geospatial features, e.g., imagery. OGC members implementing geospatial datacubes are documenting common practices to spur development and leading to the possibility to federated geospatial datacubes. 2) OGC is forming a Earth Observation Exploitation Platform Domain Working Group with the goal of defining a standards-based framework for cloud-based access to and analysis of EO data. An ad-hoc meeting was held in March 2018 to scope the working group with the results issued in a request for comment: http://www.opengeospatial.org/pressroom/pressreleases/2792
OGC Update for State of Geospatial Tech at T-RexGeorge Percivall
An update on OGC activities in three time horizons: Now, Next and After Next. Finishing with how to keep updated on OGC activities.
Now
Recently approved OGC standards
Implementation of approved standards
Next
Standards Program
Innovation Program
After Next
Tech Forecast
How to keep in touch
"The Golden Age of Geospatial Data Science and Engineering" presented as the inital lecture in the Geospatial Data Science Distinguished Speaker Series at the University of Illinois, Urbana-Champaign. Series organized and presented by Professor Shaowen Wang, Head of the Geography and Geographic Information Science Department.
"Data Science is in a golden age. The mathematical foundations of Data Science, known for many years, are now seeing broad applicability due to engineering advances in cloud and big data computing and due to the explosive availability of data about nearly every aspect of human activity coming from mobile devices, remote sensing and the Internet of Things. Nearly all of this data has components of location and time leading to stunning advances in geospatial data science. Development of intelligent systems using knowledge models leading to insights and understanding have the potential to significantly transform geospatial data sciences. To achieve the fullest extent of their potential, these innovations require establishment of open consensus standards. This talk will review recent developments in innovations, standards, and applications of geospatial data science and engineering."
Presentation Location and Context World, 2015. Palo Alto, CA November 3-4, 2015.
Abstract: Creating useful local context requires big data platforms and marketplaces. Contextual awareness is relevant to location based marketing, first responders, urban planners and many others. Location-aware mobile devices are revolutionizing how consumers and brands interact in the physical world. Situational awareness is a key element to efficiently handling any emergency response. In all cases, big data processing and high velocity streaming of location based data creates the richest contextual awareness. Data from many sources including IoT devices, sensor webs, surveillance and crowdsourcing are combined with semantically-rich urban and indoor data models. The resulting context information is delivered to and shared by mobile devices in connected and disconnected operations. Standards play a key role in establishing context platforms and marketplaces. Successful approaches will consolidate data from ubiquitous sensing technologies on a common space-time basis to enabled context-aware analysis of environmental and social dynamics.
Time, Change and Habits in Geospatial-Temporal Information StandardsGeorge Percivall
Keynote for HIC 2014 – 11th International Conference on Hydroinformatics, New York, USA August 17 – 21, 2014
Time, Change and Habits in Geospatial-Temporal Information Standards
Time and change are fundamental to our scientific understanding of the world. Standards for geospatial-temporal information exist but new needs outstrip current standards. Geospatial-temporal information includes capturing change in features and coverages and modeling the processes that inform change. Key standards for time, calendars, and temporal reference systems are in place. Time series modeling from the WaterML standard is a recent advance of high value to hydrology. The OGC Moving Features standard will establish an encoding format for changes in “rigid” features. Interoperability standards are needed for Coverages with values that change based on observations, analytical expressions, or simulations. Applying a coverage model to time-varying, fluid Earth systems was the topic of the ground breaking GALEON Interoperability Experiment. Standards developments for spatial-temporal process models is progressing with WPS, OpenMI and ESMF - supporting a Model Web concept. A robust framework for sharing geospatial-temporal information is now coming into place based on developments captured in standards by ISO, WMO, ITU, ICSU and OGC - including the newly established OGC Temporal domain working group. The new framework will enable capabilities in expressing and sharing scientific investigations including research on the emergence of forms over time. With these new capabilities we may come to understand Peirce’s observation that over time “all things have a tendency to take habits.”
Innovation in Geospatial Technology and StandardsGeorge Percivall
All predictions are wrong; some are useful. This presentation offers a slate of geospatial trends developed in discussion with the OGC Board of Directors and expanded in an OGC blog series. These geospatial technology issues were developed by reviewing over 200 articles from geospatial publications as well as from information technology journals (IEEE, ACM, etc.).
These "Ripe Issues" of geospatial technology identify areas where further development of open standards can lead to great benefit:
* The Power of Location
* Internet of Things
* Mobile Development
* Indoor Frontier
* Cartographers of the future
* Big Processing of GeoData
* Smart Cities
The OGC is an international consortium where members participate in a consensus process to develop publicly available geospatial standards. OGC has a history of developing anticipatory standards. OGC is a leader in achieving a consensus balanced with innovation where OGC members actively designing the standard while implementing running software. In the role of OGC Chief Engineer, George Percivall identifies technology and market trends relevant to open standards development.
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
CyberGIS Architectures for Collaborative Problem Solving - OGC perspectiveGeorge Percivall
1. What is CyberGIS:
- collaboration; open data, open source, open standards
2. The plumbing for CyberGIS collaboration is available:
- Processing, Workflow, Model interoperability as web services are “solved” several times;
- the concepts for collaboration need to be made explicit
3. Need for “decision” and “hypothesis” objects including modeling and linked data
- Ontology for decision types. Templates for Decisions and Hypothesis
- Recommender systems - a guess at the riddle
- If I see these conditions then consider this decision template
- If I am researching these conditions then consider this hypothesis
TITLE: Open Standards Role in EarthCube (Invited)
AUTHORS (FIRST NAME, LAST NAME): Luis E Bermudez1, David K Arctur2, 1, George Percivall1
INSTITUTIONS (ALL): 1. Open Geospatial Consortium, Gaithersburg, MD, United States.
2. University of Texas at Austin, Austin, TX, United States.
ABSTRACT BODY: EarthCube is an NSF initiative that will enable sharing of data in an open and transparent manner, improving access and use of data, allowing scientists to better understand the Earth. EarthCube is based on a network of enthusiasts willing to make the sharing of data a reality. But is just having open data enough? Open data will not accelerate the process a scientist team needs to go through to understand, reformat and use the data. However, agreements among colleagues or adoption of agreements can make a big difference. These agreements also need to be published, freely available, and unpolluted from intellectual property rights issues. The system design requirements to develop cyberinfrastructure for Geosciences need to take into account these open agreements, including open interfaces and open encodings. Once open agreements are in place, it is essential to have in place policy and procedures, and a governance body for maintaining those agreements. This presentation will explore these issues and suggest ways the standard development organizations, like the Open Geospatial Consortium (OGC), and other coordinating organizations, such as the Earth Science Information Partners (ESIP) and the Research Data Alliance (RDA), could be involved in this process.
http://www.opengeospatial.org
In AGU 2013 Session: IN43B. Emerging Concepts for Cyberinfrastructure in the Geosciences
Scientific Knowledge from Geospatial ObservationsGeorge Percivall
Presentation to IGARSS 2015 Conference, July 205, Milan Italy.
Part of invited session: Why Data Matters: Value of Stewardship and Knowledge Augmentation Services
Keynote presentation to New Zealand Geospatial Research Conference 2015. This presentation covered emerging topics for geospatial research in four areas:
- Spatial Representation: urban models, CityGML, indoor and DGGS
- New Data Sources: sensors everywhere, IoT, UAVs citizen observations, social media
- Computer Engineering: Big data, moving features, spatial analytics, mobile, 3D portrayal, augmented reality
- Application Areas: Soils Interoperability Experiment, Urban Climate Resilience in OGC Testbed 11.
Progress towards Open Standards-Based Agro-GeoinformaticsGeorge Percivall
Keynote presentation to Agro-Geoinformatics Conference
20 July 2015, Istanbul, Turkey
http://agro-geoinformatics.org/
** What is agro-geoinformatics and why need for exchange of Agriculture Geo-Information?
Efficient exchange of data on utilization of farmland, soil and crop characteristics, water availability, environmental impacts, …
Many user roles: growers, advisors, landowners, foodstuff processors, regulators and all levels of government
Major challenges to agricultural: climate change, increasing population, shortage of water and arable land
Increasing need for information standards to support transparency in agricultural goods and services markets
** Projects showing the progress of standards-based agro-geoinformatics technology
SoilML for information exchange
Soil information platforms
Precision Agriculture and In-situ networks
Remote sensing from satellites and drones
Big Data processing for decision support
Climate - Food - Water nexus
** OGC support of Agro-Geoinformatics
- Agriculture Domain Working Group
Identify geospatial interoperability challenges in agriculture domain
Forum to identify standards-based solutions, new standards
- Discrete Global Grid Systems standards development
Geometric partitioning of Earth surface into cells with identifiers
Enable fusion of disparate data for spatial analysis and modeling
- Soil Data Interoperability Experiment (SoilIE)
Testing standards for exchange of soils data
Results to converge and mature soil information standards.
Get involved as participant or an observer, contact:
David Medyckyj-Scott Medyckyj-Scottd@landcareresearch.co.nz
…and others
Raj Singh talks about the history of OGC standards such as Sensor Web Enablement Suite -- Sensor Planning Service, Sensor Observation Service, SensorML, Observation & Measurements -- and its IoT companion -- SWEforIoT, and how the geospatial industry is uniquely positioned to take leadership in the emerging Internet of Things space.
All predictions are wrong; some are useful. This presentation offers a slate of "ripe issues" that were developed in discussion with the OGC Board of Directors and expanded in a blog series. The issues were developed by reviewing over 200 articles from geospatial industry publications as well as from information technology journals (IEEE, ACM, etc.).
These Ripe Issues of geospatial technology identify areas where further development of open standards can lead to great benefit. The OGC is an international consortium where members participate in a consensus process to develop publicly available geospatial standards.
The ripe issues of geospatial technology identified in March 2013 are:
• The Power of Location
• Internet of Things
• Mobile Development
• Indoor Frontier
• Cartographers of the future
• Big Processing of Geospatial Data
• Smart Cities Depend on Smart Location
• Policy implementation
SDSC Technology Forum: Increasing the Impact of High Resolution Topography Da...OpenTopography Facility
High-resolution topography is a powerful tool for studying the Earth's surface, vegetation, and urban landscapes, with broad scientific, engineering, and educational-based applications. Over the past decade, there has been dramatic growth in the acquisition of these data for scientific, environmental, engineering and planning purposes. In the US, the U.S. Geological Society is undertaking the 3D Elevation Program (3DEP) to map the entire lower 48 with lidar by 2023.
The richness of these topography datasets make them extremely valuable beyond the application that drove their acquisition and thus are of interest to a large and varied user community. A cyberinfrastructure platform that enables users to efficiently discover, access and process these massive volumes of data increases the impact of investments in collection of the data and catalyzes scientific discovery as well as informs critical decisions that are made across our Nation every day that depend on elevation data, ranging from immediate safety of life, property, and environment to long term planning for infrastructure projects.
Join us to hear about the motivations, technology, and data assets behind the National Science Foundation funded OpenTopography platform, which aims to democratize access to high resolution topographic data. OpenTopography’s innovation is in co-locating massive volumes of topographic data with processing tools that enable users with varied expertise and application domains to quickly and easily access and process data, to enable innovation and decision making.
Application packaging and systematic processing in earth observation exploita...terradue
An overview of Terradue's solutions supporting Earth Observations (EO) Exploitation Platforms across multiple domains.
Presentation done as part of the Open Geospatial Consortium (OGC) Technical Committee ad-hoc meeting for the setup of a new domain working group on EO Exploitation Platforms.
Paradigm Shift of Geospatial Information ServiceSANGHEE SHIN
"Paradigm Shift of Geospatial Information Service:From Mass Production to Mass Customization, A Case Study of Korea NGII"
This talk was given at the 2nd Eurasian SDI conference held at Astana, Kazakhstan from 27th to 29th July 2016. This presentation contains parts of consulting report submitted to Korea NGII(National Geospatial Information Institute), a sort of national mapping agency in Korea.
Field Data Collecting, Processing and Sharing: Using web Service TechnologiesNiroshan Sanjaya
Collecting, Distributing and Analyzing field data is a crucial part in any geospatial study. Field data collection tools and methods have been developed significantly due to the advancement of technologies such as Global Navigational Satellite Systems (GNSS) and development of smartphones. Accurate field data collection is also a necessary task for broad spatial data analysis and proper decision making. Development of Web technologies led to share the data and information effectively. This study tries to develop a framework based on the Geospatial Semantic Web technologies for disseminating and processing field data. Experimental results from an implemented prototype show that the proposed framework allows to visualize and process the field data in any context. The system of this study is capable of distributing and processing field data using web application. Moreover, the study demonstrates the importance and the capabilities of web services for spatial data gathering and processing. The system has been developed based on Free and Open Source Software (FOSS) packages such as ZOO-Project, Open Data Kit, etc. It enables user to further improve or deploy the system for variety of studies.
TITLE: Open Standards Role in EarthCube (Invited)
AUTHORS (FIRST NAME, LAST NAME): Luis E Bermudez1, David K Arctur2, 1, George Percivall1
INSTITUTIONS (ALL): 1. Open Geospatial Consortium, Gaithersburg, MD, United States.
2. University of Texas at Austin, Austin, TX, United States.
ABSTRACT BODY: EarthCube is an NSF initiative that will enable sharing of data in an open and transparent manner, improving access and use of data, allowing scientists to better understand the Earth. EarthCube is based on a network of enthusiasts willing to make the sharing of data a reality. But is just having open data enough? Open data will not accelerate the process a scientist team needs to go through to understand, reformat and use the data. However, agreements among colleagues or adoption of agreements can make a big difference. These agreements also need to be published, freely available, and unpolluted from intellectual property rights issues. The system design requirements to develop cyberinfrastructure for Geosciences need to take into account these open agreements, including open interfaces and open encodings. Once open agreements are in place, it is essential to have in place policy and procedures, and a governance body for maintaining those agreements. This presentation will explore these issues and suggest ways the standard development organizations, like the Open Geospatial Consortium (OGC), and other coordinating organizations, such as the Earth Science Information Partners (ESIP) and the Research Data Alliance (RDA), could be involved in this process.
http://www.opengeospatial.org
In AGU 2013 Session: IN43B. Emerging Concepts for Cyberinfrastructure in the Geosciences
Scientific Knowledge from Geospatial ObservationsGeorge Percivall
Presentation to IGARSS 2015 Conference, July 205, Milan Italy.
Part of invited session: Why Data Matters: Value of Stewardship and Knowledge Augmentation Services
Keynote presentation to New Zealand Geospatial Research Conference 2015. This presentation covered emerging topics for geospatial research in four areas:
- Spatial Representation: urban models, CityGML, indoor and DGGS
- New Data Sources: sensors everywhere, IoT, UAVs citizen observations, social media
- Computer Engineering: Big data, moving features, spatial analytics, mobile, 3D portrayal, augmented reality
- Application Areas: Soils Interoperability Experiment, Urban Climate Resilience in OGC Testbed 11.
Progress towards Open Standards-Based Agro-GeoinformaticsGeorge Percivall
Keynote presentation to Agro-Geoinformatics Conference
20 July 2015, Istanbul, Turkey
http://agro-geoinformatics.org/
** What is agro-geoinformatics and why need for exchange of Agriculture Geo-Information?
Efficient exchange of data on utilization of farmland, soil and crop characteristics, water availability, environmental impacts, …
Many user roles: growers, advisors, landowners, foodstuff processors, regulators and all levels of government
Major challenges to agricultural: climate change, increasing population, shortage of water and arable land
Increasing need for information standards to support transparency in agricultural goods and services markets
** Projects showing the progress of standards-based agro-geoinformatics technology
SoilML for information exchange
Soil information platforms
Precision Agriculture and In-situ networks
Remote sensing from satellites and drones
Big Data processing for decision support
Climate - Food - Water nexus
** OGC support of Agro-Geoinformatics
- Agriculture Domain Working Group
Identify geospatial interoperability challenges in agriculture domain
Forum to identify standards-based solutions, new standards
- Discrete Global Grid Systems standards development
Geometric partitioning of Earth surface into cells with identifiers
Enable fusion of disparate data for spatial analysis and modeling
- Soil Data Interoperability Experiment (SoilIE)
Testing standards for exchange of soils data
Results to converge and mature soil information standards.
Get involved as participant or an observer, contact:
David Medyckyj-Scott Medyckyj-Scottd@landcareresearch.co.nz
…and others
Raj Singh talks about the history of OGC standards such as Sensor Web Enablement Suite -- Sensor Planning Service, Sensor Observation Service, SensorML, Observation & Measurements -- and its IoT companion -- SWEforIoT, and how the geospatial industry is uniquely positioned to take leadership in the emerging Internet of Things space.
All predictions are wrong; some are useful. This presentation offers a slate of "ripe issues" that were developed in discussion with the OGC Board of Directors and expanded in a blog series. The issues were developed by reviewing over 200 articles from geospatial industry publications as well as from information technology journals (IEEE, ACM, etc.).
These Ripe Issues of geospatial technology identify areas where further development of open standards can lead to great benefit. The OGC is an international consortium where members participate in a consensus process to develop publicly available geospatial standards.
The ripe issues of geospatial technology identified in March 2013 are:
• The Power of Location
• Internet of Things
• Mobile Development
• Indoor Frontier
• Cartographers of the future
• Big Processing of Geospatial Data
• Smart Cities Depend on Smart Location
• Policy implementation
SDSC Technology Forum: Increasing the Impact of High Resolution Topography Da...OpenTopography Facility
High-resolution topography is a powerful tool for studying the Earth's surface, vegetation, and urban landscapes, with broad scientific, engineering, and educational-based applications. Over the past decade, there has been dramatic growth in the acquisition of these data for scientific, environmental, engineering and planning purposes. In the US, the U.S. Geological Society is undertaking the 3D Elevation Program (3DEP) to map the entire lower 48 with lidar by 2023.
The richness of these topography datasets make them extremely valuable beyond the application that drove their acquisition and thus are of interest to a large and varied user community. A cyberinfrastructure platform that enables users to efficiently discover, access and process these massive volumes of data increases the impact of investments in collection of the data and catalyzes scientific discovery as well as informs critical decisions that are made across our Nation every day that depend on elevation data, ranging from immediate safety of life, property, and environment to long term planning for infrastructure projects.
Join us to hear about the motivations, technology, and data assets behind the National Science Foundation funded OpenTopography platform, which aims to democratize access to high resolution topographic data. OpenTopography’s innovation is in co-locating massive volumes of topographic data with processing tools that enable users with varied expertise and application domains to quickly and easily access and process data, to enable innovation and decision making.
Application packaging and systematic processing in earth observation exploita...terradue
An overview of Terradue's solutions supporting Earth Observations (EO) Exploitation Platforms across multiple domains.
Presentation done as part of the Open Geospatial Consortium (OGC) Technical Committee ad-hoc meeting for the setup of a new domain working group on EO Exploitation Platforms.
Paradigm Shift of Geospatial Information ServiceSANGHEE SHIN
"Paradigm Shift of Geospatial Information Service:From Mass Production to Mass Customization, A Case Study of Korea NGII"
This talk was given at the 2nd Eurasian SDI conference held at Astana, Kazakhstan from 27th to 29th July 2016. This presentation contains parts of consulting report submitted to Korea NGII(National Geospatial Information Institute), a sort of national mapping agency in Korea.
Field Data Collecting, Processing and Sharing: Using web Service TechnologiesNiroshan Sanjaya
Collecting, Distributing and Analyzing field data is a crucial part in any geospatial study. Field data collection tools and methods have been developed significantly due to the advancement of technologies such as Global Navigational Satellite Systems (GNSS) and development of smartphones. Accurate field data collection is also a necessary task for broad spatial data analysis and proper decision making. Development of Web technologies led to share the data and information effectively. This study tries to develop a framework based on the Geospatial Semantic Web technologies for disseminating and processing field data. Experimental results from an implemented prototype show that the proposed framework allows to visualize and process the field data in any context. The system of this study is capable of distributing and processing field data using web application. Moreover, the study demonstrates the importance and the capabilities of web services for spatial data gathering and processing. The system has been developed based on Free and Open Source Software (FOSS) packages such as ZOO-Project, Open Data Kit, etc. It enables user to further improve or deploy the system for variety of studies.
Lucidata Titan geo-server data container and servicesGeoffrey Clark
a linux server, or docker container ready to help you automate data integration and visualization for supply chain, logistics and transportation freight data flows. Packed with market intelligence data, metadata, a geospatial data analysis portal with many templates, and supply chain web analytics. Let us help you wrangle your data and provide insights through visualization.
NoSQL: Μη-σχεσιακές βάσεις δεδομένων για υψηλή κλιμάκωση σε web applicationsStelios Karabasakis
Download original PPTX presentation with speaker notes in greek from: http://www.mediafire.com/?me3h3zfqkny
NoSQL Grunge Logo designed by me and released to the public domain. Download as PSD or PNG from: http://www.mediafire.com/?sharekey=2644cf1d57cb17d6ab1eab3e9fa335cace0f768f8ef0a62b
---------
Παρουσίαση που πραγματοποιήθηκε στις 26/5/2010 στο τμήμα Πληροφορικής και Τηλεπικοινωνιών ΕΚΠΑ, στα πλαίσια του μεταπτυχιακού μαθήματος "Θέματα Εφαρμογών Βάσεων Δεδομένων"
Although NoSQL databases are relatively new, they've quickly adopted geo, from basic bounding box queries to full geospatial indexing, query and projection on a par with PostGIS. This presentation introduces NoSQL to the Geo developer, describing the pros and cons of NoSQL vs. relational, and what Geo functionality exists in the leading products.
Review this presentation Geospatial Technology Trends 2015 to understand more about the GIS, GPS, UAV, LiDAR, Remote Sensing, Earth Observation, Policy and education trends and directions this 2015.
Using Big Data techniques to query and store OpenStreetMap data. Stephen Knox...huguk
This talk will describe his research into using Hadoop to query and manage big geographic datasets, specifically OpenStreetMap(OSM). OSM is an “open-source” map of the world, growing at a large rate, currently around 5TB of data. The talk will introduce OSM, detail some aspects of the research, but also discuss his experiences with using the SpatialHadoop stack on Azure and Google Cloud.
In this on-demand webinar, you'll hear from Grant Ingersoll, co-founder of Lucid Imagination and chairman of the Apache Lucene PMC, for an in-depth technical workshop on the potential and application of the newly released Lucene and Solr geo-search functions. Grant will be joined by thought leaders: Ryan McKinley, co-founder of Voyager GIS and Apache Lucene PMC member; and Sameer Maggon, of Lucid Imagination customer AT&T Interactive, which manages and delivers online and mobile advertising products across AT&T's media platforms.
Presentation to for the ISPRS Congress 2012, Melbourne
Over the last decade, standards have played a key role in the expansion of the market for Earth Observation (EO) products and services. Standards become increasingly important as geospatial technologies and markets continue to evolve in an increasingly complex technology ecosystem. OGC and ISPRS work jointly to further the development of this vital information industry.
We continue to see global growth in the supply of geometrically controlled image-based geodata. On the data supplier side, most end-use EO information products use data from multiple EO sources (aerial and satellite) as well as from ground-based sources. On the customer side, customers’ business models involving EO data require easy connections between multiple data suppliers and multiple technology platforms. Typically, new markets create stovepiped, proprietary solutions that persist until market forces create demand for standards that in turn enhance market opportunity. The OGC’s standards meet this demand in the geospatial markets.
OGC leads worldwide in the creation and establishment of standards that allow geospatial content and services to be seamlessly integrated into business and civic processes, the spatial web and enterprise computing. OGC accelerates market assimilation of interoperability research through collaborative consortium processes.
OGC has both domain focused and technology focused activities. For example, the Meteorology & Oceanography Domain Working Group ensures that OGC standards and profiles allow the meteorological community to develop effective interoperability for web services and content across the wider geospatial domain. These needs are met for example by the technology of standards such as netCDF which was brought into the OGC to encourage broader international use and greater interoperability among clients and servers interchanging data in binary form.
Most OGC standards specify open interfaces or encodings that apply to imagery. Some of these are:
o Web Coverage Service (WCS)
o Web Coverage Processing Service (WCPS)
o Web Map Service (WMS)
o Geography Markup Language (GML)
o GML in JPEG 2000 Encoding
o OGC Network Common Data Form (NetCDF)
o Sensor Observation Service (SOS)
o Sensor Planning Service (SPS)
o Sensor Model Language Encoding Standard (SensorML).
o Catalogue Service for the WEB (CSW)
Interoperability and Standards for Disaster Risk ManagementLuis Bermudez
Presentation at the Strengthening Disaster Risk Reduction across the Americas: A Regional Summit on the Contribution of Earth Observations - https://disasters.nasa.gov/argentina-summit-2017
Geospatial Data and Key Characteristics of Geospatial Data Analysis and ScienceLuis Bermudez
The growing complexity and interdisciplinarity of research and applied science questions requires the developments of standards to exchange data within continuously growing communities as well as across domains. In most domains, geo-spatial data is the fundamental base layer for data science and analysis, as the vast majority have some spatial characteristics or apply to elements in space. Using the available standards, a good level of interoperability can already be realized. Nevertheless, the increasing complexity of research questions, the growing number of available data, and the increasing range of data providers, ranging from citizen scientists to fully automated sensor networks making their data directly available at the Internet, require even richer models that need to be developed to enhance the level of interoperability.
Slides of my PhD presentation @ Eurecom, presenting our work on publishing and consuming geo-spatial data and government data using Semantic Web technologies.
Overview of GEO activities to promote broad open Earth observations data and information, as well as insight into GEO engagement priorities and links to ISPRS.
Geospatial Temporal Open Standards for Big Data from Space (BiDS2014)George Percivall
Presentation to ESA Big Data From Space (BiDS2014), November 2014.
Big data from space requires processing large amounts of data in a distributed environment. For efficient, quality and cost-effective deployment, these environments must be based on open standards. The Open Geospatial Consortium (OGC) open standards for geospatial-temporal information have been tuned through implementations to meet the needs of big data.
Presentation by Mattia Santoro, CNR-IIA, Italy
GEOSS (Global Earth Observation System of System) is presented and its main constituent services and components are described. The FAIR principles appliance in GEOSS is analyzed. On-going and future developments are also introduced.
Presentation from EuroSDR 113th meeting, Cardiff, October 2008. An overview of some of the geospatial research carried out by the different departments, centres and groups at UCL.
A major challenge for the next decade is to design virtual and augmented reality systems (VR at large) for real-world use cases such as healthcare, entertainment, e-education, and high-risk missions. This requires VR systems to operate at scale, in a personalized manner, remaining bandwidth-tolerant whilst meeting quality and latency criteria. One key challenge to reach this goal is to fully understand and anticipate user behaviours in these mixed reality settings.
This can be accomplished only by a fundamental revolution of the network and VR systems that have to put the interactive user at the heart of the system rather than at the end of the chain. With this goal in mind, in this talk, we describe our current researches on user-centric systems. First, we describe our view-port based streaming strategies for 360-degree video. Then, we present more in details our research on of users‘ behaviour analysis, when users interact with the 360-degree content. Specifically, we describe a set of metrics that allows us to identify key behaviours among users and quantify the level of similarity of these behaviours. Specifically, we present our clique-based clustering methodology, information theory and trajectory base in-depth analysis. Finally, we conclude with an overview of the extension of this work to navigation within volumetric video sequences.
The implementation of the INSPIRE Directive in Europe and similar efforts around the globe to develop spatial data infrastructures and global systems of systems have been focusing largely on the adoption of agreed technologies, standards, and specifications to meet the (systems) interoperability challenge. Addressing the key scientific challenges of humanity in the 21st century requires however a much increased inter-disciplinary effort, which in turn makes more complex demands on the type of systems and arrangements needed to support it. This paper analyses the challenges for inter-disciplinary interoperability using the experience of the EuroGEOSS research project. It argues that inter-disciplinarity requires mutual understanding of requirements, methods, theoretical underpinning and tacit knowledge, and this in turn demands for a flexible approach to interoperability based on mediation, brokering and semantics-aware, cross-thematic functionalities. The paper demonstrates the implications of adopting this approach and charts the trajectory for the evolution of current spatial data infrastructures.
My special talk on 'GIS & Remote Sensing-Introduction to the Primer’ is a part of the 'Learn from the Leaders- 2' webinar series organized by IEEE SIGHT, Bombay section on May 25th, 2021
The NextGEOSS project, a European contribution to GEOSS (Global Earth Observation System of Systems), proposes to develop the next generation data hub for Earth Observations, where the users can connect to access data and deploy data-driven applications.
Defining Digital Earth as a virtual representation of all digital information with a geospatial component, this geography attempts to delineate the scope and elements of Digital Earth. The framework for this geography is a set of layers applicable to describing an information system. From bottom to top the layers are physical, data, information, knowledge, decisions and actions. Conclusions of this geography are that some technologies are sufficient for a Digital Earth to come into existence, but some technologies, in particular in the upper layers, need to be developed. Three conclusions are listed in this abstract.
In the physical and data layers, the explosive growth of Internet provides access to much Digital Earth data. However, the bandwidth necessary for high-end Digital Earth clients will not be widely deployed for some time. In the near term it will be necessary to have Digital Earth access points in public places like museums where high bandwidth is available.
Digital Earth information volume is estimated by assuming a fraction of all digital information that has a geospatial component. Estimates place the total volume of recorded information at several thousand petabytes, i.e., several exabytes. It has been regularly postulated in the geographic community that half or more of all information has a geospatial component. Even though we will soon have the capacity to digitally record this volume of information, most of of it will never be looked at by a human. Tools are needed for auto-summarization, distilling the information into knowledge with lower volume and higher semantic content.
To allow decisions and actions based on the knowledge of Digital Earth requires analysis of the knowledge using tools particular to the geospatial domain. As Digital Earth will exist in a distributed service environment based on standards for interoperability, the standards must address the particulars of geospatial semantics. Syntax standards for transporting semantic information (e.g., XML) have been defined and extended with geospatial structures. Standards for achieving shared understandings ("domain semantics") are yet to be developed. Beyond domain semantics, the validity of chaining services on geospatial features ("process semantics") is less developed.
Climate Data Sharing for Urban Resilience - OGC Testbed 11George Percivall
OGC Testbed 11:
Delivering on our commitment to the Climate Data Initiative
In December 2014 the US White House Office of Science and Technology (OSTP) released a Policy Fact Sheet titled "Harnessing Climate Data to Boost Ecosystem & Water Resilience." The Fact Sheet includes OGC’s commitment to increase open access to climate change information using open standards. Testbed 11 results are now available delivering on that commitment.
The results of this major interoperability testbed contribute to development and refinement of international standards that are critical for the communication and integration of geospatial information. http://www.opengeospatial.org/projects/initiatives/testbed11
• Nine sponsors provided requirements and funding for Testbed 11.
• Thirty organizations participated in Testbed 11 by contributing prototypes, engineering
reports and participating in a scenario driven demonstration of the technical advances Technical results of Testbed 11 relevant to the Climate Data Initiative include:
• Analysis and prediction based on open climate data accessed using open standards
• Making predictive models more accessible with OGC Web Processing Service (WPS)
• Verifying model predictions using mobile operations, in-situ gauges and social media.
Climate adaptation, resilience and security planning based on technology from OGC Testbed 11:
• Estimating geographic extend of coastal inundation in dynamic weather conditions
• Assessing social unrest with displaced population due to climate change
• Integrating spatial and non-spatial models of human geography and resilience
• Predictive models and verifications to support planning and response phases
Manual on Remote Sensing v4 - Chapter 6 archive and accessGeorge Percivall
Presentation on ASPRS Manual on Remote Sensing, v4 (MRSv4)
John Faundeen, USGS/EROS and I are editors of the Archiving and Access chapter.
My focus is on visualization, access, processing and workflow.
MRSv4 is planned for release at the ISPRS congress next year.
MRSv4 Chap 6 at ASPRS Annual Meeting 2015
Urban IoT for Smart Cities: New Pathways to Business and Location Intelligenc...George Percivall
Presentation to Location Intelligence 2014 on 20 May 2014, during Opening Plenary/LI Vision Panel:" Location Analytics & Visual Data Discovery … New Pathways to Business Intelligence" My presentation identifies how rich location information is vital to the success of smart cities. Topics addressed included benefits for location infrastructure in Smart Cities. Spatial architecture from geospatial, infrastructure, buildings - indoor, outdoor and urban settings. OGC Smart Cities Testbed as convergence of many technologies to meet the needs of urban citizens and services.
http://www.locationintelligence.net/dc/agenda/
Mobile World Congress 2014 was again a huge display of the power of location information. OGC standards for mobile applications are key to exploiting the value of geospatial information. OGC has several open standards that enable accurate and robust sharing of geospatial information in mobile environment.
Variations of this presentation were made at the OGC Workshop at MWC, at the OMA Demo Day and at the Small Cell Zone exhibit space.
Note the slide calling for a Smart Cities - Urban IoT Testbed concept that builds on OGC Interoperability Program capabilities.
The Open Landscape of Geospatial Information: Open data, open source, open standards
Presented at ASPRS GeoTech 2013 conference: http://www.asprspotomac.org/geotech2013/
Abstract:
The many dimensions of "open" provides users with higher quality geospatial information. Open Standards ensures interoperability to information whether its served by proprietary or open source software. Open Source software benefits the development of open standards and leads to a business ecosystem that includes more providers, more partnerships and more customers.[1] In the end the user does not care if the code is open or proprietary. Users care about access to data and the quality of the data. Open Data has advanced with the recent policies from GEOSS Data-CORE [2] and the US Open Government Initiative [3]. Open Earth Observation data from government sources benefits industry and users. Open standards, Open source and Open data can result in higher quality information. The fusion of data from multiple sources results in higher quality. Fusion is possible based on multiple data sources that can be interrelated [4]. Improving Data Quality through knowing the uncertainty and the provenance of derived information is dependent upon an open landscape of geospatial information.
[1] http://wiki.osgeo.org/wiki/Open_Source_and_Open_Standards
[2] http://www.earthobservations.org/geoss_dsp.shtml
[3] http://www.whitehouse.gov/open
[4] http://www.opengeospatial.org/projects/initiatives/fusion2
Location Based Services update for Small Cell ForumGeorge Percivall
Presentation about OGC activities on location based services with an emphasis on indoor location and IndoorGML.
Agenda of talk:
- The power of location
- Mission of OGC
- OGC standards
- OpenLS - OGC Open Location Services
- New developments: IndoorGML and others
Responding to an oil spill requires access and understanding of many types of information. Effective, coordinated operations for the response are based on a shared, common picture of the situation. Interoperability provides shared situational awareness of the crisis and the response activities.
The OGP and IPIECA are conducting a Joint Industry Project to produce a recommended practice for an Oil Spill Response Common Operating Picture (COP) for management of the response. The presentation will provide an overview, plans and status of the OGP/IPEICA project being conducted with support from RDA and OGC.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
top nidhi software solution freedownloadvrstrong314
This presentation emphasizes the importance of data security and legal compliance for Nidhi companies in India. It highlights how online Nidhi software solutions, like Vector Nidhi Software, offer advanced features tailored to these needs. Key aspects include encryption, access controls, and audit trails to ensure data security. The software complies with regulatory guidelines from the MCA and RBI and adheres to Nidhi Rules, 2014. With customizable, user-friendly interfaces and real-time features, these Nidhi software solutions enhance efficiency, support growth, and provide exceptional member services. The presentation concludes with contact information for further inquiries.
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Innovating Inference - Remote Triggering of Large Language Models on HPC Clus...Globus
Large Language Models (LLMs) are currently the center of attention in the tech world, particularly for their potential to advance research. In this presentation, we'll explore a straightforward and effective method for quickly initiating inference runs on supercomputers using the vLLM tool with Globus Compute, specifically on the Polaris system at ALCF. We'll begin by briefly discussing the popularity and applications of LLMs in various fields. Following this, we will introduce the vLLM tool, and explain how it integrates with Globus Compute to efficiently manage LLM operations on Polaris. Attendees will learn the practical aspects of setting up and remotely triggering LLMs from local machines, focusing on ease of use and efficiency. This talk is ideal for researchers and practitioners looking to leverage the power of LLMs in their work, offering a clear guide to harnessing supercomputing resources for quick and effective LLM inference.
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...informapgpstrackings
Keep tabs on your field staff effortlessly with Informap Technology Centre LLC. Real-time tracking, task assignment, and smart features for efficient management. Request a live demo today!
For more details, visit us : https://informapuae.com/field-staff-tracking/
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
Enhancing Project Management Efficiency_ Leveraging AI Tools like ChatGPT.pdfJay Das
With the advent of artificial intelligence or AI tools, project management processes are undergoing a transformative shift. By using tools like ChatGPT, and Bard organizations can empower their leaders and managers to plan, execute, and monitor projects more effectively.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
12. What is Geodesy? 12OGC
Latitude is not unique !
f1
f2
nor is Longitude
f1 f2
Due to different
Geodetic Datums:
13. What is Geodesy? 13OGC
Mercator
projection
Globular
projection
Orthographic
projection
Stereographic
projection
A familiarly shaped ‘continent’ in different map
projections
14. What is Geodesy? 14OGC
What errors can you expect?
Wrong geodetic datum:
q several hundreds of metres
Incorrect ellipsoid:
q horizontally: several tens of metres
q height: not effected, or tens to several hundred metres
Wrong map projection:
entirely the wrong projection:
hundreds, even thousands of kilometres (at least easy to spot!)
partly wrong (i.e. one or more parameters are wrong):
several metres to many hundreds of kilometres
No geodetic metadata coordinates cannot be interpreted
datum
ellipsoid
prime meridian
map projection