Defining Digital Earth as a virtual representation of all digital information with a geospatial component, this geography attempts to delineate the scope and elements of Digital Earth. The framework for this geography is a set of layers applicable to describing an information system. From bottom to top the layers are physical, data, information, knowledge, decisions and actions. Conclusions of this geography are that some technologies are sufficient for a Digital Earth to come into existence, but some technologies, in particular in the upper layers, need to be developed. Three conclusions are listed in this abstract.
In the physical and data layers, the explosive growth of Internet provides access to much Digital Earth data. However, the bandwidth necessary for high-end Digital Earth clients will not be widely deployed for some time. In the near term it will be necessary to have Digital Earth access points in public places like museums where high bandwidth is available.
Digital Earth information volume is estimated by assuming a fraction of all digital information that has a geospatial component. Estimates place the total volume of recorded information at several thousand petabytes, i.e., several exabytes. It has been regularly postulated in the geographic community that half or more of all information has a geospatial component. Even though we will soon have the capacity to digitally record this volume of information, most of of it will never be looked at by a human. Tools are needed for auto-summarization, distilling the information into knowledge with lower volume and higher semantic content.
To allow decisions and actions based on the knowledge of Digital Earth requires analysis of the knowledge using tools particular to the geospatial domain. As Digital Earth will exist in a distributed service environment based on standards for interoperability, the standards must address the particulars of geospatial semantics. Syntax standards for transporting semantic information (e.g., XML) have been defined and extended with geospatial structures. Standards for achieving shared understandings ("domain semantics") are yet to be developed. Beyond domain semantics, the validity of chaining services on geospatial features ("process semantics") is less developed.
Providing geospatial information as Linked Open DataPat Kenny
ADAPT is revolutionising the way people can seamlessly interact with digital content, systems and each other and enabling users to achieve unprecedented levels of access and efficiency. - Prof. Declan O'Sullivan, Trinity College Dublin. Address given at Ordnance Survey Ireland GI R&D Initiatives, Tuesday, 22 March 2016, 13:00 to 20:30 (GMT), Maynooth University.
Geographic Information Management TransformationPat Kenny
GI Management Transformation: from geometry to data-based relationships. - Dr Tracey P. Lauriault, School of Journalism and Communication, Carleton University & Programmable City, Maynooth University. Address given at Ordnance Survey Ireland GI R&D Initiatives, Tuesday, 22 March 2016, 13:00 to 20:30 (GMT), Maynooth University.
presented at the 2011 SemTech
Open government data and related services/applications are quickly growing on the Web. Although most agree that the government data has great potential in solving real world problems, there are still many challenges that must be addressed. This talk will describe several representative domain applications and provide concrete examples of evolving technical challenges remaining. We will show solution paths that have proven useful and make recommendations on the corresponding Semantic Web best practices.
• Scalability. How can we handle(e.g. search and cleanse) the 3,000+ raw/tool datasets, and the additional 300,000+ geo datasets from data.gov?
• Interoperability. Multi-scale open government data came from city governments, state governments, and national governments. How can one compare the GDP of the US and China, and later link to state-level financial data? Open government data covers many domains. How can one associate open government data with domain knowledge to build a cancer prevention application?
• Provenance and quality. How should provenance be leveraged to facilitate high-quality data management interactions (e.g. reuse, mash-up and feedback) between the government and the public?
Providing geospatial information as Linked Open DataPat Kenny
ADAPT is revolutionising the way people can seamlessly interact with digital content, systems and each other and enabling users to achieve unprecedented levels of access and efficiency. - Prof. Declan O'Sullivan, Trinity College Dublin. Address given at Ordnance Survey Ireland GI R&D Initiatives, Tuesday, 22 March 2016, 13:00 to 20:30 (GMT), Maynooth University.
Geographic Information Management TransformationPat Kenny
GI Management Transformation: from geometry to data-based relationships. - Dr Tracey P. Lauriault, School of Journalism and Communication, Carleton University & Programmable City, Maynooth University. Address given at Ordnance Survey Ireland GI R&D Initiatives, Tuesday, 22 March 2016, 13:00 to 20:30 (GMT), Maynooth University.
presented at the 2011 SemTech
Open government data and related services/applications are quickly growing on the Web. Although most agree that the government data has great potential in solving real world problems, there are still many challenges that must be addressed. This talk will describe several representative domain applications and provide concrete examples of evolving technical challenges remaining. We will show solution paths that have proven useful and make recommendations on the corresponding Semantic Web best practices.
• Scalability. How can we handle(e.g. search and cleanse) the 3,000+ raw/tool datasets, and the additional 300,000+ geo datasets from data.gov?
• Interoperability. Multi-scale open government data came from city governments, state governments, and national governments. How can one compare the GDP of the US and China, and later link to state-level financial data? Open government data covers many domains. How can one associate open government data with domain knowledge to build a cancer prevention application?
• Provenance and quality. How should provenance be leveraged to facilitate high-quality data management interactions (e.g. reuse, mash-up and feedback) between the government and the public?
Five Technology Trends Every Nonprofit Needs to KnowAzavea
Are you tired of hearing about big data, social media, web 2.0, and other buzzwords? This session will introduce five emerging technology trends that will fundamentally impact the independent sector. Join us and learn how to incorporate them into your current plans to better reach your donors, engage your constituents, and maximize your impact.
Big Data - Yesterday, Today and Tomorrow by John Mashey, TechviserAngela Hey
A history of Big Data by Dr John Mashey. It discusses 1890 census data, IBM mainframes, DEC minicomputer clusters, Silicon Graphics (SGI) applications and a history of Open Source computing.
GIS 2.0: Impacts on Humanitarian Affairs and Genocide StudiesJoshua Campbell
Presentation given to the Geography 571: Geography of Genocide and Geography 526: Remote Sensing of the Environment I class at the University of Kansas on 22 March 2010
Visualizing history - A proposal for Augmentive Drones in Archaeology.Clinton Jones
A report i'm working on for Archaeology sites, designed to incorporate drone technology and ubiquitous computing during the excavation process in Archaeology.
The Use of GIS in Local Government - The City of MonashSteven Truman
The City of Monash as a case study in the use of Geographic Information Systems (GIS) and geographic data in Local government. The city of Monash is located in the south eastern suburbs of Melbourne, Victoria, Australia.
The Open Data movement is now moving a step forward, many governments, institutions and business have recently started the process of making information available to citizens and customers. Data is now seen as a powerful instrument to increase transparency in public administration and business on policies. About 80% of this information has a spatial component that is not entirely exploited yet. A range of open source solutions are now available to address this challenge, in this session we will explore their potential and possible applications. The so-called “data deluge” is here.. but we can build good umbrellas.
Drupal Day 2011 - Thinking spatially with your open dataDrupalDay
Talk di Juan Arevalo & Marco Giacomassi | Drupal Day Roma 2011
The Open Data movement is now moving a step forward, many governments, institutions and business have recently started the process of making information available to citizens and customers. Data is now seen as a powerful instrument to increase transparency in public administration and business on policies. About 80% of this information has a spatial component that is not entirely exploited yet. A range of open source solutions are now available to address this challenge, in this session we will explore their potential and possible applications. The so-called “data deluge” is here.. but we can build good umbrellas. Please come to learn more about it!
Scientific Knowledge from Geospatial ObservationsGeorge Percivall
Presentation to IGARSS 2015 Conference, July 205, Milan Italy.
Part of invited session: Why Data Matters: Value of Stewardship and Knowledge Augmentation Services
This presentation accompanied a joint keynote address given by AAM's Brian Nicholls and Singapore Land Authority's (SLA) Dr Victor Khoo at the Locate17 Conference. AAM and the SLA are working together to capture and deliver an accurate and up-to-date 3D digital map for the entire country of Singapore, providing the digital framework for Singapore's visionary Smart Nation program. This presentation outlines the processes and technologies used to create the 3D digital map and highlights the many applications stemming from it such as Property Management systems, Solar Potential Studies, the development of Driverless Vehicle systems and more (many yet to be discovered!).
Five Technology Trends Every Nonprofit Needs to KnowAzavea
Are you tired of hearing about big data, social media, web 2.0, and other buzzwords? This session will introduce five emerging technology trends that will fundamentally impact the independent sector. Join us and learn how to incorporate them into your current plans to better reach your donors, engage your constituents, and maximize your impact.
Big Data - Yesterday, Today and Tomorrow by John Mashey, TechviserAngela Hey
A history of Big Data by Dr John Mashey. It discusses 1890 census data, IBM mainframes, DEC minicomputer clusters, Silicon Graphics (SGI) applications and a history of Open Source computing.
GIS 2.0: Impacts on Humanitarian Affairs and Genocide StudiesJoshua Campbell
Presentation given to the Geography 571: Geography of Genocide and Geography 526: Remote Sensing of the Environment I class at the University of Kansas on 22 March 2010
Visualizing history - A proposal for Augmentive Drones in Archaeology.Clinton Jones
A report i'm working on for Archaeology sites, designed to incorporate drone technology and ubiquitous computing during the excavation process in Archaeology.
The Use of GIS in Local Government - The City of MonashSteven Truman
The City of Monash as a case study in the use of Geographic Information Systems (GIS) and geographic data in Local government. The city of Monash is located in the south eastern suburbs of Melbourne, Victoria, Australia.
The Open Data movement is now moving a step forward, many governments, institutions and business have recently started the process of making information available to citizens and customers. Data is now seen as a powerful instrument to increase transparency in public administration and business on policies. About 80% of this information has a spatial component that is not entirely exploited yet. A range of open source solutions are now available to address this challenge, in this session we will explore their potential and possible applications. The so-called “data deluge” is here.. but we can build good umbrellas.
Drupal Day 2011 - Thinking spatially with your open dataDrupalDay
Talk di Juan Arevalo & Marco Giacomassi | Drupal Day Roma 2011
The Open Data movement is now moving a step forward, many governments, institutions and business have recently started the process of making information available to citizens and customers. Data is now seen as a powerful instrument to increase transparency in public administration and business on policies. About 80% of this information has a spatial component that is not entirely exploited yet. A range of open source solutions are now available to address this challenge, in this session we will explore their potential and possible applications. The so-called “data deluge” is here.. but we can build good umbrellas. Please come to learn more about it!
Scientific Knowledge from Geospatial ObservationsGeorge Percivall
Presentation to IGARSS 2015 Conference, July 205, Milan Italy.
Part of invited session: Why Data Matters: Value of Stewardship and Knowledge Augmentation Services
This presentation accompanied a joint keynote address given by AAM's Brian Nicholls and Singapore Land Authority's (SLA) Dr Victor Khoo at the Locate17 Conference. AAM and the SLA are working together to capture and deliver an accurate and up-to-date 3D digital map for the entire country of Singapore, providing the digital framework for Singapore's visionary Smart Nation program. This presentation outlines the processes and technologies used to create the 3D digital map and highlights the many applications stemming from it such as Property Management systems, Solar Potential Studies, the development of Driverless Vehicle systems and more (many yet to be discovered!).
The presentation looks at some of the key capabilities that are required, whether at a campus-wide, regional or national level to make sure that digitisation happens effectively, as rapidly as possible and offers value for money in the medium and long term.
In recent years governments and research institutions have emphasized the need for open data as a fundamental component of open science. But we need much more than the data themselves for them to be reusable and useful. We need descriptive and machine-readable metadata, of course, but we also need the software and the algorithms necessary to fully understand the data. We need the standards and protocols that allow us to easily read and analyze the data with the tools of our choice. We need to be able to trust the source and derivation of the data. In short, we need an interoperable data infrastructure, but it must be a flexible infrastructure able to work across myriad cultures, scales, and technologies. This talk will present a concept of infrastructure as a body of human, organisational, and machine relationships built around data. It will illustrate how a new organization, the Research Data Alliance, is working to build those relationships to enable functional data sharing and reuse.
Dealing with Semantic Heterogeneity in Real-Time InformationEdward Curry
Tutorial at the EarthBiAs 2014 Summer School on Dealing with Semantic Heterogeneity in Real-Time Information
Part I: Large Scale Open Environments
Part Ii: Computational Paradigms
Part III: RDF Event Processing
Part IV: Theory of Event Exchange
Part V: Approaches to Semantic Decoupling
Part VI: Example Application: Linked Energy Intelligence
"The Golden Age of Geospatial Data Science and Engineering" presented as the inital lecture in the Geospatial Data Science Distinguished Speaker Series at the University of Illinois, Urbana-Champaign. Series organized and presented by Professor Shaowen Wang, Head of the Geography and Geographic Information Science Department.
"Data Science is in a golden age. The mathematical foundations of Data Science, known for many years, are now seeing broad applicability due to engineering advances in cloud and big data computing and due to the explosive availability of data about nearly every aspect of human activity coming from mobile devices, remote sensing and the Internet of Things. Nearly all of this data has components of location and time leading to stunning advances in geospatial data science. Development of intelligent systems using knowledge models leading to insights and understanding have the potential to significantly transform geospatial data sciences. To achieve the fullest extent of their potential, these innovations require establishment of open consensus standards. This talk will review recent developments in innovations, standards, and applications of geospatial data science and engineering."
OGC Update for State of Geospatial Tech at T-RexGeorge Percivall
An update on OGC activities in three time horizons: Now, Next and After Next. Finishing with how to keep updated on OGC activities.
Now
Recently approved OGC standards
Implementation of approved standards
Next
Standards Program
Innovation Program
After Next
Tech Forecast
How to keep in touch
Analysis Ready Data workshop - OGC presentation George Percivall
The Open Geospatial Consortium (OGC) has activities relevant to the workshop scope of "the current state-of-the-art in satellite data interoperability”. This presentation will focus on two main topics with the option to discuss other relevant topics that the participants may wish to discuss, e.g., WFS3. The two focus areas of development: 1) Geospatial Datacubes and 2) Earth Observation Exploitation Platforms. 1) A Geospatial Datacube provides access to and analytics on analysis ready data (ARD) organized with coordinate axes of space and time with cells in the cube containing data of geospatial features, e.g., imagery. OGC members implementing geospatial datacubes are documenting common practices to spur development and leading to the possibility to federated geospatial datacubes. 2) OGC is forming a Earth Observation Exploitation Platform Domain Working Group with the goal of defining a standards-based framework for cloud-based access to and analysis of EO data. An ad-hoc meeting was held in March 2018 to scope the working group with the results issued in a request for comment: http://www.opengeospatial.org/pressroom/pressreleases/2792
This talk opened the geospatial track of the Apache Big Data conference. The geospatial track aimed to increase the benefits of implementing open source consistent with open geospatial standards.
After an introduction of the geospatial track this talk focused on these topics:
- Applications of Big Geo Data
- Geospatial Open Standards
- Big Geo Use Cases
- Open Source and Open Standards.
Keynote presentation to New Zealand Geospatial Research Conference 2015. This presentation covered emerging topics for geospatial research in four areas:
- Spatial Representation: urban models, CityGML, indoor and DGGS
- New Data Sources: sensors everywhere, IoT, UAVs citizen observations, social media
- Computer Engineering: Big data, moving features, spatial analytics, mobile, 3D portrayal, augmented reality
- Application Areas: Soils Interoperability Experiment, Urban Climate Resilience in OGC Testbed 11.
Presentation Location and Context World, 2015. Palo Alto, CA November 3-4, 2015.
Abstract: Creating useful local context requires big data platforms and marketplaces. Contextual awareness is relevant to location based marketing, first responders, urban planners and many others. Location-aware mobile devices are revolutionizing how consumers and brands interact in the physical world. Situational awareness is a key element to efficiently handling any emergency response. In all cases, big data processing and high velocity streaming of location based data creates the richest contextual awareness. Data from many sources including IoT devices, sensor webs, surveillance and crowdsourcing are combined with semantically-rich urban and indoor data models. The resulting context information is delivered to and shared by mobile devices in connected and disconnected operations. Standards play a key role in establishing context platforms and marketplaces. Successful approaches will consolidate data from ubiquitous sensing technologies on a common space-time basis to enabled context-aware analysis of environmental and social dynamics.
Climate Data Sharing for Urban Resilience - OGC Testbed 11George Percivall
OGC Testbed 11:
Delivering on our commitment to the Climate Data Initiative
In December 2014 the US White House Office of Science and Technology (OSTP) released a Policy Fact Sheet titled "Harnessing Climate Data to Boost Ecosystem & Water Resilience." The Fact Sheet includes OGC’s commitment to increase open access to climate change information using open standards. Testbed 11 results are now available delivering on that commitment.
The results of this major interoperability testbed contribute to development and refinement of international standards that are critical for the communication and integration of geospatial information. http://www.opengeospatial.org/projects/initiatives/testbed11
• Nine sponsors provided requirements and funding for Testbed 11.
• Thirty organizations participated in Testbed 11 by contributing prototypes, engineering
reports and participating in a scenario driven demonstration of the technical advances Technical results of Testbed 11 relevant to the Climate Data Initiative include:
• Analysis and prediction based on open climate data accessed using open standards
• Making predictive models more accessible with OGC Web Processing Service (WPS)
• Verifying model predictions using mobile operations, in-situ gauges and social media.
Climate adaptation, resilience and security planning based on technology from OGC Testbed 11:
• Estimating geographic extend of coastal inundation in dynamic weather conditions
• Assessing social unrest with displaced population due to climate change
• Integrating spatial and non-spatial models of human geography and resilience
• Predictive models and verifications to support planning and response phases
UAVs are a disruptive technology bringing new geographic data and information to many application domains. UASs are similar to other geographic imagery systems so existing frameworks are applicable. But the diversity of UAVs as platforms along with the diversity of available sensors are presenting challenges in the processing and creation of geospatial products. Efficient processing and dissemination of the data is achieved using software and systems that implement open standards. The challenges identified point to the need for use of existing standards and extending standards. Results from the use of the OGC Sensor Web Enablement set of standards are presented. Next steps in the progress of UAVs and UASs may follow the path of open data, open source and open standards.
Progress towards Open Standards-Based Agro-GeoinformaticsGeorge Percivall
Keynote presentation to Agro-Geoinformatics Conference
20 July 2015, Istanbul, Turkey
http://agro-geoinformatics.org/
** What is agro-geoinformatics and why need for exchange of Agriculture Geo-Information?
Efficient exchange of data on utilization of farmland, soil and crop characteristics, water availability, environmental impacts, …
Many user roles: growers, advisors, landowners, foodstuff processors, regulators and all levels of government
Major challenges to agricultural: climate change, increasing population, shortage of water and arable land
Increasing need for information standards to support transparency in agricultural goods and services markets
** Projects showing the progress of standards-based agro-geoinformatics technology
SoilML for information exchange
Soil information platforms
Precision Agriculture and In-situ networks
Remote sensing from satellites and drones
Big Data processing for decision support
Climate - Food - Water nexus
** OGC support of Agro-Geoinformatics
- Agriculture Domain Working Group
Identify geospatial interoperability challenges in agriculture domain
Forum to identify standards-based solutions, new standards
- Discrete Global Grid Systems standards development
Geometric partitioning of Earth surface into cells with identifiers
Enable fusion of disparate data for spatial analysis and modeling
- Soil Data Interoperability Experiment (SoilIE)
Testing standards for exchange of soils data
Results to converge and mature soil information standards.
Get involved as participant or an observer, contact:
David Medyckyj-Scott Medyckyj-Scottd@landcareresearch.co.nz
…and others
Manual on Remote Sensing v4 - Chapter 6 archive and accessGeorge Percivall
Presentation on ASPRS Manual on Remote Sensing, v4 (MRSv4)
John Faundeen, USGS/EROS and I are editors of the Archiving and Access chapter.
My focus is on visualization, access, processing and workflow.
MRSv4 is planned for release at the ISPRS congress next year.
MRSv4 Chap 6 at ASPRS Annual Meeting 2015
CyberGIS Architectures for Collaborative Problem Solving - OGC perspectiveGeorge Percivall
1. What is CyberGIS:
- collaboration; open data, open source, open standards
2. The plumbing for CyberGIS collaboration is available:
- Processing, Workflow, Model interoperability as web services are “solved” several times;
- the concepts for collaboration need to be made explicit
3. Need for “decision” and “hypothesis” objects including modeling and linked data
- Ontology for decision types. Templates for Decisions and Hypothesis
- Recommender systems - a guess at the riddle
- If I see these conditions then consider this decision template
- If I am researching these conditions then consider this hypothesis
Geospatial Temporal Open Standards for Big Data from Space (BiDS2014)George Percivall
Presentation to ESA Big Data From Space (BiDS2014), November 2014.
Big data from space requires processing large amounts of data in a distributed environment. For efficient, quality and cost-effective deployment, these environments must be based on open standards. The Open Geospatial Consortium (OGC) open standards for geospatial-temporal information have been tuned through implementations to meet the needs of big data.
Time, Change and Habits in Geospatial-Temporal Information StandardsGeorge Percivall
Keynote for HIC 2014 – 11th International Conference on Hydroinformatics, New York, USA August 17 – 21, 2014
Time, Change and Habits in Geospatial-Temporal Information Standards
Time and change are fundamental to our scientific understanding of the world. Standards for geospatial-temporal information exist but new needs outstrip current standards. Geospatial-temporal information includes capturing change in features and coverages and modeling the processes that inform change. Key standards for time, calendars, and temporal reference systems are in place. Time series modeling from the WaterML standard is a recent advance of high value to hydrology. The OGC Moving Features standard will establish an encoding format for changes in “rigid” features. Interoperability standards are needed for Coverages with values that change based on observations, analytical expressions, or simulations. Applying a coverage model to time-varying, fluid Earth systems was the topic of the ground breaking GALEON Interoperability Experiment. Standards developments for spatial-temporal process models is progressing with WPS, OpenMI and ESMF - supporting a Model Web concept. A robust framework for sharing geospatial-temporal information is now coming into place based on developments captured in standards by ISO, WMO, ITU, ICSU and OGC - including the newly established OGC Temporal domain working group. The new framework will enable capabilities in expressing and sharing scientific investigations including research on the emergence of forms over time. With these new capabilities we may come to understand Peirce’s observation that over time “all things have a tendency to take habits.”
Urban IoT for Smart Cities: New Pathways to Business and Location Intelligenc...George Percivall
Presentation to Location Intelligence 2014 on 20 May 2014, during Opening Plenary/LI Vision Panel:" Location Analytics & Visual Data Discovery … New Pathways to Business Intelligence" My presentation identifies how rich location information is vital to the success of smart cities. Topics addressed included benefits for location infrastructure in Smart Cities. Spatial architecture from geospatial, infrastructure, buildings - indoor, outdoor and urban settings. OGC Smart Cities Testbed as convergence of many technologies to meet the needs of urban citizens and services.
http://www.locationintelligence.net/dc/agenda/
Presentation to for the ISPRS Congress 2012, Melbourne
Over the last decade, standards have played a key role in the expansion of the market for Earth Observation (EO) products and services. Standards become increasingly important as geospatial technologies and markets continue to evolve in an increasingly complex technology ecosystem. OGC and ISPRS work jointly to further the development of this vital information industry.
We continue to see global growth in the supply of geometrically controlled image-based geodata. On the data supplier side, most end-use EO information products use data from multiple EO sources (aerial and satellite) as well as from ground-based sources. On the customer side, customers’ business models involving EO data require easy connections between multiple data suppliers and multiple technology platforms. Typically, new markets create stovepiped, proprietary solutions that persist until market forces create demand for standards that in turn enhance market opportunity. The OGC’s standards meet this demand in the geospatial markets.
OGC leads worldwide in the creation and establishment of standards that allow geospatial content and services to be seamlessly integrated into business and civic processes, the spatial web and enterprise computing. OGC accelerates market assimilation of interoperability research through collaborative consortium processes.
OGC has both domain focused and technology focused activities. For example, the Meteorology & Oceanography Domain Working Group ensures that OGC standards and profiles allow the meteorological community to develop effective interoperability for web services and content across the wider geospatial domain. These needs are met for example by the technology of standards such as netCDF which was brought into the OGC to encourage broader international use and greater interoperability among clients and servers interchanging data in binary form.
Most OGC standards specify open interfaces or encodings that apply to imagery. Some of these are:
o Web Coverage Service (WCS)
o Web Coverage Processing Service (WCPS)
o Web Map Service (WMS)
o Geography Markup Language (GML)
o GML in JPEG 2000 Encoding
o OGC Network Common Data Form (NetCDF)
o Sensor Observation Service (SOS)
o Sensor Planning Service (SPS)
o Sensor Model Language Encoding Standard (SensorML).
o Catalogue Service for the WEB (CSW)
Innovation in Geospatial Technology and StandardsGeorge Percivall
All predictions are wrong; some are useful. This presentation offers a slate of geospatial trends developed in discussion with the OGC Board of Directors and expanded in an OGC blog series. These geospatial technology issues were developed by reviewing over 200 articles from geospatial publications as well as from information technology journals (IEEE, ACM, etc.).
These "Ripe Issues" of geospatial technology identify areas where further development of open standards can lead to great benefit:
* The Power of Location
* Internet of Things
* Mobile Development
* Indoor Frontier
* Cartographers of the future
* Big Processing of GeoData
* Smart Cities
The OGC is an international consortium where members participate in a consensus process to develop publicly available geospatial standards. OGC has a history of developing anticipatory standards. OGC is a leader in achieving a consensus balanced with innovation where OGC members actively designing the standard while implementing running software. In the role of OGC Chief Engineer, George Percivall identifies technology and market trends relevant to open standards development.
Mobile World Congress 2014 was again a huge display of the power of location information. OGC standards for mobile applications are key to exploiting the value of geospatial information. OGC has several open standards that enable accurate and robust sharing of geospatial information in mobile environment.
Variations of this presentation were made at the OGC Workshop at MWC, at the OMA Demo Day and at the Small Cell Zone exhibit space.
Note the slide calling for a Smart Cities - Urban IoT Testbed concept that builds on OGC Interoperability Program capabilities.
TITLE: Open Standards Role in EarthCube (Invited)
AUTHORS (FIRST NAME, LAST NAME): Luis E Bermudez1, David K Arctur2, 1, George Percivall1
INSTITUTIONS (ALL): 1. Open Geospatial Consortium, Gaithersburg, MD, United States.
2. University of Texas at Austin, Austin, TX, United States.
ABSTRACT BODY: EarthCube is an NSF initiative that will enable sharing of data in an open and transparent manner, improving access and use of data, allowing scientists to better understand the Earth. EarthCube is based on a network of enthusiasts willing to make the sharing of data a reality. But is just having open data enough? Open data will not accelerate the process a scientist team needs to go through to understand, reformat and use the data. However, agreements among colleagues or adoption of agreements can make a big difference. These agreements also need to be published, freely available, and unpolluted from intellectual property rights issues. The system design requirements to develop cyberinfrastructure for Geosciences need to take into account these open agreements, including open interfaces and open encodings. Once open agreements are in place, it is essential to have in place policy and procedures, and a governance body for maintaining those agreements. This presentation will explore these issues and suggest ways the standard development organizations, like the Open Geospatial Consortium (OGC), and other coordinating organizations, such as the Earth Science Information Partners (ESIP) and the Research Data Alliance (RDA), could be involved in this process.
http://www.opengeospatial.org
In AGU 2013 Session: IN43B. Emerging Concepts for Cyberinfrastructure in the Geosciences
The Open Landscape of Geospatial Information: Open data, open source, open standards
Presented at ASPRS GeoTech 2013 conference: http://www.asprspotomac.org/geotech2013/
Abstract:
The many dimensions of "open" provides users with higher quality geospatial information. Open Standards ensures interoperability to information whether its served by proprietary or open source software. Open Source software benefits the development of open standards and leads to a business ecosystem that includes more providers, more partnerships and more customers.[1] In the end the user does not care if the code is open or proprietary. Users care about access to data and the quality of the data. Open Data has advanced with the recent policies from GEOSS Data-CORE [2] and the US Open Government Initiative [3]. Open Earth Observation data from government sources benefits industry and users. Open standards, Open source and Open data can result in higher quality information. The fusion of data from multiple sources results in higher quality. Fusion is possible based on multiple data sources that can be interrelated [4]. Improving Data Quality through knowing the uncertainty and the provenance of derived information is dependent upon an open landscape of geospatial information.
[1] http://wiki.osgeo.org/wiki/Open_Source_and_Open_Standards
[2] http://www.earthobservations.org/geoss_dsp.shtml
[3] http://www.whitehouse.gov/open
[4] http://www.opengeospatial.org/projects/initiatives/fusion2
Location Based Services update for Small Cell ForumGeorge Percivall
Presentation about OGC activities on location based services with an emphasis on indoor location and IndoorGML.
Agenda of talk:
- The power of location
- Mission of OGC
- OGC standards
- OpenLS - OGC Open Location Services
- New developments: IndoorGML and others
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
Comparing Evolved Extractive Text Summary Scores of Bidirectional Encoder Rep...University of Maribor
Slides from:
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Track: Artificial Intelligence
https://www.etran.rs/2024/en/home-english/
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
1. A Geography Of
Digital Earth
George Percivall
Digital Earth Office
percivall@gsfc.nasa.gov
2. 14-Nov-01 p1
Digital Earth
•A virtual representation of our planet that
enables a person to explore and interact
with the vast amounts of natural and
cultural information gathered about the
Earth
–multi-resolution
–three-dimensional representation
•Surface of the earth as the organizing
metaphor for a vast amount of data
3. 14-Nov-01 p2
Public
Tools &
Technology
Tools &
Technology
Enabling Citizens
and Communities
Enabling Citizens
and Communities
Interoperability
Interoperability
Collecting Data
Collecting Data
EARTH
EARTH
P
r
i
v
a
t
e
Digital Earth
Digital Resources
Digital Resources
Applications
Applications
4. 14-Nov-01 p3
Decisions
Knowledge
Information
Data
Adapted from “A Theory of Computer Semiotics”, Peter Andersen, Cambridge Press, 1997
Definitions from ANSI Dictionary of Information Technology, www.ncits.org
A Geography for Digital Earth
Compression of redundancies
Representation is described
Data with meaning assigned
Integrated model of information
Pragmatic application of knowledge
Goals of multiple stakeholders
A representation subject to interpretation
5. 14-Nov-01 p4
A Geography for Digital Earth
Decisions
Knowledge
Information
Data
Decision Support
Phenomena Location
Vladimir and Estragon
Models, Agents,
and Knowledge
Fusion and
Assimilation
Content is King? Where in DE are you?
How big is DE? The world is wired
6. 14-Nov-01 p5
The World is Wired and Wireless
• Internet is Pervasive
–IPV6 - nearly 1,600 addresses/sq-m of the earth
~ 1 per sq inch
• Gilder’s law: Deployed bandwidth triples every year
• Wired bandwidth
–Gigabit Ethernet now, Terabit Ethernet in 2008
–Landsat scene transfer: 1 per sec now, 1000 per sec in 2008
• Wireless bandwidth
–4th generation Wireless: 10 to 100 megabit/second (2010)
–Full motion video on wireless devices
• Deployment will take time
–Necessary for Digital Earth access points in public places like
museums
Decisions
Knowledge
Information
Data
7. 14-Nov-01 p6
The Digital Earth — Varying Scales
e.g.
Land Use
Demographics
Global Climate
Sea Surface Temperature
Digital Elevation
HUMANITY
HUMANITY
e.g.
Food and Fiber
Disaster Preparedness
Biodiversity
Coastal Sensitivity
COUNTRY
COUNTRY
e.g.
Smart Growth
Public Health
Disaster Response
Transportation Planning
Weather
COMMUNITY
COMMUNITY
e.g.
Education
Decision Making
Enhanced Living
Information
INDIVIDUAL
INDIVIDUAL
Vision
Initiative
NASA Program
Decisions
Knowledge
Information
Data
8. 14-Nov-01 p7
• All recorded information:
~ several exabytes
• Most data has geographic
component
• DE ~ an exabyte
Yotta
Zetta
Exa
Peta
Tera
Giga
Mega
Kilo
How big is DE?
A Book
A Book
.Movie
All LoC books
(words) 20 TB
All Books
MultiMedia
Everything!
Recorded
A Photo
A Photo
Data
Archive
NASA
EOSDIS
Digital
Earth
Map
Satellite
Image
www.lesk.com/mlesk/ksg97/ksg.html
• Archive organization
–Image products in collections
–CEOS Census: millions of collections
–On-line collections: 10’s of thousands
–Collections cataloged
• Need to get data on-line
9. 14-Nov-01 p8
Decisions
Knowledge
Information
Data
Moving Digital Earth online
• Progression of Storage Technology
–Disk technology is overtaking tapes
–EOSDIS data moving from tape to disk
–In ten years RAM will cost what disk costs today
• Hurdles to moving online: technical, legal, business
–Technical issues of accessibility - geocoding,
geographic access standards
–“Government commercialization” concerns
–Owners fear intellectual property will be stolen.
–Need for copy-protection & payment schemes
–Privacy Impact Assessment
“Rules of Thumb in Data Engineering,” March 2000, Microsoft Technical Report, MS-TR-99-100
“Data Policy Issues and Barriers to Using Commercial Resources” Rand Doc No: DB-247-NASA/OSTP, 1999
10. 14-Nov-01 p9
Decisions
Knowledge
Information
Data
Finding value in DE Data
• Technology will allow us to store every thing
–In only a few years, we will be able save everything,
no information will have to be thrown out
–Human attention is the scarce resource
–Most of this data will never be looked at by a human being
• Domain Semantics
–Needs to have representation with the data, I.e. information
–Geography Markup Language
–Information communities, semantic nets, RDF
• Enables auto-summarization
–Mining based on geographic concepts
–Feature Tracking
–Feature Classification - Feature Type Catalogs
ADaM data mining engine, http://datamining.itsc.uah.edu/environment.html
11. 14-Nov-01 p10
Decisions
Knowledge
Information
Data
Is Content King in Digital Earth?
• Economics of a telecommunications medium
–“Content is King” vs. “Commodity Pipelines”
–Broadcast content is most of the data volume
–Point to point communications is most of the dollars
• Content is King in DE
–Professionally developed content from data centers
• Commodity Pipelines prevails in DE
–Individuals providing information from a geographic location
• Need for geographic device directory, e.g., dot-geo
• cellular Cams, GPS and Virtualized Reality (CMU)
–Sensor webs
• The need for geographic reference
“The history of communications and its implications for the Internet,” Odlyzko, AT&T Labs
12. 14-Nov-01 p11
• Coordinate Reference Systems
–Mature technology including ISO standards
–Registries for specific coordinate reference systems
• Overlay of multi-site 2-D data
–Standardized using OGC Web Mapping Service (WMS)
– “Digital Earth will do for georeferenced information what the
World Wide Web did for text and multimedia” - Jeff de La
Beaujardiere
Knowing where you are in DE
Decisions
Knowledge
Information
Data
• Merging multi-site 3-D and 4-D data
–Demonstrated as research
–Standards needed
13. 14-Nov-01 p12
Knowing where you are in DE
Decisions
Knowledge
Information
Data
Space Time Toolkit: http://vast.uah.edu/
• Merging multi-site 3-D and 4-D data
–Demonstrated as research
–Standards needed
14. 14-Nov-01 p13
Decisions
Knowledge
Information
Data
Fusion and Assimilation
• Data Fusion
–combine remote sensing data with other sources of geospatial
information to improve the understanding of specific
phenomena.
–Fusion Levels: Pixel, Features, Decisions
• Data Assimilation
–melding observations with model simulations to provide
accurate estimation of the state of the atmosphere, oceans,
and land-surface, etc.
• Models
–Simulations of a given topic
–Geographic models utilize coordinate reference systems
–Metadata for models to enable interoperability
15. 14-Nov-01 p14
Decisions
Knowledge
Information
Data
Models, Agents, and Knowledge
• Agents
–Software that runs without direct human control to accomplish
goals provided by a user
–Agents collect, filter and process information found on the
Web, sometimes with the help of other agents.
• Process Semantics
–Define automated interaction of Agents and Models
–W3C activity - Semantic Web:
• Use rules to make inferences, choose courses of action and
answer questions
–Digital Earth is a Geographic Semantic Web
• Geospatial linkages are well understood semantics
• Needed technology
–Geographic Ontologies
–Software interoperability stack
17. 14-Nov-01 p16
Decisions
Knowledge
Information
Data
Decision Support Systems
``Interactive system to help decision makers
select options”
–Large quantities of space-time data
–Models for predicting results of alternative policy choices
–Display the results in easily understood ways to multiple
communities
• Technology Progression
–Many models and DSS environments exist in close solutions
–Need for distributed DSS, scripting, workflow
• DE supports multiple decision making methods
–Content is King - professional data bases, models, and
decision processes
–Simple Pipes - point to point communications about the state of
the earth will enable inform the democratic decision process
18. 14-Nov-01 p17
A Geography for Digital Earth
Decisions
Knowledge
Information
Data
Decision Support
Phenomena Place
Vladimir and Estragon
Models, Agents,
and Knowledge
Fusion and
Assimilation
Content is King?
Where in DE are
you?
How big is DE? The world is wired
19. 14-Nov-01 p18
Waiting for Godot,
by Samuel Beckett
Vladimir: Well? Shall we go?
Estragon: Yes, let’s go
They do not move
Curtain
Decisions
Knowledge
Information
Data
Final Scene: Two men along a country road
20. 14-Nov-01 p19
Waiting for Godot,
in Digital Earth
Final Scene: Two men along a country road
Vladimir: Well? Shall we go?
Estragon: Yes, let’s go
They depart
Curtain
Decisions
Knowledge
Information
Data