This slide will provide an overview of current functionality, techniques, and tips for visualization and query of HDF and netCDF data in ArcGIS, as well as future plans. Hierarchical Data Format (HDF) and netCDF (network Common Data Form) are two widely used data formats for storing and manipulating scientific data. The NetCDF format also supports temporal data by using multidimensional arrays. The basic structure of data in this format and how to work with it will be covered in the context of standardized data structures and conventions. This slide will demonstrate the tools and techniques for ingesting HDF and netCDF data efficiently in ArcGIS, as well as some common workflows to employ the visualization capabilities of ArcGIS for effective animation and analysis of your data.
NCAR Command Language (NCL) is an interpreted language designed for sceintific data analysis and visualization with high quality graphics, espeially for atmospherice scince. NCL has been support NetCDF 3/4, GRIB 1/2, HDF-SDS, HDF_EOS, shapefiles, binary, and ASCII files for years. Now HDF-EOS5 is the released version, and HDF5 in beta-test stage.
Now NCL team are developing NCL to write HDF5 files, and to read HDF-EOS5 data with OPeNDAP.
NCL team will share with people their experience to visualize and analyze HDF-EOS5 and HDF5 data.
Tutorial pengolahan data salinitas menggunakan aplikasi SeaDas, Excel dan ODV...dwi fitria ningsih
Salinitas merupakan jumlah garam terlarut dalam air, dalam konteks ini adalah air laut. darimana garam itu berasal? dari peristiwa outgassing yaitu peristiwa merembesnya garam-garam dari dasar laut. Nah disini kita akan melihat bagaimana kondisi salinitas di wilayah selatan jawa timur mengggunakan aplikasi SeaDas, excel dan juga ODV yang kemudian di interpretasikan.
Python has long been established in software development departments of research and industry, not least because of the proliferation of libraries such as SciPy and Matplotlib. However, when processing large amounts of data, in particular in combination with GUI toolkits or three-dimensional visualizations, it seems that Python as an interpretative programming language may be reaching its limits.
This presentation shows how visualization applications with special performance requirements can be designed on the basis of the GR framework, a "lightweight" alternative to Matplotlib. It aims to show in detail how to implement real-time applications or compute-intensive simulations in Python by using current software technologies. The responsiveness of animated visualization applications and their resulting frame rates can be improved, for example, by the use of just-in-time compilation with Numba (Pro).
Using concrete examples, the presentation aims to demonstrate the benefits of the GR and GR3 frameworks in conjunction with C wrappers, JIT compilers, graphical user interfaces (GUIs) and OpenGL. Based on selected applications, the suitability of the GR framework especially in real-time environments will be highlighted and the system’s performance capabilities illustrated using demanding live applications. In addition, the special abilities of the GR and GR3 frameworks are emphasized in terms of interoperability with current web technologies.
PEARC17: Visual exploration and analysis of time series earthquake dataAmit Chourasia
Earthquake hazard estimation requires systematic investigation of past records as well as fundamental processes that cause the quake. However, detailed long-term records of earthquakes at all scales (magnitude, space and time) are not available. Hence a synthetic method based on first principals could be employed to generate such records to bridge this critical gap of missing data. RSQSim is such a simulator that generates seismic event catalogs for several thousand years at various scales. This synthetic catalog contains rich detail about the earthquake events and associated properties.
Exploring this data is of vital importance to validate the simulator as well as to identify features of interest such as quake time histories, conduct analyses such as calculating mean recurrence interval of events on each fault section. This work1 describes and demonstrates a prototype web based visual tool that enables domain scientists and students explore this rich dataset, as well as discusses refinement and streamlining of data management and analysis that is less error prone and scalable.
NCAR Command Language (NCL) is an interpreted language designed for sceintific data analysis and visualization with high quality graphics, espeially for atmospherice scince. NCL has been support NetCDF 3/4, GRIB 1/2, HDF-SDS, HDF_EOS, shapefiles, binary, and ASCII files for years. Now HDF-EOS5 is the released version, and HDF5 in beta-test stage.
Now NCL team are developing NCL to write HDF5 files, and to read HDF-EOS5 data with OPeNDAP.
NCL team will share with people their experience to visualize and analyze HDF-EOS5 and HDF5 data.
Tutorial pengolahan data salinitas menggunakan aplikasi SeaDas, Excel dan ODV...dwi fitria ningsih
Salinitas merupakan jumlah garam terlarut dalam air, dalam konteks ini adalah air laut. darimana garam itu berasal? dari peristiwa outgassing yaitu peristiwa merembesnya garam-garam dari dasar laut. Nah disini kita akan melihat bagaimana kondisi salinitas di wilayah selatan jawa timur mengggunakan aplikasi SeaDas, excel dan juga ODV yang kemudian di interpretasikan.
Python has long been established in software development departments of research and industry, not least because of the proliferation of libraries such as SciPy and Matplotlib. However, when processing large amounts of data, in particular in combination with GUI toolkits or three-dimensional visualizations, it seems that Python as an interpretative programming language may be reaching its limits.
This presentation shows how visualization applications with special performance requirements can be designed on the basis of the GR framework, a "lightweight" alternative to Matplotlib. It aims to show in detail how to implement real-time applications or compute-intensive simulations in Python by using current software technologies. The responsiveness of animated visualization applications and their resulting frame rates can be improved, for example, by the use of just-in-time compilation with Numba (Pro).
Using concrete examples, the presentation aims to demonstrate the benefits of the GR and GR3 frameworks in conjunction with C wrappers, JIT compilers, graphical user interfaces (GUIs) and OpenGL. Based on selected applications, the suitability of the GR framework especially in real-time environments will be highlighted and the system’s performance capabilities illustrated using demanding live applications. In addition, the special abilities of the GR and GR3 frameworks are emphasized in terms of interoperability with current web technologies.
PEARC17: Visual exploration and analysis of time series earthquake dataAmit Chourasia
Earthquake hazard estimation requires systematic investigation of past records as well as fundamental processes that cause the quake. However, detailed long-term records of earthquakes at all scales (magnitude, space and time) are not available. Hence a synthetic method based on first principals could be employed to generate such records to bridge this critical gap of missing data. RSQSim is such a simulator that generates seismic event catalogs for several thousand years at various scales. This synthetic catalog contains rich detail about the earthquake events and associated properties.
Exploring this data is of vital importance to validate the simulator as well as to identify features of interest such as quake time histories, conduct analyses such as calculating mean recurrence interval of events on each fault section. This work1 describes and demonstrates a prototype web based visual tool that enables domain scientists and students explore this rich dataset, as well as discusses refinement and streamlining of data management and analysis that is less error prone and scalable.
HDF4 and HDF-EOS format reading has recently been added to the NetCDF-Java 4.0 library, while HDF5 / NetCDF-4 format reading has been improved. This talk will summarize the status of reading the HDF family of formats through the NetCDF-Java library, with particular attention to the mapping between these formats and the Common Data Model.
DataStax and Esri: Geotemporal IoT Search and AnalyticsDataStax Academy
Internet of Things (IoT) data frequently has a location and time component. Getting value out of this "geotemporal" data can be tricky. We'll explore when and how to leverage Cassandra, DSE Search and DSE Analytics to surface meaningful information from your geotemporal data.
Data systems in NASA?s Earth Science Division are primarily focused on providing stewardship of the products of remote sensing and are manifested as Digital Active Archive Systems. Each Instrument Team has a related Science Team which defines the algorithms and monitors the processing of the output of the instruments to produce the related data products and in an format and standards compliance of them. These teams are influenced also by the research and applied sciences components of the programs, but the primary focus is on proving the ongoing validity of the products. Across the distributed system, every product is different. However, this is not conducive to analytics. NASA?s Advanced Information Systems Technology (AIST) program is developing an entirely new approach to creating Analytic Centers which focus on the scientific investigation and harmonize the data, computing resources and tools to enable and to accelerate scientific discovery. Stay tuned to _nd out how. A major element, in today’s science interests, is the comparison of multi-dimensional datasets; this warrants considerable experimentation in trying to understand how to do so meaningfully and quantitatively; asked another way, \What do you mean by similar?" Uncertainty quantification has evolved considerably in the arenas of data reduction and full physics models; however, the emerging demand for machine learning and other artificial intelligence techniques has failed to keep uncertainty quantification and error propagation in mind and there is considerable work to be done.
DELPH is a software suite for gophysical data acquisition, processing and interpretation. It features side-scan sonar, seismic and sub-bottom profiler as well as magnetometer analysis. Capable of working with very large multi-sensor datasets, it automates data processing and simplifies operations for geophysicists and hydrographers.
USING E-INFRASTRUCTURES FOR BIODIVERSITY CONSERVATION - Module 2Gianpaolo Coro
An e-Infrastructure is a distributed network of service nodes, residing on multiple sites and managed by one or more organizations. e-Infrastructures allow scientists residing at distant places to collaborate. They offer a multiplicity of facilities as-a-service, supporting data sharing and usage at different levels of abstraction, e.g. data transfer, data harmonization, data processing workflows etc. e-Infrastructures are gaining an important place in the field of biodiversity conservation. Their computational capabilities help scientists to reuse models, obtain results in shorter time and share these results with other colleagues. They are also used to access several and heterogeneous biodiversity catalogues.
In this course, the D4Science e-Infrastructure will be used to conduct experiments in the field of biodiversity conservation. D4Science hosts models and contributions by several international organizations involved in the biodiversity conservation field. The course will give students an overview of the models, the practices and the methods that large international organizations like FAO and UNESCO apply by means of D4Science. At the same time, the course will introduce students to the basic concepts under e-Infrastructures, Virtual Research Environments, data sharing and experiments reproducibility.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
The geophysical validation of satellite borne atmospheric chemistry instruments requires a large number of independent observations by a variety of in-situ, remote-sensing and satellite instruments. In the case of the Aura and Envisat missions more than 300 instruments have been formally included in the validation efforts.
In order to extract maximum information from the "ground-truth" measurements, these independent validation datasets must be readily accessible to all investigators in an simple format with standard metadata, variable naming and data structures.
Beginning in 2000, NASA and ESA have been collaborating under the auspices of the CEOS Working Group Calibration/Validation (WGCV) Atmospheric Chemistry Sub Group (ACSG) on the file format definition, metadata formulations, and variable definitions for validation datasets.
This standard, known as the AVDC/Envisat HDF format, is the basis of the Aura Validation Data Center (AVDC) and the Envisat Cal/ Val Data Center implementations. Recently, the AVDC/Envisat HDF formulation has been accepted as the new reporting standard of the Network for the Detection of Atmospheric Composition Change (NDACC), an international network of high quality remote sensing stations, and for the next ESA GMES Earth Observing missions. The concept behind the AVDC/Envisat HDF reporting standard, its implementation and future applications will to be presented in detail.
NASA Earth Exchange (NEX) is virtual collaborative that brings scientists and researchers together in a knowledge-based social network and provides the necessary tools, computing power, and data to accelerate research, innovation and provide transparency.
Similar to Working with HDF and netCDF Data in ArcGIS: Tools and Case Studies (20)
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Leading Change strategies and insights for effective change management pdf 1.pdf
Working with HDF and netCDF Data in ArcGIS: Tools and Case Studies
1. Working with HDF and netCDF Data in ArcGIS
Tools and Case Studies
Nawajish Noman
Jeff Donze
HDF and HDF-EOS Workshop XIV, September 28-30, 2010, Champaign, IL
HDF28-
2. Outline
• HDF and netCDF in ArcGIS
• Time in ArcGIS
• Performing Analysis
• Use Cases and Applications
• Script Tools
•F t
Future Directions
Di
ti
4. Scientific Data and ESRI
• Direct support - NetCDF and HDF
• THREDDS – a data server technology for multidimensional array
data, integrated use by our customers
• Examples using ESRI technology
• National Climate Data Center
• National Weather Service
• National Center for
Atmospheric Research
• Air Force Weather
• Australian Navy
• Australian Bur.of Met.
• UK Met Office
7. HDF4 and HDF5 Support in ArcGIS
http://help.arcgis.com/en/arcgisdesktop/10.0/help/
8. What is NetCDF?
NetCDF?
• NetCDF (network Common Data Form)
network
A platform independent format for representing multi-dimensional
multiarrayarray-orientated scientific data.
• Self Describing: a netCDF file includes information about the data it
contains.
• Di
Direct A
t Access: a small subset of a large dataset may be accessed
ll b t f l
d t
t
b
d
efficiently, without first reading through all the preceding data.
• Sh
Sharable: one writer and multiple readers may simultaneously
bl
it
d
lti l
d
i lt
l
access the same netCDF file.
NetCDF is relatively new to the GIS community but widely used by scientific
communities for many years.
9. Storing Data in a netCDF File
Time = 1 to 3
Y = 1 to 4
X = 1 to 4
10. NetCDF Support in ArcGIS
• ArcGIS reads/writes netCDF since version 9.2
• An array based data structure for storing
multidimensional data.
T
• N-dimensional coordinates systems
• X, Y, Z, time, and other dimensions
• Variables – support for multiple variables
• Temperature, humidity, pressure, salinity, etc
Temperature,
• Geometry – implicit or explicit
• Regular grid (implicit)
(implicit)
• Irregular grid
• Points
Y
Z
X
12. Ingesting netCDF data in ArcGIS
• NetCDF data is accessed as
Raster
• Feature
• Table
•
• Di
Direct read
d
• Exports GIS data to netCDF
13. CF Convention
Climate and Forecast (CF) Convention
(CF)
http://cf-pcmdi.llnl.gov/
htt // f
di ll l
/
Initially developed for
• Climate and forecast data
• Atmosphere, surface and ocean model-generated data
model• Also for observational datasets
• The CF conventions generalize and extend the COARDS (Cooperative
cean/A
Ocean/Atmosphere Research Data Service) convention.
• CF i now th most widely used conventions f geospatial netCDF
is
the
t id l
d
ti
for
ti l tCDF
data. It has the best coordinate system handling.
14. NetCDF and Coordinate Systems
• Geographic Coordinate Systems (GCS)
X dimension units: degrees_east
• Y dimension units: degrees_north
•
• Projected Coordinate Systems (PCS)
X dimension standard_name: projection_x_coordinate
standard_name:
• Y dimension standard_name: projection_y_coordinate
standard_name:
• V i bl has a grid_mapping attribute.
Variable h
id
i
tt ib t
• CF 1.4 conventions currently supports twelve predefined coordinate
systems (Appendix F: Grid Mappings)
Mappings)
•
• Undefined
• If not GCS or PCS
• ArcGIS writes (and recognizes) PE String as a variable attribute.
15. NetCDF Tools
Toolbox: Multidimension Tools
Make NetCDF Raster Layer
• Make NetCDF Feature Layer
• Make NetCDF Table View
•
Raster to NetCDF
• Feature to NetCDF
• Table to NetCDF
•
•
Select by Dimension
y
18. Using NetCDF Data
Behaves the same as any layer or table
• Display
• Same display tools for raster and feature layers will work on netCDF
raster and netCDF feature layers.
• Graphing
• Driven by the table just like any other chart.
• Animation
• Multidimensional data can be animated through a dimension (e.g.
time, pressure, elevation)
• Analysis Tools
• A netCDF layer or table will work just like any other raster layer,
feature layer or table (e g create buffers around netCDF points
layer,
table. (e.g.
points,
reproject rasters, query tables, etc.)
rasters,
22. Temporal GIS Patterns
Dynamic
Discrete
Stationary
Change
something that moves
something that
“just happens”
“j t h
”
stands still but
records changes
change or growth
• Planes
• Vehicles
• Animals
• Satellites
• Storms
• Crimes
• Weather Stations
• Population
• Lightning
• Accidents
• Traffic Sensors
• Distribution
• Fire Perimeter
23. GIS Integration of Time
New Ways to Manage Visualize & Analyze Geography
Manage,
Multi
Dimensional
Data (netCDF)
(netCDF)
Visualize Change
Vi
li Ch
T
Real Time
Sensor Network
y
x
Files
DBMS
Mobile
Stationary
T1
Simulation
Modeling
24. Time is now built-in to ArcGIS
built-
•
Simple Temporal Mapping
•
Unified experience for Time
•
•
•
Configure time properties on the layer
Use Time Slider to visualize temporal data
is ali e
Share temporal visualization
TimeTime-enabled Map Services
Export videos or images
Generate temporal map books
using ArcPy scripting
• L
Layer and map packages
d
k
•
•
•
25. Visualizing temporal data using graphs
• Create a graph using a layer or table
• Create an animation in the usual way, attaching the
layer or table to a time layer track
• When the animation is played,
p y ,
the graph will animate
28. Spatial and Temporal Analysis
• Several hundreds analytical tools available for raster, features,
and table
• Temporal Modeling
p
g
• Looping and iteration in ModelBuilder and Python
29. Generate Rainfall Statistics
• Calculates specified statistics for all time steps
• Outputs a raster catalog
• Optionally outputs a netCDF file
30. Generate Rainfall Statistics Table
• Calculates statistics for all time steps
• Outputs a table
• Optionally creates a graph
32. Use Cases and Applications
•
•
•
•
•
Hydrography and METOC Branch, Royal Australian Navy
Applied Science Associated, Inc.
Inc.
The Nature Conservancy
The University of Washington
The University of Southern Mississippi
33. Observations and Model Forecasts
• Near real-time observations
real•
•
•
•
•
Weather satellite imagery
Sea Surface Temperature
Significant Wave Height (altimeter)
Ocean Winds
Moisture and Precipitation
• Numerical model forecasts
• Fixed domain Global, Regional, Tropical and Local atmospheric models
(ACCESS)
• Fixed domain Global, Regional, and Local wave models (WaveWatch 3)
(WaveWatch
• Global Ocean Model (BLUElink)
(BLUElink)
highg
p
• Relocatable high-resolution atmospheric and ocean models
Source: Hydrography and METOC Branch, Royal Australian Navy.
34. Climatology – CCMP Winds
Source: Hydrography and METOC Branch, Royal Australian Navy.
35. Climatology – CCMP Winds
Source: Hydrography and METOC Branch, Royal Australian Navy.
36. Ocean Model Forecast – Custom Display
Source: Hydrography and METOC Branch, Royal Australian Navy.
37. Web Services - GHRSST
Source: Hydrography and METOC Branch, Royal Australian Navy.
38. Environmental Data Connector (EDC)
XML
DAP
THREDDS
Features
NetCDF
Java App,
Unidata
Rasters
ArcGIS
Extension
Source: Applied Science Associated, Inc.
39. EDC Screen and Time Slider Toolbar
Synchronize and animate time-varying data in
http://www.pfeg.noaa.gov/products/EDC/
http://www.asascience.com/TimeSlider/index.htm
Source: Applied Science Associated, Inc.
44. Community Developed Tools
• Geoprocessing Resource Center
http://resources.arcgis.com/geoprocessing/
• Marine Geospatial
Ecology Tools
• Developed at Duke Univ.
• Over 180 tools for import
management, and
analysis of marine data
• Australian Navy tools
(not publicly available)
45. Script Tools of interest
• Python is used to build custom tools for specific tasks or
datasets
46. NEXRAD Geoprocessing Tools
•
•
•
•
•
•
Currently 6 geoprocessing script tools
Designed to work with NEXRAD netCDF file
Can be easily modified for other datasets
Customized tools for various workflows
Simplify repetitive work
Automate GIS processes
51. Things to Consider…
• Embrace the Common Data Model (netCDF, HDF etc.)
(netCDF,
• Use Data and metadata standards (OGC, CF etc)
• Make your data “spatial” (by specifying geographic or a
p j
projected coordinate system)
y
)
• Clearly define workflow and requirements
• Create sample tools where possible
52. Tools Under Consideration
• Temporal Statistics
• Get Variable Statistics
Analysis
•
•
•
•
Clip
p
Extract By Variable
Extract By Dimension
Append By Dimension
Data Management
53. Future directions
• Multidimension data management
• Temporal analysis tools
• Additional support for HDF5 using netCDF 4.x library
• What do you need?
55. Sharing Temporal Maps & Data with ArcGIS 10
•
Publish time-aware maps
time-
•
Export videos or images, layer and map packages
•
Vi
li data
Visualize d t
Access via REST API
Web API
• FLEX
• JavaScript
• Silverlight
• Time Slider web control
•
•
1979
56. How to store temporal data?
• DATE is a special field type specific to time
• GeoDatabase provides DATE
• If at all possible – use DATE type
• DATE field should be indexed for faster query performance
• Numeric and String fields
•
•
•
•
YYYY
YYYYMM
YYYYMMDD
YYYYMMDDhhmmss
YYYY/MM/DD
YYYY/MM/DD hh:mm:ss
• Time dimension in a netCDF variable
YYYY-MM-DD
YYYY-MMYYYY-MM-DD hh:mm:ss
YYYY-MM-
57. Animation examples
Data source: Cavalieri, D., C. Parkinson, P. Gloerson,
and H.J. Zwally. 1996, updated 2005. Sea ice
concentrations from Nimbus-7 SMMR and DMSP SSM/I
Nimbuspassive microwave data, June to September 2001.
Boulder, CO, USA: National Snow and Ice Data Center.
Sea ice concentration
Solar Insolation
Data provided courtesy of Declan Butler - http://declanbutler.info/blog/
Stream flow analysis
Avian Influenza