This document discusses interoperability between HDF5 files and the netCDF-4 format. It begins with background on netCDF-3, netCDF-4, and the Climate and Forecast (CF) metadata conventions. It then demonstrates different use cases for accessing HDF5 data via netCDF-4, including when the HDF5 file follows the netCDF data model and CF conventions compared to when it does not. The document shares experiences working with HDF-EOS5 and JPSS data products in HDF5 format through this netCDF-4 interface. In particular, it finds that following the netCDF data model and CF conventions improves visualization of HDF5 data in tools like IDV that expect netCDF files.
The tool takes HDF-EOS 5 data as input, and generates COARDS-compatible output - if the input file has enough metadata to be COARDS-compliant, the output file will be COARDS-compliant. The tool is written in portable C, and ought to run on any platform where the HDF-EOS and netCDF libraries are available.
This year, we have made two major enhancements to the converter:
It now automatically detects whether its input is HDF-EOS2 or HDF-EOS5 format, and handles either one. The previous tool worked with HDF-EOS5 only.
Its netCDF output attempts to conform to the new CF conventions (a superset of the COARDS conventions). This is primarily an improvement in its translation of Swath datasets, which CF handles much better than COARDS.
This is an introductory slide for accessing NASA HDF/HDF-EOS data for beginners. NASA distributes many Earth Science data in HDF/HDF-EOS file format and new users struggle to understand the file format and use the NASA HDF/HDF-EOS data properly. This brief presentation will help new users to understand the basic concepts about the HDF/HDF-EOS and to know the available tools that can access the NASA data easily.
The tool takes HDF-EOS 5 data as input, and generates COARDS-compatible output - if the input file has enough metadata to be COARDS-compliant, the output file will be COARDS-compliant. The tool is written in portable C, and ought to run on any platform where the HDF-EOS and netCDF libraries are available.
This year, we have made two major enhancements to the converter:
It now automatically detects whether its input is HDF-EOS2 or HDF-EOS5 format, and handles either one. The previous tool worked with HDF-EOS5 only.
Its netCDF output attempts to conform to the new CF conventions (a superset of the COARDS conventions). This is primarily an improvement in its translation of Swath datasets, which CF handles much better than COARDS.
This is an introductory slide for accessing NASA HDF/HDF-EOS data for beginners. NASA distributes many Earth Science data in HDF/HDF-EOS file format and new users struggle to understand the file format and use the NASA HDF/HDF-EOS data properly. This brief presentation will help new users to understand the basic concepts about the HDF/HDF-EOS and to know the available tools that can access the NASA data easily.
This slide will demonstrate how to use OPeNDAP Java clients such as IDV and Panoply via HDF OPeNDAP data handlers to access various NASA HDF products such as AIRS, OMI, MLS, MODIS, TRMM, CERES, SeaWIFS etc. Various features of these tools that can help users easy access the HDF data will also be explored.
The HDF Group provides NCL/IDL/MATLAB example codes and plots for many NASA HDF-EOS2 and HDF4 products. These example codes and plots can be found under http://hdfeos.org/zoo. This slide addresses some common issues on using these tools to visualize NASA HDF-EOS2 and HDF4 products.
NCAR Command Language (NCL) is an interpreted language designed for sceintific data analysis and visualization with high quality graphics, espeially for atmospherice scince. NCL has been support NetCDF 3/4, GRIB 1/2, HDF-SDS, HDF_EOS, shapefiles, binary, and ASCII files for years. Now HDF-EOS5 is the released version, and HDF5 in beta-test stage.
Now NCL team are developing NCL to write HDF5 files, and to read HDF-EOS5 data with OPeNDAP.
NCL team will share with people their experience to visualize and analyze HDF-EOS5 and HDF5 data.
This slide will demonstrate how to use OPeNDAP Java clients such as IDV and Panoply via HDF OPeNDAP data handlers to access various NASA HDF products such as AIRS, OMI, MLS, MODIS, TRMM, CERES, SeaWIFS etc. Various features of these tools that can help users easy access the HDF data will also be explored.
The HDF Group provides NCL/IDL/MATLAB example codes and plots for many NASA HDF-EOS2 and HDF4 products. These example codes and plots can be found under http://hdfeos.org/zoo. This slide addresses some common issues on using these tools to visualize NASA HDF-EOS2 and HDF4 products.
NCAR Command Language (NCL) is an interpreted language designed for sceintific data analysis and visualization with high quality graphics, espeially for atmospherice scince. NCL has been support NetCDF 3/4, GRIB 1/2, HDF-SDS, HDF_EOS, shapefiles, binary, and ASCII files for years. Now HDF-EOS5 is the released version, and HDF5 in beta-test stage.
Now NCL team are developing NCL to write HDF5 files, and to read HDF-EOS5 data with OPeNDAP.
NCL team will share with people their experience to visualize and analyze HDF-EOS5 and HDF5 data.
This tutorial is designed for new HDF5 users. We will go over a brief history of HDF and HDF5 software, and will cover basic HDF5 Data Model objects and their properties; we will give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples, and Java tool HDFView will be used to illustrate HDF5 concepts.
In this presentation, we will give an update on the HDF OPeNDAP project. We will update the new features inside the HDF5 OPeNDAP data handler. We will also introduce the enhanced HDF4 OPeNDAP data handler and demonstrate how it can help users to view and analyze remote HDF-EOS2 data. A demo that uses OPeNDAP client tools to handle AIRS and MODIS Grid/Swath data with the enhanced handler will be presented.
This tutorial is designed for new HDF5 users. We will cover HDF5 abstractions such as datasets, groups, attributes, and datatypes. Simple C examples will cover the programming model and basic features of the API, and will give new users the knowledge they need to navigate through the rich collection of HDF5 interfaces. Participants will be guided through an interactive demonstration of the fundamentals of HDF5.
This tutorial is for new HDF5 users.
Numerous scientific teams use the HDF5 format to store very large datasets. Efficient use of this data in a distributed environment depends on client applications being able to read any subset of the data without transferring the entire file to the local machine. The goal of the HDF5-iRODS Project was to develop an HDF5-iRODS module for the iRODS datagrid server that supported this capability, and to apply the technology to an NCSA/SDSC Strategic Applications Program (SAP) project, FLASH.
A joint team from The HDF Group (representing NCSA) and the SDSC SRB group collaborated to accomplish the project goal. The team implemented five HDF5 microservices functions on the iRODS server, and developed an iRODS FLASH slice client application. The client implementation also includes a JNI interface that allows HDFView, a standard tool for browsing HDF5 files, to access HDF5 files stored remotely in iRODS. Finally, three new collection client/server calls were added to the iRODS APIs, making it easier for users to query the content of an iRODS collection.
In this talk, we will give an update on the HDF5 OPeNDAP project. We will update the new features inside OPeNDAP HDF5 data handler. We will also introduce a new HDF5-Friendly OPeNDAP client library and demonstrate how it can help users to view and analyze remote HDF-EOS5 data served by OPeNDAP HDF5 handler. A demo will be presented with a customized OPeNDAP visualization client (GrADS) that uses the library.
The HDF Group provides support for NPP/NPOESS in a number of ways, including development and maintenance of software capabilities in HDF5 libraries and tools that help NPP/NPOESS data producers and users, software testing on platforms of importance to NPP/NPOESS, high quality rapid response user support for NPP/NPOESS, and performance of special projects. The purposes of this presentation are to apprise attendees of the areas of emphasis for FY 2010, and to solicit ideas and opinions that will help the project understand how best to use its resources in order to best serve the needs of NPP/NPOESS.
This tutorial targets NetCDF application developers and users who are interested in the NetCDF-4 library features based on the underlying HDF5 library and file format. We will discuss how to use new NetCDF-4/HDF5 features and APIs to achieve optimal I/O performance.
Accessibility and usability of NPP/NPOESS data in HDF5 can be enhanced by providing tools that simplify and standardize how data is accessed and presented. In this project, The HDF Group is creating such tools in the form of software to read and write certain key data types and data aggregates used in NPP/NPOESS data products, and extending HDFView to extract, present and export these data effectively. In particular, the work will focus on NPP/NPOESS use of HDF5 region references and quality flags. The HDF Group will also provide high quality user support for the project.
This 2009 tutorial slide will cover basic HDF5 Data Model objects and their properties. It will include an overview of the HDF5 Libraries and APIs, and describe the HDF5 programming model. Simple programming examples and the HDFView data browser will be used to illustrate HDF5 concepts and start developing your own HDF5 based applications.
This tutorial is for new HDF5 users.
This tutorial is designed for users with some HDF5 experience. It will cover advanced features of the HDF5 library that can be used to achieve better I/O performance and more efficient storage. The following HDF5 features will be discussed: partial I/O; compression and other filters, including new n-bit and scale+offset filters and data storage options. Significant time will be devoted to the discussion of complex HDF5 datatypes such as strings, variable-length datatypes, array datatypes, and compound datatypes.
This tutorial will introduce the three levels of the HDF-Java products: the HDF-Java wrapper (or Java Native Interfaces to the standard HDF libraries), the HDF-Java object package, and the HDFView. The Java wrapper provides standard Java APIs that allow applications to call the C HDF libraries from Java. The HDF-Java object package implements HDF data objects, e.g. Groups and Datasets, in an object-oriented form and makes it easy for applications to use the libraries. The HDFView is a visual tool for browsing and editing HDF4 and HDF5 files.
We will introduce the basic features and recent updates on the HDF5 tools. The intention of the talk is to keep users updated on the HDF5 tool development such as new features and new tools. It also helps new users to get familiar with the HDF5 tools. Three new tools h5check, h5copy and h5stat will be introduced.
It is important that HDF5 files created and modified by the HDF5 library are fully compliant with the defined HDF5 File Format Specification (File Format) to ensure the data model integrity and long term compatibility between evolving versions of the HDF5 library. The h5check tool verifies that the content of an HDF5 file is encoded according to the File Format. The verification role also makes h5check act as a watchdog for the implementation correctness of the HDF5 library. This presentation explains the features of the tool and time permitting, shows a demonstration of the tool.
h5copy is a command line tool that allows to make a copy of an HDF5 object (group, dataset or named datatype) from one location to another location within a file or across files. h5copy uses the HDF5 API function H5Gcopy to copy objects, which provides users with several options on how data object is copied, and high efficiency in data copy.
h5stat is a new utility for viewing different statistics about an HDF5 file. Tool is still under development. The goal of the talk will be to introduce the tool and get HDF5 users feedback.
HDF5 (with Nexus) is becoming the de facto standard in most X-ray facilities. However, it is not always easy to navigate such files to get quick feedback on the data, due to the peculiar structure of Nexus files. HDF5 file viewers are one way to solve this issue. They allow for the browsing and inspecting of the hierarchical structure of HDF5 files, as well as visualising the datasets they contain as basic plots (1D, 2D, 3D).
This presentation will focus on h5web, the open-source web-based viewer being developed at the European Synchrotron Radiation Facility. The intent is to provide synchrotron users with an easy-to-use application and to make open-source components available for other similar web applications. `h5web` is built with React, a front-end web development library. It supports the exploration of HDF5 files, requested from a separate back-end (e.g. HSDS) for modularity, and the visualisation of datasets using performant WebGL-based visualisations.
Similar to Interoperability with netCDF-4 - Experience with NPP and HDF-EOS5 products (20)
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
Secstrike : Reverse Engineering & Pwnable tools for CTF.pptx
Interoperability with netCDF-4 - Experience with NPP and HDF-EOS5 products
1. The HDF Group
Interoperability with
netCDF-4
Kent Yang, Larry Knox, Elena Pourmal
The HDF Group
The 15th HDF and HDF-EOS Workshop
April 17-19, 2012
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
1
www.hdfgroup.org
2. Outline
• Background
• netCDF-4
• CF
• Use cases
• Experience with HDF-EOS5 products
• Experience with JPSS products
• Current Status and future directions
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
2
www.hdfgroup.org
3. Clarification –netCDF format
• netCDF-3 format
• Simple self-describing data format based on
netCDF classic data model
• netCDF-4 format
• Uses HDF5 as a storage layer
• Exploits
• Compression, chunking, parallel-IO
• Group hierarchy, user-defined data types, etc.
• Supports both netCDF enhanced and netCDF
classic data models
• Interoperability with netCDF-4 format in this talk
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
3
www.hdfgroup.org
4. Clarification – netCDF packages
• netCDF software packages
• netCDF-C
• Support both netCDF-3 and netCDF-4 formats
• C++/Fortran Wrappers
• netCDF-Java
• Support both netCDF-3 and netCDF-4 formats
• The implementation of the Common Data Model
• netCDF version 4
• Generally mean the version 4 package of the
netCDF-C library
(Not only support netCDF-4 format)
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
4
www.hdfgroup.org
5. Why netCDF-4
• Big user community
• User-friendly data models
• Tools
• Home-grown and third-party visualization and
analysis tools
- ncdump, ncgen, IDV, Panoply, Ferret etc.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
5
www.hdfgroup.org
8. CF conventions
• Metadata conventions for earth science data.
• Sharing of files created with the NetCDF APIs, but
not specifically to netCDF.
• The CF conventions are now increasingly gaining
acceptance.
• URL: http://cf-pcmdi.llnl.gov/
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
8
www.hdfgroup.org
9. • In this tutorial, we only review the key CF
attributes that affect the access of NASA and
other Earth Science HDF and HDF-EOS data
via popular visualization tools.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
9
www.hdfgroup.org
10. Key CF data description attributes
Attribute
Description
Units
A string that represents the quantity of measurement. A
variable with no units attribute is assumed to be
dimensionless.
long_name
A descriptive name that indicates a variable‟s content.
standard_name
A standard name that references a description of a
variable‟s content in the standard name table of CF
conventions.
_FillValue
A value used to represent missing or undefined data.
valid_min
Smallest valid value of a variable.
valid_max
Largest valid value of a variable.
valid_range
Smallest and largest valid values of a variable.
Use these attributes if possible, especially use
_FillValue, valid_min,valid_max if you have missing value(s).
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
10
www.hdfgroup.org
11. Reduction of dataset size
Attribute
Description
scale_factor
If present for a variable, the data are to be multiplied by this
factor after the data are read by an application.
add_offset
If present for a variable, this number is to be added to the data
after it is read by an application. If both scale_factor and
add_offset attributes are present, the data are first scaled before
the offset is added.
The equation that describes the usage of scale_factor and add_offset is:
Final_data_value = “scale_factor” * Raw_data_value + “add_offset”;
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
11
www.hdfgroup.org
12. “Units” for coordinate variables
• Horizontal
• Latitude – “degrees_north”
• Longitude – “degrees_east”
• Vertical
• Pressure – “hPa”
• Height(depth) – “Meter” (m) or “kilometer”(km)
• Time
• seconds(minutes etc.) since a time point
• An example
• “seconds since 1992-10-8 15:15:42.5 -6:00”
Use these attributes with the CF values if possible,
without following CF conventions for these attributes, some
tools cannot properly visualize the data.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
12
www.hdfgroup.org
13. “coordinates” attribute
• List the associated coordinate variable names
of the variable
• An example
•
Variable: Temperature
•
•
Associated Coordinate variables: “latitude”,“longitude”,”pressure”
coordinates = “latitude longitude pressure”
Include this attribute if possible. For some data
products, this is the key attribute to specific the
coordinates of a variable.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
13
www.hdfgroup.org
14. The HDF Group
Interoperability of HDF5
with netCDF-4
General Information
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
14
www.hdfgroup.org
15. Review Concepts
• netCDF classic model
• Shared dimension
• netCDF enhanced model
• Group hierarchy
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
15
www.hdfgroup.org
16. Use cases to access HDF5 via netCDF-4
1. General HDF5
• Follow neither netCDF data models nor CF conventions
2. netCDF-4 HDF5
• Follow netCDF enhanced data model
3. netCDF-4 CF HDF5
• Follow netCDF enhanced model and CF conventions
4. netCDF (classic) HDF5
• Follow netCDF classic model
5. netCDF (classic) CF HDF5
• Follow netCDF classic model and CF conventions
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
16
www.hdfgroup.org
17. How to demonstrate
• Simple HDF5 files
• netCDF tools
• netCDF C
• ncdump
• netCDF Java
• IDV
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
17
www.hdfgroup.org
18. • There are some limitations for netCDF4 to
access HDF5 files
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
18
www.hdfgroup.org
19. General HDF5
• Add phony dimension names to variables
• Generally cannot be opened by IDV
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
19
www.hdfgroup.org
20. netCDF-4 HDF5
• HDF5 that follows netCDF enhanced model
• ncdump can pick up the dimension information
• cannot be opened by IDV
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
20
www.hdfgroup.org
21. netCDF-4 CF HDF5
• HDF5 that follows netCDF enhanced model
and CF conventions
• cannot be opened by IDV
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
21
www.hdfgroup.org
22. • We will use several demos to show the
differences for the last two cases
• netCDF(classic) HDF5
• netCDF(classic) CF HDF5
• ncdump can dump all demo files
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
22
www.hdfgroup.org
24. Use case 4 and 5 IDV demo 1
CF category
Units for latitude and
longitude
Apr. 17-19, 2012
netCDF classic ( Case 4)
Units is “degrees”
HDF/HDF-EOS Workshop XV
netCDF classic CF (Case 5)
Latitude: “degrees_north”
Longitude: “degrees_east”
24
www.hdfgroup.org
25. Use case 4 and 5 IDV demo 1
CF category
netCDF classic ( Case 4)
netCDF classic CF (Case 5)
Units for latitude and
longitude
Cannot open the file
Correctly display the data
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
25
www.hdfgroup.org
26. Use case 4 and 5 IDV demo 2
CF category
netCDF classic (Case 4)
netCDF classic CF (Case 5)
_FillValue
No _FillValue attribute
Have _FillValue
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
26
www.hdfgroup.org
27. Use case 4 and 5 IDV demo 2
CF category
netCDF classic (Case 4)
netCDF classic CF (Case 5)
_FillValue
Treat _FillValue as the real
data
Correctly filter out the
_FillValue
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
27
www.hdfgroup.org
28. Use case 4 and 5 IDV demo 3
CF category
scale_factor and
add_offset
scale_factor = 10.0
add_offset = 1000.0
netCDF classic(Case 4)
Attribute names for
scale_factor and add_offset
don‟t follow CF conventions
netCDF classic CF(Case 5)
Have correct scale_factor
and add_offset attribute
names
Attributes:
Attributes:
wrong_scale_name = 10.0
Wrong_offset_name = 1000.0
scale_factor = 10.0
add_offset = 1000.0
1,2,3,4,5,6,7,8,9,10,
11,12,13,14,15,16
Scale Offset
1010,1020,1030,
……
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
28
www.hdfgroup.org
29. Use case 4 and 5 IDV demo 3
CF category
scale_factor and
add_offset
Apr. 17-19, 2012
netCDF classic(Case 4)
Doesn‟t apply the
scale_factor and add_offset
HDF/HDF-EOS Workshop XV
netCDF classic CF(Case 5)
Correctly apply scale_factor
and add_offset
29
www.hdfgroup.org
30. Use case 4 and 5 IDV demo 4
CF category
coordinates
Apr. 17-19, 2012
netCDF classic(Case 4)
No „coordinates‟ attribute
HDF/HDF-EOS Workshop XV
netCDF classic CF(Case 5)
Have „coordinates‟
30
www.hdfgroup.org
31. Use case 4 and 5 IDV demo 4
CF category
Coordinates
Apr. 17-19, 2012
netCDF classic(Case 4)
May not pick up the correct
coordinate
HDF/HDF-EOS Workshop XV
netCDF classic CF(Case 5)
Correctly pick up the right
coordinate
31
www.hdfgroup.org
32. netCDF HDF5 vs netCDF CF HDF5
• CF attributes are key to make IDV
correctly display the data
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
32
www.hdfgroup.org
33. Summary of use cases
• There are some limitations for netCDF4 to access HDF5 files
Use Cases
ncdump
IDV
General HDF5
Can view
(with phony dimensions)
Generally cannot view
netCDF-4 HDF5
Can view
Generally cannot view
netCDF-4 CF HDF5
Can view
Generally cannot view
netCDF Classic HDF5
Can view
Can view some files but the
visualization may not be right
netCDF Classic CF HDF5
Can view
Can view
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
33
www.hdfgroup.org
34. The HDF Group
Interoperability of HDF5
with netCDF-4
Experience with HDF-EOS5
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
34
www.hdfgroup.org
35. netCDF4 to access HDF-EOS5 files
• Augmentation
• One file can be accessed by both EOS5 and netCDF-4
• Accessed by netCDF4
• netCDF data model should be followed
netCDF4
HDF-EOS5
HDF5
Augmentation HDF5
HDF-EOS5 file
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
35
www.hdfgroup.org
36. An HDF-EOS5 file structure
HDFEOS
GRIDS
CloudFractionAndPressure
Data Fields
CloudFraction
CloudPressure
Because of the group hierarchy, we can only augment the
HDF-EOS5 file by following the netCDF enhanced model
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
36
www.hdfgroup.org
37. An example: Augment an HDF-EOS5 Grid
• The HDF-EOS5 saves the
coordinate information
XDim and YDim in an
equation
HDFEOS
GRIDS
• The tool retrieves the
CloudFractionAndPressure
values of XDim and YDim
Data Fields
CloudFraction[XDim][YDim]
• It creates coordinate
CloudPressure[XDim][YDim]
variables XDim and YDim
XDim
with the raw values
YDim
• Then it associates the
coordinate variables with
the data variables
• Then netCDF-4 can follow
the netCDF enhanced
model to access the HDFEOS5 data
www.hdfgroup.org
38. • How does the augmented HDF-EOS5 file
follow CF conventions?
• The Aura teams(HIRDLS etc.) add key CF
attributes when creating the original HDF-EOS5
file.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
38
www.hdfgroup.org
39. Use cases to access HDF5 via netCDF-4
1. General HDF5
• Follow neither netCDF data models nor CF conventions
2. netCDF-4 HDF5
• Follow netCDF enhanced data model
3. netCDF-4 CF HDF5
• Follow netCDF enhanced model and CF conventions
4. netCDF (classic) HDF5
• Follow netCDF classic model
5. netCDF (classic) CF HDF5
• Follow netCDF classic model and CF conventions
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
39
www.hdfgroup.org
40. The HDF Group
Interoperability of HDF5
with netCDF-4
Experience with JPSS
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
40
www.hdfgroup.org
41. Applications for JPSS files
• Many potentially useful applications are
netCDF based
• Structure of JPSS files allows for effective
modification – data is separated from objects
that are unknown to netCDF-4
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
41
www.hdfgroup.org
42. JPSS obstacles to using netCDF-4 tools
1. Limitations of netCDF-4 – HDF5 may have
objects unknown to netCDF-4 (Use case 1)
• References
• Multi-dimensional attributes
• Chunked datasets (variables) with unlimited
maximum size
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
42
www.hdfgroup.org
43. JPSS obstacles to using netCDF-4 tools
1. Limitations of netCDF-4 - objects unknown to
netCDF-4
2. Files are not netCDF Classic Model
conformant (Use case 4)
a) Group structure
b) Important information including dimensions in
separate xml file
c) Geolocation data in separate file or group
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
43
www.hdfgroup.org
44. JPSS obstacles to using netCDF-4 tools
1. Limitations of netCDF-4 - objects unknown to
netCDF-4
2. Not netCDF Classic Model conformant
3. Key CF Attributes (Use Case 5)
• Latitude Longitude
• Measurement units
• Valid_min valid_max (determined by data type
and fill values)
• Scale factors
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
44
www.hdfgroup.org
45. Modification of JPSS files to overcome obstacles
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
45
www.hdfgroup.org
47. JPSS file structure
1. Hide problem objects
/
/All_Data
VIIRS-M3-SDR_All
Raw data
/Data_Products
VIIRS-M3-SDR
References to raw data
Groups
Apr. 17-19, 2012
Datasets
HDF/HDF-EOS Workshop XV
47
www.hdfgroup.org
48. JPSS file structure
2. Hide structure that does not conform to
Classic Model
/
/All_Data
VIIRS-M3-SDR_All
Raw data
/Data_Products
VIIRS-M3-SDR
References to raw data
Groups
Apr. 17-19, 2012
Datasets
HDF/HDF-EOS Workshop XV
48
www.hdfgroup.org
49. JPSS file structure
3. Import external information from product profiles
/
/All_Data
VIIRS-M3-SDR_All Group
Raw data datasets
Geolocation datasets
Dimension name and length and other attributes from product
profiles
/Data_Products
VIIRS-M3-SDR
Reference datasets
Groups
Apr. 17-19, 2012
Datasets XV
HDF/HDF-EOS Workshop
attributes
49
www.hdfgroup.org
50. Tool for obstacles 1 – 3: H5augjpss
1. Hides problem objects
2. Makes structure conform to Classic Model
3. Imports external information
Files are modified! Copy to preserve original
• Final obstacle: CF compliance
Manual additions with HDFView or h5edit
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
50
www.hdfgroup.org
51. Key CF data description attributes
Variable
Latitude
Longitude
Radiance
Apr. 17-19, 2012
Required Attribute
Units
Units
Coordinates
add_offset
scale_factor
valid_min
valid_max
Type
string
string
string
float
float
ushort
ushort
HDF/HDF-EOS Workshop XV
Value
degrees_north
degrees_east
Latitude Longitude
-0.08
2.8339462E-4
0
65527
51
www.hdfgroup.org
52. Summary
Obstacles to making JPSS files readable by
netCDF-4 can be addressed by:
• hiding file structure and objects unknown to
netCDF-4.
• importing information to interpret the data from
external files.
Product specific information for CF compliance is
currently added with HDFView or h5edit.
Unknown objects that are hidden can be
unhidden with saved object address. No tool is
planned to remove imported data.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
52
www.hdfgroup.org
53. The HDF Group
Interoperability of HDF5
with netCDF-4
Current Status and Future Directions
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
53
www.hdfgroup.org
54. Current Status
• Opportunities for interoperability primarily
involve reading HDF5 produced files using
netCDF-4 and netCDF-4 based tools.
• Issues are periodically identified and some
have been solved.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
54
www.hdfgroup.org
55. Future Directions
Continue working with Unidata to reduce
obstacles to HDF5/netCDF-4 interoperability
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
55
www.hdfgroup.org
56. The HDF Group
Thank You!
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
56
www.hdfgroup.org
57. Acknowledgements
This work was supported by Subcontract number
114820 under Raytheon Contract number
NNG10HP02C, funded by the National Aeronautics
and Space Administration (NASA) and by
cooperative agreement number NNX08AO77A from
the NASA. Any opinions, findings, conclusions, or
recommendations expressed in this material are
those of the authors and do not necessarily reflect
the views of Raytheon or the National Aeronautics
and Space Administration.
Apr. 17-19, 2012
HDF/HDF-EOS Workshop XV
57
www.hdfgroup.org
At the end of the slide, how many people are familiar with netCDF-4?
May skip this slide quickly: not mention data model.Several popular tools that are used widely by Earth Science community.
The shared dimensions can be easily used by Earth Science applications to specify the coordinate variables.
Group hierarchy and User-defined datatypes are key concepts to addTo the enhanced model.After this slide, ask the audiences how many people heard of CF conventions, how many people would like to know how CF attributes can affectthe visualization results.
May quickly go through this slide if audiences know CF attributes._FillVaule only specifies one distinct missing or undefined value. Using valid_min, valid_max or valid_range multiple distinct missing or undefined values can be filtered out
Be careful when using these two attributes. Joe will share with you MODIS examples.
Also make sure to use the exact format. Some tools are picky.
At the end of slides, just mention other CF requirement: The variable and attribute names can only have number, letter and underscore. No other characters are allowed.
HDFView: t_int.h5ncdump –h t_int.h5 Not workingTdset.h5 ncdump –h tdset.h5 working
Demos:Lat/Lon: screenshots and IDV demoUnits that don’t follow CF, IDV cannot open the file.Units that follow CF, IDV can.
Demos:Lat/Lon: screenshots and IDV demoUnits that don’t follow CF, IDV cannot open the file.Units that follow CF, IDV can.
Demos:_FillValue: 1) Use HDFView to show the data, 2) IDV first shows the data without the fillvalue attribute 3) Add _FillValue attribute at HDFView. Change the HDF5 dataset name to another name.4) Save the file(not to another HDF5 file) 5) Open with the IDV again, this time the _FillValue is not printed.
Demos:_FillValue: 1) Use HDFView to show the data, 2) IDV first shows the data without the fillvalue attribute 3) Add _FillValue attribute at HDFView. Change the HDF5 dataset name to another name.4) Save the file(not to another HDF5 file) 5) Open with the IDV again, this time the _FillValue is not printed.
Demos:Show the scale factor and add_offset. The initial value and the final value.2.Show the wrong attribute name and the right attribute name3.. Demo:file with the wrong attribute names(1,2,3…)file with the right attribute names(1010,1020, etc.)
Demos:Show the scale factor and add_offset. The initial value and the final value.2.Show the wrong attribute name and the right attribute name3.. Demo:file with the wrong attribute names(1,2,3…)file with the right attribute names(1010,1020, etc.)
Demos:Screenshots:Let the audience note the “coordinates” attribute.4. Show IDV, don’t display the value plot(IDV has a bug), just show the vertical coordinate
Demos:Screenshots:Let the audience note the “coordinates” attribute.4. Show IDV, don’t display the value plot(IDV has a bug), just show the vertical coordinate
The netCDF-4 requires the association of dimensions with coordinate variables.