The document discusses providing easy access to HDF data via NCL, IDL, and MATLAB. It presents examples and code snippets for reading HDF data from various NASA data centers like GES DISC, MODAPS, NSIDC, and LP-DAAC into the three software packages. Common issues when working with HDF files like HDF-EOS2 swaths with dimension maps and different ways metadata is stored are also addressed. The overall goal is to help lower the learning curve for users who want to analyze HDF data in their favorite analysis packages.
This slide will demonstrate how to use OPeNDAP Java clients such as IDV and Panoply via HDF OPeNDAP data handlers to access various NASA HDF products such as AIRS, OMI, MLS, MODIS, TRMM, CERES, SeaWIFS etc. Various features of these tools that can help users easy access the HDF data will also be explored.
NCAR Command Language (NCL) is an interpreted language designed for sceintific data analysis and visualization with high quality graphics, espeially for atmospherice scince. NCL has been support NetCDF 3/4, GRIB 1/2, HDF-SDS, HDF_EOS, shapefiles, binary, and ASCII files for years. Now HDF-EOS5 is the released version, and HDF5 in beta-test stage.
Now NCL team are developing NCL to write HDF5 files, and to read HDF-EOS5 data with OPeNDAP.
NCL team will share with people their experience to visualize and analyze HDF-EOS5 and HDF5 data.
This is an introductory slide for accessing NASA HDF/HDF-EOS data for beginners. NASA distributes many Earth Science data in HDF/HDF-EOS file format and new users struggle to understand the file format and use the NASA HDF/HDF-EOS data properly. This brief presentation will help new users to understand the basic concepts about the HDF/HDF-EOS and to know the available tools that can access the NASA data easily.
This slide will provide an overview of current functionality, techniques, and tips for visualization and query of HDF and netCDF data in ArcGIS, as well as future plans. Hierarchical Data Format (HDF) and netCDF (network Common Data Form) are two widely used data formats for storing and manipulating scientific data. The NetCDF format also supports temporal data by using multidimensional arrays. The basic structure of data in this format and how to work with it will be covered in the context of standardized data structures and conventions. This slide will demonstrate the tools and techniques for ingesting HDF and netCDF data efficiently in ArcGIS, as well as some common workflows to employ the visualization capabilities of ArcGIS for effective animation and analysis of your data.
The MathWorks introduced MATLAB support for HDF5 in 2002 via three high-level functions: HDF5INFO, HDF5READ, and HDF5WRITE. These functions worked well for their purpose-providing simple interfaces to a complicated file format-but MATLAB users requested finer control over their HDF5 files and the HDF5 library. MATLAB 7.3 (R2006b) adds this precise level of support for version 1.6.5 of the HDF5 library via a close mapping of the HDF5 C API to MATLAB function calls.
This presentation will briefly introduce the earlier, high-level HDF5 interface (and its limitations) before showing in detail the low-level HDF5 functions. It will show how to interact with the HDF5 library and files using the thirteen classes of functions in MATLAB, which encapsulate groupings of functionality found in the HDF5 C API. But because MATLAB is itself a higher-level language than C, we will also present MATLAB's extensions and modifications of the HDF5 C API that make it more MATLAB-like, work with defined values, and perform ID and memory management.
Wrapping a library like HDF5 requires a great deal of effort and design, and we will briefly present a general-purpose mechanism for creating close mappings between library interfaces and an application like MATLAB. One of our goals in this presentation is to facilitate communication with The HDF Group about how The MathWorks builds our HDF5 interfaces in order to ease adoption of future versions of the HDF5 library in large, general-purpose applications.
This slide will demonstrate how to use OPeNDAP Java clients such as IDV and Panoply via HDF OPeNDAP data handlers to access various NASA HDF products such as AIRS, OMI, MLS, MODIS, TRMM, CERES, SeaWIFS etc. Various features of these tools that can help users easy access the HDF data will also be explored.
NCAR Command Language (NCL) is an interpreted language designed for sceintific data analysis and visualization with high quality graphics, espeially for atmospherice scince. NCL has been support NetCDF 3/4, GRIB 1/2, HDF-SDS, HDF_EOS, shapefiles, binary, and ASCII files for years. Now HDF-EOS5 is the released version, and HDF5 in beta-test stage.
Now NCL team are developing NCL to write HDF5 files, and to read HDF-EOS5 data with OPeNDAP.
NCL team will share with people their experience to visualize and analyze HDF-EOS5 and HDF5 data.
This is an introductory slide for accessing NASA HDF/HDF-EOS data for beginners. NASA distributes many Earth Science data in HDF/HDF-EOS file format and new users struggle to understand the file format and use the NASA HDF/HDF-EOS data properly. This brief presentation will help new users to understand the basic concepts about the HDF/HDF-EOS and to know the available tools that can access the NASA data easily.
This slide will provide an overview of current functionality, techniques, and tips for visualization and query of HDF and netCDF data in ArcGIS, as well as future plans. Hierarchical Data Format (HDF) and netCDF (network Common Data Form) are two widely used data formats for storing and manipulating scientific data. The NetCDF format also supports temporal data by using multidimensional arrays. The basic structure of data in this format and how to work with it will be covered in the context of standardized data structures and conventions. This slide will demonstrate the tools and techniques for ingesting HDF and netCDF data efficiently in ArcGIS, as well as some common workflows to employ the visualization capabilities of ArcGIS for effective animation and analysis of your data.
The MathWorks introduced MATLAB support for HDF5 in 2002 via three high-level functions: HDF5INFO, HDF5READ, and HDF5WRITE. These functions worked well for their purpose-providing simple interfaces to a complicated file format-but MATLAB users requested finer control over their HDF5 files and the HDF5 library. MATLAB 7.3 (R2006b) adds this precise level of support for version 1.6.5 of the HDF5 library via a close mapping of the HDF5 C API to MATLAB function calls.
This presentation will briefly introduce the earlier, high-level HDF5 interface (and its limitations) before showing in detail the low-level HDF5 functions. It will show how to interact with the HDF5 library and files using the thirteen classes of functions in MATLAB, which encapsulate groupings of functionality found in the HDF5 C API. But because MATLAB is itself a higher-level language than C, we will also present MATLAB's extensions and modifications of the HDF5 C API that make it more MATLAB-like, work with defined values, and perform ID and memory management.
Wrapping a library like HDF5 requires a great deal of effort and design, and we will briefly present a general-purpose mechanism for creating close mappings between library interfaces and an application like MATLAB. One of our goals in this presentation is to facilitate communication with The HDF Group about how The MathWorks builds our HDF5 interfaces in order to ease adoption of future versions of the HDF5 library in large, general-purpose applications.
HDF5 is a powerful and feature-rich creature, and getting the most out of it requires powerful tools. The MathWorks provides a "low-level" interface to the HDF5 library that closely corresponds to the C API and exposes much of its richness. This short tutorial will present ways to use the low-level MATLAB interface to build those tools and tackle such topics as subsetting, chunking, and compression.
This tutorial is designed for anyone who needs to work with data stored in HDF and HDF5 files.
The first part of the tutorial will focus on the HDF5 utilities to display the contents of HDF5 files, to extract and to import data from and to HDF5 files, to compare two HDF5 files, and more. Participants will be guided through the hand-on examples and will learn about different tools options. New changes and advanced features will be covered in a separate session (Updates on HDF tools) on Wednesday.
The second part of tutorial includes a hands-on session to learn the HDF (4 & 5) Java browsing tool, HDFView. The tool and special plug-ins will be used to work with the existing HDF, HDF-EOS, and netCDF-4 files, and to create a new HDF5 file. The tutorial will cover basic features of HDFView.
The tool takes HDF-EOS 5 data as input, and generates COARDS-compatible output - if the input file has enough metadata to be COARDS-compliant, the output file will be COARDS-compliant. The tool is written in portable C, and ought to run on any platform where the HDF-EOS and netCDF libraries are available.
This year, we have made two major enhancements to the converter:
It now automatically detects whether its input is HDF-EOS2 or HDF-EOS5 format, and handles either one. The previous tool worked with HDF-EOS5 only.
Its netCDF output attempts to conform to the new CF conventions (a superset of the COARDS conventions). This is primarily an improvement in its translation of Swath datasets, which CF handles much better than COARDS.
HDF5 is a powerful and feature-rich creature, and getting the most out of it requires powerful tools. The MathWorks provides a "low-level" interface to the HDF5 library that closely corresponds to the C API and exposes much of its richness. This short tutorial will present ways to use the low-level MATLAB interface to build those tools and tackle such topics as subsetting, chunking, and compression.
This tutorial is designed for anyone who needs to work with data stored in HDF and HDF5 files.
The first part of the tutorial will focus on the HDF5 utilities to display the contents of HDF5 files, to extract and to import data from and to HDF5 files, to compare two HDF5 files, and more. Participants will be guided through the hand-on examples and will learn about different tools options. New changes and advanced features will be covered in a separate session (Updates on HDF tools) on Wednesday.
The second part of tutorial includes a hands-on session to learn the HDF (4 & 5) Java browsing tool, HDFView. The tool and special plug-ins will be used to work with the existing HDF, HDF-EOS, and netCDF-4 files, and to create a new HDF5 file. The tutorial will cover basic features of HDFView.
The tool takes HDF-EOS 5 data as input, and generates COARDS-compatible output - if the input file has enough metadata to be COARDS-compliant, the output file will be COARDS-compliant. The tool is written in portable C, and ought to run on any platform where the HDF-EOS and netCDF libraries are available.
This year, we have made two major enhancements to the converter:
It now automatically detects whether its input is HDF-EOS2 or HDF-EOS5 format, and handles either one. The previous tool worked with HDF-EOS5 only.
Its netCDF output attempts to conform to the new CF conventions (a superset of the COARDS conventions). This is primarily an improvement in its translation of Swath datasets, which CF handles much better than COARDS.
Petit déjeuner "Alternatives libres à GoogleMaps" du 11 février 2014 - Nantes...Makina Corpus
Le panorama des solutions de cartographie en ligne a bien changé depuis ses débuts. De l'époque où les solutions maisons propriétaires offraient des solutions onéreuses et accessibles uniquement aux professionnels, nous sommes passés à un monde où les internautes sont devenus acteurs et où les données se libèrent. En effet, la cartographie sur le web s'est largement démocratisée.
Google a largement contribué à cette démocratisation et nous vous proposons de découvrir que, depuis la sortie de GoogleMaps, les solutions libres ne sont pas en reste et constituent des alternatives robustes et satisfaisantes tant pour le grand public que les professionnels.
This tutorial is designed for new HDF5 users. We will go over a brief history of HDF and HDF5 software, and will cover basic HDF5 Data Model objects and their properties; we will give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples, and Java tool HDFView will be used to illustrate HDF5 concepts.
This tutorial is designed for new HDF5 users. We will cover basic HDF5 Data Model objects and their properties, give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples will be used to illustrate HDF5 concepts.
This tutorial is designed for anyone who needs to work with data stored in HDF5 files. It will cover functionality and useful features of the HDF5 utilities, which include h5dump, h5diff, h5repack, h5stat, h5copy, h5check and h5repart. The tutorial will also introduce recently changes and new features of the utilities.
The HDFView is a visual tool for browsing and editing HDF4 and HDF5 files. Some basic features and new changes of HDFView will be presented. Details of recent development in HDF-Java products will be discussed in a separate presentation.
This tutorial is designed for new HDF5 users. We will cover HDF5 abstractions such as datasets, groups, attributes, and datatypes. Simple C examples will cover the programming model and basic features of the API, and will give new users the knowledge they need to navigate through the rich collection of HDF5 interfaces. Participants will be guided through an interactive demonstration of the fundamentals of HDF5.
This tutorial is for new HDF5 users.
This Tutorial is designed for the HDF5 users with some HDF5 experience. It will cover properties of the HDF5 objects that affect I/O performance and file sizes. The following HDF5 features will be discussed: partial I/O, chunking and compression, and complex HDF5 datatypes such as strings, variable-length arrays and compound datatypes.
We will also discuss references to objects and datasets regions and how they can be used for indexing. Participants will work with the Tutorial examples and exercises during the hands-on sessions.
This tutorial is designed for anyone who needs to work with data stored in HDF5 files. The tutorial will cover functionality and useful features of the HDF5 utilities h5dump, h5diff, h5repack, h5stat, h5copy, h5check and h5repart. It will also cover HDFView, the HDF5 Java browsing and editing tool.
In this presentation, we will give an update on the HDF OPeNDAP project. We will update the new features inside the HDF5 OPeNDAP data handler. We will also introduce the enhanced HDF4 OPeNDAP data handler and demonstrate how it can help users to view and analyze remote HDF-EOS2 data. A demo that uses OPeNDAP client tools to handle AIRS and MODIS Grid/Swath data with the enhanced handler will be presented.
We will summarize current status of HDF-EOS and associated tools. Update on HDF-EOS, HDFView plug-in and The HDF-EOS to GeoTIFF (HEG) conversion tool, including recent changes to the software, ongoing maintenance, upcoming releases, future plans, and issues will be discussed.
We will also summarize the status of HDF-EOS RFC. The HDF-EOS plug-in for the THG-developed tool, HDFView, has been enhanced. The plug-in offers browse capability for both HDF4 and HDF5 - based HDF-EOS files. HDFView can also process vanilla HDF4 and HDF5 files. New features including support for Point and Zonal Average objects have been added. A port to Mac OS X version will be available in next release.
The HDF-EOS to GeoTIFF (HEG) conversion tool has been augmented to include new projections, and support for additional AMSR-E and AIRS products. Subsetting features have also been augmented. The tool is available in both stand-alone and EOS DAAC online versions.
It will cover features of the HDF5 library for achieving better I/O performance and efficient storage. The following HDF5 features will be discussed: datatype and partial I/O
This tutorial is for persons who are already familiar with HDF5 and wish to take advantage is some of its advanced features.
We will introduce the basic features and recent updates on the HDF5 tools. The intention of the talk is to keep users updated on the HDF5 tool development such as new features and new tools. It also helps new users to get familiar with the HDF5 tools. Three new tools h5check, h5copy and h5stat will be introduced.
It is important that HDF5 files created and modified by the HDF5 library are fully compliant with the defined HDF5 File Format Specification (File Format) to ensure the data model integrity and long term compatibility between evolving versions of the HDF5 library. The h5check tool verifies that the content of an HDF5 file is encoded according to the File Format. The verification role also makes h5check act as a watchdog for the implementation correctness of the HDF5 library. This presentation explains the features of the tool and time permitting, shows a demonstration of the tool.
h5copy is a command line tool that allows to make a copy of an HDF5 object (group, dataset or named datatype) from one location to another location within a file or across files. h5copy uses the HDF5 API function H5Gcopy to copy objects, which provides users with several options on how data object is copied, and high efficiency in data copy.
h5stat is a new utility for viewing different statistics about an HDF5 file. Tool is still under development. The goal of the talk will be to introduce the tool and get HDF5 users feedback.
This 2009 tutorial slide will cover basic HDF5 Data Model objects and their properties. It will include an overview of the HDF5 Libraries and APIs, and describe the HDF5 programming model. Simple programming examples and the HDFView data browser will be used to illustrate HDF5 concepts and start developing your own HDF5 based applications.
This tutorial is for new HDF5 users.
This tutorial will introduce the three levels of the HDF-Java products: the HDF-Java wrapper (or Java Native Interfaces to the standard HDF libraries), the HDF-Java object package, and the HDFView. The Java wrapper provides standard Java APIs that allow applications to call the C HDF libraries from Java. The HDF-Java object package implements HDF data objects, e.g. Groups and Datasets, in an object-oriented form and makes it easy for applications to use the libraries. The HDFView is a visual tool for browsing and editing HDF4 and HDF5 files.
Leveraging Hadoop in your PostgreSQL EnvironmentJim Mlodgenski
This talk will begin with a discussion of the strengths of PostgreSQL and Hadoop. We will then lead into a high level overview of Hadoop and its community of projects like Hive, Flume and Sqoop. Finally, we will dig down into various use cases detailing how you can leverage Hadoop technologies for your PostgreSQL databases today. The use cases will range from using HDFS for simple database backups to using PostgreSQL and Foreign Data Wrappers to do low latency analytics on your Big Data.
This tutorial is designed for the HDF5 users with some HDF5 experience.
It will cover advanced features of the HDF5 library for achieving better I/O performance and efficient storage. The following HDF5 features will be discussed: partial I/O, chunked storage layout, compression and other filters including new n-bit and scale+offset filters. Significant time will be devoted to the discussion of complex HDF5 datatypes such as strings, variable-length datatypes, array and compound datatypes.
This tutorial is designed for users with some HDF5 experience. It will cover advanced features of the HDF5 library that can be used to achieve better I/O performance and more efficient storage. The following HDF5 features will be discussed: partial I/O; compression and other filters, including new n-bit and scale+offset filters and data storage options. Significant time will be devoted to the discussion of complex HDF5 datatypes such as strings, variable-length datatypes, array datatypes, and compound datatypes.
Similar to Usage of NCL, IDL, and MATLAB to access NASA HDF4/HDF-EOS2/HDF-EOS5 data (20)
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Usage of NCL, IDL, and MATLAB to access NASA HDF4/HDF-EOS2/HDF-EOS5 data
1. The HDF Group
Easy Access of HDF data via
NCL/IDL/MATLAB
Kent Yang, Tong Qi, Ziying Li, Yi Wang, Shu
Zhang, Joe Lee
The HDF Group
September 28, 2010
HDF/HDF-EOS Workshop XIV
1
www.hdfgroup.org
2. Motivation
• Many Heterogeneous NASA HDF data
products
• To visualize the data, different products need
to be handled differently
• Users need to spend extra time figuring out the
solutions
• Individual data centers have already provided
data services for the data they distributed
• Some end-users prefer to use their favorite
tools to access HDF data
September 28, 2010
HDF/HDF-EOS Workshop XIV
2
www.hdfgroup.org
3. Learning Curve of accessing HDF data
NCL
From the ESIP wiki page:
http://wiki.esipfed.org/index.php/Making_Science_Data_Easier_to_Use_with_OPeNDAP
Making Science Data Easier to Use with OPeNDAP
September 28, 2010
HDF/HDF-EOS Workshop XIV
3
www.hdfgroup.org
6. The HDF Group
Basic Examples
September 28, 2010
HDF/HDF-EOS Workshop XIV
6
www.hdfgroup.org
7. Introduction to NCL/IDL/MATLAB
• Interpreted languages
• Visualization, analysis and computation
• NCL
- Free package, developed by NCAR
- Support HDF-EOS2, HDF-EOS5 and HDF4
• IDL
- Widely used by Earth Science Community
- Support HDF-EOS2, HDF4 and HDF5
• MATLAB
- Widely used by Computation and Engineering Communities
- Support HDF-EOS2, HDF4 and HDF5
September 28, 2010
HDF/HDF-EOS Workshop XIV
7
www.hdfgroup.org
8. A simple NCL example
load "$NCARG_ROOT/lib/ncarg/nclex/gsun/gsn_code.ncl"
load "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
begin
cdf_file = addfile("AMSR_E_L3_RainGrid_B05_200707.he2","r")
rrland = cdf_file->RrLandRain_MonthlyRainTotal_GeoGrid(:,:)
rrland@_FillValue = -1
resources = True
xwks = gsn_open_wks("pdf","AE_RnGd.hdfeos2")
plot = gsn_csm_contour_map_ce(xwks,rrland,resources)
end
September 28, 2010
HDF/HDF-EOS Workshop XIV
8
www.hdfgroup.org
9. • Complete Code can be found under
http://hdfeos.org/software/ncl.php#ref_sec:ncl-hdfeos2-grid-1d-unabridged
September 28, 2010
HDF/HDF-EOS Workshop XIV
9
www.hdfgroup.org
10. A simple IDL example
PRO AMSR_E_L2A_BrightnessTemperatures
FILE_NAME=
"AMSR_E_L2A_BrightnessTemperatures_V09_200206190029_D.hdf“
SWATH_NAME='Low_Res_Swath'
DATAFIELD_NAME='23.8H_Approx._Res.3_TB_(not-resampled)'
file_id = EOS_SW_OPEN(FILE_NAME)
swath_id = EOS_SW_ATTACH(file_id, SWATH_NAME)
status = EOS_SW_READFIELD(swath_id, DATAFIELD_NAME, data)
status = EOS_SW_READFIELD(swath_id, 'Longitude', lon)
status = EOS_SW_READFIELD(swath_id, 'Latitude', lat)
status = EOS_SW_DETACH(swath_id)
status = EOS_SW_CLOSE(file_id)
MAP_SET, /GRID, /CONTINENTS
CONTOUR, data, lon, lat, /OVERPLOT, NLEVELS=20, /CELL_FILL
END
September 28, 2010
HDF/HDF-EOS Workshop XIV
10
www.hdfgroup.org
12. • More information on the descriptions of these
examples
• Check hdfeos.org
• NCL: http://hdfeos.org/software/ncl.php
• IDL: http://hdfeos.org/examples/idl.php
• MATLAB:
http://hdfeos.org/examples/matlab.php
September 28, 2010
HDF/HDF-EOS Workshop XIV
12
www.hdfgroup.org
14. More helpful
• A comprehensive NCL/IDL/MATLAB
example codes and plots for sample data
from most NASA Data centers:
GES DISC
MODAPS(LAADS)
NSIDC
LP-DAAC
P.O DAAC
GHRC
OBPG(Ocean Color)
LaRC
September 28, 2010
HDF/HDF-EOS Workshop XIV
14
www.hdfgroup.org
15. Where are these examples located?
• http://hdfeos.org/zoo/
• We welcome you to send us feedback on
these examples. You can use the HDF-EOS
forum(http://hdfeos.org/forums) to share your
comments or contact us
at eoshelp@hdfgroup.org .
September 28, 2010
HDF/HDF-EOS Workshop XIV
15
www.hdfgroup.org
16. Common Issues for Tools
• MATLAB and IDL
- IDL
- For IDL 7.x and before, cannot add
color bar by using scripts
- MATLAB
- For data array > 1MB, one needs to use
64-bit MATLAB to generate plots;
Takes very long time.
September 28, 2010
HDF/HDF-EOS Workshop XIV
16
www.hdfgroup.org
17. Common Issues for Tools
• MATLAB and IDL
- HDF-EOS2 non-geographic projection Grids
- Need to provide additional latitude and longitude files
- HDF-EOS2 geographic projection Grids
- Need to obtain parameters to calculate latitude and longitude
• All Tools
- HDF-EOS2 Swaths with dimension maps
- Need to provide additional latitude and longitude files
September 28, 2010
HDF/HDF-EOS Workshop XIV
17
www.hdfgroup.org
18. Other Issues
• Different ways to store metadata in an HDF file
- Some HDF4 products don’t provide lat/lon
- Some HDF4 products provide attributes to
calculate lat/lon
• Users not familiar with HDF4 and HDF-EOS2
file structures
September 28, 2010
HDF/HDF-EOS Workshop XIV
18
www.hdfgroup.org
19. A Tip that you need to remember
• You can always use HDFView to quickly
examine any HDF files.
September 28, 2010
HDF/HDF-EOS Workshop XIV
19
www.hdfgroup.org
21. The HDF Group
Now we will walk through
examples for each NASA
data center
September 28, 2010
HDF/HDF-EOS Workshop XIV
21
www.hdfgroup.org
22. The HDF Group
GES DISC
September 28, 2010
HDF/HDF-EOS Workshop XIV
22
www.hdfgroup.org
23. GES DISC AIRS Swath
• Directly read the lat/lon and use the polar view
…
data=eos_file>radiances_L2_Standard_cloud_cleared_radiance_product(:,:,0) ; read
specific subset of data field
; In order to read the radiances data field from the HDF-EOS2 file, the
group
; under which the data field is placed must be appended to the data field
in NCL. For more information,
; visit section 4.3.2 of http://hdfeos.org/software/ncl.php.
data@lat2d=eos_file>Latitude_L2_Standard_cloud_cleared_radiance_product ; associate
longitude and latitude
data@lon2d=eos_file>Longitude_L2_Standard_cloud_cleared_radiance_product
data@_FillValue=-9999 ;
…
res@gsnCenterString="radiances at Channel=567"
plot(2)=gsn_csm_contour_map_polar(xwks,data_2,res)
res@gsnCenterString="radiances at Channel=1339"
plot(3)=gsn_csm_contour_map_polar(xwks,data_3,res)
delete(plot) ; cleaning up resources used
delete(data)
NCL
September 28, 2010
HDF/HDF-EOS Workshop XIV
23
www.hdfgroup.org
24. GES DISC AIRS Swath
IDL
September 28, 2010
Matlab
HDF/HDF-EOS Workshop XIV
24
www.hdfgroup.org
25. GES DISC AIRS Grid
• A typical global grid. Lat. and Lon. are provided.
…
%Reading Data from a Data Field
GRID_NAME='ascending';
grid_id = hdfgd('attach', file_id, GRID_NAME);
DATAFIELD_NAME='RelHumid_A';
[data1, fail] = hdfgd('readfield', grid_id, DATAFIELD_NAME, [], [], []);
…
%Reading Lat and Lon Data
GRID_NAME='location';
grid_id = hdfgd('attach', file_id, GRID_NAME);
%Reading Lat Data
DATAFIELD_NAME='Latitude';
[lat, status] = hdfgd('readfield', grid_id, DATAFIELD_NAME, [], [], []);
lat=double(lat);
[fillvalue,status] = hdfgd('getfillvalue',grid_id, DATAFIELD_NAME);
lat(lat==fillvalue) = NaN;
%Reading Lon Data
DATAFIELD_NAME='Longitude';
[lon, status] = hdfgd('readfield', grid_id, DATAFIELD_NAME, [], [], []);
lon=double(lon);
[fillvalue,status] = hdfgd('getfillvalue',grid_id, DATAFIELD_NAME);
lon(lon==fillvalue) = NaN;
…
September 28, 2010
HDF/HDF-EOS Workshop XIV
Matlab
25
www.hdfgroup.org
26. GES DISC AIRS Grid
NCL
September 28, 2010
IDL
HDF/HDF-EOS Workshop XIV
26
www.hdfgroup.org
27. GES DISC TRMM Swath
• The global view doesn’t show much
information; need a zoom view.
1B21_CSI.990906.10217.KORA.6_binDIDHmean.idl
September 28, 2010
HDF/HDF-EOS Workshop XIV
27
www.hdfgroup.org
28. GES DISC TRMM Swath
Matlab
September 28, 2010
HDF/HDF-EOS Workshop XIV
28
www.hdfgroup.org
29. GES DISC TRMM Grid
• Grid, 3B43:
• Calculate lat/lon based on the formula
• Add “units”
http://disc.sci.gsfc.nasa.gov/additional/faq/precipitation_faq.shtml#lat_lon
NCL
September 28, 2010
HDF/HDF-EOS Workshop XIV
29
www.hdfgroup.org
30. GES DISC MERRA Grid
• A typical global grid
…
data=eos_file->PLE_EOSGRID(1,72,:,:) ; read specific subset of data
field
;
; In order to read the PLE data field from the HDF-EOS2 file, the group
; under which the data field is placed must be appended to the data field
in NCL. For more information,
; visit section 4.3.2 of http://hdfeos.org/software/ncl.php.
data@lon1d=eos_file->XDim_EOSGRID ; associate longitude and
latitude
data@lat1d=eos_file->YDim_EOSGRID ; here, since the XDim/YDim
arrays are 1-D, we use lon1d instead of lon2d
data@units="Pa"
data@long_name="Edge pressures"
…
data_4=eos_file->PLE_EOSGRID(7,70,:,:) ; read specific subset of data
field
data_4@lon1d=eos_file->XDim_EOSGRID ; associate longitude and
latitude
data_4@lat1d=eos_file->YDim_EOSGRID
data_4@units="Pa"
data_4@long_name="Edge pressures“
…
res@gsnCenterString="PLE at TIME=1, Height=72"
plot(0)=gsn_csm_contour_map_ce(xwks,data,res)
…
September 28, 2010
HDF/HDF-EOS Workshop XIV
NCL
30
www.hdfgroup.org
31. GES DISC TOMS Grid
…
;retrieve data
grid_id = EOS_GD_ATTACH(file_id, GRID_NAME)
status = EOS_GD_READFIELD(grid_id, DATAFIELD_NAME, data)
;close file
status = EOS_GD_DETACH(grid_id)
status = EOS_GD_CLOSE(file_id)
…
;retrieve lat
;field name should be defined as "YDim:TOMS Level 3" instead of
"YDim:TOMS Level 3 (dimension)"
DATAFIELD_NAME="YDim:TOMS Level 3"
index=HDF_SD_NAMETOINDEX(newFileID,DATAFIELD_NAME)
thisSdsID=HDF_SD_SELECT(newFileID, index)
HDF_SD_GETDATA, thisSdsID, lat
;retrieve lon
;field name should be defined as "XDim:TOMS Level 3" instead of
"XDim:TOMS Level 3 (dimension)"
DATAFIELD_NAME="XDim:TOMS Level 3"
index=HDF_SD_NAMETOINDEX(newFileID,DATAFIELD_NAME)
…
CONTOUR, BYTSCL(data, /NAN), lon, lat, /OVERPLOT, /FILL, C_Colors
=Indgen(levels)+3, Background=1, NLEVELS=levels, Color=Black
MAP_GRID, /BOX_AXES, COLOR=255
MAP_CONTINENTS, COLOR=255
…
TOMS-EP_L3-TOMSEPL3_2000m0101_v8_Ozone.idl
September 28, 2010
HDF/HDF-EOS Workshop XIV
31
www.hdfgroup.org
32. The HDF Group
MODAPS
MODIS Level 1 Swath
September 28, 2010
HDF/HDF-EOS Workshop XIV
32
www.hdfgroup.org
34. MODAPS(LAADS)
• MODIS Swath using dimension map
• It needs additional latitude and longitude files
provided by NASA
• Where to obtain the corresponding latitude and
longitude files?
-ftp://ladsweb.nascom.nasa.gov/allData/5/MYD03
Or
ftp://ladsweb.nascom.nasa.gov/allData/5/MOD03
MODIS Swath with dimension map
September 28, 2010
HDF/HDF-EOS Workshop XIV
34
www.hdfgroup.org
35. MODAPS(LAADS)
• Need to apply the scale and offset factor, need special formula
•
Normal: Data = scale* data + offset
Matlab
September 28, 2010
HDF/HDF-EOS Workshop XIV
35
www.hdfgroup.org
38. NSIDC Polar Sterographic Grid
NCL
September 28, 2010
HDF/HDF-EOS Workshop XIV
38
www.hdfgroup.org
39. The HDF Group
LP- DAAC
September 28, 2010
HDF/HDF-EOS Workshop XIV
39
www.hdfgroup.org
40. LP-DAAC Sinusoidal Grid
• The lat/lon are calculated by the hdfeos2 dumper tool
http://hdfeos.org/software/eosdump.php
MOD09GA.A2007268.h10v08.005.2007272184810_sur_refl_b01_1.idl
September 28, 2010
HDF/HDF-EOS Workshop XIV
40
www.hdfgroup.org
41. The HDF Group
PO. DAAC
September 28, 2010
HDF/HDF-EOS Workshop XIV
41
www.hdfgroup.org
42. PO.DAAC Geographic Grid
• Need to calculate the lat/lon based on the information provided
by the product document.
NCL
Matlab
September 28, 2010
HDF/HDF-EOS Workshop XIV
42
www.hdfgroup.org
54. LaRC_CERES cross-section
;Open file
FILE_NAME='CER_ZAVG_Aqua-FM4MODIS_Edition2B_007005.200503.hdf'
newFileID=HDF_SD_START(FILE_NAME, /READ)
;Define datafield
DATAFIELD_NAME='Ice Particle Diameter'
index=HDF_SD_NAMETOINDEX(newFileID,DATAFIELD_NAME)
;Retrieve data
thisSdsID=HDF_SD_SELECT(newFileID, index)
HDF_SD_GETDATA, thisSdsID, data
…
;generate lat
lat=FINDGEN(180)*(-1)+89.5
;generate ngmt
ngmt=FINDGEN(8)*1+1
…
; Start off generating the plot
levels = 250
device, decomposed=0
LoadCT, 33, Ncolors=levels, Bottom=3
WINDOW, title='Ice Particle Diameter at Stats=0'+'
'+'units:'+units, XSIZE=800
CONTOUR, data2D, ngmt, lat, /Fill, C_Colors=Indgen(levels)+3, Backgro
und=1, NLEVELS=levels, Color=Black, XTITLE='Monthly 3-hourly
GMT time
increments', YTITLE='latitude',POSITION=[0.1, 0.1, 0.82, 0.95]
…
CER_ZAVG_Aqua-FM4-MODIS_Edition2B_007005.200503.idl
September 28, 2010
HDF/HDF-EOS Workshop XIV
54
www.hdfgroup.org
55. CERES Nested Grid
• http://eosweb.larc.nasa.gov/PRODOCS/ceres/
SRBAVG/Quality_Summaries/srbavg_ed2d/ne
stedgrid.html
• Such projection is not supported. We have to
emulate the projection.
September 28, 2010
HDF/HDF-EOS Workshop XIV
55
www.hdfgroup.org
57. Limitations
• No tools can generate LAMAZ(Lambert
Azimuthal projection ) grid properly.
• No latitude and longitude files can be found for
250 meter and 500 meter MOD and MYD
swaths using dimension maps distributed by
MODAPS.
• Some tools don’t support all projections.
September 28, 2010
HDF/HDF-EOS Workshop XIV
57
www.hdfgroup.org
58. Again
• Example codes and plots are under
http://hdfeos.org/zoo
• We welcome you to send us feedback on
these examples. You can use the HDF-EOS
forum(http://hdfeos.org/forums) to share your
comments or contact us
at eoshelp@hdfgroup.org .
September 28, 2010
HDF/HDF-EOS Workshop XIV
58
www.hdfgroup.org
59. The HDF Group
Thank you !
September 28, 2010
HDF/HDF-EOS Workshop XIV
59
www.hdfgroup.org
60. Acknowledgements
This work was supported by cooperative agreement
number NNX08AO77A from the National
Aeronautics and Space Administration (NASA).
Any opinions, findings, conclusions, or
recommendations expressed in this material are
those of the author[s] and do not necessarily reflect
the views of the National Aeronautics and Space
Administration.
September 28, 2010
HDF/HDF-EOS Workshop XIV
60
www.hdfgroup.org
The Earth Observing System Project Science Office -> eospso.gsfc.nasa.gov (Information about Earth Observing System)SDP Toolkit/HDF-EOS: http://newsroom.gsfc.nasa.gov/sdptoolkit/toolkit.html HDF Group website: http://hdfgroup.orgBut HDF-EOS Tools and Information Center http://hdfeos.org or http://hdfeos.net Screenshot of http://hdfeos.orgExplain two reasons for this work:We’ve done this work in the past few months.Some information of tools are out of date. Need to be updated.Requests for more information such as examples.
developed by the CISL at NCAR. It is an interpreted language designed for visualization and analysis of scientific data.
The Earth Observing System Project Science Office -> eospso.gsfc.nasa.gov (Information about Earth Observing System)SDP Toolkit/HDF-EOS: http://newsroom.gsfc.nasa.gov/sdptoolkit/toolkit.html HDF Group website: http://hdfgroup.orgBut HDF-EOS Tools and Information Center http://hdfeos.org or http://hdfeos.net Screenshot of http://hdfeos.orgExplain two reasons for this work:We’ve done this work in the past few months.Some information of tools are out of date. Need to be updated.Requests for more information such as examples.
In general, for all the following slides: You need to specify the tool name under the plot. The font should not be too small.Add another code section:Amplify data@lat2d=eos_file->Latitude_L2_Standard_cloud_cleared_radiance_product..Amplify:plot(3)=gsn_csm_contour_map_polar(xwks,data_3,res…)Replace the plot with channel 567
Plots will fly in
Add a zoom view
Add a code section for calculating the lat/lon.
IDL code and plot
MYD021KM.A2002226.0000.005.2009193222735_EV_1KM_Emissive.nclAlso add a global view if we have one.I already add global view, the zoom view will fly in
Provide the code section and circle the line that applies the scale offset.
Matlab
Add the code that reads the lat/lon.
Replace this with http://eoszoo.hdfgroup.uiuc.edu/zoo/OBPG/S1997247162631.L2_MLAC_OC_Rrs_412_zoom.idl.JPG