The document provides an introduction to the HDF5 format and The HDF Group software. It discusses:
- HDF5 is a data model, library, and file format for managing large amounts of numerical data. It was developed starting in 1996 as an improved successor to the HDF4 format.
- The HDF5 data model defines objects like datasets, groups, and attributes that provide a flexible way to organize and describe stored data. Properties of these objects can be modified to control storage and performance.
- The HDF5 library provides interfaces to work with HDF5 files and objects from C, C++, Fortran, Java and other languages. A set of command line utilities and the HDFView browser can be used to
In this Tutorial we will discuss different storage methods for the HDF5 files (split files, family of files, multi-files), and datasets (compressed, external, compact), and related filters and properties. This tutorial will introduce advanced features of HDF5, including:
o Property lists
o Compound datatypes
o hyperslab selections
o point selection
o references to objects and regions
o extendable datasets
o mounting files
group iterations
In this Tutorial we will discuss different storage methods for the HDF5 files (split files, family of files, multi-files), and datasets (compressed, external, compact), and related filters and properties. This tutorial will introduce advanced features of HDF5, including:
o Property lists
o Compound datatypes
o hyperslab selections
o point selection
o references to objects and regions
o extendable datasets
o mounting files
group iterations
Status of HDF-EOS and access tools will be summarized. Updates on HDF-EOS, TOOLKIT, HDFView plug-in and The HDF-EOS to GeoTIFF (HEG) conversion tool, including recent changes to the software, ongoing maintenance, upcoming releases, future plans, and issues will be discussed.
A preponderance of data from NASA's Earth Observing System (EOS) is archived in the HDF Version 4 (HDF4) format. The long-term preservation of these data is critical for climate and other scientific studies going many decades into the future. HDF4 is very effective for working with the large and complex collection of EOS data products. Unfortunately, because of the complex internal byte layout of HDF4 files, future readability of HDF4 data depends on preserving a complex software library that can interpret that layout. Having a way to access HDF4 data independent of a library could improve its viability as an archive format, and consequently give confidence that HDF4 data will be readily accessible forever, even if the HDF4 library is gone.
To address the need to simplify long-term access to EOS data stored in HDF4, a collaborative project between The HDF Group and NASA Earth Science Data Centers is implementing an approach to accessing data in HDF4 files based on the use of independent maps that describe the data in HDF4 files and tools that can use these maps to recover data from those files. With this approach, relatively simple programs will be able to extract the data from an HDF4 file, bypassing the need for the HDF4 library.
A demonstration project has shown that this approach is feasible. This involved an assessment of NASA�s HDF4 data holdings, and development of a prototype XML-based layout mapping language and tools to read layout maps and read HDF4 files using layout maps. Future plans call for a second phase of the project, in which the mapping tools and XML schema are made production quality, the mapping schema are integrated with existing XML metadata files in several data centers, and outreach activities are carried out to encourage and facilitate acceptance of the technology.
This tutorial is designed for new HDF5 users. We will cover HDF5 abstractions such as datasets, groups, attributes, and datatypes. Simple C examples will cover the programming model and basic features of the API, and will give new users the knowledge they need to navigate through the rich collection of HDF5 interfaces. Participants will be guided through an interactive demonstration of the fundamentals of HDF5.
This tutorial is for new HDF5 users.
This 2009 tutorial slide will cover basic HDF5 Data Model objects and their properties. It will include an overview of the HDF5 Libraries and APIs, and describe the HDF5 programming model. Simple programming examples and the HDFView data browser will be used to illustrate HDF5 concepts and start developing your own HDF5 based applications.
This tutorial is for new HDF5 users.
This tutorial is designed for new HDF5 users. We will go over a brief history of HDF and HDF5 software, and will cover basic HDF5 Data Model objects and their properties; we will give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples, and Java tool HDFView will be used to illustrate HDF5 concepts.
Status of HDF-EOS and access tools will be summarized. Updates on HDF-EOS, TOOLKIT, HDFView plug-in and The HDF-EOS to GeoTIFF (HEG) conversion tool, including recent changes to the software, ongoing maintenance, upcoming releases, future plans, and issues will be discussed.
A preponderance of data from NASA's Earth Observing System (EOS) is archived in the HDF Version 4 (HDF4) format. The long-term preservation of these data is critical for climate and other scientific studies going many decades into the future. HDF4 is very effective for working with the large and complex collection of EOS data products. Unfortunately, because of the complex internal byte layout of HDF4 files, future readability of HDF4 data depends on preserving a complex software library that can interpret that layout. Having a way to access HDF4 data independent of a library could improve its viability as an archive format, and consequently give confidence that HDF4 data will be readily accessible forever, even if the HDF4 library is gone.
To address the need to simplify long-term access to EOS data stored in HDF4, a collaborative project between The HDF Group and NASA Earth Science Data Centers is implementing an approach to accessing data in HDF4 files based on the use of independent maps that describe the data in HDF4 files and tools that can use these maps to recover data from those files. With this approach, relatively simple programs will be able to extract the data from an HDF4 file, bypassing the need for the HDF4 library.
A demonstration project has shown that this approach is feasible. This involved an assessment of NASA�s HDF4 data holdings, and development of a prototype XML-based layout mapping language and tools to read layout maps and read HDF4 files using layout maps. Future plans call for a second phase of the project, in which the mapping tools and XML schema are made production quality, the mapping schema are integrated with existing XML metadata files in several data centers, and outreach activities are carried out to encourage and facilitate acceptance of the technology.
This tutorial is designed for new HDF5 users. We will cover HDF5 abstractions such as datasets, groups, attributes, and datatypes. Simple C examples will cover the programming model and basic features of the API, and will give new users the knowledge they need to navigate through the rich collection of HDF5 interfaces. Participants will be guided through an interactive demonstration of the fundamentals of HDF5.
This tutorial is for new HDF5 users.
This 2009 tutorial slide will cover basic HDF5 Data Model objects and their properties. It will include an overview of the HDF5 Libraries and APIs, and describe the HDF5 programming model. Simple programming examples and the HDFView data browser will be used to illustrate HDF5 concepts and start developing your own HDF5 based applications.
This tutorial is for new HDF5 users.
This tutorial is designed for new HDF5 users. We will go over a brief history of HDF and HDF5 software, and will cover basic HDF5 Data Model objects and their properties; we will give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples, and Java tool HDFView will be used to illustrate HDF5 concepts.
This tutorial is designed for new HDF5 users. We will cover basic HDF5 Data Model objects and their properties, give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples will be used to illustrate HDF5 concepts.
This Tutorial gives a brief introduction to HDF5 for people who have never used it. It covers the HDF5 Data Model including HDF5 objects and their properties. It also briefly describes the HDF5 Programming Model and prepares participants for further self-study of HDF5 and hands-on sessions.
In this presentation, we will give an update on the HDF OPeNDAP project. We will update the new features inside the HDF5 OPeNDAP data handler. We will also introduce the enhanced HDF4 OPeNDAP data handler and demonstrate how it can help users to view and analyze remote HDF-EOS2 data. A demo that uses OPeNDAP client tools to handle AIRS and MODIS Grid/Swath data with the enhanced handler will be presented.
This Tutorial is designed for new HDF5 users. We will cover basic HDF5 Data Model objects and their properties; we will give an overview of the HDF5 Libraries and APIs, and discuss the HDF5 programming model. Simple C and Fortran examples will be used to illustrate HDF5 concepts. Participants will work with the Tutorial examples and exercises during the hands-on sessions.
An update on HDF, including a status report on The HDF Group, an overview of recent changes to the HDF4 and HDF5 libraries and tools, plans for future releases, HDF Group projects and collaborations, and future plans.
It will cover features of the HDF5 library for achieving better I/O performance and efficient storage. The following HDF5 features will be discussed: datatype and partial I/O
This tutorial is for persons who are already familiar with HDF5 and wish to take advantage is some of its advanced features.
The HDF-Java products include three components: HDF4 and HDF5 Java wrappers, HDF-Java object package, and HDFView. The Java wrappers provide standard Java APIs that allow applications to call the C HDF4 and HDF5 libraries from Java. The HDF-Java object package implements HDF data objects, e.g. Groups and Datasets, in an object-oriented form and makes it easy for applications to use the libraries. The HDFView is a visual tool for browsing and editing HDF4 and HDF5 files.
This presentation will include recent work on supporting HDF5 1.8 APIs and new features. As part of the HDF-NPOESS project, some enhancements have been added to HDFView to support region references and quality flags. The presentation will show these features along with other new features added to HDFView since HDF-Java 2.5 release.
The HDF Group provides support for NPP/NPOESS in a number of ways, including development and maintenance of software capabilities in HDF5 libraries and tools that help NPP/NPOESS data producers and users, software testing on platforms of importance to NPP/NPOESS, high quality rapid response user support for NPP/NPOESS, and performance of special projects. The purposes of this presentation are to apprise attendees of the areas of emphasis for FY 2010, and to solicit ideas and opinions that will help the project understand how best to use its resources in order to best serve the needs of NPP/NPOESS.
The HDF Group is in the process of updating HDF-EOS web site. During the workshop, we would like to share with audiences some useful information in the new website that can help users to have easy access of NASA HDF and HDF-EOS data.
The presentation includes three parts:
EOS User Forum: will introduce the EOS user forum and how users can benefit from this forum.
Tools: will present information on how to use several widely-used tools to access NASA HDF and HDF-EOS data.
Examples: will present several examples on how to use C, Fortran and IDL to access NASA HDF and HDF-EOS data.
This tutorial is designed for users with some HDF5 experience. It will cover advanced features of the HDF5 library that can be used to achieve better I/O performance and more efficient storage. The following HDF5 features will be discussed: partial I/O; compression and other filters, including new n-bit and scale+offset filters and data storage options. Significant time will be devoted to the discussion of complex HDF5 datatypes such as strings, variable-length datatypes, array datatypes, and compound datatypes.
It will cover features of the HDF5 library for achieving better I/O performance and efficient storage. The following HDF5 features will be discussed: chunked storage layout.
This tutorial is for persons who are already familiar with HDF5 and wish to take advantage is some of its advanced features.
The European Unemployment Puzzle: implications from population agingGRAPE
We study the link between the evolving age structure of the working population and unemployment. We build a large new Keynesian OLG model with a realistic age structure, labor market frictions, sticky prices, and aggregate shocks. Once calibrated to the European economy, we quantify the extent to which demographic changes over the last three decades have contributed to the decline of the unemployment rate. Our findings yield important implications for the future evolution of unemployment given the anticipated further aging of the working population in Europe. We also quantify the implications for optimal monetary policy: lowering inflation volatility becomes less costly in terms of GDP and unemployment volatility, which hints that optimal monetary policy may be more hawkish in an aging society. Finally, our results also propose a partial reversal of the European-US unemployment puzzle due to the fact that the share of young workers is expected to remain robust in the US.
The Evolution of Non-Banking Financial Companies (NBFCs) in India: Challenges...beulahfernandes8
Role in Financial System
NBFCs are critical in bridging the financial inclusion gap.
They provide specialized financial services that cater to segments often neglected by traditional banks.
Economic Impact
NBFCs contribute significantly to India's GDP.
They support sectors like micro, small, and medium enterprises (MSMEs), housing finance, and personal loans.
What website can I sell pi coins securely.DOT TECH
Currently there are no website or exchange that allow buying or selling of pi coins..
But you can still easily sell pi coins, by reselling it to exchanges/crypto whales interested in holding thousands of pi coins before the mainnet launch.
Who is a pi merchant?
A pi merchant is someone who buys pi coins from miners and resell to these crypto whales and holders of pi..
This is because pi network is not doing any pre-sale. The only way exchanges can get pi is by buying from miners and pi merchants stands in between the miners and the exchanges.
How can I sell my pi coins?
Selling pi coins is really easy, but first you need to migrate to mainnet wallet before you can do that. I will leave the telegram contact of my personal pi merchant to trade with.
Tele-gram.
@Pi_vendor_247
USDA Loans in California: A Comprehensive Overview.pptxmarketing367770
USDA Loans in California: A Comprehensive Overview
If you're dreaming of owning a home in California's rural or suburban areas, a USDA loan might be the perfect solution. The U.S. Department of Agriculture (USDA) offers these loans to help low-to-moderate-income individuals and families achieve homeownership.
Key Features of USDA Loans:
Zero Down Payment: USDA loans require no down payment, making homeownership more accessible.
Competitive Interest Rates: These loans often come with lower interest rates compared to conventional loans.
Flexible Credit Requirements: USDA loans have more lenient credit score requirements, helping those with less-than-perfect credit.
Guaranteed Loan Program: The USDA guarantees a portion of the loan, reducing risk for lenders and expanding borrowing options.
Eligibility Criteria:
Location: The property must be located in a USDA-designated rural or suburban area. Many areas in California qualify.
Income Limits: Applicants must meet income guidelines, which vary by region and household size.
Primary Residence: The home must be used as the borrower's primary residence.
Application Process:
Find a USDA-Approved Lender: Not all lenders offer USDA loans, so it's essential to choose one approved by the USDA.
Pre-Qualification: Determine your eligibility and the amount you can borrow.
Property Search: Look for properties in eligible rural or suburban areas.
Loan Application: Submit your application, including financial and personal information.
Processing and Approval: The lender and USDA will review your application. If approved, you can proceed to closing.
USDA loans are an excellent option for those looking to buy a home in California's rural and suburban areas. With no down payment and flexible requirements, these loans make homeownership more attainable for many families. Explore your eligibility today and take the first step toward owning your dream home.
Turin Startup Ecosystem 2024 - Ricerca sulle Startup e il Sistema dell'Innov...Quotidiano Piemontese
Turin Startup Ecosystem 2024
Una ricerca de il Club degli Investitori, in collaborazione con ToTeM Torino Tech Map e con il supporto della ESCP Business School e di Growth Capital
where can I find a legit pi merchant onlineDOT TECH
Yes. This is very easy what you need is a recommendation from someone who has successfully traded pi coins before with a merchant.
Who is a pi merchant?
A pi merchant is someone who buys pi network coins and resell them to Investors looking forward to hold thousands of pi coins before the open mainnet.
I will leave the telegram contact of my personal pi merchant to trade with
@Pi_vendor_247
how can i use my minded pi coins I need some funds.DOT TECH
If you are interested in selling your pi coins, i have a verified pi merchant, who buys pi coins and resell them to exchanges looking forward to hold till mainnet launch.
Because the core team has announced that pi network will not be doing any pre-sale. The only way exchanges like huobi, bitmart and hotbit can get pi is by buying from miners.
Now a merchant stands in between these exchanges and the miners. As a link to make transactions smooth. Because right now in the enclosed mainnet you can't sell pi coins your self. You need the help of a merchant,
i will leave the telegram contact of my personal pi merchant below. 👇 I and my friends has traded more than 3000pi coins with him successfully.
@Pi_vendor_247
how can I sell pi coins after successfully completing KYCDOT TECH
Pi coins is not launched yet in any exchange 💱 this means it's not swappable, the current pi displaying on coin market cap is the iou version of pi. And you can learn all about that on my previous post.
RIGHT NOW THE ONLY WAY you can sell pi coins is through verified pi merchants. A pi merchant is someone who buys pi coins and resell them to exchanges and crypto whales. Looking forward to hold massive quantities of pi coins before the mainnet launch.
This is because pi network is not doing any pre-sale or ico offerings, the only way to get my coins is from buying from miners. So a merchant facilitates the transactions between the miners and these exchanges holding pi.
I and my friends has sold more than 6000 pi coins successfully with this method. I will be happy to share the contact of my personal pi merchant. The one i trade with, if you have your own merchant you can trade with them. For those who are new.
Message: @Pi_vendor_247 on telegram.
I wouldn't advise you selling all percentage of the pi coins. Leave at least a before so its a win win during open mainnet. Have a nice day pioneers ♥️
#kyc #mainnet #picoins #pi #sellpi #piwallet
#pinetwork
what is the future of Pi Network currency.DOT TECH
The future of the Pi cryptocurrency is uncertain, and its success will depend on several factors. Pi is a relatively new cryptocurrency that aims to be user-friendly and accessible to a wide audience. Here are a few key considerations for its future:
Message: @Pi_vendor_247 on telegram if u want to sell PI COINS.
1. Mainnet Launch: As of my last knowledge update in January 2022, Pi was still in the testnet phase. Its success will depend on a successful transition to a mainnet, where actual transactions can take place.
2. User Adoption: Pi's success will be closely tied to user adoption. The more users who join the network and actively participate, the stronger the ecosystem can become.
3. Utility and Use Cases: For a cryptocurrency to thrive, it must offer utility and practical use cases. The Pi team has talked about various applications, including peer-to-peer transactions, smart contracts, and more. The development and implementation of these features will be essential.
4. Regulatory Environment: The regulatory environment for cryptocurrencies is evolving globally. How Pi navigates and complies with regulations in various jurisdictions will significantly impact its future.
5. Technology Development: The Pi network must continue to develop and improve its technology, security, and scalability to compete with established cryptocurrencies.
6. Community Engagement: The Pi community plays a critical role in its future. Engaged users can help build trust and grow the network.
7. Monetization and Sustainability: The Pi team's monetization strategy, such as fees, partnerships, or other revenue sources, will affect its long-term sustainability.
It's essential to approach Pi or any new cryptocurrency with caution and conduct due diligence. Cryptocurrency investments involve risks, and potential rewards can be uncertain. The success and future of Pi will depend on the collective efforts of its team, community, and the broader cryptocurrency market dynamics. It's advisable to stay updated on Pi's development and follow any updates from the official Pi Network website or announcements from the team.
US Economic Outlook - Being Decided - M Capital Group August 2021.pdfpchutichetpong
The U.S. economy is continuing its impressive recovery from the COVID-19 pandemic and not slowing down despite re-occurring bumps. The U.S. savings rate reached its highest ever recorded level at 34% in April 2020 and Americans seem ready to spend. The sectors that had been hurt the most by the pandemic specifically reduced consumer spending, like retail, leisure, hospitality, and travel, are now experiencing massive growth in revenue and job openings.
Could this growth lead to a “Roaring Twenties”? As quickly as the U.S. economy contracted, experiencing a 9.1% drop in economic output relative to the business cycle in Q2 2020, the largest in recorded history, it has rebounded beyond expectations. This surprising growth seems to be fueled by the U.S. government’s aggressive fiscal and monetary policies, and an increase in consumer spending as mobility restrictions are lifted. Unemployment rates between June 2020 and June 2021 decreased by 5.2%, while the demand for labor is increasing, coupled with increasing wages to incentivize Americans to rejoin the labor force. Schools and businesses are expected to fully reopen soon. In parallel, vaccination rates across the country and the world continue to rise, with full vaccination rates of 50% and 14.8% respectively.
However, it is not completely smooth sailing from here. According to M Capital Group, the main risks that threaten the continued growth of the U.S. economy are inflation, unsettled trade relations, and another wave of Covid-19 mutations that could shut down the world again. Have we learned from the past year of COVID-19 and adapted our economy accordingly?
“In order for the U.S. economy to continue growing, whether there is another wave or not, the U.S. needs to focus on diversifying supply chains, supporting business investment, and maintaining consumer spending,” says Grace Feeley, a research analyst at M Capital Group.
While the economic indicators are positive, the risks are coming closer to manifesting and threatening such growth. The new variants spreading throughout the world, Delta, Lambda, and Gamma, are vaccine-resistant and muddy the predictions made about the economy and health of the country. These variants bring back the feeling of uncertainty that has wreaked havoc not only on the stock market but the mindset of people around the world. MCG provides unique insight on how to mitigate these risks to possibly ensure a bright economic future.
Lecture slide titled Fraud Risk Mitigation, Webinar Lecture Delivered at the Society for West African Internal Audit Practitioners (SWAIAP) on Wednesday, November 8, 2023.
1. www.hdfgroup.org
The HDF Group
Introduction to HDF5
Barbara Jones
The HDF Group
The 13th
HDF & HDF-EOS Workshop
November 3-5, 2009
November 3-5, 2009 1HDF/HDF-EOS Workshop XIII
2. www.hdfgroup.org
Before We Begin …
HDF-EOS Home Page: http://hdfeos.org/
Workshop Info:
http://hdfeos.org/workshops/ws13/workshop_thirteen.php
The HDF Group Page: http://hdfgroup.org/
HDF5 Home Page: http://hdfgroup.org/HDF5/
HDF Helpdesk: help@hdfgroup.org
HDF Mailing Lists: http://hdfgroup.org/services/support.html
November 3-5, 2009 2HDF/HDF-EOS Workshop XIII
3. www.hdfgroup.org
HDF5 is the second HDF format
• Development started in 1996
• First release was in 1998
HDF4 is the first HDF format
• Originally called HDF
• Development started in 1987
• Still supported by The HDF Group
HDF = Hierarchical Data Format
November 3-5, 2009 3HDF/HDF-EOS Workshop XIII
5. www.hdfgroup.org
HDF5 is designed …
• for high volume and/or complex data
• for every size and type of system (portable)
• for flexible, efficient storage and I/O
• to enable applications to evolve in their use of
HDF5 and to accommodate new models
• to support long-term data preservation
November 3-5, 2009 5HDF/HDF-EOS Workshop XIII
7. www.hdfgroup.org
HDF5 Technology
• HDF5 (Abstract) Data Model
• Defines the “building blocks” for data organization and
specification
• Files, Groups, Datasets, Attributes, Datatypes, Dataspaces, …
• HDF5 Library (C, Fortran 90, C++ APIs)
• Also Java Language Interface and High Level Libraries
• HDF5 Binary File Format
• Bit-level organization of HDF5 file
• Defined by HDF5 File Format Specification
• Tools For Accessing Data in HDF5 Format
• h5dump, h5repack, HDFView, …
November 3-5, 2009 7HDF/HDF-EOS Workshop XIII
8. www.hdfgroup.org
The HDF Group
HDF5 Abstract Data Model
a.k.a. HDF5 Logical Data Model
a.k.a. HDF5 Data Model
November 3-5, 2009 8HDF/HDF-EOS Workshop XIII
9. www.hdfgroup.org
HDF5 File
lat | lon | temp
----|-----|-----
12 | 23 | 3.1
15 | 24 | 4.2
17 | 21 | 3.6An HDF5 file is a
container that
holds data
objects.
Experim
ent Notes:
Serial Number: 99378920
Date: 3/13/09
Configuration: Standard
3
November 3-5, 2009 9HDF/HDF-EOS Workshop XIII
10. www.hdfgroup.org
HDF5 Groups and Links
lat | lon | templat | lon | temp
----|-----|---------|-----|-----
12 | 23 | 3.112 | 23 | 3.1
15 | 24 | 4.215 | 24 | 4.2
17 | 21 | 3.617 | 21 | 3.6
Experiment Notes:
Serial Number: 99378920
Date: 3/13/09
Configuration: Standard 3
/
SimOutViz
HDF5 groups
and links
organize
data objects.
November 3-5, 2009 10HDF/HDF-EOS Workshop XIII
11. www.hdfgroup.org
HDF5 Objects
• HDF5 Group: A grouping structure containing
zero or more HDF5 objects
• HDF5 Dataset: Raw data elements, together
with information that describes them
(There are other HDF5 objects that help support
Groups and Datasets.)
The two primary HDF5 objects are:
November 3-5, 2009 11HDF/HDF-EOS Workshop XIII
12. www.hdfgroup.org
HDF5 Groups
“/”
A
B
C
k
l
temp
• Used to organize collections
• Every file starts with a root group
• Similar to UNIX directories
• Path to object defines it
• Objects can be shared:
/A/k and /B/l are the same
= Group
= Dataset
November 3-5, 2009 12HDF/HDF-EOS Workshop XIII
temp
13. www.hdfgroup.org
HDF5 Datasets
HDF5 Datasets organize and contain your
“raw data values”. They consist of:
• Your raw data
• Metadata describing the data:
- The information to interpret the data (Datatype)
- The information to describe the logical layout of the
data elements (Dataspace)
- Characteristics of the data (Properties)
- Additional optional information that describes the
data (Attributes)
November 3-5,
2009 HDF/HDF-EOS Workshop XIII 13
15. www.hdfgroup.org
HDF5 Dataspaces
An HDF5 Dataspace describes the logical layout
for the data elements:
• Array
• multiple elements in dataset organized in a
multi-dimensional (rectangular) array
• maximum number of elements in each
dimension may be fixed or unlimited
• NULL
• no elements in dataset
• Scalar
• single element in dataset
•November 3-5, 2009 15HDF/HDF-EOS Workshop XIII
16. www.hdfgroup.org
HDF5 Dataspaces
Two roles:
Dataspace contains spatial information (logical
layout) about a dataset
stored in a file
• Rank and dimensions
• Permanent part of dataset
definition
Partial I/0: Dataspace describes application’s data
buffer and data elements participating in I/O
Rank = 2Rank = 2
Dimensions = 4x6Dimensions = 4x6
Rank = 1Rank = 1
Dimension = 10Dimension = 10
November 3-5, 2009 16HDF/HDF-EOS Workshop XIII
17. www.hdfgroup.org
HDF5 Datatypes
The HDF5 datatype describes how to interpret
individual data elements.
HDF5 datatypes include:
− integer, float, unsigned, bitfield, …
− user-definable (e.g., 13-bit integer)
− variable length types (e.g., strings)
− references to objects/dataset regions
− enumerations - names mapped to integers
− opaque
− compound (similar to C structs)
November 3-5, 2009 17HDF/HDF-EOS Workshop XIII
19. www.hdfgroup.org
HDF5 Properties
• Properties (also known as Property Lists)
are characteristics of HDF5 objects that can
be modified
• Default properties handle most needs
• By changing properties one can take
advantage of the more powerful features in
HDF5
November 3-5, 2009 20HDF/HDF-EOS Workshop XIII
20. www.hdfgroup.org
Storage Properties
November 3-5, 2009 HDF/HDF-EOS Workshop XIII 21
ChunkedChunked
Chunked &Chunked &
CompressedCompressed
Better access time
for subsets;
extensible
Improves storage
efficiency,
transmission speed
ContiguousContiguous
(default)(default)
Data elements
stored physically
adjacent to each
other
21. www.hdfgroup.org
HDF5 Attributes (optional)
• An HDF5 attribute has a name and a value
• Attributes typically contain user metadata
• Attributes may be associated with
- HDF5 groups
- HDF5 datasets
- HDF5 named datatypes
• An attribute’s value is described by a datatype and a
dataspace
• Attributes are analogous to datasets except…
- they are NOT extensible
- they do NOT support compression or partial I/O
November 3-5, 2009 22HDF/HDF-EOS Workshop XIII
22. www.hdfgroup.org
HDF5 Abstract Data Model Summary
• The Objects in the Data Model are the “building
blocks” for data organization and specification
• Files, Groups, Links, Datasets, Datatypes,
Dataspaces, Attributes, …
• Projects using HDF5 “map” their data concepts to
these HDF5 Objects
November 3-5, 2009 23HDF/HDF-EOS Workshop XIII
24. www.hdfgroup.org
HDF5 Software Layers & Storage
HDF5 File
Format File Split
Files
File on
Parallel
Filesystem
Other
I/O Drivers
Virtual File
Layer Posix
I/O
Split
Files MPI I/O Custom
Internals
Memory
Mgmt
Datatype
Conversion
Filters
Chunked
Storage
Version
Compatibility
and so
on…
Language
Interfaces
C, Fortran, C++
HDF5 Data Model
Objects
Groups, Datasets, Attributes, …
Tunable Properties
Chunk Size, I/O Driver, …
h5dump
toolHigh Level
APIs
HDFview
tool
h5repack
tool
Java Interface
…
API
November 3-5, 2009 25HDF/HDF-EOS Workshop XIII
25. www.hdfgroup.org
HDF5 API and Applications
…
Storage
Domain Data
Objects
EOS
library
Applications
aClimate
Model
MATLAB
November 3-5, 2009 26HDF/HDF-EOS Workshop XIII
HDF5 Library
26. www.hdfgroup.org
HDF5 Home Page
HDF5 home page: http://hdfgroup.org/HDF5/
• Two releases: HDF5 1.8 and HDF5 1.6
HDF5 source code:
• Written in C, and includes optional C++, Fortran 90 APIs,
and High Level APIs
• Contains command-line utilities (h5dump, h5repack,
h5diff, ..) and compile scripts
HDF pre-built binaries:
• When possible, include C, C++, F90, and High Level
libraries. Check ./lib/libhdf5.settings file.
• Built with and require the SZIP and ZLIB external libraries
November 3-5, 2009 27HDF/HDF-EOS Workshop XIII
27. www.hdfgroup.org
Useful Tools For New Users
h5dump:
Tool to “dump” or display contents of HDF5 files
h5cc, h5c++, h5fc:
Scripts to compile applications
HDFView:
Java browser to view HDF4 and HDF5 files
http://www.hdfgroup.org/hdf-java-html/hdfview/
November 3-5, 2009 28HDF/HDF-EOS Workshop XIII
28. www.hdfgroup.org
h5dump Utility
h5dump [options] [file]
-H, --header Display header only – no data
-d <names> Display the specified dataset(s).
-g <names> Display the specified group(s) and
all members.
-p Display properties.
<names> is one or more appropriate object names.
November 3-5, 2009 29HDF/HDF-EOS Workshop XIII
30. www.hdfgroup.org
HDF5 Compile Scripts
• h5cc – HDF5 C compiler command
• h5fc – HDF5 F90 compiler command
• h5c++ – HDF5 C++ compiler command
To compile:
% h5cc h5prog.c
% h5fc h5prog.f90
November 3-5, 2009 31HDF/HDF-EOS Workshop XIII
31. www.hdfgroup.org
Compile option: -show
-show: displays the compiler commands and options
without executing them
% h5cc –show Sample_c.c
Will show the correct paths and libraries used by
the installed HDF5 library.
Will show the correct flags to specify when
building an application with that HDF5 library.
November 3-5, 2009 32HDF/HDF-EOS Workshop XIII
37. www.hdfgroup.org
Operations Supported by the API
• Create objects (groups, datasets, attributes, complex data
types, …)
• Assign storage and I/O properties to objects
• Perform complex subsetting during read/write
• Use variety of I/O “devices” (parallel, remote, etc.)
• Transform data during I/O
• Make inquiries on file and object structure, content,
properties
November 3-5, 2009 38HDF/HDF-EOS Workshop XIII
38. www.hdfgroup.org
General Programming Paradigm
• Properties of object are optionally defined
Creation properties
Access properties
• Object is opened or created
• Object is accessed, possibly many times
• Object is closed
November 3-5, 2009 39HDF/HDF-EOS Workshop XIII
39. www.hdfgroup.org
Order of Operations
• An order is imposed on operations by
argument dependencies
For Example:
A file must be opened before a dataset
-because-
the dataset open call requires a file handle
as an argument.
• Objects can be closed in any order.
November 3-5, 2009 40HDF/HDF-EOS Workshop XIII
40. www.hdfgroup.org
The General HDF5 API
• Currently C, Fortran 90, Java, and C++
bindings.
• C routines begin with prefix H5?
? is a character corresponding to the type of
object the function acts on
Example Functions:
H5D : Dataset interface e.g., H5Dread
H5F : File interface e.g., H5Fopen
H5S : dataSpace interface e.g., H5Sclose
November 3-5, 2009 41HDF/HDF-EOS Workshop XIII
41. www.hdfgroup.org
HDF5 Defined Types
For portability, the HDF5 library has its own defined
types:
hid_t: object identifiers (native integer)
hsize_t: size used for dimensions (unsigned long or
unsigned long long)
herr_t: function return value
hvl_t: variable length datatype
For C, include hdf5.h in your HDF5 application.
November 3-5, 2009 42HDF/HDF-EOS Workshop XIII
42. www.hdfgroup.org
The HDF5 API
• For flexibility, the API is extensive
300+ functions
• This can be daunting… but there is hope
A few functions can do a lot
Start simple
Build up knowledge as more features are
needed
Victronix
Swiss Army
Cybertool 34
November 3-5, 2009 43HDF/HDF-EOS Workshop XIII
43. www.hdfgroup.org
Basic Functions
H5Fcreate (H5Fopen) create (open) File
H5Screate_simple/H5Screate create dataSpace
H5Dcreate (H5Dopen) create (open) Dataset
H5Dread, H5Dwrite access Dataset
H5Dclose close Dataset
H5Sclose close dataSpace
H5Fclose close File
November 3-5, 2009 44HDF/HDF-EOS Workshop XIII
NOTE: The order specified above is not required.
44. www.hdfgroup.org
Other Common Functions
DataSpaces: H5Sselect_hyperslab (Partial I/O)
H5Sselect_elements (Partial I/O)
H5Dget_space
Groups: H5Gcreate, H5Gopen, H5Gclose
Attributes: H5Acreate, H5Aopen_name,
H5Aclose, H5Aread, H5Awrite
Property lists: H5Pcreate, H5Pclose
H5Pset_chunk, H5Pset_deflate
November 3-5, 2009 45HDF/HDF-EOS Workshop XIII
45. www.hdfgroup.org
High Level APIs
• Included along with the HDF5 library
• Simplify steps for creating, writing, and
reading objects.
• Do not entirely ‘wrap’ HDF5 library
November 3-5, 2009 46HDF/HDF-EOS Workshop XIII
47. www.hdfgroup.org
Steps to Create a File
1. Decide on properties the file should have and
create them if necessary:
• Creation properties, like size of user block
• Access properties (improve performance)
• Use default properties (H5P_DEFAULT)
2. Create the file
3. Close the file and the property lists, as needed
November 3-5, 2009 48HDF/HDF-EOS Workshop XIII
48. www.hdfgroup.org
Code: Create a File
hid_t file_id;
herr_t status;
file_id = H5Fcreate("file.h5", H5F_ACC_TRUNC,
H5P_DEFAULT, H5P_DEFAULT);
status = H5Fclose (file_id);
Note: Return codes not checked for errors in code samples.
“/” (root)“/” (root)
November 3-5, 2009 49HDF/HDF-EOS Workshop XIII
50. www.hdfgroup.org
Steps to Create a Dataset
1. Define dataset characteristics
a) Datatype – integer
b) Dataspace - 4x6
c) Properties if needed, or use H5P_DEFAULT
2. Decide where to put it
• Obtain location ID:
- Group ID puts it in a Group
- File ID puts it in Root Group
3. Create dataset in file
4. Close everything
A
“/” (root)
November 3-5, 2009 51HDF/HDF-EOS Workshop XIII
51. www.hdfgroup.org
HDF5 Pre-defined Datatype Identifiers
HDF5 defines* set of Datatype Identifiers per HDF5
session.
For example:
C Type HDF5 File Type HDF5 Memory Type
int H5T_STD_I32BE H5T_NATIVE_INT
H5T_STD_I32LE
float H5T_IEEE_F32BE H5T_NATIVE_FLOAT
H5T_IEEE_F32LE
double H5T_IEEE_F64BE H5T_NATIVE_DOUBLE
H5T_IEEE_F64LE
* Value of datatype is NOT fixed
November 3-5, 2009 52HDF/HDF-EOS Workshop XIII
52. www.hdfgroup.org
Pre-defined File Datatype Identifiers
Examples:
H5T_IEEE_F64LE Eight-byte, little-endian, IEEE floating-point
H5T_STD_I32LE Four-byte, little-endian, signed two's
complement integer
NOTE: What you see in the file. Name is the same everywhere and
explicitly defines a datatype.
*STD= “An architecture with a semi-standard type like 2’s complement integer, unsigned integer…”
Architecture*
Programming
Type
November 3-5, 2009 53HDF/HDF-EOS Workshop XIII
53. www.hdfgroup.org
Pre-defined Native Datatypes
Examples of predefined native types in C:
H5T_NATIVE_INT (int)
H5T_NATIVE_FLOAT (float )
H5T_NATIVE_UINT (unsigned int)
H5T_NATIVE_LONG (long )
H5T_NATIVE_CHAR (char )
NOTE: Memory types.
Different for each machine.
Used for reading/writing.
November 3-5, 2009 54HDF/HDF-EOS Workshop XIII
54. www.hdfgroup.org
Storage Properties
November 3-5, 2009 HDF/HDF-EOS Workshop XIII 55
ChunkedChunked
Chunked &Chunked &
CompressedCompressed
Better access time
for subsets;
extensible
Improves storage
efficiency,
transmission speed
ContiguousContiguous
(default)(default)
Data elements
stored physically
adjacent to each
other
55. www.hdfgroup.org
Code: Create a Dataset
1 hid_t file_id, dataset_id, dataspace_id;
2 hsize_t dims[2];
. herr_t status;
.
. file_id = H5Fcreate (”file.h5", H5F_ACC_TRUNC,
. H5P_DEFAULT, H5P_DEFAULT);
5 dims[0] = 4;
6 dims[1] = 6;
7 dataspace_id = H5Screate_simple (2, dims, NULL);
8 dataset_id = H5Dcreate (file_id,”A",H5T_STD_I32BE,
dataspace_id, H5P_DEFAULT,
H5P_DEFAULT,
H5P_DEFAULT);
9 status = H5Dclose (dataset_id);
10 status = H5Sclose (dataspace_id);
11 status = H5Fclose (file_id);
Define a
dataspace
rank current dims
November 3-5, 2009 58HDF/HDF-EOS Workshop XIII
56. www.hdfgroup.org
Code: Create a Dataset
1 hid_t file_id, dataset_id, dataspace_id;
. hsize_t dims[2];
. herr_t status;
.
. file_id = H5Fcreate (”file.h5", H5F_ACC_TRUNC,
H5P_DEFAULT, H5P_DEFAULT);
. dims[0] = 4;
. dims[1] = 6;
. dataspace_id = H5Screate_simple (2, dims, NULL);
8 dataset_id = H5Dcreate (file_id,”A",H5T_STD_I32BE,
dataspace_id, H5P_DEFAULT,H5P_DEFAULT,
H5P_DEFAULT);
Datatype
Properties
(Link Creation,
Dataset Creation and
Access)
Where to put
it
Size &
shape
November 3-5, 2009 59HDF/HDF-EOS Workshop XIII
58. www.hdfgroup.org
Example Code - H5Dwrite
status = H5Dwrite (dataset_id, H5T_NATIVE_INT,
H5S_ALL,H5S_ALL, H5P_DEFAULT, wdata);
Dataset ID from
H5Dcreate/H5Dopen Memory Datatype
November 3-5, 2009 61HDF/HDF-EOS Workshop XIII
59. www.hdfgroup.org
Partial I/O
File Dataspace (disk)H5S_ALL H5S_ALL
To Modify Dataspace:
H5Sselect_hyperslab
H5Sselect_elements
November 3-5, 2009 62HDF/HDF-EOS Workshop XIII
Memory
Dataspace
status = H5Dwrite (dataset_id, H5T_NATIVE_INT,
H5S_ALL, H5S_ALL, H5P_DEFAULT,wdata);
60. www.hdfgroup.org
Example Code – H5Dwrite
status = H5Dwrite (dataset_id, H5T_NATIVE_INT,
H5S_ALL, H5S_ALL, H5P_DEFAULT, wdata);
Data Transfer Property List
(MPI I/O, Transformations,…)
November 3-5, 2009 63HDF/HDF-EOS Workshop XIII
61. www.hdfgroup.org
Example Code – H5Dread
status = H5Dread (dataset_id, H5T_NATIVE_INT,
H5S_ALL, H5S_ALL, H5P_DEFAULT, rdata);
November 3-5, 2009 64HDF/HDF-EOS Workshop XIII
62. www.hdfgroup.org
High Level APIs: HDF5 Lite (H5LT)
#include “hdf5_hl.h“
.
.
file_id = H5Fcreate(“file.h5",H5F_ACC_TRUNC,
H5P_DEFAULT, H5P_DEFAULT);
status = H5LTmake_dataset (file_id,“A",2,dims,
H5T_STD_I32BE, data);
status = H5Fclose (file_id);
November 3-5, 2009 65HDF/HDF-EOS Workshop XIII
63. www.hdfgroup.org
Steps to Create a Group
1. Decide where to put it – “root group”
• Obtain location ID
1. Define properties or use H5P_DEFAULT
3. Create group in file.
4. Close the group.
November 3-5, 2009 67HDF/HDF-EOS Workshop XIII
65. www.hdfgroup.org
Code: Create a Group
hid_t file_id, group_id;
...
/* Open “file.h5” */
file_id = H5Fopen (“file.h5”, H5F_ACC_RDWR,
H5P_DEFAULT);
/* Create group "/B" in file. */
group_id = H5Gcreate (file_id,"B", H5P_DEFAULT,
H5P_DEFAULT, H5P_DEFAULT);
/* Close group and file. */
status = H5Gclose (group_id);
status = H5Fclose (file_id);
November 3-5, 2009 70HDF/HDF-EOS Workshop XIII
66. www.hdfgroup.org
HDF5 Tutorial and Examples
HDF5 Tutorial:
http://www.hdfgroup.org/HDF5/Tutor/
HDF5 Example Code:
http://www.hdfgroup.org/ftp/HDF5/examples/examples-by-api/
November 3-5, 2009 71HDF/HDF-EOS Workshop XIII
68. www.hdfgroup.org
Acknowledgements
This work was supported by cooperative agreement
number NNX08AO77A from the National
Aeronautics and Space Administration (NASA).
Any opinions, findings, conclusions, or
recommendations expressed in this material are
those of the author[s] and do not necessarily reflect
the views of the National Aeronautics and Space
Administration.
November 3-5, 2009 HDF/HDF-EOS Workshop XIII 73
Data Array is an ordered collection of identically typed data items distinguished by their indices
Metadata:
Dataspace – Rank, dimensions; spatial info about dataset
Datatype – Information on how to interpret your data
Storage Properties – How array is organized
Attributes – User-defined metadata (optional)
To create this file, we would start by creating the file itself. When you create a file, the root group gets created with it. So every file has at least that one group.