This tutorial was presented at a Boston KNIME User Meetup in 2014 and offers a crash course in KNIME, text processing, text mining, and topic classification.
The tool takes HDF-EOS 5 data as input, and generates COARDS-compatible output - if the input file has enough metadata to be COARDS-compliant, the output file will be COARDS-compliant. The tool is written in portable C, and ought to run on any platform where the HDF-EOS and netCDF libraries are available.
This year, we have made two major enhancements to the converter:
It now automatically detects whether its input is HDF-EOS2 or HDF-EOS5 format, and handles either one. The previous tool worked with HDF-EOS5 only.
Its netCDF output attempts to conform to the new CF conventions (a superset of the COARDS conventions). This is primarily an improvement in its translation of Swath datasets, which CF handles much better than COARDS.
netCDF-LD - Towards linked data conventions for delivery of environmental dat...Jonathan Yu
This presentation was given at the ISESS'15 conference at Melbourne.
Abstract. netCDF is a well-known and widely used format to exchange array-oriented scientific data such as grids and time-series. We describe a new convention for encoding netCDF based on Linked Data principles called netCDF-LD. netCDF-LD allows metadata elements, given as string values in current netCDF files, to be given as Linked Data objects. netCDF-LD allows precise semantics to be used for elements and expands the type options beyond lists of controlled terms. Using Uniform Resource Identifiers (URIs) for elements allows them to refer to other Linked Data resources for their type and descriptions. This enables improved data discovery through a generic mechanism for element type identification and adds element type expandability to new Linked Data resources as they become available. By following patterns already established for extending existing formats, netCDF-LD applications can take advantage of existing software for processing Linked Data and supporting more effective data discovery and integration across systems.
See http://link.springer.com/chapter/10.1007%2F978-3-319-15994-2_9
eReefs Data Brokering Layer (DBL) is about a mediating web service which provides integrated discovery and access to a range of known data services. It extends the idea of Web Catalogues and uses semantic web technologies (RDF, OWL, SPARQL, JSON-LD) to provide a linked data view of data providers, web services, and dataset metadata.
The DBL aims to support the broader eReefs information architecture. It allows flexible 'plug-n-play' data provider nodes to participate in the initiative. The DBL also provides client applications and end users with enhanced discovery of the underlying datasets which is traceable as the details are captured as linked data.
This tutorial was presented at a Boston KNIME User Meetup in 2014 and offers a crash course in KNIME, text processing, text mining, and topic classification.
The tool takes HDF-EOS 5 data as input, and generates COARDS-compatible output - if the input file has enough metadata to be COARDS-compliant, the output file will be COARDS-compliant. The tool is written in portable C, and ought to run on any platform where the HDF-EOS and netCDF libraries are available.
This year, we have made two major enhancements to the converter:
It now automatically detects whether its input is HDF-EOS2 or HDF-EOS5 format, and handles either one. The previous tool worked with HDF-EOS5 only.
Its netCDF output attempts to conform to the new CF conventions (a superset of the COARDS conventions). This is primarily an improvement in its translation of Swath datasets, which CF handles much better than COARDS.
netCDF-LD - Towards linked data conventions for delivery of environmental dat...Jonathan Yu
This presentation was given at the ISESS'15 conference at Melbourne.
Abstract. netCDF is a well-known and widely used format to exchange array-oriented scientific data such as grids and time-series. We describe a new convention for encoding netCDF based on Linked Data principles called netCDF-LD. netCDF-LD allows metadata elements, given as string values in current netCDF files, to be given as Linked Data objects. netCDF-LD allows precise semantics to be used for elements and expands the type options beyond lists of controlled terms. Using Uniform Resource Identifiers (URIs) for elements allows them to refer to other Linked Data resources for their type and descriptions. This enables improved data discovery through a generic mechanism for element type identification and adds element type expandability to new Linked Data resources as they become available. By following patterns already established for extending existing formats, netCDF-LD applications can take advantage of existing software for processing Linked Data and supporting more effective data discovery and integration across systems.
See http://link.springer.com/chapter/10.1007%2F978-3-319-15994-2_9
eReefs Data Brokering Layer (DBL) is about a mediating web service which provides integrated discovery and access to a range of known data services. It extends the idea of Web Catalogues and uses semantic web technologies (RDF, OWL, SPARQL, JSON-LD) to provide a linked data view of data providers, web services, and dataset metadata.
The DBL aims to support the broader eReefs information architecture. It allows flexible 'plug-n-play' data provider nodes to participate in the initiative. The DBL also provides client applications and end users with enhanced discovery of the underlying datasets which is traceable as the details are captured as linked data.
This is an introductory slide for accessing NASA HDF/HDF-EOS data for beginners. NASA distributes many Earth Science data in HDF/HDF-EOS file format and new users struggle to understand the file format and use the NASA HDF/HDF-EOS data properly. This brief presentation will help new users to understand the basic concepts about the HDF/HDF-EOS and to know the available tools that can access the NASA data easily.
In this presentation, we will give an update on the HDF OPeNDAP project. We will update the new features inside the HDF5 OPeNDAP data handler. We will also introduce the enhanced HDF4 OPeNDAP data handler and demonstrate how it can help users to view and analyze remote HDF-EOS2 data. A demo that uses OPeNDAP client tools to handle AIRS and MODIS Grid/Swath data with the enhanced handler will be presented.
The HDF Group provides support for NPP/NPOESS in a number of ways, including development and maintenance of software capabilities in HDF5 libraries and tools that help NPP/NPOESS data producers and users, software testing on platforms of importance to NPP/NPOESS, high quality rapid response user support for NPP/NPOESS, and performance of special projects. The purposes of this presentation are to apprise attendees of the areas of emphasis for FY 2010, and to solicit ideas and opinions that will help the project understand how best to use its resources in order to best serve the needs of NPP/NPOESS.
An update on HDF, including a status report on The HDF Group, an overview of recent changes to the HDF4 and HDF5 libraries and tools, plans for future releases, HDF Group projects and collaborations, and future plans.
The HDF-Java products include three components: HDF4 and HDF5 Java wrappers, HDF-Java object package, and HDFView. The Java wrappers provide standard Java APIs that allow applications to call the C HDF4 and HDF5 libraries from Java. The HDF-Java object package implements HDF data objects, e.g. Groups and Datasets, in an object-oriented form and makes it easy for applications to use the libraries. The HDFView is a visual tool for browsing and editing HDF4 and HDF5 files.
This presentation will include recent work on supporting HDF5 1.8 APIs and new features. As part of the HDF-NPOESS project, some enhancements have been added to HDFView to support region references and quality flags. The presentation will show these features along with other new features added to HDFView since HDF-Java 2.5 release.
A preponderance of data from NASA's Earth Observing System (EOS) is archived in the HDF Version 4 (HDF4) format. The long-term preservation of these data is critical for climate and other scientific studies going many decades into the future. HDF4 is very effective for working with the large and complex collection of EOS data products. Unfortunately, because of the complex internal byte layout of HDF4 files, future readability of HDF4 data depends on preserving a complex software library that can interpret that layout. Having a way to access HDF4 data independent of a library could improve its viability as an archive format, and consequently give confidence that HDF4 data will be readily accessible forever, even if the HDF4 library is gone.
To address the need to simplify long-term access to EOS data stored in HDF4, a collaborative project between The HDF Group and NASA Earth Science Data Centers is implementing an approach to accessing data in HDF4 files based on the use of independent maps that describe the data in HDF4 files and tools that can use these maps to recover data from those files. With this approach, relatively simple programs will be able to extract the data from an HDF4 file, bypassing the need for the HDF4 library.
A demonstration project has shown that this approach is feasible. This involved an assessment of NASA�s HDF4 data holdings, and development of a prototype XML-based layout mapping language and tools to read layout maps and read HDF4 files using layout maps. Future plans call for a second phase of the project, in which the mapping tools and XML schema are made production quality, the mapping schema are integrated with existing XML metadata files in several data centers, and outreach activities are carried out to encourage and facilitate acceptance of the technology.
In this talk, we will give an update on the HDF5 OPeNDAP project. We will update the new features inside OPeNDAP HDF5 data handler. We will also introduce a new HDF5-Friendly OPeNDAP client library and demonstrate how it can help users to view and analyze remote HDF-EOS5 data served by OPeNDAP HDF5 handler. A demo will be presented with a customized OPeNDAP visualization client (GrADS) that uses the library.
Accessibility and usability of NPP/NPOESS data in HDF5 can be enhanced by providing tools that simplify and standardize how data is accessed and presented. In this project, The HDF Group is creating such tools in the form of software to read and write certain key data types and data aggregates used in NPP/NPOESS data products, and extending HDFView to extract, present and export these data effectively. In particular, the work will focus on NPP/NPOESS use of HDF5 region references and quality flags. The HDF Group will also provide high quality user support for the project.
The HDF Group is in the process of updating HDF-EOS web site. During the workshop, we would like to share with audiences some useful information in the new website that can help users to have easy access of NASA HDF and HDF-EOS data.
The presentation includes three parts:
EOS User Forum: will introduce the EOS user forum and how users can benefit from this forum.
Tools: will present information on how to use several widely-used tools to access NASA HDF and HDF-EOS data.
Examples: will present several examples on how to use C, Fortran and IDL to access NASA HDF and HDF-EOS data.
An update on HDF, including a status report on the HDF Group, an overview of recent changes to the HDF4 and HDF5 libraries and tools, plans for future releases, HDF Group projects and collaborations, and future plans.
Update on HDF, including recent changes to the software, upcoming releases, collaborations, future plans. Will include an overview of the upcoming HDF5 1.8 release, and updates on the netCDF4/HDF5 merge, HDF5 support for indexing, BioHDF, the HDF5-Storage Resource Broker project, and the HDF spin-off THG.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
The Metaverse and AI: how can decision-makers harness the Metaverse for their...Jen Stirrup
The Metaverse is popularized in science fiction, and now it is becoming closer to being a part of our daily lives through the use of social media and shopping companies. How can businesses survive in a world where Artificial Intelligence is becoming the present as well as the future of technology, and how does the Metaverse fit into business strategy when futurist ideas are developing into reality at accelerated rates? How do we do this when our data isn't up to scratch? How can we move towards success with our data so we are set up for the Metaverse when it arrives?
How can you help your company evolve, adapt, and succeed using Artificial Intelligence and the Metaverse to stay ahead of the competition? What are the potential issues, complications, and benefits that these technologies could bring to us and our organizations? In this session, Jen Stirrup will explain how to start thinking about these technologies as an organisation.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
2. www.hdfgroup.orgESIP Summer MeetingJuly 8 – 11, 2014 2
Why OPeNDAP?
• Check metadata remotely (in various forms)
• Obtain the subset of data easily and efficiently
• Hide the original data sources
• netCDF, HDF4, HDF5 or GeoTiFF, GRIB
• Many popular earth science tools can visualize
and analyze the data via OPeNDAP
• OPeNDAP output(including subsets) can be
downloaded as other formats
3. www.hdfgroup.orgESIP Summer MeetingJuly 8 – 11, 2014 3
• The visualization of HDF(5) data by OPeNDAP
• What users request the most
What makes HDF OPeNDAP
support special?
5. www.hdfgroup.orgESIP Summer Meeting
HDF4 handler update
• Add TRMM 7 support
• level 1 and level 2 swath
• Level 3 grid
• Improve the performance to handle AIRS version 6
grid and MOD08_M3 products
• Improve the performance to handle MODIS
products with no-CF scale and offset rules
• Reduce the number of file open/close calls when
using file netCDF module to access HDF data
• Map HDF4 char array to DAP String
July 8 – 11, 2014 5
6. www.hdfgroup.orgESIP Summer Meeting
MOD08_M3 and AIRS version 6 Performance
• Disable the generation of StructMetadata as a DAP
attribute for the CF option
• Use the special features in these products to
efficiently build DDS and DAS
July 8 – 11, 2014 6
11. www.hdfgroup.orgESIP Summer Meeting
HDF5 handler update
• Support GPM level-1 products
• Make the products follow CF
• Reduce the number of file open/close calls when
using file netCDF module to access HDF data
July 8 – 11, 2014 11
15. www.hdfgroup.orgESIP Summer Meeting
Hyrax file-netCDF module
• Can work with HDF handlers to convert HDF files to
netCDF-3 or netCDF-4 classic files that follow the
CF conventions
• We also help debug and provide fixes for this
module to help NASA
July 8 – 11, 2014 15
16. www.hdfgroup.orgESIP Summer Meeting
File netCDF module demo
• Can use besstandalone program
• End users can install Hyrax and use this program to
convert HDF4 and HDF5 files to netCDF3 or
netCDF4
16
17. www.hdfgroup.orgESIP Summer MeetingJuly 8 – 11, 2014 17
More challenges
• CF conventions evolve and tools evolve
• New versions of existing HDF products and new
HDF products may require significant updates
• The data aggregation service requires decent
performance of the data service per file
19. www.hdfgroup.org
The HDF Group
ESIP Summer Meeting
Earth Science Group
Ted Habermann
Aleksandar Jelenak
H. Joe Lee
Joel Plutchak
John Readey
Kent Yang
19July 8 – 11, 2014
Editor's Notes
MOD08_M3, AIRS version 6 grid
Ncdump to dump the file
The Earth Science Group is a subgroup within The HDF Group formed to concentrate on issues like these. We’re excited by the opportunity to join in the conversation and help form the emerging landscape in Earth Sciences software, data and metadata conventions and their uses in current and upcoming missions.