The HDF format is the foundation for sharing data in many communities that have created domain-specific conventions on top of HDF. This presentation was given at the Winter meeting of the Earth Science Information Partnership (ESIP).
Resource Description Framework (RDF) has entered the metadata scene for libraries in a major way over the last few years. While the promise of its Linked Data capabilities is exciting, the realities of changing data models, encoding practices, and even ontologies can put a check on that excitement. This session will explore these issues and discuss when this is worth doing and how to go about doing it.
The NASA Earth Science Data and Information System (ESDIS) is migrating documentation for their data and products towards International Standards developed by ISO Technical Committee 211 (ISO/TC211). In order to do this effectively, NASA must understand and participate in the ISO process. This presentation was given at a NASA ISO Seminar during November 2012. It outlines the ISO standards process and describes some extensions to the ISO standards that are being proposed to address ESDIS requirements not addressed in the original standard.
New data access paradigms support a variety of human and machine access paths with data servers (THREDDS, https://www.unidata.ucar.edu/software/thredds/current/tds/ and Hyrax, http://opendap.org) that support multiple services for a given dataset. We need metadata that can describe those services and unambiguously differentiate between access paths for humans and for machines. The ISO 19115 metadata standard includes service metadata and allows data and services for that data to be described in the same record. I propose that we use the service metadata for machine access and the more traditional distribution information for human access. This talk was presented at the ESIP (espied.org) meeting during January 2014.
Wikis, Rubrics and Views: An Integrated Approach to Improving DocumentationTed Habermann
For many years scientists and data managers have focused on creating metadata that supports the discovery of available data. This is important, but once data sets are discovered, users need metadata that supports use and understanding of those data. This talk describes a system developed to support the required metadata improvements using wikis, rubrics, and metadata views. The wikis provide a mechanism for the community to record experiences and lessons learned and provide high-quality examples. Rubrics provide a mechanism for consistent and clear quantitative evaluation of the completeness of metadata records. The results displays include integrated links to the wiki. Views provide views with connections to the wiki and on-going interactive learning. These tools can be used with metadata from any standard and can facilitate translation of the metadata between multiple standards.
Resource Description Framework (RDF) has entered the metadata scene for libraries in a major way over the last few years. While the promise of its Linked Data capabilities is exciting, the realities of changing data models, encoding practices, and even ontologies can put a check on that excitement. This session will explore these issues and discuss when this is worth doing and how to go about doing it.
The NASA Earth Science Data and Information System (ESDIS) is migrating documentation for their data and products towards International Standards developed by ISO Technical Committee 211 (ISO/TC211). In order to do this effectively, NASA must understand and participate in the ISO process. This presentation was given at a NASA ISO Seminar during November 2012. It outlines the ISO standards process and describes some extensions to the ISO standards that are being proposed to address ESDIS requirements not addressed in the original standard.
New data access paradigms support a variety of human and machine access paths with data servers (THREDDS, https://www.unidata.ucar.edu/software/thredds/current/tds/ and Hyrax, http://opendap.org) that support multiple services for a given dataset. We need metadata that can describe those services and unambiguously differentiate between access paths for humans and for machines. The ISO 19115 metadata standard includes service metadata and allows data and services for that data to be described in the same record. I propose that we use the service metadata for machine access and the more traditional distribution information for human access. This talk was presented at the ESIP (espied.org) meeting during January 2014.
Wikis, Rubrics and Views: An Integrated Approach to Improving DocumentationTed Habermann
For many years scientists and data managers have focused on creating metadata that supports the discovery of available data. This is important, but once data sets are discovered, users need metadata that supports use and understanding of those data. This talk describes a system developed to support the required metadata improvements using wikis, rubrics, and metadata views. The wikis provide a mechanism for the community to record experiences and lessons learned and provide high-quality examples. Rubrics provide a mechanism for consistent and clear quantitative evaluation of the completeness of metadata records. The results displays include integrated links to the wiki. Views provide views with connections to the wiki and on-going interactive learning. These tools can be used with metadata from any standard and can facilitate translation of the metadata between multiple standards.
For many years metadata development activities have focused on developing and sharing metadata for discovering data. This is important. Once data are discovered, metadata supporting use and understanding become important. Efforts to encourage scientists and data providers to create those metadata have had limited success. This talk describes some approaches and tools for supporting the organizational change efforts required to integrate use and understanding metadata into organizational cultures. These approaches are described in terms of the ideas presented in Switch: How to Change Things When Change is Hard.
Communities use many different dialects to document their data. We need to be able to translate between these dialects and to understand how much is lost in translation
NASA's Earth Observing System (EOS) archive includes data collected over many years by many satellite instruments. These data are stored in the HDF format that includes data and metadata. The content of the metadata was examined for compliance with a set of conventions developed by the NASA science community at the beginning of the EOS Project (the HDF-EOS conventions). The initial results show that ~50% of the data files and 76% of the datasets have metadata that allows them to be used easily in standard tools. This talk was presented at the ESIP (espied.org) meeting during January 2014.
The HDF Product Designer – Interoperability in the First MileTed Habermann
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Science platforms are made up of (at least) four planks: data formats, services, tools and conventions. I focus here on formats and conventions, specifically the HDF5 format, already used in many disciplines, and the Climate-Forecast and HDF-EOS Conventions. Many science disciplines have already agreed on HDF as the preferred format for storing and sharing data. It is well established in high performance computing and supports arbitrary grouping and annotation. Community conventions are critical for useful data on top of the format. The Climate-Forecast (CF) conventions were created for relatively simple gridded data types while the HDF-EOS conventions originally considered more complex data (swaths). Making simple conventions more complex makes adoption more difficult. Community input and the need for stable data processing systems must be balanced in governance of conventions.
We are interested in developing a standard method for writing ISO TC211 compliant metadata into HDF data files. This presentation shows some initial workflows for this using the HDF Product Designer.
The ISO Metadata Standards include the capability to add citations to many kinds of external resources. This is very important for providing complete documentation required to understand and reproduce scientific results.
Hdf Augmentation: Interoperability in the Last MileTed Habermann
Science data files are generally written to serve well-defined purposes for a small science teams. In many cases, the organization of the data and the metadata are designed for custom tools developed and maintained by and for the team. Using these data outside of this context many times involves restructuring, re-documenting, or reformatting the data. This expensive and time-consuming process usually prevents data reuse and thus decreases the total life-cycle value of the data considerably. If the data are unique or critically important to solving a particular problem, they can be modified into a more generally usable form or metadata can be added in order to enable reuse. This augmentation process can be done to enhance data for the intended purpose or for a new purpose, to make the data available to new tools and applications, to make the data more conventional or standard, or to simplify preservation of the data. The HDF Group has addressed augmentation needs in many ways: by adding extra information, by renaming objects or moving them around in the file, by reducing complexity of the organization, and sometimes by hiding data objects that are not understood by specific applications. In some cases these approaches require re-writing the data into new files and in some cases it can be done externally, without affecting the original file. We will describe and compare several examples of each approach.
ISO Metadata Improvements - Questions and AnswersTed Habermann
The ISO Standards for describing geospatial data, services, and other resources are changing. These slides describe a few of these changes in terms of documentation needs and how the new standards address these needs. I talked with these slides at a recent webinar that is available at https://www.youtube.com/watch?v=un-PtJLclIM&feature=youtu.be
Can ISO 19157 support current NASA data quality metadata?Ted Habermann
ISO 19157 provides a powerful framework for describing quality of Earth science datasets. As NASA migrates towards using that standard, it is important to understand whether and how existing data quality content fits into the ISO 19157 model. This talk demonstrates that fit and concludes that ISO 19157 can include all existing content and also includes new capabilities that can be very useful for all kinds of NASA data users.
This presentation is intended to familiarize current HDF users and those considering adopting HDF with the range of technical documentation that we provide.
We will discuss documents that are shipped with the product and those that are available from the HDF websites. We will touch on user documentation, supporting technical documents, RFCs and design documents, other papers and presentations, etc.
Exploration of a virtual product containing all scripts and elements need to spin up a working IIIF environment for scholars, instructors and institutions alike.
EUDAT & OpenAIRE Webinar: How to write a Data Management Plan - July 7, 2016|...EUDAT
| www.eudat.eu | 1st Session: July 7, 2016.
In this webinar, Sarah Jones (DCC) and Marjan Grootveld (DANS) talked through the aspects that Horizon 2020 requires from a DMP. They discussed examples from real DMPs and also touched upon the Software Management Plan, which for some projects can be a sensible addition
This tutorial is designed for new HDF5 users. We will cover HDF5 abstractions such as datasets, groups, attributes, and datatypes. Simple C examples will cover the programming model and basic features of the API, and will give new users the knowledge they need to navigate through the rich collection of HDF5 interfaces. Participants will be guided through an interactive demonstration of the fundamentals of HDF5.
This tutorial is for new HDF5 users.
For many years metadata development activities have focused on developing and sharing metadata for discovering data. This is important. Once data are discovered, metadata supporting use and understanding become important. Efforts to encourage scientists and data providers to create those metadata have had limited success. This talk describes some approaches and tools for supporting the organizational change efforts required to integrate use and understanding metadata into organizational cultures. These approaches are described in terms of the ideas presented in Switch: How to Change Things When Change is Hard.
Communities use many different dialects to document their data. We need to be able to translate between these dialects and to understand how much is lost in translation
NASA's Earth Observing System (EOS) archive includes data collected over many years by many satellite instruments. These data are stored in the HDF format that includes data and metadata. The content of the metadata was examined for compliance with a set of conventions developed by the NASA science community at the beginning of the EOS Project (the HDF-EOS conventions). The initial results show that ~50% of the data files and 76% of the datasets have metadata that allows them to be used easily in standard tools. This talk was presented at the ESIP (espied.org) meeting during January 2014.
The HDF Product Designer – Interoperability in the First MileTed Habermann
Interoperable data have been a long-time goal in many scientific communities. The recent growth in analysis, visualization and mash-up applications that expect data stored in a standardized manner has brought the interoperability issue to the fore. On the other hand, producing interoperable data is often regarded as a sideline task in a typical research team for which resources are not readily available. The HDF Group is developing a software tool aimed at lessening the burden of creating data in standards-compliant, interoperable HDF5 files. The tool, named HDF Product Designer, lowers the threshold needed to design such files by providing a user interface that combines the rich HDF5 feature set with applicable metadata conventions. Users can quickly devise new HDF5 files while at the same time seamlessly incorporating the latest best practices and conventions from their community. That is what the term interoperability in the first mile means: enabling generation of interoperable data in HDF5 files from the onset of their production. The tool also incorporates collaborative features, allowing team approach in the file design, as well as easy transfer of best practices as they are being developed. The current state of the tool and the plans for future development will be presented. Constructive input from interested parties is always welcome.
Science platforms are made up of (at least) four planks: data formats, services, tools and conventions. I focus here on formats and conventions, specifically the HDF5 format, already used in many disciplines, and the Climate-Forecast and HDF-EOS Conventions. Many science disciplines have already agreed on HDF as the preferred format for storing and sharing data. It is well established in high performance computing and supports arbitrary grouping and annotation. Community conventions are critical for useful data on top of the format. The Climate-Forecast (CF) conventions were created for relatively simple gridded data types while the HDF-EOS conventions originally considered more complex data (swaths). Making simple conventions more complex makes adoption more difficult. Community input and the need for stable data processing systems must be balanced in governance of conventions.
We are interested in developing a standard method for writing ISO TC211 compliant metadata into HDF data files. This presentation shows some initial workflows for this using the HDF Product Designer.
The ISO Metadata Standards include the capability to add citations to many kinds of external resources. This is very important for providing complete documentation required to understand and reproduce scientific results.
Hdf Augmentation: Interoperability in the Last MileTed Habermann
Science data files are generally written to serve well-defined purposes for a small science teams. In many cases, the organization of the data and the metadata are designed for custom tools developed and maintained by and for the team. Using these data outside of this context many times involves restructuring, re-documenting, or reformatting the data. This expensive and time-consuming process usually prevents data reuse and thus decreases the total life-cycle value of the data considerably. If the data are unique or critically important to solving a particular problem, they can be modified into a more generally usable form or metadata can be added in order to enable reuse. This augmentation process can be done to enhance data for the intended purpose or for a new purpose, to make the data available to new tools and applications, to make the data more conventional or standard, or to simplify preservation of the data. The HDF Group has addressed augmentation needs in many ways: by adding extra information, by renaming objects or moving them around in the file, by reducing complexity of the organization, and sometimes by hiding data objects that are not understood by specific applications. In some cases these approaches require re-writing the data into new files and in some cases it can be done externally, without affecting the original file. We will describe and compare several examples of each approach.
ISO Metadata Improvements - Questions and AnswersTed Habermann
The ISO Standards for describing geospatial data, services, and other resources are changing. These slides describe a few of these changes in terms of documentation needs and how the new standards address these needs. I talked with these slides at a recent webinar that is available at https://www.youtube.com/watch?v=un-PtJLclIM&feature=youtu.be
Can ISO 19157 support current NASA data quality metadata?Ted Habermann
ISO 19157 provides a powerful framework for describing quality of Earth science datasets. As NASA migrates towards using that standard, it is important to understand whether and how existing data quality content fits into the ISO 19157 model. This talk demonstrates that fit and concludes that ISO 19157 can include all existing content and also includes new capabilities that can be very useful for all kinds of NASA data users.
This presentation is intended to familiarize current HDF users and those considering adopting HDF with the range of technical documentation that we provide.
We will discuss documents that are shipped with the product and those that are available from the HDF websites. We will touch on user documentation, supporting technical documents, RFCs and design documents, other papers and presentations, etc.
Exploration of a virtual product containing all scripts and elements need to spin up a working IIIF environment for scholars, instructors and institutions alike.
EUDAT & OpenAIRE Webinar: How to write a Data Management Plan - July 7, 2016|...EUDAT
| www.eudat.eu | 1st Session: July 7, 2016.
In this webinar, Sarah Jones (DCC) and Marjan Grootveld (DANS) talked through the aspects that Horizon 2020 requires from a DMP. They discussed examples from real DMPs and also touched upon the Software Management Plan, which for some projects can be a sensible addition
This tutorial is designed for new HDF5 users. We will cover HDF5 abstractions such as datasets, groups, attributes, and datatypes. Simple C examples will cover the programming model and basic features of the API, and will give new users the knowledge they need to navigate through the rich collection of HDF5 interfaces. Participants will be guided through an interactive demonstration of the fundamentals of HDF5.
This tutorial is for new HDF5 users.
A preponderance of data from NASA's Earth Observing System (EOS) is archived in the HDF Version 4 (HDF4) format. The long-term preservation of these data is critical for climate and other scientific studies going many decades into the future. HDF4 is very effective for working with the large and complex collection of EOS data products. Unfortunately, because of the complex internal byte layout of HDF4 files, future readability of HDF4 data depends on preserving a complex software library that can interpret that layout. Having a way to access HDF4 data independent of a library could improve its viability as an archive format, and consequently give confidence that HDF4 data will be readily accessible forever, even if the HDF4 library is gone.
To address the need to simplify long-term access to EOS data stored in HDF4, a collaborative project between The HDF Group and NASA Earth Science Data Centers is implementing an approach to accessing data in HDF4 files based on the use of independent maps that describe the data in HDF4 files and tools that can use these maps to recover data from those files. With this approach, relatively simple programs will be able to extract the data from an HDF4 file, bypassing the need for the HDF4 library.
A demonstration project has shown that this approach is feasible. This involved an assessment of NASA�s HDF4 data holdings, and development of a prototype XML-based layout mapping language and tools to read layout maps and read HDF4 files using layout maps. Future plans call for a second phase of the project, in which the mapping tools and XML schema are made production quality, the mapping schema are integrated with existing XML metadata files in several data centers, and outreach activities are carried out to encourage and facilitate acceptance of the technology.
Advances in web technologies have made it possible to democratize the production of open, reusable, remixable textbooks without sacrificing quality. The panelists will actively demonstrate three advances made possible by new web technologies:
1. User-friendly authoring tools that make it easy to produce and adapt remixable open textbooks.
2. An innovative production pipeline that enables beautiful and engaging textbook content to be distributed seamlessly to any student on any device in many formats.
3. New interactive content visualizations that enable students to interact with their books, explore rich data sets without downloading specialized tools, and view beautiful figures in printed media without additional work. The panel will explore examples from Connexions, Siyavula, Booktype, Quadbase, and FullMarks.
OSFair2017 Training | FAIR metrics - Starring your data setsOpen Science Fair
Peter Doorn, Marjan Grootveld & Elly Dijk talk about FAIR data principles and present the assessment tool that DANS is developing for data repositories | OSFair2017 Workshop
Workshop title: FAIR metrics - Starring your data sets
Workshop overview:
Do you want to join our effort to put the FAIR data principles into practice? Come and explore the assessment tool that DANS, Data Archiving and Networked Services in the Netherlands, is developing for data repositories.
The aim of our work is to implement the FAIR principles into a data assessment tool so that every dataset which is deposited or reused from any digital repository can be assessed in terms of a score on the principles Findable, Accessible, Interoperable, and Reusable, using a ‘FAIRness’ scale from 1 to 5 stars. In this interactive session participants can explore the pilot version of FAIRdat: the FAIR data assessment tool. The organisers would like to inform you about the project, and look forward to all feedback to improve the tool, or to improve the metrics that are used.
DAY 3 - PARALLEL SESSION 7
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Welocme to ViralQR, your best QR code generator.ViralQR
Welcome to ViralQR, your best QR code generator available on the market!
At ViralQR, we design static and dynamic QR codes. Our mission is to make business operations easier and customer engagement more powerful through the use of QR technology. Be it a small-scale business or a huge enterprise, our easy-to-use platform provides multiple choices that can be tailored according to your company's branding and marketing strategies.
Our Vision
We are here to make the process of creating QR codes easy and smooth, thus enhancing customer interaction and making business more fluid. We very strongly believe in the ability of QR codes to change the world for businesses in their interaction with customers and are set on making that technology accessible and usable far and wide.
Our Achievements
Ever since its inception, we have successfully served many clients by offering QR codes in their marketing, service delivery, and collection of feedback across various industries. Our platform has been recognized for its ease of use and amazing features, which helped a business to make QR codes.
Our Services
At ViralQR, here is a comprehensive suite of services that caters to your very needs:
Static QR Codes: Create free static QR codes. These QR codes are able to store significant information such as URLs, vCards, plain text, emails and SMS, Wi-Fi credentials, and Bitcoin addresses.
Dynamic QR codes: These also have all the advanced features but are subscription-based. They can directly link to PDF files, images, micro-landing pages, social accounts, review forms, business pages, and applications. In addition, they can be branded with CTAs, frames, patterns, colors, and logos to enhance your branding.
Pricing and Packages
Additionally, there is a 14-day free offer to ViralQR, which is an exceptional opportunity for new users to take a feel of this platform. One can easily subscribe from there and experience the full dynamic of using QR codes. The subscription plans are not only meant for business; they are priced very flexibly so that literally every business could afford to benefit from our service.
Why choose us?
ViralQR will provide services for marketing, advertising, catering, retail, and the like. The QR codes can be posted on fliers, packaging, merchandise, and banners, as well as to substitute for cash and cards in a restaurant or coffee shop. With QR codes integrated into your business, improve customer engagement and streamline operations.
Comprehensive Analytics
Subscribers of ViralQR receive detailed analytics and tracking tools in light of having a view of the core values of QR code performance. Our analytics dashboard shows aggregate views and unique views, as well as detailed information about each impression, including time, device, browser, and estimated location by city and country.
So, thank you for choosing ViralQR; we have an offer of nothing but the best in terms of QR code services to meet business diversity!
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
2. Python and HDF
From the Introduction:
As a graduate student I didn’t care
that much about the details of how
my data were stored… As a
scientist, I eventually came to
recognize that the choices we make
for organizing and storing our data
are also choices about
communication. Not only do
standard, well-designed formats
make life easier for individuals, but
they make it possible to share data
with a global audience.
2
New book on Pythin and HDF by Andrew Collette. The quote from the Introduction indicates that Andrew understands the important role of standard formats in data sharing.
Andrew works at the Colorado Center for Lunar Dust and Atmospheric Studies (http://lasp.colorado.edu/home/ccldas/). This large NASA facility is used by scientists in NASA and around the world to study very high speed dust impacts on instrument windows and other objects. Scientists take their observations home in HDF5 files.
The Colorado Center for Lunar Dust and Atmospheric Studies (http://lasp.colorado.edu/home/ccldas/) has HDFIndise.
The NASA Earth Observing System Data and Information System (https://earthdata.nasa.gov) has #HDFInside.
netCDF4 (http://www.unidata.ucar.edu/software/netcdf/) has #HDFInside
The Bathymetry Attributed Grid (BAG) developed by the Open Navigational Surface (http://www.opennavsurf.org) and used by the National Ocean Service (http://www.ngdc.noaa.gov/mgg/bathymetry/hydro.html) has #HDFInside
The NeXus Scientific Format has #HDFInside http://www.nexusformat.org
HDF5-FastQuery developed by the Visualization Groupat Lawrence Berkely National Lab (http://www-vis.lbl.gov/Events/SC05/HDF5FastQuery/) has #HDFInside
h5py (http://www.h5py.org) developed at the Colorado Center for Lunar Dust and Atmospheric Studies (http://lasp.colorado.edu/home/ccldas/) and PyTables (http://www.pytables.org/moin) have #HDFInside.
GLOBE Claritas (http://www.globeclaritas.com) seismic data processing software and h5vc software for scalable nucleotide tallies (http://www.bioconductor.org/packages/2.14/bioc/html/h5vc.html) have #HDFInside.
The KAIRA (Kilpisjärvi Atmospheric Imaging Receiver Array) dual array of omni-directional VHF radio antennas,a project of the Sodankylä Geophysical Observatory and the Low Frequency Array (LOFAR http://www.lofar.org) have #HDFInside.
MathWorks (MATLAB, http://www.mathworks.com ) and Exelis (IDL, ENVI, http://www.exelisinc.com/Pages/default.aspx) have #HDFInside.
The NASA Earth Observing System Data and Information System (https://earthdata.nasa.gov) has #HDFInside.netCDF4 (http://www.unidata.ucar.edu/software/netcdf/) has #HDFInsideThe Bathymetry Attributed Grid (BAG) developed by the Open Navigational Surface (http://www.opennavsurf.org) and used by the National Ocean Service (http://www.ngdc.noaa.gov/mgg/bathymetry/hydro.html) has #HDFInsideThe NeXus Scientific Format has #HDFInside http://www.nexusformat.orgHDF5-FastQuery developed by the Visualization Groupat Lawrence Berkely National Lab (http://www-vis.lbl.gov/Events/SC05/HDF5FastQuery/) has #HDFInsideh5py (http://www.h5py.org) developed at the Colorado Center for Lunar Dust and Atmospheric Studies (http://lasp.colorado.edu/home/ccldas/) and PyTables (http://www.pytables.org/moin) have #HDFInside.GLOBE Claritas (http://www.globeclaritas.com) seismic data processing software and h5vc software for scalable nucleotide tallies (http://www.bioconductor.org/packages/2.14/bioc/html/h5vc.html) have #HDFInside.The KAIRA (Kilpisjärvi Atmospheric Imaging Receiver Array) dual array of omni-directional VHF radio antennas,a project of the Sodankylä Geophysical Observatory and the Low Frequency Array (LOFAR http://www.lofar.org) have #HDFInside.MathWorks (MATLAB, http://www.mathworks.com ) and Exelis (IDL, ENVI, http://www.exelisinc.com/Pages/default.aspx) have #HDFInside.