This whitepaper from CDG provides a high level overview of the background and benefits of the S1000D publications standard. This standard has been widely used in Europe for many years, and is now gaining momentum in the US as well.
In this webinar Thomas Cook, Sales Director, AnzoGraph DB, uses real-world flight data to discuss RDF and its newer property-graph-functionality iteration, RDF*, wrapping up with a pair of real-world demonstrations via Zeppelin notebooks.
This whitepaper from CDG provides a high level overview of the background and benefits of the S1000D publications standard. This standard has been widely used in Europe for many years, and is now gaining momentum in the US as well.
In this webinar Thomas Cook, Sales Director, AnzoGraph DB, uses real-world flight data to discuss RDF and its newer property-graph-functionality iteration, RDF*, wrapping up with a pair of real-world demonstrations via Zeppelin notebooks.
Elvis asset and operation management elvis event marcus stenstrandFingrid Oyj
In 2008, Fingrid launched a project, the aim of which was to build a new information system that supported asset and operation management and was based on product-based solutions. This new data system that supports asset and operation management goes by the name of ELVIS (an acronym for ELectricity Verkko Information System).
GDPR compliance application architecture and implementation using Hadoop and ...DataWorks Summit
The General Data Protection Regulation (GDPR) is a legislation designed to protect personal data of European Union citizens and residents. The main requirement is to log personal data accesses/changes in customer-specific applications. These logs can then be audited by owning entities to provide reporting to end users indicating usage of their personal data. Users have the ""right to be forgotten,â€Âmeaning their personal data can be purged from the system at their request. The regulation goes into effect on May 25,2018 with significant fines for non-compliance.
This session will provide insight on how to approach/implement a GDPR compliance solution using Hadoop and Streaming for any enterprise with heavy volumes of data.This session will delve into deployment strategies, architecture of choice (Kafka,NiFi. and Hive ACID with streaming), implementation best practices, configurations, and security requirements. Hortonworks Professional Services System Architects helped the customer on ground to design, implement, and deploy this application in production.
Speaker
Saurabh Mishra, Hortonworks, Systems Architect
Arun Thangamani, Hortonworks, Systems Architect
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
These slides accompany a 1.5 hour webinar sponsored by the Western New York Library Resources Council, presented by Dan Gillean of Artefactual Systems on February 15th, 2017.
The session was intended to introduce participants to some of the key standards, services, and tools available to support digital preservation planning and activities. Part 1 focused on DP101, and how to begin tackling digital preservation in your institution. Part 2 introduced the Archivematica project's history, philosophy, and aims, while Part 3 was a live demonstration of Archivematica in action.
Thank you to WNYLRC for sponsoring this event!
Some background and thoughts on Metadata Mapping and Metadata Crosswalks. A collection of online sources and related projects. Comments are more than welcome, as is reuse!
Presentation by Antonio Dias de Figueiredo at the Workshop on Philosophy and Engineering, Royal Academy of Engineering, London, November 10-12, 2008. These slides are complemented by the text with the same title available at SlideShare.
Elvis asset and operation management elvis event marcus stenstrandFingrid Oyj
In 2008, Fingrid launched a project, the aim of which was to build a new information system that supported asset and operation management and was based on product-based solutions. This new data system that supports asset and operation management goes by the name of ELVIS (an acronym for ELectricity Verkko Information System).
GDPR compliance application architecture and implementation using Hadoop and ...DataWorks Summit
The General Data Protection Regulation (GDPR) is a legislation designed to protect personal data of European Union citizens and residents. The main requirement is to log personal data accesses/changes in customer-specific applications. These logs can then be audited by owning entities to provide reporting to end users indicating usage of their personal data. Users have the ""right to be forgotten,â€Âmeaning their personal data can be purged from the system at their request. The regulation goes into effect on May 25,2018 with significant fines for non-compliance.
This session will provide insight on how to approach/implement a GDPR compliance solution using Hadoop and Streaming for any enterprise with heavy volumes of data.This session will delve into deployment strategies, architecture of choice (Kafka,NiFi. and Hive ACID with streaming), implementation best practices, configurations, and security requirements. Hortonworks Professional Services System Architects helped the customer on ground to design, implement, and deploy this application in production.
Speaker
Saurabh Mishra, Hortonworks, Systems Architect
Arun Thangamani, Hortonworks, Systems Architect
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
These slides accompany a 1.5 hour webinar sponsored by the Western New York Library Resources Council, presented by Dan Gillean of Artefactual Systems on February 15th, 2017.
The session was intended to introduce participants to some of the key standards, services, and tools available to support digital preservation planning and activities. Part 1 focused on DP101, and how to begin tackling digital preservation in your institution. Part 2 introduced the Archivematica project's history, philosophy, and aims, while Part 3 was a live demonstration of Archivematica in action.
Thank you to WNYLRC for sponsoring this event!
Some background and thoughts on Metadata Mapping and Metadata Crosswalks. A collection of online sources and related projects. Comments are more than welcome, as is reuse!
Presentation by Antonio Dias de Figueiredo at the Workshop on Philosophy and Engineering, Royal Academy of Engineering, London, November 10-12, 2008. These slides are complemented by the text with the same title available at SlideShare.
Petrophysics and Big Data by Elephant Scale training and consultinelephantscale
Presented at the annual petrophysics software (SPWLA) show in Houston, TX, by Mark Kerzner. How Oil & Gas should approach Big Data, and how Elephant Scale can help in training and implementation.
A small presentation about wireline logs, showing their function or the technology that they use.
Ruhr-Universität Bochum, Petroleum Geology II, Winter Semester 2013/2014.
WELL LOG : Types of Logs, The Bore Hole Image, Interpreting Geophysical Well Logs, applications, Production logs, Well Log Classification and Cataloging
Summary of Accelerate - 2019 State of Devops report by Google Cloud's DORARagavendra Prasath
A detailed 82 pages report is abridged to 5 pages report. Access DORA report here - https://services.google.com/fh/files/misc/state-of-devops-2019.pdf
Inspiration and Courtesy to the authors.
Data Governance for the Cloud with Oracle DRMUS-Analytics
Ready to move away from “hope, email and spreadsheets” as a strategy for maintaining system alignments? There’s a better way. Find out how to bring people, processes, and technology together for control over ever-changing enterprise reporting hierarchies and data.
Make A Stress Free Move To The Cloud: Application Modernization and Managemen...Dell World
Delivering IT services that keep the business running from day to day is always challenging. Delivering these services while simultaneously moving your IT infrastructure to the cloud can be almost impossible without the right tools and support. Attend this session to hear directly from leaders at Dell who specialize in application management and learn how Dell migration tools and services accelerate your move to the cloud while maintaining the high quality access to web and mobile services that your users demand.
Rapidly Enable Tangible Business Value through Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3EEU2vK
Uber, the world’s largest taxi company, own no fleet; AirBnb , the largest accommodation provider owns no real estate. The extraordinary way that companies is growing fast, globally and with little investment, was with thin layers on top of a complex system of others’ goods or services that owned the customer interface. In Digital transformation- Data Minimization sometimes very useful to deliver business value rapidly without physical data redundancy – specially for seamless data migration from OLTP, OLAP, Legacy platforms for quick data domain/Data product access for incremental value until the desired Architecture/data estate evolve. To achieve the same -Data virtualization logically allows an application to retrieve and manipulate data without requiring technical details about the data, such as how it is formatted at source, or where it is physically located, and can provide a single customer view of the overall data. While implementing next-gen solution leveraging DV, it has certain set of key considerations and caveat with focused long term strategy, Target state Architecture and use case.
Watch full webinar here: https://bit.ly/2N1Ndz9
How is a logical data fabric different from a physical data fabric? What are the advantages of one type of fabric over the other? Attend this session to firm up your understanding of a logical data fabric.
Data summit connect fall 2020 - rise of data opsRyan Gross
Data governance teams attempt to apply manual control at various points for consistency and quality of the data. By thinking of our machine learning data pipelines as compilers that convert data into executable functions and leveraging data version control, data governance and engineering teams can engineer the data together, filing bugs against data versions, applying quality control checks to the data compilers, and other activities. This talk illustrates how innovations are poised to drive process and cultural changes to data governance, leading to order-of-magnitude improvements.
ATAGTR2017 Performance Testing and Non-Functional Testing Strategy for Big Da...Agile Testing Alliance
The presentation on Performance Testing and Non-Functional Testing Strategy for Big Data Applications was done during #ATAGTR2017, one of the largest global testing conference. All copyright belongs to the author.
Author and presenter : Abhinav Gupta
GlobalSoft is a MDM-focused software consultancy, specializing in Informatica MDM. GlobalSoft has been a long-term strategic partner of Informatica since the days of Siperian, providing project delivery and training services, as well as support and engineering services from our US & India offfices. Today, GlobalSoft has leveraged its deep product knowledge gained over the past 8 years and over 40 MDM projects into the preeminent service provider for Informatica MDM, and has used this knowledge to develop and offer specialized services and products for MDM.GlobalSoft headquartered in San Jose, CA maintains expert staff in the US and in India is capable of managing and delivering projects or augmenting existing project teams.
How to add security in dataops and devopsUlf Mattsson
The emerging DataOps is not Just DevOps for Data. According to Gartner, DataOps is a collaborative data management practice focused on improving the communication, integration and automation of data flows between data managers and consumers across an organization.
The goal of DataOps is to create predictable delivery and change management of data, data models and related artifacts. DataOps uses technology to automate data delivery with the appropriate levels of security, quality and metadata to improve the use and value of data in a dynamic environment.
This session will discuss how to add Security in DataOps and DevOps.
Iod session 3423 analytics patterns of expertise, the fast path to amazing ...Rachel Bland
Session content from IBM Information On Demand 2013 provides an overview of the IBM Business Intelligence Pattern with BLU Acceleration and explains the underlying technology employed to deliver high speed analysis more quickly and easily than ever before.
From Chaos to Compliance: The New Digital Governance for DevOpsXebiaLabs
DevOps and related trends (cloud-native, digital transformation, etc.) are unquestionably mainstream, but they still come with difficulties. Many organizations are struggling with outdated governance models that slow down digital innovation, while not effectively reducing risk. Plan/build/run, stage-gated checklists, and approval boards are losing favor, but what will replace them? Risk management is still critical.
Special guest Charles Betz, Forrester Principal Analyst, joined Dan Beauregard, VP, Cloud & DevOps Evangelist at XebiaLabs, to discuss:
• The role of an integrated, end-to-end release pipeline in ensuring auditability and standards compliance
• The evolution and automation of change and release management and the decline of the Change Approval Board
• Chaos and resilience engineering as the basis for a new governance model
Presentation on Data Mesh: The paradigm shift is a new type of eco-system architecture, which is a shift left towards a modern distributed architecture in which it allows domain-specific data and views “data-as-a-product,” enabling each domain to handle its own data pipelines.
Salesforce Platform: Governance and the Social EnterpriseJames Hindes
The road to the Social Enterpise is transformative, but IT departments need to ensure a smooth transition. Join us to learn first hand from customers who have made the journey, how they defined and managed the change with a well crafted governance strategy.
Detail on the WITSML to PPDM mapping project, a joint initiative betwwen the PPDM Association and Energistics to standardise movement of E&P data in the Oil & Gas industry. We outline the project and place it in the context of a data management approach to E&P data.
An example of a successful proof of conceptETLSolutions
In this presentation we explain how to create a successful proof of concept for software, using a real example from our work in the Oil & Gas industry.
Data integration case study: Automotive industryETLSolutions
Our Automotive consultants use our data integration software to integrate data from the varied systems used by Automotive dealers. Read on to find out how we have streamlined communications across a major manufacturer's network.
Automotive data integration: An example of a successful project structureETLSolutions
In this presentation we show our system for integrating Automotive dealer data, using examples of two projects for a major manufacturer. The presentation includes sample reports and the process used.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...
A 5-step methodology for complex E&P data management
1. A 5-step methodology for
complex E&P data
management
Raising data
management
standards
www.etlsolutions.com
2. The increasing complexity of E&P data
New devices are being used Timescales are collapsing.
in every phase of Exploration Once, drilling and logging
& Production (E&P) in the Oil data were distinct activities
& Gas industry, gathering separated by days, but now
more data with which better they happen
decisions can be made. simultaneously.
Metadata (in the Dublin core
These changes are being and ISO 19115 sense) are
factored into the becoming ever more
development of industry important in providing
standards (such as context. This has a direct
PPDM), driving their impact on proprietary
evolution to ensure database design and
continued use. functionality.
The price of progress is growing data complexity.
3. A 5-step methodology for managing this data
To make a robust and repeatable
approach work, we use
Transformation Manager, our data The Transformation Manager
integration toolset. software is coupled with the approach
we have adopted over many years in
the Oil & Gas industry
The result is a five-stage methodology.
4. Step 1
Separate source and target data models and the logic which
lies between them.
• This means that we can isolate the pure model structure and
clearly see the elements, attributes and relationships in each
model.
• We can also see detail such as database primary keys and
comments.
• As exposing relationships is the key in handling PPDM and
other highly normalized models, this is a critical step.
5. Step 2
Separate the model from the mechanics of data storage.
• The mechanics define physical characteristics such as ‘this is
an Oracle database’ or ‘this flat file uses a particular delimiter
or character set’. It is the model that tells us things like ‘a well
can have many bores’, ‘a wellbore many logs’, and that ‘log
trace mnemonics’ are catalogue controlled.
• At a stroke, this separation abolishes a whole category of
complexity.
• For both source and target we need a formal data
model, because this enables us to read or write to
database, XML, flat file, or any other data format.
6. Step 3
Specify relationships between source and target.
• In all data integration projects, determining the rules for the data
transfer is a fundamental requirement usually defined by analysts
working in this field, often using spreadsheets.
• But based on these or other forms of specification, we can create the
integration components in Transformation Manager using its descriptive
mapping language. This enables us to create a precisely defined
description of the link between the two data models.
• From this we can generate a runtime system which will execute the
formal definitions. Even if we chose not to create an executable
link, the formal definition of the mappings is still useful, because it
shows where the complexity in the PPDM integration is and the formal
syntax can be shared with others to verify our interpretation of their
rules.
7. Step 4
Follow an error detection procedure.
• To ensure that only good data is stored, Transformation Manager has a robust
process of error detection that operates like a series of filters. For each phase, we
detect errors relevant to that phase and we don't send bad data to the next
phase, where detection becomes even more complex.
• We detect mechanical and logical errors separately. If the source is a flat file, a
mechanical error could be malformed lines; logical errors could include dangling
foreign key references or missing data values.
• Next, we can detect errors at the mapping level, inconsistencies that are a
consequence of the map itself. Here, for example, we could detect that we are trying
to load production data for a source well which does not exist in the target.
• Finally there are errors where the data is inconsistent with the target logical model.
Here, simple tests (a string value is too long, a number is negative) can often be
automatically constructed from the model. More complex tests (well bores cannot
curve so sharply, these production figures are for an abandoned well) are built using
the semantics of the model.
• A staging store is very useful in providing an isolated area where we can disinfect the
data before letting it out onto a master system. Staging stores were an integral part of
the best practice data loaders we helped build for a major E&P company, and it is
now common practice that these are stored until issues are resolved.
8. Step 5
Execute a runtime link to generate the code required to
generate the integration.
• This will generate integration components, in the form of Java
code, which can reside anywhere in the architecture.
• This could be on the source, target or any other system to
manage the integration between PPDM and non-PPDM data
sources.
9. Our offerings: E&P data management
Transformation Support,
Transformation Manager data training and
Manager loader mentoring
software developer kits services
Data loader Data
and migration
connector packaged
development services
10. Why Transformation Manager?
For the user: Everything under one roof
Greater control and
transparency
Identify and test against errors
iteratively
Greater understanding of the
transformation requirement
Automatically document
Re-use and change
management
Uses domain specific
terminology in the mapping
11. Why Transformation Manager?
For the business: Reduces cost and effort
Reduces risk in the project
Delivers higher quality and
reduces error
Increases control and
transparency in the
development
Single product
Reduces time to market
12. Contact information
Karl Glenn
kg@etlsolutions.com
+44 (0) 1912 894040
Raising data
management
standards
www.etlsolutions.com