This presentation is an overview of a recent project where we assisted a large client with a complex migration of ECO data to PTC Windchill. We used an agile-like collaboration process to implement more functionality in 75% of the expected time.
You've heard about TotalAgility 7.0, the world's first unified platform for the development and deployment of smart process applications. But did you know it is available both on-premise and in the Cloud? In this presentation you will understand when it makes sense to deploy TotalAgility as a service, and the benefits this type of deployment delivers. You will also learn about the Cloud-specific features and licensing available in the newly announced TotalAgility 7.1 release.
Organization now trying to make themselves much more operational with easy-to-interoperate data.
Data is most important part of any organization.
Data is backbone of any report and reports are the baseline on which all the vital management decisions are taken.
Data coexist in different maybe heterogeneous data sources.
Få overblik over IT/OT-systemer og opgraderingsbehov, Leif Poulsen - NNE Phar...Mediehuset Ingeniøren Live
Større produktionsvirksomheder anvender ofte flere hunderede IT- og OT-systemer (automation) til styring af den daglige drift. Kom og hør hvordan Novo Nordisk skaber overblik over systemer og opgraderingsbehov med et smart værktøj udviklet af NNE Pharmaplan.
Dette indlæg følger op på keynoten fra Novo Nordisk.
Technology is evolving and changing at a very rapid pace, and it is more important than ever to ensure that mission critical back-end mainframe applications can exploit these new and disruptive technologies to transform digitally and deliver real value to the business, and to customers. DevOps on z Systems is a key enabler for the API economy and hybrid cloud. In this session we will discuss how DevOps can transform application delivery on z Systems, mitigate risk, and elevate the ability to respond quickly to customer expectations through continuous improvement"
This presentation is an overview of a recent project where we assisted a large client with a complex migration of ECO data to PTC Windchill. We used an agile-like collaboration process to implement more functionality in 75% of the expected time.
You've heard about TotalAgility 7.0, the world's first unified platform for the development and deployment of smart process applications. But did you know it is available both on-premise and in the Cloud? In this presentation you will understand when it makes sense to deploy TotalAgility as a service, and the benefits this type of deployment delivers. You will also learn about the Cloud-specific features and licensing available in the newly announced TotalAgility 7.1 release.
Organization now trying to make themselves much more operational with easy-to-interoperate data.
Data is most important part of any organization.
Data is backbone of any report and reports are the baseline on which all the vital management decisions are taken.
Data coexist in different maybe heterogeneous data sources.
Få overblik over IT/OT-systemer og opgraderingsbehov, Leif Poulsen - NNE Phar...Mediehuset Ingeniøren Live
Større produktionsvirksomheder anvender ofte flere hunderede IT- og OT-systemer (automation) til styring af den daglige drift. Kom og hør hvordan Novo Nordisk skaber overblik over systemer og opgraderingsbehov med et smart værktøj udviklet af NNE Pharmaplan.
Dette indlæg følger op på keynoten fra Novo Nordisk.
Technology is evolving and changing at a very rapid pace, and it is more important than ever to ensure that mission critical back-end mainframe applications can exploit these new and disruptive technologies to transform digitally and deliver real value to the business, and to customers. DevOps on z Systems is a key enabler for the API economy and hybrid cloud. In this session we will discuss how DevOps can transform application delivery on z Systems, mitigate risk, and elevate the ability to respond quickly to customer expectations through continuous improvement"
Connexus Energy needed to find a new way to report outages to their Responder™ OMS beyond traditional IVR. With a combination of web portal, middleware and a Multispeak-based web service, customers can now log into the portal and report outages. This allows Connexus to handle more outage calls by alleviating the restriction on the number of phone lines required, and reduces the time it takes to report an outage. Learn more about the technologies used and the outcome in this informative session.
State Zero: Middle Tennessee Electric Membership CorporationSSP Innovations
MTEMC recently completed a major project to merge multiple geodatabases into a single new GDB, apply data model changes along with corresponding data migration, and to implement voltage levels with feeder manager 2.0 to provide connectivity upstream of a circuit breaker. Several of these changes required the geodatabase to be at state 0 (no versions). MTEMC utilized SSP Innovations’ All Edits State 0 technology to successfully complete this project while maintaining their 1700+ design versions.
Geometric provides an intelligent approach to the migration of the PDM data in the context of applications and processes by assisting the customer in planning, assessment and suggesting right migration approach.
Automated software modernisation is the best solution that is fast, low cost, preserves legacy value and is less risky by comparison to the traditional methodology of a re-write or replacement by packaged ERP. Object Management Groups (OMGs) Model Driven Architecture (MDA) methodology provides an automated model-driven reverse engineering and forward engineering process called Architecture Driven Modernisation (ADM) which has already been successfully adopted by a variety of high profile organisations such as Boeing, U.S. Air Force, Raytheon, EDS, Thales (European Aerospace) and numerous governments worldwide.
WORPCLOUD LTD is focused on being an Automated Software Modernisation Expert. We use OMG compliant tools and parsing techniques to extract all system information, business semantics and software artifacts into an XML repository called the Abstract Syntax Tree Metamodel. Next we use MDAs automated transformation procedures to generate new source code of your choice. Manual architecting of the target system are also performed before the transformation thus ensuring; speed, low cost and accuracy of the automated process combined with the flexibility & insight of human analysis.
Research reveals that application modernisation and migration budgets are currently very strong, covering between 25% to 71% of most companies IT budgets in 2013/2014. This clearly indicates that application modernisation is one of the most significant issues affecting companies – due to high software maintenance costs, low business flexibility and crippled integration and interoperability. Software modernisation is the sole remedy for these problems and your organisation can make huge savings by modernising.
Discover New Spatial Insights with Spectrum 2020.1: Experience Enhanced User ...Precisely
Turn data into spatial insights by accessing highly flexible location data management and analytics, underpinned by improvements in interoperability, user experience, and accuracy.
Helping organizations make better, faster, location-based decisions is at the heart of Precisely’s product development approach. To achieve this, we know customers need software that supports user experience, interoperability, and accuracy to drive optimum insights - characteristics that are central to the release of Spectrum 2020.1.
Join Precisely product experts as we showcase Spectrum 2020.1 Location Intelligence solutions and the key customer-driven enhancements that impact our Location Intelligence and Data Quality portfolios. In this webcast, we will discuss:
• How Spectrum Spatial 2020.1 further enables access to spatial insights across the enterprise with improved ease-of-use and interoperability
• A summary of the key Spectrum 2020.1 enhancements to Spectrum Spatial Routing, Spectrum Global Geocoding, and Spectrum Enterprise Tax
• Highlights from the Verify portfolio side of the release and how the latest updates help ensure data is accurate, consistent, and complete for confident business decisions
Maximizing ROI on Utility Work Management SystemsSSP Innovations
Utility IT Departments must carefully assess several factors when searching for and acquiring new enterprise systems that ultimately affect the overall success or failure of the system. This presentation will review important high value considerations of Work Management Systems at utility organizations. Considerations such as implementation personnel expertise, extensibility, configurability, support, and system data models all play a role in the end cost of a system. Each of these factors and more will be examined in light of examples provided by SSP’s Work Management System, WFM aka Workforce Management.
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...BMC Software
Today we will be looking at record level data segregation for a SaaS help desk solution in the cloud- Remedyforce. Remedyforce is built on Salesforce App Cloud for IT Service Management. This functionality will allow us to filter access at the record level. As it pertains to Remedyforce, this means who can see which help desk tickets and more importantly: Who can edit said help desk and service management tickets.
Informatica provides the market's leading data integration platform. Tested on nearly 500,000 combinations of platforms and applications, the data integration platform inter operates with the broadest possible range of disparate standards, systems, and applications. This unbiased and universal view makes Informatica unique in today's market as a leader in the data integration platform. It also makes Informatica the ideal strategic platform for companies looking to solve data integration issues of any size.
App modernization projects are hard. Enterprises are looking to cloud-native platforms like Pivotal Cloud Foundry to run their applications, but they’re worried about the risks inherent to any replatforming effort.
Fortunately, several repeatable patterns of successful incremental migration have emerged.
In this webcast, Google Cloud’s Prithpal Bhogill and Pivotal’s Shaun Anderson will discuss best practices for app modernization and securely and seamlessly routing traffic between legacy stacks and Pivotal Cloud Foundry.
xRM is the natural evolution of CRM. Businesses are expanding their use of new generation CRM solutions to manage a wider range of scenarios, including asset management, prospect management, citizen management, and many more. Microsoft CRM sits on the .NET platform and because of that, it is much more than a traditional CRM product. Instead, think of Microsoft CRM is as a rapid development application with out of the box CRM functionality. The purpose of this session is to understand Microsoft's CRM strategy and how you get to market first with world class business solutions.
Connexus Energy needed to find a new way to report outages to their Responder™ OMS beyond traditional IVR. With a combination of web portal, middleware and a Multispeak-based web service, customers can now log into the portal and report outages. This allows Connexus to handle more outage calls by alleviating the restriction on the number of phone lines required, and reduces the time it takes to report an outage. Learn more about the technologies used and the outcome in this informative session.
State Zero: Middle Tennessee Electric Membership CorporationSSP Innovations
MTEMC recently completed a major project to merge multiple geodatabases into a single new GDB, apply data model changes along with corresponding data migration, and to implement voltage levels with feeder manager 2.0 to provide connectivity upstream of a circuit breaker. Several of these changes required the geodatabase to be at state 0 (no versions). MTEMC utilized SSP Innovations’ All Edits State 0 technology to successfully complete this project while maintaining their 1700+ design versions.
Geometric provides an intelligent approach to the migration of the PDM data in the context of applications and processes by assisting the customer in planning, assessment and suggesting right migration approach.
Automated software modernisation is the best solution that is fast, low cost, preserves legacy value and is less risky by comparison to the traditional methodology of a re-write or replacement by packaged ERP. Object Management Groups (OMGs) Model Driven Architecture (MDA) methodology provides an automated model-driven reverse engineering and forward engineering process called Architecture Driven Modernisation (ADM) which has already been successfully adopted by a variety of high profile organisations such as Boeing, U.S. Air Force, Raytheon, EDS, Thales (European Aerospace) and numerous governments worldwide.
WORPCLOUD LTD is focused on being an Automated Software Modernisation Expert. We use OMG compliant tools and parsing techniques to extract all system information, business semantics and software artifacts into an XML repository called the Abstract Syntax Tree Metamodel. Next we use MDAs automated transformation procedures to generate new source code of your choice. Manual architecting of the target system are also performed before the transformation thus ensuring; speed, low cost and accuracy of the automated process combined with the flexibility & insight of human analysis.
Research reveals that application modernisation and migration budgets are currently very strong, covering between 25% to 71% of most companies IT budgets in 2013/2014. This clearly indicates that application modernisation is one of the most significant issues affecting companies – due to high software maintenance costs, low business flexibility and crippled integration and interoperability. Software modernisation is the sole remedy for these problems and your organisation can make huge savings by modernising.
Discover New Spatial Insights with Spectrum 2020.1: Experience Enhanced User ...Precisely
Turn data into spatial insights by accessing highly flexible location data management and analytics, underpinned by improvements in interoperability, user experience, and accuracy.
Helping organizations make better, faster, location-based decisions is at the heart of Precisely’s product development approach. To achieve this, we know customers need software that supports user experience, interoperability, and accuracy to drive optimum insights - characteristics that are central to the release of Spectrum 2020.1.
Join Precisely product experts as we showcase Spectrum 2020.1 Location Intelligence solutions and the key customer-driven enhancements that impact our Location Intelligence and Data Quality portfolios. In this webcast, we will discuss:
• How Spectrum Spatial 2020.1 further enables access to spatial insights across the enterprise with improved ease-of-use and interoperability
• A summary of the key Spectrum 2020.1 enhancements to Spectrum Spatial Routing, Spectrum Global Geocoding, and Spectrum Enterprise Tax
• Highlights from the Verify portfolio side of the release and how the latest updates help ensure data is accurate, consistent, and complete for confident business decisions
Maximizing ROI on Utility Work Management SystemsSSP Innovations
Utility IT Departments must carefully assess several factors when searching for and acquiring new enterprise systems that ultimately affect the overall success or failure of the system. This presentation will review important high value considerations of Work Management Systems at utility organizations. Considerations such as implementation personnel expertise, extensibility, configurability, support, and system data models all play a role in the end cost of a system. Each of these factors and more will be examined in light of examples provided by SSP’s Work Management System, WFM aka Workforce Management.
Data Segregation for Remedyforce SaaS Help Desk and High-Speed Digital Servic...BMC Software
Today we will be looking at record level data segregation for a SaaS help desk solution in the cloud- Remedyforce. Remedyforce is built on Salesforce App Cloud for IT Service Management. This functionality will allow us to filter access at the record level. As it pertains to Remedyforce, this means who can see which help desk tickets and more importantly: Who can edit said help desk and service management tickets.
Informatica provides the market's leading data integration platform. Tested on nearly 500,000 combinations of platforms and applications, the data integration platform inter operates with the broadest possible range of disparate standards, systems, and applications. This unbiased and universal view makes Informatica unique in today's market as a leader in the data integration platform. It also makes Informatica the ideal strategic platform for companies looking to solve data integration issues of any size.
App modernization projects are hard. Enterprises are looking to cloud-native platforms like Pivotal Cloud Foundry to run their applications, but they’re worried about the risks inherent to any replatforming effort.
Fortunately, several repeatable patterns of successful incremental migration have emerged.
In this webcast, Google Cloud’s Prithpal Bhogill and Pivotal’s Shaun Anderson will discuss best practices for app modernization and securely and seamlessly routing traffic between legacy stacks and Pivotal Cloud Foundry.
xRM is the natural evolution of CRM. Businesses are expanding their use of new generation CRM solutions to manage a wider range of scenarios, including asset management, prospect management, citizen management, and many more. Microsoft CRM sits on the .NET platform and because of that, it is much more than a traditional CRM product. Instead, think of Microsoft CRM is as a rapid development application with out of the box CRM functionality. The purpose of this session is to understand Microsoft's CRM strategy and how you get to market first with world class business solutions.
A Journey to Enterprise Agility: Migrating 15 Atlassian Instances to Data CenterAtlassian
How do you coordinate the work of thousands of users, balance the need for teams to innovate, optimize performance, and comply with reporting standards and industry regulations?
At Johnson & Johnson we were faced with such a challenge. With 15 Atlassian application instances and tens of thousands of users, we needed to find a viable way to manage applications and our users efficiently. Come hear about our journey—the challenges, best practices, lessons learned, and ROI during one of the largest data transformation migrations we've ever embarked on.
Data Virtualization Journey: How to Grow from Single Project and to Enterpris...Denodo
In this presentation, Intel presents their journey, starting small and growing Data Virtualization to an Enterprise IT enabling use cases such as samples management, cloud, and big data for sales and marketing.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/jiYOHw.
"Wipro is one of India's largest publicly traded companies and the seventh largest IT services firm in the world. In this session, we showcase the structured methods that Wipro has used in enabling enterprises to take advantage of the cloud. These cover identifying workloads and application profiles that could benefit, re-structuring enterprise application and infrastructure components for migration, rapid and thorough verification and validation, and modifying component monitoring and management.
Several of these methods can be tailored to the individual client or functional context, so specific client examples are presented. We also discuss the enterprise experience of enabling many non-IT functions to benefit from the cloud, such as sales and training. More functions included in the cloud increase the benefit drawn from a cloud-enabled IT landscape.
Session sponsored by Wipro."
Why Data Virtualization? An Introduction by DenodoJusto Hidalgo
Data Virtualization means Real-time Data Access and Integration. But why do I need it? This presentation tries to answer it in a simple yet clear way.
By Alberto Pan, CTO of Denodo, and Justo Hidalgo, VP Product Management.
My presentation on MMS2011 in Las Vegas. Would you like to gain additional insight to the various best practices that other Application Virtualization customers are doing? This session will provide you with an opportunity to obtain insight from Application Virtualization MVPs with over 10 years of experience in the field. The objective of the session is to expose you to the numerous best practices, challenges and solutions that have been witnessed in the field in Application Virtualization.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
JMeter webinar - integration with InfluxDB and Grafana
INSPIRE Annex Testing
1. Practical Experience of INSPIRE Annex I Testing: Transforming Data into INSPIRE Data Specifications and Making Data Accessible via Download Services Debbie Wilson debbie.wilson@snowflakesoftware.com
2. Overview of INSPIRE Testing Call Objectives of INSPIRE testing were to: Understand the feasibility of transforming and publishing data into proposed INSPIRE Annex I data specification Demonstrate ability to access data via INSPIRE Download Services Evaluate costs and benefits of publishing data into the INSPIRE data specification via Download Services 89 testing reports were received from 16 Member States from >70 organisations:
3. Overview of INSPIRE Testing Call Only 60 of 89 tests involved full transformation test rest were paper exercise Of those organisations using COTS software ~40% used Snowflake’s GO Publisher Desktop & WFS
5. Transforming Data into INSPIRE Themes Two key approaches are advocated for transforming and publishing data into INSPIRE themes: Offline Transformation: Data is transformed and stored into the INSPIRE data specification (i.e. flat files or separate database) On-the-fly Transformation: Data is stored once and transformed into the INSPIRE data specification on request by the download service Offline transformation may be the most suitable option for datasets that are not updated regularly On-the-fly transformation is most suitable for datasets that are updated regularly and where organisations need to support data access to a wide range end users in different data specifications defined for different use cases Both approaches were evaluated during the testing phase
6. Making INSPIRE Data Accessible via INSPIRE Download Services GO Publisher Desktop, Agent and WFS were also used to demonstrate how organisations can develop download and direct access services INSPIRE Implementing Rules define two types of Download Service: Basic Download Service: Files can be downloaded for local use via HTTP/FTP Advanced Download Service: User can define the extent (geographic, temporal, attribute) of the data they need to be downloaded through either: Data Ordering Services or Web Feature Services (WFS)
7. Demonstration: Transforming HMLR data into INSPIRE Cadastral Parcels Test the feasibility and benefits of using Commercial Off-The-Shelf (COTS) software for INSPIRE Develop the translation without software customisation or development of bespoke scripts Work quickly and productively to reduce costs Refine the translation over several iterations within a limited time period Implement Simple and Advanced Download Services to explore the practical issues of implementing real business requirements On-the-fly translation to avoid replicating database infrastructure Source data from the existing HMLR data model to avoid disruption to existing business processes “manage once, publish many times” Implement an “industrial strength” solution
8. Defining the Translation: GO Publisher Desktop Database tables and columns XML schema elements Pull down lists populated from the XML schema Preview panel
12. Issues: Managing identity and feature lifecycles as INSPIRE GML application schema requires 3 different identifiers:INSPIRE Identifier National Identifier gmlID
13. Benefits of GO Publisher High productivity was achieved Configuration alone was sufficient No programming or scripting skills were needed Several iterations of the translation were achieved in a limited time-frame Supported progression from simple to advanced services Initially deployed as simple file creation Translation re-deployed as a WFS to support user querying Mature “Industrial Strength” solution Although INSPIRE Quality of Service Requirements were not evaluated in this test, scalability and performance has already been proven in previous deployments The number of independent evaluations and our existing operational deployment base means that the technology is well tested and reliable
14. Benefits of On-the-Fly Translation using GO Publisher Re-use existing database infrastructure Minor disruption to existing business processes Extra translations added at low cost Low initial investment - costs scale with increasing levels of data traffic Example Architecture of an SDI
15. Summary of Key Outcomes and Experiences All organisations using GO Publisher were able to successfully demonstrate that they could transform their data into INSPIRE Annex data specifications However, several issues relating to data transformation and publication were identified by many organisations: Insufficient time to adequately undertake testing (few application tests) Complexities involved in undertaking conceptual mapping of their data model to INSPIRE GML application schema (xsd) or UML model In-experience of staff to transform data from their source format into GML Lack of harmonisation in the way common concepts were modelled and to be implemented in v2.0 (e.g. Identifiers, lifecycle information, naming conventions)
16. Summary of Key Outcomes and Experiences Many organisations identified areas where further work would be needed to better understand how to operationally publish data into the INSPIRE specifications, particularly where this is achieved on-the-fly: Identifier Management Managing feature lifecycles particularly those features generated on-the-fly Translating between different codelist values Measuring and quantifying quality of transformed output (geometric and attribute) to ensure that it is consistent with source data quality levels and other data quality levels Metadata: how can organisations integrate the creation and publication of metadata into operational data management and publication workflows
17. Conclusions A number of technical and business issues exist that need to be addressed at various levels (organisational, inter-agency and Member State) Organisations need to perform more extensive testing to better understand how to incorporate requirements of INSPIRE data specifications into their business processes, datasets, products and infrastructure Responsibility for creation and maintenance of some themes falls across multiple agencies or has been devolved to multiple organisations responsible for a specific geographic region (e.g. transport: road, rail, aviation, water, devolved administrations, local/regional authorities): Need to better understand impact of cross-border/edge-matching issues How can data be seamlessly integrated when combining data from multiple organisations for a single INSPIRE data specification Organisations need facilities to support each other so they can share experiences and ensure everyone can better understand what they need to do to publish their data
Today I’m going to give you a quick summary of our experiences from testing the INSPIRE Annex I data specifications – including a demonstration of the work that we did with HMLR
Objectives of INSPIRE testing were to:Understand the feasibility of transforming and publishing data into proposed INSPIRE Annex I data specificationDemonstrate provision of data access via INSPIRE Download ServicesEvaluate costs and benefits of publishing data into the INSPIRE data specification via Download Services89 testing reports were received from 16 Member States from over 70 participating organisations (LMOs, SDICs, software vendors, research institutes and geographical institutes and associations).As you can see the UK was one of the most active Member States involved in the testing.
There was a pretty even split of test reports received by Commission across all Annex themes.However, it should be noted that for most themes most of the test reports were received from organisations involved in Commission projects related to INSPIRE: EURADIN – Addresses, ESDIN – Geographic Names, Admistrative Areas, Cadastral Parcels, Transport Network, HydrographyNATURE-GIS – Protected Areas, GIS4EU and Humboldt. <20% of all reports received were undertaken/commissioned directly by organisations (LMOs)Due to the short period given for testing, not all organisations were able to perform a full test actually transforming and publishing data and metadata to the INSPIRE data specifications and download services. Only 60 out of 89 reports described the results of an actual transformation and publication test, while the rest were paper based exercises mapping source data to output schema using MS Excel.
Snowflake were directly (through the ESDIN project working with HMLR, EDINA, OS and Registers of Scotland)and indirectly through organisations downloading evaluations or using academic licences of our software.Consequently we were able to demonstrate that our software is capable of integrating, modelling, transforming and publishing data into all INSPIRE Annex themes.
There are two key approaches for transforming and publishing data into the INSPIRE Annex themes:Offline transformation: Data is transformed and stored into the INSPIRE data specification (i.e. flat files or separate database). This may be the most suitable option for datasets that are not updated regularly or are only going to be made accessible via simple download services.On-the-fly transformation: Data is stored once and transformed into the INSPIRE data specification on request by the download service. This is the most suitable for datasets that are updated regularly and where organisations need to support access to data by a wide range end users in different data specifications defined for different use cases.On-the-fly transformation will probably be the main transformation approach for many organisations as many organisations have to support communities outside of INSPIRE (i.e. Environment domain) which may have requirements for data to be published in other data specifications (e.g. Aviation domain have developed the WXXM which may differ to the INSPIRE Annex III meteorological geographic features data specification).It was also noted in the INSPIRE Conference that the INSPIRE Data Specifications would only define core feature types and propertiesthat support a broad range of use cases across the range of environmental acquis.It is therefore anticipated that the INSPIRE data specifications will be used to provide the base specifications which are extended to develop more specialised data specifications under the remit of SEIS (Shared Environmental Information Systems) to meet more specific use cases for information and information systems required to deliver the obligations within individual environmental Directives. i.e. Information need to satisfy reporting requirements between MS and Commission Information needed to be shared between Public Authorities to successfully deliver policy objectives Developing public information systems (e.g. Near-real time air quality monitoring applications and alert services or systems to enable public engagement in environmental policy making – as required by the Arhaus Convention).GO Publisher was used to test both types of transformation by organisations involved in testing.
Working with HMLR, we aimed to demonstrate the feasibility of developing transformational download services and direct access services (for use within client applications).The INSPIRE Implementing Rules define two types of Download Service:Basic Download Service: Files can be downloaded for local use via HTTP/FTP Advanced Download Service: User can define the extent (geographic, temporal, attribute) of the data they need to be downloaded through either: Data Ordering Services or Web Feature Services (WFS)It should be noted that Direct Access services (WFS for use within applications) were deemed beyond the scope of the Implementing Rules for Download Services. It is expected that the requirements for these would be defined by MS or thematic data working groups established for individual environmental Directives (e.g. CAFE Directive, Marine Strategy Directive, Water Framework Directive)
The aim of our involvement within the INSPIRE testing with HMLR was to test the feasibility of using COTS software for delivering the requirements of INSPIRE.Our objectives were to: Develop the translation without need for software customisation or development of bespoke scriptsDemonstrate that transforming data into INSPIRE data specifications and via a range of different download and direct access services could be achieved quicklyDemonstrate that data can be transformed on-the-fly enabling organisations:to manage once, publish many timesMaintain existing data maintenance infrastructures - minimising any future costs – although some changes to data capture (business) processes may be requiredDemonstrate that we have a industrial strength, scalable solution to enable organisations to start small (i.e simple download services) but extend their services to full enterprise level or SOA/SDI level when demand increases
Configuration or authoring of the transformations required to integrate, model and transform source data into a pre-defined output schema such as the INSPIRE Cadastral Parcels specification is performed within GO Publisher Desktop.GO Publisher Desktop is a powerful, intuitive graphic user interface that enables users to map database tables and columns to respective elements within the GML application schema that has been parsed and validated prior to use.GO Publisher provides users with the ability to perform a wide range of functions for manipulating and transforming the source data: Insert new values into the output where data doesn’t currently exist in the source data - useful for inserting codespace/namespace values Geometric Operations Logical, Comparison and Arithmetic Operations Coordinate Reference System TransformationsIt also provides users with a preview panel to evaluate/validate the success of the output during the mapping process and validates this against the schema to ensure that all mandatory elements are mapped.Once the mapping is complete, GO Publisher Desktop validates the output against the output schema to test for logical consistency which is the only data quality requirement/conformance test specified in all data specifications. If the output passes this validation test, then user can state that the data passes the data specification conformance quality measure within the metadata.
Visual analysis of the GML can also be performed by viewing a sample GML file using the GML Viewer
Our approach was to demonstrate how to transform HMLR data into INSPIRE Cadastral Parcel GML via simple download service or advanced download services and direct access services (i.e. WFS).HMLR provided us with a subset of their land parcel data which was loaded into Oracle database. Then used GO Publisher Desktop to author the transformation and mapping project, in conjunction with domain experts from HMLR. After several iterations, improving the transformation and mapping project this was then used to generate individual files which can be integrated into a zip file and which is made accessible via HTTP/FTP server. Alternatively, once the transformation and mapping project is configured GO Publisher Desktop can be used to configure the WFS (i.e. create service metadata). Once configured GO Publisher Desktop then deploys the WFS as a WAR file into an application service which is then accessible for use by WFS Clients (HTTP Get or Post).
The INSPIRE model for cadastral parcels is more extensive than HMLR model – however, HMLR model did contain all the mandatory feature types and properties (i.e cadastral boundaries or index sets).In UK we don’t have a cadastral mapping agency. Instead the Ordnance Survey provide a topographic mapping database which include objects that form properties (or cadastral parcels), while the Land Registry manages and maintains the land titles for properties in England and Wales.Consequently, the data model of the Land Registry differs to the INSPIRE Cadastral Parcel data specification. The primary feature type within the Land Registry dataset is the Land Title – which is not modelled in the INSPIRE model – and one or more objects (buildings, gardens, car parking etc) are associated to a title.Whereas, in the INSPIRE Cadastral Parcel Models the primary feature type is the cadastral parcel.Despite these different viewpoints, it is possible to map each component object within a land title to a cadastral parcel. However, this reveals a data management problem for operation transformation of HMLR data to INSPIRE specification.How does HMLR manage the identity and lifecycle of each of these component objects:This is further complicated as INSPIRE GML application schema requires 3 different identifiers:INSPIRE Identifier – unique, persistent identifier for international useNational Identifier– unique, persistent identifier for national usegmlID – non persistent identifier needed to uniquely identify features and objects within the GML file
All organisations that used GO Publisher were able to successfully transform their data into the mandatory requirements of the INSPIRE Annex data specifications for their respective themesInsufficient time to adequately undertake testing : resulted in few application tests. Some organisations are still performing and submitting test reports to Commission.Complexities of the INSPIRE data specification: many organisations reported that the data specifications were too complex which was causing problems with the transformation and publication (e.g. FME struggles with nesting below 2 levels)Lack of experience:many organisations reported that although they could perform the transformations required, they would need to provide further training of their staff to gain a better understanding of how they should transform their data into INSPIRE specifications operationally and they types of download services they need to provide. Lack of harmonisation between specifications: this comment has been taken on board and additional time is being provided once the next versions of data specification have been finalised to perform cross-harmonisation of the data specifications.
Many organisations identified areas where further work would be needed to better understand how to operationally publish data into the INSPIRE specifications, particularly where this is achieved on-the-fly:Identifier ManagementManaging feature lifecycles particularly those features generated on-the-flyTranslating between different codelist values – not all codelist values can be mapped 1:1 and therefore may not be possible to be transformed on-the-fly/automatically as this may require domain expertise to assign individual features to the appropriate INSPIRE codelist value. Therefore, encoding of alternate code values may have to be incorporated into the data maintenance workflowsMeasuring and quantifying quality of transformed output (geometric and attribute) to ensure that it is consistent with source data quality levels and other data quality levels – this shouldbe performed as part of an extensive pilot to demonstrate and understand what impacts the transformation to the INSPIRE data specifications which can then be expressed in the metadataMetadata – more work needs to be done to understand how metadata creation and publication can be integrated into the transformation and publication workflow to semi-automate the creation of metadata that meets the requirements for publication within discovery services, be accessible by download services and be disseminated within files downloaded for local use.