This document discusses Datometry, a virtualization platform that allows enterprises to replatform their legacy data warehousing workloads to the cloud at a fraction of the typical cost, time, and risk of conventional replatforming methods. It notes that Datometry keeps existing applications and logic, allowing businesses to move rapidly to the cloud while preserving years of infrastructure investment. Charts show that Datometry reduces both the upfront switching costs and ongoing annual costs of replatforming projects compared to conventional approaches. The document promotes Datometry as a way to redefine the economics of modernizing data management applications and proprietary systems like Teradata.
EnterpriseDB CEO and President Ed Boyajian opened Postgres Vision 2018 with this presentation providing a look at enterprise activity in the cloud and how Postgres can extend across the IT infrastructure, from on-premises to the cloud.
At Postgres Vision 2018, Lauren Nelson, Principal Analyst, Forrester, provided a look into the practical considerations that are influencing modern cloud strategies, including existing skill sets and technology limitations, the assortment of current and future cloud workloads, and the economics and realities of today's technology options.
Postgres Vision 2018: How to Consume your Database Platform On-premisesEDB
The usual model for a database platform on-premises is to run it the way IT is usually operated - silo'd and capital- and labor-intensive. In the cloud, consumption means that you pay for what you use, with less heavy lifting to operate the platform. Presented at Postgres Vision 2018, this covers how HPE can deliver EDB Postgres in the data center or on the edge in a consumption model that is pay-per-use, elastic IT, operated for you, migrated, and integrated.
Postgres Vision 2018: The Changing Role of the DBA in the CloudEDB
Not that long ago, DBAs were the gateway to all things database related for enterprises. With the advent of the cloud, automation and DevOps, the DBAs role and responsibilities are rapidly evolving. In this presentation delivered at Postgres Vision 2018, Ken Rugg, Chief Product & Strategy Officer at EDB, explored the 10 most significant ways the role of the DBA has changed and what new, higher value skills a DBA will need be ready for epic change.
As open source databases become the enterprise standard, making all data available and accessible for AI has become an even bigger challenge. In the presentation delivered at Postgres Vision 2018, Rob Thomas, General Manager of IBM Analytics, provided answers for how companies can prepare their Information Architecture for AI, leverage containers and multi-cloud for innovation, and deliver a data and analytics strategy at scale.
Postgres Vision 2018: Making Modern an Old Legacy SystemEDB
A New England insurance company had aging hardware, a database that was out of support, an older operating system, rising costs, and no disaster recovery plan. Craig Bogovich of NTT Data tackled this massive website backend, used by the company's insureds, providers, and partners, and architected a complete overhaul and ultimately deployed it into the cloud. Presented at Postgres Vision 2018, this presentation shows how the project unfolded and provided the strategies and methods used to modernize this legacy system with open source software and cloud technology.
Postgres Vision 2018: Your Migration Path - Isabel Case StudyEDB
Benny Rutten, Senior Database Administrator at Isabel Group, presented a case study at Postgres Vision 2018 about building a Service Hub on OpenStack with EDB Postgres that any Oracle DBA could manage with zero transition time.
EnterpriseDB CEO and President Ed Boyajian opened Postgres Vision 2018 with this presentation providing a look at enterprise activity in the cloud and how Postgres can extend across the IT infrastructure, from on-premises to the cloud.
At Postgres Vision 2018, Lauren Nelson, Principal Analyst, Forrester, provided a look into the practical considerations that are influencing modern cloud strategies, including existing skill sets and technology limitations, the assortment of current and future cloud workloads, and the economics and realities of today's technology options.
Postgres Vision 2018: How to Consume your Database Platform On-premisesEDB
The usual model for a database platform on-premises is to run it the way IT is usually operated - silo'd and capital- and labor-intensive. In the cloud, consumption means that you pay for what you use, with less heavy lifting to operate the platform. Presented at Postgres Vision 2018, this covers how HPE can deliver EDB Postgres in the data center or on the edge in a consumption model that is pay-per-use, elastic IT, operated for you, migrated, and integrated.
Postgres Vision 2018: The Changing Role of the DBA in the CloudEDB
Not that long ago, DBAs were the gateway to all things database related for enterprises. With the advent of the cloud, automation and DevOps, the DBAs role and responsibilities are rapidly evolving. In this presentation delivered at Postgres Vision 2018, Ken Rugg, Chief Product & Strategy Officer at EDB, explored the 10 most significant ways the role of the DBA has changed and what new, higher value skills a DBA will need be ready for epic change.
As open source databases become the enterprise standard, making all data available and accessible for AI has become an even bigger challenge. In the presentation delivered at Postgres Vision 2018, Rob Thomas, General Manager of IBM Analytics, provided answers for how companies can prepare their Information Architecture for AI, leverage containers and multi-cloud for innovation, and deliver a data and analytics strategy at scale.
Postgres Vision 2018: Making Modern an Old Legacy SystemEDB
A New England insurance company had aging hardware, a database that was out of support, an older operating system, rising costs, and no disaster recovery plan. Craig Bogovich of NTT Data tackled this massive website backend, used by the company's insureds, providers, and partners, and architected a complete overhaul and ultimately deployed it into the cloud. Presented at Postgres Vision 2018, this presentation shows how the project unfolded and provided the strategies and methods used to modernize this legacy system with open source software and cloud technology.
Postgres Vision 2018: Your Migration Path - Isabel Case StudyEDB
Benny Rutten, Senior Database Administrator at Isabel Group, presented a case study at Postgres Vision 2018 about building a Service Hub on OpenStack with EDB Postgres that any Oracle DBA could manage with zero transition time.
In this webinar you'll learn how to quickly and easily improve your business using Snowflake and Matillion ETL for Snowflake. Webinar presented by Solution Architects Craig Collier (Snowflake) adn Kalyan Arangam (Matillion).
In this webinar:
- Learn to optimize Snowflake and leverage Matillion ETL for Snowflake
- Discover tips and tricks to improve performance
- Get invaluable insights from data warehousing pros
Simplifying Your Journey to the Cloud: The Benefits of a Cloud-Based Data War...Matillion
As companies grow, so does the volume of their data. Without the proper solutions in place to quickly store, measure and analyze that data, its usefulness quickly declines.
See our latest webinar to learn about how companies are increasingly turning towards cloud-based data warehousing to derive more value out of their data and apply their findings to make smarter business decisions. The webinar covers core topics including:
- The benefits of using Snowflake’s unique architecture for interacting with data.
- How Matillion can help you quickly load and transform your data to maximize its value.
- Expert advice on how to apply data warehousing and ETL best practices.
Watch the full webinar: https://youtu.be/mIOm3j431OQ
Altis Webinar: Use Cases For The Modern Data PlatformAltis Consulting
Several organisations have mentioned issues that they have found from choosing the wrong use cases to start their journey with a modern data platform.
In this session, NZ Regional Manager Alex Gray will cover some of those issues faced by organisations & how to pick the right use cases to get you started successfully on your journey.
Postgres, the leading open source relational database, is positioned as the centerpiece of a pivot from traditional architectures to a micro-services based approach that is in full support of a DevOps motion.
Presented by Marc Linster, Senior Vice President of Product Development at EnterpriseDB, this explores how Postgres meets the key requirements for DevOps. Lister explains how Postgres is developer friendly, supporting the process with a versatile data model using JSONB, integrating other data sources using Foreign Data Wrappers, and how Postgres supports rapid deployment in the cloud and on premises.
Continuous Data Replication into Cloud Storage with Oracle GoldenGateMichael Rainey
Continuous flow. Streaming. Near real-time. These are all terms used to identify the business’s need for quick access to data. It’s a common request, even if the data must flow from on-premises to the cloud. Oracle GoldenGate is the data replication solution built for fast data. In this session, we’ll look at how GoldenGate can be configured to extract transactions from the Oracle database and load them into a cloud object store, such as Amazon S3. There are many different use cases for this type of continuous load of data into the cloud. We’ll explore these solutions and the various tools that can be used to access and analyze the data from the cloud object store, leaving attendees with ideas for implementing a full source-to-cloud data replication solution.
Presented at ITOUG Tech Days 2019
For those contemplating re-architecting or greenfields data lakes/data hubs/data warehouses in a cloud environment, talk to our Altis AWS Practice Lead - Guillaume Jaudouin about why you should be considering the "tour de force" combination of AWS and Snowflake.
Hassle-Free Data Lake Governance: Automating Your Analytics with a Semantic L...Tyler Wishnoff
Simplify data lake governance, no matter how much data you work with and how many data sources and BI tools you manage. This presentation offers all you need to develop your own strategy for smarter data lake governance. Learn more at: https://kyligence.io/
Beyond Batch: Is ETL still relevant in the API economy?SnapLogic
Industry thought leaders Gaurav Dhillon and David Linthicum discuss the future of cloud integration and data management in the API economy. Topics from this webinar and the accompanying slides include: key considerations of today's CIOs, approaching the reality of the multi-cloud world and new solutions for managing cloud and on-premise data.
To learn more, visit: http://www.snaplogic.com/.
Every day, businesses across a wide variety of industries share data to support insights that drive efficiency and new business opportunities. However, existing methods for sharing data involve great effort on the part of data providers to share data, and involve great effort on the part of data customers to make use of that data.
However, existing approaches to data sharing (such as e-mail, FTP, EDI, and APIs) have significant overhead and friction. For one, legacy approaches such as e-mail and FTP were never intended to support the big data volumes of today. Other data sharing methods also involve enormous effort. All of these methods require not only that the data be extracted, copied, transformed, and loaded, but also that related schemas and metadata must be transported as well. This creates a burden on data providers to deconstruct and stage data sets. This burden and effort is mirrored for the data recipient, who must reconstruct the data.
As a result, companies are handicapped in their ability to fully realize the value in their data assets.
Snowflake Data Sharing allows companies to grant instant access to ready-to-use data to any number of partners or data customers without any data movement, copying, or complex pipelines.
Using Snowflake Data Sharing, companies can derive new insights and value from data much more quickly and with significantly less effort than current data sharing methods. As a result, companies now have a new approach and a powerful new tool to get the full value out of their data assets.
Smart application on Azure at Vattenfall - Rens Weijers & Peter van 't HofGoDataDriven
During GoDataFest 2019, Rens Weijers, manager data & strategy and Peter van ' t Hof, data engineer, share the story of how Vattenfall develops smart applications on Azure. Vattenfall has the ambition to transition to fossil-free living within one generation. But what about decentral energy solutions in the Customers & Solutions business unit? Data is key to help customers to reduce their CO2 footprint. Azure enables Vattenfall to be personal and relevant towards customers.
Understanding and Rightsizing Container Resources With Datadog and ProphetstorJadeCampbell13
OpenShift enables organizations to accelerate delivery cycles and rapidly scale their operations to meet the demands of today's fast-paced market. For example, individual application teams can deploy multiple versions every day on common infrastructure, and scale their applications to meet the demand of their users. However, Datadog's recent containers study found that the majority of OpenShift and Kubernetes workloads are underutilizing CPU and memory resources.
In this webinar, we show how Datadog and ProphetStor help teams to solve the challenges in deploying containerized applications on OpenShift by bringing end-to-end visibility and resource optimization recommendations to meet application performance and cost requirements.
Webinar: The 5 Most Critical Things to Understand About Modern Data IntegrationSnapLogic
In this webinar, we talk to industry analyst, author and practitioner David Linthicum who provides a state-of-the-technology explanation of big data integration.
David also provides 5 critical and lesser known data integration requirements, how to understand today's requirements, and guidance for choosing the right approaches and technology to solve these problems.
To learn more, visit: www.snaplogic.com/big-data
Data lakes are changing the way we store and process data. In this webinar, Matillion explores data lakes in detail: what they are, how they function, and how you can leverage them to benefit your business.
Watch the webinar to
- Learn the differences between data lakes and data warehouses
- Get tips for building a data lake in the cloud
- See real-world use cases for data lakes in Amazon Redshift, Google BigQuery, and Snowflake
- Find out how Matillion ETL can help you with your data lake strategy
Master the Multi-Clustered Data Warehouse - SnowflakeMatillion
Snowflake is one of the most powerful, efficient data warehouses on the market today—and we joined forces with the Snowflake team to show you how it works!
In this webinar:
- Learn how to optimize Snowflake
- Hear insider tips and tricks on how to improve performance
- Get expert insights from Craig Collier, Technical Architect from Snowflake, and Kalyan Arangam, Solution Architect from Matillion
- Find out how leading brands like Converse, Duo Security, and Pets at Home use Snowflake and Matillion ETL to make data-driven decisions
- Discover how Matillion ETL and Snowflake work together to modernize your data world
- Learn how to utilize the impressive scalability of Snowflake and Matillion
Snowflake + Power BI: Cloud Analytics for EveryoneAngel Abundez
Learn how Power BI and Snowflake can work together to bring a best-in-class data and analytics experience to your enterprise. You can combine Snowflake’s easy to use, robust, and scalable data platform with Power BI’s data visualization, built-in AI, and collaboration platform to create a data-driven culture for everyone.
Creating Agility Through Data Governance and Self-service Integration with S...SnapLogic
In July of 2015, SnapLogic announced the summer 2015 release of its SnapLogic Elastic Integration Platform, providing broader support for self-service cloud and big data integration in the enterprise. This ENTERPRISE MANAGEMENT ASSOCIATES® (EMA) impact brief details the announcement and recognizes SnapLogic’s innovation in the data integration and data governance spaces.
Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...Denodo
Watch full webinar here: https://bit.ly/3g9PlQP
It is no news that Oil and Gas companies are constantly faced with immense pressure to stay competitive, especially in the current climate while striving towards becoming data-driven at the heart of the process to scale and gain greater operational efficiencies across the organization.
Hence, the need for a logical data layer to help Oil and Gas businesses move towards a unified secure and governed environment to optimize the potential of data assets across the enterprise efficiently and deliver real-time insights.
Tune in to this on-demand webinar where you will:
- Discover the role of data fabrics and Industry 4.0 in enabling smart fields
- Understand how to connect data assets and the associated value chain to high impact domain areas
- See examples of organizations accelerating time-to-value and reducing NPT
- Learn best practices for handling real-time/streaming/IoT data for analytical and operational use cases
In this webinar you'll learn how to quickly and easily improve your business using Snowflake and Matillion ETL for Snowflake. Webinar presented by Solution Architects Craig Collier (Snowflake) adn Kalyan Arangam (Matillion).
In this webinar:
- Learn to optimize Snowflake and leverage Matillion ETL for Snowflake
- Discover tips and tricks to improve performance
- Get invaluable insights from data warehousing pros
Simplifying Your Journey to the Cloud: The Benefits of a Cloud-Based Data War...Matillion
As companies grow, so does the volume of their data. Without the proper solutions in place to quickly store, measure and analyze that data, its usefulness quickly declines.
See our latest webinar to learn about how companies are increasingly turning towards cloud-based data warehousing to derive more value out of their data and apply their findings to make smarter business decisions. The webinar covers core topics including:
- The benefits of using Snowflake’s unique architecture for interacting with data.
- How Matillion can help you quickly load and transform your data to maximize its value.
- Expert advice on how to apply data warehousing and ETL best practices.
Watch the full webinar: https://youtu.be/mIOm3j431OQ
Altis Webinar: Use Cases For The Modern Data PlatformAltis Consulting
Several organisations have mentioned issues that they have found from choosing the wrong use cases to start their journey with a modern data platform.
In this session, NZ Regional Manager Alex Gray will cover some of those issues faced by organisations & how to pick the right use cases to get you started successfully on your journey.
Postgres, the leading open source relational database, is positioned as the centerpiece of a pivot from traditional architectures to a micro-services based approach that is in full support of a DevOps motion.
Presented by Marc Linster, Senior Vice President of Product Development at EnterpriseDB, this explores how Postgres meets the key requirements for DevOps. Lister explains how Postgres is developer friendly, supporting the process with a versatile data model using JSONB, integrating other data sources using Foreign Data Wrappers, and how Postgres supports rapid deployment in the cloud and on premises.
Continuous Data Replication into Cloud Storage with Oracle GoldenGateMichael Rainey
Continuous flow. Streaming. Near real-time. These are all terms used to identify the business’s need for quick access to data. It’s a common request, even if the data must flow from on-premises to the cloud. Oracle GoldenGate is the data replication solution built for fast data. In this session, we’ll look at how GoldenGate can be configured to extract transactions from the Oracle database and load them into a cloud object store, such as Amazon S3. There are many different use cases for this type of continuous load of data into the cloud. We’ll explore these solutions and the various tools that can be used to access and analyze the data from the cloud object store, leaving attendees with ideas for implementing a full source-to-cloud data replication solution.
Presented at ITOUG Tech Days 2019
For those contemplating re-architecting or greenfields data lakes/data hubs/data warehouses in a cloud environment, talk to our Altis AWS Practice Lead - Guillaume Jaudouin about why you should be considering the "tour de force" combination of AWS and Snowflake.
Hassle-Free Data Lake Governance: Automating Your Analytics with a Semantic L...Tyler Wishnoff
Simplify data lake governance, no matter how much data you work with and how many data sources and BI tools you manage. This presentation offers all you need to develop your own strategy for smarter data lake governance. Learn more at: https://kyligence.io/
Beyond Batch: Is ETL still relevant in the API economy?SnapLogic
Industry thought leaders Gaurav Dhillon and David Linthicum discuss the future of cloud integration and data management in the API economy. Topics from this webinar and the accompanying slides include: key considerations of today's CIOs, approaching the reality of the multi-cloud world and new solutions for managing cloud and on-premise data.
To learn more, visit: http://www.snaplogic.com/.
Every day, businesses across a wide variety of industries share data to support insights that drive efficiency and new business opportunities. However, existing methods for sharing data involve great effort on the part of data providers to share data, and involve great effort on the part of data customers to make use of that data.
However, existing approaches to data sharing (such as e-mail, FTP, EDI, and APIs) have significant overhead and friction. For one, legacy approaches such as e-mail and FTP were never intended to support the big data volumes of today. Other data sharing methods also involve enormous effort. All of these methods require not only that the data be extracted, copied, transformed, and loaded, but also that related schemas and metadata must be transported as well. This creates a burden on data providers to deconstruct and stage data sets. This burden and effort is mirrored for the data recipient, who must reconstruct the data.
As a result, companies are handicapped in their ability to fully realize the value in their data assets.
Snowflake Data Sharing allows companies to grant instant access to ready-to-use data to any number of partners or data customers without any data movement, copying, or complex pipelines.
Using Snowflake Data Sharing, companies can derive new insights and value from data much more quickly and with significantly less effort than current data sharing methods. As a result, companies now have a new approach and a powerful new tool to get the full value out of their data assets.
Smart application on Azure at Vattenfall - Rens Weijers & Peter van 't HofGoDataDriven
During GoDataFest 2019, Rens Weijers, manager data & strategy and Peter van ' t Hof, data engineer, share the story of how Vattenfall develops smart applications on Azure. Vattenfall has the ambition to transition to fossil-free living within one generation. But what about decentral energy solutions in the Customers & Solutions business unit? Data is key to help customers to reduce their CO2 footprint. Azure enables Vattenfall to be personal and relevant towards customers.
Understanding and Rightsizing Container Resources With Datadog and ProphetstorJadeCampbell13
OpenShift enables organizations to accelerate delivery cycles and rapidly scale their operations to meet the demands of today's fast-paced market. For example, individual application teams can deploy multiple versions every day on common infrastructure, and scale their applications to meet the demand of their users. However, Datadog's recent containers study found that the majority of OpenShift and Kubernetes workloads are underutilizing CPU and memory resources.
In this webinar, we show how Datadog and ProphetStor help teams to solve the challenges in deploying containerized applications on OpenShift by bringing end-to-end visibility and resource optimization recommendations to meet application performance and cost requirements.
Webinar: The 5 Most Critical Things to Understand About Modern Data IntegrationSnapLogic
In this webinar, we talk to industry analyst, author and practitioner David Linthicum who provides a state-of-the-technology explanation of big data integration.
David also provides 5 critical and lesser known data integration requirements, how to understand today's requirements, and guidance for choosing the right approaches and technology to solve these problems.
To learn more, visit: www.snaplogic.com/big-data
Data lakes are changing the way we store and process data. In this webinar, Matillion explores data lakes in detail: what they are, how they function, and how you can leverage them to benefit your business.
Watch the webinar to
- Learn the differences between data lakes and data warehouses
- Get tips for building a data lake in the cloud
- See real-world use cases for data lakes in Amazon Redshift, Google BigQuery, and Snowflake
- Find out how Matillion ETL can help you with your data lake strategy
Master the Multi-Clustered Data Warehouse - SnowflakeMatillion
Snowflake is one of the most powerful, efficient data warehouses on the market today—and we joined forces with the Snowflake team to show you how it works!
In this webinar:
- Learn how to optimize Snowflake
- Hear insider tips and tricks on how to improve performance
- Get expert insights from Craig Collier, Technical Architect from Snowflake, and Kalyan Arangam, Solution Architect from Matillion
- Find out how leading brands like Converse, Duo Security, and Pets at Home use Snowflake and Matillion ETL to make data-driven decisions
- Discover how Matillion ETL and Snowflake work together to modernize your data world
- Learn how to utilize the impressive scalability of Snowflake and Matillion
Snowflake + Power BI: Cloud Analytics for EveryoneAngel Abundez
Learn how Power BI and Snowflake can work together to bring a best-in-class data and analytics experience to your enterprise. You can combine Snowflake’s easy to use, robust, and scalable data platform with Power BI’s data visualization, built-in AI, and collaboration platform to create a data-driven culture for everyone.
Creating Agility Through Data Governance and Self-service Integration with S...SnapLogic
In July of 2015, SnapLogic announced the summer 2015 release of its SnapLogic Elastic Integration Platform, providing broader support for self-service cloud and big data integration in the enterprise. This ENTERPRISE MANAGEMENT ASSOCIATES® (EMA) impact brief details the announcement and recognizes SnapLogic’s innovation in the data integration and data governance spaces.
Maximizing Oil and Gas (Data) Asset Utilization with a Logical Data Fabric (A...Denodo
Watch full webinar here: https://bit.ly/3g9PlQP
It is no news that Oil and Gas companies are constantly faced with immense pressure to stay competitive, especially in the current climate while striving towards becoming data-driven at the heart of the process to scale and gain greater operational efficiencies across the organization.
Hence, the need for a logical data layer to help Oil and Gas businesses move towards a unified secure and governed environment to optimize the potential of data assets across the enterprise efficiently and deliver real-time insights.
Tune in to this on-demand webinar where you will:
- Discover the role of data fabrics and Industry 4.0 in enabling smart fields
- Understand how to connect data assets and the associated value chain to high impact domain areas
- See examples of organizations accelerating time-to-value and reducing NPT
- Learn best practices for handling real-time/streaming/IoT data for analytical and operational use cases
Data Driven Advanced Analytics using Denodo Platform on AWSDenodo
Watch full webinar here: https://buff.ly/3JC8gCS
Accelerating cloud adoption and modernizing analytics in the cloud has become a necessity to facilitate timely, insightful, and impactful decision making. However, with the widespread data in an organization across disparate hybrid cloud data sources poses a challenge with real time and well governed analytics. Data Virtualization is a modern data integration technique in which a single semantic layer can be built to help drive data democratization and speed up the analytics in an efficient and cost-effective manner.
Watch this session to learn:
- How various AWS services (Redshift, S3, RDS) can be quickly integrated using Denodo Platform’s logical data management by implementing a logical data fabric (LDF)
- How LDF helps you manage and deliver your data for data science and analytics programs, supporting your business users.
- How governed Data Services layer enables self-service analytics in your complex AWS data landscape
Verizon Centralizes Data into a Data Lake in Real Time for AnalyticsDataWorks Summit
Verizon – Global Technology Services (GTS) was challenged by a multi-tier, labor-intensive process when trying to migrate data from disparate sources into a data lake to create financial reports and business insights. Join this session to learn more about how Verizon:
• Easily accessed data from multiple sources including SAP data
• Ingested data into major targets including Hadoop
• Achieved real-time insights from data leveraging change data capture (CDC) technology
• Reduced costs and labor
Horses for Courses: Database RoundtableEric Kavanagh
The blessing and curse of today's database market? So many choices! While relational databases still dominate the day-to-day business, a host of alternatives has evolved around very specific use cases: graph, document, NoSQL, hybrid (HTAP), column store, the list goes on. And the database tools market is teeming with activity as well. Register for this special Research Webcast to hear Dr. Robin Bloor share his early findings about the evolving database market. He'll be joined by Steve Sarsfield of HPE Vertica, and Robert Reeves of Datical in a roundtable discussion with Bloor Group CEO Eric Kavanagh. Send any questions to info@insideanalysis.com, or tweet with #DBSurvival.
Big Data LDN 2018: STREAMING INTEGRATION: POWERING HYBRID CLOUD, MACHINE LEAR...Matt Stubbs
Date: 13th November 2018
Location: Fast Data Theatre
Time: 15:10 - 15:40
Speaker: Steve Wilkes
Organisation: Striim
About: Organisations realise that modernisation and adoption of new technologies are key to business success. Whether it is adopting hybrid cloud services for elastically scalable solutions, utilising machine learning to optimise business operations, or leveraging fast analytics for instant insight, they all rely on one thing: data. Modernisation requires moving away from batch-oriented architectures to streaming technologies, taking advantage of fast data, as well as integrating new and legacy resources.
The Importance of DataOps in a Multi-Cloud WorldDATAVERSITY
There’s no denying that Cloud has evolved from being an outlying market disruptor to a mainstream method for delivering IT applications and services. In fact, it’s not uncommon to find that Enterprises use the services of more than one cloud at the same time. However, while a multi-cloud strategy offers many benefits, it also increases data management complexity and consequently reduces data availability. This webinar defines the meaning of DataOps and why it’s a crucial component for every multi-cloud approach.
Accelerate Design and Development of Data Projects Using AWSDelphix
What if you could stand up your AWS EC2 development and test environments near instantly with fresh, secure, and masked data―and at the same time slash your EBS storage usage?
It’s what Dentegra, one of the largest US health benefits providers, achieved by adding Delphix to their DevOps/AWS stack—enabling them to release new features to the market faster and more efficiently across their hybrid cloud environment.
Top Trends in Building Data Lakes for Machine Learning and AI Holden Ackerman
Presentation by Ashish Thusoo, Co-Founder & CEO at Qubole, on exploring the big data industry trends in moving from data warehouses to cloud-based data lakes.This presentation will cover how companies today are seeing a significant rise in the success of their big data projects by moving to the cloud to iteratively build more cost-effective data pipelines and new products with ML and AI.
Uncovering how services like AWS, Google, Oracle, and Microsoft Azure provide the storage and compute infrastructure to build self-service data platforms that can enable all teams and new products to scale iteratively.
Enabling Next Gen Analytics with Azure Data Lake and StreamSetsStreamsets Inc.
Big data and the cloud are perfect partners for companies who want to unlock maximum value from all of their unstructured, semi-structured, and structured data. The challenge has been how to create and manage a reliable end-to-end solution that spans data ingestion, storage and analysis in the face of the volume, velocity and variety of big data sources.
In this webinar, we will show you how to achieve big data bliss by combining StreamSets Data Collector, which specializes in creating and running complex any-to-any dataflows, with Microsoft's Azure Data Lake and Azure analytic solutions.
We will walk through an example of how a major bank is using StreamSets to transport their on-premise data to the Azure Cloud Computing Platform and Azure Data Lake to take advantage of analytics tools with unprecedented scale and performance.
Slides: Accelerate and Assure the Adoption of Cloud Data Platforms Using Inte...DATAVERSITY
Greater agility, scalability, and lower total cost of ownership made the decision to move key elements of your organization’s data capability to the cloud easy. The real challenge is migrating data from your legacy systems to your new cloud platform so you can unleash its potential and value while minimizing the migration risks.
Combining erwin‘s data modeling, governance, and intelligence solutions with Snowflake’s modern cloud data platform, organizations can realize a scalable, governed, and transparent enterprise data capability.
In this session, we’ll show you how enterprise stakeholders with different skills and needs can work together to accelerate and assure the success of cloud migration projects of any size. You’ll learn how to:
• Reduce costs and mitigate risks when migrating legacy applications to Snowflake with erwin’s model-driven schema design and transformation capabilities
• Increase the precision, speed, and agility of Snowflake deployments with erwin data automation
• Assure transparency, compliance, and governance for Snowflake data and processes
• Increase the efficiency and accuracy of analytics and other data usage on the Snowflake Cloud Platform
SendGrid Improves Email Delivery with Hybrid Data WarehousingAmazon Web Services
When you received your Uber ‘Tuesday Evening Ride Receipt’ or Spotify’s ‘This Week’s New Music’ email, did you think about how they got there?
SendGrid’s reliable email platform delivers each month over 20 Billion transactional and marketing emails on behalf of many of your favorite brands, including Uber, Airbnb, Spotify, Foursquare and NextDoor.
SendGrid was looking to evolve its data warehouse architecture in order to improve decision making and optimize customer experience. They needed a scalable and reliable architecture that would allow them to move nimbly and efficiently with a relatively small IT organization, while supporting the needs of both business and technical users at SendGrid.
SendGrid’s Director of Enterprise Data Operations will be joining architects from Amazon Web Services (AWS) and Informatica to discuss SendGrid’s journey to a hybrid cloud architecture and how a hybrid data warehousing solution is optimized to support SendGrid’s analytics initiative. Speakers will also review common technologies and use cases being deployed in hybrid cloud today, common data management challenges in hybrid cloud and best practices for addressing these challenges.
Join us to learn:
• How to evolve to a hybrid data warehouse with Amazon Redshift for scalability, agility and cost efficiency with minimal IT resources
• Hybrid cloud data management use cases
• Best practices for addressing hybrid cloud data management challenges
Webinar | Data Management for Hybrid and Multi-Cloud: A Four-Step JourneyDataStax
Data management may be the hardest part of making the transition to the cloud, but enterprises including Intuit and Macy’s have figured out how to do it right. So what do they know that you might not? Join Robin Schumacher, Chief Product Officer at DataStax as he explores best practices for defining and implementing data management strategies for the cloud. He outlines a four-step journey that will take you from your first deployment in the cloud through to a true intercloud implementation and walk through a real-world use case where a major retailer has evolved through the four phases over a period of four years and is now benefiting from a highly resilient multi-cloud deployment.
View webinar: https://youtu.be/RrTxQ2BAxjg
Simplifying Your Cloud Architecture with a Logical Data Fabric (APAC)Denodo
Watch full webinar here: https://bit.ly/3dudL6u
It's not if you move to the cloud, but when. Most organisations are well underway with migrating applications and data to the cloud. In fact, most organisations - whether they realise it or not - have a multi-cloud strategy. Single, hybrid, or multi-cloud…the potential benefits are huge - flexibility, agility, cost savings, scaling on-demand, etc. However, the challenges can be just as large and daunting. A poorly managed migration to the cloud can leave users frustrated at their inability to get to the data that they need and IT scrambling to cobble together a solution.
In this session, we will look at the challenges facing data management teams as they migrate to cloud and multi-cloud architectures. We will show how the Denodo Platform can:
- Reduce the risk and minimise the disruption of migrating to the cloud.
- Make it easier and quicker for users to find the data that they need - wherever it is located.
- Provide a uniform security layer that spans hybrid and multi-cloud environments.
Streaming Real-time Data to Azure Data Lake Storage Gen 2Carole Gunst
Check out this presentation to learn the basics of using Attunity Replicate to stream real-time data to Azure Data Lake Storage Gen2 for analytics projects.
Modernizing to a Cloud Data ArchitectureDatabricks
Organizations with on-premises Hadoop infrastructure are bogged down by system complexity, unscalable infrastructure, and the increasing burden on DevOps to manage legacy architectures. Costs and resource utilization continue to go up while innovation has flatlined. In this session, you will learn why, now more than ever, enterprises are looking for cloud alternatives to Hadoop and are migrating off of the architecture in large numbers. You will also learn how elastic compute models’ benefits help one customer scale their analytics and AI workloads and best practices from their experience on a successful migration of their data and workloads to the cloud.
Oracle GoldenGate Cloud Service OverviewJinyu Wang
The new PaaS solution in Oracle Public Cloud extends the real-time data replication from on-premises to cloud, and leads the innovation of real-time data movement with the powerful data streaming capability for enterprise solutions.
Real life use cases from across Europe (Walid Aoudi - Cognizant)
This presentation will present some Cognizant Big Data clients return on experiences on continental Europe and UK. The main focus will be centered on use cases through the presentation of the business drivers behind these projects. Key highlights around the big data architecture and approach solutions will be presented. Finally, the business outcomes in terms of ROI provided by the solutions implementations will be discussed.
Similar to Replatform your Teradata to a Next-Gen Cloud Data Platform in Weeks, Not Years (20)
The Tanzu Developer Connect is a hands-on workshop that dives deep into TAP. Attendees receive a hands on experience. This is a great program to leverage accounts with current TAP opportunities.
The Tanzu Developer Connect is a hands-on workshop that dives deep into TAP. Attendees receive a hands on experience. This is a great program to leverage accounts with current TAP opportunities.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.