The presentation will discuss challenges and problems people experience during migration to a cloud. We check what set of tools AWS offers to overcome those problems and how to use AWS Database Migration Service (DMS) and Schema Conversion Tool. We will go through the process, supported engines, different modes, options and possible problems. Primarily the session is going to be focused on Oracle database migration but we also touch other engines and areas where the tool can be used.
Couchbase Singapore Meetup #2: Why Developing with Couchbase is easy !! Karthik Babu Sekar
This session provided an overview of Couchbase Solutions and whats latest and greatest in the new release. This session also talks about how easy is to develop with Couchbase and query the database
Couchbase Chennai Meetup: Developing with Couchbase- made easyKarthik Babu Sekar
This session provided an overview of Couchbase Solutions and whats latest and greatest in the new release. This session also talks about how easy is to develop with Couchbase and query the database
AWS re:Invent 2016: How Mapbox Uses the AWS Edge to Deliver Fast Maps for Mob...Amazon Web Services
Ian Ward, Platform and Security Engineer from Mapbox, discusses how the AWS global edge network helps improve the availability and performance of delivering hundreds of billions of map tiles to hundreds of millions of end users across the globe on mobile devices, in cars, and over the web. In this session, Ian shares insights on how Mapbox manages day-to-day edge operations using Amazon CloudFront logs, dashboards, and ad hoc queries, and how Mapbox has configured CloudFront with dozens of behaviors and origins to customize their content delivery. Mapbox has grown from using a single AWS region to using several regions, so Ian also explains how his team uses Amazon Route 53 and open source tools to simplify complexity around regional failover, and how Mapbox leverages AWS WAF to deter attacks and abuse.
As GoPro expands into content networks and launches new products, new challenges have appeared. One of the most critical challenges facing GoPro during this period of rapid growth is their ability to make effective use of massive amounts of data. Every day, GoPro collects increasing amounts of data generated by internet connected consumer devices (smart cameras, smart drones), GoPro mobile apps, GoPro content networks, GoPro e-commerce sales, and social media. This data ranges from raw camera logs to refined and well-structured e-commerce datasets. In the past, it took GoPro months to understand new inbound data and determine how to transform or augment it for analysis. To streamline this process and bridge the gap between tech-savvy engineers and data-savvy analysts, GoPro is creating an analysis loop, which informs product usage trends and product insights. This analysis loop serves a large ecosystem of GoPro executives, product managers, engineers, data scientists, and business analysts through an integrated technology pipeline consisting of Apache Kafka, Apache Spark Streaming, Cloudera’s distribution of Hadoop, and Tableau’s Data Visualization Software as the end user analytical tool. Session sponsored by Tableau Software.
Couchbase Singapore Meetup #2: Why Developing with Couchbase is easy !! Karthik Babu Sekar
This session provided an overview of Couchbase Solutions and whats latest and greatest in the new release. This session also talks about how easy is to develop with Couchbase and query the database
Couchbase Chennai Meetup: Developing with Couchbase- made easyKarthik Babu Sekar
This session provided an overview of Couchbase Solutions and whats latest and greatest in the new release. This session also talks about how easy is to develop with Couchbase and query the database
AWS re:Invent 2016: How Mapbox Uses the AWS Edge to Deliver Fast Maps for Mob...Amazon Web Services
Ian Ward, Platform and Security Engineer from Mapbox, discusses how the AWS global edge network helps improve the availability and performance of delivering hundreds of billions of map tiles to hundreds of millions of end users across the globe on mobile devices, in cars, and over the web. In this session, Ian shares insights on how Mapbox manages day-to-day edge operations using Amazon CloudFront logs, dashboards, and ad hoc queries, and how Mapbox has configured CloudFront with dozens of behaviors and origins to customize their content delivery. Mapbox has grown from using a single AWS region to using several regions, so Ian also explains how his team uses Amazon Route 53 and open source tools to simplify complexity around regional failover, and how Mapbox leverages AWS WAF to deter attacks and abuse.
As GoPro expands into content networks and launches new products, new challenges have appeared. One of the most critical challenges facing GoPro during this period of rapid growth is their ability to make effective use of massive amounts of data. Every day, GoPro collects increasing amounts of data generated by internet connected consumer devices (smart cameras, smart drones), GoPro mobile apps, GoPro content networks, GoPro e-commerce sales, and social media. This data ranges from raw camera logs to refined and well-structured e-commerce datasets. In the past, it took GoPro months to understand new inbound data and determine how to transform or augment it for analysis. To streamline this process and bridge the gap between tech-savvy engineers and data-savvy analysts, GoPro is creating an analysis loop, which informs product usage trends and product insights. This analysis loop serves a large ecosystem of GoPro executives, product managers, engineers, data scientists, and business analysts through an integrated technology pipeline consisting of Apache Kafka, Apache Spark Streaming, Cloudera’s distribution of Hadoop, and Tableau’s Data Visualization Software as the end user analytical tool. Session sponsored by Tableau Software.
How to Migrate from Cassandra to Amazon DynamoDB - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Learn how to migrate from Cassandra to DynamoDB
- Learn about the considerations and pre-requisites for migrating to DynamoDB
- Learn the benefits of a fully managed nosql database - DynamoDB
In addition to running databases in Amazon EC2, AWS customers can choose among a variety of managed database services. These services save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We explain the fundamentals of Amazon DynamoDB, a fully managed NoSQL database service; Amazon RDS, a relational database service in the cloud; Amazon ElastiCache, a fast, in-memory caching service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We’ll cover how each service might help support your application, how much each service costs, and how to get started.
NYC* 2013 — "Using Cassandra for DVR Scheduling at Comcast"DataStax Academy
Comcast is developing a highly scalable cloud DVR scheduling system on top of Cassandra. The system is responsible for managing all DVR data and scheduling logic for devices on the X1 platform. This talk will cover the overall architecture of the scheduling system, data model, message queue and notification software that have been developed as part of this ambitious project. We'll take a deep dive into the details of our data model and review the implementation of Comcast's open-source, Cassandra-based clones of Amazon SQS and SNS.
Manage Microservices & Fast Data Systems on One Platform w/ DC/OSMesosphere Inc.
The application landscape inside our data center is changing: Along with the trend of moving toward microservices and containers, there are a number of new distributed data processing frameworks such as Kafka or Cassandra being released on a weekly basis. These changes have implications for the ways we think about infrastructure. With the growing need for computing power and the rise of distributed applications comes the need for a reliable and simple-use cluster manager and programming abstraction.
In this presentation, Mesosphere explains how to use DC/OS to manage microservices and fast data systems on a single platform. We will look at how container orchestration, including resource management and service management, can be streamlined to process fast data in a matter of seconds, allowing for predictive user interfaces, product recommendations, and billing charge back, among other modern app components.
AWS re:Invent 2016: Relational and NoSQL Databases on AWS: NBC, MarkLogic, an...Amazon Web Services
Learn how the AWS Marketplace brings together customers who have challenges with ISVs who have solutions to those challenges. See how to use relational and NoSQL technologies on AWS to build enterprise and consumer apps. NBC used MarkLogic to deliver an award-winning app that can handle high traffic levels and unexpected usage spikes. NBC’s popular, Emmy-winning, “SNL 40” was launched to celebrate the 40th anniversary of Saturday Night Live, and delivers four decades of sketches and performances. Hosted on AWS, the app — as well as a browser-based platform — are powered by the MarkLogic Enterprise NoSQL database. Come learn from the team who collaborated on this project how to run your own database on AWS, and how to integrate with Amazon RDS and other data stores. A world-recognized automotive brand needed to deliver real-time response about their worldwide fleet vehicles. You will learn how they used a combination of AWS services and FileMaker Cloud, (an Apple subsidiary, procured through AWS Marketplace) to deliver high-scale dealer-facing applications.
Day 2 - Amazon RDS - Letting AWS run your Low Admin, High Performance DatabaseAmazon Web Services
Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and re-sizable capacity while managing time-consuming database administration tasks, freeing you up to focus on your applications and business. In this webinar we review the different types of Amazon RDS available and how to move your existing databases to Amazon RDS with minimum disruption.
Reasons to attend:
- Learn how Amazon RDS can reduce the overhead of running high performance mission critical databases.
- Learn how to migrate your existing database workloads into Amazon RDS running on the AWS Cloud.
- Learn how to scale up and scale down your Amazon RDS instance and save money with reserved instances.
by Darin Briskman, Technical Evangelist, AWS
Database Freedom means being able to use the database engine that’s right for you as your needs evolve. Being locked into a specific technology can prevent you from achieving your mission. Fortunately, AWS Database Migration Service makes it easy to switch between different database engines. We’ll look at how to use Schema Migration Tool with DMS to switch from a commercial database to open source. You’ll need a laptop with a Firefox or Chrome browser.
AWS re:Invent 2016: Workshop: Converting Your Oracle or Microsoft SQL Server ...Amazon Web Services
In this workshop, you migrate a sample sporting event and ticketing database from Oracle or Microsoft SQL Server to Amazon Aurora or Postgre SQL using the AWS Schema Conversion Tool (AWS SCT) and AWS Database Migration Service (AWS DMS). The workshop includes the migration of tables, indexes, procedures, functions, constraints, views, and more. We run SCT on a Amazon EC2 Windows instance--bring a laptop with Remote Desktop (or some other method of connecting to the Windows instance). Ideally, you should be familiar with relational databases, especially Oracle or SQL Server and PostgreSQL or Aurora, to get the most from this session. Additionally, attendees should be familiar with SCT and DMS. Familiarity with SQL Developer and pgAdmin III will be helpful but is not required.
Prerequisites:
- Participants should have an AWS account established and available for use during the workshop.
- Please bring your own laptop.
AWS re:Invent 2016: Migrating Enterprise Messaging to the Cloud (ENT217)Amazon Web Services
Enterprises rely on messaging to integrate services and applications and to exchange information critical to running their business. However, managing and operating dedicated message-oriented middleware and underlying infrastructure creates costly overhead and can compromise reliability. In this session, enterprise architects and developers learn how to improve scalability, availability, and operational efficiency by migrating on-premises messaging middleware to a managed cloud service using Amazon SQS. Hear how Capital One is using SQS to migrate several core banking applications to the cloud to ensure high availability and cost efficiency. We also share some exciting new SQS features that allow even more workloads to take advantage of the cloud.
This is a sharing on a seminar held together by Cathay Bank and the AWS User Group in Taiwan. In this sharing, overview of Amazon EMR and AWS Glue is offered and CDK management on those services via practical scenarios is also presented
Demystifying Storage on AWS | AWS Public Sector Summit 2017Amazon Web Services
You wouldn't use a scooter to help someone move; likewise, there is no one-size-fits-all data storage solution. AWS provides a wide variety of storage services to address the spectrum of needs, from casual users saving photos to mission-critical, specialized databases utilized at the largest private and public sector entities. This session will give you an overview of these storage offerings, provide you with the groundwork to match these to your use cases. Learn More: https://aws.amazon.com/government-education/
Benchmarking Aerospike on the Google Cloud - NoSQL Speed with EaseLynn Langit
Deck from blog post detailing our work with Aerospike to verify their performance benchmark on the Google Cloud, using GCE (Google Compute Engine) instances of 4 million TPS. Blog post is here -- http://googlecloudplatform.blogspot.com/2015/10/speed-with-Ease-NoSQL-on-the-Google-Cloud-Platform.html
Modernize Legacy and Enterprise Application Through Implementation of Cloud N...Amazon Web Services
Many Federal agencies are taking on initiatives to consolidate datacenters, modernize legacy and enterprise applications, and transform the digital portfolio to enable agility and other advantageous through cloud based delivery models. Deloitte provides a breadth of services to help federal government agencies select the right cloud solutions to accelerate their missions and derive value while helping agencies to be at the forefront of technology and innovation. Join us as we demonstrate specific client use cases where we have successfully assisted clients in selecting a cloud service model for migrating and transforming current data center to the cloud, enabling agility by taking advantage of Agile, DevOps, and continuous delivery, and modernized legacy and enterprise application through the implementation of cloud native solutions on AWS infrastructure to deliver value at the speed of your mission. Learn More: https://aws.amazon.com/government-education/
Event Streaming with Kafka Streams and Spring Cloud Stream | Soby Chacko, VMwareHostedbyConfluent
Spring Cloud Stream is a framework built on top of the foundations of Spring Boot, the foremost JVM framework for developing microservice applications. It brings the familiar patterns and philosophies that Spring has championed for years through its programming model by allowing developers to focus primarily on the business logic of their applications. Kafka Streams is a powerful stream processing library built on top of Apache Kafka and attracts many developers because of its simplicity and deployment models as microservice applications. By developing Kafka Streams applications using Spring Cloud Stream, application developers get the best of both worlds - simpler stream processing execution models of Kafka Streams and battle-tested microservices foundations of Spring Boot via Spring Cloud Stream. This talk will explore: The integration points and various capabilities of Spring Cloud Stream touchpoints with Kafka Streams How to build event streaming applications using Spring’s programming model built on top of Kafka Streams, including a demo of a stateful application using Kafka Streams and Spring Cloud Stream’s functional support How to use interactive queries to expose materialized views from the state stores in the application How this Kafka Streams application can run as part of a data pipeline using Spring Cloud Data Flow in Kubernetes
How will my on-premises data migrate to the cloud? How can I make it transparent to my users? Afterwards, how will on-premises and cloud data interact? In this session you will learn about the AWS Database Migration Service (DMS) and the AWS Schema Migration Tool (SCT). You can use these tools to convert your commercial database and database warehouse to open-source engines or AWS-native services, such as Amazon Aurora and Redshift.
Speakers:
Saurabh Saxena - Principal Technical Account Manager, AWS
Chris England - Sr. Technical Account Manager, AWS
How to Migrate from Cassandra to Amazon DynamoDB - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Learn how to migrate from Cassandra to DynamoDB
- Learn about the considerations and pre-requisites for migrating to DynamoDB
- Learn the benefits of a fully managed nosql database - DynamoDB
In addition to running databases in Amazon EC2, AWS customers can choose among a variety of managed database services. These services save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We explain the fundamentals of Amazon DynamoDB, a fully managed NoSQL database service; Amazon RDS, a relational database service in the cloud; Amazon ElastiCache, a fast, in-memory caching service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We’ll cover how each service might help support your application, how much each service costs, and how to get started.
NYC* 2013 — "Using Cassandra for DVR Scheduling at Comcast"DataStax Academy
Comcast is developing a highly scalable cloud DVR scheduling system on top of Cassandra. The system is responsible for managing all DVR data and scheduling logic for devices on the X1 platform. This talk will cover the overall architecture of the scheduling system, data model, message queue and notification software that have been developed as part of this ambitious project. We'll take a deep dive into the details of our data model and review the implementation of Comcast's open-source, Cassandra-based clones of Amazon SQS and SNS.
Manage Microservices & Fast Data Systems on One Platform w/ DC/OSMesosphere Inc.
The application landscape inside our data center is changing: Along with the trend of moving toward microservices and containers, there are a number of new distributed data processing frameworks such as Kafka or Cassandra being released on a weekly basis. These changes have implications for the ways we think about infrastructure. With the growing need for computing power and the rise of distributed applications comes the need for a reliable and simple-use cluster manager and programming abstraction.
In this presentation, Mesosphere explains how to use DC/OS to manage microservices and fast data systems on a single platform. We will look at how container orchestration, including resource management and service management, can be streamlined to process fast data in a matter of seconds, allowing for predictive user interfaces, product recommendations, and billing charge back, among other modern app components.
AWS re:Invent 2016: Relational and NoSQL Databases on AWS: NBC, MarkLogic, an...Amazon Web Services
Learn how the AWS Marketplace brings together customers who have challenges with ISVs who have solutions to those challenges. See how to use relational and NoSQL technologies on AWS to build enterprise and consumer apps. NBC used MarkLogic to deliver an award-winning app that can handle high traffic levels and unexpected usage spikes. NBC’s popular, Emmy-winning, “SNL 40” was launched to celebrate the 40th anniversary of Saturday Night Live, and delivers four decades of sketches and performances. Hosted on AWS, the app — as well as a browser-based platform — are powered by the MarkLogic Enterprise NoSQL database. Come learn from the team who collaborated on this project how to run your own database on AWS, and how to integrate with Amazon RDS and other data stores. A world-recognized automotive brand needed to deliver real-time response about their worldwide fleet vehicles. You will learn how they used a combination of AWS services and FileMaker Cloud, (an Apple subsidiary, procured through AWS Marketplace) to deliver high-scale dealer-facing applications.
Day 2 - Amazon RDS - Letting AWS run your Low Admin, High Performance DatabaseAmazon Web Services
Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and re-sizable capacity while managing time-consuming database administration tasks, freeing you up to focus on your applications and business. In this webinar we review the different types of Amazon RDS available and how to move your existing databases to Amazon RDS with minimum disruption.
Reasons to attend:
- Learn how Amazon RDS can reduce the overhead of running high performance mission critical databases.
- Learn how to migrate your existing database workloads into Amazon RDS running on the AWS Cloud.
- Learn how to scale up and scale down your Amazon RDS instance and save money with reserved instances.
by Darin Briskman, Technical Evangelist, AWS
Database Freedom means being able to use the database engine that’s right for you as your needs evolve. Being locked into a specific technology can prevent you from achieving your mission. Fortunately, AWS Database Migration Service makes it easy to switch between different database engines. We’ll look at how to use Schema Migration Tool with DMS to switch from a commercial database to open source. You’ll need a laptop with a Firefox or Chrome browser.
AWS re:Invent 2016: Workshop: Converting Your Oracle or Microsoft SQL Server ...Amazon Web Services
In this workshop, you migrate a sample sporting event and ticketing database from Oracle or Microsoft SQL Server to Amazon Aurora or Postgre SQL using the AWS Schema Conversion Tool (AWS SCT) and AWS Database Migration Service (AWS DMS). The workshop includes the migration of tables, indexes, procedures, functions, constraints, views, and more. We run SCT on a Amazon EC2 Windows instance--bring a laptop with Remote Desktop (or some other method of connecting to the Windows instance). Ideally, you should be familiar with relational databases, especially Oracle or SQL Server and PostgreSQL or Aurora, to get the most from this session. Additionally, attendees should be familiar with SCT and DMS. Familiarity with SQL Developer and pgAdmin III will be helpful but is not required.
Prerequisites:
- Participants should have an AWS account established and available for use during the workshop.
- Please bring your own laptop.
AWS re:Invent 2016: Migrating Enterprise Messaging to the Cloud (ENT217)Amazon Web Services
Enterprises rely on messaging to integrate services and applications and to exchange information critical to running their business. However, managing and operating dedicated message-oriented middleware and underlying infrastructure creates costly overhead and can compromise reliability. In this session, enterprise architects and developers learn how to improve scalability, availability, and operational efficiency by migrating on-premises messaging middleware to a managed cloud service using Amazon SQS. Hear how Capital One is using SQS to migrate several core banking applications to the cloud to ensure high availability and cost efficiency. We also share some exciting new SQS features that allow even more workloads to take advantage of the cloud.
This is a sharing on a seminar held together by Cathay Bank and the AWS User Group in Taiwan. In this sharing, overview of Amazon EMR and AWS Glue is offered and CDK management on those services via practical scenarios is also presented
Demystifying Storage on AWS | AWS Public Sector Summit 2017Amazon Web Services
You wouldn't use a scooter to help someone move; likewise, there is no one-size-fits-all data storage solution. AWS provides a wide variety of storage services to address the spectrum of needs, from casual users saving photos to mission-critical, specialized databases utilized at the largest private and public sector entities. This session will give you an overview of these storage offerings, provide you with the groundwork to match these to your use cases. Learn More: https://aws.amazon.com/government-education/
Benchmarking Aerospike on the Google Cloud - NoSQL Speed with EaseLynn Langit
Deck from blog post detailing our work with Aerospike to verify their performance benchmark on the Google Cloud, using GCE (Google Compute Engine) instances of 4 million TPS. Blog post is here -- http://googlecloudplatform.blogspot.com/2015/10/speed-with-Ease-NoSQL-on-the-Google-Cloud-Platform.html
Modernize Legacy and Enterprise Application Through Implementation of Cloud N...Amazon Web Services
Many Federal agencies are taking on initiatives to consolidate datacenters, modernize legacy and enterprise applications, and transform the digital portfolio to enable agility and other advantageous through cloud based delivery models. Deloitte provides a breadth of services to help federal government agencies select the right cloud solutions to accelerate their missions and derive value while helping agencies to be at the forefront of technology and innovation. Join us as we demonstrate specific client use cases where we have successfully assisted clients in selecting a cloud service model for migrating and transforming current data center to the cloud, enabling agility by taking advantage of Agile, DevOps, and continuous delivery, and modernized legacy and enterprise application through the implementation of cloud native solutions on AWS infrastructure to deliver value at the speed of your mission. Learn More: https://aws.amazon.com/government-education/
Event Streaming with Kafka Streams and Spring Cloud Stream | Soby Chacko, VMwareHostedbyConfluent
Spring Cloud Stream is a framework built on top of the foundations of Spring Boot, the foremost JVM framework for developing microservice applications. It brings the familiar patterns and philosophies that Spring has championed for years through its programming model by allowing developers to focus primarily on the business logic of their applications. Kafka Streams is a powerful stream processing library built on top of Apache Kafka and attracts many developers because of its simplicity and deployment models as microservice applications. By developing Kafka Streams applications using Spring Cloud Stream, application developers get the best of both worlds - simpler stream processing execution models of Kafka Streams and battle-tested microservices foundations of Spring Boot via Spring Cloud Stream. This talk will explore: The integration points and various capabilities of Spring Cloud Stream touchpoints with Kafka Streams How to build event streaming applications using Spring’s programming model built on top of Kafka Streams, including a demo of a stateful application using Kafka Streams and Spring Cloud Stream’s functional support How to use interactive queries to expose materialized views from the state stores in the application How this Kafka Streams application can run as part of a data pipeline using Spring Cloud Data Flow in Kubernetes
How will my on-premises data migrate to the cloud? How can I make it transparent to my users? Afterwards, how will on-premises and cloud data interact? In this session you will learn about the AWS Database Migration Service (DMS) and the AWS Schema Migration Tool (SCT). You can use these tools to convert your commercial database and database warehouse to open-source engines or AWS-native services, such as Amazon Aurora and Redshift.
Speakers:
Saurabh Saxena - Principal Technical Account Manager, AWS
Chris England - Sr. Technical Account Manager, AWS
Heterogenous Migration with DMS and SCT: Database Week San FranciscoAmazon Web Services
Database Week at the San Francisco Loft: Heterogeneous Migration with DMS and SCT - Oracle RDS to Aurora PostgreSQL migration lab using DMS and SCT.
Speakers:
Ramya Kaushik - Database Engineer, DMS/SCT, AWS
Jatin Singh - Partner Solutions Architect, AWS
DAT317_Migrating Databases and Data Warehouses to the CloudAmazon Web Services
In this introductory session, we look at how to convert and migrate your commercial databases and data warehouses to the cloud and gain your database freedom. AWS Database Migration Service (AWS DMS) and AWS Schema Conversion Tool (AWS SCT) have been used to migrate tens of thousands of databases. These include Oracle and SQL Server to Amazon Aurora, Teradata and Netezza to Amazon Redshift, MongoDB to Amazon DynamoDB, and many other data source and target combinations. Learn how to easily and securely migrate your data and procedural code, enjoy flexibility and cost savings, and gain new opportunities.
From Mainframe to Microservices: Vanguard’s Move to the Cloud - ENT331 - re:I...Amazon Web Services
Maintaining control of sensitive data is critical in the highly regulated financial investments environment that Vanguard operates in. This need for data control complicated Vanguard's move to the cloud. They needed to expand globally to provide a great user experience while at the same time maintaining their mainframe-based backend data architecture. In this session, Vanguard discusses the creative approach they took to decouple their monolithic backend architecture to empower a microservices architecture while maintaining compliance with regulations. They also cover solutions implemented to successfully meet their requirements for security, latency, and end-state consistency.
Database Freedom is an AWS initiative that accelerates enterprise migrations from commercial database engines to AWS native database services or managed open-source systems. We review the basics of the Amazon purpose-built database strategy and cover our Workload Qualification Framework, which helps you determine a good database migration candidate and predict the level of effort. In the hands-on lab, you use AWS Schema Conversion Tool and AWS Database Migration Service to migrate your databases to Amazon Aurora PostgreSQL. Bring a laptop with Firefox or Chrome and a working AWS account. We provide an AWS CloudFormation template to configure the lab environment.
Database migration doesn’t need to be difficult or time-consuming. Learn about the AWS Database Migration Service, which helps you migrate databases with minimal downtime from on-premises and cloud environments to Amazon RDS, Amazon Redshift, Amazon Aurora, Amazon DynamoDB, and Amazon EC2. We discuss homogeneous (same database engine) and heterogeneous migrations, as well as migrations from data warehouse platforms. We’ll also talk about the AWS Schema Conversion Tool, which saves you development time when migrating your Oracle, SQL Server, and data warehouse schemas and procedural code and exporting your data to the cloud. You'll hear from GumGum, an artificial intelligence company with deep expertise in computer vision that uses DMS to replicate its dimension data from different sources into a cohesive data warehouse.
Accelerate Oracle to Aurora PostgreSQL Migration (GPSTEC313) - AWS re:Invent ...Amazon Web Services
There is a lot of interest these days in migrating data from commercial relational databases to open-source relational databases. PostgreSQL is a great choice for migration, offering advanced features, high performance, rock-solid data integrity, and a flexible open-source license. PostgreSQL is compliant with ANSI SQL. It supports drivers for nearly all development languages, and it has a strong community of active committers and companies to provide support. In this talk, we demonstrate an overall approach for migrating an application from your current Oracle database to an Amazon Aurora PostgreSQL database.
Migrating Your Databases to AWS: Deep Dive on Amazon RDS and AWS Database Mig...Amazon Web Services
Amazon RDS allows you to launch an optimally configured, secure and highly available database with just a few clicks. It provides cost-efficient and resizable capacity, automates time-consuming database administration tasks, and provides you with six familiar database engines to choose from: Amazon Aurora, Oracle, Microsoft SQL Server, PostgreSQL, MySQL and MariaDB. In this session, we will take a close look at the capabilities of Amazon RDS and explain how it works. We’ll also discuss the AWS Database Migration Service and AWS Schema Conversion Tool, which help you migrate databases and data warehouses with minimal downtime from on-premises and cloud environments to Amazon RDS and other Amazon services. Gain your freedom from expensive, proprietary databases while providing your applications with the fast performance, scalability, high availability, and compatibility they need.
AWS Speaker: Andrew Kane, Solutions Architect - Amazon Web Services
Database Week at the San Francisco Loft: Modernizing DMS
Migrate your databases to AWS with minimal downtime. More than 80,000 databases have been migrated using AWS Database Migration Service.
Speakers:
Ramya Kaushik - Database Engineer, DMS/SCT, AWS
Arun Thiagarajan - Support Engineer, AWS
Migrating Databases to the Cloud with AWS Database Migration Service (DAT207)...Amazon Web Services
Learn how to convert and migrate your relational databases, nonrelational databases, and data warehouses to the cloud. AWS Database Migration Service (AWS DMS) and AWS Schema Conversion Tool (AWS SCT) can help with homogeneous migrations as well as migrations between different database engines, such as Oracle or SQL Server, to Amazon Aurora. Hear from Verizon about how they intend to migrate critical databases to Amazon Aurora with PostgreSQL compatibility from their current on-premises Oracle databases, and learn how they intend to deal with challenges such as conversion of legacy code and complex data types, supporting business resiliency, and maintaining data synchronization during the transition phase.
Database Week at the San Francisco Loft
Modernizing Databases with DMS
Migrate your databases to AWS with minimal downtime. More than 80,000 databases have been migrated using AWS Database Migration Service.
Speaker: Ben Willett - Solutions Architect, AWS
Database migration doesn’t need to be difficult or time consuming. Learn about the AWS Database Migration Service, which helps you migrate databases with minimal downtime from on-premises and cloud environments to Amazon RDS, Amazon Redshift, Amazon Aurora, Amazon DynamoDB, and Amazon EC2. We will discuss homogeneous (same database engine) and heterogeneous migrations, as well as migrations from data warehouse platforms. We’ll also talk about the AWS Schema Conversion Tool, which saves you development time when migrating your Oracle, SQL Server, and data warehouse schemas and procedural code and exporting your data to the cloud.
Similar to [db tech showcase Tokyo 2017] C24:Taking off to the clouds. How to use DMS in AWS migrations. by The Pythian Group Inc. - Gleb Otochkin (20)
Some might think Docker is for developers only, but this is not really the case.Docker is here to stay and we will only see more of it in the future.
In this session learn what Docker is and how it works.This session will be covering core areas such as volumes, but also stepping it up to a few tips and tricks to help you get the most out of your Docker environment.The session will dive into a few examples of how to create a database environment within just a few minutes - perfect for testing,development, and possibly even production systems.
Machine Learning explained with Examples
Everybody is talking about machine learning. What is it actually and how can I use it?
In this presentation we will see some examples of solving real life use cases using machine learning. We will define Tasks and see how that task can be addressed using machine learning.
SQL Server 2017でLinuxに対応し、その延長線でDocker対応やKubernetesによる可用性構成が組めるようになりました。そしてリリースを間近に控えたSQL Server 2019ではKubernetesを活用したBig Data Cluster機能の提供が予定されており、コンテナの活用範囲はさらに広がっています。
本セッションではこれからSQL Serverコンテナに触れていくための基礎知識と実際に触れてみるための手順やサンプルをお届けします。
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
The Metaverse and AI: how can decision-makers harness the Metaverse for their...Jen Stirrup
The Metaverse is popularized in science fiction, and now it is becoming closer to being a part of our daily lives through the use of social media and shopping companies. How can businesses survive in a world where Artificial Intelligence is becoming the present as well as the future of technology, and how does the Metaverse fit into business strategy when futurist ideas are developing into reality at accelerated rates? How do we do this when our data isn't up to scratch? How can we move towards success with our data so we are set up for the Metaverse when it arrives?
How can you help your company evolve, adapt, and succeed using Artificial Intelligence and the Metaverse to stay ahead of the competition? What are the potential issues, complications, and benefits that these technologies could bring to us and our organizations? In this session, Jen Stirrup will explain how to start thinking about these technologies as an organisation.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps