The document discusses Amazon Redshift, a fully managed data warehousing service. It describes how Redshift allows users to analyze vast amounts of data stored in Amazon S3 in a fast, simple, and cost-effective way using SQL. Redshift offers benefits like massively parallel processing, automatic backups and restore, security, and integration with other AWS services and third-party tools.
by Mike Labib, In-Memory NoSQL Solutions Architect, AWS
Redis has exploded in popularity to become the de-facto standard for in-memory key-value store used by customers for fast data storage to accelerate databases and applications. In this talk, we will discuss how to leverage Redis to achieve blazing fast performance in a variety of use cases – from database caching, to messaging, queuing, IoT and more. Both high-level architecture considerations and implementation (with code snippets) will be covered. We will also see how using Amazon ElastiCache makes it easy to power your Redis workloads in a robust, secure and fully managed way. Level: 200 (requires understanding of database services)
Best Practices for Migrating Your Data Warehouse to Amazon RedshiftAmazon Web Services
by Darin Briskman, Technical Evangelist, AWS
You can gain substantially more business insights and save costs by migrating your existing data warehouse to Amazon Redshift. This session will cover the key benefits of migrating to Amazon Redshift, migration strategies, and tools and resources that can help you in the process. We’ll learn about AWS Database Migration Service and AWS Schema Migration Tool, which were recently enhanced to import data from six common data warehouse platforms. Level: 200
Amazon Redshift é um serviço gerenciado que lhe dá um Data Warehouse, pronto para usar. Você se preocupa com carregar dados e utilizá-lo. Os detalhes de infraestrutura, servidores, replicação, backup são administrados pela AWS.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. You can start small for just $0.25 per hour with no commitment or upfront costs and scale to a petabyte or more for $1,000 per terabyte per year, less than a tenth of most other data warehousing solutions.
See a recording of the webinar based on this presentation here on YouTube: https://youtu.be/GgLKodmL5xE
Masterclass series webinars, including on-demand access to all of this years recorded webinars: http://aws.amazon.com/campaigns/emea/masterclass/
Journey Through the Cloud webinar series, including on-demand access to all webinars so far this year: http://aws.amazon.com/campaigns/emea/journey/
by Rajeev Srinivasan, Sr. Solutions Architect, AWS
Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed cloud database and supports both document and key-value store models. Its flexible data model, reliable performance, and automatic scaling of throughput capacity, makes it a great fit for mobile, web, gaming, ad tech, IoT, and many other applications. We’ll take a look at how DynamoDB works and how it can be accelerated by DAX, the DynamoDB Accelerator.
Analyzing big data quickly and efficiently requires a data warehouse optimized to handle and scale for large datasets. Amazon Redshift is a fast, petabyte-scale data warehouse that makes it simple and cost-effective to analyze big data for a fraction of the cost of traditional data warehouses. By following a few best practices, you can take advantage of Amazon Redshift’s columnar technology and parallel processing capabilities to minimize I/O and deliver high throughput and query performance. This webinar will cover techniques to load data efficiently, design optimal schemas, and use work load management.
Learning Objectives:
• Get an inside look at Amazon Redshift's columnar technology and parallel processing capabilities
• Learn how to migrate from existing data warehouses, optimize schemas, and load data efficiently
• Learn best practices for managing workload, tuning your queries, and using Amazon Redshift's interleaved sorting features
Who Should Attend:
• Data Warehouse Developers, Big Data Architects, BI Managers, and Data Engineers
This is the general session for Amazon DynamoDB and will cover newly announced features, as well as provide an end to end view of recent innovations. We will also share some of our successful customer stories and use cases. Come to this session to learn all about what’s new for DynamoDB!
by Mike Labib, In-Memory NoSQL Solutions Architect, AWS
Redis has exploded in popularity to become the de-facto standard for in-memory key-value store used by customers for fast data storage to accelerate databases and applications. In this talk, we will discuss how to leverage Redis to achieve blazing fast performance in a variety of use cases – from database caching, to messaging, queuing, IoT and more. Both high-level architecture considerations and implementation (with code snippets) will be covered. We will also see how using Amazon ElastiCache makes it easy to power your Redis workloads in a robust, secure and fully managed way. Level: 200 (requires understanding of database services)
Best Practices for Migrating Your Data Warehouse to Amazon RedshiftAmazon Web Services
by Darin Briskman, Technical Evangelist, AWS
You can gain substantially more business insights and save costs by migrating your existing data warehouse to Amazon Redshift. This session will cover the key benefits of migrating to Amazon Redshift, migration strategies, and tools and resources that can help you in the process. We’ll learn about AWS Database Migration Service and AWS Schema Migration Tool, which were recently enhanced to import data from six common data warehouse platforms. Level: 200
Amazon Redshift é um serviço gerenciado que lhe dá um Data Warehouse, pronto para usar. Você se preocupa com carregar dados e utilizá-lo. Os detalhes de infraestrutura, servidores, replicação, backup são administrados pela AWS.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. You can start small for just $0.25 per hour with no commitment or upfront costs and scale to a petabyte or more for $1,000 per terabyte per year, less than a tenth of most other data warehousing solutions.
See a recording of the webinar based on this presentation here on YouTube: https://youtu.be/GgLKodmL5xE
Masterclass series webinars, including on-demand access to all of this years recorded webinars: http://aws.amazon.com/campaigns/emea/masterclass/
Journey Through the Cloud webinar series, including on-demand access to all webinars so far this year: http://aws.amazon.com/campaigns/emea/journey/
by Rajeev Srinivasan, Sr. Solutions Architect, AWS
Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed cloud database and supports both document and key-value store models. Its flexible data model, reliable performance, and automatic scaling of throughput capacity, makes it a great fit for mobile, web, gaming, ad tech, IoT, and many other applications. We’ll take a look at how DynamoDB works and how it can be accelerated by DAX, the DynamoDB Accelerator.
Analyzing big data quickly and efficiently requires a data warehouse optimized to handle and scale for large datasets. Amazon Redshift is a fast, petabyte-scale data warehouse that makes it simple and cost-effective to analyze big data for a fraction of the cost of traditional data warehouses. By following a few best practices, you can take advantage of Amazon Redshift’s columnar technology and parallel processing capabilities to minimize I/O and deliver high throughput and query performance. This webinar will cover techniques to load data efficiently, design optimal schemas, and use work load management.
Learning Objectives:
• Get an inside look at Amazon Redshift's columnar technology and parallel processing capabilities
• Learn how to migrate from existing data warehouses, optimize schemas, and load data efficiently
• Learn best practices for managing workload, tuning your queries, and using Amazon Redshift's interleaved sorting features
Who Should Attend:
• Data Warehouse Developers, Big Data Architects, BI Managers, and Data Engineers
This is the general session for Amazon DynamoDB and will cover newly announced features, as well as provide an end to end view of recent innovations. We will also share some of our successful customer stories and use cases. Come to this session to learn all about what’s new for DynamoDB!
Best Practices for Migrating your Data Warehouse to Amazon RedshiftAmazon Web Services
You can gain substantially more business insights and save costs by migrating your existing data warehouse to Amazon Redshift. This session will cover the key benefits of migrating to Amazon Redshift, migration strategies, and tools and resources that can help you in the process.
In this talk, Ian will table about Amazon Redshift, a managed petabyte scale data warehouse, give an overview of integration with Amazon Elastic MapReduce, a managed Hadoop environment, and cover some exciting new developments in the analytics space.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. You can start small for just $0.25 per hour with no commitment or upfront costs and scale to a petabyte or more for $1,000 per terabyte per year, less than a tenth of most other data warehousing solutions.
In this Masterclass presentation we will:
• Explore the architecture and fundamental characteristics of Amazon Redshift
• Show you how to launch Redshift clusters and to load data into them
• Explain out how to use the AWS Console to monitor and manage Redshift clusters
• Help you to discover best practices and other resources to help you get the most from Redshift
Watch the recording here: http://youtu.be/-FmCWcxRvXY
In this presentation, you will get a look under the covers of Amazon Redshift, a fast, fully-managed, petabyte-scale data warehouse service for less than $1,000 per TB per year. Learn how Amazon Redshift uses columnar technology, optimized hardware, and massively parallel processing to deliver fast query performance on data sets ranging in size from hundreds of gigabytes to a petabyte or more. We'll also walk through techniques for optimizing performance and, you’ll hear from a specific customer and their use case to take advantage of fast performance on enormous datasets leveraging economies of scale on the AWS platform.
Learn how Amazon Redshift, our fully managed, petabyte-scale data warehouse, can help you quickly and cost-effectively analyze all your data using your existing business intelligence tools. Get an introduction to how Amazon Redshift uses massively parallel processing and scale-out architecture to ensure compute resources grow with your dataset size, and columnar, direct-attached storage to dramatically reduce I/O time. Learn how top online retailer RetailMeNot moved their largest Vertica cluster on Amazon EC2 to Amazon Redshift. See how they gain insights from clickstream, location, merchant, marketing, and operational data across desktop and mobile properties.
Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed cloud database and supports both document and key-value store models. Its flexible data model, reliable performance, and automatic scaling of throughput capacity, makes it a great fit for mobile, web, gaming, ad tech, IoT, and many other applications. We’ll take a look at how DynamoDB works and how it can be accelerated by DAX, the DynamoDB Accelerator.
by Edin Zulich, NoSQL Solutions Architect, AWS
Explore Amazon DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over best practices for schema design with DynamoDB across multiple use cases, including gaming, IoT, and others. We explore designing efficient indexes, scanning, and querying, and go into detail on a number of recently released features, including DynamoDB Accelerator (DAX), DynamoDB Time-to-Live, and more. We also provide lessons learned from operating DynamoDB at scale, including provisioning DynamoDB for IoT. Level: 200
Amazon RDS makes it easy to set up, operate, and scale a relational database in the cloud. We’ll look at what RDS does (and does not) do to manage the “muck” of database operations.
by Darin Briskman, Technical Evangelist, AWS
Microsoft SQL Server is a commonly-used commercial relational database, especially for organizations that use Microsoft development tools. We’ll look at how to run SQL Server on the AWS Cloud, with examples of organizations using it.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools.
This webinar will provide an overview of Redshift with an emphasis on the many changes we recently introduced. In particular, we will address the newly released DW2 instance types and what you can do with them.
This content is designed for database developers and architects interested in Amazon Redshift.
Lessons from Migrating Oracle Databases to Amazon RDS or Amazon Aurora Datavail
Learn and leverage database migration best practices from moving off commercial Oracle databases to Amazon RDS or Aurora. We’ll cover common pitfalls, issues, the biggest differences between the engines, migration best practices, and how some of our customers have completed these migrations.
Take an in-depth look at data warehousing with Amazon Redshift and get answers to your technical questions. We will cover performance tuning techniques that take advantage of Amazon Redshift's columnar technology and massively parallel processing architecture. We will also discuss best practices for migrating from existing data warehouses, optimizing your schema, loading data efficiently, and using work load management and interleaved sorting.
In this session, you get an overview of Amazon Redshift, a fast, fully-managed, petabyte-scale data warehouse service. We'll cover how Amazon Redshift uses columnar technology, optimized hardware, and massively parallel processing to deliver fast query performance on data sets ranging in size from hundreds of gigabytes to a petabyte or more. We'll also discuss new features, architecture best practices, and share how customers are using Amazon Redshift for their Big Data workloads.
AWS June Webinar Series - Getting Started: Amazon RedshiftAmazon Web Services
Amazon Redshift is a fast, fully-managed petabyte-scale data warehouse service, for less than $1,000 per TB per year. In this presentation, you'll get an overview of Amazon Redshift, including how Amazon Redshift uses columnar technology, optimized hardware, and massively parallel processing to deliver fast query performance on data sets ranging in size from hundreds of gigabytes to a petabyte or more. Learn how, with just a few clicks in the AWS Management Console, you can set up with a fully functional data warehouse, ready to accept data without learning any new languages and easily plugging in with the existing business intelligence tools and applications you use today. This webinar is ideal for anyone looking to gain deeper insight into their data, without the usual challenges of time, cost and effort. In this webinar, you will learn: • Understand what Amazon Redshift is and how it works • Create a data warehouse interactively through the AWS Management Console • Load some data into your new Amazon Redshift data warehouse from S3 Who Should Attend • IT professionals, developers, line-of-business managers
Explore DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over schema design best practices with DynamoDB across multiple use cases, including gaming, AdTech, IoT, and others.
Building a Modern Data Warehouse: Deep Dive on Amazon Redshift - SRV337 - Chi...Amazon Web Services
In this chalk talk, we take a deep dive on Amazon Redshift architecture and the latest performance enhancements that give you faster insights into your data. We also cover Amazon Redshift Spectrum, a feature of Amazon Redshift that enables you to analyze data across Amazon Redshift and your Amazon S3 data lake to deliver unique insights not possible by analyzing independent data silos.
Explore Amazon DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over best practices for schema design with DynamoDB across multiple use cases, including gaming, IoT, and others. We explore designing efficient indexes, scanning, and querying, and go into detail on a number of recently released features, including DynamoDB Accelerator (DAX), DynamoDB Time-to-Live, and more. We also provide lessons learned from operating DynamoDB at scale, including provisioning DynamoDB for IoT.
by John McGrath, Startup Solutions Architect, AWS
Database Week at the AWS Loft is an opportunity to learn about Amazon’s broad and deep family of managed database services. These services provide easy, scalable, reliable, and cost-effective ways to manage your data in the cloud. We explain the fundamentals and take a technical deep dive into Amazon RDS and Amazon Aurora relational databases, Amazon DynamoDB non-relational databases, Amazon Neptune graph databases, and Amazon ElastiCache managed Redis, along with options for database migration, caching, search and more. You'll will learn how to get started, how to support applications, and how to scale.
A closer look at the fast, fully managed data warehouse that makes it simple and cost-effective to analyze all your data using standard SQL and your existing Business Intelligence (BI) tools. We'll show how to run complex analytic queries against petabytes of structured data, using sophisticated query optimization, columnar storage on high-performance local disks, and massively parallel query execution.
Level: Beginner
Speakers:
Jay Formosa - Solutions Architect, AWS
Aser Moustafa - Data Warehouse Specialist Solutions Architect, AWS
ABD201-Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
In this session, we simplify big data processing as a data bus comprising various stages: collect, store, process, analyze, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Companies, from startups to enterprises across the globe, are looking to migrate data warehousing to the cloud to increase performance and lower costs. Data engineers, data analysts, and developers also need to access and consume this important data. The landscape is constantly evolving and there are many solutions available for enterprises of all sizes. In this workshop, we dive deep into architectural patterns, use cases, and best practices when designing an enterprise data warehouse in the cloud. We also address key issues such as data governance and democratization. At the end of this workshop, you’ll be equipped to design and implement a cloud enterprise data warehouse platform that provides the most benefit for your enterprise, data consumers, and customers.
Best Practices for Migrating your Data Warehouse to Amazon RedshiftAmazon Web Services
You can gain substantially more business insights and save costs by migrating your existing data warehouse to Amazon Redshift. This session will cover the key benefits of migrating to Amazon Redshift, migration strategies, and tools and resources that can help you in the process.
In this talk, Ian will table about Amazon Redshift, a managed petabyte scale data warehouse, give an overview of integration with Amazon Elastic MapReduce, a managed Hadoop environment, and cover some exciting new developments in the analytics space.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. You can start small for just $0.25 per hour with no commitment or upfront costs and scale to a petabyte or more for $1,000 per terabyte per year, less than a tenth of most other data warehousing solutions.
In this Masterclass presentation we will:
• Explore the architecture and fundamental characteristics of Amazon Redshift
• Show you how to launch Redshift clusters and to load data into them
• Explain out how to use the AWS Console to monitor and manage Redshift clusters
• Help you to discover best practices and other resources to help you get the most from Redshift
Watch the recording here: http://youtu.be/-FmCWcxRvXY
In this presentation, you will get a look under the covers of Amazon Redshift, a fast, fully-managed, petabyte-scale data warehouse service for less than $1,000 per TB per year. Learn how Amazon Redshift uses columnar technology, optimized hardware, and massively parallel processing to deliver fast query performance on data sets ranging in size from hundreds of gigabytes to a petabyte or more. We'll also walk through techniques for optimizing performance and, you’ll hear from a specific customer and their use case to take advantage of fast performance on enormous datasets leveraging economies of scale on the AWS platform.
Learn how Amazon Redshift, our fully managed, petabyte-scale data warehouse, can help you quickly and cost-effectively analyze all your data using your existing business intelligence tools. Get an introduction to how Amazon Redshift uses massively parallel processing and scale-out architecture to ensure compute resources grow with your dataset size, and columnar, direct-attached storage to dramatically reduce I/O time. Learn how top online retailer RetailMeNot moved their largest Vertica cluster on Amazon EC2 to Amazon Redshift. See how they gain insights from clickstream, location, merchant, marketing, and operational data across desktop and mobile properties.
Amazon DynamoDB is a fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It is a fully managed cloud database and supports both document and key-value store models. Its flexible data model, reliable performance, and automatic scaling of throughput capacity, makes it a great fit for mobile, web, gaming, ad tech, IoT, and many other applications. We’ll take a look at how DynamoDB works and how it can be accelerated by DAX, the DynamoDB Accelerator.
by Edin Zulich, NoSQL Solutions Architect, AWS
Explore Amazon DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over best practices for schema design with DynamoDB across multiple use cases, including gaming, IoT, and others. We explore designing efficient indexes, scanning, and querying, and go into detail on a number of recently released features, including DynamoDB Accelerator (DAX), DynamoDB Time-to-Live, and more. We also provide lessons learned from operating DynamoDB at scale, including provisioning DynamoDB for IoT. Level: 200
Amazon RDS makes it easy to set up, operate, and scale a relational database in the cloud. We’ll look at what RDS does (and does not) do to manage the “muck” of database operations.
by Darin Briskman, Technical Evangelist, AWS
Microsoft SQL Server is a commonly-used commercial relational database, especially for organizations that use Microsoft development tools. We’ll look at how to run SQL Server on the AWS Cloud, with examples of organizations using it.
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools.
This webinar will provide an overview of Redshift with an emphasis on the many changes we recently introduced. In particular, we will address the newly released DW2 instance types and what you can do with them.
This content is designed for database developers and architects interested in Amazon Redshift.
Lessons from Migrating Oracle Databases to Amazon RDS or Amazon Aurora Datavail
Learn and leverage database migration best practices from moving off commercial Oracle databases to Amazon RDS or Aurora. We’ll cover common pitfalls, issues, the biggest differences between the engines, migration best practices, and how some of our customers have completed these migrations.
Take an in-depth look at data warehousing with Amazon Redshift and get answers to your technical questions. We will cover performance tuning techniques that take advantage of Amazon Redshift's columnar technology and massively parallel processing architecture. We will also discuss best practices for migrating from existing data warehouses, optimizing your schema, loading data efficiently, and using work load management and interleaved sorting.
In this session, you get an overview of Amazon Redshift, a fast, fully-managed, petabyte-scale data warehouse service. We'll cover how Amazon Redshift uses columnar technology, optimized hardware, and massively parallel processing to deliver fast query performance on data sets ranging in size from hundreds of gigabytes to a petabyte or more. We'll also discuss new features, architecture best practices, and share how customers are using Amazon Redshift for their Big Data workloads.
AWS June Webinar Series - Getting Started: Amazon RedshiftAmazon Web Services
Amazon Redshift is a fast, fully-managed petabyte-scale data warehouse service, for less than $1,000 per TB per year. In this presentation, you'll get an overview of Amazon Redshift, including how Amazon Redshift uses columnar technology, optimized hardware, and massively parallel processing to deliver fast query performance on data sets ranging in size from hundreds of gigabytes to a petabyte or more. Learn how, with just a few clicks in the AWS Management Console, you can set up with a fully functional data warehouse, ready to accept data without learning any new languages and easily plugging in with the existing business intelligence tools and applications you use today. This webinar is ideal for anyone looking to gain deeper insight into their data, without the usual challenges of time, cost and effort. In this webinar, you will learn: • Understand what Amazon Redshift is and how it works • Create a data warehouse interactively through the AWS Management Console • Load some data into your new Amazon Redshift data warehouse from S3 Who Should Attend • IT professionals, developers, line-of-business managers
Explore DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over schema design best practices with DynamoDB across multiple use cases, including gaming, AdTech, IoT, and others.
Building a Modern Data Warehouse: Deep Dive on Amazon Redshift - SRV337 - Chi...Amazon Web Services
In this chalk talk, we take a deep dive on Amazon Redshift architecture and the latest performance enhancements that give you faster insights into your data. We also cover Amazon Redshift Spectrum, a feature of Amazon Redshift that enables you to analyze data across Amazon Redshift and your Amazon S3 data lake to deliver unique insights not possible by analyzing independent data silos.
Explore Amazon DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over best practices for schema design with DynamoDB across multiple use cases, including gaming, IoT, and others. We explore designing efficient indexes, scanning, and querying, and go into detail on a number of recently released features, including DynamoDB Accelerator (DAX), DynamoDB Time-to-Live, and more. We also provide lessons learned from operating DynamoDB at scale, including provisioning DynamoDB for IoT.
by John McGrath, Startup Solutions Architect, AWS
Database Week at the AWS Loft is an opportunity to learn about Amazon’s broad and deep family of managed database services. These services provide easy, scalable, reliable, and cost-effective ways to manage your data in the cloud. We explain the fundamentals and take a technical deep dive into Amazon RDS and Amazon Aurora relational databases, Amazon DynamoDB non-relational databases, Amazon Neptune graph databases, and Amazon ElastiCache managed Redis, along with options for database migration, caching, search and more. You'll will learn how to get started, how to support applications, and how to scale.
A closer look at the fast, fully managed data warehouse that makes it simple and cost-effective to analyze all your data using standard SQL and your existing Business Intelligence (BI) tools. We'll show how to run complex analytic queries against petabytes of structured data, using sophisticated query optimization, columnar storage on high-performance local disks, and massively parallel query execution.
Level: Beginner
Speakers:
Jay Formosa - Solutions Architect, AWS
Aser Moustafa - Data Warehouse Specialist Solutions Architect, AWS
ABD201-Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
In this session, we simplify big data processing as a data bus comprising various stages: collect, store, process, analyze, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Companies, from startups to enterprises across the globe, are looking to migrate data warehousing to the cloud to increase performance and lower costs. Data engineers, data analysts, and developers also need to access and consume this important data. The landscape is constantly evolving and there are many solutions available for enterprises of all sizes. In this workshop, we dive deep into architectural patterns, use cases, and best practices when designing an enterprise data warehouse in the cloud. We also address key issues such as data governance and democratization. At the end of this workshop, you’ll be equipped to design and implement a cloud enterprise data warehouse platform that provides the most benefit for your enterprise, data consumers, and customers.
Learn how to reduce development time and innovate on AWS. In this webinar, Beachbody - sellers of fitness, weight loss, and muscle-building home-exercise videos - talks about their experience migrating to a data lake on Amazon Simple Storage Service (Amazon S3) using Talend. Beachbody will describe how they created an open enterprise data platform, giving their employees access to secure, well-governed data, and increasing DevOps efficiency across the entire company.
What’s New in Amazon RDS for Open-Source and Commercial Databases: Amazon Web Services
by Kwesi Edwards, Business Development Manager, AWS
In the past year, Amazon RDS has continued to expand functionality, scalability, availability and ease of use for all supported database engines: PostgreSQL, MySQL, MariaDB, Oracle and Microsoft SQL Server. We’ll take a close look at RDS use cases and new capabilities, splitting the time between open-source and commercial database engines. Level 300
DAT309_Best Practices for Migrating from Oracle and SQL Server to Amazon RDSAmazon Web Services
You can significantly reduce database licensing and operational costs by migrating from commercial database engines to Amazon RDS. In addition, you can gain flexibility and operational efficiency by avoiding the frustrating usage constraints that accompany commercial database licenses. Amazon RDS is a fully managed database service, so you no longer need to worry about complex database management tasks. Launch a single database instance or thousands of them in just a few minutes, and pay only for what you use. Learn how AWS Database Migration Service and AWS Schema Conversion Tool help you migrate commercial databases like Oracle and Microsoft SQL Server to Amazon RDS and Aurora easily and securely with minimal downtime.
Running Oracle Databases on Amazon RDS - DAT313 - re:Invent 2017Amazon Web Services
Amazon Relational Database Service (Amazon RDS) simplifies setup, operation, and management of databases in the cloud. In this session, we will explore Amazon RDS features and best practices that offer graceful migration, high performance, elastic scaling, and high availability for Oracle databases. You will also learn from the Chief Architect for Intuit’s Small Business Division how the QuickBooks Online team is using Amazon RDS for Oracle to scale the world's largest online accounting platform.
Migrating to 21st Century Analytics: Zopa Story
Speakers:
Shafreen Sayyed, Solution Architect, AWS
Varun Gangoor, Senior Big Data Engineer, Zopa
Data makes the world go around these days, and 21st Century Data Analytics means you can store, process and analyze massive amounts of data, often in real time, whilst making that data consumable across diverse groups in your organization. Many traditional tools lock data away in inflexible silos, making this impossible. This session will look at what is needed in a Financial Services organization to achieve a flexible and scalable data architecture, and we will also hear from Zopa, UK's first peer-to-peer lending company, about how they migrated their data analytics estate to AWS and look at what new insight that has given them.
by Kwesi Edwards, Business Development Manager, AWS
Database migration doesn’t need to be difficult or time-consuming. Learn how AWS Database Migration Service provides an easy, secure migration from on-premises and Amazon EC2 environments to Amazon RDS, Amazon Redshift, Amazon DynamoDB and EC2 databases, with minimal-downtime. We’ll also see how the AWS Schema Conversion Tool automatically converts your schema and a majority of the custom code, so you can get up and running in the cloud quickly and inexpensively. We’ll discuss alternative data migration strategies for special use cases. Level 200
For organizations looking to glean insights from their data, it is essential to deploy the right environment to successfully support analytics workloads. Learn about the different storage options from AWS, and discuss with our experts how to select the best option for your big data analytics workloads. Hear how one customer, ViaSat, used Amazon EBS for their Apache Kafka and Apache Hadoop workload to improve cost and performance. We also describe best practices and share tips for success throughout.
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
Self-Service Analytics with AWS Big Data and Tableau - ARC217 - re:Invent 2017Amazon Web Services
As one of the thought leaders in Expedia's cloud migration, the Expedia Global Payments Business Intelligence group architected, designed and built a complete cloud data mart solution from the ground up using the AWS and Tableau online. In this session, we will discuss our business challenge, the journey to the solution, high-level technical architecture (using S3, EMR, data pipelines, Redshift, Tableau Online) and lessons learned along the way, including best practices and optimization methods, etc.
Session sponsored by Tableau
Migrate your databases to AWS with minimal downtime. More than 80,000 databases have been migrated using AWS Database Migration Service.
Speakers:
Rajanikanth Bhargava Chilakapati - Solutions Architect, AWS
Chris England - Sr. Technical Account Manager, AWS
by Ben Willett, Solutions Architect, AWS
Database Week at the AWS Loft is an opportunity to learn about Amazon’s broad and deep family of managed database services. These services provide easy, scalable, reliable, and cost-effective ways to manage your data in the cloud. We explain the fundamentals and take a technical deep dive into Amazon RDS and Amazon Aurora relational databases, Amazon DynamoDB non-relational databases, Amazon Neptune graph databases, and Amazon ElastiCache managed Redis, along with options for database migration, caching, search and more. You'll will learn how to get started, how to support applications, and how to scale.
DAT317_Migrating Databases and Data Warehouses to the CloudAmazon Web Services
In this introductory session, we look at how to convert and migrate your commercial databases and data warehouses to the cloud and gain your database freedom. AWS Database Migration Service (AWS DMS) and AWS Schema Conversion Tool (AWS SCT) have been used to migrate tens of thousands of databases. These include Oracle and SQL Server to Amazon Aurora, Teradata and Netezza to Amazon Redshift, MongoDB to Amazon DynamoDB, and many other data source and target combinations. Learn how to easily and securely migrate your data and procedural code, enjoy flexibility and cost savings, and gain new opportunities.
Optimizing Storage for Enterprise Workloads and Migrations (STG202) - AWS re:...Amazon Web Services
In this session, we focus on best practices for AWS block and file storage when supporting enterprise workloads (like SAP, Oracle, Microsoft applications, and home directories). We discuss migrating mission-critical workload data, selecting volumes or file systems, optimizing performance, and designing for durability and availability. We also review optimizing for cost to ensure that your lift-and-shift project is a success.
Similar to Getting Started with Amazon Redshift (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.