The document discusses Amazon Elasticsearch Service (Amazon ES) and how it can be used for log analytics. Amazon ES is a fully managed service that makes it easy to deploy, manage, and scale Elasticsearch and Kibana in AWS. It allows users to ingest and analyze log data in real time to gain valuable insights from machine-generated data. The document provides examples of how various organizations use Amazon ES for infrastructure monitoring, application monitoring, container monitoring, and security information and event management. It also covers best practices for scaling Amazon ES as data volume increases.
Implementing advanced design patterns for Amazon DynamoDB - ADB401 - Chicago ...Amazon Web Services
AmazonDB is an internet-scale database that offers single-digit millisecond performance. In this session, intended for those who already have some familiarity with DynamoDB, you learn how to apply the design patterns covered in the DynamoDB deep dive session and hands-on labs for DynamoDB. We discuss patterns and data models that summarize a collection of implementations and best practices used by the Amazon CDO to deliver highly scalable solutions for a wide variety of business problems. We examine strategies for GSI sharding and index overloading, scalable graph processing with materialized queries, relational modeling with composite keys, executing transactional workflows on DynamoDB, and much more.
Best practices for migrating big data workloads to Amazon EMR - ADB204 - Chic...Amazon Web Services
Today, more and more organizations are saving IT costs by moving their analytics, data processing (Extract, Transform, and Load [ETL]), and data science workloads currently running on-premises on Apache Hadoop, Spark, and data warehouse appliances to Amazon EMR. In this session, we show how your organization not only can save money, but increase availability and improve performance with Amazon EMR. We demonstrate how to identify the components and workflows in your current environment and present you with the best practices to help you plan your migration of these workloads to AWS.
Database Freedom: Migrate a relational database to Amazon Aurora - ADB308 - N...Amazon Web Services
Database Freedom is an AWS initiative that accelerates enterprise migrations from commercial databases, like Oracle and SQL Server, to AWS native database services or managed open-source systems. We review the basics of the Amazon purpose-built database strategy and cover our Workload Qualification Framework, which helps you determine a good database migration candidate and predict the level of effort. In the hands-on lab, you use AWS Schema Conversion Tool and AWS Database Migration Service to migrate your databases to Amazon Aurora PostgreSQL. Bring a laptop with Firefox or Chrome and a working AWS account. We provide an AWS CloudFormation template to configure the lab environment and help you practice a database migration scenario.
Deep dive on Amazon S3 Glacier Deep Archive - STG301 - Santa Clara AWS SummitAmazon Web Services
Many organizations need to retain multiple PBs of data to meet business and regulatory compliance requirements, and many choose on-premises magnetic tape libraries or off-premises tape archival services, which are expensive and onerous to maintain. In this session, we dive into Amazon S3 Glacier Deep Archive, which enables customers with large datasets to eliminate the cost and management of tape infrastructure while ensuring that data is preserved for future use and analysis. S3 Glacier Deep Archive is Amazon S3’s lowest-cost storage class. Learn how it supports long-term retention and digital preservation of data that won’t be regularly accessed, if ever.
Building with AWS Databases: Match Your Workload to the Right Database | AWS ...AWS Summits
In this session we will discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, Amazon Neptune, Amazon ElasticSearch Service, Amazon TimeStream, Amazon QLDB, and Amazon DocumentDB. This session will focus on how to evaluate a new workload for the best managed database option.
“Lift and shift” storage for business-critical applications - STG203 - New Yo...Amazon Web Services
Among your company’s top priorities should be ensuring that its data is safely and securely persisted. But beyond data integrity, you also need to ensure availability. In this session, learn best practices for AWS block and file storage when supporting business-critical applications such as SAP HANA, Oracle RAC, Microsoft SQL Server, MySQL, Cassandra, and home directories. We discuss migrating mission-critical workload data, selecting volumes or file systems, maximizing performance, and designing for durability and availability. You also learn how to optimize for cost to make sure your “lift and shift” project is a complete success.
Best practices for Running Spark jobs on Amazon EMR with Spot Instances | AWS...Amazon Web Services
In this session we are going to focus on cost-optimizing and efficiently running Spark applications on EMR by using Spot Instances. There are several best practices you should follow, in order to increase the fault-tolerance of your Spark applications and make use of Spot Instances, without compromising availability or impacting performance/duration of your jobs.
Drive innovation in Financial Services with Amazon EC2 - CMP204 - New York AW...Amazon Web Services
The landscape of the Financial Services industry is changing through new types of risks, an explosion of data, and evolving customer expectations. Join this session to learn how compute solutions powered by Amazon EC2 are helping financial services companies drive innovation and transform their businesses. We highlight how financial services companies are leveraging machine learning and grid computing to minimize risk, optimize investments, and meet customer demands. We also cover how AWS Outposts extends the AWS Cloud on-premises for a consistent hybrid cloud experience.
Implementing advanced design patterns for Amazon DynamoDB - ADB401 - Chicago ...Amazon Web Services
AmazonDB is an internet-scale database that offers single-digit millisecond performance. In this session, intended for those who already have some familiarity with DynamoDB, you learn how to apply the design patterns covered in the DynamoDB deep dive session and hands-on labs for DynamoDB. We discuss patterns and data models that summarize a collection of implementations and best practices used by the Amazon CDO to deliver highly scalable solutions for a wide variety of business problems. We examine strategies for GSI sharding and index overloading, scalable graph processing with materialized queries, relational modeling with composite keys, executing transactional workflows on DynamoDB, and much more.
Best practices for migrating big data workloads to Amazon EMR - ADB204 - Chic...Amazon Web Services
Today, more and more organizations are saving IT costs by moving their analytics, data processing (Extract, Transform, and Load [ETL]), and data science workloads currently running on-premises on Apache Hadoop, Spark, and data warehouse appliances to Amazon EMR. In this session, we show how your organization not only can save money, but increase availability and improve performance with Amazon EMR. We demonstrate how to identify the components and workflows in your current environment and present you with the best practices to help you plan your migration of these workloads to AWS.
Database Freedom: Migrate a relational database to Amazon Aurora - ADB308 - N...Amazon Web Services
Database Freedom is an AWS initiative that accelerates enterprise migrations from commercial databases, like Oracle and SQL Server, to AWS native database services or managed open-source systems. We review the basics of the Amazon purpose-built database strategy and cover our Workload Qualification Framework, which helps you determine a good database migration candidate and predict the level of effort. In the hands-on lab, you use AWS Schema Conversion Tool and AWS Database Migration Service to migrate your databases to Amazon Aurora PostgreSQL. Bring a laptop with Firefox or Chrome and a working AWS account. We provide an AWS CloudFormation template to configure the lab environment and help you practice a database migration scenario.
Deep dive on Amazon S3 Glacier Deep Archive - STG301 - Santa Clara AWS SummitAmazon Web Services
Many organizations need to retain multiple PBs of data to meet business and regulatory compliance requirements, and many choose on-premises magnetic tape libraries or off-premises tape archival services, which are expensive and onerous to maintain. In this session, we dive into Amazon S3 Glacier Deep Archive, which enables customers with large datasets to eliminate the cost and management of tape infrastructure while ensuring that data is preserved for future use and analysis. S3 Glacier Deep Archive is Amazon S3’s lowest-cost storage class. Learn how it supports long-term retention and digital preservation of data that won’t be regularly accessed, if ever.
Building with AWS Databases: Match Your Workload to the Right Database | AWS ...AWS Summits
In this session we will discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, Amazon Neptune, Amazon ElasticSearch Service, Amazon TimeStream, Amazon QLDB, and Amazon DocumentDB. This session will focus on how to evaluate a new workload for the best managed database option.
“Lift and shift” storage for business-critical applications - STG203 - New Yo...Amazon Web Services
Among your company’s top priorities should be ensuring that its data is safely and securely persisted. But beyond data integrity, you also need to ensure availability. In this session, learn best practices for AWS block and file storage when supporting business-critical applications such as SAP HANA, Oracle RAC, Microsoft SQL Server, MySQL, Cassandra, and home directories. We discuss migrating mission-critical workload data, selecting volumes or file systems, maximizing performance, and designing for durability and availability. You also learn how to optimize for cost to make sure your “lift and shift” project is a complete success.
Best practices for Running Spark jobs on Amazon EMR with Spot Instances | AWS...Amazon Web Services
In this session we are going to focus on cost-optimizing and efficiently running Spark applications on EMR by using Spot Instances. There are several best practices you should follow, in order to increase the fault-tolerance of your Spark applications and make use of Spot Instances, without compromising availability or impacting performance/duration of your jobs.
Drive innovation in Financial Services with Amazon EC2 - CMP204 - New York AW...Amazon Web Services
The landscape of the Financial Services industry is changing through new types of risks, an explosion of data, and evolving customer expectations. Join this session to learn how compute solutions powered by Amazon EC2 are helping financial services companies drive innovation and transform their businesses. We highlight how financial services companies are leveraging machine learning and grid computing to minimize risk, optimize investments, and meet customer demands. We also cover how AWS Outposts extends the AWS Cloud on-premises for a consistent hybrid cloud experience.
Machine learning for developers & data scientists with Amazon SageMaker - AIM...Amazon Web Services
Machine learning (ML) offers innovation for every business. But until recently, developing ML models took time and effort, making it difficult for developers to get started. In this session, we demonstrate how Amazon SageMaker, a fully managed service that enables developers and data scientists to build, train, and deploy ML models at scale, overcomes those challenges. We review its capabilities, including data labeling, model building, model training, tuning, and production hosting.
Modernize your data warehouse with Amazon Redshift - ADB305 - New York AWS Su...Amazon Web Services
Can you set up a data warehouse and create a dashboard in less than 60 minutes? You can with Amazon Redshift, a fully managed cloud data warehouse that provides first-rate performance at a low cost. In this workshop, you learn the steps and best practices to deploy your data warehouse in your organization. You also see how to query across petabytes of data in your data warehouse and exabytes of data in your Amazon S3 data lake. Finally, you learn how to easily migrate from traditional or on-premises data warehouses.
Running Amazon Elastic Compute Cloud (Amazon EC2) workloads at scale - CMP202...Amazon Web Services
Amazon EC2 Fleet makes it easy to optimize compute performance and cost by blending Amazon EC2 Spot, On-Demand, and Reserved Instances purchasing models. In this session, we learn how to use the power of Amazon EC2 Fleet with AWS services such as AWS Auto Scaling, Amazon Elastic Container Service (Amazon ECS), Amazon Elastic Container Service for Kubernetes (Amazon EKS), Amazon EMR, AWS Batch, AWS Thinkbox Deadline, and AWS OpsWorks to programmatically optimize costs while maintaining high performance and availability. We also discuss cost-optimization patterns for workloads such as containers, web services, CI/CD, and big data.
What's new in Amazon Aurora - ADB207 - New York AWS SummitAmazon Web Services
Amazon Aurora is a fully managed relational database that runs on Amazon RDS and offers versions compatible with MySQL and PostgreSQL. Aurora provides the speed, reliability, and availability of commercial databases at a fraction of the cost and is faster than standard MySQL and PostgreSQL databases. In this session, we provide an overview of Aurora, exploring recently announced features, such as serverless, multi-master, and performance insights. We also discuss what you need to get your organization started with Aurora.
Increase the value of video with machine learning & AWS Media Services - SVC3...Amazon Web Services
With the advancement of machine learning applications, new business opportunities are rapidly emerging in media. In this session, you learn how the AWS Media2Cloud solution can save time and reduce costs by setting up a serverless end-to-end ingest workflow to move your video assets and associated metadata to the cloud. You gain insight into how to make those assets even more valuable by enabling searching and indexing on your video library and learn how to use Amazon Transcribe and Amazon Translate to take your live-streaming workflows to the next level with expert instruction on how to automatically create multilanguage subtitles.
This session covers best practices, features, and capabilities that users of Microsoft products can leverage in AWS. We emphasize Windows Server, Microsoft SQL Server, Active Directory, and .NET capabilities available and deeply integrated in AWS. With these learnings, you can extend the value of your Microsoft investments, lower total cost of ownership (TCO), and keep users working in familiar environments.
Analyzing and processing streaming data with Amazon EMR - ADB204 - New York A...Amazon Web Services
Customers regularly use Apache Spark running on Amazon EMR to process large amounts of data. As time to insight and the ability to act quickly based on those insights become core differentiators for customers, there is a greater need to be able to analyze data in real time. In this session, we teach you several design patterns to process and analyze real-time streaming data using Amazon EMR and Amazon Kinesis data services.
Microservices on AWS: Architectural Patterns and Best Practices | AWS Summit ...AWS Summits
This session is the first of 5 sessions that will cover a fully functioning system we have built to demonstrate how to rapidly develop systems using the AWS platform. This session we will start with a demo and an architecture review in which we will break into the different subsystems. In the second part of the session we will zoom into the Microservices part of the solution.Microservices are an architectural and organizational approach to software development where software is composed of small independent services that communicate over well-defined APIs. This session demonstrates the use of services like Amazon ECS, AWS Cloud Map and Amazon API Gateway and can help you understand where you can utilize microservices architecture in your own organization and understand areas of potential savings and increased agility.
Resiliency-and-Availability-Design-Patterns-for-the-CloudAmazon Web Services
We have traditionally built robust software systems by trying to avoid mistakes and by dodging failures when they occur in production or by testing parts of the system in isolation from one another. Modern methods and techniques take a very different approach based on resiliency, which promotes embracing failure instead of trying to avoid it. Resilient architectures enhance observability, leverage well-known patterns such as graceful degradation, timeouts and circuit breakers. In this session, will review the most useful patterns for building resilient software systems and especially show the audience how they can benefit from the patterns.
Soluzioni per la migrazione e gestione dei dati in Amazon Web ServicesAmazon Web Services
AWS Summit Milano 2019 - Soluzioni per la migrazione e gestione dei dati in Amazon Web Services - Antonio Aga Rossi, Global Accounts Solutions Architect, AWS
What’s new with Amazon Redshift, featuring ZS Associates - ADB205 - Chicago A...Amazon Web Services
No organization can afford a data warehouse that scales slowly or forces tradeoffs between performance and concurrency. Amazon Redshift scales to provide consistently fast performance with rapidly growing data as well as high user and query concurrency for more than 10,000 customers, including ZS Associates, a professional-services firm serving primarily the Pharmaceutical and Healthcare industries. In this session, we learn how they migrated data-warehousing workloads to Amazon Redshift for scale, agility, cost savings, and performance gain. In addition, they describe their pilot-based approach to migration and the key outcomes achieved. Finally, we highlight recently released and soon-to-come features in Amazon Redshift.
Optimize your workloads with Amazon EC2 and AMD EPYC - DEM03-SR - New York AW...Amazon Web Services
Customers are always looking to optimize the performance and cost for their workloads. With new Amazon EC2 instances featuring AMD EPYC processors, customer can do just that. Join AMD and AWS as they jointly showcase how Amazon EC2 M5a, R5a, and T3a instances can save you 10% on infrastructure cost for right-sized workloads. Learn the benefits, use cases, and customer successes of these new instances. This presentation is brought to you by AWS partner, AMD (Advanced Micro Devices).
No Hassle NoSQL - Amazon DynamoDB & Amazon DocumentDB | AWS Summit Tel Aviv ...AWS Summits
NoSQL databases are a great fit for many modern applications such as mobile, web, and gaming that require flexible, scalable, high-performance, and highly functional databases to provide great user experiences but they can be hard to manage and require high proficiency and attention.In this session we will present Amazon DynamoDB, a fully managed, multi-region, multi-master database that provides consistent single-digit millisecond latency in any scale.
High-Performance-Computing-on-AWS-and-Industry-SimulationAmazon Web Services
High Performance Computing on AWS enables engineers, analysts, and researchers to think beyond the limitations of on-premises HPC infrastructure. AWS HPC solutions address the infrastructure capacity, secure global collaboration, technology obsolescence, and capital expenditure constraints associated with on-premises HPC clusters to give you the freedom to tackle the most challenging HPC workloads and get to your results faster. In this session. we will provide a quick overview of the services that make up the HPC on AWS solution, and share customer success stories across multiple industries, such as Financial Services and Life Sciences.
Modernize your data warehouse with Amazon Redshift - ADB305 - Atlanta AWS SummitAmazon Web Services
Can you set up a data warehouse and create a dashboard in under 60 minutes? In this workshop, we show you how with Amazon Redshift, a fully managed cloud data warehouse that provides first-rate performance at the lowest cost for queries across your data warehouse and data lake. Learn the steps and best practices for deploying your data warehouse in your organization. Also, learn how to query petabytes of data in your data warehouse and exabytes of data, without loading or moving, in your Amazon S3 data lake. Finally, learn how to easily migrate from traditional or on-premises data warehouses.
As an official MongoDB-as-a-Service offering from MongoDB Inc., the maker for MongoDB, Atlas is becoming a very popular service offering for those who wish to build their applications in the cloud, regardless on AWS, Azure or GCP. One less known cloud product offered on the Atlas platform is Stitch, A group of services designed to interact with Atlas in every conceivable way, including creating endpoints, triggers, user authentication flows, serverless functions, and a UI to handle all of this. Adding these together, you have a server-less solution running on top of MongoDB cloud.
Amazon EC2 A1 instances, powered by the AWS Graviton processor - CMP303 - San...Amazon Web Services
Amazon EC2 A1 instances are the first EC2 instances powered by Arm-based AWS Graviton processors. They deliver significant cost savings for scale-out and Arm-based applications, such as web servers, containerized microservices, caching fleets, and distributed
Introducing Open Distro for Elasticsearch - ADB201 - New York AWS SummitAmazon Web Services
Open Distro for Elasticsearch is a 100% open-source distribution of Elasticsearch, the popular search and analytics engine. In this session, we explore its many new advanced features—previously available only in commercial software—including encryption in transit, role-based access control (RBAC), event monitoring and alerting, SQL support, cluster diagnostics, and more. We also show you how you can join the Open Distro for Elasticsearch community to accelerate open innovation for Elasticsearch.
Ask me anything about building data lakes on AWS - ADB209 - New York AWS SummitAmazon Web Services
Bring your questions and learn how AWS delivers an integrated suite of services that provide everything needed to build and manage a data lake for analytics. Discover how AWS-powered data lakes can handle the scale, agility, and flexibility required to combine different types of data and analytics approaches. You can also learn how to set up security and granular access controls for multiple analytics services.
Build your own log analytics solution on AWS - ADB301 - Atlanta AWS SummitAmazon Web Services
With the simplicity of Amazon Elasticsearch Service (Amazon ES) comes many opportunities to use it as a backend for real-time application and infrastructure monitoring. With this wealth of opportunities comes sprawl; developers in your organization are deploying Amazon ES for many different workloads. Should you centralize into one Amazon ES domain? What are the tradeoffs in scale and cost? How do you control access to the data and dashboards? Do you structure your indexes as single tenant or multi-tenant? In this session, we explore whether, when, and how to centralize logging across your organization and discover how Autodesk built a unified log analytics solution using Amazon ES.
Searching for patterns: Log analytics using Amazon ES - ADB205 - New York AWS...Amazon Web Services
Amazon Elasticsearch Service gives customers many options for log analytics. From small environments with a single application to large environments where multiple teams log five terabytes or more per day with retention periods that span months, Amazon ES provides a tool kit that gives organizations a holistic view of their application logs. In this session, we discuss effective patterns leveraged by organizations across the AWS ecosystem and gives you foundational knowledge and deployment architectures that will accelerate your goals of building a cost-effective logging solution.
Machine learning for developers & data scientists with Amazon SageMaker - AIM...Amazon Web Services
Machine learning (ML) offers innovation for every business. But until recently, developing ML models took time and effort, making it difficult for developers to get started. In this session, we demonstrate how Amazon SageMaker, a fully managed service that enables developers and data scientists to build, train, and deploy ML models at scale, overcomes those challenges. We review its capabilities, including data labeling, model building, model training, tuning, and production hosting.
Modernize your data warehouse with Amazon Redshift - ADB305 - New York AWS Su...Amazon Web Services
Can you set up a data warehouse and create a dashboard in less than 60 minutes? You can with Amazon Redshift, a fully managed cloud data warehouse that provides first-rate performance at a low cost. In this workshop, you learn the steps and best practices to deploy your data warehouse in your organization. You also see how to query across petabytes of data in your data warehouse and exabytes of data in your Amazon S3 data lake. Finally, you learn how to easily migrate from traditional or on-premises data warehouses.
Running Amazon Elastic Compute Cloud (Amazon EC2) workloads at scale - CMP202...Amazon Web Services
Amazon EC2 Fleet makes it easy to optimize compute performance and cost by blending Amazon EC2 Spot, On-Demand, and Reserved Instances purchasing models. In this session, we learn how to use the power of Amazon EC2 Fleet with AWS services such as AWS Auto Scaling, Amazon Elastic Container Service (Amazon ECS), Amazon Elastic Container Service for Kubernetes (Amazon EKS), Amazon EMR, AWS Batch, AWS Thinkbox Deadline, and AWS OpsWorks to programmatically optimize costs while maintaining high performance and availability. We also discuss cost-optimization patterns for workloads such as containers, web services, CI/CD, and big data.
What's new in Amazon Aurora - ADB207 - New York AWS SummitAmazon Web Services
Amazon Aurora is a fully managed relational database that runs on Amazon RDS and offers versions compatible with MySQL and PostgreSQL. Aurora provides the speed, reliability, and availability of commercial databases at a fraction of the cost and is faster than standard MySQL and PostgreSQL databases. In this session, we provide an overview of Aurora, exploring recently announced features, such as serverless, multi-master, and performance insights. We also discuss what you need to get your organization started with Aurora.
Increase the value of video with machine learning & AWS Media Services - SVC3...Amazon Web Services
With the advancement of machine learning applications, new business opportunities are rapidly emerging in media. In this session, you learn how the AWS Media2Cloud solution can save time and reduce costs by setting up a serverless end-to-end ingest workflow to move your video assets and associated metadata to the cloud. You gain insight into how to make those assets even more valuable by enabling searching and indexing on your video library and learn how to use Amazon Transcribe and Amazon Translate to take your live-streaming workflows to the next level with expert instruction on how to automatically create multilanguage subtitles.
This session covers best practices, features, and capabilities that users of Microsoft products can leverage in AWS. We emphasize Windows Server, Microsoft SQL Server, Active Directory, and .NET capabilities available and deeply integrated in AWS. With these learnings, you can extend the value of your Microsoft investments, lower total cost of ownership (TCO), and keep users working in familiar environments.
Analyzing and processing streaming data with Amazon EMR - ADB204 - New York A...Amazon Web Services
Customers regularly use Apache Spark running on Amazon EMR to process large amounts of data. As time to insight and the ability to act quickly based on those insights become core differentiators for customers, there is a greater need to be able to analyze data in real time. In this session, we teach you several design patterns to process and analyze real-time streaming data using Amazon EMR and Amazon Kinesis data services.
Microservices on AWS: Architectural Patterns and Best Practices | AWS Summit ...AWS Summits
This session is the first of 5 sessions that will cover a fully functioning system we have built to demonstrate how to rapidly develop systems using the AWS platform. This session we will start with a demo and an architecture review in which we will break into the different subsystems. In the second part of the session we will zoom into the Microservices part of the solution.Microservices are an architectural and organizational approach to software development where software is composed of small independent services that communicate over well-defined APIs. This session demonstrates the use of services like Amazon ECS, AWS Cloud Map and Amazon API Gateway and can help you understand where you can utilize microservices architecture in your own organization and understand areas of potential savings and increased agility.
Resiliency-and-Availability-Design-Patterns-for-the-CloudAmazon Web Services
We have traditionally built robust software systems by trying to avoid mistakes and by dodging failures when they occur in production or by testing parts of the system in isolation from one another. Modern methods and techniques take a very different approach based on resiliency, which promotes embracing failure instead of trying to avoid it. Resilient architectures enhance observability, leverage well-known patterns such as graceful degradation, timeouts and circuit breakers. In this session, will review the most useful patterns for building resilient software systems and especially show the audience how they can benefit from the patterns.
Soluzioni per la migrazione e gestione dei dati in Amazon Web ServicesAmazon Web Services
AWS Summit Milano 2019 - Soluzioni per la migrazione e gestione dei dati in Amazon Web Services - Antonio Aga Rossi, Global Accounts Solutions Architect, AWS
What’s new with Amazon Redshift, featuring ZS Associates - ADB205 - Chicago A...Amazon Web Services
No organization can afford a data warehouse that scales slowly or forces tradeoffs between performance and concurrency. Amazon Redshift scales to provide consistently fast performance with rapidly growing data as well as high user and query concurrency for more than 10,000 customers, including ZS Associates, a professional-services firm serving primarily the Pharmaceutical and Healthcare industries. In this session, we learn how they migrated data-warehousing workloads to Amazon Redshift for scale, agility, cost savings, and performance gain. In addition, they describe their pilot-based approach to migration and the key outcomes achieved. Finally, we highlight recently released and soon-to-come features in Amazon Redshift.
Optimize your workloads with Amazon EC2 and AMD EPYC - DEM03-SR - New York AW...Amazon Web Services
Customers are always looking to optimize the performance and cost for their workloads. With new Amazon EC2 instances featuring AMD EPYC processors, customer can do just that. Join AMD and AWS as they jointly showcase how Amazon EC2 M5a, R5a, and T3a instances can save you 10% on infrastructure cost for right-sized workloads. Learn the benefits, use cases, and customer successes of these new instances. This presentation is brought to you by AWS partner, AMD (Advanced Micro Devices).
No Hassle NoSQL - Amazon DynamoDB & Amazon DocumentDB | AWS Summit Tel Aviv ...AWS Summits
NoSQL databases are a great fit for many modern applications such as mobile, web, and gaming that require flexible, scalable, high-performance, and highly functional databases to provide great user experiences but they can be hard to manage and require high proficiency and attention.In this session we will present Amazon DynamoDB, a fully managed, multi-region, multi-master database that provides consistent single-digit millisecond latency in any scale.
High-Performance-Computing-on-AWS-and-Industry-SimulationAmazon Web Services
High Performance Computing on AWS enables engineers, analysts, and researchers to think beyond the limitations of on-premises HPC infrastructure. AWS HPC solutions address the infrastructure capacity, secure global collaboration, technology obsolescence, and capital expenditure constraints associated with on-premises HPC clusters to give you the freedom to tackle the most challenging HPC workloads and get to your results faster. In this session. we will provide a quick overview of the services that make up the HPC on AWS solution, and share customer success stories across multiple industries, such as Financial Services and Life Sciences.
Modernize your data warehouse with Amazon Redshift - ADB305 - Atlanta AWS SummitAmazon Web Services
Can you set up a data warehouse and create a dashboard in under 60 minutes? In this workshop, we show you how with Amazon Redshift, a fully managed cloud data warehouse that provides first-rate performance at the lowest cost for queries across your data warehouse and data lake. Learn the steps and best practices for deploying your data warehouse in your organization. Also, learn how to query petabytes of data in your data warehouse and exabytes of data, without loading or moving, in your Amazon S3 data lake. Finally, learn how to easily migrate from traditional or on-premises data warehouses.
As an official MongoDB-as-a-Service offering from MongoDB Inc., the maker for MongoDB, Atlas is becoming a very popular service offering for those who wish to build their applications in the cloud, regardless on AWS, Azure or GCP. One less known cloud product offered on the Atlas platform is Stitch, A group of services designed to interact with Atlas in every conceivable way, including creating endpoints, triggers, user authentication flows, serverless functions, and a UI to handle all of this. Adding these together, you have a server-less solution running on top of MongoDB cloud.
Amazon EC2 A1 instances, powered by the AWS Graviton processor - CMP303 - San...Amazon Web Services
Amazon EC2 A1 instances are the first EC2 instances powered by Arm-based AWS Graviton processors. They deliver significant cost savings for scale-out and Arm-based applications, such as web servers, containerized microservices, caching fleets, and distributed
Introducing Open Distro for Elasticsearch - ADB201 - New York AWS SummitAmazon Web Services
Open Distro for Elasticsearch is a 100% open-source distribution of Elasticsearch, the popular search and analytics engine. In this session, we explore its many new advanced features—previously available only in commercial software—including encryption in transit, role-based access control (RBAC), event monitoring and alerting, SQL support, cluster diagnostics, and more. We also show you how you can join the Open Distro for Elasticsearch community to accelerate open innovation for Elasticsearch.
Ask me anything about building data lakes on AWS - ADB209 - New York AWS SummitAmazon Web Services
Bring your questions and learn how AWS delivers an integrated suite of services that provide everything needed to build and manage a data lake for analytics. Discover how AWS-powered data lakes can handle the scale, agility, and flexibility required to combine different types of data and analytics approaches. You can also learn how to set up security and granular access controls for multiple analytics services.
Build your own log analytics solution on AWS - ADB301 - Atlanta AWS SummitAmazon Web Services
With the simplicity of Amazon Elasticsearch Service (Amazon ES) comes many opportunities to use it as a backend for real-time application and infrastructure monitoring. With this wealth of opportunities comes sprawl; developers in your organization are deploying Amazon ES for many different workloads. Should you centralize into one Amazon ES domain? What are the tradeoffs in scale and cost? How do you control access to the data and dashboards? Do you structure your indexes as single tenant or multi-tenant? In this session, we explore whether, when, and how to centralize logging across your organization and discover how Autodesk built a unified log analytics solution using Amazon ES.
Searching for patterns: Log analytics using Amazon ES - ADB205 - New York AWS...Amazon Web Services
Amazon Elasticsearch Service gives customers many options for log analytics. From small environments with a single application to large environments where multiple teams log five terabytes or more per day with retention periods that span months, Amazon ES provides a tool kit that gives organizations a holistic view of their application logs. In this session, we discuss effective patterns leveraged by organizations across the AWS ecosystem and gives you foundational knowledge and deployment architectures that will accelerate your goals of building a cost-effective logging solution.
Need for Speed – Intro To Real-Time Data Streaming Analytics on AWS | AWS Sum...AWS Summits
AWS provides multiple ways to ingest and process real-time data generated from sources such as Edge device, logs, websites, mobile apps, IoT devices and more.
In this session we will compare the different tools and technologies and share best practices for when to use what.
The session will cover: Apache Kafka, Kinesis Data Streams/Firehose, MSK (Managed Kafka), Kinesis Data Analytics for SQL and Java (Flink), Apache Spark and more.
Need for Speed – Intro To Real-Time Data Streaming Analytics on AWS | AWS Sum...Amazon Web Services
AWS provides multiple ways to ingest and process real-time data generated from sources such as Edge device, logs, websites, mobile apps, IoT devices and more.
In this session we will compare the different tools and technologies and share best practices for when to use what.
The session will cover: Apache Kafka, Kinesis Data Streams/Firehose, MSK (Managed Kafka), Kinesis Data Analytics for SQL and Java (Flink), Apache Spark and more.
Real-time analytics have traditionally been analyzed using batch processing in DWH/Hadoop environments. Common use cases use data lakes, data science, and machine learning (ML). Creating serverless data-driven architecture and serverless streaming solutions with services like Amazon Kinesis, AWS Lambda, and Amazon Athena can solve real-time ingestion, storage, and analytics challenges, and help you focus on application logic without managing infrastructure. Learn design patterns and best practices for serverless stream processing.
Cyber Data Lake: How CIS Analyzes Billions of Network Traffic Records per DayAmazon Web Services
As network traffic exponentially increased, the Center for Internet Security (CIS) needed a way to cost-effectively scale Albert, its IP traffic-monitoring tool. With over 10 terabytes of data and more than 10 billion logs for daily analysis, its existing on-premises architecture could no longer meet the performance requirements for providing low-latency analytics. Learn how CIS worked jointly with AWS Professional Services to develop an architecture that uses native AWS services, such as Amazon Athena, Amazon S3, Amazon EC2, and AWS Lambda to build a scalable, cost-effective system. This new architecture provides ingestion, enrichment, and access to all of CIS's data in near real time, enabling answers that previously took days, in just a few minutes.
Serverless Stream Processing Pipeline Best Practices (SRV316-R1) - AWS re:Inv...Amazon Web Services
Real-time analytics has traditionally been analyzed using batch processing in DWH/Hadoop environments. Common use cases use data lakes, data science, and machine learning (ML). Creating serverless data-driven architecture and serverless streaming solutions with services like Amazon Kinesis, AWS Lambda, and Amazon Athena can solve real-time ingestion, storage, and analytics challenges, and help you focus on application logic without managing infrastructure. In this session, we introduce design patterns, best practices, and share customer journeys from batch to real-time insights in building modern serverless data-driven architecture applications. Hear how Intel built the Intel Pharma Analytics Platform using a serverless architecture. This AI cloud-based offering enables remote monitoring of patients using an array of sensors, wearable devices, and ML algorithms to objectively quantify the impact of interventions and power clinical studies in various therapeutics conditions.
Everything You Need to Know About Big Data: From Architectural Principles to ...Amazon Web Services
In this session, we discuss architectural principles that help simplify big data analytics. We'll apply principles to various stages of big data processing: collect, store, process, analyze, and visualize. We'll discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Nearly everything in IT - servers, applications, websites, connected devices, and other things - generate discrete, time-stamped records of events called logs. Processing and analyzing these logs to gain actionable insights is log analytics. We'll look at how to use centralized log analytics across multiple sources with Amazon Elasticsearch Service.
Speakers:
Lex Crosett - Solutions Architect, AWS
David Simcik - Solutions Architect, AWS
In this session, we show you how to understand what data you have, how to drive insights, and how to make predictions using purpose-built AWS services. Learn about the common pitfalls of building data lakes and discover how to successfully drive analytics and insights from your data. Also learn how services such as Amazon S3, AWS Glue, Amazon Redshift, Amazon Athena, Amazon EMR, Amazon Kinesis, and Amazon Machine Learning (Amazon ML) services work together to build a successful data lake for various roles, including data scientists and business users.
Nearly everything in IT - servers, applications, websites, connected devices, and other things - generate discrete, time-stamped records of events called logs. Processing and analyzing these logs to gain actionable insights is log analytics. We'll look at how to use centralized log analytics across multiple sources with Amazon Elasticsearch Service.
Building Data Lakes for Analytics on AWS - ADB201 - Anaheim AWS SummitAmazon Web Services
AWS provides the most comprehensive, secure, scalable, and cost-effective portfolio of services for building data lakes for analytics. In this session, learn how to discover, load, store, catalog, prepare, and secure your data in a data lake. Then, learn to analyze with the largest choice of analytics approaches, including big data, data warehouse, operational, real-time streaming analytics, and even ML and AI. Ensure that your needs are met for existing and future analytics use cases, and discover how leading companies found success with their data lake initiatives.
AWS Portfolio: highlight delle categorie di prodotti AWS con esempiAmazon Web Services
Amazon Web Services offre un' ampia gamma di strumenti e servizi globali basati sul cloud fornendo ai clienti lo strumento giusto per ogni esigenza. In questa sessione viene presentata una panoramica dei servizi offerti da AWS con gli esempi più interessanti. In particolare si parlerà di servizi base quali computing, containers, serverless, storage e database oltre a cenni di Machine Learning e Iot
Optimize data lakes with Amazon S3 - STG302 - Santa Clara AWS SummitAmazon Web Services
In this session, AWS experts dive into the benefits of Amazon S3 that customers are leveraging to build and manage their data lakes in the AWS Cloud. Learn about the Amazon S3 integrations with the AWS analytics suite and Amazon FSx for Lustre. Learn how to seamlessly run big data analytics, high performance computing applications, machine learning training models, and media data processing workloads across your Amazon S3 data lakes. We also cover the range of features that enable you to manage data with object-level granularity, configure and enforce finely tuned access policies, make changes to billions of objects with just a few clicks, enable cost efficiencies with the S3 storage classes, and audit and report on your data and Amazon S3 activities across your entire data lake.
Keynote: Customer Journey with Streaming Data on AWS - Rahul Pathak, AWSFlink Forward
Amazon Web Services (AWS) offers over 165 fully featured cloud services from data centers globally. AWS launched its first data streaming service, Amazon Kinesis Data Streams, over five years ago. Now, customers are using streaming data across most AWS services including two that support running Apache Flink, Amazon EMR and Amazon Kinesis Data Analytics. In this keynote, we will describe how customers and their use of streaming data has evolved on AWS. We will look at how streaming data and Apache Flink are used externally and internally on AWS, and where we see usage of Apache Flink growing.
Similar to Scalable, secure log analytics with Amazon ES - ADB302 - Chicago AWS Summit (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.