This document provides instructions for building a big data application on AWS that collects and analyzes web server logs. It discusses using Amazon Kinesis to collect logs with a Firehose delivery stream into an S3 bucket. It then covers using Kinesis Analytics to process the logs in real-time by writing SQL queries that compute metrics and detect anomalies. Finally, it discusses loading the processed logs into Amazon Redshift for interactive querying and visualizing insights with Amazon QuickSight.
ABD301-Analyzing Streaming Data in Real Time with Amazon KinesisAmazon Web Services
Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. In this session, we present an end-to-end streaming data solution using Kinesis Streams for data ingestion, Kinesis Analytics for real-time processing, and Kinesis Firehose for persistence. We review in detail how to write SQL queries using streaming data and discuss best practices to optimize and monitor your Kinesis Analytics applications. Lastly, we discuss how to estimate the cost of the entire system.
To win in the marketplace and provide differentiated customer experiences, businesses need to be able to use live data in real time to facilitate fast decision making. In this session, you learn common streaming data processing use cases and architectures. First, we give an overview of streaming data and AWS streaming data capabilities. Next, we look at a few customer examples and their real-time streaming applications. Finally, we walk through common architectures and design patterns of top streaming data use cases.
A Look Under the Hood – How Amazon.com Uses AWS Services for Analytics at Mas...Amazon Web Services
Amazon’s consumer business continues to grow, and so does the volume of data and the number and complexity of the analytics done in support of the business. In this session, we talk about how Amazon.com uses AWS technologies to build a scalable environment for data and analytics. We look at how Amazon is evolving the world of data warehousing with a combination of a data lake and parallel, scalable compute engines such as Amazon EMR and Amazon Redshift.
Reducing the time to get actionable insights from data is important to all businesses, and customers who employ batch data analytics tools are exploring the benefits of streaming analytics. Learn best practices to extend your architecture from data warehouses and databases to real-time solutions. Learn how to use Amazon Kinesis to get real-time data insights and integrate them with Amazon Aurora, Amazon RDS, Amazon Redshift, and Amazon S3. The Amazon Flex team describes how they used streaming analytics in their Amazon Flex mobile app, used by Amazon delivery drivers to deliver millions of packages each month on time. They discuss the architecture that enabled the move from a batch processing system to a real-time system, overcoming the challenges of migrating existing batch data to streaming data, and how to benefit from real-time analytics.
What is Amazon OpenSearch Service?
OpenSearch is a distributed, open-source search and analytics package that may be used for real-
time application monitoring, log analysis, and internet search, among other things. With OpenSearch
Dashboards, an integrated visualization tool that makes it easy for users to examine their data,
OpenSearch provides a highly scalable solution for quick access and reaction to massive amounts of
data. The Apache Lucene search library, as well as OpenSearch, Elasticsearch, and Apache Solr,
support it. Elasticsearch 7.10.2 and Kibana 7.10.2 were used to create OpenSearch and OpenSearch
Dashboards. The Apache License Version 2.0 applies to all software in the OpenSearch project (ALv2).
Deep Dive on Amazon S3 Storage Classes: Creating Cost Efficiencies across You...Amazon Web Services
"Amazon S3 supports a range of storage classes that can help you cost-effectively store data without impacting performance or availability. Each of our storage classes offer different data-access levels, retrieval times, and costs to support various use cases. In this session, Amazon S3 experts dive deep into the different Amazon S3 storage classes, their respective attributes, and when you should use them.
"
ABD301-Analyzing Streaming Data in Real Time with Amazon KinesisAmazon Web Services
Amazon Kinesis makes it easy to collect, process, and analyze real-time, streaming data so you can get timely insights and react quickly to new information. In this session, we present an end-to-end streaming data solution using Kinesis Streams for data ingestion, Kinesis Analytics for real-time processing, and Kinesis Firehose for persistence. We review in detail how to write SQL queries using streaming data and discuss best practices to optimize and monitor your Kinesis Analytics applications. Lastly, we discuss how to estimate the cost of the entire system.
To win in the marketplace and provide differentiated customer experiences, businesses need to be able to use live data in real time to facilitate fast decision making. In this session, you learn common streaming data processing use cases and architectures. First, we give an overview of streaming data and AWS streaming data capabilities. Next, we look at a few customer examples and their real-time streaming applications. Finally, we walk through common architectures and design patterns of top streaming data use cases.
A Look Under the Hood – How Amazon.com Uses AWS Services for Analytics at Mas...Amazon Web Services
Amazon’s consumer business continues to grow, and so does the volume of data and the number and complexity of the analytics done in support of the business. In this session, we talk about how Amazon.com uses AWS technologies to build a scalable environment for data and analytics. We look at how Amazon is evolving the world of data warehousing with a combination of a data lake and parallel, scalable compute engines such as Amazon EMR and Amazon Redshift.
Reducing the time to get actionable insights from data is important to all businesses, and customers who employ batch data analytics tools are exploring the benefits of streaming analytics. Learn best practices to extend your architecture from data warehouses and databases to real-time solutions. Learn how to use Amazon Kinesis to get real-time data insights and integrate them with Amazon Aurora, Amazon RDS, Amazon Redshift, and Amazon S3. The Amazon Flex team describes how they used streaming analytics in their Amazon Flex mobile app, used by Amazon delivery drivers to deliver millions of packages each month on time. They discuss the architecture that enabled the move from a batch processing system to a real-time system, overcoming the challenges of migrating existing batch data to streaming data, and how to benefit from real-time analytics.
What is Amazon OpenSearch Service?
OpenSearch is a distributed, open-source search and analytics package that may be used for real-
time application monitoring, log analysis, and internet search, among other things. With OpenSearch
Dashboards, an integrated visualization tool that makes it easy for users to examine their data,
OpenSearch provides a highly scalable solution for quick access and reaction to massive amounts of
data. The Apache Lucene search library, as well as OpenSearch, Elasticsearch, and Apache Solr,
support it. Elasticsearch 7.10.2 and Kibana 7.10.2 were used to create OpenSearch and OpenSearch
Dashboards. The Apache License Version 2.0 applies to all software in the OpenSearch project (ALv2).
Deep Dive on Amazon S3 Storage Classes: Creating Cost Efficiencies across You...Amazon Web Services
"Amazon S3 supports a range of storage classes that can help you cost-effectively store data without impacting performance or availability. Each of our storage classes offer different data-access levels, retrieval times, and costs to support various use cases. In this session, Amazon S3 experts dive deep into the different Amazon S3 storage classes, their respective attributes, and when you should use them.
"
ABD318_Architecting a data lake with Amazon S3, Amazon Kinesis, AWS Glue and ...Amazon Web Services
"Learn how to architect a data lake where different teams within your organization can publish and consume data in a self-service manner. As organizations aim to become more data-driven, data engineering teams have to build architectures that can cater to the needs of diverse users - from developers, to business analysts, to data scientists. Each of these user groups employs different tools, have different data needs and access data in different ways.
In this talk, we will dive deep into assembling a data lake using Amazon S3, Amazon Kinesis, Amazon Athena, Amazon EMR, and AWS Glue. The session will feature Mohit Rao, Architect and Integration lead at Atlassian, the maker of products such as JIRA, Confluence, and Stride. First, we will look at a couple of common architectures for building a data lake. Then we will show how Atlassian built a self-service data lake, where any team within the company can publish a dataset to be consumed by a broad set of users."
The volume of data businesses create and process is growing every day. To get the most value out of this data, companies often invest in traditional BI tools. These tools however require investments in costly on-premises hardware and software. It takes weeks or months of data engineering time to build complex data models; not to mention the additional infrastructure needed to maintain fast query performance as data sets grow. Amazon QuickSight is built from the ground up to solve these problems by bringing the scale and flexibility of the AWS Cloud and by providing a business user focused experience to business analytics. This session will provide you with the relevant capabilities, benefits and use cases for AWS Quicksight.
AWS Cost Management Workshop at the San Francisco Loft
AWS offers a number of products that allow you to access, organize, understand, optimize, and control your AWS costs and usage. This workshop will help you get started using AWS Cost Explorer to visualize your usage patterns and identify your underlying cost drivers. From there, you can take action on your insights by learning how to set custom cost and usage budgets and receive alerts via email or Amazon SNS topic using AWS Budgets.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
The Zen of DataOps – AWS Lake Formation and the Data Supply Chain PipelineAmazon Web Services
Many organizations have adopted or are in the process of adopting DevOps methodologies in their quest to accelerate the delivery of software capabilities, features, and functionalities to support their organizational objectives. By applying the same practices, DataOps aims to provide the same level of agility in delivering data and information to the organization. AWS Lake Formation, in coordination with other AWS Services, enables DevOps methodologies to be realized through the Data Supply Chain Pipeline.
Learning Objectives:
- Learn the common use-cases for using Athena, AWS' interactive query service on S3
- Learn best practices for creating tables and partitions and performance optimizations
- Learn how Athena handles security, authorization, and authentication
Getting Started with AWS Enterprise Applications: WorkSpaces, WorkMail, WorkDocsAmazon Web Services
AWS Enterprise Applications deliver managed, secure desktop and productivity capabilities run in the AWS cloud. Amazon WorkSpaces allows customers to easily provision cloud-based desktops that allow end-users to securely access the documents, applications, and resources they need with the device of their choice. Amazon WorkMail is a secure and managed business email and calendaring service that gives users the ability to seamlessly access their email, contacts, and calendars while allowing IT to maintain control over encryption and location of data. The speakers also dive into Amazon WorkDocs, a fully managed and secure enterprise storage and sharing service with strong administrative controls and feedback capabilities. In this session, we explore each of these services, explain how your organization can benefit from them, and also provide a brief demo to show how they work together.
Amazon Web Services gives you fast access to flexible and low cost IT resources, so you can rapidly scale and build virtually any big data application including data warehousing, clickstream analytics, fraud detection, recommendation engines, event-driven ETL, serverless computing, and internet-of-things processing regardless of volume, velocity, and variety of data.
https://aws.amazon.com/webinars/anz-webinar-series/
How Amazon.com Uses AWS Analytics: Data Analytics Week SFAmazon Web Services
Data Analytics Week at the San Francisco Loft
How Amazon.com Uses AWS Analytics
An inside look at how a global e-commerce firm uses AWS technologies to build a scalable environment for data and analytics. We'll look at how Amazon is evolving the world of data warehousing with a combination of a data lake and parallel scalable compute engines including Amazon EMR and Amazon Redshift.
Speakers:
Saurabh Shrivastava - Partner Solutions Architect, AWS
Andre Hass - Specialist Technical Account Manager (Redshift), AWS
Big Data Analytics Architectural Patterns and Best Practices (ANT201-R1) - AW...Amazon Web Services
In this session, we discuss architectural principles that helps simplify big data analytics.
We'll apply principles to various stages of big data processing: collect, store, process, analyze, and visualize. We'll disucss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on.
Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
by Michael St. Onge, Global Cloud Security Architect, AWS
Join us for this hands-on lab where you will learn about the new service Amazon GuardDuty by walking through its capabilities and some real-world attack scenarios. You will need an AWS account to do the lab. This should be your own personal account and not an account through your company given the activity in the lab. AWS Credits will be provided to help cover any costs incurred in the lab. Level 300
Today’s organisations require a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lake is a new and increasingly popular way to store all of your data, structured and unstructured, in one, centralised repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
In this webinar, you will discover how AWS gives you fast access to flexible and low-cost IT resources, so you can rapidly scale and build your data lake that can power any kind of analytics such as data warehousing, clickstream analytics, fraud detection, recommendation engines, event-driven ETL, serverless computing, and internet-of-things processing regardless of volume, velocity and variety of data.
Learning Objectives:
• Discover how you can rapidly scale and build your data lake with AWS.
• Explore the key pillars behind a successful data lake implementation.
• Learn how to use the Amazon Simple Storage Service (S3) as the basis for your data lake.
• Learn about the new AWS services recently launched, Amazon Athena and Amazon Redshift Spectrum, that help customers directly query that data lake.
Deep Dive - Amazon Kinesis Video Streams - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Get an overview of Amazon Kinesis Video Streams and key use cases
- Learn how to use the Kinesis Video Streams producer SDK to securely stream video to AWS
- Discover how to use the Kinesis Video Streams parser library to retrieve video fragments for analytics and processing
Build Data Lakes & Analytics on AWS: Patterns & Best PracticesAmazon Web Services
With over 90% of today’s data generated in the last two years, the rate of data growth is showing no sign of slowing down. In this session, we step through the challenges and best practices for capturing data, understanding what data you own, driving insights, and predicting the future using AWS services. We frame the session and demonstrations around common pitfalls of building data lakes and how to successfully drive analytics and insights from data. We also discuss the architecture patterns brought together key AWS services, including Amazon S3, AWS Glue, Amazon Athena, Amazon Kinesis, and Amazon Machine Learning. Discover the real-world application of data lakes for roles including data scientists and business users.
Stephen Moon, Sr. Solutions Architect, Amazon Web Services
James Juniper, Solution Architect for the Geo-Community Cloud, Natural Resources Canada
NEW LAUNCH! Stream video from edge devices to AWS for playback, storage and p...Amazon Web Services
Amazon Kinesis Video Streams is a video ingestion and storage service for analytics, machine learning, and video processing use cases. In this workshop, you will learn how to stream video from devices to Kinesis Video Streams for playback, storage and subsequent processing. For the workshop, we will provide you a Raspberry Pi 3 with camera module and pre-loaded with the Kinesis Video Streams producer SDK. First, you will create and configure a Kinesis video stream in the AWS management console. Next, you will stream videos from the Raspberry Pis to Kinesis Video Streams, view the live video feed in the console, and retrieve stored videos. Lastly, you will pull key operating metrics to understand the performance characteristics of your video stream. For this workshop, you need to create an AWS account and bring your own development laptop.
AWS delivers an integrated suite of services that provide everything needed to quickly and easily build and manage a data lake for analytics. AWS-powered data lakes can handle the scale, agility, and flexibility required to combine different types of data and analytics approaches to gain deeper insights, in ways that traditional data silos and data warehouses cannot. In this session, we will show you how you can quickly build a data lake on AWS that ingests, catalogs and processes incoming data and makes it ready for analysis. Using a live demo, we demonstrate the capabilities of AWS provided analytical services such as AWS Glue, Amazon Athena and Amazon EMR and how to build a Data Lake on AWS step-by-step.
Do you want to ramp up your knowledge of AWS analytics services and launch your first big data application on the cloud? In this session, we walk you through simplifying big data processing as a data bus comprising of ingestion, storage, processing, and visualization. You build a big data application using AWS managed services, including Amazon Athena, Amazon Kinesis, Amazon EMR, AWS Glue, Amazon Redshift, Amazon QuickSight, and Amazon S3. Along the way, we review architecture design patterns for big data applications and give you access to a take-home lab so you can rebuild and customize the application yourself. To get the most from this session, bring your own laptop and have some familiarity with AWS services.
I Want to Analyze and Visualize Website Access Logs, but Why Do I Need Server...Amazon Web Services
Nowadays, it’s common for a web server to be fronted by a global content delivery service, such as Amazon CloudFront, to accelerate delivery of websites, APIs, media content, and other web assets. Website administrators and developers want to generate insights in order to improve website availability through bot detection and mitigation, by optimizing web content based on the devices and browser used, by reducing perceived latency by caching a popular object closer to its viewer, and so on. In this session, we dive deep into building an end-to-end serverless analytics solution to analyze Amazon CloudFront access logs, both at rest and in transit, using Amazon Athena and Amazon Kinesis Analytics, respectively, and we generate visualization insights using Amazon QuickSight. Join a discussion with AWS solution architects to learn more about the various ways to generate insights to improve the overall perceived experience for your website users.
ABD318_Architecting a data lake with Amazon S3, Amazon Kinesis, AWS Glue and ...Amazon Web Services
"Learn how to architect a data lake where different teams within your organization can publish and consume data in a self-service manner. As organizations aim to become more data-driven, data engineering teams have to build architectures that can cater to the needs of diverse users - from developers, to business analysts, to data scientists. Each of these user groups employs different tools, have different data needs and access data in different ways.
In this talk, we will dive deep into assembling a data lake using Amazon S3, Amazon Kinesis, Amazon Athena, Amazon EMR, and AWS Glue. The session will feature Mohit Rao, Architect and Integration lead at Atlassian, the maker of products such as JIRA, Confluence, and Stride. First, we will look at a couple of common architectures for building a data lake. Then we will show how Atlassian built a self-service data lake, where any team within the company can publish a dataset to be consumed by a broad set of users."
The volume of data businesses create and process is growing every day. To get the most value out of this data, companies often invest in traditional BI tools. These tools however require investments in costly on-premises hardware and software. It takes weeks or months of data engineering time to build complex data models; not to mention the additional infrastructure needed to maintain fast query performance as data sets grow. Amazon QuickSight is built from the ground up to solve these problems by bringing the scale and flexibility of the AWS Cloud and by providing a business user focused experience to business analytics. This session will provide you with the relevant capabilities, benefits and use cases for AWS Quicksight.
AWS Cost Management Workshop at the San Francisco Loft
AWS offers a number of products that allow you to access, organize, understand, optimize, and control your AWS costs and usage. This workshop will help you get started using AWS Cost Explorer to visualize your usage patterns and identify your underlying cost drivers. From there, you can take action on your insights by learning how to set custom cost and usage budgets and receive alerts via email or Amazon SNS topic using AWS Budgets.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
The Zen of DataOps – AWS Lake Formation and the Data Supply Chain PipelineAmazon Web Services
Many organizations have adopted or are in the process of adopting DevOps methodologies in their quest to accelerate the delivery of software capabilities, features, and functionalities to support their organizational objectives. By applying the same practices, DataOps aims to provide the same level of agility in delivering data and information to the organization. AWS Lake Formation, in coordination with other AWS Services, enables DevOps methodologies to be realized through the Data Supply Chain Pipeline.
Learning Objectives:
- Learn the common use-cases for using Athena, AWS' interactive query service on S3
- Learn best practices for creating tables and partitions and performance optimizations
- Learn how Athena handles security, authorization, and authentication
Getting Started with AWS Enterprise Applications: WorkSpaces, WorkMail, WorkDocsAmazon Web Services
AWS Enterprise Applications deliver managed, secure desktop and productivity capabilities run in the AWS cloud. Amazon WorkSpaces allows customers to easily provision cloud-based desktops that allow end-users to securely access the documents, applications, and resources they need with the device of their choice. Amazon WorkMail is a secure and managed business email and calendaring service that gives users the ability to seamlessly access their email, contacts, and calendars while allowing IT to maintain control over encryption and location of data. The speakers also dive into Amazon WorkDocs, a fully managed and secure enterprise storage and sharing service with strong administrative controls and feedback capabilities. In this session, we explore each of these services, explain how your organization can benefit from them, and also provide a brief demo to show how they work together.
Amazon Web Services gives you fast access to flexible and low cost IT resources, so you can rapidly scale and build virtually any big data application including data warehousing, clickstream analytics, fraud detection, recommendation engines, event-driven ETL, serverless computing, and internet-of-things processing regardless of volume, velocity, and variety of data.
https://aws.amazon.com/webinars/anz-webinar-series/
How Amazon.com Uses AWS Analytics: Data Analytics Week SFAmazon Web Services
Data Analytics Week at the San Francisco Loft
How Amazon.com Uses AWS Analytics
An inside look at how a global e-commerce firm uses AWS technologies to build a scalable environment for data and analytics. We'll look at how Amazon is evolving the world of data warehousing with a combination of a data lake and parallel scalable compute engines including Amazon EMR and Amazon Redshift.
Speakers:
Saurabh Shrivastava - Partner Solutions Architect, AWS
Andre Hass - Specialist Technical Account Manager (Redshift), AWS
Big Data Analytics Architectural Patterns and Best Practices (ANT201-R1) - AW...Amazon Web Services
In this session, we discuss architectural principles that helps simplify big data analytics.
We'll apply principles to various stages of big data processing: collect, store, process, analyze, and visualize. We'll disucss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on.
Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
by Michael St. Onge, Global Cloud Security Architect, AWS
Join us for this hands-on lab where you will learn about the new service Amazon GuardDuty by walking through its capabilities and some real-world attack scenarios. You will need an AWS account to do the lab. This should be your own personal account and not an account through your company given the activity in the lab. AWS Credits will be provided to help cover any costs incurred in the lab. Level 300
Today’s organisations require a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lake is a new and increasingly popular way to store all of your data, structured and unstructured, in one, centralised repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
In this webinar, you will discover how AWS gives you fast access to flexible and low-cost IT resources, so you can rapidly scale and build your data lake that can power any kind of analytics such as data warehousing, clickstream analytics, fraud detection, recommendation engines, event-driven ETL, serverless computing, and internet-of-things processing regardless of volume, velocity and variety of data.
Learning Objectives:
• Discover how you can rapidly scale and build your data lake with AWS.
• Explore the key pillars behind a successful data lake implementation.
• Learn how to use the Amazon Simple Storage Service (S3) as the basis for your data lake.
• Learn about the new AWS services recently launched, Amazon Athena and Amazon Redshift Spectrum, that help customers directly query that data lake.
Deep Dive - Amazon Kinesis Video Streams - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Get an overview of Amazon Kinesis Video Streams and key use cases
- Learn how to use the Kinesis Video Streams producer SDK to securely stream video to AWS
- Discover how to use the Kinesis Video Streams parser library to retrieve video fragments for analytics and processing
Build Data Lakes & Analytics on AWS: Patterns & Best PracticesAmazon Web Services
With over 90% of today’s data generated in the last two years, the rate of data growth is showing no sign of slowing down. In this session, we step through the challenges and best practices for capturing data, understanding what data you own, driving insights, and predicting the future using AWS services. We frame the session and demonstrations around common pitfalls of building data lakes and how to successfully drive analytics and insights from data. We also discuss the architecture patterns brought together key AWS services, including Amazon S3, AWS Glue, Amazon Athena, Amazon Kinesis, and Amazon Machine Learning. Discover the real-world application of data lakes for roles including data scientists and business users.
Stephen Moon, Sr. Solutions Architect, Amazon Web Services
James Juniper, Solution Architect for the Geo-Community Cloud, Natural Resources Canada
NEW LAUNCH! Stream video from edge devices to AWS for playback, storage and p...Amazon Web Services
Amazon Kinesis Video Streams is a video ingestion and storage service for analytics, machine learning, and video processing use cases. In this workshop, you will learn how to stream video from devices to Kinesis Video Streams for playback, storage and subsequent processing. For the workshop, we will provide you a Raspberry Pi 3 with camera module and pre-loaded with the Kinesis Video Streams producer SDK. First, you will create and configure a Kinesis video stream in the AWS management console. Next, you will stream videos from the Raspberry Pis to Kinesis Video Streams, view the live video feed in the console, and retrieve stored videos. Lastly, you will pull key operating metrics to understand the performance characteristics of your video stream. For this workshop, you need to create an AWS account and bring your own development laptop.
AWS delivers an integrated suite of services that provide everything needed to quickly and easily build and manage a data lake for analytics. AWS-powered data lakes can handle the scale, agility, and flexibility required to combine different types of data and analytics approaches to gain deeper insights, in ways that traditional data silos and data warehouses cannot. In this session, we will show you how you can quickly build a data lake on AWS that ingests, catalogs and processes incoming data and makes it ready for analysis. Using a live demo, we demonstrate the capabilities of AWS provided analytical services such as AWS Glue, Amazon Athena and Amazon EMR and how to build a Data Lake on AWS step-by-step.
Do you want to ramp up your knowledge of AWS analytics services and launch your first big data application on the cloud? In this session, we walk you through simplifying big data processing as a data bus comprising of ingestion, storage, processing, and visualization. You build a big data application using AWS managed services, including Amazon Athena, Amazon Kinesis, Amazon EMR, AWS Glue, Amazon Redshift, Amazon QuickSight, and Amazon S3. Along the way, we review architecture design patterns for big data applications and give you access to a take-home lab so you can rebuild and customize the application yourself. To get the most from this session, bring your own laptop and have some familiarity with AWS services.
I Want to Analyze and Visualize Website Access Logs, but Why Do I Need Server...Amazon Web Services
Nowadays, it’s common for a web server to be fronted by a global content delivery service, such as Amazon CloudFront, to accelerate delivery of websites, APIs, media content, and other web assets. Website administrators and developers want to generate insights in order to improve website availability through bot detection and mitigation, by optimizing web content based on the devices and browser used, by reducing perceived latency by caching a popular object closer to its viewer, and so on. In this session, we dive deep into building an end-to-end serverless analytics solution to analyze Amazon CloudFront access logs, both at rest and in transit, using Amazon Athena and Amazon Kinesis Analytics, respectively, and we generate visualization insights using Amazon QuickSight. Join a discussion with AWS solution architects to learn more about the various ways to generate insights to improve the overall perceived experience for your website users.
RET301-Build Single Customer View across Multiple Retail Channels using AWS S...Amazon Web Services
A challenge faced by many retailers is how to form an integrated single view of the customer across multiple retail channels to help you better understand purchasing behavior & patterns. In this session, we will present a solution that merges web analytics data with customer purchase history based on AWS API Gateway, Lambda and S3. Learn how to track customer purchase behaviors across different selling channels to better predict future needs and make relevant, intelligent recommendations.
AWS X-Ray: Debugging Applications at Scale - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Learn how to reduce time to resolution for errors and performance bottlenecks from days/hours to minutes
- Learn how to detect latency distribution and pinpoint issues to specific service(s)
- Learn how to quantify customer impact
Deliver Voice Automated Serverless BI Solutions in Under 3 Hours - ABD325 - r...Amazon Web Services
Use AWS tools to discover meaningful Key Performance Indicators (KPIs) for your organization. Data that was sitting in raw form such as JSON can be published to S3 and queried using Athena, the AWS clusterless query engine. To visually discover your data, AWS Quicksight, will enable your organization to create KPIs such as: “How many unique user visits in the last quarter?” or “How many tweets has our company had from AsiaPac in the last day?”. In this workshop we’ll use these Big Data technologies along with AWS serverless tools to deliver metrics through voice. To do this, we’ll walk through the process of enabling and testing these metrics for a custom skill on Alexa-enabled devices (No echo device needed for workshop). This will give you the skills to develop creative voice-powered analytics to your organization’s stakeholders.
Are you running multiple workloads in AWS and growing? Are you looking for best practices to help you manage your accounts as you scale? In this workshop, we explore various strategies to manage your growing AWS account portfolio. We explore best practices around security, including creating accounts for Identity and Access Management (IAM), logging and shared services, and implementing federated access and single sign-on (SSO). From a cost-management perspective, we review best practices surrounding account creation for business units, environment lifecycle, and individual projects. Some of the services we use in this workshop include AWS Organizations, AWS CloudTrail, IAM.
Deploying Business Analytics at Enterprise Scale - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Deploy business analytics to thousands of users using Active Directory and Federated SSO
- Securely access data sources in Amazon VPCs or on-premises and build data marts with SPICE
- Control access to your data sources, implement row-level security, and audit access to your data
Image recognition is a field of deep learning that uses neural networks to recognize the subject and traits for a given image. In Japan, Cookpad uses Amazon ECS to run an image recognition platform on clusters of GPU-enabled EC2 instances. In this session, hear from Cookpad about the challenges they faced building and scaling this advanced, user-friendly service to ensure high-availability and low-latency for tens of millions of users.
How Nextdoor Built a Scalable, Serverless Data Pipeline for Billions of Event...Amazon Web Services
In this session, learn how Nextdoor replaced their home-grown data pipeline based on a topology of Flume nodes with a completely serverless architecture based on Kinesis and Lambda. By making these changes, they improved both the reliability of their data and the delivery times of billions of records of data to their Amazon S3–based data lake and Amazon Redshift cluster. Nextdoor is a private social networking service for neighborhoods.
Join us to learn what's new in serverless computing and AWS Lambda. Dr. Tim Wagner, General Manager of AWS Lambda and Amazon API Gateway, will share the latest developments in serverless computing and how companies are benefiting from serverless applications. You'll learn about the latest feature releases from AWS Lambda, Amazon API Gateway, and more. You will also hear from FICO about how it is using serverless computing for its predictive analytics and data science platform.
Interstella 8888 is an intergalactic trading company that deals in rare resources, but their antiquated monolithic logistics systems are causing the business to lose money. In this workshop, you’ll help Interstella 8888 build a modern microservices-based logistics system to save the company from financial ruin.
We’ll give you the hands-on experience you need to run microservices in the real world. Ths includes implementing advanced container scheduling and scaling to deal with variable service requests, implementing a service mesh, issue tracing with Amazon X-Ray, container and instance-level logging with CloudWatch, and load testing.
AWS credits are provided. Bring your laptop, and have an active AWS account.
Moving to Amazon ECS – the Not-So-Obvious Benefits - CON356 - re:Invent 2017Amazon Web Services
If you ask 10 teams why they migrated to containers, you will likely get answers like ‘developer productivity’, ‘cost reduction’, and ‘faster scaling’. But teams often find there are several other ‘hidden’ benefits to using containers for their services. In this talk, Franziska Schmidt, Platform Engineer at Mapbox and Yaniv Donenfeld from AWS will discuss the obvious, and not so obvious benefits of moving to containerized architecture. These include using Docker and ECS to achieve shared libraries for dev teams, separating private infrastructure from shareable code, and making it easier for non-ops engineers to run services.
Using AWS Management Tools to Enable Governance, Compliance, Operational, and...Amazon Web Services
In this session, you learn how to enable governance, compliance, operational, and risk auditing of your AWS account. Approaches discussed include a combination of continuous monitoring and assessing, auditing, and evaluating your AWS resources. With AWS management tools, you can view a history of AWS API calls for your various accounts, review changes in configurations and relationships between AWS resources, dive into detailed resource configuration histories, determine your overall compliance against the configurations specified in your internal guidelines, and give developers and systems administrators a secure and compliant means to create and manage AWS resources.
How Chick-fil-A Embraces DevSecOps on AWS - SID306 - re:Invent 2017Amazon Web Services
As Chick-fil-A became a cloud-first organization, their security team didn't want to become the bottleneck for agility. But the security team also wanted to raise the bar for their security posture on AWS. Robert Davis, security architect at Chick-fil-A, provides an overview about how he and his team recognized that writing code was the best way for their security policies to scale across the many AWS accounts that Chick-fil-A operates. The use of DevSecOps within Chick-fil-A led to the creation of a set of account bootstrapping tools, auditing capabilities, and event-based policy enforcement. This session goes over these tools and how they were built on AWS.
Design, Build, and Modernize Your Web Applications with AWSDonnie Prakoso
Cloud makes it super easy for you to spin off your desired IT resources. But, the true value of cloud lies in its capability to provide you a set of building blocks for your applications. Join us in this hands-on session to understand how to use Amazon Virtual Private Cloud (VPC) and Amazon Elastic Compute Cloud (EC2) along with Amazon EC2 Auto Scaling and Elastic Load Balancer to design your scalable architecture and build your applications in no time. Moreover, we will discover how to modernize your application with the help of our serverless service AWS Lambda.
Motion detection triggers have reduced the amount of video recorded by modern devices. But maybe you want to reduce that further—maybe you only care if a car or a person is on-camera before recording or sending a notification. Security cameras and smart doorbells can use Amazon Rekognition to reduce the number of false alarms. Learn how device makers and home enthusiasts are building their own smart layers of person and car detection to reduce false alarms and limit video volume. Learn too how you can use face detection and recognition to notify you when a friend has arrived.
Motion detection triggers have reduced the amount of video recorded by modern devices. But maybe you want to reduce that further—maybe you only care if a car or a person is on-camera before recording or sending a notification. Security cameras and smart doorbells can use Amazon Rekognition to reduce the number of false alarms. Learn how device makers and home enthusiasts are building their own smart layers of person and car detection to reduce false alarms and limit video volume. Learn too how you can use face detection and recognition to notify you when a friend has arrived.
Similar to ABD317_Building Your First Big Data Application on AWS - ABD317 (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
Durante i laboratori pratici, gli esperti AWS ti mostrano quali strumenti aiutano a sviluppare le applicazioni Serverless in locale e nel cloud AWS e ti aiuteranno a programmare i prossimi passi per iniziare ad utilizzare questa tecnologia nella tua azienda.
81. Activity 4B: ETL Job in Glue
• Close Script Editor tips
window (if it appears)
• In the Glue Script Editor,
copy the ETL code by
clicking on the “Open
Glue ETL Code” link in
Student Resources
• Ensure that the database
name (db_name) and
table name reflect the
database and table name
created by the Glue
Crawler
99. Activity 5C: Interactive Querying with Amazon Athena
• Run interactive queries (copy SQL queries from “Athena SQL” in
Student Resources) and see the results on the console
114. Activity 6A: Open the Zeppelin interface
1. Copy the Zeppelin end point in
Student Resources section in
qwiklabs
2. Click on the “Open Zeppelin
Notebook” link in Student
Resources section to open the
zeppelin link into a new window.
3. Download the file (or Copy and
save it to file with .json extension)
4. Import the Notebook using the
Import Note link on Zeppelin
interface
130. Activity 7B: Connect to Amazon Redshift
Note: Use ”dbadmin” as the
username. You can get the Amazon
Redshift database password from
qwikLABS by navigating to the
“Connection details” section (see
below)