We have recently seen some convergence of different database technologies. Many customers are evaluating heterogeneous migrations as their database needs have evolved or changed. Evaluating the best database to use for a job isn’t as clear as it was ten years ago. In this session, we discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, and Amazon Redshift. This session digs into how to evaluate a new workload for the best managed database option.
DAT324_Expedia Flies with DynamoDB Lightning Fast Stream Processing for Trave...Amazon Web Services
Building rich, high-performance streaming data systems requires fast, on-demand access to reference data sets, to implement complex business logic. In this talk, Expedia will discuss the architectural challenges the company faced, and how DAX + DynamoDB fits into the overall architecture and met their design requirements. Additionally, you will hear how DAX that enabled Expedia to add caching to their existing applications in hours, which previously was taking much longer. Session attendees will walk away with three key outputs: 1) Expedia’s overall architectural patterns for streaming data 2) how they uniquely leverage DynamoDB, DAX, Apache Spark, and Apache Kafka to solve these problems 3) the value that DAX provides and how it enabled them to improve our performance and throughput, reduce costs, and all without having to write any new code.
WIN301-Migrating Microsoft SQL Server Databases to AWS-Best Practices and Pat...Amazon Web Services
Migrating databases to the cloud is a critical part of organizations cloud journey and requires careful planning and architecture considerations including migration methods. This session will provide you with best practices and guidelines in migrating and/or architecting hybrid database architecture on AWS with focus on Microsoft SQL server databases. We will review current SQL on RDS, SQL on EC2 capabilities, compare and contrast various migration methods including SQL Export, Backup and Restore, and using AWS Database Migration Service (DMS). We will also look at how Expedia is migrating monolith SQL server databases to AWS using a hybrid approach leveraging SQL Server Distributed Availability Architecture. Expedia will share lessons learned during initial test and deployment phase followed by a demo of their existing architecture and deployment.
What's New for AWS Purpose Built, Non-relational Databases - DAT204 - re:Inve...Amazon Web Services
In this session, Shawn Bice, VP of NoSQL and QuickSight, will cover what's new in AWS non-relational data services, such as Amazon DynamoDB, Amazon ElastiCache, and Amazon Elastisearch. We will discuss how developers might select different data services to solve different aspects of an application and demo scenarios on which application use cases lend themselves well to which data services. If you’re a developer building massively scaled applications, requiring flexibility, consistent millisecond performance, and trying to understand what non-relational data service you might use, this is a great introductory session.
This presentation compares three modern architecture patterns that startups are building their businesses around. It includes a realistic analysis of cost, team management, and security implications of each approach. It covers AWS Elastic Beanstalk, Amazon ECS, Amazon API Gateway, AWS Lambda, Amazon DynamoDB, and Amazon CloudFront. Attendees will also hear from venture capital investor Third Rock Ventures (TRV) who has launched 40+ biotech startups over the last 10 years. TRV will outline how it launches cloud native startups that turn bleeding edge science into new treatments across the spectrum of disease, with highlights drawn Relay Therapeutics and Tango Therapeutics.
AWS Database and Analytics State of the Union - 2017 - DAT201 - re:Invent 2017Amazon Web Services
In this session, we discuss the evolution of database and analytics services in AWS, the new database and analytics services and features we launched this year, and our vision for continued innovation in this space. We are witnessing an unprecedented growth in the amount of data collected, in many different forms. Storage, management, and analysis of this data require database services that scale and perform in ways not possible before. AWS offers a collection of database and other data services—including Amazon Aurora, Amazon DynamoDB, Amazon RDS, Amazon Redshift, Amazon ElastiCache, Amazon Kinesis, and Amazon EMR—to process, store, manage, and analyze data. In this session, we provide an overview of AWS database and analytics services and discuss how customers are using these services today.
MySQL is the world's most popular open source relational database and is used by dozens of popular open source applications. AWS provides several ways to run a MySQL application in the cloud, including Amazon EC2, Amazon RDS for MySQL, Amazon RDS for MariaDB, and Amazon Aurora. This session presents the different options for running MySQL in the AWS Cloud, discusses the different ways to migrate your MySQL database to AWS, and provide tips and tricks for optimizing your MySQL workloads in AWS. Also, a customer presents lessons learned while migrating their MySQL databases to AWS.
CMP217_Scale In-Memory Workloads on Amazon EC2 X1 and X1e Instances with up t...Amazon Web Services
Amazon EC2 X1 and X1e instances are designed to demand memory-optimized enterprise workloads, including production installations of SAP HANA, Microsoft SQL Server, Apache Spark, and Presto. Just recently released, x1e.32xlarge are our largest cloud-native instances yet, offering 4 TB of DDR4 memory per instance. Join this session for a detailed look into this new instance, and learn how enterprise customers are using these instances to run mission-critical workloads, such as SAP HANA, to realize greater speed and agility.
GPSWKS407-Strategies for Migrating Microsoft SQL Databases to AWSAmazon Web Services
Data is king! In this workshop, we explore different strategies and options for migrating Microsoft SQL Server databases to AWS. We cover the migration process, mechanisms, best practices, version and object compatibility, and licensing. We walk you through the migration plan and activities, landing zone, and what you need to consider before migrating. We also discuss the different tools you can use for migration, monitoring, and backup and recovery.
DAT324_Expedia Flies with DynamoDB Lightning Fast Stream Processing for Trave...Amazon Web Services
Building rich, high-performance streaming data systems requires fast, on-demand access to reference data sets, to implement complex business logic. In this talk, Expedia will discuss the architectural challenges the company faced, and how DAX + DynamoDB fits into the overall architecture and met their design requirements. Additionally, you will hear how DAX that enabled Expedia to add caching to their existing applications in hours, which previously was taking much longer. Session attendees will walk away with three key outputs: 1) Expedia’s overall architectural patterns for streaming data 2) how they uniquely leverage DynamoDB, DAX, Apache Spark, and Apache Kafka to solve these problems 3) the value that DAX provides and how it enabled them to improve our performance and throughput, reduce costs, and all without having to write any new code.
WIN301-Migrating Microsoft SQL Server Databases to AWS-Best Practices and Pat...Amazon Web Services
Migrating databases to the cloud is a critical part of organizations cloud journey and requires careful planning and architecture considerations including migration methods. This session will provide you with best practices and guidelines in migrating and/or architecting hybrid database architecture on AWS with focus on Microsoft SQL server databases. We will review current SQL on RDS, SQL on EC2 capabilities, compare and contrast various migration methods including SQL Export, Backup and Restore, and using AWS Database Migration Service (DMS). We will also look at how Expedia is migrating monolith SQL server databases to AWS using a hybrid approach leveraging SQL Server Distributed Availability Architecture. Expedia will share lessons learned during initial test and deployment phase followed by a demo of their existing architecture and deployment.
What's New for AWS Purpose Built, Non-relational Databases - DAT204 - re:Inve...Amazon Web Services
In this session, Shawn Bice, VP of NoSQL and QuickSight, will cover what's new in AWS non-relational data services, such as Amazon DynamoDB, Amazon ElastiCache, and Amazon Elastisearch. We will discuss how developers might select different data services to solve different aspects of an application and demo scenarios on which application use cases lend themselves well to which data services. If you’re a developer building massively scaled applications, requiring flexibility, consistent millisecond performance, and trying to understand what non-relational data service you might use, this is a great introductory session.
This presentation compares three modern architecture patterns that startups are building their businesses around. It includes a realistic analysis of cost, team management, and security implications of each approach. It covers AWS Elastic Beanstalk, Amazon ECS, Amazon API Gateway, AWS Lambda, Amazon DynamoDB, and Amazon CloudFront. Attendees will also hear from venture capital investor Third Rock Ventures (TRV) who has launched 40+ biotech startups over the last 10 years. TRV will outline how it launches cloud native startups that turn bleeding edge science into new treatments across the spectrum of disease, with highlights drawn Relay Therapeutics and Tango Therapeutics.
AWS Database and Analytics State of the Union - 2017 - DAT201 - re:Invent 2017Amazon Web Services
In this session, we discuss the evolution of database and analytics services in AWS, the new database and analytics services and features we launched this year, and our vision for continued innovation in this space. We are witnessing an unprecedented growth in the amount of data collected, in many different forms. Storage, management, and analysis of this data require database services that scale and perform in ways not possible before. AWS offers a collection of database and other data services—including Amazon Aurora, Amazon DynamoDB, Amazon RDS, Amazon Redshift, Amazon ElastiCache, Amazon Kinesis, and Amazon EMR—to process, store, manage, and analyze data. In this session, we provide an overview of AWS database and analytics services and discuss how customers are using these services today.
MySQL is the world's most popular open source relational database and is used by dozens of popular open source applications. AWS provides several ways to run a MySQL application in the cloud, including Amazon EC2, Amazon RDS for MySQL, Amazon RDS for MariaDB, and Amazon Aurora. This session presents the different options for running MySQL in the AWS Cloud, discusses the different ways to migrate your MySQL database to AWS, and provide tips and tricks for optimizing your MySQL workloads in AWS. Also, a customer presents lessons learned while migrating their MySQL databases to AWS.
CMP217_Scale In-Memory Workloads on Amazon EC2 X1 and X1e Instances with up t...Amazon Web Services
Amazon EC2 X1 and X1e instances are designed to demand memory-optimized enterprise workloads, including production installations of SAP HANA, Microsoft SQL Server, Apache Spark, and Presto. Just recently released, x1e.32xlarge are our largest cloud-native instances yet, offering 4 TB of DDR4 memory per instance. Join this session for a detailed look into this new instance, and learn how enterprise customers are using these instances to run mission-critical workloads, such as SAP HANA, to realize greater speed and agility.
GPSWKS407-Strategies for Migrating Microsoft SQL Databases to AWSAmazon Web Services
Data is king! In this workshop, we explore different strategies and options for migrating Microsoft SQL Server databases to AWS. We cover the migration process, mechanisms, best practices, version and object compatibility, and licensing. We walk you through the migration plan and activities, landing zone, and what you need to consider before migrating. We also discuss the different tools you can use for migration, monitoring, and backup and recovery.
ARC329_Optimizing Performance and Efficiency for Amazon EC2 and More with Tur...Amazon Web Services
The document discusses how Turbonomic can optimize performance and efficiency for Amazon EC2 and other cloud resources. Turbonomic provides complete visibility across hybrid environments and automates actions to continuously optimize performance, efficiency, and compliance. It enables real-time scaling of EC2 instances and storage, optimized sizing of databases on Amazon RDS, and planning for cloud migrations and use of reserved instances. Turbonomic aims to help organizations accelerate success with AWS through self-managing hybrid cloud management.
GPSTEC315_GPS Optimizing Tips Amazon Redshift for Cloud DataAmazon Web Services
This document summarizes optimization tips for Amazon Redshift data warehousing including: loading data efficiently using COPY and compression; designing tables with optimal distribution keys, sort keys and column sizes; identifying and addressing disk-based queries, unnecessary transactions, and stale table statistics; configuring WLM queues; and using Redshift Spectrum to query external data in Amazon S3. The presentation provides an overview of Redshift architecture and services, demonstrates optimization techniques, and invites questions from attendees.
NET309_Best Practices for Securing an Amazon Virtual Private CloudAmazon Web Services
This workshop will provide practical advice and guidance for designing and building secure Amazon Virtual Private Clouds (VPCs). Using a hands-on approach, we'll take you through Amazon VPC features such as subnets, security groups, network ACLs, routing, flow logs and service endpoints. The AWS team will also provide some guidance around best practices for VPC design and management, based on our experience of supporting customers running large-scale infrastructures.
ABD201-Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
In this session, we simplify big data processing as a data bus comprising various stages: collect, store, process, analyze, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Many customers want a disaster recovery environment, and they want to use this environment daily and know that it's in sync with and can support a production workload. This leads them to an active-active architecture. In other cases, users like Netflix and Lyft are distributed over large geographies. In these cases, multi-region active-active deployments are not optional. Designing these architectures is more complicated than it appears, as data being generated at one end needs to be synced with data at the other end. There are also consistency issues to consider. One needs to make trade-off decisions on cost, performance, and consistency. Further complicating matters is the variety of data stores used in the architecture results in a variety replication methods. In this session, we explore how to design an active-active multi-region architecture using AWS services, including Amazon Route 53, Amazon RDS multi-region replication, AWS DMS, and Amazon DynamoDB Streams. We discuss the challenges, trade-offs, and solutions.
AWS Commercial Management and Cost Optimisation - Dec 2017Amazon Web Services
Technical levers and strategic mechanisms for AWS Commercial Management and Cost Optimisation. Includes 2017 commercially relevant updates.
Speaker: Peter Shi, Commercial Architect, BD AWS APAC
Hybrid Cloud Data Management: Using Data for Business Outcomes - STG308 - re:...Amazon Web Services
Today, data backup isn’t enough. IT teams with a cloud data management strategy become a data broker for the business. Data helps the business improve company reputation, drive revenue, and satisfy customers. With a hybrid architecture approach to managing data on-premises and in the cloud, the business can be more agile and more responsive than today. Find out what your IT peers are doing with cloud data management (hint: it’s more than backup). Learn how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. See what your peers are doing to best move, manage, and use data across on-premises storage and cloud services. In this session, you learn steps for seamless, risk-free migration to different AWS services (Amazon EC2, Amazon RDS, Amazon S3, Amazon S3 - Infrequent Access class, Amazon Glacier and AWS Snowball); tactics for streamlined, enterprise-class disaster recovery; ways to save money by retiring expensive alternatives like tape storage; single view e-discovery across hybrid locations with dynamic data indexing across on-premises and cloud storage; and how to achieve holistic data protection across storage locations.
Session sponsored by Commvault
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
A modern Big Data architecture involves extending your on-premises data management to AWS, implementing a data pipeline to stream real-time data into cloud data warehouse Amazon Redshift, perform data transformation, discovery, predictive analytics through machine learning, visualize complex information and be notified to respond to business events. This session is for APN Consulting Partners and organizations looking for ways to accelerate and modernize their Big Data projects. You will learn how to deploy and integrate AWS Services with Third-party Solutions in AWS Marketplace. Reduce your time to market by combining AWS services, open source software and ready-to-run on AWS solutions. Familiarity with Database technologies required. The session includes demonstrations and cooperative learning group activities.
Design patterns and best practices for data analytics with amazon emr (ABD305)Amazon Web Services
Amazon EMR is one of the largest Hadoop operators in the world, enabling customers to run ETL, machine learning, real-time processing, data science, and low-latency SQL at petabyte scale. In this session, we introduce you to Amazon EMR design patterns such as using Amazon S3 instead of HDFS, taking advantage of both long and short-lived clusters, and other Amazon EMR architectural best practices. We talk about lowering cost with Auto Scaling and Spot Instances, and security best practices for encryption and fine-grained access control. Finally, we dive into some of our recent launches to keep you current on our latest features.
ARC306_High Resiliency & Availability Of Online Entertainment Communities Usi...Amazon Web Services
With increase in popularity of online engagement as a means of entertainment, broad use of wide range of communities have become popular. These communities need to be highly available and resilient at scale. Failure of availability could be fatal to the product that are used by the customer. We will share the process you should use to develop your architectural principles that will allow you to reap the benefits of reduced complexity.
STG311_Deep Dive on Amazon S3 & Amazon Glacier Storage ManagementAmazon Web Services
Learn best practices for Amazon Simple Storage Service (Amazon S3) performance optimization, security, data protection, storage management, and much more. Learn how to optimize key naming to increase throughput, apply the appropriate AWS Identity and Access Management (IAM) and encryption configurations, and leverage object tagging and other features to enhance security.
In this popular session, discover how Amazon EBS can take your application deployments on Amazon EC2 to the next level. Learn about Amazon EBS features and benefits, how to identify applications that are appropriate for use with Amazon EBS, best practices, and details about its performance and volume types. The target audience is storage administrators, application developers, applications owners, and anyone who wants to understand how to optimize performance for Amazon EC2 using the power of Amazon EBS.
ARC303_Running Lean Architectures How to Optimize for Cost EfficiencyAmazon Web Services
This document discusses best practices for optimizing AWS architectures for cost efficiency. It provides strategies across business goals, architecture, and operations. Key recommendations include using AWS cost tools, Reserved Instances, automation to avoid idle instances, Spot Instances, optimizing databases through caching, splitting architectures into focused stacks, and leveraging managed services. The presentation provides customer examples and emphasizes letting AWS handle undifferentiated heavy lifting to save money and focus on core business needs.
DAT339_Replicate, Analyze, and Visualize Datasets Using AWS Database Migratio...Amazon Web Services
Customers often have disparate datasets within their data centers and on AWS. As a result, they find it challenging to replicate and analyze the data and drive positive business outcomes. In this workshop, we use AWS managed database services and serverless technologies to help replicate, analyze, and visualize data. We replicate an on-premises database to Amazon RDS and Amazon S3 using AWS Database Migration Service and the AWS Schema Conversion Tool. Then, we use Amazon Athena to interactively analyze data using SQL. Finally, we use Amazon QuickSight to visualize the data and enable better business decisions.
DAT332_How Verizon is Adopting Amazon Aurora PostgreSQL for Enterprise WorkloadsAmazon Web Services
Learn how Verizon is adopting the Amazon Aurora PostgreSQL-compatible edition for their mission-critical applications. Verizon has a history of adopting best of breed database technologies as they continue to serve their 140M+ customers. As Verizon moves its enterprise applications to the cloud, database performance and reliability are the key considerations. With heavy dependence on commercial databases, learn how a large enterprise like Verizon evaluated performance, reliability and operational characteristics of Amazon Aurora, and was able to create internal momentum behind adoption of open source technologies by showcasing early wins. This session also highlights best practices for using Amazon Aurora and the newly-announced RDS Performance Insights.
FINRA uses big data and data science technologies to detect fraud, market manipulation, and insider trading across US capital markets. As a financial regulator, FINRA analyzes highly sensitive data, so information security is critical. Learn how FINRA secures its Amazon S3 Data Lake and its data science platform on Amazon EMR and Amazon Redshift, while empowering data scientists with tools they need to be effective. In addition, FINRA shares AWS security best practices, covering topics such as AMI updates, micro segmentation, encryption, key management, logging, identity and access management, and compliance.
ATC303-Cache Me If You Can Minimizing Latency While Optimizing Cost Through A...Amazon Web Services
This document summarizes a presentation about caching strategies to minimize latency and optimize costs. It discusses caching at the edge with CloudFront, caching at the web tier and app tier with solutions like Varnish and ElastiCache, and caching database queries with DynamoDB Accelerator. The presentation provides examples from Team Internet and recommends caching everything possible at each layer to improve performance.
This document provides a deep dive on Amazon Elastic Block Store (EBS) and includes:
- An overview of EBS including that it provides block storage as a service that can be attached to EC2 instances and persists independently.
- Descriptions of the different EBS volume types including General Purpose SSD (gp2), Provisioned IOPS SSD (io1), Throughput Optimized HDD (st1), Cold HDD (sc1), and how to choose the appropriate type.
- Details on modifying EBS volumes including increasing size and provisioned IOPS using EBS elastic volumes and the modification process.
- Tips for automating volume modifications using services like AWS Lambda and Amazon CloudWatch
Choosing the Right Database for My Workload: Purpose-Built Databases AWS Germany
The document discusses choosing the right database for different types of workloads. It covers operational databases like Amazon DynamoDB, Amazon RDS, Amazon ElastiCache and Amazon Neptune that are well-suited for transactional workloads. It also discusses analytic databases like Amazon Redshift, Amazon Athena, Amazon Kinesis Analytics and Amazon Elasticsearch Service that are well-suited for large-scale analytics and business intelligence workloads. The document emphasizes that AWS offers a variety of purpose-built databases and there is no need to pick just one, as different databases can be combined to solve different aspects of a problem.
In this session, we discuss the evolution of database and analytics services in AWS, the new database and analytics services and features we launched this year, and our vision for continued innovation in this space. We are witnessing an unprecedented growth in the amount of data collected, in many different forms. Storage, management, and analysis of this data require database services that scale and perform in ways not possible before. AWS offers a collection of database and other data services—including Amazon Aurora, Amazon DynamoDB, Amazon RDS, Amazon Redshift, Amazon ElastiCache, Amazon Kinesis, and Amazon EMR—to process, store, manage, and analyze data. In this session, we provide an overview of AWS database and analytics services and discuss how customers are using these services today.
ARC329_Optimizing Performance and Efficiency for Amazon EC2 and More with Tur...Amazon Web Services
The document discusses how Turbonomic can optimize performance and efficiency for Amazon EC2 and other cloud resources. Turbonomic provides complete visibility across hybrid environments and automates actions to continuously optimize performance, efficiency, and compliance. It enables real-time scaling of EC2 instances and storage, optimized sizing of databases on Amazon RDS, and planning for cloud migrations and use of reserved instances. Turbonomic aims to help organizations accelerate success with AWS through self-managing hybrid cloud management.
GPSTEC315_GPS Optimizing Tips Amazon Redshift for Cloud DataAmazon Web Services
This document summarizes optimization tips for Amazon Redshift data warehousing including: loading data efficiently using COPY and compression; designing tables with optimal distribution keys, sort keys and column sizes; identifying and addressing disk-based queries, unnecessary transactions, and stale table statistics; configuring WLM queues; and using Redshift Spectrum to query external data in Amazon S3. The presentation provides an overview of Redshift architecture and services, demonstrates optimization techniques, and invites questions from attendees.
NET309_Best Practices for Securing an Amazon Virtual Private CloudAmazon Web Services
This workshop will provide practical advice and guidance for designing and building secure Amazon Virtual Private Clouds (VPCs). Using a hands-on approach, we'll take you through Amazon VPC features such as subnets, security groups, network ACLs, routing, flow logs and service endpoints. The AWS team will also provide some guidance around best practices for VPC design and management, based on our experience of supporting customers running large-scale infrastructures.
ABD201-Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
In this session, we simplify big data processing as a data bus comprising various stages: collect, store, process, analyze, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Many customers want a disaster recovery environment, and they want to use this environment daily and know that it's in sync with and can support a production workload. This leads them to an active-active architecture. In other cases, users like Netflix and Lyft are distributed over large geographies. In these cases, multi-region active-active deployments are not optional. Designing these architectures is more complicated than it appears, as data being generated at one end needs to be synced with data at the other end. There are also consistency issues to consider. One needs to make trade-off decisions on cost, performance, and consistency. Further complicating matters is the variety of data stores used in the architecture results in a variety replication methods. In this session, we explore how to design an active-active multi-region architecture using AWS services, including Amazon Route 53, Amazon RDS multi-region replication, AWS DMS, and Amazon DynamoDB Streams. We discuss the challenges, trade-offs, and solutions.
AWS Commercial Management and Cost Optimisation - Dec 2017Amazon Web Services
Technical levers and strategic mechanisms for AWS Commercial Management and Cost Optimisation. Includes 2017 commercially relevant updates.
Speaker: Peter Shi, Commercial Architect, BD AWS APAC
Hybrid Cloud Data Management: Using Data for Business Outcomes - STG308 - re:...Amazon Web Services
Today, data backup isn’t enough. IT teams with a cloud data management strategy become a data broker for the business. Data helps the business improve company reputation, drive revenue, and satisfy customers. With a hybrid architecture approach to managing data on-premises and in the cloud, the business can be more agile and more responsive than today. Find out what your IT peers are doing with cloud data management (hint: it’s more than backup). Learn how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. See what your peers are doing to best move, manage, and use data across on-premises storage and cloud services. In this session, you learn steps for seamless, risk-free migration to different AWS services (Amazon EC2, Amazon RDS, Amazon S3, Amazon S3 - Infrequent Access class, Amazon Glacier and AWS Snowball); tactics for streamlined, enterprise-class disaster recovery; ways to save money by retiring expensive alternatives like tape storage; single view e-discovery across hybrid locations with dynamic data indexing across on-premises and cloud storage; and how to achieve holistic data protection across storage locations.
Session sponsored by Commvault
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
A modern Big Data architecture involves extending your on-premises data management to AWS, implementing a data pipeline to stream real-time data into cloud data warehouse Amazon Redshift, perform data transformation, discovery, predictive analytics through machine learning, visualize complex information and be notified to respond to business events. This session is for APN Consulting Partners and organizations looking for ways to accelerate and modernize their Big Data projects. You will learn how to deploy and integrate AWS Services with Third-party Solutions in AWS Marketplace. Reduce your time to market by combining AWS services, open source software and ready-to-run on AWS solutions. Familiarity with Database technologies required. The session includes demonstrations and cooperative learning group activities.
Design patterns and best practices for data analytics with amazon emr (ABD305)Amazon Web Services
Amazon EMR is one of the largest Hadoop operators in the world, enabling customers to run ETL, machine learning, real-time processing, data science, and low-latency SQL at petabyte scale. In this session, we introduce you to Amazon EMR design patterns such as using Amazon S3 instead of HDFS, taking advantage of both long and short-lived clusters, and other Amazon EMR architectural best practices. We talk about lowering cost with Auto Scaling and Spot Instances, and security best practices for encryption and fine-grained access control. Finally, we dive into some of our recent launches to keep you current on our latest features.
ARC306_High Resiliency & Availability Of Online Entertainment Communities Usi...Amazon Web Services
With increase in popularity of online engagement as a means of entertainment, broad use of wide range of communities have become popular. These communities need to be highly available and resilient at scale. Failure of availability could be fatal to the product that are used by the customer. We will share the process you should use to develop your architectural principles that will allow you to reap the benefits of reduced complexity.
STG311_Deep Dive on Amazon S3 & Amazon Glacier Storage ManagementAmazon Web Services
Learn best practices for Amazon Simple Storage Service (Amazon S3) performance optimization, security, data protection, storage management, and much more. Learn how to optimize key naming to increase throughput, apply the appropriate AWS Identity and Access Management (IAM) and encryption configurations, and leverage object tagging and other features to enhance security.
In this popular session, discover how Amazon EBS can take your application deployments on Amazon EC2 to the next level. Learn about Amazon EBS features and benefits, how to identify applications that are appropriate for use with Amazon EBS, best practices, and details about its performance and volume types. The target audience is storage administrators, application developers, applications owners, and anyone who wants to understand how to optimize performance for Amazon EC2 using the power of Amazon EBS.
ARC303_Running Lean Architectures How to Optimize for Cost EfficiencyAmazon Web Services
This document discusses best practices for optimizing AWS architectures for cost efficiency. It provides strategies across business goals, architecture, and operations. Key recommendations include using AWS cost tools, Reserved Instances, automation to avoid idle instances, Spot Instances, optimizing databases through caching, splitting architectures into focused stacks, and leveraging managed services. The presentation provides customer examples and emphasizes letting AWS handle undifferentiated heavy lifting to save money and focus on core business needs.
DAT339_Replicate, Analyze, and Visualize Datasets Using AWS Database Migratio...Amazon Web Services
Customers often have disparate datasets within their data centers and on AWS. As a result, they find it challenging to replicate and analyze the data and drive positive business outcomes. In this workshop, we use AWS managed database services and serverless technologies to help replicate, analyze, and visualize data. We replicate an on-premises database to Amazon RDS and Amazon S3 using AWS Database Migration Service and the AWS Schema Conversion Tool. Then, we use Amazon Athena to interactively analyze data using SQL. Finally, we use Amazon QuickSight to visualize the data and enable better business decisions.
DAT332_How Verizon is Adopting Amazon Aurora PostgreSQL for Enterprise WorkloadsAmazon Web Services
Learn how Verizon is adopting the Amazon Aurora PostgreSQL-compatible edition for their mission-critical applications. Verizon has a history of adopting best of breed database technologies as they continue to serve their 140M+ customers. As Verizon moves its enterprise applications to the cloud, database performance and reliability are the key considerations. With heavy dependence on commercial databases, learn how a large enterprise like Verizon evaluated performance, reliability and operational characteristics of Amazon Aurora, and was able to create internal momentum behind adoption of open source technologies by showcasing early wins. This session also highlights best practices for using Amazon Aurora and the newly-announced RDS Performance Insights.
FINRA uses big data and data science technologies to detect fraud, market manipulation, and insider trading across US capital markets. As a financial regulator, FINRA analyzes highly sensitive data, so information security is critical. Learn how FINRA secures its Amazon S3 Data Lake and its data science platform on Amazon EMR and Amazon Redshift, while empowering data scientists with tools they need to be effective. In addition, FINRA shares AWS security best practices, covering topics such as AMI updates, micro segmentation, encryption, key management, logging, identity and access management, and compliance.
ATC303-Cache Me If You Can Minimizing Latency While Optimizing Cost Through A...Amazon Web Services
This document summarizes a presentation about caching strategies to minimize latency and optimize costs. It discusses caching at the edge with CloudFront, caching at the web tier and app tier with solutions like Varnish and ElastiCache, and caching database queries with DynamoDB Accelerator. The presentation provides examples from Team Internet and recommends caching everything possible at each layer to improve performance.
This document provides a deep dive on Amazon Elastic Block Store (EBS) and includes:
- An overview of EBS including that it provides block storage as a service that can be attached to EC2 instances and persists independently.
- Descriptions of the different EBS volume types including General Purpose SSD (gp2), Provisioned IOPS SSD (io1), Throughput Optimized HDD (st1), Cold HDD (sc1), and how to choose the appropriate type.
- Details on modifying EBS volumes including increasing size and provisioned IOPS using EBS elastic volumes and the modification process.
- Tips for automating volume modifications using services like AWS Lambda and Amazon CloudWatch
Choosing the Right Database for My Workload: Purpose-Built Databases AWS Germany
The document discusses choosing the right database for different types of workloads. It covers operational databases like Amazon DynamoDB, Amazon RDS, Amazon ElastiCache and Amazon Neptune that are well-suited for transactional workloads. It also discusses analytic databases like Amazon Redshift, Amazon Athena, Amazon Kinesis Analytics and Amazon Elasticsearch Service that are well-suited for large-scale analytics and business intelligence workloads. The document emphasizes that AWS offers a variety of purpose-built databases and there is no need to pick just one, as different databases can be combined to solve different aspects of a problem.
In this session, we discuss the evolution of database and analytics services in AWS, the new database and analytics services and features we launched this year, and our vision for continued innovation in this space. We are witnessing an unprecedented growth in the amount of data collected, in many different forms. Storage, management, and analysis of this data require database services that scale and perform in ways not possible before. AWS offers a collection of database and other data services—including Amazon Aurora, Amazon DynamoDB, Amazon RDS, Amazon Redshift, Amazon ElastiCache, Amazon Kinesis, and Amazon EMR—to process, store, manage, and analyze data. In this session, we provide an overview of AWS database and analytics services and discuss how customers are using these services today.
What are the different options for a developer to run his DB in the Cloud? This session will look into the different options and how to choose the right DB for your workload.
Learn how to reduce development time and innovate on AWS. In this webinar, Beachbody - sellers of fitness, weight loss, and muscle-building home-exercise videos - talks about their experience migrating to a data lake on Amazon Simple Storage Service (Amazon S3) using Talend. Beachbody will describe how they created an open enterprise data platform, giving their employees access to secure, well-governed data, and increasing DevOps efficiency across the entire company.
The document discusses AWS database and analytics services. It provides an overview of the portfolio of AWS services for databases and analytics, including both relational and non-relational databases as well as data analytics services. It also discusses several specific AWS services in more detail, including Amazon Aurora, DynamoDB, ElastiCache, Neptune, and analytics services like Redshift.
The document discusses managed NoSQL databases, including Amazon DynamoDB, Amazon Neptune, and Amazon ElastiCache. It provides an overview of each service, highlighting key features such as DynamoDB being a fast and flexible key-value and document database, Neptune being a fully managed graph database, and ElastiCache providing an in-memory cache. It also discusses why organizations are adopting non-relational databases to address needs for massive scale, low latency, and schema flexibility for highly connected internet applications.
The document discusses migrating big data workloads from on-premises environments to AWS. It describes deconstructing current workloads, identifying challenges with on-premises architectures, and how to migrate components to AWS services like Amazon EMR and Amazon S3. The document also shares the experience of Vanguard migrating their big data workload to AWS.
Using AWS Purpose-Built Databases to Modernize your ApplicationsAmazon Web Services
As you look to modernizing your applications, you will need to consider your database options to meet the new application requirements. AWS offers a series of purpose-built databases that include relational, key value, document, graph and cache use cases to help you deliver new and enhanced functionalities. In this webinar session, we share the different modern application architectures, and how to combine different database services to meet your requirements. Understand how to modernize your relational databases through easy upgrades with Amazon Relational Database Service and learn how to migrate from one database to another with AWS Database Migration Service and AWS Schema Conversion Tool.
Speaker:
Blair Layton, Business Development Manager, Amazon Web Services
Building low latency apps with a serverless architecture and in-memory data I...AWS Germany
Memory data stores such as ElastiCache for Redis enables applications with response times in microseconds. By using Aurora, DynamoDB, DAX, Lambda, and ElastiCache, we explored how to design and deploy high-perfomance applications. Learn more here: https://aws.amazon.com/products/databases/
This is the general session for Amazon DynamoDB and will cover newly announced features, as well as provide an end to end view of recent innovations. We will also share some of our successful customer stories and use cases. Come to this session to learn all about what’s new for DynamoDB!
Applying AWS Purpose-Built Database Strategy - SRV307 - Toronto AWS SummitAmazon Web Services
In this session, we dive deep into applying the "AWS Purpose-Built Database Strategy" to determine which databases to use for which components of your application. Learn how to evaluate a new workload for the best managed database option based on specific application needs related to data shape, data size at limit, computational requirements, programmability, throughput and latency needs, etc. This session explains the ideal use cases for relational and non-relational database services, including Amazon Aurora, Amazon DynamoDB, Amazon ElastiCache for Redis, Amazon Neptune, and Amazon Redshift.
AWS Purpose-Built Database Strategy: The Right Tool For The Right JobAmazon Web Services
This document provides an overview of Amazon Web Services database strategies and services. It discusses how to choose the right database based on data structure, volume, and scaling needs. It also describes several AWS database offerings, including Amazon RDS, DynamoDB, Redshift, ElastiCache, and DMS. Hands-on demonstrations are provided for creating an RDS MySQL database and migrating a database using AWS Database Migration Service.
Migrating your traditional Data Warehouse to a Modern Data LakeAmazon Web Services
In this session, we discuss the latest features of Amazon Redshift and Redshift Spectrum, and take a deep dive into its architecture and inner workings. We share many of the recent availability, performance, and management enhancements and how they improve your end user experience. You also hear from 21st Century Fox, who presents a case study of their fast migration from an on-premises data warehouse to Amazon Redshift. Learn how they are expanding their data warehouse to a data lake that encompasses multiple data sources and data formats. This architecture helps them tie together siloed business units and get actionable 360-degree insights across their consumer base.
Technology Trends in Data Processing - DAT311 - re:Invent 2017Amazon Web Services
In this talk, Anurag Gupta, VP for AWS Analytic and Transactional Database Services, will talk about some of the key trends we see in data processing and how they shape the services we offer at AWS. Specific trends will include the rise of machine generated logs as the dominant source of data, the move towards Serverless, api-centric computing, and the growing need for local access to data from users around the world.
SRV307 Applying AWS Purpose-Built Database Strategy: Match Your Workload to ...Amazon Web Services
In this session, Tony Petrossian, director of engineering, AWS Database Services, dives deep into what databases to use for which components of your application. Learn how to evaluate a new workload for the best managed database option based on specific application needs related to data shape, data size at limit, computational requirements, programmability, throughput and latency needs, etc. This session explains the ideal use cases for relational and non-relational database services, including Amazon Aurora, Amazon DynamoDB, Amazon ElastiCache for Redis, Amazon Neptune, and Amazon Redshift.
Migrating to 21st Century Analytics: Zopa Story
Speakers:
Shafreen Sayyed, Solution Architect, AWS
Varun Gangoor, Senior Big Data Engineer, Zopa
Data makes the world go around these days, and 21st Century Data Analytics means you can store, process and analyze massive amounts of data, often in real time, whilst making that data consumable across diverse groups in your organization. Many traditional tools lock data away in inflexible silos, making this impossible. This session will look at what is needed in a Financial Services organization to achieve a flexible and scalable data architecture, and we will also hear from Zopa, UK's first peer-to-peer lending company, about how they migrated their data analytics estate to AWS and look at what new insight that has given them.
This document discusses data warehousing and analytics using Amazon Redshift. It provides an overview of Redshift's capabilities such as its columnar data storage, automatic scaling, integration with data lakes in Amazon S3, and query performance. It also covers best practices for optimizing Redshift performance through techniques like compression, sorting, and distribution of data.
FINRA's Managed Data Lake: Next-Gen Analytics in the Cloud - ENT328 - re:Inve...Amazon Web Services
FINRA faced challenges with their on-premises data infrastructure, including difficulty tracking data, limited scalability, and high costs. They migrated to a managed data lake on AWS to address these issues. This provided centralized data management with a catalog, separation of storage and compute, encryption, and cost optimization. It enabled faster analytics through Presto querying, machine learning model development, and reduced TCO by 30% compared to their on-premises environment. Lessons learned included embracing disruption, automating infrastructure, and treating infrastructure as code. FINRA is exploring additional AWS services like Athena, Lambda, and Step Functions to continue improving their analytics capabilities.
How TrueCar Gains Actionable Insights with Splunk Cloud PPTAmazon Web Services
The vast amount of big data that today’s companies generate makes it difficult to separate the signal from the noise. Organizations need to derive meaningful insights into operations and business to take action. TrueCar needed a better way to manage, search, and analyze their hybrid environment. In this webinar, you’ll learn how TrueCar centralized all of their data in one place using Amazon Kinesis and Splunk Cloud, gaining deep visibility, scalability, and the ability to monitor and troubleshoot operational issues – all while migrating to AWS.
Let the data decide!
Amazon Relational Database Service (RDS)
Demo - Deploy Multi-AZ database in VPC
Amazon DynamoDB (NoSQL)
Intro to AWS Athena and Redshift
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
1) The document discusses building a minimum viable product (MVP) using Amazon Web Services (AWS).
2) It provides an example of an MVP for an omni-channel messenger platform that was built from 2017 to connect ecommerce stores to customers via web chat, Facebook Messenger, WhatsApp, and other channels.
3) The founder discusses how they started with an MVP in 2017 with 200 ecommerce stores in Hong Kong and Taiwan, and have since expanded to over 5000 clients across Southeast Asia using AWS for scaling.
This document discusses pitch decks and fundraising materials. It explains that venture capitalists will typically spend only 3 minutes and 44 seconds reviewing a pitch deck. Therefore, the deck needs to tell a compelling story to grab their attention. It also provides tips on tailoring different types of decks for different purposes, such as creating a concise 1-2 page teaser, a presentation deck for pitching in-person, and a more detailed read-only or fundraising deck. The document stresses the importance of including key information like the problem, solution, product, traction, market size, plans, team, and ask.
This document discusses building serverless web applications using AWS services like API Gateway, Lambda, DynamoDB, S3 and Amplify. It provides an overview of each service and how they can work together to create a scalable, secure and cost-effective serverless application stack without having to manage servers or infrastructure. Key services covered include API Gateway for hosting APIs, Lambda for backend logic, DynamoDB for database needs, S3 for static content, and Amplify for frontend hosting and continuous deployment.
This document provides tips for fundraising from startup founders Roland Yau and Sze Lok Chan. It discusses generating competition to create urgency for investors, fundraising in parallel rather than sequentially, having a clear fundraising narrative focused on what you do and why it's compelling, and prioritizing relationships with people over firms. It also notes how the pandemic has changed fundraising, with examples of deals done virtually during this time. The tips emphasize being fully prepared before fundraising and cultivating connections with investors in advance.
AWS_HK_StartupDay_Building Interactive websites while automating for efficien...Amazon Web Services
This document discusses Amazon's machine learning services for building conversational interfaces and extracting insights from unstructured text and audio. It describes Amazon Lex for creating chatbots, Amazon Comprehend for natural language processing tasks like entity extraction and sentiment analysis, and how they can be used together for applications like intelligent call centers and content analysis. Pre-trained APIs simplify adding machine learning to apps without requiring ML expertise.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
32. Analytics database dimensions
Streaming analytics ✔
Serverless ad-hoc query ✔
Process, prepare and index in-place ✔
Low-latency for reporting and BI dashboards ✔
Pay per query ✔
Data warehouse with multiple enterprise data sources ✔
Query data directly in S3 without format conversions ✔
Directly query CSV, JSON, TSV or text files ✔
Amazon
Redshift
AthenaKinesis Analytics