by Drew Meyer, Sr. Product Marketing Manager, AWS
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements. The session will also address our partner network and what it means for a storage provider to have the APN Storage Competency.
This document summarizes a presentation on data lifecycle and storage management techniques for Amazon S3. It discusses lifecycle management rules for transitioning or expiring objects based on age, S3 inventory for listing objects, object tagging for classification and policy filtering, storage class analysis for monitoring usage and optimizing storage, and monitoring tools like CloudWatch and CloudTrail. The presentation provides an overview and best practices for these S3 management features.
by Dave Stein, Business Development Manager, AWS
Discover how EBS can take your application deployments on EC2 to the next level. You will learn service features and benefits, how to identify applications that are appropriate for use with EBS, best practices, and details about its performance and volume types.
by Henry Zhang, Sr. Product Manager, AWS
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
by PD Dutta, Sr. Product Manager, Object Storage, AWS
We will explain how to design and build an IoT cloud platform on top of Amazon S3. You will get to review the best practices for architecting a cost-effective, durable, and secure storage solution to store and analyze your IoT data on Amazon S3. In addition, we’ll cover how to collect, ingest and analyze the data in-place using different AWS Services such as AWS IoT, Amazon Kinesis, Amazon Athena, and Amazon Redshift Spectrum.
Backup and Recovery with Cloud-Native Deduplication and Use Cases from the Fi...Amazon Web Services
by Hugh Emberson, CTO, StorReduce
Designing and deploying cloud-enabled backup & recovery solutions often leads to opportunities for reducing storage requirements and increasing efficiencies. Having effective cloud-native deduplication capabilities as part of your backup & recovery strategy can optimize migration, decrease the need for purpose built backup appliances like Data Domains, large tape archives, and enable cost reductions of up to 95%. In this session, StorReduce will provide best practices around data deduplication in relation to designing and deploying solutions around backup, archive, and general unstructured file data. They will also demonstrate how using a cloud native interface with scale-out deduplication enables generic cloud services like search inside all backups moved to cloud. They will guide the audience through two customer use cases from the financial services and healthcare industries.
Disaster Recovery Best Practices and Customer Use Cases: CGS and HealthQuestAmazon Web Services
This document provides an agenda for a CloudEndure presentation at an AWS storage day event. The agenda includes an introduction to CloudEndure and how it works with AWS, enterprise disaster recovery strategies, a deep dive into CloudEndure's DR technology and demo, customer case studies, and a Q&A session. It discusses CloudEndure's key technology pillars for OS-based continuous replication and disaster recovery benefits like cost reductions and recovery objectives. The document also includes case studies of how CloudEndure helped customers like CGS and HealthQuest implement DR solutions on AWS to reduce costs and improve recovery times. It promotes CloudEndure DR and migration products available on AWS Marketplace and provides a call to action to become a CloudEnd
Building Hybrid Cloud Storage Architectures with AWS @scaleAmazon Web Services
The document discusses building hybrid cloud storage architectures with AWS. It provides an overview of AWS storage services including Amazon S3, Glacier, EBS, and EFS. It also describes the AWS Storage Gateway family of on-premises appliances that enable hybrid storage between on-premises and AWS cloud storage. Specifically, it covers the File Gateway for accessing S3 storage as files, Volume Gateway for iSCSI volumes, and Tape Gateway for migrating tape backups to S3.
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
This document summarizes a presentation on data lifecycle and storage management techniques for Amazon S3. It discusses lifecycle management rules for transitioning or expiring objects based on age, S3 inventory for listing objects, object tagging for classification and policy filtering, storage class analysis for monitoring usage and optimizing storage, and monitoring tools like CloudWatch and CloudTrail. The presentation provides an overview and best practices for these S3 management features.
by Dave Stein, Business Development Manager, AWS
Discover how EBS can take your application deployments on EC2 to the next level. You will learn service features and benefits, how to identify applications that are appropriate for use with EBS, best practices, and details about its performance and volume types.
by Henry Zhang, Sr. Product Manager, AWS
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
by PD Dutta, Sr. Product Manager, Object Storage, AWS
We will explain how to design and build an IoT cloud platform on top of Amazon S3. You will get to review the best practices for architecting a cost-effective, durable, and secure storage solution to store and analyze your IoT data on Amazon S3. In addition, we’ll cover how to collect, ingest and analyze the data in-place using different AWS Services such as AWS IoT, Amazon Kinesis, Amazon Athena, and Amazon Redshift Spectrum.
Backup and Recovery with Cloud-Native Deduplication and Use Cases from the Fi...Amazon Web Services
by Hugh Emberson, CTO, StorReduce
Designing and deploying cloud-enabled backup & recovery solutions often leads to opportunities for reducing storage requirements and increasing efficiencies. Having effective cloud-native deduplication capabilities as part of your backup & recovery strategy can optimize migration, decrease the need for purpose built backup appliances like Data Domains, large tape archives, and enable cost reductions of up to 95%. In this session, StorReduce will provide best practices around data deduplication in relation to designing and deploying solutions around backup, archive, and general unstructured file data. They will also demonstrate how using a cloud native interface with scale-out deduplication enables generic cloud services like search inside all backups moved to cloud. They will guide the audience through two customer use cases from the financial services and healthcare industries.
Disaster Recovery Best Practices and Customer Use Cases: CGS and HealthQuestAmazon Web Services
This document provides an agenda for a CloudEndure presentation at an AWS storage day event. The agenda includes an introduction to CloudEndure and how it works with AWS, enterprise disaster recovery strategies, a deep dive into CloudEndure's DR technology and demo, customer case studies, and a Q&A session. It discusses CloudEndure's key technology pillars for OS-based continuous replication and disaster recovery benefits like cost reductions and recovery objectives. The document also includes case studies of how CloudEndure helped customers like CGS and HealthQuest implement DR solutions on AWS to reduce costs and improve recovery times. It promotes CloudEndure DR and migration products available on AWS Marketplace and provides a call to action to become a CloudEnd
Building Hybrid Cloud Storage Architectures with AWS @scaleAmazon Web Services
The document discusses building hybrid cloud storage architectures with AWS. It provides an overview of AWS storage services including Amazon S3, Glacier, EBS, and EFS. It also describes the AWS Storage Gateway family of on-premises appliances that enable hybrid storage between on-premises and AWS cloud storage. Specifically, it covers the File Gateway for accessing S3 storage as files, Volume Gateway for iSCSI volumes, and Tape Gateway for migrating tape backups to S3.
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
This document discusses optimizing storage for big data workloads on AWS. It provides an overview of various AWS storage options for different big data use cases like Hadoop, data warehousing, NoSQL databases and streaming. It also shares several customer examples using EBS volumes for big data workloads like Hadoop, Cassandra and Splunk. The document recommends choosing the right EC2 instance type and EBS volume type based on the workload's input/output patterns and throughput/capacity needs to optimize performance and costs.
Best Practices for Building a Data Lake in Amazon S3 and Amazon Glacier, with...Amazon Web Services
Learn how to build a data lake for analytics in Amazon S3 and Amazon Glacier. In this session, we discuss best practices for data curation, normalization, and analysis on Amazon object storage services. We examine ways to reduce or eliminate costly extract, transform, and load (ETL) processes using query-in-place technology, such as Amazon Athena and Amazon Redshift Spectrum. We also review custom analytics integration using Apache Spark, Apache Hive, Presto, and other technologies in Amazon EMR. You'll also get a chance to hear from Airbnb & Viber about their solutions for Big Data analytics using S3 as a data lake.
Hybrid Cloud Data Management: Using Data for Business Outcomes - STG308 - re:...Amazon Web Services
Today, data backup isn’t enough. IT teams with a cloud data management strategy become a data broker for the business. Data helps the business improve company reputation, drive revenue, and satisfy customers. With a hybrid architecture approach to managing data on-premises and in the cloud, the business can be more agile and more responsive than today. Find out what your IT peers are doing with cloud data management (hint: it’s more than backup). Learn how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. See what your peers are doing to best move, manage, and use data across on-premises storage and cloud services. In this session, you learn steps for seamless, risk-free migration to different AWS services (Amazon EC2, Amazon RDS, Amazon S3, Amazon S3 - Infrequent Access class, Amazon Glacier and AWS Snowball); tactics for streamlined, enterprise-class disaster recovery; ways to save money by retiring expensive alternatives like tape storage; single view e-discovery across hybrid locations with dynamic data indexing across on-premises and cloud storage; and how to achieve holistic data protection across storage locations.
Session sponsored by Commvault
by Darryl Osborne, Solutions Architect, AWS
Learn how to get started with a scalable file system with a simple interface for use with Amazon EC2 instances in the AWS Cloud. We’ll cover the basics, go through customer use cases to illustrate key features, and run active demos that show you how EFS supports application workflows.
DAT324_Expedia Flies with DynamoDB Lightning Fast Stream Processing for Trave...Amazon Web Services
Building rich, high-performance streaming data systems requires fast, on-demand access to reference data sets, to implement complex business logic. In this talk, Expedia will discuss the architectural challenges the company faced, and how DAX + DynamoDB fits into the overall architecture and met their design requirements. Additionally, you will hear how DAX that enabled Expedia to add caching to their existing applications in hours, which previously was taking much longer. Session attendees will walk away with three key outputs: 1) Expedia’s overall architectural patterns for streaming data 2) how they uniquely leverage DynamoDB, DAX, Apache Spark, and Apache Kafka to solve these problems 3) the value that DAX provides and how it enabled them to improve our performance and throughput, reduce costs, and all without having to write any new code.
STG311_Deep Dive on Amazon S3 & Amazon Glacier Storage ManagementAmazon Web Services
Learn best practices for Amazon Simple Storage Service (Amazon S3) performance optimization, security, data protection, storage management, and much more. Learn how to optimize key naming to increase throughput, apply the appropriate AWS Identity and Access Management (IAM) and encryption configurations, and leverage object tagging and other features to enhance security.
by Everett Dolgner, Business Development Manager, AWS
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum and optimizing your overall capital expense can be challenging. This session presents AWS features and services along with disaster recovery architectures that you can leverage when building highly available and disaster-resilient strategies.
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
Hybrid Cloud Storage for Recovery & Migration with AWS Storage Gateway (STG30...Amazon Web Services
In this workshop, we provide hands-on experience using the AWS Storage Gateway service to protect on-premises data in AWS, recover it locally or in the cloud in minutes, and migrate it when the time is right. You work with the File Gateway and Microsoft SQL Server native tools to back up to Amazon S3, and then recover or migrate that database in AWS rapidly. In addition, you use Volume Gateway and Amazon EBS Snapshots to protect and migrate block-based volumes. Use this session to hone your skills with backup and DR, and prepare for application migrations.
We will cover the core AWS storage services, which include Amazon Simple Storage Service (Amazon S3), Amazon Glacier, Amazon Elastic File System (Amazon EFS), and Amazon Elastic Block Store (Amazon EBS). We also discuss data transfer services such as AWS Snowball, Snowball Edge, and AWS Snowmobile, and hybrid storage solutions such as AWS Storage Gateway.
AWS offers numerous services to migrate data at a petabyte scale. You can easily move large volumes of data from onsite to the cloud and utilize the cloud as a backup target using data transfer services, such as AWS Snowball, AWS Snowball Edge, or AWS Storage Gateway. Learn about available data migration options and which one is the right fit for your requirements.
Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for what you use. Join us for this webinar where you will learn how Amazon Glacier changes the game for data archiving and backup as you pay nothing upfront, pay a very low price for storage, and can scale your usage up or down as needed.
Is Your SaaS Covered? Best Practices for Preventing Data Loss in Microsoft Of...Amazon Web Services
This document discusses best practices for preventing data loss in Microsoft Office 365. It notes that while SaaS usage is growing, many SaaS providers do not offer robust data backup and recovery capabilities. NetApp's Cloud Control solution can be used to back up Office 365 data in AWS, offering features like granular restores that Office 365 alone does not provide. This addresses the risks organizations face around accidental or malicious data loss from their Office 365 deployment.
Deep Dive: Building Hybrid Cloud Storage Architectures with AWS Storage Gatew...Amazon Web Services
Are you tired of the treadmill of deploying on-premises storage? Join this session to learn how to use AWS Storage Gateway to shift storage for on-premises apps to the cloud, reducing your infrastructure and management challenges. Storage Gateway connects your apps to AWS storage services, including Amazon S3, using standard block, file and tape storage protocols. You can use Storage Gateway for hybrid cloud use cases for file-based application data storage, backup, analytics with data lakes, machine learning (ML), and migration. Learn about best practices from a customer using Storage Gateway for Microsoft SQL Server data protection.
We have recently seen some convergence of different database technologies. Many customers are evaluating heterogeneous migrations as their database needs have evolved or changed. Evaluating the best database to use for a job isn’t as clear as it was ten years ago. In this session, we discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, and Amazon Redshift. This session digs into how to evaluate a new workload for the best managed database option.
AWS offers storage, networking, and data transfer services so you can build and deploy solutions to extend backup and archive targets to the AWS Cloud, increasing scalability, durability, security, and compliance.
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...Amazon Web Services
Running out of capacity on your NAS? Tired of buying and maintaining storage systems for file shares, media archives, or high-performance shared file systems? Learn how you can use AWS storage services to help eliminate the capital expense and operational complexity of on-premises file storage. We provide guidance how to use AWS’s in-cloud file storage service Amazon EFS, as well as how to connect on-premises file workloads to data stored in Amazon S3 via the AWS Storage Gateway. Hear examples from customers such as Celgene Corporation, who are using these services in hybrid and in-cloud architectures, to take advantage of AWS durability, performance, and economics.
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
by Everett Dolgner, Business Development Manager AWS
With a hybrid architecture approach to managing data on-premises and in the cloud, organizations can be more agile and responsive than ever before. Find out what your peers are doing with cloud and how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. We will also cover dynamic data indexing across on-premises and cloud storage and holistic data protection.
Hightail is a file sharing and collaboration platform that was formerly known as YouSendIt. In 2015, Hightail transitioned from solely file sending to broader file sharing and collaboration tools. This required overhauling Hightail's technical stack, which was previously on-premise. Hightail evaluated AWS, Google, and IBM for cloud compute and storage and chose AWS in late 2016 due to AWS's tiered storage options, data lifecycle management, competitive pricing and financial incentives, lower operational costs and risks compared to on-premise, and AWS's experience supporting other companies through similar migrations. Hightail completed migrating its infrastructure and data to AWS by August 2017.
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will also touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements.
Building a modern data platform in the cloud. AWS DevDay Nordicsjavier ramirez
This presentation introduces the problems of data engineering and the AWS services you can use to make your life easier. It featured a live demo in Stockholm and Oslo, which is shown as screenshots. The URL for doing the demo yourself is included in the slides
This document discusses optimizing storage for big data workloads on AWS. It provides an overview of various AWS storage options for different big data use cases like Hadoop, data warehousing, NoSQL databases and streaming. It also shares several customer examples using EBS volumes for big data workloads like Hadoop, Cassandra and Splunk. The document recommends choosing the right EC2 instance type and EBS volume type based on the workload's input/output patterns and throughput/capacity needs to optimize performance and costs.
Best Practices for Building a Data Lake in Amazon S3 and Amazon Glacier, with...Amazon Web Services
Learn how to build a data lake for analytics in Amazon S3 and Amazon Glacier. In this session, we discuss best practices for data curation, normalization, and analysis on Amazon object storage services. We examine ways to reduce or eliminate costly extract, transform, and load (ETL) processes using query-in-place technology, such as Amazon Athena and Amazon Redshift Spectrum. We also review custom analytics integration using Apache Spark, Apache Hive, Presto, and other technologies in Amazon EMR. You'll also get a chance to hear from Airbnb & Viber about their solutions for Big Data analytics using S3 as a data lake.
Hybrid Cloud Data Management: Using Data for Business Outcomes - STG308 - re:...Amazon Web Services
Today, data backup isn’t enough. IT teams with a cloud data management strategy become a data broker for the business. Data helps the business improve company reputation, drive revenue, and satisfy customers. With a hybrid architecture approach to managing data on-premises and in the cloud, the business can be more agile and more responsive than today. Find out what your IT peers are doing with cloud data management (hint: it’s more than backup). Learn how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. See what your peers are doing to best move, manage, and use data across on-premises storage and cloud services. In this session, you learn steps for seamless, risk-free migration to different AWS services (Amazon EC2, Amazon RDS, Amazon S3, Amazon S3 - Infrequent Access class, Amazon Glacier and AWS Snowball); tactics for streamlined, enterprise-class disaster recovery; ways to save money by retiring expensive alternatives like tape storage; single view e-discovery across hybrid locations with dynamic data indexing across on-premises and cloud storage; and how to achieve holistic data protection across storage locations.
Session sponsored by Commvault
by Darryl Osborne, Solutions Architect, AWS
Learn how to get started with a scalable file system with a simple interface for use with Amazon EC2 instances in the AWS Cloud. We’ll cover the basics, go through customer use cases to illustrate key features, and run active demos that show you how EFS supports application workflows.
DAT324_Expedia Flies with DynamoDB Lightning Fast Stream Processing for Trave...Amazon Web Services
Building rich, high-performance streaming data systems requires fast, on-demand access to reference data sets, to implement complex business logic. In this talk, Expedia will discuss the architectural challenges the company faced, and how DAX + DynamoDB fits into the overall architecture and met their design requirements. Additionally, you will hear how DAX that enabled Expedia to add caching to their existing applications in hours, which previously was taking much longer. Session attendees will walk away with three key outputs: 1) Expedia’s overall architectural patterns for streaming data 2) how they uniquely leverage DynamoDB, DAX, Apache Spark, and Apache Kafka to solve these problems 3) the value that DAX provides and how it enabled them to improve our performance and throughput, reduce costs, and all without having to write any new code.
STG311_Deep Dive on Amazon S3 & Amazon Glacier Storage ManagementAmazon Web Services
Learn best practices for Amazon Simple Storage Service (Amazon S3) performance optimization, security, data protection, storage management, and much more. Learn how to optimize key naming to increase throughput, apply the appropriate AWS Identity and Access Management (IAM) and encryption configurations, and leverage object tagging and other features to enhance security.
by Everett Dolgner, Business Development Manager, AWS
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum and optimizing your overall capital expense can be challenging. This session presents AWS features and services along with disaster recovery architectures that you can leverage when building highly available and disaster-resilient strategies.
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
Hybrid Cloud Storage for Recovery & Migration with AWS Storage Gateway (STG30...Amazon Web Services
In this workshop, we provide hands-on experience using the AWS Storage Gateway service to protect on-premises data in AWS, recover it locally or in the cloud in minutes, and migrate it when the time is right. You work with the File Gateway and Microsoft SQL Server native tools to back up to Amazon S3, and then recover or migrate that database in AWS rapidly. In addition, you use Volume Gateway and Amazon EBS Snapshots to protect and migrate block-based volumes. Use this session to hone your skills with backup and DR, and prepare for application migrations.
We will cover the core AWS storage services, which include Amazon Simple Storage Service (Amazon S3), Amazon Glacier, Amazon Elastic File System (Amazon EFS), and Amazon Elastic Block Store (Amazon EBS). We also discuss data transfer services such as AWS Snowball, Snowball Edge, and AWS Snowmobile, and hybrid storage solutions such as AWS Storage Gateway.
AWS offers numerous services to migrate data at a petabyte scale. You can easily move large volumes of data from onsite to the cloud and utilize the cloud as a backup target using data transfer services, such as AWS Snowball, AWS Snowball Edge, or AWS Storage Gateway. Learn about available data migration options and which one is the right fit for your requirements.
Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for what you use. Join us for this webinar where you will learn how Amazon Glacier changes the game for data archiving and backup as you pay nothing upfront, pay a very low price for storage, and can scale your usage up or down as needed.
Is Your SaaS Covered? Best Practices for Preventing Data Loss in Microsoft Of...Amazon Web Services
This document discusses best practices for preventing data loss in Microsoft Office 365. It notes that while SaaS usage is growing, many SaaS providers do not offer robust data backup and recovery capabilities. NetApp's Cloud Control solution can be used to back up Office 365 data in AWS, offering features like granular restores that Office 365 alone does not provide. This addresses the risks organizations face around accidental or malicious data loss from their Office 365 deployment.
Deep Dive: Building Hybrid Cloud Storage Architectures with AWS Storage Gatew...Amazon Web Services
Are you tired of the treadmill of deploying on-premises storage? Join this session to learn how to use AWS Storage Gateway to shift storage for on-premises apps to the cloud, reducing your infrastructure and management challenges. Storage Gateway connects your apps to AWS storage services, including Amazon S3, using standard block, file and tape storage protocols. You can use Storage Gateway for hybrid cloud use cases for file-based application data storage, backup, analytics with data lakes, machine learning (ML), and migration. Learn about best practices from a customer using Storage Gateway for Microsoft SQL Server data protection.
We have recently seen some convergence of different database technologies. Many customers are evaluating heterogeneous migrations as their database needs have evolved or changed. Evaluating the best database to use for a job isn’t as clear as it was ten years ago. In this session, we discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, and Amazon Redshift. This session digs into how to evaluate a new workload for the best managed database option.
AWS offers storage, networking, and data transfer services so you can build and deploy solutions to extend backup and archive targets to the AWS Cloud, increasing scalability, durability, security, and compliance.
Files in AWS: Overcoming Storage Challenges for Common File Use Cases, with S...Amazon Web Services
Running out of capacity on your NAS? Tired of buying and maintaining storage systems for file shares, media archives, or high-performance shared file systems? Learn how you can use AWS storage services to help eliminate the capital expense and operational complexity of on-premises file storage. We provide guidance how to use AWS’s in-cloud file storage service Amazon EFS, as well as how to connect on-premises file workloads to data stored in Amazon S3 via the AWS Storage Gateway. Hear examples from customers such as Celgene Corporation, who are using these services in hybrid and in-cloud architectures, to take advantage of AWS durability, performance, and economics.
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
by Everett Dolgner, Business Development Manager AWS
With a hybrid architecture approach to managing data on-premises and in the cloud, organizations can be more agile and responsive than ever before. Find out what your peers are doing with cloud and how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. We will also cover dynamic data indexing across on-premises and cloud storage and holistic data protection.
Hightail is a file sharing and collaboration platform that was formerly known as YouSendIt. In 2015, Hightail transitioned from solely file sending to broader file sharing and collaboration tools. This required overhauling Hightail's technical stack, which was previously on-premise. Hightail evaluated AWS, Google, and IBM for cloud compute and storage and chose AWS in late 2016 due to AWS's tiered storage options, data lifecycle management, competitive pricing and financial incentives, lower operational costs and risks compared to on-premise, and AWS's experience supporting other companies through similar migrations. Hightail completed migrating its infrastructure and data to AWS by August 2017.
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will also touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements.
Building a modern data platform in the cloud. AWS DevDay Nordicsjavier ramirez
This presentation introduces the problems of data engineering and the AWS services you can use to make your life easier. It featured a live demo in Stockholm and Oslo, which is shown as screenshots. The URL for doing the demo yourself is included in the slides
Modern data is massive, quickly evolving, unstructured, and increasingly hard to catalog and understand from multiple consumers and applications. This session will guide you though the best practices for designing a robust data architecture, highlightning the benefits and typical challenges of data lakes and data warehouses. We will build a scalable solution based on managed services such as Amazon Athena, AWS Glue, and AWS Lake Formation.
Building Data Lakes and Analytics on AWS. IPExpo Manchester.javier ramirez
Over 90% of today's data was generated in the last 2 years, and the rate of data growth isn't slowing down. In this session, we'll step through the challenges and best practices on how to capture all the data that is being generated, understand what data you have, and start driving insights and even predict the future using purpose built AWS Services.
We'll frame the session around common pitfalls of building Data Lakes and how to successful drive analytics and insights from the data. This session will focus on the architecture patterns bringing together key AWS Services and rather than a deep dive on any single service. We'll show how services such as Amazon S3, AWS Glue, Amazon Redshift, Amazon Athena, Amazon EMR, Amazon Kinesis, and even Amazon Machine Learning services are put together to build a successful data lake for various role including both data scientists and business users.
Backup & Recovery - Optimize Your Backup and Restore Architectures in the CloudAmazon Web Services
This document discusses optimizing backup and restore architectures in the cloud. It begins by noting the rapid growth of digital data and importance of backup and recovery. Common terms like RPO and RTO are defined. Traditional on-premises backup is compared to approaches using cloud connectors, gateways, and services like S3, Glacier, and EBS. Benefits of cloud backup include cost savings, automation, and analytics. A variety of AWS storage services and partners are presented as solutions for different backup use cases.
NetApp Cloud Data Services & AWS Empower Your Cloud ChampionsAmazon Web Services
The document discusses enabling cloud champions with AWS and NetApp cloud data services. It highlights how hyperscale computing is leading the way for government agencies to use data to innovate and reduce costs. NetApp cloud volumes provide enterprise-level file services on AWS to accelerate all types of cloud workloads. Examples are given of how NetApp solutions help with data migration, disaster recovery, and meeting data storage needs on AWS.
How to Migrate Your SaaS Apps to AWS for Increased Agility and AvailabilityAmazon Web Services
SoftNAS Cloud helped Modus easily move its app to AWS without application re-engineering. This frictionless experience helped increase agility and gave time back to operations teams, while enabling Modus to drive business value, add new customers and streamline company acquisitions. Join us for this special webinar to learn more about how your organization can benefit from SoftNAS Cloud on AWS.
The document discusses the speaker's journey in building analytics capabilities over time, from using a separate server for reporting to using Hadoop and NoSQL databases to address scaling issues. It notes the challenges of maintaining these systems and data preparation work. The presentation then outlines Amazon Web Services offerings for building a modern data lake and analytics platform in AWS, including services for data storage, movement, processing, analytics and machine learning.
As the volume and types of data continues to grow, customers often have valuable data that is not easily discoverable and available for analytics. A common challenge for data engineering teams is architecting a data lake that can cater to the needs of diverse users - from developers to business analysts to data scientists. In this session, dive deep into building a data lake using Amazon S3, Amazon Kinesis, Amazon Athena and AWS Glue. Learn how AWS Glue crawlers can automatically discover your data, extracting and cataloguing relevant metadata to reduce operations in preparing your data for downstream consumers.
This document provides a summary of Amazon Web Services (AWS) storage solutions and customer use cases. It discusses AWS storage services like Amazon S3, EBS, EFS, Snowball, and Storage Gateway. It also highlights new storage features for data movement, security, management and analytics. Several case studies describe how customers are using AWS storage for backup/disaster recovery, media workflows, scientific computing, and other workloads.
When Fujirebio Diagnostics, a leading producer of in vitro diagnostics, shifted to virtualization and the cloud, it wanted to replace its costly, unreliable, and cumbersome backup solution. Fujirebio turned to Amazon Web Services (AWS) and Rubrik for a more modern solution. The company used Rubrik Cloud Data Management to eliminate complex tape backup and archive mission critical production systems on AWS, as well as extend on-site storage capacity. The solution automates backup, recovery, and archival on AWS, helping the company drive operational efficiency and resilience. In this webinar, you will learn how Fujirebio Diagnostics used AWS and Rubrik to simplify data protection, achieve fast recovery, reduce management time, and lower total cost of ownership by 75 percent.
AWS Speaker: Mike Ruiz, Partner Solutions Architect
Rubrik Speakers: Kenneth Hui, Technical Marketing Engineer & Mark Haus, Sales Engineer
BDA306 Building a Modern Data Warehouse: Deep Dive on Amazon RedshiftAmazon Web Services
In this session, we take a deep dive on Amazon Redshift architecture and the latest performance enhancements that give you faster insights into your data. We also cover Redshift Spectrum, a feature of Redshift that enables you to analyze data across Redshift and your Amazon S3 data lake to deliver unique insights not possible by analyzing independent data silos. A customer is joining us to share how they were able to extend their data warehouse to their data lake to encompass multiple data sources and data formats. This modern architecture helps them tie together data sources to get actionable insights across their business units.
Move Data to AWS Faster for Migrations, DR, & Bidirectional Workflows (STG382...Amazon Web Services
There are a myriad of ways to get your file-based data into Amazon S3 and Amazon EFS, from the simple AWS command line tools and similar script-based approaches to proprietary commercial tools built for media workflows. Attend this session to learn about an architecture and associated tips and tricks that enable you to improve data transfer performance—scaling out with parallel streams—without the manual labor of extensive scripting or the cost of third-party licensed software.
Overview of AWS Services for Data Storage and Migration - SRV205 - Anaheim AW...Amazon Web Services
In this session, we explore the features and functions of AWS storage services. We provide context on the portfolio, and we cover the most common use cases for AWS offerings for object, file, block, and migration technologies, including the partner ecosystem. We then describe each service through customer case studies. Expect to leave this session understanding how to select a storage service and start moving workloads or building new ones.
How Fannie Mae Processes over a Quarter Million Loans per Day with Amazon S3 ...Amazon Web Services
Fannie Mae processes over 250,000 loans per day with Amazon S3 by optimizing performance, availability, and durability. They implemented caching to improve GET and PUT latency, uncoupled writes from transactions with compensation, and encrypted data at rest. These changes reduced response times by over 100% and improved latency percentiles. Additional challenges involved retrying occasional slow GETs and rightsizing instances.
AWS Data Transfer Services: Deep Dive - SRV302 - Chicago AWS SummitAmazon Web Services
In this session, we provide IT pros and application owners with an overview of AWS options for building hybrid storage architectures or even migrating an entire data center to the AWS Cloud. AWS Storage Gateway connects existing on-premises block, file, or tape storage systems to AWS Cloud storage over the WAN in a hybrid model. The AWS Snow family of physical devices can capture, pre-process, and migrate data into and out of AWS without any network connection. Join us to learn how you can close down data centers, reduce storage footprints, and build solutions for tiering, data lakes, backup, disaster recovery, and migration.
From raw data to business insights. A modern data lakejavier ramirez
In this talk I spoke about the pitfalls when you try to build a data lake, and how you can solve the problem either with unmanaged open source, or with the managed and/or native solutions at AWS. Delivered at the Madrid Data Engineering meetup in May 2019
Database Week at the San Francisco Loft
Amazon Aurora
A closer look at the MySQL and PostgreSQL compatible relational database built for the cloud that combines the performance and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases. We’ll explore how Aurora uses the AWS cloud to provide high reliability, high durability, and high throughput.
Level: 200
Speakers:
Mahesh Pakala - Solutions Architect, AWS
Arabinda Pani - Partner Solutions Architect, Database Specialist, AWS
: Grid Computing in the cloud is driven by numerous motivations – scalability, agility, security, price, enterprise data governance, just to name a few. The results are the ability to innovate faster and experiment with less risks and costs. AWS Financial Services Technology Leader Yinal Ozkan and Intel Sales Development Manager Steve Conn go through specific benefits grid computing in the cloud can bring to all organizations (particularly in Financial Services), share customer use cases, and discuss different computing consumption models.
Speaker: Diaa Radwan, AWS
Level: 300
When migrating applications to the AWS Cloud, it’s important to architect cloud environments that are efficient, secure, and compliant. AWS now offers the simple services of data and applications migration. In this session, we explore ways to cost-effectively reinvent disaster recovery so it can extend to applications and workloads as first steps for migration to AWS cloud. We discuss customer use cases and review the different applications they used with our data migration services to cut their IT expenditures and management time on hardware and backup solutions.
Similar to AWS Storage State of the Union & APN Storage Ecosystem (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
1) The document discusses building a minimum viable product (MVP) using Amazon Web Services (AWS).
2) It provides an example of an MVP for an omni-channel messenger platform that was built from 2017 to connect ecommerce stores to customers via web chat, Facebook Messenger, WhatsApp, and other channels.
3) The founder discusses how they started with an MVP in 2017 with 200 ecommerce stores in Hong Kong and Taiwan, and have since expanded to over 5000 clients across Southeast Asia using AWS for scaling.
This document discusses pitch decks and fundraising materials. It explains that venture capitalists will typically spend only 3 minutes and 44 seconds reviewing a pitch deck. Therefore, the deck needs to tell a compelling story to grab their attention. It also provides tips on tailoring different types of decks for different purposes, such as creating a concise 1-2 page teaser, a presentation deck for pitching in-person, and a more detailed read-only or fundraising deck. The document stresses the importance of including key information like the problem, solution, product, traction, market size, plans, team, and ask.
This document discusses building serverless web applications using AWS services like API Gateway, Lambda, DynamoDB, S3 and Amplify. It provides an overview of each service and how they can work together to create a scalable, secure and cost-effective serverless application stack without having to manage servers or infrastructure. Key services covered include API Gateway for hosting APIs, Lambda for backend logic, DynamoDB for database needs, S3 for static content, and Amplify for frontend hosting and continuous deployment.
This document provides tips for fundraising from startup founders Roland Yau and Sze Lok Chan. It discusses generating competition to create urgency for investors, fundraising in parallel rather than sequentially, having a clear fundraising narrative focused on what you do and why it's compelling, and prioritizing relationships with people over firms. It also notes how the pandemic has changed fundraising, with examples of deals done virtually during this time. The tips emphasize being fully prepared before fundraising and cultivating connections with investors in advance.
AWS_HK_StartupDay_Building Interactive websites while automating for efficien...Amazon Web Services
This document discusses Amazon's machine learning services for building conversational interfaces and extracting insights from unstructured text and audio. It describes Amazon Lex for creating chatbots, Amazon Comprehend for natural language processing tasks like entity extraction and sentiment analysis, and how they can be used together for applications like intelligent call centers and content analysis. Pre-trained APIs simplify adding machine learning to apps without requiring ML expertise.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.