by Henry Zhang, Sr. Product Manager, AWS
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
Backup and Recovery with Cloud-Native Deduplication and Use Cases from the Fi...Amazon Web Services
by Hugh Emberson, CTO, StorReduce
Designing and deploying cloud-enabled backup & recovery solutions often leads to opportunities for reducing storage requirements and increasing efficiencies. Having effective cloud-native deduplication capabilities as part of your backup & recovery strategy can optimize migration, decrease the need for purpose built backup appliances like Data Domains, large tape archives, and enable cost reductions of up to 95%. In this session, StorReduce will provide best practices around data deduplication in relation to designing and deploying solutions around backup, archive, and general unstructured file data. They will also demonstrate how using a cloud native interface with scale-out deduplication enables generic cloud services like search inside all backups moved to cloud. They will guide the audience through two customer use cases from the financial services and healthcare industries.
by Everett Dolgner, Business Development Manager, AWS
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum and optimizing your overall capital expense can be challenging. This session presents AWS features and services along with disaster recovery architectures that you can leverage when building highly available and disaster-resilient strategies.
by Drew Meyer, Sr. Product Marketing Manager, AWS
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements. The session will also address our partner network and what it means for a storage provider to have the APN Storage Competency.
Disaster Recovery Best Practices and Customer Use Cases: CGS and HealthQuestAmazon Web Services
This document provides an agenda for a CloudEndure presentation at an AWS storage day event. The agenda includes an introduction to CloudEndure and how it works with AWS, enterprise disaster recovery strategies, a deep dive into CloudEndure's DR technology and demo, customer case studies, and a Q&A session. It discusses CloudEndure's key technology pillars for OS-based continuous replication and disaster recovery benefits like cost reductions and recovery objectives. The document also includes case studies of how CloudEndure helped customers like CGS and HealthQuest implement DR solutions on AWS to reduce costs and improve recovery times. It promotes CloudEndure DR and migration products available on AWS Marketplace and provides a call to action to become a CloudEnd
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
This document discusses optimizing storage for big data workloads on AWS. It provides an overview of various AWS storage options for different big data use cases like Hadoop, data warehousing, NoSQL databases and streaming. It also shares several customer examples using EBS volumes for big data workloads like Hadoop, Cassandra and Splunk. The document recommends choosing the right EC2 instance type and EBS volume type based on the workload's input/output patterns and throughput/capacity needs to optimize performance and costs.
This document summarizes a presentation on data lifecycle and storage management techniques for Amazon S3. It discusses lifecycle management rules for transitioning or expiring objects based on age, S3 inventory for listing objects, object tagging for classification and policy filtering, storage class analysis for monitoring usage and optimizing storage, and monitoring tools like CloudWatch and CloudTrail. The presentation provides an overview and best practices for these S3 management features.
by Dave Stein, Business Development Manager, AWS
Discover how EBS can take your application deployments on EC2 to the next level. You will learn service features and benefits, how to identify applications that are appropriate for use with EBS, best practices, and details about its performance and volume types.
Backup and Recovery with Cloud-Native Deduplication and Use Cases from the Fi...Amazon Web Services
by Hugh Emberson, CTO, StorReduce
Designing and deploying cloud-enabled backup & recovery solutions often leads to opportunities for reducing storage requirements and increasing efficiencies. Having effective cloud-native deduplication capabilities as part of your backup & recovery strategy can optimize migration, decrease the need for purpose built backup appliances like Data Domains, large tape archives, and enable cost reductions of up to 95%. In this session, StorReduce will provide best practices around data deduplication in relation to designing and deploying solutions around backup, archive, and general unstructured file data. They will also demonstrate how using a cloud native interface with scale-out deduplication enables generic cloud services like search inside all backups moved to cloud. They will guide the audience through two customer use cases from the financial services and healthcare industries.
by Everett Dolgner, Business Development Manager, AWS
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum and optimizing your overall capital expense can be challenging. This session presents AWS features and services along with disaster recovery architectures that you can leverage when building highly available and disaster-resilient strategies.
by Drew Meyer, Sr. Product Marketing Manager, AWS
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements. The session will also address our partner network and what it means for a storage provider to have the APN Storage Competency.
Disaster Recovery Best Practices and Customer Use Cases: CGS and HealthQuestAmazon Web Services
This document provides an agenda for a CloudEndure presentation at an AWS storage day event. The agenda includes an introduction to CloudEndure and how it works with AWS, enterprise disaster recovery strategies, a deep dive into CloudEndure's DR technology and demo, customer case studies, and a Q&A session. It discusses CloudEndure's key technology pillars for OS-based continuous replication and disaster recovery benefits like cost reductions and recovery objectives. The document also includes case studies of how CloudEndure helped customers like CGS and HealthQuest implement DR solutions on AWS to reduce costs and improve recovery times. It promotes CloudEndure DR and migration products available on AWS Marketplace and provides a call to action to become a CloudEnd
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
This document discusses optimizing storage for big data workloads on AWS. It provides an overview of various AWS storage options for different big data use cases like Hadoop, data warehousing, NoSQL databases and streaming. It also shares several customer examples using EBS volumes for big data workloads like Hadoop, Cassandra and Splunk. The document recommends choosing the right EC2 instance type and EBS volume type based on the workload's input/output patterns and throughput/capacity needs to optimize performance and costs.
This document summarizes a presentation on data lifecycle and storage management techniques for Amazon S3. It discusses lifecycle management rules for transitioning or expiring objects based on age, S3 inventory for listing objects, object tagging for classification and policy filtering, storage class analysis for monitoring usage and optimizing storage, and monitoring tools like CloudWatch and CloudTrail. The presentation provides an overview and best practices for these S3 management features.
by Dave Stein, Business Development Manager, AWS
Discover how EBS can take your application deployments on EC2 to the next level. You will learn service features and benefits, how to identify applications that are appropriate for use with EBS, best practices, and details about its performance and volume types.
by Isaiah Weiner, Sr. Manager of Solutions Architecture, AWS
Surveys consistently rank backup & restore as one of the first workloads to move to the cloud. But what does it really look like? This session provides best practices for streamlining AWS integration with existing on-premises data backup software, tape processes, virtual tape libraries, third-party snapshots, file servers, and archives. Learn how to choose the right integration with varying degrees of disruption, how to automatically migrate data for cost reductions and compliance, and how to recover individual files or many files quickly.
AWS offers storage, networking, and data transfer services so you can build and deploy solutions to extend backup and archive targets to the AWS Cloud, increasing scalability, durability, security, and compliance.
by Everett Dolgner, Business Development Manager, AWS
AWS offers numerous services to migrate data at a petabyte scale. You can easily move large volumes of data from onsite to the cloud and utilize the cloud as a backup target using data transfer services, such as AWS Snowball, AWS Snowball Edge, or AWS Storage Gateway. Learn about available data migration options and which one is the right fit for your requirements.
Hybrid Cloud Data Management: Using Data for Business Outcomes - STG308 - re:...Amazon Web Services
Today, data backup isn’t enough. IT teams with a cloud data management strategy become a data broker for the business. Data helps the business improve company reputation, drive revenue, and satisfy customers. With a hybrid architecture approach to managing data on-premises and in the cloud, the business can be more agile and more responsive than today. Find out what your IT peers are doing with cloud data management (hint: it’s more than backup). Learn how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. See what your peers are doing to best move, manage, and use data across on-premises storage and cloud services. In this session, you learn steps for seamless, risk-free migration to different AWS services (Amazon EC2, Amazon RDS, Amazon S3, Amazon S3 - Infrequent Access class, Amazon Glacier and AWS Snowball); tactics for streamlined, enterprise-class disaster recovery; ways to save money by retiring expensive alternatives like tape storage; single view e-discovery across hybrid locations with dynamic data indexing across on-premises and cloud storage; and how to achieve holistic data protection across storage locations.
Session sponsored by Commvault
by PD Dutta, Sr. Product Manager, Object Storage, AWS
We will explain how to design and build an IoT cloud platform on top of Amazon S3. You will get to review the best practices for architecting a cost-effective, durable, and secure storage solution to store and analyze your IoT data on Amazon S3. In addition, we’ll cover how to collect, ingest and analyze the data in-place using different AWS Services such as AWS IoT, Amazon Kinesis, Amazon Athena, and Amazon Redshift Spectrum.
The document discusses various backup and archival strategies using AWS services like Amazon S3, EBS, Glacier, and Snowball. It provides examples of using S3 lifecycle policies to transition data between storage tiers, taking EBS snapshots for EC2 instance backups, and using Snowball for large-scale data transfers to the cloud. Backup and archival solutions can provide durability, scalability, cost savings, and reduce risks compared to on-premises options.
Deep Dive: Building Hybrid Cloud Storage Architectures with AWS Storage Gatew...Amazon Web Services
Are you tired of the treadmill of deploying on-premises storage? Join this session to learn how to use AWS Storage Gateway to shift storage for on-premises apps to the cloud, reducing your infrastructure and management challenges. Storage Gateway connects your apps to AWS storage services, including Amazon S3, using standard block, file and tape storage protocols. You can use Storage Gateway for hybrid cloud use cases for file-based application data storage, backup, analytics with data lakes, machine learning (ML), and migration. Learn about best practices from a customer using Storage Gateway for Microsoft SQL Server data protection.
Disaster Recovery with AWS: Tiered Approaches to Balance Cost with Recovery O...Amazon Web Services
Learn how to take advantage of AWS for disaster recovery. In this session, we examine how traditional disaster recovery concepts can be adapted to the cloud. We also explore ways to cost-effectively reinvent disaster recovery, so it can extend to applications and workloads that have never had it before. This session walks you through tiered technology approaches to apply as part of a disaster recovery strategy that aligns costs to intended business outcomes.
Improving Backup & DR – AWS Storage Gateway - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Understand how AWS Storage Gateway works to enable application recoveries on EC2
- Determine which type of storage gateway makes sense for your different backup and DR needs
- Learn how to connect your applications and backup systems to AWS
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
AWS Data Transfer Services - AWS Gateway, AWS Snowball, AWS Snowball Edge, an...Amazon Web Services
by Everett Dolgner, Business Development Manager, AWS
AWS offers a suite of tools to help you surmount limitations associated to data migration from on premise to the cloud. Attend this session to learn about moving data by using networks, roads, and AWS technology partners. We will also discuss how to move data into and out of the Cloud in batches, increments, and streams.
STG311_Deep Dive on Amazon S3 & Amazon Glacier Storage ManagementAmazon Web Services
Learn best practices for Amazon Simple Storage Service (Amazon S3) performance optimization, security, data protection, storage management, and much more. Learn how to optimize key naming to increase throughput, apply the appropriate AWS Identity and Access Management (IAM) and encryption configurations, and leverage object tagging and other features to enhance security.
Backup & Recovery - Optimize Your Backup and Restore Architectures in the CloudAmazon Web Services
This document discusses optimizing backup and restore architectures in the cloud. It begins by noting the rapid growth of digital data and importance of backup and recovery. Common terms like RPO and RTO are defined. Traditional on-premises backup is compared to approaches using cloud connectors, gateways, and services like S3, Glacier, and EBS. Benefits of cloud backup include cost savings, automation, and analytics. A variety of AWS storage services and partners are presented as solutions for different backup use cases.
by Chris Proto, DevOps Engineer, Craftsy
Craftsy is the leading online destination for passionate makers to learn, create, and share. With online classes, popular supplies and indie patterns, over ten million creative enthusiasts are taking their skills to new heights. By working with AWS and using data transfer services, Crafty was able to bounce back from a massive storage outage that impacted numerous teams. By using AWS storage services, the company was able to minimize the outage and speed up data restore three-fold. Learn more by attending this session.
AWS and Panzura provide cost-efficient storage resources to your Amazon EC2 based storage targets, such as Amazon EBS and Amazon S3, while maintain existing legacy NAS connections. Attend this session to learn about how enterprises can use Panzura and AWS to drastically reduce data storage costs while keeping the performance and feel of an on premise NAS. Presented by Panzura.
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
Hybrid Cloud Storage for Recovery & Migration with AWS Storage Gateway (STG30...Amazon Web Services
In this workshop, we provide hands-on experience using the AWS Storage Gateway service to protect on-premises data in AWS, recover it locally or in the cloud in minutes, and migrate it when the time is right. You work with the File Gateway and Microsoft SQL Server native tools to back up to Amazon S3, and then recover or migrate that database in AWS rapidly. In addition, you use Volume Gateway and Amazon EBS Snapshots to protect and migrate block-based volumes. Use this session to hone your skills with backup and DR, and prepare for application migrations.
Public sector IT teams, similar to those in the commercial sector, face the daunting task of meeting ever-increasing storage demand from users with status quo budget. Cloud provides scale out storage with a pay-as-you-go pricing, but how can you take its advantage without rewriting your applications and impacting the current storage experience? AWS Storage Gateway provides a swift path to using cloud storage without disrupting your business workflows and storage experience. AWS Storage Gateway enables a wide variety of hybrid storage uses cases such as backup and archiving, disaster recovery, cloud data processing, storage tiering, and migration. In this session, you will learn about File, Volume, and Tape configurations of the AWS Storage Gateway family, and their features and benefits. You will also hear from our customers how they use AWS Storage Gateway to overcome their on-premises storage challenges and chart paths to using cloud for storage and processing.
Disaster Recovery on AWS Webinar December 2017 - IL WebinarAmazon Web Services
Learn about the use of the AWS Cloud as a disaster recovery (DR) environment and explore how architectural approaches to DR and business continuity on AWS give you the skills and experience you need to start building cloud-based production applications.
- Create DR environments for your existing systems to minimize technology and business risks
- Reduce your infrastructure costs and pay only for the DR resources you use
- Test your DR provision more frequently to ensure your critical systems and data are protected
We have recently seen some convergence of different database technologies. Many customers are evaluating heterogeneous migrations as their database needs have evolved or changed. Evaluating the best database to use for a job isn’t as clear as it was ten years ago. In this session, we discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, and Amazon Redshift. This session digs into how to evaluate a new workload for the best managed database option.
In this session, learn about all of the AWS storage solutions, and get guidance about which ones to use for different use cases. We discuss the core AWS storage services. These include Amazon Simple Storage Service (Amazon S3), Amazon Glacier, Amazon Elastic File System (Amazon EFS), and Amazon Elastic Block Store (Amazon EBS). We also discuss data transfer services such as AWS Snowball, Snowball Edge, and AWS Snowmobile, and hybrid storage solutions such as AWS Storage Gateway.
by Isaiah Weiner, Sr. Manager of Solutions Architecture, AWS
Surveys consistently rank backup & restore as one of the first workloads to move to the cloud. But what does it really look like? This session provides best practices for streamlining AWS integration with existing on-premises data backup software, tape processes, virtual tape libraries, third-party snapshots, file servers, and archives. Learn how to choose the right integration with varying degrees of disruption, how to automatically migrate data for cost reductions and compliance, and how to recover individual files or many files quickly.
AWS offers storage, networking, and data transfer services so you can build and deploy solutions to extend backup and archive targets to the AWS Cloud, increasing scalability, durability, security, and compliance.
by Everett Dolgner, Business Development Manager, AWS
AWS offers numerous services to migrate data at a petabyte scale. You can easily move large volumes of data from onsite to the cloud and utilize the cloud as a backup target using data transfer services, such as AWS Snowball, AWS Snowball Edge, or AWS Storage Gateway. Learn about available data migration options and which one is the right fit for your requirements.
Hybrid Cloud Data Management: Using Data for Business Outcomes - STG308 - re:...Amazon Web Services
Today, data backup isn’t enough. IT teams with a cloud data management strategy become a data broker for the business. Data helps the business improve company reputation, drive revenue, and satisfy customers. With a hybrid architecture approach to managing data on-premises and in the cloud, the business can be more agile and more responsive than today. Find out what your IT peers are doing with cloud data management (hint: it’s more than backup). Learn how data backup, recovery, management, and e-discovery capabilities can help maximize your use of AWS. See what your peers are doing to best move, manage, and use data across on-premises storage and cloud services. In this session, you learn steps for seamless, risk-free migration to different AWS services (Amazon EC2, Amazon RDS, Amazon S3, Amazon S3 - Infrequent Access class, Amazon Glacier and AWS Snowball); tactics for streamlined, enterprise-class disaster recovery; ways to save money by retiring expensive alternatives like tape storage; single view e-discovery across hybrid locations with dynamic data indexing across on-premises and cloud storage; and how to achieve holistic data protection across storage locations.
Session sponsored by Commvault
by PD Dutta, Sr. Product Manager, Object Storage, AWS
We will explain how to design and build an IoT cloud platform on top of Amazon S3. You will get to review the best practices for architecting a cost-effective, durable, and secure storage solution to store and analyze your IoT data on Amazon S3. In addition, we’ll cover how to collect, ingest and analyze the data in-place using different AWS Services such as AWS IoT, Amazon Kinesis, Amazon Athena, and Amazon Redshift Spectrum.
The document discusses various backup and archival strategies using AWS services like Amazon S3, EBS, Glacier, and Snowball. It provides examples of using S3 lifecycle policies to transition data between storage tiers, taking EBS snapshots for EC2 instance backups, and using Snowball for large-scale data transfers to the cloud. Backup and archival solutions can provide durability, scalability, cost savings, and reduce risks compared to on-premises options.
Deep Dive: Building Hybrid Cloud Storage Architectures with AWS Storage Gatew...Amazon Web Services
Are you tired of the treadmill of deploying on-premises storage? Join this session to learn how to use AWS Storage Gateway to shift storage for on-premises apps to the cloud, reducing your infrastructure and management challenges. Storage Gateway connects your apps to AWS storage services, including Amazon S3, using standard block, file and tape storage protocols. You can use Storage Gateway for hybrid cloud use cases for file-based application data storage, backup, analytics with data lakes, machine learning (ML), and migration. Learn about best practices from a customer using Storage Gateway for Microsoft SQL Server data protection.
Disaster Recovery with AWS: Tiered Approaches to Balance Cost with Recovery O...Amazon Web Services
Learn how to take advantage of AWS for disaster recovery. In this session, we examine how traditional disaster recovery concepts can be adapted to the cloud. We also explore ways to cost-effectively reinvent disaster recovery, so it can extend to applications and workloads that have never had it before. This session walks you through tiered technology approaches to apply as part of a disaster recovery strategy that aligns costs to intended business outcomes.
Improving Backup & DR – AWS Storage Gateway - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Understand how AWS Storage Gateway works to enable application recoveries on EC2
- Determine which type of storage gateway makes sense for your different backup and DR needs
- Learn how to connect your applications and backup systems to AWS
Increasingly, valuable customer data sources are dispersed among on-premises data centers, SaaS providers, partners, third-party data providers, and public datasets. Building a data lake on AWS offers a foundation for storing on-premises, third-party, and public datasets cost effectively with high performance. This workshop introduces AWS tools and technologies you can use to analyze and extract value from petabyte-scale datasets, including Amazon Athena and Amazon Redshift Spectrum.
AWS Data Transfer Services - AWS Gateway, AWS Snowball, AWS Snowball Edge, an...Amazon Web Services
by Everett Dolgner, Business Development Manager, AWS
AWS offers a suite of tools to help you surmount limitations associated to data migration from on premise to the cloud. Attend this session to learn about moving data by using networks, roads, and AWS technology partners. We will also discuss how to move data into and out of the Cloud in batches, increments, and streams.
STG311_Deep Dive on Amazon S3 & Amazon Glacier Storage ManagementAmazon Web Services
Learn best practices for Amazon Simple Storage Service (Amazon S3) performance optimization, security, data protection, storage management, and much more. Learn how to optimize key naming to increase throughput, apply the appropriate AWS Identity and Access Management (IAM) and encryption configurations, and leverage object tagging and other features to enhance security.
Backup & Recovery - Optimize Your Backup and Restore Architectures in the CloudAmazon Web Services
This document discusses optimizing backup and restore architectures in the cloud. It begins by noting the rapid growth of digital data and importance of backup and recovery. Common terms like RPO and RTO are defined. Traditional on-premises backup is compared to approaches using cloud connectors, gateways, and services like S3, Glacier, and EBS. Benefits of cloud backup include cost savings, automation, and analytics. A variety of AWS storage services and partners are presented as solutions for different backup use cases.
by Chris Proto, DevOps Engineer, Craftsy
Craftsy is the leading online destination for passionate makers to learn, create, and share. With online classes, popular supplies and indie patterns, over ten million creative enthusiasts are taking their skills to new heights. By working with AWS and using data transfer services, Crafty was able to bounce back from a massive storage outage that impacted numerous teams. By using AWS storage services, the company was able to minimize the outage and speed up data restore three-fold. Learn more by attending this session.
AWS and Panzura provide cost-efficient storage resources to your Amazon EC2 based storage targets, such as Amazon EBS and Amazon S3, while maintain existing legacy NAS connections. Attend this session to learn about how enterprises can use Panzura and AWS to drastically reduce data storage costs while keeping the performance and feel of an on premise NAS. Presented by Panzura.
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
Hybrid Cloud Storage for Recovery & Migration with AWS Storage Gateway (STG30...Amazon Web Services
In this workshop, we provide hands-on experience using the AWS Storage Gateway service to protect on-premises data in AWS, recover it locally or in the cloud in minutes, and migrate it when the time is right. You work with the File Gateway and Microsoft SQL Server native tools to back up to Amazon S3, and then recover or migrate that database in AWS rapidly. In addition, you use Volume Gateway and Amazon EBS Snapshots to protect and migrate block-based volumes. Use this session to hone your skills with backup and DR, and prepare for application migrations.
Public sector IT teams, similar to those in the commercial sector, face the daunting task of meeting ever-increasing storage demand from users with status quo budget. Cloud provides scale out storage with a pay-as-you-go pricing, but how can you take its advantage without rewriting your applications and impacting the current storage experience? AWS Storage Gateway provides a swift path to using cloud storage without disrupting your business workflows and storage experience. AWS Storage Gateway enables a wide variety of hybrid storage uses cases such as backup and archiving, disaster recovery, cloud data processing, storage tiering, and migration. In this session, you will learn about File, Volume, and Tape configurations of the AWS Storage Gateway family, and their features and benefits. You will also hear from our customers how they use AWS Storage Gateway to overcome their on-premises storage challenges and chart paths to using cloud for storage and processing.
Disaster Recovery on AWS Webinar December 2017 - IL WebinarAmazon Web Services
Learn about the use of the AWS Cloud as a disaster recovery (DR) environment and explore how architectural approaches to DR and business continuity on AWS give you the skills and experience you need to start building cloud-based production applications.
- Create DR environments for your existing systems to minimize technology and business risks
- Reduce your infrastructure costs and pay only for the DR resources you use
- Test your DR provision more frequently to ensure your critical systems and data are protected
We have recently seen some convergence of different database technologies. Many customers are evaluating heterogeneous migrations as their database needs have evolved or changed. Evaluating the best database to use for a job isn’t as clear as it was ten years ago. In this session, we discuss the ideal use cases for relational and nonrelational data services, including Amazon ElastiCache for Redis, Amazon DynamoDB, Amazon Aurora, and Amazon Redshift. This session digs into how to evaluate a new workload for the best managed database option.
In this session, learn about all of the AWS storage solutions, and get guidance about which ones to use for different use cases. We discuss the core AWS storage services. These include Amazon Simple Storage Service (Amazon S3), Amazon Glacier, Amazon Elastic File System (Amazon EFS), and Amazon Elastic Block Store (Amazon EBS). We also discuss data transfer services such as AWS Snowball, Snowball Edge, and AWS Snowmobile, and hybrid storage solutions such as AWS Storage Gateway.
This document provides a summary of Amazon Web Services (AWS) storage solutions and customer use cases. It discusses AWS storage services like Amazon S3, EBS, EFS, Snowball, and Storage Gateway. It also highlights new storage features for data movement, security, management and analytics. Several case studies describe how customers are using AWS storage for backup/disaster recovery, media workflows, scientific computing, and other workloads.
Deploy and Enforce Compliance Controls When Archiving Large-Scale Data Stores...Amazon Web Services
Learning Objectives:
- Learn what storage regulations to be aware of when developing and deploying a cloud based storage solution
- Gain awareness of various strategies to address compliance
- Examine solutions available from AWS, including Amazon S3, Amazon Glacier and the Vault Lock feature, AWS Snowball and their data ingestion services.
We will cover the core AWS storage services, which include Amazon Simple Storage Service (Amazon S3), Amazon Glacier, Amazon Elastic File System (Amazon EFS), and Amazon Elastic Block Store (Amazon EBS). We also discuss data transfer services such as AWS Snowball, Snowball Edge, and AWS Snowmobile, and hybrid storage solutions such as AWS Storage Gateway.
100 Billion Data Points With Lambda_AWSPSSummit_SingaporeAmazon Web Services
The document discusses the Genome Institute of Singapore's use of AWS services like Lambda and Batch for processing and analyzing genomics data. It describes their journey from using basic compute and storage to implementing serverless architectures on AWS to automate complex genomics pipelines and scale to process over 100 billion data points daily from sequencing experiments. This enabled GIS to achieve their objective of characterizing genetic variation in 10,000 Singaporeans through whole genome sequencing and create genomic references and controls for disease studies.
How to Migrate Your SaaS Apps to AWS for Increased Agility and AvailabilityAmazon Web Services
SoftNAS Cloud helped Modus easily move its app to AWS without application re-engineering. This frictionless experience helped increase agility and gave time back to operations teams, while enabling Modus to drive business value, add new customers and streamline company acquisitions. Join us for this special webinar to learn more about how your organization can benefit from SoftNAS Cloud on AWS.
This is your chance to learn directly from top CTOs and Cloud Architects from some of the most innovative AWS customers. In this lightning round session, we'll have an action-packed hour, jumping straight to the architecture and technical detail for some of the most innovative data storage solutions of 2017. Hear how Insitu collects and analyzes data from drone flights in the field with AWS Snowball Edge. See how iRobot collects and analyzes IoT data from their robotic vacuums, mops, and pool cleaners. Learn how Viber maintains a petabyte-scale data lake on Amazon S3. Understand how Alert Logic scales their massive SaaS cloud security solution on Amazon S3 & Amazon Glacier.
I Want to Analyze and Visualize Website Access Logs, but Why Do I Need Server...Amazon Web Services
Nowadays, it’s common for a web server to be fronted by a global content delivery service, such as Amazon CloudFront, to accelerate delivery of websites, APIs, media content, and other web assets. Website administrators and developers want to generate insights in order to improve website availability through bot detection and mitigation, by optimizing web content based on the devices and browser used, by reducing perceived latency by caching a popular object closer to its viewer, and so on. In this session, we dive deep into building an end-to-end serverless analytics solution to analyze Amazon CloudFront access logs, both at rest and in transit, using Amazon Athena and Amazon Kinesis Analytics, respectively, and we generate visualization insights using Amazon QuickSight. Join a discussion with AWS solution architects to learn more about the various ways to generate insights to improve the overall perceived experience for your website users.
Cloud computing gives you a number of advantages, such as the ability to scale your web application or website on demand. If you have a new web application and want to use cloud computing, you might be asking yourself, "Where do I start?" Join us in this session to understand best practices for scaling your resources from one to millions of users. We show you how to best combine different AWS services, how to make smarter decisions for architecting your application, and how to scale your infrastructure in the cloud.
Storage Data Management: Tools and Templates to Seamlessly Automate and Optim...Amazon Web Services
This document discusses storage management strategies for Amazon S3 and Amazon Glacier. It provides an overview of S3 architecture and storage classes. It also describes tools for organizing, monitoring, securing, and taking action on stored data using object tagging, inventory, metrics, lifecycle policies, cross-region replication, encryption, and event notifications. The document aims to help users understand their stored data and automate storage management.
Strategic Uses for Cost Efficient Long-Term Cloud StorageAmazon Web Services
Compared to storing long-term datasets on-premises, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategy and meeting compliance needs using Amazon Glacier. Hear how customers have evolved their backup and disaster recovery architectures and replaced tape solutions by turning to AWS for a more cost efficient, durable and agile solution. We will showcase Sony DADC's active archive deployment on Glacier and demo how some of our financial service customers have set up compliant archives to meet their regulatory objectives.
This document provides an overview of building mobile backends on AWS. It discusses using AWS services like AWS Mobile Hub, Amazon Cognito, DynamoDB, Lambda and API Gateway to easily develop, deploy and scale mobile backends. It emphasizes starting simply, integrating services easily, securing data by default, and scaling globally across AWS' infrastructure. The document also covers using AWS for analytics, machine learning, media and real-time features in mobile apps.
Surveys consistently rank backup as one of the first workloads to move to the cloud. But what does it really look like? This session gives backup managers and admins the straight story on streamlining AWS Cloud integration with existing on-premises data backup software, tape processes, virtual tape libraries, third-party snapshots, file servers, and archives. Learn how to choose the right integration with varying degrees of disruption, how to automatically migrate data for cost reductions and compliance, and how to recover individual files or many files fast. We discuss Amazon S3, Amazon Glacier, Amazon EFS, AWS Snowball, AWS Storage Gateway (both as VTL and File Gateway), and third-party partner integrations.
Case Study: Learn how to Choose and Optimize Storage for Media and Entertainm...Amazon Web Services
The document discusses choosing and optimizing storage for media and entertainment workloads on AWS. It provides an overview of AWS storage services like S3, EBS, EFS, and Glacier and how they map to different M&E segments and workloads. It also covers best practices for storage alignment, capacity planning, security, and a case study of Theory Studios using AWS for VFX workloads.
Many customers want a disaster recovery environment, and they want to use this environment daily and know that it's in sync with and can support a production workload. This leads them to an active-active architecture. In other cases, users like Netflix and Lyft are distributed over large geographies. In these cases, multi-region active-active deployments are not optional. Designing these architectures is more complicated than it appears, as data being generated at one end needs to be synced with data at the other end. There are also consistency issues to consider. One needs to make trade-off decisions on cost, performance, and consistency. Further complicating matters is the variety of data stores used in the architecture results in a variety replication methods. In this session, we explore how to design an active-active multi-region architecture using AWS services, including Amazon Route 53, Amazon RDS multi-region replication, AWS DMS, and Amazon DynamoDB Streams. We discuss the challenges, trade-offs, and solutions.
When Fujirebio Diagnostics, a leading producer of in vitro diagnostics, shifted to virtualization and the cloud, it wanted to replace its costly, unreliable, and cumbersome backup solution. Fujirebio turned to Amazon Web Services (AWS) and Rubrik for a more modern solution. The company used Rubrik Cloud Data Management to eliminate complex tape backup and archive mission critical production systems on AWS, as well as extend on-site storage capacity. The solution automates backup, recovery, and archival on AWS, helping the company drive operational efficiency and resilience. In this webinar, you will learn how Fujirebio Diagnostics used AWS and Rubrik to simplify data protection, achieve fast recovery, reduce management time, and lower total cost of ownership by 75 percent.
AWS Speaker: Mike Ruiz, Partner Solutions Architect
Rubrik Speakers: Kenneth Hui, Technical Marketing Engineer & Mark Haus, Sales Engineer
Tape Is a Four Letter Word: Back Up to the Cloud in Under an Hour (STG201) - ...Amazon Web Services
Tape backups. Yes, they're still a thing. If you want to stop using tapes but need to store immutable backups for compliance or operational reasons, attend this session to learn how to make an easy switch to a cloud-based virtual tape library (VTL). AWS Storage Gateway provides a seamless drop-in replacement for tape backups with its Tape Gateway. It works with the major backup software products, so you simply change the target for your backups, and they go to a VTL that stores virtual tapes on Amazon S3 and Amazon Glacier. Come see how it works.
Similar to Deep Dive on Archiving and Compliance (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
1) The document discusses building a minimum viable product (MVP) using Amazon Web Services (AWS).
2) It provides an example of an MVP for an omni-channel messenger platform that was built from 2017 to connect ecommerce stores to customers via web chat, Facebook Messenger, WhatsApp, and other channels.
3) The founder discusses how they started with an MVP in 2017 with 200 ecommerce stores in Hong Kong and Taiwan, and have since expanded to over 5000 clients across Southeast Asia using AWS for scaling.
This document discusses pitch decks and fundraising materials. It explains that venture capitalists will typically spend only 3 minutes and 44 seconds reviewing a pitch deck. Therefore, the deck needs to tell a compelling story to grab their attention. It also provides tips on tailoring different types of decks for different purposes, such as creating a concise 1-2 page teaser, a presentation deck for pitching in-person, and a more detailed read-only or fundraising deck. The document stresses the importance of including key information like the problem, solution, product, traction, market size, plans, team, and ask.
This document discusses building serverless web applications using AWS services like API Gateway, Lambda, DynamoDB, S3 and Amplify. It provides an overview of each service and how they can work together to create a scalable, secure and cost-effective serverless application stack without having to manage servers or infrastructure. Key services covered include API Gateway for hosting APIs, Lambda for backend logic, DynamoDB for database needs, S3 for static content, and Amplify for frontend hosting and continuous deployment.
This document provides tips for fundraising from startup founders Roland Yau and Sze Lok Chan. It discusses generating competition to create urgency for investors, fundraising in parallel rather than sequentially, having a clear fundraising narrative focused on what you do and why it's compelling, and prioritizing relationships with people over firms. It also notes how the pandemic has changed fundraising, with examples of deals done virtually during this time. The tips emphasize being fully prepared before fundraising and cultivating connections with investors in advance.
AWS_HK_StartupDay_Building Interactive websites while automating for efficien...Amazon Web Services
This document discusses Amazon's machine learning services for building conversational interfaces and extracting insights from unstructured text and audio. It describes Amazon Lex for creating chatbots, Amazon Comprehend for natural language processing tasks like entity extraction and sentiment analysis, and how they can be used together for applications like intelligent call centers and content analysis. Pre-trained APIs simplify adding machine learning to apps without requiring ML expertise.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
Amazon EBS provides highly available, consistent, low-latency block storage for Amazon EC2, to help tune applications with the right storage capacity, performance and cost. EBS is designed for workloads that require persistent storage accessible by single EC2 instances. Typical use cases include Big Data analytics engines (like the Hadoop/HDFS ecosystem and Amazon EMR), relational and NoSQL databases (like Microsoft SQL Server and MySQL or Cassandra and MongoDB), stream and log processing applications (like Kafka and Splunk), and data warehousing applications (like Vertica and Teradata).
Amazon EFS provides simple, scalable, fully managed file system storage for sharing data between Amazon EC2 instances in the AWS Cloud. It delivers a file system interface with standard file system access semantics for Amazon EC2 instances. Amazon EFS grows and shrinks capacity automatically, and provides high throughput with consistently low latencies. Amazon EFS is designed for high availability and durability, and provides performance for a broad spectrum of workloads and applications, including Big Data and analytics, media processing workflows, content management, web serving, container storage, and home directories.
Amazon S3 is object storage designed to store and access any type of data over the Internet. It is secure, 99.999999999% durable, and scales past tens of trillions of objects. Amazon S3 is used for backup and recovery, tiered archive, user-driven content (like photos, videos, music and files), data lakes for Big Data analytics and data warehouse platforms, or as a foundation for serverless computing design.
Amazon Glacier is an extremely low-cost, highly durable storage for long-term backup and archive. Amazon Glacier is a solution for customers who want low-cost storage for infrequently accessed data. It can replace tape while assisting with compliance in highly regulated organizations like healthcare, life science, and financial services.
Amazon Cloud Data Migration services help customers migrate data into and out of the AWS Cloud in offline, online, or streaming models.
In addition to media files, we also see healthcare and life science customers archiving long term data on AWS. Philips Healthcare run their HealthSuite Digital Platform on AWS and stores PBs of patient data, which must be retained for the lifetime of the patient and beyond. Philips Healthcare supports over 1500 hospitals in the U.S. which produce patient records and medical images daily. Philips uses a number of HIPAA eligible services on AWS that helps them meet heathcare specific compliance requirements for storing Personal Health Information (PHI).
Finally, we also see customers in the public sector storing long term archive data on AWS. King county is the largest county in Washington State and is also home to the AWS headquarter. They replaced tape-based backup solutions in 17 agencies with AWS storage services that not only allowed them to meet their existing compliance requirements, but also helped them reduce tape management overhead and increased agility – they saved $1MM in the first year after switching to AWS.
Across many industry segments, we see more data produced every day and an increasing desire to store and retain more of it for longer (if not keep everything forever) as long as it is operationally and financially feasible. More data is created due to higher resolution cameras, 4k, 8k videos, advancement in medical imaging/genomics sequencing technology, and the growing breadth and depth of regulations that require more firms to retain more data for potential audit purposes (call logs, voice mails, now even social media). On the other hand, customers now have access to new tools to analyze massive amounts of historical data (example ad-tech industry). They want to retain user activity logs for longer periods so that they can go back and run a new algorithm to derive new insights in the future that may create new avenues for monetization. All of this business demand is putting pressure on reliable, scalable, and cost-effective data archiving solutions.
Traditionally, customer have relied on on-premises storage arrays (NAS, SAN, and tapes) to archive data as well as expensive, purpose build compliance storage hardware to retain regulatory/compliance storage. These solutions typically require a hefty upfront capex investment and require on-going maintenance and capacity planning, and for those that use tapes, there is always the burden of tape refresh every few years to go from gen N-2 to N. Many customers have found the traditional purchasing and operational model burdensome and that it often hampers business growth/time to market.
Amazon Web Services give you reliable, durable long term storage options without the up-front capital expenditures and complex capacity-planning burden of on-premises storage. Amazon storage services remove the need for complex and time-consuming capacity planning, ongoing negotiations with multiple hardware and software vendors, specialized training, and maintenance of offsite facilities or transportation of storage media to third party offsite locations. You only pay for what you use and you can choose one of the N AWS world wide regions based on your compliance and data sovereignty objectives. AWS does not move your data out of a region unless you specifically request to do so.
The other thing to note is our strong histroy of price cuts. Normally when you buy capital equipment and price is reduced, no one calls you and offers you a refund for what you’ve already purchased. AWS frequently cut pricing as we contiue to gain scale and realize efficienes in our operational model. Just last year we cut S3 pricing by 65%. This year we introduced S3-IA which provides roughly 60% savings compared to S3 and cut Glacier pricing by 30%.
S3 is highly durable. Your data is stored across three separate facilities giving you geo-redundancy and we can sustain data loss in two facilities simultaneously and your data is still safe, providing a statistical measure of 11 9’s of durability. Consider what it would take to architect for such a level of durability in your own data centres
Across the board, we provide 3 storage options with 3 different performance characteristics and price points. On the left, we have S3 Standard which is our high performance object storage for the internet, designed for very active, hot workloads. Data in S3 Standard is available in milliseconds and costs $0.03/GB/month (starting at). On the right hand side, we have Glacier, our cold storage service designed for long term archival and infrequently accessed data. Data in Glacier has a 3-5 hour access latency and Glacier costs $0.007/GB/month (starting at). Between the hot and cold options, we have a “warm” option – S3 infrequent access designed for data you plan to access maybe a few times a year or what we think of as “active archive”. S3-IA costs $0.0125/GB/mo (starting at). From an archiving perspective, customers typically use S3IA and Glacier together.
Just a quick note terminology – S3 stores data in buckets and each piece of data is an object; Glacier stores data in vaults (equivalent of S3 buckets) and each piece of data is called an archive (similar to object). You will hear me use bucket/vault/object/archive later on.
Customers like the 3 storage options and find it very flexible and easy to pick one that suits their need. What’s more, they also like how we help them tie it together with the Data Lifecycle Management feature that allows you use the 3 storage options in tandem and tier data from hot to warm to cold as data ages.
If you think about the typical lifecycle of data, newly created active data is access very frequently.
Think about a new video clip you create and share with your friends and family on S3. People will be consuming this new data actively, this new video will be played back frequently, shared and commented on very frequently.
As this video becomes older, a smaller number of people will engage, it will be LESS FREQUENTLY accessed and can be archived to S3IA.
As time goes on, the video becomes colder and can be archived in Glacier for the lowest cost.
Beyond tiering data, Data Lifecycle Management can also automate expiration/deletion of data and supports storing multiple versions of the same object.
You can transition objects from S3 Standard to SIA after 30 day and then transition to Glacier after 365 days.
One of the key advantages of using automated storage tiering is cost reduction. S3->SIA saves 58% and and SIA to Glacier saves 44%.
Remove all logos so we are not recommending, list increasing benefits as you pay more.
Remove all logos so we are not recommending, list increasing benefits as you pay more.
Add “…” for more options
Many media customers store video archives on AWS and keep them indefinitely. Video assets are typically the core creation/work of art of creative professionals which are sometimes referred to as their “crown jewels”. Sony DADC is Sony’s new media division and recently launched Ve.nue, a media processing and distribution service powered by AWS. Sony DADC stores PBs of video assets on AWS and intends to deep the data indefinitely. What we see across many media customers is that they store high res master/mezzanine files in the cloud that can be transcoded to new delivery formats for monetization when necessary, say for a new iPhone, or a new director’s cut/special edition release.
Decision to 1 year
Now, these customers can use Glacier as compliance storage with Glacier’s Vault Lock capability. We launched Vault Lock in summer 2015 which allows customers to set compliance controls on the Glacier storage containers (vault) via a lockable policy. For example, for customers who used to buy WORM storage/drives for records retention, they can now easily set up a Vault Lock with say 7 year retention and Glacier will enforce the retention control such that any archives stored in the Vault cannot be deleted until it has been stored for 7 years.
We recognize that data retention is one of the most common archive use cases and we launched Vault Lock to make life simpler for these customers. However, Vault Lock does more than data retention (WORM). It can be used to enforce a number of compliance objectives, such as protection on data access. For example, a pharmaceutical company can lock their top secret drug formula in a Vault that requires a 3 way multi-factor authentication for access.
Let’s take a look at how Vault Lock can help you achieve compliance archiving objectives in more detail. To start, it lets you quickly configure non-overwrite/non-erasable records so you can use it as WORM storage. You can then specify time based retention, which is defined by the “ArchiveAgeInDays” control code. The Vault Lock policy is immutable after you test and lock it down, providing a strong form of governance, and finally, Vault Lock supports legal hold for cases when a firm gets a subpoena and they must retain all the related records for as long as the legal investigation continues. In that case the legal hold will override the underlying time-based retention – the record cannot be deleted even if it has exited the planned retention window.
Those in the financial industry such as broker dealers also need to designate a 3rd party (D3P) to have read access to the regulatory data for continuity reasons. Glacier makes it easy for you to set up your D3P and our financial services page include a few partners that can provide such service.
We understand that choosing a compliance storage offering or switching from one to another requires internal alignment such as convincing your compliance offer. For financial customers, we made this easier by obtaining a 3rd party compliance assessment from a reputable independent audit firm - Cohasset Associates, who has been in the compliance industry for over 40 years and has produced similar compliance evaluations for many compliance storage products on the market. Cohasset Associates found that Amazon Glacier with Vault Lock can be used to meet the requirements of financial services records retention rules, specifically, SEC Rule 17a-4(f) and CFTC 1.31(b)-(c). You can download a copy of this report from the Glacier website and share that with your compliance officer/relevant decision maker.
Talk about our data hierarchy – customer maps to a Vault, social post is in an Archive. Retention and legal hold are set at Vault level.
Walk through the policy. Note that we set it with less than 20 lines of json.