We are excited to announce Amazon Glacier, a fully-managed archive service in the cloud that allows customers to store data in 'cold storage' at an extremely competitive price point. Built to support the same 11 9s durability as S3, we'll take you through Glacier, how it works, where it sits with the storage spectrum and our planned integration with S3.
Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for what you use. Join us for this webinar where you will learn how Amazon Glacier changes the game for data archiving and backup as you pay nothing upfront, pay a very low price for storage, and can scale your usage up or down as needed.
by Robbie Wright, HEad of Amazon S3 & Amazon Glacier Product Marketing, AWS
Learn from AWS on how we've designed S3 and Glacier to be durable, available, and massively scalable. Hear how customers are using these services to enhance the accessibility and usability of their data. We will also dive into the benefits of object storage, its applications, and some best practices to follow.
(SPOT209) State of the Union: AWS Simple Storage and Glacier Services | AWS r...Amazon Web Services
General Manager of Amazon Simple Storage Service, Mai-Lan Tomsen Bukovec, will share our learnings running and growing the AWS storage services. You will hear the interesting ways customers are using S3 and Glacier, learn about the new Amazon features that we launched this year, and how we think about evolving the Storage services.
Amazon Glacier provides low-cost cloud storage for archiving and backup, with a durability of 99.999999999% and storage costs starting at $0.01/GB. It has experienced rapid growth since its 2012 launch, with Amazon Web Services revenue increasing from $3.8 billion in 2013 to $8.8 billion in 2015. While competition is increasing, Amazon Glacier is well-positioned for continued success due to its low costs, flexibility, integration with other AWS services, and programmer-friendly design.
Data is gravity. Your workloads and processing is dependent on where your data is and how it is stored. With AWS, you have a host of storage options and the key to successfully leverage them is to know when to use which option. This session will explain in details about each of the AWS Storage offerings along with data ingestion optins into the Cloud using Snowball and Snowmobile
Marc Trimuschat,
Head - Business Developement, AWS Storage, AWS APAC
AWS Webcast - Archiving in the Cloud - Best Practices for Amazon GlacierAmazon Web Services
Join our webinar to learn more about how to build a cost effective archive application using Amazon Glacier, an extremely low cost, secure, highly durable, and easy to use storage service in the AWS cloud.
We will explain how Amazon Glacier works and walk through some best practices to get the most out of the service
We will also highlight how to choose between Amazon Glacier and Amazon S3’s Glacier storage option.
Learn more: http://aws.amazon.com/glacier/
by PD Dutta, Sr. Product Manager, Object Storage, AWS
We will explain how to design and build an IoT cloud platform on top of Amazon S3. You will get to review the best practices for architecting a cost-effective, durable, and secure storage solution to store and analyze your IoT data on Amazon S3. In addition, we’ll cover how to collect, ingest and analyze the data in-place using different AWS Services such as AWS IoT, Amazon Kinesis, Amazon Athena, and Amazon Redshift Spectrum.
Backup and Recovery with Cloud-Native Deduplication and Use Cases from the Fi...Amazon Web Services
by Hugh Emberson, CTO, StorReduce
Designing and deploying cloud-enabled backup & recovery solutions often leads to opportunities for reducing storage requirements and increasing efficiencies. Having effective cloud-native deduplication capabilities as part of your backup & recovery strategy can optimize migration, decrease the need for purpose built backup appliances like Data Domains, large tape archives, and enable cost reductions of up to 95%. In this session, StorReduce will provide best practices around data deduplication in relation to designing and deploying solutions around backup, archive, and general unstructured file data. They will also demonstrate how using a cloud native interface with scale-out deduplication enables generic cloud services like search inside all backups moved to cloud. They will guide the audience through two customer use cases from the financial services and healthcare industries.
Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for what you use. Join us for this webinar where you will learn how Amazon Glacier changes the game for data archiving and backup as you pay nothing upfront, pay a very low price for storage, and can scale your usage up or down as needed.
by Robbie Wright, HEad of Amazon S3 & Amazon Glacier Product Marketing, AWS
Learn from AWS on how we've designed S3 and Glacier to be durable, available, and massively scalable. Hear how customers are using these services to enhance the accessibility and usability of their data. We will also dive into the benefits of object storage, its applications, and some best practices to follow.
(SPOT209) State of the Union: AWS Simple Storage and Glacier Services | AWS r...Amazon Web Services
General Manager of Amazon Simple Storage Service, Mai-Lan Tomsen Bukovec, will share our learnings running and growing the AWS storage services. You will hear the interesting ways customers are using S3 and Glacier, learn about the new Amazon features that we launched this year, and how we think about evolving the Storage services.
Amazon Glacier provides low-cost cloud storage for archiving and backup, with a durability of 99.999999999% and storage costs starting at $0.01/GB. It has experienced rapid growth since its 2012 launch, with Amazon Web Services revenue increasing from $3.8 billion in 2013 to $8.8 billion in 2015. While competition is increasing, Amazon Glacier is well-positioned for continued success due to its low costs, flexibility, integration with other AWS services, and programmer-friendly design.
Data is gravity. Your workloads and processing is dependent on where your data is and how it is stored. With AWS, you have a host of storage options and the key to successfully leverage them is to know when to use which option. This session will explain in details about each of the AWS Storage offerings along with data ingestion optins into the Cloud using Snowball and Snowmobile
Marc Trimuschat,
Head - Business Developement, AWS Storage, AWS APAC
AWS Webcast - Archiving in the Cloud - Best Practices for Amazon GlacierAmazon Web Services
Join our webinar to learn more about how to build a cost effective archive application using Amazon Glacier, an extremely low cost, secure, highly durable, and easy to use storage service in the AWS cloud.
We will explain how Amazon Glacier works and walk through some best practices to get the most out of the service
We will also highlight how to choose between Amazon Glacier and Amazon S3’s Glacier storage option.
Learn more: http://aws.amazon.com/glacier/
by PD Dutta, Sr. Product Manager, Object Storage, AWS
We will explain how to design and build an IoT cloud platform on top of Amazon S3. You will get to review the best practices for architecting a cost-effective, durable, and secure storage solution to store and analyze your IoT data on Amazon S3. In addition, we’ll cover how to collect, ingest and analyze the data in-place using different AWS Services such as AWS IoT, Amazon Kinesis, Amazon Athena, and Amazon Redshift Spectrum.
Backup and Recovery with Cloud-Native Deduplication and Use Cases from the Fi...Amazon Web Services
by Hugh Emberson, CTO, StorReduce
Designing and deploying cloud-enabled backup & recovery solutions often leads to opportunities for reducing storage requirements and increasing efficiencies. Having effective cloud-native deduplication capabilities as part of your backup & recovery strategy can optimize migration, decrease the need for purpose built backup appliances like Data Domains, large tape archives, and enable cost reductions of up to 95%. In this session, StorReduce will provide best practices around data deduplication in relation to designing and deploying solutions around backup, archive, and general unstructured file data. They will also demonstrate how using a cloud native interface with scale-out deduplication enables generic cloud services like search inside all backups moved to cloud. They will guide the audience through two customer use cases from the financial services and healthcare industries.
SRG302 Archiving in the Cloud using Amazon Glacier - AWS re: Invent 2012Amazon Web Services
The document discusses archiving files in Amazon Glacier. It outlines the basic steps: (1) create a vault in Glacier to store archives, (2) configure access policies for the vault, (3) upload files as archives to the vault which takes 3-5 hours to complete, and (4) download the archives from the vault later. It also describes using services like DynamoDB or S3 for indexing archive metadata and retrieving it alongside the archive files.
AWS offers storage, networking, and data transfer services so you can build and deploy solutions to extend backup and archive targets to the AWS Cloud, increasing scalability, durability, security, and compliance.
Training for AWS Solutions Architect at http://zekelabs.com/courses/amazon-web-services-training-bangalore/.Training for AWS Solutions Architect at http://zekelabs.com/courses/amazon-web-services-training-bangalore/. This slide describes about features of simple storage service, s3 buckets, s3-static web hosting, cross region replication, storage classes and comparison, glacier, transfer acceleration, life cycle management, security and encryption
___________________________________________________
zekeLabs is a Technology training platform. We provide instructor led corporate training and classroom training on Industry relevant Cutting Edge Technologies like Big Data, Machine Learning, Natural Language Processing, Artificial Intelligence, Data Science, Amazon Web Services, DevOps, Cloud Computing and Frameworks like Django,Spring, Ruby on Rails, Angular 2 and many more to Professionals.
Reach out to us at www.zekelabs.com or call us at +91 8095465880 or drop a mail at info@zekelabs.com
by Drew Meyer, Sr. Product Marketing Manager, AWS
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements. The session will also address our partner network and what it means for a storage provider to have the APN Storage Competency.
This document provides an overview of Amazon Web Services storage options, including scalable object storage with Amazon S3, inexpensive archive storage with Amazon Glacier, persistent block storage with Amazon EBS, and a shared file system with Amazon EFS. It discusses the growth of data production across industries and how AWS storage services provide scalable, cost-effective solutions. Key features and use cases are described for each storage service.
Disaster Recovery Best Practices and Customer Use Cases: CGS and HealthQuestAmazon Web Services
This document provides an agenda for a CloudEndure presentation at an AWS storage day event. The agenda includes an introduction to CloudEndure and how it works with AWS, enterprise disaster recovery strategies, a deep dive into CloudEndure's DR technology and demo, customer case studies, and a Q&A session. It discusses CloudEndure's key technology pillars for OS-based continuous replication and disaster recovery benefits like cost reductions and recovery objectives. The document also includes case studies of how CloudEndure helped customers like CGS and HealthQuest implement DR solutions on AWS to reduce costs and improve recovery times. It promotes CloudEndure DR and migration products available on AWS Marketplace and provides a call to action to become a CloudEnd
by Henry Zhang, Sr. Product Manager, AWS
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
An Overview of AWS Services for Data Storage and Migration - SRV205 - Atlanta...Amazon Web Services
In this session, we explore the features and functions of AWS storage services. We provide context on the AWS storage portfolio, and we cover the most common use cases for AWS offerings for object, file, block, and migration technologies, including the AWS Partner Network (APN) ecosystem. Then we examine each service, using customer case studies as examples. You gain an understanding of how to select storage and start moving workloads or building new ones.
AWS May Webinar Series - Getting Started: Storage with Amazon S3 and Amazon G...Amazon Web Services
If you are interested to know more about AWS Chicago Summit, please use the following to register: http://amzn.to/1RooPPL
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives: • Business value of Amazon S3 and Amazon Glacier • Leveraging S3 for web applications, media delivery, big data analytics and backup • Leveraging Amazon Glacier to build cost effective archives • Understand the life cycle management of AWS' storage services
by Everett Dolgner, Business Development Manager, AWS
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum and optimizing your overall capital expense can be challenging. This session presents AWS features and services along with disaster recovery architectures that you can leverage when building highly available and disaster-resilient strategies.
This document discusses optimizing storage for big data workloads on AWS. It provides an overview of various AWS storage options for different big data use cases like Hadoop, data warehousing, NoSQL databases and streaming. It also shares several customer examples using EBS volumes for big data workloads like Hadoop, Cassandra and Splunk. The document recommends choosing the right EC2 instance type and EBS volume type based on the workload's input/output patterns and throughput/capacity needs to optimize performance and costs.
With AWS, you can choose the right storage service for the right use case. This session shows the range of AWS choices-from object storage to block storage-that is available to you. We include specifics about real-world deployments from customers who are using Amazon S3, Amazon EBS, Amazon Glacier, and AWS Storage Gateway.
(STG312) Amazon Glacier Deep Dive: Cold Data Storage in AWSAmazon Web Services
This session explores some of the key features of Amazon Glacier, including security, durability, and configuration for storing compliance and regulatory data. It covers best practices for managing your cold data, including ingest, retrieval, and security controls. Other topics include: how to optimize storage, upload, and retrieval costs; how to identify the most applicable workloads; and recommended optimizations based on a few sample use cases from a number of industry verticals.
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
This document summarizes a presentation on data lifecycle and storage management techniques for Amazon S3. It discusses lifecycle management rules for transitioning or expiring objects based on age, S3 inventory for listing objects, object tagging for classification and policy filtering, storage class analysis for monitoring usage and optimizing storage, and monitoring tools like CloudWatch and CloudTrail. The presentation provides an overview and best practices for these S3 management features.
AWS S3 | Tutorial For Beginners | AWS S3 Bucket Tutorial | AWS Tutorial For B...Simplilearn
This presentation AWS S3 will help you understand what is cloud storage, types of storage, life before Amazon S3, what is S3 ( Amazon Simple Storage Service ), benefits of S3, objects and buckets, how does Amazon S3 work along with the explanation on features of AWS S3. Amazon S3 is a storage service for the Internet. It is a simple storage service that offers software developers a highly-scalable, reliable, and low-latency data storage infrastructure at a relatively low cost. Amazon S3 gives a simple web service interface that can be used to store and restore any amount of data. Using this, developers can build applications that make use of Internet storage with ease. Amazon S3 is designed to be highly flexible and scalable. Now, lets deep dive into this presentation and understand what Amazon S3 actually is.
Below topics are explained in this AWS S3 presentation:
1. What is Cloud storage?
2. Types of storage
3. Before Amazon S3
4. What is S3
5. Benefits of S3
6. Objects and buckets
7. How does Amazon S3 work
8. Features of S3
This AWS certification training is designed to help you gain in-depth understanding of Amazon Web Services (AWS) architectural principles and services. You will learn how cloud computing is redefining the rules of IT architecture and how to design, plan, and scale AWS Cloud implementations with best practices recommended by Amazon. The AWS Cloud platform powers hundreds of thousands of businesses in 190 countries, and AWS certified solution architects take home about $126,000 per year.
This AWS certification course will help you learn the key concepts, latest trends, and best practices for working with the AWS architecture – and become industry-ready aws certified solutions architect to help you qualify for a position as a high-quality AWS professional.
The course begins with an overview of the AWS platform before diving into its individual elements: IAM, VPC, EC2, EBS, ELB, CDN, S3, EIP, KMS, Route 53, RDS, Glacier, Snowball, Cloudfront, Dynamo DB, Redshift, Auto Scaling, Cloudwatch, Elastic Cache, CloudTrail, and Security. Those who complete the course will be able to:
1. Formulate solution plans and provide guidance on AWS architectural best practices
2. Design and deploy scalable, highly available, and fault tolerant systems on AWS
3. Identify the lift and shift of an existing on-premises application to AWS
4. Decipher the ingress and egress of data to and from AWS
5. Select the appropriate AWS service based on data, compute, database, or security requirements
6. Estimate AWS costs and identify cost control mechanisms
This AWS course is recommended for professionals who want to pursue a career in Cloud computing or develop Cloud applications with AWS. You’ll become an asset to any organization, helping leverage best practices around advanced cloud-based solutions and migrate existing workloads to the cloud.
Learn more at: https://www.simplilearn.com/
With AWS you can choose the right storage service for the right use case. Given the myriad of choices, from object storage to block storage, this session will profile details and examples of some of the choices available to you, with details on real world deployments from customers using Amazon Simple Storage Service (Amazon S3), Amazon Elastic Block Store (Amazon EBS), Amazon Glacier and AWS Storage Gateway. In addition, this session will also cover all the new AWS storage features introduced in the last 12 months.
Best Practices for Backup and Recovery: Windows Workload on AWS Amazon Web Services
Backing up Windows workloads can be a challenge, and cumbersome for many companies. Backup and recovery for Windows workloads on AWS, however, can be easy. This session will cover best practices for backup and recovery, how to configure Windows workloads to back up to AWS; pitfalls to look out for; and recommended reference architectures.
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
This document discusses using AWS for storage and archive solutions. It begins by outlining the business and technical benefits AWS can provide, such as reducing costs, reducing on-premise infrastructure needs, changing processes, and removing aging technologies. It then covers fundamental AWS storage services like EBS, S3, and Glacier. Examples of how these services can be used for different storage and archive use cases like backups, data distribution, databases, and long-term archives are provided. Finally, it discusses getting data into AWS and using database services like RDS and DynamoDB.
This presentation discusses the use of AWS as a storage and archive platform. A wide range of assets can be cost effectively held in highly durable storage systems within the AWS cloud, for global distribution, long term storage or cold archive. Learn about a range of use cases for the Amazon Simple Storage Service (S3) beyond simple object storage, and how Amazon Glacier can revolutionise long term archive economics and technology.
James Brown, Business Development Manager, AWS
SRG302 Archiving in the Cloud using Amazon Glacier - AWS re: Invent 2012Amazon Web Services
The document discusses archiving files in Amazon Glacier. It outlines the basic steps: (1) create a vault in Glacier to store archives, (2) configure access policies for the vault, (3) upload files as archives to the vault which takes 3-5 hours to complete, and (4) download the archives from the vault later. It also describes using services like DynamoDB or S3 for indexing archive metadata and retrieving it alongside the archive files.
AWS offers storage, networking, and data transfer services so you can build and deploy solutions to extend backup and archive targets to the AWS Cloud, increasing scalability, durability, security, and compliance.
Training for AWS Solutions Architect at http://zekelabs.com/courses/amazon-web-services-training-bangalore/.Training for AWS Solutions Architect at http://zekelabs.com/courses/amazon-web-services-training-bangalore/. This slide describes about features of simple storage service, s3 buckets, s3-static web hosting, cross region replication, storage classes and comparison, glacier, transfer acceleration, life cycle management, security and encryption
___________________________________________________
zekeLabs is a Technology training platform. We provide instructor led corporate training and classroom training on Industry relevant Cutting Edge Technologies like Big Data, Machine Learning, Natural Language Processing, Artificial Intelligence, Data Science, Amazon Web Services, DevOps, Cloud Computing and Frameworks like Django,Spring, Ruby on Rails, Angular 2 and many more to Professionals.
Reach out to us at www.zekelabs.com or call us at +91 8095465880 or drop a mail at info@zekelabs.com
by Drew Meyer, Sr. Product Marketing Manager, AWS
This session will provide an overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. We will touch on new offerings, outline some of the most common use cases, and prepare you for the individual deep dive sessions, customer sessions, and new announcements. The session will also address our partner network and what it means for a storage provider to have the APN Storage Competency.
This document provides an overview of Amazon Web Services storage options, including scalable object storage with Amazon S3, inexpensive archive storage with Amazon Glacier, persistent block storage with Amazon EBS, and a shared file system with Amazon EFS. It discusses the growth of data production across industries and how AWS storage services provide scalable, cost-effective solutions. Key features and use cases are described for each storage service.
Disaster Recovery Best Practices and Customer Use Cases: CGS and HealthQuestAmazon Web Services
This document provides an agenda for a CloudEndure presentation at an AWS storage day event. The agenda includes an introduction to CloudEndure and how it works with AWS, enterprise disaster recovery strategies, a deep dive into CloudEndure's DR technology and demo, customer case studies, and a Q&A session. It discusses CloudEndure's key technology pillars for OS-based continuous replication and disaster recovery benefits like cost reductions and recovery objectives. The document also includes case studies of how CloudEndure helped customers like CGS and HealthQuest implement DR solutions on AWS to reduce costs and improve recovery times. It promotes CloudEndure DR and migration products available on AWS Marketplace and provides a call to action to become a CloudEnd
by Henry Zhang, Sr. Product Manager, AWS
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
An Overview of AWS Services for Data Storage and Migration - SRV205 - Atlanta...Amazon Web Services
In this session, we explore the features and functions of AWS storage services. We provide context on the AWS storage portfolio, and we cover the most common use cases for AWS offerings for object, file, block, and migration technologies, including the AWS Partner Network (APN) ecosystem. Then we examine each service, using customer case studies as examples. You gain an understanding of how to select storage and start moving workloads or building new ones.
AWS May Webinar Series - Getting Started: Storage with Amazon S3 and Amazon G...Amazon Web Services
If you are interested to know more about AWS Chicago Summit, please use the following to register: http://amzn.to/1RooPPL
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives: • Business value of Amazon S3 and Amazon Glacier • Leveraging S3 for web applications, media delivery, big data analytics and backup • Leveraging Amazon Glacier to build cost effective archives • Understand the life cycle management of AWS' storage services
by Everett Dolgner, Business Development Manager, AWS
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum and optimizing your overall capital expense can be challenging. This session presents AWS features and services along with disaster recovery architectures that you can leverage when building highly available and disaster-resilient strategies.
This document discusses optimizing storage for big data workloads on AWS. It provides an overview of various AWS storage options for different big data use cases like Hadoop, data warehousing, NoSQL databases and streaming. It also shares several customer examples using EBS volumes for big data workloads like Hadoop, Cassandra and Splunk. The document recommends choosing the right EC2 instance type and EBS volume type based on the workload's input/output patterns and throughput/capacity needs to optimize performance and costs.
With AWS, you can choose the right storage service for the right use case. This session shows the range of AWS choices-from object storage to block storage-that is available to you. We include specifics about real-world deployments from customers who are using Amazon S3, Amazon EBS, Amazon Glacier, and AWS Storage Gateway.
(STG312) Amazon Glacier Deep Dive: Cold Data Storage in AWSAmazon Web Services
This session explores some of the key features of Amazon Glacier, including security, durability, and configuration for storing compliance and regulatory data. It covers best practices for managing your cold data, including ingest, retrieval, and security controls. Other topics include: how to optimize storage, upload, and retrieval costs; how to identify the most applicable workloads; and recommended optimizations based on a few sample use cases from a number of industry verticals.
Compared to storing long-term datasets on-premise, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategies and meeting compliance needs using Amazon Glacier.
This document summarizes a presentation on data lifecycle and storage management techniques for Amazon S3. It discusses lifecycle management rules for transitioning or expiring objects based on age, S3 inventory for listing objects, object tagging for classification and policy filtering, storage class analysis for monitoring usage and optimizing storage, and monitoring tools like CloudWatch and CloudTrail. The presentation provides an overview and best practices for these S3 management features.
AWS S3 | Tutorial For Beginners | AWS S3 Bucket Tutorial | AWS Tutorial For B...Simplilearn
This presentation AWS S3 will help you understand what is cloud storage, types of storage, life before Amazon S3, what is S3 ( Amazon Simple Storage Service ), benefits of S3, objects and buckets, how does Amazon S3 work along with the explanation on features of AWS S3. Amazon S3 is a storage service for the Internet. It is a simple storage service that offers software developers a highly-scalable, reliable, and low-latency data storage infrastructure at a relatively low cost. Amazon S3 gives a simple web service interface that can be used to store and restore any amount of data. Using this, developers can build applications that make use of Internet storage with ease. Amazon S3 is designed to be highly flexible and scalable. Now, lets deep dive into this presentation and understand what Amazon S3 actually is.
Below topics are explained in this AWS S3 presentation:
1. What is Cloud storage?
2. Types of storage
3. Before Amazon S3
4. What is S3
5. Benefits of S3
6. Objects and buckets
7. How does Amazon S3 work
8. Features of S3
This AWS certification training is designed to help you gain in-depth understanding of Amazon Web Services (AWS) architectural principles and services. You will learn how cloud computing is redefining the rules of IT architecture and how to design, plan, and scale AWS Cloud implementations with best practices recommended by Amazon. The AWS Cloud platform powers hundreds of thousands of businesses in 190 countries, and AWS certified solution architects take home about $126,000 per year.
This AWS certification course will help you learn the key concepts, latest trends, and best practices for working with the AWS architecture – and become industry-ready aws certified solutions architect to help you qualify for a position as a high-quality AWS professional.
The course begins with an overview of the AWS platform before diving into its individual elements: IAM, VPC, EC2, EBS, ELB, CDN, S3, EIP, KMS, Route 53, RDS, Glacier, Snowball, Cloudfront, Dynamo DB, Redshift, Auto Scaling, Cloudwatch, Elastic Cache, CloudTrail, and Security. Those who complete the course will be able to:
1. Formulate solution plans and provide guidance on AWS architectural best practices
2. Design and deploy scalable, highly available, and fault tolerant systems on AWS
3. Identify the lift and shift of an existing on-premises application to AWS
4. Decipher the ingress and egress of data to and from AWS
5. Select the appropriate AWS service based on data, compute, database, or security requirements
6. Estimate AWS costs and identify cost control mechanisms
This AWS course is recommended for professionals who want to pursue a career in Cloud computing or develop Cloud applications with AWS. You’ll become an asset to any organization, helping leverage best practices around advanced cloud-based solutions and migrate existing workloads to the cloud.
Learn more at: https://www.simplilearn.com/
With AWS you can choose the right storage service for the right use case. Given the myriad of choices, from object storage to block storage, this session will profile details and examples of some of the choices available to you, with details on real world deployments from customers using Amazon Simple Storage Service (Amazon S3), Amazon Elastic Block Store (Amazon EBS), Amazon Glacier and AWS Storage Gateway. In addition, this session will also cover all the new AWS storage features introduced in the last 12 months.
Best Practices for Backup and Recovery: Windows Workload on AWS Amazon Web Services
Backing up Windows workloads can be a challenge, and cumbersome for many companies. Backup and recovery for Windows workloads on AWS, however, can be easy. This session will cover best practices for backup and recovery, how to configure Windows workloads to back up to AWS; pitfalls to look out for; and recommended reference architectures.
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
This document discusses using AWS for storage and archive solutions. It begins by outlining the business and technical benefits AWS can provide, such as reducing costs, reducing on-premise infrastructure needs, changing processes, and removing aging technologies. It then covers fundamental AWS storage services like EBS, S3, and Glacier. Examples of how these services can be used for different storage and archive use cases like backups, data distribution, databases, and long-term archives are provided. Finally, it discusses getting data into AWS and using database services like RDS and DynamoDB.
This presentation discusses the use of AWS as a storage and archive platform. A wide range of assets can be cost effectively held in highly durable storage systems within the AWS cloud, for global distribution, long term storage or cold archive. Learn about a range of use cases for the Amazon Simple Storage Service (S3) beyond simple object storage, and how Amazon Glacier can revolutionise long term archive economics and technology.
James Brown, Business Development Manager, AWS
This document discusses three fundamental storage options from AWS: Simple Storage Service (S3), Elastic Block Store (EBS), and Glacier. S3 provides scalable object storage, EBS provides block-level storage volumes for EC2 instances, and Glacier provides low-cost archival storage. The document compares the performance, redundancy, security, pricing and typical use cases of each service. It also discusses SteelStore, a cloud-integrated storage solution that aims to reduce backup time, costs and data volumes by up to 80% through data deduplication and compression.
A brief introduction of different storage options available on AWS platform. And what is the value proposition of AWS in the Disaster Recovery (DR) scenario.
The document discusses various storage options on Amazon Web Services (AWS) including Simple Storage Service (S3), Elastic Block Store (EBS), and Glacier. It then provides details on how to configure NetBackup to leverage these AWS storage services for backup and recovery. Specific scenarios are presented on backing up on-premises and cloud-based workloads to S3, EBS, and Glacier using different NetBackup and AWS configurations. Reporting and monitoring capabilities are also demonstrated.
Learn more about the tools, techniques and technologies for working productively with data at any scale. This session will introduce the family of data analytics tools on AWS which you can use to collect, compute and collaborate around data, from gigabytes to petabytes. We'll discuss Amazon Elastic MapReduce, Redshift, Hadoop, structured and unstructured data, and the EC2 instance types which enable high performance analytics.
Introduction to Storage on AWS - AWS Summit Cape Town 2017Amazon Web Services
With AWS, you can choose the right storage service for the right use case. This session shows the range of AWS choices that are available to you: Amazon S3, Amazon EBS, Amazon EFS, Amazon Glacier and Cloud Data Migration solutions.
Optimizing Data Management Using AWS Storage and Data Migration Products | AW...Amazon Web Services
DigitalGlobe, Inc., the world’s leading provider of high-resolution Earth imagery, data, and analysis, is migrating its IT infrastructure, supporting imagery production and storage as well as satellite flight operations, to AWS with plans to close its commercial data centers within four years. DigitalGlobe has utilized AWS Snowmobile to move its 100PB image archive to the cloud. DigitalGlobe built its Geospatial Big Data platform, GBDX, natively on AWS. GBDX utilizes the image archive and combines geospatial big data and analytic tools, partner and customer data and tools, and dynamic cloud compute all in one place. This session will explore cost optimization for data management on AWS, highlighting various storage tiers and data import opportunities. We will focus on cost optimal usage of S3, S3-IA, Glacier, Snowball Edge and Snowmobile – balancing imagery access time with storage costs. Hear how DigitalGlobe utilized some of the newest features of the AWS platform to minimize their costs from storage. Learn More: https://aws.amazon.com/government-education/
IT systems and applications are producing and consuming content at a rapidly growing rate. This could significantly impact costs and agility of IT organizations if not planned for appropriately. Organizations of all sizes have seen significant benefits from utilizing cloud services in their business. One early area of focus for companies has been the highly durable, low cost and massively scalable benefits that come with cloud storage services. Today, thousands of developers and businesses around the globe rely on Amazon Web Services (AWS) for their backup, archival and disaster recovery requirements. This session covers best practices on proven designs from real world customer use cases and discuss topics such as capacity planning, durability, cost, security, as well as content categorization and transfer.
Learn how AWS customers save money, time and effort by using AWS's backup and archive services. Organizations of all sizes rely on AWS services to durably safeguard their data off-premises at a surprisingly low cost. This session will illustrate backup and archive architectures that AWS customers are benefitting from today.
Types of Cloud Storage and choosing the right solutionVrishali Sanglikar
Cloud computing and technology – popularly referred to as the cloud – has redefined the way we store and share our information. It has helped us transcend the limitations of using a physical device to share and opened a whole new dimension of the internet. We shall shortly see the why and how of the above. The providers making such services available are know are Cloud Service Providers or Hyperscalars or Cloud Providers or simply as Providers , etc. The leaders in this space are AWS, GCP, Azure, etc.
Cloud Computing has been around for close to 2 decades now (with AWS being the first Cloud Service Provider which started in 2006 and was the only Hyperscalar in market for a complete 4 years after inception). So by now cloud computing is widely recognized by name, but few people really understand how it works. This whitepaper is focused on AWS, but other providers have similar services to AWS. Cloud computing had its early beginnings in the form of Grid Computing, where resources were up and running on a network of connected computers. The same concept has evolved today and abstracted even more and across wider geographical area leading to emergence of what we call today as Cloud. Now why is it called a Cloud – because the location of the Resource or Server hosting the resource on the connected computers or computing devices or data centers does not matter. We simply say that our ‘Database is hosted on the Cloud’ OR ‘our Compute Resources are hosted on the Cloud’.
So then how do we use these digital resources stored in the virtual space – it is by way of networks. It allows people to share information and applications without being restricted by their physical location. We can say that Cloud Computing is the ‘on-demand delivery of IT services and resources over the Internet with a pay-as-you-go pricing model’. Instead of buying, owning, and maintaining physical Data Centers and Servers, you can access technology services, such as computing power, storage, and databases, on an as-needed basis from a cloud provider.
Organizations of every type, size, and industry are using the cloud for a wide variety of use cases, such as data backup, disaster recovery, email, virtual desktops, software development, big data analytics, and customer-facing web applications.
Lunch and Learn - Store and Move your Data To & From the AWS Cloud, Markku Le...Amazon Web Services
This document provides an overview and summary of options for storing and moving data to and from AWS cloud storage services. It discusses the problems with traditional on-premise storage solutions and how AWS storage services can help by providing scalable, cost-effective storage across different services optimized for various data and access needs such as block storage, file storage, archive storage, and backup storage. The document also covers data transfer options, choosing the right data store, disaster recovery strategies using AWS, and examples of companies using AWS storage services successfully.
AWS Summit 2014 Melbourne - Breakout 1
Businesses of all sizes are archiving their data to the AWS Cloud in order to reduce costs while taking advantage of highly secure, highly durable, and simple cloud based storage services. With AWS, you pay as you go and you can scale up and down as required. With your data stored in the AWS Cloud, it’s easy to use other Amazon Web Services to take advantage of additional cost savings and benefits. Amazon storage services remove the need for complex and time-consuming capacity planning, ongoing negotiations with multiple hardware and software vendors, specialized training, and maintenance of offsite facilities or transportation of storage media to third party offsite locations. Amazon Web Services now offers a robust set of hybrid storage solutions for customers that currently operate and maintain data centers. Our Next Generation Enterprise Storage strategy has at its heart Amazon S3. This highly scalable, extremely durable storage service combines with a diverse set of Cloud Storage Gateways to provide businesses with a new approach to Enterprise storage.
Presenter: Jeff Putt, Business Development Manager, APAC, Amazon Web Services
This document provides an overview and summary of backup and disaster recovery strategies using AWS cloud services. It discusses how AWS services like S3, Glacier, EBS snapshots, and Storage Gateway provide more durable, scalable and cost-effective backups compared to traditional on-premise solutions. Specific AWS tools covered include S3 for backups, Glacier for archival storage, EBS snapshots, Import/Export for large data transfers, Storage Gateway options, and using a "Pilot Light" strategy in AWS for fast disaster recovery compared to traditional tape-based approaches.
AWS Sydney Summit 2013 - Technical Lessons on How to do DR in the CloudAmazon Web Services
1. The document discusses backup and disaster recovery (DR) lessons learned from implementing backup and DR solutions using AWS for Ausenco Limited. It provides definitions of archiving, backup, and DR.
2. It then describes Ausenco's IT environment and challenges with unreliable backups, lack of DR, and limited local storage. Their initial approach involved consulting various vendors before shifting to leverage AWS cloud services.
3. The results section outlines key lessons around backup including ensuring it is accessible, able to scale, safe, works with DR policies, and that ownership is clearly defined. For DR, lessons include having a plan, testing regularly, and that different solutions can meet varying needs.
AWS provides a variety of storage services including object storage with Amazon S3, inexpensive archive storage with Amazon Glacier, persistent block storage with Amazon EBS, and a shared file system with Amazon EFS. The document discusses these services and how they can address different storage needs through scalable, cost-effective cloud storage options. It also highlights new features like cross-region replication for S3 and introduces the AWS Storage Gateway for on-premises backup and archival solutions into AWS storage services.
Active Archiving with Amazon S3 and Tiering to Amazon Glacier - March 2017 AW...Amazon Web Services
Most organizations have data that they need to retain, but is accessed infrequently, if ever. In cases where this data needs to be accessible at a moment’s notice, it’s hard to save money by moving to an archival storage because access times on these platforms are slower. Now, customers are using Amazon S3 & Glacier for “Active Archiving” to reduce storage costs while maintaining the flexibility of instant access. In this tech talk, we’ll show you how implement Active Archiving with AWS Object Storage services, and we’ll provide some real world examples of how AWS customers are saving money with these capabilities today.
Learning Outcomes:
• Define Active Archiving, and understand how it is different from traditional cold archiving
• Review the cost modeling tools available to determine if Active Archiving is a good fit for your organization
• Learn about best practices for using AWS Object Storage features & functionality to enable Active Archiving
Best practices: Backup and Recovery for Windows WorkloadsAmazon Web Services
Backing up Windows workloads can be a challenge, and cumbersome for many companies. Backup and recovery for Windows workloads on AWS, however, can be easy. This session will cover best practices for backup and recovery, how to configure Windows workloads to back up to AWS; pitfalls to look out for; and recommended reference architectures.
AWS re:Invent 2016: High Performance Cinematic Production in the Cloud (MAE304)Amazon Web Services
The process of making a film is highly complex, and comprises of multiple workflows across story development, pre-production, production, post-production and final distribution. Given the size and amount of media and assets associated with each stage, high performance infrastructure is often essential to meeting deadlines.
In this session we will take a deeper dive at running a full cinematic production in the cloud, with a focus on solutions for each of the production stages. We will also look at best practices around design, optimization, performance, scheduling, scalability and low latency utilizing AWS technologies such as EC2, Lambda, Snowball, Direct Connect, and Partner Solutions.
This webinar discussed the use of the AWS Cloud as a disaster recovery (DR) environment. It also explored how the architectural approaches to DR in the AWS Cloud makes DR and BCP a great scenario for familiarising yourself with AWS before moving on to production application deployments in the cloud.
Similar to AWS Update | London - Amazon Glacier (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
1) The document discusses building a minimum viable product (MVP) using Amazon Web Services (AWS).
2) It provides an example of an MVP for an omni-channel messenger platform that was built from 2017 to connect ecommerce stores to customers via web chat, Facebook Messenger, WhatsApp, and other channels.
3) The founder discusses how they started with an MVP in 2017 with 200 ecommerce stores in Hong Kong and Taiwan, and have since expanded to over 5000 clients across Southeast Asia using AWS for scaling.
This document discusses pitch decks and fundraising materials. It explains that venture capitalists will typically spend only 3 minutes and 44 seconds reviewing a pitch deck. Therefore, the deck needs to tell a compelling story to grab their attention. It also provides tips on tailoring different types of decks for different purposes, such as creating a concise 1-2 page teaser, a presentation deck for pitching in-person, and a more detailed read-only or fundraising deck. The document stresses the importance of including key information like the problem, solution, product, traction, market size, plans, team, and ask.
This document discusses building serverless web applications using AWS services like API Gateway, Lambda, DynamoDB, S3 and Amplify. It provides an overview of each service and how they can work together to create a scalable, secure and cost-effective serverless application stack without having to manage servers or infrastructure. Key services covered include API Gateway for hosting APIs, Lambda for backend logic, DynamoDB for database needs, S3 for static content, and Amplify for frontend hosting and continuous deployment.
This document provides tips for fundraising from startup founders Roland Yau and Sze Lok Chan. It discusses generating competition to create urgency for investors, fundraising in parallel rather than sequentially, having a clear fundraising narrative focused on what you do and why it's compelling, and prioritizing relationships with people over firms. It also notes how the pandemic has changed fundraising, with examples of deals done virtually during this time. The tips emphasize being fully prepared before fundraising and cultivating connections with investors in advance.
AWS_HK_StartupDay_Building Interactive websites while automating for efficien...Amazon Web Services
This document discusses Amazon's machine learning services for building conversational interfaces and extracting insights from unstructured text and audio. It describes Amazon Lex for creating chatbots, Amazon Comprehend for natural language processing tasks like entity extraction and sentiment analysis, and how they can be used together for applications like intelligent call centers and content analysis. Pre-trained APIs simplify adding machine learning to apps without requiring ML expertise.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
2. Getting to Glacier…
Why AWS for storage & archive?
AWS fundamental services
Storage & archive – examples &
patterns
Amazon Glacier
3. Storage & Archive
AWS is used in a variety of ways…
Powers applications that allows
customers to access historical Store its vast repository of music to
stock price information feed to over 15 million active users
Estimates it has saved $500,000 Digital assets and usage data behind
in storage expenditures and cut publication sites and mobile
its disk storage array costs in half applications
4. Business & technical drivers
You might be able to:
Reduce costs Reduce on-premise
Slash storage & archive budgets Eliminate on premise equipment to
manage archives
Change processes Remove aging technologies
Remove the need to do capacity Eliminate tape for backup and archive
planning
5. Business & technical drivers
You might be able to:
Reduce costs Reduce on-premise
Reduce CAPEX while dramatically
Slash storage & archive budgets by Eliminate on premise equipment to
increasing scalability
up to 50% manage archives
Eliminate the need for secondary
sites
Change processes Remove aging technologies
Remove the need to do capacity Eliminate tape for backup and archive
planning
6. Business & technical drivers
You might be able to:
Reduce costs Reduce on-premise
Reduce CAPEX while dramatically Eliminate 30%+ of your storage
Slash storage & archive budgets by Eliminate on premise equipment to
increasing scalability footprint
up to 50% manage archives
Eliminate the need for secondary Consolidate on-premise and
sites augment with cloud
Change processes Remove aging technologies
Remove the need to do capacity Eliminate tape for backup and archive
planning
7. Business & technical drivers
You might be able to:
Reduce costs Reduce on-premise
Reduce CAPEX while dramatically Eliminate 30%+ of your storage
Slash storage & archive budgets by Eliminate on premise equipment to
increasing scalability footprint
up to 50% manage archives
Eliminate the need for secondary Consolidate on-premise and
sites augment with cloud
Change processes Remove aging technologies
Remove the need to do capacity
Eliminate capacity planning Eliminate tape for backup and archive
planning
Eliminate provisioning for peak
demand
8. Business & technical drivers
You might be able to:
Reduce costs Reduce on-premise
Reduce CAPEX while dramatically Eliminate 30%+ of your storage
Slash storage & archive budgets by Eliminate on premise equipment to
increasing scalability footprint
up to 50% manage archives
Eliminate the need for secondary Consolidate on-premise and
sites augment with cloud
Change processes Remove aging technologies
Remove the need to do capacity
Eliminate capacity planning Eliminate tape for backup and
planning Remove tape archives
Eliminate provisioning for peak
Cycle out aging disk arrays
demand
10. Fundamental Storage Options
Elastic Block Store, S3 and Glacier
Elastic Block Store Simple Storage Service Glacier
High performance block storage device Highly scalable object storage Long term object archive
1GB to 1TB in size 1 byte to 5TB in size Extremely low cost per gigabyte
Mount as drives to instances with 99.999999999% durability 99.999999999% durability
snapshot/cloning functionalities
11. Fundamental Storage Options
Elastic Block Store, S3 and Glacier
Elastic Block Store Simple Storage Service Glacier
High performance block storage device Highly scalable object storage Long term object archive
1GB to 1TB in size 1 byte to 5TB in size Extremely low cost per gigabyte
Mount as drives to instances with 99.999999999% durability 99.999999999% durability
snapshot/cloning functionalities
Very fast Fast web object Slow, rare access
‘instance’ disks storage
12. Fundamental Storage Options
Elastic Block Store, S3 and Glacier
Elastic Block Store Simple Storage Service Glacier
High performance block storage device Highly scalable object storage Long term object archive
1GB to 1TB in size 1 byte to 5TB in size Extremely low cost per gigabyte
Mount as drives to instances with 99.999999999% durability 99.999999999% durability
snapshot/cloning functionalities
13. Fundamental Storage Options
Elastic Block Store, S3 and Glacier
Elastic Block Store
Archive Backup
Simple Storage Service
DR
Glacier
High performance block storage device Highly scalable object storage Long term object archive
Data1TB in size
1GB to accessed Snapshots 1 byte to 5TB in size Extremely low cost per gigabyte
Rapid RTO
~>10% / month
Amazo as drives to instances with
Mount 99.999999999% durability 99.999999999% durability
Shorter term data
nsnapshot/cloning functionalities backup with rapid
S3 Expiration policies
11 9s durability
RTO
Amazo
Lower cost when 11
n S3 9s not required
Lower cost Lower cost
RRS
Long term
Amazo archiving Use policies to Retain “write once -
move cold backup read never” copy in
n Infrequent data data for long term case of worst case
Glacier access (~<10% retention scenario
data/month)
14. Use case journey
On-premise On-instance Object level Long term
Locally
accessible file
systems
Workloads
with local data
15. Use case journey
On-premise On-instance Object level Long term
Locally
accessible file
systems AWS
Workloads
with local data
16. Use case journey
On-premise On-instance Object level Long term
Locally EC2 based Data System images
accessible file applications distribution Database
systems DR Durable media backups
Workloads deployments storage Data archives
with local data
17. Use case journey
On-premise On-instance Object level Long term
Locally EC2 based Data System images
accessible file applications distribution Database
systems DR Durable media backups
Workloads deployments storage Data archives
with local data
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
performance Scalability
18. Use case journey
On-premise On-instance Object level Long term
Locally EC2 based Data System images
accessible file applications distribution Database
systems DR Durable media backups
Workloads deployments storage Data archives
with local data
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
performance Scalability
19. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications distribution Database
systems DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
performance Scalability
20. Getting data into the cloud
Direct connect, import/export and storage gateway
AWS Direct Connect AWS Import/Export Amazon Storage Gateway
Dedicated bandwidth between you Physical transfer of media into and Shrink-wrapped gateway for volume
site and AWS out of AWS synchronization
21. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications distribution Database
systems DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
performance Scalability
22. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications and distribution
Disks
Database
systems data
DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into 2
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
performance Scalability
24. Curiosity
The mars.jpl.nasa.gov website
is based on the open-source
Content Management System
(CMS) Railo, running on
Amazon EC2
Shared storage for Railo is
provided by Amazon EC2
instances running Gluster on a
pool of Amazon Elastic Block
Store (EBS) volumes for
consistently high performance
disk I/O.
25. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications and distribution
Disks
Database
systems data
DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into 2
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
performance Scalability
26. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications and distribution
Disks
Database
systems data
DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into 2
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
Database
performance
as a
service
3 Scalability
28. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications and distribution
Disks
Database
systems data
DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into 2
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
Database
performance
as a
service
3 Scalability
29. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications and distribution
Disks
Database
systems data
DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into 2 4
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
Database
performance
Object
as a
service
3 Scalability
serving
and
storage
31. You put in it S3
AWS stores with 99.999999999% durability
32. Highly scalable web
access to objects
You put in it S3
AWS stores with 99.999999999% durability
Multiple redundant
copies in a region
33.
34. “Spotify needed a storage solution that
could scale very quickly without incurring
long lead times for upgrades. This led us to
cloud storage, and in that market, Amazon
Simple Storage Service (Amazon S3) is the
most mature large-scale product.
Amazon S3 gives us confidence in our
ability to expand storage quickly while also
providing high data durability.”
Emil Fredriksson, Operations Director
35. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data System images
accessible file applications and distribution
Disks
Database
systems data
DR Durable media backups
Workloads deployments storage Data archives
with local data
Getting
data into 2 4
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
Database
performance
Object
as a
service
3 Scalability
serving
and
storage
36. Use case journey
On-premise On-instance Object level Long term
Locally 1 EC2 based Data Cold System images
accessible file applications and distribution storage & Database
Disks
systems data
DR Durable mediaarchiving backups
Workloads deployments storage Data archives
with local data
Getting
data into 2 4 5
the cloud
High IO High IO performance Good Very low price
performance Provisioned IOPS performance High durability
High network Backup & Restore High durability Slow access
Database
performance
Object
as a
service
3 Scalability
serving
and
storage
37. What we heard from you
You love Amazon S3
for its simplicity,
security, durability,
and performance.
38. What we heard from you
You love Amazon S3 You wanted a highly
for its simplicity, secure, extremely
security, durability, durable, and extremely
and performance. cost effective option for
archiving data for years
39. The need…
Reliable and cheap storage of data
Data with long Multi-PB, infrequently
retention periods accessed data sets
42. Our goals with Glacier…
Redefine data archiving Replace physical media for
and backup: archiving:
no upfront payments an easy to use storage service that is
a very low price for storage infinitely scalable
ability to scale up and down as a secure service for important data
needed assets
designed for an annual average
99.999999999% durability per saved
object
for as little as $0.01 per gigabyte per month
48. Glacier allows you to cost-effectively and securely store
Offsite archive enterprise data offsite, making it simple, inexpensive and safe
to retain archived data for as long as desired. Common use
cases include enterprise data, media assets, and research and
scientific data
49. Glacier allows you to cost-effectively and securely store
Offsite archive enterprise data offsite, making it simple, inexpensive and safe
to retain archived data for as long as desired. Common use
cases include enterprise data, media assets, and research and
scientific data
Libraries, historical societies, non-profit organizations and
Digital preservation governments are increasing their efforts to preserve
valuable but aging digital content such as websites, software
source code, video games, user-generated content and
other digital artifacts
50. Glacier allows you to cost-effectively and securely store
Offsite archive enterprise data offsite, making it simple, inexpensive and safe
to retain archived data for as long as desired. Common use
cases include enterprise data, media assets, and research and
scientific data
Libraries, historical societies, non-profit organizations and
Digital preservation governments are increasing their efforts to preserve
valuable but aging digital content such as websites, software
source code, video games, user-generated content and
other digital artifacts
Amazon Glacier is cost competitive, even at scale, and
Tape replacement eliminates pain points like capacity planning, capital
budgeting and investments, media formats, hardware
refreshes, and off-site storage costs, shipping and
retrieving
51. Good reasons to replace off-site tape archives
100% restore success rate – no broken or missing
tapes
No lost tapes and improved security posture
No device or media admin or handling
No capacity planning
Pay as you go
No need for recurrent and risky data migrations
56. What is an archive?
Any object, such as a photo, video, document or
compressed collection
It is a base unit of storage in Amazon Glacier
Upload an archive in a single request
For large archives use multipart upload API
57. API credentials
Glacier client (keys)
Region endpoint
client = new AmazonGlacierClient(credentials);
client.setEndpoint("https://glacier.us-east-1.amazonaws.com/");
ArchiveTransferManager atm = new ArchiveTransferManager(client, credentials);
UploadResult result = atm.upload(vaultName, ”MyArc “, new
File(archiveToUpload));
Transfer manager
File to upload
Vault & archive
name Java
58. Transfer manager
Region endpoint
var manager = new
ArchiveTransferManager(Amazon.RegionEndpoint.USEast1);
string archiveId = manager.Upload(vaultName, ”MyArchive",
archiveToUpload).ArchiveId;
Vault & archive
File to upload name
.net
62. Initiate job
JobParameters jobParameters = new JobParameters()
Glacier .withArchiveId("*** provide an archive id ***")
client .withDescription("archive retrieval")
.withType("archive-retrieval");
InitiateJobResult initiateJobResult =
client.initiateJob(new InitiateJobRequest()
.withJobParameters(jobParameters)
.withVaultName(vaultName));
String jobId = initiateJobResult.getJobId();
JobID to track Java
63. Track job
After 3-5 hours:
1. SNS topic notification
2. Call describeJob
Using JobID
64. API credentials Download job
Glacier client (keys)
Region endpoint
client = new AmazonGlacierClient(credentials);
client.setEndpoint("https://glacier.us-east-1.amazonaws.com/");
ArchiveTransferManager atm = new ArchiveTransferManager(client, credentials);
atm.download(vaultName, archiveId, new File(downloadFilePath));
Transfer manager
Download path
Vault name &
archive id Java
65. Download job
var manager = new ArchiveTransferManager(Amazon.RegionEndpoint.USEast1);
var options = new DownloadOptions();
options.StreamTransferProgress += ArchiveDownloadHighLevel.progress;
manager.Download(vaultName, archiveId, downloadFilePath, options);
static int currentPercentage = -1;
static void progress(object sender, StreamTransferProgressArgs args)
{
if (args.PercentDone != currentPercentage)
{
currentPercentage = args.PercentDone;
Console.WriteLine("Downloaded {0}%", args.PercentDone);
}
}
.net
66. “Every day our genome sequencers produce
terabytes of data. As our company moves into
the clinical space, we face a legal
requirement to archive patient data for years
that would drastically raise the cost of
storage.
Thanks to Amazon Glacier’s secure and
scalable solution, we will be able to provide
cost-effective, long-term storage and thereby
eliminate a barrier to providing whole genome
sequencing for medical treatment of cancer
and other genetic diseases.”
67. “An organization like ours thinks in centuries
when it comes to content retention, and long
term preservation of our Master Archives is a
critical part our mission here at NYPR.
Storing these core assets on traditional media
such as local disk and off-site tape exposes us to
corruption and even outright-loss of data. We
are excited to move our archives to Amazon
Glacier, which will be a better long-term
solution.”
Steve Shultis, CTO, New York Public Radio
70. Storage Retrievals Data In Data Out
From $0.1 per GB Free up to 5% of Free Tiered (1st GB free)
average monthly
storage the tiered
fees
71. Storage Retrievals Data In Data Out
From $0.1 per GB Free up to 5% of Free Tiered (1st GB free)
average monthly
storage the tiered
fees
Anticipation is archives will be accessed infrequently
Storage is cheap, trade-off on retrieval pricing
72. Benefits of Amazon Glacier
Low cost Secure
As little as $0.01/GB/month with no up-front capital Secure and durable technology platform with
commitments. industry-recognized certifications and audits.
Durable Simple
Average annual durability of 99.999999999% per Eliminate hardware, software, and capacity
archive. planning.
Flexible Use multiple services
Add any amount of data, quickly. Easily expire and Easily leverage other AWS services once your data is
delete without handling media. in the AWS cloud.