This document provides an overview and comparison of various AWS data transfer and storage services. It discusses Internet/VPN ingest which uses standard HTTP/HTTPS to ingest data into S3 via existing internet connections. It also covers S3 Transfer Acceleration which optimizes data transfer between edge locations and S3 for improved throughput. Larger data transfers are supported using Snowball which ships physical storage devices. Direct Connect provides dedicated private connections for consistent network performance. Storage Gateway caches frequently accessed data locally while storing all data in AWS for backup, disaster recovery, and data migration use cases.
Cloud Economics: Transform Businesses at Lower Costs - AWS Summit Bahrain 2017Amazon Web Services
Most likely, your organization is not in the business of running data centers, yet a significant amount of time and money is spent doing just that. Amazon Web Services provides a way to acquire and use infrastructure on-demand, so that you pay only for what you consume. This puts more money back into the business, so that you can innovate more, expand faster, and be better positioned to take advantage of new opportunities. Learn from the CEO of DevFactory on how they saved money and redirected their resources towards boosting innovation after taking advantage of the cloud.
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
by Isaiah Weiner, Sr. Manager of Solutions Architecture, AWS
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
There are options beyond a straight forward lift and shift into Azure IaaS. What are your options? Learn how Azure helps modernize applications faster with containers and how you can use serverless to add additional functionality while keeping your production codebase 'clean'. We'll also learn how to incorporate DevOps throughout your apps lifecycle and take advantage of data-driven intelligence. Demo intensive session integrating the likes of Service Fabric, AKS VSTS and more.
Cloud Economics: Transform Businesses at Lower Costs - AWS Summit Bahrain 2017Amazon Web Services
Most likely, your organization is not in the business of running data centers, yet a significant amount of time and money is spent doing just that. Amazon Web Services provides a way to acquire and use infrastructure on-demand, so that you pay only for what you consume. This puts more money back into the business, so that you can innovate more, expand faster, and be better positioned to take advantage of new opportunities. Learn from the CEO of DevFactory on how they saved money and redirected their resources towards boosting innovation after taking advantage of the cloud.
Using AWS for Backup and Restore (backup in the cloud, backup to the cloud, a...Amazon Web Services
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
by Isaiah Weiner, Sr. Manager of Solutions Architecture, AWS
Companies are using AWS to create and deploy efficient, fast, and cost-effective backup and restore capabilities to protect critical IT systems without incurring the infrastructure expense of a second physical site. In this session, we will talk about cloud-based services AWS provides to enable robust backup and rapid recovery of your IT infrastructure and data.
There are options beyond a straight forward lift and shift into Azure IaaS. What are your options? Learn how Azure helps modernize applications faster with containers and how you can use serverless to add additional functionality while keeping your production codebase 'clean'. We'll also learn how to incorporate DevOps throughout your apps lifecycle and take advantage of data-driven intelligence. Demo intensive session integrating the likes of Service Fabric, AKS VSTS and more.
Domain Driven Design provides not only the strategic guidelines for decomposing a large system into microservices, but also offers the main tactical pattern that helps in decoupling microservices. The presentation will focus on the way domain events could be implemented using Kafka and the trade-offs between consistency and availability that are supported by Kafka.
https://youtu.be/P6IaxNcn-Ag?t=1466
Amazon API Gateway and AWS Lambda: Better TogetherDanilo Poccia
Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you, making it easy to build applications that respond quickly to new information. Together they help you build a server-less event-driven backend that is easy to manage and scale.
The benefits of running databases in the cloud are compelling but how do you get the data there? In this session we will explore how to use the AWS Database Migration Service and the AWS Schema Conversion Tool to help you to migrate, or continuously replicate, your on-premise databases to AWS.
Speaker: Jarrod Spiga, Solutions Architect, Amazon Web Services
AWS delivers an integrated suite of services that provide everything needed to quickly and easily build and manage a data lake for analytics. AWS-powered data lakes can handle the scale, agility, and flexibility required to combine different types of data and analytics approaches to gain deeper insights, in ways that traditional data silos and data warehouses cannot. In this session, we will show you how you can quickly build a data lake on AWS that ingests, catalogs and processes incoming data and makes it ready for analysis. Using a live demo, we demonstrate the capabilities of AWS provided analytical services such as AWS Glue, Amazon Athena and Amazon EMR and how to build a Data Lake on AWS step-by-step.
(DAT303) Oracle on AWS and Amazon RDS: Secure, Fast, and ScalableAmazon Web Services
AWS and Amazon RDS provide advanced features and architectures that enable graceful migration, high performance, elastic scaling, and high availability for Oracle database workloads. Learn best practices for realizing the benefits of the cloud while reducing costs, by running Oracle on AWS in a variety of single- and multi-instance topologies. This session teaches you to take advantage of features unique to AWS and Amazon RDS to free your databases from the confines of the conventional data center.
Open API and API Management - Introduction and Comparison of Products: TIBCO ...Kai Wähner
In October 2014, I had a talk at Jazoon in Zurich, Switzerland: "A New Front for SOA: Open API and API Management as Game Changer"
Open API represent the leading edge of a new business model, providing innovative ways for companies to expand brand value and routes to market, and create new value chains for intellectual property. In the past, SOA strategies mostly targeted internal users. Open APIs target mostly external partners.
This session introduces the concepts of Open API, its challenges and opportunities. API Management will become important in many areas, no matter if business-to-business (B2B) or business-to-customer (B2C) communication. Several real world use cases will discuss how to gain leverage due to API Management. The end of the session shows and compares API management products from different vendors such as TIBCO API Exchange, IBM, Apigee, 3scale, WSO2, MuleSoft, Mashery, Layer 7, Vordel
(DEV203) Amazon API Gateway & AWS Lambda to Build Secure APIsAmazon Web Services
Amazon API Gateway is a fully managed service that makes it easy for developers to create, deploy, secure, and monitor APIs at any scale. In this presentation, you’ll find out how to quickly declare an API interface and connect it with code running on AWS Lambda. Amazon API Gateway handles all of the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls, including traffic management, authorization and access control, monitoring, and API version management. We will demonstrate how to build an API that uses AWS Identity and Access Management (IAM) for authorization and Amazon Cognito to retrieve temporary credentials for your API calls. We will write the AWS Lambda function code in Java and build an iOS sample application in Objective C.
Amazon Athena is a new serverless query service that makes it easy to analyze data in Amazon S3, using standard SQL. With Athena, there is no infrastructure to setup or manage, and you can start analyzing your data immediately. You don’t even need to load your data into Athena, it works directly with data stored in S3.
Cloud migrations are hardly one size fits all. It can be challenging to migrate from a large-scale data center to an optimized AWS environment without draining IT resources. By leveraging CSC, organizations are able to determine exactly what they need from their IT infrastructure and efficiently migrate to a customized cloud environment on AWS that meets those needs. With 400+ AWS certified architects and 30+ experts with AWS professional-level certification, CSC helps organizations experience seamless, results-oriented migrations. Register for the upcoming webinar to hear speakers from CSC and AWS discuss the ins and outs of a successful large-scale migration to AWS.
Join us to learn:
How CSC helped a large federal systems integration company migrate their workloads to the AWS Cloud in less than three months
How CSC has facilitated customers split from their shared IT environment in less than 3 months
The step-by-step process of an efficient data center migration
Who Should Attend:
IT Manager, IT Security Manager, Solution Architect, Cloud App Architect, System Administrator, IT Project Manager, Product Manager, Business Development
[금융고객을 위한 AWS re:Invent 2022 re:Cap] 3.AWS reInvent 2022 Technical Highlights...AWS Korea 금융산업팀
AWS re:Invent 2022 Technical Highlights: 혁신은 계속된다.
2022 AWS re:Invent에서발표되었던 주요한 서비스들 중에서 금융 분야에서 활용하면 좋은 서비스들을 요약하여 전달 드립니다. 급변하는 시장에서 살아남기 위해서 지속적인 혁신이 그 어느때보다도 중요한 시점입니다. 본 세션에서는 AWS에서 주도하는 IT 혁신에 대한 기술적인 내용들을 다룰 예정입니다.
송규호, Solutions Architect, AWS
AWS offers a variety of data migration services and tools to help you easily and rapidly move everything from gigabytes to petabytes of data. We can provide guidance and methodologies to help you find the right service or tool to fit your requirements, and we share examples of customers who have used these options in their cloud journey.
Introduction to Amazon Kinesis Firehose - AWS August Webinar SeriesAmazon Web Services
Streaming data applications can deliver compelling, near real-time user experiences, but building the back-end infrastructure to collect and process streaming data is difficult. Amazon Kinesis Firehose makes it easy for you to load streaming data into AWS without having to build custom stream processing applications. In this webinar, we will introduce Amazon Kinesis Firehose and discuss how to ingest streaming data into Amazon S3, Amazon Redshift, and Amazon Elasticsearch Service using Amazon Kinesis Firehose. We will also highlight key use cases based on real-world examples from IoT, AdTech, E-Commerce, and Gaming. Join us to: - Get an introduction to streaming data and an overview of Amazon Kinesis Firehose - Learn about common streaming data use cases from IoT, Ad Tech, E-Commerce, and Gaming - Understand how to use Amazon Kinesis Firehose to load streaming data into Amazon S3, Amazon Redshift, and Amazon Elasticsearch Service Who should attend: Developers, data analysts, data engineers, architects
AWS re:Invent 2016: Deep Dive on AWS Cloud Data Migration Services (ENT210)Amazon Web Services
When evaluating and planning migrating your data from on premises to the Cloud, you might encounter physical limitations. Amazon offers a suite of tools to help you surmount these limitations by moving data using networks, roads, and technology partners. In this session, we discuss how to move large amounts of data into and out of the Cloud in batches, increments, and streams.
AWS Data Transfer Services: Data Ingest Strategies Into the AWS CloudAmazon Web Services
Different types and sizes of data require different strategies. In this session, learn about the various features and services available for migrating data, be it small ongoing transactional data or large multi-petabyte volumes. Come learn how customers are using the latest network, streaming and large scale ingest features for their cloud data migrations to AWS storage services.
Domain Driven Design provides not only the strategic guidelines for decomposing a large system into microservices, but also offers the main tactical pattern that helps in decoupling microservices. The presentation will focus on the way domain events could be implemented using Kafka and the trade-offs between consistency and availability that are supported by Kafka.
https://youtu.be/P6IaxNcn-Ag?t=1466
Amazon API Gateway and AWS Lambda: Better TogetherDanilo Poccia
Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. AWS Lambda is a compute service that runs your code in response to events and automatically manages the compute resources for you, making it easy to build applications that respond quickly to new information. Together they help you build a server-less event-driven backend that is easy to manage and scale.
The benefits of running databases in the cloud are compelling but how do you get the data there? In this session we will explore how to use the AWS Database Migration Service and the AWS Schema Conversion Tool to help you to migrate, or continuously replicate, your on-premise databases to AWS.
Speaker: Jarrod Spiga, Solutions Architect, Amazon Web Services
AWS delivers an integrated suite of services that provide everything needed to quickly and easily build and manage a data lake for analytics. AWS-powered data lakes can handle the scale, agility, and flexibility required to combine different types of data and analytics approaches to gain deeper insights, in ways that traditional data silos and data warehouses cannot. In this session, we will show you how you can quickly build a data lake on AWS that ingests, catalogs and processes incoming data and makes it ready for analysis. Using a live demo, we demonstrate the capabilities of AWS provided analytical services such as AWS Glue, Amazon Athena and Amazon EMR and how to build a Data Lake on AWS step-by-step.
(DAT303) Oracle on AWS and Amazon RDS: Secure, Fast, and ScalableAmazon Web Services
AWS and Amazon RDS provide advanced features and architectures that enable graceful migration, high performance, elastic scaling, and high availability for Oracle database workloads. Learn best practices for realizing the benefits of the cloud while reducing costs, by running Oracle on AWS in a variety of single- and multi-instance topologies. This session teaches you to take advantage of features unique to AWS and Amazon RDS to free your databases from the confines of the conventional data center.
Open API and API Management - Introduction and Comparison of Products: TIBCO ...Kai Wähner
In October 2014, I had a talk at Jazoon in Zurich, Switzerland: "A New Front for SOA: Open API and API Management as Game Changer"
Open API represent the leading edge of a new business model, providing innovative ways for companies to expand brand value and routes to market, and create new value chains for intellectual property. In the past, SOA strategies mostly targeted internal users. Open APIs target mostly external partners.
This session introduces the concepts of Open API, its challenges and opportunities. API Management will become important in many areas, no matter if business-to-business (B2B) or business-to-customer (B2C) communication. Several real world use cases will discuss how to gain leverage due to API Management. The end of the session shows and compares API management products from different vendors such as TIBCO API Exchange, IBM, Apigee, 3scale, WSO2, MuleSoft, Mashery, Layer 7, Vordel
(DEV203) Amazon API Gateway & AWS Lambda to Build Secure APIsAmazon Web Services
Amazon API Gateway is a fully managed service that makes it easy for developers to create, deploy, secure, and monitor APIs at any scale. In this presentation, you’ll find out how to quickly declare an API interface and connect it with code running on AWS Lambda. Amazon API Gateway handles all of the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls, including traffic management, authorization and access control, monitoring, and API version management. We will demonstrate how to build an API that uses AWS Identity and Access Management (IAM) for authorization and Amazon Cognito to retrieve temporary credentials for your API calls. We will write the AWS Lambda function code in Java and build an iOS sample application in Objective C.
Amazon Athena is a new serverless query service that makes it easy to analyze data in Amazon S3, using standard SQL. With Athena, there is no infrastructure to setup or manage, and you can start analyzing your data immediately. You don’t even need to load your data into Athena, it works directly with data stored in S3.
Cloud migrations are hardly one size fits all. It can be challenging to migrate from a large-scale data center to an optimized AWS environment without draining IT resources. By leveraging CSC, organizations are able to determine exactly what they need from their IT infrastructure and efficiently migrate to a customized cloud environment on AWS that meets those needs. With 400+ AWS certified architects and 30+ experts with AWS professional-level certification, CSC helps organizations experience seamless, results-oriented migrations. Register for the upcoming webinar to hear speakers from CSC and AWS discuss the ins and outs of a successful large-scale migration to AWS.
Join us to learn:
How CSC helped a large federal systems integration company migrate their workloads to the AWS Cloud in less than three months
How CSC has facilitated customers split from their shared IT environment in less than 3 months
The step-by-step process of an efficient data center migration
Who Should Attend:
IT Manager, IT Security Manager, Solution Architect, Cloud App Architect, System Administrator, IT Project Manager, Product Manager, Business Development
[금융고객을 위한 AWS re:Invent 2022 re:Cap] 3.AWS reInvent 2022 Technical Highlights...AWS Korea 금융산업팀
AWS re:Invent 2022 Technical Highlights: 혁신은 계속된다.
2022 AWS re:Invent에서발표되었던 주요한 서비스들 중에서 금융 분야에서 활용하면 좋은 서비스들을 요약하여 전달 드립니다. 급변하는 시장에서 살아남기 위해서 지속적인 혁신이 그 어느때보다도 중요한 시점입니다. 본 세션에서는 AWS에서 주도하는 IT 혁신에 대한 기술적인 내용들을 다룰 예정입니다.
송규호, Solutions Architect, AWS
AWS offers a variety of data migration services and tools to help you easily and rapidly move everything from gigabytes to petabytes of data. We can provide guidance and methodologies to help you find the right service or tool to fit your requirements, and we share examples of customers who have used these options in their cloud journey.
Introduction to Amazon Kinesis Firehose - AWS August Webinar SeriesAmazon Web Services
Streaming data applications can deliver compelling, near real-time user experiences, but building the back-end infrastructure to collect and process streaming data is difficult. Amazon Kinesis Firehose makes it easy for you to load streaming data into AWS without having to build custom stream processing applications. In this webinar, we will introduce Amazon Kinesis Firehose and discuss how to ingest streaming data into Amazon S3, Amazon Redshift, and Amazon Elasticsearch Service using Amazon Kinesis Firehose. We will also highlight key use cases based on real-world examples from IoT, AdTech, E-Commerce, and Gaming. Join us to: - Get an introduction to streaming data and an overview of Amazon Kinesis Firehose - Learn about common streaming data use cases from IoT, Ad Tech, E-Commerce, and Gaming - Understand how to use Amazon Kinesis Firehose to load streaming data into Amazon S3, Amazon Redshift, and Amazon Elasticsearch Service Who should attend: Developers, data analysts, data engineers, architects
AWS re:Invent 2016: Deep Dive on AWS Cloud Data Migration Services (ENT210)Amazon Web Services
When evaluating and planning migrating your data from on premises to the Cloud, you might encounter physical limitations. Amazon offers a suite of tools to help you surmount these limitations by moving data using networks, roads, and technology partners. In this session, we discuss how to move large amounts of data into and out of the Cloud in batches, increments, and streams.
AWS Data Transfer Services: Data Ingest Strategies Into the AWS CloudAmazon Web Services
Different types and sizes of data require different strategies. In this session, learn about the various features and services available for migrating data, be it small ongoing transactional data or large multi-petabyte volumes. Come learn how customers are using the latest network, streaming and large scale ingest features for their cloud data migrations to AWS storage services.
When evaluating and planning migrating your data from on premises to the Cloud, you might encounter physical limitations. Amazon offers a suite of tools to help you surmount these limitations by moving data using networks, roads, and technology partners. In this session, we discuss how to move large amounts of data into and out of the Cloud in batches, increments, and streams.
Cloud Data Migration Strategies - AWS May 2016 Webinar SeriesAmazon Web Services
AWS offers a variety of methods to migrate your data into the cloud. You may want perform regular backups, start collecting device streams, migrate a single large datastore, or simply establish dedicated connectivity and figure out what to do next. Which AWS cloud data migration offering is right for your needs?
This webinar will give you an overview of the six data migration tools we offer, including the strengths and weaknesses of each, as well as their complementary opportunities.
Learning Objectives:
• An overview of cloud data migration
• The basics of the six services (Direct Connect, Storage Gateway, Snowball, Transfer Acceleration, Firehose, 3rd party partners)
• An overview of the Amazon Content Distribution network and how it can help with long distance transfers into and out of the cloud
• Special emphasis on the new Amazon S3 Transfer Acceleration feature
AWS Data Transfer Services - AWS Storage Gateway, AWS Snowball, AWS Snowball ...Amazon Web Services
AWS offers a suite of tools to help you surmount limitations associated to data migration from on premise to the cloud. Attend this session to learn about moving data by using networks, roads, and AWS technology partners. We will also discuss how to move data into and out of the Cloud in batches, increments, and streams.
• Overview of AWS Storage Services including block, file, and object
• AWS data migration tools and approaches
• Description of AWS data migration programs aimed at accelerating your journey to the cloud
Data migration at petabyte scale is now a simple service from AWS. You can easily migrate large volumes of data from on-premises environments to the cloud, quickly get started with the cloud as a backup target, or burst workloads between your on-premises environments and the AWS Cloud. Learn about AWS Snowball, AWS Snowball Edge, AWS Snowmobile and AWS Storage Gateway, and understand which one is the right fit for your requirements. We will go through customer use cases, review the different applications used, and help you cut IT spend and management time on hardware and backup solutions.
AWS as a Data Platform for Cloud and On-Premises Workloads | AWS Public Secto...Amazon Web Services
This session discusses the set of data services that AWS offers for managing all types of data, including files, objects, databases, and data warehouses. We will discuss use cases for each AWS data service, including unique capabilities that the cloud enables and hybrid scenarios for integrating and migrating on-premises data to AWS. This session discusses Amazon S3, AWS Storage Gateway, Amazon EBS, Amazon RDS, Amazon Redshift, and native databases running on AWS. It also covers some of the key data and storage capabilities provided by AWS partners, and considerations for integrating with and migrating enterprise data to the cloud.
Storage is the most clear requirement for digital media. The AWS Cloud has customized solutions that cater to digital media storage, and present an array of options to ingest, store and move digital media, using the Cloud as a transport and storage mechanism.
Erik Durand, the Principal Business Development Manager for AWS Storage, takes us on this analysis of the options, benefits and characteristics of each one.
Presented during the AWS Media and Entertainment Symposium in Toronto
Getting Started with the Hybrid Cloud: Enterprise Backup and RecoveryAmazon Web Services
This sessions is for architects and storage admins seeking simple and non-disruptive ways to adopt cloud platforms in their organizations. You will learn how to deliver lower costs and greater scale with nearly seamless integration into your existing B&R processes. Services mentioned: S3, Glacier, Snowball, 3rd party partners, storage gateway, and ingestion services.
Getting Started with the Hybrid Cloud: Enterprise Backup and RecoveryAmazon Web Services
This sessions is for architects and storage admins seeking simple and non-disruptive ways to adopt cloud platforms in their organizations. You will learn how to deliver lower costs and greater scale with nearly seamless integration into your existing B&R processes. Services mentioned: S3, Glacier, Snowball, 3rd party partners, storage gateway, and ingestion services.
AWS January 2016 Webinar Series - Cloud Data Migration: 6 Strategies for Gett...Amazon Web Services
AWS offers a variety of methods to migrate your data into the cloud. You may want to begin performing regular backups, start collecting device streams, migrate a large datastore once, or just gain dedicated connectivity and then figure out what to do next. How do you know what option works best with your architecture(s)?
This webinar will give you an overview of the six data migration tools we offer, including the strengths and weaknesses of each, as well as their complementary opportunities.
Learning Objectives:
Gain an overview of cloud data migration
Learn the basics of the six transfer services (Direct Connect, Gateway, Snowball, Disk transfer, Firehose, 3rd party partners)
Understand the strengths and weaknesses of each service, and opportunities to layer them together
Who Should Attend:
Developers, IT Professionals, Storage and Backup Administrators, who are familiar with the concept of cloud storage but concerned about how to move their data in effectively
Getting Started with the Hybrid Cloud: Enterprise Backup and RecoveryAmazon Web Services
This sessions is for architects and storage admins seeking simple and non-disruptive ways to adopt cloud platforms in their organizations. You will learn how to deliver lower costs and greater scale with nearly seamless integration into your existing B&R processes. Services mentioned: S3, Glacier, Snowball, 3rd party partners, storage gateway, and ingestion services.
Getting Started with the Hybrid Cloud: Enterprise Backup and RecoveryAmazon Web Services
This session is for architects and storage admins seeking simple and non-disruptive ways to adopt cloud platforms in their organizations. You will learn how to deliver lower costs and greater scale with nearly seamless integration into your existing B&R processes. Services mentioned: S3, Glacier, Snowball, 3rd party partners, storage gateway, and ingestion services.
Learn the basics of getting started with AWS and migrating your data to AWS. This session will also cover core AWS services, such as Amazon EC2 and Amazon S3, and provide demonstrations of how to set up and utilize those services to launch virtual machines in the cloud.
AWS Services Overview and Quarterly Update - April 2017 AWS Online Tech TalksAmazon Web Services
Learning Objectives:
• Overview of AWS New & Existing Services
• Advice for Getting Started
Join the “AWS Services Overview and Quarterly Update” webinar to take a fast-paced 45-minute tour through our broad range of new and existing services. We will also provide an update so you can review and catch up on the biggest updates from the past quarter. During the webinar, you will have the opportunity to propose questions for the live Q&A session following the presentation.
AWS Services Overview and Quarterly Update - April 2017 AWS Online Tech TalksAmazon Web Services
• Overview of AWS New & Existing Services
• Advice for Getting Started
Join the “AWS Services Overview and Quarterly Update” webinar to take a fast-paced 45-minute tour through our broad range of new and existing services. We will also provide an update so you can review and catch up on the biggest updates from the past quarter. During the webinar, you will have the opportunity to propose questions for the live Q&A session following the presentation.
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
5. What is Internet/VPN?
Globally available
Default method of ingesting content into Amazon S3
Simple standards-based (HTTP or HTTPS) connection
Use your existing internet connection
Available in a VPC for VPN connectivity
Acceleration through multipart upload
Data transfer into AWS is free
VPN connections using VPC virtual private gateway
•$0.05 per VPN connection-hour
•$0.048 per VPN connection-hour for connections to the Tokyo region
6. How does Internet/VPN ingest work?
Accelerate data transfer using
multipart upload
Ingest data directly into S3 buckets
with existing internet connectivity
S3 bucket
AWS Region
and
through the console or API
customer
gateway
endpoints
VPN
connection
Internet Internet through VPN +
VPC
8. Ingest & egress with S3 transfer acceleration
S3 bucket
AWS edge
location
Uploader
Optimized
throughput!
Uses AWS 55 global edge locations
AWS determines best edge location
Data transfer optimized between
edge and customer, and edge and S3
Data is not stored on the edge cache
9. Customers: Frame.io, Hudl, Viocorp
Problem Statement:
• Needed to accelerate customer content ingest into their respective
applications running on AWS
• Existing ingest options were proprietary and too expensive
Use of AWS:
• S3 and S3 transfer acceleration for massively scalable ingest
• S3 for storage, CloudFront and S3 transfer acceleration for ingest
Business Benefits:
• Global highly distributed data transport available on demand
• Massive scalability and elasticity
• Lower TCO for storage and data transport infrastructure
Accelerating media content uploads to their platforms
S3 Bucket
AWS Edge
Location
Uploader
Optimized
Throughput!
10. Amazon
Route 53
Resolve
b1.s3-accelerate.amazonaws.com
HTTPS PUT/POST
upload_files.zip
HTTP/S PUT/POST
“upload_files.zip”
Service traffic flow
Client to S3 bucket example
S3 bucket
b1.s3-accelerate.amazonaws.com
EC2 proxy
AWS region
AWS edge location
Customer client
1
2
3
4
Data is not cached on the
AWS edge location
Fully managed file transfer acceleration
using all AWS edge locations
11. Using the service is as easy as 1, 2, 3…
Update application to point to new S3 URL
• Update “bucket.s3.amazonaws.com” to
“<bucket-name>.s3-accelerate.amazonaws.com”
• Original bucket location and contents are the same, only namespace
changes
Or use permissions through API
s3:PutAccelerateConfiguration
Enable the service in the AWS Management Console
Start uploading data to Amazon S3
1
2
3
12. Rio De
Janeiro
Warsaw New York Atlanta Madrid Virginia Melbourne Paris Los
Angeles
Seattle Tokyo Singapore
Time[hrs]
500 GB upload from these edge locations to a bucket in Singapore
Public internet
How fast is S3 transfer acceleration?
S3 transfer acceleration
14. Pricing*
Dimension Price/GB
Data transfer in from Internet** $0.04 (Edge location in US, EU, JP)
$0.08 (Edge location in rest of the world)
Data transfer out to Internet $0.04
Data transfer out to another AWS region $0.04
Amazon S3 charges Standard data transfer charges apply
*Plus standard Amazon S3 data transfer charges apply
**Accelerated performance or there is no bandwidth charge
16. What is AWS Direct Connect?
Dedicated, 1 or 10 GE private pipes into AWS
Create private (VPC) or public virtual interfaces to AWS
Reduced data-out rates (data-in still free)
Consistent network performance
At least 1 location to each AWS region
Option for redundant connections
Uses BGP to exchange routing information over a VLAN
17. Physical connection
• Cross-connect at the location
• Single-mode optical fiber
- 1000Base-LX or 10GBASE-LR
• Potential onward delivery through Direct Connect partner
• Customer router
18. At the Direct Connect location
CORP
AWS Direct
Connect
Routers
Customer
Router
Colocation
DX Location
Customer
network
`
AWS backbone
network
Cross-
connect
Customer
router
Customer’s network
Demarcation
19. Dedicated port through Direct Connect partner
CORP
AWS Direct
Connect
Routers
Colocation
DX Location
Partner network
AWS backbone
network
Cross-
connect
Customer
router
Partner
network
Access
circuit
Demarcation
Partner
equipment
20. Direct Connect Locations
AWS Region AWS Direct Connect Location
Asia Pacific (Singapore) Equinix SG2, GPX, Mumbai
Asia Pacific (Seoul) KINX, Seoul
Asia Pacific (Sydney) Equinix SY3, Global Switch
Asia Pacific (Tokyo) Equinix OS1, Equinix TY2
China (Beijing) Sinnet JiuXianqiao IDC, CIDS Jiachuang IDC
EU (Frankfurt) Equinix FR5, Interxion Frankfurt
EU (Ireland) TelecityGroup, London Docklands, Eircom Clonshaugh
Equinix LD4 - LD6, London
South America (Sao Paulo) Terremark NAP do Brasil, Tivit
US East (Virginia) CoreSite NY1 & NY2, Equinix DC1 - DC6 & DC10
US West (Northern
California)
CoreSite One Wilshire & 900 North Alameda, CA,
Equinix SV1 & SV5
US West (Oregon) Equinix SE2 & SE3, Switch SUPERNAP, Las Vegas
AWS GovCloud (US) Equinix SV1 & SV5
22. What is AWS Snowball?
Petabyte-scale data transport
E-ink shipping
label
Ruggedized case
“8.5G impact”
All data encrypted
end-to-end
Rain- and dust-
resistant
Tamper-resistant
case and
electronics
80 TB
10 GE network
23. Customer: Scripps Networks Interactive
Problem Statement:
• Need storage platform to manage active archive content
• Existing content repository too large to migrate via available
network-based ingest methods
Use of AWS:
• S3 and Snowball for massively scalable ingest
• S3 for storage, Glacier for content archive
• Snowball to securely transport existing media content from on-
premises storage and tape vault
Business Benefits:
• Petabyte-scale data transport without increased network costs
• Massive scalability and elasticity
• Lower TCO for active archive storage
Active archive transport and archival for digital content provider
24. Customer: DevFactory
Problem Statement:
• Acquiring new companies at a rapid pace
• Needed a faster and more cost effective data ingest process to
their platform
Use of AWS:
• S3 and Snowball for massively scalable ingest
• Transferred thousands of virtual server images to EC2
• Total data migrations to date with Snowball amounting to over 1PB
of data
Business Benefits:
• Can migrate new customers to the DevFactory environment up to
60 percent faster than before by using AWS Snowball
• Significant reduction in migration and ongoing operational costs
Accelerated datacenter decommissioning for software provider
26. How fast is Snowball?
• Less than 1 day to transfer 200TB via 3x10G connections with 3
Snowballs, less than 1 week including shipping
• Number of days to transfer 200TB via the Internet at typical utilizations
Internet Connection Speed
Utilization 1Gbps 500Mbps 300Mbps 150Mbps
25% 71 141 236 471
50% 36 71 118 236
75% 24 47 225 157
27. Pricing
Dimension Price
Usage Charge per Job $250.00 (80 TB)
Extra Day Charge (First 10 days* are free) $15.00
Data Transfer In $0.00/GB
Data Transfer Out $0.03/GB
Shipping** Varies
Amazon S3 Charges Standard storage and request
fees apply
* Starts one day after the appliance is delivered to you. The first day the appliance is received at your site and the last day the appliance is shipped out are also free
and not included in the 10-day free usage time.
** Shipping charges are based on your shipment destination and the shipping option (e.g., overnight, 2-day) you choose.
29. What is AWS Storage Gateway?
Works with your existing applications
Secure and durable storage in AWS
Low latency for frequently used data
Scalable and cost-effective on-premises storage - $125 per
gateway per month + S3/Amazon Glacier storage fees
Service connecting an on-premises software appliance
with cloud-based storage
30. Common uses for AWS Storage Gateway
Backup and archive
Disaster recovery
Data migration
31. How does AWS Storage Gateway work?
Amazon EBS
snapshots
Amazon S3
Amazon
Glacier
AWS
Storage Gateway
appliance
Application
server
AWS
Storage Gateway
back end
Customer premises
S3
transfer
acceleration
AWS
Direct
Connect
Internet
32. AWS Storage Gateway configurations
iSCSI block storage
Gateway-stored volumes
iSCSI virtual tape storage
Low latency for all your data with point-in-time
backups to AWS
Replacement for on-premises physical tape
infrastructure for backup and archive
Gateway-cached volumes
Gateway virtual tape library (VTL)
Low latency for frequently used data with all data
stored in AWS
33. Gateway-virtual tape library (VTL)
• Replace or augment your aging tape infrastructure with durable object
storage
• Virtual tapes stored in AWS, frequently accessed data cached on-premises
• Up to 1,500 tapes, up to 2.5 TB each, for up to 150 TB per gateway-VTL
• Unlimited number of tapes in virtual tape shelf (VTS)
Customer data center
VTS storage
backed by
Amazon Glacier
AWS Storage
Gateway VM
Backup
server
INITIATOR
AWS
Storage Gateway
service
MEDIA
CHANGER
Upload
buffer
Cache
storage
Gateway-VTL
storage backed
by Amazon S3
VT
S
TAPE
DRIVE
35. Amazon storage partner ecosystem
Gateway/NAS
Data
management
Sync and shareBackup/DR
Content and
acceleration
Archive
File System
36. Backup to AWS approaches
Amazon S3
Amazon
Glacier
AWS
Direct
Connect
Internet
Amazon S3-IA
Application
servers
Cloud gateway
Local disk
Media
server
Cloud gateway
Application
servers
Backup SW cloud connector
Local disk
Media
server with cloud
connector
37. CommVault ties together on-premises and cloud-data
strategies
Commvault orchestrates the enterprise
• Back up in the cloud: Keep backups of
cloud workloads internal to the cloud
• Back up to the cloud: Allow on-premises
workloads the ability to leverage AWS
• Disaster recovery to the cloud:
Automate disaster recovery to the cloud
on a scheduled basis
• Workload portability: Rest assured that
virtual servers can be moved from on-
premises to the cloud and back, keep your
data available wherever you need it
• Archiving to the cloud: Moving legacy
data to tier 2 storage in the cloud for long
term archive
Together, AWS and Commvault minimize
networking, storage, and infrastructure costs
while providing the business a sound data
protection and disaster-recovery strategy.
38. Backup to AWS approaches
Amazon S3
Amazon
Glacier
AWS
Direct
Connect
Internet
Amazon S3-IA
Application
servers
Cloud gateway
Local disk
Media
server
Cloud gateway
Application
servers
Backup SW cloud connector
Local disk
Media
server with cloud
connector
39. NetApp AltaVault backup from on-
premises to S3/Amazon Glacier
Common backup applications integrated with AltaVaultSolve backup and archive headaches with cloud-integrated
storage
90% reduction in time, cost, and data volumes
Shrink recovery times from days to minutes
85% of backup and software providers supported
Amazon
Glacier
On-premises
AWS
Cloud-integrated
storage appliance
NetApp AltaVault
FAS
E-series
Non-NetApp
storage
NetApp SnapProtect
Arcserve
CommVault Simpana
EMC NetWorker
HP Data Protector
IBM Tivoli
Storage Manager
Symantec Backup
Exec
Symantec
NetBackup
Veeam
Microsoft SQL
Server
Oracle RMAN
S3
AltaVault also available on Marketplace
to protect cloud-native workloads
Seamlessly integrates into existing
storage and backup software environment
Caches recent backups locally,
vaults older copies to the cloud
Store data in AWS
Deduplicates, compresses,
and encrypts
40. Summary: When to use each service
IF YOU NEED: CONSIDER:
An optimized or replacement Internet connection to:
connect directly into an AWS regional data center Direct Connect
migrate TB or PB of data to the cloud Snowball
accelerate data transfer
S3 Transfer Acceleration,
AWS Partner
A friendly interface into S3 to:
cache data locally in a hybrid model (for
performance reasons)
Storage Gateway, AWS partner
redirect backups or archives with minimal
disruption
Storage Gateway, AWS partner