In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide - with cost points competitive against tape archives. Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. See how Amazon Athena runs "query in place" analytics on your data and hear about the new expedited and bulk retrievals from Amazon Glacier. Learn how AWS customers have built solutions that turn their data from a cost into a strategic asset, and bring your toughest questions straight to our experts. Learn More: https://aws.amazon.com/government-education/
Active Archiving with Amazon S3 and Tiering to Amazon Glacier - March 2017 AW...Amazon Web Services
Most organizations have data that they need to retain, but is accessed infrequently, if ever. In cases where this data needs to be accessible at a moment’s notice, it’s hard to save money by moving to an archival storage because access times on these platforms are slower. Now, customers are using Amazon S3 & Glacier for “Active Archiving” to reduce storage costs while maintaining the flexibility of instant access. In this tech talk, we’ll show you how implement Active Archiving with AWS Object Storage services, and we’ll provide some real world examples of how AWS customers are saving money with these capabilities today.
Learning Outcomes:
• Define Active Archiving, and understand how it is different from traditional cold archiving
• Review the cost modeling tools available to determine if Active Archiving is a good fit for your organization
• Learn about best practices for using AWS Object Storage features & functionality to enable Active Archiving
Deep Dive On Object Storage: Amazon S3 and Amazon Glacier - AWS PS Summit Can...Amazon Web Services
Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. Discover how AWS customers have built solutions that turn their data into a strategic asset.
Speakers: Ben Thurgood. Solutions Architect. Amazon Web Services with Timothy Eckersley, Enterprise Architect, NSW Pathology
Level: 300
AWS Storage Gateway is a service that connects an on-premises software appliance with AWS storage. It simplifies the adoption of cloud-based storage within on-premises environments, giving customers a secure, reliable, and cost-effective alternative to local storage. In this session, we take a detailed look at how to use Storage Gateway to backup and archive on-premises data. We discuss the three types of storage and how to select the right type for your environment. We walk through setup and configuration of the on-premises gateway appliance, data restoration, and daily management, such as monitoring performance and managing storage. The session is intended for customers who perform on-premises backup and archive today, and want to learn how to include cloud storage in their environment.
Not just for archiving or compliance use cases, Amazon Glacier accommodates customers simply looking to replace their on-premises long term storage with a cost efficient, durable, cloud option, from which they can easily and quickly access their data when they need to. This session will introduce newly launched features for Amazon Glacier, review the current service feature set, and share the global data center shut down and storage strategy for Sony DADC New Media Solutions (NMS). NMS is Sony’s digital servicing division providing global digital distribution, linear playout and white label OTT/Commerce solutions for clients such as BBC Worldwide, NBCUniversal, Sony Playstation, and Funimation Entertainment.
Hear from Andy Shenkler, NMS’s Chief Technology and Solutions Officer as he talks about the key factors that drove the organization’s decision to move away from tape and go towards the cloud and out of the infrastructure business overall. Learn more about the impact and operational practices inside a world class digital supply chain as they were able to move over 20 petabytes of data, over 1M hours of video, to the cloud and never looked back.
SRV403 Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide – with cost points competitive against tape archives. Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. Hear about Amazon Glacier and new capabilities to get access to your data faster with expedited retrievals. Learn how AWS customers have built solutions that turn their data from a cost into a strategic asset, and bring your toughest questions straight to our experts.
In this session, we’ll expand on the S3 re:Invent deep-dive session with a hands-on workshop on advanced S3 features and storage management capabilities. We’ll have AWS S3 and Glacier experts on-hand to deep-dive on S3 architecture, performance & scalability optimization, how to analyze your content and leverage storage tiers (S3 Standard, S3 Standard Infrequent Access, Glacier) to balance cost and SLAs, security considerations, replication with Cross Region Replication (CRR), versioning for data protection and more.
In the hands-on lab, we’ll walk through a customer scenario: architecting a high-performance infrastructure for consumer applications. In the scenario, we’ll use sample data sets on S3, analyze object retrieval patterns and design a complete solution using many of the features S3 offers including migrating objects to an appropriate tier.
Prerequisites:
- Participants should have an AWS account established and available for use during the workshop.
- Please bring your own laptop.
AWS Webcast - Archiving in the Cloud - Best Practices for Amazon GlacierAmazon Web Services
Join our webinar to learn more about how to build a cost effective archive application using Amazon Glacier, an extremely low cost, secure, highly durable, and easy to use storage service in the AWS cloud.
We will explain how Amazon Glacier works and walk through some best practices to get the most out of the service
We will also highlight how to choose between Amazon Glacier and Amazon S3’s Glacier storage option.
Learn more: http://aws.amazon.com/glacier/
Architectures for HPC and HTC Workloads on AWS | AWS Public Sector Summit 2017Amazon Web Services
Researchers and IT professionals using High Performance Computing (HPC) and High Throughput Computing (HTC) need large scale infrastructure in order to move their research forward. Neuroimaging employs a variety of computationally demanding techniques with which to interrogate the structure and function of the living brain. Tara Madhyastha with the University of Washington, Department of Radiology, is demonstrating these methods at scale. This session will provide reference architectures for running your workloads on AWS, enabling you to achieve scale on demand, and reduce your time to science. We will also debunk myths about HPC in the cloud and show techniques for running common on-premises workloads in the cloud. Learn More: https://aws.amazon.com/government-education/
Active Archiving with Amazon S3 and Tiering to Amazon Glacier - March 2017 AW...Amazon Web Services
Most organizations have data that they need to retain, but is accessed infrequently, if ever. In cases where this data needs to be accessible at a moment’s notice, it’s hard to save money by moving to an archival storage because access times on these platforms are slower. Now, customers are using Amazon S3 & Glacier for “Active Archiving” to reduce storage costs while maintaining the flexibility of instant access. In this tech talk, we’ll show you how implement Active Archiving with AWS Object Storage services, and we’ll provide some real world examples of how AWS customers are saving money with these capabilities today.
Learning Outcomes:
• Define Active Archiving, and understand how it is different from traditional cold archiving
• Review the cost modeling tools available to determine if Active Archiving is a good fit for your organization
• Learn about best practices for using AWS Object Storage features & functionality to enable Active Archiving
Deep Dive On Object Storage: Amazon S3 and Amazon Glacier - AWS PS Summit Can...Amazon Web Services
Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. Discover how AWS customers have built solutions that turn their data into a strategic asset.
Speakers: Ben Thurgood. Solutions Architect. Amazon Web Services with Timothy Eckersley, Enterprise Architect, NSW Pathology
Level: 300
AWS Storage Gateway is a service that connects an on-premises software appliance with AWS storage. It simplifies the adoption of cloud-based storage within on-premises environments, giving customers a secure, reliable, and cost-effective alternative to local storage. In this session, we take a detailed look at how to use Storage Gateway to backup and archive on-premises data. We discuss the three types of storage and how to select the right type for your environment. We walk through setup and configuration of the on-premises gateway appliance, data restoration, and daily management, such as monitoring performance and managing storage. The session is intended for customers who perform on-premises backup and archive today, and want to learn how to include cloud storage in their environment.
Not just for archiving or compliance use cases, Amazon Glacier accommodates customers simply looking to replace their on-premises long term storage with a cost efficient, durable, cloud option, from which they can easily and quickly access their data when they need to. This session will introduce newly launched features for Amazon Glacier, review the current service feature set, and share the global data center shut down and storage strategy for Sony DADC New Media Solutions (NMS). NMS is Sony’s digital servicing division providing global digital distribution, linear playout and white label OTT/Commerce solutions for clients such as BBC Worldwide, NBCUniversal, Sony Playstation, and Funimation Entertainment.
Hear from Andy Shenkler, NMS’s Chief Technology and Solutions Officer as he talks about the key factors that drove the organization’s decision to move away from tape and go towards the cloud and out of the infrastructure business overall. Learn more about the impact and operational practices inside a world class digital supply chain as they were able to move over 20 petabytes of data, over 1M hours of video, to the cloud and never looked back.
SRV403 Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide – with cost points competitive against tape archives. Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. Hear about Amazon Glacier and new capabilities to get access to your data faster with expedited retrievals. Learn how AWS customers have built solutions that turn their data from a cost into a strategic asset, and bring your toughest questions straight to our experts.
In this session, we’ll expand on the S3 re:Invent deep-dive session with a hands-on workshop on advanced S3 features and storage management capabilities. We’ll have AWS S3 and Glacier experts on-hand to deep-dive on S3 architecture, performance & scalability optimization, how to analyze your content and leverage storage tiers (S3 Standard, S3 Standard Infrequent Access, Glacier) to balance cost and SLAs, security considerations, replication with Cross Region Replication (CRR), versioning for data protection and more.
In the hands-on lab, we’ll walk through a customer scenario: architecting a high-performance infrastructure for consumer applications. In the scenario, we’ll use sample data sets on S3, analyze object retrieval patterns and design a complete solution using many of the features S3 offers including migrating objects to an appropriate tier.
Prerequisites:
- Participants should have an AWS account established and available for use during the workshop.
- Please bring your own laptop.
AWS Webcast - Archiving in the Cloud - Best Practices for Amazon GlacierAmazon Web Services
Join our webinar to learn more about how to build a cost effective archive application using Amazon Glacier, an extremely low cost, secure, highly durable, and easy to use storage service in the AWS cloud.
We will explain how Amazon Glacier works and walk through some best practices to get the most out of the service
We will also highlight how to choose between Amazon Glacier and Amazon S3’s Glacier storage option.
Learn more: http://aws.amazon.com/glacier/
Architectures for HPC and HTC Workloads on AWS | AWS Public Sector Summit 2017Amazon Web Services
Researchers and IT professionals using High Performance Computing (HPC) and High Throughput Computing (HTC) need large scale infrastructure in order to move their research forward. Neuroimaging employs a variety of computationally demanding techniques with which to interrogate the structure and function of the living brain. Tara Madhyastha with the University of Washington, Department of Radiology, is demonstrating these methods at scale. This session will provide reference architectures for running your workloads on AWS, enabling you to achieve scale on demand, and reduce your time to science. We will also debunk myths about HPC in the cloud and show techniques for running common on-premises workloads in the cloud. Learn More: https://aws.amazon.com/government-education/
Getting Started with the Hybrid Cloud: Enterprise Backup and RecoveryAmazon Web Services
This sessions is for architects and storage admins seeking simple and non-disruptive ways to adopt cloud platforms in their organizations. You will learn how to deliver lower costs and greater scale with nearly seamless integration into your existing B&R processes. Services mentioned: S3, Glacier, Snowball, 3rd party partners, storage gateway, and ingestion services.
Amazon S3 hosts trillions of objects and is used for storing a wide range of data, from system backups to digital media. This presentation from the Amazon S3 Masterclass webinar we explain the features of Amazon S3 from static website hosting, through server side encryption to Amazon Glacier integration. This webinar will dive deep into the feature sets of Amazon S3 to give a rounded overview of its capabilities, looking at common use cases, APIs and best practice.
See a recording of this video here on YouTube: http://youtu.be/VC0k-noNwOU
Check out future webinars in the Masterclass series here: http://aws.amazon.com/campaigns/emea/masterclass/
View the Journey Through the Cloud webinar series here: http://aws.amazon.com/campaigns/emea/journey/
This session is for IT pros working with compliance managers to deliver solutions that lower costs and still meet compliance demands. You will learn how to move large scale data stores to the cloud, while remaining compliant with existing regulations. Services mentioned: S3, Glacier and the Vault Lock feature, Snowball, ingestion services.
AWS re:Invent 2016: Big Data Mini Con State of the Union (BDM205)Amazon Web Services
Join us for this general session where AWS big data experts present an in-depth look at the current state of big data. Learn about the latest big data trends and industry use cases. Hear how other organizations are using the AWS big data platform to innovate and remain competitive. Take a look at some of the most recent AWS big data announcements, as we kick off the Big Data re:Source Mini Con.
Optimizing Data Management Using AWS Storage and Data Migration Products | AW...Amazon Web Services
DigitalGlobe, Inc., the world’s leading provider of high-resolution Earth imagery, data, and analysis, is migrating its IT infrastructure, supporting imagery production and storage as well as satellite flight operations, to AWS with plans to close its commercial data centers within four years. DigitalGlobe has utilized AWS Snowmobile to move its 100PB image archive to the cloud. DigitalGlobe built its Geospatial Big Data platform, GBDX, natively on AWS. GBDX utilizes the image archive and combines geospatial big data and analytic tools, partner and customer data and tools, and dynamic cloud compute all in one place. This session will explore cost optimization for data management on AWS, highlighting various storage tiers and data import opportunities. We will focus on cost optimal usage of S3, S3-IA, Glacier, Snowball Edge and Snowmobile – balancing imagery access time with storage costs. Hear how DigitalGlobe utilized some of the newest features of the AWS platform to minimize their costs from storage. Learn More: https://aws.amazon.com/government-education/
(STG312) Amazon Glacier Deep Dive: Cold Data Storage in AWSAmazon Web Services
This session explores some of the key features of Amazon Glacier, including security, durability, and configuration for storing compliance and regulatory data. It covers best practices for managing your cold data, including ingest, retrieval, and security controls. Other topics include: how to optimize storage, upload, and retrieval costs; how to identify the most applicable workloads; and recommended optimizations based on a few sample use cases from a number of industry verticals.
Deep Dive on the AWS Storage Gateway - April 2017 AWS Online Tech TalksAmazon Web Services
- Learn about the benefits and capabilities of AWS Storage Gateway
- Learn how to get started with AWS Storage Gateway
AWS Storage Gateway provides file, volume, and tape storage in AWS through standard protocols which integrate seamlessly into your on-premises and in-cloud environments. This tech talk covers a deep dive to the main features and capabilities of AWS Storage Gateway, and patterns for using the service to accelerate your adoption of hybrid storage. With stories from real-life customer deployments, we’ll show the benefit of Storage Gateway for workloads such as backup and archive, disaster recovery, tiered storage, and cloud bursting.
AWS May Webinar Series - Getting Started: Storage with Amazon S3 and Amazon G...Amazon Web Services
If you are interested to know more about AWS Chicago Summit, please use the following to register: http://amzn.to/1RooPPL
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives: • Business value of Amazon S3 and Amazon Glacier • Leveraging S3 for web applications, media delivery, big data analytics and backup • Leveraging Amazon Glacier to build cost effective archives • Understand the life cycle management of AWS' storage services
This session is for IT pros working with compliance managers to deliver solutions that lower costs and still meet compliance demands. You will learn how to move large scale data stores to the cloud, while remaining compliant with existing regulations. Services mentioned: S3, Glacier and the Vault Lock feature, Snowball, ingestion services.
Cost Optimising Your Architecture Practical Design Steps for Developer Saving...Amazon Web Services
This session uses practical examples aimed at architects and developers. Using code and AWS CloudFormation in concert with services such as Amazon EC2, Amazon ECS, Lambda, Amazon RDS, Amazon SQS, Amazon SNS, Amazon S3 and more, we demonstrate the financial advantages of different architectural decisions. Attendees will walk away with concrete examples, as well as a new perspective on how they can build systems economically and effectively.
Speaker: Simon Elisha, Head of Solution Architecture, ANZ Public Sector, Amazon Web Services
Level 300
Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for what you use. Join us for this webinar where you will learn how Amazon Glacier changes the game for data archiving and backup as you pay nothing upfront, pay a very low price for storage, and can scale your usage up or down as needed.
AWS Data Transfer Services: Data Ingest Strategies Into the AWS CloudAmazon Web Services
Different types and sizes of data require different strategies. In this session, learn about the various features and services available for migrating data, be it small ongoing transactional data or large multi-petabyte volumes. Come learn how customers are using the latest network, streaming and large scale ingest features for their cloud data migrations to AWS storage services.
Come learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on your object storage workloads.
AWS April 2016 Webinar Series - S3 Best Practices - A Decade of Field ExperienceAmazon Web Services
Amazon Simple Storage Service (S3) has been providing developers and IT teams with secure, durable, highly-scalable cloud storage for 10 years.
This webinar will share our insights about what we’ve seen in the past ten years of live customer environments, including backup, restore, archive, and compliance best practices as implemented by some of our largest data stores in the cloud. We will also do a quick review of the 6 different ways to transfer data into and out of AWS cloud storage, discuss how you can accelerate data transfers into and out of S3 over long distances and slow networks, and share some new developments with the AWS Import/Export Snowball appliance.
Learning Objectives:
• Best practices to keep data safe and cost effective (SIA, Versioning, Cross-region Replication, lifecycle policies)
• Quick overview on transfer services (Direct Connect, Snowball, Firehose, 3rd party partnerships, Storage Gateway)
• Deep dive on new ways to accelerate data transfers over long distances and slow networks
AWS re:Invent 2016: Bringing Deep Learning to the Cloud with Amazon EC2 (CMP314)Amazon Web Services
Algorithmia is a startup with a mission to make state of the art machine learning discoverable by everyone&emdash;they offer the largest algorithm marketplace in the world, with over 2500 algorithms supporting tens of thousands of application developers. Algorithma is the first company to make deep learning, one of the most conceptually difficult areas of computing, accessible to any company via microservices. In this session, you learn how this startup has selected and optimized Amazon EC2 instances for various algorithms (including the latest generation of GPU optimized instances), to create a flexible and scalable platform. They also share their architecture and best practices for getting any computationally-intensive application started quickly.
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Presented by: Arie Leeuwesteijn, Principal Solutions Architect, Amazon Web Services
Customer Guest: Sander Kieft, Sanoma
Come learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on your object storage workloads.
SRV403 Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide – with cost points competitive against tape archives. Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. See how Amazon Athena runs serverless analytics on your data and hear about expedited and bulk retrievals from Amazon Glacier. Learn how AWS customers have built solutions that turn their data from a cost into a strategic asset, and bring your toughest questions straight to our experts.
Learning Objectives:
- Review best practices for to reduce costs, protect against data loss, and increase performance in Amazon S3
- Learn about new S3 storage management features that help you align storage with business needs
- Understand data security capabilities available in S3 that help protect against malicious or accidental deletion or other data loss
Learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on to your object storage workloads.
Getting Started with the Hybrid Cloud: Enterprise Backup and RecoveryAmazon Web Services
This sessions is for architects and storage admins seeking simple and non-disruptive ways to adopt cloud platforms in their organizations. You will learn how to deliver lower costs and greater scale with nearly seamless integration into your existing B&R processes. Services mentioned: S3, Glacier, Snowball, 3rd party partners, storage gateway, and ingestion services.
Amazon S3 hosts trillions of objects and is used for storing a wide range of data, from system backups to digital media. This presentation from the Amazon S3 Masterclass webinar we explain the features of Amazon S3 from static website hosting, through server side encryption to Amazon Glacier integration. This webinar will dive deep into the feature sets of Amazon S3 to give a rounded overview of its capabilities, looking at common use cases, APIs and best practice.
See a recording of this video here on YouTube: http://youtu.be/VC0k-noNwOU
Check out future webinars in the Masterclass series here: http://aws.amazon.com/campaigns/emea/masterclass/
View the Journey Through the Cloud webinar series here: http://aws.amazon.com/campaigns/emea/journey/
This session is for IT pros working with compliance managers to deliver solutions that lower costs and still meet compliance demands. You will learn how to move large scale data stores to the cloud, while remaining compliant with existing regulations. Services mentioned: S3, Glacier and the Vault Lock feature, Snowball, ingestion services.
AWS re:Invent 2016: Big Data Mini Con State of the Union (BDM205)Amazon Web Services
Join us for this general session where AWS big data experts present an in-depth look at the current state of big data. Learn about the latest big data trends and industry use cases. Hear how other organizations are using the AWS big data platform to innovate and remain competitive. Take a look at some of the most recent AWS big data announcements, as we kick off the Big Data re:Source Mini Con.
Optimizing Data Management Using AWS Storage and Data Migration Products | AW...Amazon Web Services
DigitalGlobe, Inc., the world’s leading provider of high-resolution Earth imagery, data, and analysis, is migrating its IT infrastructure, supporting imagery production and storage as well as satellite flight operations, to AWS with plans to close its commercial data centers within four years. DigitalGlobe has utilized AWS Snowmobile to move its 100PB image archive to the cloud. DigitalGlobe built its Geospatial Big Data platform, GBDX, natively on AWS. GBDX utilizes the image archive and combines geospatial big data and analytic tools, partner and customer data and tools, and dynamic cloud compute all in one place. This session will explore cost optimization for data management on AWS, highlighting various storage tiers and data import opportunities. We will focus on cost optimal usage of S3, S3-IA, Glacier, Snowball Edge and Snowmobile – balancing imagery access time with storage costs. Hear how DigitalGlobe utilized some of the newest features of the AWS platform to minimize their costs from storage. Learn More: https://aws.amazon.com/government-education/
(STG312) Amazon Glacier Deep Dive: Cold Data Storage in AWSAmazon Web Services
This session explores some of the key features of Amazon Glacier, including security, durability, and configuration for storing compliance and regulatory data. It covers best practices for managing your cold data, including ingest, retrieval, and security controls. Other topics include: how to optimize storage, upload, and retrieval costs; how to identify the most applicable workloads; and recommended optimizations based on a few sample use cases from a number of industry verticals.
Deep Dive on the AWS Storage Gateway - April 2017 AWS Online Tech TalksAmazon Web Services
- Learn about the benefits and capabilities of AWS Storage Gateway
- Learn how to get started with AWS Storage Gateway
AWS Storage Gateway provides file, volume, and tape storage in AWS through standard protocols which integrate seamlessly into your on-premises and in-cloud environments. This tech talk covers a deep dive to the main features and capabilities of AWS Storage Gateway, and patterns for using the service to accelerate your adoption of hybrid storage. With stories from real-life customer deployments, we’ll show the benefit of Storage Gateway for workloads such as backup and archive, disaster recovery, tiered storage, and cloud bursting.
AWS May Webinar Series - Getting Started: Storage with Amazon S3 and Amazon G...Amazon Web Services
If you are interested to know more about AWS Chicago Summit, please use the following to register: http://amzn.to/1RooPPL
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives: • Business value of Amazon S3 and Amazon Glacier • Leveraging S3 for web applications, media delivery, big data analytics and backup • Leveraging Amazon Glacier to build cost effective archives • Understand the life cycle management of AWS' storage services
This session is for IT pros working with compliance managers to deliver solutions that lower costs and still meet compliance demands. You will learn how to move large scale data stores to the cloud, while remaining compliant with existing regulations. Services mentioned: S3, Glacier and the Vault Lock feature, Snowball, ingestion services.
Cost Optimising Your Architecture Practical Design Steps for Developer Saving...Amazon Web Services
This session uses practical examples aimed at architects and developers. Using code and AWS CloudFormation in concert with services such as Amazon EC2, Amazon ECS, Lambda, Amazon RDS, Amazon SQS, Amazon SNS, Amazon S3 and more, we demonstrate the financial advantages of different architectural decisions. Attendees will walk away with concrete examples, as well as a new perspective on how they can build systems economically and effectively.
Speaker: Simon Elisha, Head of Solution Architecture, ANZ Public Sector, Amazon Web Services
Level 300
Companies typically over-pay for data archiving. First, they're forced to make an expensive upfront payment for their archiving solution (which does not include the ongoing cost for operational expenses such as power, facilities, staffing, and maintenance). Second, since companies have to guess what their capacity requirements will be, they understandably over-provision to make sure they have enough capacity for data redundancy and unexpected growth. This set of circumstances results in under-utilized capacity and wasted money. With Amazon Glacier, you pay only for what you use. Join us for this webinar where you will learn how Amazon Glacier changes the game for data archiving and backup as you pay nothing upfront, pay a very low price for storage, and can scale your usage up or down as needed.
AWS Data Transfer Services: Data Ingest Strategies Into the AWS CloudAmazon Web Services
Different types and sizes of data require different strategies. In this session, learn about the various features and services available for migrating data, be it small ongoing transactional data or large multi-petabyte volumes. Come learn how customers are using the latest network, streaming and large scale ingest features for their cloud data migrations to AWS storage services.
Come learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on your object storage workloads.
AWS April 2016 Webinar Series - S3 Best Practices - A Decade of Field ExperienceAmazon Web Services
Amazon Simple Storage Service (S3) has been providing developers and IT teams with secure, durable, highly-scalable cloud storage for 10 years.
This webinar will share our insights about what we’ve seen in the past ten years of live customer environments, including backup, restore, archive, and compliance best practices as implemented by some of our largest data stores in the cloud. We will also do a quick review of the 6 different ways to transfer data into and out of AWS cloud storage, discuss how you can accelerate data transfers into and out of S3 over long distances and slow networks, and share some new developments with the AWS Import/Export Snowball appliance.
Learning Objectives:
• Best practices to keep data safe and cost effective (SIA, Versioning, Cross-region Replication, lifecycle policies)
• Quick overview on transfer services (Direct Connect, Snowball, Firehose, 3rd party partnerships, Storage Gateway)
• Deep dive on new ways to accelerate data transfers over long distances and slow networks
AWS re:Invent 2016: Bringing Deep Learning to the Cloud with Amazon EC2 (CMP314)Amazon Web Services
Algorithmia is a startup with a mission to make state of the art machine learning discoverable by everyone&emdash;they offer the largest algorithm marketplace in the world, with over 2500 algorithms supporting tens of thousands of application developers. Algorithma is the first company to make deep learning, one of the most conceptually difficult areas of computing, accessible to any company via microservices. In this session, you learn how this startup has selected and optimized Amazon EC2 instances for various algorithms (including the latest generation of GPU optimized instances), to create a flexible and scalable platform. They also share their architecture and best practices for getting any computationally-intensive application started quickly.
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Presented by: Arie Leeuwesteijn, Principal Solutions Architect, Amazon Web Services
Customer Guest: Sander Kieft, Sanoma
Come learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on your object storage workloads.
SRV403 Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide – with cost points competitive against tape archives. Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. See how Amazon Athena runs serverless analytics on your data and hear about expedited and bulk retrievals from Amazon Glacier. Learn how AWS customers have built solutions that turn their data from a cost into a strategic asset, and bring your toughest questions straight to our experts.
Learning Objectives:
- Review best practices for to reduce costs, protect against data loss, and increase performance in Amazon S3
- Learn about new S3 storage management features that help you align storage with business needs
- Understand data security capabilities available in S3 that help protect against malicious or accidental deletion or other data loss
Learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on to your object storage workloads.
Deep Dive on Amazon S3 - March 2017 AWS Online Tech TalksAmazon Web Services
Learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on to your object storage workloads.
Learning Objectives:
• Review best practices for to reduce costs, protect against data loss, and increase performance in Amazon S3
• Learn about new S3 storage management features that help you align storage with business needs
• Understand data security capabilities available in S3 that help protect against malicious or accidental deletion or other data loss
SRV403 Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide – with cost points competitive against tape archives. Learn about the different ways you can accelerate data transfer into S3 and get a close look at new tools to secure and manage your data more efficiently. See how Amazon Athena runs serverless analytics on your data and hear about expedited and bulk retrievals from Amazon Glacier. Learn how AWS customers have built solutions that turn their data from a cost into a strategic asset, and bring your toughest questions straight to our experts.
AWS re:Invent 2016: Strategic Planning for Long-Term Data Archiving with Amaz...Amazon Web Services
Without careful planning, data management can quickly turn complex with a runaway cost structure. Enterprise customers are turning to the cloud to solve long-term data archive needs such as reliability, compliance, and agility while optimizing the overall cost. Come to this session and hear how AWS customers are using Amazon Glacier to simplify their archiving strategy. Learn how customers architect their cloud archiving applications and share integration to streamline their organization's data management and establish successful IT best practices.
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives:
• Business value of Amazon S3 and Amazon Glacier
• Leveraging S3 for web applications, media delivery, big data analytics and backup
• Leveraging Amazon Glacier to build cost effective archives
• Understand the life cycle management of AWS’s storage services
Who Should Attend:
• Developers, DevOps Engineers, Engineers and System Administrators
Learn how AWS customers save money, time and effort by using AWS's backup and archive services. Organizations of all sizes rely on AWS services to durably safeguard their data off-premises at a surprisingly low cost. This session will illustrate backup and archive architectures that AWS customers are benefitting from today.
Come learn about new and existing Amazon S3 features that can help you better protect your data, save on cost, and improve usability, security, and performance. We will cover a wide variety of Amazon S3 features and go into depth on several newer features with configuration and code snippets, so you can apply the learnings on your object storage workloads.
Deep Dive on Object Storage: Amazon S3 and Amazon GlacierAmazon Web Services
In this session, storage experts will walk you through Amazon S3 and Amazon Glacier, bulk data repositories that can deliver 99.999999999% durability and scale past trillions of objects worldwide – with cost points competitive against tape archives.
Data is gravity. Your workloads and processing is dependent on where your data is and how it is stored. With AWS, you have a host of storage options and the key to successfully leverage them is to know when to use which option. This session will explain in details about each of the AWS Storage offerings along with data ingestion optins into the Cloud using Snowball and Snowmobile
Marc Trimuschat,
Head - Business Developement, AWS Storage, AWS APAC
Overview of AWS Services for Data Storage and Migration - SRV205 - Anaheim AW...Amazon Web Services
In this session, we explore the features and functions of AWS storage services. We provide context on the portfolio, and we cover the most common use cases for AWS offerings for object, file, block, and migration technologies, including the partner ecosystem. We then describe each service through customer case studies. Expect to leave this session understanding how to select a storage service and start moving workloads or building new ones.
An Overview of AWS Services for Data Storage and Migration - SRV205 - Atlanta...Amazon Web Services
In this session, we explore the features and functions of AWS storage services. We provide context on the AWS storage portfolio, and we cover the most common use cases for AWS offerings for object, file, block, and migration technologies, including the AWS Partner Network (APN) ecosystem. Then we examine each service, using customer case studies as examples. You gain an understanding of how to select storage and start moving workloads or building new ones.
Strategic Uses for Cost Efficient Long-Term Cloud StorageAmazon Web Services
Compared to storing long-term datasets on-premises, archiving in the cloud is a smart alternative whether you’re looking for an active archive solution, tape replacement, or to fulfill a compliance requirement. Learn how AWS customers are simplifying their archiving strategy and meeting compliance needs using Amazon Glacier. Hear how customers have evolved their backup and disaster recovery architectures and replaced tape solutions by turning to AWS for a more cost efficient, durable and agile solution. We will showcase Sony DADC's active archive deployment on Glacier and demo how some of our financial service customers have set up compliant archives to meet their regulatory objectives.
Introduction to Storage on AWS - AWS Summit Cape Town 2017Amazon Web Services
With AWS, you can choose the right storage service for the right use case. This session shows the range of AWS choices that are available to you: Amazon S3, Amazon EBS, Amazon EFS, Amazon Glacier and Cloud Data Migration solutions.
Eric Durand once again takes us to a journey of Storage solutions for digital media, using the AWS Cloud.
This presentation was delivered at AWS Toronto, during the Media and Entertainment Symposium.
Introduction to key architectural concepts to build a data lake using Amazon S3 as the storage layer and making this data available for processing with a broad set of analytic options including Amazon EMR and open source frameworks such as Apache Hadoop, Spark, Presto, and more.
Data migration at petabyte scale is now a simple service from AWS. You can easily migrate large volumes of data from on-premises environments to the cloud, quickly get started with the cloud as a backup target, or burst workloads between your on-premises environments and the AWS Cloud. Learn about AWS Snowball, AWS Snowball Edge, AWS Snowmobile and AWS Storage Gateway, and understand which one is the right fit for your requirements. We will go through customer use cases, review the different applications used, and help you cut IT spend and management time on hardware and backup solutions.
In this session, you'll learn how to architect your applications based on Amazon Web Services' Well-Architected Framework principles and Adrian’s 10+ years of experience using AWS.
An Overview of AWS services for Data Storage and Migration - SRV205 - Toronto...Amazon Web Services
In this session, we explore features and functions of AWS Storage Services. We give context on the portfolio, cover the most common use cases for AWS offerings for object, file, block and migration technologies, including thepartner ecosystem, and then go into each service with customer case studiy examples. Leave this session with an understanding of how to select storage and start moving workloads or building new ones.
Similar to Deep Dive on Object Storage: Amazon S3 and Amazon Glacier | AWS Public Sector Summit 2017 (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
2. Cloud Data Migration
AWS Direct
Connect
AWS Snow*
data transport
family
Third Party
Connectors
Transfer
Acceleration
AWS
Storage
Gateway
Amazon Kinesis
Firehose
The AWS Storage Portfolio
Object
Amazon GlacierAmazon S3
Block
Amazon EBS
(persistent)
Amazon EC2
Instance Store
(ephemeral)
File
Amazon EFS
3. What to expect from the session
• Pick the right storage class for your use cases
• Automate management tasks
• Best practices to optimize S3 performance
• Tools to help you manage storage
• Storage migration, tiering, bursting
4. Choice of storage classes on S3
Standard
Active data Archive dataInfrequently accessed data
Standard - Infrequent Access Amazon Glacier
5. Storage classes designed for your use case
S3 Standard
• Big data analysis
• Content distribution
• Static website
hosting
Standard - IA
• Backup & archive
• Disaster recovery
• File sync & share
• Long-retained data
Amazon Glacier
• Long term archives
• Digital preservation
• Magnetic tape
replacement
7. When should you move to Standard-IA?
S3 Analytics – storage class analysis
• Visualize the access pattern on your data over time
• Measure the object age where data is infrequently accessed
• Dive deep by bucket, prefixes, or specific object tag
• Easily create a lifecycle policy based on the analysis
8. Amazon Glacier
Archival storage for infrequently accessed data
Amazon Glacier
is optimized for
infrequent retrieval
Stop managing
physical media
Even lower cost than
Amazon S3;
same high durability
9. Amazon Glacier – Data Retrieval Tiers
Standard Retrieval
• Current model
• 3-5 hours
• Disaster recovery
Bulk Retrieval
• Batch/Bulk access
• 5-12 hours
• PB scale re-transcoding
or video/image analysis
Expedited Retrieval
• Emergency access
• 1-5 minutes
• Last minute play-out
schedule swap
$0.03/GB $0.01/GB $0.0025/GB
On-site tape replacement Off-site tape replacement
10. S3 Data Lake Example: FINRA
Use Case
• Quickly ingest data from many sources, and store it efficiently in one location
• Multiple analytics/processing tools (e.g., Amazon Athena, Amazon EMR, Amazon Redshift) to
speed time to value
• Migrate on-premises data warehouses, Hadoop & big data clusters to AWS
Value Proposition
• Decouple storage and compute—scale optimally and cost efficiently
• Eliminate silos—centrally manage, govern and access all data
• Use the right analytic tools for the job - evolve as requirements change - no data migration
FINRA TCO
• Analyzes & stores 75 billion events per day with S3, EMR & Amazon Redshift
• Securely stores 5PB of historical data on S3 for deeper ad hoc analytics
• Increased agility and speed to results
• Estimated savings of $10-20M per year over previous on-premise solution
11. Amazon Glacier Example:
Satellite Image Archive
• DigitalGlobe takes satellite imagery of the Earth
• 100 PB image library = 6 billion square kilometers
• 1 PB new image every year
• Images to be archived and retained for decades
12. Pick the right storage class for your use cases
Automate management tasks
• Best practices to optimize S3 performance
• Tools to help you manage storage
• Storage migration, tiering, bursting
16. Automate data management
lifecycle policies
• Automatic tiering and cost controls
• Includes two possible actions:
• Transition: archives to Standard - IA or Amazon
Glacier based on object age you specified
• Expiration: deletes objects after specified time
• Actions can be combined
• Set policies by bucket, prefix, or tags
• Set policies for current version or non-
current versions
Lifecycle policies
17. Set up a lifecycle policy on the AWS Management Console
18.
19.
20. Protect your data from accidental deletes
• Protects from unintended user deletes or
application logic failures
• New version with every upload
• Easy retrieval of deleted objects and
rollback to previous versions
Best Practice
Versioning
21. Automate with trigger-based workflow
Amazon S3 event notifications
Events
SNS topic
SQS
queue
Lambda
function
• Notification when objects are
created via PUT, POST, Copy,
Multipart Upload, or DELETE
• Filter on prefixes and suffixes
• Trigger workflow with Amazon
SNS, Amazon SQS, and AWS
Lambda functions
22. Cross-region replication
Automated, fast, and reliable asynchronous replication of data across AWS regions
Use cases:
• Compliance – store data hundreds of miles apart
• Lower latency – distribute data to regional customers
• Security – create remote replicas managed by separate AWS accounts
How it works:
• Only replicates new PUTs. Once configured, all new uploads into source
bucket will be replicated
• Entire bucket or prefix based
• 1:1 replication between any 2 regions
• Versioning required
• Deletes and lifecycle actions are not replicated
23. Summary – automate management tasks
Cross-region
replication
Automate transition
and expiration with
lifecycle policies
Trigger-based
workflow with
event notification
Easily recover from
accidental delete
with versioning
24. Topics
Pick the right storage class for your use cases
Automate management tasks
Best practices to optimize S3 performance
• Tools to help you manage storage
• Storage migration, tiering, bursting
25. Faster upload of large objects
Parallelize PUTs with multipart uploads
• Increase aggregate throughput by
parallelizing PUTs on high-bandwidth
networks
• Move the bottleneck to the network,
where it belongs
• Increase resiliency to network errors;
fewer large restarts on error-prone
networks
Best Practice
26. Faster download
You can parallelize GETs as well as PUTs
GET /example-object HTTP/1.1
Host: example-bucket.s3.amazonaws.com
x-amz-date: Fri, 28 Jan 2016 21:32:02 GMT
Range: bytes=0-9
Authorization: AWS AKIAIOSFODNN7EXAMPLE:Yxg83MZaEgh3OZ3l0rLo5RTX11o=
For large objects, use range-based GETs
align your get ranges with your parts
For content distribution, enable Amazon CloudFront
• Caches objects at the edge
• Low latency data transfer to end user
28. Distributing key names
Add randomness to the beginning of the key name
with a hash or reversed timestamp (ssmmhhddmmyy)
<my_bucket>/521335461-2013_11_13.jpg
<my_bucket>/465330151-2013_11_13.jpg
<my_bucket>/987331160-2013_11_13.jpg
<my_bucket>/465765461-2013_11_13.jpg
<my_bucket>/125631151-2013_11_13.jpg
<my_bucket>/934563160-2013_11_13.jpg
<my_bucket>/532132341-2013_11_13.jpg
<my_bucket>/565437681-2013_11_13.jpg
<my_bucket>/234567460-2013_11_13.jpg
<my_bucket>/456767561-2013_11_13.jpg
29. Best practices – performance
Faster upload over long distances
with S3 Transfer Acceleration
Faster upload for large objects
with S3 multipart upload
Optimize GET performance with
Range GET and CloudFront
SQL query on S3 with Athena
Distribute key name for high TPS
workload
TCP window scaling for long, fat
networks
TCP SACK for fast, lossy
connections like mobile
30. Topics
Pick the right storage class for your use cases
Automate management tasks
Best practices to optimize S3 performance
Tools to help you manage storage
• Storage migration, tiering, bursting
31. Organize your data with object tags
Manage data based on what it is as opposed to where its located
• Classify your data, up to 10 tags per object
• Tag your objects with key-value pairs
• Write policies once based on the type of data
• Put object with tag or add tag to existing objects
Storage Metrics
& Analytics
Lifecycle PolicyAccess Control
33. Manage access: restrict deletes
• Bucket policies can restrict deletes
• For additional security, enable MFA (multi-factor
authentication) delete, which requires additional
authentication to:
• Change the versioning state of your bucket
• Permanently delete an object version
• MFA delete requires both your security credentials and a
code from an approved authentication device
Best Practice
34. Use cases:
• Perform security analysis
• Meet your IT auditing and compliance needs
• Take immediate action on activity
How it works:
• Capture S3 object-level requests
• Enable at the bucket level
• Logs delivered to your S3 bucket
• $0.10 per 100,000 data events
Audit and monitor access
AWS CloudTrail data events
35. Monitor performance and operation
Amazon CloudWatch metrics for S3
• Generate metrics for data of your choice
• Entire bucket, prefixes, and tags
• Up to 1,000 groups per bucket
• 1-minute CloudWatch metrics
• Alert and alarm on metrics
• $0.30 per metric per month
36. CloudWatch Metrics for S3
Metric Name value
AllRequests Count
PutRequests Count
GetRequests Count
ListRequests Count
DeleteRequests Count
HeadRequests Count
PostRequests Count
Metric Name value
BytesDownloaded MB
BytesUploaded MB
4xxErrors Count
5xxErrors Count
FirstByteLatency ms
TotalRequestLatency ms
37.
38. S3 Inventory
Save time Daily or weekly delivery Delivery to S3 bucketCSV file output
Use case: trigger business workflows and applications such as secondary index garbage collection, data
auditing, and offline analytics
• More information about your objects than provided by LIST API such as replication status, multipart
upload flag and delete marker
• Simple pricing: $0.0025 per million objects listed
39. S3 Inventory
Eventually consistent rolling snapshot
• New objects may not be listed
• Removed objects may still be included
Name Value Type Description
Bucket String Bucket name. UTF-8 encoded.
Key String Object key name. UTF-8 encoded.
Version Id String Version Id of the object
Is Latest boolean true if object is the latest version (current version) of a versioned object, otherwise false
Delete Marker boolean true if object is a delete marker of a versioned object, otherwise false
Size long Object size in bytes
Last Modified String Last modified timestamp. Format in ISO: YYYY-MM-DDTHH:mm:ss.SSSZ
ETag String E Tag in HEX encoded format
StorageClass String Valid values: STANDARD, REDUCED_REDUNDANCY, GLACIER, STANDARD_IA. UTF-8 encoded.
Multipart Uploaded boolean true if object is uploaded by using multipart, otherwise false
Replication Status String Valid values: REPLICA, COMPLETED, PENDING, FAILED. UTF-8 encoded.
Validate before you act!
• Use HEAD OBJECT
40. Pulling it all together
• SaaS security & compliance solution
• Built on AWS
• On-premises NAS migration to S3, Amazon Glacier
• Multi-PB migration
• Performant and scalable multi-tenant storage
• 1.7 PB per month
• 4000 customers
• Leverage new AWS storage features
• S3 tagging for scalability, lifecycle management
• S3 analytics to understand data access patterns
• Cross-region replication
• Glacier expedited retrieval for multi-region availability
“We couldn’t have
completed the
migration, optimized
performance, cost
optimized via lifecycle &
classified our (4000+)
customers without using
S3’s new analytics,
tagging and lifecycle
features
41. Topics
Pick the right storage class for your use cases
Automate management tasks
Best practices to optimize S3 performance
Tools to help you manage storage
Storage migration, tiering, bursting
42. Data has gravity and underpins all workloads
…it’s easier to move processing to the data
4k/8k
Genomics
Seismic
Financial
Logs
IoT
1 PB over the Internet: 22 years
100 PB over T3: 609 years
100 PB over 1Gbps DX: 27 years
43. AWS Direct Connect AWS Snowball ISV Connectors
Amazon Kinesis
Firehose
S3 Transfer
Acceleration
AWS Storage
Gateway
Data transfer into Amazon S3
AWS Snowmobile
AWS Snowball Edge
44. Faster upload over long distances
S3 Transfer Acceleration
S3 Bucket
AWS Edge
Location
Uploader
Optimized
Throughput!
Change your endpoint, not your code
No firewall changes or client software
Longer distance, larger files, more benefit
Faster or free
Global edge locations
45. Rio De
Janeiro
Warsaw New York Atlanta Madrid Virginia Melbourne Paris Los
Angeles
Seattle Tokyo Singapore
Time[hrs.]
500 GB upload from these edge locations to a bucket in Singapore
Public Internet
How fast is S3 Transfer Acceleration?
S3 Transfer Acceleration
The longer the distance,
the larger the file
more benefit
Try it at s3speedtest.com
46. How Snowball moves data into and out of AWS
Create
a job
Connect the
Snowball
Copy data to
the Snowball
Your data
moved to
Amazon S3
In transit to you Delivered to you Delivered to AWS At AWS
Job created Job completed
47. AWS Snowball Edge
Petabyte-scale hybrid device with onboard compute and storage
• 100 TB local storage
• Local compute equivalent to an Amazon
EC2 m4.4xlarge instance
• 10GBase-T, 10/25Gb SFP28, and 40Gb
QSFP+ copper, and optical networking
• Ruggedized and rack-mountable
RE:INVENT 2016 LAUNCH
48. Collect data Create job Copy data Moved to S3
Hybrid capabilities beyond data migrationMIGRATIONCOLLECTION
Create job Copy data Moved to S3
49. Case Study: Oregon State University
Use case:
• Collect and analyze oceanic and coastal images
• 60 TB of data per week
• Environmental and ocean ecosystem research
Architecture before Snowball:
• Transferred data with many small hard drives
• Used to take weeks to months to upload data
• $4MM+ in infrastructure investment
• Expensive and inefficient
Snowball lets OSU migrate TBs of data in days at a fraction of the cost
51. AWS Storage Gateway hybrid storage solutions
Use standard storage protocols to access AWS storage services
AWS Storage
Gateway
Amazon EBS
snapshots
Amazon S3
Amazon Glacier
AWS Identity and Access
Management (IAM)
AWS Key Management
Service (KMS)
AWS
CloudTrail
Amazon
CloudWatch
Files
Volumes
Tapes
On-premises
AWSCloud
52. Enabling cloud workloads
Move data to AWS storage for big data, cloud bursting, or migration
“Storage Gateway has the promise to transform the way we move
data into the cloud. The NFS interface lets us easily integrate data
files from analytical instruments, and the transparent S3 storage
lets us easily connect our cloud-based applications and leverage the
powerful storage capabilities of S3.
With Storage Gateway, we can now unleash the full power of AWS
on our instrument data.”
53. Backup, archive, and disaster recovery
Cost effective storage in AWS with local or cloud restore
“Tapes are a headache, prone to hardware
failures, offsite storage costs, and constant
maintenance needs. Storage Gateway
provided the most cost-effective and simple
alternative. We even got disaster recovery by
using a bicoastal data center.”
54. Amazon Storage Partner Solutions
aws.amazon.com/backup-recovery/partner-solutions/
Note: Represents a sample of storage partners
Backup and RecoveryPrimary Storage Archive
Solutions that leverage file, block, object,
and streamed data formats as an
extension to on-premises storage
Solutions that leverage Amazon S3 for
durable data backup
Solutions that leverage Amazon
Glacier for durable and cost-effective
long-term data backup