Amazon Web Services (AWS) es una plataforma de servicios de nube que ofrece potencia de cómputo, almacenamiento de bases de datos, entrega de contenido y otra funcionalidad para ayudar a las empresas a escalar y crecer. Explore cómo millones de clientes aprovechan los productos y soluciones de la nube de AWS para crear aplicaciones sofisticadas y cada vez más flexibles, escalables y fiables.
AWS re:Invent 2016: Simplified Data Center Migration—Lessons Learned by Live ...Amazon Web Services
As the global leader of live entertainment, Live Nation promotes and produces over 22,000 events annually, operates out of 37 countries, and cultivates over 530 million fans globally. To focus on the growth of the business and shed increasing infrastructure costs, the company made the strategic decision to get out of the data center business and go all in with the cloud. Using instrumental services like AWS Import/Export Snowball, VM Import/Export, AWS CloudFormation and AWS Identity and Access Management, VP Cloud Services Jake Burns quickly and efficiently migrated priority business and operational applications, allowing for immediate cost efficiencies. Learn how AWS offerings like Snowball played a decisive role in Live Nation's ability to easily migrate data and enable end users to quickly access applications to minimize operational impact.
El almacenamiento en la nube es un componente crítico de la informática en la nube, que guarda la información que utilizan las aplicaciones. El análisis de big data, los almacenes de datos, el Internet de las cosas, las bases de datos y las aplicaciones de backup y archivado dependen de algún tipo de arquitectura de almacenamiento de datos. El almacenamiento en la nube, por lo general, es más fiable, escalable y seguro que los sistemas de almacenamiento en las instalaciones tradicionales.
Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
February 2016 Webinar Series - Architectural Patterns for Big Data on AWSAmazon Web Services
With an ever-increasing set of technologies to process big data, organizations often struggle to understand how to build scalable and cost-effective big data applications.
In this webinar, we will simplify big data processing as a pipeline comprising various stages; and then show you how to choose the right technology for each stage based on criteria such as data structure, design patterns, and best practices.
Learning Objectives:
Understand key AWS Big Data services including S3, Amazon EMR, Kinesis, and Redshift
Learn architectural patterns for Big Data
Hear best practices for building Big Data applications on AWS
Who Should Attend:
Architects, developers and data scientists who are looking to start a Big Data initiative
Building Serverless Web Applications - DevDay Los Angeles 2017Amazon Web Services
The document provides information about building serverless web applications using AWS Lambda and other AWS services. It begins with an overview of serverless computing using AWS Lambda and how it avoids the need to provision and manage servers. It then discusses various AWS compute offerings and when to use EC2, ECS, or Lambda. The rest of the document discusses serverless design patterns, demonstrates building a serverless web application using services like API Gateway and DynamoDB, and how to define and manage serverless applications using the AWS Serverless Application Model (SAM).
With distributed frameworks like Hadoop and Kafka, it is essential to deploy the right environment to successfully support these workloads. Learn about the different block storage options from AWS and walk through with our experts on how to select the best option for your big data analytic workloads. We will demonstrate how to setup, select, and modify volume types to right size your environment needs.
Amazon Kinesis provides services for you to work with streaming data on AWS. Learn how to load streaming data continuously and cost-effectively to Amazon S3 and Amazon Redshift using Amazon Kinesis Firehose without writing custom stream processing code. Get an introduction to building custom stream processing applications with Amazon Kinesis Streams for specialised needs.
This document discusses running databases on AWS. It provides an overview of several AWS database services including Amazon DynamoDB, Amazon RDS, Amazon Redshift, and Amazon ElastiCache. It highlights how these services provide scalability, reliability, automation of tasks like backups and patching, and pay-as-you-go pricing. It also discusses how AWS database services eliminate the need to manage hardware, databases, backups, and other complex tasks when compared to operating databases in an on-premises data center.
AWS re:Invent 2016: Simplified Data Center Migration—Lessons Learned by Live ...Amazon Web Services
As the global leader of live entertainment, Live Nation promotes and produces over 22,000 events annually, operates out of 37 countries, and cultivates over 530 million fans globally. To focus on the growth of the business and shed increasing infrastructure costs, the company made the strategic decision to get out of the data center business and go all in with the cloud. Using instrumental services like AWS Import/Export Snowball, VM Import/Export, AWS CloudFormation and AWS Identity and Access Management, VP Cloud Services Jake Burns quickly and efficiently migrated priority business and operational applications, allowing for immediate cost efficiencies. Learn how AWS offerings like Snowball played a decisive role in Live Nation's ability to easily migrate data and enable end users to quickly access applications to minimize operational impact.
El almacenamiento en la nube es un componente crítico de la informática en la nube, que guarda la información que utilizan las aplicaciones. El análisis de big data, los almacenes de datos, el Internet de las cosas, las bases de datos y las aplicaciones de backup y archivado dependen de algún tipo de arquitectura de almacenamiento de datos. El almacenamiento en la nube, por lo general, es más fiable, escalable y seguro que los sistemas de almacenamiento en las instalaciones tradicionales.
Big Data Architectural Patterns and Best Practices on AWSAmazon Web Services
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
February 2016 Webinar Series - Architectural Patterns for Big Data on AWSAmazon Web Services
With an ever-increasing set of technologies to process big data, organizations often struggle to understand how to build scalable and cost-effective big data applications.
In this webinar, we will simplify big data processing as a pipeline comprising various stages; and then show you how to choose the right technology for each stage based on criteria such as data structure, design patterns, and best practices.
Learning Objectives:
Understand key AWS Big Data services including S3, Amazon EMR, Kinesis, and Redshift
Learn architectural patterns for Big Data
Hear best practices for building Big Data applications on AWS
Who Should Attend:
Architects, developers and data scientists who are looking to start a Big Data initiative
Building Serverless Web Applications - DevDay Los Angeles 2017Amazon Web Services
The document provides information about building serverless web applications using AWS Lambda and other AWS services. It begins with an overview of serverless computing using AWS Lambda and how it avoids the need to provision and manage servers. It then discusses various AWS compute offerings and when to use EC2, ECS, or Lambda. The rest of the document discusses serverless design patterns, demonstrates building a serverless web application using services like API Gateway and DynamoDB, and how to define and manage serverless applications using the AWS Serverless Application Model (SAM).
With distributed frameworks like Hadoop and Kafka, it is essential to deploy the right environment to successfully support these workloads. Learn about the different block storage options from AWS and walk through with our experts on how to select the best option for your big data analytic workloads. We will demonstrate how to setup, select, and modify volume types to right size your environment needs.
Amazon Kinesis provides services for you to work with streaming data on AWS. Learn how to load streaming data continuously and cost-effectively to Amazon S3 and Amazon Redshift using Amazon Kinesis Firehose without writing custom stream processing code. Get an introduction to building custom stream processing applications with Amazon Kinesis Streams for specialised needs.
This document discusses running databases on AWS. It provides an overview of several AWS database services including Amazon DynamoDB, Amazon RDS, Amazon Redshift, and Amazon ElastiCache. It highlights how these services provide scalability, reliability, automation of tasks like backups and patching, and pay-as-you-go pricing. It also discusses how AWS database services eliminate the need to manage hardware, databases, backups, and other complex tasks when compared to operating databases in an on-premises data center.
Serverlesss Big Data Analytics with Amazon Athena and QuicksightAmazon Web Services
Check out how you can easily query raw data in various formats in Amazon S3, transform it into a canonical form, analyze it, and build dashboards to get more insights from your data.
Amazon QuickSight is a fast, cloud-powered business intelligence (BI) service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get business insights from your data. In this session, we demonstrate how you can point Amazon QuickSight to AWS data stores, flat files, or other third-party data sources and begin visualizing your data in minutes. We also introduce you to SPICE - a Super-fast, Parallel, In-memory, Calculation Engine in Amazon QuickSight, which performs advanced calculations and render visualizations rapidly without requiring any additional infrastructure, SQL programming, or dimensional modeling, so you can seamlessly scale to hundreds of thousands of users and petabytes of data. Lastly, you will see how Amazon QuickSight provides you with smart visualizations and graphs that are optimized for your different data types, to ensure the most suitable and appropriate visualization to conduct your analysis, and how to share these visualization stories using the built-in collaboration tools.
This session provides a foundational overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. This session will touch on the significant new offerings, outline some of the most common use cases and prepare you for the individual deep dive sessions, customer sessions and new announcements.
ENT313 Deploying a Disaster Recovery Site on AWS: Minimal Cost with Maximum E...Amazon Web Services
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum as well as optimizing your overall capital expense can be challenging. This session presents AWS features and services along with Disaster Recovery architectures that you can leverage when building highly available and disaster resilient applications. We will provide recommendations on how to improve your Disaster Recovery plan and discuss example scenarios showing how to recover from a disaster.
This document discusses building a modern data analytics architecture on AWS. It provides an overview of AWS services that can be used for ingesting, processing, storing, and analyzing large volumes of data in both real-time and batch scenarios. These include services like Amazon S3, Kinesis, EMR, Redshift, Athena, Elasticsearch, and Glue for ingesting, storing, processing, and querying data. Architectures shown include real-time data pipelines, data lakes, and batch ETL/ELT processes. Performance, cost effectiveness, and scalability benefits of AWS services are highlighted.
You have heard how containers are great for running microservices, but what is needed to get microservices to run in production at scale? In this session, we explore the reasoning and concepts behind microservices and how containers simplify building microservices based applications. We will show how you can easily launch microservices on Amazon EC2 Container Service and how you can use ELB and Route 53 to easily do service discovery between microservices.
Presented by: Danny Fezer, Solutions Architect, Amazon Web Services
Customer Guest: Liz Duke, Technical Delivery Manager, Irdeto
Getting Started with Managed Database Services on AWS - AWS Summit Tel Aviv 2017Amazon Web Services
In addition to running databases in Amazon EC2, AWS customers can choose among a variety of managed database services. These services save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We explain the fundamentals of Amazon DynamoDB, a fully managed NoSQL database service; Amazon RDS, a relational database service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We will cover how each service might help support your application and how to get started.
This document summarizes key aspects of full stack analytics on AWS, including foundational services like storage, data ingestion, processing and analytics, machine learning, and security. It discusses AWS services like S3, Athena, Glue, Kinesis, Rekognition, and how they can be used together for cost-effective analytics from ingestion to machine learning to building smarter applications. Security is addressed at both the service and data levels using tools like IAM, encryption, and third party integration.
This document provides an agenda and overview for a workshop on building a data lake on AWS. The agenda includes reviewing data lakes, modernizing data warehouses with Amazon Redshift, data processing with Amazon EMR, and event-driven processing with AWS Lambda. It discusses how data lakes extend traditional data warehousing approaches and how services like Redshift, EMR, and Lambda can be used for analytics in a data lake on AWS.
This document summarizes key announcements from re:Invent 2016, AWS's annual user conference. The main themes included artificial intelligence, serverless computing, devops, data, and migration tools. Notable product announcements included AWS Batch for batch processing, Aurora for PostgreSQL, Athena for querying data lakes, and X-Ray for debugging distributed applications. The document also discusses AWS's strategy around machine learning and deep learning using MXNet as its primary framework.
Taking the Performance of your Data Warehouse to the Next Level with Amazon R...Amazon Web Services
Amazon Redshift gives you fast SQL query performance on large data sets. We will discuss optimisation from end to end, all the way from loading through to querying to ensure your end users get the data they need, when they need it.
Speaker: Russell Nash, Solutions Architect, Amazon Web Services
Featured Customer - Domain
by Darin Briskman, Technical Evangelist, AWS
SQL is a powerful tool to query data, but it doesn't cover everything you might need. Sometimes, the precision of SQL is a limitation, that can be overcome by using the flexibility and inherent ranking of search. Learn how to use AWS servcies to create fully managed solutions using Amazon Aurora and Amazon Elasticsearch Service to combine the power of query and search. Level: 200
Convert and Migrate Your NoSQL Database or Data Warehouse to AWS - July 2017Amazon Web Services
Learning Objectives:
- Understand the use cases for migrating or replicating databases to the cloud
- Learn about the benefits of cloud-native databases for performance and costs reduction
- See how AWS Database Migration Service helps with your migration and how AWS Schema Conversion Tool makes conversions simple and quick
Moving or replicating your databases to the cloud should be simple and inexpensive. AWS has recently enhanced the AWS Database Migration Service and the AWS Schema Conversion Tool with new data sources to increase your migration options. You can now export from MongoDB databases and Greenplum, IBM Netezza, HPE Vertica, Teradata, Oracle DW and Microsoft SQL Server data warehouses to AWS. Learn how to export and migrate your data and procedural code with minimal downtime to the cloud database of your choice, including cloud-native offerings such as Amazon Aurora, Amazon DynamoDB and Amazon Redshift.
This document summarizes a presentation on simplifying big data processing with AWS. It discusses collecting and ingesting different types of data, various storage options for storing data, tools for processing and analyzing stored data, and consuming and visualizing results. It provides recommendations on which AWS services to use for different tasks based on factors like latency, throughput, data volume, and cost. Examples of using various AWS services together in solutions are also presented.
Database Migration: Simple, Cross-Engine and Cross-Platform Migrations with M...Amazon Web Services
Learn about the new AWS Database Migration Service, which helps you migrate databases with minimal downtime from on-premises and Amazon EC2 environments to Amazon RDS, Amazon Redshift, Amazon Aurora and EC2 databases. We discuss homogeneous (e.g. Oracle-to-Oracle, PostgreSQL-to-PostgreSQL, etc.) and heterogeneous (e.g. Oracle to Aurora, SQL Server to MariaDB) database migrations. We also talk about the new AWS Schema Conversion Tool that saves you development time when migrating your Oracle and SQL Server database schemas, including PL/SQL and T-SQL procedural code, to their MySQL, MariaDB and Aurora equivalents.
The document discusses strategies for scaling applications on AWS from 1 user to over 1 million users. It begins with hosting an application on a single EC2 instance, and progresses to introducing load balancing, separating databases onto RDS, read replicas, and sharding. It recommends services like S3, DynamoDB, ElastiCache to offload components. Automation with services like CloudFormation, CodeDeploy, and a SOA approach are suggested to scale independently. Migrating parts of the application to use specialized AWS databases and NoSQL options like DynamoDB is advised for very large user counts. The key is to leverage managed AWS services that scale automatically wherever possible.
Data comes in a variety of forms and in order to gain insight from this data you need to have the right platform in place. AWS has the services to cover all types of data, whether you need databases for structured data, Hadoop for unstructured data or a streaming engine for high-velocity data. In this session we will cover the various data analytics services on AWS and when to use them.
Data-driven companies have a need to make their data easily accessible to those who analyze it. Many organizations have adopted the Looker application, LookML on AWS, a centralized analytical database with a user-friendly interface that allows employees to ask and answer their own questions to make informed business decisions.
Join our webinar to learn how our customer, Casper, an online mattress retailer, made the switch from a transactional database to Looker’s data analytics program on Amazon Redshift. Looker on Amazon Redshift can help you greatly reduce your analytics lifecycle with a simplified infrastructure and rapid cloud scaling.
Join us to learn:
• How to utilize LookML to build reusable definitions and logic for your data
• Best practices for architecting a centralized analytical database
• How Casper leveraged Looker and Amazon Redshift to provide all their employees access to their data and metrics
Who should attend: Heads of Analytics, Heads of BI, Analytics Managers, BI Teams, Senior Analysts
Tackle Your Dark Data Challenge with AWS Glue - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Discover dark data that you are currently not analyzing.
- Analyze dark data without moving it into your data warehouse.
- Visualize the results of your dark data analytics.
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
AWS re:Invent is the largest global gathering of the Amazon Web Services community held in Las Vegas. Our five-day 2016 agenda, which is comprised of over 400 breakout sessions and attracted over 32,000 attendees, offers expanded opportunities to learn about AWS solutions, how the cloud impacts your business, including deep dives into hot topics including IoT and AI, new perspectives on cloud issues, and more chances to learn from the experts.
Not able to attend? Don’t worry, we are bringing AWS re:Invent to Hong Kong on Jan 19, 2017. Packed into a one-day format, AWS re:Invent 2016 Recap Hong Kong will showcase the key products, services, and enhancements recently announced in Las Vegas; Including new services for Big Data & Analytics, Architecture, Developers and Enterprises, etc. Local AWS customers are also invited to share their post re:Invent perspectives and ongoing successes with AWS so you can learn more from their successful stories.
Come join us at this event to hear new AWS product announcements addressed by the AWS Team!
AWS Data-Driven Insights Learning Series_ANZ Sep 2019 Part 2Amazon Web Services
AWS has been supporting companies across Australia and New Zealand to put their most innovative tools and technologies to work to achieve their business needs and goals. AWS and our ecosystem of partners has helped the likes of CP Mining, IntelliHQ, WesCEF, Oz Minerals, Woodside and many more to modernise their analytics and data architecture in order to successfully generate business value from their data.
This event series aimed to educate customers with a broader understanding of how to build next-gen data lakes and analytics platforms and make connections with AWS.
Serverlesss Big Data Analytics with Amazon Athena and QuicksightAmazon Web Services
Check out how you can easily query raw data in various formats in Amazon S3, transform it into a canonical form, analyze it, and build dashboards to get more insights from your data.
Amazon QuickSight is a fast, cloud-powered business intelligence (BI) service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get business insights from your data. In this session, we demonstrate how you can point Amazon QuickSight to AWS data stores, flat files, or other third-party data sources and begin visualizing your data in minutes. We also introduce you to SPICE - a Super-fast, Parallel, In-memory, Calculation Engine in Amazon QuickSight, which performs advanced calculations and render visualizations rapidly without requiring any additional infrastructure, SQL programming, or dimensional modeling, so you can seamlessly scale to hundreds of thousands of users and petabytes of data. Lastly, you will see how Amazon QuickSight provides you with smart visualizations and graphs that are optimized for your different data types, to ensure the most suitable and appropriate visualization to conduct your analysis, and how to share these visualization stories using the built-in collaboration tools.
This session provides a foundational overview of the AWS storage portfolio, including block, file, object, and cloud data migration services. This session will touch on the significant new offerings, outline some of the most common use cases and prepare you for the individual deep dive sessions, customer sessions and new announcements.
ENT313 Deploying a Disaster Recovery Site on AWS: Minimal Cost with Maximum E...Amazon Web Services
In the event of a disaster, you need to be able to recover lost data quickly to ensure business continuity. For critical applications, keeping your time to recover and data loss to a minimum as well as optimizing your overall capital expense can be challenging. This session presents AWS features and services along with Disaster Recovery architectures that you can leverage when building highly available and disaster resilient applications. We will provide recommendations on how to improve your Disaster Recovery plan and discuss example scenarios showing how to recover from a disaster.
This document discusses building a modern data analytics architecture on AWS. It provides an overview of AWS services that can be used for ingesting, processing, storing, and analyzing large volumes of data in both real-time and batch scenarios. These include services like Amazon S3, Kinesis, EMR, Redshift, Athena, Elasticsearch, and Glue for ingesting, storing, processing, and querying data. Architectures shown include real-time data pipelines, data lakes, and batch ETL/ELT processes. Performance, cost effectiveness, and scalability benefits of AWS services are highlighted.
You have heard how containers are great for running microservices, but what is needed to get microservices to run in production at scale? In this session, we explore the reasoning and concepts behind microservices and how containers simplify building microservices based applications. We will show how you can easily launch microservices on Amazon EC2 Container Service and how you can use ELB and Route 53 to easily do service discovery between microservices.
Presented by: Danny Fezer, Solutions Architect, Amazon Web Services
Customer Guest: Liz Duke, Technical Delivery Manager, Irdeto
Getting Started with Managed Database Services on AWS - AWS Summit Tel Aviv 2017Amazon Web Services
In addition to running databases in Amazon EC2, AWS customers can choose among a variety of managed database services. These services save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We explain the fundamentals of Amazon DynamoDB, a fully managed NoSQL database service; Amazon RDS, a relational database service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We will cover how each service might help support your application and how to get started.
This document summarizes key aspects of full stack analytics on AWS, including foundational services like storage, data ingestion, processing and analytics, machine learning, and security. It discusses AWS services like S3, Athena, Glue, Kinesis, Rekognition, and how they can be used together for cost-effective analytics from ingestion to machine learning to building smarter applications. Security is addressed at both the service and data levels using tools like IAM, encryption, and third party integration.
This document provides an agenda and overview for a workshop on building a data lake on AWS. The agenda includes reviewing data lakes, modernizing data warehouses with Amazon Redshift, data processing with Amazon EMR, and event-driven processing with AWS Lambda. It discusses how data lakes extend traditional data warehousing approaches and how services like Redshift, EMR, and Lambda can be used for analytics in a data lake on AWS.
This document summarizes key announcements from re:Invent 2016, AWS's annual user conference. The main themes included artificial intelligence, serverless computing, devops, data, and migration tools. Notable product announcements included AWS Batch for batch processing, Aurora for PostgreSQL, Athena for querying data lakes, and X-Ray for debugging distributed applications. The document also discusses AWS's strategy around machine learning and deep learning using MXNet as its primary framework.
Taking the Performance of your Data Warehouse to the Next Level with Amazon R...Amazon Web Services
Amazon Redshift gives you fast SQL query performance on large data sets. We will discuss optimisation from end to end, all the way from loading through to querying to ensure your end users get the data they need, when they need it.
Speaker: Russell Nash, Solutions Architect, Amazon Web Services
Featured Customer - Domain
by Darin Briskman, Technical Evangelist, AWS
SQL is a powerful tool to query data, but it doesn't cover everything you might need. Sometimes, the precision of SQL is a limitation, that can be overcome by using the flexibility and inherent ranking of search. Learn how to use AWS servcies to create fully managed solutions using Amazon Aurora and Amazon Elasticsearch Service to combine the power of query and search. Level: 200
Convert and Migrate Your NoSQL Database or Data Warehouse to AWS - July 2017Amazon Web Services
Learning Objectives:
- Understand the use cases for migrating or replicating databases to the cloud
- Learn about the benefits of cloud-native databases for performance and costs reduction
- See how AWS Database Migration Service helps with your migration and how AWS Schema Conversion Tool makes conversions simple and quick
Moving or replicating your databases to the cloud should be simple and inexpensive. AWS has recently enhanced the AWS Database Migration Service and the AWS Schema Conversion Tool with new data sources to increase your migration options. You can now export from MongoDB databases and Greenplum, IBM Netezza, HPE Vertica, Teradata, Oracle DW and Microsoft SQL Server data warehouses to AWS. Learn how to export and migrate your data and procedural code with minimal downtime to the cloud database of your choice, including cloud-native offerings such as Amazon Aurora, Amazon DynamoDB and Amazon Redshift.
This document summarizes a presentation on simplifying big data processing with AWS. It discusses collecting and ingesting different types of data, various storage options for storing data, tools for processing and analyzing stored data, and consuming and visualizing results. It provides recommendations on which AWS services to use for different tasks based on factors like latency, throughput, data volume, and cost. Examples of using various AWS services together in solutions are also presented.
Database Migration: Simple, Cross-Engine and Cross-Platform Migrations with M...Amazon Web Services
Learn about the new AWS Database Migration Service, which helps you migrate databases with minimal downtime from on-premises and Amazon EC2 environments to Amazon RDS, Amazon Redshift, Amazon Aurora and EC2 databases. We discuss homogeneous (e.g. Oracle-to-Oracle, PostgreSQL-to-PostgreSQL, etc.) and heterogeneous (e.g. Oracle to Aurora, SQL Server to MariaDB) database migrations. We also talk about the new AWS Schema Conversion Tool that saves you development time when migrating your Oracle and SQL Server database schemas, including PL/SQL and T-SQL procedural code, to their MySQL, MariaDB and Aurora equivalents.
The document discusses strategies for scaling applications on AWS from 1 user to over 1 million users. It begins with hosting an application on a single EC2 instance, and progresses to introducing load balancing, separating databases onto RDS, read replicas, and sharding. It recommends services like S3, DynamoDB, ElastiCache to offload components. Automation with services like CloudFormation, CodeDeploy, and a SOA approach are suggested to scale independently. Migrating parts of the application to use specialized AWS databases and NoSQL options like DynamoDB is advised for very large user counts. The key is to leverage managed AWS services that scale automatically wherever possible.
Data comes in a variety of forms and in order to gain insight from this data you need to have the right platform in place. AWS has the services to cover all types of data, whether you need databases for structured data, Hadoop for unstructured data or a streaming engine for high-velocity data. In this session we will cover the various data analytics services on AWS and when to use them.
Data-driven companies have a need to make their data easily accessible to those who analyze it. Many organizations have adopted the Looker application, LookML on AWS, a centralized analytical database with a user-friendly interface that allows employees to ask and answer their own questions to make informed business decisions.
Join our webinar to learn how our customer, Casper, an online mattress retailer, made the switch from a transactional database to Looker’s data analytics program on Amazon Redshift. Looker on Amazon Redshift can help you greatly reduce your analytics lifecycle with a simplified infrastructure and rapid cloud scaling.
Join us to learn:
• How to utilize LookML to build reusable definitions and logic for your data
• Best practices for architecting a centralized analytical database
• How Casper leveraged Looker and Amazon Redshift to provide all their employees access to their data and metrics
Who should attend: Heads of Analytics, Heads of BI, Analytics Managers, BI Teams, Senior Analysts
Tackle Your Dark Data Challenge with AWS Glue - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Discover dark data that you are currently not analyzing.
- Analyze dark data without moving it into your data warehouse.
- Visualize the results of your dark data analytics.
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
AWS re:Invent is the largest global gathering of the Amazon Web Services community held in Las Vegas. Our five-day 2016 agenda, which is comprised of over 400 breakout sessions and attracted over 32,000 attendees, offers expanded opportunities to learn about AWS solutions, how the cloud impacts your business, including deep dives into hot topics including IoT and AI, new perspectives on cloud issues, and more chances to learn from the experts.
Not able to attend? Don’t worry, we are bringing AWS re:Invent to Hong Kong on Jan 19, 2017. Packed into a one-day format, AWS re:Invent 2016 Recap Hong Kong will showcase the key products, services, and enhancements recently announced in Las Vegas; Including new services for Big Data & Analytics, Architecture, Developers and Enterprises, etc. Local AWS customers are also invited to share their post re:Invent perspectives and ongoing successes with AWS so you can learn more from their successful stories.
Come join us at this event to hear new AWS product announcements addressed by the AWS Team!
AWS Data-Driven Insights Learning Series_ANZ Sep 2019 Part 2Amazon Web Services
AWS has been supporting companies across Australia and New Zealand to put their most innovative tools and technologies to work to achieve their business needs and goals. AWS and our ecosystem of partners has helped the likes of CP Mining, IntelliHQ, WesCEF, Oz Minerals, Woodside and many more to modernise their analytics and data architecture in order to successfully generate business value from their data.
This event series aimed to educate customers with a broader understanding of how to build next-gen data lakes and analytics platforms and make connections with AWS.
Charlene-Elise Anderson is the Justice & Public Safety Manager for Amazon Web Services' (AWS) Worldwide Public Sector division. In 2018, AWS held 11 public sector summits around the world and had over 20,000 public sector customers. AWS offers a broad set of cloud computing services including compute, storage, database, analytics, and deployment services from data centers in 21 regions and 66 availability zones worldwide. Customers use AWS to improve time to market, scale infrastructure up and down as needed, and reduce costs compared to maintaining their own data centers. Some common uses of AWS by public sector customers include development and testing, disaster recovery, big data analytics, and virtually unlimited cloud-based storage. Customers work with
Kalyanaraman Prasad, VP of AWS Edge Services, discusses the evolution of cloud computing and AWS. He notes that in 2014 control was shifting to customers, while in 2015 customers had control over their own destiny. AWS provides customers with "superpowers" through its robust and fully-featured technology infrastructure platform that innovates at a rapid pace with new capabilities added daily. AWS offers a broad spectrum of compute capabilities across many different service types to meet various needs. Customers in many industries have found success using AWS to transform their businesses through unprecedented scale, speed, and other benefits.
An Introduction to the AI services at AWS - AWS Summit Tel Aviv 2017Amazon Web Services
Artificial Intelligence (AI) services on the AWS cloud bring deep learning (DL) technologies like natural language understanding (NLU), automatic speech recognition (ASR), image recognition and computer vision (CV), text-to-speech (TTS), and machine learning (ML) within reach of every developer. In this session, you will be introduced to several new AI services: Amazon Lex, to build sophisticated text and voice chatbots; Amazon Rekognition, for deep learning-based image recognition; and Amazon Polly, for turning text into lifelike speech. The opportunities to apply one or more of these DL services are nearly boundless and this session will provide a number of examples and use cases to help you get started.
AWS 활용을 통한 금융권 혁신 사례 소개 :: Felix Candelario :: AWS Fi...Amazon Web Services Korea
The document discusses how Amazon Web Services (AWS) provides scalable and cost-effective IT infrastructure for financial services firms. It highlights that AWS is used by over 1 million customers globally, including over 1,000 financial services organizations. The document then summarizes some key customer stories of how capital markets firms, banks, payments companies, and other financial services organizations are using AWS to power applications, analytics, exchanges, and other workloads.
Artificial Intelligence on the AWS PlatformAdrian Hornsby
The document provides an overview of artificial intelligence services available on Amazon Web Services (AWS). It discusses Amazon Polly for text-to-speech, Amazon Rekognition for image and video analysis, and Amazon Lex for conversational interfaces. It also describes Amazon Machine Learning for building machine learning models and Amazon EMR for running Spark and Hadoop clusters. Customers in various industries are using these AI services to power applications like fraud detection, personalization, targeted marketing, and more.
This document summarizes a presentation given by Teresa Carlson at the AWS Government, Education and Nonprofits Symposium in Canberra, Australia. Carlson discussed how cloud computing has become the new normal for many organizations. She provided examples of successful government adoption models and how AWS addresses security, compliance, procurement and culture issues. Carlson also presented statistics on AWS's growing customer base and the rapid pace of innovation, with over 500 new features and services launched in 2014.
This document summarizes a presentation given by Teresa Carlson from Amazon Web Services at the AWS Government, Education and Nonprofits Symposium in Canberra, Australia. Carlson discussed how cloud computing has become the new normal for many organizations. She provided examples of successful government adoption models and addressed common issues around security, compliance, procurement and culture. Carlson also presented statistics on AWS's growing customer base and the rapid pace of innovation, with over 500 new features and services launched in 2014.
AWS continues to rapidly innovate and grow its services. Over 1 million customers now use AWS each month, representing over 120% year-over-year growth. Revenue is also growing significantly, with an annual run rate now over $7 billion. Key trends include the rise of analytics solutions, real-time streaming analytics, machine learning and predictions, more database and storage options, high performance containers and serverless computing.
The document discusses Amazon Web Services (AWS) and why customers use AWS. It notes that the top reasons are: 1) Agility and scalability allowing customers to quickly scale resources, 2) The broad platform of services offered by AWS, and 3) AWS's ability to support innovation through frequent new feature and service launches. It provides examples of how customers from startups to large enterprises are using AWS across different industry domains.
2014년 10월 29일에 열린 AWS Enterprise Summit에서의 발표자료입니다. 아마존 웹서비스의 APAC Principal Technology Evangelist인 Markku Lepisto가 진행한 강연입니다.
강연 요약: 클라우드 컴퓨팅은 기업들의 IT 서비스 소비와 생산 방식을 빠르게 변화시키고 있습니다. 일반적으로 큰 규모의 기업들은 클라우드 컴퓨팅의 가치를 잘 인식하고 있지만, 자신들의 사업에 적합하게 클라우드 컴퓨팅을 평가하고 도입할 방법에 대해서는 확실히 파악하고 있지 못한 기업들이 많습니다. 이 세션에서는 클라우드 도입의 여러가지 전략과 각 단계에 대해 살펴보도록 하겠습니다.
The AWS Workshop Series Online is a series of live webinars designed for IT professionals who are looking to leverage the AWS Cloud to build and transform their business, are new to the AWS Cloud or looking to further expand their skills and expertise. In this series, we will cover:'Introduction to Cloud Computing with Amazon Web Services'.
Move fast, build things with AWS (June 2016)Julien SIMON
This document discusses top concerns of CIOs and how technology trends are impacting businesses. It argues that while problems remain the same over decades, attitudes, processes, tools, and embracing new technologies can help solve them. It promotes adopting cloud computing and serverless architectures to help businesses innovate faster, gain insights from data, and focus on their core operations rather than infrastructure management. Several examples of large companies migrating to AWS cloud are provided. The document advocates that infrastructure can become even more flexible and cost-efficient by leveraging services like AWS Lambda.
This document provides an overview of Amazon Web Services (AWS). It discusses AWS's position as a leader in cloud infrastructure services, its vast global infrastructure and integrated technology platform, and its expansive portfolio of cloud services including compute, storage, databases, analytics and artificial intelligence. The summary also notes that AWS continues rapid innovation, releasing new services and features daily to help customers build applications and drive digital transformation.
This document discusses how organizations can leverage big data and artificial intelligence (AI) to drive insights and add intelligence to their solutions. It covers common big data challenges, AWS big data solutions like Amazon S3, Glue, Athena, Redshift, Kinesis, and SageMaker, and how big data can power machine learning. Some key tenets for building big data architectures are using the right tools, leveraging managed services, adopting event-driven design patterns, and enabling ML applications.
Introduction to the AWS Cloud from Digital Tuesday MeetupIan Massingham
These are the slides that I used for my Introduction to AWS talk at the South Wales Digital Tuesday Meetup on the 2nd of December 2014.
Find out more about Digital Tuesday at their website here: http://www.digital-tuesday.com/
APAC Principal Solutions Architect, Johnathon Meichtry will run through the highlights of 2015 showcasing the biggest announcements and how customers are using these new features. This session will cover the entire breadth of the AWS platform, and is a chance to get a high level overview of all of the announcements, feature updates and new services that AWS has launched in 2015.
O documento discute opções para Disaster Recovery na nuvem AWS, incluindo Backup e Restore, Pilot Light, Warm Standby e Multi-Site. A AWS oferece várias soluções para atender a diferentes requisitos de RTO e RPO a um custo variável. A nuvem permite testes fáceis e dimensionamento flexível dos recursos de recuperação de desastres.
O documento discute opções para Disaster Recovery na nuvem AWS, incluindo Backup e Restore, Pilot Light, Warm Standby e Multi-Site. A AWS oferece várias soluções para atender a diferentes requisitos de RTO e RPO a um custo variável. A nuvem permite testes fáceis e dimensionamento flexível dos recursos de recuperação de desastres.
O documento descreve várias soluções de segurança da nuvem da AWS, incluindo ferramentas para gestão de acessos e identidade, detecção, segurança de infraestrutura, resposta a incidentes e proteção de dados. A AWS oferece 203 certificações de segurança e mais de 2.600 controles auditados anualmente para ajudar clientes a manterem a conformidade e segurança na nuvem.
En este webinar, aprenderá cómo las empresas pueden aprovechar la nube de AWS para automatizar los pipelines de desarrollo de software. Este enfoque permite que su equipo sea más ágil, mejorando su capacidad para entregar aplicaciones y servicios rápidamente.
Neste webinar, você aprenderá como as empresas podem se valer da nuvem da AWS para automatizar os pipelines de desenvolvimento de software. Essa abordagem permite que sua equipe seja mais ágil, melhorando sua capacidade para entregar aplicações e serviços mais rapidamente.
Las tecnologías como los contenedores y kubernetes pueden hacer que sus procesos de entrega de software sean más fáciles y más rápidos. En este webinar, hablaremos sobre cómo usar el Amazon Kubernetes Service (EKS) para construir aplicaciones modernas con grupos Kubernetes totalmente administrados.
Tecnologias como containers e Kubernetes podem tornar seus processos de entrega de software mais fáceis e rápidos. Neste webinar, falaremos sobre como usar o Amazon Elastic Kubernetes Service (EKS) para criar aplicativos modernos com clusters de Kubernetes totalmente gerenciados.
Ransomware é uma das ameaças de crescimento mais rápido para qualquer organização. Nenhuma empresa, grande ou pequena, está imune a ataques de cibercriminosos. Nesta sessão, mostramos como você pode aproveitar os serviços e recursos da nuvem AWS para proteger seus dados mais valiosos de ataques cibernéticos e acelerar a restauração de operações.
El ransomware es una de las amenazas de más rápido crecimiento para cualquier organización. Ninguna empresa, grande o pequeña, es inmune a los ataques de los ciberdelincuentes. En esta sesión, mostramos cómo puede aprovechar los servicios y las capacidades de la nube AWS para proteger sus datos más valiosos de los ataques cibernéticos y acelerar la restauración de las operaciones.
Ransomware é uma prática maliciosa que tem se popularizado nos últimos anos. Nessa sessão, mostraremos como através da Amazon Web Services nossos clientes podem desenvolver uma estratégia pró-ativa de mitigação a ataques de ransomware, tanto em cenários on-premises como operando na nuvem.
El ransomware es una práctica maliciosa que se ha popularizado en los últimos años. En esta sesión les mostraremos cómo desde Amazon Web Services nuestros clientes pueden desarrollar una estrategia proactiva de mitigación frente a ataques de ransomware, tanto en escenarios on-premises, como operando en la nube.
Al mover datos a la nube, los clientes deben comprender los métodos óptimos para los diferentes casos de uso, los tipos de datos que están moviendo y los recursos disponibles en la red, entre otros. Las soluciones de migración y transferencia de AWS contemplan desde la migración de datos con conectividad limitada, almacenamiento en la nube híbrida, transferencias frecuentes de archivos B2B, hasta transferencias de datos en línea y sin conexión. En esta sesión, le mostramos cómo puede acelerar la migración y transferencia de datos de manera simplificada desde y hacia la nube de AWS.
O documento discute estratégias para migração de dados para a AWS, incluindo serviços como AWS Transfer Family para transferência de arquivos, AWS DataSync para mover dados entre ambientes on-premises e AWS, e AWS Snow Family para transferência offline de grandes quantidades de dados.
El almacenamiento de archivos tiene diversos casos de uso; como directorios de usuarios, datos de aplicaciones, archivos multimedia y almacenamiento compartido para cargas de trabajo de alto rendimiento. La administración del almacenamiento de archivos en instalaciones propias suele ser un trabajo pesado, indiferenciado, con altos costos de adquisición, carga operativa para configurar y administra, lo que conlleva a desafíos de escalabilidad. En esta sesión, le mostramos cómo puede aprovechar las soluciones de archivos totalmente administradas de AWS para dejar de preocuparse por la sobrecarga administrativa de configurar, proteger, mantener y realizar copias de seguridad de su infraestructura de archivos.
La visualización de datos analíticos es un reto al que se enfrentan muchas organizaciones, el poder crear tableros, alertas, agregar predicciones a sus datos y actuar de acuerdo a estas de manera rápida es una necesidad de todos los negocios actuales. Únase a nuestros arquitectos para aprender como Amazon QuickSight le permite agregar inteligencia de negocios a sus aplicaciones y crear predicciones a futuro de sus datos. Amazon QuickSight es un servicio de inteligencia de negocios escalable y serverless creado para la nube, a través del cual podrá explotar sus datos de negocio para convertirlos en insights para hacer decisiones informadas sobre su negocio sin preocuparse de la gestión, escalamiento y la disponibilidad de la infraestructura de cómputo.
A visualização de dados é um desafio que muitas organizações enfrentam hoje. Criar dashboards, alertas, fazer previsões e agir rapidamente de acordo com os insights dos dados é uma necessidade de todas as empresas. Junte-se aos nossos arquitetos para aprender como o Amazon QuickSight o ajudará a adicionar BI aos seus aplicativos. O Amazon Quicksight é um serviço de BI escalável e serverless criado para a nuvem. Com ele, você pode explorar seus dados para obter insights e tomar decisões embasadas em seus negócios, sem se preocupar em gerenciar e dimensionar servidores e manter a disponibilidade de sua infraestrutura.
1) O documento discute os benefícios de migrar workloads de Big Data para a AWS, incluindo tornar mais fácil construir data lakes e analytics, oferecer maior abrangência de serviços e fornecer infraestrutura mais segura e escalável.
2) É apresentada a plataforma Amazon EMR para executar aplicativos de Big Data de forma gerenciada na AWS, proporcionando melhor desempenho a menor custo em comparação a clusters on-premises.
3) A separação de computação e armazenamento no Amazon EMR permite
AI in the Workplace Reskilling, Upskilling, and Future Work.pptxSunil Jagani
Discover how AI is transforming the workplace and learn strategies for reskilling and upskilling employees to stay ahead. This comprehensive guide covers the impact of AI on jobs, essential skills for the future, and successful case studies from industry leaders. Embrace AI-driven changes, foster continuous learning, and build a future-ready workforce.
Read More - https://bit.ly/3VKly70
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
This talk will cover ScyllaDB Architecture from the cluster-level view and zoom in on data distribution and internal node architecture. In the process, we will learn the secret sauce used to get ScyllaDB's high availability and superior performance. We will also touch on the upcoming changes to ScyllaDB architecture, moving to strongly consistent metadata and tablets.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Must Know Postgres Extension for DBA and Developer during MigrationMydbops
Mydbops Opensource Database Meetup 16
Topic: Must-Know PostgreSQL Extensions for Developers and DBAs During Migration
Speaker: Deepak Mahto, Founder of DataCloudGaze Consulting
Date & Time: 8th June | 10 AM - 1 PM IST
Venue: Bangalore International Centre, Bangalore
Abstract: Discover how PostgreSQL extensions can be your secret weapon! This talk explores how key extensions enhance database capabilities and streamline the migration process for users moving from other relational databases like Oracle.
Key Takeaways:
* Learn about crucial extensions like oracle_fdw, pgtt, and pg_audit that ease migration complexities.
* Gain valuable strategies for implementing these extensions in PostgreSQL to achieve license freedom.
* Discover how these key extensions can empower both developers and DBAs during the migration process.
* Don't miss this chance to gain practical knowledge from an industry expert and stay updated on the latest open-source database trends.
Mydbops Managed Services specializes in taking the pain out of database management while optimizing performance. Since 2015, we have been providing top-notch support and assistance for the top three open-source databases: MySQL, MongoDB, and PostgreSQL.
Our team offers a wide range of services, including assistance, support, consulting, 24/7 operations, and expertise in all relevant technologies. We help organizations improve their database's performance, scalability, efficiency, and availability.
Contact us: info@mydbops.com
Visit: https://www.mydbops.com/
Follow us on LinkedIn: https://in.linkedin.com/company/mydbops
For more details and updates, please follow up the below links.
Meetup Page : https://www.meetup.com/mydbops-databa...
Twitter: https://twitter.com/mydbopsofficial
Blogs: https://www.mydbops.com/blog/
Facebook(Meta): https://www.facebook.com/mydbops/
The Department of Veteran Affairs (VA) invited Taylor Paschal, Knowledge & Information Management Consultant at Enterprise Knowledge, to speak at a Knowledge Management Lunch and Learn hosted on June 12, 2024. All Office of Administration staff were invited to attend and received professional development credit for participating in the voluntary event.
The objectives of the Lunch and Learn presentation were to:
- Review what KM ‘is’ and ‘isn’t’
- Understand the value of KM and the benefits of engaging
- Define and reflect on your “what’s in it for me?”
- Share actionable ways you can participate in Knowledge - - Capture & Transfer
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
Introducing BoxLang : A new JVM language for productivity and modularity!Ortus Solutions, Corp
Just like life, our code must adapt to the ever changing world we live in. From one day coding for the web, to the next for our tablets or APIs or for running serverless applications. Multi-runtime development is the future of coding, the future is to be dynamic. Let us introduce you to BoxLang.
Dynamic. Modular. Productive.
BoxLang redefines development with its dynamic nature, empowering developers to craft expressive and functional code effortlessly. Its modular architecture prioritizes flexibility, allowing for seamless integration into existing ecosystems.
Interoperability at its Core
With 100% interoperability with Java, BoxLang seamlessly bridges the gap between traditional and modern development paradigms, unlocking new possibilities for innovation and collaboration.
Multi-Runtime
From the tiny 2m operating system binary to running on our pure Java web server, CommandBox, Jakarta EE, AWS Lambda, Microsoft Functions, Web Assembly, Android and more. BoxLang has been designed to enhance and adapt according to it's runnable runtime.
The Fusion of Modernity and Tradition
Experience the fusion of modern features inspired by CFML, Node, Ruby, Kotlin, Java, and Clojure, combined with the familiarity of Java bytecode compilation, making BoxLang a language of choice for forward-thinking developers.
Empowering Transition with Transpiler Support
Transitioning from CFML to BoxLang is seamless with our JIT transpiler, facilitating smooth migration and preserving existing code investments.
Unlocking Creativity with IDE Tools
Unleash your creativity with powerful IDE tools tailored for BoxLang, providing an intuitive development experience and streamlining your workflow. Join us as we embark on a journey to redefine JVM development. Welcome to the era of BoxLang.
In our second session, we shall learn all about the main features and fundamentals of UiPath Studio that enable us to use the building blocks for any automation project.
📕 Detailed agenda:
Variables and Datatypes
Workflow Layouts
Arguments
Control Flows and Loops
Conditional Statements
💻 Extra training through UiPath Academy:
Variables, Constants, and Arguments in Studio
Control Flow in Studio
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Keywords: AI, Containeres, Kubernetes, Cloud Native
Event Link: https://meine.doag.org/events/cloudland/2024/agenda/#agendaId.4211
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
"NATO Hackathon Winner: AI-Powered Drug Search", Taras KlobaFwdays
This is a session that details how PostgreSQL's features and Azure AI Services can be effectively used to significantly enhance the search functionality in any application.
In this session, we'll share insights on how we used PostgreSQL to facilitate precise searches across multiple fields in our mobile application. The techniques include using LIKE and ILIKE operators and integrating a trigram-based search to handle potential misspellings, thereby increasing the search accuracy.
We'll also discuss how the azure_ai extension on PostgreSQL databases in Azure and Azure AI Services were utilized to create vectors from user input, a feature beneficial when users wish to find specific items based on text prompts. While our application's case study involves a drug search, the techniques and principles shared in this session can be adapted to improve search functionality in a wide range of applications. Join us to learn how PostgreSQL and Azure AI can be harnessed to enhance your application's search capability.
10. Global Infrastructure
• Over 1 million active customers
across 190 countries
• 2,300 government agencies
• 7,000 educational institutions
• 16 regions (2 more in
2016/2017)
– France,China.
• 42 availability zones
• 70 edge locations
11. *Gartner, Magic Quadrant for Cloud Infrastructure as a Service, Worldwide, Leong, Lydia, Petri, Gregor, Gill, Bob, Dorosh, Mike, August 32016
This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available
upon request from AWS : http://www.gartner.com/doc/reprints?id=1-2G2O5FC&ct=150519&st=sb
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings
or other designation. Gartner research publications consist of the opinions of Gartner's research
organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of
merchantability or fitness for a particular purpose.
AWS is positioned highest in
execution and furthest in vision within
the Leaders Quadrant
AWS Positioned as a Leader
in the Gartner Magic Quadrant
for Cloud Infrastructure as
a Service, Worldwide
12. The fastest growing multi-billion dollar enterprise IT company in the world
& The Rise Of The New Guard
13. “AWS continues to grow at a nearly unbelievable rate, even as
credible and very deep pocketed-competitors have emerged.
That seems to indicate an almost insatiable demand from
businesses that want to offload computing and storage tasks to third
party providers rather than build more data centers.”
Where Are We In The Cloud’s Evolution?
Fortune, October 27th 2016
14. Where Are We In The Cloud’s Evolution?
New Normal
In 2014
15. New Normal Control Over
Your Own Destiny
In 2014 In 2015
Where Are We In The Cloud’s Evolution?
16. S U P E R P O W E R S
With AWS, It Can Feel Like You Have Been Given
18. Most Robust, Fully-Featured
Technology Infrastructure Platform
HYBRID ARCHITECTURE
Data Backups
Integrated App
Deployments
Direct
Connect
Identity
Federation
Integrated Resource
Management
Integrated
Networking
VMware
Integration
MARKETPLACE
Business
Apps
Databases
DevOps
Tools
NetworkingSecurity Storage
Business
Intelligence
INFRASTRUCTURE
Availability
Zones
Points of
Presence
Regions
CORE SERVICES
Compute
VMs, Auto-scaling, Load
Balancing, Containers, Cloud
functions
Storage
Object, Blocks, File,
Archivals,
Import/Export
Databases
Relational, NoSQL,
Caching, Migration
CDN
Networking
VPC, DX,
DNS
Access Control
Identity
Management
Key Management
& Storage
Monitoring
& Logs
SECURITY & COMPLIANCE
Resource &
Usage Auditing
Configuration
Compliance
Web application
firewall
Assessment and
reporting
TECHNICAL & BUSINESS SUPPORT
Support
Professional
Services
Account
Management
Partner
Ecosystem
Solutions
Architects
Training &
Certification
Security &
Billing Reports
Optimization
Guidance
ENTERPRISE APPS
Backup
Corporate
Email
Sharing &
Collaboration
Virtual
Desktops
IoT
Rules
Engine
Registry
Device
Shadows
Device
Gateway
Device
SDKs
DEVELOPMENT & OPERATIONSMOBILE SERVICESAPP SERVICESANALYTICS
Data
Warehousing
Hadoop/
Spark
Streaming Data
Collection
Machine
Learning
Elastic
Search
Push
Notifications
Identity
Sync
Resource
Templates
One-click App
Deployment
Triggers
Containers
DevOps Resource
Management
Application Lifecycle
Management
API
Gateway
Transcoding
Queuing &
Notifications
Email
Workflow
Search
Streaming Data
Analysis
Business
Intelligence
Mobile
Analytics
Single Integrated
Console
Mobile App
Testing
Data
Pipelines
Petabyte-Scale
Data Migration
Database
Migration
Schema
Conversion
Application
Migration
MIGRATION
19. Pace Of Innovation:
New Capabilities Daily
In 2011, we released over 80 significant services and features; in 2012, nearly 160; in
2013, 280, in 2014, 516, in 2015, we launched 722. In 2016, we launched 1,017 new
services and features. As of March 1st, we have launched 131 new features and
services in 2017.
21. The Ability to Understand Your Customers And
Business Better Through Analytics
Real-time
streaming
data
Data
Warehouse
Hadoop, Spark,
HBase, Hive,
Presto, Mahout,
Pig, Zeppelin
Elasticsearch Business
Intelligence
Machine
Learning
Amazon
Kinesis
Amazon
Redshift
Amazon
EMR
Amazon
Elasticsearch
Amazon
QuickSight
Amazon
Machine Learning
27. Amazon Polly
Text To Speech Powered By Deep Learning
Generally Available Today
2
28. Amazon Polly: Text In, Life-like Speech Out
Amazon Polly
“The temperature
in WA is 75°F”
“The temperature
in Washington is 75 degrees
Fahrenheit”
29. Amazon ALEXA
(It’s what’s inside Alexa)
Preview Available Today
3
Natural Language Understanding (NLU) &
Automatic Speech Recognition (ASR) Powered By Deep Learning
30. Amazon Lex: Speech Recognition
& Natural Language Understanding
Amazon Lex
Automatic Speech Recognition
Natural Language Understanding
“What’s the weather
forecast?”
Weather
Forecast
31. Amazon Lex: Speech Recognition
& Natural Language Understanding
Amazon Lex
Automatic Speech Recognition
Natural Language Understanding
“What’s the weather
forecast?”
“It will be sunny
and 75°F”
Weather
Forecast
32. Amazon Lex: Build Natural, Conversational
Interactions In Voice & Text
Integrated
development in the
AWS console
Fully
managed
Trigger
Lambda
functions
Continually improving
ASR & NLU models
Enterprise
connectors
Multi-step
conversations
34. Startups Are Breathing New Life Into Virtually Every Industry
Accommodation
Payments CommunicationTradingVisual Search
Healthcare
Teams
Commerce Education Customer Relations
35. SaaS Companies Are Breathing New Life
Into Virtually Every Industry
ERP
Content management Business Applications
Operation Intelligence
Data Integration
CommunicationCRM
Healthcare
44. Amazon Aurora: Speed And Availability Of Commercial
Databases, With Cost-Effectiveness Of Open Source
Up To 5x Performance
Of High-end MySQL
Highly Available
and Durable
MySQL
Compatible
1/10th The Cost Of
Commercial Grade Databases
Fastest Growing
AWS Service, Ever
46. Migrating Databases To AWS
20,000+
databases migrated
Migrate between
on-prem and AWS
Migrate between
databases
Automated schema
conversion
Data replication for
zero downtime migrations
55. VMware Cloud On AWS
Seamlessly migrate workloads
Manage with your existing VMware tools
Run the same VMware software on AWS that
you run in your data centers
Nuna Health provides analytics-based healthcare insights to government agencies. Using AWS, Nuna Health has a powerful analytics platform that complies with HIPAA, ITAR, and other patient-privacy and insurance regulations, enabling it to help government agencies like the Centers for Medicare and Medicaid Services find patterns in vast amounts of data.
Robinhood, maker of a no-fee, mobile brokerage app that is disrupting the brokerage industry by democratizing stock trading, used AWS to create its massively scalable trading app with strong built-in security and compliance features that supported hundreds of thousands of users at launch.
Dollar Shave Club, a grooming-product-subscription service that is disrupting the shaving industry by neutralizing the distribution advantage of big name brands and offering cheaper, better razors online, launched all-in on AWS. The company has since added more than 3.2 million subscribers and is on pace to top $200 million in sales in 2016.
Intercom, a startup helping businesses talk to their customers: with services such as website chat, targeted marketing campaigns and customer support. Headcount has doubled in the past year, to 250, while revenue has quadrupled into the tens of millions of dollars.
OTHERS
• Pinterest
• AirBnB
• Stripe
• Slack
• Domo
• Periscope
• Oil & Gas: Shell, BP, Hess
• Healthcare: Merck, Pfizer, J&J, MNS
• Technology: Netflix, Samsung, Adobe
• Manufacturing: GE, Phillips, Schneider Electric
• Financial Services: Cap One, Finra, Intuit, Commonwealth Bank Of Australia
• Insurance: Liberty Mutual, Aviva, <more coming>
• CPG: <pending>
• Retail: ShopDirect, Ocado, Best Buy
• Media: Financial Times, Dow Jones, Newscorp, Conde Nast
UPDATE FOR REINVENT:
more than 2,300 government agencies,
7,000 education institutions
more than 22,000 nonprofit organizations
55 premiere,
Thousands of SIs
GSIs: Accenture, Cognizant, Booz Allen, Infosys, Wipro;
Born in Cloud: 2nd Watch, Bulletproof, Dedalus, Cloudreach, Slalom, InfoReliance and Smartronics
And if you read the report, it states: “Many times the aggregate size of all other providers in the market”
En este análisis se presentan dos situaciones interesantes:
Hay un real cambio en el mapa pasando en este momento. Como las compañías antiguas estan contrayendo, o casi que sin crecer, mientras nuevas empresas al mismo estan creciendo para posicionarse como los partners tecnologicos de la nueva década para las compañías.
Talk track: Most of these old companies, like MSFT, are contracting rather than growing with the change in the market.
Ok. For the footnote, can you change it to "Last Reported Quarterly Revenue YoY Growth"
The cloud has come a long way.
Reporter mused after our Q3 earnings that “AWS continues to grow at a nearly unbelievable rate, even as credible and very deep pocketed-competitors have emerge. That seems to indicate an almost insatiable demand from businesses that want to offload computing and storage tasks to third party providers rather than build more data centers”
Reason is, as we said in 2014, cloud is the new normal.
We talked about why cloud has become so compelling: cap ex to op ex, lower var expense, elasticity up and down, agility, no undifferentitaed heavy lifting, global presence quickly.
But this doesn’t necessarily explain why developers, dev managers, CIOs are SO passionate about the cloud movement.
Talked last year about what was really going on: that the cloud gave builders freedom and control of their own destiny, which is very motivating.
We talked about the freedom it gave builders
But as we continue to talk to builders, we’ve realized it’s more than just owning your own destiny.
Because the cloud and AWS gives builders so many more capabilities or tools than builders have had on premises, the cloud feels like it arms builders with super powers.
Kind of like the super heroes we watched on TV as kids (and now in movies) and kind of like what the secret organizations hand James Bond or Tom Cruise in Mission Impossible before they go try to save the world.
In tech and builders hands, these tools give them super powers to change their business and customer experiences and make the world a better place.
So if the cloud and AWS give builders super powers, what are some of these super powers.
I’ll share a few of these today, the first is…
Single biggest reason companies are moving to the cloud is for agility and speed.
The ability to be able to get from idea to implementation several orders of magnitude faster in the cloud and AWS than on-prem
There are two primary reasons:
1/ Unlike on-premsies where it typically takes 10 to 12 weeks to get a server and try and experiment, in the cloud you can spin up thousands of servers in minutes.
2/ Even more importantly, because AWS has 70+ services and your disposal, builders don’t have to spend the time building and deploying all the infrastructure software services. It can use an unmatched collection of capabilities from AWS to either migrate existing apps to the cloud or invent altogether new apps or businesses faster than ever before.
42 AZs in 16 regions
68 POPs
Vs MSFT: 30 “regions” (really just DCs); use Akamai for CDN
MP: AWS customers use over 300 million hours a month of Amazon EC2 for AWS Marketplace products. Overall, the AWS Marketplace offers 35 categories, and more than 3,500 software listings from more than 1,100 ISVs. 112% growth since last year.
1/ robust tech infra platform with 70+ services;
2/ much more functionality, by large margin, than any other provider;
3/ won't go through all of this in detail, but at a high level…
4/ Regions/AZs/POPs;
5/ Building blocks;
6/ Security + Compliance;
7/ App Services;
8/ Analytics;
9/ IoT;
10/ Mobile Services;
11/ Migration services;
12 Hybrid capabilities;
13/ Enterprise Apps;
14/ People Services; 15/ MP
1000 if we ship 70% of whats in the roadmap in December
One company that has been leveraging the breadth and depth of the AWS cloud is Enel. Here to tell you more is Fabio…
Pronunciation: Fabio Ver-o-nay-zay
Nunca fue más costo efectivo, y fácil de recolectar. Almacenar, procesar, y analizar la informacion que como es hoy en día con Cloud. Y la ecuación ha sido gracias a servicios que AWS ha proporcionado a nuestros clientes, poder administrar Hadoop, Spark, Presto, sobre nuestro servicio EMR.
Full EMR list: Hadoop, Ganglia, HBase, Hive, HCatalog, Hue, Mahout, Oozie, Phoenix, Pig, Presto, Spark, Tez, Zeppelin, ZooKeeper, Flink
Supercell: process 45B in game events per day to obtain key analytics and feed real time dashboards and applications
Phillips HealthSuite: use AWS to securely cpature, analyze, and store 15PB+ of patient data gathered from 390M imaging studies, medical records, and patient inputs;
NTT Docomo: 6PB DW
A new collection of tools called AMAZON AI, which brings the power of artificial intelligence, in all its flavors, to all developers. We’re announcing the first three today, focused on making deep learning models as easy as possible to use.
The first is Rekognition.
Lets take a look at how this works…
* upload an image via API or SDK, Rekognition will analyze contents of image and tell you objects inside (person, woman, car, steering wheel)...makes it east to ad advanced search (e.g. give me women driving cars pics)
* will detect faces and number of people so can generate thumbnails and crop faces well
* can detect sentiment and details in faces (smiling, glasses)
* face matching (good for security...e.g. Unlock laptops)
The service is designed to be very easy to use - all this sophistication is available at the end of an API, and it can perform batch processing of millions of images in an S3 bucket, or real time analysis as photos are uploaded to S3.
And because this is in the cloud we can continually improve the model and make those improvements available to everyone, and use our economies of scale to deliver this in a very low cost way.
Second is polly, which takes text, and uses deep learning to generate life like speech.
47 diferentes tipos de voces en 27 idiomas diferentes.
The basics are pretty simple, but the service has deep functionality.
You can send the service a simple it a simple string of text, and it will generate the life like voice in your choice of 47 different voices. But it’s not naive of the context of the text. For example, the text here - ‘WA’ and ‘degree F’, that would sound strange if it were spoken out loud, like I just had to. Instead, Polly will automatically expand the text strings ‘WA’ and ‘degree F’, to ‘Washington’ and ‘degrees fahrenheit’, to create more life like speech. The developer doesn’t have to do anything - just send the text, and get life like voice back.
With Lex, any application running on the web, a mobile app, or a device, can send natural language - as both text or speech - to Lex using an API or SDK. Lex will apply ASR and NLU to the incoming message to understand the intent of the user, so to understand what the question is, and map that to a Lambda function which will process the information, and…
Then form a response, which will be passed back to the user as either text, or will use Polly automatically to generate a voice response.
there's an integrated development console that lets you define what phrases you want to elicit what responses or additional questions...in either text or speech (where lambda triggers the response back)…
The service has an integrated development console - including the ability to be able to quickly specify example phrases which Lex will expand to capture more possible input from your users, and build sophisticated multi-step conversations - and you can build and test these models, or bots, right in the AWS console.
You can then embed these through an API in your apps, services or devices, and take advantage of a continually improving deep learning models on the services.
We provide a collection of enterprise connectors:
Salesforce, Microsoft Dynamics, Marketo
Zendesk
Quickbooks
Hubspot
Which help you build new interfaces around existing enterprise data, and the whole thing is fully managed.
And to give us more detail on how to use this, and a live demo, the one and only, Dr. Matt Wood.
Talk about being born with immortality or acquiring it as a superpower. Podemos extender nuestra vitalidad con los avances actuales o tenemos que nacer con nda mejorado?
No tenemos aún la respuesta a eso, pero es muy aplicable a los negocios, es sumamente difícil como compañía persistir por un largo periodo, mantener el éxito y posicionamiento. Si vemos la primera lista de Fortune 500 de 1955, solo el 12% de esas compañías siguen en la lista actual.
Y en los negocios hay variables con influyen como entender y actuar ante la dinámica del mercado, conocer a la competencia, o estar en la vanguardía.
PERO SI TIENES UN CHANCE PARA VIVIR POR SIEMPRE en los negocios, una cosa que es totalmente clara por lo menos para mi, es que TIENES que sacar ventaja de las evoluciones tecnológicas, y es que la única cosa cierta que hemos visto en los últimos 10 años y seguro seguirmemos viendo los 10 años siguientes es que la Tecnología ha evolucionado y cambiado a un ritmo muy alto.
Nuna Health provides analytics-based healthcare insights to government agencies. Using AWS, Nuna Health has a powerful analytics platform that complies with HIPAA, ITAR, and other patient-privacy and insurance regulations, enabling it to help government agencies like the Centers for Medicare and Medicaid Services find patterns in vast amounts of data.
Robinhood, maker of a no-fee, mobile brokerage app that is disrupting the brokerage industry by democratizing stock trading, used AWS to create its massively scalable trading app with strong built-in security and compliance features that supported hundreds of thousands of users at launch.
Dollar Shave Club, a grooming-product-subscription service that is disrupting the shaving industry by neutralizing the distribution advantage of big name brands and offering cheaper, better razors online, launched all-in on AWS. The company has since added more than 3.2 million subscribers and is on pace to top $200 million in sales in 2016.
Pronunciation: A-neel Bus-ri
Aneel Bhusri
1/ lots of enterprises have figured out that they need to evolve their tech quickly to survive long term
2/ if your plan is to fight gravity, largely remain on premises, have a small fraction of the capabilities that all your competitors and peer cos and start-ups have (with a higher cost structure), very tough to compete even in medium term
3/ to me, biggest mistake enterprises can make is to give up their ability to adjust to what rapidly tech changes are allowing...
4/ can see several cos (Quantas, GE, Motorola Solutions, Capital One, Discovery Communications)
5/ an example of a long standing enterprise who gets this and is reinventing itself via the cloud is McDonalds...to share their journey, here's Tom Gergets the CTO
Pronunciation: Tom Ger-GETS (hard G both times)
MORE MAIL LIKE
Andy Q: “should we consider sharing that this is now over a $150M revenue run rate business in less than a year and a half? that's not valuation, that's revenue run rate...there are businesses valued at over $1B on that revenue run rate…"
Expedia, a leading online travel company, migrated its Lodging Inventory System to Amazon Aurora for greater scalability, enabling it to capture and analyze more than 300 million updates per day at a rate of more than 20,000 writes per second.
FanDuel, a fantasy sports website, uses Aurora to scale to meet the demands of more than 200,000 users in the run-up to major NFL games.
Intercom provides a way for Internet-based businesses to communicate with customers at scale. It moved its live databases—with more than two billion rows—to a new data store in Amazon Aurora with almost no downtime or any lost records.
Netflix, an online-media streaming provider delivering almost seven billion hours of videos to nearly 50 million customers in 60 countries each quarter, uses Amazon Aurora to run a priority-messaging system that serves the encoding system, which generates petabytes of streams and playback artifacts to support the growing content catalog.
14,000 databases migrated with DMS since January 2016
Customers have asked about postgres as a very popular alternative, even closer semantically to Oracle
Pronunciation: Steve Ran-ditch
You can see how many customers are excited about using Aurora and migrating from older guard databases to aurora; *and, we've launched 20 features to aurora since launch; * so, what's the top customer request for aurora...by far, it's postgres compatibility...why?
one of the cool things about being able to shape shift is you can be any shape you like. Relate to computing, some where you want to run all in the cloud or some that have to stay on premises
Sabemos que no es una desición Binaria.
NEW HYBRID EXAMPLES
J&J Consideran su Datacenter “sin limites” al extenderlo con AWS, y han desarrollado herramientas como Xbot que hace un scanning de ambos ambientes para identificar cualquier averiación de compliance o vulnerabilidad. O un modelo híbrido de SAP.
Comcast, the world’s largest cable company and the leading provider of Internet in the U.S., uses AWS in a hybrid environment to innovate and deploy features for its flagship video product, XFINITY X1, several times a week instead of once every 12-18 months under its old architecture.
Bristol-Myers Squibb has a mission to deliver innovative medicines. By using AWS to build a secure self-service portal, Bristol-Myers can take advantage of on-demand capacity to run clinical trial simulations 98 percent faster than in its previous environment, enabling more efficient clinical trials and better patient outcomes.
Practicamente todas las compañías virtualizan con VmWare
TRANSITION TO PAT
Question 1: Pat, welcome. It’s been several weeks now since we announced VMware cloud on AWS. I’m sure you’ve had conversations about this with a bunch of your customers. What’s the reaction been like?
Answer: Pat talks about the tremendous excitement he’s seen. Hundreds of customers have already told us they are planning to use this.]
Question 2A: Can you share a few examples how your customers are planning to use VMware cloud on AWS?
Answer: Pat talks about the huge interest he’s seen and talks to three examples of new customers who are planning to use VMware Cloud on AWS: Merck, Liberty Mutual, State of Louisiana. Pat also wants to say that he’s seen strong interest from the GSI community and that Accenture, CSC, Deloitte & Cap all told him they are planning to use Vmware cloud on AWS with their customers.]
Question 2B: How about the system integrators?
Question 3: So, you chose one partner to do this managed, operated-by-VMware service … why did you choose to it with us (other than my charm)?
Answer: Pat to talk to 3-4 reasons. Pat’s team is working on a script for him that covers the following reasons that I suggested: AWS is the leading public cloud provider with broadest and deepest platform, needed the strength of security and compliance capabilities of AWS for their enterprise customers, largest partner ecosystem, global infrastructure footprint).
Hemos gastado los últimos 10 años analizando y viendo en conjunto la evolución del Cloud, fuimos pioneros hace 10 años y desde entonces ha sido un desgaste con personas preguntándose si alguien iba utilizar Cloud, si iba ser un caso mas allá de los startups, si Gobierno iba utiilzarlo, si iban a correrse aplicaciones de misión crítica, y hemos llegado al punto donde la conversación ya no es más sobre “si vamos o no aprovechar las ventajas de Cloud”, sino: cuándo? Y cómo?
Creemos firmemente que los próximos 10 años vamos a tener una fase de innovación aún mayor que la exploción de innovación que hemos ofrecidos durante los últimos 10, y es una ventaja para todos ustedes, para que al terminar esta mañana vuelvan a sus compañías con un pensamiento de evolución y de cómo potenciarlo con nuestro apoyo para desarrollar con las herremientas y crear negocio a niveles que aún no han imaginado, y lo más importante, tomar los negocios ya existentes y poder cambiar la experiencia de usuario final para volverse esa compañía que persiste por siempre.
Por favor disfruten el resto de las sesiones.
Invention and failure as inseparable twins
Talk about the future. The first 10 years in the cloud has led to ungodly amount of invention and change, and the next 10 will be even more. Think about how much time people wasted in the first 10 years arguing about the cloud. The next generation and the next 10 years is going to take all that energy and think about changing customer experience forever. Today we had 13 new announcements. Over the year it will be ~1000 new features and services, which means that every day as a builder you can wake up and use all this new technology, and you only need to pay for what you actually use. This changes what is possible. Whether you're at a startup and building your business, or an enterprise figuring out how to evolve.
Option to also weave in the invention/failure point: Anyone who is an innovator knows that innovation and failure are inseparable twins, so really need to innovate at a high pace to figure out what works. Have to be working on your next failure