Businesses are generating more data than ever before.
Doing real time data analytics requires IT infrastructure that often needs to be scaled up quickly and running an on-premise environment in this setting has its limitations.
Organisations often require a massive amount of IT resources to analyse their data and the upfront capital cost can deter them from embarking on these projects.
What’s needed is scalable, agile and secure cloud-based infrastructure at the lowest possible cost so they can spin up servers that support their data analysis projects exactly when they are required. This infrastructure must enable them to create proof-of-concepts quickly and cheaply – to fail fast and move on.
Using real time big data analytics for competitive advantageAmazon Web Services
Many organisations find it challenging to successfully perform real-time data analytics using their own on premise IT infrastructure. Building a system that can adapt and scale rapidly to handle dramatic increases in transaction loads can potentially be quite a costly and time consuming exercise.
Most of the time, infrastructure is under-utilised and it’s near impossible for organisations to forecast the amount of computing power they will need in the future to serve their customers and suppliers.
To overcome these challenges, organisations can instead utilise the cloud to support their real-time data analytics activities. Scalable, agile and secure, cloud-based infrastructure enables organisations to quickly spin up infrastructure to support their data analytics projects exactly when it is needed. Importantly, they can ‘switch off’ infrastructure when it is not.
BluePi Consulting and Amazon Web Services (AWS) are giving you the opportunity to discover how organisations are using real time data analytics to gain new insights from their information to improve the customer experience and drive competitive advantage.
AWS Big Data and Analytics Services Speed Innovation | AWS Public Sector Summ...Amazon Web Services
Data-driven agencies face extreme data integration and analytics challenges. Decades of point solutions have solved specific mission problems while creating valuable data stores. However, these data stores are not integrated and are stored in information silos. AWS's powerful data ingestion and integration services now allow agencies to rapidly store more in data lakes for deeper analytics. Join this discussion on how FAA and other agencies have leveraged AWS data integration and analytic services to optimize and innovate with their previously untapped information silos. Learn More: https://aws.amazon.com/government-education/
AWS re:Invent 2016| HLC301 | Data Science and Healthcare: Running Large Scale...Amazon Web Services
Working with Amazon Web Services “AWS” and 1Strategy, an Advance AWS Consulting partner; the Cambia Health Data Sciences teams have been able to deploy HIPAA compliant and secured AWS Elastic Map Reduce (EMR) data pipelines on the cloud. In this session, we will dive deep into the architectural components of this solution and you will learn how utilizing AWS services has helped Cambia decrease processing time for analytics, increase application flexibility and accelerate speed to production. The second part of the session is going to cover machine learning and its role in reducing cost and improving quality of care. The healthcare community must rely on advanced analytics and machine learning to analyze multiple facets of healthcare data and process it at scale to gain insights on things that matter. You will learn why AWS is a well suited platform for machine learning. We will take you through the steps of building a machine learning model using Amazon ML for a real world problem of predicting patient readmissions.
Antoine Genereux takes us on a detailed overview of the Database solutions available on the AWS Cloud, addressing the needs and requirements of customers at all levels. He also discusses Business Intelligence and Analytics solutions.
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
AWS re:Invent 2016: How to Build a Big Data Analytics Data Lake (LFS303)Amazon Web Services
For discovery-phase research, life sciences companies have to support infrastructure that processes millions to billions of transactions. The advent of a data lake to accomplish such a task is showing itself to be a stable and productive data platform pattern to meet the goal. We discuss how to build a data lake on AWS, using services and techniques such as AWS CloudFormation, Amazon EC2, Amazon S3, IAM, and AWS Lambda. We also review a reference architecture from Amgen that uses a data lake to aid in their Life Science Research.
Using real time big data analytics for competitive advantageAmazon Web Services
Many organisations find it challenging to successfully perform real-time data analytics using their own on premise IT infrastructure. Building a system that can adapt and scale rapidly to handle dramatic increases in transaction loads can potentially be quite a costly and time consuming exercise.
Most of the time, infrastructure is under-utilised and it’s near impossible for organisations to forecast the amount of computing power they will need in the future to serve their customers and suppliers.
To overcome these challenges, organisations can instead utilise the cloud to support their real-time data analytics activities. Scalable, agile and secure, cloud-based infrastructure enables organisations to quickly spin up infrastructure to support their data analytics projects exactly when it is needed. Importantly, they can ‘switch off’ infrastructure when it is not.
BluePi Consulting and Amazon Web Services (AWS) are giving you the opportunity to discover how organisations are using real time data analytics to gain new insights from their information to improve the customer experience and drive competitive advantage.
AWS Big Data and Analytics Services Speed Innovation | AWS Public Sector Summ...Amazon Web Services
Data-driven agencies face extreme data integration and analytics challenges. Decades of point solutions have solved specific mission problems while creating valuable data stores. However, these data stores are not integrated and are stored in information silos. AWS's powerful data ingestion and integration services now allow agencies to rapidly store more in data lakes for deeper analytics. Join this discussion on how FAA and other agencies have leveraged AWS data integration and analytic services to optimize and innovate with their previously untapped information silos. Learn More: https://aws.amazon.com/government-education/
AWS re:Invent 2016| HLC301 | Data Science and Healthcare: Running Large Scale...Amazon Web Services
Working with Amazon Web Services “AWS” and 1Strategy, an Advance AWS Consulting partner; the Cambia Health Data Sciences teams have been able to deploy HIPAA compliant and secured AWS Elastic Map Reduce (EMR) data pipelines on the cloud. In this session, we will dive deep into the architectural components of this solution and you will learn how utilizing AWS services has helped Cambia decrease processing time for analytics, increase application flexibility and accelerate speed to production. The second part of the session is going to cover machine learning and its role in reducing cost and improving quality of care. The healthcare community must rely on advanced analytics and machine learning to analyze multiple facets of healthcare data and process it at scale to gain insights on things that matter. You will learn why AWS is a well suited platform for machine learning. We will take you through the steps of building a machine learning model using Amazon ML for a real world problem of predicting patient readmissions.
Antoine Genereux takes us on a detailed overview of the Database solutions available on the AWS Cloud, addressing the needs and requirements of customers at all levels. He also discusses Business Intelligence and Analytics solutions.
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
AWS re:Invent 2016: How to Build a Big Data Analytics Data Lake (LFS303)Amazon Web Services
For discovery-phase research, life sciences companies have to support infrastructure that processes millions to billions of transactions. The advent of a data lake to accomplish such a task is showing itself to be a stable and productive data platform pattern to meet the goal. We discuss how to build a data lake on AWS, using services and techniques such as AWS CloudFormation, Amazon EC2, Amazon S3, IAM, and AWS Lambda. We also review a reference architecture from Amgen that uses a data lake to aid in their Life Science Research.
AWS re:Invent 2016: Visualizing Big Data Insights with Amazon QuickSight (BDM...Amazon Web Services
Amazon QuickSight is a fast BI service that makes it easy for you to build visualizations, perform ad-hoc analysis, and quickly get business insights from your data. QuickSight is built to harness the power and scalability of the cloud, so you can easily run analysis on large datasets, and support hundreds of thousands of users. In this session, we’ll demonstrate how you can easily get started with Amazon QuickSight, uploading files, connecting to S3 and Redshift and creating analyses from visualizations that are optimized based on the underlying data. Once we’ve built our analysis and dashboard, we’ll show you easy it is to share it with colleagues and stakeholders in just a few seconds. And with SPICE – QuckSight’s in-memory calculation engine – you can go from data to insights, faster than ever.
AWS re:Invent 2016: How Fulfillment by Amazon (FBA) and Scopely Improved Resu...Amazon Web Services
We’ll share an overview of leveraging serverless architectures to support high performance data intensive applications. Fulfillment by Amazon (FBA) built the Seller Inventory Authority Platform (IAP) using Amazon DynamoDB Streams, AWS Lambda functions, Amazon Elasticsearch Service, and Amazon Redshift to improve results and reduce costs. Scopely will share how they used a flexible logging system built on Kinesis, Lambda, and Amazon Elasticsearch to provide high-fidelity reporting on hotkeys in Memcached and DynamoDB, and drastically reduce the incidence of hotkeys. Both of these customers are using managed services and serverless architecture to build scalable systems that can meet the projected business growth without a corresponding increase in operational costs.
AWS re:Invent 2016: Setting the Stage for Instant Success: Getting the Most O...Amazon Web Services
Ticketmaster will share their playbill to managing and optimizing their AWS deployment in real-time. Learn how this ticket-sales giant was able to deploy, train and obtain utilization of the Cloudyn solution for their significant AWS deployment - and get rapid results.
Ticketmaster relies on a large and dynamic AWS workload of thousands of instances across multiple regions and availability zones. As their reliance on the cloud increases, so does their need for actionable insights. Using Cloudyn, Ticketmaster is now able to confidently grow their cloud and focus efforts on expanding their core business.
In this session, presenters will also demonstrate how to use big data analytics to manage and optimize cloud investments, and leverage best practices to realize cloud potential. Attendees will also hear about cloud industry trends, the latest developments in optimization and how to plan for 2017. Session sponsored by Cloudyn.
AWS re:Invent 2016: Continuous Compliance in the AWS Cloud for Regulated Life...Amazon Web Services
Life sciences organizations running regulated workloads in the cloud can move from point-in-space testing of their environment to near real-time testing to achieve continuous compliance with the mandates of auditors and regulation entities. Get deep insights into some of the AWS services used to accomplish continuous compliance such as Amazon CloudTrail, Amazon CloudWatch, AWS Config, Amazon VPC, Amazon S3, and Amazon EC2. Get real-world use cases of how heavily regulated environments within Merck maintain governance and control over a shared environment. We also discuss the automated tools used by Merck to eliminate manual processes and streamline IT management.
AWS Compute Overview: Servers, Containers, Serverless, and Batch | AWS Public...Amazon Web Services
The AWS Compute platform has expanded EC2 instance types including FPGA and new GPU instances. There are also other ways to run workloads in AWS including Lambda (serverless), ECS (managed Docker), and AWS Batch (batch computing). This session will cover the newest instance types in EC2 and review AWS Lambda, ECS, and Batch. Learn More: https://aws.amazon.com/government-education/
AWS re:Invent 2016: High Performance Cinematic Production in the Cloud (MAE304)Amazon Web Services
The process of making a film is highly complex, and comprises of multiple workflows across story development, pre-production, production, post-production and final distribution. Given the size and amount of media and assets associated with each stage, high performance infrastructure is often essential to meeting deadlines.
In this session we will take a deeper dive at running a full cinematic production in the cloud, with a focus on solutions for each of the production stages. We will also look at best practices around design, optimization, performance, scheduling, scalability and low latency utilizing AWS technologies such as EC2, Lambda, Snowball, Direct Connect, and Partner Solutions.
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
Understanding AWS Managed Database and Analytics Services | AWS Public Sector...Amazon Web Services
The world is creating more data in more ways than ever before. The average internet user in 2017 generates 1.5GB of data per day, with the rate doubling every 18 months. A single autonomous vehicle can generate 4TB per day. Each smart manufacturing plant generates 1PB per day. Storing, managing, and analyzing this data requires integrated database and analytic services that provide reliability and security at scale. AWS offers a range of managed data services that let customers focus on making data useful, including Amazon Aurora, RDS, DynamoDB, Redshift, Spectrum, ElastiCache, Kinesis, EMR, Elasticsearch Service, and Glue. In this session, we discuss these services, share our vision for innovation, and show how our customers use these services today. Learn More: https://aws.amazon.com/government-education/
Building analytics applications requires more than just one good service. It requires the ability to capture a vast amount of data, and react to data changes in real time. It requires flexible tools which enable end users to work in the way they can be most productive, and which addresses the needs of both data consumers, as well as data scientists. This analysis won't just be about data exploration and reports, but must be able to support the largest scale, complex machine and deep learning models imaginable. Across it all, strong governance, security, and cataloguing is essential. In this session, come to hear about how to build a full stack analytics application using AWS Services. We'll see how to capture static and dynamic data in real time, and react to data changes. We'll see AWS Services which perform analytics from drag-and-drop, through simple query-on-files, and into exascale data science. At the end, we'll have a data lake architecture that will meet the demands of the most sophisticated analytics customers for many years to come.
AWS Speaker: Ian Robinson, Specialist Solution Architect, Big Data and Analytics, EMEA - Amazon Web Services
ENT305 Migrating Your Databases to AWS: Deep Dive on Amazon Relational Databa...Amazon Web Services
Amazon RDS allows you to launch an optimally configured, secure and highly available database with just a few clicks. It provides cost-efficient and resizable capacity, automates time-consuming database administration tasks, and provides you with six familiar database engines to choose from: Amazon Aurora, Oracle, Microsoft SQL Server, PostgreSQL, MySQL and MariaDB. In this session, we will take a close look at the capabilities of Amazon RDS and explain how it works. We’ll also discuss the AWS Database Migration Service and AWS Schema Conversion Tool, which help you migrate databases and data warehouses with minimal downtime from on-premises and cloud environments to Amazon RDS and other Amazon services. Gain your freedom from expensive, proprietary databases while providing your applications with the fast performance, scalability, high availability, and compatibility they need.
Amazon Web Services gives you fast access to flexible and low cost IT resources, so you can rapidly scale and build virtually any big data and analytics application including data warehousing, clickstream analytics, fraud detection, recommendation engines, event-driven ETL, serverless computing, and internet-of-things processing regardless of volume, velocity, and variety of data.
In this one-hour webinar, we will look at the portfolio of AWS Big Data services and how they can be used to build a modern data architecture.
We will cover:
Using different SQL engines to analyze large amounts of structured data
Analysing streaming data in near-real time
Architectures for batch processing
Best practices for Data Lake architectures
This session is suited for:
Solution and enterprise architects
Data architects/ Data warehouse owners
IT & Innovation team members
AWS re:Invent 2016: Visualizing Big Data Insights with Amazon QuickSight (BDM...Amazon Web Services
Amazon QuickSight is a fast BI service that makes it easy for you to build visualizations, perform ad-hoc analysis, and quickly get business insights from your data. QuickSight is built to harness the power and scalability of the cloud, so you can easily run analysis on large datasets, and support hundreds of thousands of users. In this session, we’ll demonstrate how you can easily get started with Amazon QuickSight, uploading files, connecting to S3 and Redshift and creating analyses from visualizations that are optimized based on the underlying data. Once we’ve built our analysis and dashboard, we’ll show you easy it is to share it with colleagues and stakeholders in just a few seconds. And with SPICE – QuckSight’s in-memory calculation engine – you can go from data to insights, faster than ever.
AWS re:Invent 2016: How Fulfillment by Amazon (FBA) and Scopely Improved Resu...Amazon Web Services
We’ll share an overview of leveraging serverless architectures to support high performance data intensive applications. Fulfillment by Amazon (FBA) built the Seller Inventory Authority Platform (IAP) using Amazon DynamoDB Streams, AWS Lambda functions, Amazon Elasticsearch Service, and Amazon Redshift to improve results and reduce costs. Scopely will share how they used a flexible logging system built on Kinesis, Lambda, and Amazon Elasticsearch to provide high-fidelity reporting on hotkeys in Memcached and DynamoDB, and drastically reduce the incidence of hotkeys. Both of these customers are using managed services and serverless architecture to build scalable systems that can meet the projected business growth without a corresponding increase in operational costs.
AWS re:Invent 2016: Setting the Stage for Instant Success: Getting the Most O...Amazon Web Services
Ticketmaster will share their playbill to managing and optimizing their AWS deployment in real-time. Learn how this ticket-sales giant was able to deploy, train and obtain utilization of the Cloudyn solution for their significant AWS deployment - and get rapid results.
Ticketmaster relies on a large and dynamic AWS workload of thousands of instances across multiple regions and availability zones. As their reliance on the cloud increases, so does their need for actionable insights. Using Cloudyn, Ticketmaster is now able to confidently grow their cloud and focus efforts on expanding their core business.
In this session, presenters will also demonstrate how to use big data analytics to manage and optimize cloud investments, and leverage best practices to realize cloud potential. Attendees will also hear about cloud industry trends, the latest developments in optimization and how to plan for 2017. Session sponsored by Cloudyn.
AWS re:Invent 2016: Continuous Compliance in the AWS Cloud for Regulated Life...Amazon Web Services
Life sciences organizations running regulated workloads in the cloud can move from point-in-space testing of their environment to near real-time testing to achieve continuous compliance with the mandates of auditors and regulation entities. Get deep insights into some of the AWS services used to accomplish continuous compliance such as Amazon CloudTrail, Amazon CloudWatch, AWS Config, Amazon VPC, Amazon S3, and Amazon EC2. Get real-world use cases of how heavily regulated environments within Merck maintain governance and control over a shared environment. We also discuss the automated tools used by Merck to eliminate manual processes and streamline IT management.
AWS Compute Overview: Servers, Containers, Serverless, and Batch | AWS Public...Amazon Web Services
The AWS Compute platform has expanded EC2 instance types including FPGA and new GPU instances. There are also other ways to run workloads in AWS including Lambda (serverless), ECS (managed Docker), and AWS Batch (batch computing). This session will cover the newest instance types in EC2 and review AWS Lambda, ECS, and Batch. Learn More: https://aws.amazon.com/government-education/
AWS re:Invent 2016: High Performance Cinematic Production in the Cloud (MAE304)Amazon Web Services
The process of making a film is highly complex, and comprises of multiple workflows across story development, pre-production, production, post-production and final distribution. Given the size and amount of media and assets associated with each stage, high performance infrastructure is often essential to meeting deadlines.
In this session we will take a deeper dive at running a full cinematic production in the cloud, with a focus on solutions for each of the production stages. We will also look at best practices around design, optimization, performance, scheduling, scalability and low latency utilizing AWS technologies such as EC2, Lambda, Snowball, Direct Connect, and Partner Solutions.
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
Understanding AWS Managed Database and Analytics Services | AWS Public Sector...Amazon Web Services
The world is creating more data in more ways than ever before. The average internet user in 2017 generates 1.5GB of data per day, with the rate doubling every 18 months. A single autonomous vehicle can generate 4TB per day. Each smart manufacturing plant generates 1PB per day. Storing, managing, and analyzing this data requires integrated database and analytic services that provide reliability and security at scale. AWS offers a range of managed data services that let customers focus on making data useful, including Amazon Aurora, RDS, DynamoDB, Redshift, Spectrum, ElastiCache, Kinesis, EMR, Elasticsearch Service, and Glue. In this session, we discuss these services, share our vision for innovation, and show how our customers use these services today. Learn More: https://aws.amazon.com/government-education/
Building analytics applications requires more than just one good service. It requires the ability to capture a vast amount of data, and react to data changes in real time. It requires flexible tools which enable end users to work in the way they can be most productive, and which addresses the needs of both data consumers, as well as data scientists. This analysis won't just be about data exploration and reports, but must be able to support the largest scale, complex machine and deep learning models imaginable. Across it all, strong governance, security, and cataloguing is essential. In this session, come to hear about how to build a full stack analytics application using AWS Services. We'll see how to capture static and dynamic data in real time, and react to data changes. We'll see AWS Services which perform analytics from drag-and-drop, through simple query-on-files, and into exascale data science. At the end, we'll have a data lake architecture that will meet the demands of the most sophisticated analytics customers for many years to come.
AWS Speaker: Ian Robinson, Specialist Solution Architect, Big Data and Analytics, EMEA - Amazon Web Services
ENT305 Migrating Your Databases to AWS: Deep Dive on Amazon Relational Databa...Amazon Web Services
Amazon RDS allows you to launch an optimally configured, secure and highly available database with just a few clicks. It provides cost-efficient and resizable capacity, automates time-consuming database administration tasks, and provides you with six familiar database engines to choose from: Amazon Aurora, Oracle, Microsoft SQL Server, PostgreSQL, MySQL and MariaDB. In this session, we will take a close look at the capabilities of Amazon RDS and explain how it works. We’ll also discuss the AWS Database Migration Service and AWS Schema Conversion Tool, which help you migrate databases and data warehouses with minimal downtime from on-premises and cloud environments to Amazon RDS and other Amazon services. Gain your freedom from expensive, proprietary databases while providing your applications with the fast performance, scalability, high availability, and compatibility they need.
Amazon Web Services gives you fast access to flexible and low cost IT resources, so you can rapidly scale and build virtually any big data and analytics application including data warehousing, clickstream analytics, fraud detection, recommendation engines, event-driven ETL, serverless computing, and internet-of-things processing regardless of volume, velocity, and variety of data.
In this one-hour webinar, we will look at the portfolio of AWS Big Data services and how they can be used to build a modern data architecture.
We will cover:
Using different SQL engines to analyze large amounts of structured data
Analysing streaming data in near-real time
Architectures for batch processing
Best practices for Data Lake architectures
This session is suited for:
Solution and enterprise architects
Data architects/ Data warehouse owners
IT & Innovation team members
Real time data analytics - part 1 - backend infrastructureAmazon Web Services
Building a real time data analysis infrastructure is a challenging task that requires experienced engineers. With AWS services, you can do it in a matter of minutes, scale it easily to handle almost unlimited load, and keep it as a low cost infrastructure. This session is an opportunity to see a live demo on building an infrastructure using a combination of Amazon Kinesis, Redshift, DynamoDB, EMR and CloudSearch, to collect, process and share data.
Thinking through how you want to run Microsoft Windows Server and application workloads on AWS is straightforward, when you have a game plan. Understanding which service to leverage– like Amazon EC2, Amazon RDS, and Directory Services to name a few – will accelerate the process further. There are also a number of new enhancements to help make things even easier. In this session we will walk through how to think about mapping to the various AWS services available so you can get your deployment or migration project off to the right start. Think of this session as the decoder ring between your on-premises deployment and what you can expect from the AWS cloud for your Microsoft Windows Server and applications.
Application Delivery on Amazon Web Services for DevelopersAmazon Web Services
Application Delivery on Amazon Web Services for Developers
Every developer has gone through the frustration of creating new features, fixing bugs, or refactoring beautiful code, and then wait for it to reach the promise land of production. Come and learn how to get your changes in the hands of your customers with more speed, reliability, security and quality.
Speaker: Daniel Zoltak & Shiva Narayanaswamy, Solutions Architects, Amazon Web Services
Thinking through how you want to run Microsoft Windows Server and application workloads on AWS is straightforward, when you have a game plan. Understanding which service to leverage– like Amazon EC2, Amazon RDS, and Directory Services to name a few – will accelerate the process further. There are also a number of new enhancements to help make things even easier. In this session we will walk through how to think about mapping to the various AWS services available so you can get your deployment or migration project off to the right start. Think of this session as the decoder ring between your on-premises deployment and what you can expect from the AWS cloud for your Microsoft Windows Server and applications.
Getting Started with EC2 Spot - November 2016 Webinar SeriesAmazon Web Services
Amazon EC2 Spot instances are spare EC2 instances that you can bid on to run your cloud computing applications. Since Spot instances are often available at a lower price, you can significantly reduce the cost of running your applications, grow your application’s compute capacity and throughput for the same budget, and enable new types of cloud computing applications.
Learn how to launch a fleet of EC2 Spot instances in minutes to run your applications for less than 1 cent per core.
Learning Objectives:
* How Spot works
* Spot Best Practices
* How you can automate your fleet management
* Spot blocks for 1-6 hr tasks
* Get started in minutes with the new Spot console
In this session, we will walk through the fundamentals of Amazon Virtual Private Cloud (VPC). First, we will cover build-out and design fundamentals for VPC, including picking your IP space, subnetting, routing, security, NAT, and much more. We will then transition into different approaches and use cases for optionally connecting your VPC to your physical data center with VPN or AWS Direct Connect. This mid-level architecture discussion is aimed at architects, network administrators, and technology decision-makers interested in understanding the building blocks AWS makes available with VPC and how you can connect this with your offices and current data center footprint.
Amazon QuickSight is a fast, cloud-powered business intelligence (BI) service that makes it easy to build visualizations, perform ad-hoc analysis, and quickly get business insights from your data. In this session, we demonstrate how you can point Amazon QuickSight to AWS data stores, flat files, or other third-party data sources and begin visualizing your data in minutes. We also introduce you to SPICE - a Super-fast, Parallel, In-memory, Calculation Engine in Amazon QuickSight, which performs advanced calculations and render visualizations rapidly without requiring any additional infrastructure, SQL programming, or dimensional modeling, so you can seamlessly scale to hundreds of thousands of users and petabytes of data. Lastly, you will see how Amazon QuickSight provides you with smart visualizations and graphs that are optimized for your different data types, to ensure the most suitable and appropriate visualization to conduct your analysis, and how to share these visualization stories using the built-in collaboration tools.
Microservices on AWS: Divide & Conquer for Agility and ScalabilityAmazon Web Services
To tackle complexity and change, AWS customers are increasingly evolving their architectures from monoliths towards microservices, and benefiting from increased agility, simplified scalability, resiliency, and faster deployments. However, microservices also introduce new technical challenges. In this session, we'll provide an introduction and overview of the benefits and challenges of micrososervices, and share best practices for architecting and deploying microservices on AWS.
Curious about AWS Mobile Services and latest updates? Attend this session for a deep dive on recent updates to AWS Mobile Services aimed at helping you build scalable, reliable, and feature-rich mobile apps. We’ll dig into the new features and discuss the relevant use cases. Specifically, we will cover the following releases: Amazon Cognito Your User Pools - Add sign-up and sign-on to your mobile apps, Amazon Simple Notification Service Global SMS - Send SMS messages to phone numbers in 200+ countries, and AWS Device Farm Remote Access - Gesture, swipe, and interact with iOS and Android devices in real time, directly from your web browser.
AWS re:Invent 2016: Building and Growing a Successful AWS User Group (DCS203)Amazon Web Services
Our panel of experts lead AWS user groups in San Francisco, Yokohama, Munich, and Shanghai. In this session they share the stories behind how their groups began, list best practices for sustaining a technical meetup over time, and offer advice for AWS enthusiasts who are considering starting up a new user group in their city.
Build a Text Enabled Keg-orator Robot with Alexa, AWS IoT & AWS LambdaAmazon Web Services
Learn how to build a text enabled robot that will take your beer order, serve your pint, and notify you when it is ready, all while keeping an eye on your consumption so that you wake up on time the next morning. In this demo-heavy workshop, we will use the Zipwhip Texterator as the platform on which we will show you how to use Alexa, AWS Lambda, and AWS IoT to build the ultimate beer serving device.
AWS DevDay San Francisco, June 21, 2016.
Presenter: John Rotach, SDE, AWS IoT
This is the complete deck presented at the Westin Calgary Hotel, on August 16th, 2016.
It covers the current state of the AWS Big Data Solution set. Contains several use cases of Big Data, Machine Learning, and a tutorial on how to implement and use Big Data on the AWS Cloud Platform.
講師: Ivan Cheng, Solution Architect, AWS
Join us for a series of introductory and technical sessions on AWS Big Data solutions. Gain a thorough understanding of what Amazon Web Services offers across the big data lifecycle and learn architectural best practices for applying those solutions to your projects.
We will kick off this technical seminar in the morning with an introduction to the AWS Big Data platform, including a discussion of popular use cases and reference architectures. In the afternoon, we will deep dive into Machine Learning and Streaming Analytics. We will then walk everyone through building your first Big Data application with AWS.
Understanding AWS Managed Database and Analytics Services | AWS Public Sector...Amazon Web Services
The world is creating more data in more ways than ever before. The average internet user in 2017 generates 1.5GB of data per day, with the rate doubling every 18 months. A single autonomous vehicle can generate 4TB per day. Each smart manufacturing plant generates 1PB per day. Storing, managing, and analyzing this data requires integrated database and analytic services that provide reliability and security at scale. AWS offers a range of managed data services that let customers focus on making data useful, including Amazon Aurora, RDS, DynamoDB, Redshift, Spectrum, ElastiCache, Kinesis, EMR, Elasticsearch Service, and Glue. In this session, we discuss these services, share our vision for innovation, and show how our customers use these services today. Learn More: https://aws.amazon.com/government-education/
(BDT201) Big Data and HPC State of the Union | AWS re:Invent 2014Amazon Web Services
Leveraging big data and high performance computing (HPC) solutions enables your organization to make smarter and faster decisions that influence strategy, increase productivity, and ultimately grow your business. We kick off the Big Data and HPC track with the latest advancements in data analytics, databases, storage, and HPC at AWS. Hear customer success stories and discover how to put data to work in your own organization.
Understand the Big Data ecosystem on the Cloud and the building blocks that help you build application for Data Mining and Visualization. Also learn from Latentview Analytics on how they built “PanelMiner” a Platform That Efficiently Transforms Unstructured HTML Data to Structured Data to gain Insights about consumer behavior from large data sets.
Presenter:
Ganesh Raja, Solution Architect, Amazon Internet Services
Ganesh Sankarlingam, Head of Delivery (US West Coast), LatentView Analytics
Shrirang Bapat, Vice President – Engineering, Pubmatic
This overview presentation discusses big data challenges and provides an overview of the AWS Big Data Platform by covering:
- How AWS customers leverage the platform to manage massive volumes of data from a variety of sources while containing costs.
- Reference architectures for popular use cases, including, connected devices (IoT), log streaming, real-time intelligence, and analytics.
- The AWS big data portfolio of services, including, Amazon S3, Kinesis, DynamoDB, Elastic MapReduce (EMR), and Redshift.
- The latest relational database engine, Amazon Aurora— a MySQL-compatible, highly-available relational database engine, which provides up to five times better performance than MySQL at one-tenth the cost of a commercial database.
Created by: Rahul Pathak,
Sr. Manager of Software Development
What is Innovation? How can cloud computing help you innovate? How can you make your applications smarter? Predictive? How can you interpret data and anticipate trends? With AWS Artificial Intelligence Solutions: Machine Learning, Rekognition, Polly; with serverless - Lambda, Step Functions.
Understanding AWS Managed Databases and Analytic Services - AWS Innovate Otta...Amazon Web Services
• Overview of database services to elevate your applications, analytic services to engage your data, and migration services to help you reach database freedom.
• Survey of how Canadian and other organizations are using the cloud to make data scalable, reliable, and secure.
Amazon Web Services proporciona una amplia gama de servicios que le ayudarán a crear e implementar aplicaciones de análisis de big data de forma rápida y sencilla. AWS ofrece un acceso rápido a recursos de TI económicos y flexibles, algo que permitirá escalar prácticamente cualquier aplicación de big data con rapidez, incluidos almacenamiento de datos, análisis de clics, detección de elementos fraudulentos, motores de recomendación, proceso ETL impulsado por eventos, informática sin servidor y procesamiento del Internet de las cosas. Con AWS no necesita hacer grandes inversiones iniciales de tiempo o dinero para crear y mantener la infraestructura. En su lugar, puede aprovisionar exactamente el tipo y el tamaño adecuado de los recursos que necesita para impulsar sus aplicaciones de análisis de big data. Puede obtener acceso a tantos recursos como necesite, prácticamente al instante, y pagar únicamente por los utilice.
The New Normal: Benefits of Cloud Computing and Defining your IT StrategyAmazon Web Services
The standard business model is changing rapidly changing. Companies used to be built for the long haul. But now, success is powered by rapid-paced innovation and the ability to get disruptive products to market first.
You’re used to balancing resources between keeping things running and the development of new initiatives. But merely keeping the lights on doesn't differentiate you from your competitors.
Modern apps and services are leveraging data to change the way we engage with users in a more personalized way. Skyla Loomis talks big data, analytics, NoSQL, SQL and how IBM Cloud is open for data.
Learn more by visiting our Bluemix Hybrid page: http://ibm.co/1PKN23h
Learn best practices for building a real-time streaming data architecture on AWS with Spark Streaming, Amazon Kinesis, and Amazon Elastic MapReduce (EMR). Get a closer look at how to ingest streaming data scalably and durably from data producers like mobile devices, servers, and even web browsers, and design a stream processing application with minimal data duplication and exactly-once processing.
Presented by: Guy Ernest, Principal Business Development Manager, Amazon Web Services
Customer Guest: Harry Koch, Solutions Architecture, Philips
Horses for Courses: Database RoundtableEric Kavanagh
The blessing and curse of today's database market? So many choices! While relational databases still dominate the day-to-day business, a host of alternatives has evolved around very specific use cases: graph, document, NoSQL, hybrid (HTAP), column store, the list goes on. And the database tools market is teeming with activity as well. Register for this special Research Webcast to hear Dr. Robin Bloor share his early findings about the evolving database market. He'll be joined by Steve Sarsfield of HPE Vertica, and Robert Reeves of Datical in a roundtable discussion with Bloor Group CEO Eric Kavanagh. Send any questions to info@insideanalysis.com, or tweet with #DBSurvival.
Shawn Gandhi, head of Solutions Architecture for AWS Canada, takes us on a journey through Big Data and the different strategies and services available to implementers and practicioners.
Using AWS to design and build your data architecture has never been easier to gain insights and uncover new opportunities to scale and grow your business. Join this workshop to learn how you can gain insights at scale with the right big data applications.
Using AWS to design and build your data architecture has never been easier to gain insights and uncover new opportunities to scale and grow your business. Join this workshop to learn how you can gain insights at scale with the right big data applications.
Similar to Tapping the cloud for real time data analytics (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
2. What is Big Data?
Volume
Velocity
Variety
Veracity
Value
Quantum of data
TB to PB of data
Speed of data
Millisecond latency
Types of data
Hundreds of data sources
Quality of data
Varies greatly. Affects
accuracy of analysis
Business relevance
How does it help the business?
25+TB of data being
generated per second
globally
90+% of world’s data
created in last 2 years
90+% of data generated is
unstructured
3. Evolution of Big Data ProcessingDescriptivePredictivePrescriptive
Batch
Real-time
Dashboards;
Traditional query &
reporting
Prediction engines;
Inventory forecasting,
cross-sell analysis
Recommendation engines;
routes, content recos
What & Why it happened?
Probability of ‘x’ happening
What to do if ‘x’ happens
TypeofAnalytics
Speed of Analysis
It is happening!
Alerts, analysis &
detection; what is going
wrong, fraudulent use
4. Big Data
Potentially massive datasets
Iterative, experimental style of data
manipulation & analysis
Frequently not a steady-state
workload; peaks & valleys
Variety & velocity of data
Management of tools complex
AWS Cloud
Massive, virtually unlimited capacity
On-demand infrastructure allows iterative,
experimental deployment/usage
Most efficient with highly variable
workloads
Tools & services for managing structured
& unstructured, batch & stream data
Fully managed
Big Data was built for the Cloud
5. Ingest/
Collect
Consume/
visualize
Store Process &
Analyze
Data
1 4
0 9
5
Answers &
Insights
Broad, Tightly Integrated Capabilities
AWS provides the broadest platform for big data analytics today
Start Here
with a
business
case
Real-time
Amazon Kinesis
Firehose
Data Import
AWS
Import/Export
Snowball
Object Storage
Amazon S3
Real-time
Amazon Kinesis
Streams
Distributed
Amazon EMR
(Hadoop, Spark, etc)
BI & Data
Vizualization
Amazon Quicksoght
Real-time
AWS Lambda
Amazon Kinesis Analytics
Data Warehousing
Amazon Redshift
Machine Learning
Amazon Machine
Learning
Relational
Databases
Amazon RDS
No SQL
Databases
Amazon
DynamoDB Elasticsearch
Amazon
Elasticsearch
Data Connect
AWS Direct
Connect
Storage gateway
AWS Storage
Gateway
Database Migration
AWS Data Migration
Service
Time to
answer
(latency)
Throughput
Cost
6. Amazon Redshift
Fast, fully managed, petabyte-scale data warehouse
• 10X better performance than traditional DBs
• Less than one tenth the cost of traditional solutions
• Simple and fully managed
• Flexible & Scalable: Easily change number or type nodes
• ANSI SQL Compatible: Use familiar SQL clients/BI tools
• Secure: Encryption, network isolation, audit & compliance
• Ideal usage patterns: sales, historical, gaming, finance,
marketing, ad, social data
10 GigE
(HPC)
Ingestion
Backup
Restore
SQL Clients/BI Tools
128GB RAM
16TB disk
16 cores
Amazon S3
JDBC/ODBC
128GB RAM
16TB disk
16 coresCompute
Node
128GB RAM
16TB disk
16 coresCompute
Node
128GB RAM
16TB disk
16 coresCompute
Node
Leader
Node
7. Amazon EMR
Quickly and cost-effectively process vast amounts of data
• Largest cloud operator of Hadoop infrastructure
• Open source & MapR distributions
• Most current Hadoop distribution
• Flexibility : Decoupled compute & storage, select apps, resize
• Simple : Launch a cluster in minutes, fully managed
• Scalable : Provision as much capacity as needed
• Multiple pricing option – On-demand, Reserved Instances, Spot
• Typical use cases – Clickstream analysis, log processing, genomics
8. Amazon Kinesis
Easily work with real-time streaming data
Amazon Kinesis Streams
• Build custom apps to process or analyze streaming data
• Typical use cases – Log & event data collection, real-time analytics
Amazon Kinesis Firehose
• Easily load massive volumes of streaming data into S3, Redshift, AWS ES
• Typical use cases – Digital marketing, IoT, mobile data capture
Amazon Kinesis Analytics
• Easily analyze data streams using standard SQL queries
9. Amazon Elasticsearch
Fully managed making it easy to set-up, operate & scale
Elasticsearch clusters in the cloud
• Easy set-up & configuration. Fully managed
• Flexible storage options
• Set-up for high availability
• Seamlessly scale
• Direct access to Elasticsearch APIs
• Support for ELK. Built-in KIbana
• Integration with AWS IAM for controlling access to your domain
• Integration with Amazon CloudTrail for auditing
Amazon
Route 53
Elastic
Load
Balancing
IAM
CloudWatch
Elasticsearch API
CloudTrailAWS
10. Select Big Data & Analytics Customers
The vast majority of Big Data use cases deployed in the cloudtoday run onAWS
14. Businesses are literally drowning in data
1-3. Source: https://www-01.ibm.com/software/data/bigdata/what-is-big-data.html
4. Source: https://www.technologyreview.com/business-report/big-data-gets-personal/download/?state=join#/join/
2.5 quintillion bytes of data is being created every day1
90% of data in the world today has been created in the last two years2
1.7 megabytes of new information will be created every second for every human on the planet by 20203
<0.5% of all data is currently being analysed and used
15. Internal systems can’t cope
On-premise environments can’t scale quick enough for big data analytics
projects to work well
Cost is prohibitive
The high capital cost of upgrading server infrastructure deters organisations
from embarking on projects.
Tools are outdated
Data management architectures are complex and traditional data analytics
tools are no longer suitable
So what’s the problem?
16. Drives scale
Offers agile and secure cloud infrastructure, provided by AWS, at a low cost
Provides clarity
Easy to forecast how much computing power is needed
Ensures infrastructure is not under-utilised
Empowers business
Servers can be ‘spun up’ to support proof of concepts as required
Enables organisations to go to market faster
Supports a ‘fail fast’ culture
Why cloud for real time analytics?
18. Cloud advisory
Consulting approach to identify the suitability of a move to the cloud – examining current
apps, infrastructure tools, methods and readiness
Migration and deployment
Move web-based and ERP apps – including Oracle and SAP solutions – to the cloud
Cloud consulting services
19. DevOps
Continuous integration, deployment and release
management processes with Puppet Labs, Jenkins,
Capistrano, and ELK Stack
Managed services
Proactive monitoring of AWS infrastructure, SLA-based
resolution, 24x7 support, and account management
Cloud consulting services
20. Big data on cloud
Process data in real time using Amazon Kinesis, Apache Kafka, AWS Lambda and Hadoop
Data warehouse on cloud
Data warehouse design, management and reporting with Amazon Redshift, AWS Quicksight
and Tableau.
Cloud native app and product development
Provide micro services and driven architecture with tools like SQS and SNS
Analytics and product development services
21. Cloud stream
Digital content, asset management, publishing workflow
and video on demand
Cloudlytics
Provides log and billing analytics, cloud automation and
monitoring
CloudScale
Load testing and resilience, and testing automation in the
cloud
BlazeNAS
A highly available and fault-tolerant storage solution
Our products and frameworks
22. Our in-house developed big data framework Cloudlytics 2.0 is an analytics engine that addresses
applications from different domains like infrastructure, application monitoring and IoT.
It gives organizations an edge over their competition by providing real-time insights which help reduce the
time to market for products and services.
Big data analytics engine
24. 5Abox is a software company building embedded solutions for the IoT world. It is focused on energy and
domotics gateways, and ‘VPN on request’ solutions.
Case study: 5Abox
Analyzing IOT data in real-time
25. Streaming real-time data
Complex transformation
Visualization
Case study: 5Abox
The problem/challenge:
26. The solution:
Case study: 5Abox
Weather and Voltage
fluctuations Data
IOT Device Real-Time Transformation and
visualization of Data on Cloudlytics 2.0
using MQTT
protocol
BlazeClan solution enabled the customer Real-Time Insights for weather and voltage fluctuations data.
Data in the 21st century is like oil in the 18th century, an immensely valuable yet largely untapped asset. Like with oil, for those who see data’s fundamental value and learn to extract and use it there will be huge rewards
Big data is typically described basis 3Vs (volume, variety & velocity of big data which is ever increasing) and recently 2 more have got added (value & veracity)-
Value: Refers to the business relevance of the captured data i.e. how does it help the business ?
Veracity: Refers to the quality of captured data as it varies greatly. This is important as it affects the accuracy of the analysis
Variety: Refers to the nature of the captured data. You have a plethora of data sources today and hence you have a broad variety of data be it log/streaming/IoT data or then transactional data. Then you have for example file data with fixed schema (CSV, Parquet, Avro) and file data which is schema free (JSON. Key value). Then you have small files and large files and I could go on
Velocity: Refers to the speed at which the data is generated and processed. Today for real-time use cases we are talking about milliseconds latency. 1 million reads and writes per second is becoming a norm for example for customers in the digital advertising business
Volume: Refers to the quantity of data being generated and stored. The size of the data determines the value and potential insight- and whether it can actually be considered big data or not. Customers generating 100-150 TB a day is not very uncommon now
25+TB of data being generated per second globally
90+% of world’s data created in last 2 years
90+% of data generated is unstructured and hence needs some work before it can be meaningfully used
Now lets look at how big data processing is evolving
On the x-axis you have the speed of analysis while on the y-axis you have the type of analytics you can derive basis the same
With batch analysis its typically descriptive analytics. Descriptive analytics answers the questions what happened and why did it happen. Descriptive analytics looks at past performance and understands that performance by mining historical data to look for the reasons behind past success or failure. Most management reporting - such as sales, marketing, operations, and finance - uses this type of post-mortem analysis. Good for dashboards, reports in response to queries, looking at trends, looking at outcomes e.g. (i) daily customer-preferences report from your web site’s click stream: helps you decide on how to optimize deals and what ad you should try next time, (ii) daily fraud reports: was there fraud yesterday.
Then it comes to dealing with data in real-time with which it moves to what is happening vs what happened – Great for real-time alerts (what is happening now, what is going wrong now), real-time analysis (what to offer the current customer now), real-time spending caps (transaction gets denied as it exceeds your balance for example)
The next phase is predictive analytics. Predictive analytics answers the question what might happen. This is when historical performance data is combined with a variety of statistical, modeling, data mining, and machine learning techniques, and occasionally external data to determine the probable future outcome of an event or the likelihood of a situation occurring
The final phase is prescriptive analytics, which goes beyond predicting future outcomes by also suggesting actions to benefit from the predictions and showing the implications of each decision option. e.g. Think of a traffic navigation app. Pick an origin and a destination — a multitude of factors get mashed together, and it advises you on different route choices, each with a predicted ETA. This is everyday prescriptive analytics at work. Prescriptive analytics can continually take in new data to re-predict and re-prescribe, thus automatically improving prediction accuracy and prescribing better decision options. So prescriptive analytics provide intelligent recommendations for the optimal next steps for almost any application or business process to drive desired outcomes. So while predictive analytics forecasts what might happen in the future, prescriptive analytics can help alter the future
Example of a retailer that offers free expedited shipping to loyal customers.
Descriptive analysis would provide the trends on which this program was structured
Based on past customer behavior, a predictive model would assume that customers will keep the majority of what they purchase with this promotion. However, one customer purchases eight items of clothing but decides to keep only one.
The retailer paid for expedited shipping with the assumption that there's this great consumer out there who bought eight items, so they're willing to invest and lose a little margin on shipping. The algorithm didn't take return behavior into account.
For this retailer, reducing its losses on "outlier" customers who don't follow what predictive analytics forecasted means having policies in place to cover itself. Using prescriptive analytics, the retailer might come up with the options of giving an in-store-only coupon to customers who make returns (to encourage another purchase in which shipping isn't a factor) or notifying customers that they must pay for return shipping
Big Data was built for the cloud and if you aren't using the cloud for big data then you either aren't dealing with big data or then are struggling/going to run into issues very soon and lets understand why that’s the case
With big data you typically are dealing with very large or then large and fast growing data sets and with on-prem infrastructure you will run into capacity issues sooner tan later. You have no such capacity issues with the cloud
With big data there are typically peaks and valleys and its rarely persistent volume and that creates challenges for on-prem infrastructure as you have to provision for peak load which is highly inefficient. The cloud in contrast is most efficient with highly variable workloads
Given the variety and velocity of big data you will need a set of services & tools to manage the same and managing the same is complex while in the AWS cloud the same set of tools & services are fully managed
If you look at a typical big data pipeline, data comes in one side and answers/insights come out the other side and there are multiple stages in between – ingest, store, process & analyze, consume/vizualize with store and process repeating itself multiple times to shape the data in a format that the end consuming application can consume at any rate or any characteristic it demands. What goes on in between is what is called time to answer (pipeline latency), pipeline throughput = f (volume, request rate) and cost
Before we get to the components that enable this, its important to emphasize that its imperative to start with understanding the use case or in other words the answers and insights that are required, why they are required and how they will help the business before embarking on building out the solution and piecing together the elements to enable it. What’s important is leveraging the data & not the technology stack. The technology exists today to make it all happen quickly, securely & cost efficiently!
Amazon Machine Learning is a service that makes it easy for developers of all skill levels to use machine learning technology. Amazon Machine Learning provides visualization tools and wizards that guide you through the process of creating machine learning (ML) models without having to learn complex ML algorithms and technology. Once your models are ready, Amazon Machine Learning makes it easy to obtain predictions for your application using simple APIs, without having to implement custom prediction generation code, or manage any infrastructure. Amazon Machine Learning is based on the same proven, highly scalable, ML technology used for years by Amazon’s internal data scientist community
And then we have a new addition to the analytics portfolio by way of Amazon Quicksight our very fast, easy-to-use, cloud-powered business intelligence service for 1/10th the cost of traditional BI solutions at $9/user/mth. Amazon Quicksight is under preview currently
Fast, fully managed, petabyte scale datawarehouse
Fast - Optimized for data warehousing. Redshift has a massively parallel processing (MPP) architecture with columnar storage, data compression and 10GigE networking between nodes for up to 10x better performance than traditional relational, row-based databases
Cheap - No upfront costs, pay only for resources you provision. Start small for $0.25 per hour and scale over a PB for $935 per TB per year, less than a tenth of most other data warehousing solutions
Simple – Get started in minutes with a few clicks or a simple API call. Fully managed and fault tolerant. Easy to set up, operate and scale. We take care of provisioning, installation, monitoring, backup, restore and patching
Scalable – With a few clicks via the Console or a simple API call, you can change the type or number of nodes as your performance or capacity needs change. While resizing, your cluster still runs in read-only mode
ANSI SQL Compliant – Uses standard JDBC and ODBC drivers, allowing you to use a wide range of familiar SQL clients/BI tools
Secure – You can encrypt data at rest and in transit using hardware-accelerated AES-256 and SSL, isolate your clusters using Amazon VPC and even manage your keys using hardware security modules (HSMs). Compliant with SOC1, SOC2 & SOC3, FedRAMP, HIPAA and PCI DSS Level 1
Durability and Availability
Replication
Backup
Automated recovery from failed drives & nodes
Interfaces
JDBC/ODBC interface with BI/ETL tools
Amazon S3 or DynamoDB
Cost model
No upfront costs or long term commitments
Free backup storage equivalent to 100% of provisioned storage
Amazon Elastic MapReduce (EMR) simplifies big data processing by providing a managed Hadoop framework that makes it easy, fast, and cost-effective for you to distribute and process vast amounts of your data across dynamically scalable Amazon EC2 instances
You can also run other popular distributed frameworks such as Apache Spark and Presto or any other application in the Apache Hadoop stack in Amazon EMR, and interact with data in other AWS data stores such as Amazon S3 and Amazon DynamoDB
Little trumpeted fact that EMR is the largest cloud operator of Hadoop infrastructure having spun up tens of millions of clusters for customers since 2009
EMR support the open source & MapR distributions and has the most current Hadoop distribution in the market today with the current versions of the most popular Hadoop apps
Fully managed and hence simple allowing you to launch a cluster in minutes and takes care of provisioning, set-up, configuration, tuning and monitoring
Extremely flexible as we have decoupled compute & storage (which also provides a very significant cost benefit), you can select the apps you need as also easily resize a running cluster
Elastic as you can provision one, hundreds or thousands of instances to process data at any scale
Typical use cases – Clickstream analysis, log processing, genomics
Amazon Kinesis services make it easy to work with real-time streaming data. Lets look at the components and their functionalities
Amazon Kinesis Streams enables you to build custom applications that process or analyze streaming data for specialized needs. Amazon Kinesis Streams can continuously capture and store terabytes of data per hour from hundreds of thousands of sources such as website clickstreams, financial transactions, social media feeds, IT logs, and location-tracking events. With Amazon Kinesis Client Library (KCL), you can build Amazon Kinesis Applications and use streaming data to power real-time dashboards, generate alerts, implement dynamic pricing and advertising, and more
Next is Amazon Kinesis Firehose which is the easiest way to load streaming data into AWS. It can capture and automatically load streaming data into Amazon S3, Amazon Redshift, and Amazon Elasticsearch Service, enabling near real-time analytics with existing business intelligence tools and dashboards you’re already using today
And then you have Amazon Kinesis Analytics which allows you to easily analyze data streams using standard SQL queries
Easy set-up & configuration
create domains via console, SDK or CLI
specify instance types, number of instances & storage options
modify or delete existing domains at any time
Fully managed
addresses time consuming management tasks
ensures high availability, patch management, backups
monitors cluster and replaces nodes as required
Flexible storage options
choose between local on-instance storage or Amazon EBS volumes to store your Elasticsearch indices
specify size of the Amazon EBS volume and volume type
modify the storage options after domain creation as needed
Set-up for High Availability
Zone Awareness distributes instances supporting the domain across two different AZs
with replicas enabled, instances are automatically distributed to deliver cross-zone replication
Here is a select set of referenceable customers using our analytics services
The vast majority of Big Data use cases deployed in the cloud today run on AWS
We now have a large and growing user base in India too