Data-driven companies have a need to make their data easily accessible to those who analyze it. Many organizations have adopted the Looker application, LookML on AWS, a centralized analytical database with a user-friendly interface that allows employees to ask and answer their own questions to make informed business decisions.
Join our webinar to learn how our customer, Casper, an online mattress retailer, made the switch from a transactional database to Looker’s data analytics program on Amazon Redshift. Looker on Amazon Redshift can help you greatly reduce your analytics lifecycle with a simplified infrastructure and rapid cloud scaling.
Join us to learn:
• How to utilize LookML to build reusable definitions and logic for your data
• Best practices for architecting a centralized analytical database
• How Casper leveraged Looker and Amazon Redshift to provide all their employees access to their data and metrics
Who should attend: Heads of Analytics, Heads of BI, Analytics Managers, BI Teams, Senior Analysts
AWS re:Invent 2016: Big Data Mini Con State of the Union (BDM205)Amazon Web Services
Join us for this general session where AWS big data experts present an in-depth look at the current state of big data. Learn about the latest big data trends and industry use cases. Hear how other organizations are using the AWS big data platform to innovate and remain competitive. Take a look at some of the most recent AWS big data announcements, as we kick off the Big Data re:Source Mini Con.
AWS re:Invent 2016: Big Data Architectural Patterns and Best Practices on AWS...Amazon Web Services
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Amazon Kinesis provides services for you to work with streaming data on AWS. Learn how to load streaming data continuously and cost-effectively to Amazon S3 and Amazon Redshift using Amazon Kinesis Firehose without writing custom stream processing code. Get an introduction to building custom stream processing applications with Amazon Kinesis Streams for specialised needs.
Cloud computing gives you a number of advantages, such as the ability to scale your web application or website on demand. If you have a new web application and want to use cloud computing, you might be asking yourself, "Where do I start?" Join us in this session to understand best practices for scaling your resources from zero to millions of users. We show you how to best combine different AWS services, how to make smarter decisions for architecting your application, and how to scale your infrastructure in the cloud.
Managing Data with Voume Velocity, and Variety with Amazon ElastiCache for RedisAmazon Web Services
Learn how to use Amazon ElastiCache with AWS IoT and AWS Lambda to create serverless solutions that let you rapidly make use of large and multisource data sets.
Building an Amazon Datawarehouse and Using Business Intelligence Analytics ToolsAmazon Web Services
Using AWS has never been easier or more affordable to solve business problems and uncover new opportunities using data. Now, businesses of all sizes and across all industries can take advantage of big data technologies and easily collect, store, process, analyze, and share their data. Gain a thorough understanding of what AWS offers across the big data lifecycle and learn architectural best practices for applying these technologies to your projects. We will also deep dive into how to use AWS services such as Kinesis, DynamoDB, Redshift, and Quicksight to optimize logging, build real-time applications, and analyze and visualize data at any scale.
AWS re:Invent 2016: How to Build a Big Data Analytics Data Lake (LFS303)Amazon Web Services
For discovery-phase research, life sciences companies have to support infrastructure that processes millions to billions of transactions. The advent of a data lake to accomplish such a task is showing itself to be a stable and productive data platform pattern to meet the goal. We discuss how to build a data lake on AWS, using services and techniques such as AWS CloudFormation, Amazon EC2, Amazon S3, IAM, and AWS Lambda. We also review a reference architecture from Amgen that uses a data lake to aid in their Life Science Research.
Convert and Migrate Your NoSQL Database or Data Warehouse to AWS - July 2017Amazon Web Services
Learning Objectives:
- Understand the use cases for migrating or replicating databases to the cloud
- Learn about the benefits of cloud-native databases for performance and costs reduction
- See how AWS Database Migration Service helps with your migration and how AWS Schema Conversion Tool makes conversions simple and quick
Moving or replicating your databases to the cloud should be simple and inexpensive. AWS has recently enhanced the AWS Database Migration Service and the AWS Schema Conversion Tool with new data sources to increase your migration options. You can now export from MongoDB databases and Greenplum, IBM Netezza, HPE Vertica, Teradata, Oracle DW and Microsoft SQL Server data warehouses to AWS. Learn how to export and migrate your data and procedural code with minimal downtime to the cloud database of your choice, including cloud-native offerings such as Amazon Aurora, Amazon DynamoDB and Amazon Redshift.
AWS re:Invent 2016: Big Data Mini Con State of the Union (BDM205)Amazon Web Services
Join us for this general session where AWS big data experts present an in-depth look at the current state of big data. Learn about the latest big data trends and industry use cases. Hear how other organizations are using the AWS big data platform to innovate and remain competitive. Take a look at some of the most recent AWS big data announcements, as we kick off the Big Data re:Source Mini Con.
AWS re:Invent 2016: Big Data Architectural Patterns and Best Practices on AWS...Amazon Web Services
The world is producing an ever increasing volume, velocity, and variety of big data. Consumers and businesses are demanding up-to-the-second (or even millisecond) analytics on their fast-moving data, in addition to classic batch processing. AWS delivers many technologies for solving big data problems. But what services should you use, why, when, and how? In this session, we simplify big data processing as a data bus comprising various stages: ingest, store, process, and visualize. Next, we discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architecture, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Amazon Kinesis provides services for you to work with streaming data on AWS. Learn how to load streaming data continuously and cost-effectively to Amazon S3 and Amazon Redshift using Amazon Kinesis Firehose without writing custom stream processing code. Get an introduction to building custom stream processing applications with Amazon Kinesis Streams for specialised needs.
Cloud computing gives you a number of advantages, such as the ability to scale your web application or website on demand. If you have a new web application and want to use cloud computing, you might be asking yourself, "Where do I start?" Join us in this session to understand best practices for scaling your resources from zero to millions of users. We show you how to best combine different AWS services, how to make smarter decisions for architecting your application, and how to scale your infrastructure in the cloud.
Managing Data with Voume Velocity, and Variety with Amazon ElastiCache for RedisAmazon Web Services
Learn how to use Amazon ElastiCache with AWS IoT and AWS Lambda to create serverless solutions that let you rapidly make use of large and multisource data sets.
Building an Amazon Datawarehouse and Using Business Intelligence Analytics ToolsAmazon Web Services
Using AWS has never been easier or more affordable to solve business problems and uncover new opportunities using data. Now, businesses of all sizes and across all industries can take advantage of big data technologies and easily collect, store, process, analyze, and share their data. Gain a thorough understanding of what AWS offers across the big data lifecycle and learn architectural best practices for applying these technologies to your projects. We will also deep dive into how to use AWS services such as Kinesis, DynamoDB, Redshift, and Quicksight to optimize logging, build real-time applications, and analyze and visualize data at any scale.
AWS re:Invent 2016: How to Build a Big Data Analytics Data Lake (LFS303)Amazon Web Services
For discovery-phase research, life sciences companies have to support infrastructure that processes millions to billions of transactions. The advent of a data lake to accomplish such a task is showing itself to be a stable and productive data platform pattern to meet the goal. We discuss how to build a data lake on AWS, using services and techniques such as AWS CloudFormation, Amazon EC2, Amazon S3, IAM, and AWS Lambda. We also review a reference architecture from Amgen that uses a data lake to aid in their Life Science Research.
Convert and Migrate Your NoSQL Database or Data Warehouse to AWS - July 2017Amazon Web Services
Learning Objectives:
- Understand the use cases for migrating or replicating databases to the cloud
- Learn about the benefits of cloud-native databases for performance and costs reduction
- See how AWS Database Migration Service helps with your migration and how AWS Schema Conversion Tool makes conversions simple and quick
Moving or replicating your databases to the cloud should be simple and inexpensive. AWS has recently enhanced the AWS Database Migration Service and the AWS Schema Conversion Tool with new data sources to increase your migration options. You can now export from MongoDB databases and Greenplum, IBM Netezza, HPE Vertica, Teradata, Oracle DW and Microsoft SQL Server data warehouses to AWS. Learn how to export and migrate your data and procedural code with minimal downtime to the cloud database of your choice, including cloud-native offerings such as Amazon Aurora, Amazon DynamoDB and Amazon Redshift.
Amazon Kinesis provides services for you to work with streaming data on AWS. Learn how to load streaming data continuously and cost-effectively to Amazon S3 and Amazon Redshift using Amazon Kinesis Firehose without writing custom stream processing code. Get an introduction to building custom stream processing applications with Amazon Kinesis Streams for specialized needs.
BDA307 Real-time Streaming Applications on AWS, Patterns and Use CasesAmazon Web Services
In this session, you will learn best practices for implementing simple to advanced real-time streaming data use cases on AWS. First, we’ll review decision points on near real-time versus real time scenarios. Next, we will take a look at streaming data architecture patterns that include Amazon Kinesis Analytics, Amazon Kinesis Firehose, Amazon Kinesis Streams, Spark Streaming on Amazon EMR, and other open source libraries. Finally, we will dive deep into the most common of these patterns and cover design and implementation considerations.
Building analytics applications requires more than just one good service. It requires the ability to capture a vast amount of data, and react to data changes in real time. It requires flexible tools which enable end users to work in the way they can be most productive, and which addresses the needs of both data consumers, as well as data scientists. This analysis won't just be about data exploration and reports, but must be able to support the largest scale, complex machine and deep learning models imaginable. Across it all, strong governance, security, and cataloguing is essential. In this session, come to hear about how to build a full stack analytics application using AWS Services. We'll see how to capture static and dynamic data in real time, and react to data changes. We'll see AWS Services which perform analytics from drag-and-drop, through simple query-on-files, and into exascale data science. At the end, we'll have a data lake architecture that will meet the demands of the most sophisticated analytics customers for many years to come.
AWS Speaker: Ian Robinson, Specialist Solution Architect, Big Data and Analytics, EMEA - Amazon Web Services
Getting Started with Managed Database Services on AWS - September 2016 Webina...Amazon Web Services
On AWS you can choose from a variety of managed database services that save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We'll explain the fundamentals of Amazon RDS, a managed relational database service in the cloud; Amazon DynamoDB, a fully managed NoSQL database service; Amazon ElastiCache, a fast, in-memory caching service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We will cover how each service might help support your application, how much each service costs, and how to get started.
Learning Objectives:
• Overview of managed database services available on AWS
• How to combine them for high-performance cost effective architectures
• Learn how to choose between the AWS database services based on the use case
Who Should Attend:
• IT Managers, DBAs, Enterprise and Solution Architects, IT Managers, DBAs, Enterprise and Solution Architects, Devops Engineers and Developers
AWS re:Invent 2016 was AWS’ largest event yet with over 32,000 attendees, 400 breakout sessions, and two keynotes of new product announcements. In this talk, we’ll explore the core themes of AWS re:Invent 2016 such as serverless and artificial intelligence. We will also drill down into several of the services and features unveiled including AWS Batch, AWS Shield, Aurora for Postgres, X-Ray, Polly, Lex, Rekognition, AWS Step Functions. Light appetizers and refreshments will be provided.
AWS re:Invent 2016: Event Handling at Scale: Designing an Auditable Ingestion...Amazon Web Services
How does McGraw-Hill Education use the AWS platform to scale and reliably receive 10,000 learning events per second? How do we provide near-real-time reporting and event-driven analytics for hundreds of thousands of concurrent learners in a reliable, secure, and auditable manner that is cost effective? MHE designed and implemented a robust solution that integrates AWS API Gateway, AWS Lambda, Amazon Kinesis, Amazon S3, Amazon Elasticsearch Service, Amazon DynamoDB, HDFS, Amazon EMR, Amazopn EC2, and other technologies to deliver this cloud-native platform across the US and soon the world. This session describes the challenges we faced, architecture considerations, how we gained confidence for a successful production roll-out, and the behind-the-scenes lessons we learned.
AWS re:Invent 2016: Workshop: Building Your First Big Data Application with A...Amazon Web Services
Want to get ramped up on how to use Amazon's big data web services and launch your first big data application on AWS? Join us in this workshop as we build a big data application in real time using Amazon EMR, Amazon Redshift, Amazon Kinesis, Amazon DynamoDB, and Amazon S3. We review architecture design patterns for big data solutions on AWS, and give you access to a take-home lab so that you can rebuild and customize the application yourself.
AWS re:Invent 2016: Building Big Data Applications with the AWS Big Data Plat...Amazon Web Services
Building big data applications often requires integrating a broad set of technologies to store, process, and analyze the increasing variety, velocity, and volume of data being collected by many organizations. In this session, we show how you can build entire big data applications using a core set of managed services including Amazon S3, Amazon Kinesis, Amazon EMR, Amazon Elasticsearch Service, Amazon Redshift, and Amazon QuickSight.
We walk you through the steps of building and securing a big data application using the AWS Big Data Platform. We also share best practices and common use cases for AWS big data services, including tips to help you choose the best services for your specific application.
Cloud computing gives you a number of advantages, such as the ability to scale your web application or website on demand. If you have a new web application and want to use cloud computing, you might be asking yourself, "Where do I start?" Join us in this session to understand best practices for scaling your resources from zero to millions of users. We show you how to best combine different AWS services, how to make smarter decisions for architecting your application, and how to scale your infrastructure in the cloud.
Dive deep into some of the key innovations behind Amazon Aurora, discuss best practices and configurations, and share early customer experience from the field.
AWS re:Invent 2016: Case Study: How Startups like Mapbox, Ring, Hudl, and Oth...Amazon Web Services
Join us for this lightning-round showcase of hot new brands and startup companies that are using AWS to play a really big game. You'll hear from experts like Mapbox CIO Will White, Ring Senior Engineer Jason Gluckman, Hudl Engineering Director Rob Hruska, and many others as they explain how they thought about the problems they faced and how they solved them in this TED-style session packed with lots of creative thinking.
Managing Data with Amazon ElastiCache for Redis - August 2016 Monthly Webinar...Amazon Web Services
Many data sets, such as time-series collections or Internet of Things (IoT) deployments can include huge numbers of sensor reports and other data points, which can be a challenge to manage and aggregate. Amazon ElastiCache for Redis provides an on-demand managed service with the performance and scalability to turn big data into useful information. Join us to learn how to use Amazon ElastiCache to create serverless solutions that lets you rapidly make use of large and multisource data sets.
Learning Objectives:
• Learn how to ingest and analyze sensor data using Amazon ElastiCache for Redis and the AWS IoT Service
• Learn how to use ElastiCache Redis for Time-Series data
In dynamic cloud environments, many organizations have a need to implement a unified threat management solution that enhances visibility across their workloads. Learn how REAN Cloud adopted Sophos Unified Threat Management (UTM) for increased simplicity, visibility, and security of their AWS workloads. Sophos is an Advanced Technology Partner in the AWS Partner Network that provides a reliable, unified security solution capable of scaling to meet the agility and speed of the AWS Cloud. Join the upcoming webinar to hear Sri Vasireddy from REAN Cloud, Bryan Nairn from Sophos, and Nick Matthews from AWS discuss security innovations on the AWS Cloud. Join us to learn: • Why Sophos end user REAN Cloud trusts Sophos UTM for simplicity, visibility and security. • How easy it can be to protect your AWS workloads, with a proven and scalable solution designed for the AWS Cloud. • AWS security innovations, including support across multiple Availability Zones and UTM Auto Scaling.
Who should attend: Security Managers, Security Engineers, Security Architects, IT System Administrators, System Administrators, IT Administrators, IT Managers, DevOps, Architects, IT Architects, IT Security Engineers, Business Decision Makers
Legacy monitoring and troubleshooting tools can limit visibility and control over your infrastructure and applications. Organizations must find monitoring and troubleshooting tools that can scale with the volume, variety and velocity of data generated by today’s complex applications in order to keep pace with business demands. Our upcoming webinar will discuss how Sumo Logic helped Scripps Networks harness cloud-native machine data analytics to improve application quality and reliability on AWS. Sumo Logic allows IT operations teams to visualize and monitor workloads in real-time, identify issues and expedite root-cause analysis across the AWS environment.
Join us to learn:
• How to migrate from traditional on-premises data centers to AWS with confidence
• How to improve the monitoring and troubleshooting of modern applications
• How Scripps Networks, a leading content developer, used Sumo Logic to optimize their transition to AWS
Who should attend: Developers, DevOps Director/Manager, IT Operations Director/Manager, Director of Cloud/Infrastructure, VP of Engineering
Amazon Kinesis provides services for you to work with streaming data on AWS. Learn how to load streaming data continuously and cost-effectively to Amazon S3 and Amazon Redshift using Amazon Kinesis Firehose without writing custom stream processing code. Get an introduction to building custom stream processing applications with Amazon Kinesis Streams for specialized needs.
BDA307 Real-time Streaming Applications on AWS, Patterns and Use CasesAmazon Web Services
In this session, you will learn best practices for implementing simple to advanced real-time streaming data use cases on AWS. First, we’ll review decision points on near real-time versus real time scenarios. Next, we will take a look at streaming data architecture patterns that include Amazon Kinesis Analytics, Amazon Kinesis Firehose, Amazon Kinesis Streams, Spark Streaming on Amazon EMR, and other open source libraries. Finally, we will dive deep into the most common of these patterns and cover design and implementation considerations.
Building analytics applications requires more than just one good service. It requires the ability to capture a vast amount of data, and react to data changes in real time. It requires flexible tools which enable end users to work in the way they can be most productive, and which addresses the needs of both data consumers, as well as data scientists. This analysis won't just be about data exploration and reports, but must be able to support the largest scale, complex machine and deep learning models imaginable. Across it all, strong governance, security, and cataloguing is essential. In this session, come to hear about how to build a full stack analytics application using AWS Services. We'll see how to capture static and dynamic data in real time, and react to data changes. We'll see AWS Services which perform analytics from drag-and-drop, through simple query-on-files, and into exascale data science. At the end, we'll have a data lake architecture that will meet the demands of the most sophisticated analytics customers for many years to come.
AWS Speaker: Ian Robinson, Specialist Solution Architect, Big Data and Analytics, EMEA - Amazon Web Services
Getting Started with Managed Database Services on AWS - September 2016 Webina...Amazon Web Services
On AWS you can choose from a variety of managed database services that save effort, save time, and unlock new capabilities and economies. In this session, we make it easy to understand how they differ, what they have in common, and how to choose one or more. We'll explain the fundamentals of Amazon RDS, a managed relational database service in the cloud; Amazon DynamoDB, a fully managed NoSQL database service; Amazon ElastiCache, a fast, in-memory caching service in the cloud; and Amazon Redshift, a fully managed, petabyte-scale data-warehouse solution that can be surprisingly economical. We will cover how each service might help support your application, how much each service costs, and how to get started.
Learning Objectives:
• Overview of managed database services available on AWS
• How to combine them for high-performance cost effective architectures
• Learn how to choose between the AWS database services based on the use case
Who Should Attend:
• IT Managers, DBAs, Enterprise and Solution Architects, IT Managers, DBAs, Enterprise and Solution Architects, Devops Engineers and Developers
AWS re:Invent 2016 was AWS’ largest event yet with over 32,000 attendees, 400 breakout sessions, and two keynotes of new product announcements. In this talk, we’ll explore the core themes of AWS re:Invent 2016 such as serverless and artificial intelligence. We will also drill down into several of the services and features unveiled including AWS Batch, AWS Shield, Aurora for Postgres, X-Ray, Polly, Lex, Rekognition, AWS Step Functions. Light appetizers and refreshments will be provided.
AWS re:Invent 2016: Event Handling at Scale: Designing an Auditable Ingestion...Amazon Web Services
How does McGraw-Hill Education use the AWS platform to scale and reliably receive 10,000 learning events per second? How do we provide near-real-time reporting and event-driven analytics for hundreds of thousands of concurrent learners in a reliable, secure, and auditable manner that is cost effective? MHE designed and implemented a robust solution that integrates AWS API Gateway, AWS Lambda, Amazon Kinesis, Amazon S3, Amazon Elasticsearch Service, Amazon DynamoDB, HDFS, Amazon EMR, Amazopn EC2, and other technologies to deliver this cloud-native platform across the US and soon the world. This session describes the challenges we faced, architecture considerations, how we gained confidence for a successful production roll-out, and the behind-the-scenes lessons we learned.
AWS re:Invent 2016: Workshop: Building Your First Big Data Application with A...Amazon Web Services
Want to get ramped up on how to use Amazon's big data web services and launch your first big data application on AWS? Join us in this workshop as we build a big data application in real time using Amazon EMR, Amazon Redshift, Amazon Kinesis, Amazon DynamoDB, and Amazon S3. We review architecture design patterns for big data solutions on AWS, and give you access to a take-home lab so that you can rebuild and customize the application yourself.
AWS re:Invent 2016: Building Big Data Applications with the AWS Big Data Plat...Amazon Web Services
Building big data applications often requires integrating a broad set of technologies to store, process, and analyze the increasing variety, velocity, and volume of data being collected by many organizations. In this session, we show how you can build entire big data applications using a core set of managed services including Amazon S3, Amazon Kinesis, Amazon EMR, Amazon Elasticsearch Service, Amazon Redshift, and Amazon QuickSight.
We walk you through the steps of building and securing a big data application using the AWS Big Data Platform. We also share best practices and common use cases for AWS big data services, including tips to help you choose the best services for your specific application.
Cloud computing gives you a number of advantages, such as the ability to scale your web application or website on demand. If you have a new web application and want to use cloud computing, you might be asking yourself, "Where do I start?" Join us in this session to understand best practices for scaling your resources from zero to millions of users. We show you how to best combine different AWS services, how to make smarter decisions for architecting your application, and how to scale your infrastructure in the cloud.
Dive deep into some of the key innovations behind Amazon Aurora, discuss best practices and configurations, and share early customer experience from the field.
AWS re:Invent 2016: Case Study: How Startups like Mapbox, Ring, Hudl, and Oth...Amazon Web Services
Join us for this lightning-round showcase of hot new brands and startup companies that are using AWS to play a really big game. You'll hear from experts like Mapbox CIO Will White, Ring Senior Engineer Jason Gluckman, Hudl Engineering Director Rob Hruska, and many others as they explain how they thought about the problems they faced and how they solved them in this TED-style session packed with lots of creative thinking.
Managing Data with Amazon ElastiCache for Redis - August 2016 Monthly Webinar...Amazon Web Services
Many data sets, such as time-series collections or Internet of Things (IoT) deployments can include huge numbers of sensor reports and other data points, which can be a challenge to manage and aggregate. Amazon ElastiCache for Redis provides an on-demand managed service with the performance and scalability to turn big data into useful information. Join us to learn how to use Amazon ElastiCache to create serverless solutions that lets you rapidly make use of large and multisource data sets.
Learning Objectives:
• Learn how to ingest and analyze sensor data using Amazon ElastiCache for Redis and the AWS IoT Service
• Learn how to use ElastiCache Redis for Time-Series data
In dynamic cloud environments, many organizations have a need to implement a unified threat management solution that enhances visibility across their workloads. Learn how REAN Cloud adopted Sophos Unified Threat Management (UTM) for increased simplicity, visibility, and security of their AWS workloads. Sophos is an Advanced Technology Partner in the AWS Partner Network that provides a reliable, unified security solution capable of scaling to meet the agility and speed of the AWS Cloud. Join the upcoming webinar to hear Sri Vasireddy from REAN Cloud, Bryan Nairn from Sophos, and Nick Matthews from AWS discuss security innovations on the AWS Cloud. Join us to learn: • Why Sophos end user REAN Cloud trusts Sophos UTM for simplicity, visibility and security. • How easy it can be to protect your AWS workloads, with a proven and scalable solution designed for the AWS Cloud. • AWS security innovations, including support across multiple Availability Zones and UTM Auto Scaling.
Who should attend: Security Managers, Security Engineers, Security Architects, IT System Administrators, System Administrators, IT Administrators, IT Managers, DevOps, Architects, IT Architects, IT Security Engineers, Business Decision Makers
Legacy monitoring and troubleshooting tools can limit visibility and control over your infrastructure and applications. Organizations must find monitoring and troubleshooting tools that can scale with the volume, variety and velocity of data generated by today’s complex applications in order to keep pace with business demands. Our upcoming webinar will discuss how Sumo Logic helped Scripps Networks harness cloud-native machine data analytics to improve application quality and reliability on AWS. Sumo Logic allows IT operations teams to visualize and monitor workloads in real-time, identify issues and expedite root-cause analysis across the AWS environment.
Join us to learn:
• How to migrate from traditional on-premises data centers to AWS with confidence
• How to improve the monitoring and troubleshooting of modern applications
• How Scripps Networks, a leading content developer, used Sumo Logic to optimize their transition to AWS
Who should attend: Developers, DevOps Director/Manager, IT Operations Director/Manager, Director of Cloud/Infrastructure, VP of Engineering
Legacy on-premises identity and access management (IAM) solutions can slow your organization’s efficiency by forcing employees to focus on administrative tasks rather than business needs. Your organization can benefit from a tool to streamline IAM on AWS that securely connects users and ensures appropriate access to resources. Okta is an integrated identity and mobility management service. Learn through customer use cases how Okta has helped various organizations connect employees to the cloud by leveraging services such as AWS Identity and Access Management (AWS IAM) and logging services like AWS CloudTrail.
Join us to learn:
• Best practices for overcoming IAM challenges in the cloud, such as accessing multiple applications across multiple domains and securing your mobile workforce
• How to authenticate, manage, and secure your users’ access to the AWS Cloud more easily with Okta on AWS
• How to streamline identity management and the associated administrative tasks
Who should attend: IT Manager, IT Security Manager, Solution Architect, Cloud App Architect, Product Management, Product Manager, Business Development
Log Analytics with Amazon Elasticsearch Service - September Webinar SeriesAmazon Web Services
Elasticsearch is a popular open-source search and analytics engine used for log analytics. With Amazon Elasticsearch Service, you can easily run Elasticsearch on AWS. In this webinar, we will provide an overview of Amazon Elasticsearch Service and demo how to set up and configure an Amazon Elasticsearch domain for the log analytics use case.
Learning Objectives:
'- Understand Amazon Elasticsearch Service use cases and key features
- Learn how to secure your Amazon Elasticsearch cluster for access from Kibana and other plug-ins
- Learn best practices for scaling, monitoring, and troubleshooting Amazon Elasticsearch domains
Big data. Small data. All data. You have access to an ever-expanding volume of data inside the walls of your business and out across the web. The potential in data is endless – from predicting election results to preventing the spread of epidemics. But how can you use it to your advantage to help move your business forward?
Drive a Data Culture within your organisation
Keynote include Ric Howe & Anthony Saxby
The Getting Started on AWS deck serves to introduce Amazon users and prospective customers to the Amazon VPC, EC2 and the concepts and components that are necessary building Fault Tolerant & High Available environments on AWS. It also serves to introduce services like Direct Connect, Router53 (Amazon DNS Service) and one of our new additions, the Amazon
Application Load Balancer (ALB). After perusing this deck, users should have a better understanding of what these services are and their propose benefits.
Analyzing big data quickly and efficiently requires a data warehouse optimized to handle and scale for large datasets. Amazon Redshift is a fast, petabyte-scale data warehouse that makes it simple and cost-effective to analyze big data for a fraction of the cost of traditional data warehouses. By following a few best practices, you can take advantage of Amazon Redshift’s columnar technology and parallel processing capabilities to minimize I/O and deliver high throughput and query performance. This webinar will cover techniques to load data efficiently, design optimal schemas, and use work load management.
Learning Objectives:
• Get an inside look at Amazon Redshift's columnar technology and parallel processing capabilities
• Learn how to migrate from existing data warehouses, optimize schemas, and load data efficiently
• Learn best practices for managing workload, tuning your queries, and using Amazon Redshift's interleaved sorting features
Who Should Attend:
• Data Warehouse Developers, Big Data Architects, BI Managers, and Data Engineers
Deep Dive Amazon Redshift for Big Data Analytics - September Webinar SeriesAmazon Web Services
Analyzing big data quickly and efficiently requires a data warehouse optimized to handle and scale for large datasets. Amazon Redshift is a fast, petabyte-scale data warehouse that makes it simple and cost-effective to analyze big data for a fraction of the cost of traditional data warehouses. By following a few best practices, you can take advantage of Amazon Redshift’s columnar technology and parallel processing capabilities to minimize I/O and deliver high throughput and query performance. This webinar will cover techniques to load data efficiently, design optimal schemas, and tune query and database performance.
Learning Objectives:
• Get an inside look at Amazon Redshift's columnar technology and parallel processing capabilities
• Learn how to migrate from existing data warehouses, optimize schemas, and load data efficiently
• Learn best practices for managing workload, tuning your queries, and using Amazon Redshift's interleaved sorting features
An overview of the Amazon ElastiCache managed service, with examples of how it can be used to increase performance, lower costs and augment other database services and databases to make things faster, easier and less expensive.
Rackspace provides a comprehensive set of tooling and expertise on AWS that further unlocks your ability to secure your environment efficiently and cost effectively. The dynamic environment of data, applications, and infrastructure can pose challenges for businesses trying to manage security while following compliance regulations. To mitigate these challenges, businesses need a scalable security solution to ensure their data is safe, secure, and stable. In this webinar, Brad Schulteis, Jarret Raim and Todd Gleason will discuss the topic of security control requirements on AWS through the lens of three common compliance scenarios: HIPAA, PCI-DSS, and generalized security compliance based on the NIST Risk Management Framework. Watch our webinar to learn how Rackspace combines AWS and security expertise with tools like AWS CloudFormation, AWS CodeCommit and AWS CodeDeploy to help customers meet their security and compliance needs.
Join us to learn:
• Best practices for securely operating workloads on the AWS Cloud
• Architecting a secure environment for dynamic workloads
• How to incorporate Security by Design principles to address compliance needs across 3 use cases: HIPAA, PCI-DSS and generalized security compliance based on the NIST Risk Management Framework
Who should attend: Directors and Managers of Security, IT Administers, IT Architects, and IT Security Engineers
Learn about the Amazon made to a service-oriented architecture over a decade ago and an introduction to AWS CodeCommit, AWS CodePipeline, and AWS CodeDeploy, three new services born out of Amazon's internal DevOps experience.
So you got a handle on what Big Data is and how you can use it to find business value in your data. Now you need an understanding of the Microsoft products that can be used to create a Big Data solution. Microsoft has many pieces of the puzzle and in this presentation I will show how they fit together. How does Microsoft enhance and add value to Big Data? From collecting data, transforming it, storing it, to visualizing it, I will show you Microsoft’s solutions for every step of the way
Webinar - Accelerating Hadoop Success with Rapid Data Integration for the Mod...Hortonworks
Many enterprises are turning to Apache Hadoop to enable Big Data Analytics and reduce the costs of traditional data warehousing. Yet, it is hard to succeed when 80% of the time is spent on moving data and only 20% on using it. It’s time to swap the 80/20! The Big Data experts at Attunity and Hortonworks have a solution for accelerating data movement into and out of Hadoop that enables faster time-to-value for Big Data projects and a more complete and trusted view of your business. Join us to learn how this solution can work for you.
The Data World Distilled
Understanding how the data world works in the Big Data era
I created this slide deck as a learning tool for new employees, I figured I would post it in case it can help others understand the data space.
This slide deck covers:
- Big Data
- Data Warehouses
- ETL/Data Integration
- Business Intelligence and Analytics
- Data Quality
- Data Testing
- Data Governance
It provides a brief description along with key vendors in the space.
Businesses are generating more data than ever before.
Doing real time data analytics requires IT infrastructure that often needs to be scaled up quickly and running an on-premise environment in this setting has its limitations.
Organisations often require a massive amount of IT resources to analyse their data and the upfront capital cost can deter them from embarking on these projects.
What’s needed is scalable, agile and secure cloud-based infrastructure at the lowest possible cost so they can spin up servers that support their data analysis projects exactly when they are required. This infrastructure must enable them to create proof-of-concepts quickly and cheaply – to fail fast and move on.
GOTO Aarhus 2014: Making Enterprise Data Available in Real Time with elastics...Yann Cluchey
My talk from GOTO Aarhus, 30th September 2014. Cogenta is a retail intelligence company which tracks ecommerce web sites around the world to provide competitive monitoring and analysis services to retailers. Using its proprietary crawler technology, Lucene and SQL Server, a stream of 20 million raw product data entries is captured and processed each day. This case study looks at how Cogenta uses Elasticsearch to break the shackles imposed by the RDBMS (and a limited budget) to make the data available in real time to its customers.
Cogenta uses SQL as its canonical store & for complex reporting, and Elasticsearch for real-time processing & to drive its SaaS web applications. Elasticsearch is easy to use, delivers the powerful features of Lucene and enables the data & platform cost to scale linearly. But… synchronising your existing data in two places presents some interesting challenges such as aggregation and concurrency control. This talk will take a detailed look at how Cogenta how overcame those challenges, with a perpetually changing and asynchronously updated dataset.
http://gotocon.com/aarhus-2014/presentation/Cogenta%20-%20Making%20Enterprise%20Data%20Available%20in%20Real%20Time%20with%20Elasticsearch
Understanding AWS Managed Database and Analytics Services | AWS Public Sector...Amazon Web Services
The world is creating more data in more ways than ever before. The average internet user in 2017 generates 1.5GB of data per day, with the rate doubling every 18 months. A single autonomous vehicle can generate 4TB per day. Each smart manufacturing plant generates 1PB per day. Storing, managing, and analyzing this data requires integrated database and analytic services that provide reliability and security at scale. AWS offers a range of managed data services that let customers focus on making data useful, including Amazon Aurora, RDS, DynamoDB, Redshift, Spectrum, ElastiCache, Kinesis, EMR, Elasticsearch Service, and Glue. In this session, we discuss these services, share our vision for innovation, and show how our customers use these services today. Learn More: https://aws.amazon.com/government-education/
Understanding AWS Managed Database and Analytics Services | AWS Public Sector...Amazon Web Services
The world is creating more data in more ways than ever before. The average internet user in 2017 generates 1.5GB of data per day, with the rate doubling every 18 months. A single autonomous vehicle can generate 4TB per day. Each smart manufacturing plant generates 1PB per day. Storing, managing, and analyzing this data requires integrated database and analytic services that provide reliability and security at scale. AWS offers a range of managed data services that let customers focus on making data useful, including Amazon Aurora, RDS, DynamoDB, Redshift, Spectrum, ElastiCache, Kinesis, EMR, Elasticsearch Service, and Glue. In this session, we discuss these services, share our vision for innovation, and show how our customers use these services today. Learn More: https://aws.amazon.com/government-education/
Secrets of Enterprise Data Mining: SQL Saturday Oregon 201411Mark Tabladillo
If you have a SQL Server license (Standard or higher) then you already have the ability to start data mining. In this new presentation, you will see how to scale up data mining from the free Excel 2013 add-in to production use. Aimed at beginning to intermediate data miners, this presentation will show how mining models move from development to production. We will use SQL Server 2014 tools including SSMS, SSIS, and SSDT.
AWS Partner Webcast - Analyze Big Data for Consumer Applications with Looker ...Amazon Web Services
Analyze Big Data for Consumer Applications with Looker BI and Amazon Redshift Customizing the customer experience based on user behavior is a constant challenge for today’s consumer apps. Business intelligence helps analyze and model large amounts of data. Looker offers a modern approach to BI leveraging AWS that’s fast, agile, and easy to manage. Join this webinar to learn how MessageMe, which provides emotionally engaging messaging apps to consumers, leverages Looker business intelligence software and the Amazon Redshift data warehouse service to analyze billions of rows of customer data in seconds.
Webinar topics include:
• How MessageMe turns billions of rows of customer data stored in Amazon Redshift into actionable insights
• How Looker connects directly to Amazon Redshift in just a few clicks, enabling MessageMe to build a modern, big data analytics in the cloud. Who should attend
• Information or Solution Architects, Data Analysts, BI Directors, DBAs, Development Leads, Developers, or Technical IT Leaders.
Presenters:
• Justin Rosenthal, CTO, MessageMe
• Keenan Rice, VP, Marketing & Alliances, Looker
• Tina Adams, Senior Product Manager, AWS
Amazon Web Services proporciona una amplia gama de servicios que le ayudarán a crear e implementar aplicaciones de análisis de big data de forma rápida y sencilla. AWS ofrece un acceso rápido a recursos de TI económicos y flexibles, algo que permitirá escalar prácticamente cualquier aplicación de big data con rapidez, incluidos almacenamiento de datos, análisis de clics, detección de elementos fraudulentos, motores de recomendación, proceso ETL impulsado por eventos, informática sin servidor y procesamiento del Internet de las cosas. Con AWS no necesita hacer grandes inversiones iniciales de tiempo o dinero para crear y mantener la infraestructura. En su lugar, puede aprovisionar exactamente el tipo y el tamaño adecuado de los recursos que necesita para impulsar sus aplicaciones de análisis de big data. Puede obtener acceso a tantos recursos como necesite, prácticamente al instante, y pagar únicamente por los utilice.
Using AWS to design and build your data architecture has never been easier to gain insights and uncover new opportunities to scale and grow your business. Join this workshop to learn how you can gain insights at scale with the right big data applications.
Testing Big Data: Automated Testing of Hadoop with QuerySurgeRTTS
Are You Ready? Stepping Up To The Big Data Challenge In 2016 - Learn why Testing is pivotal to the success of your Big Data Strategy.
According to a new report by analyst firm IDG, 70% of enterprises have either deployed or are planning to deploy big data projects and programs this year due to the increase in the amount of data they need to manage.
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data and Hadoop. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data - all with one data testing tool.
The data lake has become extremely popular, but there is still confusion on how it should be used. In this presentation I will cover common big data architectures that use the data lake, the characteristics and benefits of a data lake, and how it works in conjunction with a relational data warehouse. Then I’ll go into details on using Azure Data Lake Store Gen2 as your data lake, and various typical use cases of the data lake. As a bonus I’ll talk about how to organize a data lake and discuss the various products that can be used in a modern data warehouse.
QuerySurge Slide Deck for Big Data Testing WebinarRTTS
This is a slide deck from QuerySurge's Big Data Testing webinar.
Learn why Testing is pivotal to the success of your Big Data Strategy .
Learn more at www.querysurge.com
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data, Hadoop and NoSQL. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data warehouse - all with one ETL testing tool.
This information is geared towards:
- Big Data & Data Warehouse Architects,
- ETL Developers
- ETL Testers, Big Data Testers
- Data Analysts
- Operations teams
- Business Intelligence (BI) Architects
- Data Management Officers & Directors
You will learn how to:
- Improve your Data Quality
- Accelerate your data testing cycles
- Reduce your costs & risks
- Provide a huge ROI (as high as 1,300%)
Using real time big data analytics for competitive advantageAmazon Web Services
Many organisations find it challenging to successfully perform real-time data analytics using their own on premise IT infrastructure. Building a system that can adapt and scale rapidly to handle dramatic increases in transaction loads can potentially be quite a costly and time consuming exercise.
Most of the time, infrastructure is under-utilised and it’s near impossible for organisations to forecast the amount of computing power they will need in the future to serve their customers and suppliers.
To overcome these challenges, organisations can instead utilise the cloud to support their real-time data analytics activities. Scalable, agile and secure, cloud-based infrastructure enables organisations to quickly spin up infrastructure to support their data analytics projects exactly when it is needed. Importantly, they can ‘switch off’ infrastructure when it is not.
BluePi Consulting and Amazon Web Services (AWS) are giving you the opportunity to discover how organisations are using real time data analytics to gain new insights from their information to improve the customer experience and drive competitive advantage.
Similar to A Data Culture with Embedded Analytics in Action (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
1. A Data Culture with Embedded
Analytics in Action
Dave Rocamora • Solutions Architect, AWS
Erin Franz • Senior Analyst, Alliances, Looker
Scott Breitenother • VP, Data and Analytics, Casper
2. Data is Growing
of new data will be
created every second for
every human being on
the planet by 2020
1.7MB
compound annual growth
rate of 58% surpassing $1
billion by 2020 forecasted
for the Hadoop market
58%
of all data is ever
analyzed and used at the
moment
0.5%<
http://www.ap-institute.com/big-data-articles/big-data-
what-is-hadoop-%E2%80%93-an-explanation-for-
absolutely-anyone.aspx
http://www.marketanalysis.com/?p=279
http://www.technologyreview.com/news/514346/the-
data-made-me-do-it/
http://www.whizpr.be/upload/medialab/21/company/M
edia_Presentation_2012_DigiUniverseFINAL1.pdf
3. Big Data is for Everyone
The market for Big Data technologies is growing more than six times faster than the
information technology market as a whole…
…and those companies who use their data will win.
4. Why AWS for Big Data?
Immediately
Available
Broad and Deep
Capabilities
Trusted and Secure Scalable
5. Collect, Store, Analyze, and Visualize
It’s easy to get data to AWS, store it securely, and analyze it with the engine of your
choice, without any long-term commitment or vendor lock-in
Collect
AWS Import/Export
AWS Snowball
Direct Connect
VM Import/Export
Store
Amazon S3
Amazon EMR
Amazon Glacier
Amazon Redshift
DynamoDB
Amazon Aurora
Analyze
Amazon Kinesis
AWS Lambda
Amazon EMR
Amazon EC2
6. AWS Provides the Most Complete Platform
for Big Data
What can you do with Big Data on AWS?
Big Data Repositories Clickstream Analysis ETL Offload
Machine Learning Online Ad Serving BI Applications
7. A Data Culture with Embedded
Analytics in Action
Erin Franz • Senior Analyst, Alliances, Looker
8. Make it easy for everyone
to find, explore and
understand the data that
drives your business
9. Looker: A Self-Service Data Platform
Find, explore and understand the data
Explore Everything
Find, explore and
understand all the data
Create Standards
Define your data and
business metrics
Any SQL Database
Analyze all of your data
where it is stored
Build a Data Culture
Anyone can ask and
answer questions
10. Looker for Amazon Web Services
RDS Redshift EMR Aurora
Deployment
Easy deployment on Amazon EC2
Data Sources
Connect to Amazon RDS, Amazon Redshift,
Amazon Aurora and Amazon EMR
(Spark SQL and Presto)
Data Modeling Layer
Define your data and business metrics
Explore
Find, explore and understand your data
11. The Technical Pillars that Make it Possible
100% in Database
Leverage all your data
Avoid summarizing or
moving it
Modern Web
Architecture
Access from anywhere
Share and collaborate
Extend to anyone
LookML Intelligent
Modeling Layer
Describe the data
Create reusable and
shareable business logic
12. Looker/Redshift Integration Highlights
In-Database Architecture
The power of Amazon Redshift is
directly leveraged by Looker
because all transformation is
done in-database
Looker: A Standard for
Amazon Redshift
Some of the most demanding
Amazon Redshift deployments
choose Looker for data
exploration, including:
Highest Level of
Looker Features
We’ve invested in providing
Looker features for Amazon
Redshift to make the best
experience possible, including:
As real-time as data in
Amazon Redshift
Shared compute,
scalability, caching all
utilized by Looker
Persistent derived tables
Symmetric aggregates
Query killing
Lat/Long location
Sony
Lyft
Yahoo!
Kohler
Docker
13. Companies Winning with Redshift + Looker
eCommerce Technology Marketplaces Fin Services Media/Ad Tech
14. A Data Culture with Embedded
Analytics in Action
Scott Breitenother • VP, Data and Analytics, Casper
16. Data Powers Everything that We Do
Data Team Mission:
Enable better, faster
decisions through
information visibility
and analytical expertise
17. Until We Outgrew Our Data Infrastructure
Required data refresh by the data team
File speed and data size limitations
Intimidating presentation of information
Analysis is siloed in files
Cannot query across sources, must download
data to join
Difficult to manage ad hoc queries
No one place holds all the information
Inconsistent definitions (and a lot of work if you
make a small change!)
Production
Databases
Solution was not efficient or scalable
Big Excel Files
18. Enter Looker & AWS
Central warehouse for all data
Join previously siloed data for
better analysis
Dialect is very similar to Postgresql
We use AWS ecosystem (AWS
Lambda, Amazon RDS, Amazon EC2)
Efficient data modeling
Easy to manage source of truth
Visualization layer
Intuitive UI for business users
No SQL for business users!!
Amazon Redshift
19. We Implemented in Phases
Copy
Batch copy production
databases
1
Copy Faster
Frequent, faster and
incremental copy
2
ELT
Build specific data marts
3
20. Phase 1: Copy
Open source project from
DonorsChoose.org
Bash script with regex translations
from Postgres to Redshift
Full refresh with up to 40 min load
time
Whitelist of tables to copy for each
database
Results
Data updated every 6 hours
Missing certain key aggregations
Not read performant
Unwieldy to manage
Poor UX on Looker front end
How We Did It
21. Stitch (formerly RJ Metrics
Pipeline)
Integrates with Postgres as well as
other common third party sources
30 minute refresh cycle
Point and click to add tables and
integrations
Easy to use UI
Incremental copy
Pre-existing integrations and expertise
(multiple engineers, customer support)
Fully managed and relatively
inexpensive
Transparent logging (rows
replicated, errors)
Phase 2: Copy Faster
ResultsHow We Did It
22. Phase 3: ELT
How We Did It
Data Build Tool (dbt) from Fishtown Analytics
Looker like abstraction of tables and views
SQL that references other SQL
Manages dependency graph
Options for materializing SQL (CTE,
view, table)
Set sort and distribution keys
Simple repo deployed to EC2 tiny
Results
Pre-aggregated tables
Marts: de-normalized table for an area of
the business
Lookups: attributes for a product, location
Rollups: time-series aggregations for
summary reporting
Facts: aggregations on key “business
objects” (orders, customers)
Updates every 30 minutes
24. This Is What Success Looks Like
Access for
Business Users
Find many answers themselves
Easy to filter, pivot and visualize
Access to all existing analysis
Data is refreshed and up-to-date
Multiple ways to consume
(web, email, links via slack)
Simple Management
for Data Team
Single source of truth
Insight into usage
Centralized business logic
Git managed, easy collaboration
Keep pace with evolving business
(new countries and products)
25. Success Story: Supply Chain
Solution
Monitoring 2 KPIS:
Operational Metric – daily Days on Hand (DOH)
Success Metric – weekly Order to Ship SLA
Challenge
Operations team
needed to ensure fast
delivery of our highly
in-demand products
Order to Ship SLAMattress Inventory DOH
26. Success Story: Executive Reporting
Solution
Created a dashboard that highlights key metrics
from each department
Each metric has a goal and includes a weekly,
MTD, QTD and trend view
Challenge
Executive team needed
actionable metrics and
the ability to track against
goals while seeing trends
over time
27. There Will Always Be More Questions
Increased
Access to Data
More Sophisticated
Clients
Tougher
Questions
28. Q&A
Dave Rocamora • Solutions Architect, AWS
Erin Franz • Senior Analyst, Alliances, Looker
Scott Breitenother • VP, Data and Analytics, Casper