Akash Chauhan holds a Cloudera Certified Developer for Apache Hadoop (CCDH) certification for CDH Version 5 with license number 100-012-082, which was issued on February 11, 2015.
People are deploying servers in cloud environments faster than ever before but most are still not doing so in a safe and secure manner. Too few server instances are hardened as a part of the provisioning process; often leaving the technological doors wide open for potential service disruption by malicious threat agents — such as malware, automated attack tools and human attackers. This talk will explain how Chef can be used to automate the creation and maintenance of secure server baselines as a foundation for securely operating in cloud environments.
People are deploying servers in cloud environments faster than ever before but most are still not doing so in a safe and secure manner. Too few server instances are hardened as a part of the provisioning process; often leaving the technological doors wide open for potential service disruption by malicious threat agents — such as malware, automated attack tools and human attackers. This talk will explain how Chef can be used to automate the creation and maintenance of secure server baselines as a foundation for securely operating in cloud environments.
AWS re:Invent 2016: Chalk Talk: Succeeding at Infrastructure-as-Code (GPSCT312)Amazon Web Services
The days of manually managing infrastructure tasks are quickly coming to an end; businesses increasingly need their infrastructure teams to react with the same agility of their development teams. In this session, we discuss various approaches to infrastructure-as-code utilizing AWS solutions across the areas of templated infrastructure provisioning, configuration management, and policy as code. We invite you to bring your questions and join AWS Solutions Architects as we dive deeper into the concepts and best practices behind infrastructure-as-code.
Configuration Management in the Cloud - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Learn how to use AWS OpsWorks, AWS CodeDeploy, and AWS CodePipeline to build a reliable and consistent development pipeline
- Understand about continous integration and delivery for Infrastructure as Code
- Learn how to get started with these services.
Spark in yarn managed multi-tenant clustersshareddatamsft
Spark’s YARN support allows scheduling Spark workloads on Hadoop alongside a variety of other data-processing frameworks. In this talk, Pravin and Rajesh from Microsoft HDInsight dev team deep dive on how Spark works on yarn and why we opted on yarn as preferred cluster manager. We will give our insight on how we achieved multi-tenancy, maximizing cluster resource utilization, and while ensuring minimum resources for each application using Spark dynamic executor and Yarn schedulers on Spark HDI clusters.
SharePoint 24x7x365 Architecting for High Availability, Fault Tolerance and D...Eric Shupps
Building SharePoint farms for development and testing is easy. But building highly available farms to meet enterprise service level agreements that are fault tolerant, scalable and fully recoverable? Not so simple. Learn how to plan, design and implement a highly available on-premises farm architecture for 2016 and 2019 using proven, field-tested techniques and practical guidance.
Cloudera Manager Webinar | Cloudera Enterprise 3.7Cloudera, Inc.
Managing Hadoop just got easier!
In this webinar, Cloudera VP, Product Charles Zedlewski will introduce and explain the new features and functionality of Cloudera Manager, a core component of Cloudera Enterprise 3.7.
What is Cloudera Manager?
Cloudera Manager is the industry’s first end-to-end management application for Apache Hadoop. With Cloudera Manager, you can easily deploy and centrally operate a complete Hadoop stack. The application automates the installation process, reducing deployment time from weeks to minutes; gives you a cluster wide, real time view of nodes and services running; provides a single, central place to enact configuration changes across your cluster; and incorporates a full range of reporting and diagnostic tools to help you optimize cluster performance and utilization.
What is Cloudera Enterprise?
Cloudera Enterprise enables data driven enterprises to run Apache Hadoop environments in production cost effectively with repeatable success. Comprised of Cloudera Support and Cloudera Manager, a software layer that delivers deep visibility into and across Hadoop clusters, Cloudera Enterprise gives Hadoop operators an efficient way to precisely provision and manage cluster resources. It also allows IT shops to apply familiar business metrics – such as measurable SLAs and chargebacks – to Hadoop environments so they can run at optimal utilization. Built-in predictive capabilities anticipate shifts in the Hadoop infrastructure, ensuring reliable operation.
(DVO306) AWS CodeDeploy: Automating Your Software DeploymentsAmazon Web Services
So you’ve written some code. Now what? How do you make it available to your customers in an efficient and reliable manner? Learn how you can use AWS CodeDeploy to easily and quickly push your application updates. This talk will introduce you to the basics of CodeDeploy: key concepts, how it works, where it fits in your release process, and some deployment strategies to get you started on the right foot. We’ll walk through several demos, going from a basic sample deployment to a live update of a large multi-instance fleet, giving you a sense for how CodeDeploy can grow with your needs.
AWS Summit 2014 Melbourne - Breakout 2
Intel is contributing to a common security framework for Apache Hadoop, in the form of Project Rhino, which enables Hadoop to run workloads without compromising performance or security. Join this session to learn how your enterprise can take advantage of the security capabilities in the Intel Data Platform running on AWS to analyze data while ensuring technical safeguards that help you remain in compliance.
Presenter: Peter Kerney, Senior Solution Architect, Intel
The BlueData EPIC software platform makes deployment of Big Data infrastructure and applications easier, faster, and more cost-effective – whether on-premises or on the public cloud.
With BlueData EPIC on AWS, you can quickly and easily deploy your preferred Big Data applications, distributions and tools; leverage enterprise-class security and cost controls for multi-tenant deployments on the Amazon cloud; and tap into both Amazon S3 and on-premises storage for your Big Data analytics.
Sign up for a free two-week trial at www.bluedata.com/aws
AWS re:Invent 2016: Chalk Talk: Succeeding at Infrastructure-as-Code (GPSCT312)Amazon Web Services
The days of manually managing infrastructure tasks are quickly coming to an end; businesses increasingly need their infrastructure teams to react with the same agility of their development teams. In this session, we discuss various approaches to infrastructure-as-code utilizing AWS solutions across the areas of templated infrastructure provisioning, configuration management, and policy as code. We invite you to bring your questions and join AWS Solutions Architects as we dive deeper into the concepts and best practices behind infrastructure-as-code.
Configuration Management in the Cloud - AWS Online Tech TalksAmazon Web Services
Learning Objectives:
- Learn how to use AWS OpsWorks, AWS CodeDeploy, and AWS CodePipeline to build a reliable and consistent development pipeline
- Understand about continous integration and delivery for Infrastructure as Code
- Learn how to get started with these services.
Spark in yarn managed multi-tenant clustersshareddatamsft
Spark’s YARN support allows scheduling Spark workloads on Hadoop alongside a variety of other data-processing frameworks. In this talk, Pravin and Rajesh from Microsoft HDInsight dev team deep dive on how Spark works on yarn and why we opted on yarn as preferred cluster manager. We will give our insight on how we achieved multi-tenancy, maximizing cluster resource utilization, and while ensuring minimum resources for each application using Spark dynamic executor and Yarn schedulers on Spark HDI clusters.
SharePoint 24x7x365 Architecting for High Availability, Fault Tolerance and D...Eric Shupps
Building SharePoint farms for development and testing is easy. But building highly available farms to meet enterprise service level agreements that are fault tolerant, scalable and fully recoverable? Not so simple. Learn how to plan, design and implement a highly available on-premises farm architecture for 2016 and 2019 using proven, field-tested techniques and practical guidance.
Cloudera Manager Webinar | Cloudera Enterprise 3.7Cloudera, Inc.
Managing Hadoop just got easier!
In this webinar, Cloudera VP, Product Charles Zedlewski will introduce and explain the new features and functionality of Cloudera Manager, a core component of Cloudera Enterprise 3.7.
What is Cloudera Manager?
Cloudera Manager is the industry’s first end-to-end management application for Apache Hadoop. With Cloudera Manager, you can easily deploy and centrally operate a complete Hadoop stack. The application automates the installation process, reducing deployment time from weeks to minutes; gives you a cluster wide, real time view of nodes and services running; provides a single, central place to enact configuration changes across your cluster; and incorporates a full range of reporting and diagnostic tools to help you optimize cluster performance and utilization.
What is Cloudera Enterprise?
Cloudera Enterprise enables data driven enterprises to run Apache Hadoop environments in production cost effectively with repeatable success. Comprised of Cloudera Support and Cloudera Manager, a software layer that delivers deep visibility into and across Hadoop clusters, Cloudera Enterprise gives Hadoop operators an efficient way to precisely provision and manage cluster resources. It also allows IT shops to apply familiar business metrics – such as measurable SLAs and chargebacks – to Hadoop environments so they can run at optimal utilization. Built-in predictive capabilities anticipate shifts in the Hadoop infrastructure, ensuring reliable operation.
(DVO306) AWS CodeDeploy: Automating Your Software DeploymentsAmazon Web Services
So you’ve written some code. Now what? How do you make it available to your customers in an efficient and reliable manner? Learn how you can use AWS CodeDeploy to easily and quickly push your application updates. This talk will introduce you to the basics of CodeDeploy: key concepts, how it works, where it fits in your release process, and some deployment strategies to get you started on the right foot. We’ll walk through several demos, going from a basic sample deployment to a live update of a large multi-instance fleet, giving you a sense for how CodeDeploy can grow with your needs.
AWS Summit 2014 Melbourne - Breakout 2
Intel is contributing to a common security framework for Apache Hadoop, in the form of Project Rhino, which enables Hadoop to run workloads without compromising performance or security. Join this session to learn how your enterprise can take advantage of the security capabilities in the Intel Data Platform running on AWS to analyze data while ensuring technical safeguards that help you remain in compliance.
Presenter: Peter Kerney, Senior Solution Architect, Intel
The BlueData EPIC software platform makes deployment of Big Data infrastructure and applications easier, faster, and more cost-effective – whether on-premises or on the public cloud.
With BlueData EPIC on AWS, you can quickly and easily deploy your preferred Big Data applications, distributions and tools; leverage enterprise-class security and cost controls for multi-tenant deployments on the Amazon cloud; and tap into both Amazon S3 and on-premises storage for your Big Data analytics.
Sign up for a free two-week trial at www.bluedata.com/aws