This lightning talk will show you how simple it is to apply CI to the creation of Docker images, ensuring that each time the source is changed, a new image is created, tagged, and published. I will then show how easy it is to then deploy containers from this image and run tests to verify the behaviour.
This presentation about Jenkins pipeline will help you understand what is Jenkins & how Jenkins performs continuous integration, why do we need pipeline & how Jenkins pipeline works. You will learn how to create build and delivery pipelines & automate tasks, understand what is scripted and declarative pipeline with the help of Groovy scripts. Jenkins is an open-source continuous integration tool that is used to automate software development phases such as building, testing and deploying. Jenkins pipeline is a suite of plugins that support integration and implementation of jobs using continuous build and delivery pipelines. Now let's get started and understand how Jenkins pipeline works.
Below topics are explained in this Jenkins pipeline presentation:
1) What is Jenkins?
2) What is Continuous Integration?
3) Why Pipeline?
4) How does Jenkins pipeline work?
5) Build and delivery pipeline
6) Scripted and declarative pipeline
7) Demo on Jenkins pipeline
Why learn DevOps?
Simplilearn’s DevOps training course is designed to help you become a DevOps practitioner and apply the latest in DevOps methodology to automate your software development lifecycle right out of the class. You will master configuration management; continuous integration deployment, delivery and monitoring using DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios in a practical, hands-on and interactive approach. The DevOps training course focuses heavily on the use of Docker containers, a technology that is revolutionizing the way apps are deployed in the cloud today and is a critical skillset to master in the cloud age.
After completing the DevOps training course you will achieve hands-on expertise in various aspects of the DevOps delivery model. The practical learning outcomes of this Devops training course are:
An understanding of DevOps and the modern DevOps toolsets
The ability to automate all aspects of a modern code delivery and deployment pipeline using:
1. Source code management tools
2. Build tools
3. Test automation tools
4. Containerization through Docker
5. Configuration management tools
6. Monitoring tools
Who should take this course?
DevOps career opportunities are thriving worldwide. DevOps jobs are the third-highest tech role ranked by employer demand on Indeed.com but have the second-highest talent deficit.
1. This DevOps training course will be of benefit the following professional roles:
2. Software Developers
3. Technical Project Managers
4. Architects
5. Operations Support
6. Deployment engineers
7. IT managers
8. Development managers
Learn more at https://www.simplilearn.com/cloud-computing/devops-practitioner-certification-training
CI CD Pipeline Using Jenkins | Continuous Integration and Deployment | DevOps...Edureka!
** DevOps Training: https://www.edureka.co/devops **
This CI CD Pipeline tutorial explains the concepts of Continuous Integration, Continuous Delivery & Deployment, its benefits, and its Tools. Below are the topics covered in the video:
1. What is DevOps
2. What are CI and CD?
3. Pipelines: What are they?
4. Continuous Delivery and Continuous Deployment
5. Role of Jenkins
6. Role of Docker
7. Hands-On – Creating CI CD Pipeline Using Jenkins and Docker
Check our complete DevOps playlist here (includes all the videos mentioned in the video): http://goo.gl/O2vo13
This presentation about Jenkins pipeline will help you understand what is Jenkins & how Jenkins performs continuous integration, why do we need pipeline & how Jenkins pipeline works. You will learn how to create build and delivery pipelines & automate tasks, understand what is scripted and declarative pipeline with the help of Groovy scripts. Jenkins is an open-source continuous integration tool that is used to automate software development phases such as building, testing and deploying. Jenkins pipeline is a suite of plugins that support integration and implementation of jobs using continuous build and delivery pipelines. Now let's get started and understand how Jenkins pipeline works.
Below topics are explained in this Jenkins pipeline presentation:
1) What is Jenkins?
2) What is Continuous Integration?
3) Why Pipeline?
4) How does Jenkins pipeline work?
5) Build and delivery pipeline
6) Scripted and declarative pipeline
7) Demo on Jenkins pipeline
Why learn DevOps?
Simplilearn’s DevOps training course is designed to help you become a DevOps practitioner and apply the latest in DevOps methodology to automate your software development lifecycle right out of the class. You will master configuration management; continuous integration deployment, delivery and monitoring using DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios in a practical, hands-on and interactive approach. The DevOps training course focuses heavily on the use of Docker containers, a technology that is revolutionizing the way apps are deployed in the cloud today and is a critical skillset to master in the cloud age.
After completing the DevOps training course you will achieve hands-on expertise in various aspects of the DevOps delivery model. The practical learning outcomes of this Devops training course are:
An understanding of DevOps and the modern DevOps toolsets
The ability to automate all aspects of a modern code delivery and deployment pipeline using:
1. Source code management tools
2. Build tools
3. Test automation tools
4. Containerization through Docker
5. Configuration management tools
6. Monitoring tools
Who should take this course?
DevOps career opportunities are thriving worldwide. DevOps jobs are the third-highest tech role ranked by employer demand on Indeed.com but have the second-highest talent deficit.
1. This DevOps training course will be of benefit the following professional roles:
2. Software Developers
3. Technical Project Managers
4. Architects
5. Operations Support
6. Deployment engineers
7. IT managers
8. Development managers
Learn more at https://www.simplilearn.com/cloud-computing/devops-practitioner-certification-training
CI CD Pipeline Using Jenkins | Continuous Integration and Deployment | DevOps...Edureka!
** DevOps Training: https://www.edureka.co/devops **
This CI CD Pipeline tutorial explains the concepts of Continuous Integration, Continuous Delivery & Deployment, its benefits, and its Tools. Below are the topics covered in the video:
1. What is DevOps
2. What are CI and CD?
3. Pipelines: What are they?
4. Continuous Delivery and Continuous Deployment
5. Role of Jenkins
6. Role of Docker
7. Hands-On – Creating CI CD Pipeline Using Jenkins and Docker
Check our complete DevOps playlist here (includes all the videos mentioned in the video): http://goo.gl/O2vo13
Introduction to Jenkins and how to effectively apply Jenkins to your projects.
Jenkins Growth , Companies using Jenkins , Most downloaded and Used Plugins.
Jenkins is an open source automation server written in Java. Jenkins helps to automate the non-human part of software development process, with continuous integration and facilitating technical aspects of continuous delivery. It is a server-based system that runs in servlet containers such as Apache Tomcat.
What is Jenkins | Jenkins Tutorial for Beginners | EdurekaEdureka!
****** DevOps Training : https://www.edureka.co/devops ******
This DevOps Jenkins Tutorial on what is Jenkins ( Jenkins Tutorial Blog Series: https://goo.gl/JebmnW ) will help you understand what is Continuous Integration and why it was introduced. This tutorial also explains how Jenkins achieves Continuous Integration in detail and includes a Hands-On session around Jenkins by the end of which you will learn how to compile a code that is present in GitHub, Review that code and Analyse the test cases present in the GitHub repository. The Hands-On session also explains how to create a build pipeline using Jenkins and how to add Jenkins Slaves.
The Hands-On session is performed on an Ubuntu-64bit machine in which Jenkins is installed.
To learn how Jenkins can be used to integrate multiple DevOps tools, watch the video titled 'DevOps Tools', by clicking this link: https://goo.gl/up9iwd
Check our complete DevOps playlist here: http://goo.gl/O2vo13
Facebook: https://www.facebook.com/edurekaIN/
Twitter: https://twitter.com/edurekain
LinkedIn: https://www.linkedin.com/company/edureka
Jenkins is the leading open source continuous integration tool. It builds and tests our software continuously and monitors the execution and status of remote jobs, making it easier for team members and users to regularly obtain the latest stable code.
Learn All Aspects Of Maven step by step, Enhance your skills & Launch Your Career, On-Demand Course affordable price & classes on virtually every topic.Try Before You Buy
Maven es una herramienta de software para la gestión y construcción de proyectos Java creada por Jason van Zyl, de Sonatype, en 2002.
Primeros pasos con Maven fue nuestra desconferencia en el BarCamp STI 2013,.
Nuestro objetivo principal es introducir el manejo de proyectos usando Maven mediante la
utilización de ejemplos básicos.
Así como también, Conocer los valores inmediatos que Maven puede ofrecer a muchos usuarios y organizaciones, ya que funciona igual para proyectos pequeños y grandes.
**BarCamp STI 2013 es el primer Barcamp celebrado en la República Dominicana, tuvo lugar en el campus de la Pontificia Universidad Católica Madre y Maestra, el sábado 16 de noviembre del 2013, desde las 9 A.M hasta las 6 P.M.
What is Docker | Docker Tutorial for Beginners | Docker Container | DevOps To...Edureka!
This DevOps Docker Tutorial on what is docker ( Docker Tutorial Blog Series: https://goo.gl/32kupf ) will help you understand how to use Docker Hub, Docker Images, Docker Container & Docker Compose. This tutorial explains Docker's working Architecture and Docker Engine in detail. This Docker tutorial also includes a Hands-On session around Docker by the end of which you will learn to pull a centos Docker Image and spin your own Docker Container. You will also see how to launch multiple docker containers using Docker Compose. Finally, it will also tell you the role Docker plays in the DevOps life-cycle.
The Hands-On session is performed on an Ubuntu-64bit machine in which Docker is installed.
What Is Docker? | What Is Docker And How It Works? | Docker Tutorial For Begi...Simplilearn
This presentation on Docker will help you understand DevOps tools, why Docker is needed, Docker vs Virtual Machine, what is Docker, how does a Docker work and components of Docker. Docker is a tool which is used to automate the deployment of the application in lightweight containers so that applications can work efficiently in different environments. A container is a software package that consists of all the dependencies required to run an application. Until now we have been running applications on virtual machines. Every virtual machine used to be the base of our application but now with the advent of Docker and containerization technologies, each application is run in a container like logical space. Now, let us get started and learn what exactly is Docker.
Below topics are explained in this Docker presentation:
1. DevOps and its tools
2. What is Docker?
3. How does Docker work?
4. What are the components of Docker?
Simplilearn's DevOps Certification Training Course will prepare you for a career in DevOps, the fast-growing field that bridges the gap between software developers and operations. You’ll become an expert in the principles of continuous development and deployment, automation of configuration management, inter-team collaboration and IT service agility, using modern DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios. DevOps jobs are highly paid and in great demand, so start on your path today.
Why learn DevOps?
Simplilearn’s DevOps training course is designed to help you become a DevOps practitioner and apply the latest in DevOps methodology to automate your software development lifecycle right out of the class. You will master configuration management; continuous integration deployment, delivery and monitoring using DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios in a practical, hands-on and interactive approach. The DevOps training course focuses heavily on the use of Docker containers, a technology that is revolutionizing the way apps are deployed in the cloud today and is a critical skillset to master in the cloud age.
Who should take this course?
DevOps career opportunities are thriving worldwide. DevOps was featured as one of the 11 best jobs in America for 2017, according to CBS News, and data from Payscale.com shows that DevOps Managers earn as much as $122,234 per year, with DevOps engineers making as much as $151,461. DevOps jobs are the third-highest tech role ranked by employer demand on Indeed.com but have the second-highest talent deficit.
1. This DevOps training course will be of benefit for the following professional roles:
2. Software Developers
3. Technical Project Managers
4. Architects
5. Operations Support
6. Deployment engineers
7. IT managers
8. Development managers
Learn more at: https://www.simplilearn.com/
Using Docker to build and test in your laptop and JenkinsMicael Gallego
Docker is changing the way we create and deploy software. This presentation is a hands-on introduction to how to use docker to build and test software, in your laptop and in your Jenkins CI server
Introduction to Jenkins and how to effectively apply Jenkins to your projects.
Jenkins Growth , Companies using Jenkins , Most downloaded and Used Plugins.
Jenkins is an open source automation server written in Java. Jenkins helps to automate the non-human part of software development process, with continuous integration and facilitating technical aspects of continuous delivery. It is a server-based system that runs in servlet containers such as Apache Tomcat.
What is Jenkins | Jenkins Tutorial for Beginners | EdurekaEdureka!
****** DevOps Training : https://www.edureka.co/devops ******
This DevOps Jenkins Tutorial on what is Jenkins ( Jenkins Tutorial Blog Series: https://goo.gl/JebmnW ) will help you understand what is Continuous Integration and why it was introduced. This tutorial also explains how Jenkins achieves Continuous Integration in detail and includes a Hands-On session around Jenkins by the end of which you will learn how to compile a code that is present in GitHub, Review that code and Analyse the test cases present in the GitHub repository. The Hands-On session also explains how to create a build pipeline using Jenkins and how to add Jenkins Slaves.
The Hands-On session is performed on an Ubuntu-64bit machine in which Jenkins is installed.
To learn how Jenkins can be used to integrate multiple DevOps tools, watch the video titled 'DevOps Tools', by clicking this link: https://goo.gl/up9iwd
Check our complete DevOps playlist here: http://goo.gl/O2vo13
Facebook: https://www.facebook.com/edurekaIN/
Twitter: https://twitter.com/edurekain
LinkedIn: https://www.linkedin.com/company/edureka
Jenkins is the leading open source continuous integration tool. It builds and tests our software continuously and monitors the execution and status of remote jobs, making it easier for team members and users to regularly obtain the latest stable code.
Learn All Aspects Of Maven step by step, Enhance your skills & Launch Your Career, On-Demand Course affordable price & classes on virtually every topic.Try Before You Buy
Maven es una herramienta de software para la gestión y construcción de proyectos Java creada por Jason van Zyl, de Sonatype, en 2002.
Primeros pasos con Maven fue nuestra desconferencia en el BarCamp STI 2013,.
Nuestro objetivo principal es introducir el manejo de proyectos usando Maven mediante la
utilización de ejemplos básicos.
Así como también, Conocer los valores inmediatos que Maven puede ofrecer a muchos usuarios y organizaciones, ya que funciona igual para proyectos pequeños y grandes.
**BarCamp STI 2013 es el primer Barcamp celebrado en la República Dominicana, tuvo lugar en el campus de la Pontificia Universidad Católica Madre y Maestra, el sábado 16 de noviembre del 2013, desde las 9 A.M hasta las 6 P.M.
What is Docker | Docker Tutorial for Beginners | Docker Container | DevOps To...Edureka!
This DevOps Docker Tutorial on what is docker ( Docker Tutorial Blog Series: https://goo.gl/32kupf ) will help you understand how to use Docker Hub, Docker Images, Docker Container & Docker Compose. This tutorial explains Docker's working Architecture and Docker Engine in detail. This Docker tutorial also includes a Hands-On session around Docker by the end of which you will learn to pull a centos Docker Image and spin your own Docker Container. You will also see how to launch multiple docker containers using Docker Compose. Finally, it will also tell you the role Docker plays in the DevOps life-cycle.
The Hands-On session is performed on an Ubuntu-64bit machine in which Docker is installed.
What Is Docker? | What Is Docker And How It Works? | Docker Tutorial For Begi...Simplilearn
This presentation on Docker will help you understand DevOps tools, why Docker is needed, Docker vs Virtual Machine, what is Docker, how does a Docker work and components of Docker. Docker is a tool which is used to automate the deployment of the application in lightweight containers so that applications can work efficiently in different environments. A container is a software package that consists of all the dependencies required to run an application. Until now we have been running applications on virtual machines. Every virtual machine used to be the base of our application but now with the advent of Docker and containerization technologies, each application is run in a container like logical space. Now, let us get started and learn what exactly is Docker.
Below topics are explained in this Docker presentation:
1. DevOps and its tools
2. What is Docker?
3. How does Docker work?
4. What are the components of Docker?
Simplilearn's DevOps Certification Training Course will prepare you for a career in DevOps, the fast-growing field that bridges the gap between software developers and operations. You’ll become an expert in the principles of continuous development and deployment, automation of configuration management, inter-team collaboration and IT service agility, using modern DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios. DevOps jobs are highly paid and in great demand, so start on your path today.
Why learn DevOps?
Simplilearn’s DevOps training course is designed to help you become a DevOps practitioner and apply the latest in DevOps methodology to automate your software development lifecycle right out of the class. You will master configuration management; continuous integration deployment, delivery and monitoring using DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios in a practical, hands-on and interactive approach. The DevOps training course focuses heavily on the use of Docker containers, a technology that is revolutionizing the way apps are deployed in the cloud today and is a critical skillset to master in the cloud age.
Who should take this course?
DevOps career opportunities are thriving worldwide. DevOps was featured as one of the 11 best jobs in America for 2017, according to CBS News, and data from Payscale.com shows that DevOps Managers earn as much as $122,234 per year, with DevOps engineers making as much as $151,461. DevOps jobs are the third-highest tech role ranked by employer demand on Indeed.com but have the second-highest talent deficit.
1. This DevOps training course will be of benefit for the following professional roles:
2. Software Developers
3. Technical Project Managers
4. Architects
5. Operations Support
6. Deployment engineers
7. IT managers
8. Development managers
Learn more at: https://www.simplilearn.com/
Using Docker to build and test in your laptop and JenkinsMicael Gallego
Docker is changing the way we create and deploy software. This presentation is a hands-on introduction to how to use docker to build and test software, in your laptop and in your Jenkins CI server
The Jenkins open source continuous integration server now provides a “pipeline” scripting language which can define jobs that persist across server restarts, can be stored in a source code repository and can be versioned with the source code they are building. By defining the build and deployment pipeline in source code, teams can take full control of their build and deployment steps. The Docker project provides lightweight containers and a system for defining and managing those containers. The Jenkins pipeline and Docker containers are a great combination to improve the portability, reliability, and consistency of your build process.
This session will demonstrate Jenkins and Docker in the journey from continuous integration to DevOps.
Deploying Spring Boot applications with Docker (east bay cloud meetup dec 2014)Chris Richardson
This presentation describes how to deploy a Spring Boot-based microservice using Docker.
See http://plainoldobjects.com/2014/11/16/deploying-spring-boot-based-microservices-with-docker/
Continuous Integration/Deployment with Docker and JenkinsFrancesco Bruni
“Continuous Integration doesn’t get rid of bugs, but it does make them dramatically easier to find and remove” M. Fowler
Jenkins and Docker are cool technologies. Here's how they serve in a continuous integration based process and how they could be exploited to deliver new version of the same software.
The slides present the whole process along with real code snippets.
Infrastructure Deployment with Docker & AnsibleRobert Reiz
This is an introduction to Docker & Ansible. It shows how Ansible can be used as orchestration too for Docker. There are 2 real world examples included with code examples in a Gist.
DevOps and Continuous Delivery reference architectures for DockerSonatype
People want to understand how to architect continuous delivery and DevOps environments using containerized applications and artifacts. We assembled this deck to represent best practices across a number of different organizations. These may look like the tool chains and infrastructure that you have built or would like to build.
Production sec ops with kubernetes in dockerDocker, Inc.
In this talk, Scott Coulton will walk through how to build a container as a service platform with Docker EE. Starting from scratch he will help you figure out what orchestrator to choose by deep diving into the technical differences between swarm and kubernetes on the EE platform as well as cover some of the practical considerations that could influence your decision. He will also share various automation solutions to deploy your cluster into production. Once the cluster is up and and running, Scott will delve into sec ops and discuss security best practices - including signing images in DTR (Docker Trusted Registry) and CVE scanning to provide a secure supply chain into production. You’ll leave this talk with the knowledge needed to build your own container platform in production. And did I mention it will all be done live, step-by-step?
Reduce DevOps Friction with Docker & Jenkins by Andy Pemberton, CloudbeesDocker, Inc.
Jenkins and Docker are two game-changing technologies: together, they have huge potential to reduce DevOps friction. Come learn about the integration points between CloudBees Jenkins Platform and Docker and how you can use them to get on the path to frictionless DevOps in your company.
In this deck from the Stanford HPC Conference, Christian Kniep from Docker, Inc. gives a tutorial on linux containers.
"This tutorial provides a detailed overview of the components needed to run containerized applications and explores how distributed HPC applications can be tackled. We’ll explain the concept of Linux Containers and describe the bits and pieces participants will explore following step-by-step examples.
The workshop will introduce the predominant forms of orchestration in the industry; what problems they solve and how to approach the problem.
Attendees will explore the benefits and drawbacks of orchestrators first hand with their own small exemplary stack deployments.
Finally the workshop will introduce how HPC and Big Data workloads can be tackled on-top of these service-oriented clusters."
Watch the video: https://youtu.be/LJinZpCTyk0
Learn more: http://www.docker.com/
and
http://hpcadvisorycouncil.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
DCEU 18: Building Your Development PipelineDocker, Inc.
Oliver Pomeroy - Solution Engineer, Docker
Laura Frank Tacho - Director of Engineering, CloudBees
Enterprises often want to provide automation and standardisation on top of their container platform, using a pipeline to build and deploy their containerized applications. However this opens up new challenges… Do I have to build a new CI/CD Stack? Can I build my CI/CD pipeline with Kubernetes orchestration? What should my build agents look like? How do I integrate my pipeline into my enterprise container registry? In this session full of examples and “how-to”s, Olly and Laura will guide you through common situations and decisions related to your pipelines. We’ll cover building minimal images, scanning and signing images, and give examples on how to enforce compliance standards and best practices across your teams.
2016 Docker Palo Alto - CD with ECS and JenkinsTracy Kennedy
Through the use of build pipelines, Continuous Delivery enables faster and more frequent builds, tests and deployment cycles. But how do we build a continuous delivery pipelines in the real world? In this session, we are going to demonstrate how to code a pipeline that builds a containerized application and ultimately deploys it to Amazon’s container service, ECS.
Presentation on Docker and Docker Compose. Includes basic commands to get started with Docker container. This presentation was presented on 9th February, 2018
DCSF 19 Building Your Development Pipeline Docker, Inc.
Oliver Pomeroy, Docker & Laura Tacho, Cloudbees
Enterprises often want to provide automation and standardisation on top of their container platform, using a pipeline to build and deploy their containerized applications. However this opens up new challenges; Do I have to build a new CI/CD Stack? Can I build my CI/CD pipeline with Kubernetes orchestration? What should my build agents look like? How do I integrate my pipeline into my enterprise container registry? In this session full of examples and how-to's, Olly and Laura will guide you through common situations and decisions related to your pipelines. We'll cover building minimal images, scanning and signing images, and give examples on how to enforce compliance standards and best practices across your teams.
PuppetConf 2017: What’s in the Box?!- Leveraging Puppet Enterprise & Docker- ...Puppet
“Docker, Docker, Docker.” It’s a phrase we hear often, but what are containers, what can they be used for, and why should you know more about them? In this session, Grace (Puppet) and Tricia (AppDynamics) will introduce attendees to Docker and help them build and deploy their first container with Puppet. They will leverage the docker_image_build module from the Puppet Forge and take attendees through the proper workflow for coupling Docker and Puppet together. The session will focus on how to use some of the newest Docker features, such as multi-stage build files and password stores within Docker so you can pass "secrets" to a swarm for login credentials. The goal is to provide newcomers with a working proficiency of how to get started deploying containers using Puppet as their automation tool.
Enhancing the application development process in all its phases—building, scaling, shipping, deploying
and running—plays a vital role in today’s competitive IT industry by shortening the time between writing
code and running it.
This presentation gives a brief understanding of docker architecture, explains what docker is not, followed by a description of basic commands and explains CD/CI as an application of docker.
Slides for the workshop session at Linuxing in London, introducing a demo app for the workshop, which uses Node.js, SQL Server, Nginx, Prometheus and Grafana - all in Docker containers on Linux.
Docker and Puppet for Continuous IntegrationGiacomo Vacca
Today developers want to change the code, build and deploy often, even several times per day.
New versions of software may need to be tested on different distributions, and with different configurations.
Achieving this with Virtual Machines it’s possible, but it’s very resource and time consuming. Docker provides an incredibly good solution for this, in particular if combined with Continuous Integration tools like Jenkins and Configuration Management tools like Puppet.
This presentation focuses on the opportunities to configure automatically Docker images, use Docker containers as disposable workers during your tests, and even running your Continuous Integration system inside Docker.
Similar to Build, Publish, Deploy and Test Docker images and containers with Jenkins Workflow (20)
Containerize Your Game Server for the Best Multiplayer Experience Docker, Inc.
Raymond Arifianto, AccelByte and
Mark Mandel, Google -
We have been deploying containerized micro-services for our Game Backend Services for a while. Now we are tackling the challenge to scale up fleets of game dedicated servers in multiple regions, multiple data centers and multiple providers - some in bare metal, some in Cloud. So we leverage docker containerization to deploy Game Servers to achieve Portability, Fast Deployment and Predictability, enabling us to scale up to thousands of servers, on demand, without a sweat.
How to Improve Your Image Builds Using Advance Docker BuildDocker, Inc.
Nicholas Dille, Haufe-Lexware + Docker Captain -
Docker continues to be the standard tool for building container images. For more than a year Docker ships with BuildKit as an alternative image builder, providing advanced features for secret and cache management. These features help to make image builds faster and more secure. In this session, Docker Captain Nicholas Dille will teach you how to use Buildkit features to your advantage.
Build & Deploy Multi-Container Applications to AWSDocker, Inc.
Lukonde Mwila, Entelect -
As the cloud-native approach to development and deployment becomes more prevalent, it's an exciting time for software engineers to be equipped on how to dockerize multi-container applications and deploy them to the cloud.
In this talk, Lukonde Mwila, Software Engineer at Entelect, will cover the following topics:
- Docker Compose
- Containerizing an Nginx Server
- Containerizing an React App
- Containerizing an Node.JS App
- Containerizing anMongoDB App
- Runing Multi-Container App Locally
- Creating a CI/CD Pipeline
- Adding a build stage to test containers and push images to Docker Hub
- Deploying Multi-Container App to AWS Elastic Beanstalk
Lukonde will start by giving an overview of how Docker Compose works and how it makes it very easy and straightforward to startup multiple Docker containers at the same time and automatically connect them together with some form of networking.
After that, Lukonde will take a hands on approach to containerize an Nginx server, a React app, a NodeJS app and a MongoDB instance to demonstrate the power of Docker Compose. He'll demonstrate usage of two Docker files for an application, one production grade and the other for local development and running of tests. Lastly, he'll demonstrate creating a CI/CD pipeline in AWS to build and test our Docker images before pushing them to Docker Hub or AWS ECR, and finally deploying our multi-container application AWS Elastic Beanstalk.
Securing Your Containerized Applications with NGINXDocker, Inc.
Kevin Jones, NGNIX -
NGINX is one of the most popular images on Docker Hub and has been at the forefront of the web since the early 2000's. In this talk we will discuss how and why NGINX's lightweight and powerful architecture makes it a very popular choice for securing containerized applications as a sidecar reverse proxy within containers. We will highlight important aspects of application security that NGINX can help with, such as TLS, HTTP, AuthN, AuthZ and traffic control.
How To Build and Run Node Apps with Docker and ComposeDocker, Inc.
Kathleen Juell, Digital Ocean -
Containers are an essential part of today's microservice ecosystem, as they allow developers and operators to maintain standards of reliability and reproducibility in fast-paced deployment scenarios. And while there are best practices that extend across stacks in containerized environments, there are also things that make each stack distinct, starting with the application image itself.
This talk will dive into some of these particularities, both at the image and service level, while also covering general best practices for building and running Node applications with database backends using Docker and Compose.
Jessica Deen, Microsoft -
Helm 3 is here; let's go hands-on! In this demo-fueled session, I'll walk you through the differences between Helm 2 and Helm 3. I'll offer tips for a successful rollout or upgrade, go over how to easily use charts created for Helm 2 with Helm 3 (without changing your syntax), and review opportunities where you can participate in the project's future.
Distributed Deep Learning with Docker at SalesforceDocker, Inc.
Jeff Hajewski, Salesforce -
There is a wealth of information on building deep learning models with PyTorch or TensorFlow. Anyone interested in building a deep learning model is only a quick search away from a number of clear and well written tutorials that will take them from zero knowledge to having a working image classifier. But what happens when you need to deploy these models in a production setting? At Salesforce, we use TensorFlow models to help us provide customers with insights into their data, and we do this as close to real-time as possible. Designing these systems in a scalable manner requires overcoming a number of design challenges, but the core component is Docker. Docker enables us to design highly scalable systems by allowing us to focus on service interactions, rather than how our services will interact with the hardware. Docker is also at the core of our test infrastructure, allowing developers and data scientists to build and test the system in an end to end manner on their local machines. While some of this may sound complex, the core message is simplicity - Docker allows us to focus on the aspects of the system that matter, greatly simplifying our lives.
The First 10M Pulls: Building The Official Curl Image for Docker HubDocker, Inc.
James Fuller, webcomposite s.r.o. -
Curl is the venerable (yet very modern) 'swiss army knife' command line tool and library for transferring data with URLs. Recently we (the Curl team) decided to build a release for Docker Hub. This talk will outline our current development workflow with respect to the docker image and provide insights on what it takes to build a docker image for mass public consumption. We are also keen to learn from users and other developers how we might improve and enhance the official curl docker image.
Fabian Stäber, Instana -
In recent years, we saw a great paradigm shift in software engineering away from static monolithic applications towards dynamic distributed horizontally scalable architectures. Docker is one of the key technologies enabling this development. This shift poses a lot of new challenges for application monitoring, ranging from practical issues (need for automation) to technical challenges (Docker networking) to organizational topics (blurring line between software engineers and operations) to fundamental questions (define what is an application). In this talk we show how Docker changed the way we do monitoring, how modern application monitoring systems work, and what future developments we expect.
COVID-19 in Italy: How Docker is Helping the Biggest Italian IT Company Conti...Docker, Inc.
Clemente Biondo, Engineering Ingegneria Informatica -
When the COVID 19 pandemic started, Engineering Ingegneria Informatica Group (1.25 billion euros of revenues, 65 offices around the world, 12.000 employees) was forced to put their digital transformation to the test in order to maintain operational continuity. In this session, Clemente Biondo, the Tech Lead of the Information Systems Department, will share how his company is reacting to this unforeseeable scenario and how Docker-driven digital transformation had paved the path for work to continue remotely. Clemente will discuss learnings moving from colocated teams, manual approaches, email based-business processes, and a monolithic application to a mature DevOps culture characterized by a distributed autonomous workforce and a continuous deployment process that deploys backward-compatible Docker containerized microservices into hybrid multi cloud datacenters an average of twice a day with zero-downtime. He will detail how they use Docker to unify dev, test and production environments, and as an efficient and automated mechanism for deploying applications. Lastly, Clemente shares how, in our darkest hour, he and others are working to shine their brightest light.
Chris Lauer, NOAA Space Weather Prediction Center -
This is the story of how adopting a containerized workflow changed the way our small software team works at NOAA’s Space Weather Prediction Center. Our old architecture, a big ball of mud shared-database integration, just wasn’t cutting it - it was killing our agility. Over the past two years, our small team has adopted a microservice style architecture, using Docker with docker-compose and environment files as our deployment strategy for all new development. We’ve discovered the joys of using containers for identical dev, staging, and production environments. We work closely with scientists: much of the code we’re running has complicated and conflicting library dependencies. Docker captures these beautifully - we’ve even had some success teaching our scientists to use it! I’ll share what we’ve learned, some of the persistent challenges we face, and one place we really got it wrong. This talk builds off of a popular hallway track from DockerCon 2019.
Become a Docker Power User With Microsoft Visual Studio CodeDocker, Inc.
Brian Christner, 56k + Docker Captain -
In this session, we will unlock the full potential of using Microsoft Visual Studio Code (VS Code) and Docker Desktop to turn you into a Docker Power User. When we expand and utilize the VS Code Docker plugin, we can take our projects and Docker skills to the next level. In addition to using VS Code, we streamline our Docker Desktop development workflow with less context switching and built-in shortcuts. You will learn how to bootstrap new projects, quickly write Dockerfiles utilizing templates, build, run, and interact with containers all from VS Code.
How to Use Mirroring and Caching to Optimize your Container RegistryDocker, Inc.
Brandon Mitchell, Boxboat + Docker Captain -
How do you make your builds more performant? This talk looks at options to configure caching and mirroring of images that you need to save on bandwidth costs and to keep running even if something goes down upstream.
Monolithic to Microservices + Docker = SDLC on Steroids!Docker, Inc.
Ashish Sharma, SS&C Eze -
SS&C Eze provides various products in the stock market domain. We spent the last couple of years building Eclipse which is an investment suite born in cloud. The journey so far has been very interesting. The very first version of the product were a bunch of monolithic windows services and deployed using Octopus tool. We successfully managed to bring all the monolithic problem to the cloud and created a nightmare for ourselves. We then started applying microservices architecture principles and started breaking the monolithic into small services. Very soon we realized that we need a better packaging/deployment tool. Docker looked like a magical solution to our problem. Since its adoption, It has not only solved the deployment problem for us but has made a deep impact on different aspects of SDLC. It allowed us to use heterogeneous technology stacks, simplified development environment setup, simplified our testing strategy, improved our speed of delivery, and made our developers more productive. In this talk I would like to share our experience of using Docker and its positive impact on our SDLC.
Ara Pulido, Datadog -
Container technologies, although not new, have increased their popularity in the past few years, with container orchestrators allowing companies around the world to adopt these technologies to help them ship and scale microservices with precision and velocity. Kubernetes is currently the most popular container orchestration platform, and while many organizations are migrating their workloads to it, Kubernetes is still relatively immature. New corner cases, errors, and quirks are regularly discovered as users push the boundaries of size and scale. When Datadog adopted Kubernetes we discovered some of these boundaries the hard way, and we continuously challenge and modify our infrastructure decisions in order to fit our use case. Join me in this talk for our story on what we learned while we scaled our Kubernetes clusters, the contributions to Kubernetes we made along the way, and how you can apply those learnings when growing your Kubernetes clusters from a handful to hundreds or thousands of nodes.
Andy Clemenko, StackRox -
One underutilized, and amazing, thing about the docker image scheme is labels. Labels are a built in way to document all aspects about the image itself. Think about all the information that the tags inside your clothing carry. If you care to look you can find out everything about the garment. All that information can be very valuable. Now think about how we can leverage labels to carry similar information. We can even use the labels to contain Docker Compose or even Kubernetes Yaml. We can even include labels into the CI/CD process making things more secure and smoother. Come find out some fun techniques on how to leverage labels to do some fun and amazing things.
Using Docker Hub at Scale to Support Micro Focus' Delivery and Deployment ModelDocker, Inc.
Patrick Deloulay, Micro Focus -
Micro Focus started their digital transformation 3 years ago, moving the entire portfolio into hundreds of container images. Leveraging Docker Hub as our primary registry service, we will cover how we ended up building a simple but secure push/pull model to publish and deliver our premium assets to our customers and partners to both meet the high agility of our DevOps teams while greatly simplifying the deployment of our applications.
Build & Deploy Multi-Container Applications to AWSDocker, Inc.
Lukonde Mwila, Entelect
As the cloud-native approach to development and deployment becomes more prevalent, it's an exciting time for software engineers to be equipped on how to dockerize multi-container applications and deploy them to the cloud.
In this talk, Lukonde Mwila, Software Engineer at Entelect, will cover the following topics:
- Docker Compose
- Containerizing an Nginx Server
- Containerizing an React App
- Containerizing an Node.JS App
- Containerizing anMongoDB App
- Runing Multi-Container App Locally
- Creating a CI/CD Pipeline
- Adding a build stage to test containers and push images to Docker Hub
- Deploying Multi-Container App to AWS Elastic Beanstalk
Lukonde will start by giving an overview of how Docker Compose works and how it makes it very easy and straightforward to startup multiple Docker containers at the same time and automatically connect them together with some form of networking.
After that, Lukonde will take a hands on approach to containerize an Nginx server, a React app, a NodeJS app and a MongoDB instance to demonstrate the power of Docker Compose. He'll demonstrate usage of two Docker files for an application, one production grade and the other for local development and running of tests. Lastly, he'll demonstrate creating a CI/CD pipeline in AWS to build and test our Docker images before pushing them to Docker Hub or AWS ECR, and finally deploying our multi-container application AWS Elastic Beanstalk.
From Fortran on the Desktop to Kubernetes in the Cloud: A Windows Migration S...Docker, Inc.
Elton Stoneman, Docker Captain + Container Consultant and Trainer
How do you provide a SaaS offering when your product is a 10-year old Fortran app, currently built to run on Windows 10? With Docker and Kubernetes of course - and you can do it in a week (... to prototype level at least).
In this session I'll walk through the processes and practicalities of taking an older Windows app, making it run in containers with Kubernetes, and then building a simple API wrapper to host the whole stack as a cloud-based SaaS product.
There's a lot of technology here from a real world case study, and I'll focus on:
- running Windows apps in Docker containers
- building a .NET Core API which can run in Linux or Windows containers
- running the stack in Kubernetes with Docker Desktop locally and AKS in the cloud
- configuring AKS workloads in Azure to burst out to Azure Container Instances
And there's a core theme to this session: Docker and Kubernetes are complex technologies, but they're the key to modern development. If you invest time learning them, they make projects like this simple, portable, fast and fun.
Developing with Docker for the Arm ArchitectureDocker, Inc.
This virtual meetup introduces the concepts and best practices of using Docker containers for software development for the Arm architecture across a variety of hardware systems. Using Docker Desktop on Windows or Mac, Amazon Web Services (AWS) A1 instances, and embedded Linux, we will demonstrate the latest Docker features to build, share, and run multi-architecture images with transparent support for Arm.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
9. The New World Order: Containers Codify OS Config
9
ProdDev QA Staging
DEV Server/VM QA Server/VM STG Server/VMPROD Server/VM
<PROD OS config><STG OS config><QA OS config><DEV OS config>
App
<code>
<APP OS config>
App
<code>
<APP OS config>
App
<code>
<APP OS config>
App
<code>
<APP OS config>
13. How Can You Use Jenkins & Docker Together?
1. Run Jenkins Masters & Slaves in
Docker
2. Build, Test, & Deploy Docker Images
from Jenkins
14. 1. Run Jenkins Masters & Slaves in Docker
Docker (Cloud) – use Docker
images as standardized build
environments to improve isolation
and elasticity
Docker Custom Build
Environment – specify customized
build environments as Docker
containers
CloudBees Docker Shared
Config – manage Docker (or
Swarm) host configuration centrally
in CloudBees Jenkins Operations
Center
15. 2. Build, Test, & Deploy Docker Images from Jenkins
Build and Publish – build projects
that have a Dockerfile and push
the resultant tagged image to
Docker Hub
Docker Traceability – identify
which build pushed a particular
container and displays the build /
image details in Jenkins
Docker Hub Notification – trigger
downstream jobs when a tagged
container is pushed to Docker Hub
17. Jenkins Workflow Primer
Jenkins powered CD pipelines
Jenkins Workflow
ProdDev
Perf Test
BuildCommit
Selenium
Test
Stage Deploy
Sonar
Test
Pipelines Need:
Branching
Looping
Restarts
Checkpoints
Manual Input
??
18. Key Workflow Features
18
Entire flow is one concise Groovy script using Workflow DSL
• For loops, try-finally, fork-join …
Can restart Jenkins while flow is running
Allocate slave nodes and workspaces
• As many as you want, when you want
Stages throttle concurrency of builds
Human input/approval integrated into flow
Standard project concepts: SCM, artifacts, plugins
20. Pipeline Stages
20
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
21. Build, unit test and package
21
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker Tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
22. Build, unit test and package
stage 'Build App’
node('docker') {
docker.image(‘maven:3.3.3-jdk-8’).inside(‘-v /data:/data’ {
mkdir –p /data/mvn
writeFile file: 'settings.xml', text: ”(………)"
git 'https://github.com/cloudbees/mobile-deposit-api.git’
sh 'mvn –s settings.xml clean package’
…
Specify the Stage Name
Specify the slave label
Custom Build Env Mount volume from slave
.m2 repo location
co and build
24. Test the app
24
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker Tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
25. Test the app
node('docker') {
docker.image(‘maven:3.3.3-jdk-8’).inside(‘-v /data:/data’ {
…
stage 'Sonar analysis’
sh 'mvn -s settings.xml sonar:sonar’
stage 'Integration-test’
sh 'mvn -s settings.xml verify’
step([$class: 'JUnitResultArchiver', testResults: '**/target/surefire-reports/TEST-*.xml'])
}
…
In same env as build
Sonar tests
Run API Tests
26. Build, test and publish Docker image
26
Build
and Unit
Test App
Test
Docker
Image
Publish
Docker
Image
SCM Checkout
mvn package
mvn sonar:sonar
mvn verify
docker build
docker Tag
docker run
notify
cucumber
war img
Sonar
Analysis
Prepare
Release
Build
Docker
Image
Int Test
docker push
image.inside withServer
30. Traceability
Builds on existing Jenkins artifact traceability
Allows the tracking of the creation and use of Docker containers in
Jenkins and their future use.
Combine with artifact fingerprinting for a comprehensive solution
Each Build shows the image fingerprints created
30
Identify which build pushed a particular container and display the
build / image details in Jenkins
Image fingerprints
31. Traceability – registering events
Jenkins can track actions against this image such as:
• Creating a container
• Container events such as start/stop
To achieve this, it is necessary to call the Traceability API – see
$(JENKINS_URL)/docker-traceability/api/
There are two endpoints to submit events to:
31
/docker-
traceability/submitContai
nerStatus
Allows to submit the current container status
snapshot with a minimal set of parameters. Outputs
of docker inspect $(containerId) can be directly
submitted to Jenkins server using this command.
/docker-
traceability/submitReport
Submits a report using the extended JSON API. This
endpoint can be used by scripts to submit the full
available info about the container and its
environment in a single command.
36. Docker Hub Notification
Trigger downstream jobs when a tagged container is pushed to Docker Hub
The Docker Hub Notification Trigger plugin lets you configure Jenkins to
trigger builds when an image is pushed to Docker Hub. E.g. to run
verification for the container.
What are the steps
Set up a WebHook Account for Notification
Set up your Docker Registry to make callbacks on Image events
Set up your builds
36
37. Docker Hub Notification – Docker Registry Webhook
37
In the format:
http://<user>:<token>@<jenkins_url>/dockerhub-webhook/notify
40. Docker and Jenkins with Workflow is the proven
CD Platform
40
+
TESTING
STAGING
PRODUCTION
Workflow CD Pipeline Triggers:
• New application code (i.e. feature, bug, etc.)
• Updated certified stack (security fix in Linux, etc.)
… will lead to a new gold image being built and available for…
… TESTING
… STAGING
… PRODUCTION
All taking place in a standardized/similar/consistent environment
<OS config>
Company
“Gold”
Docker Img
(~per app)
App
<code>
(git, etc.)
<OS config>
Certified
Docker
Images
(Ubuntu, etc.)
Jenkins Workflow
41. CloudBees: Leading the Way for Docker and CD
Docker Workflow – Provides first-class support for Jenkins Workflow to build real
world CD pipelines for containerized applications using Jenkins and Docker
Build and Publish – Builds projects that have a Dockerfile and pushes the
resultant tagged image to Docker Hub
Docker Hub Notification – Triggers downstream jobs when a tagged container is
pushed to Docker Hub
Docker Traceability – Identifies which build pushed a particular container that is
running in production and displays that on the Jenkins builds page
Docker – Uses Docker containers as standardized build environments to improve
isolation and elasticity – Dockerized Build Slaves
Docker Custom Build Environment – Specifies customized build environments
as Docker containers
42. Getting started
Docker plugin documentation:
http://documentation.cloudbees.com/docs/cje-user-guide/docker-
workflow.html
Workflow tutorial:
https://github.com/jenkinsci/workflow-plugin/blob/master/TUTORIAL.md
Example Source Code
https://github.com/harniman/mobile-deposit-api/blob/master/flow.groovy
43. How Do You Manage CD at Enterprise Scale?
43
CloudBees Jenkins Platform
Jenkins at Enterprise Scale for CI and CD
About me:
I work for CloudBees as a Solution Architect helping our customers understand how CloudBees Jenkins Platform can help them solve their goals.
I have been in engineering for over 20 years and have performed various java development and architecture roles including a stint as a build engineer and as a lead Dev Ops.
I came to CloudBees from Sky where I had responsibility for the online video platform, and as part of my time there I designed and built an online platform for sales and service using Infrastructure as Code principles – devops before it was called that! QAs deployed many times a day via a self service mechanism with db redeployment/upgrade and flexible mocking options. We deployed to prod weekly with full VM tear down and rebuild via a scripted ‘next”, “next” approach
I am interested in all things automation, devops, and especially how that applies in the cloud.
We’ve heard this Meme over and over. Marc Andreeson said “Software is eating the World.”
What does this mean? Wherever we look, products are being defined by the software they run as much as the physical appearance. For instance, is a car defined just by its style, or by the driver automation features implemented by software such as auto parking, lane assist, adaptive cruise control, self driving? What about the recent emissions scandal involving a certain German manufacturer? Has this been attributed to Hardware or Software?
The software stakes have never been higher. Quality needs have never been higher – who wants their self driving car to crash – but speed to market of new features becomes critical as software becomes a key differentiator.
So, how do we do that? How do we deliver better software faster?
How do we take code developed by developers and rapidly move it to production as new features for users?
Whilst maintaining quality.
Well, Automation is the key.
Just as the Tesla Motor Company built a fully automated factory floor to produce their leading edge cars, we need to build a fully automated software factory using automation technologies.
Lets look at the advantage Docker brings to speeding up this process.
A typical full stack configuration looks like this:
Develop Code
Commit to SCM
Build and test app with Jenkins
Provision environment with Puppet
Test
Environment and App code are not bound tightly together. Environment changes do not propagate with App changes.
Testers find bugs, developers have to spend time investigating why it worked in DEV and not in PROD and then re-working. This is not fast
Use Docker to manage the environment config alongside the application.Propagate the same configuration across all environments.
If it works in Dev, it will work in prod.
Focus on new innovation rather than fault finding.
What does this look like in reality?
We package all app related OS config with the application code.
The same tested package is propagated across the environments.
This takes the single binary concept to the next level.
(NB we still have to manage data network layers and provide consistent configuration. Other tooling can address these needs.
Images need to be built using a reliable, repeatable and automated process
These days it is not acceptable to build application artifacts by hand – so Docker Images need the same type of automation.
This is where Jenkins comes to our rescue
Jenkins is widely used for application CI and drives many CD initiaves (RebelLabs research showed 70% of Java projects use Jenkins)
Lets look at how Jenkins and Docker can be used together to take your delivery process to the next level.
Two patterns of use:
Use Docker to provide run-time environments for Jenkins components – Slaves and Masters (And Operations Center if running CloudBees Jenkins Platform)
Use Jenkins to build and test Docker Images
Firstly Docker can be leveraged as the runtime platform for Jenkins components such as Masters and Slaves.
There are standard docker images for Masters, and the CloudBees Jenkins Platform components. Docker can also be used to provision Slave nodes on demand using the Docker Slaves plugin. Various images exist, or roll your own with all required tools. Also integrates via Swam and Kubernetes for scaling across many Docker hosts
Sometimes you want a very controlled build environment – think clean room, or you need certain pre-configured credentials or other config to exist. The Custom Build Environment plugin allows you to achieve just this. Within your slave, a container is spun up from a predefined image, filesystems mounted from the slave and the build steps executed within the container.
Users of the CloudBees Jenkins Platform are able to leverage the Shared Config capability to distribute the docker host and image/label configuration across the whole cluster of masters from a central point.
I won’t go into details of these now, as we want to focus on pipelines.
The second area that Jenkins and Docker deliver is the ability to create a fully automated pipeline to Build, Test and Deploy Docker images.
The Build and Publish plugin provides an easy to use abstraction of the Docker command line and adds Jenkins Build Steps for build, tag, push etc
Docker Traceability extends Jenkins existing Fingerprint capability to allow identification of the underlying build that created a given image, and allows tracing back from a running container
Docker Hub Notification addresses two needs. How do I trigger a redeployment when an Image is pushed, and, given Docker’s layered approach, how do I rebuild my image if an upstream layer is changed – ie my Company pushes a new Ubuntu-secure-base
These plugins can be used in regular Jenkins jobs to assemble pipelines, but I want to show you how super simple this is using Jenkins Workflow.
Workflow is a new Job type. Launched in Nov 2014.
Workflow is available to the OSS users.
A job now becomes the whole pipeline, and has the power to model complex scenarios such as Branching, Looping, handling human input.
A workflow also runs in a detached manner, which means as long as the real work is being performed on executors, it survives a Master restart.
Jenkins Workflow has some really cool features…
Workflow has the concept of Stages.
This screen shot is using the Stage View plugin from CloudBees Jenkins Platform to show how a typical Docker pipeline might look.
Stages are fully customizable.
Lets look at this example pipeline in more detail.
We are going to build the app just like we do today – this will compile and unit test, produce a war (mvn package) , run Sonar analysis (mvn sonar:sonar) and then Integration test (mvn verify)
The difference here, is we are going to run this inside a specific Docker container using the Custom Build Environment plugin
We will then prepare the release – in this case it is grabbing the version from the POM
Now comes the real Docker integration.
We will create the docker image and tag it.
We then spin up a container from this image, notify Jenkins of the container (for traceability) and run tests against it – these could be functional, security, performance – maybe you will have multiple test phases run in parallel against multiple containers.
If the tests pass, we then publish the image to our registry – public or private – the choice is yours.
Lets look at the build step in more detail
We specify the stage name
Next, we need to run these steps on a slave. This slave needs docker installed.
Then we define the container we need to run the build in
And mount a additional data volume – we do this to provide a common maven repo cache
Which is why we use a custom settings file to point to the repo location
And then we perform a git checkout, and run mvn from the command line
A note about docker slaves
In the global config, you specify various docker images that can be used as slaves, and map to labels.
There is an existing docker-in-docker plugin that I am using to spin up a container that also can run docker,
Next we will look at the sonar and integration test steps
We add stage names
And then we run the mvn targets from the command line
This is typical workflow pipeline construction
Now we’ll focus on the image creation and use.
First we need to ensure we have access to a docker host.
You can see here I am referencing a Jenkins Credentials via its ID. The docker plugins are fully integrated with Jenkins Credentials API.
Within this block that performs the bind, I will then execute the workflow steps
You can see more stages defined – I’m not going to cover these in detail – it’s the same as before
Next we need to ensue we are executing commands within the context of the correct directory on the filesystem that contains the dockerfile
We then invoke the build – providing the tag at the same time – note we obtain a reference to the image
Once the image is built, we want to provision a container. Note we also grab a handle to this so we can address it later.
The next step is to notify Jenkins that we have created a container from this image (I’ll show more details a bit later)
If the tests pass, we bind to the registry (the default is dockerhub) – note we also supply credentials reference here too, and the push the image
And voila, we have a tagged version that is fully tested.
So I mentioned earlier I would talk more about traceability.
Identify which build pushed a particular container and display the build / image details in Jenkins
After we have spun up a container, we need to call the Jenkins traceability endpoint with details.
Fortunately we can pass in the output of “docker inspect”
What does this give us?
On the left hand menu we have a new Docker Traceability item
It shows the containers known to Jenkins
Clicking on one reveals
The container’s events – as logged via the Traceability API
And the Build that created the image so you can trace back to the source.
A final word on Docker Hub Notifications
Trigger downstream jobs when a tagged container is pushed to Docker Hub
Need to configure Docker registry with a WebHook and provide the user and token to access
Then you configure the trigger conditions on the jobs
It can either be automatic from any Docker image used by the build – ie deploy container from image x
Or, you can list the depenant images explicity
OK, so in conclusion
Jenkins and Docker can be your key to Continuous Delivery.
The same automation engine that you already know and use for CI can fully power your docker based CD process as well.
Jenkins supports the creation and management of complex Delivery Pipelines
CloudBees has been working closely with Docker, the company, to create a number of Jenkins plugins that insure that Docker is a first-class entity in the CD/DevOps ecosystem.
How can you get started?
Documentation on the Docker extensions for Workflow
Workflow Tutorial
Take a look at the example application and pipeline on my Github
So, how do you manage Jenkins at Enterprise scale?
If you are going to use Jenkins for CI or CD, then it will become a crucial part of your application delivery environment. You need to be confident that it will be there when needed.
That’s where we come in.
<click build>
CloudBees is the enterprise Jenkins company.
We offer subscription based access to CloudBees Jenkins Enterprise which is an enhanced, robust, and highly available version of Jenkins that is built on the same open source core that you know and trust.