This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
Continuous delivery with jenkins pipelines (@WeAreDevelopers2017)Roman Pickl
Roman Pickl is the CTO of Fluid4me and has been using Jenkins since 2012. He gives a presentation on continuous delivery with Jenkins Pipelines. Jenkins is an open source tool for continuous integration and delivery that has over 100,000 active installations. The presentation demonstrates how to set up a Jenkinsfile and pipeline in code to automate building, testing, and deploying applications. It also shows Blue Ocean, a new user interface for visualizing and editing Jenkins pipelines.
Continuous delivery with jenkins pipelines (@devfest Vienna)Roman Pickl
Presentation at DevFest Vienna 25.11.2017:
This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
Jenkins Pipeline @ Scale. Building Automation Frameworks for Systems IntegrationOleg Nenashev
This is a follow-up presentation to my talk at CloudBees | Jenkins Automotive and Embedded Day 2016, where I was presenting Pipeline usage strategies for use-cases in the Embedded area. In this presentation I talk about Jenkins Pipeline features for automation frameworks and talk about lessons learned in several project.
This document summarizes a Jenkins pipeline for testing and deploying Chef cookbooks. The pipeline is configured to automatically scan a GitHub organization for any repositories containing a Jenkinsfile. It will then create and manage multibranch pipeline jobs for each repository and branch. The pipelines leverage a shared Jenkins global library which contains pipeline logic to test and deploy the Chef cookbooks. This allows for standardized and reusable pipeline logic across all Chef cookbook repositories.
Jenkins vs. AWS CodePipeline (AWS User Group Berlin)Steffen Gebert
This document summarizes a presentation comparing Jenkins and AWS CodePipeline for continuous integration and delivery. It provides an overview of how to set up and use Jenkins and CodePipeline, including building environments, secrets handling, testing, branching strategies, approvals, and deployments. It also compares features, pricing, access control, and visualization capabilities between the two tools. Finally, it discusses options for integrating Jenkins and CodePipeline together to leverage the strengths of both solutions. The overall message is that the best tool depends on each organization's needs, and combining tools can provide benefits over relying on a single solution.
This document discusses continuous delivery and the new features of Jenkins 2, including pipeline as code. Jenkins 2 introduces the concept of pipeline as a new type that allows defining build pipelines explicitly as code in Jenkinsfiles checked into source control. This enables pipelines to be versioned, more modular through shared libraries, and resumed if interrupted. The document provides examples of creating pipelines with Jenkinsfiles that define stages and steps for builds, tests and deployments.
http://www.meetup.com/BruJUG/events/228994900/
During this session, you will presented a solution to the problem of scalability of continuous delivery in Jenkins, when your organisation has to deal with thousands of jobs, by introducing a self-service approach based on the "pipeline as code" principles.
Kamon is an open-source tool for monitoring JVM applications like those using Akka. It provides metrics collection and distributed tracing capabilities. The document discusses how Kamon 1.0 can be used to monitor Akka applications by collecting automatic and custom metrics. It also describes how to set up Kamon with Prometheus and Grafana for metrics storage and visualization. The experience of instrumenting an application at EMnify with Kamon is presented as an example.
Continuous delivery with jenkins pipelines (@WeAreDevelopers2017)Roman Pickl
Roman Pickl is the CTO of Fluid4me and has been using Jenkins since 2012. He gives a presentation on continuous delivery with Jenkins Pipelines. Jenkins is an open source tool for continuous integration and delivery that has over 100,000 active installations. The presentation demonstrates how to set up a Jenkinsfile and pipeline in code to automate building, testing, and deploying applications. It also shows Blue Ocean, a new user interface for visualizing and editing Jenkins pipelines.
Continuous delivery with jenkins pipelines (@devfest Vienna)Roman Pickl
Presentation at DevFest Vienna 25.11.2017:
This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
Jenkins Pipeline @ Scale. Building Automation Frameworks for Systems IntegrationOleg Nenashev
This is a follow-up presentation to my talk at CloudBees | Jenkins Automotive and Embedded Day 2016, where I was presenting Pipeline usage strategies for use-cases in the Embedded area. In this presentation I talk about Jenkins Pipeline features for automation frameworks and talk about lessons learned in several project.
This document summarizes a Jenkins pipeline for testing and deploying Chef cookbooks. The pipeline is configured to automatically scan a GitHub organization for any repositories containing a Jenkinsfile. It will then create and manage multibranch pipeline jobs for each repository and branch. The pipelines leverage a shared Jenkins global library which contains pipeline logic to test and deploy the Chef cookbooks. This allows for standardized and reusable pipeline logic across all Chef cookbook repositories.
Jenkins vs. AWS CodePipeline (AWS User Group Berlin)Steffen Gebert
This document summarizes a presentation comparing Jenkins and AWS CodePipeline for continuous integration and delivery. It provides an overview of how to set up and use Jenkins and CodePipeline, including building environments, secrets handling, testing, branching strategies, approvals, and deployments. It also compares features, pricing, access control, and visualization capabilities between the two tools. Finally, it discusses options for integrating Jenkins and CodePipeline together to leverage the strengths of both solutions. The overall message is that the best tool depends on each organization's needs, and combining tools can provide benefits over relying on a single solution.
This document discusses continuous delivery and the new features of Jenkins 2, including pipeline as code. Jenkins 2 introduces the concept of pipeline as a new type that allows defining build pipelines explicitly as code in Jenkinsfiles checked into source control. This enables pipelines to be versioned, more modular through shared libraries, and resumed if interrupted. The document provides examples of creating pipelines with Jenkinsfiles that define stages and steps for builds, tests and deployments.
http://www.meetup.com/BruJUG/events/228994900/
During this session, you will presented a solution to the problem of scalability of continuous delivery in Jenkins, when your organisation has to deal with thousands of jobs, by introducing a self-service approach based on the "pipeline as code" principles.
Kamon is an open-source tool for monitoring JVM applications like those using Akka. It provides metrics collection and distributed tracing capabilities. The document discusses how Kamon 1.0 can be used to monitor Akka applications by collecting automatic and custom metrics. It also describes how to set up Kamon with Prometheus and Grafana for metrics storage and visualization. The experience of instrumenting an application at EMnify with Kamon is presented as an example.
An Open-Source Chef Cookbook CI/CD Implementation Using Jenkins PipelinesSteffen Gebert
This document discusses implementing continuous integration and continuous delivery (CI/CD) for Chef cookbooks using Jenkins pipelines. It introduces Jenkins pipelines and how they can be used to test, version, and publish Chef cookbooks. Key steps include linting, dependency resolution, test-kitchen testing, version bumping, and uploading to the Chef Server. The jenkins-chefci cookbook automates setting up Jenkins with the necessary tools to run pipelines defined in a shared Groovy library for cookbook CI/CD.
Belarus Jenkins Meetup - Managing security in Jenkins with configuration-as-c...Oleg Nenashev
In this presentation I will show how to protect your Jenkins system from common user mistakes using Configuration-as-Code and Ownership-based security.
Jenkins days workshop pipelines - Eric Longericlongtx
This document provides an overview of a Jenkins Days workshop on building Jenkins pipelines. The workshop goals are to install Jenkins Enterprise, create a Jenkins pipeline, and explore additional capabilities. Hands-on exercises will guide attendees on installing Jenkins Enterprise using Docker, creating their first pipeline that includes checking code out of source control and stashing files, using input steps and checkpoints, managing tools, developing pipeline as code, and more advanced pipeline steps. The document encourages attendees to get involved with the Jenkins and CloudBees communities online and on Twitter.
Cloud-Native CI/CD on Kubernetes with Tekton PipelinesNikhil Thomas
This document discusses how to build cloud-native CI/CD pipelines with Tekton on Kubernetes. It introduces Tekton Pipeline custom resources like Tasks, Pipelines, PipelineResources that model CI/CD concepts. It demonstrates building a sample app pipeline that pulls source code, builds container images, and deploys services. The presentation concludes by showing how to use the Tekton CLI to interact with pipelines and view logs.
This document provides an overview of Kubernetes for Java developers. It discusses setting up Kubernetes with Docker and Minikube to deploy Java microservices. It covers key Kubernetes concepts like pods, replica sets, services and ingress. It also provides steps to deploy a sample database and application on Kubernetes, and best practices for containerizing Java applications to run on Kubernetes.
Continuous Deployment with Kubernetes, Docker and GitLab CIalexanderkiel
This document discusses continuous deployment of Clojure services to Kubernetes using Docker and GitLab CI. It provides an overview of Docker, Kubernetes, deploying a sample Clojure service, and configuring GitLab CI for continuous integration and deployment. The sample Clojure service is built as a Docker image, tested using GitLab CI, and deployed to Kubernetes clusters for testing and production using configuration files and GitLab CI pipelines.
Openshift: The power of kubernetes for engineers - Riga Dev Days 18Jorge Morales
1. The document introduces OpenShift as a container application platform based on Kubernetes that provides developers with tools for building, deploying and managing containerized applications.
2. It discusses key OpenShift concepts like pods, services, projects and image registries that allow grouping and connecting container workloads as well as storing and distributing container images.
3. Hands-on examples and tutorials are provided to demonstrate how developers can use OpenShift to develop multi-container applications from source code to deployment through features like source-to-image builds, deployments and routes.
This document compares Jenkins and AWS CodePipeline for implementing software pipelines. It finds that Jenkins provides more flexibility through plugins and scripting but requires managing infrastructure, while CodePipeline is fully hosted but has less customization options. Both can be combined, with CodePipeline triggering Jenkins jobs or Jenkins deploying code using CodeDeploy. The document concludes that the right solution depends on individual needs and integrating tools enables getting benefits from both.
January OpenNTF Webinar: 4D - Domino Docker Deep DiveHoward Greenberg
This talk is for Domino admins and developers who would like to leverage containerization and want to get started navigating this jungle of technologies. Docker, Podman, Kubernetes, OpenShift, and more - we're going to explain when to use which platform and how to automate your deployments. The speakers will be:
Thomas Hampel, Director, HCL Product Management
Daniel Nashed, HCL Lifetime Ambassador
How to use Concourse CI to deliver BOSH releasesAmit Gupta
This document discusses using Concourse CI to deliver BOSH releases nominally. It recommends separating creation, deployment, and testing of releases in CI/CD pipelines. Specific practices mentioned include not having snowflake environments, testing task scripts, managing BOSH jobs with programs, and making pipeline configurations and jobs public. Resources provided include links to sample release and pipeline repositories.
Talk about Continuous Delivery and Jenkins 2 that enables us to define our pipelines with our code. Showing example pipelines, snippets and the new Blue Ocean GUI.
CI/CD Pipeline as a Code using Jenkins 2Mayank Patel
Mayank Patel from Oildex gave a presentation on Jenkins 2 Pipelines. He discussed how pipelines allow continuous delivery through features like resilience, pausability, and efficiency. Pipelines can be configured as code in source control and provide security and reusability. The presentation covered the Jenkins environment, ideal pipeline flows, important plugins, and included a demo of a sample pipeline configured with Docker.
Voxxed Luxembourd 2016 Jenkins 2.0 et Pipeline as codeDamien Duportal
Né Hudson en 2004 (cf. http://kohsuke.org/2011/01/11/bye-bye-hudson-hello-jenkins/), le projet Jenkins vient de franchir un cap majeur : la version Jenkins 2.0 (cf. https://groups.google.com/forum/#!msg/jenkinsci-dev/vbXK7JJekFw/BlEvO0UxBgAJ) !
Cette étape majeure réussit à concilier la gestion de l'ancien, et la transition vers des pratiques de déploiement continu plus modernes.
Parmi les nouveautés, la gestion des Pipeline-as-a-Code et l'intégration de Docker sont deux éléments dont vous allez pouvoir tirer de nombreux bénéfices.
Si vous êtes intéressés pour un exemple concret de migration depuis un Jenkins 1.x vers un flux basé sur Docker et Pipeline avec Jenkins 2.0, cette session est faite pour vous !
L'exemple suivi sera un projet Java-Maven "type", stocké sur un dépôt Git, bénéficiant de tests et d'analyses, en "multi-job enchaînés", que nous ferons glisser dans un "Jenkins Pipeline", configuré via un fichier du dépôt Git, en mode "livraison continue" via Docker.
The document discusses the new Jenkins Workflow engine. It provides an overview of continuous delivery and how Jenkins is used to orchestrate continuous delivery processes. The new Workflow engine in Jenkins allows defining complex build pipelines using a Groovy DSL, with features like stages, interactions with humans, and restartable builds. Examples of using the new Workflow syntax are demonstrated. Possible future enhancements to Workflow are also discussed.
February OpenNTF Webinar: Introduction to Ansible for NewbiesHoward Greenberg
This talk is for Domino admins and developers who would like to learn Ansible basics. Ansible is an automation engine to automate deployments. HCL provides a set of Ansible playbooks and roles to deploy a complete HCL Connections 7 environment. Come learn what Ansible is and why you should use it in this webinar.
The speaker will be:
Christoph Stoettener, HCL Ambassador
This document discusses how Jenkins can be used to integrate with Git and Docker. It describes how Jenkins supports advanced Git integration through various plugins that help manage interactions with Git repositories. It also explains how Jenkins can be used to both manage Docker resources and build Docker images through available plugins. The document includes demonstrations of these capabilities.
The document provides an overview of continuous integration and continuous delivery practices. It discusses continuous integration, which involves integrating code changes frequently and verifying them through automated builds and tests. Continuous delivery is described as building software in a way that allows release to production at any time, while continuous deployment means any change is automatically deployed to production. Jenkins, an open source automation server, is introduced as a tool that enables continuous integration and deployment through jobs, credentials, scheduling, build steps, and post-build actions. Pipelines in Jenkins are discussed as dividing deployment into stages to provide quick feedback. The Blue Ocean plugin is highlighted as providing a simplified user interface for Jenkins pipelines.
The document outlines Julien Pivotto's presentation on building pipelines at scale using Jenkins and Puppet. It discusses how Puppet can be used to define Jenkins job configurations and pipelines for applications and infrastructure to allow easy deployment of new pipelines. It also covers alternative approaches using Jenkins plugins to define pipelines through Groovy scripts to reduce complexity compared to Puppet management.
Continuous delivery with jenkins pipelines @devops pro moscow Roman Pickl
This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
Continuous delivery with jenkins pipelines @devopsdays cairoRoman Pickl
This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
An Open-Source Chef Cookbook CI/CD Implementation Using Jenkins PipelinesSteffen Gebert
This document discusses implementing continuous integration and continuous delivery (CI/CD) for Chef cookbooks using Jenkins pipelines. It introduces Jenkins pipelines and how they can be used to test, version, and publish Chef cookbooks. Key steps include linting, dependency resolution, test-kitchen testing, version bumping, and uploading to the Chef Server. The jenkins-chefci cookbook automates setting up Jenkins with the necessary tools to run pipelines defined in a shared Groovy library for cookbook CI/CD.
Belarus Jenkins Meetup - Managing security in Jenkins with configuration-as-c...Oleg Nenashev
In this presentation I will show how to protect your Jenkins system from common user mistakes using Configuration-as-Code and Ownership-based security.
Jenkins days workshop pipelines - Eric Longericlongtx
This document provides an overview of a Jenkins Days workshop on building Jenkins pipelines. The workshop goals are to install Jenkins Enterprise, create a Jenkins pipeline, and explore additional capabilities. Hands-on exercises will guide attendees on installing Jenkins Enterprise using Docker, creating their first pipeline that includes checking code out of source control and stashing files, using input steps and checkpoints, managing tools, developing pipeline as code, and more advanced pipeline steps. The document encourages attendees to get involved with the Jenkins and CloudBees communities online and on Twitter.
Cloud-Native CI/CD on Kubernetes with Tekton PipelinesNikhil Thomas
This document discusses how to build cloud-native CI/CD pipelines with Tekton on Kubernetes. It introduces Tekton Pipeline custom resources like Tasks, Pipelines, PipelineResources that model CI/CD concepts. It demonstrates building a sample app pipeline that pulls source code, builds container images, and deploys services. The presentation concludes by showing how to use the Tekton CLI to interact with pipelines and view logs.
This document provides an overview of Kubernetes for Java developers. It discusses setting up Kubernetes with Docker and Minikube to deploy Java microservices. It covers key Kubernetes concepts like pods, replica sets, services and ingress. It also provides steps to deploy a sample database and application on Kubernetes, and best practices for containerizing Java applications to run on Kubernetes.
Continuous Deployment with Kubernetes, Docker and GitLab CIalexanderkiel
This document discusses continuous deployment of Clojure services to Kubernetes using Docker and GitLab CI. It provides an overview of Docker, Kubernetes, deploying a sample Clojure service, and configuring GitLab CI for continuous integration and deployment. The sample Clojure service is built as a Docker image, tested using GitLab CI, and deployed to Kubernetes clusters for testing and production using configuration files and GitLab CI pipelines.
Openshift: The power of kubernetes for engineers - Riga Dev Days 18Jorge Morales
1. The document introduces OpenShift as a container application platform based on Kubernetes that provides developers with tools for building, deploying and managing containerized applications.
2. It discusses key OpenShift concepts like pods, services, projects and image registries that allow grouping and connecting container workloads as well as storing and distributing container images.
3. Hands-on examples and tutorials are provided to demonstrate how developers can use OpenShift to develop multi-container applications from source code to deployment through features like source-to-image builds, deployments and routes.
This document compares Jenkins and AWS CodePipeline for implementing software pipelines. It finds that Jenkins provides more flexibility through plugins and scripting but requires managing infrastructure, while CodePipeline is fully hosted but has less customization options. Both can be combined, with CodePipeline triggering Jenkins jobs or Jenkins deploying code using CodeDeploy. The document concludes that the right solution depends on individual needs and integrating tools enables getting benefits from both.
January OpenNTF Webinar: 4D - Domino Docker Deep DiveHoward Greenberg
This talk is for Domino admins and developers who would like to leverage containerization and want to get started navigating this jungle of technologies. Docker, Podman, Kubernetes, OpenShift, and more - we're going to explain when to use which platform and how to automate your deployments. The speakers will be:
Thomas Hampel, Director, HCL Product Management
Daniel Nashed, HCL Lifetime Ambassador
How to use Concourse CI to deliver BOSH releasesAmit Gupta
This document discusses using Concourse CI to deliver BOSH releases nominally. It recommends separating creation, deployment, and testing of releases in CI/CD pipelines. Specific practices mentioned include not having snowflake environments, testing task scripts, managing BOSH jobs with programs, and making pipeline configurations and jobs public. Resources provided include links to sample release and pipeline repositories.
Talk about Continuous Delivery and Jenkins 2 that enables us to define our pipelines with our code. Showing example pipelines, snippets and the new Blue Ocean GUI.
CI/CD Pipeline as a Code using Jenkins 2Mayank Patel
Mayank Patel from Oildex gave a presentation on Jenkins 2 Pipelines. He discussed how pipelines allow continuous delivery through features like resilience, pausability, and efficiency. Pipelines can be configured as code in source control and provide security and reusability. The presentation covered the Jenkins environment, ideal pipeline flows, important plugins, and included a demo of a sample pipeline configured with Docker.
Voxxed Luxembourd 2016 Jenkins 2.0 et Pipeline as codeDamien Duportal
Né Hudson en 2004 (cf. http://kohsuke.org/2011/01/11/bye-bye-hudson-hello-jenkins/), le projet Jenkins vient de franchir un cap majeur : la version Jenkins 2.0 (cf. https://groups.google.com/forum/#!msg/jenkinsci-dev/vbXK7JJekFw/BlEvO0UxBgAJ) !
Cette étape majeure réussit à concilier la gestion de l'ancien, et la transition vers des pratiques de déploiement continu plus modernes.
Parmi les nouveautés, la gestion des Pipeline-as-a-Code et l'intégration de Docker sont deux éléments dont vous allez pouvoir tirer de nombreux bénéfices.
Si vous êtes intéressés pour un exemple concret de migration depuis un Jenkins 1.x vers un flux basé sur Docker et Pipeline avec Jenkins 2.0, cette session est faite pour vous !
L'exemple suivi sera un projet Java-Maven "type", stocké sur un dépôt Git, bénéficiant de tests et d'analyses, en "multi-job enchaînés", que nous ferons glisser dans un "Jenkins Pipeline", configuré via un fichier du dépôt Git, en mode "livraison continue" via Docker.
The document discusses the new Jenkins Workflow engine. It provides an overview of continuous delivery and how Jenkins is used to orchestrate continuous delivery processes. The new Workflow engine in Jenkins allows defining complex build pipelines using a Groovy DSL, with features like stages, interactions with humans, and restartable builds. Examples of using the new Workflow syntax are demonstrated. Possible future enhancements to Workflow are also discussed.
February OpenNTF Webinar: Introduction to Ansible for NewbiesHoward Greenberg
This talk is for Domino admins and developers who would like to learn Ansible basics. Ansible is an automation engine to automate deployments. HCL provides a set of Ansible playbooks and roles to deploy a complete HCL Connections 7 environment. Come learn what Ansible is and why you should use it in this webinar.
The speaker will be:
Christoph Stoettener, HCL Ambassador
This document discusses how Jenkins can be used to integrate with Git and Docker. It describes how Jenkins supports advanced Git integration through various plugins that help manage interactions with Git repositories. It also explains how Jenkins can be used to both manage Docker resources and build Docker images through available plugins. The document includes demonstrations of these capabilities.
The document provides an overview of continuous integration and continuous delivery practices. It discusses continuous integration, which involves integrating code changes frequently and verifying them through automated builds and tests. Continuous delivery is described as building software in a way that allows release to production at any time, while continuous deployment means any change is automatically deployed to production. Jenkins, an open source automation server, is introduced as a tool that enables continuous integration and deployment through jobs, credentials, scheduling, build steps, and post-build actions. Pipelines in Jenkins are discussed as dividing deployment into stages to provide quick feedback. The Blue Ocean plugin is highlighted as providing a simplified user interface for Jenkins pipelines.
The document outlines Julien Pivotto's presentation on building pipelines at scale using Jenkins and Puppet. It discusses how Puppet can be used to define Jenkins job configurations and pipelines for applications and infrastructure to allow easy deployment of new pipelines. It also covers alternative approaches using Jenkins plugins to define pipelines through Groovy scripts to reduce complexity compared to Puppet management.
Continuous delivery with jenkins pipelines @devops pro moscow Roman Pickl
This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
Continuous delivery with jenkins pipelines @devopsdays cairoRoman Pickl
This talk demonstrates how a continuous delivery deployment pipeline can be set up harnessing jenkins 2’s Pipeline as Code features as well as its new Blue Ocean User Experience.
Codifying the Build and Release Process with a Jenkins Pipeline Shared LibraryAlvin Huang
These are my slides from my Jenkins World 2017 talk, detailing a war story of migrating 150-200 Freestyle Jobs for build and release, into ~10 line Jenkinsfiles that heavily leverages Jenkins Pipeline Shared Libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/)
This document provides an overview of Jenkins, an open-source tool for continuous integration and continuous delivery. It discusses key Jenkins concepts like architecture, pipelines, and shared libraries. Jenkins allows integrating multiple stages of development through continuous integration and delivery. It has a master-slave architecture and supports defining automated build processes through pipelines implemented as code.
SD DevOps Meet-up - Jenkins 2.0 and Pipeline-as-CodeBrian Dawson
This is a presentation given at the March 16th San Diego DevOps Meet-up covering some of the upcoming activities around Jenkin 2.0 and the Pipeline plugins which provide for Pipeline-as-Code and enable Jenkins with 1st class pipelines and stages.
Implementing CI CD UiPath Using Jenkins PluginSatish Prasad
The document provides a step-by-step guide to implementing continuous integration and continuous delivery (CI/CD) for UiPath projects using Jenkins and the UiPath Jenkins plugin. It covers setting up Jenkins, installing the UiPath plugin, creating a sample pipeline with build and test stages, and deploying packages to UiPath Orchestrator. The pipeline utilizes environment variables, credentials, and the UiPathPack, UiPathTest, and UiPathDeploy steps.
This presentation walks through a Jenkins as Code approach that aims to fully automate and describe the creation of Infrastructure, Application and Configuration as Code.
We treat our applications with a strong 'as code' approach, but often forget about the critical operational tools. This presentation shows how it is possible to create a code first approach to creating and managing a Jenkins Service.
Working code repository is available at https://bitbucket.org/stevemac/dockerfiles
Continuous Delivery in the Cloud with Bitbucket PipelinesAtlassian
This document discusses Bitbucket Pipelines, a continuous integration tool from Atlassian. It allows developers to automatically build, test, and deploy their code every time a change is merged into a shared repository. Pipelines uses Docker containers to run builds, allowing them to be fast, isolated from infrastructure concerns, and reproducible across environments. It also supports defining build configurations as code to make the pipeline definition versioned, reusable, and easy to understand at a glance.
This document discusses Jenkins Pipelines, which allow defining continuous integration and delivery (CI/CD) pipelines as code. Key points:
- Pipelines are defined using a Groovy domain-specific language (DSL) for stages, steps, and environment configuration.
- This provides configuration as code that is version controlled and reusable across projects.
- Jenkins plugins support running builds and tests in parallel across Docker containers.
- Notifications can be sent to services like Slack on failure.
- The Blue Ocean UI in Jenkins focuses on visualization of pipeline runs.
Fast and efficient software testing is easy with Docker. We often
use containers to maintain parity across development, testing, and production environments, but we can also use containerization to significantly reduce time needed for testing by spinning up multiple instances of fully isolated testing environments and executing tests in parallel. This strategy also helps you maximize the utilization of infrastructure resources. The enhanced toolset provided by Docker makes this process simple and unobtrusive, and you’ll see how Docker Engine, Registry, and Compose can work together to make your tests fast.
Continuous delivery with jenkins pipelines incl. dev tools (@ Vienna DevOps &...Roman Pickl
Presentation at Vienna DevOps & Security Meetup 14.06.2017:
Scripted Jenkins Pipelines - everyone knows and loves them. However, on this day, Roman Pickl (Fluidtime) will show us the new Syntax: Declarative Pipelines! A hint for the insiders: The presentation will be in ocean blue!
Pipeline as code - new feature in Jenkins 2Michal Ziarnik
What is pipeline as code in continuous delivery/continuous deployment environment.
How to set up Multibranch pipeline to fully benefit from pipeline features.
Jenkins master-node concept in Kubernetes cluster.
Ordina Accelerator program 2019 - Jenkins blue ocean pipelinesBert Koorengevel
This document provides an overview of CI/CD with Jenkins. It defines continuous integration and continuous delivery, discusses the history and benefits of Jenkins, and covers Jenkins pipeline plugins. It also demonstrates how to build a basic pipeline in Jenkins and extend it by integrating Sonar code quality analysis prior to deployment.
“I have stopped counting how many times I’ve done this from scratch” - was one of the responses to the tweet about starting the project called Spring Cloud Pipelines. Every company sets up a pipeline to take code from your source control, through unit testing and integration testing, to production from scratch. Every company creates some sort of automation to deploy its applications to servers. Enough is enough - time to automate that and focus on delivering business value.
In this presentation we’ll go through the contents of the Spring Cloud Pipelines project. We’ll start a new project for which we’ll have a deployment pipeline set up in no time. We’ll deploy to Cloud Foundry (but we also could do it with Kubernetes) and check if our application is backwards compatible so that we can roll it back on production.
Continuous Deployment of your Application @SpringOneciberkleid
Spring Cloud Pipelines is an opinionated framework that automates the creation of structured continuous deployment pipelines.
In this presentation we’ll go through the contents of the Spring Cloud Pipelines project. We’ll start a new project for which we’ll have a deployment pipeline set up in no time. We’ll deploy to Cloud Foundry and check if our application is backwards compatible so that we can roll it back on production.
Explore seamless development with Continuous Integration using Jenkins and Python. Learn the essentials of integrating Jenkins with Python for efficient software deployment and management.
The Latest Status of CE Workgroup Shared Embedded Linux Distribution ProjectYoshitake Kobayashi
The CE workgroup of Linux Foundation has started a project to share the work of maintaining long-term support for an embedded distribution, by leveraging the work of the Debian and Debian LTS project. Debian gives you pre-compiled binary packages but the meta-debian layer enables to install customized packages to create similar or smaller images. If both usecases are able to share the source code, it is good to share the maintenance effort.
In this talk, Yoshitake will describe the details of meta-debian which provides a meta layer for the Poky build system. This talk will to gives the latest status, technical details and lessons learned from its development.
All source code are available on GitHub and related document also available on Github and elinux wiki.
SpringOne Platform 2017
Marcin Grzejszczak, Pivotal; Cora Iberkleid, Pivotal
"“I have stopped counting how many times I’ve done this from scratch” - was one of the responses to the tweet about starting the project called Spring Cloud Pipelines. Every company sets up a pipeline to take code from your source control, through unit testing and integration testing, to production from scratch. Every company creates some sort of automation to deploy its applications to servers. Enough is enough - time to automate that and focus on delivering business value.
In this presentation we’ll go through the contents of the Spring Cloud Pipelines project. We’ll start a new project for which we’ll have a deployment pipeline set up in no time. We’ll deploy to Cloud Foundry and check if our application is backwards compatible so that we can roll it back on production."
Continuous Delivery on Kubernetes Using SpinnakerWSO2
Continuous delivery helps development teams to deliver faster and safer. WSO2 Continuous delivery for Kubernetes provides the tools and pipelines required to continuously deliver WSO2 products to Kubernetes environments. Using tools like Jenkins, ELK, and Prometheus, WSO2 Kubernetes Pipeline is able to provide an end to end solution for development teams to deliver changes and WSO2 updates effortlessly.
This deck explores:
- Installing the Kubernetes pipeline chart with pre-configured pipelines using helm.
- Deploying development, staging and production environment.
- Deploying changes and WSO2 updates across environments.
- Centralized logging using ELK.
- Monitoring using Prometheus and Grafana.
Cloud Native CI/CD with Jenkins X and Knative PipelinesC4Media
Video and slides synchronized, mp3 and slide download available at URL http://bit.ly/2Pc3H50.
Christie Wilson and James Rawlings explain the CI/CD challenges in a cloud native landscape, and show how Jenkins X rises to them by leveraging open source cloud native technologies like Knative Pipelines. They demo a GitOps based Jenkins X workflow, showing how simple Jenkins X makes it for developers to stage and deploy changes on demand. Filmed at qconlondon.com.
Christie Wilson is a software engineer at Google, currently leading the knative build-pipeline project. Over the past ten years she has worked in the mobile, financial and video game industries. James Rawlings is a co-creator of the open source project Jenkins X and works for CloudBees, where he aims to help developers and teams move to the cloud.
Similar to Continuous delivery with jenkins pipelines @ devdays (20)
Are we really moving faster? How visualizing flow changed the way we workRoman Pickl
This document discusses how visualizing workflow helped a company change the way they work and move faster. It summarizes that initially they were creating too much inventory and would never be able to run fast enough. Value stream mapping revealed ways to change processes and improve flow. This led to the company finally being able to move faster. The final step is fully instituting organizational change.
Are we really moving faster? How visualizing flow changed the way we workRoman Pickl
This document discusses how visualizing workflow helped a company change the way they work to move faster. It summarizes that initially they realized they were creating too much unnecessary work. Visualizing their flow revealed they could not keep running faster without changes. Using value stream mapping showed they needed to change processes. These insights helped them finally start moving faster.
Are we really moving faster? How visualizing flow changed the way we workRoman Pickl
Are we really moving faster?
After putting in countless hours improving the deployment pipeline, investing in automation and deploying new technologies, it is time to ask this fundamental question: "Are we really moving faster?"
This is a story of how we made work visible by applying Flow Metrics to discover bottlenecks and improve flow.
The session will leave you with concrete steps to implement key metrics, automatically collect and visualize them on an open source dashboard and find an answer to this important question.
Key Takeaways:
- A brief Intro to Value Stream Mapping
- Actionable Flow Metrics
- An Implementation Example using an Open Source Solution
References and pointers to advanced material
For more information see: https://pickl.eu/blog/are-we-really-moving-faster-how-visualizing-flow-changed-the-way-we-work/
Continuous Code Quality with the Sonar Ecosystem @GeeCON 2017 in PragueRoman Pickl
Continuous Code Quality with the SonarEcosystem
SonarQube is the leading platform for static code analysis and Continuous Code Quality. In this talk we will look into all three lines of defense of the SonarEcosystem and how they can help to find bugs before they enter your codebase (or at least go into production). After this talk, you’ll have a good overview of the SonarEcosystem as well as actionable starting points for increasing your code quality. Furthermore, we will share learnings from using SonarQube for more than 4 years and pointers to additional resources.
Roman Pickl
As Chief Technical Officer, Roman is in charge of the technical development at Fluidtime. He has comprehensive experience in project management, the technical coordination of national and international mobility projects and the optimisation of business and development processes. Roman Pickl studied business management and commercial information technology at the Vienna University of Economics and Business and the University of Technology, Sydney, as well as software engineering at the University of Applied Sciences Technikum Wien. There he specialised in the fields of entrepreneurship & innovation management, project & process management and information management as well as software evolution and mobile computing.
Continuous Code Quality with the sonar ecosystemRoman Pickl
Continuous Code Quality with the SonarEcosystem
SonarQube is the leading platform for static code analysis and Continuous Code Quality.
In this talk we will look into all three lines of defense of the SonarEcosystem and how they can help to find bugs before they enter your codebase (or at least go into production).
After this talk, you’ll have a good overview of the SonarEcosystem as well as actionable starting points for increasing your code quality.
Furthermore, we will share learnings from using SonarQube for more than 4 years and pointers to additional resources.
About the Speaker:
As Chief Technical Officer, Roman Pickl is in charge of technical development at Fluidtime. He has comprehensive experience in project management, the technical coordination of national and international mobility projects and the optimisation of business and development processes.
Consistent toolbox talks are critical for maintaining workplace safety, as they provide regular opportunities to address specific hazards and reinforce safe practices.
These brief, focused sessions ensure that safety is a continual conversation rather than a one-time event, which helps keep safety protocols fresh in employees' minds. Studies have shown that shorter, more frequent training sessions are more effective for retention and behavior change compared to longer, infrequent sessions.
Engaging workers regularly, toolbox talks promote a culture of safety, empower employees to voice concerns, and ultimately reduce the likelihood of accidents and injuries on site.
The traditional method of conducting safety talks with paper documents and lengthy meetings is not only time-consuming but also less effective. Manual tracking of attendance and compliance is prone to errors and inconsistencies, leading to gaps in safety communication and potential non-compliance with OSHA regulations. Switching to a digital solution like Safelyio offers significant advantages.
Safelyio automates the delivery and documentation of safety talks, ensuring consistency and accessibility. The microlearning approach breaks down complex safety protocols into manageable, bite-sized pieces, making it easier for employees to absorb and retain information.
This method minimizes disruptions to work schedules, eliminates the hassle of paperwork, and ensures that all safety communications are tracked and recorded accurately. Ultimately, using a digital platform like Safelyio enhances engagement, compliance, and overall safety performance on site. https://safelyio.com/
🏎️Tech Transformation: DevOps Insights from the Experts 👩💻campbellclarkson
Connect with fellow Trailblazers, learn from industry experts Glenda Thomson (Salesforce, Principal Technical Architect) and Will Dinn (Judo Bank, Salesforce Development Lead), and discover how to harness DevOps tools with Salesforce.
DECODING JAVA THREAD DUMPS: MASTER THE ART OF ANALYSISTier1 app
Are you ready to unlock the secrets hidden within Java thread dumps? Join us for a hands-on session where we'll delve into effective troubleshooting patterns to swiftly identify the root causes of production problems. Discover the right tools, techniques, and best practices while exploring *real-world case studies of major outages* in Fortune 500 enterprises. Engage in interactive lab exercises where you'll have the opportunity to troubleshoot thread dumps and uncover performance issues firsthand. Join us and become a master of Java thread dump analysis!
Building API data products on top of your real-time data infrastructureconfluent
This talk and live demonstration will examine how Confluent and Gravitee.io integrate to unlock value from streaming data through API products.
You will learn how data owners and API providers can document, secure data products on top of Confluent brokers, including schema validation, topic routing and message filtering.
You will also see how data and API consumers can discover and subscribe to products in a developer portal, as well as how they can integrate with Confluent topics through protocols like REST, Websockets, Server-sent Events and Webhooks.
Whether you want to monetize your real-time data, enable new integrations with partners, or provide self-service access to topics through various protocols, this webinar is for you!
The Power of Visual Regression Testing_ Why It Is Critical for Enterprise App...kalichargn70th171
Visual testing plays a vital role in ensuring that software products meet the aesthetic requirements specified by clients in functional and non-functional specifications. In today's highly competitive digital landscape, users expect a seamless and visually appealing online experience. Visual testing, also known as automated UI testing or visual regression testing, verifies the accuracy of the visual elements that users interact with.
Enhanced Screen Flows UI/UX using SLDS with Tom KittPeter Caitens
Join us for an engaging session led by Flow Champion, Tom Kitt. This session will dive into a technique of enhancing the user interfaces and user experiences within Screen Flows using the Salesforce Lightning Design System (SLDS). This technique uses Native functionality, with No Apex Code, No Custom Components and No Managed Packages required.
Mobile App Development Company In Noida | Drona InfotechDrona Infotech
React.js, a JavaScript library developed by Facebook, has gained immense popularity for building user interfaces, especially for single-page applications. Over the years, React has evolved and expanded its capabilities, becoming a preferred choice for mobile app development. This article will explore why React.js is an excellent choice for the Best Mobile App development company in Noida.
Visit Us For Information: https://www.linkedin.com/pulse/what-makes-reactjs-stand-out-mobile-app-development-rajesh-rai-pihvf/
How GenAI Can Improve Supplier Performance Management.pdfZycus
Data Collection and Analysis with GenAI enables organizations to gather, analyze, and visualize vast amounts of supplier data, identifying key performance indicators and trends. Predictive analytics forecast future supplier performance, mitigating risks and seizing opportunities. Supplier segmentation allows for tailored management strategies, optimizing resource allocation. Automated scorecards and reporting provide real-time insights, enhancing transparency and tracking progress. Collaboration is fostered through GenAI-powered platforms, driving continuous improvement. NLP analyzes unstructured feedback, uncovering deeper insights into supplier relationships. Simulation and scenario planning tools anticipate supply chain disruptions, supporting informed decision-making. Integration with existing systems enhances data accuracy and consistency. McKinsey estimates GenAI could deliver $2.6 trillion to $4.4 trillion in economic benefits annually across industries, revolutionizing procurement processes and delivering significant ROI.
A neural network is a machine learning program, or model, that makes decisions in a manner similar to the human brain, by using processes that mimic the way biological neurons work together to identify phenomena, weigh options and arrive at conclusions.
Voxxed Days Trieste 2024 - Unleashing the Power of Vector Search and Semantic...Luigi Fugaro
Vector databases are redefining data handling, enabling semantic searches across text, images, and audio encoded as vectors.
Redis OM for Java simplifies this innovative approach, making it accessible even for those new to vector data.
This presentation explores the cutting-edge features of vector search and semantic caching in Java, highlighting the Redis OM library through a demonstration application.
Redis OM has evolved to embrace the transformative world of vector database technology, now supporting Redis vector search and seamless integration with OpenAI, Hugging Face, LangChain, and LlamaIndex. This talk highlights the latest advancements in Redis OM, focusing on how it simplifies the complex process of vector indexing, data modeling, and querying for AI-powered applications. We will explore the new capabilities of Redis OM, including intuitive vector search interfaces and semantic caching, which reduce the overhead of large language model (LLM) calls.
Nashik's top web development company, Upturn India Technologies, crafts innovative digital solutions for your success. Partner with us and achieve your goals
Why Apache Kafka Clusters Are Like Galaxies (And Other Cosmic Kafka Quandarie...Paul Brebner
Closing talk for the Performance Engineering track at Community Over Code EU (Bratislava, Slovakia, June 5 2024) https://eu.communityovercode.org/sessions/2024/why-apache-kafka-clusters-are-like-galaxies-and-other-cosmic-kafka-quandaries-explored/ Instaclustr (now part of NetApp) manages 100s of Apache Kafka clusters of many different sizes, for a variety of use cases and customers. For the last 7 years I’ve been focused outwardly on exploring Kafka application development challenges, but recently I decided to look inward and see what I could discover about the performance, scalability and resource characteristics of the Kafka clusters themselves. Using a suite of Performance Engineering techniques, I will reveal some surprising discoveries about cosmic Kafka mysteries in our data centres, related to: cluster sizes and distribution (using Zipf’s Law), horizontal vs. vertical scalability, and predicting Kafka performance using metrics, modelling and regression techniques. These insights are relevant to Kafka developers and operators.
How Can Hiring A Mobile App Development Company Help Your Business Grow?ToXSL Technologies
ToXSL Technologies is an award-winning Mobile App Development Company in Dubai that helps businesses reshape their digital possibilities with custom app services. As a top app development company in Dubai, we offer highly engaging iOS & Android app solutions. https://rb.gy/necdnt