In this presentation, I covered how I've migrated Android project from old Jenkins (Freestyle jobs, 1st Jenkins instance) to new Jenkins (Multibranch pipeline, 2nd Jenkins instance).
Also, it covers a Jenkins Shared Library usage and integration tests on pipeline code.
At the end, I'm covering pros/cons of final result and what difficulties I faced during migration.
Jenkins is a Continuous Integration (CI) server or tool which is written in Java. It provides Continuous Integration services for software development, which can be started via command line or web application server. Jenkins Pipeline is a suite of plugins which supports implementing and integrating continuous delivery pipelines into Jenkins.
Codifying the Build and Release Process with a Jenkins Pipeline Shared LibraryAlvin Huang
These are my slides from my Jenkins World 2017 talk, detailing a war story of migrating 150-200 Freestyle Jobs for build and release, into ~10 line Jenkinsfiles that heavily leverages Jenkins Pipeline Shared Libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/)
Pipeline as code - new feature in Jenkins 2Michal Ziarnik
What is pipeline as code in continuous delivery/continuous deployment environment.
How to set up Multibranch pipeline to fully benefit from pipeline features.
Jenkins master-node concept in Kubernetes cluster.
Building an Extensible, Resumable DSL on Top of Apache Groovyjgcloudbees
Presented at: https://apacheconeu2016.sched.org/event/8ULR
n 2014, a few Jenkins hackers set out to implement a new way of defining continuous delivery pipelines in Jenkins. Dissatisfied with chaining jobs together, configured in the web UI, the effort started with Apache Groovy as the foundation and grew from there. Today the result of that effort, named Jenkins Pipeline, supports a rich DSL with "steps" provided by a Jenkins plugins, built-in auto-generated documentation, and execution resumability which allow Pipelines to continue executing while the master is offline.
In this talk we'll take a peek behind the scenes of Jenkins Pipeline. Touring the various constraints we started with, whether imposed by Jenkins or Groovy, and discussing which features of Groovy were brought to bear during the implementation. If you're embedding, extending or are simply interested in the internals of Groovy this talk should have plenty of food for thought.
Jenkins is a Continuous Integration (CI) server or tool which is written in Java. It provides Continuous Integration services for software development, which can be started via command line or web application server. Jenkins Pipeline is a suite of plugins which supports implementing and integrating continuous delivery pipelines into Jenkins.
Codifying the Build and Release Process with a Jenkins Pipeline Shared LibraryAlvin Huang
These are my slides from my Jenkins World 2017 talk, detailing a war story of migrating 150-200 Freestyle Jobs for build and release, into ~10 line Jenkinsfiles that heavily leverages Jenkins Pipeline Shared Libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/)
Pipeline as code - new feature in Jenkins 2Michal Ziarnik
What is pipeline as code in continuous delivery/continuous deployment environment.
How to set up Multibranch pipeline to fully benefit from pipeline features.
Jenkins master-node concept in Kubernetes cluster.
Building an Extensible, Resumable DSL on Top of Apache Groovyjgcloudbees
Presented at: https://apacheconeu2016.sched.org/event/8ULR
n 2014, a few Jenkins hackers set out to implement a new way of defining continuous delivery pipelines in Jenkins. Dissatisfied with chaining jobs together, configured in the web UI, the effort started with Apache Groovy as the foundation and grew from there. Today the result of that effort, named Jenkins Pipeline, supports a rich DSL with "steps" provided by a Jenkins plugins, built-in auto-generated documentation, and execution resumability which allow Pipelines to continue executing while the master is offline.
In this talk we'll take a peek behind the scenes of Jenkins Pipeline. Touring the various constraints we started with, whether imposed by Jenkins or Groovy, and discussing which features of Groovy were brought to bear during the implementation. If you're embedding, extending or are simply interested in the internals of Groovy this talk should have plenty of food for thought.
** DevOps Training: https://www.edureka.co/devops **
This Edureka tutorial on "Jenkins pipeline Tutorial" will help you understand the basic concepts of a Jenkins pipeline. Below are the topics covered in the tutorial:
1. The need for Continuous Delivery
2. What is Continuous Delivery?
3. Features before the Jenkins Pipeline
4. What is a Jenkins Pipeline?
5. What is a Jenkinsfile?
6. Pipeline Concepts
7. Hands-On
Check our complete DevOps playlist here (includes all the videos mentioned in the video): http://goo.gl/O2vo13
Mile High Agile 2018 presentation describing continuous integration concepts and how to implement them with a Jenkins single branch Pipeline and with a Jenkins multi-branch Pipeline.
Additional presentation material is included to describe several of the ways that Docker can benefit a continuous integration environment
Slides from my presentation to the Sydney Jenkins Meetup on Declarative Pipeline. Video of the presentation available at https://www.youtube.com/watch?v=3R5xh4oeDg0&feature=youtu.be
Voxxed Luxembourd 2016 Jenkins 2.0 et Pipeline as codeDamien Duportal
Né Hudson en 2004 (cf. http://kohsuke.org/2011/01/11/bye-bye-hudson-hello-jenkins/), le projet Jenkins vient de franchir un cap majeur : la version Jenkins 2.0 (cf. https://groups.google.com/forum/#!msg/jenkinsci-dev/vbXK7JJekFw/BlEvO0UxBgAJ) !
Cette étape majeure réussit à concilier la gestion de l'ancien, et la transition vers des pratiques de déploiement continu plus modernes.
Parmi les nouveautés, la gestion des Pipeline-as-a-Code et l'intégration de Docker sont deux éléments dont vous allez pouvoir tirer de nombreux bénéfices.
Si vous êtes intéressés pour un exemple concret de migration depuis un Jenkins 1.x vers un flux basé sur Docker et Pipeline avec Jenkins 2.0, cette session est faite pour vous !
L'exemple suivi sera un projet Java-Maven "type", stocké sur un dépôt Git, bénéficiant de tests et d'analyses, en "multi-job enchaînés", que nous ferons glisser dans un "Jenkins Pipeline", configuré via un fichier du dépôt Git, en mode "livraison continue" via Docker.
The Jenkins open source continuous integration server now provides a “pipeline” scripting language which can define jobs that persist across server restarts, can be stored in a source code repository and can be versioned with the source code they are building. By defining the build and deployment pipeline in source code, teams can take full control of their build and deployment steps. The Docker project provides lightweight containers and a system for defining and managing those containers. The Jenkins pipeline and Docker containers are a great combination to improve the portability, reliability, and consistency of your build process.
This session will demonstrate Jenkins and Docker in the journey from continuous integration to DevOps.
Jenkins Pipeline @ Scale. Building Automation Frameworks for Systems IntegrationOleg Nenashev
This is a follow-up presentation to my talk at CloudBees | Jenkins Automotive and Embedded Day 2016, where I was presenting Pipeline usage strategies for use-cases in the Embedded area. In this presentation I talk about Jenkins Pipeline features for automation frameworks and talk about lessons learned in several project.
Are you planning to start working with open source and are overwhelmed by all the new tools you have to learn ? Here you can find a quick overview about Subversion, Git and Maven.
** DevOps Training: https://www.edureka.co/devops **
This Edureka tutorial on "Jenkins pipeline Tutorial" will help you understand the basic concepts of a Jenkins pipeline. Below are the topics covered in the tutorial:
1. The need for Continuous Delivery
2. What is Continuous Delivery?
3. Features before the Jenkins Pipeline
4. What is a Jenkins Pipeline?
5. What is a Jenkinsfile?
6. Pipeline Concepts
7. Hands-On
Check our complete DevOps playlist here (includes all the videos mentioned in the video): http://goo.gl/O2vo13
Mile High Agile 2018 presentation describing continuous integration concepts and how to implement them with a Jenkins single branch Pipeline and with a Jenkins multi-branch Pipeline.
Additional presentation material is included to describe several of the ways that Docker can benefit a continuous integration environment
Slides from my presentation to the Sydney Jenkins Meetup on Declarative Pipeline. Video of the presentation available at https://www.youtube.com/watch?v=3R5xh4oeDg0&feature=youtu.be
Voxxed Luxembourd 2016 Jenkins 2.0 et Pipeline as codeDamien Duportal
Né Hudson en 2004 (cf. http://kohsuke.org/2011/01/11/bye-bye-hudson-hello-jenkins/), le projet Jenkins vient de franchir un cap majeur : la version Jenkins 2.0 (cf. https://groups.google.com/forum/#!msg/jenkinsci-dev/vbXK7JJekFw/BlEvO0UxBgAJ) !
Cette étape majeure réussit à concilier la gestion de l'ancien, et la transition vers des pratiques de déploiement continu plus modernes.
Parmi les nouveautés, la gestion des Pipeline-as-a-Code et l'intégration de Docker sont deux éléments dont vous allez pouvoir tirer de nombreux bénéfices.
Si vous êtes intéressés pour un exemple concret de migration depuis un Jenkins 1.x vers un flux basé sur Docker et Pipeline avec Jenkins 2.0, cette session est faite pour vous !
L'exemple suivi sera un projet Java-Maven "type", stocké sur un dépôt Git, bénéficiant de tests et d'analyses, en "multi-job enchaînés", que nous ferons glisser dans un "Jenkins Pipeline", configuré via un fichier du dépôt Git, en mode "livraison continue" via Docker.
The Jenkins open source continuous integration server now provides a “pipeline” scripting language which can define jobs that persist across server restarts, can be stored in a source code repository and can be versioned with the source code they are building. By defining the build and deployment pipeline in source code, teams can take full control of their build and deployment steps. The Docker project provides lightweight containers and a system for defining and managing those containers. The Jenkins pipeline and Docker containers are a great combination to improve the portability, reliability, and consistency of your build process.
This session will demonstrate Jenkins and Docker in the journey from continuous integration to DevOps.
Jenkins Pipeline @ Scale. Building Automation Frameworks for Systems IntegrationOleg Nenashev
This is a follow-up presentation to my talk at CloudBees | Jenkins Automotive and Embedded Day 2016, where I was presenting Pipeline usage strategies for use-cases in the Embedded area. In this presentation I talk about Jenkins Pipeline features for automation frameworks and talk about lessons learned in several project.
Are you planning to start working with open source and are overwhelmed by all the new tools you have to learn ? Here you can find a quick overview about Subversion, Git and Maven.
Integration testing is hard, and often teams are tempted to do it in production. Testcontainers allows writing meaningful integration tests spawning Docker containers for databases, queue systems, kv-store, other services. The talk, a blend of slides and live code, will show how we are able to deploy without fear while integrating with a dozen of different datastores. Don't mock your database with fake data anymore, work with real data
Build, Publish, Deploy and Test Docker images and containers with Jenkins Wor...Docker, Inc.
This lightning talk will show you how simple it is to apply CI to the creation of Docker images, ensuring that each time the source is changed, a new image is created, tagged, and published. I will then show how easy it is to then deploy containers from this image and run tests to verify the behaviour.
Everything as a Code / Александр Тарасов (Одноклассники)Ontico
РИТ++ 2017, Root Conf
Зал Пекин + Шанхай, 5 июня, 11:00
Тезисы:
http://rootconf.ru/2017/abstracts/2627.html
Процесс разработки не начинается и не заканчивается на написании кода программного продукта. Мы пишем документацию, придумываем, как это всё оттестировать, и заботимся о том, чтобы доступность приложения была на высоком уровне.
Мы все делаем привычные вещи привычным для нас способом. Порой выполняя много ручной и неэффективной работы. Но что, если есть другой, радикальный подход. Можно ли формализовать свою деятельность и переложить её в код? Какие практики и инструменты для этого использовать?
В докладе будет представлен личный опыт автора по автоматизации различных элементов разработки ПО.
Процесс разработки не начинается и не заканчивается на написании кода программного продукта. Мы пишем документацию, придумываем, как это всё оттестировать, и заботимся о том, чтобы доступность приложения была на высоком уровне.
Мы все делаем привычные вещи привычным для нас способом. Порой выполняя много ручной и неэффективной работы. Но что, если есть другой, радикальный подход. Можно ли формализовать свою деятельность и переложить её в код? Какие практики и инструменты для этого использовать?
В докладе будет представлен личный опыт автора по автоматизации различных элементов разработки ПО.
GeeCON 2017 - TestContainers. Integration testing without the hassleAnton Arhipov
TestContainers is a Java library that supports JUnit tests, providing lightweight, throwaway instances of common databases, Selenium web browsers, or anything else that can run in a Docker container.
Level Up Your Android Build -Droidcon Berlin 2015Friedger Müffke
A journey of a young entrepreneur through the Android Gradle build system. It explains the groovy and gradle details that are need during the development process of an Android app.
On stage with Volker Leck
Presented at AI NEXTCon Seattle 1/17-20, 2018
http://aisea18.xnextcon.com
join our free online AI group with 50,000+ tech engineers to learn and practice AI technology, including: latest AI news, tech articles/blogs, tech talks, tutorial videos, and hands-on workshop/codelabs, on machine learning, deep learning, data science, etc..
When to use Serverless? When to use Kubernetes?Niklas Heidloff
Slides of a session that I have given/will give at various developer conferences in H1 2018.
Niklas Heidloff
http://twitter.com/nheidloff
http://heidloff.net
Summary Article
http://heidloff.net/article/when-to-use-serverless-kubernetes
OpenWhisk
https://openwhisk.apache.org
https://github.com/ibm-functions/composer
https://github.com/nheidloff/openwhisk-debug-nodejs
Kubernetes
https://kubernetes.io
https://istio.io
IBM Cloud
http://ibm.biz/nheidloff
Abstract
There is a lot of debate whether to use Serverless or Kubernetes to build cloud-native apps. Both have their advantages and unique capabilities which developers should take into consideration when planning new projects. We will throw some light on the topics ease of use, maturity, types of scenarios, developer productivity and debugging, supported languages, DevOps and monitoring, performance, community and pricing. Cloud-native architectures shift the complexity from within an application to orchestrations of Microservices. Both Kubernetes and Serverless have their strengths which we will discuss. Besides the core development topics, developers should also understand operational aspects how complicated it is to maintain your own systems versus using managed platforms.
Similar to CI/CD on Android project via Jenkins Pipeline (20)
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
46. What went wrong
Update build number
Publish Git tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
46
47. What went wrong
Update build number
Publish Git Tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
47
48. What went wrong
Update build number
Publish Git tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
48
49. What went wrong
Update build number
Publish Git tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
49
50. What went wrong
Update build number
Publish Git tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
50
51. What went wrong
Update build number
Publish Git tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
51
52. What went wrong
Update build number
Publish Git tag
Plots
Upload to Google Play
3rd-party libraries in pipeline code
Parallel builds on slaves
Slack notifications
52
63. Upload to Google Play
Android application
- build number 1..10000
Android TV application
- build number 10000+
Upload failed: Devices with version 10002 of this app
would be downgraded to version 54 if they met the following criteria
63
Android Lint = иногда подсказывает о довольно важных проблемах
Fabric.Beta = internal/external builds
Работаем по gitflow (feature develop release hotfix master)
У нас был старый дженкинс где только айос андроид. Там почти ничего не было.
И мы должны были всё перенести в новый где все остальные проекты компании.
То как выглядит старый дженкинс, достался в наследство
Если только VRV проект, то 3 задачи
Это почти всё что было внутри задач.
Нет интеграции с гитхабом, нет никаких полезных графиков и данных, почти ничего нет.
Полу-ручные билды.
Перешли на multibranch pipeline. Расскажу дальше немного подробнее об этом.
Неотъемлемая часть Jenkins Pipeline плагина
Расскажу что это и где мы используем.
Вот так выглядит VRV Android Jenkins задачи. Это multibranch pipeline и ещё одна вспомогательная задача нужная для миграции.
Автоматически создаются задачи в Jenkins из веток и PR.
И автоматически удаляются.
Как же работает эта штука?
Внутри самого проекта есть Jenkinsfile с инструкциями.
После того как это всё запускается вы увидите ваши стадии.
В нашем случае их 4 и в среднем билд занимает 15 минут.
Немного информации о тестах на проекте.
Это чуть больше чем за пол года.
Покрытие на этом рисунке не актуально, всё руки не доходят актуализировать.
Благо мы следим за тем чтобы тесты были в каждом новом Pull Request и основная бизнес логика покрыта.
Немного дополнительной информации, которая не так часто автоматизируется.
Размер приложения очень важен.
Количество методов в проекте. Важный показатель на Андроид проекте.
Вся реализация тех стадий (build test distribute) вынесена в Jenkins Shared Library.
Так выглядит дефолтная библиотека.
Исходный код, глобальные переменные и ресурсы.
Так выглядит наш проект.
Код для разных типов проектов. AndroidPipeline.groovy.
Jenkinsfile - автоматизация самого pipeline проекта.
Ну и тесты.
Это 1 класс из 500 строк и примерно 40 атомарных методов.
Более детально выглядит так.
Изначально всё было в 1 файле, но сейчас я занимаюсь улучшением этого дела и выношу то что можно в отдельные независимые классы чтобы переиспользовать.
Подробно о каждом шаге.
Подробно о каждом шаге.
Подробно о каждом шаге.
Почему закомментировано расскажу чуть ниже.
Тут много интересных вещей происходит.
Как мы тестируем pipeline code.
Стоит сказать что покрытие пока очень маленькое, всего 4 метода из 40 покрыто.
Есть и юнит тесты, но они обычные, ничего интересного.
Так выглядит интеграционный тест.
Что же пошло не так или где я встретил трудности и на что потратил много времени.
В андроиде каждый билд имеет билд номер и они идут по порядку.
В старом дженкинсе 1 задача для альфа билдов и всё по порядку.
Git checkout = HTTPS
Надо трансформировать в SSH и потом с sshagent отправить можно будет Git.
Графики что вы видели не были совместимы с pipeline.
Я сделал это возможным. Там довольный интересный PR, работал с ребятами из Cloudbees.
2 проекта у нас и build number специфика.
Оба плагина не умеют это делать пока - Jenkins/Gradle.
Для Slack-a допустим мне нужно было работать с API.
Встроенный сборщик в Groovy = Grapes.
Много коммитов, много билдов, не хватало памяти.
Каждому в приват, вместо общего канала.
Легче администрировать админам.
Другие могут твои методы использовать. Например Slack интеграцию что я сделал.
в гитхабе девопсы ревью делают.
Намного нагляднее и понятнее в коде что происходит чем через UI настраивать.
Можно покрыть тестами и проверить заранее функционал.
обновлять плагины (надо всех подождать и чтобы у всех работало)
нет доступа в настройку дженкинса
Надо думать что у других тоже что-то может сломаться, изолированно в своём мирке Андроида уже не поживёшь.
Иногда много времени занимает так как девопсы заняты постоянно и физически не успевают.
Это довольно нетривиально и надо хорошо дженкинс знать и pipeline.
Сложно дебажить. У меня локальный дженкинс всегда запущен.
Немного информации полезной.
Совсем мало, но важно. Я сам использовал то что я давно сделал и 50% pipeline это был тупо copy-paste отсюда.
Там есть вся инфа обо мне. И опен-сорс проекты, и ссылка на блог, твиттер, линкедин.