The document discusses how Docker can be used to ship software using containers, allowing for scalable and efficient delivery pipelines. It explains that Docker containers help standardize environments, ensure clean environments at each stage of development, testing and production, and simplify deployments and rollbacks. The benefits highlighted include better software configuration management by putting everything needed to build software into version control.
Criando pipelines de entrega contínua multilinguagem com Docker e JenkinsCamilo Ribeiro
Palestra apresentada no QCon do Rio de Janeiro em 2015, sobre docker, jenkins, job dsl e automação de uma pipeline.
Exemplos e código podem ser encontrados em: https://github.com/camiloribeiro/cdeasy
We know how complicated it is to have a stable grid, and how hard it is to maintain over time with enough capabilities to cover most browsers and platforms. Internally, we found that ~75% of our tests were executed in Firefox/Chrome, and the remaining were executed in Safari/IE. We decided to develop a tool where docker-selenium nodes are created, used and disposed on demand. For Safari/IE, we just forward the tests to Sauce Labs/BrowserStack.
Zalenium is an OSS extension to scale up and down your local grid dynamically with Docker containers. It uses Docker-Selenium to run tests in Firefox/Chrome, and when a different browser is needed, tests get redirected to a cloud testing service. Result: our tests suites run faster since most of the tests run on local Firefox/Chrome nodes, and we use in a smarter way the cloud testing service we pay for.
Diego Molina – Software Engineer in Test, Zalando SE
Leo Gallucci – Software Engineer, Tools and Infrastructure, Zalando SE
DockerCon EU 2015: Stop Being Lazy and Test Your Software!Docker, Inc.
Presented by Laura Frank, Engineer, Codeship
Testing software is necessary, no matter the size or status of your company. Introducing Docker to your development workflow can help you write and run your testing frameworks more efficiently, so that you can always deliver your best product to your customers and there are no excuses for not writing tests anymore. You’ll walk away from this talk with practical advice for using Docker to run your test frameworks more efficiently, as well as some solid knowledge of software testing principles.
Criando pipelines de entrega contínua multilinguagem com Docker e JenkinsCamilo Ribeiro
Palestra apresentada no QCon do Rio de Janeiro em 2015, sobre docker, jenkins, job dsl e automação de uma pipeline.
Exemplos e código podem ser encontrados em: https://github.com/camiloribeiro/cdeasy
We know how complicated it is to have a stable grid, and how hard it is to maintain over time with enough capabilities to cover most browsers and platforms. Internally, we found that ~75% of our tests were executed in Firefox/Chrome, and the remaining were executed in Safari/IE. We decided to develop a tool where docker-selenium nodes are created, used and disposed on demand. For Safari/IE, we just forward the tests to Sauce Labs/BrowserStack.
Zalenium is an OSS extension to scale up and down your local grid dynamically with Docker containers. It uses Docker-Selenium to run tests in Firefox/Chrome, and when a different browser is needed, tests get redirected to a cloud testing service. Result: our tests suites run faster since most of the tests run on local Firefox/Chrome nodes, and we use in a smarter way the cloud testing service we pay for.
Diego Molina – Software Engineer in Test, Zalando SE
Leo Gallucci – Software Engineer, Tools and Infrastructure, Zalando SE
DockerCon EU 2015: Stop Being Lazy and Test Your Software!Docker, Inc.
Presented by Laura Frank, Engineer, Codeship
Testing software is necessary, no matter the size or status of your company. Introducing Docker to your development workflow can help you write and run your testing frameworks more efficiently, so that you can always deliver your best product to your customers and there are no excuses for not writing tests anymore. You’ll walk away from this talk with practical advice for using Docker to run your test frameworks more efficiently, as well as some solid knowledge of software testing principles.
Building the Test Automation Framework - Jenkins for TestersWilliam Echlin
http://www.TestManagement.com: Module 2 of a six module course on building the test automation framework. This second module looks at how to implement Jenkins in the test environment.
RESTful API Testing using Postman, Newman, and JenkinsQASymphony
INCLUDE AUTOMATED RESTFUL API TESTING USING POSTMAN, NEWMAN, AND JENKINS
If you’re going to automate one kind of tests at your company, API testing is the perfect place to start! It’s fast and simple to write as well as fast to execute. If your company writes an API for its software, then you understand the need and importance of testing it. In this webinar, we’ll do a live demonstration of how you can use free tools, such as Postman, Newman, and Jenkins, to enhance your software quality and security.
Elise Carmichael will cover:
Why your API tests should be included with your CI
Real examples using Postman, Newman and Jenkins + Newman
An active Q&A where you can get your automated testing questions answered, live!
To get the most out of this session:
Download these free tools prior to the webinar: Postman, Newman (along with node and npm) and Jenkins
Read up on how to parse JSON objects using javascript
*Can’t attend the webinar live? Register and we will send the recording after the webinar is over.
TeamCity is a great tool for Continuous Integration with a lot of advanced features provided out-of-the-box. In this session, we will go through how TeamCity helps the software development with the daily routine; what was added to the product in the latest releases; and what features are coming next.
You will learn why build pipelines are useful, and how the CI server can be optimized when properly configured. I will also demonstrate how to configure the builds using the special Kotlin DSL provided with TeamCity.
DockerCon EU 2015 in Barcelona
Practical tips for using Docker to run tests during development, CI/CD, and different strategies for speeding up the run of your test suite by using parallel pipelines in containers.
The jet tool used to demonstrate parallel testing is available here: https://codeship.com/documentation/docker/installation/
In this talk, we will discuss the construction of a CI/CD pipeline consisting of Docker Engine, GitHub, Jenkins, Docker Registry and calm.io. The pipeline will be kicked off by a commit to a GitHub repository. The commit will cause Jenkins to run a build job and, upon successful completion of that job, push a Docker image up to Docker Registry. Once the new docker image is made available, Jenkins will trigger calm.io to deploy the new images on staging and production systems.
Efficient Parallel Testing with Docker by Laura FrankDocker, Inc.
Fast and efficient software testing is easy with Docker. We often
use containers to maintain parity across development, testing, and production environments, but we can also use containerization to significantly reduce time needed for testing by spinning up multiple instances of fully isolated testing environments and executing tests in parallel. This strategy also helps you maximize the utilization of infrastructure resources. The enhanced toolset provided by Docker makes this process simple and unobtrusive, and you’ll see how Docker Engine, Registry, Machine, and Compose can work together to make your tests fast.
"Workstation Up" - Docker Development at Flow by Mike RothDocker, Inc.
Docker is an integral part of Flow's technology stack, supporting everything from a developer's local environment to Production containers in AWS.
"Workstation" has become central to a developer's toolset at Flow, giving them the ability to bring up/down a service, along with any upstream/downstream dependencies, in a single, simple command implemented with GOlang CLI. For example, developers can run “workstation up --app www” - and reliably have the www app running along with its dozens of transitive dependencies. It truly is reliable - requiring no additional configuration - and just continues to work.
The team has recently transitioned to Docker for Mac Beta and just love referencing containers via localhost!
Learn how to automate your tests and integrate with your CI tool with 5 simple steps.
Check out the most simple and powerful test management platform in the cloud: https://hiptest.net
Mobile industry data shows that up to 84% of users may delete an app due to a poor quality or performance experience. Consequently, it’s imperative to deliver high quality builds from day one (even if it’s just an MVP product) in order to not lose the customers that you’ve worked so hard to acquire. A critical component of delivering high-quality apps is testing your app on real devices as they exist in the wild. In this session, you’ll learn about using AWS Device Farm as part of your testing regime and how to incorporate as part of your CI/CD pipeline.
Gear4music has become one of the largest retailers of musical instruments and equipment in the United Kingdom. I joined the business back in October 2018 as they required a tester with API testing experience for upcoming projects. In this talk, I'll be covering how we went from 0 to 1000+ API tests and how Postman has helped throughout the project's life cycle.I'll also be talking about how I've evolved throughout the years as a Postman user and I'll cover things I wish I knew when I first started.
Building the Test Automation Framework - Jenkins for TestersWilliam Echlin
http://www.TestManagement.com: Module 2 of a six module course on building the test automation framework. This second module looks at how to implement Jenkins in the test environment.
RESTful API Testing using Postman, Newman, and JenkinsQASymphony
INCLUDE AUTOMATED RESTFUL API TESTING USING POSTMAN, NEWMAN, AND JENKINS
If you’re going to automate one kind of tests at your company, API testing is the perfect place to start! It’s fast and simple to write as well as fast to execute. If your company writes an API for its software, then you understand the need and importance of testing it. In this webinar, we’ll do a live demonstration of how you can use free tools, such as Postman, Newman, and Jenkins, to enhance your software quality and security.
Elise Carmichael will cover:
Why your API tests should be included with your CI
Real examples using Postman, Newman and Jenkins + Newman
An active Q&A where you can get your automated testing questions answered, live!
To get the most out of this session:
Download these free tools prior to the webinar: Postman, Newman (along with node and npm) and Jenkins
Read up on how to parse JSON objects using javascript
*Can’t attend the webinar live? Register and we will send the recording after the webinar is over.
TeamCity is a great tool for Continuous Integration with a lot of advanced features provided out-of-the-box. In this session, we will go through how TeamCity helps the software development with the daily routine; what was added to the product in the latest releases; and what features are coming next.
You will learn why build pipelines are useful, and how the CI server can be optimized when properly configured. I will also demonstrate how to configure the builds using the special Kotlin DSL provided with TeamCity.
DockerCon EU 2015 in Barcelona
Practical tips for using Docker to run tests during development, CI/CD, and different strategies for speeding up the run of your test suite by using parallel pipelines in containers.
The jet tool used to demonstrate parallel testing is available here: https://codeship.com/documentation/docker/installation/
In this talk, we will discuss the construction of a CI/CD pipeline consisting of Docker Engine, GitHub, Jenkins, Docker Registry and calm.io. The pipeline will be kicked off by a commit to a GitHub repository. The commit will cause Jenkins to run a build job and, upon successful completion of that job, push a Docker image up to Docker Registry. Once the new docker image is made available, Jenkins will trigger calm.io to deploy the new images on staging and production systems.
Efficient Parallel Testing with Docker by Laura FrankDocker, Inc.
Fast and efficient software testing is easy with Docker. We often
use containers to maintain parity across development, testing, and production environments, but we can also use containerization to significantly reduce time needed for testing by spinning up multiple instances of fully isolated testing environments and executing tests in parallel. This strategy also helps you maximize the utilization of infrastructure resources. The enhanced toolset provided by Docker makes this process simple and unobtrusive, and you’ll see how Docker Engine, Registry, Machine, and Compose can work together to make your tests fast.
"Workstation Up" - Docker Development at Flow by Mike RothDocker, Inc.
Docker is an integral part of Flow's technology stack, supporting everything from a developer's local environment to Production containers in AWS.
"Workstation" has become central to a developer's toolset at Flow, giving them the ability to bring up/down a service, along with any upstream/downstream dependencies, in a single, simple command implemented with GOlang CLI. For example, developers can run “workstation up --app www” - and reliably have the www app running along with its dozens of transitive dependencies. It truly is reliable - requiring no additional configuration - and just continues to work.
The team has recently transitioned to Docker for Mac Beta and just love referencing containers via localhost!
Learn how to automate your tests and integrate with your CI tool with 5 simple steps.
Check out the most simple and powerful test management platform in the cloud: https://hiptest.net
Mobile industry data shows that up to 84% of users may delete an app due to a poor quality or performance experience. Consequently, it’s imperative to deliver high quality builds from day one (even if it’s just an MVP product) in order to not lose the customers that you’ve worked so hard to acquire. A critical component of delivering high-quality apps is testing your app on real devices as they exist in the wild. In this session, you’ll learn about using AWS Device Farm as part of your testing regime and how to incorporate as part of your CI/CD pipeline.
Gear4music has become one of the largest retailers of musical instruments and equipment in the United Kingdom. I joined the business back in October 2018 as they required a tester with API testing experience for upcoming projects. In this talk, I'll be covering how we went from 0 to 1000+ API tests and how Postman has helped throughout the project's life cycle.I'll also be talking about how I've evolved throughout the years as a Postman user and I'll cover things I wish I knew when I first started.
Introduction to Docker for NodeJs developers at Node DC 2/26/2014lenworthhenry
This was my presentation to the Node DC meetup on using Docker for Node JS projects. The code for the demonstration is available at github: https://github.com/lenworthhenry/Docker-Example
Advanced cgroups and namespaces
This talk picks up where we left off in the previous cgroups and namespaces talk and dive in even deeper!
Agenda:
* cgroups v2 design (cgroup v2 was started to be merged in the current kernel, 4.4)
* cgroups v2 examples (migrating tasks, enabling and disabling controllers, and more).
* comparison between cgroup v2 unified hierarchy and cgroup v1 legacy hierarchy.
* PIDs namespaces (from kernel 4.3)
* cgroup namespaces (not merged yet)
Build, Publish, Deploy and Test Docker images and containers with Jenkins Wor...Docker, Inc.
This lightning talk will show you how simple it is to apply CI to the creation of Docker images, ensuring that each time the source is changed, a new image is created, tagged, and published. I will then show how easy it is to then deploy containers from this image and run tests to verify the behaviour.
Getting Started With Docker | Docker Tutorial | Docker Training | EdurekaEdureka!
This tutorial on "Getting started With Docker" will help you understand the fundamental concepts in Docker and how it is used for containerization. Below are the topics covered in this tutorial:
1. Challenges With Shipping & Transportation
2. How Does Docker Fit The Bill?
3. What Is Docker?
4. Benefits Of Docker Over Virtual Machines
5. Docker Terminology
6. Architecture Of Docker
7. Hands-On: Running Hello-World Docker Container
To take a structured training on Deep Learning, you can check complete details of our Deep Learning with TensorFlow course here: https://goo.gl/WF1RYI
Docker for Developers talk from the San Antonio Web Dev Meetup in Aug 2023
Never used Docker? This is perfect for you!
New to Docker? You'll learn something for sure!
Links included for all slides, code, and examples
Go from no Docker experience to a fully running web app in one slide deck!
Patterns & Antipatterns in Docker Image Lifecycleyoavl
While Docker has enabled an unprecedented velocity of software production, it is all too easy to spin out of control. A promotion-based model is required to control and track the flow of Docker images as much as it is required for a traditional software development lifecycle. New tools often introduce new paradigms. We will examine the patterns and the anti-patterns for Docker image management, and what impact the new tools have on the battle-proven paradigms of the software development lifecycle.
This talk takes it from the point that everybody already understand the need in the CI/CD pipeline and some of the basic techniques are taken for granted. It’s much more about tools, processes and automation.
Seamless Continuous Deployment Using Docker ContainersFaiz Bashir
To be a truly modern software development team, delivering software as quickly as possible using the minimum viable product (MVP) model is a key requirement. To achieve continuous software delivery is not trivial. A number of technologies have sprung up to simplify this process and Docker is hottest one among them.
Docker Tutorial For Beginners | What Is Docker And How It Works? | Docker Tut...Simplilearn
This presentation about Docker tutorial will help you understand what is Docker, advantages of Docker, how does Docker work, components of Docker, virtual machine vs Docker, advanced concepts in Docker, basic Docker commands along with a demo. A Docker is an OS-level virtualization software that enables developers and IT administrators to create, deploy and run applications in a Docker container with all their dependencies. It is said to be a very light-weight software container and containerization platform. Docker engine or Docker is a client-server application that builds and executes using Docker components. Rapid deployment, portability, better efficiency, faster configuration, scalability, security are some of the advantages you get by using Docker.
Below topics are explained in this Docker presentation:
1. Virtual machine vs Docker
2. What is Docker?
3. Advantages of Docker
4. How does Docker work?
5. Components of Docker
6. Advanced concepts in Docker
7. Basic Docker commands
Why learn DevOps?
Simplilearn’s DevOps training course is designed to help you become a DevOps practitioner and apply the latest in DevOps methodology to automate your software development lifecycle right out of the class. You will master configuration management; continuous integration deployment, delivery and monitoring using DevOps tools such as Git, Docker, Jenkins, Puppet and Nagios in a practical, hands-on and interactive approach. The DevOps training course focuses heavily on the use of Docker containers, a technology that is revolutionizing the way apps are deployed in the cloud today and is a critical skillset to master in the cloud age.
After completing the DevOps training course you will achieve hands-on expertise in various aspects of the DevOps delivery model. The practical learning outcomes of this Devops training course are:
An understanding of DevOps and the modern DevOps toolsets
The ability to automate all aspects of a modern code delivery and deployment pipeline using:
1. Source code management tools
2. Build tools
3. Test automation tools
4. Containerization through Docker
5. Configuration management tools
6. Monitoring tools
Who should take this course?
DevOps career opportunities are thriving worldwide. DevOps was featured as one of the 11 best jobs in America for 2017, according to CBS News, and data from Payscale.com shows that DevOps Managers earn as much as $122,234 per year, with DevOps engineers making as much as $151,461. DevOps jobs are the third-highest tech role ranked by employer demand on Indeed.com but have the second-highest talent deficit.
This DevOps training course will be of benefit the following professional roles:
1. Software Developers
2. Technical Project Managers
3. Architects
4. Operations Support
5. Deployment engineers
6. IT managers
7. Development managers
You can learn more at https://www.simplilearn.com/cloud-computing/devops-practitioner-certification-training
Tech Talk #2: Docker - From $1 Billion Startup to the Future Industry StandardNexus FrontierTech
Chia sẻ của anh Trương Anh Quân: R&D Specialist - IT Center - Vietcombank. Over 9 years experience in Software Developement and Management
Demo: https://www.youtube.com/watch?v=sdeVSJgAQvQ&feature=youtu.be
Ed Seymour
Containerisation Lead – Red Hat
Ed has over 20 years experience working in software development and IT automation. With a career that started with a small software start-up, working efficiently and with agility was a necessity, and through his experience working at a global IT services company, gained valuable experience in promoting and effecting organisational change, adoption of agile methods, and automation of the software development life-cycle. At Red Hat, Ed’s role has focused on enabling customers as they embrace new organisational behaviours and structures, for example DevOps, and developing new IT services through adoption of emerging technologies, such as Cloud Management, OpenStack; Ed specialises in solutions based on containers through Docker, Kubernetes and OpenShift.
C219 - Docker and PureApplication Patterns: Better TogetherHendrik van Run
Interest in deploying software using Docker containers has been growing very quickly. Clients are hearing all the "buzz" around Docker and beginning to investigate how they can take advantage of this new technology. In the latest v2.1 release of IBM PureApplication, support has been added that allows clients to easily create patterns that deploy Docker containers as software components using the pattern editor. Now clients can build upon the skills they already have with patterns and easily add Docker containers. Because the new support for Docker is integrated with the existing patterns, the new technology can be added incrementally at a pace that makes sense for the customer's business. There is no need to "start all over again" in order to exploit Docker.
What is Docker & Why is it Getting Popular?Mars Devs
Docker and containerization, in general, are now causing quite a stir But what is Docker, and how does it relate to containerization. Today, in this blog we will walk you through the nitty-gritty of Docker and why it is getting adopted rapidly.
Click here to know more: https://www.marsdevs.com/blogs/what-is-docker-why-is-it-getting-popular
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
15. Delivery Pipeline with Containers
Development
Test
Acceptance
Production
Environment
Setup
Clean
Environments
Similarity to
Production
Deployments and
Roll-back/forwards
16. Delivery Pipeline with Containers
Development
Test
Acceptance
Production
Environment
Setup
Clean
Environments
Similarity to
Production
Deployments and
Roll-back/forwards
Hello,My name is Pini Reznik, I’m from Ugly Duckling and I’m going to talk today about Docker.10
Let me present the problem through an analogy.Here you can see physical goods being shipped around the world through a delivery pipeline.Goods are shipped in many different formats, shapes and sizes, Some of them might interact with each other. For example coffee and spices cannot be stored in close proximity. Each stage in the pipeline needs to support all possible formats. Including yet to be invented 40
Let me present the problem through an analogy.Here you can see physical goods being shipped around the world through a delivery pipeline.Goods are shipped in many different formats, shapes and sizes, Some of them might interact with each other. For example coffee and spices cannot be stored in close proximity. Each stage in the pipeline needs to support all possible formats. Including yet to be invented 40
And that is how the work is typically done at such pipeline.It is manual, complicated and requires understanding of the shipped content by the workers.Does it remind you anything? Just think what would say an operational person in the picture to two development teams who built round barrels and square boxes. And what will say the end customer at the destination when coffee will smell like spices, or gasoline will be spilled on a piano … ?1:15
And that is how the work is typically done at such pipeline.It is manual, complicated and requires understanding of the shipped content by the workers.Does it remind you anything? Just think what would say an operational person in the picture to two development teams who built round barrels and square boxes. And what will say the end customer at the destination when coffee will smell like spices, or gasoline will be spilled on a piano … ?1:15
And that is how the work is typically done at such pipeline.It is manual, complicated and requires understanding of the shipped content by the workers.Does it remind you anything? Just think what would say an operational person in the picture to two development teams who built round barrels and square boxes. And what will say the end customer at the destination when coffee will smell like spices, or gasoline will be spilled on a piano … ?1:15
The solution forefficient shipment is – a standardized container.Today, all types of storage and transportation means support standard containers.They are always sealed and the content of each container is separated from the content of all other containers.Now, producers can easily ship anything they want as long as it fits into a container.And operations can now focus of maintenance of the infrastructure without thinking about content of a transported package.Maybe they can finally find some time to improve the railroads. 2:02
The solution forefficient shipment is – a standardized container.Today, all types of storage and transportation means support standard containers.They are always sealed and the content of each container is separated from the content of all other containers.Now, producers can easily ship anything they want as long as it fits into a container.And operations can now focus of maintenance of the infrastructure without thinking about content of a transported package.Maybe they can finally find some time to improve the railroads. 2:02
The solution forefficient shipment is – a standardized container.Today, all types of storage and transportation means support standard containers.They are always sealed and the content of each container is separated from the content of all other containers.Now, producers can easily ship anything they want as long as it fits into a container.And operations can now focus of maintenance of the infrastructure without thinking about content of a transported package.Maybe they can finally find some time to improve the railroads. 2:02
The solution forefficient shipment is – a standardized container.Today, all types of storage and transportation means support standard containers.They are always sealed and the content of each container is separated from the content of all other containers.Now, producers can easily ship anything they want as long as it fits into a container.And operations can now focus of maintenance of the infrastructure without thinking about content of a transported package.Maybe they can finally find some time to improve the railroads. 2:02
And with containers we can finally tackle the scalability challenge. Imagine shipping a piano on such ship without a container? 2:15
And with containers we can finally tackle the scalability challenge. Imagine shipping a piano on such ship without a container? 2:15
The challenges and the solutions for SW delivery pipeline are very similar to those I just described.Wide variety of hardware platforms should support even wider variety of software components.With Docker, developers will build their applications and put them into a standard containers. Such container will be picked up by the operations and deployed to virtually any platform without concern of dependencies and incompatibilities. 2:46
The challenges and the solutions for SW delivery pipeline are very similar to those I just described.Wide variety of hardware platforms should support even wider variety of software components.With Docker, developers will build their applications and put them into a standard containers. Such container will be picked up by the operations and deployed to virtually any platform without concern of dependencies and incompatibilities. 2:46
Containers are easily built as a part of the regular development lifecycle and can be started in a fraction of a second. It means that we can run every single build or test suite in a new clean environment created for a single use and dispose of it afterwards.3:08
Containers are easily built as a part of the regular development lifecycle and can be started in a fraction of a second. It means that we can run every single build or test suite in a new clean environment created for a single use and dispose of it afterwards.3:08
I don’t have time for a full demo here, so I have chosen to show something really cool to illustrate my point.First example shows creation of a new container which is measured in milliseconds.Second is an example of a Dockerfiles used to build Docker images.Itis short and simple which will make it easily maintainable.3:38----------$ time echo "Running inside container"Running inside containerreal 0m0.000suser 0m0.000ssys 0m0.000s------------FROM quintenk/jdk7-oracleMAINTAINER Pini Reznik <p.reznik@uglyduckling.nl>RUN echo "deb http://archive.ubuntu.com/ubuntu precise main universe" > /etc/apt/sources.listRUN apt-get updateRUN apt-get install -y maven
I don’t have time for a full demo here, so I have chosen to show something really cool to illustrate my point.First example shows creation of a new container which is measured in milliseconds.Second is an example of a Dockerfiles used to build Docker images.Itis short and simple which will make it easily maintainable.3:38----------$ time echo "Running inside container"Running inside containerreal 0m0.000suser 0m0.000ssys 0m0.000s------------FROM quintenk/jdk7-oracleMAINTAINER Pini Reznik <p.reznik@uglyduckling.nl>RUN echo "deb http://archive.ubuntu.com/ubuntu precise main universe" > /etc/apt/sources.listRUN apt-get updateRUN apt-get install -y maven
Using Docker, we can finally do Software Configuration Management and properly version our build and runtime environments together with the source code.3:53
If you want to hear more details about Docker and ask questions about your specific environment come over to the Docker Amsterdam meetup organised by Ugly Duckling, together with Docker.4:12