Containerization provides a repeatable and reliable strategy for application delivery across environments by utilizing lightweight, isolated containers. Containers package applications and their dependencies in self-contained units that can be easily deployed. This allows applications deployed in containers to behave consistently regardless of the infrastructure they are deployed on or environment they are running in. Containerization solves issues of unpredictability that often occur when transitioning applications between development and production environments.
IBM Blue Box is a private cloud as a service offering that provides a dedicated, scalable OpenStack cloud infrastructure either hosted in IBM data centers (Blue Box Dedicated) or on a customer's own premises (Blue Box Local). Key benefits include fully managed infrastructure, high availability, security, and flexibility to scale compute, storage, and networking resources as needed. IBM provides expertise in deploying and managing the OpenStack environment so customers can focus on applications.
Cloud computing involves computation done over the internet as a service. By 2020, cloud services are projected to be the primary IT consumption source for most individuals and enterprises, with the cloud computing market worth $241 billion. As computing moves to this new paradigm, IT jobs require new skills like virtualization, automation, security, and understanding IT as a service through models like SaaS, PaaS and IaaS. Cloud certifications and integrating cloud technologies into curricula can help develop these skills.
Open nebula leading innovation in cloud computing managementIgnacio M. Llorente
The document discusses OpenNebula, an open-source toolkit for building Infrastructure as a Service (IaaS) clouds. It originated from the RESERVOIR European research project. OpenNebula allows organizations to build private, hybrid, and public clouds to manage their infrastructure resources. It has over 4,000 downloads per month and is used by many organizations and projects to build cloud computing testbeds and ecosystems. The document outlines OpenNebula's innovation model and calls for collaboration to address challenges regarding cloud adoption and key research issues in areas like cloud aggregation, interoperability, and management.
The document discusses OpenNebula, an open-source tool for managing virtual infrastructure in cloud computing. It describes OpenNebula's interoperability and portability features, challenges in these areas, and the community's approach of leveraging standards. Examples are given of collaborations using standards like OCCI and OVF to enable interoperability between OpenNebula and other cloud platforms.
Open Hybrid Cloud.
A presentation given by Erik Geensen, responsible for Cloud, Platform and Virtualization at Red Hat Benelux, at the OPEN'14 conference in Belgium.
This document discusses IBM's hybrid multicloud platform and digital transformation. Some key points:
- IBM's hybrid multicloud platform is founded on Red Hat technologies like Red Hat Enterprise Linux and Red Hat OpenShift which allow applications to be built once and deployed anywhere across public clouds, private clouds, and on-premises.
- The platform provides consistent management, security, and services across heterogeneous cloud environments from different vendors through an open, standards-based approach.
- A case study describes how Deutsche Bank used Red Hat solutions to build an application platform that streamlined development, improved efficiency, and allowed applications to be developed 2-3 weeks instead of 6-9 months.
Ibm test & development cloud + rational service delivery services platformBabak Hosseinzadeh
- IBM is announcing a new cloud computing offering called IBM Smart Business Development and Test on the IBM Cloud, which will provide enterprises with a dynamic virtual development and test infrastructure service hosted on IBM's public cloud.
- The service will allow customers to provision virtual server configurations on demand, with options to add persistent storage and bandwidth on a pay-as-you-go basis. It will provide pre-configured software stacks including Rational Application Lifecycle Management tools.
- The service is aimed at helping enterprises accelerate application development and testing cycles through an on-demand, self-service cloud model with billing based on actual usage and reserve capacity options.
Live Introduction to the Cloud Native Microservices Platform – open, manageab...Lucas Jellema
The microservices architecture promises flexibility, scalability and optimal use of compute resources. Through independent components with well-defined scope and responsibility, interface and ownership that are evolved and managed in an automated DevOps process, this architecture leverages current technologies and lessons learned. The Oracle Microservices Platform is an open source runtime for deploying, running and managing container based microservices. This platform offers a distributed container runtime based on Kubernetes and on top of that API management, a build in event bus, a service broker to link in external services, advanced inter microservice traffic control and load balancing and extensive monitoring. It supports the pure pay-per-use and scale-on-request serverless paradigm. The platform can run anywhere: your laptop our data center, a third party cloud or as an Oracle managed cloud service. This session introduces this Microservices Platform and demonstrates how it is used to roll out and manage a set of collaborative microservices, both locally and in the cloud.
Kubernetes has become the defacto standard platform for managing containerized microservices. However, with just Kubernetes this platform is not yet complete. We also need facilities for managing traffic between microservices - monitor, route, authorize - as well as handle events. We need to support the Serverless architecture style - with triggered functions instead of pre-allocated servers. And we need a governance strategy around new versions of functions and microservices.
Oracle will launch an open (source) microservices platform with all these capabilities preintegrated. This platform is based on Kubernetes and also leverages Kafka, Project Fn, OpenServiceBroker and Istio along with monitoring using Prometheus, Grafana and Kibana. The platform can be run locally or on any IaaS platform. Oracle hopes to make money from a managed cloud service for this platform.
In this session, I want to explore the need for a microservices platform and the essential components it should provide. I will then demonstrate this open microservices platform proposed by Oracle.
IBM Blue Box is a private cloud as a service offering that provides a dedicated, scalable OpenStack cloud infrastructure either hosted in IBM data centers (Blue Box Dedicated) or on a customer's own premises (Blue Box Local). Key benefits include fully managed infrastructure, high availability, security, and flexibility to scale compute, storage, and networking resources as needed. IBM provides expertise in deploying and managing the OpenStack environment so customers can focus on applications.
Cloud computing involves computation done over the internet as a service. By 2020, cloud services are projected to be the primary IT consumption source for most individuals and enterprises, with the cloud computing market worth $241 billion. As computing moves to this new paradigm, IT jobs require new skills like virtualization, automation, security, and understanding IT as a service through models like SaaS, PaaS and IaaS. Cloud certifications and integrating cloud technologies into curricula can help develop these skills.
Open nebula leading innovation in cloud computing managementIgnacio M. Llorente
The document discusses OpenNebula, an open-source toolkit for building Infrastructure as a Service (IaaS) clouds. It originated from the RESERVOIR European research project. OpenNebula allows organizations to build private, hybrid, and public clouds to manage their infrastructure resources. It has over 4,000 downloads per month and is used by many organizations and projects to build cloud computing testbeds and ecosystems. The document outlines OpenNebula's innovation model and calls for collaboration to address challenges regarding cloud adoption and key research issues in areas like cloud aggregation, interoperability, and management.
The document discusses OpenNebula, an open-source tool for managing virtual infrastructure in cloud computing. It describes OpenNebula's interoperability and portability features, challenges in these areas, and the community's approach of leveraging standards. Examples are given of collaborations using standards like OCCI and OVF to enable interoperability between OpenNebula and other cloud platforms.
Open Hybrid Cloud.
A presentation given by Erik Geensen, responsible for Cloud, Platform and Virtualization at Red Hat Benelux, at the OPEN'14 conference in Belgium.
This document discusses IBM's hybrid multicloud platform and digital transformation. Some key points:
- IBM's hybrid multicloud platform is founded on Red Hat technologies like Red Hat Enterprise Linux and Red Hat OpenShift which allow applications to be built once and deployed anywhere across public clouds, private clouds, and on-premises.
- The platform provides consistent management, security, and services across heterogeneous cloud environments from different vendors through an open, standards-based approach.
- A case study describes how Deutsche Bank used Red Hat solutions to build an application platform that streamlined development, improved efficiency, and allowed applications to be developed 2-3 weeks instead of 6-9 months.
Ibm test & development cloud + rational service delivery services platformBabak Hosseinzadeh
- IBM is announcing a new cloud computing offering called IBM Smart Business Development and Test on the IBM Cloud, which will provide enterprises with a dynamic virtual development and test infrastructure service hosted on IBM's public cloud.
- The service will allow customers to provision virtual server configurations on demand, with options to add persistent storage and bandwidth on a pay-as-you-go basis. It will provide pre-configured software stacks including Rational Application Lifecycle Management tools.
- The service is aimed at helping enterprises accelerate application development and testing cycles through an on-demand, self-service cloud model with billing based on actual usage and reserve capacity options.
Live Introduction to the Cloud Native Microservices Platform – open, manageab...Lucas Jellema
The microservices architecture promises flexibility, scalability and optimal use of compute resources. Through independent components with well-defined scope and responsibility, interface and ownership that are evolved and managed in an automated DevOps process, this architecture leverages current technologies and lessons learned. The Oracle Microservices Platform is an open source runtime for deploying, running and managing container based microservices. This platform offers a distributed container runtime based on Kubernetes and on top of that API management, a build in event bus, a service broker to link in external services, advanced inter microservice traffic control and load balancing and extensive monitoring. It supports the pure pay-per-use and scale-on-request serverless paradigm. The platform can run anywhere: your laptop our data center, a third party cloud or as an Oracle managed cloud service. This session introduces this Microservices Platform and demonstrates how it is used to roll out and manage a set of collaborative microservices, both locally and in the cloud.
Kubernetes has become the defacto standard platform for managing containerized microservices. However, with just Kubernetes this platform is not yet complete. We also need facilities for managing traffic between microservices - monitor, route, authorize - as well as handle events. We need to support the Serverless architecture style - with triggered functions instead of pre-allocated servers. And we need a governance strategy around new versions of functions and microservices.
Oracle will launch an open (source) microservices platform with all these capabilities preintegrated. This platform is based on Kubernetes and also leverages Kafka, Project Fn, OpenServiceBroker and Istio along with monitoring using Prometheus, Grafana and Kibana. The platform can be run locally or on any IaaS platform. Oracle hopes to make money from a managed cloud service for this platform.
In this session, I want to explore the need for a microservices platform and the essential components it should provide. I will then demonstrate this open microservices platform proposed by Oracle.
Binh-Minh Nguyen presented an approach for migrating applications to interoperable clouds using a Cloud Abstraction Layer (CAL). CAL provides a generalized interface that abstracts away differences between cloud providers and allows applications to be deployed across multiple cloud platforms. It aims to address issues like vendor lock-in and allow easier migration of applications between clouds. A prototype was demonstrated using CAL to deploy a bioinformatics workflow as a service across OpenStack clouds.
Intro to SW Eng Principles for Cloud Computing - DNelson Apr2015Darryl Nelson
The document provides an introduction to software engineering principles for cloud computing. It discusses key concepts like dematerialization, distributed computing, and shared responsibility between cloud vendors and clients. It outlines principles for cloud systems around infrastructure, scalability, reliability, availability, and avoiding vendor lock-in. Tactics, techniques and procedures are proposed for building cloud-native systems, along with resources for further learning. The document emphasizes designing systems to fail gracefully and testing systems under failure conditions.
CLMS was established in 1998 to bridge the gap between business and technology by designing digital operational enterprise ecosystems. In such ecosystems, applications, systems, processes, data, and external partners work together according to business strategy. CLMS builds on engineering foundations to improve business process effectiveness and offers solutions that help clients adapt to changing business and technology landscapes. Its approach involves domain modeling, knowledge management, continuous engineering processes, and integration infrastructures to facilitate interoperability and connectivity between systems.
Introduces the characteristics of clouds and discusses the challenges that these pose for engineering scientific applications on the cloud. Key challenges are programming models, developing PaaS interfaces for high performance/high throughput computing and developing an SDE for cloud programmng.
Cloud computing & Batch processing: potentiels & perspectives Claude Riousset
Présentation effectuée le 21/10 pour le groupe opérations du "Guide Share France"
Thème: Cloud Computing et Batch processing
Historique et rappel des concepts
Data Center & transformation
Synergie Cloud & Batch, exemple.
Perspectives OpenStack et exemples
This document summarizes a presentation on building the perfect cloud. It discusses evaluating cloud deployment and delivery options, including private, public and hybrid clouds. It covers cloud architecture principles like "cloud ready IT" and common anti-patterns. The presentation addresses defining technology stacks, assessing workloads for cloud suitability, and adopting cloud native application architectures. It emphasizes that cloud transformation requires organizational alignment of people, processes and culture.
A buyer's guide to Hyper-Converged infrastructureEric Van 't Hoff
A very comprehensive buyer's guide for anyone considering to use a Hyper-Converged Infrastructure to modernize his/her Data-Center. The report is published by Computer Weekly and covers solutions from Dell EMC VxRail or XC, Nutanix, VMware vSAN ReadyNodes, HPE SimpliVity and Pivot3 to name just a few HCI leaders.
Daniel Raisch - raisch@br.ibm.com
Passados dez 10 do ínício do se convencionou chamar de Transformação Digital, as principais iniciativas que caracterizam essa transformação como Cloud, Mobile, Analytics , atingiram sua maturidade e já estão na agenda de prioridades de mais de 70% das empresas brasileiras. Nessa apresentação vamos mostrar a curva de evolução dessas iniciativas ao longo desse período e qual o estado da arte em que cada uma se encontra na indústria.
Productionizing Predictive Analytics using the Rendezvous Architecture - for ...danielschulz2005
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive function. Exercise causes chemical changes in the brain that may help protect against mental illness and improve symptoms for those who already suffer from conditions like anxiety and depression.
Redefining HCI: How to Go from Hyper Converged to Hybrid Cloud InfrastructureNetApp
The hyper converged infrastructure (HCI) market is entering a new phase of maturity. A modern HCI solution requires a private cloud platform that integrates with public clouds to create a consistent hybrid multi-cloud experience.
During this webinar, NetApp and an IDC guest speaker covered what led to the next generation of hyper converged infrastructure and which five capabilities are required to go from hyper converged to hybrid cloud infrastructure.
Presentazione IBM Power System Evento Venaria 14 ottobrePRAGMA PROGETTI
The document discusses IBM's POWER8 processor and Linux on Power platform. It provides an overview of the OpenPOWER Consortium which aims to drive innovation through an open development model. Key highlights of POWER8 include 12 cores per socket, improved caches and memory bandwidth. Linux is highlighted as a growing enterprise workload with over 90% of supercomputers using it. Linux on Power is positioned as a strategic platform for new workloads like big data and analytics by combining Linux with the performance of POWER8.
Cloud native applications are built using microservices that are designed to integrate into any cloud environment. Microservices are loosely coupled, self-sufficient programs that execute a single business function. Docker uses containers to package software which provides isolation. Kubernetes is an open-source tool that automates deployment, scaling, and management of containerized applications across clusters. Migrating to cloud native involves steps like moving infrastructure to the cloud to reduce risk, automating processes, adopting a culture of experimentation, breaking monolithic apps into microservices, using containers, and orchestrating containers across clusters.
Jelastic Hybrid Cloud on Google Cloud Platform Ruslan Synytsky
Jelastic can be installed on top of Google Cloud Compute Engine as Virtual Private Cluster. At the same time enterprises can use same version of Jelastic Private Cloud on-premise. It provides 100% compatibility between private and public clouds, and full automation for easy migration between the clouds.
100% compatibility between Public and Private Clouds is the key requirement for fast adoption by enterprises.
Edge computing is a distributed computing architecture that processes data closer to where it is generated, at the edge of the network, rather than sending all data to centralized cloud data centers for processing. It provides benefits like increased speed and reliability, reduced latency, and better security compared to cloud computing. Edge computing is well-suited for applications in smart cities, manufacturing, healthcare, augmented reality, and AI assistants. Future directions for edge computing include improved edge-to-cloud data exchange, common data exchange between edge devices, streaming and batch data analytics, and cloud-based deployments of edge applications.
Hybrid Cloud Point of View - IBM Event, 2015Denny Muktar
My Slide for IBM Cloud Event on November 2015. The slide is talking about disruption, innovation, 4 guiding principles on hybrid cloud, and steps to cloud journey.
Link to IBM Cloud adoption Advisor is at the end of the slide.
Must watch video: Guy Kawasaki - TedX Talk.
How IBM is helping developers win the race to innovate with next-gen cloud se...Michael Elder
In the race to transform, enterprises employ cloud to deliver innovation and stay ahead of the competition. New services are built natively on cloud, but what about the 80% of enterprise applications that have not yet moved to the cloud.
In this session, we'll answer these questions: How do I integrate next-gen technology like Blockchain, Watson IoT, and Data &AI into my new applications? How do I make multicloud an advantage instead of adding new complexity?
Cloud engineering is the application of systematic and disciplined approaches to the ideation, development, operation and maintenance of cloud computing. It focuses on infrastructure as a service, platform as a service and software as a service. Cloud engineering draws from areas like system engineering, software engineering and security engineering. It aims to facilitate cloud adoption and standardization. Key benefits include reduced costs, improved quality, scalability and agility. The term was coined in 2007 and the first IEEE conference on cloud engineering was held in 2013.
Marine Air Ground Task Force Command & Control Systems Software Deployment an...LaurenWendler
This document discusses the current and future state of software deployment for Marine Air Ground Task Force Command & Control Systems. Currently, software deployment is a manual, costly, and time-intensive process involving building DVDs, physical distribution, and manual installation. The future state aims to automate the entire process using technologies like UrbanCode Deploy, Aspera, and BigFix to enable continuous integration, electronic distribution, automated installation, and automatic maintenance. This is expected to significantly reduce costs, improve compliance, and allow faster response times to security issues. The transition to the new system architecture and deployment approach will occur over fiscal year 2018 through migration, testing, and accreditation activities.
The document discusses cloud computing and its benefits. It summarizes that cloud provides a flexible way to deliver IT solutions and lower costs. It presents three use cases where cloud computing can help by reducing time-to-market, providing elastic computing power when needed, and allowing organizations to focus on innovation rather than infrastructure maintenance. The document promotes IBM's cloud strategy and services for developing and implementing cloud solutions. It highlights IBM's BlueMix platform and SoftLayer infrastructure as key offerings.
Webinar presented live on May 29, 2018
The Cloud Native Computing Foundation builds sustainable ecosystems and fosters a community around a constellation of projects that orchestrate containers as part of a microservices architecture. CNCF serves as the vendor-neutral home for many of the fastest-growing projects on GitHub, including Kubernetes, Prometheus and Envoy, fostering collaboration between the industry’s top developers, end users, and vendors.
In this webinar, Dan Kohn, CNCF Executive Director, will present:
- A brief overview of CNCF
- Evolving monolithic applications to microservices on Kubernetes
- Why Continuous Integration is the most important part of the cloud native architecture
Watch the video: http://www.cloud-council.org/webinars/kubernetes-and-container-technologies-from-cncf.htm
Bahrain ch9 introduction to docker 5th birthday Walid Shaari
A hands-on workshop will go over the foundations of the containers platform, including an overview of the platform system components: images, containers, repositories, clustering, and orchestration. The strategy is to demonstrate through "live demo, and hands-on exercises." The reuse case of containers in building a portable distributed application cluster running a variety of workloads including HPC workload.
Binh-Minh Nguyen presented an approach for migrating applications to interoperable clouds using a Cloud Abstraction Layer (CAL). CAL provides a generalized interface that abstracts away differences between cloud providers and allows applications to be deployed across multiple cloud platforms. It aims to address issues like vendor lock-in and allow easier migration of applications between clouds. A prototype was demonstrated using CAL to deploy a bioinformatics workflow as a service across OpenStack clouds.
Intro to SW Eng Principles for Cloud Computing - DNelson Apr2015Darryl Nelson
The document provides an introduction to software engineering principles for cloud computing. It discusses key concepts like dematerialization, distributed computing, and shared responsibility between cloud vendors and clients. It outlines principles for cloud systems around infrastructure, scalability, reliability, availability, and avoiding vendor lock-in. Tactics, techniques and procedures are proposed for building cloud-native systems, along with resources for further learning. The document emphasizes designing systems to fail gracefully and testing systems under failure conditions.
CLMS was established in 1998 to bridge the gap between business and technology by designing digital operational enterprise ecosystems. In such ecosystems, applications, systems, processes, data, and external partners work together according to business strategy. CLMS builds on engineering foundations to improve business process effectiveness and offers solutions that help clients adapt to changing business and technology landscapes. Its approach involves domain modeling, knowledge management, continuous engineering processes, and integration infrastructures to facilitate interoperability and connectivity between systems.
Introduces the characteristics of clouds and discusses the challenges that these pose for engineering scientific applications on the cloud. Key challenges are programming models, developing PaaS interfaces for high performance/high throughput computing and developing an SDE for cloud programmng.
Cloud computing & Batch processing: potentiels & perspectives Claude Riousset
Présentation effectuée le 21/10 pour le groupe opérations du "Guide Share France"
Thème: Cloud Computing et Batch processing
Historique et rappel des concepts
Data Center & transformation
Synergie Cloud & Batch, exemple.
Perspectives OpenStack et exemples
This document summarizes a presentation on building the perfect cloud. It discusses evaluating cloud deployment and delivery options, including private, public and hybrid clouds. It covers cloud architecture principles like "cloud ready IT" and common anti-patterns. The presentation addresses defining technology stacks, assessing workloads for cloud suitability, and adopting cloud native application architectures. It emphasizes that cloud transformation requires organizational alignment of people, processes and culture.
A buyer's guide to Hyper-Converged infrastructureEric Van 't Hoff
A very comprehensive buyer's guide for anyone considering to use a Hyper-Converged Infrastructure to modernize his/her Data-Center. The report is published by Computer Weekly and covers solutions from Dell EMC VxRail or XC, Nutanix, VMware vSAN ReadyNodes, HPE SimpliVity and Pivot3 to name just a few HCI leaders.
Daniel Raisch - raisch@br.ibm.com
Passados dez 10 do ínício do se convencionou chamar de Transformação Digital, as principais iniciativas que caracterizam essa transformação como Cloud, Mobile, Analytics , atingiram sua maturidade e já estão na agenda de prioridades de mais de 70% das empresas brasileiras. Nessa apresentação vamos mostrar a curva de evolução dessas iniciativas ao longo desse período e qual o estado da arte em que cada uma se encontra na indústria.
Productionizing Predictive Analytics using the Rendezvous Architecture - for ...danielschulz2005
The document discusses the benefits of exercise for mental health. Regular physical activity can help reduce anxiety and depression and improve mood and cognitive function. Exercise causes chemical changes in the brain that may help protect against mental illness and improve symptoms for those who already suffer from conditions like anxiety and depression.
Redefining HCI: How to Go from Hyper Converged to Hybrid Cloud InfrastructureNetApp
The hyper converged infrastructure (HCI) market is entering a new phase of maturity. A modern HCI solution requires a private cloud platform that integrates with public clouds to create a consistent hybrid multi-cloud experience.
During this webinar, NetApp and an IDC guest speaker covered what led to the next generation of hyper converged infrastructure and which five capabilities are required to go from hyper converged to hybrid cloud infrastructure.
Presentazione IBM Power System Evento Venaria 14 ottobrePRAGMA PROGETTI
The document discusses IBM's POWER8 processor and Linux on Power platform. It provides an overview of the OpenPOWER Consortium which aims to drive innovation through an open development model. Key highlights of POWER8 include 12 cores per socket, improved caches and memory bandwidth. Linux is highlighted as a growing enterprise workload with over 90% of supercomputers using it. Linux on Power is positioned as a strategic platform for new workloads like big data and analytics by combining Linux with the performance of POWER8.
Cloud native applications are built using microservices that are designed to integrate into any cloud environment. Microservices are loosely coupled, self-sufficient programs that execute a single business function. Docker uses containers to package software which provides isolation. Kubernetes is an open-source tool that automates deployment, scaling, and management of containerized applications across clusters. Migrating to cloud native involves steps like moving infrastructure to the cloud to reduce risk, automating processes, adopting a culture of experimentation, breaking monolithic apps into microservices, using containers, and orchestrating containers across clusters.
Jelastic Hybrid Cloud on Google Cloud Platform Ruslan Synytsky
Jelastic can be installed on top of Google Cloud Compute Engine as Virtual Private Cluster. At the same time enterprises can use same version of Jelastic Private Cloud on-premise. It provides 100% compatibility between private and public clouds, and full automation for easy migration between the clouds.
100% compatibility between Public and Private Clouds is the key requirement for fast adoption by enterprises.
Edge computing is a distributed computing architecture that processes data closer to where it is generated, at the edge of the network, rather than sending all data to centralized cloud data centers for processing. It provides benefits like increased speed and reliability, reduced latency, and better security compared to cloud computing. Edge computing is well-suited for applications in smart cities, manufacturing, healthcare, augmented reality, and AI assistants. Future directions for edge computing include improved edge-to-cloud data exchange, common data exchange between edge devices, streaming and batch data analytics, and cloud-based deployments of edge applications.
Hybrid Cloud Point of View - IBM Event, 2015Denny Muktar
My Slide for IBM Cloud Event on November 2015. The slide is talking about disruption, innovation, 4 guiding principles on hybrid cloud, and steps to cloud journey.
Link to IBM Cloud adoption Advisor is at the end of the slide.
Must watch video: Guy Kawasaki - TedX Talk.
How IBM is helping developers win the race to innovate with next-gen cloud se...Michael Elder
In the race to transform, enterprises employ cloud to deliver innovation and stay ahead of the competition. New services are built natively on cloud, but what about the 80% of enterprise applications that have not yet moved to the cloud.
In this session, we'll answer these questions: How do I integrate next-gen technology like Blockchain, Watson IoT, and Data &AI into my new applications? How do I make multicloud an advantage instead of adding new complexity?
Cloud engineering is the application of systematic and disciplined approaches to the ideation, development, operation and maintenance of cloud computing. It focuses on infrastructure as a service, platform as a service and software as a service. Cloud engineering draws from areas like system engineering, software engineering and security engineering. It aims to facilitate cloud adoption and standardization. Key benefits include reduced costs, improved quality, scalability and agility. The term was coined in 2007 and the first IEEE conference on cloud engineering was held in 2013.
Marine Air Ground Task Force Command & Control Systems Software Deployment an...LaurenWendler
This document discusses the current and future state of software deployment for Marine Air Ground Task Force Command & Control Systems. Currently, software deployment is a manual, costly, and time-intensive process involving building DVDs, physical distribution, and manual installation. The future state aims to automate the entire process using technologies like UrbanCode Deploy, Aspera, and BigFix to enable continuous integration, electronic distribution, automated installation, and automatic maintenance. This is expected to significantly reduce costs, improve compliance, and allow faster response times to security issues. The transition to the new system architecture and deployment approach will occur over fiscal year 2018 through migration, testing, and accreditation activities.
The document discusses cloud computing and its benefits. It summarizes that cloud provides a flexible way to deliver IT solutions and lower costs. It presents three use cases where cloud computing can help by reducing time-to-market, providing elastic computing power when needed, and allowing organizations to focus on innovation rather than infrastructure maintenance. The document promotes IBM's cloud strategy and services for developing and implementing cloud solutions. It highlights IBM's BlueMix platform and SoftLayer infrastructure as key offerings.
Webinar presented live on May 29, 2018
The Cloud Native Computing Foundation builds sustainable ecosystems and fosters a community around a constellation of projects that orchestrate containers as part of a microservices architecture. CNCF serves as the vendor-neutral home for many of the fastest-growing projects on GitHub, including Kubernetes, Prometheus and Envoy, fostering collaboration between the industry’s top developers, end users, and vendors.
In this webinar, Dan Kohn, CNCF Executive Director, will present:
- A brief overview of CNCF
- Evolving monolithic applications to microservices on Kubernetes
- Why Continuous Integration is the most important part of the cloud native architecture
Watch the video: http://www.cloud-council.org/webinars/kubernetes-and-container-technologies-from-cncf.htm
Bahrain ch9 introduction to docker 5th birthday Walid Shaari
A hands-on workshop will go over the foundations of the containers platform, including an overview of the platform system components: images, containers, repositories, clustering, and orchestration. The strategy is to demonstrate through "live demo, and hands-on exercises." The reuse case of containers in building a portable distributed application cluster running a variety of workloads including HPC workload.
A presentation to explain the microservices architecture, the pro and the cons, with a view on how to migrate from a monolith to a SOA architecture. Also, we'll show the benefits of the microservices architecture also for the frontend side with the microfrontend architecture.
Modern Architecture in the Cloud of 2018 (IT Camp 2018)Marius Zaharia
Today, the large public Clouds - Azure and AWS - deploy at high-speed a diversity of services and features. Between Azure Functions, Event Grid, Azure VM Scale Sets, or Logic Apps, what to choose? Shall I go on Microservices? Event-Driven? Lambda Architecture? Deploy on Serverless? Containers? Modern Compute? Let's put a bit of order in all that. Enter the Modern Architecture, the foundation of all the new wave of Cloud services and not only. Session focused on application and infrastructure architecture, examples based on Cloud, perspectives and roadmap of the corresponding services at Microsoft.
Recording here: https://www.youtube.com/watch?v=5W4n9K3PIVg
Since Docker was open sourced in 2013, the community and adoption around Docker containers has grown to over 6 billion downloads and over 1000 contributors. Learn about why this is, and why you should start using containers for your own applications.
Containers Anywhere with OpenShift by Red Hat - Session Sponsored by Red HatAmazon Web Services
OpenShift is Red Hat's Platform-as-a-Service (PaaS) that lets developers quickly develop, host, and scale Docker container-based applications. OpenShift enables a uniform and standardised approach to container management across all hosting options including AWS/EC2 and other private/public cloud and on/off-premise variants.
At this session, you will learn how Red Hat's enterprise clients are using OpenShift to enable their digital transformation initiatives. Examples will cover how realising a hybrid cloud strategy can simplify and reduce the risk of migrating and transitioning application workloads to containers in the cloud.
Speaker: Andrea Spanner, Red Hat Asia Pacific Pty Ltd
Red hat's updates on the cloud & infrastructure strategyOrgad Kimchi
Red Hat presented its cloud and infrastructure strategy, focusing on Red Hat Cloud Suite which includes OpenStack for the software platform, OpenShift for DevOps and containers, and CloudForms for cloud management. OpenStack provides massive scalability for infrastructure and removes vendor lock-in. OpenShift enables developers and operations to build, deploy, and manage containerized applications from development to production on any infrastructure including physical, virtual, private and public clouds. CloudForms allows for managing containers and OpenShift deployments across hybrid cloud environments.
Hybrid Cloud: How to Get a Return from an Investment Made Three Decades Ago (...Michael Elder
How do you get the value of the last 3 decades of investment in your backend into the hands of your end users faster? And through new mediums like mobile?
IBM Bluemix offers you the opportunity to craft new applications in a fully hosted and managed Platform as a Service. Wouldn’t it be great if you could tie these two worlds together? Well, in fact you can!
In this talk, we’ll show you how to incorporate backend services into your IBM Bluemix applications through Cast Iron Live, an API gateway that let’s you expose your on-prem backend services safely to off-prem applications on IBM Bluemix. We’ll even show you how to manage the entire chain using a consistent DevOps-centric toolchain using IBM UrbanCode Deploy!
Bob is technical architect, Alice is buyer, Ted is software developer. Three of them have good reasons to select Dimension Data as a strategic cloud service provider. Learn more about CloudControl (the global orchestration) and the Managed Cloud Platform (the pods). Engage with the European cloud team and create the capability that you are looking for!
This document provides an agenda and information for the IBM Cloud Tour Design Track on hybrid cloud infrastructure choices. The agenda includes lightning talks on various hybrid cloud topics, roundtable discussions, and demos. Several IBM experts are listed as presenting on topics like hybrid cloud infrastructure choices, building cloud object storage, starting an API journey, and managing hybrid deployments. The document promotes IBM Cloud as providing choices to meet varied app and data needs with security, cognitive solutions, powerful data/analytics, hybrid integration, DevOps productivity, and consistency across options. It encourages attendees to provide feedback and continue discussions at the roundtables and demos.
InterConnect 2015: 3045 Hybrid Cloud - How to get a return from an investment...Daniel Berg
This document discusses hybrid cloud and IBM's approach. It defines hybrid cloud as the secure consumption of services from both private and public clouds as well as traditional IT. It outlines IBM's focus on services integration, portability, and flexible deployment models to enable hybrid cloud. It also discusses IBM's DevOps services and tools like UrbanCode Deploy that help deliver applications to hybrid environments through continuous delivery pipelines.
Docker, Unikernels and Docker for Mac discusses how Docker spans the continuum of compute by enabling the building, shipping, and running of applications across Linux containers, Windows containers, and soon unikernels. Docker for Mac embeds a hypervisor and extends it with improvements for native packaging, enabling Docker containers to run seamlessly on Mac systems. Unikernels compile application source code into custom operating systems including only required functionality for high performance, efficiency, and security. Docker aims to incorporate unikernels onto a continuum with Linux and Windows containers to allow applications to run from datacenters to clouds to IoT.
Container Technologies and Transformational valueMihai Criveti
Transformational value for container technologies - the business impact of Digital Transformation to Cloud Native technologies.
A brief overview of the technology impact of containers, OpenShift and automation.
Talk delivered at Guide Share Europe Conference 2021: https://www.youtube.com/watch?v=1QunNECL26M
This document introduces ActOnMagic, an innovative technology company that provides cloud governance, analytics, management, and brokering software. It has expertise in multi-cloud and hybrid cloud environments. The company's ActOnCloud platform offers a single pane of glass for self-service provisioning, pricing/chargeback, compliance, and automation across public, private and hybrid clouds. The platform provides intelligence, management, and governance capabilities. Testimonials praise ActOnMagic's expertise in upgrading CloudStack environments and creating a cloud governance solution for Softlayer usage.
Cloud native refers to applications and infrastructure designed for cloud environments. It emphasizes characteristics like resilience, agility, operability, and observability. Cloud native applications are built for the cloud from the start, rather than refactored to run in the cloud. Cloud native infrastructure is software-defined and API-driven, enabling scalability and automation. Moving to cloud native involves virtualizing infrastructure, using containers, modularizing services, and implementing automation.
DockerCon SF 2015: Docker Community in ChinaDocker, Inc.
1) The document discusses the Docker community in China, noting that early adopters like Baidu helped drive adoption.
2) Meetups and content contributed to scaling the community from 1 to over 19 cities with thousands of attendees. Chinese contributors are also among the top for the Docker project.
3) The market for Docker in China is driven by the "Internet Plus" strategy and sectors like e-commerce, social media, and IoT. This is creating opportunities for startups and traditional businesses to embrace mobile and cloud technologies.
4) The ecosystem involves startups building tools for CI/CD, container services, and management, and projects like Hyper focusing on running containers on any hypervisor. Developers are also using
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
National Security Agency - NSA mobile device best practices
Slide shared
1. Containerization as an
Enterprise Architecture Strategy
Huxi LI
huxi.li@holydis.com
Head of Engineering, Holy-Dis, France
April 2019, Cluj Napoca, Romania
(https://codecamp.ro/cluj)
2. 2
Holy-Dis (https://holydis.com)
A leading French workforce management
solution provider for retails, call centres, …,
A company of 30 years’ history and innovation,
Axon-soft (https://axon-soft.com)
Strategic Partner of Holy-Dis
Providing invaluable development expertise for
Holy-Dis core products
Huxi LI
Ph. D. in engineering and simulation,
Head of Engineering of Holy-Dis (France), lead
the work of
‐ New generation WFM platform with cloud-native,
highly scalable architecture, unifying all our
products on a single platform,
‐ Fully containerized SaaS platform,
In the past (2000 – 2018), managing architect,
lead enterprise system architecting and
transformation for various companies and
institutions,
WHO AM I
5. 5
CONTAINER TECHNOLOGY
FreeBSD Jail LXC Docker LXD Rkt
• Free BSD’s OS-
virtualization
technology,
• NOT allow to run
different kernel versions
than that of the base
system
• Linux containers,
• Foundation of Linux
containerization
• Leading container
system for application
deployment,
• Initially based on LXC
but now on Open
Container Initiative
• Building on top of LXC,
• Specializes in deploying
Linux Virtual Machines
and integration with
OpenStack
• Emphasizing on security,
• Competing with Docker with
support from Google, AWS, etc
2000 2008 2013 2015 2016
KEY Container Technologies
6. 6
Docker leads container movements:
Battle-proven in production
De facto industrial standard
Wide industrial supports
Widely available docker images
(https://hub.docker.com)
CONTAINER TECHNOLOGY
Google Trend Data (2014-2019)
7. 7
Docker adoption in enterprise:
• 20% of Hosts run docker at the end of
2017, and it progress at a rate of 40%
annually according data from Datadog.
CONTAINER TECHNOLOGY
https://www.datadoghq.com/docker-adoption/
8. 8
Docker adoption in enterprise:
• 25% Datadog monitored enterprise using
docker in production at the end of 2017,
this figure should much higher now!
CONTAINER TECHNOLOGY
https://www.datadoghq.com/docker-adoption/
11. 11
Cloud computing is the new norm
• Like it or not, Cloud will be everywhere !
• IaaS +40 % annual growth in 2018 (Synergy, 2019)
• Cloud IT Infrastructure Revenues Surpassed Traditional IT
Infrastructure Revenues at the end of 2018 (IDC, 2019)
• Cloud is becoming a safe choice of architects
CLOUD STRATEGY AND CONTAINERIZATION
IaaS +40% annually
IaaS
> 50%
Traditonal
< 50%
REVENUES
2018
12. 12
Cloud lock-in, the real risk, the worst
• Cloud will be “Backbone” of future enterprise computing,
• The battle of cloud is a new kind of OS war,
• All major cloud providers use their marketing techniques
inciting new customer’s buy in,
AWS offers one year free account,
Microsoft AZURE and Google GCP have their counterparts,
• All cloud players try to buy you in, but are you prepared
for an exit door ?
CLOUD STRATEGY AND CONTAINERIZATION
13. 13
Facing cloud lock-in, you need a strategy
• You need pull out options
• Multi-cloud strategy is a wise architectural strategy to
avoid cloud provider locked-in
• The question is HOW, “How to deploy a multi-cloud
strategy?”
CLOUD STRATEGY AND CONTAINERIZATION
14. 14
Containerization = Good multi-cloud strategy
• A containerized application run on any cloud,
- Run on all major cloud IaaS platforms
- Reduce / avoid the risks of cloud lock-in
• A containerized application run the same way, on-premise or
on-cloud,
- Freedom of choice between on-premise & on-cloud
- You have the choice of taking advantages of both on-premise &
on-cloud resources
CLOUD STRATEGY AND CONTAINERIZATION
17. 17
Technologies move at unprecedented
speed
• Disruptive technologies appearing every few year,
creating opportunities for many, but traps for those
unprepared,
• Software is at the center of digital revolutions,
advancing at an amazing speed thanks to open
sources and globalized collaborations,
• How Your Business Can Keep Up With Changing
Technology Trends ?
INNOVATION AGILITY AND CONTAINERIZATION
18. 18
Facing rapid-changing Tech Industry, You
need an Agile Innovation strategy
• Efficient technology evaluation,
• Reliable technology upgrading,
• Reliable technology decommission,
• Reactive risk management,
INNOVATION AGILITY AND CONTAINERIZATION
19. 19
Containerization = Innovation Catalyst :
• You have free access to
> 2 millions images available on hub.docker.com,
Most open sources available as docker images,
Official images available for thousands of proprietary
technologies
• You can innovate without the burden of from start
Any docker image can be customized, extended!
• You can do upgrade your technologies at your
convenience,
Docker images are versioned,
You can chose the one you needed,
INNOVATION AGILITY AND CONTAINERIZATION
20. 20
Need a MySQL 8 server ?
• Simply run the following command:
docker run -d --name mydb8 -p 3306:3306 mysql:8
• You immediately got a running MySql 8 server listening on port 3306!
• You can do anything you want to do with it!
• This is amazing fast and simple!
Want change version ?
• Simply change image version:
docker run -d -p 3306:3306 mysql:5.7
INNOVATION AGILITY AND CONTAINERIZATION
21. 21
Need learn Mongodb 3 ?
• Simply run the following command:
docker run -d --name mymongo -p 27017:27017 mongo:3
• You immediately got a running Mongo 3 server!
• You can test, experiment, or do anything that you like
to do on your new server!
INNOVATION AGILITY AND CONTAINERIZATION
22. 22
Containerization = Innovation Agility
• Rapid technology evaluations,
• Benefits from millions pre-fabricated images,
covering amazing number of technologies
• You can easily keep your experimentation results
By creating your own images
BY saving Dockerfile in GIT or any SCM and rebuild
your image at any moment
INNOVATION AGILITY AND CONTAINERIZATION
25. 25
REPEATABLE PROVISION AND CONTAINERIZATION
Repeatable is hard!
• Something works here and now, it does not guarantee it will still
works tomorrow or at somewhere else,
• Something works on a development environment, it does not
guarantee it will still works on production,
• From development to production is a permanent battle, with
confusion and uncertainty,
- «But it worked yesterday, and nothing has changed! But it works on
my machine! » Such kind of conversions are common in enterprise
• These phenomena illustrate
- the frustration of IT professionals facing non-reproducible behaviour
of their enterprise systems
- how hard it is of achieving Repeatable IT delivery
26. 26
Hardware virtualization
revolutionized
machine provision
Cloud technologies
industrialize
Datacentre provision
Container technology
revolutionized Appl. &
Services provision
Machine Datacentre Application & Service
REPEATABLE PROVISION AND CONTAINERIZATION
Repeatable is hard, but it Progresses
27. 27
Repeatable “Machine Provision”
• Hardware virtualization revolutionized machine provision:
- You can create VM image with preinstalled OS and software,
- You can deploy new instance VM with identical configuration
• Hardware virtualization makes machine provision
Repeatable & Economic,
• Hardware virtualization creates a market valued of billions
USD
REPEATABLE PROVISION AND CONTAINERIZATION
Source:MarketResearchFuture®
Server virtualization market size
28. 28
Industrialized “Datacentre Provision”
• Cloud technologies industrialize Datacentre provision with:
- One click on-demand VM deployment,
- Self-service of Virtual Private Cloud (VPC) deployment,
• Cloud technology uses virtualization technologies with prefabricated VM
images:
- 38 official Amazon Machine Image (AMI)
- 64K community provided AMI
REPEATABLE PROVISION AND CONTAINERIZATION
29. 29
Surprise and Unexpected continue …
• Hardware virtualization is not enough,
• Strong coupling continue:
VM are of multi-purposes & multi-contexts,
Shared system components,
Shared environments,
Shared resources (RAM, CPU, NETWORK)
• Your programs are not alone, suffering from:
Unexpected system changes,
Unexpected impacts on others,
Unexpected differences from DEV to production
REPEATABLE PROVISION AND CONTAINERIZATION
30. 30
Containerization = Repeatable Application Delivery
• Containerization providing:
Mono-purpose & mono-context image,
Self-contained & isolated runtime,
Light-weight, versioned, immutable image
• Containerized system:
Protect you from unexpected system changes,
Protect you from unexpected impacts on others,
Use same image for all environments (Dev, Stage, Prod)
REPEATABLE PROVISION AND CONTAINERIZATION
31. 31
Containers are Isolated
• A container run in an isolated Blackbox, and it
has its owner filesystem, RAM, CPU and
Network interfaces,
• You don’t see any difference from a traditional
operating system inside a container!
REPEATABLE PROVISION AND CONTAINERIZATION
Ubuntu 18.04 Container
# docker run --name myubuntu -d -it ubuntu:18.04
# docker exec -it myubuntu bash
# ls -l
32. 32
Containers are Self-contained
• A container is a complete and independent unit of runtime!
• Containing all runtime components, independent of host
machine,
• That is why it runs exactly the same way on any platform having
Docker installed, regardless the nature of its host machine (cloud
or premise, VM or physique, version of host OS)
• That is why it runs exactly same way for all environments (Dev,
Stage, Prod)
REPEATABLE PROVISION AND CONTAINERIZATION
33. 33
Containers are Light-weighted
• A Docker image of Alpine Linux with a
complete package index has only 5 MB in size!
• Nginx server based on Alpine Linux has only
16 MB in size!
• A container can be as big as a virtual machine,
and as little as a simple “Hello World”
program,
• A container start < 1 sec
REPEATABLE PROVISION AND CONTAINERIZATION
34. 34
Container Images are Versioned
• Docker image is a self-contained archive with layered
structure. Each layer introduces its own changes but
never touches lower layers,
• Docker images are tagged (versioning) and identified
by a unique ID,
• Image layers are hashed for optimization, caching,
and share,
• Most importantly, a container image contains
everything needed for running!
REPEATABLE PROVISION AND CONTAINERIZATION
1.0
1.1
2.0 2.3 3.0
5.0
35. 35
Containers are mono-purpose
• Containers are designed for mono-purpose &
mono-context application deployment,
• Containerized applications avoid :
unexpected system changes by others,
unexpected impacts on others,
REPEATABLE PROVISION AND CONTAINERIZATION