The document summarizes a presentation about American Airlines' transformation to a lean architecture and Agile development approach using IBM Cloud. Some key points:
- American Airlines wanted to adopt Agile but faced challenges with their monolithic website, complex applications, difficulty with regression tests, and siloed teams.
- The project aimed to transform AA.com, mobile apps, and kiosks to a modern microservices architecture on IBM Cloud to enable continuous innovation.
- They used an IBM Cloud Garage methodology with squads, guilds, and decentralized governance. Architecture was documented in tools developers used like GitHub.
- Lessons included evolving architecture incrementally, starting platform teams ahead of squads, and
This document discusses enterprise data management. It defines enterprise data management as removing organizational data issues by defining accurate, consistent, and transparent data that can be created, integrated, disseminated, and managed across enterprise applications in a timely manner. It also discusses the need for a structured data delivery strategy from producers to consumers. The document then outlines some key enterprise data categories and provides a conceptual and logical view of an enterprise master data lineage architecture with data flowing between transactional systems, a data management layer, and analytics.
Why an AI-Powered Data Catalog Tool is Critical to Business SuccessInformatica
Imagine a fast, more efficient business thriving on trusted data-driven decisions. An intelligent data catalog can help your organization discover, organize, and inventory all data assets across the org and democratize data with the right balance of governance and flexibility. Informatica's data catalog tools are powered by AI and can automate tedious data management tasks and offer immediate recommendations based on derived business intelligence. We offer data catalog workshops globally. Visit Informatica.com to attend one near you.
Slide show for the webinar on "Spatial Data Science with R" organized for the GeoDevelopers.org community. The video of the webinar and all the related materials including source code and sample data can be downloaded from this link: http://amsantac.co/blog/en/2016/08/07/spatial-data-science-r.html
In this webinar I talked about Data Science in the context of its application to spatial data and explained how we can use the R language for the analysis of geographic information within the different stages of a data science workflow, from the import and processing of spatial data to visualization and publication of results.
DataOps: An Agile Method for Data-Driven OrganizationsEllen Friedman
DataOps expands DevOps philosophy to include data-heavy roles (data engineering & data science). DataOps uses better cross-functional collaboration for flexibility, fast time to value and an agile workflow for data-intensive applications including machine learning pipelines. (Strata Data San Jose March 2018)
This document discusses Uber's use of big data and real-world examples. It describes how Uber handles millions of daily rides and billions of recorded rides globally. It discusses how Uber uses Kafka to centrally handle data from different sources and formats at varying throughput. It also discusses how Uber uses Cassandra for its noSQL database needs like reading user and driver location data with fast response times. It provides examples of how Spark can be used to do real-time and batch processing on Uber's huge volumes of data to gain insights. Finally, it proposes a hypothetical system called Cablito that could be built to handle Uber's personal user data, process booking requests and rides data, and perform analytics on metrics and historical data.
This document summarizes a research study that assessed the data management practices of 175 organizations between 2000-2006. The study had both descriptive and self-improvement goals, such as understanding the range of practices and determining areas for improvement. Researchers used a structured interview process to evaluate organizations across six data management processes based on a 5-level maturity model. The results provided insights into an organization's practices and a roadmap for enhancing data management.
The document provides an overview of database migration options using AWS Database Migration Service (DMS) and AWS Schema Conversion Tool (SCT). It discusses how DMS can be used to migrate databases across different database platforms with minimal downtime. It also outlines how SCT can be used to convert schemas from commercial databases to open-source databases like PostgreSQL. The document shares customer examples and benefits of using DMS and SCT for heterogeneous, scale-up, and split migrations. It also lists available resources for customers on DMS and SCT.
This document discusses enterprise data management. It defines enterprise data management as removing organizational data issues by defining accurate, consistent, and transparent data that can be created, integrated, disseminated, and managed across enterprise applications in a timely manner. It also discusses the need for a structured data delivery strategy from producers to consumers. The document then outlines some key enterprise data categories and provides a conceptual and logical view of an enterprise master data lineage architecture with data flowing between transactional systems, a data management layer, and analytics.
Why an AI-Powered Data Catalog Tool is Critical to Business SuccessInformatica
Imagine a fast, more efficient business thriving on trusted data-driven decisions. An intelligent data catalog can help your organization discover, organize, and inventory all data assets across the org and democratize data with the right balance of governance and flexibility. Informatica's data catalog tools are powered by AI and can automate tedious data management tasks and offer immediate recommendations based on derived business intelligence. We offer data catalog workshops globally. Visit Informatica.com to attend one near you.
Slide show for the webinar on "Spatial Data Science with R" organized for the GeoDevelopers.org community. The video of the webinar and all the related materials including source code and sample data can be downloaded from this link: http://amsantac.co/blog/en/2016/08/07/spatial-data-science-r.html
In this webinar I talked about Data Science in the context of its application to spatial data and explained how we can use the R language for the analysis of geographic information within the different stages of a data science workflow, from the import and processing of spatial data to visualization and publication of results.
DataOps: An Agile Method for Data-Driven OrganizationsEllen Friedman
DataOps expands DevOps philosophy to include data-heavy roles (data engineering & data science). DataOps uses better cross-functional collaboration for flexibility, fast time to value and an agile workflow for data-intensive applications including machine learning pipelines. (Strata Data San Jose March 2018)
This document discusses Uber's use of big data and real-world examples. It describes how Uber handles millions of daily rides and billions of recorded rides globally. It discusses how Uber uses Kafka to centrally handle data from different sources and formats at varying throughput. It also discusses how Uber uses Cassandra for its noSQL database needs like reading user and driver location data with fast response times. It provides examples of how Spark can be used to do real-time and batch processing on Uber's huge volumes of data to gain insights. Finally, it proposes a hypothetical system called Cablito that could be built to handle Uber's personal user data, process booking requests and rides data, and perform analytics on metrics and historical data.
This document summarizes a research study that assessed the data management practices of 175 organizations between 2000-2006. The study had both descriptive and self-improvement goals, such as understanding the range of practices and determining areas for improvement. Researchers used a structured interview process to evaluate organizations across six data management processes based on a 5-level maturity model. The results provided insights into an organization's practices and a roadmap for enhancing data management.
The document provides an overview of database migration options using AWS Database Migration Service (DMS) and AWS Schema Conversion Tool (SCT). It discusses how DMS can be used to migrate databases across different database platforms with minimal downtime. It also outlines how SCT can be used to convert schemas from commercial databases to open-source databases like PostgreSQL. The document shares customer examples and benefits of using DMS and SCT for heterogeneous, scale-up, and split migrations. It also lists available resources for customers on DMS and SCT.
Introduction to Modern Data Virtualization (US)Denodo
Watch full webinar here: https://bit.ly/3uyvxN5
“Through 2022, 60% of all organizations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
- How to easily get started with Denodo Standard 8.0
Learn About The Basic of Salesforce.com (force.com) which is the worlds' first and most popular CRM system headquartered in San Francisco.
By : Vijay maurya
Student At Baddi University of emerging science and technology ,solan
For more info contact me:-
email: vijaymaurya3167@gmail.com
IG: @vijaymaurya_ (follow me)
Fb.com/vijay.maurya2
New Dynamics 365 Implementation Guide - Available for downloadDynamics Square
Almost 700 pages of insights and guidance around strategy, initiation, implementation, preparation and operation, Contact us for Dynamics 365 Implementation: https://www.dynamicssquare.com.au/dynamics-365/
Big Data Fabric: A Recipe for Big Data InitiativesDenodo
Big data fabric combines essential big data capabilities in a single platform to automate the many facets of data discovery, preparation, curation, orchestration, and integration across a multitude of data sources. Attend this session to learn how Big Data Fabric enabled by data virtualization constitutes a recipe for:
• Enabling new actionable insights with minimal effort
• Securing big data end-to-end
• Addressing big data skillset scarcity
• Providing easy access to data without having to decipher various data formats
Agenda:
• Big Data with Data Virtualization
• Product Demonstration
• Summary & Next Steps
• Q&A
Watch webinar on demand here: https://goo.gl/EpmIBx
This webinar is part of the Data Virtualization Packed Lunch Webinar Series: https://goo.gl/W1BeCb
Global mapper tutorial Jimma University Ethiopiachala hailu
This document discusses using Global Mapper software to delineate watersheds and calculate peak runoff for flood analysis. It provides an overview of common hydrological analysis methods for estimating maximum flood levels, and describes using the SCS unit hydrograph method within Global Mapper. It also outlines the basic steps and tools in Global Mapper for watershed delineation and peak runoff calculation.
Survive the fog of system development! Developers' lives have gotten more complex in the last decade. There is too much to learn and understand now, and you need a co-pilot. Let AIOps be that co-pilot.
In this webinar, we'll share use cases and discuss:
What is AIOps?
Why AI and ML are well-suited for Ops and DevOps
A guide for assessing where to automate
The document provides an overview of SAP Cloud Platform, including key use cases for integrating apps and data, extending existing cloud and on-premise apps, and building new cloud apps. It also discusses connecting people and data. Customer stories demonstrate how companies are using SAP Cloud Platform for integration, innovation, Internet of Things applications, and digital experiences. Architectural blueprints illustrate potential implementations involving SAP and non-SAP systems and applications.
This document provides a training guide for Databricks partners. It outlines Databricks' partner enablement program, which includes self-paced and instructor-led training courses to help partners make the most of their partnership. The courses cover topics like the Databricks platform, data engineering, machine learning, and industry solutions. Partners can earn badges by completing courses. The guide also discusses partner certification programs, champions programs for top technical experts, and sales training to prepare partners' customer-facing roles.
Cloud offers organizations the opportunity to run their workloads on physical machines at a reduced cost, with better overall performance and enhanced security. Yet engaging in a partial or total migration to cloud requires a solid, holistic strategy that focuses on technical and management challenges that will likely arise, and an organizational mindset that will help ensure the migration’s success. This introductory, vendor agnostic talk will highlight the technical, management and cultural considerations that every cloud migration strategy should consider, how to address some common challenges, and best practices to help guide the process.
This document summarizes three case studies that used remote sensing and GIS techniques to analyze land use and land cover change over time. The first case study analyzed changes from 1990-2010 in Hawalbagh, India using Landsat imagery. It found increases in built-up land and decreases in barren land. The second studied coastal Egypt from 1987-2001 using Landsat, identifying 8 land cover classes. The third examined Simly watershed, Pakistan from 1992-2012 using Landsat and SPOT data, finding increases in agriculture and decreases in vegetation. All three used supervised classification and post-classification comparison to analyze land use/cover changes.
Henry Peyret Presentation - Data Governance 2.0.
Based on the analysis of Digital Transformation and Values Transformation, Forrester gives its insight and orientations in terms of Data Governance 2.0 and Data Citizenship.
Netflix on Cloud - combined slides for Dev and OpsAdrian Cockcroft
This document contains slides from a presentation given by Adrian Cockcroft on Netflix's use of cloud computing on Amazon Web Services (AWS). The summary includes:
1) Netflix moved most of its infrastructure to AWS to leverage AWS's scale and features rather than building its own datacenters, as capacity growth was unpredictable and datacenters were inflexible.
2) Netflix uses many AWS services including EC2, S3, EBS, EMR and more. It deployed a large movie encoding farm on EC2, stores content on S3, uses EMR/Hadoop for log analysis, and a CDN for content delivery.
3) Netflix has learned that cloud tools don't always scale for large
Sopra Steria: Intelligent Network Analysis in a Telecommunications EnvironmentNeo4j
The Intelligent Network Analyzer (INA) uses the graph database by Neo4j to build a digital twin of the mobile telecommunications network. Based on this digital twin, INA can be used to efficiently perform various analyses to support network operators in their daily business. In our talk, we will show some features of INA and explain how they draw on the particular strengths of the Neo4j graph database.
How to Execute a Successful API StrategyMatt McLarty
This document discusses executing a successful API strategy through a programmatic approach. It begins by outlining the digital age and how APIs serve as digital enablers, allowing companies to transform digitally. It then discusses what an API program entails and how to execute one through establishing a digital strategy, aligning the organization and culture, evaluating technologies, and engaging the ecosystem. The document also outlines several API program workshops that can help organizations at different stages of an API program, from strategy to productization of APIs.
Event Monitoring: Use Powerful Insights to Improve Performance and SecurityDreamforce
Salesforce runs its business on Salesforce, but it supports its business using log data to the tune of three terabytes of data per day, per pod. Log data answers questions like who's logging in, who's downloading the customer list, which Visualforce pages are the slowest, which API versions you should upgrade, who's adopting Salesforce and how, and much, much more. Event Monitoring - part of Salesforce Shield is now generally available to provide insights into your org activity like never before. Join us to learn how you can use this built-in premium service to support your organization with powerful insights. Watch the video now: https://www.youtube.com/watch?v=QlESjd5aNDY
Applications of RS and GIS in Urban Planning by Rakshith m murthys0l0m0n7
This document discusses the application of remote sensing (RS) and geographical information systems (GIS) in urban planning. It explains that RS allows for the collection of spatial, spectral and temporal data about areas in an accurate and cost-effective manner, while GIS stores and analyzes geographic data in layers. The document then provides several examples of how RS and GIS have been used in urban planning, including analyzing urban sprawl in Bengaluru, mapping land use changes in Mysuru over time, assessing water demand and supply in Nairobi, and monitoring archaeological sites for encroachment using satellite imagery. It concludes that RS and GIS are necessary technologies for urban planning authorities to efficiently respond to issues faced by rapidly urbanizing
Use of remote sensing for land cover monitoring servir science applicationsKabir Uddin
This document discusses land cover mapping using remote sensing. It provides background on land cover mapping and monitoring in the Himalayan region, where deforestation and forest degradation have been issues. Remote sensing using satellite imagery and tools like GIS allows accurate land cover mapping over large areas. The document discusses different remote sensing platforms and sensors, as well as image classification techniques including unsupervised, supervised and object-based classification. It provides examples of software used for object-based image analysis, and outlines the steps involved in land cover mapping projects using remote sensing.
Cloud Migration: Cloud Readiness Assessment Case StudyCAST
Learn more about Cloud Migration: https://www.castsoftware.com/use-cases/cloud-readiness-and-migration
Review this case study of a CIO migrating applications to Microsoft Azure to see how a cloud readiness assessment help to identify obstacles preventing the organization from moving faster to Azure. Learn how to gain quick visibility through an objective assessment of your core application's cloud readiness, before you plan your cloud migration.
Learn more about Cloud Migration: https://www.castsoftware.com/use-cases/cloud-readiness-and-migration
Keynote: Elastic Observability evolution and visionElasticsearch
Elastic Observability helps drive mean time to resolution toward zero with end-to-end visibility in a single platform. Hear about the latest features and capabilities, and get a glimpse into the future.
This document summarizes an IBM presentation on emerging cloud migration approaches including AI planning, chatbots, and beyond. The presentation discusses using AI and machine learning to improve various phases of the cloud migration process, such as workload classification, selection, and planning. It also covers automating migration tasks and using conversational interfaces like chatbots to assist throughout the lifecycle. The document provides examples of how these approaches have helped customers successfully migrate to the cloud.
The trial period is over - Microservices adoption gains momentum Shahir Daya
IBM Think 2019 session: Everybody in technology is talking about microservices these days. Microservices aren't the answer to all of your problems. However, they do address many of the pain points we are seeing with applications developed only a few years ago. Where and how we start a Microservices Transformation can be confusing. Come to this session to learn what the Microservices architectural style is really about, how to ensure your next Microservices Transformation is successful, and what benefits you can realistically expect from your efforts.
Introduction to Modern Data Virtualization (US)Denodo
Watch full webinar here: https://bit.ly/3uyvxN5
“Through 2022, 60% of all organizations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
- How to easily get started with Denodo Standard 8.0
Learn About The Basic of Salesforce.com (force.com) which is the worlds' first and most popular CRM system headquartered in San Francisco.
By : Vijay maurya
Student At Baddi University of emerging science and technology ,solan
For more info contact me:-
email: vijaymaurya3167@gmail.com
IG: @vijaymaurya_ (follow me)
Fb.com/vijay.maurya2
New Dynamics 365 Implementation Guide - Available for downloadDynamics Square
Almost 700 pages of insights and guidance around strategy, initiation, implementation, preparation and operation, Contact us for Dynamics 365 Implementation: https://www.dynamicssquare.com.au/dynamics-365/
Big Data Fabric: A Recipe for Big Data InitiativesDenodo
Big data fabric combines essential big data capabilities in a single platform to automate the many facets of data discovery, preparation, curation, orchestration, and integration across a multitude of data sources. Attend this session to learn how Big Data Fabric enabled by data virtualization constitutes a recipe for:
• Enabling new actionable insights with minimal effort
• Securing big data end-to-end
• Addressing big data skillset scarcity
• Providing easy access to data without having to decipher various data formats
Agenda:
• Big Data with Data Virtualization
• Product Demonstration
• Summary & Next Steps
• Q&A
Watch webinar on demand here: https://goo.gl/EpmIBx
This webinar is part of the Data Virtualization Packed Lunch Webinar Series: https://goo.gl/W1BeCb
Global mapper tutorial Jimma University Ethiopiachala hailu
This document discusses using Global Mapper software to delineate watersheds and calculate peak runoff for flood analysis. It provides an overview of common hydrological analysis methods for estimating maximum flood levels, and describes using the SCS unit hydrograph method within Global Mapper. It also outlines the basic steps and tools in Global Mapper for watershed delineation and peak runoff calculation.
Survive the fog of system development! Developers' lives have gotten more complex in the last decade. There is too much to learn and understand now, and you need a co-pilot. Let AIOps be that co-pilot.
In this webinar, we'll share use cases and discuss:
What is AIOps?
Why AI and ML are well-suited for Ops and DevOps
A guide for assessing where to automate
The document provides an overview of SAP Cloud Platform, including key use cases for integrating apps and data, extending existing cloud and on-premise apps, and building new cloud apps. It also discusses connecting people and data. Customer stories demonstrate how companies are using SAP Cloud Platform for integration, innovation, Internet of Things applications, and digital experiences. Architectural blueprints illustrate potential implementations involving SAP and non-SAP systems and applications.
This document provides a training guide for Databricks partners. It outlines Databricks' partner enablement program, which includes self-paced and instructor-led training courses to help partners make the most of their partnership. The courses cover topics like the Databricks platform, data engineering, machine learning, and industry solutions. Partners can earn badges by completing courses. The guide also discusses partner certification programs, champions programs for top technical experts, and sales training to prepare partners' customer-facing roles.
Cloud offers organizations the opportunity to run their workloads on physical machines at a reduced cost, with better overall performance and enhanced security. Yet engaging in a partial or total migration to cloud requires a solid, holistic strategy that focuses on technical and management challenges that will likely arise, and an organizational mindset that will help ensure the migration’s success. This introductory, vendor agnostic talk will highlight the technical, management and cultural considerations that every cloud migration strategy should consider, how to address some common challenges, and best practices to help guide the process.
This document summarizes three case studies that used remote sensing and GIS techniques to analyze land use and land cover change over time. The first case study analyzed changes from 1990-2010 in Hawalbagh, India using Landsat imagery. It found increases in built-up land and decreases in barren land. The second studied coastal Egypt from 1987-2001 using Landsat, identifying 8 land cover classes. The third examined Simly watershed, Pakistan from 1992-2012 using Landsat and SPOT data, finding increases in agriculture and decreases in vegetation. All three used supervised classification and post-classification comparison to analyze land use/cover changes.
Henry Peyret Presentation - Data Governance 2.0.
Based on the analysis of Digital Transformation and Values Transformation, Forrester gives its insight and orientations in terms of Data Governance 2.0 and Data Citizenship.
Netflix on Cloud - combined slides for Dev and OpsAdrian Cockcroft
This document contains slides from a presentation given by Adrian Cockcroft on Netflix's use of cloud computing on Amazon Web Services (AWS). The summary includes:
1) Netflix moved most of its infrastructure to AWS to leverage AWS's scale and features rather than building its own datacenters, as capacity growth was unpredictable and datacenters were inflexible.
2) Netflix uses many AWS services including EC2, S3, EBS, EMR and more. It deployed a large movie encoding farm on EC2, stores content on S3, uses EMR/Hadoop for log analysis, and a CDN for content delivery.
3) Netflix has learned that cloud tools don't always scale for large
Sopra Steria: Intelligent Network Analysis in a Telecommunications EnvironmentNeo4j
The Intelligent Network Analyzer (INA) uses the graph database by Neo4j to build a digital twin of the mobile telecommunications network. Based on this digital twin, INA can be used to efficiently perform various analyses to support network operators in their daily business. In our talk, we will show some features of INA and explain how they draw on the particular strengths of the Neo4j graph database.
How to Execute a Successful API StrategyMatt McLarty
This document discusses executing a successful API strategy through a programmatic approach. It begins by outlining the digital age and how APIs serve as digital enablers, allowing companies to transform digitally. It then discusses what an API program entails and how to execute one through establishing a digital strategy, aligning the organization and culture, evaluating technologies, and engaging the ecosystem. The document also outlines several API program workshops that can help organizations at different stages of an API program, from strategy to productization of APIs.
Event Monitoring: Use Powerful Insights to Improve Performance and SecurityDreamforce
Salesforce runs its business on Salesforce, but it supports its business using log data to the tune of three terabytes of data per day, per pod. Log data answers questions like who's logging in, who's downloading the customer list, which Visualforce pages are the slowest, which API versions you should upgrade, who's adopting Salesforce and how, and much, much more. Event Monitoring - part of Salesforce Shield is now generally available to provide insights into your org activity like never before. Join us to learn how you can use this built-in premium service to support your organization with powerful insights. Watch the video now: https://www.youtube.com/watch?v=QlESjd5aNDY
Applications of RS and GIS in Urban Planning by Rakshith m murthys0l0m0n7
This document discusses the application of remote sensing (RS) and geographical information systems (GIS) in urban planning. It explains that RS allows for the collection of spatial, spectral and temporal data about areas in an accurate and cost-effective manner, while GIS stores and analyzes geographic data in layers. The document then provides several examples of how RS and GIS have been used in urban planning, including analyzing urban sprawl in Bengaluru, mapping land use changes in Mysuru over time, assessing water demand and supply in Nairobi, and monitoring archaeological sites for encroachment using satellite imagery. It concludes that RS and GIS are necessary technologies for urban planning authorities to efficiently respond to issues faced by rapidly urbanizing
Use of remote sensing for land cover monitoring servir science applicationsKabir Uddin
This document discusses land cover mapping using remote sensing. It provides background on land cover mapping and monitoring in the Himalayan region, where deforestation and forest degradation have been issues. Remote sensing using satellite imagery and tools like GIS allows accurate land cover mapping over large areas. The document discusses different remote sensing platforms and sensors, as well as image classification techniques including unsupervised, supervised and object-based classification. It provides examples of software used for object-based image analysis, and outlines the steps involved in land cover mapping projects using remote sensing.
Cloud Migration: Cloud Readiness Assessment Case StudyCAST
Learn more about Cloud Migration: https://www.castsoftware.com/use-cases/cloud-readiness-and-migration
Review this case study of a CIO migrating applications to Microsoft Azure to see how a cloud readiness assessment help to identify obstacles preventing the organization from moving faster to Azure. Learn how to gain quick visibility through an objective assessment of your core application's cloud readiness, before you plan your cloud migration.
Learn more about Cloud Migration: https://www.castsoftware.com/use-cases/cloud-readiness-and-migration
Keynote: Elastic Observability evolution and visionElasticsearch
Elastic Observability helps drive mean time to resolution toward zero with end-to-end visibility in a single platform. Hear about the latest features and capabilities, and get a glimpse into the future.
This document summarizes an IBM presentation on emerging cloud migration approaches including AI planning, chatbots, and beyond. The presentation discusses using AI and machine learning to improve various phases of the cloud migration process, such as workload classification, selection, and planning. It also covers automating migration tasks and using conversational interfaces like chatbots to assist throughout the lifecycle. The document provides examples of how these approaches have helped customers successfully migrate to the cloud.
The trial period is over - Microservices adoption gains momentum Shahir Daya
IBM Think 2019 session: Everybody in technology is talking about microservices these days. Microservices aren't the answer to all of your problems. However, they do address many of the pain points we are seeing with applications developed only a few years ago. Where and how we start a Microservices Transformation can be confusing. Come to this session to learn what the Microservices architectural style is really about, how to ensure your next Microservices Transformation is successful, and what benefits you can realistically expect from your efforts.
Client Deployment of IBM Cloud Private (IBM #Think2019 #5964)Michael Elder
As you plan for the adoption of Kubernetes in your datacenter, you’ll face several common questions. How much capacity will your clusters need? How should you manage the network security of the cluster? How do you expose services on the cluster to your existing network fabric? What are the tradeoffs to consider between different storage providers? What should you do for backup and disaster recovery scenarios? In this session, we’ll review several examples of client deployment architectures that will help you get started on your journey to a hybrid, multicloud architecture for your apps!
Why Domino is still the best platform for Rapid Application Development!Tony Ollivier
The document discusses the advantages of using IBM Domino for RAD (Rapid Application Development) needs. It addresses common objections to Domino by highlighting recent improvements like better performance, larger databases, new programming models, and mobile support. Domino's all-in-one architecture allows for faster development and deployment of applications compared to other platforms. The value of Domino increases as more applications and users are added due to its integrated features like database, security, email and replication capabilities in a single package.
Using GitHub and Visual Studio Code for Mainframe DevelopmentDevOps.com
Developers can now use these popular, dev-friendly tools with mainframe applications. Join this session to learn how to use GitHub and VS Code with mainframe-native code and languages like COBOL. For developers already familiar with these tools, mainframe development becomes more like other platforms. For mainframe developers new to these tools, combining their productivity and collaboration benefits with access to a broad array of devops tool opens a world of possibilities.
The presenters will demonstrate GitHub with the Git bridge to CA Endevor, the dominant mainframe-native SCM, allowing next-generation developers to work alongside their peers who use traditional tools. The Zowe open source extension for Visual Studio Code, which enables additional interactions with the mainframe without ever seeing a green screen, will also be demonstrated.
Client Deployment of IBM Cloud Private (Think 2019 Session 5964A)Yong Feng
This document provides guidance on planning and designing IBM Cloud Private deployments. It discusses key architecture decisions around high availability, workloads, security and more. It also covers topics like network topology, storage options, infrastructure providers, configuration of management services, and examples for large scale, multi-tenant and air-gapped environments. The goal is to help customers successfully plan their specific IBM Cloud Private architecture based on their requirements.
IBM Cloud Private and IBM Power Systems: Overview and Real-World ScenariosJoe Cropper
The document provides an overview of IBM Cloud Private and IBM Power Systems. It discusses how they address customer needs for hybrid cloud, AI/ML, and modernization. It reviews the technical components of IBM Cloud Private including containers, orchestration, and provisioning. Real-world customer scenarios are presented, including one for a financial services company using IBM Cloud Private across x86, Power, and Z architectures. The document concludes by discussing packaging and purchasing options for IBM Cloud Private.
Continuous integration and deployment has become an increasingly standard and common practice in software development. However, doing this for machine learning models and applications introduces many challenges. Not only do we need to account for standard code quality and integration testing, but how do we best account for changes in model performance metrics coming from changes to code, deployment framework or mechanism, pre- and post-processing steps, changes in data, not to mention the core deep learning model itself?
In addition, deep learning presents particular challenges:
* model sizes are often extremely large and take significant time and resources to train
* models are often more difficult to understand and interpret making it more difficult to debug issues
* inputs to deep learning are often very different from the tabular data involved in most ‘traditional machine learning’ models
* model formats, frameworks and the state-of-the art models and architectures themselves are changing extremely rapidly
* usually many disparate tools are combined to create the full end-to-end pipeline for training and deployment, making it trickier to plug together these components and track down issues.
We also need to take into account the impact of changes on wider aspects such as model bias, fairness, robustness and explainability. And we need to track all of this over time and in a standard, repeatable manner. This talk explores best practices for handling these myriad challenges to create a standardized, automated, repeatable pipeline for continuous deployment of deep learning models and pipelines. I will illustrate this through the work we are undertaking within the free and open-source IBM Model Asset eXchange.
Practical thoughts for cloud transformationMark Osborn
The document discusses cloud transformation and provides the following key points:
- Most enterprises now use multiple cloud environments and providers.
- Only 20% of enterprise workloads have moved to the cloud to date.
- There are three phases of cloud technology adoption: public cloud, hybrid cloud, and multi-cloud.
- Cloud transformation requires multiple concurrent approaches to minimize risk while leveraging new and existing investments.
Kubernetes for Developers - 7 lessons learned from 7 data centers in 7 months...Michael Tougeron
Mike Tougeron from Adobe presented lessons learned from implementing Kubernetes across 7 data centers over 7 months. Some key lessons included the importance of communication, defining responsibilities, and training for abstracting Kubernetes resources. Additional lessons focused on code pipelines to production, ensuring applications work well on Kubernetes through metrics and monitoring, and performing cluster upgrades carefully. Managing applications and infrastructure across multiple clouds also presented challenges addressed.
This document discusses IBM's strategy and progress in attracting developers to use Db2. It outlines IBM's objectives to increase Db2 adoption, modernize the developer experience, and expand the ecosystem. It also summarizes initial work efforts such as improving documentation, examples, and community resources. Next steps are identified like answering more Stack Overflow questions, creating additional guides and videos, and expanding Db2 availability on cloud platforms.
Notebook-based AI Pipelines with Elyra and KubeflowNick Pentreath
A typical machine learning pipeline begins as a series of preprocessing steps followed by experimentation, optimization and model-tuning, and, finally deployment. Jupyter notebooks have become a hugely popular tool for data scientists and other machine learning practitioners to explore and experiment as part of this workflow, due to the flexibility and interactivity they provide. However, with notebooks it is often a challenge to move from the experimentation phase to creating a robust, modular and production-grade end-to-end AI pipeline.
Elyra is a set of open-source, AI centric extensions to JupyterLab. Elyra provides a visual editor for building notebook-based pipelines that simplifies the conversion of multiple notebooks into batch jobs or workflows. These workflows can be executed both locally (during the experimentation phase) and on Kubernetes via Kubeflow Pipelines for production deployment. In this way, Elyra combines the flexibility and ease-of-use of notebooks and JupyterLab, with the production-grade qualities of Kubeflow (and in future potentially other Kubernetes-based orchestration platforms).
In this talk I introduce Elyra and its capabilities, then give a deep dive of Elyra's pipeline editor and the underlying pipeline execution mechanics, showing a demo of using Elyra to construct an end-to-end analytics and machine learning pipeline. I will also explore how to integrate and scale out model-tuning as well as deployment via Kubeflow Serving.
The document provides an overview of a presentation on IBM Connections Customizer given in Berlin, Germany on April 26-27, 2018. The summary includes:
- IBM Connections Customizer allows modifying the IBM Connections user experience by acting as a proxy between requests and responses and injecting custom JavaScript, CSS, and other items.
- The presentation covered Customizer capabilities like request routing, the app registry for managing extensions, and demoed samples for UI changes, API calls, and CSS customizations.
- Examples of customer and community apps that use Customizer were shown, and tips for managing Customizer in production environments were provided, including payload filtering and caching policies.
From Data to AI - Silicon Valley Open Source projects come to you - Madrid me...Luciano Resende
The IBM Center for Open Source, Data and AI Technology "CODAIT" (https://developer.ibm.com/code/open/centers/codait/) works on multiple open-source Data and AI projects. In this section we will introduce these projects around Jupyter Notebooks, reusable Model and Data assets, Trusted AI among others.
Building Notebook-based AI Pipelines with Elyra and KubeflowDatabricks
A typical machine learning pipeline begins as a series of preprocessing steps followed by experimentation, optimization and model-tuning, and, finally deployment. Jupyter notebooks have become a hugely popular tool for data scientists and other machine learning practitioners to explore and experiment as part of this workflow, due to the flexibility and interactivity they provide. However, with notebooks it is often a challenge to move from the experimentation phase to creating a robust, modular and production-grade end-to-end AI pipeline.
This document contains multiple articles on the topics of BIM, including:
1. The difference between Building Information Modelling (BIM) and Partial Building Information Modelling, where BIM requires data from multiple disciplines and partial BIM is within a single discipline.
2. BIM data sharing methodologies like data exchange, data harmony, data unity, and data integration that allow sharing of model data between different software solutions with varying levels of data loss.
3. The importance of APIs for allowing communication between programs and using tools from one program in another program to help automate tasks without extensive new coding.
What is new in IBM Connections 5.5 and IBM Docs 2.0Luis Benitez
This deck covers the highlights of the new capabilities introduced in the IBM Docs 2.0 and IBM Connections 5.5 released on December 2015.
To learn more, go to http://ibm.com/social
Follow me:
Twitter: http://twitter.com/lbenitez
LinkedIn: http://pr.linkedin.com/in/luisbenitez
My Blog: http://www.lbenitez.com
.NET and Kubernetes: Bringing Legacy .NET Into the Modern World with Pivotal ...VMware Tanzu
SpringOne Platform 2019
.NET and Kubernetes: Bringing Legacy .NET Into the Modern World with Pivotal Container Services
Speakers: David Dieruf, Product Marketing Manager, Pivotal and Christopher Umbel, .NET AppTx Practice Lead, Pivotal
YouTube: https://youtu.be/nw6gI67l8GA
Learntek is global online training provider on Big Data Analytics, Hadoop, Machine Learning, Deep Learning, IOT, AI, Cloud Technology, DEVOPS, Digital Marketing and other IT and Management courses.
Similar to A Toolchain for Lean Architecture at American Airlines (20)
The Key to Digital Success_ A Comprehensive Guide to Continuous Testing Integ...kalichargn70th171
In today's business landscape, digital integration is ubiquitous, demanding swift innovation as a necessity rather than a luxury. In a fiercely competitive market with heightened customer expectations, the timely launch of flawless digital products is crucial for both acquisition and retention—any delay risks ceding market share to competitors.
What to do when you have a perfect model for your software but you are constrained by an imperfect business model?
This talk explores the challenges of bringing modelling rigour to the business and strategy levels, and talking to your non-technical counterparts in the process.
E-commerce Development Services- Hornet DynamicsHornet Dynamics
For any business hoping to succeed in the digital age, having a strong online presence is crucial. We offer Ecommerce Development Services that are customized according to your business requirements and client preferences, enabling you to create a dynamic, safe, and user-friendly online store.
Mobile App Development Company In Noida | Drona InfotechDrona Infotech
Drona Infotech is a premier mobile app development company in Noida, providing cutting-edge solutions for businesses.
Visit Us For : https://www.dronainfotech.com/mobile-application-development/
UI5con 2024 - Bring Your Own Design SystemPeter Muessig
How do you combine the OpenUI5/SAPUI5 programming model with a design system that makes its controls available as Web Components? Since OpenUI5/SAPUI5 1.120, the framework supports the integration of any Web Components. This makes it possible, for example, to natively embed own Web Components of your design system which are created with Stencil. The integration embeds the Web Components in a way that they can be used naturally in XMLViews, like with standard UI5 controls, and can be bound with data binding. Learn how you can also make use of the Web Components base class in OpenUI5/SAPUI5 to also integrate your Web Components and get inspired by the solution to generate a custom UI5 library providing the Web Components control wrappers for the native ones.
Flutter is a popular open source, cross-platform framework developed by Google. In this webinar we'll explore Flutter and its architecture, delve into the Flutter Embedder and Flutter’s Dart language, discover how to leverage Flutter for embedded device development, learn about Automotive Grade Linux (AGL) and its consortium and understand the rationale behind AGL's choice of Flutter for next-gen IVI systems. Don’t miss this opportunity to discover whether Flutter is right for your project.
Preparing Non - Technical Founders for Engaging a Tech AgencyISH Technologies
Preparing non-technical founders before engaging a tech agency is crucial for the success of their projects. It starts with clearly defining their vision and goals, conducting thorough market research, and gaining a basic understanding of relevant technologies. Setting realistic expectations and preparing a detailed project brief are essential steps. Founders should select a tech agency with a proven track record and establish clear communication channels. Additionally, addressing legal and contractual considerations and planning for post-launch support are vital to ensure a smooth and successful collaboration. This preparation empowers non-technical founders to effectively communicate their needs and work seamlessly with their chosen tech agency.Visit our site to get more details about this. Contact us today www.ishtechnologies.com.au
Malibou Pitch Deck For Its €3M Seed Roundsjcobrien
French start-up Malibou raised a €3 million Seed Round to develop its payroll and human resources
management platform for VSEs and SMEs. The financing round was led by investors Breega, Y Combinator, and FCVC.
Consistent toolbox talks are critical for maintaining workplace safety, as they provide regular opportunities to address specific hazards and reinforce safe practices.
These brief, focused sessions ensure that safety is a continual conversation rather than a one-time event, which helps keep safety protocols fresh in employees' minds. Studies have shown that shorter, more frequent training sessions are more effective for retention and behavior change compared to longer, infrequent sessions.
Engaging workers regularly, toolbox talks promote a culture of safety, empower employees to voice concerns, and ultimately reduce the likelihood of accidents and injuries on site.
The traditional method of conducting safety talks with paper documents and lengthy meetings is not only time-consuming but also less effective. Manual tracking of attendance and compliance is prone to errors and inconsistencies, leading to gaps in safety communication and potential non-compliance with OSHA regulations. Switching to a digital solution like Safelyio offers significant advantages.
Safelyio automates the delivery and documentation of safety talks, ensuring consistency and accessibility. The microlearning approach breaks down complex safety protocols into manageable, bite-sized pieces, making it easier for employees to absorb and retain information.
This method minimizes disruptions to work schedules, eliminates the hassle of paperwork, and ensures that all safety communications are tracked and recorded accurately. Ultimately, using a digital platform like Safelyio enhances engagement, compliance, and overall safety performance on site. https://safelyio.com/