Operationalizing Machine Learning—Managing Provenance from Raw Data to Predic...Databricks
Our team at Comcast is challenged with operationalizing predictive ML models to improve customer experience. Our goal is to eliminate bottlenecks in the process from model inception to deployment and monitoring.
Traditionally CI/CD manages code and infrastructure artifacts like container definitions. We want to extend it to support granular traceability enabling tracking of ML Models from use-case, to feature/attribute selection, development of versioned datasets, model training code, model evaluation artifacts, model prediction deployment containers, and sinks to which the predictions/outcomes are persisted to. Our framework stack enables us to track models from use-case to deployments, manage and evaluate multiple models simultaneously in the live yet dark mode and continue to monitor models in production against real-world outcomes using configurable policies.
The technologies/components which drive this vision are:
1. FeatureStore – Enables data scientists to reuse versioned features and review feature metrics by models. Self-Service capabilities allow all teams to onboard their events data into the feature store.
2. ModelRepository – Manages meta-data about models including pre-processing parameters (Ex. Scaling parameters for features), mapping to the features needed to execute the model, model discovery mechanisms, etc.
3. Spark on Alluxio – Alluxio provides the universal data plane on top of various under-stores (Ex. S3, HDFS, RDBMS). Apache Spark with its Data Sources API provides a unified query language which Data Scientist use to consume features to create training/validation/test datasets which are versioned and integrated into the full model pipeline using Ground-Context discussed next.
4. Ground-Context – This open-source vendor-neutral data context service enables full traceability from use-case, models, features, model to features mapping, versioned datasets, model training codebase, model deployment containers and prediction/outcome sinks. It integrates with the Feature-Store, Container Repository and Git to integrate data, code and run-time artifacts for CI/CD integration.
This session will demonstrate a new approach to LIMS implementations eliminates the complexities, excessive customization and lengthy associated validation requirements inherent with legacy LIMS—offering fast, “out-of-the-box” deployment capabilities, no custom coding, easy integration into existing software platforms and enterprise-wide data management capabilities.
Each Accelrys LIMS application comes with Workflow Editors that eliminate traditional software custom-coding processes, enabling your own internal system administrator to deploy needed applications, workflows and procedures using a simple drag-and-drop process and dialog interface.
When finished with the workflow editing, a single mouse click generates a complete validation document for the application, workflow or procedure created. Built-in compliance at the “core technology” level turns qualification/validation into a simple, fast document review with no need for external validation consultants, even in regulated environments.
Operationalizing Machine Learning—Managing Provenance from Raw Data to Predic...Databricks
Our team at Comcast is challenged with operationalizing predictive ML models to improve customer experience. Our goal is to eliminate bottlenecks in the process from model inception to deployment and monitoring.
Traditionally CI/CD manages code and infrastructure artifacts like container definitions. We want to extend it to support granular traceability enabling tracking of ML Models from use-case, to feature/attribute selection, development of versioned datasets, model training code, model evaluation artifacts, model prediction deployment containers, and sinks to which the predictions/outcomes are persisted to. Our framework stack enables us to track models from use-case to deployments, manage and evaluate multiple models simultaneously in the live yet dark mode and continue to monitor models in production against real-world outcomes using configurable policies.
The technologies/components which drive this vision are:
1. FeatureStore – Enables data scientists to reuse versioned features and review feature metrics by models. Self-Service capabilities allow all teams to onboard their events data into the feature store.
2. ModelRepository – Manages meta-data about models including pre-processing parameters (Ex. Scaling parameters for features), mapping to the features needed to execute the model, model discovery mechanisms, etc.
3. Spark on Alluxio – Alluxio provides the universal data plane on top of various under-stores (Ex. S3, HDFS, RDBMS). Apache Spark with its Data Sources API provides a unified query language which Data Scientist use to consume features to create training/validation/test datasets which are versioned and integrated into the full model pipeline using Ground-Context discussed next.
4. Ground-Context – This open-source vendor-neutral data context service enables full traceability from use-case, models, features, model to features mapping, versioned datasets, model training codebase, model deployment containers and prediction/outcome sinks. It integrates with the Feature-Store, Container Repository and Git to integrate data, code and run-time artifacts for CI/CD integration.
This session will demonstrate a new approach to LIMS implementations eliminates the complexities, excessive customization and lengthy associated validation requirements inherent with legacy LIMS—offering fast, “out-of-the-box” deployment capabilities, no custom coding, easy integration into existing software platforms and enterprise-wide data management capabilities.
Each Accelrys LIMS application comes with Workflow Editors that eliminate traditional software custom-coding processes, enabling your own internal system administrator to deploy needed applications, workflows and procedures using a simple drag-and-drop process and dialog interface.
When finished with the workflow editing, a single mouse click generates a complete validation document for the application, workflow or procedure created. Built-in compliance at the “core technology” level turns qualification/validation into a simple, fast document review with no need for external validation consultants, even in regulated environments.
MLOps and Data Quality: Deploying Reliable ML Models in ProductionProvectus
Looking to build a robust machine learning infrastructure to streamline MLOps? Learn from Provectus experts how to ensure the success of your MLOps initiative by implementing Data QA components in your ML infrastructure.
For most organizations, the development of multiple machine learning models, their deployment and maintenance in production are relatively new tasks. Join Provectus as we explain how to build an end-to-end infrastructure for machine learning, with a focus on data quality and metadata management, to standardize and streamline machine learning life cycle management (MLOps).
Agenda
- Data Quality and why it matters
- Challenges and solutions of Data Testing
- Challenges and solutions of Model Testing
- MLOps pipelines and why they matter
- How to expand validation pipelines for Data Quality
Measure, Metrics, Indicators, Metrics of Process Improvement, Statistical Software Process Improvement, Metrics of Project Management, Metrics of the Software Product, 12 Steps to Useful Software Metrics
The Functional Mockup Interface: FMI overview
Modelica: a very brief overview
A Real-World Example: Active Grill Shutter Controls
Vehicle Thermal Management with Modelica
Continuous Validation of System Requirements
- Intermediate results from ITEA3 MODRIO project
Iterative Controller Development Using Modelica
Conclusions
Guiding through a typical Machine Learning PipelineMichael Gerke
Many People are talking about AI and Machine Learning. Here's a quick guideline how to manage ML Projects and what to consider in order to implement machine learning use cases.
Citrix Systems Inc. has an IT landscape consisting of diversified technologies that include SAP, SAP Ariba, Concur, and Workday solutions. Learn how Citrix put in place a test automation strategy to achieve end-to-end quality and validation of complex business processes that span multiple SAP technology solutions. Citrix HR, procurement, and travel and expenses business processes, which require integration between a hosted SAP solution and cloud-based Workday, SAP Ariba, and Concur platforms, will be explained.
Semantic Validation: Enforcing Kafka Data Quality Through Schema-Driven Verif...HostedbyConfluent
"Incorrect data produced into Kafka can be a poison pill that has the potential to disrupt businesses built upon Kafka. The “Semantic Validation” feature is designed to address the challenges posed by incorrect or unexpected data in Kafka’s data processing pipelines, with the goal of mitigating such disruptions. By allowing users to define robust field constraints directly within schemas, such as Avro, we aim to enhance data quality and minimize the downstream impacts of inaccurate data in Kafka.
Furthermore, this feature can be expanded to include offline data processing, in addition to Kafka and Flink real-time processing. By combining real-time processing, batch analytics, and AI data pipelines, a global semantic validation system can be built.
In our upcoming talk, we will delve into the use cases of this feature, discuss its architecture, provide examples of defining rules, and explain how we enforce these rules. Ultimately, we will demonstrate how this feature can significantly enhance reliability and trustworthiness in Uber’s data processing pipelines."
Tutorial for Machine Learning 101 (an all-day tutorial at Strata + Hadoop World, New York City, 2015)
The course is designed to introduce machine learning via real applications like building a recommender image analysis using deep learning.
In this talk we cover deployment of machine learning models.
Erfaringer fra Sparebank 1 med bruk av syntetisk testdata fra Tenor for api-testautomatisering.
Presentert på et fagseminar arrangert av Skatteetaten 27.apr.2023
MLOps and Data Quality: Deploying Reliable ML Models in ProductionProvectus
Looking to build a robust machine learning infrastructure to streamline MLOps? Learn from Provectus experts how to ensure the success of your MLOps initiative by implementing Data QA components in your ML infrastructure.
For most organizations, the development of multiple machine learning models, their deployment and maintenance in production are relatively new tasks. Join Provectus as we explain how to build an end-to-end infrastructure for machine learning, with a focus on data quality and metadata management, to standardize and streamline machine learning life cycle management (MLOps).
Agenda
- Data Quality and why it matters
- Challenges and solutions of Data Testing
- Challenges and solutions of Model Testing
- MLOps pipelines and why they matter
- How to expand validation pipelines for Data Quality
Measure, Metrics, Indicators, Metrics of Process Improvement, Statistical Software Process Improvement, Metrics of Project Management, Metrics of the Software Product, 12 Steps to Useful Software Metrics
The Functional Mockup Interface: FMI overview
Modelica: a very brief overview
A Real-World Example: Active Grill Shutter Controls
Vehicle Thermal Management with Modelica
Continuous Validation of System Requirements
- Intermediate results from ITEA3 MODRIO project
Iterative Controller Development Using Modelica
Conclusions
Guiding through a typical Machine Learning PipelineMichael Gerke
Many People are talking about AI and Machine Learning. Here's a quick guideline how to manage ML Projects and what to consider in order to implement machine learning use cases.
Citrix Systems Inc. has an IT landscape consisting of diversified technologies that include SAP, SAP Ariba, Concur, and Workday solutions. Learn how Citrix put in place a test automation strategy to achieve end-to-end quality and validation of complex business processes that span multiple SAP technology solutions. Citrix HR, procurement, and travel and expenses business processes, which require integration between a hosted SAP solution and cloud-based Workday, SAP Ariba, and Concur platforms, will be explained.
Semantic Validation: Enforcing Kafka Data Quality Through Schema-Driven Verif...HostedbyConfluent
"Incorrect data produced into Kafka can be a poison pill that has the potential to disrupt businesses built upon Kafka. The “Semantic Validation” feature is designed to address the challenges posed by incorrect or unexpected data in Kafka’s data processing pipelines, with the goal of mitigating such disruptions. By allowing users to define robust field constraints directly within schemas, such as Avro, we aim to enhance data quality and minimize the downstream impacts of inaccurate data in Kafka.
Furthermore, this feature can be expanded to include offline data processing, in addition to Kafka and Flink real-time processing. By combining real-time processing, batch analytics, and AI data pipelines, a global semantic validation system can be built.
In our upcoming talk, we will delve into the use cases of this feature, discuss its architecture, provide examples of defining rules, and explain how we enforce these rules. Ultimately, we will demonstrate how this feature can significantly enhance reliability and trustworthiness in Uber’s data processing pipelines."
Tutorial for Machine Learning 101 (an all-day tutorial at Strata + Hadoop World, New York City, 2015)
The course is designed to introduce machine learning via real applications like building a recommender image analysis using deep learning.
In this talk we cover deployment of machine learning models.
Erfaringer fra Sparebank 1 med bruk av syntetisk testdata fra Tenor for api-testautomatisering.
Presentert på et fagseminar arrangert av Skatteetaten 27.apr.2023
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
A Comprehensive Look at Generative AI in Retail App Testing.pdfkalichargn70th171
Traditional software testing methods are being challenged in retail, where customer expectations and technological advancements continually shape the landscape. Enter generative AI—a transformative subset of artificial intelligence technologies poised to revolutionize software testing.
Software Engineering, Software Consulting, Tech Lead.
Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Security,
Spring Transaction, Spring MVC,
Log4j, REST/SOAP WEB-SERVICES.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Custom Healthcare Software for Managing Chronic Conditions and Remote Patient...Mind IT Systems
Healthcare providers often struggle with the complexities of chronic conditions and remote patient monitoring, as each patient requires personalized care and ongoing monitoring. Off-the-shelf solutions may not meet these diverse needs, leading to inefficiencies and gaps in care. It’s here, custom healthcare software offers a tailored solution, ensuring improved care and effectiveness.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
Enhancing Project Management Efficiency_ Leveraging AI Tools like ChatGPT.pdfJay Das
With the advent of artificial intelligence or AI tools, project management processes are undergoing a transformative shift. By using tools like ChatGPT, and Bard organizations can empower their leaders and managers to plan, execute, and monitor projects more effectively.
AI Pilot Review: The World’s First Virtual Assistant Marketing SuiteGoogle
AI Pilot Review: The World’s First Virtual Assistant Marketing Suite
👉👉 Click Here To Get More Info 👇👇
https://sumonreview.com/ai-pilot-review/
AI Pilot Review: Key Features
✅Deploy AI expert bots in Any Niche With Just A Click
✅With one keyword, generate complete funnels, websites, landing pages, and more.
✅More than 85 AI features are included in the AI pilot.
✅No setup or configuration; use your voice (like Siri) to do whatever you want.
✅You Can Use AI Pilot To Create your version of AI Pilot And Charge People For It…
✅ZERO Manual Work With AI Pilot. Never write, Design, Or Code Again.
✅ZERO Limits On Features Or Usages
✅Use Our AI-powered Traffic To Get Hundreds Of Customers
✅No Complicated Setup: Get Up And Running In 2 Minutes
✅99.99% Up-Time Guaranteed
✅30 Days Money-Back Guarantee
✅ZERO Upfront Cost
See My Other Reviews Article:
(1) TubeTrivia AI Review: https://sumonreview.com/tubetrivia-ai-review
(2) SocioWave Review: https://sumonreview.com/sociowave-review
(3) AI Partner & Profit Review: https://sumonreview.com/ai-partner-profit-review
(4) AI Ebook Suite Review: https://sumonreview.com/ai-ebook-suite-review
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
2. About me
• Holds a Ph.D. degree in Computer Science in 1997 from
Norwegian University of Technology
• 25 years work experience in software testing and QA.
• Founder / CEO / Quality Engineer at QualiTest Norway.
• Currently working as contractor at SpareBank 1 – Master
Data Management team
«Pursuing and seeking for new and innovative way of
doing test and QA in smart and effective manner»
• Test automation; continuous test execution; model-
based test; ML supported test; etc…
3. About the talk
01 Context and motivation
02 MBT implementation
03 Lesson learned
4. MDM at SpareBank 1 (SB1)
SB1-MDM
Public
registers
Updates
Consolidation
SB1
Fagsystemer
SB1
Fagsystemer
Enterprise
systems
MDM-usage:
• 7 millions Customer records
• 25 Consumers and 2 millions requests/day
• 12.000 daily updates from public registers
at real-time
Producers
Exposing
SB1
Systemer
SB1
Systemer
Systems
MDM-API
Consumers
Producers
MDM-solution:
• 12 micro-services exposed to Consumers
• 10 enterprise systems integrated
• 300 rules: data validation, transformation
and merge consolidation
• Heavy batch transactions
«The most updated and best
quality of enterprise data
stored in one single place»
5. Challenges
Extremely high requirement
to data quality
Technical complexity -
many integration interfaces
High frequency of changes
to business rules and domain
models
Increasing number of Consumers –
with new requirements
Increasing number of
Enterprise systems joining
the consolidated platform
Automated regression test suites are
constantly growing and changing
Frequent need to rapidly identify a test
suite for a hot-fix at hand (targeted test)
6. What we want
Improve test effectiveness by smart deriving
of test scope caused by particular changes
Reduce cost of developing and maintaining tests
due to constant requirement changes
Better control of traceability between business
requirements and tests
MBT – Model
based testing ???
7. Model based testing – MBT
Rules /
Requirements
Model
Test-
oracle
Test cases
Executable
tests SUT
Manual
modeling
Automated
generation
Automated
generation
Automated
coupling
Automated
execution
Automated
evaluation
Test tool
Expected benefits:
• Automated (smart) test case
generation
• Traceability between requirements
and tests
• Adjustable test coverage – «right
tests for particular change»
• Cost-efficient maintenance of tests
by changing the models instead of
tests
8. Example
Input parameter:
• 32 input parameters.
• Average 4 possible input values needed to be tested
Business rules: 60
Field-rule:
1. If SSN is set then CustomerType = PER
2. LastName length shall not exceed 256 characters
Cross-field rules:
3. If Citizenship2 != null then Citizenship1 != null
4. If SSN is D-No && PersonStatus = 3 then Sector = 9800
Behaviour-rules:
Changes from system A will override changes from system B
because field´s trust level of B is downgraded after X-days
9. Model based testing – MBT
Rules /
Requirements
Model
Test-
oracle
Test cases
Executable
tests SUT
Manual
modeling
Automated
generation
Automated
generation
Automated
coupling
Automated
execution
Automated
evaluation
Test tool
11. Model based testing – MBT
Rules /
Requirements
Model
Test-
oracle
Test cases
Executable
tests SUT
Manual
modeling
Automated
generation
Automated
generation
Automated
coupling
Automated
execution
Automated
evaluation
Test tool
12. Test case generation
Parameter
=
20
No. possible values ≈ 3
No. of possible permutations = 320 (some combinations are not relevant)
è Still «Test case explosion»
Objective of test case generation:
To derive a set of test cases – when being executed will give a
necessary and sufficient coverage for a particular purpose.
Combinatorial algorithm:
o NWise – 2 | 3 | .. | n
o Cartesian (full coverage)
Model element:
o Parameter
o Rule
o TC-filter
Genererated test cases:
o (o1, a1, b3, c2, ...)
o (o2, a2, b1, c2, ...)
o ...
13. Model based testing – MBT
Rules /
Requirements
Model
Test-
oracle
Test cases
Executable
tests SUT
Manual
modeling
Automated
generation
Automated
generation
Automated
coupling
Automated
execution
Automated
evaluation
Test tool
14. Test case execution
SoapUI test
EcFeed
Test case generation
Algorithms +
Rules +
TC-filters
Expected_output, input_1, input_2, ..., input_n
Expected_output, input_1, input_2, ..., input_n
Expected_output, input_1, input_2, ..., input_n
Expected_output, input_1, input_2, ..., input_n
...
Test parameter binding
Evaluate test result
Export test result
• CSV-fil
• DB tables
• Slack
Execute service API
15. Benefits
Expected benefits:
• Automated (smart) test case generation
• Traceability between requirements and tests
• Adjustable test coverage – «right tests for
particular change»
• Cost-efficient maintenance of tests by changing
the models instead of tests
16. Lesson learned
v Need to have a suitable usecase for MBT implementation.
v Completeness/correctness of the model is crucial – hard to find
errors in models. Defect leakage or false alarm.
MBT technique
v Require good domain knowledge and requirement -> PO task?
v Define sufficient number of rules and filters to facilitate effective test
v Automated tests must be parameterized for easy binding
v Require modeling skill and anticipate overhead
Implementation
v Smooth integration between MBT-tool ecFeed and SoapUI
v Improvement potentiale to make MBT-tool more intuitive and «test-
friendly»
Tool integration