The benefits of introducing new IBM Rational tools into an existing
project often clearly outweigh the difficulties associated with making a change midstream. Read about the various techniques you can use to manage the process.
Deployment of a test management solution for a defence project using an integ...Einar Karlsen
The presentation shows how a test management solution has been established for a defence project in compliance with a set of applicable standards using an integrated IBM Rational tool chain consisting of Rational Quality Manager for test management, IBM Rational DOORS for requirement management, IBM Rational Team Concert for defect management, IBM Rational Publishing Engine for automatic generation of project deliverables and last - but not least - IBM Rational Insight for trend and status reporting.
Plan ahead and act proficiently for reporting - Lessons LearnedEinar Karlsen
This presentation – held at Interconnect 2016 in Las Vegas - describes the top 10 mistakes an organization can make when deploying document generation tools in terms of implied costs, risk and impact. More importantly however it also gives you best practices as well as tips and tricks on how to avoid repeating those mistakes. The presentation is based on many years of experience in deploying document generation tools such as the IBM Rational Publishing Engine and the discussion takes it origin in real life examples.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
1. The document discusses the Unified Process, an iterative software development methodology. It describes the four phases of the Unified Process: Inception, Elaboration, Construction, and Transition.
2. Within each phase, development is organized into iterations which produce executable software increments. Each iteration addresses requirements, design, coding, testing, and deployment activities.
3. The Unified Process uses several modeling techniques including use case models, UML diagrams, and other artifacts to plan and guide the development process from requirements to deployment.
Factors to consider when starting a brand-new requirements management project...IBM Rational software
The document discusses factors to consider when starting a new requirements management project in IBM Rational DOORS Next Generation. It recommends understanding project goals, environment and constraints to optimize the requirements process. Key questions to address include which artifacts define scope, how artifacts will be organized and tracked, what relationships are important, and which development methodology is being followed. The document also discusses configuring artifact types, attributes, link types and modules to structure requirements information in the project.
The document discusses different software development process models including the spiral model, concurrent model, and component-based development model. The spiral model is an evolutionary model that combines iterative development and risk analysis. It involves progressively more complete versions of the software through iterations. The concurrent model allows activities like modeling, analysis, and design to progress concurrently in different states. The component-based development model is evolutionary and reuse prepackaged software components, researching available products, designing architecture, and integrating and testing components.
Basic concepts and terminology for the Requirements Management applicationIBM Rational software
After you complete this module, you should be able to do these tasks :
- Explain the difference between Jazz™ Team Server and the Requirements Management (RM) application
- Describe the basic concepts and terminology in the RM application
- Identify tasks that the team must do before starting a requirements management project with IBM® Rational® DOORS Next Generation or IBM® Rational® Requirements Composer
Dynamic Object-Oriented Requirements System (DOORS)David Groff
This document provides an overview of requirements management (RM) and the IBM Rational DOORS tool. It discusses what RM is, who uses it, and what DOORS is and how it can be used. It describes the key components and architecture of DOORS, including modules, objects, attributes, and links. It also covers security roles, configurations, scripting with DXL, and integrations with other tools.
Deployment of a test management solution for a defence project using an integ...Einar Karlsen
The presentation shows how a test management solution has been established for a defence project in compliance with a set of applicable standards using an integrated IBM Rational tool chain consisting of Rational Quality Manager for test management, IBM Rational DOORS for requirement management, IBM Rational Team Concert for defect management, IBM Rational Publishing Engine for automatic generation of project deliverables and last - but not least - IBM Rational Insight for trend and status reporting.
Plan ahead and act proficiently for reporting - Lessons LearnedEinar Karlsen
This presentation – held at Interconnect 2016 in Las Vegas - describes the top 10 mistakes an organization can make when deploying document generation tools in terms of implied costs, risk and impact. More importantly however it also gives you best practices as well as tips and tricks on how to avoid repeating those mistakes. The presentation is based on many years of experience in deploying document generation tools such as the IBM Rational Publishing Engine and the discussion takes it origin in real life examples.
THE UNIFIED APPROACH FOR ORGANIZATIONAL NETWORK VULNERABILITY ASSESSMENTijseajournal
The present business network infrastructure is quickly varying with latest servers, services, connections,
and ports added often, at times day by day, and with a uncontrollably inflow of laptops, storage media and
wireless networks. With the increasing amount of vulnerabilities and exploits coupled with the recurrent
evolution of IT infrastructure, organizations at present require more numerous vulnerability assessments.
In this paper new approach the Unified process for Network vulnerability Assessments hereafter called as
a unified NVA is proposed for network vulnerability assessment derived from Unified Software
Development Process or Unified Process, it is a popular iterative and incremental software development
process framework.
1. The document discusses the Unified Process, an iterative software development methodology. It describes the four phases of the Unified Process: Inception, Elaboration, Construction, and Transition.
2. Within each phase, development is organized into iterations which produce executable software increments. Each iteration addresses requirements, design, coding, testing, and deployment activities.
3. The Unified Process uses several modeling techniques including use case models, UML diagrams, and other artifacts to plan and guide the development process from requirements to deployment.
Factors to consider when starting a brand-new requirements management project...IBM Rational software
The document discusses factors to consider when starting a new requirements management project in IBM Rational DOORS Next Generation. It recommends understanding project goals, environment and constraints to optimize the requirements process. Key questions to address include which artifacts define scope, how artifacts will be organized and tracked, what relationships are important, and which development methodology is being followed. The document also discusses configuring artifact types, attributes, link types and modules to structure requirements information in the project.
The document discusses different software development process models including the spiral model, concurrent model, and component-based development model. The spiral model is an evolutionary model that combines iterative development and risk analysis. It involves progressively more complete versions of the software through iterations. The concurrent model allows activities like modeling, analysis, and design to progress concurrently in different states. The component-based development model is evolutionary and reuse prepackaged software components, researching available products, designing architecture, and integrating and testing components.
Basic concepts and terminology for the Requirements Management applicationIBM Rational software
After you complete this module, you should be able to do these tasks :
- Explain the difference between Jazz™ Team Server and the Requirements Management (RM) application
- Describe the basic concepts and terminology in the RM application
- Identify tasks that the team must do before starting a requirements management project with IBM® Rational® DOORS Next Generation or IBM® Rational® Requirements Composer
Dynamic Object-Oriented Requirements System (DOORS)David Groff
This document provides an overview of requirements management (RM) and the IBM Rational DOORS tool. It discusses what RM is, who uses it, and what DOORS is and how it can be used. It describes the key components and architecture of DOORS, including modules, objects, attributes, and links. It also covers security roles, configurations, scripting with DXL, and integrations with other tools.
After you complete this module, you should be able to
explain these concepts:
- How requirements fit in the development process
- Key principles of requirements definition and management
- How you can manage requirements by using IBM Rational
requirements management tools
After you complete this module, you should be able to do these tasks :
- Generate requirements report documents
- Explain the reporting capability that is available through IBM® Rational® Reporting for Development Intelligence
Rhapsody and mechatronics, multi-domain simulationGraham Bleakley
This document discusses mechatronics and its application with Rational Rhapsody Design Manager. [1] Mechatronics involves the integration of mechanical, electrical, and software engineering, requiring a systems engineering approach. [2] Mechatronic modeling requires mathematical modeling tools that can be integrated into logical behavior models. [3] Rhapsody provides a way to work with mathematical modeling tools like Simulink and Modelica to model both logical and physical behavior.
The document discusses two projects - a weather update app developed in Android using an API from OpenWeatherMap, and a charity website developed in PHP using frameworks like CakePHP and Zend.
The weather app allows users to access current weather data like temperature, humidity, and wind speed for any location by city name. It uses JSON parsing to retrieve data from the OpenWeatherMap API.
The charity website called Kalpvirksha is a platform for NGOs to showcase their work and connect with donors. It was developed using PHP and frameworks like CakePHP, Zend, Joomla and Wordpress to provide features like user accounts and project listings.
This document provides a summary of an IBM Rational DOORS Next Generation training course. It discusses key topics covered in the course including requirements management, traceability, collaboration, reuse of requirements, project templates, security roles, and integrated reporting. The course teaches participants how to effectively use DOORS Next Generation to manage requirements throughout the entire project lifecycle.
This document discusses 10 lessons learned from building real-world machine learning systems. Some key points include: implicit signals from user actions often provide better predictions than explicit feedback; combining implicit and explicit signals can improve models; models will learn whatever they are trained on so careful selection of training data, metrics, and objectives is important; and machine learning features should be reusable, transformable, and interpretable. The document also cautions that machine learning systems can accumulate hidden technical debt over time from issues like data and system dependencies, feedback loops, and unused code paths. Regular evaluation and refactoring is needed to manage technical debt in machine learning applications.
Define and Manage Requirements with IBM Rational Requirements ComposerAlan Kan
The document provides an overview of a hands-on lab session on IBM Rational Requirements Composer (RRC). The lab aims to demonstrate how RRC can help teams collaborate to define, manage and trace requirements across the software development lifecycle. The lab covers topics like importing and linking requirements, modeling business processes and use cases, conducting reviews, and generating work items and test cases from requirements. Known issues encountered in the labs are also documented.
When you complete this module, you should be able to do these tasks :
• Explore the content of a module
• Analyze the information in a module
• Create, move, edit and delete artifacts in a module
• Identify and implement hierarchical data structures in a
module
This document discusses various prescriptive software process models. It begins by describing a generic process framework that includes communication, planning, modeling, construction, and deployment. It then covers traditional models like the waterfall model and incremental model. Specialized models discussed include component-based development and formal methods. Finally, it describes the unified process model, which is iterative and incremental.
After completing this unit, you should be able to:
- Describe the purpose of traceability
- Explain the difference between a content link and a traceability link
- Link objects to create traceability
- View traceability relationships in columns, graphically, and in the artifact sidebar
- Delete links between objects to fix traceability
- Navigate around different levels of information by using traceability links
- Analyze the impact of a changed requirement or failed test by using traceability
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
The document discusses Telelogic Rhapsody, a model-driven development tool for designing technical and embedded systems. It addresses key challenges in systems development such as effective collaboration, managing requirements changes, and testing. Rhapsody uses model-driven development approaches like UML/SysML modeling, requirements traceability, model-driven testing, and automatic code generation to help developers meet schedules, reduce errors, and facilitate team collaboration.
This document provides an overview of conducting reviews in IBM Rational DOORS Next Generation. It describes preparing for a review by creating either a formal or informal review from a collection or individual artifacts. It outlines the review lifecycle and roles. Participants can complete a review by approving, disapproving, or abstaining from artifacts. Once all participants are finished, the review creator can finalize the review. The goal of reviews is to catch errors early and reduce rework through collaboration and feedback on requirements.
Lecture 3: Navigating the Requirements Management application: Web clientIBM Rational software
After you complete this module, you should be able to:
► Navigate the web client interface
► View requirement artifacts and information about them, both at the project level and in modules
► Use filters to focus on relevant information
► Change display settings to reveal relevant details
► Quickly switch perspectives of the requirements information by using views
► Use comments on artifacts to collaborate
► Use dashboards for lifecycle and progress analysis
- Manoj Kumar has 5 years of experience as a software professional specializing in data warehousing and ETL development using Informatica and Oracle database.
- He has experience designing and implementing complex ETL mappings including Slowly Changing Dimensions types 2 and 3.
- Manoj seeks new opportunities as an ETL developer where he can utilize his skills in Informatica, Oracle, shell scripting and more.
After you complete this module, you should be able to manage change by doing these tasks :
- Identify changed artifacts
- Explore the history of an artifact
- Identify suspect traceability
Aspect-Oriented Software Development (AOSD) is a programming methodology that addresses limitations in object-oriented programming for modularizing cross-cutting concerns. AOSD uses aspects to encapsulate cross-cutting concerns so they can be separated from the core functionality. Aspects are automatically incorporated into the system by a weaver. This improves modularity and makes software easier to maintain and evolve over time.
Automatically bridging UML profiles into MOF metamodelsIvano Malavolta
27th August 2015. My presentation at SEAA 2015 (http://paginas.fe.up.pt/~dsd-seaa-2015/) about our approach for autmatically bridging UML profiles into MOF metamodels. SEAA 2015 is the 41st Euromicro Conference on Software Engineering and Advanced Applications, and it has been held in 26 - 28 August 2015, Funchal, Madeira, Portugal.
Accompanying paper:
http://www.ivanomalavolta.com/files/papers/SEAA_2015.pdf
Abstract:
In Model-Driven Engineering, UML profiles and MOF-based Domain Specific Modeling Languages (DSMLs) are the most used approaches for describing domain specific applications. The choice of the right approach depends on several aspects, such as tool support, expressivity, complexity of models, company policies. In general, profiled UML models are very much used since they are intuitive for designers and model editors already exist, however they are intrinsically complex for model manipulation (e.g., transformation, analysis); conversely, domain specific models are more concise and easy to be manipulated, but they require an initial effort in terms of designers training and model editors development.
In this paper we propose an approach that allows getting the best of the two worlds: on one side designers can use UML profiles familiar to them, on the other side DSML models (automatically generated from profiled UML models) enable a better model manipulation. Our approach is based on an automatic bridge between UML profiles and MOF metamodels (which are the main artifacts of MOF-based DSMLs). The bridge is transparent to the user since it autonomously operates both on UML profiles and all the involved models. The bridge is realized through model transformation techniques in the Eclipse platform. In this paper we show its application on a case study based on SysML.
PracticalExperiences Migrating Unified Modeling Language Models to IBM® Rati...Einar Karlsen
The presentation presents some experiences migrating UML models to Rational Software Architect. It covers the topics: Motivations and Mechanisms, Migration Process
Migration from Rational Rose and Rational XDE to RSA,
Migration from 3rd Party UML Tools, Basic Rules and Conclusions.
The document discusses various aspects of software design including the design process, concepts, models, heuristics, and styles. It describes software design as translating requirements into a finished product through iterative refinement. Key aspects covered include data/class design, architectural design, interface design, component design, abstraction, modularity, patterns, and information hiding. Architectural styles provide patterns for creating system architecture for given problems.
Software Design PatternsConsider a company migrating to a third-p.pdfarorastores
Software Design Patterns:
Consider a company migrating to a third-party cloud-based solution from an internally
maintained ecosystem of applications utilizing one current-generation database system, as well
as a legacy system for older data. They plan to migrate all data to the cloud based solution in
time. But, for now, they are going to transition to the new cloud-based applications and the
cloud-based database for new data, but will rely upon the existing and legacy database for older
data. The databases have approximately the same functionality, but different interfaces and
languages.
What design pattern highlights the most significant challenge associated with integrating the
different databases (as well as one way of addressing it)?
What is that challenge?
Briefly, and in English, describe how the pattern teaches that we should approach this problem?
In other words, what is the pattern that should follow for the solution?
Solution
Design patterns like Factory pattern,Singleton pattern etc basically provide solutions to general
problems which are faced by software developers during the development phase. These patterns
do not play any role in Data migration.
There are four stages in Data Migration. They are:
1.Semantic Data models which comprises of the Dimensional models,Semantic models,
Mapping to Semantic building blocks.
2. Data Mapping Specifications which is used to translate Source data to target data.
3. KPIs and Data lineage which is useful in establishing the data lineage for the org and other
rightful requirements.
4. End-to-End scope of Data models is used to standardise data that is loaded in the Data
Warehouse.
Please follow the list of steps provided below while migrating data to the cloud:
1. Assessing the requirements and then plan.
2.Disintegrate the dependencies after the initial assessments.
3. Redesign, re-program and reintegrate.
4. Testing of new migrated components.
5. Fine tuning and training.
However, there would be technical issues while data migration. Many firms which migrate the
data to the cloud, proceed in a hybrid model, keeping key elementss of their infrastructure
inhouse and under their comtrol while they outsource less sensitive or core components.
Cloud vendors would always expect the customers to provide or develop a virtual image jointly
that specifies their basic server configuration, which is offered as a service after being built
inside the cloud. It is required that the IT team also have the skillset tp create a VM template
which includes infrastructure, application and security that is required by the enterprise..
Automated software modernisation is the best solution that is fast, low cost, preserves legacy value and is less risky by comparison to the traditional methodology of a re-write or replacement by packaged ERP. Object Management Groups (OMGs) Model Driven Architecture (MDA) methodology provides an automated model-driven reverse engineering and forward engineering process called Architecture Driven Modernisation (ADM) which has already been successfully adopted by a variety of high profile organisations such as Boeing, U.S. Air Force, Raytheon, EDS, Thales (European Aerospace) and numerous governments worldwide.
WORPCLOUD LTD is focused on being an Automated Software Modernisation Expert. We use OMG compliant tools and parsing techniques to extract all system information, business semantics and software artifacts into an XML repository called the Abstract Syntax Tree Metamodel. Next we use MDAs automated transformation procedures to generate new source code of your choice. Manual architecting of the target system are also performed before the transformation thus ensuring; speed, low cost and accuracy of the automated process combined with the flexibility & insight of human analysis.
Research reveals that application modernisation and migration budgets are currently very strong, covering between 25% to 71% of most companies IT budgets in 2013/2014. This clearly indicates that application modernisation is one of the most significant issues affecting companies – due to high software maintenance costs, low business flexibility and crippled integration and interoperability. Software modernisation is the sole remedy for these problems and your organisation can make huge savings by modernising.
After you complete this module, you should be able to
explain these concepts:
- How requirements fit in the development process
- Key principles of requirements definition and management
- How you can manage requirements by using IBM Rational
requirements management tools
After you complete this module, you should be able to do these tasks :
- Generate requirements report documents
- Explain the reporting capability that is available through IBM® Rational® Reporting for Development Intelligence
Rhapsody and mechatronics, multi-domain simulationGraham Bleakley
This document discusses mechatronics and its application with Rational Rhapsody Design Manager. [1] Mechatronics involves the integration of mechanical, electrical, and software engineering, requiring a systems engineering approach. [2] Mechatronic modeling requires mathematical modeling tools that can be integrated into logical behavior models. [3] Rhapsody provides a way to work with mathematical modeling tools like Simulink and Modelica to model both logical and physical behavior.
The document discusses two projects - a weather update app developed in Android using an API from OpenWeatherMap, and a charity website developed in PHP using frameworks like CakePHP and Zend.
The weather app allows users to access current weather data like temperature, humidity, and wind speed for any location by city name. It uses JSON parsing to retrieve data from the OpenWeatherMap API.
The charity website called Kalpvirksha is a platform for NGOs to showcase their work and connect with donors. It was developed using PHP and frameworks like CakePHP, Zend, Joomla and Wordpress to provide features like user accounts and project listings.
This document provides a summary of an IBM Rational DOORS Next Generation training course. It discusses key topics covered in the course including requirements management, traceability, collaboration, reuse of requirements, project templates, security roles, and integrated reporting. The course teaches participants how to effectively use DOORS Next Generation to manage requirements throughout the entire project lifecycle.
This document discusses 10 lessons learned from building real-world machine learning systems. Some key points include: implicit signals from user actions often provide better predictions than explicit feedback; combining implicit and explicit signals can improve models; models will learn whatever they are trained on so careful selection of training data, metrics, and objectives is important; and machine learning features should be reusable, transformable, and interpretable. The document also cautions that machine learning systems can accumulate hidden technical debt over time from issues like data and system dependencies, feedback loops, and unused code paths. Regular evaluation and refactoring is needed to manage technical debt in machine learning applications.
Define and Manage Requirements with IBM Rational Requirements ComposerAlan Kan
The document provides an overview of a hands-on lab session on IBM Rational Requirements Composer (RRC). The lab aims to demonstrate how RRC can help teams collaborate to define, manage and trace requirements across the software development lifecycle. The lab covers topics like importing and linking requirements, modeling business processes and use cases, conducting reviews, and generating work items and test cases from requirements. Known issues encountered in the labs are also documented.
When you complete this module, you should be able to do these tasks :
• Explore the content of a module
• Analyze the information in a module
• Create, move, edit and delete artifacts in a module
• Identify and implement hierarchical data structures in a
module
This document discusses various prescriptive software process models. It begins by describing a generic process framework that includes communication, planning, modeling, construction, and deployment. It then covers traditional models like the waterfall model and incremental model. Specialized models discussed include component-based development and formal methods. Finally, it describes the unified process model, which is iterative and incremental.
After completing this unit, you should be able to:
- Describe the purpose of traceability
- Explain the difference between a content link and a traceability link
- Link objects to create traceability
- View traceability relationships in columns, graphically, and in the artifact sidebar
- Delete links between objects to fix traceability
- Navigate around different levels of information by using traceability links
- Analyze the impact of a changed requirement or failed test by using traceability
This presentation is about a lecture I gave within the "Software systems and services" immigration course at the Gran Sasso Science Institute, L'Aquila (Italy): http://cs.gssi.infn.it/.
http://www.ivanomalavolta.com
The document discusses Telelogic Rhapsody, a model-driven development tool for designing technical and embedded systems. It addresses key challenges in systems development such as effective collaboration, managing requirements changes, and testing. Rhapsody uses model-driven development approaches like UML/SysML modeling, requirements traceability, model-driven testing, and automatic code generation to help developers meet schedules, reduce errors, and facilitate team collaboration.
This document provides an overview of conducting reviews in IBM Rational DOORS Next Generation. It describes preparing for a review by creating either a formal or informal review from a collection or individual artifacts. It outlines the review lifecycle and roles. Participants can complete a review by approving, disapproving, or abstaining from artifacts. Once all participants are finished, the review creator can finalize the review. The goal of reviews is to catch errors early and reduce rework through collaboration and feedback on requirements.
Lecture 3: Navigating the Requirements Management application: Web clientIBM Rational software
After you complete this module, you should be able to:
► Navigate the web client interface
► View requirement artifacts and information about them, both at the project level and in modules
► Use filters to focus on relevant information
► Change display settings to reveal relevant details
► Quickly switch perspectives of the requirements information by using views
► Use comments on artifacts to collaborate
► Use dashboards for lifecycle and progress analysis
- Manoj Kumar has 5 years of experience as a software professional specializing in data warehousing and ETL development using Informatica and Oracle database.
- He has experience designing and implementing complex ETL mappings including Slowly Changing Dimensions types 2 and 3.
- Manoj seeks new opportunities as an ETL developer where he can utilize his skills in Informatica, Oracle, shell scripting and more.
After you complete this module, you should be able to manage change by doing these tasks :
- Identify changed artifacts
- Explore the history of an artifact
- Identify suspect traceability
Aspect-Oriented Software Development (AOSD) is a programming methodology that addresses limitations in object-oriented programming for modularizing cross-cutting concerns. AOSD uses aspects to encapsulate cross-cutting concerns so they can be separated from the core functionality. Aspects are automatically incorporated into the system by a weaver. This improves modularity and makes software easier to maintain and evolve over time.
Automatically bridging UML profiles into MOF metamodelsIvano Malavolta
27th August 2015. My presentation at SEAA 2015 (http://paginas.fe.up.pt/~dsd-seaa-2015/) about our approach for autmatically bridging UML profiles into MOF metamodels. SEAA 2015 is the 41st Euromicro Conference on Software Engineering and Advanced Applications, and it has been held in 26 - 28 August 2015, Funchal, Madeira, Portugal.
Accompanying paper:
http://www.ivanomalavolta.com/files/papers/SEAA_2015.pdf
Abstract:
In Model-Driven Engineering, UML profiles and MOF-based Domain Specific Modeling Languages (DSMLs) are the most used approaches for describing domain specific applications. The choice of the right approach depends on several aspects, such as tool support, expressivity, complexity of models, company policies. In general, profiled UML models are very much used since they are intuitive for designers and model editors already exist, however they are intrinsically complex for model manipulation (e.g., transformation, analysis); conversely, domain specific models are more concise and easy to be manipulated, but they require an initial effort in terms of designers training and model editors development.
In this paper we propose an approach that allows getting the best of the two worlds: on one side designers can use UML profiles familiar to them, on the other side DSML models (automatically generated from profiled UML models) enable a better model manipulation. Our approach is based on an automatic bridge between UML profiles and MOF metamodels (which are the main artifacts of MOF-based DSMLs). The bridge is transparent to the user since it autonomously operates both on UML profiles and all the involved models. The bridge is realized through model transformation techniques in the Eclipse platform. In this paper we show its application on a case study based on SysML.
PracticalExperiences Migrating Unified Modeling Language Models to IBM® Rati...Einar Karlsen
The presentation presents some experiences migrating UML models to Rational Software Architect. It covers the topics: Motivations and Mechanisms, Migration Process
Migration from Rational Rose and Rational XDE to RSA,
Migration from 3rd Party UML Tools, Basic Rules and Conclusions.
The document discusses various aspects of software design including the design process, concepts, models, heuristics, and styles. It describes software design as translating requirements into a finished product through iterative refinement. Key aspects covered include data/class design, architectural design, interface design, component design, abstraction, modularity, patterns, and information hiding. Architectural styles provide patterns for creating system architecture for given problems.
Software Design PatternsConsider a company migrating to a third-p.pdfarorastores
Software Design Patterns:
Consider a company migrating to a third-party cloud-based solution from an internally
maintained ecosystem of applications utilizing one current-generation database system, as well
as a legacy system for older data. They plan to migrate all data to the cloud based solution in
time. But, for now, they are going to transition to the new cloud-based applications and the
cloud-based database for new data, but will rely upon the existing and legacy database for older
data. The databases have approximately the same functionality, but different interfaces and
languages.
What design pattern highlights the most significant challenge associated with integrating the
different databases (as well as one way of addressing it)?
What is that challenge?
Briefly, and in English, describe how the pattern teaches that we should approach this problem?
In other words, what is the pattern that should follow for the solution?
Solution
Design patterns like Factory pattern,Singleton pattern etc basically provide solutions to general
problems which are faced by software developers during the development phase. These patterns
do not play any role in Data migration.
There are four stages in Data Migration. They are:
1.Semantic Data models which comprises of the Dimensional models,Semantic models,
Mapping to Semantic building blocks.
2. Data Mapping Specifications which is used to translate Source data to target data.
3. KPIs and Data lineage which is useful in establishing the data lineage for the org and other
rightful requirements.
4. End-to-End scope of Data models is used to standardise data that is loaded in the Data
Warehouse.
Please follow the list of steps provided below while migrating data to the cloud:
1. Assessing the requirements and then plan.
2.Disintegrate the dependencies after the initial assessments.
3. Redesign, re-program and reintegrate.
4. Testing of new migrated components.
5. Fine tuning and training.
However, there would be technical issues while data migration. Many firms which migrate the
data to the cloud, proceed in a hybrid model, keeping key elementss of their infrastructure
inhouse and under their comtrol while they outsource less sensitive or core components.
Cloud vendors would always expect the customers to provide or develop a virtual image jointly
that specifies their basic server configuration, which is offered as a service after being built
inside the cloud. It is required that the IT team also have the skillset tp create a VM template
which includes infrastructure, application and security that is required by the enterprise..
Automated software modernisation is the best solution that is fast, low cost, preserves legacy value and is less risky by comparison to the traditional methodology of a re-write or replacement by packaged ERP. Object Management Groups (OMGs) Model Driven Architecture (MDA) methodology provides an automated model-driven reverse engineering and forward engineering process called Architecture Driven Modernisation (ADM) which has already been successfully adopted by a variety of high profile organisations such as Boeing, U.S. Air Force, Raytheon, EDS, Thales (European Aerospace) and numerous governments worldwide.
WORPCLOUD LTD is focused on being an Automated Software Modernisation Expert. We use OMG compliant tools and parsing techniques to extract all system information, business semantics and software artifacts into an XML repository called the Abstract Syntax Tree Metamodel. Next we use MDAs automated transformation procedures to generate new source code of your choice. Manual architecting of the target system are also performed before the transformation thus ensuring; speed, low cost and accuracy of the automated process combined with the flexibility & insight of human analysis.
Research reveals that application modernisation and migration budgets are currently very strong, covering between 25% to 71% of most companies IT budgets in 2013/2014. This clearly indicates that application modernisation is one of the most significant issues affecting companies – due to high software maintenance costs, low business flexibility and crippled integration and interoperability. Software modernisation is the sole remedy for these problems and your organisation can make huge savings by modernising.
Automated software modernisation is the best solution that is fast, low cost, preserves legacy value and is less risky by comparison to the traditional methodology of a re-write or replacement by packaged ERP. Object Management Groups (OMGs) Model Driven Architecture (MDA) methodology provides an automated model-driven reverse engineering and forward engineering process called Architecture Driven Modernisation (ADM) which has already been successfully adopted by a variety of high profile organisations such as Boeing, U.S. Air Force, Raytheon, EDS, Thales (European Aerospace) and numerous governments worldwide.
WORPCLOUD LTD is focused on being an Automated Software Modernisation Expert. We use OMG compliant tools and parsing techniques to extract all system information, business semantics and software artifacts into an XML repository called the Abstract Syntax Tree Metamodel. Next we use MDAs automated transformation procedures to generate new source code of your choice. Manual architecting of the target system are also performed before the transformation thus ensuring; speed, low cost and accuracy of the automated process combined with the flexibility & insight of human analysis.
Research reveals that application modernisation and migration budgets are currently very strong, covering between 25% to 71% of most companies IT budgets in 2013/2014. This clearly indicates that application modernisation is one of the most significant issues affecting companies – due to high software maintenance costs, low business flexibility and crippled integration and interoperability. Software modernisation is the sole remedy for these problems and your organisation can make huge savings by modernising.
BPM-X Pattern-based model transformations (v2)BPM-Xchange
Model data conversions can be achieved with a pattern-based transformation engine, a component included into the BPM-Xchange® enterprise application integration (EAI) software.
This document discusses the use of Model-Driven Architecture (MDA) and model transformations in software product lines (SPL). It begins by introducing SPLs and MDA. SPLs aim to increase productivity by leveraging commonalities between related products. MDA uses platform-independent and platform-specific models with transformations between them. The document then explores combining MDA and SPL approaches through the Modden framework and Baseline-Oriented Modeling. Modden develops reusable core assets through domain and application engineering processes with MDA. Baseline-Oriented Modeling produces expert systems as PRISMA architectural models from SPLs using MDA.
An Application of Business Process Modeling System Ilnet.pdfJennifer Holmes
The document describes an application called ILNET that was developed to automate administrative processes at educational institutions. ILNET is a business process modeling system that allows processes to be defined, visualized, and executed using workflows. It includes a graphical editor, server, compiler, and library of reusable building blocks. ILNET supports features like custom visualization of processes, hot-swapping of running models, and integration of web services. An example application of using ILNET for thesis administration processes at a university is also discussed.
DESIGN AND DEVELOPMENT OF BUSINESS RULES MANAGEMENT SYSTEM (BRMS) USING ATLAN...ijcsit
The document describes the design and development of a Business Rules Management System (BRMS) using the ATL and Eclipse Sirius frameworks. It proposes a new "Target Ecore meta model" to improve the structure and management of business rules. The system allows business rules to be modeled and transformed from their current format into an object-oriented format using ATL model transformations. This provides improved modularity, scalability and extensibility of the rules compared to the original structure. A case study demonstrates transforming an example business rule from a software package based on the proposed approach.
The document discusses the Unified Modeling Language (UML) and its role in object-oriented analysis and design. It describes UML as a graphical language used to visualize, specify, construct, and document software systems. UML provides tools and features to support complex systems using object-oriented concepts and methodology. UML diagrams are used to model system designs, with the key UML diagrams being class, sequence, use case, state machine, and activity diagrams. The document also briefly mentions some criticisms of UML regarding when diagrams should be used and how frequently they need to be updated.
IT 8003 Cloud ComputingFor this activi.docxvrickens
IT 8003 Cloud Computing
For this activity you need to divide your class in groups
1
Group Activity 1 “SuperTAX Software”
2
SuperTax Overview
Did you know President Abraham Lincoln, one of America's most beloved leaders, also instituted one of its least liked obligations - the income tax? In this brief history of taxes, see the historical events which shaped income taxes in the United States today.
SuperTax is an American tax preparation software package developed in the mid-1980s.
SuperTax Corporation is headquartered in Mountain View, California.
2
Group Activity 1 “SuperTAX Software”
3
SuperTax Information
Desktop Software.
Support MS Windows and Mac OS.
Software method: CD/DVD media format.
Different versions:
SuperTAX Basic, Deluxe, Premier, and Home & Business.
Used by millions of users and organizations.
Group Activity 1 “SuperTAX Software”
4
SuperTAX Project
SuperTAX has hired your group as a consultant to move their Desktop Software to a Traditional IT Hosted Software, available Online.
Group Activity 1 “SuperTAX Software”
5
For Discussion:
Find the challenges that your team will encounter attempting to move SuperTAX Software to the new platform.
Prepared a presentation for the class.
On your Group you will need to define positions.
For example:
Project Manager, Senior Project Network, Senior Project Engineer, etc.
Group Activity 1 “SuperTAX Software”
6
Infrastructure
Software Development
Software Testing
Marketing & Business Model
Project Management
CHALLENGES
Group Activity 1 “SuperTAX Software”
7
Infrastructure
No more test in a single machine. (CD/DVD format model)
Test in a production cluster. (20, 30 users?)
A larger cluster can bring problems. (1000’s of users)
Testing must be done for different clients (mobile, desktops, OS)
Small performance bottleneck. Slow performance.
CHALLENGES
Group Activity 1 “SuperTAX Software”
8
Marketing & Business Model
One time fixed cost vs. subscription model
Before a CD was sold, now a subscription model.
Maintenance and replacement of cooling, power, and server is required
CHALLENGES
Group Activity 1 “SuperTAX Software”
9
Project Management
Project can take many months to years for Software Development cycle.
What model is appropriate for Hosted application. (Agile vs. waterfall)
Ability to try new features faster.
CHALLENGES
RUNNING HEAD: INTERSESSION 5 FINAL PROJECT PROJECTION 1
INTERSESSION 5 FINAL PROJECT PROJECTION 5
INTERSESSION 5 FINAL PROJECT PROJECTION
Shalini Kantamneni
Ottawa University
Intersession 5 Final Project Projection
The Design Process
This process involves the formulation of a model to be used in deriving a comprehensive cloud application. In this case, the model-view-controller design pattern will be used. This type of design pattern partitions the logic of the application into three distinct domains that are to be interconnected to provide a working cloud application (Jailia et al., 2016). ...
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may
be software development, mechanics, construction, psychology and so on. These demarcations work fine
as long as the requirements are within one discipline. However, if the project extends over several
disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature.
Predicting the performance of web services during early stages of software development is significant. In
Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for
predicting the performance from UML models is available. Hence, In this paper, a methodology for
developing Use Case model and Activity model from User Interface is presented. The methodology is
illustrated with a case study on Amazon.com
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may
be software development, mechanics, construction, psychology and so on. These demarcations work fine
as long as the requirements are within one discipline. However, if the project extends over several
disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature.
Predicting the performance of web services during early stages of software development is significant. In
Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for
predicting the performance from UML models is available. Hence, In this paper, a methodology for
developing Use Case model and Activity model from User Interface is presented. The methodology is
illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UI ijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines.
Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life
Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case study on Amazon.com.
General Methodology for developing UML models from UIijwscjournal
The document presents a methodology for developing UML models from a user interface prototype. The methodology involves identifying user interface elements from the prototype, developing a flow diagram of the elements, creating an activity model, and developing a use case model. The methodology is demonstrated through a case study of developing UML models for the login page of the Amazon.com website. Key steps include identifying UI elements like workspaces and functions, creating a flow diagram to show the main and exception flows, developing an activity model of the login process, and specifying a use case for login and authentication.
Process models are not perfect, but provide road map for software engineering work. Software models provide stability, control, and organization to a process that if not managed can easily get out of control
Software process models are adapted to meet the needs of software engineers and managers for a specific project.
The document discusses various prescriptive software process models including the waterfall model, incremental process model, evolutionary process model, and prototyping. The waterfall model proposes a sequential approach from requirements to deployment. The incremental model produces deliverable software increments. Evolutionary models iteratively produce more complete versions. Prototyping builds prototypes to help define requirements through evaluation. Issues with each approach are also outlined.
The document discusses object-oriented analysis and design (OOAD) and the Unified Modeling Language (UML). It describes OOAD as modeling a system as interacting objects characterized by their class, state, and behavior. Various UML diagrams can show the static structure, dynamic behavior, and runtime deployment of these collaborating objects. It then discusses object-oriented analysis, design, and the Unified Process framework for software development before briefly introducing UML diagrams.
The document provides an overview of a feasibility study for a software engineering project. It discusses the technical, economic, and operational feasibility aspects that should be analyzed. The technical feasibility section examines the system requirements and configuration. The economic feasibility section describes cost-benefit analysis to determine if the benefits outweigh the costs. The operational feasibility section considers the organizational impacts and changes, including changes to skills needed and staffing requirements.
This document discusses various prescriptive process models for software engineering. It begins by introducing generic process frameworks and then discusses traditional models like waterfall, incremental, prototyping, RAD and spiral. It also covers specialized models for component-based development and formal methods. Each model is explained in terms of its activities, advantages and challenges. Traditional models tend to be sequential while evolutionary models iterate and provide early feedback. Specialized models focus on areas like reuse and formal specification.
Similar to Migrating existing projects to Rational solutions (20)
A case study in using ibm watson studio machine learning services ibm devel...Einar Karlsen
This IBM Developer article shows various ways of predicting customer churn using IBM Watson Studio ranging from a semi-automated approach using the Model Builder, a diagrammatic approach using SPSS Modeler Flows to a fully programmed style using Jupyter notebooks.
Weather data meets ibm cloud. part 4 analysis and visualization of weather ...Einar Karlsen
This IBM Developer article shows how to analyse and visualize weather data using IBM Watson Studio, Jupyter notebooks for Python and IBM Cognos Dashboard Embedded service on IBM Cloud
Weather data meets ibm cloud. part 3 transformation and aggregation of weat...Einar Karlsen
This IBM Developer article shows how to extract, transform and load weather data stored by IBM Cloud Object Storage using IBM SQL Query, pandas, Apache Spark, IBM Watson Studio and Jupyter notebooks for Python.
Weather data meets ibm cloud. part 2 storage and query of weather data - ib...Einar Karlsen
This IBM Developer article - co-authored with Rene Meyer - shows how to store and query weather data using IBM Event Streams (based on Apache Kafka), IBM Cloud Object Storage (serving as landing zone) and IBM SQL Query.
Weather data meets ibm cloud. part 1 ingestion and processing of weather da...Einar Karlsen
This recipe - co-authored with Julia Wiegel and Rene Meyer - shows how to ingest and process weather data using the Weather Company Data Service (API), IBM Cloud Functions (based on Apache OpenWhisk) and IBM Event Streams (based on Apache Kafka). It was originally published on IBM Developer.
IBM Innovate2012 - CIO Cockpit for Integrated Planning, Controlling and AnalysisEinar Karlsen
The presentation provides an overview of the CIO Cockpit for integrated planning, controlling and analysis using IBM Rational Focal Point, IBM Rational Insight and Cognos solutions for Business Intelligence.
Need for Speed: Removing speed bumps from your Symfony projects ⚡️Łukasz Chruściel
No one wants their application to drag like a car stuck in the slow lane! Yet it’s all too common to encounter bumpy, pothole-filled solutions that slow the speed of any application. Symfony apps are not an exception.
In this talk, I will take you for a spin around the performance racetrack. We’ll explore common pitfalls - those hidden potholes on your application that can cause unexpected slowdowns. Learn how to spot these performance bumps early, and more importantly, how to navigate around them to keep your application running at top speed.
We will focus in particular on tuning your engine at the application level, making the right adjustments to ensure that your system responds like a well-oiled, high-performance race car.
UI5con 2024 - Boost Your Development Experience with UI5 Tooling ExtensionsPeter Muessig
The UI5 tooling is the development and build tooling of UI5. It is built in a modular and extensible way so that it can be easily extended by your needs. This session will showcase various tooling extensions which can boost your development experience by far so that you can really work offline, transpile your code in your project to use even newer versions of EcmaScript (than 2022 which is supported right now by the UI5 tooling), consume any npm package of your choice in your project, using different kind of proxies, and even stitching UI5 projects during development together to mimic your target environment.
Zoom is a comprehensive platform designed to connect individuals and teams efficiently. With its user-friendly interface and powerful features, Zoom has become a go-to solution for virtual communication and collaboration. It offers a range of tools, including virtual meetings, team chat, VoIP phone systems, online whiteboards, and AI companions, to streamline workflows and enhance productivity.
OpenMetadata Community Meeting - 5th June 2024OpenMetadata
The OpenMetadata Community Meeting was held on June 5th, 2024. In this meeting, we discussed about the data quality capabilities that are integrated with the Incident Manager, providing a complete solution to handle your data observability needs. Watch the end-to-end demo of the data quality features.
* How to run your own data quality framework
* What is the performance impact of running data quality frameworks
* How to run the test cases in your own ETL pipelines
* How the Incident Manager is integrated
* Get notified with alerts when test cases fail
Watch the meeting recording here - https://www.youtube.com/watch?v=UbNOje0kf6E
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
Artificia Intellicence and XPath Extension FunctionsOctavian Nadolu
The purpose of this presentation is to provide an overview of how you can use AI from XSLT, XQuery, Schematron, or XML Refactoring operations, the potential benefits of using AI, and some of the challenges we face.
Atelier - Innover avec l’IA Générative et les graphes de connaissancesNeo4j
Atelier - Innover avec l’IA Générative et les graphes de connaissances
Allez au-delà du battage médiatique autour de l’IA et découvrez des techniques pratiques pour utiliser l’IA de manière responsable à travers les données de votre organisation. Explorez comment utiliser les graphes de connaissances pour augmenter la précision, la transparence et la capacité d’explication dans les systèmes d’IA générative. Vous partirez avec une expérience pratique combinant les relations entre les données et les LLM pour apporter du contexte spécifique à votre domaine et améliorer votre raisonnement.
Amenez votre ordinateur portable et nous vous guiderons sur la mise en place de votre propre pile d’IA générative, en vous fournissant des exemples pratiques et codés pour démarrer en quelques minutes.
Transform Your Communication with Cloud-Based IVR SolutionsTheSMSPoint
Discover the power of Cloud-Based IVR Solutions to streamline communication processes. Embrace scalability and cost-efficiency while enhancing customer experiences with features like automated call routing and voice recognition. Accessible from anywhere, these solutions integrate seamlessly with existing systems, providing real-time analytics for continuous improvement. Revolutionize your communication strategy today with Cloud-Based IVR Solutions. Learn more at: https://thesmspoint.com/channel/cloud-telephony
Odoo ERP software
Odoo ERP software, a leading open-source software for Enterprise Resource Planning (ERP) and business management, has recently launched its latest version, Odoo 17 Community Edition. This update introduces a range of new features and enhancements designed to streamline business operations and support growth.
The Odoo Community serves as a cost-free edition within the Odoo suite of ERP systems. Tailored to accommodate the standard needs of business operations, it provides a robust platform suitable for organisations of different sizes and business sectors. Within the Odoo Community Edition, users can access a variety of essential features and services essential for managing day-to-day tasks efficiently.
This blog presents a detailed overview of the features available within the Odoo 17 Community edition, and the differences between Odoo 17 community and enterprise editions, aiming to equip you with the necessary information to make an informed decision about its suitability for your business.
DDS Security Version 1.2 was adopted in 2024. This revision strengthens support for long runnings systems adding new cryptographic algorithms, certificate revocation, and hardness against DoS attacks.
Software Engineering, Software Consulting, Tech Lead, Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Transaction, Spring MVC, OpenShift Cloud Platform, Kafka, REST, SOAP, LLD & HLD.
WhatsApp offers simple, reliable, and private messaging and calling services for free worldwide. With end-to-end encryption, your personal messages and calls are secure, ensuring only you and the recipient can access them. Enjoy voice and video calls to stay connected with loved ones or colleagues. Express yourself using stickers, GIFs, or by sharing moments on Status. WhatsApp Business enables global customer outreach, facilitating sales growth and relationship building through showcasing products and services. Stay connected effortlessly with group chats for planning outings with friends or staying updated on family conversations.
Flutter is a popular open source, cross-platform framework developed by Google. In this webinar we'll explore Flutter and its architecture, delve into the Flutter Embedder and Flutter’s Dart language, discover how to leverage Flutter for embedded device development, learn about Automotive Grade Linux (AGL) and its consortium and understand the rationale behind AGL's choice of Flutter for next-gen IVI systems. Don’t miss this opportunity to discover whether Flutter is right for your project.
Neo4j - Product Vision and Knowledge Graphs - GraphSummit ParisNeo4j
Dr. Jesús Barrasa, Head of Solutions Architecture for EMEA, Neo4j
Découvrez les dernières innovations de Neo4j, et notamment les dernières intégrations cloud et les améliorations produits qui font de Neo4j un choix essentiel pour les développeurs qui créent des applications avec des données interconnectées et de l’IA générative.
2. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 2 of 13
Introducing new tools into existing projects
There are a variety of reasons development teams need to disrupt their daily flow by introducing
new products into an existing project. Typical motivations for this include:
1. Modernization. Existing platforms, tools, or methods have reached end-of-life or do not offer
adequate support and need to be replaced by more modern solutions. An example in this
category is the migration from a structured design method for a tool facing end-of-life to an
object-oriented design method using the Unified Modeling Language (UML).
2. Standardization/corporate governance. The project is not aligned with corporate standards
and is consequently forced to migrate to a solution mandated by those standards. This
typically occurs in the context of mergers, takeovers, or internal reorganizations.
3. Improved functionality/automation. Existing solutions do not offer the degree of automation
or functionality required to support the project and must therefore be replaced by improved
solutions. An example in this category is the transition to a modeling tool offering Model
Driven Architecture (MDA) support faced with a UML tool that lacks these capabilities.
4. Improved interoperability/lifecycle support. A change in the tool environment is made in
one discipline in order to improve interoperability and lifecycle support with the rest of the
development disciplines. An example in this category is the change from one UML tool to
another in order to achieve a proper integration with a requirement management system or
with an Integrated Development Environment (IDE).
5. Cost reduction. The existing solutions have become too expensive and there are offerings
available that would, if adopted, yield cost improvements -- e.g., in maintenance cost still
offering a ROI considering the migration and training costs. Such a migration would, for
example, occur if an IBM Rational ClearQuest® and third-party test management system user
would decide to migrate to the ClearQuest Test Manager.
Of course, other reasons might motivate a change in procedures and tools, but these are the ones
that I have come across during my many years supporting customers in their migration efforts.
Migration mechanisms
Migration can be seen as a process of transferring data between computer systems; it is to some
extent akin to data conversion (transition changing format and function) and data transformation
(transition mapping source data to target data). There are several ways in which a migration of
source data maintained in an old tool to target data in a new tool can be performed:
• Manual migration
• Built-in export/import tool
• Third party tool
• Home-grown migration scripts
The simplest form of migration is a manual migration. This kind of migration is tedious and has
a high cost in terms of money and time when there is a quantity of data involved. Moreover, it is
inherently error prone. However, in the context of a small amount of data and lack of adequate tool
support, it is a perfectly viable option. One example: it does not pay off to invest resources into
developing a tool for migrating five state-charts in a UML model. Simply do it by hand!
3. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 3 of 13
The most ideal mechanism for migrating data is to use a built-in export/import tool. You simply
export the data out of the old tool and import it into the new tool. Examples of this kind of migration
are the clearfsimport command for IBM Rational ClearCase® importing data from a file system,
the Comma Separated Value (CSV) and Microsoft Word import mechanisms of IBM Rational
RequisitePro®, the CSV import of the ClearQuest Import Tool, the IBM Rational TestManager to
ClearQuest Test Manager migration tool (see Berry, et al, parts 1 and 2, and Mirchandani 2007),
the Word and Excel import of the Rational Manual Tester and the XML Metadata Interchange
(XMI), IBM Rational Rose®, and the Rose XDE Import functions of Rational Software Modeler, as
shown in Figure 1.
In the absence of built-in export/import tools one may look for third party solutions. These may
come in the form of small migration scripts or utilities made available e.g. on developerWorks
by Rational consultants or users (see Karlsen, 2004 and 2006). Third party solutions may also
come in the form of complete migration tools such as the Rose to ParadigmPlus converter and
NewCodes Legacy Migrator tool for migrating Oracle forms to Java 2 Enterprise Edition (J2EE)
technology. The ToolBus by Reischmann Informatik provides, for example, converters that can
convert UML models from a variety of UML tools to the Rational Software Modeler (and vice
versa). The converters can be adapted by Reischmann Informatik to the needs of a client on a
consulting basis if required. Another example of an UML conversion toolkit is Meta Integration
Model Bridge (MIMB) from Meta Integration Technology, Inc.
Figure 1: XDE to Rational Software Modeler model import
Click to enlarge
In the absence of any migration tool at all, a last resort is to develop home-grown migration scripts,
either from scratch or based on any existing source available. This kind of migration mechanism
4. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 4 of 13
has the highest level of flexibility, since the tool can be adapted as required. However, it is also the
migration mechanism with the highest initial cost, and it requires 1) that personnel is available with
the required programming and domain know-how, and 2) that the appropriate APIs are available.
Once developed, however, the converter can be tuned in detail to fit the project at hand.
Be aware that the migration tools are not necessarily functionally complete: RequisitePro imports
requirements and attributes but no traces, and XMI import with Rational Software Modeler imports
UML models but no diagrams. Even with fully fledged import tools, such as the Rose and XDE
imports of Rational Software Modeler, there are always details that need to be considered. This
is due to the nature of the UML tools: Most UML tools implement a subset of UML and render
UML diagrams their own way using tool-specific styles, profiles etc. Moreover, there are significant
differences in the meta-model of UML 1.x and UML 2, which leads to a non-trivial mapping
between the two standards resulting in UML 1.x constructs being morphed to UML 2 elements.
Migration is therefore typically a one-way process and not a synchronization; i.e., you export
information out of one UML tool and import it into another. Once this is done, you discard the
model of the old tool and use the new tool from then on. An exchange of UML models from one
tool to another and vice versa is not state-of-the-art due to the inherent differences between two
different UML tools.
It should be noted that the various mechanisms are not at all exclusive. One may decide to
base the migration of the bulk set of data using either a built-in export/import mechanism or,
alternatively, a third-party solution. Manual migration, ideally backed up with guidelines, can
then be used for bits and pieces of information that are not covered by the migration tool or,
alternatively, for pre- or post-processing of the data manipulated by the tools (such as creating
new UML diagrams or polishing migrated diagrams in context of a migration from one UML
tool to another). Home-grown scripts can also be used in context of an existing migration tool
either to pre-process the data (e.g., to clean up a UML model) or to post-process it (generating
diagrams, provide missing pieces of information to make the model well-formed, etc.). Last but
not least, just like a compiler processes the source code in phases, you may decide to migrate
the data in several phases -- e.g. by using comma-separated value (CSV) files as an intermediate
presentation format to be imported into the new tool.
Activities and process for migration
When I initially started getting engaged in migrating projects to Rational solutions, I was looking
for a fast path for performing the migration consisting of the following steps: 1) analyze the source
data, 2) select migration method, 3) invoke migration tool, and 4) check resulting target data. This
fast path is an ideal case that can be used in some cases, but definitely not in all.
More complex migration projects are just like any other projects that involve introducing Rational
tools into an organization. Such a project will require project management, with well-defined
plans, estimates, and associated risk management. Migrating an existing project is often more
challenging than when a project is started from scratch, since a migration project is usually done
in the context of a productive system. The downtime and risks involved in swapping to a new
technology must therefore be carefully minimized to avoid a Big Bang at the end.
5. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 5 of 13
Just as in any other project introducing Rational tools, the process for using the tool must be
defined as well. Without a process, the introduction of the tool is likely to fail. In this context it is
important to avoid a cardinal mistake by adopting the same process for the new tool as the one
that was in place for the old tool, thus attempting to make the use of the new tool look like the old
one. That reduces the benefits of the change. A better way is to adopt the best practices of the
new tool to the existing organization.
Moreover, integration with other tools must be considered as well. This may concern, for example,
the integration of Rational Software Modeler with a configuration management system, in which
case guidelines must be developed for creating model fragments and for working with the model in
context of a team -- e.g., do you allow parallel work on a model file or prohibit parallel work?
Finally, users will need to be trained in the use of the new system. This is not a requirement unique
to migration projects, since all projects that introduce new development tools require time for
training. However, there may be some special concerns -- e.g., users that are used to working
with a repository-based UML tool who need to get acquainted with a file-based UML tool such at
Rational Software Modeler already have solution patterns in mind that will not work in the new
context. The course materials should therefore ideally be customized to deal with such situations
in order to explain the new mode of operation and properly position the best practices of the new
tool.
In fact, migration can be seen as an instance of introducing a Commercial Off the Shelf (COTS)
tool in an organization. The IBM Rational Unified Process® (RUP®) for COTS (see Péraire,
2005) covers all phases relevant for introducing new tools. It also identifies a number of activities
(Specify Data Migration, Perform Data Migration) and artifacts of relevance to the migration (Data
Migration Specification, Source Data, Target Data, Data Migration Evaluation).
1
When performing a migration project there are specific activities that one can run through that
goes beyond the ideal fast-path track. A realistic workflow is outlined in the activity diagram in
Figure 2. The following activities are usually relevant for conducting non-trivial migrations:
Define scope. Interview the various stakeholders, including the sponsor and identified users, and
collect the requirements and stakeholder requests. Analyze the existing tool environment and
document findings, application details, challenges, risks, and mitigations. Then define the scope of
the migration efforts.
Define migration mapping. During this activity the source data is analyzed and the subset of the
data that need to be migrated is identified. The target domain is identified as well as the mapping
between the source data; and the target data is also defined.
Determine migration mechanisms. During this activity the various options for migrating the
source model are analyzed, i.e. the various possible migration tools are evaluated and the
particular migration mechanism is determined.
6. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 6 of 13
Define migration plan. For large or more complex migration efforts it may be relevant to define a
migration plan identifying the various steps involved in performing the migration. Usually, however,
it is only possible to define this plan in detail having run through one or more test migrations first.
Figure 2: Migration activities
Prepare target environment. The target environment is configured according to the process and
requirements. For a tool like RequisitePro that would mean configuring the test (or production)
databases according to the Requirement Management Plan. For a tool like ClearQuest Test
Manager it would imply creating the database and configuring the ClearQuest Test Manager
Schema.
Implement conversion scripts. Any missing tools needed in order to migrate or pre-/post-
process the data shall be implemented (or adapted from existing scripts) and tested.
7. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 7 of 13
Pre-process source data. A need may exist to get the source data in a form where it can be
processed by the migration tool. The source data may exhibit validation errors or it may be in a
less canonical form than what is required by the new tool. In such a case a cleanup procedure may
be relevant. Pre-processing may be manual or automatic or a mix of both.
Perform migration. During this activity the (pre-processed) source data is migrated following the
steps outlined in the migration plan. This usually boils down to calling the migration tool with the
relevant parameters and feeding it the pre-processed source data. This will produce initial target
data as result. There may also be other artifacts produced, such as a migration log, that need to be
looked into.
Evaluate result. During this activity the target data is evaluated -- for example, with respect to the
scope of the migration effort. Evaluation can be done by looking into the model itself, by generating
reports, by invoking the validation function of the new tool, by running custom scripts checking the
data and, if available, by looking into the corresponding log of the migration tool. In context of very
large data sets it may become relevant to split the evaluation work among several team member or
actual users of the tool.
Post-process target data. Post-processing is by nature similar to pre-processing, except that the
subject has changed to the target data. During this step the target data is for example validated
and validation errors are removed -- either manually or automatically by running a script.
Document experiences. Write documentation in form of migration guides and tips and tricks
required to finally succeed. Migration projects are not performed every day and documenting all
details may come in handy next time a similar migration needs to be conducted.
Migration projects are highly iterative. Frequently a proof of concept (PoC), e.g. following the fast
path, is required even in the inception phase before the solution can at all be addressed. The PoC
can then be followed by one or more Pilots before the real migration is attempted. During initial
attempts the outcome is frequently that the target data is not completely okay. Unexpected issues
are likely to appear that may require re-evaluation of the migration scope or the migration method.
In more simple cases, a need is identified for corrective actions that can be solved simply by post-
processing the resulting target data. During later iterations, you may be lucky and determine that
the target data is complete and within scope, in which case the data migration can be performed
using the production system.
There is much more to migration projects than just data conversion. Take for example the
migration of UML models from one UML tool to another. Beyond converting the models, there
may also be a need to migrate existing custom scripts and reports. For each script you'll need to
determine if it really is required in the new UML tool; it is possible that the old script's functionality
comes out-of-the-box in the new tool, or the script may provide a solution that has become
obsolete. Alternatively, if the script is still relevant, you must decide in which form the solution
shall be provided -- i.e., in Rational Software Modeler terms in form of a pluglet, a profile with
validation rules, a pattern, or a transformation? Following that analysis, the scripts are usually re-
implemented from scratch.
8. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 8 of 13
General rules and best practices
Looking back at the migration projects that I have conducted over the years, I have come up with
a number of rules I believe you'll find helpful. My list is probably not complete, but it presents a
number of tips that are relevant to starting a new migration project:
1. Every migration project is different. Even two projects using the same tool will do it in
different ways, utilizing different subsets of the tool, with different setups according to project
specific requirements. Requirements regarding the migration will therefore be different, as will
the migration itself.
2. The devil is in the details. Even in the context of migration projects for which there exist
an out-of-the-box migration mechanism, one should not expect a migration without some
surprises, such as special characters in the input data causing the conversion tool to fail, or
other trivial issues with a high impact on the outcome. These issues can be usually resolved
but require resources. One way to reduce the cost is to use reusable assets available (see
rule 6 below).
3. Migration is an iterative process. Migration is not a process that is repeated often. Usually it
is something that is done a limited number of times within a limited period of time, with unique
characteristics every time it is pursued. As a side effect of rules 1 and 2 above, one should
expect a migration to be a trial and error process to be done iteratively until a satisfactory
result has been achieved. This may involve a PoC and a Pilot before the final transition is
made.
4. Balance the scope of the migration. Often "good enough" really is good enough. There
may always be details that need special consideration. If there are problems migrating
specific data using a migration tool one should identify other solutions, and also question
the relevance of migrating that piece of information. It does not pay to invest resources into
migrating data that is a) not used anymore since there is no business need or b) is hopelessly
outdated or inconsistent. Likewise, small amount of data does not need a tool but can be
migrated by hand.
5. The result counts, not the method. As professionals, we often strive toward perfection,
forgetting for a moment that "the perfect is the enemy of the good." In the context of a
migration project it is clearly the result that counts -- i.e., to get the data moved to the
target tool. How elegantly this appears to happen has no relevance as long as the costs in
performing the migration are acceptable.
6. Reuse and adapt what is available. In many other projects it makes a difference if the
project is started from scratch or if there are reusable assets in stock that can be adapted to
the current context. If one has migrated a project that maintained test cases in Microsoft Excel
to Rational TestManager, then the costs of doing it a second time, even in context of an Excel
spreadsheet with a different format, is much less than starting out from scratch. The proactive
variant of this rule is to develop reusable, maintainable, and documented migration scripts
during the current engagement that are likely to be adaptable in future engagements.
7. Outsource whenever there is need. If there is no in-house experience in conducting a
migration, or if the relevant tools and techniques are not at hand, find someone who has
done this before and outsource the job. Rational consultants as well as partner companies
9. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 9 of 13
have migration experience, and frequently they offer reusable assets than can be adapted to
perform a migration faster and at lower costs compared to a project starting out from scratch.
8. Document your experiences. Just as on any other project, it is good practice to document
the team's experiences. Migration occurs infrequently, and having all tips and tricks,
questionnaires, guidelines, checklists, findings, recommendations, etc., documented will
usually pay off next time a similar migration is attempted.
9. Publish your assets. Last but not least, migration is not rocket science, though it is
potentially complicated. Even small reusable assets -- such as a script that appropriately
quotes strings in CSV files as a pre-processing step before the file is imported into
ClearQuest -- may save a lot of time in context of an engagement.
Case studies
In this section I will offer a few case studies, and I will refer to the above rules using abbreviations
(i.e., R1, R8) whenever they are relevant for the case study.
Case Study 1: Requirements management
This migration effort took place in the context of a requirements management workshop with a
client that used Microsoft Word for requirements management. The motivation for introducing
Rational tools were improved functionality and improved lifecycle support. Initially, we looked into
the existing documents and requirements. Having configured RequisitePro according to the client's
requirements, we were faced with the challenge of importing the existing requirements kept in
Word tables, with one table for each requirement defining the requirement text, the attributes, and
the traces. We decided to do the migration in several steps (R5) using a mixture of out-of-the-
box import functions and home-grown scripts. During a pre-processing step the Word tables were
scanned and the information was converted into a CSV file using a custom Visual Basic script that
was written by the client representative in a short time. The produced CSV file was then imported
into RequisitePro using the RequisitePro CSV import wizard. In a post-processing step, I re-used
and adapted an available script (see RequisitePro extensibility interface sample) that could convert
trace information kept in RequisitePro attributes to proper trace links (R6). The migration took
about a day and required a few iterations (R3).
Case Study 2: Requirements management
In this engagement I was faced with a large distributed project that had defined the requirements
using numerous RequisitePro projects -- one for each major part of the application. The projects
were connected using cross-project traces and the project schemas were almost, but not exactly,
identical (e.g., the attribute "No" was used in one project, "No." in another, etc.). The main issue,
however, was that it had become impossible to define consolidated views over all sub-projects,
and a consolidation into one RequisitePro project was therefore requested. The motivation for
change was to achieve improved functionality with respect to reporting. The migration required the
development of a home-grown converter.
Having captured the requirements, I continued looking for possible solutions, but it became clear
that none of the existing tools could do what was required. The only option was to implement a
converter that could merge all the individual projects, including all the requirement documents, into
10. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 10 of 13
one target project and at the same time deal with variations in the schemas of the source projects.
During an initial migration, I developed a small prototype (receiving bits and pieces of code from
development) for the critical part: migration of document-based requirements. This PoC turned
out to be successful. The next step was to define the mapping from source projects to the target
project with the client and agree that the migration of change histories was out of scope since it
would have unnecessarily complicated the migration (R4).
Development of the converter was a non-trivial task that took about twenty days of work.
The migration itself was attempted a couple of times (R1) using throwaway Microsoft Access
databases and finally went through using the production database without any issues in less
than a week. The converter (RequisitePro Project Merge Script) has since been published on
developerWorks (see Karlsen, 2004) and used in a couple of other engagements (R9). In one of
these engagements I managed to migrate a RequisitePro project to another schema with less
than two days of work using the tool with a single adaptation that was needed in order to avoid
premature termination in the context of special characters in requirement document names (R2).
Case Study 3: Model migration
In this engagement I was faced with a project that used Hierarchical Object Oriented Design
(HOOD), the design method of the European Space Agency. The motivation to change centered
on the need to modernize the development process, given that the design tool was facing end-of-
life. A migration to UML and Rational Rose for Ada was therefore initiated.
In the initial Inception and Elaboration phases I investigated the possibility of any existing
conversion tools (R6), but none were available. The only option was to develop a home-grown
converter from HOOD to UML. Before doing so I defined the HOOD to UML mapping, taking the
actual HOOD designs and requirements of the client into consideration. The converter was finally
tested leading to a temporary analysis model. During this initial Elaboration phase we also reverse
engineered the Ada code to yield a design model. That made it possible to determine the final
scope and mechanism for the migration, which led to the definition of an implementation plan for
the Construction and Transition phases.
During several pre-processing steps, the architecture of the application was changed to avoid
cyclical imports between subsystems since Rose for Ada could not deal with such dependencies
(R2). An approach for exporting annotations out of code and into the model using pre-existing
components as well as new scripts was defined as well (R6). Scripts were furthermore developed
to remove irrelevant details from the design model. This was a surprising lesson learned: In large
complex projects, it does not make sense to reverse engineer every detail in the application;
it is the overview that counts (quite in contrast to the approach taken by users evaluating the
reverse engineering capabilities of a tool). Next, the reverse engineering process was undertaken
iteratively (R6). The various documentation fields from the converted HOOD model, as well as the
extracted code annotations, were then imported into the model during a post-processing phase.
Also during this phase the design model was synchronized with code so that changes in the model
could be forward engineered to code. Here we made the compromise simply to forward engineer
package/module specifications but not the code in the package bodies in order to keep the work at
a tolerable level (R4).
11. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 11 of 13
This was a complicated endeavor (R5), but we finally managed to complete the migration,
including provisioning of compliance relevant reports, training of the users (which was outsourced
to a third-party consulting company (R7), and transitioning into production. The migration effort
required approximately 100 days of consulting on behalf of Rational and a significant investment
by the client as well.
Case Study 4: Model migration
Not all model migration projects are so large that they require a long time and a large amount
of effort (R1). This case study addresses the migration from Rational XDE to Rational Software
Modeler, which was motivated by XDE facing end-of-life. The migration took less than a week.
During an initial PoC the models were identified that should be migrated and then passed to the
Rational Software Modeler XDE Import Wizard. From this basis a readiness plan was defined
outlining the various tasks to be performed. Next we looked into the log file produced by the PoC
migration, analyzed the various groups of error messages, and identified the resulting actions. The
need for a pre- as well as a post-processing step was identified.
During pre-processing the models were validated and the validation errors resolved according to
instructions defined in the migration plan. This involved naming anonymous association ends (if
required) and deleting broken references or links that could not be mapped to UML 2 semantics.
After the models were migrated, the error messages in the migration log were considered one
by one using as a basis the guidelines previously defined. The error log reported some Java
exceptions during the migration of a few class diagrams, but investigations showed that the model
had not lost information and that we could repair the class diagrams by invoking "Filter>Show
Relationships" to render all relationships on the diagram (R2, R4, R5). Moreover, the integration
with RequisitePro was configured as well in order to avoid loosing links between model elements
and requirements.
Next, the target model was validated, which resulted in more than 400 error messages. Most of
the errors were due to unnamed properties of association links and interface operations being
declared as private. Rather than correcting them by hand, I got hold of a script from a colleague
(R6) and adapted it to correct these errors by introducing role names and changing the visibility
of interface features to public. Although the migration was done in a relatively short time, I am
convinced that the client would not have been able to succeed on their own (R7).
Case Study 5: Test plan migration
In this engagement I was faced with a project that maintained a bulk set of test cases in Excel
files that needed to be imported into Rational TestManager. The project was about to introduce
the Rational Team Unifying Platform in order to achieve improved lifecycle support. It was clear
up front that this migration effort would require the development of a custom script. Having looked
into the existing test cases and determined the requirements of the customer, I defined the
mapping between fields in the Excel spreadsheets and the test plan and test case attributes of
TestManager. Next, TestManager was then configured with the appropriate custom attributes for
test cases. I then implemented the migration script taking as a basis code fragments defined in (B.
Richmond, 2004) (R6). Since I had full control over the converter, neither pre- nor post-processing
12. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 12 of 13
steps were needed. However, due to changing requirements several iteration and evaluation
sessions were required before the migration came to an end (R3). The migration project took less
than two weeks of work.
Case Study 6: Requirements, test, and activity migration
In this engagement the client used Telelogic DOORS for requirements management, test
management, and change management, and the client requested a migration to RequisitePro,
TestManager, and ClearQuest, respectively. The migration concerned a single project that was not
in line with the tool standards. During an initial session I discussed the requirements and scope
of the migration and looked into the input data. I then defined the migration mechanisms reusing
experience from previous projects. I also looked at an IBM internal "DOORS to RequisitePro"
migration guide (see Haddock, 2004) as a source of inspiration. The idea was basically to export
CSV files out of DOORS and then import these files into the Rational tools using out-of-the-box
import features combined with home-grown scripts. With that in mind a project plan was defined
with an estimated duration of six days.
Initial test migrations were undertaken based on CSV export/import re-using existing assets (R6).
Pre-processing was required, however, in order to bring the information in DOORS into a proper
state (identifiers were represented as strings and weren't unique, enumeration literals were not
well-defined, etc.). This required several iterations (R2, R3). The requirements were imported into
RequisitePro, followed by a post-processing step to create traces, similar to Case Study 1. The
test cases were imported by adapting the import script from Case Study 5, however the migration
required a quarter of the time, since the scripts could be adapted rather than being developed from
scratch. The change requests were imported using the ClearQuest Import Tool, but initially the
migration failed. The errors were caused by special characters and line breaks in description fields
(R2). A script forwarded by a colleague finally solved the issue and the migration was finished
within the estimated time (R6).
Cast study 7: Requirements management
In this engagement, the client was an IT department that was looking for a requirements
management solution to fill a gap in their tool chain. The stakeholders in the business department
delivered requirements in form of Microsoft Word tables, with one table for each requirement. The
table contained the requirement ID, the requirement text, as well as the various attributes, such
as priority and status. I suggested developing a RequisitePro plug-in (the RequisitePro import
table tool) (see Karlsen, 2006) that could import the information into RequisitePro. I already had
fragments of code available -- e.g., for managing the requirements documents from Case Study 2
and for scanning the Word document from Case Study 1 (R6). During initial steps, I collected the
requirements of the client and defined a migration/project plan. I then continued and configured
RequisitePro according to the requirements management approach of the client. The plug-in was
developed, installed, and tested during two iterations (R3), and I finally managed to perform a
PoC for providing the business and IT department with an integrated approach to requirements
management within eight days' work. I deliberately made the plug-in generally applicable by
allowing the user to configure it with respect to table format and position of the attributes in the
table, knowing that keeping requirements in tables was a helpful Word solution pattern I had come