The benefits of introducing new IBM Rational tools into an existing
project often clearly outweigh the difficulties associated with making a change midstream. Read about the various techniques you can use to manage the process.
2. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 2 of 13
Introducing new tools into existing projects
There are a variety of reasons development teams need to disrupt their daily flow by introducing
new products into an existing project. Typical motivations for this include:
1. Modernization. Existing platforms, tools, or methods have reached end-of-life or do not offer
adequate support and need to be replaced by more modern solutions. An example in this
category is the migration from a structured design method for a tool facing end-of-life to an
object-oriented design method using the Unified Modeling Language (UML).
2. Standardization/corporate governance. The project is not aligned with corporate standards
and is consequently forced to migrate to a solution mandated by those standards. This
typically occurs in the context of mergers, takeovers, or internal reorganizations.
3. Improved functionality/automation. Existing solutions do not offer the degree of automation
or functionality required to support the project and must therefore be replaced by improved
solutions. An example in this category is the transition to a modeling tool offering Model
Driven Architecture (MDA) support faced with a UML tool that lacks these capabilities.
4. Improved interoperability/lifecycle support. A change in the tool environment is made in
one discipline in order to improve interoperability and lifecycle support with the rest of the
development disciplines. An example in this category is the change from one UML tool to
another in order to achieve a proper integration with a requirement management system or
with an Integrated Development Environment (IDE).
5. Cost reduction. The existing solutions have become too expensive and there are offerings
available that would, if adopted, yield cost improvements -- e.g., in maintenance cost still
offering a ROI considering the migration and training costs. Such a migration would, for
example, occur if an IBM Rational ClearQuest® and third-party test management system user
would decide to migrate to the ClearQuest Test Manager.
Of course, other reasons might motivate a change in procedures and tools, but these are the ones
that I have come across during my many years supporting customers in their migration efforts.
Migration mechanisms
Migration can be seen as a process of transferring data between computer systems; it is to some
extent akin to data conversion (transition changing format and function) and data transformation
(transition mapping source data to target data). There are several ways in which a migration of
source data maintained in an old tool to target data in a new tool can be performed:
• Manual migration
• Built-in export/import tool
• Third party tool
• Home-grown migration scripts
The simplest form of migration is a manual migration. This kind of migration is tedious and has
a high cost in terms of money and time when there is a quantity of data involved. Moreover, it is
inherently error prone. However, in the context of a small amount of data and lack of adequate tool
support, it is a perfectly viable option. One example: it does not pay off to invest resources into
developing a tool for migrating five state-charts in a UML model. Simply do it by hand!
3. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 3 of 13
The most ideal mechanism for migrating data is to use a built-in export/import tool. You simply
export the data out of the old tool and import it into the new tool. Examples of this kind of migration
are the clearfsimport command for IBM Rational ClearCase® importing data from a file system,
the Comma Separated Value (CSV) and Microsoft Word import mechanisms of IBM Rational
RequisitePro®, the CSV import of the ClearQuest Import Tool, the IBM Rational TestManager to
ClearQuest Test Manager migration tool (see Berry, et al, parts 1 and 2, and Mirchandani 2007),
the Word and Excel import of the Rational Manual Tester and the XML Metadata Interchange
(XMI), IBM Rational Rose®, and the Rose XDE Import functions of Rational Software Modeler, as
shown in Figure 1.
In the absence of built-in export/import tools one may look for third party solutions. These may
come in the form of small migration scripts or utilities made available e.g. on developerWorks
by Rational consultants or users (see Karlsen, 2004 and 2006). Third party solutions may also
come in the form of complete migration tools such as the Rose to ParadigmPlus converter and
NewCodes Legacy Migrator tool for migrating Oracle forms to Java 2 Enterprise Edition (J2EE)
technology. The ToolBus by Reischmann Informatik provides, for example, converters that can
convert UML models from a variety of UML tools to the Rational Software Modeler (and vice
versa). The converters can be adapted by Reischmann Informatik to the needs of a client on a
consulting basis if required. Another example of an UML conversion toolkit is Meta Integration
Model Bridge (MIMB) from Meta Integration Technology, Inc.
Figure 1: XDE to Rational Software Modeler model import
Click to enlarge
In the absence of any migration tool at all, a last resort is to develop home-grown migration scripts,
either from scratch or based on any existing source available. This kind of migration mechanism
4. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 4 of 13
has the highest level of flexibility, since the tool can be adapted as required. However, it is also the
migration mechanism with the highest initial cost, and it requires 1) that personnel is available with
the required programming and domain know-how, and 2) that the appropriate APIs are available.
Once developed, however, the converter can be tuned in detail to fit the project at hand.
Be aware that the migration tools are not necessarily functionally complete: RequisitePro imports
requirements and attributes but no traces, and XMI import with Rational Software Modeler imports
UML models but no diagrams. Even with fully fledged import tools, such as the Rose and XDE
imports of Rational Software Modeler, there are always details that need to be considered. This
is due to the nature of the UML tools: Most UML tools implement a subset of UML and render
UML diagrams their own way using tool-specific styles, profiles etc. Moreover, there are significant
differences in the meta-model of UML 1.x and UML 2, which leads to a non-trivial mapping
between the two standards resulting in UML 1.x constructs being morphed to UML 2 elements.
Migration is therefore typically a one-way process and not a synchronization; i.e., you export
information out of one UML tool and import it into another. Once this is done, you discard the
model of the old tool and use the new tool from then on. An exchange of UML models from one
tool to another and vice versa is not state-of-the-art due to the inherent differences between two
different UML tools.
It should be noted that the various mechanisms are not at all exclusive. One may decide to
base the migration of the bulk set of data using either a built-in export/import mechanism or,
alternatively, a third-party solution. Manual migration, ideally backed up with guidelines, can
then be used for bits and pieces of information that are not covered by the migration tool or,
alternatively, for pre- or post-processing of the data manipulated by the tools (such as creating
new UML diagrams or polishing migrated diagrams in context of a migration from one UML
tool to another). Home-grown scripts can also be used in context of an existing migration tool
either to pre-process the data (e.g., to clean up a UML model) or to post-process it (generating
diagrams, provide missing pieces of information to make the model well-formed, etc.). Last but
not least, just like a compiler processes the source code in phases, you may decide to migrate
the data in several phases -- e.g. by using comma-separated value (CSV) files as an intermediate
presentation format to be imported into the new tool.
Activities and process for migration
When I initially started getting engaged in migrating projects to Rational solutions, I was looking
for a fast path for performing the migration consisting of the following steps: 1) analyze the source
data, 2) select migration method, 3) invoke migration tool, and 4) check resulting target data. This
fast path is an ideal case that can be used in some cases, but definitely not in all.
More complex migration projects are just like any other projects that involve introducing Rational
tools into an organization. Such a project will require project management, with well-defined
plans, estimates, and associated risk management. Migrating an existing project is often more
challenging than when a project is started from scratch, since a migration project is usually done
in the context of a productive system. The downtime and risks involved in swapping to a new
technology must therefore be carefully minimized to avoid a Big Bang at the end.
5. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 5 of 13
Just as in any other project introducing Rational tools, the process for using the tool must be
defined as well. Without a process, the introduction of the tool is likely to fail. In this context it is
important to avoid a cardinal mistake by adopting the same process for the new tool as the one
that was in place for the old tool, thus attempting to make the use of the new tool look like the old
one. That reduces the benefits of the change. A better way is to adopt the best practices of the
new tool to the existing organization.
Moreover, integration with other tools must be considered as well. This may concern, for example,
the integration of Rational Software Modeler with a configuration management system, in which
case guidelines must be developed for creating model fragments and for working with the model in
context of a team -- e.g., do you allow parallel work on a model file or prohibit parallel work?
Finally, users will need to be trained in the use of the new system. This is not a requirement unique
to migration projects, since all projects that introduce new development tools require time for
training. However, there may be some special concerns -- e.g., users that are used to working
with a repository-based UML tool who need to get acquainted with a file-based UML tool such at
Rational Software Modeler already have solution patterns in mind that will not work in the new
context. The course materials should therefore ideally be customized to deal with such situations
in order to explain the new mode of operation and properly position the best practices of the new
tool.
In fact, migration can be seen as an instance of introducing a Commercial Off the Shelf (COTS)
tool in an organization. The IBM Rational Unified Process® (RUP®) for COTS (see Péraire,
2005) covers all phases relevant for introducing new tools. It also identifies a number of activities
(Specify Data Migration, Perform Data Migration) and artifacts of relevance to the migration (Data
Migration Specification, Source Data, Target Data, Data Migration Evaluation).
1
When performing a migration project there are specific activities that one can run through that
goes beyond the ideal fast-path track. A realistic workflow is outlined in the activity diagram in
Figure 2. The following activities are usually relevant for conducting non-trivial migrations:
Define scope. Interview the various stakeholders, including the sponsor and identified users, and
collect the requirements and stakeholder requests. Analyze the existing tool environment and
document findings, application details, challenges, risks, and mitigations. Then define the scope of
the migration efforts.
Define migration mapping. During this activity the source data is analyzed and the subset of the
data that need to be migrated is identified. The target domain is identified as well as the mapping
between the source data; and the target data is also defined.
Determine migration mechanisms. During this activity the various options for migrating the
source model are analyzed, i.e. the various possible migration tools are evaluated and the
particular migration mechanism is determined.
6. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 6 of 13
Define migration plan. For large or more complex migration efforts it may be relevant to define a
migration plan identifying the various steps involved in performing the migration. Usually, however,
it is only possible to define this plan in detail having run through one or more test migrations first.
Figure 2: Migration activities
Prepare target environment. The target environment is configured according to the process and
requirements. For a tool like RequisitePro that would mean configuring the test (or production)
databases according to the Requirement Management Plan. For a tool like ClearQuest Test
Manager it would imply creating the database and configuring the ClearQuest Test Manager
Schema.
Implement conversion scripts. Any missing tools needed in order to migrate or pre-/post-
process the data shall be implemented (or adapted from existing scripts) and tested.
7. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 7 of 13
Pre-process source data. A need may exist to get the source data in a form where it can be
processed by the migration tool. The source data may exhibit validation errors or it may be in a
less canonical form than what is required by the new tool. In such a case a cleanup procedure may
be relevant. Pre-processing may be manual or automatic or a mix of both.
Perform migration. During this activity the (pre-processed) source data is migrated following the
steps outlined in the migration plan. This usually boils down to calling the migration tool with the
relevant parameters and feeding it the pre-processed source data. This will produce initial target
data as result. There may also be other artifacts produced, such as a migration log, that need to be
looked into.
Evaluate result. During this activity the target data is evaluated -- for example, with respect to the
scope of the migration effort. Evaluation can be done by looking into the model itself, by generating
reports, by invoking the validation function of the new tool, by running custom scripts checking the
data and, if available, by looking into the corresponding log of the migration tool. In context of very
large data sets it may become relevant to split the evaluation work among several team member or
actual users of the tool.
Post-process target data. Post-processing is by nature similar to pre-processing, except that the
subject has changed to the target data. During this step the target data is for example validated
and validation errors are removed -- either manually or automatically by running a script.
Document experiences. Write documentation in form of migration guides and tips and tricks
required to finally succeed. Migration projects are not performed every day and documenting all
details may come in handy next time a similar migration needs to be conducted.
Migration projects are highly iterative. Frequently a proof of concept (PoC), e.g. following the fast
path, is required even in the inception phase before the solution can at all be addressed. The PoC
can then be followed by one or more Pilots before the real migration is attempted. During initial
attempts the outcome is frequently that the target data is not completely okay. Unexpected issues
are likely to appear that may require re-evaluation of the migration scope or the migration method.
In more simple cases, a need is identified for corrective actions that can be solved simply by post-
processing the resulting target data. During later iterations, you may be lucky and determine that
the target data is complete and within scope, in which case the data migration can be performed
using the production system.
There is much more to migration projects than just data conversion. Take for example the
migration of UML models from one UML tool to another. Beyond converting the models, there
may also be a need to migrate existing custom scripts and reports. For each script you'll need to
determine if it really is required in the new UML tool; it is possible that the old script's functionality
comes out-of-the-box in the new tool, or the script may provide a solution that has become
obsolete. Alternatively, if the script is still relevant, you must decide in which form the solution
shall be provided -- i.e., in Rational Software Modeler terms in form of a pluglet, a profile with
validation rules, a pattern, or a transformation? Following that analysis, the scripts are usually re-
implemented from scratch.
8. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 8 of 13
General rules and best practices
Looking back at the migration projects that I have conducted over the years, I have come up with
a number of rules I believe you'll find helpful. My list is probably not complete, but it presents a
number of tips that are relevant to starting a new migration project:
1. Every migration project is different. Even two projects using the same tool will do it in
different ways, utilizing different subsets of the tool, with different setups according to project
specific requirements. Requirements regarding the migration will therefore be different, as will
the migration itself.
2. The devil is in the details. Even in the context of migration projects for which there exist
an out-of-the-box migration mechanism, one should not expect a migration without some
surprises, such as special characters in the input data causing the conversion tool to fail, or
other trivial issues with a high impact on the outcome. These issues can be usually resolved
but require resources. One way to reduce the cost is to use reusable assets available (see
rule 6 below).
3. Migration is an iterative process. Migration is not a process that is repeated often. Usually it
is something that is done a limited number of times within a limited period of time, with unique
characteristics every time it is pursued. As a side effect of rules 1 and 2 above, one should
expect a migration to be a trial and error process to be done iteratively until a satisfactory
result has been achieved. This may involve a PoC and a Pilot before the final transition is
made.
4. Balance the scope of the migration. Often "good enough" really is good enough. There
may always be details that need special consideration. If there are problems migrating
specific data using a migration tool one should identify other solutions, and also question
the relevance of migrating that piece of information. It does not pay to invest resources into
migrating data that is a) not used anymore since there is no business need or b) is hopelessly
outdated or inconsistent. Likewise, small amount of data does not need a tool but can be
migrated by hand.
5. The result counts, not the method. As professionals, we often strive toward perfection,
forgetting for a moment that "the perfect is the enemy of the good." In the context of a
migration project it is clearly the result that counts -- i.e., to get the data moved to the
target tool. How elegantly this appears to happen has no relevance as long as the costs in
performing the migration are acceptable.
6. Reuse and adapt what is available. In many other projects it makes a difference if the
project is started from scratch or if there are reusable assets in stock that can be adapted to
the current context. If one has migrated a project that maintained test cases in Microsoft Excel
to Rational TestManager, then the costs of doing it a second time, even in context of an Excel
spreadsheet with a different format, is much less than starting out from scratch. The proactive
variant of this rule is to develop reusable, maintainable, and documented migration scripts
during the current engagement that are likely to be adaptable in future engagements.
7. Outsource whenever there is need. If there is no in-house experience in conducting a
migration, or if the relevant tools and techniques are not at hand, find someone who has
done this before and outsource the job. Rational consultants as well as partner companies
9. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 9 of 13
have migration experience, and frequently they offer reusable assets than can be adapted to
perform a migration faster and at lower costs compared to a project starting out from scratch.
8. Document your experiences. Just as on any other project, it is good practice to document
the team's experiences. Migration occurs infrequently, and having all tips and tricks,
questionnaires, guidelines, checklists, findings, recommendations, etc., documented will
usually pay off next time a similar migration is attempted.
9. Publish your assets. Last but not least, migration is not rocket science, though it is
potentially complicated. Even small reusable assets -- such as a script that appropriately
quotes strings in CSV files as a pre-processing step before the file is imported into
ClearQuest -- may save a lot of time in context of an engagement.
Case studies
In this section I will offer a few case studies, and I will refer to the above rules using abbreviations
(i.e., R1, R8) whenever they are relevant for the case study.
Case Study 1: Requirements management
This migration effort took place in the context of a requirements management workshop with a
client that used Microsoft Word for requirements management. The motivation for introducing
Rational tools were improved functionality and improved lifecycle support. Initially, we looked into
the existing documents and requirements. Having configured RequisitePro according to the client's
requirements, we were faced with the challenge of importing the existing requirements kept in
Word tables, with one table for each requirement defining the requirement text, the attributes, and
the traces. We decided to do the migration in several steps (R5) using a mixture of out-of-the-
box import functions and home-grown scripts. During a pre-processing step the Word tables were
scanned and the information was converted into a CSV file using a custom Visual Basic script that
was written by the client representative in a short time. The produced CSV file was then imported
into RequisitePro using the RequisitePro CSV import wizard. In a post-processing step, I re-used
and adapted an available script (see RequisitePro extensibility interface sample) that could convert
trace information kept in RequisitePro attributes to proper trace links (R6). The migration took
about a day and required a few iterations (R3).
Case Study 2: Requirements management
In this engagement I was faced with a large distributed project that had defined the requirements
using numerous RequisitePro projects -- one for each major part of the application. The projects
were connected using cross-project traces and the project schemas were almost, but not exactly,
identical (e.g., the attribute "No" was used in one project, "No." in another, etc.). The main issue,
however, was that it had become impossible to define consolidated views over all sub-projects,
and a consolidation into one RequisitePro project was therefore requested. The motivation for
change was to achieve improved functionality with respect to reporting. The migration required the
development of a home-grown converter.
Having captured the requirements, I continued looking for possible solutions, but it became clear
that none of the existing tools could do what was required. The only option was to implement a
converter that could merge all the individual projects, including all the requirement documents, into
10. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 10 of 13
one target project and at the same time deal with variations in the schemas of the source projects.
During an initial migration, I developed a small prototype (receiving bits and pieces of code from
development) for the critical part: migration of document-based requirements. This PoC turned
out to be successful. The next step was to define the mapping from source projects to the target
project with the client and agree that the migration of change histories was out of scope since it
would have unnecessarily complicated the migration (R4).
Development of the converter was a non-trivial task that took about twenty days of work.
The migration itself was attempted a couple of times (R1) using throwaway Microsoft Access
databases and finally went through using the production database without any issues in less
than a week. The converter (RequisitePro Project Merge Script) has since been published on
developerWorks (see Karlsen, 2004) and used in a couple of other engagements (R9). In one of
these engagements I managed to migrate a RequisitePro project to another schema with less
than two days of work using the tool with a single adaptation that was needed in order to avoid
premature termination in the context of special characters in requirement document names (R2).
Case Study 3: Model migration
In this engagement I was faced with a project that used Hierarchical Object Oriented Design
(HOOD), the design method of the European Space Agency. The motivation to change centered
on the need to modernize the development process, given that the design tool was facing end-of-
life. A migration to UML and Rational Rose for Ada was therefore initiated.
In the initial Inception and Elaboration phases I investigated the possibility of any existing
conversion tools (R6), but none were available. The only option was to develop a home-grown
converter from HOOD to UML. Before doing so I defined the HOOD to UML mapping, taking the
actual HOOD designs and requirements of the client into consideration. The converter was finally
tested leading to a temporary analysis model. During this initial Elaboration phase we also reverse
engineered the Ada code to yield a design model. That made it possible to determine the final
scope and mechanism for the migration, which led to the definition of an implementation plan for
the Construction and Transition phases.
During several pre-processing steps, the architecture of the application was changed to avoid
cyclical imports between subsystems since Rose for Ada could not deal with such dependencies
(R2). An approach for exporting annotations out of code and into the model using pre-existing
components as well as new scripts was defined as well (R6). Scripts were furthermore developed
to remove irrelevant details from the design model. This was a surprising lesson learned: In large
complex projects, it does not make sense to reverse engineer every detail in the application;
it is the overview that counts (quite in contrast to the approach taken by users evaluating the
reverse engineering capabilities of a tool). Next, the reverse engineering process was undertaken
iteratively (R6). The various documentation fields from the converted HOOD model, as well as the
extracted code annotations, were then imported into the model during a post-processing phase.
Also during this phase the design model was synchronized with code so that changes in the model
could be forward engineered to code. Here we made the compromise simply to forward engineer
package/module specifications but not the code in the package bodies in order to keep the work at
a tolerable level (R4).
11. ibm.com/developerWorks/ developerWorks®
Migrating existing projects to Rational solutions Page 11 of 13
This was a complicated endeavor (R5), but we finally managed to complete the migration,
including provisioning of compliance relevant reports, training of the users (which was outsourced
to a third-party consulting company (R7), and transitioning into production. The migration effort
required approximately 100 days of consulting on behalf of Rational and a significant investment
by the client as well.
Case Study 4: Model migration
Not all model migration projects are so large that they require a long time and a large amount
of effort (R1). This case study addresses the migration from Rational XDE to Rational Software
Modeler, which was motivated by XDE facing end-of-life. The migration took less than a week.
During an initial PoC the models were identified that should be migrated and then passed to the
Rational Software Modeler XDE Import Wizard. From this basis a readiness plan was defined
outlining the various tasks to be performed. Next we looked into the log file produced by the PoC
migration, analyzed the various groups of error messages, and identified the resulting actions. The
need for a pre- as well as a post-processing step was identified.
During pre-processing the models were validated and the validation errors resolved according to
instructions defined in the migration plan. This involved naming anonymous association ends (if
required) and deleting broken references or links that could not be mapped to UML 2 semantics.
After the models were migrated, the error messages in the migration log were considered one
by one using as a basis the guidelines previously defined. The error log reported some Java
exceptions during the migration of a few class diagrams, but investigations showed that the model
had not lost information and that we could repair the class diagrams by invoking "Filter>Show
Relationships" to render all relationships on the diagram (R2, R4, R5). Moreover, the integration
with RequisitePro was configured as well in order to avoid loosing links between model elements
and requirements.
Next, the target model was validated, which resulted in more than 400 error messages. Most of
the errors were due to unnamed properties of association links and interface operations being
declared as private. Rather than correcting them by hand, I got hold of a script from a colleague
(R6) and adapted it to correct these errors by introducing role names and changing the visibility
of interface features to public. Although the migration was done in a relatively short time, I am
convinced that the client would not have been able to succeed on their own (R7).
Case Study 5: Test plan migration
In this engagement I was faced with a project that maintained a bulk set of test cases in Excel
files that needed to be imported into Rational TestManager. The project was about to introduce
the Rational Team Unifying Platform in order to achieve improved lifecycle support. It was clear
up front that this migration effort would require the development of a custom script. Having looked
into the existing test cases and determined the requirements of the customer, I defined the
mapping between fields in the Excel spreadsheets and the test plan and test case attributes of
TestManager. Next, TestManager was then configured with the appropriate custom attributes for
test cases. I then implemented the migration script taking as a basis code fragments defined in (B.
Richmond, 2004) (R6). Since I had full control over the converter, neither pre- nor post-processing
12. developerWorks® ibm.com/developerWorks/
Migrating existing projects to Rational solutions Page 12 of 13
steps were needed. However, due to changing requirements several iteration and evaluation
sessions were required before the migration came to an end (R3). The migration project took less
than two weeks of work.
Case Study 6: Requirements, test, and activity migration
In this engagement the client used Telelogic DOORS for requirements management, test
management, and change management, and the client requested a migration to RequisitePro,
TestManager, and ClearQuest, respectively. The migration concerned a single project that was not
in line with the tool standards. During an initial session I discussed the requirements and scope
of the migration and looked into the input data. I then defined the migration mechanisms reusing
experience from previous projects. I also looked at an IBM internal "DOORS to RequisitePro"
migration guide (see Haddock, 2004) as a source of inspiration. The idea was basically to export
CSV files out of DOORS and then import these files into the Rational tools using out-of-the-box
import features combined with home-grown scripts. With that in mind a project plan was defined
with an estimated duration of six days.
Initial test migrations were undertaken based on CSV export/import re-using existing assets (R6).
Pre-processing was required, however, in order to bring the information in DOORS into a proper
state (identifiers were represented as strings and weren't unique, enumeration literals were not
well-defined, etc.). This required several iterations (R2, R3). The requirements were imported into
RequisitePro, followed by a post-processing step to create traces, similar to Case Study 1. The
test cases were imported by adapting the import script from Case Study 5, however the migration
required a quarter of the time, since the scripts could be adapted rather than being developed from
scratch. The change requests were imported using the ClearQuest Import Tool, but initially the
migration failed. The errors were caused by special characters and line breaks in description fields
(R2). A script forwarded by a colleague finally solved the issue and the migration was finished
within the estimated time (R6).
Cast study 7: Requirements management
In this engagement, the client was an IT department that was looking for a requirements
management solution to fill a gap in their tool chain. The stakeholders in the business department
delivered requirements in form of Microsoft Word tables, with one table for each requirement. The
table contained the requirement ID, the requirement text, as well as the various attributes, such
as priority and status. I suggested developing a RequisitePro plug-in (the RequisitePro import
table tool) (see Karlsen, 2006) that could import the information into RequisitePro. I already had
fragments of code available -- e.g., for managing the requirements documents from Case Study 2
and for scanning the Word document from Case Study 1 (R6). During initial steps, I collected the
requirements of the client and defined a migration/project plan. I then continued and configured
RequisitePro according to the requirements management approach of the client. The plug-in was
developed, installed, and tested during two iterations (R3), and I finally managed to perform a
PoC for providing the business and IT department with an integrated approach to requirements
management within eight days' work. I deliberately made the plug-in generally applicable by
allowing the user to configure it with respect to table format and position of the attributes in the
table, knowing that keeping requirements in tables was a helpful Word solution pattern I had come