Best PracticeData Migration for Banking Services Solutions – Customer Experience                                          ...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Table of Contents1 Manag...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.01                  Manag...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0It is necessary to defin...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.01.5.1             Big Ba...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example: A special inter...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02                 Best P...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2               Migrat...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Imagine the bank has fiv...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object                 C...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object                 C...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object                 C...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object                 C...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.3             Sequen...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   Another implementatio...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0If this is possible, you...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Closed accountsDuring th...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0© 2008 SAP AG - Best_Pra...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Migration of GL-relevant...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0During the Blueprint pha...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   Inconsistent data are...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   Do not sort the field...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0© 2008 SAP AG - Best_Pra...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   Create a new sheet wi...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example 2: Account settl...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.6             Data R...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0The Planned Go-Live Date...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example 1: Big bang – no...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example 2: Migration of ...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   No system or at least...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Find here some examples:...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0You need the following c...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implemen...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implemen...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implemen...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implemen...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implemen...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0If there are external sy...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.10            Techni...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   If you use EDT for lo...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0   Write program descrip...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.14            Go-Liv...
Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.03                 Furthe...
Data migration for banking services solutions  customer experience
Upcoming SlideShare
Loading in...5
×

Data migration for banking services solutions customer experience

3,678

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,678
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
293
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Data migration for banking services solutions customer experience

  1. 1. Best PracticeData Migration for Banking Services Solutions – Customer Experience Best Practice Data Migration for Banking Services Solutions – Customer Experience Dietmar-Hopp-Allee 16 D-69190 Walldorf CS STATUS customer published DATE VERSION Nov-03 2008 2.1 SOLUTION MANAGEMENT PHASE SAP SOLUTION Setup Operations Transaction Banking 3.0 & 4.0 and Banking Services 5.0 & 6.0 TOPIC AREA SOLUTION MANAGER AREA Business Process Operations Transactional Consistency & Data IntegrityBest_Practice_Data_Migration_V21.doc – 03.11.2008
  2. 2. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Table of Contents1 Management Summary 3 1.1 Goal of Using This Service 3 1.2 Alternative Practices 3 1.3 Staff and Skills Requirements 3 1.4 System Requirements 4 1.5 Duration, Timing and Migration Strategy 4 1.5.1 Big Bang 5 1.5.2 Parallel Processing 5 1.5.3 In ‘Slices’ 52 Best Practice 7 2.1 Preliminary Migration Tasks 7 2.2 Migration Activities During Business in the Blueprint Phase 8 2.2.1 Organizational Aspects 8 2.2.2 Migration Objects 9 2.2.3 Sequence of Migration Objects 14 2.2.3.1 Migration Strategy 14 2.2.3.2 Data Volume 14 2.2.3.3 Source System(s) 14 2.2.3.4 Dependencies Between Migration Objects 15 2.2.3.5 Verification of Migrated Data 19 2.2.4 Analysis of the Source System(s) Data 19 2.2.5 ETL: Strategy and Tools 20 2.2.6 Data Reconciliation 26 2.2.7 Dependencies Between Data Loading, Data Reconciliation, and Go-Live 26 2.2.8 Definition of Acceptance Criteria 30 2.2.9 External Impacts 30 2.2.10 Technical Aspects 39 2.2.10.1 Program Definition 39 2.2.10.2 Transfer of Files 40 2.2.11 Test Concept 40 2.2.12 Realization 40 2.2.13 Testing and Final Preparation 41 2.2.14 Go-Live 423 Further Information 43 3.1 Tasks After Migration 43© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 2/44
  3. 3. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.01 Management SummaryThis Run SAP document sets out the Data Migration approach and strategy for Banking Services. It is acollection of customer experiences of other banking migration projects.A data migration in the banking environment is not only a challenge on technical aspects; it is also achallenge for the project management because of the parallel running SAP implementation projects. For thisreason the documents explains shortly how to set a migration team and the choices for the migrationstrategy.The project scope of the required SAP migration objects and their migration sequence usually determines thecomplexity and duration of the migration project. Due to this cause, the main part of the document gives aninsight to this ensemble acting.Dependencies between data loading and data reconciliation are important aspects within Data Migration. Forthis reason, data reconciliation, one of the critical success factors of the Data Migration, is covered in thesecond part of the Run SAP document.Given the fact that an SAP migration project is mostly structured like the parallel running SAP implementationprojects (e.g. ASAP methodology), the Run SAP document highlight’s the main migration tasks related to theproject phases Testing, Final Preparation and Go-Live at the end of the document.1.1 Goal of Using This ServiceThe document aims at providing a global overview of the migration process from legacy system to the SAPBanking platform. It is not a fully completed “cookbook” and it is not covering all necessary steps in detail.The entitlement of the documentation is more to explain the complexity of a migration project, necessarysteps and the relation to the running project. It is a collection of experiences out of earlier migration projectsto provide information on migration approach, strategy, and sequence of migration objects, ETL methodology,testing, quality assurance, preparation, development and execution of the migration.The Resources section describes the resource requirements, roles and responsibilities as it linked to themigration.1.2 Alternative PracticesSAP Banking consultants should support you during the design and the setup of the data migration project.Additional advice and experience for Data Migration can be delivered as part of a SAP premium supportengagement. This service is available within a SAP premium support engagements, that is: SAP EnterpriseSupport, SAP MaxAttention, SAP Safeguarding).1.3 Staff and Skills RequirementsBefore running a migration project, it is necessary to ensure that the migration team has the followingknowledge and skills: Detailed knowledge of banking processes and the SAP solution Knowledge of migration objects used in the project Overview of the project itself© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 3/44
  4. 4. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0It is necessary to define the data migration strategy as early as possible as the effort is generallyunderestimated. All project members have to be involved from the beginning of the project. All documentsfrom the Blueprint phase should have a chapter that describes the impacts on data migration.Persons with deep knowledge of the source systems should be available to: Define a migration scenario Define the involved systems Decide which part of the development (extraction, transformation) can be done directly in the source systems Define transformation rules Give the order to cleanse wrong or incomplete data in the source system Define fields that are necessary for the target system or for systems that receive data from the target systems (e.g. BW, user frond ends)Additional resources: Developer for writing extract programs in the source system(s) Developer for writing programs for transformation, analysis, and cleansing of the source data Developer/customizer for writing programs with a good knowledge of the standard techniques for data migration from SAP (External Data Transfer, DI) for loading objects into Banking Services 6.0 and comparing the loaded data with the provided source data. Access to all project members responsible for customizing and development to verify if something is relevant for data migration or not. Access to employees from the auditing department for - Deciding, which data have to be controlled on which basis during or after data migration (automated or manually, via numbers, sums, or lists, all data or just spot-tests depending on regular reporting rules and internal instructions) - Defining the signing process for verifying that the data migration has taken place according to the pre- defined rules, that everything is complete and correct Access to a person with detailed knowledge about the job processing tool of the bank, for: - Deciding if to use it or not (alternatives?) - Describing the JCL (job control language) for implementing data migration programs - Testing1.4 System RequirementsValid from TRBK 3.0 to Banking Services 6.01.5 Duration, Timing and Migration StrategyThe duration and timing of a migration project depends on the chosen migration strategy (according toexperiences, minimum 3–6 months). In general the setup of a migration project should be aligned to theASAP project methodology.There is definitely no ‘one fits for all’ strategy for Data Migration. An overview about three possible migrationstrategies is given in the next chapters: big bang, parallel processing and in “slices”.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 4/44
  5. 5. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.01.5.1 Big BangThis type of Data Migration is often chosen for a new software implementation. There is no parallelprocessing of data in the old and in the new system. Depending on data volumes, you can choose from twoscenarios:Scenario 1: The source system is stopped completely (preferably on a long weekend) Data are extracted, transformed and loaded After a successful system verification, the target system is set productiveScenario 2: The biggest part of data is extracted, transformed, loaded and checked while the source systemis still productive On a pre-defined weekend the source system is stopped Only the delta has to be migrated and controlled The target system is set productiveAdvantages Disadvantages Short period of data migration No pilot migration possible Non-recurring effort in human and system No risk diversification resources Testing of migration and follow-up processes is Communication (internal, external) is only needed even more important once 100% must be migrated correctly No additional, temporary interfaces and programs Downtime of up to a few days No migration into an already productive system Fallback is easier on the migration weekend1.5.2 Parallel ProcessingThe source system is productive during Data Migration and a certain period after Data Migration. The aim isto prove that the new software shows the same results as the old system. So every process has to beexecuted twice, and certain reconciliation points have to be defined to compare the results.Advantages Disadvantages Prove that the new system landscape is working All data exists twice properly Extensive maintenance of all depending systems Switch to the new system only when everything is Parallel delivery of interfaces working fine Additional programs and human resources for verification of results Necessity to temporarily adapt processes to get the same results Problems with new or changing processes1.5.3 In ‘Slices’This strategy allows migrating a certain number of accounts or cards in slices depending on what the bankdefines. You can begin with a pilot tranche to reduce risk or validate processes before loading mass data.You can apply this strategy only if at least some accounts or cards are not depending on other accounts ofthe same system. As a precondition for this migration strategy the cut in independent slices must be possible.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 5/44
  6. 6. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example: A special internal account 123 is needed to post an item in a foreign currency on a customeraccount in home currency. The delivering system only knows account 123. If such an account is needed inthe old and in the new system at the same time, it is not possible to migrate this account (or you rebuild thewhole process before starting the data migration).Advantages Disadvantages Risk mitigation through migration of a pilot tranche Additional tools and programs to localize accounts („uncritical“ data) (dispatcher) and their status during migration Small number of data to validate the migration required process in a productive system Parallel delivery of changes Safety through automation of the migration process Automation is a must No fixed date for Data Migration necessary High effort in testing Short or no downtime Long period of migration, complicated processes Many personal resources needed No fallback after migration of the first tranche Migration into a productive systemAs mentioned above, it is vital to come to a decision at an early stage of the project. The chosen strategy hasan impact not only on the migration team itself, but also on the other teams of the project and maybe even onnot obviously involved surrounding systems.Another important question is: What kind of history is needed? As the biggest problem of Data Migration istime for execution, act according to the following sentence: As much as needed – as little as possibleExample 1:Load of payment items of the past 10 years. Depending on the product, this can result in a huge number ofpayment items. A bank normally has an archive system where the user can find every detail of payments.Also the data from Banking Services 6.0 have to be transferred to the archive system sooner or later.Example 2:If there is a need to load payment items and account settlements for the actual and the last year, you alsoneed the limits and interest rates of this period. So you have to assure that the customizing of the conditiongroups and reference interest rates are set up in the right way. If the lead system for overdraft limits is not thesource system of the accounts, you need an additional extract program from the leading system. On the otherhand, you don’t have to rebuild the whole lifecycle of an account when you just need the correct historicalinterest rates.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 6/44
  7. 7. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02 Best Practice2.1 Preliminary Migration TasksThe following list gives you an overview of nice-to-have documents and tools which speed up your migrationproject:For preparation: This Data Migration document Documentation about transactions and fields of the source and the target system(s)For the first workshops: Pros and Cons of possible Data Migration strategies Slides to explain Data Migration strategies, if possible adjusted to the language and targets of the project (examples!) List of Data Migration objects Project plan for software implementation and Data Migration Lists of contact persons for each subject (most projects have responsible persons for business partner, products (accounts, cards), payments, correspondence, settlement,…) with mail address, phone number, deputy Evaluating a tool for data transformation or documentation (Business Objects, Access DB, Excel, …)For further work: Lists of attributes (source and target system) for every object definition of transformation rules Blueprint documents, especially the relevant chapters from documents other than Data MigrationPlease note:Depending on the Data Migration strategy and the implementation strategy in general, some of the ideas turnout to be very good and future-oriented.When the first customers went productive with TRBK 2.0, SAP had the understanding that every bank had toswitch its software from the old legacy software to TRBK and to set its migrated accounts and cardsproductive on the same day. It was not possible to migrate accounts into a productive system with alreadyproductive accounts. Also, most of the EOD programs could not divide between productive and non-produc-tive accounts. That was a major problem if you were still in the process of loading data.All the problems were discussed and finally SAP created the so-called migration group that disconnects theposting date of the bank posting area from objects (accounts, cards) in the process of Data Migration.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 7/44
  8. 8. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2 Migration Activities During Business in the Blueprint PhaseA migration project is set up according to the ASAP methodology. In the following chapters, the phasesBusiness Blueprint, Realization, Test and Go-Live are described in more detail.During the Business Blueprint, phase the evaluated migration strategy has to be validated in detail.2.2.1 Organizational AspectsA core banking project is usually divided into a number of teams. Every team has to deliver a businessblueprint document for its specific subject. A subject, in most cases, is not a business process but a functionor a group of functions in the core banking system, such as: Business partner Account lifecycle Card lifecycle End-of-day processing Product change Data migration Postings CorrespondenceIf not explicitly demanded, most of the team members just describe the new processes but do not define: The transition from the old to the new system, from the old to the new processes – for example, if you define new fields on an existing product, do you need a change of the product version of accounts already using this product? Aspects of data migration – for example: You are not allowed to plan a data migration at a month end because the GL analysts are busy with the end-of-month processes.Possible solution: Insert a chapter describing the aspects for Data Migration into the business blueprint template Ask for descriptions of interfaces e.g. for user interfaces (UIs). At least the described fields have to be delivered in the Data Migration process, too. Of special interest are the custom-developed fields which are completely new. Ask also for historical aspects: What is needed to feed a certain turnover class correctly (e.g. transaction types 123456 and 234567 over a period of two settlements)? – So that the customer does not have a limit of zero because of the not-filled turnover class. Or: Are the tax documents at the end of the year correct, even if the first account settlements of the actual year were done in the old system and the remaining settlements were calculated in the new solution?A complete migration process does not only include extraction, transformation, and loading of data from theold legacy system into the new one, but also: Delivery of changed or new data to other depending systems, such as front end systems, archive systems, BI, so that the employees of the bank are able to work according to the new processes after Data Migration Delivery of changed or new data to leading systems, if data are changed by the migration process itselfExample:The Customer Information Management system (CIM) has the lead over products used by a customer.Normally all processes that try to change these customer-product relationship have to do that via CIM, andCIM is the initiator of such changes.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 8/44
  9. 9. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Imagine the bank has five current account products in its legacy system. In the new system, the customerwants to diverse more, so there will be six additional products. The responsible person for Data Migration hasto transform the existing accounts from five old to eleven new products via transformation rules. After go-live,the Banking Services system has to inform the other system, CIM, about the new customer-productrelationship so that CIM can take over the lead again.Note: Generally it is not a good idea to combine a technically oriented Data Migration with application-oriented changes.Reasons: The validation of data is more complicated. The effort of testing is higher. The bank’s customers and the employees must be informed about changes in functions of their accounts and about the Data Migration itself if the system is stopped for a weekend or so.Depending on the migration strategy, the following has to be defined: Timeframes for the whole migration process or a “slice” Flowchart of the migration process Definition of processes to be stopped for contracts in migration Definition of processes to be changed for contracts in migration Steps of data validation2.2.2 Migration ObjectsAccording to the IMG documentation of SAP a migration object is a ‘quantity of data that has similar businesspurposes and that can be transferred during migration from the legacy system to Account Management (FS-AM) (sometimes separated into master and flow data). This also includes objects that are not businessobjects, but can be viewed as individual objects.’The Business Process Platform from SAP 6.0 manages the following migration objects and channels (foreven more details see the documentation in the IMG under Technical Documentation for AccountManagement (FS-AM) Concepts and Guidelines Migration Migration Objects and TechnicalDocumentation for Master Contract Management (FS-MCM) Concepts and Guidelines Migration):Object Channels Direct BAPI Dialog Other InputDirect Debit Order X X X(master data)Billing and Open Via account contractItems RFC-enabled function modules to the bill creation Master data system: (billing) Flow data (open BCA_API_BL_OPEN_ITEMS_MIG (Transfer of items) open items to the bill creation system) BCA_API_BL_REV_OPEN_ITEMS_MIG (Reverse open items in the bill creation system)© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 9/44
  10. 10. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object Channels Direct BAPI Dialog Other InputSettlement Account settlements as such cannot be migrated, however there is a special logic with regard to the settlement calculation start and the settlement calculation without posting. This means that settlements already executed in the legacy system are also calculated in Account Management. These settlements are not, however, allowed to generate any postingsDisbursement RFC-enabled function modules:(master data) BCA_RFC_OR_DISB_CREATE_ACTIV (Create and Activate Disbursement) BCA_RFC_OR_DISB_CHANGE_ACTIV (Change and Activate Disbursement) BCA_RFC_OR_DISB_CHANGE_DEACTIV (Change and Deactivate Disbursement) You need to transfer the BDDISB value as the entry origin or editing origin for the external data transfer.Notice on Amount XStanding Order X X X(master data)Time Deposit X(master data)Capitalization You need to transfer the value BDCAPT as the(master data and entry originor processing origin for the externalindirectly flow data transferdata)Linked Accounts No information available yetCard Pool XCancellation(master data)Card Pool X XContract (create, (change)(master data) change)Card Cancellation X(master data)Card Contract X X(master data) (create, (change) change)© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 10/44
  11. 11. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object Channels Direct BAPI Dialog Other InputAccount Closure X(order)Bank Statement X X Via account contract Master data Flow dataAccount Holder XChangeAccount-Card X XRelationship (create, (change)(master data) change)Account Contract X X(master data)Correspondence- X X XRecipientManagementPLM Document X(flow data)Product Change X(Card)Product Change X(Card Pool)Product Change X(Account)Balance X XConfirmation(master and flowdata)Starting Balance XDeferral X XForward Order XWaiver RFC-enabled function modules: BCA_RFC_OR_WAIV_CREATE_ACTVT (Create and Activate Waiver) BCA_RFC_OR_WAIV_CHANGE_ACTVT (Change and Activate Waiver) BCA_RFC_OR_WAIV_CHANGE_DCTVTE (Change and Deactivate Waiver)© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 11/44
  12. 12. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object Channels Direct BAPI Dialog Other InputEarly Payoff RFC-enabled function modules: BCA_RFC_OR_PAYF_CREATE_ACTVT (Create and Activate Loan Payoff) BCA_RFC_OR_PAYF_CHANGE_ACTVT (Change and Activate Loan Payoff) BCA_RFC_OR_PAYF_CHANGE_DCTVTE (Change and Deactivate Loan Payoff)Counter X X X (after go- (after go- live) live)Extension (master You need to transfer the value BDEXTN as theand indirectly flow entry originor processing origin for the externaldata) data transferSkip You need to transfer the value BDSKIP as the entry origin or processing originfor the external data transfer.Payment Item, Xinfo itemPayment X Via account contractAgreement Debit position: Data is migrated using info items. Master data Installment payments: Data is migrated using Flow data info items. Counters for unpaid installments: Data is migrated using counters ( BAPI_BCA_COUNTER_CHANGE). Installment status, dates (next debit position, next monitoring): Data is migrated using a BAPI from the payment monitoring ( BAPI_BCA_PAYMON_CREATE_FR_DATA). Dates for due date display (correspondence at end of payment phase, correspondence at end of payment agreement, settlement at end of payment agreement, action at end of payment agreement): Data is migrated using the master data (with synchronization).Payment Form X© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 12/44
  13. 13. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Object Channels Direct BAPI Dialog Other InputPayment Debit position: Data is migrated using info items.Distribution Item Installment payments: Data is migrated using info items.(flow data) Counters for unpaid installments: Data is migrated using counters (BAPI_BCA_COUNTER_CHANGE). Installment status, dates (next debit position, next monitoring): Data is migrated using a BAPI from the payment monitoring (BAPI_BCA_PAYMON_CREATE_FR_DATA). Dates for due date display (correspondence at end of payment phase, correspondence at end of payment agreement, settlement at end of payment agreement, action at end of payment agreement): Data is migrated using the master data (with synchronization).Turnover class Are automatically filled with the migration of payment itemsPrenote XEffective Cash RFC-enabled function modules to migrate thepooling migration object data:(flow data) /FSECP/API_DUE_DATA_MIGR (migration of ECP due dates) /FSECP/API_LAST_ECP_DATA_MIGR (migration of ECP data: Last run date)Facility Migration with main contract, participant main contracts, and main contract hierarchies Master data Flow data To transfer the flow data from the account- managing system, use the report Activate Facilities (/FSFAC/AL_PP_MIG_FAC_ACTIVATE).Master Contract X You must transfer value BDTTAP as the entryTermination origin or processing origin for the external data(order) transfer.Master Contract XHierarchy(master data)Combined RFC-enabled function module to migrate theSettlement migration object data: /FSBPR/IN_RFC_SETTLE_EV_CR_CH.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 13/44
  14. 14. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.3 Sequence of Migration ObjectsAs you might have seen, there is not much information in the documentation about the sequence of migrationobjects. Maybe this is because there is no ‘one and only’ truth – as usual, there is more than one possiblesolution.When defining the sequence, consider the aspects described in the following sections.2.2.3.1 Migration StrategyWhen defining the migration strategy you must take into account that some migration objects cannot be seenindependently. So a given sequence has an influence on the overall strategy, but the strategy itself againaffects the sequence of migration objects.Examples: It does not matter if you want to migrate accounts, cards or a master contract; you always need abusiness partner with the correct role in your target system.This seems to be obvious at first sight, but in some countries it is not allowed to have business partner andaccount data in the same system. Anyway, you have to migrate a business partner with at least a few basic(non-identifying) data. To assure the consistence of these data during daily business and migration period, aprogram must be written to compare leading and depending systems.The effort rises enormously if you have to migrate while running the relevant systems and/or your (planned)go-live date is moving forward. What you have migrated today may not be valid anymore tomorrow. So youhave to deliver master data changes and changes of flow data as well.Note: Whenever possible, try to migrate volatile data (e.g. prenotes, PLM documents, forward orders) directlybefore the effective go-live. Changes should not be allowed in the source system at least for a short periodbefore go-live.Technically, it is possible to migrate a high number of objects and their history for every account, card, and soon. In fact, the complexity and run time of a Data Migration will be reduced effectively if only active contractsare migrated.2.2.3.2 Data VolumeIf the volume of data for a special product is very big, this product and the depending data might be migratedseparately as a tranche. This can also be the case if the migration objects needed are totally or partiallydifferent from other products (e.g. accounts vs. cards, loans accounts vs. term deposits).2.2.3.3 Source System(s)Sometimes it is not possible to migrate all data in one step. So you have to decide what will be migrated firstand what later.Examples If you get data from an external card provider, you have to agree on possible delivery dates. Business partners for cards (card holder, card pool holder) should be migrated separately from roles needed for the account creation because of the time difference – even if the accounts are migrated before the cards.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 14/44
  15. 15. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 Another implementation project of the bank might have the lead on parts of your migration data, e.g. a special product. They might ask you to migrate ‘their’ accounts on a certain date. Here you have to find out what are the preconditions, e.g. are there reference accounts that have to be migrated first? If you have different source systems, e.g. for current accounts and for savings accounts, it is not easy to decide what is easier and faster: To merge data after extraction and load them in one program or to load them in two separate programs, which might increase the effort of data reconciliation. Consider differences of products and their depending migration objects.If there are big differences between bank products and their migration structures, it might be better to migratethe relevant objects separately.Example: Comparing the product customizing of a term deposit account with a loans account, there is anestimated 25 % of matching used fields. If you build the migration structure for the migration object ‘account’,you can either build one huge structure with all data needed for both products, or two smaller structures – onefor each product. In terms of performance and needed memory space the two-structure strategy turns out tobe better.The structure of payment items, on the contrary, should be more or less the same no matter which product isthe basis.2.2.3.4 Dependencies Between Migration ObjectsAccountsAccounts have to be migrated prior to debit cards or savings cards. Credit cards can be migratedindependently from accounts because their reference accounts can be in another system or bank. Accountsdo not have to be productive before migrating cards; they can be migrated in the same migration group.Historical contract master dataHistorical contract master data can be migrated in terms of account changes or product changes. Theaccount creation should always refer to the oldest state wanted.Note: Product changes can only be migrated with a correct historical valid-on date if the migration grouppresent date starts in the past and is set forward step by step.Note: If you have to migrate the contract history, find out if the complete lifecycle with all account changesand product changes is required or not. Most customers only need historical interest conditions for A recalculation of settlements for the past because of payment items with a value date in the past Comparing results of settlements in the source and target systemsHistorical changes of condition types and groups can be done with an actual valid-on date of the migrationgroup present date if the product customizing allows changes (see next figure: green flash light, open lock):© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 15/44
  16. 16. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0If this is possible, you can migrate into the newest product. Older master data should be found in an archive –an exact historization of master data is often not possible anyway.Flow dataBefore migrating any flow data, it is recommended to first create all master data including history. At the end,you should have the same product status you want to go live with. Reasons: Most of the source systems do not allow complicated product changes. Very often, you can migrate histo- rical flow data (payment items, info items, settlements, account closure) independent from the product. All data you need for the future life of an account or card must be migrated against the actual status of master data. Diverse checks take place during migration, and if you try to migrate data that is not allowed in the product, you have to find out if the product is not appropriately customized, or if a transformation rule is wrong.Payment itemsPayment items have to be migrated before settlements. Otherwise the result of the settlement will not becorrect and the system does not allow migrating a payment item with a posting date older than the last(migrated) settlement.To deal with the historical settlement calculation, you need at least the following: An active account Correct historical standard interest conditions, individual interest conditions, and limits A calculated starting balance, payment items and info items Execution of report RBCA_CN_BKK92_INSERT_PPDuring migration, this report writes the data required for calculation of settlement results to the table BCA92.The following data is required as a starting point for the first settlement run after Data Migration: Settlement period number Start of settlement calculation: The report RBCA_CN_BKK92_INSERT_PP creates an initial settlement for each settlement track. In previous releases, the start date and end date of these settlements were based on the event 013 – account interest calculation start. With Banking Services 5.0 the start and end date of those settlements are based on the start date of interest calculation of each track.In case of condition type offsetting (e. g. for earnings credit), the surplus amount of the subtrahend can beforwarded to future periods. The surplus amount of the legacy system has to be migrated, for which the APIBCA_API_CF_INSERT is used. The report RBCA_CN_BKK92_INSERT_PP stores the carry-forwardamounts in the new table BCA_CARRY_FORW.For combined settlements, the compensation start date has to be filled.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 16/44
  17. 17. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Closed accountsDuring the migration you will come to the point where the following question will be raised:Why migrate an already closed account? Possible reasons therefore are: Year-end tax documents can only be produced in the new system It is possible to reactivate an account until a month or so (customizing) after closureIf it is necessary to migrate closed accounts, then which objects are needed? First you again need to knowwhy you are migrating those accounts. If you have to produce tax documents, of course you need postingand settlement information. If you want to reactivate an account, it is more complicated. But the reactivationprocess is just an exception, so some things can be done manually.Note: The recommendation is to migrate data that is necessary for a correct settlement and the closure itself.All data that represent the future life of an account (see above) should not be migrated because ofperformance reasons. Often it is not even possible.You can migrate PLM documents, but only as long as the account is still active and they must have the status“closed” before closing the account itself. This might be an extra program in the migration process but canalso be included in the migration program for account closure. In any case it is a customer-owneddevelopment. You should be ware of the fact that reactivating an account does not necessarily lead to areactivation of every dependent object.Reference accountsIf you want to migrate an account referring to another account as a reference, e.g. for settlement payments,make sure that the reference account is either still productive in the target system or that it is part of the samemigration group but migrated before the referring account.For an overview of a possible sequence please have a look at following figures:© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 17/44
  18. 18. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 18/44
  19. 19. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Migration of GL-relevant dataAfter the go-live of a migration tranche, the balances of the accounts in the respective migration group mustbe posted into the correct accounts of the GL system. To do so, start the program BCA_INVA_RUN_PP -Inventory Preparation for Legacy Data Transfer to GL (best before the next EOD processing after go-live).The report carries out a preparation run of the legacy data transfer for the general ledger update.When the legacy data is transferred, the balances posted in Account Management (FS-AM) are written to thelegacy data transfer tables. If it is not a simulation run, the balance transfer to the general ledger is alsoprepared. If the system determines that balances for a contract already exist on the general ledger duringpreparation of the legacy data transfer, the output is a warning message and the contract will be ignored.For each go-live date, the legacy data transfer selects all those accounts that have not yet had a legacy datatransfer, according to the selection parameters. The system recognizes accounts as being relevant for thelegacy data transfer if the date of the contract start is after the go-live date for the account.With the next GL transfer, the GL accounts will be updated with the balances of the migrated accounts. Theoffsetting account should be an account where you only post migration-relevant balances.Note: The balance formerly posted via the old legacy system must be posted with the opposite posting recordto reduce the amount of money on the ‘old’ asset or liability accounts. Again, the offsetting account must bethe special migration account (customer-own development).After every tranche, the balance of the offsetting account must be zero.2.2.3.5 Verification of Migrated DataOften the same functions are realized with different techniques in the source and target system. There is nobenefit of an automatic reconciliation of data consistency if the same transformation rules are used for loadingand for reconciliation. Another challenge is to migrate and reconciliate data of the same object but fromdifferent leading source systems.Example: In a customer project, the source for individual debit interests and limits was stored in anothersystem than the source for individual credit interests. When those interest rates were loaded, they affectedeach other by creating new entries or terminating an entry and creating a new one of the other condition. Atthe beginning, the reconciliation only worked if just one condition type was loaded.2.2.4 Analysis of the Source System(s) DataIt is necessary for the migration project to have at least one contact person for each system involved to: Inform you about standard processes Explain fields and possible field values Inform about major incidents, data inconsistencies and data corrections happened during the relevant period of time Write reports to evaluate the content of the source system Explain evaluation results and discrepancies from expected results Tell how to get data if they are no longer in the source legacy system at the time of migration, e.g. because of archiving or reorganization© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 19/44
  20. 20. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0During the Blueprint phase you have to verify your migration strategy by defining migration steps, relevantobjects and transformation rules. The first step might be an interview with other project members. Theproduct customizer, for example, has to find out what kind of products a bank has and what processes can beexecuted with every product. Step by step, he or she evaluates the number of new products to be createdand the necessary fields. So there must be at least an idea of how to transform accounts of an ‘old’ productinto the new one.One fact that you have to be aware of is that the other project members do not want to rebuild the legacysystems but to create new, future-oriented products and processes. So the old processes and productsshould only be used as an orientation. Maybe you do not find an equivalent for old data in the new system.You have to ascertain if this is correct or if it was forgotten during the definition.Example (interest and fee conditions): Evaluate what kind of standard conditions exist in the legacy system Figure out if there is an equivalent for all of them or if someone wants a transformation into another standard condition If you have to migrate historical standard conditions, make sure that you have an equivalent for them in the target system. If not, ask the conditions team to create one. Historical fee conditions must be correct from the beginning of the first period calculated and posted in the Banking Services 6.0 system. Normally, it doesn’t matter if they are correct for periods calculated in the legacy system because they are not affected by postings with a value date of an already settled period.Note: If you have an own settlement track for period-based fees, make sure that the first period productivelycalculated in Banking Services 6.0 starts at the same date as the others (if all have the same period). It is notrecommended to recalculate historical fees because of performance reasons, but you have to find the correctstarting point for them anyway. Evaluate what kind of individual conditions exist in the legacy system. Especially for fee conditions, you will probably need to cleanse incorrectly created individual conditions. To do so, you should set up a ‘mini project’. One target of the migration might be to reduce the effort of maintaining too many individual conditions, so evaluate if you can convert former individual conditions into standard conditions in Banking Services 6.0. It is a good idea to ask the conditions team because they too have to analyze the existing conditions.2.2.5 ETL: Strategy and ToolsAfter the migration strategy is defined and you know what to migrate, you will have to define how to migrate.ETL stands for Extraction, Transformation, Loading. Consider the following questions:Who is going to write the extract programs?If the person is an very experienced developer that knows the source system(s) very well, the transformationof most of the data can already be done in the source system.Advantages: Only the relevant data for extraction and transformation is selected no overhead of data, better performance Files (if chosen for the data transfer) are smaller, file transfers are, in general, faster Changes to transformation rules, which happen quite often during realization and testing, can be implemented faster, especially if additional fields, for example, from an external system are needed© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 20/44
  21. 21. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 Inconsistent data are detected faster, data cleansing or a workaround solution is defined more rapidly and with less mistakesWhere to transform data?If the above mentioned solution is not possible, you have to decide either to buy a transformation tool (e.g.Business Objects), to develop a tool with bank-owned methods, or to use the SAP standard EDT tool.External Data Transfer – All transactions required for processing the ETL are available in IMG:Note: When generating a structure, please pay attention to the following: If you are importing sender structures, never use packed fields. You will get a problem with the translation of different character sets (ASCII, EBCDIC).© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 21/44
  22. 22. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 Do not sort the fields of a structure alphabetically. The structure should be the same as the one in the data dictionary.How to document transformation rules?The first step is to document the whole structure of a migration object, for instance, in an Excel sheetincluding the customer-developed fields. Workshops with other project members who are responsible foraccounts, cards, payment items and so on will help you to define which fields or field groups are not used bythe bank. Now you can shorten the structure significantly. As a consequence, the transfer files will be muchsmaller and more transparent.Note: Be careful not to delete too many fields. It is not easy to understand technical coherences betweenfields. Sometimes you will need a field in the structure even if you do not want to fill it. So it is better to deleteonly field groups that you really don’t need. For example, if you migrate only current accounts, you don’t needfields with loans or time deposit content. Product Configurator is a good help to identify connected fields foraccounts, cards, card pools, or master contracts.During realization and testing it often turns out that the bank wants to use more fields or functions thandefined before. This can have a direct impact on extraction, transformation rules and data load structures.The second step is to define the transformation rules for every field or field group. You will find differentdegrees of complexity: The source system may not deliver anything. The field values can be taken from the product customizing. No (or nearly no) transformation needed: You can use the same field value as in the source system. You just have to describe if a leading zeros has to be added at the beginning or if the value has to be delivered left-aligned or right-aligned (e.g. account number, account currency in most cases). A (more or less) 1:1 transformation: The value ‘A’ in the source system will be value ‘6’ in the target system (e.g. interest calculation method). Transformation into a field group in Banking Services 6.0: The logic in the source system can be completely different than that in Banking Services 6.0. (e.g. periodicity of account settlement, bank state- ment agreements). You don’t find equivalent data in the source system. This can happen if a new functionality is used in the Banking Services 6.0. Either you can take a fixed value or the value from the product customizing, or you have to define a transformation rule derived, for example, from CIM data.Example 1: Account master data Copy the contract structure BCA_STR_CONTRACT_ALL_DI into an Excel sheet. Delete all fields that do not belong to an account contract Mark all fields that are not needed in the migration project of your current customer (structure from TRBK 3.0, only the first fields):© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 22/44
  23. 23. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 23/44
  24. 24. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 Create a new sheet with the reduced structure (example; has to be changed according to customer requirements): Define the transformation rules. As the responsible person, choose the person responsible for the respective function (account statement, settlement…).© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 24/44
  25. 25. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example 2: Account settlementThe migration program, for example, calls the BAPI BAPI_BCA_ACC_SETTLE_STRTSINGLE with the importparameter of the above structure. Please ensure that a commit is called after every single settlement beforecalculating the next. Otherwise, the results won’t be correct, because the starting date of the calculation iswrong.How to load the data?Of course you can choose the External Data Transfer (EDT) tool for loading data. It doesn’t matter if thetransformation takes place outside the SAP system or not. The tool does not cover all the demands of a bankusing Banking Services 6.0: Performance aspects: Without a customer development, the tool is not able to load data in parallel processes. It is not possible to load huge amounts of data in a short period. To use parallel processing, you can use Framework for Parallel Processing (FPP) which is part of each SAP NetWeaver installation. A developer guide is also available in the Run SAP Roadmap for Banking. It is not possible to use EDT for all migration objects (e.g. account settlement) or if your data load file has substructures that must be used to load individual conditions. You will have to develop your own program for such objects. There are no control mechanisms of data consistencies in place before loading the data. The customer should deliver a program to verify, for example, if the number of data records received equals the number the sender wanted do deliver (this number can be part of a file trailer). After an abnormal end (abend) of a load program, a tool is required to analyze which records have been loaded before the abend and which not. In addition, you have to reconstruct a file with the unloaded data to start loading again. Such a tool or functionality is not available from SAP for EDT.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 25/44
  26. 26. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.6 Data ReconciliationWhen all necessary data are migrated to the target system, the project has to define criteria to reconciliatethe results of Data Migration and verify it against the instruction of the revising persons.There is no general rule of what has to be validated and how. The rather formal aspects can be handled inthe ETL phase, for instance: Number of objects and datasets extracted = number of objects and datasets loaded Number and amount of debit payment items extracted = Number and amount of debit payment items loaded Number and amount of credit payment items extracted = Number and amount of credit payment items loadedThe reconciliation of content-oriented aspects must be defined separately. An automatic reconciliation with agood protocol is preferred, but very likely not all field values can be validated automatically. So additionally,random samples of every product, migration object and so on must be defined and checked.An automatic validation is primarily of interest for data that can be changed during the migration processbecause of a delta load or parallel delivery to the old and the new legacy systems via one or more differentchannels, such as account status, number and amount of payment items, PLM documents and theirequivalent in the source system. The first reconciliation can be done directly after the initial load, the last onedirectly before the planned go-live. During reconciliation no changes should be allowed. The verification ofhistorical settlements is important, too, to validate that the conditions are correct and a posting with a valuedate after the last settlement will lead to a correct recalculation of a settlement period.The migration team has to define the tools and programs for reconciliation. It is recommended to extract datafrom the source legacy system and transfer the result files to Banking Services 6.0. A customer-writtenprogram should compare the loaded data with the content of the file. Contracts that are not ok cannot go live.2.2.7 Dependencies Between Data Loading, Data Reconciliation, and Go-LiveSAP uses the so-called migration group to divide contracts and their dependent objects in migration from thealready productive contracts. The idea is to load every contract of a tranche into a migration group.All SAP standard programs, dialog programs as well as mass runs, know that they are not allowed to processcontracts with a non-productive migration group status. The same logic must be implemented into thecustomer-developed programs. With this technique it is possible to load data independently from the end-of-day processing, and no mass run can destroy any half-migrated contract.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 26/44
  27. 27. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0The Planned Go-Live Date enables you to load data over a period of more than one day, and the go-live caneven be postponed.As mentioned before, the Migration Group Present allows you to reconstruct the lifecycle of a contract. If youuse this function, you normally have many ‘time slices’, because a lot of contracts will be available. In thiscase, you need a program that derives the different migration groups and calls the programRBCA_SET_PRESENCE_MIG_GRP to set the date in Banking Services.Note: Possible Problems – After loading, it very often turns out that some of the data is incorrect and acorrection is impossible. For instance, while the first migration group (the one with the correct contracts) willget the status “productive”, the other (with the damaged contracts) will receive the status “migration failed”after the decision of go-live. These contracts and all relating table entries can be deleted and, after a programor data correction the same contracts, reloaded.Note: Not all standard programs can handle the migration group status 40 (“migration failed”) correctly. Toassure the Banking Services system is working correctly, the program RBCA_UNDO_MIG_GRP, whichdeletes the key information of a contract, has to be executed. If possible, execute it before the next mass runafter the go-live starts. Otherwise error messages in the mass runs are created although nothing will bedestroyed.Depending on your migration strategy, you have to make sure, that also the source legacy system(s) knowsthe status of a contract. Depending of the status, the source legacy system decides, for example: To send an error message because the contract is in the migration process To send the status “in migration”, and the surrounding systems deliver changes either to the old legacy system or to Banking Services or to both systems, as defined To process or not to process a contract in the end-of-day processing, maybe in a different manner during migrationFollowing are some examples of possible communication steps between the source legacy system, BankingServices and surrounding systems.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 27/44
  28. 28. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example 1: Big bang – no communication Start migration Stop every system Create one migration group Extract, transform and load of all data Validate data Data correct? no Correct data Correction Fallback / no yes possible? workaround yes Go live Start only new system End of migration Before starting ETL, every system of the bank that can change data in the legacy systems must be stopped. No master data changes or postings are allowed until all contracts are migrated and the new system landscape is set active. The old legacy system(s) don’t need to know anything of migration in terms of a status or different processes, because it will never be switched on again. After loading data in Banking Services, a reconciliation process is started. Whenever possible, incorrect data should be corrected. If this is not possible, a decision has to be made based on the number of incorrect contracts or the error types whether a manual workaround can be executed or the migration itself has to be stopped. When everything is ok, the migration group status “migration complete” is set in Banking Services. After some tests the new system landscape with all new processes can be started.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 28/44
  29. 29. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Example 2: Migration of a trancheThe following picture is a simplified illustration of a migration process in tranches. The basic idea is to makesure that the migration of a certain number of contracts does not affect the functionality or technicalavailability of the rest of contracts. Neither the bank’s customers nor the employees (except the IT…) shallrealize that something is happening. The illustration is just one example out of lots of possible procedures. Start migration tranche 1 Set status of Create two migration group 2: migration groups migration failed Set status ‚in Set status migration’ for 1. ‚unproductive’ in tranche in source the source system system ETL (initial load) of End of migration 1. tranche into of tranche 1 migration group 1 in BPP Validate data of 1. tranche Transfer incorrect Delete status ‚in Data correct? no contracts to migration’ in the migration group 2 source system yes ETL (delta load) of remaining contracts into migration group 1 Validate data Transfer incorrect Delete status ‚in Data correct? no contracts to migration’ in the migration group 2 source system yes Go live with migration group 1© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 29/44
  30. 30. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 No system or at least no vital system is stopped. All processes are executed normally. It is assumed in this case, that the bank wants to make an initial load and a delta load a few days later. During the ETL phase itself, changes of master data are stopped, but only for the relevant accounts, to simplify data reconciliation A migration status is needed, so that the surrounding systems know if they can deliver or not. In this case master data should not be changed, but you can post if you decide so. The first data reconciliation after the initial load decides (manually or automatically) if up to that point the migration is correct for every migrated contract or not. If not, the easiest way to handle those contracts is to transfer them to a migration group which finally will receive the status ‘migration failed’. In this case, the source legacy system needs to know that. You can clear all migration information as if nothing happened. The contract will be handled like the others not in migration. The delta load treats only those accounts which are still ok. Again there is reconciliation, and again there might be some contracts which are not ok…. and so on. Finally, the migration group 1 goes live (status 30: migration completed) independently from the number of contracts which ‘survived’, and the migration group 2 gets the status 40 (migration failed).Note: If you extract data more than once for the same contract, you have to make sure that you can identifychanges since the last extraction.If you have online processes delivering data in both, old and new systems, you should better reconcile alldata against the source system before go-live, even if you did not load anything in the last (delta) migrationprocess. You have to make sure that all data records are consistent, even if it is not the fault of the DataMigration team.2.2.8 Definition of Acceptance CriteriaOne task of the migration team is to define criteria for every step of the Data Migration. You need theassistance of people from the users department and a reviser to cover all demands of the bank andlegislation.Here are some important questions: Which data must be validated? Do you have to use special tools? What has to be done automatically, what can be done manually? What kind of lists must be produced? Layout? Is a printed version required? How much time is needed for the diverse verification tasks? Who is allowed to do the verifications and to sign the results? What are the criteria for a go/no-go decision? What kind of document is needed to record the migration process itself? What kind of documentation is needed for manual or automatic corrections during or after the Data Migration process?2.2.9 External ImpactsA Data Migration is not only a technical task. There is a lot of communication between other project members,bank employees, and external systems. There are some decisions to make and restrictions to pay attention towhich cannot or not easily be influenced by the bank.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 30/44
  31. 31. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0Find here some examples: In Germany, it is possible to close a bank for one day if one weekend is not enough to migrate all data. You have to apply for it in a special form. The migration of cards must be planned in a tight cooperation with the card provider: - The card provider normally has very narrow time windows for testing. - Mass runs of the card provider must be considered (e.g. renewal runs). - Ask the provider to stop master data changes, if necessary. - The delivery of master data from the provider must be organized, including reconciliation. You have to inform other banks or authorities about changing bank account numbers. The information of customers must happen in a given time frame if important functionality will change after Data Migration or if the access to the accounts is limited or not possible during the migration process.The definition of the migration process might result in some tasks for other project teams or external systems.Find here the tasks: Parallel delivery of changes depending on the migration status of a contract (you must produce a complete list of transactions and fields) If the bank runs more than one legacy system, there must be a ‘dispatcher’ to deliver the data to the correct system. Do not allow transactions (= error message) for contracts in migration The Valid From date of a product must be at least as old as the oldest creation date (in the source legacy system, not migration date!) of all contracts Standard conditions must start with the earliest period to be calculated during the migration process and must correspond to the historical conditions in the source system If there is no equivalent to the historical conditions, ask to create special condition groups for migration. During the migration process (directly before or after go-live) you must change the artificial condition group into a valid one.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 31/44
  32. 32. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0You need the following customizing entries: SAP Customizing Implementation Guide Financial ServicesAccount Management Item management Basic Functions Assign Medium/Payment Method toPosting Processes.Technically, it is possible to migrate starting balances as payment items. If you do so, you don’t need thebalances entry.Note: If you migrate a starting balance of 0 for every account that has no postings in the legacy system,make sure that the respective transaction type will not be selected as a payment item on the bank statement.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 32/44
  33. 33. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Maintain Posting TypesSAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Transaction Types and Transaction Type Groups Maintain and Assign TransactionTypes for Payment ItemsYou need both transaction types or an equivalent if the bank creates own transaction types.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 33/44
  34. 34. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Assign Posting Types to Transaction TypesSAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Maintain Non-Balance-Changing Transaction Types© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 34/44
  35. 35. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Maintain Non-Balance-Changing Posting CategoriesSAP Customizing Implementation Guide Financial Services Account Management Item management Basic Functions Assign Non-Bal-Changing Posting Categories to Non-Bal-Changing Trans. Types© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 35/44
  36. 36. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implementation Guide Financial Services Account Management Item management General Ledger Transfer Maintain GL OperationSAP Customizing Implementation Guide Financial Services Account Management Item management General Ledger Transfer Assign Non-Balance-Changing Transaction Type to GL Operation© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 36/44
  37. 37. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0SAP Customizing Implementation Guide Financial Services Account Management Item management General Ledger Transfer GL Account Assignment Non-Balance-Changing ProcessesIt is recommended to create a new offsetting account for Data Migration in the GL for reconciliation reasons.SAP Customizing Implementation Guide Financial Services Account Management Tools Parallelprocessing Maintain Job DistributionThe number of tasks must be defined together with SAP Basis. Relevant criteria are: Hardware Available batch processes CPU utilization (account settlements need much CPU, payment items not) Other batch programs running at the same timeNote: SAP delivers most of these entries, but sometimes customers create their own posting types or deletetoo many of the entries, because they think they don’t need them. So just check if they are still there.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 37/44
  38. 38. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0If there are external systems receiving data from the legacy systems, like a front-end system or a businessinformation warehouse, the following has to be decided and defined: Must there be an extraordinary data delivery after the go-live and before people start to work with the new environment? (Possible reasons: completely new data in the new system, new processes where these data are needed) If not, how can the user actualize data from the day before? (Assumption: The data are only delivered once in the end-of-day processing). This is a general task, not only for Data Migration!Delivery of data for SAP BWIf you are migrating into a productive system with data already delivered to SAP BW, you must pay attentionto the following: The BW extraction is working with time stamps. The already productive data are extracted every day in the end-of-day processing. The time stamp of the last extraction is saved in a table in Banking Services. The next extraction will select all changes made after this time stamp. The standard BW extraction selects only contracts in migration if there is a migration group in progress. Already productive data are not selected. After selection the new time stamp is saved. The problem can be solved by executing the following steps: - While migrating master data of a contract, the BW link table BCA_BCT_CN_OBJV is automatically filled. The migration group is saved in field XTR_STATUS. - Program RBCA_PP_SIF_TBBW_CHG_MIGR_OBJV changes the value for the migration group into the standard value for already productive contracts ‘PEEEEEEEEE’. Now there is no difference between productive contracts and contracts in migration. - Additionally, you have to change the field value for the field Status BW-Relevant to “20”.In the next EOD processing all master data and all related flow data that have been created after the last timestamp of extraction will be selected.Note: Program RBCA_PP_SIF_TBBW_CHG_MIGR_OBJV only selects relevant entries if the migrationgroup has the status 20 (in migration). If the migration group goes productive without having executed theprogram, migrated master data are not extracted. In case you extract payment items and other flow data forthose contracts, they cannot be assigned to a contract in SAP BW.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 38/44
  39. 39. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.10 Technical Aspects2.2.10.1 Program DefinitionData Migration programs should be developed according to the customer’s developing rules of other massrun programs. Load programs should write a protocol where you can find: The number of total data sets Data sets loaded Data sets failed Reason for failing (message number and message text) A statistic with number per messageLoad programs should generate a new file with failed data sets to facilitate correction and reload. Data setsfor loading that belong together must be treated together: If one of the data sets has an error, the wholegroup of data sets must be written into the error file and no data is saved (complete roll-back)Example: To load an account, you need more than one dataset, especially if you have more than onecondition group. Also the periodicity of an account settlement or statement needs more than one dataset perperiodicity, depending on the migration object. You need to define whether you have to pay attention to aspecial data load order or not.Example 1: Payment itemsIf one payment item out of 1000 for one account has an invalid transaction type, you load all other 999payment items. After correction, the one payment item can be reloaded.Note: This is a general rule, but it depends on how you generate the time stamps of payment items. They canimpact the first statement created in Banking Services and the correct data selection for SAP BI, the datavalidation itself, if you load payment items more than once using the time stamp for separating formerlyloaded items from newly loaded items.Example 2: Account settlementsYou have to load four settlements for one account. The second leads to an error. You cannot load the thirdand forth before loading the second, because it is not possible to reload the second settlement. Data sets 2,3, and 4 have to be written into the error file.Reconciliation programs should generate lists according to the definition mentioned above: One entry per migration object (correct and incorrect objects) A marker: what is not correct Statistics: how many objects in total, how many correct, how many incorrect Like standard SAP mass runs, you should create a protocol to be found in the transaction slg1 (application logs) with information about the variant started, the content of the selection screen, error messages, and return code. If you use an external job control tool to set variables in a variant, the incorrect delivery of a variable is often a mistake, especially at the beginning of testing with a job control tool. A reorganization program should be able to create a new input file out of the original error and success messages after a dump. Please consider that you cannot reorganize those data manually if you don’t know what is loaded and what is not in a program with high amounts of data.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 39/44
  40. 40. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 If you use EDT for loading, you can use transaction KCLP for monitoring the load process. If you write own parallelizing programs, please use program RBANK_PP_MONITOR or ST13 Mass Activity Monitor to view the percentage of data loaded. For detailed error messages, you have to write your own protocol.2.2.10.2 Transfer of FilesMost Data Migrations are done with files. It is easier to guarantee the consistency of data and to find reasonsfor an error. If you use files, you have to make sure that no file is loaded twice. Again, it depends on themigration object if you get trouble when loading a file a second time: Loading an account with an externalaccount number twice will lead to an error message. Payment items, on the contrary, can be loaded manytimes. Therefore, it is recommended to use a one-to-one name for a file and, of course, to check the nameagainst already processed files in the load program.Example: Account master data file <date>_account<run number> Name of the first file of a day: 20081005_account01 Name of the second file of a day: 20081005_account02 Name of the first file of the next day: 20081006_account01It is also possible to use the creation time stamp in the header of a file for control.If not done manually, the techniques for a file transfer are normally defined by the SAP Basis department:IFTP or SFTP, the writing of transfer scripts. The folder system must be defined by the project, because oftenthere are more file transfers used, e.g. by the end-of-day processing.2.2.11 Test ConceptDepending on the migration strategy, the migration process can be very complicated, with lots of impacts onsurrounding systems and processes. On the other hand, the possibility to test all these processes is oftenvery limited because of lacks of time, personal or technical resources.The following list might be helpful for a good test result: Test environment for testing the migration process itself (ETL, reconciliation) Unit test Realistic data for reproducible tests Historical test data Test environment for testing the processes during and after Data Migration System- or integration tests Consistent data in all systems Personal resources to write and test the test cases2.2.12 RealizationBesides the usual task of developing programs or creating EDT structures, you have to carry out some moremigration-related tasks: Complete defined transformation rules (often only possible during Realization phase because of other project teams – especially customer-developed fields are very often not defined completely during the Specification phase)© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 40/44
  41. 41. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.0 Write program descriptions for the job control tool: - Sequence of migration programs, integration into the end-of-day processing - Program description - JCLs for all migration programs - Reaction if a program is canceled Create variants for Banking Services programs Write an implementation or migration story book (depending again on the migration strategy) with every detail of the migration process: - Date and time of every Data Migration step - Description of every task - Responsible person - Person to be notified in case of completion of a task or in case of an error - Go/no-go decisions - Fallback strategy2.2.13 Testing and Final PreparationTests are vital for the verification of the Data Migration process. An essential aspect is to verify that allplanned processes are really working on the contracts that are migrated. Especially the end-of-yearprocessing may require data that was created in the old legacy system but was simply forgotten during theDefinition phase.Another important aspect is to carry out performance tests to Define the size of tranches Validate the planned time frame of Data Migration Define the number of batch processes and hardware equipment needed Find out if programs are too slow performance optimization in terms of - Coding - File size - Variants: number of objects per package or package size Avoid collisions and locks with other mass runs or optimize the number of batch processes if you cannot avoid these collisions Find out how much disk space is required for loading Define how much space is needed for generating, transferring, or archiving files Find out when to reorganize data bases during and after the migration process See if there are enough and the right people involved to verify data and to do all the manual tasks Verify the sequence of all tasks Verify all system authorizations needed by the persons involvedWhen it comes to restricted time slots during the productive Data Migration, you already should know the typeof errors that can occur and how to correct them. If you test with a near-productive data set, you will detect alot of errors that you can fix or for which you find a workaround solution.Note: To simplify unit testing, you can create a job net within Banking Services to repeat loading more easily.If you do this, you do not have to start every load program or reconciliation program manually.Precondition: The programs must be enabled to be used in a job net. To do this you can use the standard.© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 41/44
  42. 42. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.02.2.14 Go-LiveAfter having tested properly, no major problem should occur during the real migration. The story book givesevery participating person the confidence what to do.Of course testing cannot give you a 100% guarantee because productive data have been changed since thelast tested copy, but you should try to get as close as possible to it. Also the tested environment differs fromthe productive one, which can cause some issues.In the Preparation phase you should Check if you have the right to access the building at night or at the weekend Control if you have valid users and passwords for all systems and tools you must work with Verify if you have to use special monitoring work stations for the productive systemsMake sure that you get as much support as possible from the following departments (in addition to thepersons mentioned in the story book): Data base specialists SAP basis specialists Authorization specialists Someone who can give you access to the building at the weekend Someone who gives the ok for data corrections in the productive system© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 42/44
  43. 43. Best PracticeData Migration for TRBK 3.0 – Customer Experience and Changes to Banking Services 6.03 Further Information3.1 Tasks After MigrationIt is important to support the first productive days of end-of-day processing, end-of-month processing andend-of-year processing through the migration team and the project team or an equivalent group of people.Especially, the first account statements sent to the customers can result in a lot of incidents where bankcustomers complain about missing information or documents, wrong periodicities, miscalculated interests orfees and so on or where users have problems with new processes and layouts. Not always there will be aneed to make a program or data correction, but even if not, the effort to analyze all incidents should not beunderestimated.During the first end-of-day processing it may turn up that the performance of a mass run is too bad for thenumber of accounts that are in the system now. This can happen because the test system has the samemaster data as the productive system but does not get as many master data changes or flow data records.Then there has to be somebody to find a solution for this, often even during the same night.When having finished the Data Migration, you have to clean up: Archive the migration protocols, task lists, event logs Deactivate or delete job definitions in the job control tool Remove modifications especially developed for the migration process Switch off all systems that are no longer used© 2008 SAP AG - Best_Practice_Data_Migration_V21.doc page 43/44

×