• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Collaborate 2012-business data transformation and consolidation for a global energy services company

Collaborate 2012-business data transformation and consolidation for a global energy services company



For More Details Visit us: http://www.chain-sys.com

For More Details Visit us: http://www.chain-sys.com



Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Collaborate 2012-business data transformation and consolidation for a global energy services company Collaborate 2012-business data transformation and consolidation for a global energy services company Presentation Transcript

    • Business Data Transformation and Consolidation for aGlobal Energy Services CompanyPresented By:John PerkinsDirector, ChainSys Corporation, Lansing MIwww.chainsys.com
    • Introduction Business and organizationalchangeshave significantramificationson data requirementsfor a global company.Combine these changes with radical business process change and you have a functionaland technicalproblem with increased complexity. Migrationof master, referentialand transactionaldata to new business organizationsrequire precise planning,coordinationand execution. We will review in detail a case study for a global energy services companyreviewing the process followedthroughoutthe project. Objective 1: To provide project and data strategies to dealing with multiple organizationalconsolidation. Objective 2: To discuss both manual and automatedtools that were used to create the new business entity. Objective 3: To review the structured process and techniquesfollowedby the project team. Objective 4: To review lessons learned in dealingwith a complexbusiness process and data consolidationproject. Objective 5: To review processes to followto mitigate business risks with data migration. Intended Audiences: (i) Individual contributor (ii) Project team member (iii) Project Manager
    • Project Overview Established Power Generation Company Part of USD 4Billion Group. Operational capacity of 191 MW of power and generating capacity of 2460 MW under various stages of implementation. Demerged from BILT. Captive power plants operating since 1955.
    • Project Overview APIL CPP IPP HO THERMALPOWERPLANTS Korba West Jhabua Power Power Co Ltd Ltd PLANT PLANT PLANT PLANT LEGEND: 1 2 3 4 APIL- Avantha Power and Infrastructure Limited CPP – Captive Power Plants IPP - Independent Power Plants HO – Head Office
    • Project Overview Legacy System New System Oracle EBS R12.1.3 Oracle EBS 11.5.7 Modules: Modules: Oracle OPM Oracle OPM Oracle Financials Oracle Financials Oracle Projects Oracle EAM Oracle OM/PO/QA/QP
    • Why Reimplementation? Demerger from BILT in 2006 and need for separate legal entity and financial reporting Need to re-engineer the E2E processes & standardizations Requirement to capture production and product costs for pricing to cover ROI mark up and for regulatory reporting Need to address requirementsrelating to capturing of Project Costs, Asset Capitalizations and Maintenance Costs Need to remove the redundant workarounds by enabling the new functionalities
    • Challenges… Operational Compliance and Governance Technology• Multiple financial transaction • Comply with IFRS • End of life application – Oracle• Business process re-engineering • Comply with Tax rules E-BS 11.5.7• No visibility into production cost • Improve Group level • No good Reporting / BI tools• No tax benefits consolidation and financial • Inadequate MIS Reporting• No tracking of project expenditures and reporting capital expenditures • Comply with multiple• Complicated Period closing cycle legislative, industry and geography requirements.• Valuation for Coal and Chemicals not available.
    • ProcessImprovements
    • ProcessImprovements
    • Key Concepts Data Life Cycle Plan Obtain Store Maintain Apply Dispose Master data Reference Data Transactional Data Meta Data Data Mart Data Manipulation•Customers •Customer type •Sales orders •Table name •Extracted data •Extraction•Employees •Item type •Purchaseorders •Field name •Transformation •Validation•Vendors •Trips •Constraints •Validations •Transformation•Locations •Deliveries •Data types •Cleansing •Cleansing•Organization •Invoices •Data Quality •Consolidation•Accounts •Payments •Cross Reference•Products •Loading
    • Oracle E-Business Suite DefineData Objectsand Attributesto beMigrated 1 Cleanse and Transform Data as Required 4 • appMIGRATE providesa robustlogical data R12.1.3 Migration -Ourmethodology beginswith facilitated work shopsto definedataobjects and attributes to be migrated, anddata transformationtool. Itis capable oflogically validation,transformationand cleansing rulesand modifyingexisting data orcreating data logically in Approach requirements -Theworkshopshelpto build stakeholderownership pre-definedfields. For exampleif a newsegment is addedto the general ledger chart set the entry can -Thisinformation is then documented and source object / becreated using this feature. appMIGRATE includes a attributeto target object / attribute mapsand data validation numberof tools tosupportdatacleansing, for Define Data Objects and Attributes to be Migrated, Data Validation, Transformation and Cleansing Rules and Requirements appMIGRATE Templates are Configured / cleansing rulesarecreated and approved example data may be compared to locate -Thesedocuments arethe basisforthe configuring ofthe duplicate records with alias key identifiers appMIGRATE templates based on sound, key words (Levinstein appMIGRATE appMIGRATE Source System Data Staging Table Data Staging Table Distance Method) and custom configured logical relationships. All objectsand attributesin 2 thedata mart are available foredit. An audittrail is Data Configure appMIGRATE Extraction Templates Staging Data Table Staging Table createdfor all changes Data is Transformed and Cleansed appMIGRATE features over 200 extraction Data is Extracted to the Data Mart Data File Updated • Target System adapters and load templates that include every object and attribute in the standard eBS data base Pre-Validate DataPrior to Load to Target Instance appMIGRATE will performa pre-validation of the data in the datamart using the samecriteria asthe target instance. -In 5 Error Exception API’s and Upload • Template configuration involves mapping the addition special validation rules maybe added to the Report is Programs are Created Executed source object attributes to the target object appMIGRATE pre-validation if needed. -An on-line errorreport Errors are Data is Validated andIncorrect Data Rejected Corrected Data Migration is Complete attributes this includes any user defined ispresented and the usermay makecorrections asrequired attributes and additional fields (Definitive Flex -Atool to assistin the correction oferrors(bysuggestion) is MainActivitiesin Data Migration Felds) alsoprovided. -All data correctiondata is collected and • CHAIN SYS will be pleased tocomplete the maintainedand isavailable in an electronic document form configurationtask or totrain client personnel in 1. Define Data Objects and Attributesto be Migrated configuring the templates 2. Configure appMIGRATE Extraction Templates Accordingly 3. Extract Selected Objects and Attributesfrom ExtractSelected Objects andAttributes from Source Instance 3 Load Data into Target Instance appMIGRATE uses standardeBS Application Programming 6 andmove toa Staging Table Interfaces(API’s) toensure thatdatais validated in SourceInstanceand move to a StagingTable • The extract andload templates are selected andthe accordance with the eBS rules. Wherestandard API’sare not 4. Transform and Cleanse Data as Required sourceinstance and file(s) tobe extracted are providedby eBS appusescustom API”s andOpenInterface 5. Pre-Validate Data Prior to Load to eBS Target identified. Upon execution of“Run-Loader” program Rules to providefordata validation Instance datais copied from thesource objectto theData -Load executionsmay be scheduled automatically using File Martand is available for review and edit. A TransferProtocol(FTP) proceduresstandard to eBS Correct Any Pre-validation Errors completeelectronic record of source location of -Load execution isrun asa “background task” permitting 6. Load Data to Target Instance eachrecord is maintained otherapplicationsto be usedsimultaneously
    • Data transform and cleansing Data Transformation Data Cleansing• OPM inventory to Discrete inventory • Manual and Automated cleansing transformations • Threshold value for automated cleansing• New Structure and accounts for GL accounts • Extract the data and load into the Data mart for• New Item Codification cleansing• New Sub-inventory and Locator implementation • Manual cleansing by exporting the data mart data• Projects / task included in the Inventory, PO and into excel spreadsheet Sales transactions. • Automated cleansing using Levenstein’s distance• Transformation occurs on the data mart. method, soundex and custom rules.• Rules Engine for Transformation • Audit trail for the cleansed data• Cross References • Validate the data after cleansing before it is loaded into Oracle EBS R12.1.3
    • e●Chain™ J D Edwards Legacy ERP eBS Instance 1 EnterpriseOne Instance Instance eBS Instance 2 Instanceand cross referencesData Consolidation appLOAD Data Repository Collect / Collect Consolidate Consolidate Standardize / Standardize Review Master Data Data in a Harmonize from all / Harmonize andCleanse Transactional Common Transactional Sources Master Data Master Data Data from all Repository Data Sources Data Cross Referencing / Harmonization File
    • An alternate approachto Data Standardizationis Data Harmonization. In this techniqueand cross references all unique records are still linkedData Consolidation via a commonand standard key identifier. A cross reference table, similar to that used for part numbercross referencingis developedto cross reference originalkey identifiersfrom the diverse data repositoriesto the standardkey identifier. This techniqueprovides a means of obtainingthe standard key identifierwhen the original key identifieris presented.
    • Key TakeAway – Recommended Solution
    • Key Take Away
    • Thank You.