Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Informatica 5+years of experince

121 views

Published on

Having 5+ Years of experience in informatica as developer and also experience on informatica Data Qulaity

Published in: Career
  • Be the first to comment

  • Be the first to like this

Informatica 5+years of experince

  1. 1. DHARMA CH E-Mail:dharamach@gmail.com Contact No: +91 9742363999 Career Objective: A challenging growth oriented position in a progressive company, where my skills are effectively utilized to improve operations, to contribute in organization success and work that will make me learn something new. Professional Summary: § Having an IT experience of around 5.7 years in Data warehousing BI tools with Progressive ETL Experience in Design, Development, Testing and Implementation and administration. § Having an experience on Informatica Data Quality (IDQ) using the DQ transformations like Case,Merger,Match,KeyGenerator,Association,Consolidation,Comparison,Exception and Address validator. § Also good knowledge on profiling on data objects. Also experience on migrate the mapping and integrated with power centre. § Around Two year of experience on Tibco BW real time integration tool. § Having an experience on Salesforce.com configuration and integration with sfdc. § Expertise in the Analysis, Design and Development testing and implementation of Software Applications and providing Data Warehousing BI solutions. § Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Center. § Also worked on various migration projects. § Worked on Tibco webservers and JMS queues and topic. § Technical expertise in ETL tools Informatica (Designer Client tools, Workflow Manager, Workflow Monitor, and Repository Manager). § Proficient in the Integration of various data sources like Salesforce, Flat Files, Oracle, Microsoft SQL Server and Web services with multiple relational databases like Oracle, Microsoft SQL server as staging area. § Expertise in developing Mappings and Shared Mapplets, Sessions, Workflows, Worklets and All Tasks using Informatica Designer, and Workflow manager. § Experience in designing/developing complex mappings using transformations like Connected/ Unconnected Lookups, Router, Filter, Expression, Aggregator, Normalizer, Joiner, Stored procedure, Sequence Generator, Transaction Control, Update Strategy and Web service consumer Transformation. § Worked on various tasks like session, email, control, assignment, decision, command, event wait, even raise tasks. Also worked on assignment variables to pass the values between workflow to mapping and mapping to session and session to workflow. § Worked with both sequential and concurrent batches. § Implemented Session partitioning and Data base partitioning to improve the performance where we have the large volume of data. § Worked with Different Databases like Oracle 10g and MS SQL Server. § Good Knowledge in SQL & Pl/SQL. § Have Good knowledge in UNIX shell scripting. § Having experience on informatica Administration in moving the codes as Folders and XMLS from DEV to ITG and then to prod. And also taking the backup of the repositories. Setting the user permissions and environmental variables. § Worked on Tidal scheduling tool to schedule the jobs and set up them in tidal adapter. § Worked on SOAP, JMS,ODBC and Service pallets to provide a web service to the downstream systems.
  2. 2. § Worked on various data migration projects during HP Split. From HPE to HPI using informatica data integration tool. Education Summary: ü Master of Computer Application from Andhra University, Vishakhapatnam, A.P, India in May 2010. Technical Summary: Operating Systems : Windows 2003server/XP, UNIX,Linux ETL Tools : Informatica Power Center V8x/9x,IDQ,Tibco BW 5.7 Databases : Oracle10g/11g, Microsoft SQL Server Tools : TOAD, Putty, SFDC Apex Data loader, SQL Loader Scheduling Tool : Cisco Tidal Enterprise Scheduling, DAC Professional Experience: • Currently working as a senior Software Engineer at Allegis Services (India) Pvt Ltd from Sep 2014 to till date. • Worked as a Senior Software Engineer in Cap Gemini India Pvt Ltd. From Jun ‘11 to Sep 2014. • Worked as a Software Engineer in Mahindra Satyam from Aug ‘10 to Jun ‘11. Major Assignments: Client : Hewlett Packard (HPE & HP Inc) Feb 12 –Till Date Role : ETL Developer and Application support Project : NextGen CRM Salesforce.com Application : Salesforce Cloud Tools : Informatica Power Center 9X Tibco BW, Apex data loader Databases : Oracle 10g/11g,Microsft SQL server. OS : UNIX Project: Next Gen CRM Salesforce CRM Integration Client: Hewlett Packard Role: Etl Developer Project Description: Sales force CRM is a cloud based application which mainly deals with the Hp’s customers and partners and Products. In this project I have worked on various integration applications like Opportunityty management, Lead management, Opportunity Renewals, ERMS, Territory management, Case API real time integration and various migration projects from Seibel to sfdc and sfdc to sfdc. Opportunity management is to load the opportunities and their related products from Siebel as sales force is the Next Generation of sibel. And also migrated the Cases and support requests.
  3. 3. ERMS is an enterprise records management system which maintains the archiving data from Salesforce. Opportunity Renewals is interface to create the opportunities, products and contracts from SAP to sfdc. From SAP we will get three regional files with Header and Detail on daily basis. Interface will take the oldest set file first. If any failures in loading into sales force will update with proper error message and load in to the pending object in sales force with key fields and into database table with all fields. After the correction happened the admin team will check t to be processed flag then interface will pick the failed records from Database pending table. After every interface run will send the report with bank statement for each and every record using Unix shell scripting. Territory management to create the territories and users associated to that territories from MyComp system to SFDC. And also creates the product specialty definition and assign the sales rep users to the corresponding specialties. Also assign the sales person managers and their role types to the sales representatives in sfdc. Case API is a real time application which used to create support cases in sfdc from the boundary system and sends back the sfdc 18 digit id as response to the boundary system. Roles and Responsibilities: • Participating the design calls during design to understand the design. • Creating the TDD documents and KT documents also provide the KT to the support team. • Writing unit tests and log defects during development. • Coordinate with the source and target systems to make sure the data flowing accurately. • Provide problem-solving expertise and complex analysis of data to develop business intelligence integration designs. • Work with Analysts and Business Users to translate functional specifications into technical designs for implementation and deployment. • Implement processes and logic to extract, transform, and distribute data across one or more data stores from a wide variety of sources. • Validates data integration by developing and executing test plans and scenarios including data design, tool design, data extract/transform • Performance tuning of mappings, SQL queries and sessions (session partitioning) to optimize session performance. • Expertise in unit testing of Informatica mappings. Also in debugging the mappings. • Worked on variable assignment concept to pass the mapping parameters/variables to worklets and workflows. • Mapping data, establishing interfaces, developing and modifying functions, programs, routines, and stored procedures to export, transform and load data and meeting performance parameters, resolving and escalating integration issues. Project # : Next Gen CRM GETIT ED Integration Salesforce CRM Integration Client: Hewlett Packard Role: Etl Developer Project Description: GetIT ED which is an sfdc HRIT application and maintains all the HP regular, terminated and Contract employee’s information. Here from Enterprise directory we are downloading the data in the files using CURL Unix command to our Unix server. The employee’s data will be maintained in this application.
  4. 4. Once the new employee created/updated/inactivated we used to send the email notification from sfdc to the HRIT PDL along with the report for the actioned records. Roles & Responsibilities: • Gathering the requirements from Business and understand the design from the function design doc’s which are HLD. • Provide the Level of Estimation for these requirements and upload them in sfdc SAASPMO which is project tracking tool. • Design the mapping and sessions based on the requirements. Project #: Title: MyComp SFDC Integration: Maycom is an external system which contains the account territory and the product specialty information. This will tells this user should be under this account territory and the particular user will be have one product specialty. In territory it is one many to many relationship i.e n:n but the product specialty is one to many. One user can have only one specialty. One specialty can have more than one user assigned to it. Mycomp will send the feed files in the format of .dat and .aud. HPSB will place the file in the import location which is the pickup location for integration. The shell script will compare the count in aud file with no of records in data file. If the accuracy is 99% then only the script will load the file in to sfdc. Using the business logic the data will be loaded into sfdc. After the load completion we need send the statistics of each load to the admin team in the form of user reports. How many records we received from input all should have status like succeed, rejected, failed or warning.The user reports of all the Mycomp interfaces will be zipped and each individual sfdc attachment URL in the email. Project 1: July 11 – Feb 12 Siebel CRM Integration Client : Hewlett Packard Role : Software Engineer In this project we are sending the data to the downstream system form the Siebel in the form of files. The data from the Siebel will send in two set of files. One is data file extension with .dat and other one is aud file. Dat file has the data and aud file have the keys like source system key, time stamp, version no and also the no of records. After generating the .dat and .aud file will zip the both using the dos scripting. The data will be extracted entity wise like opportunity, agreements, activities, products, accounts, contacts, users, profiles, job roles and etc. and also implemented the concept of delete capture process. If a record deleted from the Siebel it will be inserted in to the Index organized tables due to the triggers defined on each entity. Roles and Responsibilities: • Writing the triggers on all the Siebel objects and executed in dev and itg data bases.
  5. 5. • Testing the generated files using hp file format testing tool (FFT). If any counts vary between dat and aud file perform the analysis and check for the new line character in data. • Writing the unit test scripts and log the unit testing defects in hp quality center. • Run the test cases in ALM test lab. • Set the dependency between the three loads full, incremental and delete load region wise. • Run the loads during SIT and UAT time to the data available for testers. • Monitor the loads and send the weekly status report with the time taken by each session. • Testing the dash board reports. • Schedule the loads using Data warehouse Admin Console (DAC) tool. Project #: Client: Mahindra Satyam Environment: Windows From: Aug 2010 to jun2011 Informatica Administration: Worked as Informatica Admin to move the code from dev to ITG and then to PROD. Roles & Responsibilities: • Creating the folders and users for development and testing. • Migrate the folders and repositories from one domain to another domain. • Move the folders and migrate the code xmls from one repository to another. • Loaded the short cuts in the shared folder and provide the permissions to the users who should access the short cuts. • Done the code validations before migration of the folder. • Removed the locks from mappings, sessions and workflows from the Admin console. • Take the backup of the repositories and maintained them based on the time stamp.
  6. 6. • Testing the generated files using hp file format testing tool (FFT). If any counts vary between dat and aud file perform the analysis and check for the new line character in data. • Writing the unit test scripts and log the unit testing defects in hp quality center. • Run the test cases in ALM test lab. • Set the dependency between the three loads full, incremental and delete load region wise. • Run the loads during SIT and UAT time to the data available for testers. • Monitor the loads and send the weekly status report with the time taken by each session. • Testing the dash board reports. • Schedule the loads using Data warehouse Admin Console (DAC) tool. Project #: Client: Mahindra Satyam Environment: Windows From: Aug 2010 to jun2011 Informatica Administration: Worked as Informatica Admin to move the code from dev to ITG and then to PROD. Roles & Responsibilities: • Creating the folders and users for development and testing. • Migrate the folders and repositories from one domain to another domain. • Move the folders and migrate the code xmls from one repository to another. • Loaded the short cuts in the shared folder and provide the permissions to the users who should access the short cuts. • Done the code validations before migration of the folder. • Removed the locks from mappings, sessions and workflows from the Admin console. • Take the backup of the repositories and maintained them based on the time stamp.

×