SlideShare a Scribd company logo
1 of 4
Mukhtar Ahmed
1
SUMMARY
 Over 8 years of work experience in the field of Data Warehousing involving various ETL projects.
 All-round experience in design, build, deploy and support of DWH-ETL processes.
 In depth Knowledge and understanding of the DWHconcepts and Dimensional modeling.
 Highly specialized in working on IBM InfoSphere Datastage 11.3/8.x, Ascential Datastage 7.x/6.0
 Worked on Server/Parallel/Sequence Datastage jobs involving variety of different stages.
 Worked on File stages like CFF, Hash file, Flat file, Datasets, Sequential file stages
 Processing stages like Merge, Join, Lookup, Transformer, Aggregator, Sort, Remove Duplicate,
CDC, Funnel, Copy, Surrogate key Generator,
 Database Stages like ODBC, DB2-UDB, Oracle connector, Teradata stage, TPT connector
 Built Shared Containers, Basic routines, Multi-Instance jobs, Before/After subroutines
 Worked on multiple databases Teradata, DB2, Netteza, Oracle, SQL Server.
 Expert in working on Teradata Tools & Utilities (BTEQ/Fastload/Fast export/Multiload/TPT/Tbuild)
 Skilled in writing UNIX shell scripts and generic DS automation scripts to execute DS jobs.
 Worked on multiple scheduling/monitoring tools such as Tivoli Workstation, Control-m, Autosys.
 Specialized in dealing with massive Data-warehouse applications (larger than 100 Terabytes)
 Confident in building high Quality code including Performance tuning & Query Optimization
 Hands-on experience in Datastage/Teradata/AIX upgrade/migration projects.
 Good understanding of the reporting tools like Cognos, Tableau, Business Objects.
 Expert in Operations & Production support functions to resolve Datastage/Teradata job aborts.
 Worked on Tivoli storage management tools for archival and retrieval of the data files
 Well versed with all the AIX/UNIX commands required for performing ETL job.
 Gained knowledge on Health Care, Retail Banking, Mortgage, Insurance domain over time.
TECHNICALSKILLS:
ETL Tools : IBM Data stage 8.1/8.5/11.3, Ascential Datastage 7.5.
RDBMS : SQL scripting in TERADATA, DB2, ORACLE 8.1, MS-SQL Server 6.5/7.0
Teradata Tools: : BTEQ/FASTEXPORT/FASTLOAD/MULTILOAD/TPT/TPUMP
SQL Clients : Teradata SQL Assistant, DB2 Control Center, Toad, SQL Developer.
O/S : AIX 6.1/7.1 (UNIX scripting), Windows batch scripting, Mainframes Z-OS
Scheduling Tools : Control-M V6.3 & V7, Tivoli Workload Scheduler(TWS), Autosys.
Versioning Tools : MKS Source & Integrity, SVN Tortoise
Archive Tools : Tivoli Storage Manager(TSM)
QA Tools : HP QC & ALM
Document Tools : Microsoft Office Excel/Word/PPT/VISIO , Ultra Edit ,Text Pad
TRAINING & CERTIFICATION:
 Six Sigma –White belt trained professional.
 Company trained and certified on AIX, RDBMS and Data warehousing
 Passed the Data Stage Competency test conducted within the organization.
 Trained on Teradata database and utilities.
 Internally certified on global courses like Global Anti-Bribery, Anti Money Laundering, Information
Security Risk and Privacy/Data Protection Awareness
EDUCATIONAL BACKGROUND:
BE in Information Technology, Osmania University, 2004 – 2008
13800 ChestnutDrive, APT 208,
Eden Prairie, Minnesota 55344, U.S.A.
Email id:asad_3077@yahoo.com
Cell:+1-952-737-2060
Mukhtar Ahmed
2
PROFESSIONAL EXPERIENCE:
Optum Technology Inc. (UnitedHealth Group),
Sr. ETL Developer November 2014 - Present
Project: Normative Health Informatics, Adding Health care data of over 10 million members useful for
the research and analysis of Lifesciences.
 Build complex Data stage jobs to handle huge work load of data (100’s of billions of records).
 Utilize Teradata Tools and Utilities like BTEQ/Fastload/Multiload/Tbuild/TPT framework to load
large tables of over 2 terabytes.
 Build unique primary/secondary/join indexes on Teradata tables
 Worked on both Server and Parallel Datastage jobs involving all the different stages like
File stages (Sequential, Dataset, CFF, Hash files), Processing stages (Transformer, Sort,
Remove Duplicate, Aggregate, Copy, Merge, Join, Lookup, CDC, Funnel), Database Stages
(ODBC, TD enterprise, TPT Connector, TD Multiload, DB2, Oracle connector etc.,.)
 Develop Generic shell scripts on AIX server to execute thousands of Data stage & Teradata jobs.
 Create Tivoli workstation jobs to schedule and automate the execution of these jobs.
 Coordinate with teams across locations in smooth execution of ETL process & data deliverables.
 Build ETL metric solution for detecting monthly increase/decrease of data from source files.
 Take full ownership for proper functioning of application processes and maintaining Data quality.
 Re-designing and Optimizing the existing processes where required.
 Carrying out capacity planning and taking proactive measures to avoid production issues.
 Follow Agile methodology techniques for progressing and tracking day to day project tasks
Environment: AIX 7.1, Datastage 8.5/11,3, Teradata 14/15, TWS, HPSM, SVN.
Optum Technology Inc. (UnitedHealth Group),
Sr. ETL Developer September 2013 - November 2014
Project:HCCI, Integration of over additional 1 million members Health Care data critical for analyzing and
researching for UnitedHealth care business, this also involved migrations/upgrades of the existing system.
 Develop load process for member data and generate rekey data for medical claims rekey.
 Create new File Watcher jobs to help receive thousands of files at landing path and properly move,
process and archive them for future usage.
 Build complex data stage jobs to cleanse, transform and de-identify the source data
 Build Load scripts using TPT framework for parallel loading millions of records in tables
 Define unique primary indexes/secondary indexes/Join indexes on the tables
 Create different views on top of the tables to de-identify and showcase only the relevant data.
 Create new Unix shell scripts for running hundreds of DS jobs using automated scripts
 Build new TWS scheduling jobs to execute the newly build processes in proper order.
 Work on BA request to perform detailed impact analysis on application design issues/changes.
 Write Technical specification documents and flow diagrams and keep them up to date.
 Build healthy relationship with project stakeholders/ business partners.
Environment: AIX 6.3, Datastage 8.5, Teradata 14, TWS, HPSM, SVN.
HSBC Software Dev. Pvt. Ltd, Sr. ETL Developer, Nov 2012 – Sep 2013
Mukhtar Ahmed
3
Project: Mortgage Transformation Program, tracking and storing every single event that happens
during the lifecycle of the mortgage application required for research.
 Lead projects though all SDLC phases, utilize team & ensure timely delivery as per release plan.
 Interact with the Business Analysts to understand ETL requirements
 Understanding the Mortgage Application journey in terms of different Milestones
 Design complex ETL processes, Basic server routines, Teradata Sql Scripts(BTEQ)
 Build data cleansing, validation and DSJOB scripts using UNIX shell scripting
 Build multiple instance jobs to run the same process for 72 different sites
 Utilize Run time column propagation to deal with multiple set of files
 Use Import/Export commands for migrating the data stage code across the projects
 Developed Datastage jobs involving Join/lookup/merge/transform/remove duplicate/funnel stages
 Recommendations for new data mart design as well as improvements in current data warehouse
 Act as a reviewer on Technical review panel and review code created by other team members
 Ensure compliance with organization and project defined standards and processes
 Follow and enforce best practices, framework processes, standards and conventions.
Environment: AIX 6.3, Teradata, HP-QC, Datastage 8.1, Trillium server, Control-M.
HSBC Software Development Pvt. Ltd, ETL Developer, Jul 2011 – Nov 2012
Project: Liquidity Management Program, Project was about gathering and storing details about all the
liquidity assets and liabilities held by HSBC UK business critical for calculating the liquidity risk.
 Understand the business requirement and functional specifications documents
 Create Technical design documents and Data model documents
 Build sequencers to run datastage jobs, Routines, shell scripts in loops as per requirement
 Develop parallel jobs involving stages like transformer,sort, Remove Duplicate, aggregator, merge,
join, Lookup, change capture, change apply, peek stages.
 Create Basic routines to generate and allocate new ID for different source files received.
 Explain plan verification and optimizing SQL query by building join/secondary/primary indexes.
 Execute the System integration, User acceptance test as per plans
 Create DDL for several tables and define proper indexes, datatypes, compression.
 Use MKS Integrity code versioning tool for checking in/out the code components.
 Verify crystal reports on Cognos framework to track and test the mortgage application status
 Provide secured data access to the users as per Compliance requests.
 Ensure compliance with organization and project defined standards and processes
Environment: IBM Information server Datastage 8.0, Teradata, Sql Server, UNIX/AIX,
Control –M, MKS Integrity, ERWIN 4.1.
HSBC Software Development Pvt. Ltd, ETL Developer, Apr 2010 – Jun 2011
Mukhtar Ahmed
4
Project: Global Analytics & Integration Project, Integration of multiple sources data into a single
warehouse.
 Involved in Integration of several modules in the Global Analytics across different Regions
 Build/Amend, Test and Deploy Datastage Jobs and DB2 SQL scripts
 Build jobs to read source files using CFF stage/Hash file/Dataset/Sequential files
 Carry out Server to Parallel code changes, test and deploy it to Production.
 Build UNIX/AIX shell script for archival and retrieval of the source files
 Creating Control-M 6.3 Batch scheduling drafts and migrating them to production environment
 Coordinate with QA test team and fix the pending Defects.
 Undertake Productivity and Quality assurance initiatives across the team
Environment: Unix, DB2, Oracle 10g/9i, Control-M 6.3, Datastage 7.5
HSBC Software Development Pvt. Ltd, ETL Developer, Jun 2008 – Mar 2010
Project: Teradata Migration, Project was about migrating an existing UK data warehouse platform from
DB2 database to the Teradata database.
 Run Datastage server/parallel/sequence jobs on both DB2 and Teradata environment in parallel
 Build new BTEQ scripts, fast load/fast export scripts, TPT scripts for loading and extracting data.
 Utilize Control-M scheduling tool to run Datastage jobs on top of DB2 and Teradata
 Utilize Golden gate tool to identify any mismatch during data reconciliations
 Capture test results and perform the reconciliation between the data
 Raise any defects identified during the test run and reconciliations
 Perform root cause analysis and apply fixes for the defects
 Extensively use the Quality center to log and track the defects till closure
 Attend Daily status calls and update the run progress
Environment: AIX 6.3, Datastage 8.0, Teradata, DB2, Control-M, Golden Gate.
Achievements:
 Sustaining Edge Award, Excellence at Workplace & Remarkable Contribution.
 Above & Beyond Award - Best Quality Deliverables.
 Long Service Award: 4 years of Excellent Service to the organization.
 Reuse & Innovation Award - Most Productive Initiation.
 Pat on a Back Award: Awarded for Hard work and Dedication.
PROFESSIONAL SUMMARY:
Software Engineer, HSBC Software Development Pvt. Ltd. June 2008 - September 2013.
Senior Software Engineer, Optum Services Inc. (UnitedHealth Group), September 2013 - Present.
References: Available upon request

More Related Content

What's hot

Talend Open Studio Introduction - OSSCamp 2014
Talend Open Studio Introduction - OSSCamp 2014Talend Open Studio Introduction - OSSCamp 2014
Talend Open Studio Introduction - OSSCamp 2014OSSCube
 
RAD - System i - Presentation
RAD - System i - PresentationRAD - System i - Presentation
RAD - System i - PresentationChuck Walker
 
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...Alex Rayón Jerez
 
Etl with talend (data integeration)
Etl with talend (data integeration)Etl with talend (data integeration)
Etl with talend (data integeration)Pooja Mishra
 
TheETLBottleneckinBigDataAnalytics(1)
TheETLBottleneckinBigDataAnalytics(1)TheETLBottleneckinBigDataAnalytics(1)
TheETLBottleneckinBigDataAnalytics(1)ruchabhandiwad
 
From Raw Data to Analytics with No ETL
From Raw Data to Analytics with No ETLFrom Raw Data to Analytics with No ETL
From Raw Data to Analytics with No ETLCloudera, Inc.
 
Choosing technologies for a big data solution in the cloud
Choosing technologies for a big data solution in the cloudChoosing technologies for a big data solution in the cloud
Choosing technologies for a big data solution in the cloudJames Serra
 
Etl overview training
Etl overview trainingEtl overview training
Etl overview trainingMondy Holten
 
Open Source ETL using Talend Open Studio
Open Source ETL using Talend Open StudioOpen Source ETL using Talend Open Studio
Open Source ETL using Talend Open Studiosantosluis87
 
Whats New Sql Server 2008 R2
Whats New Sql Server 2008 R2Whats New Sql Server 2008 R2
Whats New Sql Server 2008 R2Eduardo Castro
 
Understanding Oracle GoldenGate 12c
Understanding Oracle GoldenGate 12cUnderstanding Oracle GoldenGate 12c
Understanding Oracle GoldenGate 12cIT Help Desk Inc
 
Big SQL 3.0 - Fast and easy SQL on Hadoop
Big SQL 3.0 - Fast and easy SQL on HadoopBig SQL 3.0 - Fast and easy SQL on Hadoop
Big SQL 3.0 - Fast and easy SQL on HadoopWilfried Hoge
 
SQL Server Integration Services
SQL Server Integration ServicesSQL Server Integration Services
SQL Server Integration ServicesRobert MacLean
 

What's hot (20)

Introduction to ETL and Data Integration
Introduction to ETL and Data IntegrationIntroduction to ETL and Data Integration
Introduction to ETL and Data Integration
 
siddharthaBS_DBA
siddharthaBS_DBAsiddharthaBS_DBA
siddharthaBS_DBA
 
Talend Open Studio Introduction - OSSCamp 2014
Talend Open Studio Introduction - OSSCamp 2014Talend Open Studio Introduction - OSSCamp 2014
Talend Open Studio Introduction - OSSCamp 2014
 
Prabhu_dba
Prabhu_dbaPrabhu_dba
Prabhu_dba
 
RAD - System i - Presentation
RAD - System i - PresentationRAD - System i - Presentation
RAD - System i - Presentation
 
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
 
Talend for big_data_intorduction
Talend for big_data_intorductionTalend for big_data_intorduction
Talend for big_data_intorduction
 
Etl with talend (data integeration)
Etl with talend (data integeration)Etl with talend (data integeration)
Etl with talend (data integeration)
 
TheETLBottleneckinBigDataAnalytics(1)
TheETLBottleneckinBigDataAnalytics(1)TheETLBottleneckinBigDataAnalytics(1)
TheETLBottleneckinBigDataAnalytics(1)
 
Siva-Resume
Siva-ResumeSiva-Resume
Siva-Resume
 
Mihai_Nuta
Mihai_NutaMihai_Nuta
Mihai_Nuta
 
From Raw Data to Analytics with No ETL
From Raw Data to Analytics with No ETLFrom Raw Data to Analytics with No ETL
From Raw Data to Analytics with No ETL
 
Choosing technologies for a big data solution in the cloud
Choosing technologies for a big data solution in the cloudChoosing technologies for a big data solution in the cloud
Choosing technologies for a big data solution in the cloud
 
Etl overview training
Etl overview trainingEtl overview training
Etl overview training
 
Open Source ETL using Talend Open Studio
Open Source ETL using Talend Open StudioOpen Source ETL using Talend Open Studio
Open Source ETL using Talend Open Studio
 
Whats New Sql Server 2008 R2
Whats New Sql Server 2008 R2Whats New Sql Server 2008 R2
Whats New Sql Server 2008 R2
 
Understanding Oracle GoldenGate 12c
Understanding Oracle GoldenGate 12cUnderstanding Oracle GoldenGate 12c
Understanding Oracle GoldenGate 12c
 
Big SQL 3.0 - Fast and easy SQL on Hadoop
Big SQL 3.0 - Fast and easy SQL on HadoopBig SQL 3.0 - Fast and easy SQL on Hadoop
Big SQL 3.0 - Fast and easy SQL on Hadoop
 
Ssis 2008
Ssis 2008Ssis 2008
Ssis 2008
 
SQL Server Integration Services
SQL Server Integration ServicesSQL Server Integration Services
SQL Server Integration Services
 

Similar to Mukhtar_Resume_ETL_Developer

Similar to Mukhtar_Resume_ETL_Developer (20)

Saranteja gutta wells
Saranteja gutta wellsSaranteja gutta wells
Saranteja gutta wells
 
Resume ratna rao updated
Resume ratna rao updatedResume ratna rao updated
Resume ratna rao updated
 
Resume_Ratna Rao updated
Resume_Ratna Rao updatedResume_Ratna Rao updated
Resume_Ratna Rao updated
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 
SivakumarS
SivakumarSSivakumarS
SivakumarS
 
DS_Sreeram_7
DS_Sreeram_7DS_Sreeram_7
DS_Sreeram_7
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developer
 
Shrikanth
ShrikanthShrikanth
Shrikanth
 
Sreekanth Resume
Sreekanth  ResumeSreekanth  Resume
Sreekanth Resume
 
RajeshS_ETL
RajeshS_ETLRajeshS_ETL
RajeshS_ETL
 
WhatIsData-Blitz
WhatIsData-BlitzWhatIsData-Blitz
WhatIsData-Blitz
 
Mohamed Mahgoub_CV
Mohamed Mahgoub_CVMohamed Mahgoub_CV
Mohamed Mahgoub_CV
 
Resume
ResumeResume
Resume
 
Nitin Paliwal
Nitin PaliwalNitin Paliwal
Nitin Paliwal
 
Jacob Keecheril
Jacob KeecherilJacob Keecheril
Jacob Keecheril
 
Shivaprasada_Kodoth
Shivaprasada_KodothShivaprasada_Kodoth
Shivaprasada_Kodoth
 
Richa_Profile
Richa_ProfileRicha_Profile
Richa_Profile
 
Eric Stone's Resume
Eric Stone's ResumeEric Stone's Resume
Eric Stone's Resume
 
Veera Narayanaswamy_PLSQL_Profile
Veera Narayanaswamy_PLSQL_ProfileVeera Narayanaswamy_PLSQL_Profile
Veera Narayanaswamy_PLSQL_Profile
 
Mahesh_Resume
Mahesh_ResumeMahesh_Resume
Mahesh_Resume
 

Mukhtar_Resume_ETL_Developer

  • 1. Mukhtar Ahmed 1 SUMMARY  Over 8 years of work experience in the field of Data Warehousing involving various ETL projects.  All-round experience in design, build, deploy and support of DWH-ETL processes.  In depth Knowledge and understanding of the DWHconcepts and Dimensional modeling.  Highly specialized in working on IBM InfoSphere Datastage 11.3/8.x, Ascential Datastage 7.x/6.0  Worked on Server/Parallel/Sequence Datastage jobs involving variety of different stages.  Worked on File stages like CFF, Hash file, Flat file, Datasets, Sequential file stages  Processing stages like Merge, Join, Lookup, Transformer, Aggregator, Sort, Remove Duplicate, CDC, Funnel, Copy, Surrogate key Generator,  Database Stages like ODBC, DB2-UDB, Oracle connector, Teradata stage, TPT connector  Built Shared Containers, Basic routines, Multi-Instance jobs, Before/After subroutines  Worked on multiple databases Teradata, DB2, Netteza, Oracle, SQL Server.  Expert in working on Teradata Tools & Utilities (BTEQ/Fastload/Fast export/Multiload/TPT/Tbuild)  Skilled in writing UNIX shell scripts and generic DS automation scripts to execute DS jobs.  Worked on multiple scheduling/monitoring tools such as Tivoli Workstation, Control-m, Autosys.  Specialized in dealing with massive Data-warehouse applications (larger than 100 Terabytes)  Confident in building high Quality code including Performance tuning & Query Optimization  Hands-on experience in Datastage/Teradata/AIX upgrade/migration projects.  Good understanding of the reporting tools like Cognos, Tableau, Business Objects.  Expert in Operations & Production support functions to resolve Datastage/Teradata job aborts.  Worked on Tivoli storage management tools for archival and retrieval of the data files  Well versed with all the AIX/UNIX commands required for performing ETL job.  Gained knowledge on Health Care, Retail Banking, Mortgage, Insurance domain over time. TECHNICALSKILLS: ETL Tools : IBM Data stage 8.1/8.5/11.3, Ascential Datastage 7.5. RDBMS : SQL scripting in TERADATA, DB2, ORACLE 8.1, MS-SQL Server 6.5/7.0 Teradata Tools: : BTEQ/FASTEXPORT/FASTLOAD/MULTILOAD/TPT/TPUMP SQL Clients : Teradata SQL Assistant, DB2 Control Center, Toad, SQL Developer. O/S : AIX 6.1/7.1 (UNIX scripting), Windows batch scripting, Mainframes Z-OS Scheduling Tools : Control-M V6.3 & V7, Tivoli Workload Scheduler(TWS), Autosys. Versioning Tools : MKS Source & Integrity, SVN Tortoise Archive Tools : Tivoli Storage Manager(TSM) QA Tools : HP QC & ALM Document Tools : Microsoft Office Excel/Word/PPT/VISIO , Ultra Edit ,Text Pad TRAINING & CERTIFICATION:  Six Sigma –White belt trained professional.  Company trained and certified on AIX, RDBMS and Data warehousing  Passed the Data Stage Competency test conducted within the organization.  Trained on Teradata database and utilities.  Internally certified on global courses like Global Anti-Bribery, Anti Money Laundering, Information Security Risk and Privacy/Data Protection Awareness EDUCATIONAL BACKGROUND: BE in Information Technology, Osmania University, 2004 – 2008 13800 ChestnutDrive, APT 208, Eden Prairie, Minnesota 55344, U.S.A. Email id:asad_3077@yahoo.com Cell:+1-952-737-2060
  • 2. Mukhtar Ahmed 2 PROFESSIONAL EXPERIENCE: Optum Technology Inc. (UnitedHealth Group), Sr. ETL Developer November 2014 - Present Project: Normative Health Informatics, Adding Health care data of over 10 million members useful for the research and analysis of Lifesciences.  Build complex Data stage jobs to handle huge work load of data (100’s of billions of records).  Utilize Teradata Tools and Utilities like BTEQ/Fastload/Multiload/Tbuild/TPT framework to load large tables of over 2 terabytes.  Build unique primary/secondary/join indexes on Teradata tables  Worked on both Server and Parallel Datastage jobs involving all the different stages like File stages (Sequential, Dataset, CFF, Hash files), Processing stages (Transformer, Sort, Remove Duplicate, Aggregate, Copy, Merge, Join, Lookup, CDC, Funnel), Database Stages (ODBC, TD enterprise, TPT Connector, TD Multiload, DB2, Oracle connector etc.,.)  Develop Generic shell scripts on AIX server to execute thousands of Data stage & Teradata jobs.  Create Tivoli workstation jobs to schedule and automate the execution of these jobs.  Coordinate with teams across locations in smooth execution of ETL process & data deliverables.  Build ETL metric solution for detecting monthly increase/decrease of data from source files.  Take full ownership for proper functioning of application processes and maintaining Data quality.  Re-designing and Optimizing the existing processes where required.  Carrying out capacity planning and taking proactive measures to avoid production issues.  Follow Agile methodology techniques for progressing and tracking day to day project tasks Environment: AIX 7.1, Datastage 8.5/11,3, Teradata 14/15, TWS, HPSM, SVN. Optum Technology Inc. (UnitedHealth Group), Sr. ETL Developer September 2013 - November 2014 Project:HCCI, Integration of over additional 1 million members Health Care data critical for analyzing and researching for UnitedHealth care business, this also involved migrations/upgrades of the existing system.  Develop load process for member data and generate rekey data for medical claims rekey.  Create new File Watcher jobs to help receive thousands of files at landing path and properly move, process and archive them for future usage.  Build complex data stage jobs to cleanse, transform and de-identify the source data  Build Load scripts using TPT framework for parallel loading millions of records in tables  Define unique primary indexes/secondary indexes/Join indexes on the tables  Create different views on top of the tables to de-identify and showcase only the relevant data.  Create new Unix shell scripts for running hundreds of DS jobs using automated scripts  Build new TWS scheduling jobs to execute the newly build processes in proper order.  Work on BA request to perform detailed impact analysis on application design issues/changes.  Write Technical specification documents and flow diagrams and keep them up to date.  Build healthy relationship with project stakeholders/ business partners. Environment: AIX 6.3, Datastage 8.5, Teradata 14, TWS, HPSM, SVN. HSBC Software Dev. Pvt. Ltd, Sr. ETL Developer, Nov 2012 – Sep 2013
  • 3. Mukhtar Ahmed 3 Project: Mortgage Transformation Program, tracking and storing every single event that happens during the lifecycle of the mortgage application required for research.  Lead projects though all SDLC phases, utilize team & ensure timely delivery as per release plan.  Interact with the Business Analysts to understand ETL requirements  Understanding the Mortgage Application journey in terms of different Milestones  Design complex ETL processes, Basic server routines, Teradata Sql Scripts(BTEQ)  Build data cleansing, validation and DSJOB scripts using UNIX shell scripting  Build multiple instance jobs to run the same process for 72 different sites  Utilize Run time column propagation to deal with multiple set of files  Use Import/Export commands for migrating the data stage code across the projects  Developed Datastage jobs involving Join/lookup/merge/transform/remove duplicate/funnel stages  Recommendations for new data mart design as well as improvements in current data warehouse  Act as a reviewer on Technical review panel and review code created by other team members  Ensure compliance with organization and project defined standards and processes  Follow and enforce best practices, framework processes, standards and conventions. Environment: AIX 6.3, Teradata, HP-QC, Datastage 8.1, Trillium server, Control-M. HSBC Software Development Pvt. Ltd, ETL Developer, Jul 2011 – Nov 2012 Project: Liquidity Management Program, Project was about gathering and storing details about all the liquidity assets and liabilities held by HSBC UK business critical for calculating the liquidity risk.  Understand the business requirement and functional specifications documents  Create Technical design documents and Data model documents  Build sequencers to run datastage jobs, Routines, shell scripts in loops as per requirement  Develop parallel jobs involving stages like transformer,sort, Remove Duplicate, aggregator, merge, join, Lookup, change capture, change apply, peek stages.  Create Basic routines to generate and allocate new ID for different source files received.  Explain plan verification and optimizing SQL query by building join/secondary/primary indexes.  Execute the System integration, User acceptance test as per plans  Create DDL for several tables and define proper indexes, datatypes, compression.  Use MKS Integrity code versioning tool for checking in/out the code components.  Verify crystal reports on Cognos framework to track and test the mortgage application status  Provide secured data access to the users as per Compliance requests.  Ensure compliance with organization and project defined standards and processes Environment: IBM Information server Datastage 8.0, Teradata, Sql Server, UNIX/AIX, Control –M, MKS Integrity, ERWIN 4.1. HSBC Software Development Pvt. Ltd, ETL Developer, Apr 2010 – Jun 2011
  • 4. Mukhtar Ahmed 4 Project: Global Analytics & Integration Project, Integration of multiple sources data into a single warehouse.  Involved in Integration of several modules in the Global Analytics across different Regions  Build/Amend, Test and Deploy Datastage Jobs and DB2 SQL scripts  Build jobs to read source files using CFF stage/Hash file/Dataset/Sequential files  Carry out Server to Parallel code changes, test and deploy it to Production.  Build UNIX/AIX shell script for archival and retrieval of the source files  Creating Control-M 6.3 Batch scheduling drafts and migrating them to production environment  Coordinate with QA test team and fix the pending Defects.  Undertake Productivity and Quality assurance initiatives across the team Environment: Unix, DB2, Oracle 10g/9i, Control-M 6.3, Datastage 7.5 HSBC Software Development Pvt. Ltd, ETL Developer, Jun 2008 – Mar 2010 Project: Teradata Migration, Project was about migrating an existing UK data warehouse platform from DB2 database to the Teradata database.  Run Datastage server/parallel/sequence jobs on both DB2 and Teradata environment in parallel  Build new BTEQ scripts, fast load/fast export scripts, TPT scripts for loading and extracting data.  Utilize Control-M scheduling tool to run Datastage jobs on top of DB2 and Teradata  Utilize Golden gate tool to identify any mismatch during data reconciliations  Capture test results and perform the reconciliation between the data  Raise any defects identified during the test run and reconciliations  Perform root cause analysis and apply fixes for the defects  Extensively use the Quality center to log and track the defects till closure  Attend Daily status calls and update the run progress Environment: AIX 6.3, Datastage 8.0, Teradata, DB2, Control-M, Golden Gate. Achievements:  Sustaining Edge Award, Excellence at Workplace & Remarkable Contribution.  Above & Beyond Award - Best Quality Deliverables.  Long Service Award: 4 years of Excellent Service to the organization.  Reuse & Innovation Award - Most Productive Initiation.  Pat on a Back Award: Awarded for Hard work and Dedication. PROFESSIONAL SUMMARY: Software Engineer, HSBC Software Development Pvt. Ltd. June 2008 - September 2013. Senior Software Engineer, Optum Services Inc. (UnitedHealth Group), September 2013 - Present. References: Available upon request