SlideShare a Scribd company logo
1 of 7
Download to read offline
KRISHNA KISHORE PUSAPATI
kishore1.pusapati@gmail.com
SUMMARY
• 9+ years of experience in building Data Warehouse as a Senior Datastage Designer, Senior Datastage Developer,
Datastage administrator, Quality Stage Developer with consultant roles and responsibilities
• Strong Knowledge of Data Warehouse Architecture which includes Star Schema, Snow flake Schema, FACT,
Dimensional Tables, Physical and Logical Data Modeling.
• Extensive knowledge and wide range of experience in Data profiling, Data cleansing, designing and developing
Datastage jobs in Datastage versions 11.3,9.1, 8.7, 8.5, 8.1 (IBM Info sphere, Information Server Suite).
• Actively participated in migration of data stage jobs between different versions.
• Good experience in Datastage Administrator, Datastage Director, Datastage Designer
• Involved in Datastage Version migration like data stage 8.5/9.1 and data stage 8.1/9.1
• Actively participated in user / client meeting to understand and put the requirements in compatible with the ETL design
on OLAP as well as OLTP systems.
• Involved and assist the team in different phases of the project like Design, Development, Testing, Preproduction and
Production.
• Good experience in writing SQL loader scripts and UNIX Shell Scripts to automate file manipulation and data loading
procedures.
• Experience in gathering business requirements and preparation of software requirement specification (SRS) documents
and actively participated while designing the ETL model with the architect.
• Implement procedures for maintenance, monitoring, backup, and recovery operations for the ETL environment.
• Reporting (daily, weekly, and monthly) on health and performance of the ETL environment and jobs.
• Infrastructure optimization, troubleshooting and debugging.
• Actively participated in Data stage admin activities
• Maintain ownership of release activities which interact with ETL projects.
• Work experience in production and support for the critical application
• Processed large volumes of data (more than 10 million) on a daily frequency, in form of flat files as well as from
database tables.
• Worked on the performance tuning with Datastage, SQL and the Oracle, DB2 bulk loads and simple inserts.
• Experience in the performance tuning of the Datastage jobs for both historical migration and daily loads.
TECHNICAL SKILLS
ETL Tools Datastage 11.3,9.1,8.7,8.5,8.1 , Quality stage, Information Services Director,
Standardization rules designer ,Information Analyser.
Databases Oracle 11g, Oracle 10g , Oracle 9i, DB2, Teradata, Netezza.
Operating
Systems
AIX, LINUX, UNIX, Windows 2003 server, XP
Languages Sql, Pl/Sql , C++,perl scripting
Scheduling
Tools
UNIX Cron, batchman Scheduler, Datastage Scheduler,Autosys,control-M.
Certifications Certified IBM Datastage Professional
Certified IBM Qualitystage Professional
PROFESSIONAL EXPERIENCE
MACYS SYSTEMS and TECHNOLOGY,JohnsCreek,GA Oct 2015- Present
DataStage Lead Consultant
Page 2 of 7
The business owner requires the ability to manage inbound and outbound campaigns (offers/messages), including those from
external lists, and operational reporting for Email, SMS, Mobile Push and Site media types for Macy’s/Bloomingdales
(M/BL) using the SAS MA/MO tools. The key goal of the RTO effort is to replace separate and disparate systems by channel
with a single enterprise Omni channel Campaign, Optimization and Real Time Offer/Message solution to enable a new
platform for delivering targeted, relevant, and personalized marketing campaigns and offers/messages to customers. Support
a shift in Marketing from the traditional event/calendar-driven paradigm towards a customer-centric, relationship-driven
approach to communicate with our customers in a contextually relevant dialogue across omnichannel marketing vehicles. To
achieve the above goals Macy’s/Bloomingdales (M/BL) will utilize the SAS Campaign Management/Campaign Optimization
(MA/MO) product that will integrate with existing execution/delivery systems and a newly created Data Mart. Datastage is a
ETL tool which has given good solution for creating the Required Data mart.
• Working on the environment set up for the current project.
• Gathering business requirements and preparation of software requirement specification (SRS) documents and actively
participated while designing the ETL model with the architect.
• Designing the mapping documents for the development ETL jobs.
• Active participation on the dimensional modeling for the data marts and building the flattened structure for SAS.
• Working on the IBM Data stage jobs design templates.
• Developing the ETL jobs using Data stage and conducting Unit testing for the jobs.
• Involving in the Administrative tasks for the set up of the Data stage Environment.
• Creating the Tables Indexes and Referential integrities for the Data marts.
• Involving in maintenance of the data quality and data profiling of the client data.
• Managing offshore delivery teams directly ( assigning tasks, getting status, doing regular calls, etc ).
• Preparing the Unit test cases and Quality maintenance of the ETL code.
• Working with Oracle tuning, modeling and database design.
• Responsible for the code deployment process between the environments.
• Working on the performance issues with the initial and incremental loads and code bug fixing.
• Working on the reject processing and audit control jobs.
• Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple
APT_CONFIG_FILE.
• Environment: IBM Info Sphere Quality Stage and Data Stage 11.3.2, Flat Files, DB2,Oracle 11g,SAS,Sub-
Version,Control-M.
NYC DOHMH, LongIsland City, NY April 2015- Sep 2015
DataStage and QualityStage Lead Consultant
NYC DOHMH is continuing a project to evaluate the NY/NY III Supportive housing program by comparing and identifying
individuals in supportive housing to individuals who are not by using health and administrative data sets. The Division of
Informatics, Information Technology, and Telecommunications (DIITT) is to develop a probabilistic matching algorithm
utilizing IBM Infosphere Quality Stage/Data Stage v9.1 to de-duplicate persons in supportive housing against multiple
disparate health data sets. Need to include these data in an SQL relational data warehouse based on results of the match to
link person health data across different sources. DIITT is a multidisciplinary unit with the goal of combining cutting-edge
informatics, application development analysis and reporting to implement innovative IT solutions for public health.
• Responsible for standardizing different data sources in one common SQL table structure for de-duplication.
• Extensively worked with IBM Info sphere Quality Stage V9.1 investigation, standardization (including CASS address
verification), and Unduplicate and Reference file matching.
• Worked to setup one-time person match using historic data, work with Informatics, Technology and Epidemiology staff
to provide data sets to evaluate the algorithm and set thresholds
• Implemented stored procedures to implement recurring matching going forward including updating the SQL Data
warehouse.
Page 3 of 7
• Experince in real time stages like XML,parseJSON(JavaScript Object Notation) to handle web based data.
• Woked with contract libraries contains the schema that describes the JSON document for company profiles.
• Technical documentation of implementation.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Implemented reusable components (Containers) for the error handling and error mailing.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies
• Experience with complex Sql scripts, Stored procedures, Creation of indexes, complex joins in SQL Server2012
• Ability to assist with operations and production coverage
• Experience installing and upgrading Datastage Server product suite on different tiers
• Experience with Post install Configuration, various data sources configuration for the Enterprise plug-in and
configurations
• Experience with stopping and starting server and monitor the logs.
• Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple
APT_CONFIG_FILE
• Outstanding communication and customer support skills and experience
• Environment: IBM Info Sphere Quality Stage and Data Stage 9.1, Flat Files, SQL Server 2012.
SEARS, Livonia, MI February 2013- March 2015
Data Stage and Quality Stage Lead Consultant/Infosphere Data Stage Admin
The objective of the project is to process the data from source files and Database by performing ETL to develop job streams
for the interfaces, which will process data from source to target .These files are generate automated reporting capability for
its client businesses. These are met by the Extraction, Transformation and Loading using Data stage.
• Worked with the Business process analyst to thoroughly understand the different business processes and requirements.
• Leading the role of onsite coordinator and handling the team.
• Implemented Administrative tasks using Data stage Administrator
• Involved in migration of the data stage jobs from older version(8.1) to new version(9.1)
• Creating backup of the jobs and Generation of Configuration file.
• Creation of users and groups and Assigning roles and responsibilities to the users.
• Documented user requirements, translated requirements into system solutions and develop implementation plan and
schedule.
• Involved in the implementation of Clustering Architecture(Grid Technology)
• Developed the OLTP data models for the real time retail data.
• Developed parallel jobs between multiple source systems and sequencers.
• Extensively worked on Information Analyzer to perform data profiling and identify and analyze the Metadata of the data
(Column Analysis). Developed a detailed data profiling reports.
• Publishing XML document from tabular data
• Parsing XML document into tabular data
• Accessing a Web service with input and output data
• Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple
APT_CONFIG_FILE
• Experience in real time stages like XML,parseJSON(JavaScript Object Notation) to handle web based data.
• Worked with contract libraries contains the schema that describes the JSON document for company profiles.
• Using JSON stage to get the data from .XSD format.
• Implemented reusable components (Containers) for the error handling and error mailing.
• Worked on Information Analyzer for column analysis, primary key analysis and foreign key analysis and developed
detailed data quality assessment (DQA) reports.
• Ability to assist with operations and production coverage.
• Worked used Data Stage and Quality Stage Designer to develop customized rule sets for standardizing fields like
individual name, organization names, address and phone numbers.
• Extensively used Data Stage and Quality Stage Designer to design and develop jobs for extracting, cleansing,
transforming, integrating, and loading data using various stages like Standardize, Match Frequency, Reference Match,
Page 4 of 7
Unduplicate Match, Survive, Remove Duplicate, Surrogate Key, Aggregator, Funnel, Join, Merge, Lookup, Change
Capture, Change Apply and Copy.
• Worked with complex Sql scripts, Stored procedures, Creation of indexes, complex joins in DB2
• Worked with Data Stage Director to monitor and analyze performance of individual stages and run Data Stage parallel
jobs and sequencers.
• Extensively worked on migrating Data Stage jobs from development to test and to production environments.
• Performed Unit Testing, Regression Testing, and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed parallel jobs and sequencers by using various performance tuning
strategies.
• Environment: IBM Info Sphere Data Stage 8.1/9.1, Information Analyzer, IBM AIX, Flat Files, Neteeza NPS 7.0,
DB29.7.
WALGREENS, Detroit, MI March 2011 – January 2013
Senior ETL Lead Data Stage Developer/Admin
The scope of the project is to develop Data Warehouse is built upon integrated legacy system data feeds received from many
divisions of business data sources .Data Marts were build according to the business critical user requirements.
• Involved in different phases of building the Data Marts like analyzing business requirements, ETL process design,
performance enhancement, go-live activities and maintenance.
• Interacted with the end users, Business Analysts and Architects for collecting, understating the business requirements.
Documented them and translated requirements into system solutions.
• Worked as a lead data stage developer.
• Experienced with Creation of users, groups and different tasks using Data stage Administrator
• Adding Environment variables to the project and involved in the migration of data stage version from 8.5 to 9.1.
• Maintaining the Version control of the jobs.
• Implemented reusable components (Containers) for the error handling and error mailing.
• Designed ETL process as per the requirements and documented ETL process using MS Visio.
• Extensively worked on Data Stage Designer, Director and Administrator.
• Extensively used Data Stage Designer to design, develop ETL jobs for extracting data from Oracle tables and Flat files
and load them into respective DataMarts.
• Importing and Exporting of jobs through DataStage Designer into different projects.
• Extensively worked with Fact and Dimension tables to produce source - target mappings based upon business specs.
• Extensively worked on Information Analyzer to perform data profiling for identify and analyze the Metadata of the data
(Column Analysis). Developed a detailed data profiling reports.
• Worked on UNIX Shell Scripts to automate the file transfer process (FTP) and Scheduled jobs using AutoSys.
• Worked with DataStage Director to monitor and analyze performance of individual stages and run DataStage jobs.
• Experience in complex Sql scripts, Stored procedures, Creation of indexes, complex joins in Oracle
• Implemented enhancements to the current ETL programs based on the new requirements.
• Involved in setting up of war room sessions for the closure of issues.
• Involved in the design of BIS architecture, Code reviews of DataStage jobs and implemented best practices in DataStage
jobs.
• Involved in up gradation of tool and migration of jobs from lower version to higher version.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Extensively worked on coordination with Offshore.
• Environment: IBM InfoSphere DataStage 8.5/9.1, Oracle (10g/11g), SunOS 5.1, HP Quality Center,Cognos 10,
AutoSys, MS Visio, PL/SQL and Flat Files.
THAMESWATER, Reading, UK June 2010 – February 2011
DataStage and QualityStage Consultant
The Scope of WAMI (Thames water) project is client wanted to a single view on customer data. Initially WAMI was
scattered in different. They decided to create a single customer solution using SAP CRM system.
Page 5 of 7
• Involved in different phases of Data Integration like gathering and analyzing business requirements, ETL process design,
performance enhancement, go-live activities and support.
• Interacted with the users, Business Analysts for collecting, understating the business requirements.
• Designed ETL process as per the requirements and documented ETL process using MS Visio.
• Extensively worked on DataStage Enterprise Edition (Formerly Parallel Extender) Designer, Director and Administrator.
• Extensively used DataStage Designer to design, develop ETL jobs for extracting data from different source system,
transforming the legacy data to SAP ready data and loading the data into SAP system.
• Extensively worked on administration tasks like creating projects for the respective releases and creating global
parameters which can be commonly used across the environments.
• Designed and developed DataStage jobs, containers to implement business rules.
• Used After/ before job routines to perform specific task.
• Involved in Master Data like Vendor Master, Customer Master and Transaction Data like AR Invoices, IM and WM
managed Inventory, packing conversion.
• Scheduled jobs using DataStage Director and Unix Shell Scripts.
• Worked on UNIX Shell Scripts to automate the file transfer process(FTP).
• Worked with DataStage Director to monitor and analyze performance of individual stages and run DataStage jobs.
• Provided 24/7 support during the various test and mock phases and production support during the Go-Live.
• Extensively worked on migrating DataStage codes from development to quality and to production environments.
• Actively participated in discussions and implementation of Error Handling and Reject management functions.
• Experience in monitoring the InfoSphere suite server for I/O usage, memory usage, CPU usage, network usage, storage
usage.
• Involved in setting up of war room sessions for the closure of issues.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Worked on improving the performance of the designed parallel jobs and sequencers by using various performance
tuning strategies.
• Extensive Offshore coordination.
• Environment: IBM InfoSphere DataStage and Quality Stage 8.7 Parallel Extender, HP Quality Center, MSSQL Server
2000/2005, Sybase, SAP BW 7.0, IBM AIX 5.X, , SAP R/3, AutoSys, MS Visio, PL/SQL, DOS and Flat Files, Oracle
10g,DB2 9.7.
Hutchinsion3G, Maidenhead, UK May 2009 – March 2010
DataStage and QualityStage Consultant
The Hutchinson 3G UK Limited business and main IT focus is now on BI – analyzing data to predict future strategies for
driving business behavior, developing campaigns, and targeting clients to best effect with reliable up to date information that
can help differentiate themselves in the marketplace and from their competition.
• Worked with the Business analyst to thoroughly understand the different business processes and requirements.
• Documented user requirements, translated requirements into system solutions and develop implementation plan and
schedule.
• Developed parallel jobs between multiple source systems and sequencers.
• Extensively worked on Investigate Stage to perform data profiling and identify and analyze the data patterns and
inconsistencies. Developed a detailed data profiling report.
• Worked on Information Analyzer for column analysis, primary key analysis and foreign key analysis.
• Extensively used DataStage and QualityStage Designer to design and develop jobs for extracting, cleansing,
transforming, integrating, and loading data using various stages like Standardize, Match Frequency, Unduplicate Match,
Survive, Remove Duplicate, Surrogate Key, Aggregator, Funnel, Join, Change Capture, Change Apply and Copy.
• Worked with DataStage Director to schedule, monitor and analyze performance of individual stages and run DataStage
jobs.
• Extensively worked administration tasks like creating Repository, User groups, Users and managed users by setting up
their profiles and privileges and creating global parameters which can be commonly used across the environments.
• Designed and developed DataStage jobs to populate into the MDM and send notification in xml messages using web
service transformer.
Page 6 of 7
• Written various Unix Shell Scripts for scheduling the jobs.
• Extensively worked on migrating DataStage jobs from development to test and to production environments.
• Involved in setting up of war room sessions for the closure of issues.
• Designed mechanism for identifying bad records during the loading process.
• Performed Unit Testing, Regression Testing, and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Environment: IBM WebSphere DataStage and QualityStage 8.1 Parallel Extender, Information Analyzer, IBM MDM,
IBM AIX 6.1, Oracle 10g, PL/SQL, Flat Files.
ALLSTATE Insurance, Fort Wayne, IN January 2009 – April 2009
DataStage Consultant
The main objective of the project was to develop an Enterprise Data Warehouse for reporting and analysis purposes. Matrix
extracts data from different source systems and applies business rules, loads the data into warehouse to different DataMarts
and sends data to the representatives on the field.
• Involved in all the phases of building the DataMarts like analyzing the business requirements, ETL process design,
performance enhancement and maintenance.
• Extensively worked on DataStage Enterprise Edition (Formerly Parallel Extender) Administrator, Designer, Director,
and Manager.
• Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into
different DataMarts.
• Extensively worked administration tasks like creating user profiles and assigning user privileges and creating projects for
the respective releases and creating global parameters which can be commonly used across the environments.
• Involved in Data Modeling and creation of Star Schema and Snowflake Dimensional, DataMarts using ERWin tool.
• Extensively worked in Teradata RDBMS V2R6.1.
• Have Strong working experience in various Teradata Client Utilities like Multiload, Fast load, Fast Export and
TERADATA administrator activities.
• Designed mechanism to send alerts as soon as the jobs failed to PSO and the respective developers.
• Provided production support during the various phases.
• Extensively worked on migrating DataStage jobs from development to test and to production environments.
• Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the
transformations while loading the data.
• Experience in complex Sql scripts, Stored procedures, Creation of indexes, complex joins.
• Extensively worked Requirement analysis & Impact Analysis
• Extensively worked Designing solution framework and Data Analysis
• Extensively coordinated with Offshore.
• Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code
change and enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Environment: IBM WebSphere DataStage 8.1 Parallel Extender, Cognos, AutoSys, Windows NT 4.0, IBM AIX 4.1, Red
Hat Enterprise Linux 5, Oracle 10g, PL/SQL, DOS and Sequential Files.
Ryvo Electronics, Atlanta, GA April 2007 –December 2008
PL/SQL and DataStage Developer
The objective of the project is to develop the system, which can give him intelligent Information reports regarding his
existing business situation and how he can analyze business.Involved in all the phases of building the DataMarts like
analyzing the business requirements, ETL process design, performance enhancement and maintenance.
• Extensively Worked on Parallel Extender (PX), QualityStage.
• Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into
different DataMarts.
• Performed Import and Export of DataStage components and table definitions using DataStage Manager.
Page 7 of 7
• Extensively worked administration tasks like creating user profiles and assigning user privileges and creating projects for
the respective releases and creating global parameters which can be commonly used across the environments.
• Developed Stored Procedures, Functions, database Triggers and created Packages to access databases from front end
screens.
• Developed both inbound and outbound interfaces to load data into targeted database and extract data from database to
flat files.
• Extensively used Data Stage Designer to develop a variety of jobs for extracting, cleansing, transforming, integrating and
loading data into target database.
• Extracted data from various source systems like Siebel, Oracle, Flat Files and SAP R/3.
• Extensively used SAP integration using SAP connection.
• Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the
transformations while loading the data.
• Extensively worked Requirement analysis & Impact Analysis
• Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Written various Unix Shell Scripts for scheduling the jobs.
• Environment: IBM WebSphere DataStage 8.1, Windows NT 4.0,UNIX, LINUX, Oracle 8i, PL/SQL, IBM DB2, Dbase3
Files, DOS and UNIX Sequential Files, MS Access, .csv Files, XML Files.
MetLife insurance Company, Bangalore, INDIA January 2007 - March 2007
Software Developer
The primary objective of the project is to develop one process for their daily activities used to calculate the premium and
analyze what type of policies people preferring.
• Automated SQL Loader to load the data from the flat files
• Used Materialized Views, Packages and Dynamic SQL
• Created programs to transform data from Legacy systems into Oracle database.
• Worked on DataMarts using ERWIN tool.
• Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the
transformations while loading the data.
• Written various Unix Shell Scripts for scheduling the jobs.
• Implemented enhancements to the current programs based on the new requirements.
• Experience in complex Sql scripts, stored procedures, Creation of indexes, complex joins.
• Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and
enhancement.
• Worked on improving the performance of the designed jobs by using various performance tuning strategies.
• Worked closely with Business Users and Data Quality Analysts after loading data for accuracy and consistency of data.
• Environment: MS Word, Mercury Win runner, Mercury Load runner, Windows NT 4.0, Autosys, UNIX, Knoppix
Linux, Oracle 9i/8i, PL/SQL, Dbase3, DOS and UNIX Sequential Files, MS Access, ERWin, DB2/UDB/EE Database.
CERTIFICATION
IBM Infosphere Datastage
IBM Infosphere Qualitystage
Got many performance awards

More Related Content

What's hot (19)

Resume
ResumeResume
Resume
 
informatica_developer
informatica_developerinformatica_developer
informatica_developer
 
Resume_Gulley_Oct7_2016
Resume_Gulley_Oct7_2016Resume_Gulley_Oct7_2016
Resume_Gulley_Oct7_2016
 
Arun Kondra
Arun KondraArun Kondra
Arun Kondra
 
Elizabeth Mc Rae Resume Sf
Elizabeth Mc Rae Resume SfElizabeth Mc Rae Resume Sf
Elizabeth Mc Rae Resume Sf
 
resume_latest
resume_latestresume_latest
resume_latest
 
Siva - Resume
Siva - ResumeSiva - Resume
Siva - Resume
 
Sreekanth Resume
Sreekanth  ResumeSreekanth  Resume
Sreekanth Resume
 
Gregory.Harvey.2015
Gregory.Harvey.2015Gregory.Harvey.2015
Gregory.Harvey.2015
 
Ashish updated cv 17 dec 2015
Ashish updated cv 17 dec 2015Ashish updated cv 17 dec 2015
Ashish updated cv 17 dec 2015
 
John Plunkett III 2016 - DBA revised
John Plunkett III 2016 - DBA revisedJohn Plunkett III 2016 - DBA revised
John Plunkett III 2016 - DBA revised
 
ChakravarthyUppara
ChakravarthyUpparaChakravarthyUppara
ChakravarthyUppara
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer
 
Levin_Michael_2016-04
Levin_Michael_2016-04Levin_Michael_2016-04
Levin_Michael_2016-04
 
Alphonso_Triplett.Sr_Prometheus_Phoenix
Alphonso_Triplett.Sr_Prometheus_PhoenixAlphonso_Triplett.Sr_Prometheus_Phoenix
Alphonso_Triplett.Sr_Prometheus_Phoenix
 
Abdul ETL Resume
Abdul ETL ResumeAbdul ETL Resume
Abdul ETL Resume
 
Padmini Parmar
Padmini ParmarPadmini Parmar
Padmini Parmar
 
Padmini parmar
Padmini parmarPadmini parmar
Padmini parmar
 
Data Warehouse Methodology
Data Warehouse MethodologyData Warehouse Methodology
Data Warehouse Methodology
 

Viewers also liked

Rancangan formula-suppositoria-aminofilin
Rancangan formula-suppositoria-aminofilinRancangan formula-suppositoria-aminofilin
Rancangan formula-suppositoria-aminofilinaufia w
 
Partes de la computadora
Partes de la computadoraPartes de la computadora
Partes de la computadoraJazdaly Meza
 
Orient Textiles Ramadan Luxurious Collection 2016
Orient Textiles Ramadan Luxurious Collection 2016Orient Textiles Ramadan Luxurious Collection 2016
Orient Textiles Ramadan Luxurious Collection 2016Orient Textiles
 
Keynote technicals currency intraday levels for 010213
Keynote technicals currency intraday levels for 010213Keynote technicals currency intraday levels for 010213
Keynote technicals currency intraday levels for 010213Keynote Capitals Ltd.
 
Infographic: The House Republican Budget
Infographic: The House Republican BudgetInfographic: The House Republican Budget
Infographic: The House Republican BudgetObama White House
 
طراحی وب سایت و تجارت الکترونیک
طراحی وب سایت و تجارت الکترونیکطراحی وب سایت و تجارت الکترونیک
طراحی وب سایت و تجارت الکترونیکSajad Salehipour
 
Web Video Thunderdome - SXSW 2010
Web Video Thunderdome - SXSW 2010Web Video Thunderdome - SXSW 2010
Web Video Thunderdome - SXSW 2010Mike Arauz
 
Jickson Accounts CV (1)
Jickson Accounts CV (1)Jickson Accounts CV (1)
Jickson Accounts CV (1)JICKSON REBERA
 
Amazing Winter Photography
Amazing Winter PhotographyAmazing Winter Photography
Amazing Winter Photographytrisha garcia
 
A Taste of Costa Rica Pg# 8577-INTL CA
A Taste of Costa Rica Pg# 8577-INTL CAA Taste of Costa Rica Pg# 8577-INTL CA
A Taste of Costa Rica Pg# 8577-INTL CAJenkins Macedo
 
Bitcoin Lore for Delivery
Bitcoin Lore for DeliveryBitcoin Lore for Delivery
Bitcoin Lore for DeliveryJudd Bagley
 
Steve Jobs' 25 Most Inspiring Quotes
Steve Jobs' 25 Most Inspiring QuotesSteve Jobs' 25 Most Inspiring Quotes
Steve Jobs' 25 Most Inspiring QuotesKiran Menon Vappala
 
Daniel - 3.ders
Daniel - 3.dersDaniel - 3.ders
Daniel - 3.dersoddl
 

Viewers also liked (16)

Rancangan formula-suppositoria-aminofilin
Rancangan formula-suppositoria-aminofilinRancangan formula-suppositoria-aminofilin
Rancangan formula-suppositoria-aminofilin
 
Partes de la computadora
Partes de la computadoraPartes de la computadora
Partes de la computadora
 
Orient Textiles Ramadan Luxurious Collection 2016
Orient Textiles Ramadan Luxurious Collection 2016Orient Textiles Ramadan Luxurious Collection 2016
Orient Textiles Ramadan Luxurious Collection 2016
 
Keynote technicals currency intraday levels for 010213
Keynote technicals currency intraday levels for 010213Keynote technicals currency intraday levels for 010213
Keynote technicals currency intraday levels for 010213
 
Infographic: The House Republican Budget
Infographic: The House Republican BudgetInfographic: The House Republican Budget
Infographic: The House Republican Budget
 
طراحی وب سایت و تجارت الکترونیک
طراحی وب سایت و تجارت الکترونیکطراحی وب سایت و تجارت الکترونیک
طراحی وب سایت و تجارت الکترونیک
 
Web Video Thunderdome - SXSW 2010
Web Video Thunderdome - SXSW 2010Web Video Thunderdome - SXSW 2010
Web Video Thunderdome - SXSW 2010
 
Jickson Accounts CV (1)
Jickson Accounts CV (1)Jickson Accounts CV (1)
Jickson Accounts CV (1)
 
Just In Time
Just In TimeJust In Time
Just In Time
 
Amazing Winter Photography
Amazing Winter PhotographyAmazing Winter Photography
Amazing Winter Photography
 
Collaborative working and federating v4
Collaborative working and federating v4Collaborative working and federating v4
Collaborative working and federating v4
 
Diagramacion
DiagramacionDiagramacion
Diagramacion
 
A Taste of Costa Rica Pg# 8577-INTL CA
A Taste of Costa Rica Pg# 8577-INTL CAA Taste of Costa Rica Pg# 8577-INTL CA
A Taste of Costa Rica Pg# 8577-INTL CA
 
Bitcoin Lore for Delivery
Bitcoin Lore for DeliveryBitcoin Lore for Delivery
Bitcoin Lore for Delivery
 
Steve Jobs' 25 Most Inspiring Quotes
Steve Jobs' 25 Most Inspiring QuotesSteve Jobs' 25 Most Inspiring Quotes
Steve Jobs' 25 Most Inspiring Quotes
 
Daniel - 3.ders
Daniel - 3.dersDaniel - 3.ders
Daniel - 3.ders
 

Similar to Krishna_IBM_Infosphere_Certified_Datastage_Consultant

Similar to Krishna_IBM_Infosphere_Certified_Datastage_Consultant (20)

PradeepDWH
PradeepDWHPradeepDWH
PradeepDWH
 
HamsaBalajiresume
HamsaBalajiresumeHamsaBalajiresume
HamsaBalajiresume
 
Resume - Deepak v.s
Resume -  Deepak v.sResume -  Deepak v.s
Resume - Deepak v.s
 
Pradeep_resume_ETL Testing
Pradeep_resume_ETL TestingPradeep_resume_ETL Testing
Pradeep_resume_ETL Testing
 
Ganesh profile
Ganesh profileGanesh profile
Ganesh profile
 
Praveena-Resume-Lead
Praveena-Resume-LeadPraveena-Resume-Lead
Praveena-Resume-Lead
 
sandhya exp resume
sandhya exp resume sandhya exp resume
sandhya exp resume
 
Mallikarjun_Konduri
Mallikarjun_KonduriMallikarjun_Konduri
Mallikarjun_Konduri
 
Shane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slim
 
Resume_PratikDey
Resume_PratikDeyResume_PratikDey
Resume_PratikDey
 
Navendu_Resume
Navendu_ResumeNavendu_Resume
Navendu_Resume
 
Resume_Parthiban_Ranganathan
Resume_Parthiban_RanganathanResume_Parthiban_Ranganathan
Resume_Parthiban_Ranganathan
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
Saurabh's_profile
Saurabh's_profileSaurabh's_profile
Saurabh's_profile
 
Salim Khan.Resume_3.8
Salim Khan.Resume_3.8Salim Khan.Resume_3.8
Salim Khan.Resume_3.8
 
Subhoshree resume
Subhoshree resumeSubhoshree resume
Subhoshree resume
 
Copy of Alok_Singh_CV
Copy of Alok_Singh_CVCopy of Alok_Singh_CV
Copy of Alok_Singh_CV
 
Amit Kumar_resume
Amit Kumar_resumeAmit Kumar_resume
Amit Kumar_resume
 
Resume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_DeveloperResume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_Developer
 
Resume sailaja
Resume sailajaResume sailaja
Resume sailaja
 

Krishna_IBM_Infosphere_Certified_Datastage_Consultant

  • 1. KRISHNA KISHORE PUSAPATI kishore1.pusapati@gmail.com SUMMARY • 9+ years of experience in building Data Warehouse as a Senior Datastage Designer, Senior Datastage Developer, Datastage administrator, Quality Stage Developer with consultant roles and responsibilities • Strong Knowledge of Data Warehouse Architecture which includes Star Schema, Snow flake Schema, FACT, Dimensional Tables, Physical and Logical Data Modeling. • Extensive knowledge and wide range of experience in Data profiling, Data cleansing, designing and developing Datastage jobs in Datastage versions 11.3,9.1, 8.7, 8.5, 8.1 (IBM Info sphere, Information Server Suite). • Actively participated in migration of data stage jobs between different versions. • Good experience in Datastage Administrator, Datastage Director, Datastage Designer • Involved in Datastage Version migration like data stage 8.5/9.1 and data stage 8.1/9.1 • Actively participated in user / client meeting to understand and put the requirements in compatible with the ETL design on OLAP as well as OLTP systems. • Involved and assist the team in different phases of the project like Design, Development, Testing, Preproduction and Production. • Good experience in writing SQL loader scripts and UNIX Shell Scripts to automate file manipulation and data loading procedures. • Experience in gathering business requirements and preparation of software requirement specification (SRS) documents and actively participated while designing the ETL model with the architect. • Implement procedures for maintenance, monitoring, backup, and recovery operations for the ETL environment. • Reporting (daily, weekly, and monthly) on health and performance of the ETL environment and jobs. • Infrastructure optimization, troubleshooting and debugging. • Actively participated in Data stage admin activities • Maintain ownership of release activities which interact with ETL projects. • Work experience in production and support for the critical application • Processed large volumes of data (more than 10 million) on a daily frequency, in form of flat files as well as from database tables. • Worked on the performance tuning with Datastage, SQL and the Oracle, DB2 bulk loads and simple inserts. • Experience in the performance tuning of the Datastage jobs for both historical migration and daily loads. TECHNICAL SKILLS ETL Tools Datastage 11.3,9.1,8.7,8.5,8.1 , Quality stage, Information Services Director, Standardization rules designer ,Information Analyser. Databases Oracle 11g, Oracle 10g , Oracle 9i, DB2, Teradata, Netezza. Operating Systems AIX, LINUX, UNIX, Windows 2003 server, XP Languages Sql, Pl/Sql , C++,perl scripting Scheduling Tools UNIX Cron, batchman Scheduler, Datastage Scheduler,Autosys,control-M. Certifications Certified IBM Datastage Professional Certified IBM Qualitystage Professional PROFESSIONAL EXPERIENCE MACYS SYSTEMS and TECHNOLOGY,JohnsCreek,GA Oct 2015- Present DataStage Lead Consultant
  • 2. Page 2 of 7 The business owner requires the ability to manage inbound and outbound campaigns (offers/messages), including those from external lists, and operational reporting for Email, SMS, Mobile Push and Site media types for Macy’s/Bloomingdales (M/BL) using the SAS MA/MO tools. The key goal of the RTO effort is to replace separate and disparate systems by channel with a single enterprise Omni channel Campaign, Optimization and Real Time Offer/Message solution to enable a new platform for delivering targeted, relevant, and personalized marketing campaigns and offers/messages to customers. Support a shift in Marketing from the traditional event/calendar-driven paradigm towards a customer-centric, relationship-driven approach to communicate with our customers in a contextually relevant dialogue across omnichannel marketing vehicles. To achieve the above goals Macy’s/Bloomingdales (M/BL) will utilize the SAS Campaign Management/Campaign Optimization (MA/MO) product that will integrate with existing execution/delivery systems and a newly created Data Mart. Datastage is a ETL tool which has given good solution for creating the Required Data mart. • Working on the environment set up for the current project. • Gathering business requirements and preparation of software requirement specification (SRS) documents and actively participated while designing the ETL model with the architect. • Designing the mapping documents for the development ETL jobs. • Active participation on the dimensional modeling for the data marts and building the flattened structure for SAS. • Working on the IBM Data stage jobs design templates. • Developing the ETL jobs using Data stage and conducting Unit testing for the jobs. • Involving in the Administrative tasks for the set up of the Data stage Environment. • Creating the Tables Indexes and Referential integrities for the Data marts. • Involving in maintenance of the data quality and data profiling of the client data. • Managing offshore delivery teams directly ( assigning tasks, getting status, doing regular calls, etc ). • Preparing the Unit test cases and Quality maintenance of the ETL code. • Working with Oracle tuning, modeling and database design. • Responsible for the code deployment process between the environments. • Working on the performance issues with the initial and incremental loads and code bug fixing. • Working on the reject processing and audit control jobs. • Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple APT_CONFIG_FILE. • Environment: IBM Info Sphere Quality Stage and Data Stage 11.3.2, Flat Files, DB2,Oracle 11g,SAS,Sub- Version,Control-M. NYC DOHMH, LongIsland City, NY April 2015- Sep 2015 DataStage and QualityStage Lead Consultant NYC DOHMH is continuing a project to evaluate the NY/NY III Supportive housing program by comparing and identifying individuals in supportive housing to individuals who are not by using health and administrative data sets. The Division of Informatics, Information Technology, and Telecommunications (DIITT) is to develop a probabilistic matching algorithm utilizing IBM Infosphere Quality Stage/Data Stage v9.1 to de-duplicate persons in supportive housing against multiple disparate health data sets. Need to include these data in an SQL relational data warehouse based on results of the match to link person health data across different sources. DIITT is a multidisciplinary unit with the goal of combining cutting-edge informatics, application development analysis and reporting to implement innovative IT solutions for public health. • Responsible for standardizing different data sources in one common SQL table structure for de-duplication. • Extensively worked with IBM Info sphere Quality Stage V9.1 investigation, standardization (including CASS address verification), and Unduplicate and Reference file matching. • Worked to setup one-time person match using historic data, work with Informatics, Technology and Epidemiology staff to provide data sets to evaluate the algorithm and set thresholds • Implemented stored procedures to implement recurring matching going forward including updating the SQL Data warehouse.
  • 3. Page 3 of 7 • Experince in real time stages like XML,parseJSON(JavaScript Object Notation) to handle web based data. • Woked with contract libraries contains the schema that describes the JSON document for company profiles. • Technical documentation of implementation. • Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code change and enhancement. • Implemented reusable components (Containers) for the error handling and error mailing. • Worked on improving the performance of the designed jobs by using various performance tuning strategies • Experience with complex Sql scripts, Stored procedures, Creation of indexes, complex joins in SQL Server2012 • Ability to assist with operations and production coverage • Experience installing and upgrading Datastage Server product suite on different tiers • Experience with Post install Configuration, various data sources configuration for the Enterprise plug-in and configurations • Experience with stopping and starting server and monitor the logs. • Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple APT_CONFIG_FILE • Outstanding communication and customer support skills and experience • Environment: IBM Info Sphere Quality Stage and Data Stage 9.1, Flat Files, SQL Server 2012. SEARS, Livonia, MI February 2013- March 2015 Data Stage and Quality Stage Lead Consultant/Infosphere Data Stage Admin The objective of the project is to process the data from source files and Database by performing ETL to develop job streams for the interfaces, which will process data from source to target .These files are generate automated reporting capability for its client businesses. These are met by the Extraction, Transformation and Loading using Data stage. • Worked with the Business process analyst to thoroughly understand the different business processes and requirements. • Leading the role of onsite coordinator and handling the team. • Implemented Administrative tasks using Data stage Administrator • Involved in migration of the data stage jobs from older version(8.1) to new version(9.1) • Creating backup of the jobs and Generation of Configuration file. • Creation of users and groups and Assigning roles and responsibilities to the users. • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule. • Involved in the implementation of Clustering Architecture(Grid Technology) • Developed the OLTP data models for the real time retail data. • Developed parallel jobs between multiple source systems and sequencers. • Extensively worked on Information Analyzer to perform data profiling and identify and analyze the Metadata of the data (Column Analysis). Developed a detailed data profiling reports. • Publishing XML document from tabular data • Parsing XML document into tabular data • Accessing a Web service with input and output data • Extensive experience with the following Create/Delete projects, create/modify environmental parameters. Multiple APT_CONFIG_FILE • Experience in real time stages like XML,parseJSON(JavaScript Object Notation) to handle web based data. • Worked with contract libraries contains the schema that describes the JSON document for company profiles. • Using JSON stage to get the data from .XSD format. • Implemented reusable components (Containers) for the error handling and error mailing. • Worked on Information Analyzer for column analysis, primary key analysis and foreign key analysis and developed detailed data quality assessment (DQA) reports. • Ability to assist with operations and production coverage. • Worked used Data Stage and Quality Stage Designer to develop customized rule sets for standardizing fields like individual name, organization names, address and phone numbers. • Extensively used Data Stage and Quality Stage Designer to design and develop jobs for extracting, cleansing, transforming, integrating, and loading data using various stages like Standardize, Match Frequency, Reference Match,
  • 4. Page 4 of 7 Unduplicate Match, Survive, Remove Duplicate, Surrogate Key, Aggregator, Funnel, Join, Merge, Lookup, Change Capture, Change Apply and Copy. • Worked with complex Sql scripts, Stored procedures, Creation of indexes, complex joins in DB2 • Worked with Data Stage Director to monitor and analyze performance of individual stages and run Data Stage parallel jobs and sequencers. • Extensively worked on migrating Data Stage jobs from development to test and to production environments. • Performed Unit Testing, Regression Testing, and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed parallel jobs and sequencers by using various performance tuning strategies. • Environment: IBM Info Sphere Data Stage 8.1/9.1, Information Analyzer, IBM AIX, Flat Files, Neteeza NPS 7.0, DB29.7. WALGREENS, Detroit, MI March 2011 – January 2013 Senior ETL Lead Data Stage Developer/Admin The scope of the project is to develop Data Warehouse is built upon integrated legacy system data feeds received from many divisions of business data sources .Data Marts were build according to the business critical user requirements. • Involved in different phases of building the Data Marts like analyzing business requirements, ETL process design, performance enhancement, go-live activities and maintenance. • Interacted with the end users, Business Analysts and Architects for collecting, understating the business requirements. Documented them and translated requirements into system solutions. • Worked as a lead data stage developer. • Experienced with Creation of users, groups and different tasks using Data stage Administrator • Adding Environment variables to the project and involved in the migration of data stage version from 8.5 to 9.1. • Maintaining the Version control of the jobs. • Implemented reusable components (Containers) for the error handling and error mailing. • Designed ETL process as per the requirements and documented ETL process using MS Visio. • Extensively worked on Data Stage Designer, Director and Administrator. • Extensively used Data Stage Designer to design, develop ETL jobs for extracting data from Oracle tables and Flat files and load them into respective DataMarts. • Importing and Exporting of jobs through DataStage Designer into different projects. • Extensively worked with Fact and Dimension tables to produce source - target mappings based upon business specs. • Extensively worked on Information Analyzer to perform data profiling for identify and analyze the Metadata of the data (Column Analysis). Developed a detailed data profiling reports. • Worked on UNIX Shell Scripts to automate the file transfer process (FTP) and Scheduled jobs using AutoSys. • Worked with DataStage Director to monitor and analyze performance of individual stages and run DataStage jobs. • Experience in complex Sql scripts, Stored procedures, Creation of indexes, complex joins in Oracle • Implemented enhancements to the current ETL programs based on the new requirements. • Involved in setting up of war room sessions for the closure of issues. • Involved in the design of BIS architecture, Code reviews of DataStage jobs and implemented best practices in DataStage jobs. • Involved in up gradation of tool and migration of jobs from lower version to higher version. • Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed jobs by using various performance tuning strategies. • Extensively worked on coordination with Offshore. • Environment: IBM InfoSphere DataStage 8.5/9.1, Oracle (10g/11g), SunOS 5.1, HP Quality Center,Cognos 10, AutoSys, MS Visio, PL/SQL and Flat Files. THAMESWATER, Reading, UK June 2010 – February 2011 DataStage and QualityStage Consultant The Scope of WAMI (Thames water) project is client wanted to a single view on customer data. Initially WAMI was scattered in different. They decided to create a single customer solution using SAP CRM system.
  • 5. Page 5 of 7 • Involved in different phases of Data Integration like gathering and analyzing business requirements, ETL process design, performance enhancement, go-live activities and support. • Interacted with the users, Business Analysts for collecting, understating the business requirements. • Designed ETL process as per the requirements and documented ETL process using MS Visio. • Extensively worked on DataStage Enterprise Edition (Formerly Parallel Extender) Designer, Director and Administrator. • Extensively used DataStage Designer to design, develop ETL jobs for extracting data from different source system, transforming the legacy data to SAP ready data and loading the data into SAP system. • Extensively worked on administration tasks like creating projects for the respective releases and creating global parameters which can be commonly used across the environments. • Designed and developed DataStage jobs, containers to implement business rules. • Used After/ before job routines to perform specific task. • Involved in Master Data like Vendor Master, Customer Master and Transaction Data like AR Invoices, IM and WM managed Inventory, packing conversion. • Scheduled jobs using DataStage Director and Unix Shell Scripts. • Worked on UNIX Shell Scripts to automate the file transfer process(FTP). • Worked with DataStage Director to monitor and analyze performance of individual stages and run DataStage jobs. • Provided 24/7 support during the various test and mock phases and production support during the Go-Live. • Extensively worked on migrating DataStage codes from development to quality and to production environments. • Actively participated in discussions and implementation of Error Handling and Reject management functions. • Experience in monitoring the InfoSphere suite server for I/O usage, memory usage, CPU usage, network usage, storage usage. • Involved in setting up of war room sessions for the closure of issues. • Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed parallel jobs and sequencers by using various performance tuning strategies. • Extensive Offshore coordination. • Environment: IBM InfoSphere DataStage and Quality Stage 8.7 Parallel Extender, HP Quality Center, MSSQL Server 2000/2005, Sybase, SAP BW 7.0, IBM AIX 5.X, , SAP R/3, AutoSys, MS Visio, PL/SQL, DOS and Flat Files, Oracle 10g,DB2 9.7. Hutchinsion3G, Maidenhead, UK May 2009 – March 2010 DataStage and QualityStage Consultant The Hutchinson 3G UK Limited business and main IT focus is now on BI – analyzing data to predict future strategies for driving business behavior, developing campaigns, and targeting clients to best effect with reliable up to date information that can help differentiate themselves in the marketplace and from their competition. • Worked with the Business analyst to thoroughly understand the different business processes and requirements. • Documented user requirements, translated requirements into system solutions and develop implementation plan and schedule. • Developed parallel jobs between multiple source systems and sequencers. • Extensively worked on Investigate Stage to perform data profiling and identify and analyze the data patterns and inconsistencies. Developed a detailed data profiling report. • Worked on Information Analyzer for column analysis, primary key analysis and foreign key analysis. • Extensively used DataStage and QualityStage Designer to design and develop jobs for extracting, cleansing, transforming, integrating, and loading data using various stages like Standardize, Match Frequency, Unduplicate Match, Survive, Remove Duplicate, Surrogate Key, Aggregator, Funnel, Join, Change Capture, Change Apply and Copy. • Worked with DataStage Director to schedule, monitor and analyze performance of individual stages and run DataStage jobs. • Extensively worked administration tasks like creating Repository, User groups, Users and managed users by setting up their profiles and privileges and creating global parameters which can be commonly used across the environments. • Designed and developed DataStage jobs to populate into the MDM and send notification in xml messages using web service transformer.
  • 6. Page 6 of 7 • Written various Unix Shell Scripts for scheduling the jobs. • Extensively worked on migrating DataStage jobs from development to test and to production environments. • Involved in setting up of war room sessions for the closure of issues. • Designed mechanism for identifying bad records during the loading process. • Performed Unit Testing, Regression Testing, and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed jobs by using various performance tuning strategies. • Environment: IBM WebSphere DataStage and QualityStage 8.1 Parallel Extender, Information Analyzer, IBM MDM, IBM AIX 6.1, Oracle 10g, PL/SQL, Flat Files. ALLSTATE Insurance, Fort Wayne, IN January 2009 – April 2009 DataStage Consultant The main objective of the project was to develop an Enterprise Data Warehouse for reporting and analysis purposes. Matrix extracts data from different source systems and applies business rules, loads the data into warehouse to different DataMarts and sends data to the representatives on the field. • Involved in all the phases of building the DataMarts like analyzing the business requirements, ETL process design, performance enhancement and maintenance. • Extensively worked on DataStage Enterprise Edition (Formerly Parallel Extender) Administrator, Designer, Director, and Manager. • Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into different DataMarts. • Extensively worked administration tasks like creating user profiles and assigning user privileges and creating projects for the respective releases and creating global parameters which can be commonly used across the environments. • Involved in Data Modeling and creation of Star Schema and Snowflake Dimensional, DataMarts using ERWin tool. • Extensively worked in Teradata RDBMS V2R6.1. • Have Strong working experience in various Teradata Client Utilities like Multiload, Fast load, Fast Export and TERADATA administrator activities. • Designed mechanism to send alerts as soon as the jobs failed to PSO and the respective developers. • Provided production support during the various phases. • Extensively worked on migrating DataStage jobs from development to test and to production environments. • Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the transformations while loading the data. • Experience in complex Sql scripts, Stored procedures, Creation of indexes, complex joins. • Extensively worked Requirement analysis & Impact Analysis • Extensively worked Designing solution framework and Data Analysis • Extensively coordinated with Offshore. • Performed Unit Testing, Integration Testing, Regression Testing and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed jobs by using various performance tuning strategies. • Environment: IBM WebSphere DataStage 8.1 Parallel Extender, Cognos, AutoSys, Windows NT 4.0, IBM AIX 4.1, Red Hat Enterprise Linux 5, Oracle 10g, PL/SQL, DOS and Sequential Files. Ryvo Electronics, Atlanta, GA April 2007 –December 2008 PL/SQL and DataStage Developer The objective of the project is to develop the system, which can give him intelligent Information reports regarding his existing business situation and how he can analyze business.Involved in all the phases of building the DataMarts like analyzing the business requirements, ETL process design, performance enhancement and maintenance. • Extensively Worked on Parallel Extender (PX), QualityStage. • Extensively used DataStage Designer to design, develop ETL jobs for extracting, transforming and loading the data into different DataMarts. • Performed Import and Export of DataStage components and table definitions using DataStage Manager.
  • 7. Page 7 of 7 • Extensively worked administration tasks like creating user profiles and assigning user privileges and creating projects for the respective releases and creating global parameters which can be commonly used across the environments. • Developed Stored Procedures, Functions, database Triggers and created Packages to access databases from front end screens. • Developed both inbound and outbound interfaces to load data into targeted database and extract data from database to flat files. • Extensively used Data Stage Designer to develop a variety of jobs for extracting, cleansing, transforming, integrating and loading data into target database. • Extracted data from various source systems like Siebel, Oracle, Flat Files and SAP R/3. • Extensively used SAP integration using SAP connection. • Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the transformations while loading the data. • Extensively worked Requirement analysis & Impact Analysis • Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed jobs by using various performance tuning strategies. • Written various Unix Shell Scripts for scheduling the jobs. • Environment: IBM WebSphere DataStage 8.1, Windows NT 4.0,UNIX, LINUX, Oracle 8i, PL/SQL, IBM DB2, Dbase3 Files, DOS and UNIX Sequential Files, MS Access, .csv Files, XML Files. MetLife insurance Company, Bangalore, INDIA January 2007 - March 2007 Software Developer The primary objective of the project is to develop one process for their daily activities used to calculate the premium and analyze what type of policies people preferring. • Automated SQL Loader to load the data from the flat files • Used Materialized Views, Packages and Dynamic SQL • Created programs to transform data from Legacy systems into Oracle database. • Worked on DataMarts using ERWIN tool. • Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the transformations while loading the data. • Written various Unix Shell Scripts for scheduling the jobs. • Implemented enhancements to the current programs based on the new requirements. • Experience in complex Sql scripts, stored procedures, Creation of indexes, complex joins. • Performed Unit Testing, Integration Testing and User Acceptance Testing (UAT) for every code change and enhancement. • Worked on improving the performance of the designed jobs by using various performance tuning strategies. • Worked closely with Business Users and Data Quality Analysts after loading data for accuracy and consistency of data. • Environment: MS Word, Mercury Win runner, Mercury Load runner, Windows NT 4.0, Autosys, UNIX, Knoppix Linux, Oracle 9i/8i, PL/SQL, Dbase3, DOS and UNIX Sequential Files, MS Access, ERWin, DB2/UDB/EE Database. CERTIFICATION IBM Infosphere Datastage IBM Infosphere Qualitystage Got many performance awards