SlideShare a Scribd company logo
1 of 6
1
Sreeram Makam
Mobile: (919) 889-0004 Website - http://sites.google.com/site/makamsreeram/
Email: sreeram.makam@gmail.com
SUMMARY
➢ IBM Certified Solution Developer -- WebSphere IIS DataStage Enterprise Edition V7.5
➢ SAP Business Objects Certified Professional – Data Integrator –Level one
➢ 10 years plus of IT experience with 9 years of solid experience in Data Warehouse analysis, design, development
and implementation
➢ Using Ajile Methodology to improve time to market for solutions and deployments in projects
➢ Extensively worked in Datawarehousing using Business Objects XI - Data Integrator, Web Inteligence XI,
Excelcius for Dashboards, Ascential Datastage 5.1/6.0 XE/7.0/7.1/7.5.1/8.1 (Designer, Director and Administrator),
Orchestrate and Metastage.
➢ Experienced in using SAP 6.5 AFS system and understand the data for different modules like OTC, P2P, Fin/GL,
AP/AR related subject areas
➢ Experienced in integration of various data sources like Oracle, DB2UDB, SQL Server, IDOCs and Flat files into
the staging area
➢ Experienced in Data Analysis and Data Profiling using Trillium Software System- TS Discovery v11
➢ Experience in working with Bloomberg Terminals for Asset Management with Bloomberg AIM
➢ Developed E-R models and dimensional data models, design of Star Schema and Snow flake Schema.
➢ Proficiency in data warehousing techniques for data cleansing, slowly changing dimension phenomenon,
surrogate key generation
➢ Extensive experience with UNIX shell scripting for File validation, File Manipulation and scheduling DataStage
jobs
➢ Experienced in Oracle PL/SQL programming and performance tuning using concepts like Explain Plan, Hints,
Indexes (Bitmap/B-Tree)
➢ Involved and assisted in unit testing, implementation, maintenance and performance tuning
➢ Highly adaptive and a proven ability to work in a fast paced team environment with excellent time management,
analytical, written and communication skills
➢ Exploring new technologies via the MOOC classes for Data Science and aware of the basic concepts in Big Data,
Hadoop, Map Reduce, PIG, HIVE components. Coursera Data Science Signature Track candidate.
TECHNICAL SKILLS
ETL Tools Business Objects Data Integrator,IBM Infosphere/DataStage 5.1/6.0 XE/7.0/7.1/7.5.1/8.1/8.7
(Designer, Director, Manger, Administrator)
Reporting Tools
Quality Assurance
Cognos (Framework Manager, Query Studio, Report Studio) 10.x,Business Objects
4.1/5.0/5.1/6.x (Designer, BO InfoView Web Intelligence XI)
HP/Mercury Quality Center, Rational Clear Quest
Data Profile Tools
MDM
Trillium Software System (TS Discovery) v11
Riversand, Siperian Software
Databases Oracle 7.1/8/8i/9i, IBM DB2/UDB EEE 9.7/10.5, MS SQL Server 2000, MS Access, Teradata,
Sybase
Modeling Tools IBM Information Data Architect, ERwin 3.5/4.0
Operating Systems Windows 2000/NT, IBM AIX UNIX 4.2/4.3, HP-UX
Languages Perl, NANT, C, C++, UNIX Shell Scripting, SQL, DB2, PL/SQL, SQL * Plus, VB 6.0
Web Development ASP 6, PHP 4, IIS 5/6, Apache, JavaScript, VBScript, HTML, XML
Other Software TOAD, MS Office, MS-Project, MS-Visio
2
PROFESSIONAL EXPERINCE
VF Corp, Greensboro, NC Dec 2012 - Current
Sr. Design and Integration Engineer
Project: Acadia
Acadia is the North American implementation of new SAP 6.5 AFS system and my role is to architect the ETL
solution for BI group to build an Enterprise Data Warehouse. Data from multiple sources like SAP, PKMS,
MDM(Riversand),Island Pacific, Legacy systems, Logility, GT Nexus, Paypal to integrate into EDW.
➢ Administered and owned the DataStage v8.7 install and maintenance in the initial phase and then transitioned to
the support model. Resolve issues by engaging with IBM Support via PMR’s.
➢ Participate in the implementation phase of the project life cycle using SDLC/Agile methodology
➢ Overseen a team of 18 resources(1 lead) from offshore and lead them for the completion of the ETL designs and
development for a period of 18 months for about 1300 plus ETL jobs
➢ Data Modeling for the Stage and Integration and Dimensional schemas using IDA for Master Data
➢ Creating Source to Target mappings and create documentation process for Unit Testing and UAT testing
➢ Used MQ series to do the initial POC for Service Oriented Architectures and then implemented for 6 subject areas
like Item Master, Customer, Vendor, GT Nexus, WCS – Ecommerce, PKMS(DOM, WMS,DOCS)
➢ Used ABAP Extract stage(CPIC and RFC connections) to create ETL designs to get data from SAP. Created
custom tables and custom ABAP to bring Sales Orders, Deliveries, Billing, Purchase Order/ Purchase Requisition,
Finance, AP/AR, COPA. Cost Center Accounting modules to Stage.
➢ Establish Process and publish best practices/checklists for Code Deployment and migration from Development to
QA/Pre-Prod and Production
➢ Participated in different Mock cycles, Cutover, Black out, GO-Live cycles and also Hypercare phase of the projects
➢ Worked with Interns to develop the ESP applications for Night and Day batches.
➢ Worked with Enterprise Architects, Application Admins to identify the root cause/troubleshoot and resolve any
connectivity issues or errors.
Environment:
WebSphere Application Server, IBM Infosphere V8.7(Aix – P7/Flash based storage), IBM Cognos 10.1/8.4, MSSql Server,
DB2 10.5, Oracle 10g, SQL, PL/SQL, Windows Server 2003, HP-QC, IBM Information Data Architect, ERWIN DataModeler
7, CA Workload Manager ESP scheduling
AIG, New York, NY May 2010-Dec 2012
Sr. ETL Architect
Project: AIG Asset Management - Bloomberg – Asset and Investment Management
Understand the Bloomberg Specification requirements available from the Bloomberg terminal and Design ETL jobs
using DataStage, schedule jobs using Autosys, unix scripts, perl scripts, NANT and batch scripts to load the
Position, Market positions for the securities into Bloomberg daily along with Custom sectors(Watchlists, Credit
Family), Custom fields (TSCF like MAC1, MAC5, ) . Create Compliance and Operations reports – Bloomberg
Accounts Reports, Par Share reports, Trade Idea reports from the Transaction Logs. Monitor the daily feed jobs to
Bloomberg and troubleshoot if any issues.
Project: Chartis Insurance - ISUITE_REPORTING (Pega)
Design and build ETL jobs for populating the datamart for reporting model for ISUITE WORKFLOW for the
Submission, Underwriting and Issuance objects for the Operations team. Collected requirements from Seven
different profit centers in Chartis for North America region.
Project: Chartis Insurance - BI Chart Reporting Datamart (EDW)
3
Design and build ETL jobs for loading the DataMart and maintaining the DataModel for any incremental changes in
the design.
Project: Chartis Insurance - Business Metrics DataMart (EDW)
Enhance the existing DataMart by bringing in a new Source system (Isuite). The new data is combined with existing
sources and integrating into the presentation layer.
Project: Chartis Insurance - AIUQSR Reporting DataMart (Quarterly Settlement Reporting)
Create UNIX scripts for automating the staging area for the input files and creating scheduling script to be linked
with Autosys.
Transition this project to another team by creating a transition plan and documentation.
Project: Chartis Insurance - Review Monitoring and Tracking
Production support and analysis for the live application and monitoring the weekly batches.
Project: Chartis Insurance - DMRS(Direct Marketing Reporting Services)
The strategic intent of DMRS system is to aggregate and manage all relevant reporting inputs to enable A&H
(Accident and Health) DMG Financial Analysts to produce and distribute timely profit and loss information
necessary to run their business. The solution will deliver operational access, insight and provide information at a
granular level supporting channels, partners, offers & products.
Responsibilities:
➢ Participate in the whole project life cycle using SDLC methodology
➢ Administer and Maintain WebSphere DataStage involving security administration like adding and granting privileges
to users from the console
➢ Migration of existing projects on V7.5 to V8.5/8.1 and validation
➢ Develop and architect the ETL job to populate tables from staging to presentation and validation
➢ Creating DataModel for the initial design and vetting it with Data Architect for getting approval and verification
➢ Communicating the progress on the day to day activities and planning for additional change requests and
production releases and migrations
➢ Creating scripts for complex file processing and notification.
Environment:
WebSphere Application Server, IBM Infosphere V7.5/8.1(Unix), IBM Cognos 10.1/8.4, MSSql Server, Oracle 10g, SQL,
PL/SQL, Windows Server 2003, HP-UX, ERWIN DataModeler 7, Webshpere Integration Developer 6.1 CA Workload
manager
MMM Healthcare, Inc, San Juan, PR August 2009-May 2010
ETL Architect
Project: Aveta Core Technologies(ACT – V)
ACT-V is the Business initiative to implement a core system integration – ‘ODS’, for MMM (Medicare Y Mucho Mas),
and provide a interface for Trizetto’s QNXT application for Claims adjudication. This will require current system
analysis of EZCAP and other in house applications to create a ODS layer for Reporting also acting as a Master
Cross Reference for other applications using Enterprise Service Bus architecture.
Responsibilities:
➢ Administer and Maintain WebSphere DataStage involving security administration like adding and granting privileges
to users from the console
➢ Creating DSN, resolving the character set mismatches for handling different languages, debugging issues related to
driver incompatibilities, releasing Job Locks
➢ Migration of Projects from Dev to SIT using TFS and architecting the jobs to use parameters and $Projdef’s.
4
➢ Copying and creating Environment variables across projects
➢ Creating scripts for taking backups of the DataStage Jobs on a weekly basis
➢ Scheduling the jobs with email notifications and guiding the developers to create effective sequences with Error
Handling and reject handling
➢ Analyzing the Source/Target data(QNXT, Market Prominence, EZCAP, ProHealth, Portico) and profile them for
Member, Provider and Claims and Utilization including Case Management/Tracking Review(Concurrent review)
➢ Identifying the Business keys, Foreign Keys, Primary keys, Alternate Keys for designing the ODS layer
➢ Creating Source to Target mappings for ETL for Provider, Claims and Utilization Model
➢ Creating the Staging Data Model using ERWIN Data Modeler from the XSD for Provider and Practitioner XML files
which are provided by Portico to load into ODS
➢ Creating the ETL Design document with complex transformations using Master Cross Reference and lookups along
with Scheduling strategy and Mail Notification
➢ Develop the ETL designs using different stages like Folder, ODBC, XML Input, Transformer, Lookup, Merge, Join,
Funnel and different Job Scheduling Stages like Job Activity, Notification Activity, Execute Command, Exception
Handler
➢ Document the process for Unit test and deployment, monitor using the Director and Web console.
➢ Created Master Sequences using CA Workload manager(Autosys).
Environment:
WebSphere Application Server, IBM Infosphere V8.1(Windows), IBM Cognos, MSSql Server, Oracle 10g, SQL, PL/SQL,
Windows Server 2003, HP-UX, ERWIN DataModeler 7, Webshpere Integration Developer 6.1, Websphere TX. Websphere
Service Registry and Repository, CA Workload manager
Redhat, Raleigh, NC April 2009- July 2009
BI Consultant/ETL Architect
Project: Routes to Market
RTM is the Business initiative to grow the business of Redhat, improve collaboration in the Open source
community and to improve customer satisfaction.
Responsibilities:
➢ Using Agile methodology to improve the time to market solutions for RTM using Sprints and SCRUMs
➢ Gather requirements using User story sessions and defining tasks and estimation
➢ Design ETL jobs using the Data mappings and complex History Preserving method
➢ Creating Batch jobs with scripts for Initial/Delta loads and email notifications as well as ETL stats(Facelift)
➢ Working with Rapid Marts for Sales, Salesforce, Finance modules.
➢ Create Reports for Users in BO Universe as well as Web Intelligence
➢ Parameterization of Job designs for migration and automation
➢ Create User defined queries for the source extract of the data
➢ Create different test cases for validation and testing the functionality
➢ Troubleshooting and resolving complex business transformation/data logic issues for the RT tickets raised and
proposing innovative solutions for Data Standardization
Environment:
Environment: Business Objects Data Integrator 11.2(XI R2), Business Objects XI R2, RHEL 5.1, Oracle 10g, SQL, PL/SQL,
Windows Server 2003, HP-UX, ERWIN DataModeler 7,
5
Merrill Lynch, Hopewell, NJ July 2008- March 2009
ETL Solution Architect
Project: CEDP – OnlineB
CEDP (Client Enterprise Data Project) is a business initiative to bring the Account and Profile centric data into a common
repository for all the Business divisions of Merrill Lynch.
Responsibilities:
➢ Create SAD's(Software Architecture Documents) from SRS(Software Requirement Specifications)
➢ Leading and Managing a team of 3 resources and completing my deliverables.
➢ Design ETL jobs using the Data mappings
➢ Parameterization of Job designs for migration and automation
➢ Create User defined queries for the source extract of the data
➢ Create ETL extracts to consume binary files from DB2
➢ Create Unix scripts to truncate tables before loading into Prelanding and Landing tables
➢ Create ETL designs for creating the loadready files and then loading into the Target
➢ Improve performance by changing the Array size and rows per transaction
➢ Create different test cases for validation and testing the functionality
➢ Create ETL designs to strip Header and Trailer records before loading into Prelanding
➢ Create Custom Routine to generate a Header and Trailer record for the Extract files to be sent to other target
systems using BASIC
➢ Create MJC(Master Job Control) for automation of ETL using CSV's, INI's and BAT scripts to run in different
environments using KBA routines
➢ Execute Siperian Stage and Load jobs and Match and Merge using stored procedures.
➢ Configuring the Autosys job automation for Siperian MDM hub jobs using Autorepos tables in Autosys.
➢ Creating JIL(Job Information Language) for running the Batch scripts using Autosys
➢ Troubleshooting and resolving complex business transformation/data logic issues for the CQ’s raised and proposing
innovative solutions for Data Standardization
Environment:
Environment: IBM Ascential DataStage 7.5.1/2, Siperian Master Data Management, Oracle 9i, DB2, SQL, PL/SQL,
Windows Server 2003, HP-UX, ERWIN 3.5, Rational Clear Quest, Autosys
Chevron, Houston, TX Feb 2007- June2008
ETL Architect
Project: LYNX- Global Supply and Trading
Refinery (ElSegundo) built interfaces to extract transform and load data from different source systems like Simto, LIMS and
Spreadsheets.
IA (Information Architecture) built interfaces to extract transform and load data from different source systems like SAP (XML
inbound interface), PW (Web Services interface), ICTS (Web Service outbound). Designed jobs, to be published as a web
service, and also, consume a web service.
Responsibilities:
➢ Collaborated to gather requirements to understand the scope of the project through various requirement meetings
and team collaboration meetings
➢ Create Technical Specifications from Functional Requirements
➢ Analyze and profile Master Data using Trillium Software
➢ Build ETL jobs using IBM Datastage 7.5.1a (Enterprise Edition) to push data to Staging/ RIA/ ODS(Simto)
➢ Build ETL jobs to pull data from production sources e.g. LIMS, SIMTO
6
➢ Design Job Sequences in a Batch process for daily, weekly, Intraday
➢ Build ETL jobs to read the input XML source files for processing in Datastage PX
➢ Build ETL jobs to sequence in a Batch process
➢ Build ETL Before/After routines to send mail notifications using DSSendMail.
➢ Design and Build complex ETL transformations to load Dimensions and Fact implementing SCD Type2 designs
➢ Component testing of the jobs designed and production support for the initial/final cutover
➢ Designing POC for jobs, to be published as a web service with SOA, and also, consume a web service
Environment: IBM Ascential DataStage 7.5.1/2, Trillium Software system v11,Oracle 9i, SQLServer 2005, SQL, PL/SQL,
Windows Server 2003, HP-UX, ERWIN 3.5, Cognos 8.x
Kinko’s, Dallas, TX Apr’06-Jan’07
DW ETL Consultant
Project: Transactions Datamart
Daily Transactions from the Kinko’s centers are validated and loaded into a star schema based Transactions Datamart in
the DWH. The transactions are sourced from ODS (TODS) and COMPASS/CLARIFY (CRM) systems and loaded into the
DWH using Ascential Datastage. Batch Processing is used to schedule the order of execution for the jobs
Project: Revenue Segmentation
The transactions loaded into the DWH are updated with the Portfolio Assignment Id’s, Customer_Id’s, and Revenue
Segment Ids.
Project: Zenith 4.1 AE support
New Business rules prompted the changes to be done to the different portfolios that are brought into the DWH. AE (account
executives) Portfolios are not to be brought into the DWH, as they were not commissionable. Changes to the Compass
Category jobs are done to exclude the AE’s and bring in other portfolios such as AM/MAM, and SAE. Fit/gap analysis and
end to end regression testing is done using BO reports to see if the changes are done correctly.
Responsibilities:
➢ Collaborated to gather requirements to understand the scope of the projects through various requirement meetings
➢ Design of source to target mappings for building the Datamart. tested and implemented the fit/gap analysis
➢ Created Batches (DS Job Control) and sequencers to control set of DataStage Jobs
➢ Created Technical Design Document for the ETL process with Source to Target mappings and the extract filters
giving a high level overview of the process using diagrams built in Visio
➢ Importing the BO universe and reports and changing the database connections from Dev to Test to do the
regression testing
➢ Improved Performance tuning by building the indexes, partition pruning, drop and rebuild indexes, analyzing table
after the partition loads
➢ Followed the naming standards for the design of jobs and stages and links with proper documentation of the
process
➢ UNIX scripts/commands in Job control for file manipulation, and before job routines to sort files before aggregating
etc.
➢ Performed unit testing and User Acceptance Testing(UAT), Integration testing and created the test plans and
documented the test results and doing the production validation after the jobs are migrated into production
Environment: Ascential DataStage 7.5.1, Oracle 9i, SQL, PL/SQL, HP-UX, ERWIN 3.5, Business Objects 6.x

More Related Content

What's hot

Vinoth_Perumal_Datawarehousing
Vinoth_Perumal_DatawarehousingVinoth_Perumal_Datawarehousing
Vinoth_Perumal_Datawarehousingvinoth perumal
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17Shipra Jaiswal
 
Maharshi_Amin_416
Maharshi_Amin_416Maharshi_Amin_416
Maharshi_Amin_416mamin1411
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developerbasha shaik
 
Cognos BI Profile(Vasudev A)
Cognos BI Profile(Vasudev A)Cognos BI Profile(Vasudev A)
Cognos BI Profile(Vasudev A)Vasu Deva
 
Mukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar Mohammed
 
Hedrich_Michael_Resume_NT
Hedrich_Michael_Resume_NTHedrich_Michael_Resume_NT
Hedrich_Michael_Resume_NTMichael Hedrich
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer Mohamed Sakr
 
ETL tool evaluation criteria
ETL tool evaluation criteriaETL tool evaluation criteria
ETL tool evaluation criteriaAsis Mohanty
 
Enterprise Solutions Architect Eli Perl CV
Enterprise Solutions Architect Eli Perl CVEnterprise Solutions Architect Eli Perl CV
Enterprise Solutions Architect Eli Perl CVEli Perl
 
David Pate Resume Bus Obj 3
David Pate Resume Bus Obj 3David Pate Resume Bus Obj 3
David Pate Resume Bus Obj 3David Pate
 
TawenKan_092015
TawenKan_092015TawenKan_092015
TawenKan_092015Tawen Kan
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latestKallesha CB
 

What's hot (20)

SAP
SAPSAP
SAP
 
Vinoth_Perumal_Datawarehousing
Vinoth_Perumal_DatawarehousingVinoth_Perumal_Datawarehousing
Vinoth_Perumal_Datawarehousing
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 
Maharshi_Amin_416
Maharshi_Amin_416Maharshi_Amin_416
Maharshi_Amin_416
 
Resume (3)
Resume (3)Resume (3)
Resume (3)
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developer
 
Cognos BI Profile(Vasudev A)
Cognos BI Profile(Vasudev A)Cognos BI Profile(Vasudev A)
Cognos BI Profile(Vasudev A)
 
Ramachandran_ETL Developer
Ramachandran_ETL DeveloperRamachandran_ETL Developer
Ramachandran_ETL Developer
 
J. M. Hoffman
J. M. HoffmanJ. M. Hoffman
J. M. Hoffman
 
Mukhtar resume etl_developer
Mukhtar resume etl_developerMukhtar resume etl_developer
Mukhtar resume etl_developer
 
Hedrich_Michael_Resume_NT
Hedrich_Michael_Resume_NTHedrich_Michael_Resume_NT
Hedrich_Michael_Resume_NT
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer
 
Sandeep Grandhi (1)
Sandeep Grandhi (1)Sandeep Grandhi (1)
Sandeep Grandhi (1)
 
Resume
ResumeResume
Resume
 
ETL tool evaluation criteria
ETL tool evaluation criteriaETL tool evaluation criteria
ETL tool evaluation criteria
 
Enterprise Solutions Architect Eli Perl CV
Enterprise Solutions Architect Eli Perl CVEnterprise Solutions Architect Eli Perl CV
Enterprise Solutions Architect Eli Perl CV
 
SURENDRANATH GANDLA4
SURENDRANATH GANDLA4SURENDRANATH GANDLA4
SURENDRANATH GANDLA4
 
David Pate Resume Bus Obj 3
David Pate Resume Bus Obj 3David Pate Resume Bus Obj 3
David Pate Resume Bus Obj 3
 
TawenKan_092015
TawenKan_092015TawenKan_092015
TawenKan_092015
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
 

Similar to DS_Sreeram_7

Prateek tulsiyan app_integration_tcs
Prateek tulsiyan app_integration_tcsPrateek tulsiyan app_integration_tcs
Prateek tulsiyan app_integration_tcsPrateek Tulsiyan
 
Anant_Shekdar_BI_Resume
Anant_Shekdar_BI_ResumeAnant_Shekdar_BI_Resume
Anant_Shekdar_BI_ResumeAnant Shekdar
 
Amit Porwal_resume-Latest
Amit Porwal_resume-LatestAmit Porwal_resume-Latest
Amit Porwal_resume-LatestAmit Porwal
 
Lokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh Reddy
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
GANESH -Certified Consultant
GANESH -Certified ConsultantGANESH -Certified Consultant
GANESH -Certified Consultantvspganesh
 
Vu Ho (.Net )
Vu Ho (.Net )Vu Ho (.Net )
Vu Ho (.Net )Vu Ho
 
Saranteja gutta wells
Saranteja gutta wellsSaranteja gutta wells
Saranteja gutta wellsramesh5080
 
Khalid SRIJI resume
Khalid SRIJI resumeKhalid SRIJI resume
Khalid SRIJI resumeKhalid SRIJI
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETLMani Sagar
 
Rizvi_Shaik
Rizvi_ShaikRizvi_Shaik
Rizvi_Shaikrizvi_s
 
Yuvaraj Shanmugam - Application Architect
Yuvaraj Shanmugam - Application ArchitectYuvaraj Shanmugam - Application Architect
Yuvaraj Shanmugam - Application ArchitectYuvaraj Shanmugam
 
JAVA J2EE LEAD coming out of CITI
JAVA J2EE LEAD coming out of CITIJAVA J2EE LEAD coming out of CITI
JAVA J2EE LEAD coming out of CITIvravi123
 

Similar to DS_Sreeram_7 (20)

Resume_Raj Ganesh Subramanian
Resume_Raj Ganesh SubramanianResume_Raj Ganesh Subramanian
Resume_Raj Ganesh Subramanian
 
Prateek tulsiyan app_integration_tcs
Prateek tulsiyan app_integration_tcsPrateek tulsiyan app_integration_tcs
Prateek tulsiyan app_integration_tcs
 
Suja_Resume_Recent
Suja_Resume_RecentSuja_Resume_Recent
Suja_Resume_Recent
 
Anant_Shekdar_BI_Resume
Anant_Shekdar_BI_ResumeAnant_Shekdar_BI_Resume
Anant_Shekdar_BI_Resume
 
Amit Porwal_resume-Latest
Amit Porwal_resume-LatestAmit Porwal_resume-Latest
Amit Porwal_resume-Latest
 
Lokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
GANESH -Certified Consultant
GANESH -Certified ConsultantGANESH -Certified Consultant
GANESH -Certified Consultant
 
Vu Ho (.Net )
Vu Ho (.Net )Vu Ho (.Net )
Vu Ho (.Net )
 
Saranteja gutta wells
Saranteja gutta wellsSaranteja gutta wells
Saranteja gutta wells
 
Technical Project Manager
Technical Project ManagerTechnical Project Manager
Technical Project Manager
 
Khalid SRIJI resume
Khalid SRIJI resumeKhalid SRIJI resume
Khalid SRIJI resume
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Vamshi
VamshiVamshi
Vamshi
 
VAMSHI
VAMSHIVAMSHI
VAMSHI
 
Rizvi_Shaik
Rizvi_ShaikRizvi_Shaik
Rizvi_Shaik
 
Yuvaraj Shanmugam - Application Architect
Yuvaraj Shanmugam - Application ArchitectYuvaraj Shanmugam - Application Architect
Yuvaraj Shanmugam - Application Architect
 
JAVA J2EE LEAD coming out of CITI
JAVA J2EE LEAD coming out of CITIJAVA J2EE LEAD coming out of CITI
JAVA J2EE LEAD coming out of CITI
 
Madhava_Sr_JAVA_J2EE
Madhava_Sr_JAVA_J2EEMadhava_Sr_JAVA_J2EE
Madhava_Sr_JAVA_J2EE
 

DS_Sreeram_7

  • 1. 1 Sreeram Makam Mobile: (919) 889-0004 Website - http://sites.google.com/site/makamsreeram/ Email: sreeram.makam@gmail.com SUMMARY ➢ IBM Certified Solution Developer -- WebSphere IIS DataStage Enterprise Edition V7.5 ➢ SAP Business Objects Certified Professional – Data Integrator –Level one ➢ 10 years plus of IT experience with 9 years of solid experience in Data Warehouse analysis, design, development and implementation ➢ Using Ajile Methodology to improve time to market for solutions and deployments in projects ➢ Extensively worked in Datawarehousing using Business Objects XI - Data Integrator, Web Inteligence XI, Excelcius for Dashboards, Ascential Datastage 5.1/6.0 XE/7.0/7.1/7.5.1/8.1 (Designer, Director and Administrator), Orchestrate and Metastage. ➢ Experienced in using SAP 6.5 AFS system and understand the data for different modules like OTC, P2P, Fin/GL, AP/AR related subject areas ➢ Experienced in integration of various data sources like Oracle, DB2UDB, SQL Server, IDOCs and Flat files into the staging area ➢ Experienced in Data Analysis and Data Profiling using Trillium Software System- TS Discovery v11 ➢ Experience in working with Bloomberg Terminals for Asset Management with Bloomberg AIM ➢ Developed E-R models and dimensional data models, design of Star Schema and Snow flake Schema. ➢ Proficiency in data warehousing techniques for data cleansing, slowly changing dimension phenomenon, surrogate key generation ➢ Extensive experience with UNIX shell scripting for File validation, File Manipulation and scheduling DataStage jobs ➢ Experienced in Oracle PL/SQL programming and performance tuning using concepts like Explain Plan, Hints, Indexes (Bitmap/B-Tree) ➢ Involved and assisted in unit testing, implementation, maintenance and performance tuning ➢ Highly adaptive and a proven ability to work in a fast paced team environment with excellent time management, analytical, written and communication skills ➢ Exploring new technologies via the MOOC classes for Data Science and aware of the basic concepts in Big Data, Hadoop, Map Reduce, PIG, HIVE components. Coursera Data Science Signature Track candidate. TECHNICAL SKILLS ETL Tools Business Objects Data Integrator,IBM Infosphere/DataStage 5.1/6.0 XE/7.0/7.1/7.5.1/8.1/8.7 (Designer, Director, Manger, Administrator) Reporting Tools Quality Assurance Cognos (Framework Manager, Query Studio, Report Studio) 10.x,Business Objects 4.1/5.0/5.1/6.x (Designer, BO InfoView Web Intelligence XI) HP/Mercury Quality Center, Rational Clear Quest Data Profile Tools MDM Trillium Software System (TS Discovery) v11 Riversand, Siperian Software Databases Oracle 7.1/8/8i/9i, IBM DB2/UDB EEE 9.7/10.5, MS SQL Server 2000, MS Access, Teradata, Sybase Modeling Tools IBM Information Data Architect, ERwin 3.5/4.0 Operating Systems Windows 2000/NT, IBM AIX UNIX 4.2/4.3, HP-UX Languages Perl, NANT, C, C++, UNIX Shell Scripting, SQL, DB2, PL/SQL, SQL * Plus, VB 6.0 Web Development ASP 6, PHP 4, IIS 5/6, Apache, JavaScript, VBScript, HTML, XML Other Software TOAD, MS Office, MS-Project, MS-Visio
  • 2. 2 PROFESSIONAL EXPERINCE VF Corp, Greensboro, NC Dec 2012 - Current Sr. Design and Integration Engineer Project: Acadia Acadia is the North American implementation of new SAP 6.5 AFS system and my role is to architect the ETL solution for BI group to build an Enterprise Data Warehouse. Data from multiple sources like SAP, PKMS, MDM(Riversand),Island Pacific, Legacy systems, Logility, GT Nexus, Paypal to integrate into EDW. ➢ Administered and owned the DataStage v8.7 install and maintenance in the initial phase and then transitioned to the support model. Resolve issues by engaging with IBM Support via PMR’s. ➢ Participate in the implementation phase of the project life cycle using SDLC/Agile methodology ➢ Overseen a team of 18 resources(1 lead) from offshore and lead them for the completion of the ETL designs and development for a period of 18 months for about 1300 plus ETL jobs ➢ Data Modeling for the Stage and Integration and Dimensional schemas using IDA for Master Data ➢ Creating Source to Target mappings and create documentation process for Unit Testing and UAT testing ➢ Used MQ series to do the initial POC for Service Oriented Architectures and then implemented for 6 subject areas like Item Master, Customer, Vendor, GT Nexus, WCS – Ecommerce, PKMS(DOM, WMS,DOCS) ➢ Used ABAP Extract stage(CPIC and RFC connections) to create ETL designs to get data from SAP. Created custom tables and custom ABAP to bring Sales Orders, Deliveries, Billing, Purchase Order/ Purchase Requisition, Finance, AP/AR, COPA. Cost Center Accounting modules to Stage. ➢ Establish Process and publish best practices/checklists for Code Deployment and migration from Development to QA/Pre-Prod and Production ➢ Participated in different Mock cycles, Cutover, Black out, GO-Live cycles and also Hypercare phase of the projects ➢ Worked with Interns to develop the ESP applications for Night and Day batches. ➢ Worked with Enterprise Architects, Application Admins to identify the root cause/troubleshoot and resolve any connectivity issues or errors. Environment: WebSphere Application Server, IBM Infosphere V8.7(Aix – P7/Flash based storage), IBM Cognos 10.1/8.4, MSSql Server, DB2 10.5, Oracle 10g, SQL, PL/SQL, Windows Server 2003, HP-QC, IBM Information Data Architect, ERWIN DataModeler 7, CA Workload Manager ESP scheduling AIG, New York, NY May 2010-Dec 2012 Sr. ETL Architect Project: AIG Asset Management - Bloomberg – Asset and Investment Management Understand the Bloomberg Specification requirements available from the Bloomberg terminal and Design ETL jobs using DataStage, schedule jobs using Autosys, unix scripts, perl scripts, NANT and batch scripts to load the Position, Market positions for the securities into Bloomberg daily along with Custom sectors(Watchlists, Credit Family), Custom fields (TSCF like MAC1, MAC5, ) . Create Compliance and Operations reports – Bloomberg Accounts Reports, Par Share reports, Trade Idea reports from the Transaction Logs. Monitor the daily feed jobs to Bloomberg and troubleshoot if any issues. Project: Chartis Insurance - ISUITE_REPORTING (Pega) Design and build ETL jobs for populating the datamart for reporting model for ISUITE WORKFLOW for the Submission, Underwriting and Issuance objects for the Operations team. Collected requirements from Seven different profit centers in Chartis for North America region. Project: Chartis Insurance - BI Chart Reporting Datamart (EDW)
  • 3. 3 Design and build ETL jobs for loading the DataMart and maintaining the DataModel for any incremental changes in the design. Project: Chartis Insurance - Business Metrics DataMart (EDW) Enhance the existing DataMart by bringing in a new Source system (Isuite). The new data is combined with existing sources and integrating into the presentation layer. Project: Chartis Insurance - AIUQSR Reporting DataMart (Quarterly Settlement Reporting) Create UNIX scripts for automating the staging area for the input files and creating scheduling script to be linked with Autosys. Transition this project to another team by creating a transition plan and documentation. Project: Chartis Insurance - Review Monitoring and Tracking Production support and analysis for the live application and monitoring the weekly batches. Project: Chartis Insurance - DMRS(Direct Marketing Reporting Services) The strategic intent of DMRS system is to aggregate and manage all relevant reporting inputs to enable A&H (Accident and Health) DMG Financial Analysts to produce and distribute timely profit and loss information necessary to run their business. The solution will deliver operational access, insight and provide information at a granular level supporting channels, partners, offers & products. Responsibilities: ➢ Participate in the whole project life cycle using SDLC methodology ➢ Administer and Maintain WebSphere DataStage involving security administration like adding and granting privileges to users from the console ➢ Migration of existing projects on V7.5 to V8.5/8.1 and validation ➢ Develop and architect the ETL job to populate tables from staging to presentation and validation ➢ Creating DataModel for the initial design and vetting it with Data Architect for getting approval and verification ➢ Communicating the progress on the day to day activities and planning for additional change requests and production releases and migrations ➢ Creating scripts for complex file processing and notification. Environment: WebSphere Application Server, IBM Infosphere V7.5/8.1(Unix), IBM Cognos 10.1/8.4, MSSql Server, Oracle 10g, SQL, PL/SQL, Windows Server 2003, HP-UX, ERWIN DataModeler 7, Webshpere Integration Developer 6.1 CA Workload manager MMM Healthcare, Inc, San Juan, PR August 2009-May 2010 ETL Architect Project: Aveta Core Technologies(ACT – V) ACT-V is the Business initiative to implement a core system integration – ‘ODS’, for MMM (Medicare Y Mucho Mas), and provide a interface for Trizetto’s QNXT application for Claims adjudication. This will require current system analysis of EZCAP and other in house applications to create a ODS layer for Reporting also acting as a Master Cross Reference for other applications using Enterprise Service Bus architecture. Responsibilities: ➢ Administer and Maintain WebSphere DataStage involving security administration like adding and granting privileges to users from the console ➢ Creating DSN, resolving the character set mismatches for handling different languages, debugging issues related to driver incompatibilities, releasing Job Locks ➢ Migration of Projects from Dev to SIT using TFS and architecting the jobs to use parameters and $Projdef’s.
  • 4. 4 ➢ Copying and creating Environment variables across projects ➢ Creating scripts for taking backups of the DataStage Jobs on a weekly basis ➢ Scheduling the jobs with email notifications and guiding the developers to create effective sequences with Error Handling and reject handling ➢ Analyzing the Source/Target data(QNXT, Market Prominence, EZCAP, ProHealth, Portico) and profile them for Member, Provider and Claims and Utilization including Case Management/Tracking Review(Concurrent review) ➢ Identifying the Business keys, Foreign Keys, Primary keys, Alternate Keys for designing the ODS layer ➢ Creating Source to Target mappings for ETL for Provider, Claims and Utilization Model ➢ Creating the Staging Data Model using ERWIN Data Modeler from the XSD for Provider and Practitioner XML files which are provided by Portico to load into ODS ➢ Creating the ETL Design document with complex transformations using Master Cross Reference and lookups along with Scheduling strategy and Mail Notification ➢ Develop the ETL designs using different stages like Folder, ODBC, XML Input, Transformer, Lookup, Merge, Join, Funnel and different Job Scheduling Stages like Job Activity, Notification Activity, Execute Command, Exception Handler ➢ Document the process for Unit test and deployment, monitor using the Director and Web console. ➢ Created Master Sequences using CA Workload manager(Autosys). Environment: WebSphere Application Server, IBM Infosphere V8.1(Windows), IBM Cognos, MSSql Server, Oracle 10g, SQL, PL/SQL, Windows Server 2003, HP-UX, ERWIN DataModeler 7, Webshpere Integration Developer 6.1, Websphere TX. Websphere Service Registry and Repository, CA Workload manager Redhat, Raleigh, NC April 2009- July 2009 BI Consultant/ETL Architect Project: Routes to Market RTM is the Business initiative to grow the business of Redhat, improve collaboration in the Open source community and to improve customer satisfaction. Responsibilities: ➢ Using Agile methodology to improve the time to market solutions for RTM using Sprints and SCRUMs ➢ Gather requirements using User story sessions and defining tasks and estimation ➢ Design ETL jobs using the Data mappings and complex History Preserving method ➢ Creating Batch jobs with scripts for Initial/Delta loads and email notifications as well as ETL stats(Facelift) ➢ Working with Rapid Marts for Sales, Salesforce, Finance modules. ➢ Create Reports for Users in BO Universe as well as Web Intelligence ➢ Parameterization of Job designs for migration and automation ➢ Create User defined queries for the source extract of the data ➢ Create different test cases for validation and testing the functionality ➢ Troubleshooting and resolving complex business transformation/data logic issues for the RT tickets raised and proposing innovative solutions for Data Standardization Environment: Environment: Business Objects Data Integrator 11.2(XI R2), Business Objects XI R2, RHEL 5.1, Oracle 10g, SQL, PL/SQL, Windows Server 2003, HP-UX, ERWIN DataModeler 7,
  • 5. 5 Merrill Lynch, Hopewell, NJ July 2008- March 2009 ETL Solution Architect Project: CEDP – OnlineB CEDP (Client Enterprise Data Project) is a business initiative to bring the Account and Profile centric data into a common repository for all the Business divisions of Merrill Lynch. Responsibilities: ➢ Create SAD's(Software Architecture Documents) from SRS(Software Requirement Specifications) ➢ Leading and Managing a team of 3 resources and completing my deliverables. ➢ Design ETL jobs using the Data mappings ➢ Parameterization of Job designs for migration and automation ➢ Create User defined queries for the source extract of the data ➢ Create ETL extracts to consume binary files from DB2 ➢ Create Unix scripts to truncate tables before loading into Prelanding and Landing tables ➢ Create ETL designs for creating the loadready files and then loading into the Target ➢ Improve performance by changing the Array size and rows per transaction ➢ Create different test cases for validation and testing the functionality ➢ Create ETL designs to strip Header and Trailer records before loading into Prelanding ➢ Create Custom Routine to generate a Header and Trailer record for the Extract files to be sent to other target systems using BASIC ➢ Create MJC(Master Job Control) for automation of ETL using CSV's, INI's and BAT scripts to run in different environments using KBA routines ➢ Execute Siperian Stage and Load jobs and Match and Merge using stored procedures. ➢ Configuring the Autosys job automation for Siperian MDM hub jobs using Autorepos tables in Autosys. ➢ Creating JIL(Job Information Language) for running the Batch scripts using Autosys ➢ Troubleshooting and resolving complex business transformation/data logic issues for the CQ’s raised and proposing innovative solutions for Data Standardization Environment: Environment: IBM Ascential DataStage 7.5.1/2, Siperian Master Data Management, Oracle 9i, DB2, SQL, PL/SQL, Windows Server 2003, HP-UX, ERWIN 3.5, Rational Clear Quest, Autosys Chevron, Houston, TX Feb 2007- June2008 ETL Architect Project: LYNX- Global Supply and Trading Refinery (ElSegundo) built interfaces to extract transform and load data from different source systems like Simto, LIMS and Spreadsheets. IA (Information Architecture) built interfaces to extract transform and load data from different source systems like SAP (XML inbound interface), PW (Web Services interface), ICTS (Web Service outbound). Designed jobs, to be published as a web service, and also, consume a web service. Responsibilities: ➢ Collaborated to gather requirements to understand the scope of the project through various requirement meetings and team collaboration meetings ➢ Create Technical Specifications from Functional Requirements ➢ Analyze and profile Master Data using Trillium Software ➢ Build ETL jobs using IBM Datastage 7.5.1a (Enterprise Edition) to push data to Staging/ RIA/ ODS(Simto) ➢ Build ETL jobs to pull data from production sources e.g. LIMS, SIMTO
  • 6. 6 ➢ Design Job Sequences in a Batch process for daily, weekly, Intraday ➢ Build ETL jobs to read the input XML source files for processing in Datastage PX ➢ Build ETL jobs to sequence in a Batch process ➢ Build ETL Before/After routines to send mail notifications using DSSendMail. ➢ Design and Build complex ETL transformations to load Dimensions and Fact implementing SCD Type2 designs ➢ Component testing of the jobs designed and production support for the initial/final cutover ➢ Designing POC for jobs, to be published as a web service with SOA, and also, consume a web service Environment: IBM Ascential DataStage 7.5.1/2, Trillium Software system v11,Oracle 9i, SQLServer 2005, SQL, PL/SQL, Windows Server 2003, HP-UX, ERWIN 3.5, Cognos 8.x Kinko’s, Dallas, TX Apr’06-Jan’07 DW ETL Consultant Project: Transactions Datamart Daily Transactions from the Kinko’s centers are validated and loaded into a star schema based Transactions Datamart in the DWH. The transactions are sourced from ODS (TODS) and COMPASS/CLARIFY (CRM) systems and loaded into the DWH using Ascential Datastage. Batch Processing is used to schedule the order of execution for the jobs Project: Revenue Segmentation The transactions loaded into the DWH are updated with the Portfolio Assignment Id’s, Customer_Id’s, and Revenue Segment Ids. Project: Zenith 4.1 AE support New Business rules prompted the changes to be done to the different portfolios that are brought into the DWH. AE (account executives) Portfolios are not to be brought into the DWH, as they were not commissionable. Changes to the Compass Category jobs are done to exclude the AE’s and bring in other portfolios such as AM/MAM, and SAE. Fit/gap analysis and end to end regression testing is done using BO reports to see if the changes are done correctly. Responsibilities: ➢ Collaborated to gather requirements to understand the scope of the projects through various requirement meetings ➢ Design of source to target mappings for building the Datamart. tested and implemented the fit/gap analysis ➢ Created Batches (DS Job Control) and sequencers to control set of DataStage Jobs ➢ Created Technical Design Document for the ETL process with Source to Target mappings and the extract filters giving a high level overview of the process using diagrams built in Visio ➢ Importing the BO universe and reports and changing the database connections from Dev to Test to do the regression testing ➢ Improved Performance tuning by building the indexes, partition pruning, drop and rebuild indexes, analyzing table after the partition loads ➢ Followed the naming standards for the design of jobs and stages and links with proper documentation of the process ➢ UNIX scripts/commands in Job control for file manipulation, and before job routines to sort files before aggregating etc. ➢ Performed unit testing and User Acceptance Testing(UAT), Integration testing and created the test plans and documented the test results and doing the production validation after the jobs are migrated into production Environment: Ascential DataStage 7.5.1, Oracle 9i, SQL, PL/SQL, HP-UX, ERWIN 3.5, Business Objects 6.x