SlideShare a Scribd company logo
1 of 10
Sriram Jasti
Mobile: 415-741-9969 E-Mail: sriramjasti@gmail.com
Aspiring for challenging assignments with a company that will utilize previous experience and
technical skills in Data warehouse and Business Intelligence.
PROFESSIONAL SUMMARY
 10+ years of rich experience in solving critical business needs by modeling data warehouse for business
intelligence database environments using Start Schema and Snow Flake Schema methodology for faster and
effective Querying and managing large volumes of data.
Understanding Client Requirement
 Competent in client relationship management, presentation, Designing functional specifications,
implementing the solution, User Training and support.
 Conducting Gap Analysis and providing strategic input to various business users in order to support reporting
and analysis initiatives.
 Expertise in Served as Subject Matter Expert in data modeling for Airlines, Finance Services, Property
Management, Manufacturing, Banking, Insurance, and Retail domains.
Expertise in Agile
 Expertise in Project Planning, Architecture Design and Development. Responsible for prototyping solutions,
preparing test scripts, and conducting tests and for data replication, extraction, transformation, loading,
cleansing, create Multi-dimensional Cubes and create prototype excel pivot table reports.
 Experience in Conducting Sprint planning and Conducted scrum meeting.
Designing and implementing
 Implemented four full lifecycle Data warehouses and Business Intelligence solutions with Star Schemas and
Snowflake Schemas Methodologies.
 Create Logical and Physical model by using MS Visio and Erwin tools.
 Rich expertise in Data Analysis, Design and Development process by using Data Extraction, Data
Transformation and Data Loading (ETL) using IPE and IDQ.
 Developed processes for capturing and maintaining metadata from all data warehousing components which
includes sizing estimates for large scale data warehousing projects, aggregation and indexing strategies
with database security.
 Rich Experience in analyzing performance statistics for Extracting high volume of data from Multiple
Sources to Data Warehouse and create automation process based on the performance statistics.
 Experience in Analyzing Data Quality Issues using IDQ.
 Experience in Bullet proofing the system by using Auditing, Balancing and Control (ABC) Mechanism.
 Create/Maintain ETL processes utilizing Microsoft SQL Server Integration Services (SSIS) from a diversity
of data sources including other servers, flat files, MS Access and Excel.
 Responsible in data maintenance and database improvements following Best Practices for data-typing,
relational data structure and data quality.
 Experience in Creating HIVE Queries in Hadoop1.x and Very good understanding of YARN Architecture.
 Expertise in Creating Multi-dimensional cubes using Analysis Services.
EXCELLENCE SPHERE
Requirement Analysis
Software Usability &
Development
System Architecture
Data Modelling
Design and
Implementation
User Support
Managing Onshore/
Offshore teams
 Troubleshoot and correct any reported data issues in deliverables.
 Rich experience in design & development using Informatica 6.1/7.3/8.1/8.6/9.0.1,Datastage 7.5/8.0.1 PX,SSIS
SQL Server 2005/2008 r2,SSRS SQL Server 2005/2008 r2,SSAS SQL server 2005/2008 r2,Erwin
4.2/7.1,Microsoft Viso 2010,SAS Data Integration studio 3.3/3.4,SAS 9.1.2 including SAS SQL and SAS Macros.
Additional Contributions
 Experience involves integrating HIVE Queries (hql) with Data warehouse for better performance.
 Experience involves converting the SQL Server DTS Packages to Informatica Mappings.
 Experience involves converting the Data stage Server Jobs to Informatica Mappings.
 Extensive working experience in integrating with the Cognos Planning to GL Interface in Oracle Applications 12i
and HFM.
 Pull and Push Data to/from different ERP Systems like Oracle Apps, SIEBEL CRM, People Soft and QAD systems
for Maintaining Customer Information.
 Experence in cleaning up data using Address validation, Standardization, Match and Merge rules using IDQ.
 Knowledge in integration with real time systems like mainframes by creating Data Maps using Power Exchange.
CERTIFICATIONS
 Microsoft Certified Professional. http://www.microsoft.com/learning/mcp/transcripts
• Microsoft® SQL ServerTM 2005 Business Intelligence – Implementation and Maintenance
• Designing a Business Intelligence Solution by Using Microsoft® SQL Server 2005
CAREER PATH
Data warehouse Consultant at Info Objects Inc., Santa Clara from May’14 to till date
Data warehouse Consultant at SACC Inc., Redwood City from Oct’13 to May’14
Data warehouse Architect at Laird Technologies Inc., Holly from Apr’11 to Sep’13
Senior Programmer Analyst at Global Techies Inc., Detroit from Dec’09 – Mar’11
Senior Associate –Platform L2 at Sapient Corporation, Bangalore from Aug’09 – Dec’09
Senior Engineer – Level1 at Satyam Computer Services, Malaysia from Oct’07 – July’09
Software Engineer – Level1 at Penfos Systems Pvt. Ltd, Hyderabad from Apr’04 – Sep’07
SKILL SET
Project Management
• Implementing project plans within preset budgets and deadlines; monitoring project progress and outstanding
issues; ensuring quality and timeliness of deliverables; reporting on the project’s progress and escalate issues.
• Creating support SLAs and establishing SME & various technical support team interactions for improved
operational efficiency; administering quality procedures related to project and delivery.
Software Development
• Interfacing with clients for business gathering, conducting system analysis and finalising technical / functional
specifications and high level design documents for the project.
• Contributing to the design, development, testing, troubleshooting and debugging of the software.
• Architected, designed, developed & managed teams for products in the Financial Services, Manufacturing and
Retail domains using agile methodology.
• Providing post-implementation, application maintenance and enhancement support to the client.
• Provide ETL Solutions for multi-terabyte Data warehouse and understand its impact
Key Account:
• Interfacing and interacting with customers to give proper updates regarding status of the project and delivery
timelines; ensuring the adherence to the delivery timelines.
• Managing and developing beneficial and cordial relationships with key clients.
• Ensuring speedy resolution of queries & grievances to maximize client/customer satisfaction levels and
maintaining excellent relations with clients/customers to generate avenues for further business.
• Played a key role in performance improvements and review for various projects.
Manpower Leadership
• Leading, mentoring & monitoring the performance of team members to ensure efficiency in process operations
and meeting of individual & group targets.
• Developing competency among the team members; conducting interviews to recruit the right talent and
resources and developing employee competency.
Refer to Annexure for Projects Executed
TECHNICAL PURVIEW
Operating Systems : Windows 2003, NT, XP, UNIX
Languages : C,C#, Base SAS,SAS SQL, shell scripting
Databases : Oracle 8i/9i,10g, MS SQL Server 2000/2005/2008 r2/2012.
Data modeling Tools : ERWin 4.1/7.1,Embarcardo and MS Viso 2010
Big Data : Hadoop1.x Mapreduce,Hive,Pig,Scoop,Flum,Hadoop 2.x Yarn
ETL Tools : Informatica Power Center 6.1/7.1/8.1.3/8.6.1/9.0.1,
Informatica Power exchange 8.6.1,
Informatica Data quality 8.6.1/9.1,
B2B Data transformation 8.6.1,
MS-SSIS SQL Server 2005/2008 r2,
Datastage7.5/8.0.1PX and SAS Data integration Studio 3.3/3.4
Reporting Tools : MS-SSRS SQL Server 2005/2008 R2 with SharePoint Integration,
MS-SSAS SQL Server 2005/2008 R22012, Report Builder 3.0,
MS- Excel Pivot tables
Business Objects XI R/2(Universe Build) and
Cognos 8 (Framework Manager)
Scheduling Tools : Control-M 8.x, LSF Platform Scheduler 3.x
Tools & Version Control : Tortoise Subversion (SVN), PVCS, Perforce
Defect Tracking Tool : JIRA
Agile Tools : Rally,JIRA
EDUCATION
• PG. Diploma in Computer Application from Jawaharlal Nehru National Youth center in 2004
• B.E. (Electrical and Electronics Engineering) from Jawaharlal Nehru Technical University in 2004
ANNEXURE
PROJECTS EXECUTED
At SACC and Info Objects,inc
Client : Digital Insight Inc.
Title : IFS Data warehouse (IDW)
Duration : Oct ’13-till date
Role : Data warehouse Consultant
Location : Redwood City, CA
Environment :
Database : Oracle 11g
Tools : Informatica 9.1,
Embarcardo ER Studio Data Architecture XE2,Perforce
Hadoop 1.x Big data, MAP Reduce,HIVE Queries,HBASE,
JIRA, Rally and QlikView 11.0
O/S : UNIX
Description:
Digital Insight is previously part of Intuit. Now Digital insight is part of NCR, leading Solution provider for ATM.
Digital Insight Provides Solutions for various Financial Institutions such as online banking, Digital Banking, Mobile
Banking, Payments, Money Movements and Provided Solutions to a lot of Small Business. DI goal is to integrate all
the FI’s data and provide reports to individuals using a centralized Data Platform and also provide data feeds by
integrating all the applications .DI is in the process of redesign from the EDW to IDW.
Responsibilities : Executed the following tasks
• Understanding existing business model and customer requirements
• Determination and documentation of Fit Gaps
• Create Dimensional Model (Logical and Physical Model).
• Create Data Dictionaries.
• Design the ETL flow i.e. architectural design.
• Implemented Auditing and Logging Mechanism.
• Interact with the Subject matter experts.
• Implemented CDC Mechanism for avoiding bottlenecks/better loading performance.
• Implemented ELT solution for multi-terabyte Data Warehouse and understand its impacts.
• Setting up Environment to pull data from various sources.
• Involved in migration of data between Data centers
• Involved in Automating Migration of Code from One Environment.
• Send data for individual FI’s Consumer Data files by using MoveIT application.
• Responsible for preparing ETL Design and Specification for developing Workflow.
• Created multiple reporting feed to multiple FI’s.
• Implemented Data Quality Methodology to confirm on the data.
• Created Map reduce job to pull data from Global Logging to accommodate Activity Details and
generate reports.
• Established a mechanism to handle different type of files to load HIVE External tables.
• Extract data from HDFS using HIVE Queries and convert aggregated data into csv files and load
these csv files to IDW
• Responsible in Data Quality Analysis and Performance tuning for incremental and historical loads.
• Created Dashboards for individual FI’s.
• Create Interactive reports to analyze financial data.
At Laird Technologies
Client : Laird Technologies Inc.
Title : Laird Enterprise Data warehouse (LEDW)
Duration : Apr ’11-Sep ‘13
Role : Data warehouse Architect
Location : Holly, MI
Environment :
Database : MS SQL 2000/2008 r2/2012, Progress DB, Hyperion Essbase ,
Tools : MS SSIS,SSRS, SSAS, Informatica 9.1,Informatica Data Quality 9.1
O/S : Microsoft Windows 2008 server
Description:
Laird Technologies designs and manufactures customized, performance-critical products for wireless and other
advanced electronics applications. The company is a global market leader in the design and supply of electromagnetic
interference (EMI) shielding, thermal management products, specialty metal products, signal integrity components,
and antenna solutions, as well as radio frequency (RF) modules and wireless remote controls and systems. Laird is a
fastly grown company by acquisitions. So, Laird is in the process of re-building the Laird Enterprise Business
Intelligence system.
Responsibilities : Executed the following tasks
• Determination and documentation of Fit Gaps
• Create Dimensional Model (Logical and Physical Model).
• Design the ETL flow i.e. architectural design.
• Interact and manage Vendors.
• Implemented ELT solution for multi-terabyte Data Warehouse and understand its impacts.
• Setting up Environment to pull data from various sources.
• Responsible for preparing ETL Specification for developing Workflow.
• Create and Modification of Reusable and User defined objects
• Responsible in Data Quality Analysis and Performance tuning for incremental and history loads.
• Initiating Database Cleanup, Technical assistance for any data mismatch related queries / Data
recovery and Data Reconciliation of Source Vs Operational Data Store Vs Staging Vs Warehouse.
• Understanding existing business model and customer requirements.
• Migrated data from SQL Server 2000 to 2008 r2 and some of the data to SQL Server 2012.
• Create and edit SSIS packages including error management and logging.
• Create and schedule Server Jobs on SQL Server 2008 r2.
• Implemented CDC Mechanism using MS SSIS and Informatica 9.1.
• Responsible for integrating the Cube with the SharePoint using Report builder.
• Responsible for Created Finance KPI using SSAS OLAP cubes.
• Created Performance point reports and Dashboard reports.
• Designed the reporting model.
• Publish the Reports in the Share Point and created BI security.
• Build the Enterprise OLAP Cubes for Laird Intelligence using Analysis services.
• Preparation of customized Mappings and Workflows, reports etc as required by the functional
members.
• Preparation of Unit Test scripts and execution of the test scripts and documenting the testing
exercise.
• Integrate data with SharePoint list using SSIS SharePoint list Add-in.
• Building, Publishing customized interactive reports and dashboards, report scheduling using Tableau
Server
• Created automation of Backup mechanism for Cubes and Databases objects.
• Created automation mechanism to process OLAP cubes after ETL run.
• Created automated version control for database object, Cubes and manually upload XML’s of
Informatica objects to SVN
At Global Techies Inc.
Client : Blue Cross Blue Shield of Michigan(BCBSM)
Title : EDW Data Quality
Duration : Jan’10 – Jan’11
Role : Senior Data warehouse Consultant
Location : Detroite,MI
Environment :
Database : Oracle 10g, MS SQL Server 2005
Tools : Informatica 9.0.1, Informatica IDQ
O/S : UNIX
Description:
Blue Cross Blue Shield is a non-profitable healthcare Insurance organization. BCBSM has the two types of
data warehouse .One of the data warehouse represents global and another data warehouse is for the local.
BCBSM is in the process of integrating these two data warehouses and build data marts. The data mart was
developed to enable multi-level aggregations of claims data across various providers and consumer groups.
Blue Direct Reporting Data mart developed to meet financial and utilization reporting needs of authorized agents.
For Integration and analysis purpose, BCBSM has chosen Informatica as an ETL.
Responsibilities :
• Determination and documentation of Fit Gaps
• Create Dimensional Model (Logical and Physical Model).
• Design the ETL flow i.e. architectural design.
• Involved in Creating the ABC (Audit, Balancing and Controlling) Mechanism of data based on the design.
• Setting up and running of ETL jobs to pull data from various sources.
• Involved in prepare ETL Specification for developing Jobs.
• Worked on several transformations such as SQL, Filter, Joiner, Rank, Sequence Generator, Stored
Procedure, Expression and implemented SCD and Knowledge of using CDC in Informatica.
• Experience in Creating Data maps using power exchange.
• Involved in creating, editing, deleting and scheduling of the Sessions in Informatica
• Create and Modification of Reusable Transformations, Maplets.
• Involved in the Data Quality Analysis and Performance tuning for incremental and history loads.
• Creating data maps in Power exchange.
• Initiating Database Cleanup, Technical assistance for any data mismatch related queries / Data recovery and
Data Reconciliation of Source Vs Operational Data Store Vs Staging Vs Warehouse.
• Created IDQ plans for Address validation and Standardization of Customer names
• Understood existing business model and customer requirements.
• Preparation of customized Mappings and Workflows, reports etc as required by the functional members.
• Created Master Sequencer Jobs and scheduled them based on business requirement.
• Preparation of Unit Test scripts and execution of the test scripts and documenting the testing exercise.
• Testing individual modules of the Jobs.
• Validation of Data Loaded by ETL Process.
At Sapient Corporation
Client : Target
Title : Target Everest
Duration : Aug ’09 – Dec ’09
Location : Minneapolis, MN
Role : Senior Data warehouse Analyst
Environment :
Database : Oracle 10g, DB2
Tools : ERWIN 7.1, Data stage 7.5
O/S : UNIX
Description:
Target Corporation is a Retail vendor. Target Stores have been opened in 1962 with more than 1,700 stores across
the global. Target wanted to make an end to end solution for the online purchase of the products. So they have
chosen Java as a front-end and some of the middleware systems like IBM Message broker for the messaging
system .For Integration and analysis purpose, target has chosen Data stage as an ETL .For creating cubes and
reports we use Business objects.
Responsibilities: Executed the following tasks
• Determination and documentation of Fit Gaps
• Creating Technical specification documents for developers.
• Implemented Auditing Mechanism
• Involved in Conducting Design Review Sessions with Different business users.
• Involved in Code review sessions with developers and manager.
• Created Test cases for ETL and Integration verification.
• Involved in Release Deployments.
• Created Operational guide documentation.
At Satyam Computer Services Limited
Client : Sun corp, Brisbane, Australia
Title : Customer Franchise Data Mart (CFDM)
Duration : Oct ’08 – Aug ’09
Location : Kulalumpur,Malaysia (Offshore Team)
Role : Team Lead
Environment :
Database : Oracle 10g,MS SQL 2000
Tools : ERWIN 7.1, Data stage 7.5, Cognos 8.3
O/S: Microsoft Windows XP, UNIX.
Description:
Sun Corp was established in the year 1916.It has wide range of business in Banking and Insurance. In Australia
Suncorp is the 6th largest bank and third largest Insurance group. Suncorp Group Marketing has
embarked on an engagement to build a Customer Data Mart for developing customer reporting and
analytics. However, there is no single repository where all this information is integrated and historically
maintained. The CFDM (Customer Franchise Data Mart) program is the strategic initiative to encompass
all the Data warehousing requirements of Group Marketing. Data Stage is the integrated ETL solution
recommended for the Data mart as the extraction and transformation tool to meet Suncorp’s
requirements. Cognos is the reporting tool for doing Analysis.
Responsibilities: Managed the following tasks:
• Understanding existing system and customer requirements.
• Meet business users to collect and analyze business requirements and provide system enhancement
recommendations.
• Designed the Architecture of the Data warehouse in the Integrative perspective
• Design the Data model (Logical and Physical Model).
• Design the Mapping Documents for the ETL Developers to develop the jobs.
• Test the data whether the data fits as per the data model and Business Requirement.
Client : Emirates and NBD Bank
Title : Unified Emirates Financial Management System (UniFi)
Duration : Jan’08 – Nov ’08
Location : Dubai
Role : Design Analyst / Team Lead
Environment :
Database : Oracle 10g,MS-SQL Server 2000,MS-SQLServer 2005 .
Tools : Informatica 8.1.3, Cognos Planning 8, Cognos 8.3, Oracle 12i.
O/S: Microsoft Windows XP, Windows 2003.
Description:
Emirates banking group is operating in 6 countries and it is the largest bank in the Middle East. Since it is operating
in the different countries So EBG wants a solution for maintaince of data in one place with one
application. With a view to integrate the Financial and the operational systems, Emirates Banking Group
is in the process of deploying Oracle Applications, which includes all the financial modules, purchasing,
inventory, property manager, enterprise asset management and Budgets which is coming from the
Cognos Planning. As part of Oracle General Ledger and Budgets integration implementation we are using
Informatica, this project covers the processes to be adopted by Emirates Banking Group based on the
functionalities of the General Ledger Interface and Budgets .
Responsibilities: Managed the following tasks:
• Understanding existing system and customer requirements.
• Meet business users to collect and analyze business requirements and provide system enhancement
recommendations.
• Designed the Architecture and Configuration of the Data warehouse
• Understand the Cognos planning Budgeted data and maintain number of versions required by the business.
• Analyze the data which need to be flowed to the Oracle APPS GL interface and the data flow to the Cognos
Planning system.
• Designed an Integration for the data flow from Oracle Apps to Data warehouse and Cognos Planning.
• Analyzing the data quality issues and Communicate with concerned business user to resolve issues
• Designed and Developed Complex Update Strategy, aggregator, joiner, lookup Transformations that are used in
mapping.
• Created Session, Database connections and batches using Informatica workflow Manager.
• Manage Connection strings for Source and Target databases in Informatica.
• Created temporary repository for already migrated database for system analysis.
• Involved in creating, editing, deleting and scheduling of the Sessions.
• Managing Admistrative activities like Session, Event and Error logs for troubleshooting connectivity problems
• Creating, editing, deleting and scheduling of the Batches in Informatica generate reports.
Client : Kuwait Airlines
Title : Kuwait Airways Corporation Integration Information System (KAC-IIS)
Duration : Oct’07 – Jan ’08
Location : Kuwait
Role : Senior Data warehouse Developer, Testing Lead
Environment : Database : Oracle 10g, MS-SQL Server 2000.
Tools : Data stage 8.1Informatica 7.1,BO XI R2
O/S :Microsoft Windows XP, Unix (AIX 5.3)
Description:
Kuwait Airways Corporation (KAC) is a premier airline in Kuwait and well known in the middle-east and surrounding
areas. KAC has varied systems leading to a very heterogeneous environment. They want to leverage on
their existing systems to build a world class decision support system (Business Intelligence Framework)
based on their business priorities. This project is to build and implement a world class Data warehousing
and Business intelligence system.
Responsibilities: Managed the following tasks:
• Involved in logical and physical data model design, Star Schema design. Also responsible for trouble
shooting, data cleansing and data integrity.
• Create a set of definitions for a logical warehouse schema, its data sources, target and the operations that
map, transform, and populate a warehouse
• Involved in Data stage Designer to create the various Jobs for extracting, transforming and loading
operational data into the data mart from ODS/Flat file to Reporting Tables.
• Designed the Test cases, Test Scripts and test results for all the Modules.
At Penfos Systems Pvt. Ltd.
Client : Ministry of finance,Malaysia
Title : Property Data warehouse System (MFPDW)
Duration : Oct ’06 – Aug ’07
Location : Kulalumpur,Malaysia
Role : Senior Data warehouse Consultant
Environment :
Database : Oracle 9i,MS-SQL Server 2000.
Tools : SAS Data integration Studio 3.3, LSF Platform Scheduler,
SAS OLAP Cube Studio, Web Report Studio
O/S: Microsoft Windows XP, Unix
Description:
Ministry of finance is a Government Organization, which deals with the budget management, Tax Analysis, Economic
and International Division, Loan management, Finance Market, Housing loan, Investment and housing loans.
Business users required to create the monthly forecast and yearly budgets (Plan) to analyze the trends in the loan
management, Finance Market Investments and housing loans. Involved in development of Property information data
warehouse using SAS Data integration Studio and Reporting system to provide reports for inventory analysis and
transferor, transferee details using SAS OLAP studio and WRS as reporting tool and Oracle10g as database.
Responsibilities: Managed the following tasks:
• Involved in designing the Data Model.
• Developed Plan to load the dimensional and fact tables into the Data Mart.
• Designed, Developed Jobs, Transformation objects using SAS Data integration Studio 3.3.
• Analyzed the reporting requirement for Budgeting and Forecasting application.
• Design and developed the Budgeting Cube.
• Scheduling Batch jobs in LSF Platform Scheduler to transport data.
Client : P & G, Philippines
Title : Automated Sales Data warehouse Management System
Duration : May ’04 – Aug ’06
Location : Hyderabad (Offshore team)
Role : ETL Developer
Environment :
Database : Oracle 9i
Tools : Informatica 7.1,Business Objects 6.5
O/S: Microsoft Windows XP, Unix
Description:
The client was a large CPG (Consumer Package Goods) Company. Their product portfolio spans across the entire
spectrum of non-durable. They are also leaders in processed foods, Frozen Desserts and beverages. The company
has a massive distribution network of warehouses, distributors and retail stockiest. And they have more than 500
brands in their portfolio. The aim is to develop Sales Data Mart and customized Reports for Sales and Stock.
Responsibilities: Managed the following tasks:
• Involved in logical and physical data model design, Star Schema design. Meet business users to collect and
analyze business requirements and provide system enhancement recommendations. Also responsible for
trouble shooting, data cleansing and data integrity.
• Create a set of definitions for a logical warehouse schema, its data sources, target and the operations that
map, transform, and populate a warehouse
• Involved in Informatics Designer to create the various mappings for extracting, transforming and loading
operational data into the data mart from ODS/Flat file to Reporting Tables
• Designed, Developed Jobs, Transformation objects using SAS Data integration Studio 3.3.
• Analyzed the reporting requirement for Budgeting and Forecasting application.
• Design and developed the Budgeting Cube.
• Scheduling Batch jobs in LSF Platform Scheduler to transport data.
Client : P & G, Philippines
Title : Automated Sales Data warehouse Management System
Duration : May ’04 – Aug ’06
Location : Hyderabad (Offshore team)
Role : ETL Developer
Environment :
Database : Oracle 9i
Tools : Informatica 7.1,Business Objects 6.5
O/S: Microsoft Windows XP, Unix
Description:
The client was a large CPG (Consumer Package Goods) Company. Their product portfolio spans across the entire
spectrum of non-durable. They are also leaders in processed foods, Frozen Desserts and beverages. The company
has a massive distribution network of warehouses, distributors and retail stockiest. And they have more than 500
brands in their portfolio. The aim is to develop Sales Data Mart and customized Reports for Sales and Stock.
Responsibilities: Managed the following tasks:
• Involved in logical and physical data model design, Star Schema design. Meet business users to collect and
analyze business requirements and provide system enhancement recommendations. Also responsible for
trouble shooting, data cleansing and data integrity.
• Create a set of definitions for a logical warehouse schema, its data sources, target and the operations that
map, transform, and populate a warehouse
• Involved in Informatics Designer to create the various mappings for extracting, transforming and loading
operational data into the data mart from ODS/Flat file to Reporting Tables

More Related Content

What's hot

Shraddha Verma_IT_ETL Architect_10+_CV
Shraddha Verma_IT_ETL Architect_10+_CVShraddha Verma_IT_ETL Architect_10+_CV
Shraddha Verma_IT_ETL Architect_10+_CVShraddha Mehrotra
 
ABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankAbhijeet Ghag
 
JohnEGaryPMdetails2015
JohnEGaryPMdetails2015JohnEGaryPMdetails2015
JohnEGaryPMdetails2015John Gary
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETLMani Sagar
 
Narender Reddy Andra Profile
Narender Reddy Andra ProfileNarender Reddy Andra Profile
Narender Reddy Andra ProfileNarender Reddy
 
AnujGupta_TechnologyConsultant
AnujGupta_TechnologyConsultantAnujGupta_TechnologyConsultant
AnujGupta_TechnologyConsultantAnuj Gupta
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Resume - Abhishek Ray-Mar-2016 - Ind
Resume - Abhishek Ray-Mar-2016 - IndResume - Abhishek Ray-Mar-2016 - Ind
Resume - Abhishek Ray-Mar-2016 - IndAbhishek Ray
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam
 
Resume_Sanket_S_Manjrekar - Software Developer
Resume_Sanket_S_Manjrekar - Software DeveloperResume_Sanket_S_Manjrekar - Software Developer
Resume_Sanket_S_Manjrekar - Software DeveloperSanket S. Manjrekar
 
PratikGhosh_Resume_Final
PratikGhosh_Resume_FinalPratikGhosh_Resume_Final
PratikGhosh_Resume_FinalPratik Ghosh
 
Consulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCSConsulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCSVictor M Torres
 
Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016David Colbourn
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh Kumar
 

What's hot (19)

JohnGary
JohnGaryJohnGary
JohnGary
 
Shraddha Verma_IT_ETL Architect_10+_CV
Shraddha Verma_IT_ETL Architect_10+_CVShraddha Verma_IT_ETL Architect_10+_CV
Shraddha Verma_IT_ETL Architect_10+_CV
 
ABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG Axisbank
 
JohnEGaryPMdetails2015
JohnEGaryPMdetails2015JohnEGaryPMdetails2015
JohnEGaryPMdetails2015
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Narender Reddy Andra Profile
Narender Reddy Andra ProfileNarender Reddy Andra Profile
Narender Reddy Andra Profile
 
AnujGupta_TechnologyConsultant
AnujGupta_TechnologyConsultantAnujGupta_TechnologyConsultant
AnujGupta_TechnologyConsultant
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2
 
Resume - Abhishek Ray-Mar-2016 - Ind
Resume - Abhishek Ray-Mar-2016 - IndResume - Abhishek Ray-Mar-2016 - Ind
Resume - Abhishek Ray-Mar-2016 - Ind
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
 
Resume_Sanket_S_Manjrekar - Software Developer
Resume_Sanket_S_Manjrekar - Software DeveloperResume_Sanket_S_Manjrekar - Software Developer
Resume_Sanket_S_Manjrekar - Software Developer
 
PratikGhosh_Resume_Final
PratikGhosh_Resume_FinalPratikGhosh_Resume_Final
PratikGhosh_Resume_Final
 
SunilTiwari_CV
SunilTiwari_CVSunilTiwari_CV
SunilTiwari_CV
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
Resume
ResumeResume
Resume
 
Consulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCSConsulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCS
 
Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resume
 

Viewers also liked

CDC - Women's Health - How to Plan a Health Fair
CDC - Women's Health - How to Plan a Health FairCDC - Women's Health - How to Plan a Health Fair
CDC - Women's Health - How to Plan a Health Fairincandescentret22
 
My experinces while I observrd Class.
My experinces while I observrd Class.My experinces while I observrd Class.
My experinces while I observrd Class.Kyara M Negron
 
HSU Giving Tuesday
HSU Giving TuesdayHSU Giving Tuesday
HSU Giving Tuesdayhsunitednyc
 
The color green
The color greenThe color green
The color greenDiiana Pb
 
Magento On-sale and Discounted Price Tutorials
Magento On-sale and Discounted Price TutorialsMagento On-sale and Discounted Price Tutorials
Magento On-sale and Discounted Price TutorialsAtall
 
Sap basis 5 years experience
Sap basis 5 years experienceSap basis 5 years experience
Sap basis 5 years experiencesuresh srcm
 
Ronen Sarig Resume
Ronen Sarig ResumeRonen Sarig Resume
Ronen Sarig ResumeRonen Sarig
 
Sommer Ron Bi Resume 2010 02 08
Sommer Ron Bi Resume   2010 02 08Sommer Ron Bi Resume   2010 02 08
Sommer Ron Bi Resume 2010 02 08rsommer608
 
Robert Hager - Application Developer - Resume_2016_0718
Robert Hager - Application Developer - Resume_2016_0718Robert Hager - Application Developer - Resume_2016_0718
Robert Hager - Application Developer - Resume_2016_0718hagerb99
 

Viewers also liked (18)

Bimbo
BimboBimbo
Bimbo
 
CDC - Women's Health - How to Plan a Health Fair
CDC - Women's Health - How to Plan a Health FairCDC - Women's Health - How to Plan a Health Fair
CDC - Women's Health - How to Plan a Health Fair
 
My experinces while I observrd Class.
My experinces while I observrd Class.My experinces while I observrd Class.
My experinces while I observrd Class.
 
HSU Giving Tuesday
HSU Giving TuesdayHSU Giving Tuesday
HSU Giving Tuesday
 
Health fair
Health fairHealth fair
Health fair
 
The color green
The color greenThe color green
The color green
 
Magento On-sale and Discounted Price Tutorials
Magento On-sale and Discounted Price TutorialsMagento On-sale and Discounted Price Tutorials
Magento On-sale and Discounted Price Tutorials
 
Raaaaaaaaa
RaaaaaaaaaRaaaaaaaaa
Raaaaaaaaa
 
Sap basis 5 years experience
Sap basis 5 years experienceSap basis 5 years experience
Sap basis 5 years experience
 
Painter and decorator
Painter and decoratorPainter and decorator
Painter and decorator
 
Dennis schmidresume
Dennis schmidresumeDennis schmidresume
Dennis schmidresume
 
Ronen Sarig Resume
Ronen Sarig ResumeRonen Sarig Resume
Ronen Sarig Resume
 
Sommer Ron Bi Resume 2010 02 08
Sommer Ron Bi Resume   2010 02 08Sommer Ron Bi Resume   2010 02 08
Sommer Ron Bi Resume 2010 02 08
 
We are Web Talent Marketing
We are Web Talent MarketingWe are Web Talent Marketing
We are Web Talent Marketing
 
My Dream school
My Dream schoolMy Dream school
My Dream school
 
Robert Hager - Application Developer - Resume_2016_0718
Robert Hager - Application Developer - Resume_2016_0718Robert Hager - Application Developer - Resume_2016_0718
Robert Hager - Application Developer - Resume_2016_0718
 
Jinank Jain
Jinank JainJinank Jain
Jinank Jain
 
Abdulla Resume
Abdulla ResumeAbdulla Resume
Abdulla Resume
 

Similar to Sriramjasti

Similar to Sriramjasti (20)

Senthilkumar_SQL_New
Senthilkumar_SQL_NewSenthilkumar_SQL_New
Senthilkumar_SQL_New
 
SunilTiwari_CV
SunilTiwari_CVSunilTiwari_CV
SunilTiwari_CV
 
Big Data Analyst at BankofAmerica
Big Data Analyst at BankofAmericaBig Data Analyst at BankofAmerica
Big Data Analyst at BankofAmerica
 
Sami patel full_resume
Sami patel full_resumeSami patel full_resume
Sami patel full_resume
 
Geetha_6 yrs_CV_July-2016
Geetha_6 yrs_CV_July-2016Geetha_6 yrs_CV_July-2016
Geetha_6 yrs_CV_July-2016
 
Abhishek jaiswal
Abhishek jaiswalAbhishek jaiswal
Abhishek jaiswal
 
Sr_MicroStrategy_Consultant
Sr_MicroStrategy_ConsultantSr_MicroStrategy_Consultant
Sr_MicroStrategy_Consultant
 
Salim Khan.Resume_3.8
Salim Khan.Resume_3.8Salim Khan.Resume_3.8
Salim Khan.Resume_3.8
 
Varadarajan CV
Varadarajan CVVaradarajan CV
Varadarajan CV
 
Business_Analytic_Kunal_Kaushal
Business_Analytic_Kunal_KaushalBusiness_Analytic_Kunal_Kaushal
Business_Analytic_Kunal_Kaushal
 
Deepa_Resume
Deepa_ResumeDeepa_Resume
Deepa_Resume
 
Rajesh CV
Rajesh CVRajesh CV
Rajesh CV
 
Msbi power bi_ lead
Msbi power bi_ leadMsbi power bi_ lead
Msbi power bi_ lead
 
Pratik Patel Python/ Big Data Analyst
Pratik Patel Python/ Big Data AnalystPratik Patel Python/ Big Data Analyst
Pratik Patel Python/ Big Data Analyst
 
BusinessIntelligence_SQL_Developer
BusinessIntelligence_SQL_DeveloperBusinessIntelligence_SQL_Developer
BusinessIntelligence_SQL_Developer
 
Padmini parmar
Padmini parmarPadmini parmar
Padmini parmar
 
Padmini Parmar
Padmini ParmarPadmini Parmar
Padmini Parmar
 
GANESH -Certified Consultant
GANESH -Certified ConsultantGANESH -Certified Consultant
GANESH -Certified Consultant
 
Monish R_9163_b
Monish R_9163_bMonish R_9163_b
Monish R_9163_b
 
BrodtKerry_122016
BrodtKerry_122016BrodtKerry_122016
BrodtKerry_122016
 

Sriramjasti

  • 1. Sriram Jasti Mobile: 415-741-9969 E-Mail: sriramjasti@gmail.com Aspiring for challenging assignments with a company that will utilize previous experience and technical skills in Data warehouse and Business Intelligence. PROFESSIONAL SUMMARY  10+ years of rich experience in solving critical business needs by modeling data warehouse for business intelligence database environments using Start Schema and Snow Flake Schema methodology for faster and effective Querying and managing large volumes of data. Understanding Client Requirement  Competent in client relationship management, presentation, Designing functional specifications, implementing the solution, User Training and support.  Conducting Gap Analysis and providing strategic input to various business users in order to support reporting and analysis initiatives.  Expertise in Served as Subject Matter Expert in data modeling for Airlines, Finance Services, Property Management, Manufacturing, Banking, Insurance, and Retail domains. Expertise in Agile  Expertise in Project Planning, Architecture Design and Development. Responsible for prototyping solutions, preparing test scripts, and conducting tests and for data replication, extraction, transformation, loading, cleansing, create Multi-dimensional Cubes and create prototype excel pivot table reports.  Experience in Conducting Sprint planning and Conducted scrum meeting. Designing and implementing  Implemented four full lifecycle Data warehouses and Business Intelligence solutions with Star Schemas and Snowflake Schemas Methodologies.  Create Logical and Physical model by using MS Visio and Erwin tools.  Rich expertise in Data Analysis, Design and Development process by using Data Extraction, Data Transformation and Data Loading (ETL) using IPE and IDQ.  Developed processes for capturing and maintaining metadata from all data warehousing components which includes sizing estimates for large scale data warehousing projects, aggregation and indexing strategies with database security.  Rich Experience in analyzing performance statistics for Extracting high volume of data from Multiple Sources to Data Warehouse and create automation process based on the performance statistics.  Experience in Analyzing Data Quality Issues using IDQ.  Experience in Bullet proofing the system by using Auditing, Balancing and Control (ABC) Mechanism.  Create/Maintain ETL processes utilizing Microsoft SQL Server Integration Services (SSIS) from a diversity of data sources including other servers, flat files, MS Access and Excel.  Responsible in data maintenance and database improvements following Best Practices for data-typing, relational data structure and data quality.  Experience in Creating HIVE Queries in Hadoop1.x and Very good understanding of YARN Architecture.  Expertise in Creating Multi-dimensional cubes using Analysis Services. EXCELLENCE SPHERE Requirement Analysis Software Usability & Development System Architecture Data Modelling Design and Implementation User Support Managing Onshore/ Offshore teams
  • 2.  Troubleshoot and correct any reported data issues in deliverables.  Rich experience in design & development using Informatica 6.1/7.3/8.1/8.6/9.0.1,Datastage 7.5/8.0.1 PX,SSIS SQL Server 2005/2008 r2,SSRS SQL Server 2005/2008 r2,SSAS SQL server 2005/2008 r2,Erwin 4.2/7.1,Microsoft Viso 2010,SAS Data Integration studio 3.3/3.4,SAS 9.1.2 including SAS SQL and SAS Macros. Additional Contributions  Experience involves integrating HIVE Queries (hql) with Data warehouse for better performance.  Experience involves converting the SQL Server DTS Packages to Informatica Mappings.  Experience involves converting the Data stage Server Jobs to Informatica Mappings.  Extensive working experience in integrating with the Cognos Planning to GL Interface in Oracle Applications 12i and HFM.  Pull and Push Data to/from different ERP Systems like Oracle Apps, SIEBEL CRM, People Soft and QAD systems for Maintaining Customer Information.  Experence in cleaning up data using Address validation, Standardization, Match and Merge rules using IDQ.  Knowledge in integration with real time systems like mainframes by creating Data Maps using Power Exchange. CERTIFICATIONS  Microsoft Certified Professional. http://www.microsoft.com/learning/mcp/transcripts • Microsoft® SQL ServerTM 2005 Business Intelligence – Implementation and Maintenance • Designing a Business Intelligence Solution by Using Microsoft® SQL Server 2005 CAREER PATH Data warehouse Consultant at Info Objects Inc., Santa Clara from May’14 to till date Data warehouse Consultant at SACC Inc., Redwood City from Oct’13 to May’14 Data warehouse Architect at Laird Technologies Inc., Holly from Apr’11 to Sep’13 Senior Programmer Analyst at Global Techies Inc., Detroit from Dec’09 – Mar’11 Senior Associate –Platform L2 at Sapient Corporation, Bangalore from Aug’09 – Dec’09 Senior Engineer – Level1 at Satyam Computer Services, Malaysia from Oct’07 – July’09 Software Engineer – Level1 at Penfos Systems Pvt. Ltd, Hyderabad from Apr’04 – Sep’07 SKILL SET Project Management • Implementing project plans within preset budgets and deadlines; monitoring project progress and outstanding issues; ensuring quality and timeliness of deliverables; reporting on the project’s progress and escalate issues. • Creating support SLAs and establishing SME & various technical support team interactions for improved operational efficiency; administering quality procedures related to project and delivery. Software Development • Interfacing with clients for business gathering, conducting system analysis and finalising technical / functional specifications and high level design documents for the project. • Contributing to the design, development, testing, troubleshooting and debugging of the software. • Architected, designed, developed & managed teams for products in the Financial Services, Manufacturing and Retail domains using agile methodology. • Providing post-implementation, application maintenance and enhancement support to the client. • Provide ETL Solutions for multi-terabyte Data warehouse and understand its impact Key Account: • Interfacing and interacting with customers to give proper updates regarding status of the project and delivery timelines; ensuring the adherence to the delivery timelines. • Managing and developing beneficial and cordial relationships with key clients. • Ensuring speedy resolution of queries & grievances to maximize client/customer satisfaction levels and maintaining excellent relations with clients/customers to generate avenues for further business. • Played a key role in performance improvements and review for various projects.
  • 3. Manpower Leadership • Leading, mentoring & monitoring the performance of team members to ensure efficiency in process operations and meeting of individual & group targets. • Developing competency among the team members; conducting interviews to recruit the right talent and resources and developing employee competency. Refer to Annexure for Projects Executed TECHNICAL PURVIEW Operating Systems : Windows 2003, NT, XP, UNIX Languages : C,C#, Base SAS,SAS SQL, shell scripting Databases : Oracle 8i/9i,10g, MS SQL Server 2000/2005/2008 r2/2012. Data modeling Tools : ERWin 4.1/7.1,Embarcardo and MS Viso 2010 Big Data : Hadoop1.x Mapreduce,Hive,Pig,Scoop,Flum,Hadoop 2.x Yarn ETL Tools : Informatica Power Center 6.1/7.1/8.1.3/8.6.1/9.0.1, Informatica Power exchange 8.6.1, Informatica Data quality 8.6.1/9.1, B2B Data transformation 8.6.1, MS-SSIS SQL Server 2005/2008 r2, Datastage7.5/8.0.1PX and SAS Data integration Studio 3.3/3.4 Reporting Tools : MS-SSRS SQL Server 2005/2008 R2 with SharePoint Integration, MS-SSAS SQL Server 2005/2008 R22012, Report Builder 3.0, MS- Excel Pivot tables Business Objects XI R/2(Universe Build) and Cognos 8 (Framework Manager) Scheduling Tools : Control-M 8.x, LSF Platform Scheduler 3.x Tools & Version Control : Tortoise Subversion (SVN), PVCS, Perforce Defect Tracking Tool : JIRA Agile Tools : Rally,JIRA EDUCATION • PG. Diploma in Computer Application from Jawaharlal Nehru National Youth center in 2004 • B.E. (Electrical and Electronics Engineering) from Jawaharlal Nehru Technical University in 2004
  • 4. ANNEXURE PROJECTS EXECUTED At SACC and Info Objects,inc Client : Digital Insight Inc. Title : IFS Data warehouse (IDW) Duration : Oct ’13-till date Role : Data warehouse Consultant Location : Redwood City, CA Environment : Database : Oracle 11g Tools : Informatica 9.1, Embarcardo ER Studio Data Architecture XE2,Perforce Hadoop 1.x Big data, MAP Reduce,HIVE Queries,HBASE, JIRA, Rally and QlikView 11.0 O/S : UNIX Description: Digital Insight is previously part of Intuit. Now Digital insight is part of NCR, leading Solution provider for ATM. Digital Insight Provides Solutions for various Financial Institutions such as online banking, Digital Banking, Mobile Banking, Payments, Money Movements and Provided Solutions to a lot of Small Business. DI goal is to integrate all the FI’s data and provide reports to individuals using a centralized Data Platform and also provide data feeds by integrating all the applications .DI is in the process of redesign from the EDW to IDW. Responsibilities : Executed the following tasks • Understanding existing business model and customer requirements • Determination and documentation of Fit Gaps • Create Dimensional Model (Logical and Physical Model). • Create Data Dictionaries. • Design the ETL flow i.e. architectural design. • Implemented Auditing and Logging Mechanism. • Interact with the Subject matter experts. • Implemented CDC Mechanism for avoiding bottlenecks/better loading performance. • Implemented ELT solution for multi-terabyte Data Warehouse and understand its impacts. • Setting up Environment to pull data from various sources. • Involved in migration of data between Data centers • Involved in Automating Migration of Code from One Environment. • Send data for individual FI’s Consumer Data files by using MoveIT application. • Responsible for preparing ETL Design and Specification for developing Workflow. • Created multiple reporting feed to multiple FI’s. • Implemented Data Quality Methodology to confirm on the data. • Created Map reduce job to pull data from Global Logging to accommodate Activity Details and generate reports. • Established a mechanism to handle different type of files to load HIVE External tables. • Extract data from HDFS using HIVE Queries and convert aggregated data into csv files and load these csv files to IDW
  • 5. • Responsible in Data Quality Analysis and Performance tuning for incremental and historical loads. • Created Dashboards for individual FI’s. • Create Interactive reports to analyze financial data. At Laird Technologies Client : Laird Technologies Inc. Title : Laird Enterprise Data warehouse (LEDW) Duration : Apr ’11-Sep ‘13 Role : Data warehouse Architect Location : Holly, MI Environment : Database : MS SQL 2000/2008 r2/2012, Progress DB, Hyperion Essbase , Tools : MS SSIS,SSRS, SSAS, Informatica 9.1,Informatica Data Quality 9.1 O/S : Microsoft Windows 2008 server Description: Laird Technologies designs and manufactures customized, performance-critical products for wireless and other advanced electronics applications. The company is a global market leader in the design and supply of electromagnetic interference (EMI) shielding, thermal management products, specialty metal products, signal integrity components, and antenna solutions, as well as radio frequency (RF) modules and wireless remote controls and systems. Laird is a fastly grown company by acquisitions. So, Laird is in the process of re-building the Laird Enterprise Business Intelligence system. Responsibilities : Executed the following tasks • Determination and documentation of Fit Gaps • Create Dimensional Model (Logical and Physical Model). • Design the ETL flow i.e. architectural design. • Interact and manage Vendors. • Implemented ELT solution for multi-terabyte Data Warehouse and understand its impacts. • Setting up Environment to pull data from various sources. • Responsible for preparing ETL Specification for developing Workflow. • Create and Modification of Reusable and User defined objects • Responsible in Data Quality Analysis and Performance tuning for incremental and history loads. • Initiating Database Cleanup, Technical assistance for any data mismatch related queries / Data recovery and Data Reconciliation of Source Vs Operational Data Store Vs Staging Vs Warehouse. • Understanding existing business model and customer requirements. • Migrated data from SQL Server 2000 to 2008 r2 and some of the data to SQL Server 2012. • Create and edit SSIS packages including error management and logging. • Create and schedule Server Jobs on SQL Server 2008 r2. • Implemented CDC Mechanism using MS SSIS and Informatica 9.1. • Responsible for integrating the Cube with the SharePoint using Report builder. • Responsible for Created Finance KPI using SSAS OLAP cubes. • Created Performance point reports and Dashboard reports. • Designed the reporting model. • Publish the Reports in the Share Point and created BI security. • Build the Enterprise OLAP Cubes for Laird Intelligence using Analysis services. • Preparation of customized Mappings and Workflows, reports etc as required by the functional members. • Preparation of Unit Test scripts and execution of the test scripts and documenting the testing exercise. • Integrate data with SharePoint list using SSIS SharePoint list Add-in. • Building, Publishing customized interactive reports and dashboards, report scheduling using Tableau Server • Created automation of Backup mechanism for Cubes and Databases objects. • Created automation mechanism to process OLAP cubes after ETL run. • Created automated version control for database object, Cubes and manually upload XML’s of Informatica objects to SVN At Global Techies Inc. Client : Blue Cross Blue Shield of Michigan(BCBSM)
  • 6. Title : EDW Data Quality Duration : Jan’10 – Jan’11 Role : Senior Data warehouse Consultant Location : Detroite,MI Environment : Database : Oracle 10g, MS SQL Server 2005 Tools : Informatica 9.0.1, Informatica IDQ O/S : UNIX Description: Blue Cross Blue Shield is a non-profitable healthcare Insurance organization. BCBSM has the two types of data warehouse .One of the data warehouse represents global and another data warehouse is for the local. BCBSM is in the process of integrating these two data warehouses and build data marts. The data mart was developed to enable multi-level aggregations of claims data across various providers and consumer groups. Blue Direct Reporting Data mart developed to meet financial and utilization reporting needs of authorized agents. For Integration and analysis purpose, BCBSM has chosen Informatica as an ETL. Responsibilities : • Determination and documentation of Fit Gaps • Create Dimensional Model (Logical and Physical Model). • Design the ETL flow i.e. architectural design. • Involved in Creating the ABC (Audit, Balancing and Controlling) Mechanism of data based on the design. • Setting up and running of ETL jobs to pull data from various sources. • Involved in prepare ETL Specification for developing Jobs. • Worked on several transformations such as SQL, Filter, Joiner, Rank, Sequence Generator, Stored Procedure, Expression and implemented SCD and Knowledge of using CDC in Informatica. • Experience in Creating Data maps using power exchange. • Involved in creating, editing, deleting and scheduling of the Sessions in Informatica • Create and Modification of Reusable Transformations, Maplets. • Involved in the Data Quality Analysis and Performance tuning for incremental and history loads. • Creating data maps in Power exchange. • Initiating Database Cleanup, Technical assistance for any data mismatch related queries / Data recovery and Data Reconciliation of Source Vs Operational Data Store Vs Staging Vs Warehouse. • Created IDQ plans for Address validation and Standardization of Customer names • Understood existing business model and customer requirements. • Preparation of customized Mappings and Workflows, reports etc as required by the functional members. • Created Master Sequencer Jobs and scheduled them based on business requirement. • Preparation of Unit Test scripts and execution of the test scripts and documenting the testing exercise. • Testing individual modules of the Jobs. • Validation of Data Loaded by ETL Process. At Sapient Corporation Client : Target Title : Target Everest Duration : Aug ’09 – Dec ’09 Location : Minneapolis, MN Role : Senior Data warehouse Analyst Environment : Database : Oracle 10g, DB2 Tools : ERWIN 7.1, Data stage 7.5 O/S : UNIX Description: Target Corporation is a Retail vendor. Target Stores have been opened in 1962 with more than 1,700 stores across the global. Target wanted to make an end to end solution for the online purchase of the products. So they have chosen Java as a front-end and some of the middleware systems like IBM Message broker for the messaging system .For Integration and analysis purpose, target has chosen Data stage as an ETL .For creating cubes and reports we use Business objects. Responsibilities: Executed the following tasks • Determination and documentation of Fit Gaps • Creating Technical specification documents for developers. • Implemented Auditing Mechanism
  • 7. • Involved in Conducting Design Review Sessions with Different business users. • Involved in Code review sessions with developers and manager. • Created Test cases for ETL and Integration verification. • Involved in Release Deployments. • Created Operational guide documentation. At Satyam Computer Services Limited Client : Sun corp, Brisbane, Australia Title : Customer Franchise Data Mart (CFDM) Duration : Oct ’08 – Aug ’09 Location : Kulalumpur,Malaysia (Offshore Team) Role : Team Lead Environment : Database : Oracle 10g,MS SQL 2000 Tools : ERWIN 7.1, Data stage 7.5, Cognos 8.3 O/S: Microsoft Windows XP, UNIX. Description: Sun Corp was established in the year 1916.It has wide range of business in Banking and Insurance. In Australia Suncorp is the 6th largest bank and third largest Insurance group. Suncorp Group Marketing has embarked on an engagement to build a Customer Data Mart for developing customer reporting and analytics. However, there is no single repository where all this information is integrated and historically maintained. The CFDM (Customer Franchise Data Mart) program is the strategic initiative to encompass all the Data warehousing requirements of Group Marketing. Data Stage is the integrated ETL solution recommended for the Data mart as the extraction and transformation tool to meet Suncorp’s requirements. Cognos is the reporting tool for doing Analysis. Responsibilities: Managed the following tasks: • Understanding existing system and customer requirements. • Meet business users to collect and analyze business requirements and provide system enhancement recommendations. • Designed the Architecture of the Data warehouse in the Integrative perspective • Design the Data model (Logical and Physical Model). • Design the Mapping Documents for the ETL Developers to develop the jobs. • Test the data whether the data fits as per the data model and Business Requirement. Client : Emirates and NBD Bank Title : Unified Emirates Financial Management System (UniFi) Duration : Jan’08 – Nov ’08 Location : Dubai Role : Design Analyst / Team Lead Environment : Database : Oracle 10g,MS-SQL Server 2000,MS-SQLServer 2005 . Tools : Informatica 8.1.3, Cognos Planning 8, Cognos 8.3, Oracle 12i. O/S: Microsoft Windows XP, Windows 2003. Description: Emirates banking group is operating in 6 countries and it is the largest bank in the Middle East. Since it is operating in the different countries So EBG wants a solution for maintaince of data in one place with one application. With a view to integrate the Financial and the operational systems, Emirates Banking Group is in the process of deploying Oracle Applications, which includes all the financial modules, purchasing, inventory, property manager, enterprise asset management and Budgets which is coming from the Cognos Planning. As part of Oracle General Ledger and Budgets integration implementation we are using Informatica, this project covers the processes to be adopted by Emirates Banking Group based on the functionalities of the General Ledger Interface and Budgets . Responsibilities: Managed the following tasks: • Understanding existing system and customer requirements. • Meet business users to collect and analyze business requirements and provide system enhancement recommendations. • Designed the Architecture and Configuration of the Data warehouse • Understand the Cognos planning Budgeted data and maintain number of versions required by the business.
  • 8. • Analyze the data which need to be flowed to the Oracle APPS GL interface and the data flow to the Cognos Planning system. • Designed an Integration for the data flow from Oracle Apps to Data warehouse and Cognos Planning. • Analyzing the data quality issues and Communicate with concerned business user to resolve issues • Designed and Developed Complex Update Strategy, aggregator, joiner, lookup Transformations that are used in mapping. • Created Session, Database connections and batches using Informatica workflow Manager. • Manage Connection strings for Source and Target databases in Informatica. • Created temporary repository for already migrated database for system analysis. • Involved in creating, editing, deleting and scheduling of the Sessions. • Managing Admistrative activities like Session, Event and Error logs for troubleshooting connectivity problems • Creating, editing, deleting and scheduling of the Batches in Informatica generate reports. Client : Kuwait Airlines Title : Kuwait Airways Corporation Integration Information System (KAC-IIS) Duration : Oct’07 – Jan ’08 Location : Kuwait Role : Senior Data warehouse Developer, Testing Lead Environment : Database : Oracle 10g, MS-SQL Server 2000. Tools : Data stage 8.1Informatica 7.1,BO XI R2 O/S :Microsoft Windows XP, Unix (AIX 5.3) Description: Kuwait Airways Corporation (KAC) is a premier airline in Kuwait and well known in the middle-east and surrounding areas. KAC has varied systems leading to a very heterogeneous environment. They want to leverage on their existing systems to build a world class decision support system (Business Intelligence Framework) based on their business priorities. This project is to build and implement a world class Data warehousing and Business intelligence system. Responsibilities: Managed the following tasks: • Involved in logical and physical data model design, Star Schema design. Also responsible for trouble shooting, data cleansing and data integrity. • Create a set of definitions for a logical warehouse schema, its data sources, target and the operations that map, transform, and populate a warehouse • Involved in Data stage Designer to create the various Jobs for extracting, transforming and loading operational data into the data mart from ODS/Flat file to Reporting Tables. • Designed the Test cases, Test Scripts and test results for all the Modules. At Penfos Systems Pvt. Ltd. Client : Ministry of finance,Malaysia Title : Property Data warehouse System (MFPDW) Duration : Oct ’06 – Aug ’07 Location : Kulalumpur,Malaysia Role : Senior Data warehouse Consultant Environment : Database : Oracle 9i,MS-SQL Server 2000. Tools : SAS Data integration Studio 3.3, LSF Platform Scheduler, SAS OLAP Cube Studio, Web Report Studio O/S: Microsoft Windows XP, Unix Description: Ministry of finance is a Government Organization, which deals with the budget management, Tax Analysis, Economic and International Division, Loan management, Finance Market, Housing loan, Investment and housing loans. Business users required to create the monthly forecast and yearly budgets (Plan) to analyze the trends in the loan management, Finance Market Investments and housing loans. Involved in development of Property information data warehouse using SAS Data integration Studio and Reporting system to provide reports for inventory analysis and transferor, transferee details using SAS OLAP studio and WRS as reporting tool and Oracle10g as database. Responsibilities: Managed the following tasks: • Involved in designing the Data Model. • Developed Plan to load the dimensional and fact tables into the Data Mart.
  • 9. • Designed, Developed Jobs, Transformation objects using SAS Data integration Studio 3.3. • Analyzed the reporting requirement for Budgeting and Forecasting application. • Design and developed the Budgeting Cube. • Scheduling Batch jobs in LSF Platform Scheduler to transport data. Client : P & G, Philippines Title : Automated Sales Data warehouse Management System Duration : May ’04 – Aug ’06 Location : Hyderabad (Offshore team) Role : ETL Developer Environment : Database : Oracle 9i Tools : Informatica 7.1,Business Objects 6.5 O/S: Microsoft Windows XP, Unix Description: The client was a large CPG (Consumer Package Goods) Company. Their product portfolio spans across the entire spectrum of non-durable. They are also leaders in processed foods, Frozen Desserts and beverages. The company has a massive distribution network of warehouses, distributors and retail stockiest. And they have more than 500 brands in their portfolio. The aim is to develop Sales Data Mart and customized Reports for Sales and Stock. Responsibilities: Managed the following tasks: • Involved in logical and physical data model design, Star Schema design. Meet business users to collect and analyze business requirements and provide system enhancement recommendations. Also responsible for trouble shooting, data cleansing and data integrity. • Create a set of definitions for a logical warehouse schema, its data sources, target and the operations that map, transform, and populate a warehouse • Involved in Informatics Designer to create the various mappings for extracting, transforming and loading operational data into the data mart from ODS/Flat file to Reporting Tables
  • 10. • Designed, Developed Jobs, Transformation objects using SAS Data integration Studio 3.3. • Analyzed the reporting requirement for Budgeting and Forecasting application. • Design and developed the Budgeting Cube. • Scheduling Batch jobs in LSF Platform Scheduler to transport data. Client : P & G, Philippines Title : Automated Sales Data warehouse Management System Duration : May ’04 – Aug ’06 Location : Hyderabad (Offshore team) Role : ETL Developer Environment : Database : Oracle 9i Tools : Informatica 7.1,Business Objects 6.5 O/S: Microsoft Windows XP, Unix Description: The client was a large CPG (Consumer Package Goods) Company. Their product portfolio spans across the entire spectrum of non-durable. They are also leaders in processed foods, Frozen Desserts and beverages. The company has a massive distribution network of warehouses, distributors and retail stockiest. And they have more than 500 brands in their portfolio. The aim is to develop Sales Data Mart and customized Reports for Sales and Stock. Responsibilities: Managed the following tasks: • Involved in logical and physical data model design, Star Schema design. Meet business users to collect and analyze business requirements and provide system enhancement recommendations. Also responsible for trouble shooting, data cleansing and data integrity. • Create a set of definitions for a logical warehouse schema, its data sources, target and the operations that map, transform, and populate a warehouse • Involved in Informatics Designer to create the various mappings for extracting, transforming and loading operational data into the data mart from ODS/Flat file to Reporting Tables