SlideShare a Scribd company logo

Amit_Kumar_CV

A
Amit Kumar
1 of 10
Download to read offline
Amit Kumar
+1-925-918-5201
amit.git@gmail.com
Professional Objective
Motivated to work in a competitive and congenial environment to help business grow and provides
ample opportunities for individual professional growth.
Experience Summary
9.5 years of IT work experience, with 8 years+ of DW/BI experience as a Lead developer and architect
roles. Worked with clients based on various geographical locations and domains. Experience
implementing DW/BI solutions for Telecom, insurance and healthcare domain data warehouse
projects. My technological forte is Data modelling, Solution design, Informatica PowerCenter,
Informatica BDE, UNIX, Hadoop, Hive, Oracle 10g, PL/SQL, T-SQL (SQL server), SSIS, EPIC’s Clarity ETL.
October 2006 – 14th August 2015: Tata Consultancy Services.
17th August 2015 - Till Date: Capgemini
Joined IT Multinational on 17th of August 2015, currently assigned to lead equipment install base
project for Oil and Gas major, in capacity of Data Architect.
Technology & Tools
ETL Tools
Informatica PowerCenter 9.x/8.x, DT Studio, Informatica BDE Developer,
Oracle warehouse Builder, Epic’s Clarity, SSIS
Big Data solutions Hadoop Ecosystem, Hive, Pig Latin, Sqoop
Reporting Tools Microsoft power view, OBIEE, Predix
Databases
Exadata, Teradata, MS SQL Server 2012, Oracle 10g, PostgreSQL, Green
Plum
Data Modelling Tool Erwin, SQL Modeller
Languages
T-SQL, SQL, PL/SQL, Unix shell scripting, Basic Python, Core Java
Other Tools
RTC, ALM, SQL plus, TOAD, Remedy Admin, Remedy User, SVN, Star Team,
VSS
Certifications
• Cloudera Certified Apache Hadoop Developer (CCD-410 with 84%) - 2015
• MapR Certified Apache Hadoop Developer- 2015
• MapR Certified Apache HBase Developer (No SQL)- 2015
• AHM 250 Certified – 2014
• Microsoft .NET Framework 2.0 - Application Development Foundation Certification - 2008
• Oracle certified Associate - 2010
1
Engagement Overview
Currently working for Capgemini as full time DW/BI consultant, current end client IT is a multinational
Oil and Gas major. I am working in capacity of data architect leading 12 members team to develop
install base for client’s equipment service and sales business. Project execution mode is Agile and my
roles and responsibilities include providing sprint estimation and drive the execution following client
specific change control process for code deployment. UI for this solution is proposed to be
implemented on cloud platform.
I have 8+ years of experience in providing and implementing DW/BI solutions of which close to 3 years
worked as BW/BI architect. During this time, I have lead team to design and develop Install base BI
solution, Product lifecycle BI solution, QA process data quality model along with various data
integration solutions. With multiple business domain BI solution implementation experience, I am
capable to helping teams to manage data strategy.
With extensive working experience with ETL frameworks and range of ETL tools, I am capable of
guiding teams to correct and efficient implementation of ETL solution. I am having very good analytical
skills and it helped me quickly provide data quality matrix and highlight issues with data integration.
Experienced in analysing business processes, transactional system data and other source data to
understand dependencies, anomalies if present in implementing BI solution.
I am also having 8 months of work experience with Hadoop platform and its echo system tools like Pig,
Hive, Sqoop, and ETL tool Informatica BDE. I completed Big Data Developer training with Cloudera and
became certified Hadoop developer by Cloudera and MapR. In addition to that I also completed MapR
certification on Hbase (No SQL solution) developer. Looking forward for a project to contribute with my
experience on Big Data and learn further.
Below are the details of assignments I have worked in: - (given in chronologically reverse order
depending on assignment end date)
2
1. Project Name : Surface install base (June 2016 to Till Date)
Application Name : Field 360
End Client : Major Commercial Oil and Gas client.
ETL Tool/Technology: Informatica Powercenter, SQL Modeller, Exadata, Postgres, Predix,
UNIX.
Team Size : 12
Working in capacity of Data architect to provide install base BI solution for Oil and Gas equipment sales
and service business.
Role: Data Architect
Responsibilities:
• Interact with the business users on daily basis as a part of agile methodology and Involve in
understanding/gathering user’s requirements and build/update IT solution document for the
project.
• Created data model using SQL Modeller solution and get design approvals from IT owners and
platform architect. Preparing guidelines for ETL process for performance related issues and make
changes as per the need.
• Data profiling tasks and provide strategy for data integration from ERP, CRMs and other sources.
• Database table, Index design with other objects.
• Create data flow diagram and design Informatica workflows.
• Lead the team to meet sprint and iteration deadlines.
• Work on data quality issues and defect analysis. Interact with users on data issues. Work as a SPOC
for all defects in ALM for entire project.
• UI design and guide team for user experience requirement.
• Follow the Change control process to move the developed code to production to make it available
to the users. Create necessary change requests for go live and follow up with different technical
teams for review and approval.
• Provide time and effort level estimations on requirements and getting approvals from the IT
Owners by explaining the complexity if required.
2. Project Name : PQW data warehouse (August 2015 to May 2016)
Application Name : PQW
End Client : Major Healthcare equipment manufacturer client
ETL Tool/Technology: Informatica Powercenter, Erwin, Teradata, UNIX, ALM.
Team Size : 7
Worked in capacity of BI Architect and lead team to implement BI solution for distribution record,
which helped them to get a view of correct product lifecycle. Implemented performance improvement
changes (30 % improvement achieved) in existing Product Quality Warehouse project. Project was
executed in Agile mode.
Role: BI Architect
Responsibilities:
• Interact with the business users on daily basis as a part of agile methodology and Involve in
gathering user’s requirements and prepare solution document.
• ETL Performance analysis and get change implemented to improve data processing time and
resource consumption.
• Data profiling tasks and provide strategy for data integration from ERP, PLMs and Legacy sources.
3
• Provide time and effort level estimations on requirements and getting approvals from the client by
explaining the complexity.
• UI design and guide team for user experience requirement in OBIEE.
• Updated existing data model using Erwin and get design approvals from client. Monitor ETL
process for performance related issues and make changes as per the need.
• Design and develop Informatica workflows and Teradata ETL objects for data loading, and
packages for reporting. Lead the team to meet deadlines on time for each sprint.
• Work on defect analysis of reports with client. Interact with users on the issue occurrence and to
solve the issues. Work as a SPOC for all defects in ALM for entire project.
• Follow the Change control process to move the developed code to production to make it available
to the users. Create necessary change requests for go live and follow up with different technical
teams for review and approval.
3. Project Name : CI IT QA Metric (August 2014 to August 2015)
Application Name : CI IT QA Metric
End Client : USA Insurance Client
ETL Tool/Technology: Informatica Powercenter, Erwin, SSIS, SQL Server DB, Oracle,
Sharepoint, UNIX, Microsoft Power View
Team Size : 6
Designed data mart to hold quality assurance team metric specific data sourced from various QA Tools
(ALM, RTC, Zira) databases and multiple management dashboard. Designed and developed Informatica
ETL workflow for it and schedule it for automatic update in future.
Role: BI Architect
Responsibilities:
• Design of QA data mart (Logical and Physical Dimension Modelling) to facilitate metric reports of
ongoing and completed projects.
• Data profiling tasks and provided data integration strategy for various QA tools backed data for
metric reporting.
• Design and develop ETL workflow using Informatica and SSIS to pull data from QA tools like ALM,
RTC and various project dashboards.
• Define scheduling of Informatica workflow.
• Address any data quality issues.
• Provided knowledge about the developed metric reports to QA teams across domains in CI
vertical.
4
4. Project Name : EDW (September 2013 to August 2014)
Application Name : Enterprise Data warehouse
Client : USA Healthcare Client
ETL Tool/Technology: Hadoop, Hive, Pig Latin, Informatica BDE, Powercenter, Teradata,
UNIX
Team Size : 10+
Project was on development of EDW and BI solution and integrates data from multiple source systems,
to support multiple lines of business. As Informatica BDE developer designed and developed workflow
in BDE to transfer data from HDFS to Hive. Created Sqoop scripts to transfer data from Hive to RDBMS
Stage and scheduled it through oozie.
Role: Informatica BDE/Powercenter Lead Developer.
Responsibilities:
• Design and develop ETL workflow in Informatica BDE and powercenter.
• Writing PIG latin scripts to process data in Hadoop.
• Write Sqoop commands to pull data from Hive to put it into Teradata stage.
• Creating UDF’s in Python for date transformation.
• Investigation of data quality issues.
5. Project Name : Integrated Data Repository (Jan 2013 to September 2013)
Application Name : Integrated Data Repository
Client : USA Healthcare Client
ETL Tool/Technology: Informatica Powercenter, Teradata, Oracle, UNIX
Team Size : 7
Our team completed task of code rewrite from SAAS to Informatica and SQL to pull patient, encounter,
hospital services and related data into IDR warehouse from sources like clarity and other out of
network medical facilities. New workflow was fully automated and saved cost for clients to maintain a
separate SAAS team to pull data from clarity manually. Project was build using existing infrastructure at
client facility and proved to be more efficient than old solution.
Role: Developer
Responsibilities:
• Developed number of complex Informatica mappings, reusable transformations to implement the
business logic and to load the data incrementally.
• Designed the ETL processes using Informatica Powercenter 9.x to load data from Teradata, SQL
server into the target Oracle database.
• Extensively worked with the business and data analysts in requirements gathering and to translate
business requirements into technical specifications.
• Tested the data and data integrity among various sources and targets. Associated with Production
support team in various performances related issues.
5
6. Project Name : Clarity ETL (Oct 2011 to Dec 2012)
Application Name : Clarity
Client : USA Healthcare Client
ETL Tool/Technology: Epic’s Clarity, Teradata, UNIX
Team Size : 7
Upgraded (Epic 2010) and maintained multi deployment Clarity environment, which was processing
approx. 20 TB data on daily basis. Project was on healthcare data warehouse product implementation
(Epic’s Clarity) for a healthcare major. Clarity provided backbone of client reporting capability from a
RDBMS on day to day medical facility operations.
Role: Development Lead
Responsibilities:
• Installing, implementing and testing complete Clarity environments (i.e., console, Compass,
hyperspace, Teradata database configuration of Epic tools) for multiple Clarity non-production and
production environments.
• Responsible for designing Clarity environments and ETL solutions for release and version upgrades;
performs analysis between release and version levels to identify and document differences
• Worked on Epic technical components such as Cache database, Datalink.
• Investigation and resolution on data quality issues raised by business community.
• Investigation of data quality issues.
7. Project Name : Component Engine Oct 2010 to Sept 2011
Application Name : Merlin.
Client : UK Telecomm Client
ETL Tool/Technology : Oracle PL/SQL, UNIX
Team Size : 4
Our team build a generic component engine in Oracle and Unix, which was configurable by the
business clients through web interface to calculate new component measures. Component engine was
used to populate multiple facts tables exposed to BI reports. This project is backbone of Clients
Business reporting, and analysing efficiency and performance for its call centre employees.
Role: Dev Team Member
Responsibilities:
• Involve in Requirement gathering and analysis with delivery team & client.
• Created PL/SQL procedure through with component engine was implemented.
• Created a UNIX shell script to control the number of process invoked for component engine and
maintain execution order as per the configurable inter component dependency.
• Prod Implementation Support.
8. Project Name : Closed Loop Mkt. & Email Repository, Nov 2009 - Sept 2010
Application Name : EDI(Enterprise Data Integration)
Client : UK Telecomm Client
ETL Tool/Technology : Informatica Powercenter, Oracle, UNIX
Team Size : 5
This project was a new initiative by Client business to make customer response data for telemarketing
campaigns available with Marketing Campaign source data to accommodate customer’s feedback in
future campaigns. Technologies involved are Informatica for ETL and Siebel Analytics for Business
Reporting and marketing campaign.
Email Repository was an initiative by Client business to make a consolidated email data at one place,
with more cleansed data, rather than various distributed sources, along with the consent information.
6

More Related Content

What's hot

Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETLMani Sagar
 
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCB
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCBECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCB
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCBEMA Design Automation
 
ABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankAbhijeet Ghag
 
Amit Porwal_resume-Latest
Amit Porwal_resume-LatestAmit Porwal_resume-Latest
Amit Porwal_resume-LatestAmit Porwal
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh Kumar
 
PTC Live: Integrating PTC Windchill with Cadence PCB Design
PTC Live: Integrating PTC Windchill with Cadence PCB DesignPTC Live: Integrating PTC Windchill with Cadence PCB Design
PTC Live: Integrating PTC Windchill with Cadence PCB DesignEMA Design Automation
 
Consulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCSConsulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCSVictor M Torres
 
Qlikview training course contents, Qlikview training in bangalore, qlikview t...
Qlikview training course contents, Qlikview training in bangalore, qlikview t...Qlikview training course contents, Qlikview training in bangalore, qlikview t...
Qlikview training course contents, Qlikview training in bangalore, qlikview t...Manoj Jagtap
 

What's hot (19)

Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Harikrishna yaddanapudi
Harikrishna yaddanapudiHarikrishna yaddanapudi
Harikrishna yaddanapudi
 
Resume_Gulley_Oct7_2016
Resume_Gulley_Oct7_2016Resume_Gulley_Oct7_2016
Resume_Gulley_Oct7_2016
 
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCB
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCBECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCB
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCB
 
Abhishek jaiswal
Abhishek jaiswalAbhishek jaiswal
Abhishek jaiswal
 
ABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG Axisbank
 
Resume
ResumeResume
Resume
 
Geetha 6 yrs cv_july-2016
Geetha 6 yrs cv_july-2016Geetha 6 yrs cv_july-2016
Geetha 6 yrs cv_july-2016
 
Amit Porwal_resume-Latest
Amit Porwal_resume-LatestAmit Porwal_resume-Latest
Amit Porwal_resume-Latest
 
Sivagama_sundari_Sakthivel_Resume_2016
Sivagama_sundari_Sakthivel_Resume_2016Sivagama_sundari_Sakthivel_Resume_2016
Sivagama_sundari_Sakthivel_Resume_2016
 
Shivaprasada_Kodoth
Shivaprasada_KodothShivaprasada_Kodoth
Shivaprasada_Kodoth
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resume
 
PTC Live: Integrating PTC Windchill with Cadence PCB Design
PTC Live: Integrating PTC Windchill with Cadence PCB DesignPTC Live: Integrating PTC Windchill with Cadence PCB Design
PTC Live: Integrating PTC Windchill with Cadence PCB Design
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Resume
ResumeResume
Resume
 
Consulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCSConsulting Profile_Victor_Torres_2016-VVCS
Consulting Profile_Victor_Torres_2016-VVCS
 
Qlikview training course contents, Qlikview training in bangalore, qlikview t...
Qlikview training course contents, Qlikview training in bangalore, qlikview t...Qlikview training course contents, Qlikview training in bangalore, qlikview t...
Qlikview training course contents, Qlikview training in bangalore, qlikview t...
 
Rajesh CV
Rajesh CVRajesh CV
Rajesh CV
 
Jinan Babu
Jinan BabuJinan Babu
Jinan Babu
 

Viewers also liked

Production Support_ETL_Informatica Developer_raja_velpula_5yrs
Production Support_ETL_Informatica Developer_raja_velpula_5yrsProduction Support_ETL_Informatica Developer_raja_velpula_5yrs
Production Support_ETL_Informatica Developer_raja_velpula_5yrsrajasekhar velpula
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir Saxena
 
Sanket_Informatica_CV
Sanket_Informatica_CVSanket_Informatica_CV
Sanket_Informatica_CVSanket Gaware
 
Resume_Anjali Rakesh_BA_Sales Force
Resume_Anjali Rakesh_BA_Sales Force Resume_Anjali Rakesh_BA_Sales Force
Resume_Anjali Rakesh_BA_Sales Force Anjali Rakesh
 
BLANCA KEOGH PLSQL Developer
BLANCA KEOGH PLSQL DeveloperBLANCA KEOGH PLSQL Developer
BLANCA KEOGH PLSQL DeveloperBlanca Murillo
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh Kumar
 
Astha Kumari Resume2
Astha Kumari Resume2Astha Kumari Resume2
Astha Kumari Resume2Astha Singh
 
ASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer ResumeASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer ResumeSorakayala Ashok
 
Introdução a linguagem Swift
Introdução a linguagem SwiftIntrodução a linguagem Swift
Introdução a linguagem SwiftGabriel Rodrigues
 
Dynamic Width File in Spark_2016
Dynamic Width File in Spark_2016Dynamic Width File in Spark_2016
Dynamic Width File in Spark_2016Subhasish Guha
 
javier lasa: calculos videoweb2010
javier lasa: calculos videoweb2010javier lasa: calculos videoweb2010
javier lasa: calculos videoweb2010Gonzalo Martín
 
Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...
Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...
Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...Introspecta Taller Orientación Vocacional
 
Oficiais Aprovados
Oficiais AprovadosOficiais Aprovados
Oficiais AprovadosHugo Machado
 
How society and technologies influence User Interfaces
How society and technologies influence User InterfacesHow society and technologies influence User Interfaces
How society and technologies influence User InterfacesMarianne Abreu
 

Viewers also liked (17)

Production Support_ETL_Informatica Developer_raja_velpula_5yrs
Production Support_ETL_Informatica Developer_raja_velpula_5yrsProduction Support_ETL_Informatica Developer_raja_velpula_5yrs
Production Support_ETL_Informatica Developer_raja_velpula_5yrs
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume
 
Sanket_Informatica_CV
Sanket_Informatica_CVSanket_Informatica_CV
Sanket_Informatica_CV
 
Resume_Anjali Rakesh_BA_Sales Force
Resume_Anjali Rakesh_BA_Sales Force Resume_Anjali Rakesh_BA_Sales Force
Resume_Anjali Rakesh_BA_Sales Force
 
BLANCA KEOGH PLSQL Developer
BLANCA KEOGH PLSQL DeveloperBLANCA KEOGH PLSQL Developer
BLANCA KEOGH PLSQL Developer
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resume
 
Astha Kumari Resume2
Astha Kumari Resume2Astha Kumari Resume2
Astha Kumari Resume2
 
ASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer ResumeASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer Resume
 
Introdução a linguagem Swift
Introdução a linguagem SwiftIntrodução a linguagem Swift
Introdução a linguagem Swift
 
Dynamic Width File in Spark_2016
Dynamic Width File in Spark_2016Dynamic Width File in Spark_2016
Dynamic Width File in Spark_2016
 
javier lasa: calculos videoweb2010
javier lasa: calculos videoweb2010javier lasa: calculos videoweb2010
javier lasa: calculos videoweb2010
 
Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...
Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...
Descubre tu Vocación: Licenciatura Filosofía | Panorama laboral | ¿Cuánto gan...
 
Imaflora ras final
Imaflora ras finalImaflora ras final
Imaflora ras final
 
nd-grad-cert
nd-grad-certnd-grad-cert
nd-grad-cert
 
Oficiais Aprovados
Oficiais AprovadosOficiais Aprovados
Oficiais Aprovados
 
How society and technologies influence User Interfaces
How society and technologies influence User InterfacesHow society and technologies influence User Interfaces
How society and technologies influence User Interfaces
 
IBD 2016 QI symposium 2-23-2016
IBD 2016 QI symposium 2-23-2016IBD 2016 QI symposium 2-23-2016
IBD 2016 QI symposium 2-23-2016
 

Similar to Amit_Kumar_CV

Resume quaish abuzer
Resume quaish abuzerResume quaish abuzer
Resume quaish abuzerquaish abuzer
 
Vishwanath_M_CV_NL
Vishwanath_M_CV_NLVishwanath_M_CV_NL
Vishwanath_M_CV_NLVishwanath M
 
Murali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BIMurali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BIMurali Tummala
 
HarshaKore-HCLTechnologies
HarshaKore-HCLTechnologiesHarshaKore-HCLTechnologies
HarshaKore-HCLTechnologiesharsha kore
 
Ipsita_Informatica_9Year
Ipsita_Informatica_9YearIpsita_Informatica_9Year
Ipsita_Informatica_9Yearipsita mohanty
 
Subhoshree_ETLDeveloper
Subhoshree_ETLDeveloperSubhoshree_ETLDeveloper
Subhoshree_ETLDeveloperSubhoshree Deo
 
Swati Gupta Resume
Swati Gupta ResumeSwati Gupta Resume
Swati Gupta ResumeSwati Gupta
 
Resume - Deepak v.s
Resume -  Deepak v.sResume -  Deepak v.s
Resume - Deepak v.sDeepak V S
 
Dinesh_Deshpande_9Yrs_Exp_Informatica
Dinesh_Deshpande_9Yrs_Exp_InformaticaDinesh_Deshpande_9Yrs_Exp_Informatica
Dinesh_Deshpande_9Yrs_Exp_InformaticaDinesh Deshpande
 
Resume - DWBI - 5years - Akshay Shaha
Resume - DWBI - 5years - Akshay ShahaResume - DWBI - 5years - Akshay Shaha
Resume - DWBI - 5years - Akshay ShahaAkshay Shaha
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLsivakumar s
 

Similar to Amit_Kumar_CV (20)

RajeshS_ETL
RajeshS_ETLRajeshS_ETL
RajeshS_ETL
 
Resume quaish abuzer
Resume quaish abuzerResume quaish abuzer
Resume quaish abuzer
 
Vishwanath_M_CV_NL
Vishwanath_M_CV_NLVishwanath_M_CV_NL
Vishwanath_M_CV_NL
 
Abdul ETL Resume
Abdul ETL ResumeAbdul ETL Resume
Abdul ETL Resume
 
Subhoshree resume
Subhoshree resumeSubhoshree resume
Subhoshree resume
 
Murali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BIMurali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BI
 
HarshaKore-HCLTechnologies
HarshaKore-HCLTechnologiesHarshaKore-HCLTechnologies
HarshaKore-HCLTechnologies
 
Ganesh profile
Ganesh profileGanesh profile
Ganesh profile
 
Sudhir jaiswal
Sudhir jaiswalSudhir jaiswal
Sudhir jaiswal
 
Ipsita_Informatica_9Year
Ipsita_Informatica_9YearIpsita_Informatica_9Year
Ipsita_Informatica_9Year
 
Subhoshree_ETLDeveloper
Subhoshree_ETLDeveloperSubhoshree_ETLDeveloper
Subhoshree_ETLDeveloper
 
Swati Gupta Resume
Swati Gupta ResumeSwati Gupta Resume
Swati Gupta Resume
 
Resume - Deepak v.s
Resume -  Deepak v.sResume -  Deepak v.s
Resume - Deepak v.s
 
Dinesh_Deshpande_9Yrs_Exp_Informatica
Dinesh_Deshpande_9Yrs_Exp_InformaticaDinesh_Deshpande_9Yrs_Exp_Informatica
Dinesh_Deshpande_9Yrs_Exp_Informatica
 
Resume_Sita_Ramadas_akkineni
Resume_Sita_Ramadas_akkineniResume_Sita_Ramadas_akkineni
Resume_Sita_Ramadas_akkineni
 
Murali Tummala Resume
Murali Tummala ResumeMurali Tummala Resume
Murali Tummala Resume
 
Resume - DWBI - 5years - Akshay Shaha
Resume - DWBI - 5years - Akshay ShahaResume - DWBI - 5years - Akshay Shaha
Resume - DWBI - 5years - Akshay Shaha
 
Abhishek jaiswal
Abhishek jaiswalAbhishek jaiswal
Abhishek jaiswal
 
HamsaBalajiresume
HamsaBalajiresumeHamsaBalajiresume
HamsaBalajiresume
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 

Amit_Kumar_CV

  • 1. Amit Kumar +1-925-918-5201 amit.git@gmail.com Professional Objective Motivated to work in a competitive and congenial environment to help business grow and provides ample opportunities for individual professional growth. Experience Summary 9.5 years of IT work experience, with 8 years+ of DW/BI experience as a Lead developer and architect roles. Worked with clients based on various geographical locations and domains. Experience implementing DW/BI solutions for Telecom, insurance and healthcare domain data warehouse projects. My technological forte is Data modelling, Solution design, Informatica PowerCenter, Informatica BDE, UNIX, Hadoop, Hive, Oracle 10g, PL/SQL, T-SQL (SQL server), SSIS, EPIC’s Clarity ETL. October 2006 – 14th August 2015: Tata Consultancy Services. 17th August 2015 - Till Date: Capgemini Joined IT Multinational on 17th of August 2015, currently assigned to lead equipment install base project for Oil and Gas major, in capacity of Data Architect. Technology & Tools ETL Tools Informatica PowerCenter 9.x/8.x, DT Studio, Informatica BDE Developer, Oracle warehouse Builder, Epic’s Clarity, SSIS Big Data solutions Hadoop Ecosystem, Hive, Pig Latin, Sqoop Reporting Tools Microsoft power view, OBIEE, Predix Databases Exadata, Teradata, MS SQL Server 2012, Oracle 10g, PostgreSQL, Green Plum Data Modelling Tool Erwin, SQL Modeller Languages T-SQL, SQL, PL/SQL, Unix shell scripting, Basic Python, Core Java Other Tools RTC, ALM, SQL plus, TOAD, Remedy Admin, Remedy User, SVN, Star Team, VSS Certifications • Cloudera Certified Apache Hadoop Developer (CCD-410 with 84%) - 2015 • MapR Certified Apache Hadoop Developer- 2015 • MapR Certified Apache HBase Developer (No SQL)- 2015 • AHM 250 Certified – 2014 • Microsoft .NET Framework 2.0 - Application Development Foundation Certification - 2008 • Oracle certified Associate - 2010 1
  • 2. Engagement Overview Currently working for Capgemini as full time DW/BI consultant, current end client IT is a multinational Oil and Gas major. I am working in capacity of data architect leading 12 members team to develop install base for client’s equipment service and sales business. Project execution mode is Agile and my roles and responsibilities include providing sprint estimation and drive the execution following client specific change control process for code deployment. UI for this solution is proposed to be implemented on cloud platform. I have 8+ years of experience in providing and implementing DW/BI solutions of which close to 3 years worked as BW/BI architect. During this time, I have lead team to design and develop Install base BI solution, Product lifecycle BI solution, QA process data quality model along with various data integration solutions. With multiple business domain BI solution implementation experience, I am capable to helping teams to manage data strategy. With extensive working experience with ETL frameworks and range of ETL tools, I am capable of guiding teams to correct and efficient implementation of ETL solution. I am having very good analytical skills and it helped me quickly provide data quality matrix and highlight issues with data integration. Experienced in analysing business processes, transactional system data and other source data to understand dependencies, anomalies if present in implementing BI solution. I am also having 8 months of work experience with Hadoop platform and its echo system tools like Pig, Hive, Sqoop, and ETL tool Informatica BDE. I completed Big Data Developer training with Cloudera and became certified Hadoop developer by Cloudera and MapR. In addition to that I also completed MapR certification on Hbase (No SQL solution) developer. Looking forward for a project to contribute with my experience on Big Data and learn further. Below are the details of assignments I have worked in: - (given in chronologically reverse order depending on assignment end date) 2
  • 3. 1. Project Name : Surface install base (June 2016 to Till Date) Application Name : Field 360 End Client : Major Commercial Oil and Gas client. ETL Tool/Technology: Informatica Powercenter, SQL Modeller, Exadata, Postgres, Predix, UNIX. Team Size : 12 Working in capacity of Data architect to provide install base BI solution for Oil and Gas equipment sales and service business. Role: Data Architect Responsibilities: • Interact with the business users on daily basis as a part of agile methodology and Involve in understanding/gathering user’s requirements and build/update IT solution document for the project. • Created data model using SQL Modeller solution and get design approvals from IT owners and platform architect. Preparing guidelines for ETL process for performance related issues and make changes as per the need. • Data profiling tasks and provide strategy for data integration from ERP, CRMs and other sources. • Database table, Index design with other objects. • Create data flow diagram and design Informatica workflows. • Lead the team to meet sprint and iteration deadlines. • Work on data quality issues and defect analysis. Interact with users on data issues. Work as a SPOC for all defects in ALM for entire project. • UI design and guide team for user experience requirement. • Follow the Change control process to move the developed code to production to make it available to the users. Create necessary change requests for go live and follow up with different technical teams for review and approval. • Provide time and effort level estimations on requirements and getting approvals from the IT Owners by explaining the complexity if required. 2. Project Name : PQW data warehouse (August 2015 to May 2016) Application Name : PQW End Client : Major Healthcare equipment manufacturer client ETL Tool/Technology: Informatica Powercenter, Erwin, Teradata, UNIX, ALM. Team Size : 7 Worked in capacity of BI Architect and lead team to implement BI solution for distribution record, which helped them to get a view of correct product lifecycle. Implemented performance improvement changes (30 % improvement achieved) in existing Product Quality Warehouse project. Project was executed in Agile mode. Role: BI Architect Responsibilities: • Interact with the business users on daily basis as a part of agile methodology and Involve in gathering user’s requirements and prepare solution document. • ETL Performance analysis and get change implemented to improve data processing time and resource consumption. • Data profiling tasks and provide strategy for data integration from ERP, PLMs and Legacy sources. 3
  • 4. • Provide time and effort level estimations on requirements and getting approvals from the client by explaining the complexity. • UI design and guide team for user experience requirement in OBIEE. • Updated existing data model using Erwin and get design approvals from client. Monitor ETL process for performance related issues and make changes as per the need. • Design and develop Informatica workflows and Teradata ETL objects for data loading, and packages for reporting. Lead the team to meet deadlines on time for each sprint. • Work on defect analysis of reports with client. Interact with users on the issue occurrence and to solve the issues. Work as a SPOC for all defects in ALM for entire project. • Follow the Change control process to move the developed code to production to make it available to the users. Create necessary change requests for go live and follow up with different technical teams for review and approval. 3. Project Name : CI IT QA Metric (August 2014 to August 2015) Application Name : CI IT QA Metric End Client : USA Insurance Client ETL Tool/Technology: Informatica Powercenter, Erwin, SSIS, SQL Server DB, Oracle, Sharepoint, UNIX, Microsoft Power View Team Size : 6 Designed data mart to hold quality assurance team metric specific data sourced from various QA Tools (ALM, RTC, Zira) databases and multiple management dashboard. Designed and developed Informatica ETL workflow for it and schedule it for automatic update in future. Role: BI Architect Responsibilities: • Design of QA data mart (Logical and Physical Dimension Modelling) to facilitate metric reports of ongoing and completed projects. • Data profiling tasks and provided data integration strategy for various QA tools backed data for metric reporting. • Design and develop ETL workflow using Informatica and SSIS to pull data from QA tools like ALM, RTC and various project dashboards. • Define scheduling of Informatica workflow. • Address any data quality issues. • Provided knowledge about the developed metric reports to QA teams across domains in CI vertical. 4
  • 5. 4. Project Name : EDW (September 2013 to August 2014) Application Name : Enterprise Data warehouse Client : USA Healthcare Client ETL Tool/Technology: Hadoop, Hive, Pig Latin, Informatica BDE, Powercenter, Teradata, UNIX Team Size : 10+ Project was on development of EDW and BI solution and integrates data from multiple source systems, to support multiple lines of business. As Informatica BDE developer designed and developed workflow in BDE to transfer data from HDFS to Hive. Created Sqoop scripts to transfer data from Hive to RDBMS Stage and scheduled it through oozie. Role: Informatica BDE/Powercenter Lead Developer. Responsibilities: • Design and develop ETL workflow in Informatica BDE and powercenter. • Writing PIG latin scripts to process data in Hadoop. • Write Sqoop commands to pull data from Hive to put it into Teradata stage. • Creating UDF’s in Python for date transformation. • Investigation of data quality issues. 5. Project Name : Integrated Data Repository (Jan 2013 to September 2013) Application Name : Integrated Data Repository Client : USA Healthcare Client ETL Tool/Technology: Informatica Powercenter, Teradata, Oracle, UNIX Team Size : 7 Our team completed task of code rewrite from SAAS to Informatica and SQL to pull patient, encounter, hospital services and related data into IDR warehouse from sources like clarity and other out of network medical facilities. New workflow was fully automated and saved cost for clients to maintain a separate SAAS team to pull data from clarity manually. Project was build using existing infrastructure at client facility and proved to be more efficient than old solution. Role: Developer Responsibilities: • Developed number of complex Informatica mappings, reusable transformations to implement the business logic and to load the data incrementally. • Designed the ETL processes using Informatica Powercenter 9.x to load data from Teradata, SQL server into the target Oracle database. • Extensively worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications. • Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues. 5
  • 6. 6. Project Name : Clarity ETL (Oct 2011 to Dec 2012) Application Name : Clarity Client : USA Healthcare Client ETL Tool/Technology: Epic’s Clarity, Teradata, UNIX Team Size : 7 Upgraded (Epic 2010) and maintained multi deployment Clarity environment, which was processing approx. 20 TB data on daily basis. Project was on healthcare data warehouse product implementation (Epic’s Clarity) for a healthcare major. Clarity provided backbone of client reporting capability from a RDBMS on day to day medical facility operations. Role: Development Lead Responsibilities: • Installing, implementing and testing complete Clarity environments (i.e., console, Compass, hyperspace, Teradata database configuration of Epic tools) for multiple Clarity non-production and production environments. • Responsible for designing Clarity environments and ETL solutions for release and version upgrades; performs analysis between release and version levels to identify and document differences • Worked on Epic technical components such as Cache database, Datalink. • Investigation and resolution on data quality issues raised by business community. • Investigation of data quality issues. 7. Project Name : Component Engine Oct 2010 to Sept 2011 Application Name : Merlin. Client : UK Telecomm Client ETL Tool/Technology : Oracle PL/SQL, UNIX Team Size : 4 Our team build a generic component engine in Oracle and Unix, which was configurable by the business clients through web interface to calculate new component measures. Component engine was used to populate multiple facts tables exposed to BI reports. This project is backbone of Clients Business reporting, and analysing efficiency and performance for its call centre employees. Role: Dev Team Member Responsibilities: • Involve in Requirement gathering and analysis with delivery team & client. • Created PL/SQL procedure through with component engine was implemented. • Created a UNIX shell script to control the number of process invoked for component engine and maintain execution order as per the configurable inter component dependency. • Prod Implementation Support. 8. Project Name : Closed Loop Mkt. & Email Repository, Nov 2009 - Sept 2010 Application Name : EDI(Enterprise Data Integration) Client : UK Telecomm Client ETL Tool/Technology : Informatica Powercenter, Oracle, UNIX Team Size : 5 This project was a new initiative by Client business to make customer response data for telemarketing campaigns available with Marketing Campaign source data to accommodate customer’s feedback in future campaigns. Technologies involved are Informatica for ETL and Siebel Analytics for Business Reporting and marketing campaign. Email Repository was an initiative by Client business to make a consolidated email data at one place, with more cleansed data, rather than various distributed sources, along with the consent information. 6
  • 7. Response data available with Marketing Campaign source data to accommodate customer’s feedback in future campaigns. Technologies involved are Informatica for ETL and Siebel Analytics for Business Reporting. Role: Developer Responsibilities: • Develop mapping with medium complexity to pull campaign marketing, and customer’s feedback data from Siebel Marketing systems, and customer data from EDW to the data warehouse. • Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production. • Worked with Source qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence Generator, and Update Strategy transformation. • Involved in Informatica code deployments to all the environments. • Unit testing. 9. Project Name : Siebel Marketing, Jul 2009 to Oct 2009 Application Name : EDI(Enterprise Data Integration) Client : UK Telecomm Client ETL Tool/Technology : Informatica Powercenter, Oracle, UNIX, QC. Team Size : 9 Our Team provided value added services to client Business with expertise in OBI & ETL design. I worked as functional tester for Informatica ETL mappings & PL/SQL code. I worked on testing of ETL workflow of couple of facts and dimension table’s data on database BI Layer. Role: Application System Tester Responsibilities: • Involve in Requirement gathering and analysis with delivery team & client. • Functional testing of Informatica maps & Maintaining Quality Gates. • Create Automation Scripts for regression testing. • Implementation Support. 10. Project Name : TSR Migration, EDW. July 2008 to Jun 2009 Application Name : Retail EDW(Enterprise Data warehouse) Client : UK Telecomm Client ETL Tool/Technology: OWB (Oracle warehouse builder), Oracle, UNIX, QC. Team Size : 5 This team worked on many aspects of data warehouse enhancement’s quality assurance work. I was involved in system testing part of delivery cycle. One of the major projects where I got opportunity to work as tester was TSR Migration. This project is developed for the migration of all UK Business Customer data from the Classic stack (CSS) to the new Strategic stack (One Siebel & Geneva) of UK Business Single Solution (UKBSS) to achieve the physical separation of data stated by transformational and regulatory (TSR) requirements for BT UK Business. The overall migration is broadly divided into two main processes – Data Analysis (using OBI) & Data Migration (using ODI). The Data Analysis process identifies the set of billing accounts that are ready for migration in a particular release. The Data migration process in essence extracts data from the legacy systems like IRS and SIM, validates the data against multiple exclusion and validation rules, transforms the data using product and field mappings, generates load files and sends the same to intermediate systems that load the transformed data to strategic systems like One Siebel and Geneva. Role: Application System Tester Responsibilities: • Involve in test approach analysis with delivery team members & client. 7
  • 8. • System testing & Maintaining Quality Gates. • Create Automation Scripts for regression testing. • Implementation Support. 11. Project Name : NTM Dec 2006 to May 2008 Client : Qwest Communications ETL Tool/Technology : Remedy Admin, Remedy User, Core Java Team Size : 5 This team was working on Network Trouble Ticketing. Main tool that we had opportunity to work was AR Systems Remedy Admin, User, QC. It was a project to develop ticketing system having capability to register trouble on basis of manual complains as well as proactively by monitoring network elements parameter. Java was used to implement complex business logic, which was not possible using Remedy Admin tool features. Role: Developer Responsibilities: • Understand requirements form business/IT users or onshore counterpart. • Provide a way to implement the requirement in AR Remedy Admin. • Unit test the solution. • Provide support for system testing. • Implementation Support. 8
  • 9. Tools Developed: • UNIX utility using OMB+ to make OWB, PLS/SQL & Unix Auto Deployment Scripts. • Pl/SQL Packages to generate data for testing purpose. • Macro based excels to load data on oracle database and retrieve. • XML parser to generate csv file with clarity execution stats along with performance graphs. • Multiple project specific Unix shell scripts to automate process. Skills • Good analytical skills and very quick learning ability. • Multidimensional thinking and problem Solving Skills. • Capable of handling multiple responsibilities. • Good communication and interpersonal skills. • Experience leading technical team and guiding each individual’s goal towards to achieve end solution. Personal Profile Name : Amit Kumar Date of Birth : 15 Jan 1984 Marital Status : Married Nationality : India Work visa : H1B (Valid till July 2018) Languages Known : Hindi, English Present Address : Houston, TX, USA Declaration I here by declare that the information furnished above are true and correct to the best of my knowledge. Place : Houston, TX, USA Amit Kumar Date : July-15, 2016 9
  • 10. Tools Developed: • UNIX utility using OMB+ to make OWB, PLS/SQL & Unix Auto Deployment Scripts. • Pl/SQL Packages to generate data for testing purpose. • Macro based excels to load data on oracle database and retrieve. • XML parser to generate csv file with clarity execution stats along with performance graphs. • Multiple project specific Unix shell scripts to automate process. Skills • Good analytical skills and very quick learning ability. • Multidimensional thinking and problem Solving Skills. • Capable of handling multiple responsibilities. • Good communication and interpersonal skills. • Experience leading technical team and guiding each individual’s goal towards to achieve end solution. Personal Profile Name : Amit Kumar Date of Birth : 15 Jan 1984 Marital Status : Married Nationality : India Work visa : H1B (Valid till July 2018) Languages Known : Hindi, English Present Address : Houston, TX, USA Declaration I here by declare that the information furnished above are true and correct to the best of my knowledge. Place : Houston, TX, USA Amit Kumar Date : July-15, 2016 9