This document contains a summary of Amit Kumar's professional experience and qualifications. He has over 9 years of IT experience, including 8 years of data warehouse and business intelligence experience. Currently he works as a data architect at Capgemini, leading a team of 12 on an oil and gas equipment install base project. He has extensive experience designing and developing data integration solutions using tools like Informatica, Hadoop, and SQL.
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCBEMA Design Automation
Learn how PTC and Cadence have developed a unique collaboration environment to connect Allegro PCB design data with the Windchill PLM system for robust file management and check-in check-out capabilities.
Actively looking for new opportunity in Business Intelligence, Data Integration and Data Warehouse. Hands on experience in data analysis, providing ETL Solutions and building reports, dashboards and framework for business.
Tools/Technologies:
-Databases: SQL Server 2012, Teradata, MySQL.
-Reporting Tools: Pentaho Report Designer, Tableau.
-Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
-ETL Tools: Pentaho PDI, SSIS
-Scripting Languages: Python,UNIX Shell Scripting, Java script.
-Cloud: AWS – S3, RDS, EC2, DMS, Glue , snowflake, Metallion
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Sudhir hadoop and Data warehousing resume Sudhir Saxena
Overall 5.8 Years of Professional IT experience in Data Warehousing and Business Intelligence.
Having one year onsite experience in Mexico and USA with USAA Client, USA.
Having good Knowledge in Bigdata, Hadoop, Pig, Hive, Sqoop, Hbase, Python, Spark, Scala.
ECAD MCAD Design Data Management with PTC Windchill and Cadence Allegro PCBEMA Design Automation
Learn how PTC and Cadence have developed a unique collaboration environment to connect Allegro PCB design data with the Windchill PLM system for robust file management and check-in check-out capabilities.
Actively looking for new opportunity in Business Intelligence, Data Integration and Data Warehouse. Hands on experience in data analysis, providing ETL Solutions and building reports, dashboards and framework for business.
Tools/Technologies:
-Databases: SQL Server 2012, Teradata, MySQL.
-Reporting Tools: Pentaho Report Designer, Tableau.
-Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
-ETL Tools: Pentaho PDI, SSIS
-Scripting Languages: Python,UNIX Shell Scripting, Java script.
-Cloud: AWS – S3, RDS, EC2, DMS, Glue , snowflake, Metallion
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Sudhir hadoop and Data warehousing resume Sudhir Saxena
Overall 5.8 Years of Professional IT experience in Data Warehousing and Business Intelligence.
Having one year onsite experience in Mexico and USA with USAA Client, USA.
Having good Knowledge in Bigdata, Hadoop, Pig, Hive, Sqoop, Hbase, Python, Spark, Scala.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Tu Vocación: Licenciatura Filosofía
http://descubretuexito.introspecta.com.mx/licenciatura-en-filosofia-y-etica/
Descubre tu vocación en la Licenciatura en Filosofía y ética y elige tu carrera. Conoce de qué se trata, las posibilidades, el futuro, las tendencias, su demanda, y una comparativa de las carreras mejor pagadas.
Ejerce una carrera universitaria como experto en la madre de todas las ciencias
La Filosofía es una de las carreras más importantes y continuamente una de las más juzgadas, ya que a pesar de la necesidad de la existencia de Filósofos, la pregunta que permanece a través de los años es ¿qué hacen realmente? Esta carrera implica el amor a la sabiduría, así que sea cual sea el área que elijas para especializarte, ten la certeza que si estudias esta carrera con la seguridad que es tu vocación, estarás más cerca que el resto de la gente de conocer la verdad detrás de todas las cosas.
I am Murali.Below is a short summary on my professional experience,
Total years of experience - 13 (Releavant BO experience - 10 Years)
Technologies/Tools worked on : SAP BO 4.x - XI R2,BO Admin, Xcelsius Dashboard designing, Crystal Reports, SAP Design Studio, SAP BW 7,Unix,Oracle 10G PL/SQL, SQL Server and Microsoft Technologies(VB,ASP and VB.Net).
I am interested to take up new roles on different aspects in BO domain.
A competent professional offering over 7+ years of experience in Business Intelligence & Data Warehousing working in roles such as Business Analyst, Data Analyst & Reporting Developer, ETL Architect, Data Architect. Extensively worked on a wide range of tools such as Tableau, Spotfire & SAP Business Objects. Skilled in Teradata database, Oracle Database, MS SQL Server, MS Access, PostgreSQL, Requirements Analysis, reporting, Extract, Transform, Load (ETL), Data Warehousing, End-to-end execution of BI Projects, System Integration, Value Delivery, Team Leadership, Onshore-Offshore working model.
Business Expertise: Power & Engineering(GE POWER), Banking and Financial Services(GE CAPITAL), Leasing domain .
Interests: Data Science.
IT Professional with 9 years of Data Warehousing experience in the areas of ETL design and Development.Excellent Experience in Requirement Gathering, Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
Actively looking for new opportunity in Business Intelligence, Data Integration and Data Warehouse. Hands on experience in data analysis, providing ETL Solutions and building reports, dashboards and framework for business.
Tools/Technologies:
-Databases: SQL Server 2012, Teradata, MySQL.
-Reporting Tools: Pentaho Report Designer, Tableau.
-Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
-ETL Tools: Pentaho PDI, SSIS
-Scripting Languages: Python,UNIX Shell Scripting, Java script.
-Cloud: AWS – S3, RDS, EC2, DMS, Glue , snowflake, Metallion
1. Amit Kumar
+1-925-918-5201
amit.git@gmail.com
Professional Objective
Motivated to work in a competitive and congenial environment to help business grow and provides
ample opportunities for individual professional growth.
Experience Summary
9.5 years of IT work experience, with 8 years+ of DW/BI experience as a Lead developer and architect
roles. Worked with clients based on various geographical locations and domains. Experience
implementing DW/BI solutions for Telecom, insurance and healthcare domain data warehouse
projects. My technological forte is Data modelling, Solution design, Informatica PowerCenter,
Informatica BDE, UNIX, Hadoop, Hive, Oracle 10g, PL/SQL, T-SQL (SQL server), SSIS, EPIC’s Clarity ETL.
October 2006 – 14th August 2015: Tata Consultancy Services.
17th August 2015 - Till Date: Capgemini
Joined IT Multinational on 17th of August 2015, currently assigned to lead equipment install base
project for Oil and Gas major, in capacity of Data Architect.
Technology & Tools
ETL Tools
Informatica PowerCenter 9.x/8.x, DT Studio, Informatica BDE Developer,
Oracle warehouse Builder, Epic’s Clarity, SSIS
Big Data solutions Hadoop Ecosystem, Hive, Pig Latin, Sqoop
Reporting Tools Microsoft power view, OBIEE, Predix
Databases
Exadata, Teradata, MS SQL Server 2012, Oracle 10g, PostgreSQL, Green
Plum
Data Modelling Tool Erwin, SQL Modeller
Languages
T-SQL, SQL, PL/SQL, Unix shell scripting, Basic Python, Core Java
Other Tools
RTC, ALM, SQL plus, TOAD, Remedy Admin, Remedy User, SVN, Star Team,
VSS
Certifications
• Cloudera Certified Apache Hadoop Developer (CCD-410 with 84%) - 2015
• MapR Certified Apache Hadoop Developer- 2015
• MapR Certified Apache HBase Developer (No SQL)- 2015
• AHM 250 Certified – 2014
• Microsoft .NET Framework 2.0 - Application Development Foundation Certification - 2008
• Oracle certified Associate - 2010
1
2. Engagement Overview
Currently working for Capgemini as full time DW/BI consultant, current end client IT is a multinational
Oil and Gas major. I am working in capacity of data architect leading 12 members team to develop
install base for client’s equipment service and sales business. Project execution mode is Agile and my
roles and responsibilities include providing sprint estimation and drive the execution following client
specific change control process for code deployment. UI for this solution is proposed to be
implemented on cloud platform.
I have 8+ years of experience in providing and implementing DW/BI solutions of which close to 3 years
worked as BW/BI architect. During this time, I have lead team to design and develop Install base BI
solution, Product lifecycle BI solution, QA process data quality model along with various data
integration solutions. With multiple business domain BI solution implementation experience, I am
capable to helping teams to manage data strategy.
With extensive working experience with ETL frameworks and range of ETL tools, I am capable of
guiding teams to correct and efficient implementation of ETL solution. I am having very good analytical
skills and it helped me quickly provide data quality matrix and highlight issues with data integration.
Experienced in analysing business processes, transactional system data and other source data to
understand dependencies, anomalies if present in implementing BI solution.
I am also having 8 months of work experience with Hadoop platform and its echo system tools like Pig,
Hive, Sqoop, and ETL tool Informatica BDE. I completed Big Data Developer training with Cloudera and
became certified Hadoop developer by Cloudera and MapR. In addition to that I also completed MapR
certification on Hbase (No SQL solution) developer. Looking forward for a project to contribute with my
experience on Big Data and learn further.
Below are the details of assignments I have worked in: - (given in chronologically reverse order
depending on assignment end date)
2
3. 1. Project Name : Surface install base (June 2016 to Till Date)
Application Name : Field 360
End Client : Major Commercial Oil and Gas client.
ETL Tool/Technology: Informatica Powercenter, SQL Modeller, Exadata, Postgres, Predix,
UNIX.
Team Size : 12
Working in capacity of Data architect to provide install base BI solution for Oil and Gas equipment sales
and service business.
Role: Data Architect
Responsibilities:
• Interact with the business users on daily basis as a part of agile methodology and Involve in
understanding/gathering user’s requirements and build/update IT solution document for the
project.
• Created data model using SQL Modeller solution and get design approvals from IT owners and
platform architect. Preparing guidelines for ETL process for performance related issues and make
changes as per the need.
• Data profiling tasks and provide strategy for data integration from ERP, CRMs and other sources.
• Database table, Index design with other objects.
• Create data flow diagram and design Informatica workflows.
• Lead the team to meet sprint and iteration deadlines.
• Work on data quality issues and defect analysis. Interact with users on data issues. Work as a SPOC
for all defects in ALM for entire project.
• UI design and guide team for user experience requirement.
• Follow the Change control process to move the developed code to production to make it available
to the users. Create necessary change requests for go live and follow up with different technical
teams for review and approval.
• Provide time and effort level estimations on requirements and getting approvals from the IT
Owners by explaining the complexity if required.
2. Project Name : PQW data warehouse (August 2015 to May 2016)
Application Name : PQW
End Client : Major Healthcare equipment manufacturer client
ETL Tool/Technology: Informatica Powercenter, Erwin, Teradata, UNIX, ALM.
Team Size : 7
Worked in capacity of BI Architect and lead team to implement BI solution for distribution record,
which helped them to get a view of correct product lifecycle. Implemented performance improvement
changes (30 % improvement achieved) in existing Product Quality Warehouse project. Project was
executed in Agile mode.
Role: BI Architect
Responsibilities:
• Interact with the business users on daily basis as a part of agile methodology and Involve in
gathering user’s requirements and prepare solution document.
• ETL Performance analysis and get change implemented to improve data processing time and
resource consumption.
• Data profiling tasks and provide strategy for data integration from ERP, PLMs and Legacy sources.
3
4. • Provide time and effort level estimations on requirements and getting approvals from the client by
explaining the complexity.
• UI design and guide team for user experience requirement in OBIEE.
• Updated existing data model using Erwin and get design approvals from client. Monitor ETL
process for performance related issues and make changes as per the need.
• Design and develop Informatica workflows and Teradata ETL objects for data loading, and
packages for reporting. Lead the team to meet deadlines on time for each sprint.
• Work on defect analysis of reports with client. Interact with users on the issue occurrence and to
solve the issues. Work as a SPOC for all defects in ALM for entire project.
• Follow the Change control process to move the developed code to production to make it available
to the users. Create necessary change requests for go live and follow up with different technical
teams for review and approval.
3. Project Name : CI IT QA Metric (August 2014 to August 2015)
Application Name : CI IT QA Metric
End Client : USA Insurance Client
ETL Tool/Technology: Informatica Powercenter, Erwin, SSIS, SQL Server DB, Oracle,
Sharepoint, UNIX, Microsoft Power View
Team Size : 6
Designed data mart to hold quality assurance team metric specific data sourced from various QA Tools
(ALM, RTC, Zira) databases and multiple management dashboard. Designed and developed Informatica
ETL workflow for it and schedule it for automatic update in future.
Role: BI Architect
Responsibilities:
• Design of QA data mart (Logical and Physical Dimension Modelling) to facilitate metric reports of
ongoing and completed projects.
• Data profiling tasks and provided data integration strategy for various QA tools backed data for
metric reporting.
• Design and develop ETL workflow using Informatica and SSIS to pull data from QA tools like ALM,
RTC and various project dashboards.
• Define scheduling of Informatica workflow.
• Address any data quality issues.
• Provided knowledge about the developed metric reports to QA teams across domains in CI
vertical.
4
5. 4. Project Name : EDW (September 2013 to August 2014)
Application Name : Enterprise Data warehouse
Client : USA Healthcare Client
ETL Tool/Technology: Hadoop, Hive, Pig Latin, Informatica BDE, Powercenter, Teradata,
UNIX
Team Size : 10+
Project was on development of EDW and BI solution and integrates data from multiple source systems,
to support multiple lines of business. As Informatica BDE developer designed and developed workflow
in BDE to transfer data from HDFS to Hive. Created Sqoop scripts to transfer data from Hive to RDBMS
Stage and scheduled it through oozie.
Role: Informatica BDE/Powercenter Lead Developer.
Responsibilities:
• Design and develop ETL workflow in Informatica BDE and powercenter.
• Writing PIG latin scripts to process data in Hadoop.
• Write Sqoop commands to pull data from Hive to put it into Teradata stage.
• Creating UDF’s in Python for date transformation.
• Investigation of data quality issues.
5. Project Name : Integrated Data Repository (Jan 2013 to September 2013)
Application Name : Integrated Data Repository
Client : USA Healthcare Client
ETL Tool/Technology: Informatica Powercenter, Teradata, Oracle, UNIX
Team Size : 7
Our team completed task of code rewrite from SAAS to Informatica and SQL to pull patient, encounter,
hospital services and related data into IDR warehouse from sources like clarity and other out of
network medical facilities. New workflow was fully automated and saved cost for clients to maintain a
separate SAAS team to pull data from clarity manually. Project was build using existing infrastructure at
client facility and proved to be more efficient than old solution.
Role: Developer
Responsibilities:
• Developed number of complex Informatica mappings, reusable transformations to implement the
business logic and to load the data incrementally.
• Designed the ETL processes using Informatica Powercenter 9.x to load data from Teradata, SQL
server into the target Oracle database.
• Extensively worked with the business and data analysts in requirements gathering and to translate
business requirements into technical specifications.
• Tested the data and data integrity among various sources and targets. Associated with Production
support team in various performances related issues.
5
6. 6. Project Name : Clarity ETL (Oct 2011 to Dec 2012)
Application Name : Clarity
Client : USA Healthcare Client
ETL Tool/Technology: Epic’s Clarity, Teradata, UNIX
Team Size : 7
Upgraded (Epic 2010) and maintained multi deployment Clarity environment, which was processing
approx. 20 TB data on daily basis. Project was on healthcare data warehouse product implementation
(Epic’s Clarity) for a healthcare major. Clarity provided backbone of client reporting capability from a
RDBMS on day to day medical facility operations.
Role: Development Lead
Responsibilities:
• Installing, implementing and testing complete Clarity environments (i.e., console, Compass,
hyperspace, Teradata database configuration of Epic tools) for multiple Clarity non-production and
production environments.
• Responsible for designing Clarity environments and ETL solutions for release and version upgrades;
performs analysis between release and version levels to identify and document differences
• Worked on Epic technical components such as Cache database, Datalink.
• Investigation and resolution on data quality issues raised by business community.
• Investigation of data quality issues.
7. Project Name : Component Engine Oct 2010 to Sept 2011
Application Name : Merlin.
Client : UK Telecomm Client
ETL Tool/Technology : Oracle PL/SQL, UNIX
Team Size : 4
Our team build a generic component engine in Oracle and Unix, which was configurable by the
business clients through web interface to calculate new component measures. Component engine was
used to populate multiple facts tables exposed to BI reports. This project is backbone of Clients
Business reporting, and analysing efficiency and performance for its call centre employees.
Role: Dev Team Member
Responsibilities:
• Involve in Requirement gathering and analysis with delivery team & client.
• Created PL/SQL procedure through with component engine was implemented.
• Created a UNIX shell script to control the number of process invoked for component engine and
maintain execution order as per the configurable inter component dependency.
• Prod Implementation Support.
8. Project Name : Closed Loop Mkt. & Email Repository, Nov 2009 - Sept 2010
Application Name : EDI(Enterprise Data Integration)
Client : UK Telecomm Client
ETL Tool/Technology : Informatica Powercenter, Oracle, UNIX
Team Size : 5
This project was a new initiative by Client business to make customer response data for telemarketing
campaigns available with Marketing Campaign source data to accommodate customer’s feedback in
future campaigns. Technologies involved are Informatica for ETL and Siebel Analytics for Business
Reporting and marketing campaign.
Email Repository was an initiative by Client business to make a consolidated email data at one place,
with more cleansed data, rather than various distributed sources, along with the consent information.
6
7. Response data available with Marketing Campaign source data to accommodate customer’s feedback
in future campaigns. Technologies involved are Informatica for ETL and Siebel Analytics for Business
Reporting.
Role: Developer
Responsibilities:
• Develop mapping with medium complexity to pull campaign marketing, and customer’s feedback
data from Siebel Marketing systems, and customer data from EDW to the data warehouse.
• Used Informatica Version Control for checking in all versions of the objects used in creating the
mappings, workflows to keep track of the changes in the development, test and production.
• Worked with Source qualifier, Aggregator, Expression, lookup, Filter, Router, Rank, Sequence
Generator, and Update Strategy transformation.
• Involved in Informatica code deployments to all the environments.
• Unit testing.
9. Project Name : Siebel Marketing, Jul 2009 to Oct 2009
Application Name : EDI(Enterprise Data Integration)
Client : UK Telecomm Client
ETL Tool/Technology : Informatica Powercenter, Oracle, UNIX, QC.
Team Size : 9
Our Team provided value added services to client Business with expertise in OBI & ETL design. I worked
as functional tester for Informatica ETL mappings & PL/SQL code. I worked on testing of ETL workflow
of couple of facts and dimension table’s data on database BI Layer.
Role: Application System Tester
Responsibilities:
• Involve in Requirement gathering and analysis with delivery team & client.
• Functional testing of Informatica maps & Maintaining Quality Gates.
• Create Automation Scripts for regression testing.
• Implementation Support.
10. Project Name : TSR Migration, EDW. July 2008 to Jun 2009
Application Name : Retail EDW(Enterprise Data warehouse)
Client : UK Telecomm Client
ETL Tool/Technology: OWB (Oracle warehouse builder), Oracle, UNIX, QC.
Team Size : 5
This team worked on many aspects of data warehouse enhancement’s quality assurance work. I was
involved in system testing part of delivery cycle. One of the major projects where I got opportunity to
work as tester was TSR Migration. This project is developed for the migration of all UK Business
Customer data from the Classic stack (CSS) to the new Strategic stack (One Siebel & Geneva) of UK
Business Single Solution (UKBSS) to achieve the physical separation of data stated by transformational
and regulatory (TSR) requirements for BT UK Business.
The overall migration is broadly divided into two main processes – Data Analysis (using OBI) & Data
Migration (using ODI). The Data Analysis process identifies the set of billing accounts that are ready for
migration in a particular release. The Data migration process in essence extracts data from the legacy
systems like IRS and SIM, validates the data against multiple exclusion and validation rules, transforms
the data using product and field mappings, generates load files and sends the same to intermediate
systems that load the transformed data to strategic systems like One Siebel and Geneva.
Role: Application System Tester
Responsibilities:
• Involve in test approach analysis with delivery team members & client.
7
8. • System testing & Maintaining Quality Gates.
• Create Automation Scripts for regression testing.
• Implementation Support.
11. Project Name : NTM Dec 2006 to May 2008
Client : Qwest Communications
ETL Tool/Technology : Remedy Admin, Remedy User, Core Java
Team Size : 5
This team was working on Network Trouble Ticketing. Main tool that we had opportunity to work was
AR Systems Remedy Admin, User, QC. It was a project to develop ticketing system having capability to
register trouble on basis of manual complains as well as proactively by monitoring network elements
parameter. Java was used to implement complex business logic, which was not possible using Remedy
Admin tool features.
Role: Developer
Responsibilities:
• Understand requirements form business/IT users or onshore counterpart.
• Provide a way to implement the requirement in AR Remedy Admin.
• Unit test the solution.
• Provide support for system testing.
• Implementation Support.
8
9. Tools Developed:
• UNIX utility using OMB+ to make OWB, PLS/SQL & Unix Auto Deployment Scripts.
• Pl/SQL Packages to generate data for testing purpose.
• Macro based excels to load data on oracle database and retrieve.
• XML parser to generate csv file with clarity execution stats along with performance graphs.
• Multiple project specific Unix shell scripts to automate process.
Skills
• Good analytical skills and very quick learning ability.
• Multidimensional thinking and problem Solving Skills.
• Capable of handling multiple responsibilities.
• Good communication and interpersonal skills.
• Experience leading technical team and guiding each individual’s goal towards to achieve end
solution.
Personal Profile
Name : Amit Kumar
Date of Birth : 15 Jan 1984
Marital Status : Married
Nationality : India
Work visa : H1B (Valid till July 2018)
Languages Known : Hindi, English
Present Address : Houston, TX, USA
Declaration
I here by declare that the information furnished above are true and correct to the best of my
knowledge.
Place : Houston, TX, USA Amit Kumar
Date : July-15, 2016
9
10. Tools Developed:
• UNIX utility using OMB+ to make OWB, PLS/SQL & Unix Auto Deployment Scripts.
• Pl/SQL Packages to generate data for testing purpose.
• Macro based excels to load data on oracle database and retrieve.
• XML parser to generate csv file with clarity execution stats along with performance graphs.
• Multiple project specific Unix shell scripts to automate process.
Skills
• Good analytical skills and very quick learning ability.
• Multidimensional thinking and problem Solving Skills.
• Capable of handling multiple responsibilities.
• Good communication and interpersonal skills.
• Experience leading technical team and guiding each individual’s goal towards to achieve end
solution.
Personal Profile
Name : Amit Kumar
Date of Birth : 15 Jan 1984
Marital Status : Married
Nationality : India
Work visa : H1B (Valid till July 2018)
Languages Known : Hindi, English
Present Address : Houston, TX, USA
Declaration
I here by declare that the information furnished above are true and correct to the best of my
knowledge.
Place : Houston, TX, USA Amit Kumar
Date : July-15, 2016
9