SlideShare a Scribd company logo
Tara Prasad Panda
Flat no. C-301
Icon Linera
Bhumkar Chowk, Pune
Mobile: +91 8796543968
E-Mail: panda.tara@gmail.com
CAREER OBJECTIVE
To build a career that offers career growth with opportunities to enrich my skills
while contributing my best to the organization I work with.
PORFESSIONAL SUMMARY
 Have 2 years of experience in data warehousing and informatica, this includes mapping
creation and designing
 Extensive hands on experience in informatica 9.6, informatica lifecycle management
 Have good experience in Oracle 10g and MSSQL server
 Worked with different data sources like database, flat files and MQ
 Have knowledge of DEV, UAT and PROD environment
 Experience with dimensional modeling using star schema and snowflake models
 Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow
 Performed performance tuning at session level to maximize data throughput
 Developed a generic query for partitioning at session level
 Built reusable transformation and mapping wherever redundancy is needed
 Performed Unit testing and maintained test logs and test cases for all the mappings
 Parameterized hard-coded values in session level and mapping level
 Have good knowledge in HADOOP ecosystem especially in PIG and HIVE
 Assisted the other ETL developers in solving complex scenarios and coordinated with
source systems owners with day-to-day ETL progress monitoring
EXPERIENCE SUMMARY
 Working as Senior Software Engineer Capgemini India PVT. LTD. from NOV 2014 to till date.
ACADEMIC DETAILS
EXAMINATION SPECIALIZATION INSTITUTION
BOARD / YEAR OF PERCENTAGE /
UNIVERSITY PASSING CGPA DIVISION
Electronics and National Institute of
B.Tech Communication Science and BPUT 2014 8.7/10
Engineering Technology
12th
Science Kendriya Vidyalaya CBSE 2010 83 %
Bhubaneswar
10th
Kendriya Vidyalaya CBSE 2008 91.6 %
Bhubaneswar
TECHNICAL SKILLS
 ETL Tool:
 Informatica Power Center 9.6.
 Informatica ILM 9.x
 Database:-
 Oracle 10g/11g.
 SQL Server 2008.
 Programming language:-
 SQL, PIG, HIVE
o Operating System:-
 UNIX (basic shell scripting).
 MS DOS, Windows XP/7/8/10.
 Tools
 Autosys
 Toad
 SQL server management studio
PROJECT EXPERIENCE
Project -1:- Data masking
Organization: - Capgemini
Client: - Barclays Capital
Duration: - 1.9 years. (Feb-2014 to till date)
Summary:-
Data is vital and to protect personal data is a most challenging task for today's banking
organizations. Mostly data leakages occur from non-production environment. To avoid this we have
provided data masking solution, which will mask all the sensitive personal information before
migrating to non-production environments. We masked all type of sources like databases, flat files,
xmls, MQs, etc. We have designed standard mapplets to mask various information like full name,
user identification number, credit card number, account number, etc.
Challenges:-
Integrity is the most challenging task in masking. Which means particular account number
must be masked with some dummy valid account number throughout all the tables and applications.
To achieve this we have designed secure key and unique key mapplet. Secure key mapplet keeps
integrity while unique key mapplet will randomly mask data with unique value.
Another challenge is to developed complex process for running Informatica jobs. The process
must be complex for maintaining secrecy. No one can easily understand how data is getting masked.
We have used the autosys and unix scripts for running Informatica workflows.
Responsibilities:-
 Mostly I engaged with development activities. I took the responsibility of whole application. So
far I have masked approximately 10 applications. Whole cycle involves different steps like
requirement gathering, documentation, plans set up, development, unit testing, sign off and
code migration to UAT then to PROD.
 Understanding business requirement.
 To import source from different data source like databases, flat files, MQ.
 To generate code through Informatica ILM. ILM is a widely used tool for generating
Informatica code.
 To standardize code with the help of UNIX script.
 To prepare autosys jobs by taking care of dependencies. Autosys is a third party tool which is
widely used for running Informatica workflows by setting up dependencies.
 To prepare mappings for complex code. Mapping logic involves calculations, implementation
of algorithms, handling integrity especially for many to many relationships.
 Developed mappings that perform Extraction, Transformation and load of source data into
derived Masters schema using various power center transformations like Source Qualifier,
Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, XML
parser, XML generator, Transaction control transformation, Normalizer and update strategy to
meet business logic in the mappings
 To develop mapplet. Mostly we create mapplet because the masking logic for particular
columns are same e.g. account number, credit card number has almost same masking logic.
So it is more feasible to use mapplet rather than developing code in each mapping.
 To develop workflows and sessions. Workflows may be sequential, parallel, with
dependencies, etc.
 Performed partitioning at session level to increase throughput.
 To prepare deployment and run book documents. Deployment document contains entire
information how to migrate code in prod or go live. Run books provide guidance to end user
how to run jobs and prepare a mask copy of databases.
ACHIEVEMENTS/ EXTRA-CURRULAR ACTIVITIES
 Awarded as the Rising Star for H1 2016.
 Attended Cloudera HADOOP training.
 Attended PIG and HIVE classroom trainings and did mini project in PIG.
 Attended Manage by metrics training session.
 Attended Software configuration management training.
PERSONAL PROFILE
Name : Tara Prasad Panda
Date of Birth : 10th
July 1992
Nationality : Indian
Gender : Male
Marital Status : Single
Hobbies : Playing Table Tennis and reading tech news.
Languages known : English, Hindi and Odiya
Declaration:-
I hereby declare that the information furnished above is true to the best of my knowledge and belief.
Date:
Place: Tara Prasad Panda
Resume_20112016

More Related Content

What's hot

Kishor resume-
Kishor   resume-Kishor   resume-
Kishor resume-
Kishor M
 
Kishor resume-
Kishor   resume-Kishor   resume-
Kishor resume-
Kishor M
 
omar_alhussein_final_cv
omar_alhussein_final_cvomar_alhussein_final_cv
omar_alhussein_final_cvOmar AlHussein
 
Resume for Oracle DBA
Resume for Oracle DBAResume for Oracle DBA
Resume for Oracle DBAAnkit Anand
 
Marcus Wesley's Resume
Marcus Wesley's ResumeMarcus Wesley's Resume
Marcus Wesley's Resume
Marcus Wesley
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Net experience-resume-sample
Net experience-resume-sampleNet experience-resume-sample
Net experience-resume-sampleAmit Sawant
 
Mugilan Resume
Mugilan ResumeMugilan Resume
Mugilan Resumemugilan b
 
Yury Larin Resume N
Yury Larin Resume NYury Larin Resume N
Yury Larin Resume NYury Larin
 

What's hot (16)

pretesh2015
pretesh2015pretesh2015
pretesh2015
 
Akanksha_Resume
Akanksha_ResumeAkanksha_Resume
Akanksha_Resume
 
Kishor resume-
Kishor   resume-Kishor   resume-
Kishor resume-
 
CV
CVCV
CV
 
Kishor resume-
Kishor   resume-Kishor   resume-
Kishor resume-
 
Net Resume Ke
Net Resume KeNet Resume Ke
Net Resume Ke
 
omar_alhussein_final_cv
omar_alhussein_final_cvomar_alhussein_final_cv
omar_alhussein_final_cv
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
Resume for Oracle DBA
Resume for Oracle DBAResume for Oracle DBA
Resume for Oracle DBA
 
Marcus Wesley's Resume
Marcus Wesley's ResumeMarcus Wesley's Resume
Marcus Wesley's Resume
 
Archana Jaiswal Resume
Archana Jaiswal ResumeArchana Jaiswal Resume
Archana Jaiswal Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Net experience-resume-sample
Net experience-resume-sampleNet experience-resume-sample
Net experience-resume-sample
 
Resume
ResumeResume
Resume
 
Mugilan Resume
Mugilan ResumeMugilan Resume
Mugilan Resume
 
Yury Larin Resume N
Yury Larin Resume NYury Larin Resume N
Yury Larin Resume N
 

Similar to Resume_20112016

Mallikarjun_BizTalkResume_10_June_15
Mallikarjun_BizTalkResume_10_June_15Mallikarjun_BizTalkResume_10_June_15
Mallikarjun_BizTalkResume_10_June_15Mallikarjun Dirisala
 
Lokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh Reddy
 
Informatica_3_yrs_Exp_revised
Informatica_3_yrs_Exp_revisedInformatica_3_yrs_Exp_revised
Informatica_3_yrs_Exp_revisedkumari anuradha
 
Resume_Akanksha_Pandya_2022.docx
Resume_Akanksha_Pandya_2022.docxResume_Akanksha_Pandya_2022.docx
Resume_Akanksha_Pandya_2022.docx
apandya9
 
Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)Chaitanya Prasad
 
Sudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha Janmeda
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_ResumeChandan Das
 
Jesy George_CV_LATEST
Jesy George_CV_LATESTJesy George_CV_LATEST
Jesy George_CV_LATESTJesy George
 
Jaya Sindhura_Resume_Datastage
Jaya Sindhura_Resume_DatastageJaya Sindhura_Resume_Datastage
Jaya Sindhura_Resume_DatastageSindhura Reddy
 

Similar to Resume_20112016 (20)

Mallikarjun_BizTalkResume_10_June_15
Mallikarjun_BizTalkResume_10_June_15Mallikarjun_BizTalkResume_10_June_15
Mallikarjun_BizTalkResume_10_June_15
 
Richa_Profile
Richa_ProfileRicha_Profile
Richa_Profile
 
Ramesh_resume
Ramesh_resumeRamesh_resume
Ramesh_resume
 
Jawad's Resume
Jawad's ResumeJawad's Resume
Jawad's Resume
 
CV_Gervano_Fernandes
CV_Gervano_FernandesCV_Gervano_Fernandes
CV_Gervano_Fernandes
 
Lokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_Resume
 
Informatica_3_yrs_Exp_revised
Informatica_3_yrs_Exp_revisedInformatica_3_yrs_Exp_revised
Informatica_3_yrs_Exp_revised
 
Resume_Achin
Resume_AchinResume_Achin
Resume_Achin
 
Resume_Akanksha_Pandya_2022.docx
Resume_Akanksha_Pandya_2022.docxResume_Akanksha_Pandya_2022.docx
Resume_Akanksha_Pandya_2022.docx
 
Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)
 
Sreekanth Resume
Sreekanth  ResumeSreekanth  Resume
Sreekanth Resume
 
Sudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_Developer
 
Raghav_thakkar
Raghav_thakkarRaghav_thakkar
Raghav_thakkar
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
CV @ JALIL2016+
CV @ JALIL2016+CV @ JALIL2016+
CV @ JALIL2016+
 
Balaji_M
Balaji_MBalaji_M
Balaji_M
 
RajeshS_ETL
RajeshS_ETLRajeshS_ETL
RajeshS_ETL
 
Jesy George_CV_LATEST
Jesy George_CV_LATESTJesy George_CV_LATEST
Jesy George_CV_LATEST
 
Jaya Sindhura_Resume_Datastage
Jaya Sindhura_Resume_DatastageJaya Sindhura_Resume_Datastage
Jaya Sindhura_Resume_Datastage
 
Naresh Babu
Naresh BabuNaresh Babu
Naresh Babu
 

Resume_20112016

  • 1. Tara Prasad Panda Flat no. C-301 Icon Linera Bhumkar Chowk, Pune Mobile: +91 8796543968 E-Mail: panda.tara@gmail.com CAREER OBJECTIVE To build a career that offers career growth with opportunities to enrich my skills while contributing my best to the organization I work with. PORFESSIONAL SUMMARY  Have 2 years of experience in data warehousing and informatica, this includes mapping creation and designing  Extensive hands on experience in informatica 9.6, informatica lifecycle management  Have good experience in Oracle 10g and MSSQL server  Worked with different data sources like database, flat files and MQ  Have knowledge of DEV, UAT and PROD environment  Experience with dimensional modeling using star schema and snowflake models  Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow  Performed performance tuning at session level to maximize data throughput  Developed a generic query for partitioning at session level  Built reusable transformation and mapping wherever redundancy is needed  Performed Unit testing and maintained test logs and test cases for all the mappings  Parameterized hard-coded values in session level and mapping level  Have good knowledge in HADOOP ecosystem especially in PIG and HIVE  Assisted the other ETL developers in solving complex scenarios and coordinated with source systems owners with day-to-day ETL progress monitoring EXPERIENCE SUMMARY  Working as Senior Software Engineer Capgemini India PVT. LTD. from NOV 2014 to till date. ACADEMIC DETAILS EXAMINATION SPECIALIZATION INSTITUTION BOARD / YEAR OF PERCENTAGE / UNIVERSITY PASSING CGPA DIVISION Electronics and National Institute of B.Tech Communication Science and BPUT 2014 8.7/10 Engineering Technology 12th Science Kendriya Vidyalaya CBSE 2010 83 % Bhubaneswar 10th Kendriya Vidyalaya CBSE 2008 91.6 % Bhubaneswar
  • 2. TECHNICAL SKILLS  ETL Tool:  Informatica Power Center 9.6.  Informatica ILM 9.x  Database:-  Oracle 10g/11g.  SQL Server 2008.  Programming language:-  SQL, PIG, HIVE o Operating System:-  UNIX (basic shell scripting).  MS DOS, Windows XP/7/8/10.  Tools  Autosys  Toad  SQL server management studio PROJECT EXPERIENCE Project -1:- Data masking Organization: - Capgemini Client: - Barclays Capital Duration: - 1.9 years. (Feb-2014 to till date) Summary:- Data is vital and to protect personal data is a most challenging task for today's banking organizations. Mostly data leakages occur from non-production environment. To avoid this we have provided data masking solution, which will mask all the sensitive personal information before migrating to non-production environments. We masked all type of sources like databases, flat files, xmls, MQs, etc. We have designed standard mapplets to mask various information like full name, user identification number, credit card number, account number, etc. Challenges:- Integrity is the most challenging task in masking. Which means particular account number must be masked with some dummy valid account number throughout all the tables and applications. To achieve this we have designed secure key and unique key mapplet. Secure key mapplet keeps integrity while unique key mapplet will randomly mask data with unique value. Another challenge is to developed complex process for running Informatica jobs. The process must be complex for maintaining secrecy. No one can easily understand how data is getting masked. We have used the autosys and unix scripts for running Informatica workflows.
  • 3. Responsibilities:-  Mostly I engaged with development activities. I took the responsibility of whole application. So far I have masked approximately 10 applications. Whole cycle involves different steps like requirement gathering, documentation, plans set up, development, unit testing, sign off and code migration to UAT then to PROD.  Understanding business requirement.  To import source from different data source like databases, flat files, MQ.  To generate code through Informatica ILM. ILM is a widely used tool for generating Informatica code.  To standardize code with the help of UNIX script.  To prepare autosys jobs by taking care of dependencies. Autosys is a third party tool which is widely used for running Informatica workflows by setting up dependencies.  To prepare mappings for complex code. Mapping logic involves calculations, implementation of algorithms, handling integrity especially for many to many relationships.  Developed mappings that perform Extraction, Transformation and load of source data into derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, XML parser, XML generator, Transaction control transformation, Normalizer and update strategy to meet business logic in the mappings  To develop mapplet. Mostly we create mapplet because the masking logic for particular columns are same e.g. account number, credit card number has almost same masking logic. So it is more feasible to use mapplet rather than developing code in each mapping.  To develop workflows and sessions. Workflows may be sequential, parallel, with dependencies, etc.  Performed partitioning at session level to increase throughput.  To prepare deployment and run book documents. Deployment document contains entire information how to migrate code in prod or go live. Run books provide guidance to end user how to run jobs and prepare a mask copy of databases. ACHIEVEMENTS/ EXTRA-CURRULAR ACTIVITIES  Awarded as the Rising Star for H1 2016.  Attended Cloudera HADOOP training.  Attended PIG and HIVE classroom trainings and did mini project in PIG.  Attended Manage by metrics training session.  Attended Software configuration management training. PERSONAL PROFILE Name : Tara Prasad Panda Date of Birth : 10th July 1992 Nationality : Indian Gender : Male Marital Status : Single Hobbies : Playing Table Tennis and reading tech news. Languages known : English, Hindi and Odiya Declaration:- I hereby declare that the information furnished above is true to the best of my knowledge and belief. Date: Place: Tara Prasad Panda