SlideShare a Scribd company logo
1 of 4
CHANDAN DAS
Contact: +919421789997 Wakad, PUNE
Email : chandandas64@gmail.com
CAREER OBJECTIVES
Talented and dedicated individual seeking productive role in IT organization as a Developer/Designer to
construct a career with leading corporate of high tech environment. And implement the expertise,
experience gained in this field to develop complex project with efficiency and excellence.
Ingenious problem finder and troubleshooter, lucratively point out the errors in early stages to avoid
time/cost expenditures. Excellent leader with success in coordinating efforts within the internal external
terms to reach and surpass expectations. Motivating and enthusiastic professional through an excellent
approach to achieve triumph in all projects. Capable of delivering success in a complex projects with
scope for learning and challenge.
PROFESSIONAL EXPERIENCE
 Over 7+ years of experience in IT in Implementation of OLTP and OLAP Data Warehousing
projects with Teradata, Oracle, ETL tool.
 Good knowledge in Hadoop echo systems HDFS, PIG, HIVE,Spark, HBase, Sqoop, Flume,
Oozie, Zookeeper for scalability, distributed computing and high performance computing.
 Have done sample project using PIG, HIVE, SQOOP, and HBase.
 Working experience on Cloudera, Apache hadoop.
 Able to setup Pseudo Distributed Cluster using Apache Hadoop and also setting up the Multi
Node Cluster on a single machine for the testing purpose using the VM ware tool.
 Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export
and load data to/from different source systems including flat files.
 Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata
SQL Assistant
 Having experience in Oracle 9i DBA and performance tuning.
 Good knowledge in Unix Shell scripting.
 In-depth hands on experience in database, ETL/ELT design and development and should have
excellent data analysis skills.
 Application involved in Analysis, Design, Testing and implementation of UNIX scripts.
 Having complete domain and development Life Cycle knowledge of Data Warehousing & Client
Server Concepts and knowledge of basic data modeling.
 Very good in Fast learning, Analytical thinking, decision making and problem solving skills.
 Have experience in working with both 3NF and dimensional models for data warehouse and good
understanding of OLAP/OLTP systems.
 Having Experience in Data migration project.
TECHNICAL SKILLSET
 Cloudera Hadoop System, HDFS, PIG, HIVE, HBase, SQOOP, OOZIE. Zookeeper
 Teradata, ( Mload , Fastload , Bteq , Tpump)
 Oracle 9i DBA. Developer, SQL, PL/SQL. Performance Tuning. Oracle10g.
 UNIX, Shell scripting.
 Knowledge in Python.
 Knowledge in AB-initio, Informatica, TWS, Mainframe.
KEY ACHIVEMENTS:
 Appreciated by UBA, Nigeria, Lagos & Andhra Bank in HP, Japan clients for improving the
performance of applications.
 ITIL foundation certified.
 Onsite experience in Nigeria and Japan.
 Received Best performer Award in Barclays Technology Centre India Pvt Ltd.
 Certified from Eduerka as a HADOOP, BIG DATA developer.
 Certification in INFORMIX at IBM.
PROJECT COMPLETED:
COMPANY: IBM INDIA PVT LTD, 2013 DEC – TILL DATE
DESIGNATION: APPLICATION DEVELOPER
CURRENT PROJECT:
Project Title : Application development for Vodafone (Germany) And
System Testing for Data Migration (Vodafone UK), POC in BIG DATA
Project Description:
There are multiple subsystem in the application. We enhance the application as per the
requirement from client.
Responsibility: (In Application development and System Testing for Data Migration)
 Writing BTEQ, MLOAD, FLOAD, TPUMP Scripts.
 Writing Oracle Sql, PL/Sql scripts in TOAD.
 Coding, unit Testing, bug fixing, System Testing in QC and trouble shooting
 Preparing Visio, Implementation Plan, System Docs.
 Ensure on-going operational functionality and efficiency of the Application.
 Ensure the data is being loaded in DWH according by business rule and mapping documents
provided by ETL designer.
Project Description:
There are multiple systems present in different applications. The source data in are provided by
csv files. We process the formatting those files according to the Hive schema using PIG scripts
in Presentation layer. We load the files in HIVE using directly in external tables and
managed tables which becomes a Data layer for us. Then writing hive query we prepare data
for Aggregate Layer, then we the data using SQOOP and load it to Teradata DWH for analysis.
Responsibility:
 Writing PIG scripts for formatting of the data.
 Writing HIVE scripts for Aggregation Layer.
 Writing HBase script to transfer PIG and HIVE output.
 Coding, unit Testing, bug fixing, System Testing in QC and trouble shooting
 Writing SQOOP script for extracting the data from HIVE and load it to Teradata DWH.
COMPANY: BARCLAYS TECHNOLOGY CENTRE INDIA PVT LTD, MARCH 2011- DEC 2013
DESIGNATION: ANALYST PROGRAMMER
PROJECT NAME:
RMDB, Shares takes,Navigator MI, RCV (Retails Customer View),MVS (Mobile Validation System),
FIN-ETL
Software/tools/Technology used: Teradata (Mload, Fastload,Bteq), UNIX, Oracle 10g, UNIX
AB-INITIO, Teradata, UNIX, Cognos,Mainframe and TWS
Project Description:
All 6 applications are supports and enhancement. So handled support and small changes
related queries.
Responsibility:
 Managing and coordinating with dev., deployment and support team.
 Supporting Teradata database.
 Writing BTEQ, MLOAD, FLOAD, TPUMP Scripts.
 Writing Oracle Sql, PL/Sql scripts for small change and enhancement
 Leading the team from Barclays and vendor.
 BAU activities.
 Documenting Implementation plan.
 Help the team for release work and DB changes
 Interacting with Stakeholders.
COMPANY: PERSISTENT SYSTEM LTD, FEB 2010- MAR 2011
DESIGNATION: SENIOR SOFTWARE ENGGINEER.
PROJECT NAME: Agilent Access Fiber
Project Description
Access fiber is an application Agilent technology. In this product it we can trace the
two geo mark fiber optic connection and we can setup those route, link etc.
Responsibility:
 Oracle 9i, 8i, DBA activities.
 Solving issues coming from DB as well as Application problem.
Writing Sql, Pl/Sql for reporting.
 Help testing and dev. team for migration process.
 Creating staging area, taking back up writing procedure, function for migration.
COMPANY: BEST OF BREED SOFTWARE SOLUTION PVT LTD, MAY 2007- FEB 2010
DESIGNATION: APPLICATION SUPPORT /IMPLEMENTATION ENGGINEER.
PROJECTS:
 Benchmark for Andhra Bank, for HP, JAPAN, TOKYO.
 Performance Tuning in UBA (UNITED BANK FOR AFRICA), NIGERIA.
 HDFC &CBOP Merger
 OBC Data Migration.
 ICICI Data Center Merger
Responsibility:
 Various kind of Oracle 9i DBA activities (like taking back up, indexing, creating resizing, creating
of data file, log file etc., solving space problem, monitoring).
 Taking plan of query, analyzing and tuning of queries.
 Tuning Finacle and Finn One Interface and Finacle Applications.
 Assist IBM AIX Admin to tune the Web Server, App Server and DB Server according to
Application setup and also assist DBA to tune DB according to Finacle application.
 Sustaining the Load Runner Performance team of HP and analyzing HP servers.
 Oracle database performance tuning to increase the DB efficiency
 Analyzing objects and performing Finacle application tuning
EDUCATION
Bachelor of Engineering - (Electronics & Communication) - 2006
C.V Raman College of Engineering, Utkal University, Orissa
TRAINING:
INFOSYS for Finacle and Finacle database for Data Center Merger. - 2007
Bangalore
PERSONAL DETAILS:
Date of Birth: 06th June 1984
Marital Status: Married
Language Known: English, Oriya, Hindi and Bengali
Nationality: INDIAN
Passport: G6511053
Address: S/o Trilochan Das
Plot No: 29/A, BJB Nagar, Kalpana Area, Bhubaneswar, Pin-751014
Dist.: Khurda, Odisha.

More Related Content

What's hot

Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Sudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha Janmeda
 
TUSAR UPDATED RESUME4
TUSAR UPDATED RESUME4TUSAR UPDATED RESUME4
TUSAR UPDATED RESUME4tushar gujar
 
CV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQL
CV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQLCV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQL
CV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQLAnantha TMS
 
Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...
Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...
Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...Abhishek Vijaywargiya
 
Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.Farooq Omer
 
Akram_Resume_ETL_Informatica
Akram_Resume_ETL_InformaticaAkram_Resume_ETL_Informatica
Akram_Resume_ETL_InformaticaAkram Bhuyan
 

What's hot (19)

Narayana_Chowdam
Narayana_ChowdamNarayana_Chowdam
Narayana_Chowdam
 
Resume
ResumeResume
Resume
 
Nikhil (2)
Nikhil (2)Nikhil (2)
Nikhil (2)
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Anuragh Ravindran
Anuragh RavindranAnuragh Ravindran
Anuragh Ravindran
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Sudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_Developer
 
TUSAR UPDATED RESUME4
TUSAR UPDATED RESUME4TUSAR UPDATED RESUME4
TUSAR UPDATED RESUME4
 
CV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQL
CV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQLCV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQL
CV_Anantha_Krishnan_Senior_Oracle_ETL_PLSQL
 
Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...
Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...
Resume Abhishek Vijaywargiya: Database Developer with 9 years of experience i...
 
Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.Farooq_Oracle_DBA_Updated.
Farooq_Oracle_DBA_Updated.
 
Mallikharjun_Vemana
Mallikharjun_VemanaMallikharjun_Vemana
Mallikharjun_Vemana
 
Akram_Resume_ETL_Informatica
Akram_Resume_ETL_InformaticaAkram_Resume_ETL_Informatica
Akram_Resume_ETL_Informatica
 
IMITHIYAZ
IMITHIYAZIMITHIYAZ
IMITHIYAZ
 
naiditch_resume_APR17
naiditch_resume_APR17naiditch_resume_APR17
naiditch_resume_APR17
 
naiditch_resume_APR10
naiditch_resume_APR10naiditch_resume_APR10
naiditch_resume_APR10
 
Resume
ResumeResume
Resume
 

Similar to Chandan's_Resume

Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLsivakumar s
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETLMani Sagar
 
Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Rajesh Dheeti
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh Kumar
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh Kumar
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latestKallesha CB
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17Shipra Jaiswal
 
JeffRichardsonResume2016
JeffRichardsonResume2016JeffRichardsonResume2016
JeffRichardsonResume2016Jeff Richardson
 
Resume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_DeveloperResume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_DeveloperRaghav Mahajan
 
Lokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh Reddy
 

Similar to Chandan's_Resume (20)

Pranabesh Ghosh
Pranabesh Ghosh Pranabesh Ghosh
Pranabesh Ghosh
 
Resume
ResumeResume
Resume
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 
SivakumarS
SivakumarSSivakumarS
SivakumarS
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resume
 
Rakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resumeRakesh sr dwh_bi_consultant resume
Rakesh sr dwh_bi_consultant resume
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 
RajeshS_ETL
RajeshS_ETLRajeshS_ETL
RajeshS_ETL
 
Veera Narayanaswamy_PLSQL_Profile
Veera Narayanaswamy_PLSQL_ProfileVeera Narayanaswamy_PLSQL_Profile
Veera Narayanaswamy_PLSQL_Profile
 
JeffRichardsonResume2016
JeffRichardsonResume2016JeffRichardsonResume2016
JeffRichardsonResume2016
 
Resume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_DeveloperResume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_Developer
 
Resume
ResumeResume
Resume
 
Shrikanth
ShrikanthShrikanth
Shrikanth
 
Shivaprasada_Kodoth
Shivaprasada_KodothShivaprasada_Kodoth
Shivaprasada_Kodoth
 
Lokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_ResumeLokesh_Reddy_Datastage_Resume
Lokesh_Reddy_Datastage_Resume
 
Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2
 
Sreekanth Resume
Sreekanth  ResumeSreekanth  Resume
Sreekanth Resume
 

Chandan's_Resume

  • 1. CHANDAN DAS Contact: +919421789997 Wakad, PUNE Email : chandandas64@gmail.com CAREER OBJECTIVES Talented and dedicated individual seeking productive role in IT organization as a Developer/Designer to construct a career with leading corporate of high tech environment. And implement the expertise, experience gained in this field to develop complex project with efficiency and excellence. Ingenious problem finder and troubleshooter, lucratively point out the errors in early stages to avoid time/cost expenditures. Excellent leader with success in coordinating efforts within the internal external terms to reach and surpass expectations. Motivating and enthusiastic professional through an excellent approach to achieve triumph in all projects. Capable of delivering success in a complex projects with scope for learning and challenge. PROFESSIONAL EXPERIENCE  Over 7+ years of experience in IT in Implementation of OLTP and OLAP Data Warehousing projects with Teradata, Oracle, ETL tool.  Good knowledge in Hadoop echo systems HDFS, PIG, HIVE,Spark, HBase, Sqoop, Flume, Oozie, Zookeeper for scalability, distributed computing and high performance computing.  Have done sample project using PIG, HIVE, SQOOP, and HBase.  Working experience on Cloudera, Apache hadoop.  Able to setup Pseudo Distributed Cluster using Apache Hadoop and also setting up the Multi Node Cluster on a single machine for the testing purpose using the VM ware tool.  Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.  Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant  Having experience in Oracle 9i DBA and performance tuning.  Good knowledge in Unix Shell scripting.  In-depth hands on experience in database, ETL/ELT design and development and should have excellent data analysis skills.  Application involved in Analysis, Design, Testing and implementation of UNIX scripts.  Having complete domain and development Life Cycle knowledge of Data Warehousing & Client Server Concepts and knowledge of basic data modeling.  Very good in Fast learning, Analytical thinking, decision making and problem solving skills.  Have experience in working with both 3NF and dimensional models for data warehouse and good understanding of OLAP/OLTP systems.  Having Experience in Data migration project. TECHNICAL SKILLSET  Cloudera Hadoop System, HDFS, PIG, HIVE, HBase, SQOOP, OOZIE. Zookeeper  Teradata, ( Mload , Fastload , Bteq , Tpump)  Oracle 9i DBA. Developer, SQL, PL/SQL. Performance Tuning. Oracle10g.  UNIX, Shell scripting.  Knowledge in Python.  Knowledge in AB-initio, Informatica, TWS, Mainframe.
  • 2. KEY ACHIVEMENTS:  Appreciated by UBA, Nigeria, Lagos & Andhra Bank in HP, Japan clients for improving the performance of applications.  ITIL foundation certified.  Onsite experience in Nigeria and Japan.  Received Best performer Award in Barclays Technology Centre India Pvt Ltd.  Certified from Eduerka as a HADOOP, BIG DATA developer.  Certification in INFORMIX at IBM. PROJECT COMPLETED: COMPANY: IBM INDIA PVT LTD, 2013 DEC – TILL DATE DESIGNATION: APPLICATION DEVELOPER CURRENT PROJECT: Project Title : Application development for Vodafone (Germany) And System Testing for Data Migration (Vodafone UK), POC in BIG DATA Project Description: There are multiple subsystem in the application. We enhance the application as per the requirement from client. Responsibility: (In Application development and System Testing for Data Migration)  Writing BTEQ, MLOAD, FLOAD, TPUMP Scripts.  Writing Oracle Sql, PL/Sql scripts in TOAD.  Coding, unit Testing, bug fixing, System Testing in QC and trouble shooting  Preparing Visio, Implementation Plan, System Docs.  Ensure on-going operational functionality and efficiency of the Application.  Ensure the data is being loaded in DWH according by business rule and mapping documents provided by ETL designer. Project Description: There are multiple systems present in different applications. The source data in are provided by csv files. We process the formatting those files according to the Hive schema using PIG scripts in Presentation layer. We load the files in HIVE using directly in external tables and managed tables which becomes a Data layer for us. Then writing hive query we prepare data for Aggregate Layer, then we the data using SQOOP and load it to Teradata DWH for analysis. Responsibility:  Writing PIG scripts for formatting of the data.  Writing HIVE scripts for Aggregation Layer.  Writing HBase script to transfer PIG and HIVE output.  Coding, unit Testing, bug fixing, System Testing in QC and trouble shooting  Writing SQOOP script for extracting the data from HIVE and load it to Teradata DWH.
  • 3. COMPANY: BARCLAYS TECHNOLOGY CENTRE INDIA PVT LTD, MARCH 2011- DEC 2013 DESIGNATION: ANALYST PROGRAMMER PROJECT NAME: RMDB, Shares takes,Navigator MI, RCV (Retails Customer View),MVS (Mobile Validation System), FIN-ETL Software/tools/Technology used: Teradata (Mload, Fastload,Bteq), UNIX, Oracle 10g, UNIX AB-INITIO, Teradata, UNIX, Cognos,Mainframe and TWS Project Description: All 6 applications are supports and enhancement. So handled support and small changes related queries. Responsibility:  Managing and coordinating with dev., deployment and support team.  Supporting Teradata database.  Writing BTEQ, MLOAD, FLOAD, TPUMP Scripts.  Writing Oracle Sql, PL/Sql scripts for small change and enhancement  Leading the team from Barclays and vendor.  BAU activities.  Documenting Implementation plan.  Help the team for release work and DB changes  Interacting with Stakeholders. COMPANY: PERSISTENT SYSTEM LTD, FEB 2010- MAR 2011 DESIGNATION: SENIOR SOFTWARE ENGGINEER. PROJECT NAME: Agilent Access Fiber Project Description Access fiber is an application Agilent technology. In this product it we can trace the two geo mark fiber optic connection and we can setup those route, link etc. Responsibility:  Oracle 9i, 8i, DBA activities.  Solving issues coming from DB as well as Application problem. Writing Sql, Pl/Sql for reporting.  Help testing and dev. team for migration process.  Creating staging area, taking back up writing procedure, function for migration.
  • 4. COMPANY: BEST OF BREED SOFTWARE SOLUTION PVT LTD, MAY 2007- FEB 2010 DESIGNATION: APPLICATION SUPPORT /IMPLEMENTATION ENGGINEER. PROJECTS:  Benchmark for Andhra Bank, for HP, JAPAN, TOKYO.  Performance Tuning in UBA (UNITED BANK FOR AFRICA), NIGERIA.  HDFC &CBOP Merger  OBC Data Migration.  ICICI Data Center Merger Responsibility:  Various kind of Oracle 9i DBA activities (like taking back up, indexing, creating resizing, creating of data file, log file etc., solving space problem, monitoring).  Taking plan of query, analyzing and tuning of queries.  Tuning Finacle and Finn One Interface and Finacle Applications.  Assist IBM AIX Admin to tune the Web Server, App Server and DB Server according to Application setup and also assist DBA to tune DB according to Finacle application.  Sustaining the Load Runner Performance team of HP and analyzing HP servers.  Oracle database performance tuning to increase the DB efficiency  Analyzing objects and performing Finacle application tuning EDUCATION Bachelor of Engineering - (Electronics & Communication) - 2006 C.V Raman College of Engineering, Utkal University, Orissa TRAINING: INFOSYS for Finacle and Finacle database for Data Center Merger. - 2007 Bangalore PERSONAL DETAILS: Date of Birth: 06th June 1984 Marital Status: Married Language Known: English, Oriya, Hindi and Bengali Nationality: INDIAN Passport: G6511053 Address: S/o Trilochan Das Plot No: 29/A, BJB Nagar, Kalpana Area, Bhubaneswar, Pin-751014 Dist.: Khurda, Odisha.