This document is a resume for Purnachandra Ch summarizing his experience and skills. He has over 11 years of experience in software development, project execution, and big data technologies like Hadoop. He is currently working as a senior project lead on a trade data management project for Citibank involving Hadoop technologies like Pig, Hive, Sqoop, and HDFS.
1. PURNACHANDRA CH
Mobile: + 91- 9840380143 / E-Mail: chennareddypurna@gmail.com
IT PROFESSIONAL - HADOOP DEVELOPER & PROJECT LEAD
Targeting assignments in Hadoop Development, Big Data Solutions & Project Execution with a reputed IT organisation to deliver
solutions for complex technical requirements and for transforming large & unruly data sets into competitive advantages
PROFILE SUMMARY
MCA offering nearly 11 years of rich experience in Project Execution, Software Development (SDLC) in Oracle, PL/SQL including 2+
years of experience in Hadoop Development & Testing and Big Data Ecosystems; expertise entails:
Project Planning/ Execution Software Development/ Testing (UAT) Hadoop Development / Ecosystems
Big Data Solutions and Analysis Hive Tables Creation/ Technical Support Incident & Release Management
Oracle, PL/SQL Programming Build, Deployment & Change Management Data Import & Export
Client Relationship Management Reporting/SLAs & TAT Compliances Team Management/ Trainings
• Effectively supported the complete Software Development Lifecycle (SDLC) including requirement analysis, design, coding, testing,
UAT, production support, change & release management; acquired expertise in Trade Financing & Financial Services domains
• Familiar with the concepts of Project Execution from effort estimation, risk analysis to QA; successfully executed Application
Design, Development, Production Support & Maintenance Projects for leading Global clients like Citibank & Presto Cards
• Skilled in designing, implementing and improving analytic solutions for Big Data on Hadoop, Hive & MapReduce
• Proficiency in developing & updating Hadoop Software Applications using MapReduce and troubleshooting of problems related to
Hadoop jobs; well versed in writing Pig Scripts and Hive Queries as per the requirement
• Expertise in Big Data Ecosystems like Sqoop & Flume for importing & exporting large chunks of data between Hadoop ecosystem
& external systems; effectively managed structured and un-structured data using PigLatin and HiveQL
• Proven talent in effectively interfacing with end-users, identifying functional/technical gaps, custom designing IT solutions &
resolving client’s queries; acquired international experience by working onsite at Singapore for 2 years
• Customer-Centric Leader with skills in leading & guiding team members and enabling knowledge-sharing amongst them;
comfortable working in a global team work environment with strong communication, problem –solving & analytical skills
TECHNICAL SKILLS
• Skills in processing large sets of structured, semi-structured & unstructured data and supporting systems application architecture
• Proficiency in developing data pipelines to import/export structured/un-structured datasets using Sqoop to move data in and out of
the Hadoop ecosystem; extensive exposure to Distributed Computing and Cluster Environment
• Acquired expertise in Business Analysis with in-depth knowledge of:
o Administration, configuration management, monitoring, debugging & performance tuning of Hadoop applications
o Big Data Technologies like HDFS, HUE, Sqoop, Pig, Hive and HBase
o HDFS Architecture and Cluster Concepts
o Hive Query Language and Debugging of Hive Issues
o Procedures, Packages and SQL Complex Queries
Operating Systems : Windows Families, Linux CentOS
Hadoop Eco System : HDFS, Hive, Pig, Sqoop
Databases : Oracle 10g and NoSQL (HBase)
Servers : Tomcat and WebSphere
Hadoop Environment : Cloudera Distribution and Apache Hadoop
Tools : TOAD, PL/SQL Developer, Service now, Serena PVCS
Languages : Core Java, PIG Latin, SQL, HiveQL,PL/SQL, PRO *C
ORGANIZATIONAL EXPERIENCE
Since Dec’06 Polaris Consulting & Services Ltd., Chennai Senior Project Lead
Aug'05 to Dec’06 D-Soft India Pvt. Ltd., Chennai Software Engineer
Key Result Areas:
• Contributing in end to end execution of project involving project planning & scheduling
2. • Monitoring project progress as per scheduled deadlines and ensuring project completion within time & effort parameters
• Interacting with client & development team for requirement gathering, system study & analysis
• Working out the new system's requirements and test strategies and ensuring that it meets the user specifications
• Supporting complete SDLC entailing design, development, Unit Testing, troubleshooting & debugging of the application
• Providing post-implementation, enhancement and maintenance support to client for application
• Utilising Hive & Mapreduce programs to parse raw data, populate staging tables and storing refined data in partitioned tables
• Creating HBase Tables to load large sets of structured, semi-structured & unstructured data coming from UNIX / other portfolios
• Coordinating with the other Internal teams to ensure data quality & availability
• Conducting Performance Testing and Quality Checks before sending jobs to Production Environment
• Ensuring high customer satisfaction matrices by achieving delivery & service quality norms
• Sustaining a dynamic environment that motivates high performance amongst team members
Ongoing Project:
Title: Trade Data Management
Client: Citibank, USA
Period: Jan’15 - Till Date
Hadoop Technologies: Pig, Hive, Sqoop, HDFS & MapReduce
Hadoop Environment: Cloudera Distribution
Role: Hadoop Developer-Testing
Description: Citi is using TRIMS application worldwide to manage their Trade Financing System. This system has lots of facilities for
managing the Trade functionality as, Transaction Management (for Import & Export side), Accounting, Messaging & Doc Scanning. Citi
moved towards latest technical architecture in Big Data to serve the various technical queries. Citi focused on 5 big Data Use Cases -
Fraud Detection & Security, Compliance & Regulatory, Customer Segmentation, Risk Management and Customer based Offering. Aim is
to get customer insights from sources across the globe and heterogeneous applications in order to perform analytics from
unstructured /structured data in efficient and reliable manner. Cornerstone Goals entailed:
• Single Data Repository across Use Cases for Space Optimization
• Ensure Quality of data through strict Validation, Balancing & Controls
• Secure, Controlled and Monitored Access for Compliance and Auditing
• Full attention to Privacy constructs during data extraction
• Mandatory requirement for all use cases per audit and compliance
Responsibilities:
• Contributing in writing numerous scripts to handle the data ingestion into HDFS from various data sources
• Importing and exporting data into HDFS and Hive using Sqoop
• Loading and transforming large data sets of structured, semi-structured and un-structured data
• Providing support in loading data from UNIX file system to HDFS
• Conducting Performance Testing and Quality Checks before sending jobs to Production Environment
ACADEMIC DETAILS
MCA from Sri Venkateswara University, Tirupathi, Andhra Pradesh in 2004
BA (MES) from Sri Venkateswara University, Tirupathi, Andhra Pradesh in2000
PERSONAL DETAILS
Date of Birth: 31st
March 1979
Address: No. 16 - 11/841, Opp. Airtel Tower, Haranathpuram 2nd
Street, Nellore - 524003
Languages Known: English, Telugu & Hindi
Passport No.: Z2499641
Nationality: Indian
Location Preference: Chennai / Singapore ~ industry Preference: IT
Please Refer to Annexure for Key Projects Undertaken
3. ANNEXURE
KEY PROJECTS EXECUTED
Title: Trade Data Management - POC
Client: Citibank, USA
Period: May 2014 – Dec-2014
Hadoop Technologies: Pig, Hive, Sqoop, HDFS, MapReduce
Hadoop Environment: Apache Hadoop
Role: Hadoop Developer
Description: Citi is using TRIMS application worldwide to manage their Trade Financing System. This system has lots of facilities for
managing the Trade functionality as, Transaction Management (for Import & Export side), Accounting, Messaging & Doc. Scanning. Citi
moved towards latest technical architecture in Big Data to serve the various technical queries. This Migration involved implementing
country wise phases. As part of Big Data migration did the Specific Reports generations based on user requirements through Hadoop.
Moved the Interface data to HDFS and then moved to Hive tables with specific criteria’s. Cornerstone Goals entailed:
• Single Data Repository across Use Cases for Space Optimization
• Ensure Quality of data through strict Validation, Balancing & Controls
• Secure, Controlled and Monitored Access for Compliance and Auditing
• Full attention to Privacy constructs during data extraction
• Mandatory requirement for all use cases per audit and compliance
Responsibilities:
• Contributed in writing numerous scripts to handle the data ingestion into HDFS from existing database
• Imported and exported data into HDFS and Hive using Sqoop
• Loaded and transformed large data sets of structured, semi-structured and un-structured data
• Supported the loading of data from UNIX file system to HDFS
Title: Trade Record Information Management System (TRIMS), Citibank
Client: Citibank, APAC
Period: Aug 2010 – May 2014
Technologies: Oracle 11g, Pro*C, HP/AIX UNIX, PL/SQL
Role: Application Support & Development- Team Lead
Description: Trade Record Information Management System (TRIMS) is a comprehensive Global Trade Finance Division (GTFD) system
that processes transactions from initial registration to final liquidation for all GTFD products. It mainly revolves around Letter of Credit
(LC), Bills and other important documents. The Maker/Checker modules have been developed in TRIMS for all functionalities for
assuring the secured transaction processing. TRIMS is mainly divided into TRIMS imaging and TRIMS processing closely coordinating to
accomplish the client requirements in Trade business. TRIMS Processing provides user friendly interface to capture different details
involved in Trade and process the transaction as per the functionality needed by the clients. It provides the charge calculation module,
which is driven by the database setup and user modified parameters. It broadly divides trade products in i) Letter of Credit ii) Bills and
Collections. Different modules and parties in the business are interconnected through various kinds of medium such as SWIFT
messages, Emails, NDM, etc. TRIMS has Coordinates with different partner systems and send the offline and online feed files and
system is capable for Online Credit checks also.
Responsibilities:
• Provided the Quick Development Fixes in UAT and Production environment for reduce the impact for users
• Attended the:
o Global Status calls and prioritized the UAT release deployments
o Production Deployment Calls and finalized Production releases
• Prepared the Production Users Sanity Scope Plan and supported the Production User’s Sanity
• Imparted KT to the Team Members for the UAT/Production deployments as well as interacted with Users for Online Issues
• Monitored the UAT Application for BAU Support for Users.
• Assigned the EOD requests to Team and monitored the same
• Prioritized the Users Tickets (Service now Incidents) and reviewed the incidents based on severity
• Coordinated with Dev. team to understand client’s requirement and provide the fixes
Title: TRIMS – NA/EMEA UAT Support
Client: Citibank, NA/LATAM, EMEA
Period: Nov 2009 – Aug 2010
Technologies: Oracle 10g, UNIX (Solaris), Pro*C, PL/SQL
Role: Support - Lead
Description: Trade UAT Support for Trade Records Information Management System (TRIMS) NA-EMEA Region. Supports team in BAU
activities for UAT, included the Application SOD (Start of the Day) & Application health checks; supported the UAT users for UAT issues,
provide the Users input errors advices on SDB (static Database) setup issues. Run Application batches based on users request and helps
on further cycles of Testing. Helps Users on Application level issues and DB & MQ upgrades in application levels. UAT support co-
ordinates with UAT Downstream & Partner system and sends UAT offline and online Batch Files and gets the Report results.
Responsibilities:
• Monitored the health check for the application and managed tasks like:
4. o Running the EOD (Batch) based on users request and sending the feed files to downstream systems
o Resolving the BAU issues and closing the Tickets
o Checking with users with online for Critical issues
o Supporting in Production deployment and Users Sanity
Title: TRIMS – Customer Online Payment Response
Client: Citibank, NA/LATAM
Period: Sep 2009 – Nov 2009
Technologies: Oracle 10g, Pro*C, PL/SQL
Role: Software Developer
Description: TRIMS will send the online Payment notification for Docprep customer based on customer details through MQ. As part of
enhancement changed the destination point (received channel) with additional details and with specific customers.
Responsibilities:
• Conducted Analysis of the existing system and coordinated with the team to develop a common logic for the entire interface
• Provided support in:
o Program Specification and Test Case Documentation
o Developing the Payment Pro*C technology
o Code reviewing and all R&D related works for the project
o Unit Testing and Integration of all the interfaces
o Code check-in and SIT/QC promotion activities
o SIT/QC and UAT activity
Title: TRIMS – Canada GL Interface and Hogan Interface Development
Client: Citibank, NA/LATAM
Period: Apr 2009 – Sep 2009
Technologies: Oracle 10g, Pro*C, PL/SQL
Role: Software Developer
Description: TRIMS will send the Interfaces feeds to HOGAN and Canada GL system. As part of this enhancement needed to generate 2
new feeds issuance & amendment. This feeds provides the details of Issuance, Amendment, Close, Cancel and Revaluation details.
Canada GL System, TRIMS had sent the Opening Balances Accounts details as part of EOD.
Responsibilities:
• Performed various functions like:
o Unit Testing and Integration of all the interfaces
o Code check-in and SIT/QC promotion activities
o SIT/QC and UAT activity
• Developed Pro*C modules to extract data with user interface capabilities
• Understood the Enhancement requirement
• Wrote the new events for generating the Interfaces in Batch(EOD)
Title: TRIMS- Asset Sale Enhancement for NA/EMEA
Client: Citibank, NA/LATAM
Period: Aug 2008 - Apr 2009
Technologies: Oracle 10g, Pro*C, PL/SQL
Role: Software Developer
Description: As a part of this enhancement, customer can share the Citibank loan; for those customers Citibank will pay the Risk
Participation fees to share the risk. Based on this changed the EOD feeds and other processing details.
Responsibilities:
• Developed the new events based on enhancement requirements
• Modified the Accounting logics based on new enhancement changes
• Performed various functions like:
o Unit Test Case preparation and Testing of Asset Sale Module Components
o Architectural and Detail Design Documentation of Asset Sale Module
o Testing the enhancement
Title: Presto Retails Cards
Client: Presto Cards, Chile
Period: Dec 2006 - Jul 2008
Technologies: Oracle 10g, Pro*C, PL/SQL
Role: Software Developer
Description: Presto Card project aimed at the development of Reports based on Intellect Tool.
Responsibilities:
• Developed the 250 Presto Reports under the Reports module.
• Interacted with BA for understanding the requirement and developed the Reports using Intellect Report Tool
• Contributed in:
o Unit Test Case preparation and Testing
o SIT & UAT Testing in Reports Module