· Industry certified Hadoop developer with 7+ years of experience in Software Industry and 6 years of hadoop development experience
· Has 3+ yrs experience as Technical Lead .
· Has experience in domains - Retail analytics,Hi-tech,Banking,Telecom and Insurance
· Working experience in HORTONWORKS,MAPR and CLOUDERA distributions
· Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
· Intermediate expertise in scala programming.
· Strong understanding and hands-on experience in distributed computing frameworks, particularly Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies - Hive, Sqoop, , Avro, Flume, Oozie, Zookeeper, Hortonworks Ni-Fi etc.
· Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
· Proficiency in Python Scripting
Automate your Kamailio Test Calls - Kamailio World 2024
Athira mp cv_latest - copy
1. ATHIRA MELE PATTADATH
Email :athiramp174@gmail.com
SUMMARY
Industry certified Hadoop developer with 7+ years of experience in Software Industry and 6 years
of hadoop development experience
Has 3+ yrs experience as Technical Lead .
Has experience in domains - Retail analytics,Hi-tech,Banking,Telecom and Insurance
Working experience in HORTONWORKS,MAPR and CLOUDERA distributions
Experience with building stream-processing systems, using solutions such as Storm or Spark-
Streaming
Intermediate expertise in scala programming.
Strong understanding and hands-on experience in distributed computing frameworks, particularly
Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies - Hive, Sqoop, , Avro, Flume,
Oozie, Zookeeper, Hortonworks Ni-Fi etc.
Experience with NoSQL databases,such as HBase,Cassandra,MongoDB
Proficiency in Python Scripting
Provided technical leadership in BigData space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase,
Flume, Sqoop, etc..NoSQL stores likeCassandra, HBaseetc).
Passionate for continuous learning, experimenting, applying and contributing towards cutting
edge open sourcetechnologies and software paradigms
Driven significanttechnology initiatives end to end and across multiplelayers of architecture
Provided strong technical leadership in adopting and contributing to open source technologies
related to BigData
Driven operational excellence through root cause analysis and continuous improvement for
BigData technologies and processes.
Expert-level proficiency in Java.
Experience working within a Linux computing environment, and use of command line tools
including knowledgeof shell scripting for automatingcommon tasks
Ability to work in a team in an agilesetting and clear understanding of how GIT works.
Experience in production deployment through Maven, Gradle ,GIT etc.
Has Problem solving,analytical skills and decision making
Able to effectively handle difficult and stressful situations with poise, tact and patience, while
demonstrating a sense of urgency
2. Can approach problems in an innovativemanner, exhibiting the aptitude to think quickly
Excellent organizational skills with strong attention to detail, efficient time management, and the
ability to prioritizework effectively
Strong collaborativeskills and ableto adjustapproach to effectively interactwith customers atall
organizational and technical levels
Experience in developing and deployingprojectbest practices,policies,procedures,and processes
Passionateaboutdevelopment, and eager to expand their influencein surrounding areas.
Leadership, people management, coaching abilities
Experience in Leading project to ensure quality and on-time delivery, making technical
recommendations to the product's functional owner when appropriate, and taking stakeholder
input to come up with creative solutions with very high commercial quality results
Managed and tracked product backlog by working with Product Manager in prioritizing customer
feedback, bug fixes and the feature roadmap to rapidly iteratethe product
Instructs and directs the software development team, formulating and defining system scope,
integration requirements, and objectives based on user defined needs
Lead quality control tests and keeps logs of problems, issues, changes, and future enhancements
during the different phases of the development cycle
Supported the Product Manager, Business Owner and other stakeholders on developing and
implementing product launch plans to rollout product to technical and non-technical service
delivery team
Mentored and directed other, more junior members of the team
Extensive experience in softwaretools architecture, development, and team leadership,including
performance management
Experience leading Development teams in a continuous delivery, continuous integration, and
continuous testing environment
Education
Bachelor of Technology with specialization in Computer Science and Engineering, Model
Engineering College, Cochin University, Kochi, India, 2010.
3. Certifications
Cloudera Certified Developer for Apache Hadoop
MapR Certified Administrator
Areas/Applications
Retail Analytics
To migrate the traditional databases ORACLE/Teradata and the ETL process to Hadoop.
Hi-Tech
- Big Data Framework
To analyze the data generated from the printers and build up new marketing strategies and
there by regulating the production of the various printer models across the globe
Banking
- Big Data Framework
To identify the Party to Party Association map for the customers and to identify the
relations between the data available within the internal systems.
Telecom
- Big Data Product development
To provide a low cost platform for storing and processing the huge amount of CDR data,
network probe data to provide various insights for the operational, marketing teams.
4. Insurance
- Re-engineering
To replace the existing system in VB 6.0 to J2EE by retaining its functionality, database
structure and User Interface look and feel
Career Profile
Since Jan 2015
TATA Consultancy
Services
Title Enterprise Data Warehousing
Period Since Jan 2015
Client Name Walgreens
Position Technical Lead/Architect
Responsibilities
Understand the Business use cases and prepare requirement specification documents.
Collaborated with business analysts to devise technical solutions for initiating business
process change.
Design the basic approach and architecture of the system to meet the requirements.
Building stream-processing systems,using solutions such as Spark-Streaming
5. Conducting discussions with the client for understanding the business requirements and
constantly reviewing them
Implemented scope, scheme, priority and business goals in development activities
Studying existing application and carrying out the impact analysis for the new changes to
be incorporated.
Prepare the implementation strategy for requirements and execution plan.
Prepare the technical design document(Hbase – Define row key,column family, Hbase data
load, Hive)
Managing the resources – Educating them on the design approach, distributing the Task
and reviewing the code.
Driving the build activities – Commit to GIT and monitoring the build plan.
Mentoring the new resources.
Leading the discussion with Hortonworks Distribution Team for Technical Solutions
Leading the Production Deployment.
Project
Walgreens is one of the largest retailers in the US with primary business of pharmacy.
Walgreens(WAG) is a convenience store and pharmacy chain in the United States that operates
more than 8,000 stores in 50 states and Puerto Rico. The Enterprise Data Warehousing Project
provides consistent, complete, integrated, accurate, and timely core business data to support
applications and information needs across the enterprise. The Enterprise Data Warehouse contains
key business data from across the enterprise organized around customer. EDW provides this data to
business users and applications to meet their information needs. This data is kept for severalyears
to gain visibility not only into current activities but to analyze trends in the business as a whole. It
is a very powerful tool, but the combination of such a large amount of sensitive company data in a
single location will require some specific security restrictions.
The Project deals with a key application group of corporate i.e. EDW (Enterprise Data Warehouse)
applications. It is basically a set of different business critical applications, which work in a
sequence , and failure in one can cause the next one to fail too, thereby producing huge losses.
EDW refers to enterprise data warehouse of Walgreens having subject areas based on the different
business segments of Walgreen. These subject areas comprises of POS, Pharmacy, Photo, AARP,
Epsilon, CDI, E-Commerce, Loyalty etc. Data is coming from multiple source systems and getting
transformed with the help of ETL and first loaded into the staging area and then transformed and
loaded into the centralData Warehouse having different subject areas. POS contain the point of
sale data transaction happens at the counter like product code/UPC numbers. Pharmacy contains
transactional data related to patient/prescriber/drug and other related information. Photo contains
the data related to the photo attributes.
AARP provide data for retired citizens to be covered under special profiles. Epsilon provided
marketing data to Walgreens. CDI provides information about the customer data. E-commerce
provides the transaction data happen at the Online stores. Loyalty program is reward program in
which loyal customers of Walgreens are given some points as per the business rule and they can
redeem those points while purchasing at WAGstore.
6. The Enterprise Data Warehouse currently contains five classifications of data, as it pertains to
requesting access and the specific data usage restrictions:
Provide analysis capabilities to business, executive, and client leadership
Provide an information delivery architecture that is scalable and maintainable
Support iterative development and provide incremental functionality to the business
Provide the user community access to analytic data through user friendly front-end
reporting and analysis tools
Support the extension of existing source systems by integrating the corresponding data
directly into IDL
Hardware Intel
Operating System Linux Cent OS
Languages Java,
Tools Hadoop Map-reduce, Hive,Pig, Oozie,Unix,Python,Hbase
Special Software Eclipse IDE, Git, HDP 2.5,Gradle,Maven
CLUSTER DISTRIBUTION HDP 2.5
May 2013 – Jan 2015
TATA Consultancy
Services
Title HP – Pony Express
Period May 2013-Jan 2015
Client Name Hewlett-Packard
7. Position Senior Hadoop Developer/Technical Lead
Responsibilities
To do an overall Analysis for the New/Changed Requirements and prepare low level
design
Estimate time required for code completion
Analyze customer data loaded from multiple sources
Ensure all the developers understand the big picture.
Know the status of the developer’s work and detect slippage
Managing the resources – Distributing the task
Reviewing the code - according to the design, and standardize it using coding standards
and code versioning techniques.
Analyzing the scope for optimization.
Scheduling the jobs using oozie.
Code Reviews and providing feedback to the team mates for improvement.
Perform Review and Code Walkthrough. To perform the code review and provide feedback
to team mates for improvement.
Testing the application in Big Data perspective.
Perform the causal analysis on the defects raised.
Ensure overall quality of all the deliverables.
Project
Hewlett-Packard is a manufacturer of PCs, laptops, printers – inkjet, laser and so on. In this project,
starting with the inkjet printers as Phase 1, the details of prints made from various printers were
made available. This data passes through a data filtering procedure and other volumetric
8. calculations to find out the usage of these printers and the type of usage by the customers over the
geographic regions. This information helps in . The cartridge installed in these printers was also
analysed, to check the malpractices performed on it.
The project was done on MapR cluster. Web based printers were also considered, where web
services were used to store the printer related data for analysis. In phase 2, the details of the laser
printers were to be analysed. It is in the testing phase now.
Operating System Ubuntu Red Hat enterprise Linux Release6.3
Tools Hive, Putty, GIT,Hbase,Mapreduce
Special Software IPMS, Eclipse IDE
CLUSTER DISTRIBUTION MAPR
Project Location Kochi, India
DECEMBER 2012 – MAY 2013 TATA Consultancy Services
Title Relation Finder Application in Big Data Framework to identify various
relationships between customers by analysing multiple data feeds,in
NextGen Solutions Kochi for Merrill Lynch, USA.
Period December 2012 – MAY 2013
Client Name Merrill Lynch, USA
Position Hadoop Developer
Responsibilities
• Develop UI screens to represent the relationships in a graphical model.
• Develop screens to represent real-time update feature.
9. • Develop the code according to the design.
• Standardize the code by following the coding standards and code versioning techniques
• Integration of Cassandra API with the UI.
• Integration testing of Relation Finder Application.
• Testing the application in Big Data perspective.
• Perform Peer Review and Code Walkthrough.
• Ensure overall quality of all the deliverables.
Project
The aim of the application is to identify the Party to Party Association map for the customers. The
Party To Party association process aims to identify the relations between the data available within
the internal systems. The data association process will look through the data attributes in the sources
identified and the relationships will be identified.
The POC is planned to be executed in two phases.
• Phase 1: Identify the internal party to party association map for a size-able set of customers
from internal systems
• Phase 2: Identifying external associations and providing social behaviour and sentiment
analysis for a finite set of high net worth customers.
The high level solution approach is to get the customer profile and the association as the input data.
This data will be stored and processed to identify the matches and generate the relationship data
out. The visualisation will create the required customer UI to find a graphical way to display the
customer relationships
10. Hardware Dell Servers
Operating System Ubuntu 4.4.3-4ubuntu5.1
Languages Java, SQL, HTML / CSS, JSP, Servlets
Special Software GIT,Cassandra
CLUSTER DISTRIBUTION CLOUDERA
Project Location 1. Kochi, India
2. Mumbai, India
3. New Jersey, USA
Since December 2011 TATA Consultancy Services
Title TeleInsights is a telecom based solution aimed at providing diverse
functionality to Telecom Service Providers.
Period December 2011 - December 2012
Position Hadoop Developer
Responsibilities
• Develop code which implements storing Tower data in MongoDB
• Develop GUI screens and implement Mapreduce code for clustering algorithms
• Integration of Server and UI.
• Follow coding standards and code versioning techniques.
• Integration testing of TeleInsights Application.
• Testing the application in Big Data perspective.
11. Project
TeleInsights is a unified solution that caters to diverse needs of Telecom Service Provider, covering
Network Analysis, Customer Segmentation, Customer Experience, Churn Prediction,
Recommendation etc. The solution deals with variety of data from multiple sources internal and
external to the service providers. The data from the OSS systems, CRM, The Social Media customer
profiles etc were brought into the Hadoop Staging area and filtering was done using Pig Scripts and
MapReduce scripts. Data aggregation was done after loading this filtered set into Hive Tables. Hbase
tables were used to hold aggregated data and perform reporting. Machine learning libraries in
Apache Mahout was leveraged for Customer Churn Analysis,Prediction and computing Customer
Satisfaction.
Hardware Dell Servers
Operating System Ubuntu 4.4.3-4ubuntu5.1
Languages Java,
Tools SQL, HTML / CSS, JSP, Servlets, HiveQL,Pig,MongoDB
Special Software GIT
CLUSTER DISTRIBUTION CLOUDERA
12. Since December 2010 TATA Consultancy Services
Title Renaissance – Sanlam LAMDA Re-engineering, SA
Period December 2010 – December 2011
Client Name Sanlam Insurance, South Africa
Position Developer
Responsibilities
Understanding existing application
Developing code for the product
Adhering to coding standards and blueprint suggested by client
Integration of the code among multiple modules
Unit Testing of the Product
Functional testing
Bug Fixing
Enhancement of the Product
Developing Server-side code from Low-Level Design Models using
IBM Rational Application Developer.
Developing UI screens for Epsilon.
Performance tuning and code optimization
Co-ordinating with QA team in system testing, defect fixing and
support.
13. Project
Lamda is a Policy Administration Tool used by Sanlam - One of the major financial providers in South
Africa. Renaissance is a Re Engineering project which replaces the existing Lamda system in VB
6.0 to J2EE based Epsilon application by retaining its functionality, database structure and User
Interface look and feel to the extent possible. The new stand-alone application will be able to handle
the needs of the user more efficiently and effectively than the old System.
The Application code was reverse engineered to accurately documented use cases with the help of
domain experts. The Use cases where then realized in Java/J2EE by the construction team. The
logical implementation was tested against the current logic followed by the legacy system to form the
re-engineered application. The main principles of the Epsilon application are:
• The Epsilon application is designed and developed using the Java and Java Platform
Enterprise Edition technologies.
• The database resides in the Microsoft SQL Server technology. Hibernate is employed as the
mapping framework.
• The online architecture is implemented in three tier architecture with the application on the
desktop accessing an application on the server tier which interacts with the database.
Hardware Lenovo PC
Operating System Microsoft w indowsEnterprise edition 4.0(XP)
Languages Java/J2EE, SQL
Special Software IBM RAD 7.5.5 and 8.0.1, SQL Server Management Studio 2005, SVN
Project Location Kochi, India
14. Personal Details
Date of Joining September 20, 2010
Designation IT Analyst(Senior Hadoop Developer)
Location Des Plaines,Illinois
Passport Details J1891937
Issued at Kozhikode on July,05,2010
Valid up to July,04,2020
Visa Details H1B
Valid up to June 22,2019