M.V. Rama Kumar has 3 years of experience in application development using Java and big data technologies like Hadoop. He has 1.6 years of experience using Hadoop components such as HDFS, MapReduce, Pig, Hive, Sqoop, HBase and Oozie. He has extensive experience setting up Hadoop clusters and processing large, structured and unstructured data.
• Capable of processing large sets of structured, semi-structured and unstructured data and supporting system architecture
• Implemented Proof of concepts on Hadoop stack and different big data analytic tools, migration from different databases to Hadoop.
• Developed multiple Map Reduce jobs in java for data cleaning and pre-processing according to the business requirements, Importing and exporting data into HDFS and Hive using Sqoop.
Having Experience in writing HIVE queries & Pig scripts.
• Capable of processing large sets of structured, semi-structured and unstructured data and supporting system architecture
• Implemented Proof of concepts on Hadoop stack and different big data analytic tools, migration from different databases to Hadoop.
• Developed multiple Map Reduce jobs in java for data cleaning and pre-processing according to the business requirements, Importing and exporting data into HDFS and Hive using Sqoop.
Having Experience in writing HIVE queries & Pig scripts.
Overview of Big data, Hadoop and Microsoft BI - version1Thanh Nguyen
Big Data and advanced analytics are critical topics for executives today. But many still aren't sure how to turn that promise into value. This presentation provides an overview of 16 examples and use cases that lay out the different ways companies have approached the issue and found value: everything from pricing flexibility to customer preference management to credit risk analysis to fraud protection and discount targeting. For the latest on Big Data & Advanced Analytics: http://mckinseyonmarketingandsales.com/topics/big-data
The strategic relationship between Hortonworks and SAP enables SAP to resell Hortonworks Data Platform (HDP) and provide enterprise support for their global customer base. This means SAP customers can incorporate enterprise Hadoop as a complement within a data architecture that includes SAP HANA, Sybase and SAP BusinessObjects enabling a broad range of new analytic applications.
This presentation sets out how to use data collection and player analytics in online games to create better experiences and environments for players. The Games Industry can use data and analytics to create pro-active Player Relationship Management and increase revenues and engagements.
Overview of Big data, Hadoop and Microsoft BI - version1Thanh Nguyen
Big Data and advanced analytics are critical topics for executives today. But many still aren't sure how to turn that promise into value. This presentation provides an overview of 16 examples and use cases that lay out the different ways companies have approached the issue and found value: everything from pricing flexibility to customer preference management to credit risk analysis to fraud protection and discount targeting. For the latest on Big Data & Advanced Analytics: http://mckinseyonmarketingandsales.com/topics/big-data
The strategic relationship between Hortonworks and SAP enables SAP to resell Hortonworks Data Platform (HDP) and provide enterprise support for their global customer base. This means SAP customers can incorporate enterprise Hadoop as a complement within a data architecture that includes SAP HANA, Sybase and SAP BusinessObjects enabling a broad range of new analytic applications.
This presentation sets out how to use data collection and player analytics in online games to create better experiences and environments for players. The Games Industry can use data and analytics to create pro-active Player Relationship Management and increase revenues and engagements.
AppTrim® is a medical food manufactured in a cGMP certified facility in the United States, that is specially formulated to manage the increased nutritional requirements of obesity. AppTrim provides the specific amino acids and nutrients required by the body to stimulate the production of key neurotransmitters responsible for controlling appetite, hunger and satiety. AppTrim is clinically proven to reduce appetite, carbohydrate cravings and weight when used in addition to a daily exercise and nutrition plan
There many occasions need wearing party dresses, like prom, homecoming, Christmas and so on. This summer, neon dress become a hot pick for girls. How about your summer party dress?
Percura, Improve Pain and Numbness Associated with Peripheral NeuropathyTargeted Medical Pharma
Percura capsules by oral administration. A specially formulated Medical Food product, consisting of a proprietary blend of amino acids in specific proportions, for the dietary management of the metabolic processes associated with pain, inflammation and loss of sensation due to peripheral neuropathy. Must be administered under physician supervision.
Percura is a proprietary formulation of amino acids and other dietary factors to support induction, maintenance, and enhancement of the specific neurotransmitter activity involved in the physiology of neuropathic pain. The formulation consists of nonessential and essential amino acids L-Arginine HCL, L-Histidine HCL, L-Glutamine, L-Serine, L-Lysine, L-Ornithine, Acetyl L-Carnitine, L-Tyrosine, the nonstandard amino acid Gamma Aminobutyric Acid, Choline Bitartrate, Glucose, Inositol, Griffonia Extract griffonia simplicifolia (95% 5-HTP) , and Creatine. These ingredients fall into the classification of Generally Recognized as Safe (GRAS) as defined by the Food and Drug Administration (FDA) (Sections 201(s) and 409 of the Federal Food, Drug, and Cosmetic Act). A GRAS substance is distinguished from a food additive on the basis of the common knowledge about the safety of the substance for its intended use. The standard for an ingredient to achieve GRAS status requires not only technical demonstration of non-toxicity and safety, but also general recognition of safety through widespread usage and agreement of that safety by experts in the field. Many ingredients have been determined by the FDA to be GRAS, and are listed as such by regulation, in Volume 21 Code of Federal Regulations (CFR) Sections 182, 184, and 186.
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014Targeted Medical Pharma
Targeted Medical Pharma (OTCQB: TRGM) is a small cap biotechnology company that specializes in manufacturing of medical foods for a vareity of disease states. Innovators in the growing medical food sector, The Company has developed products that improve back pain, sleep disorders, hypertension, peripheral neuropathy, fibromyalgia, PTSD and cognitive dysfunction. The Company has recently launched Clearwayz (www.clearwayz.com), the first in a line of proprietary dietary supplements.
The Clearwayz formula provides the respiratory system, nasal and sinus cavities with a balance of nutrients, amino acids and polyphenols required to support anti-inflammatory processes and support autonomic nervous system function.
The use of Clearwayz for acute sinus and respiratory support can result in balanced mucous production, improved immune function, and improved nasal breathing during times of congestion.
As a daily supplement the patented Clearwayz formula provides the body with the natural tools it needs to safely manage inflammation in the respiratory system, nasal and sinus cavities.
The fundamental principle of Clearwayz is to support healthy airflow in the nasal and sinus cavities through anti-inflammatory mediators and nervous system support.
• Excellent analytical and problem solving skills.
• Excellent communication skills.
• Quick Learner, Self-Motivated and team player traits.
• Ability to mentor and educate peers whenever needed for the greater good of the team as a whole.
1. M.V. Rama Kumar Mobile: +91-9014514500
Email:matururamakumar@gmail.com
Professional Summary:
• 3 years of overall IT experience in Application Development in Java and Big Data
Hadoop.
• 1.6 years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
• Extensive Experience in Setting Hadoop Cluster
• Capable of processing large sets of structured, semi-structured and unstructured
data and supporting systems application architecture.
• Involved in writing the Pig scripts to reduce the job execution time.
• Experience in importing and exporting data using Sqoop from Hdfs to Relational
Database Systems and vice-versa.
• Extending Hive and Pig core functionality by writing Custom UDFs.
• Load data in to the Hive tables.
• Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce
programs in Java.
• Knowledge in job workflow scheduling tool like Oozie.
• Experience in Capturing data from existing databases that provides SQL interfaces
using Sqoop.
• Good working knowledge with Sqoop and Apache Pig , Apache Oozie
• Excellent communication, interpersonal, analytical skills, and strong ability to
perform as part of team.
• Exceptional ability to learn new concepts.
• Hard working and enthusiastic.
Qualifications:
Program of
Study
Specialization Institute Board/University
Year of
Passing
%
M.C.A CS
Prakasam Eng
College
JNTUK 2012 70%
Technical Skills:
Languages SQL, Pig Script, HQL, Core Java, Map Reduce.
Big data Technologies HDFS,MapReduce, Pig, Sqoop, Hive, Hbase,Oozie
Frameworks Hadoop.
Java IDEs Eclipse.
Operating Systems Windows7, Windows XP and Linux
Work Experience
2. • Currently working as a System Analyst with Global InfoVision Pvt Ltd from
Sepetember 2012
Professional Summary
• Deploying and configuring the components of the system infrastructure on an
ongoing basis
• Understanding the client required tools and configuring the same in the business
environment with the reusable documentation
• Giving KT to the team members on the newly configured tools
• Configuring Basic Hadoop Big Data on Linux platform.
• Providing application support (administration, configuration, and monitoring) for
both off the shelf and home grown applications
• Supporting 100+ servers in the Dev/ IDE / STAGING QA/ Prod environment
• Working closely with the software development, Integration, Quality Assurance and
production teams from design to deployment phase of the applications
• Troubleshooting client related issues.
• Designed, configured and managed the backup and recovery for Hadoop data.
• Optimized and tuned the Hadoop environment to meet the performance
requirements.
• Proficient in designing, configuring and managing the backup and recovery for
Hadoop data.
• Proficient installing and configuring Hive, Pig, HBASE and other Hadoop tools.
Project#1:
Project Name : Sales Analytics
Environment : Hadoop, HDFS, Oozie, Map Reduce, Hbase, Pig, Hive, SQOOP, Java
Duration : Jan 2015 to till date
Role : Hadoop Developer
Project Description
• We use to analyze historical sales information and generate reports on monthly basis
and compare sales between months, compare sales of different product categories
and analyze growth and decline of sales, generating weekly, monthly and yearly
report for the same.
• Doing forecast for future sales on also comparing the target achieved or not for
weekly and yearly basis.
• When selling products, in order to prevent products stock out due to the uncertainty
events, enterprises usually reserves a certain amount of safety stock.
• The amount of safety stock is directly related to inventory costs and no. of sales.
• We determine the amount of future safety stock as per the previous sales report.
3. • A Cluster of 7 nodes was setup using Cloudera Manager 4.0.Various data sources
like csv, xml and RDBMS were used as data source
Roles & Responsibilities
• Involved in setting up the development environment
• Developed Map Reduce jobs in Java for data cleaning and pre-processing
• Developed Map Reduce programs to parse the raw data, populate staging tables and
store the refined data in partitioned tables in the EDW
• Assisted with data capacity planning and node forecasting.
• Created Hive queries that helped market analysts spot emerging trends by
comparing fresh data with EDW reference tables and historical metrics.
• Used to write hive and pig queries on different data sets and joining them.
• Used Sqoop for data transfer between RDBMS and HDFS.
• Checked the performance by introducing concept of combiner and Partitioner
• Hadoop Administration and Support, log monitoring as well as node monitoring for
performance issues. Used Cloudera manager as well as system log files.
• Worked on setting up a Cluster with High Availability Name node configuration
(active and standby mode) through NFS.
• Supported code/design analysis, strategy development and project planning
• Created scripts to delete log files of hadoop periodically
Project#2:
Project Name : Sentiment Analysis on Customer evolution in banking domain
Environment : Hadoop, HDFS, Apache Pig, SQOOP, Java, Linux, MySQL
Duration : May 2014 to Sep 2014
Role : Hadoop Developer
Project Description:
The purpose of the project is to store information generated by the bank’s historical data,
extract meaning information out of it and based on the information predict the customer’s
category. The solution is based on the open source Big data s/w Hadoop. The data will be
stored in hadoop file system and processed using Map/Reduce jobs for the Product and
pricing information.
Roles and Responsibilities:
• Involved in developing the pig Scripts
• Developed the Sqoop scripts in order to make the integration between Pig and
MySQL Database.
• Completely involved in the requirement analysis phase
Project#3:
Project Name : Preparing a web interface for hbase
4. Environment : Hadoop, HDFS, Apache Pig, SQOOP, Map Reduce, MySQL, Hbase
Duration : Sep 2014 to Dec 2014
Role : Hadoop Developer
Project Description:
The purpose of the project is to perform the analysis on the effectiveness and validity of
controls and to store terabytes of log information generated by the source providers as part
of the analysis and extract meaningful information out of it.The solution is based on the
open source bigdata software hadoop. The data will be stored in hadoop filesystem and
processed using Map Reduce jobs, which intern includes getting the raw data, process the
data to obtain controls and redesign/change history information, extract various reports out
of the controls history and export the information for further processing.
Roles and Responsibilities:
• Fetched data from hdfs using pig, hive, mapreduce into hbase
• Fetched data from existing MySql database to HBase using Sqoop and Map reduce.
• Prepared a web UI for the HBase database for crud operation like put, get, scan,
delete, update etc.
• Provided a UI to enter user’s Query which would be parsed and executed internally
and Result would be displayed on the web interface.
Project#4:
Project Name : Severity Claims System (SCS).
Duration : Sep 2012 to Apr 2014
Role : Developer
Type of Project : Ticket Support and Maintenance.
Project Description:
SCS (Severity Claims System) application is a core insurance claim processing system which
caters to the Excess Claims, Environmental and Toxic tort LOBs (Line of Businesses). It aims
to identify potential Severity claims, determine and eliminate duplicate claims, calculate the
potential exposure of AIG regarding claims and optimize the process of claims processing.
Roles and Responsibilities:
• Perform Root Cause analysis for User requests.
5. • Performance Tuning and Query Optimization.
• Ensure smooth knowledge transition to team members.
• Attend Status Meetings.
• Doing a peer review of the deliverables and documenting the defects.
• Giving new proposals and suggestions for application performance improvement.
Operating System : Windows XP, Windows 7, OS/390.
Languages : CORE JAVA, HTML, JCL, COBOL, DB2.
Special Software : Eclipse, PVCS, Query Tool.
PERSONAL DETAILS:
Name : M.V. Rama Kumar
Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM
(M.D)
Guntur (DIST), A.P-522311
D.O.B :12-11-1987
Languages Known : English, Telugu, Hindi
Nationality : Indian
Interests : Playing Cricket
6. • Performance Tuning and Query Optimization.
• Ensure smooth knowledge transition to team members.
• Attend Status Meetings.
• Doing a peer review of the deliverables and documenting the defects.
• Giving new proposals and suggestions for application performance improvement.
Operating System : Windows XP, Windows 7, OS/390.
Languages : CORE JAVA, HTML, JCL, COBOL, DB2.
Special Software : Eclipse, PVCS, Query Tool.
PERSONAL DETAILS:
Name : M.V. Rama Kumar
Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM
(M.D)
Guntur (DIST), A.P-522311
D.O.B :12-11-1987
Languages Known : English, Telugu, Hindi
Nationality : Indian
Interests : Playing Cricket