SlideShare a Scribd company logo
1 of 7
vishnu.c E-mail: vishnuch661@gmail.com
Mobile: 91-9550684116
Skills summary
 Having 5+ Years of IT experience in Application Development in Java and Big Data
Hadoop.
 Working knowledge in Object Oriented Programming.
 2+ years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop and Oozie.
 Knowledge in Setting Hadoop Cluster.
 Good working knowledge with Map Reduce ,Apache Pig and Sqoop.
 Good expertise & experience in web application development with sound knowledge of
Core java, JDBC, JSP, Servlets, Struts, Hibernate.
 Very well experienced in designing and developing both server side and client side
applications.
 Good understanding of IBM Web Sphere Application Server10.3 and Apache
Tomcat6.0 in the areas of deployment, configuration settings.
 Working experience in MySQL and Oracle Database.
 Experience with various IDE’s for development of project ( Eclipse,Netbeans) and
efficiently worked on version controlling systems like CVS and SVN.
 Knowledge on FLUME, Hbase and Spark&Scala.
 Excellent communication, interpersonal, analytical skills, and strong ability to perform as
part of team.
 Exceptional ability to learn new concepts, hard working and enthusiastic.
Professional Experience
 Currently working as a Hadoop Developer at CAPGEMINI, Hyderabad from Nov
2013 to till date.
 Previously worked as a system Engineer at Orange Business Services India
Technology Private Limited, Mumbai from Jan 2013 to Oct 2013.
 Previously worked as a software Engineer at Ventech, Hyderabad from June
2011 to November 2012.
Educational Qualifications
 MCA-Master of Computer Applications(2011) from JNTU kukatpally, Hyderabad with
An aggregate of 79%.
Technical Skills
Language : Java, MapReduce, Pig, Sqoop, Hive and HDFS.
J2EE : JDBC, Servlets, JSP.
Web Framework : Struts 1.0.
ORM Framework : Hibernate3.3.
Java-JEE Framework : Spring (awareness)
Servers : Tomcat 6.0 and Websphere10.3.
Database : Oracle10g, MySQL.
IDE’s : Eclipse 3.4, Netbeans.
Operating System : Windows 2000/XP,UNIX and LINUX.
Tools : Log4j,SVN.
Web Programming : HTML, JavaScript, Ajax, Jquery, css.
Project Profile
Project#4
Organization : CAPGEMINI
Project Title : Analyze Cost & Performance
Client : Element Financial Services, USA.
Team Size : 12.
Role : Hadoop Developer.
Environment :Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Oozie, Java, Unix,
Struts1.2, Hibernate3.3, MySQL.
Period : Jan’14 to till date.
Description:
Analyze is web based application, Aim of the project is to user can able to see the
Dashboards, Graphs and Flow charts of vehicles cost and performance on tier wise. The cost of
vehicle is based on cost for fuel, cost for maintenance and cost for accidents … etc and
performance is based on kilometer per litter…etc.
Responsibilities:
 Moved all log/text files generated by various components into HDFS location
 Written MapReduce program to process the HDFS data.
 Written the Apache PIG scripts to process the HDFS data.
 Developed the Ssqoop scripts in order to make the interaction between Pig and
MySQL Database.
 Writing the script files for processing data and loading to HDFS.
 Created Hive tables to store the processed results in a tabular format.
 Created External Hive Table on top of parsed data.
 Involved in writing the OOZIE jobs.
POC:
Sensex Log Data Processing
Environment : Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Java, Unix, Struts,
Hibernate, MySQL.
Role : Hadoop Developer.
Hardware : Virtual Machines, UNIX.
Period : Nov’13 to Jan’14.
Description:
The purpose of the project is to store terabytes of log information generated by
the sensex website and extract meaning information out of it. The solution is based on the open
source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using
Map/Reduce jobs, Pig and export processed data to the MYSQL using SQOOP. By using struts will
generate the graphs based on data in DB.
Responsibilities:
 Moved all PDF data files generated from SENSEX report to HDFS for further
processing.
 Written MapReduce program to process the HDFS data.
 Written the Apache PIG scripts to process the HDFS data.
 Developed the Sqoop scripts inorder to make the interaction between Pig and
MySQL Database.
 For the development of Dashboard solution, developed the Controller, Service and
Dao layers of Struts Framework.
 Completely involved in the requirement analysis phase.
Project#3
Organization : Orange Business Services India Technology Pvt LTD.
Project Title : NBR Component
Client : WorldBank, US.
Team Size : 15.
Role : Application Developer.
Environment : JDK 1.7, Jsp, Struts 2.0, Hibernate 3.3, Oracle,
Tomcat6.0, HTML, Ajax, JavaScript, Jquery, Eclipse.
Period : Jan’13 to Oct’13.
Description:
NBR Component is web based application, Aim of the project is to download
WorldBank people recorded meetings from WebEx cloud to their repository server location and
transfer downloaded meeting from repository server to PDMZ server. And send a mail notification
to the administrator and owner of that recorded meeting file based on the each step. It will apply
ACL permissions on that downloaded recorded meetings and more operations.
Responsibilities:
 Involved in client meetings.
 Responsible for writing code for Action Classes Using Struts2.0.
 Responsible for writing code for Hibernate Configuration & mapping files.
 Involved in writing sql Queries.
 Responsible for writing code for Business logic and DAO.
 Responsible for writing code for Server Side Validations using XWork Validation
Framework.
 Responsible for writing code for Client Side Validations using JavaScript, Jquery,
Ajax.
 Involved in Development of Persistence logic using Hibernate.
 Responsible for writing code for Interfaces using JSP’s, HTML and CSS.
 Involved in Debugging.
Project#1
Organization : Ventech.
Project Title : Inter Communication.
Client : In-House Project.
Team Size : 10
Role : Team Member
Environment : JDK1.6, Servlets, Jsp, JDBC3.0, MySQL, Tomcat 6.0, Netbeans.
Description:
Inter Communication is web based application built on Java technology to
Developing a web application which is directed to the employee’s in a company. Project goal is to
serve as the simulation to them. This system will be owned by a Company, to serve for their
people. This web application will incorporate all requirements needed for the interaction between
Project and IT support team and get the project team members problems solved, like
installations, access to other resources across the network, etc.
Responsibilities:
 Developed View Components.
 Implementation of JSPs with TagLibs.
 Involved in the development of Java Action Classes Using Servlets.
 Involved in the writing of client side validations using JavaScript and Jquery.
 Responsible for Involving in Debugging.
PERSONAL PROFILE:
Name : VISHNU C
D.O.B : 25-MARCH-1986.
Languages Known : English, Hindi and Telugu.
Place : Mumbai
Date : (Vishnu)
Date : (Vishnu)

More Related Content

What's hot (20)

Rahul 5yr java
Rahul 5yr javaRahul 5yr java
Rahul 5yr java
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
Purti
PurtiPurti
Purti
 
Resume_Karthick
Resume_KarthickResume_Karthick
Resume_Karthick
 
Bhargav
BhargavBhargav
Bhargav
 
Resume_Manabrata_Maity_2_Years_Experience
Resume_Manabrata_Maity_2_Years_ExperienceResume_Manabrata_Maity_2_Years_Experience
Resume_Manabrata_Maity_2_Years_Experience
 
Wei ding(resume)
Wei ding(resume)Wei ding(resume)
Wei ding(resume)
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resume
 
Keyur_Joshi_resume - Copy
Keyur_Joshi_resume - CopyKeyur_Joshi_resume - Copy
Keyur_Joshi_resume - Copy
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
ChandraSekhar CV
ChandraSekhar CVChandraSekhar CV
ChandraSekhar CV
 
Resume
ResumeResume
Resume
 
Arindam Sengupta _ Resume
Arindam Sengupta _ ResumeArindam Sengupta _ Resume
Arindam Sengupta _ Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Jeevan_Resume
Jeevan_ResumeJeevan_Resume
Jeevan_Resume
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
Resume
ResumeResume
Resume
 
Rohini kumar_ K_Resume
Rohini kumar_ K_ResumeRohini kumar_ K_Resume
Rohini kumar_ K_Resume
 
PHP_Developer_03exp_Rasmi_Ranjan
PHP_Developer_03exp_Rasmi_RanjanPHP_Developer_03exp_Rasmi_Ranjan
PHP_Developer_03exp_Rasmi_Ranjan
 

Similar to Vishnu_HadoopDeveloper

Similar to Vishnu_HadoopDeveloper (20)

Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
Trinath Resume
Trinath ResumeTrinath Resume
Trinath Resume
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
 
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
 
Gubendran Lakshmanan
Gubendran LakshmananGubendran Lakshmanan
Gubendran Lakshmanan
 
MANJARI RASTOGI_CV_ex
MANJARI RASTOGI_CV_exMANJARI RASTOGI_CV_ex
MANJARI RASTOGI_CV_ex
 
shamResume (1)
shamResume (1)shamResume (1)
shamResume (1)
 
Vicky_Resume
Vicky_ResumeVicky_Resume
Vicky_Resume
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
Anoop Saxena
Anoop SaxenaAnoop Saxena
Anoop Saxena
 
Resume_20012017
Resume_20012017Resume_20012017
Resume_20012017
 
sajeer_resume
sajeer_resumesajeer_resume
sajeer_resume
 
JavaDeveloper
JavaDeveloperJavaDeveloper
JavaDeveloper
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
AnkitDubeyResume
AnkitDubeyResumeAnkitDubeyResume
AnkitDubeyResume
 
NaveenQuodras_Resume
NaveenQuodras_ResumeNaveenQuodras_Resume
NaveenQuodras_Resume
 
Resume
ResumeResume
Resume
 
Nand_CV
Nand_CV Nand_CV
Nand_CV
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 

Vishnu_HadoopDeveloper

  • 1. vishnu.c E-mail: vishnuch661@gmail.com Mobile: 91-9550684116 Skills summary  Having 5+ Years of IT experience in Application Development in Java and Big Data Hadoop.  Working knowledge in Object Oriented Programming.  2+ years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop and Oozie.  Knowledge in Setting Hadoop Cluster.  Good working knowledge with Map Reduce ,Apache Pig and Sqoop.  Good expertise & experience in web application development with sound knowledge of Core java, JDBC, JSP, Servlets, Struts, Hibernate.  Very well experienced in designing and developing both server side and client side applications.  Good understanding of IBM Web Sphere Application Server10.3 and Apache Tomcat6.0 in the areas of deployment, configuration settings.  Working experience in MySQL and Oracle Database.  Experience with various IDE’s for development of project ( Eclipse,Netbeans) and efficiently worked on version controlling systems like CVS and SVN.  Knowledge on FLUME, Hbase and Spark&Scala.  Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team.  Exceptional ability to learn new concepts, hard working and enthusiastic. Professional Experience  Currently working as a Hadoop Developer at CAPGEMINI, Hyderabad from Nov 2013 to till date.  Previously worked as a system Engineer at Orange Business Services India Technology Private Limited, Mumbai from Jan 2013 to Oct 2013.  Previously worked as a software Engineer at Ventech, Hyderabad from June 2011 to November 2012.
  • 2. Educational Qualifications  MCA-Master of Computer Applications(2011) from JNTU kukatpally, Hyderabad with An aggregate of 79%. Technical Skills Language : Java, MapReduce, Pig, Sqoop, Hive and HDFS. J2EE : JDBC, Servlets, JSP. Web Framework : Struts 1.0. ORM Framework : Hibernate3.3. Java-JEE Framework : Spring (awareness) Servers : Tomcat 6.0 and Websphere10.3. Database : Oracle10g, MySQL. IDE’s : Eclipse 3.4, Netbeans. Operating System : Windows 2000/XP,UNIX and LINUX. Tools : Log4j,SVN. Web Programming : HTML, JavaScript, Ajax, Jquery, css. Project Profile Project#4 Organization : CAPGEMINI Project Title : Analyze Cost & Performance Client : Element Financial Services, USA. Team Size : 12. Role : Hadoop Developer. Environment :Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Oozie, Java, Unix, Struts1.2, Hibernate3.3, MySQL. Period : Jan’14 to till date. Description: Analyze is web based application, Aim of the project is to user can able to see the Dashboards, Graphs and Flow charts of vehicles cost and performance on tier wise. The cost of vehicle is based on cost for fuel, cost for maintenance and cost for accidents … etc and performance is based on kilometer per litter…etc.
  • 3. Responsibilities:  Moved all log/text files generated by various components into HDFS location  Written MapReduce program to process the HDFS data.  Written the Apache PIG scripts to process the HDFS data.  Developed the Ssqoop scripts in order to make the interaction between Pig and MySQL Database.  Writing the script files for processing data and loading to HDFS.  Created Hive tables to store the processed results in a tabular format.  Created External Hive Table on top of parsed data.  Involved in writing the OOZIE jobs. POC: Sensex Log Data Processing Environment : Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Java, Unix, Struts, Hibernate, MySQL. Role : Hadoop Developer. Hardware : Virtual Machines, UNIX. Period : Nov’13 to Jan’14. Description: The purpose of the project is to store terabytes of log information generated by the sensex website and extract meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs, Pig and export processed data to the MYSQL using SQOOP. By using struts will generate the graphs based on data in DB. Responsibilities:  Moved all PDF data files generated from SENSEX report to HDFS for further processing.  Written MapReduce program to process the HDFS data.  Written the Apache PIG scripts to process the HDFS data.
  • 4.  Developed the Sqoop scripts inorder to make the interaction between Pig and MySQL Database.  For the development of Dashboard solution, developed the Controller, Service and Dao layers of Struts Framework.  Completely involved in the requirement analysis phase. Project#3 Organization : Orange Business Services India Technology Pvt LTD. Project Title : NBR Component Client : WorldBank, US. Team Size : 15. Role : Application Developer. Environment : JDK 1.7, Jsp, Struts 2.0, Hibernate 3.3, Oracle, Tomcat6.0, HTML, Ajax, JavaScript, Jquery, Eclipse. Period : Jan’13 to Oct’13. Description: NBR Component is web based application, Aim of the project is to download WorldBank people recorded meetings from WebEx cloud to their repository server location and transfer downloaded meeting from repository server to PDMZ server. And send a mail notification to the administrator and owner of that recorded meeting file based on the each step. It will apply ACL permissions on that downloaded recorded meetings and more operations. Responsibilities:  Involved in client meetings.  Responsible for writing code for Action Classes Using Struts2.0.  Responsible for writing code for Hibernate Configuration & mapping files.  Involved in writing sql Queries.  Responsible for writing code for Business logic and DAO.  Responsible for writing code for Server Side Validations using XWork Validation Framework.  Responsible for writing code for Client Side Validations using JavaScript, Jquery, Ajax.  Involved in Development of Persistence logic using Hibernate.
  • 5.  Responsible for writing code for Interfaces using JSP’s, HTML and CSS.  Involved in Debugging. Project#1 Organization : Ventech. Project Title : Inter Communication. Client : In-House Project. Team Size : 10 Role : Team Member Environment : JDK1.6, Servlets, Jsp, JDBC3.0, MySQL, Tomcat 6.0, Netbeans. Description: Inter Communication is web based application built on Java technology to Developing a web application which is directed to the employee’s in a company. Project goal is to serve as the simulation to them. This system will be owned by a Company, to serve for their people. This web application will incorporate all requirements needed for the interaction between Project and IT support team and get the project team members problems solved, like installations, access to other resources across the network, etc. Responsibilities:  Developed View Components.  Implementation of JSPs with TagLibs.  Involved in the development of Java Action Classes Using Servlets.  Involved in the writing of client side validations using JavaScript and Jquery.  Responsible for Involving in Debugging. PERSONAL PROFILE: Name : VISHNU C D.O.B : 25-MARCH-1986. Languages Known : English, Hindi and Telugu. Place : Mumbai