SlideShare a Scribd company logo
Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)
 Email : nageswara268@gmail.com
Career Objective
A challenging and vibrant career in a growing organization, where I could be able to learn and use my technical skills
which can contribute to the growth of organization in the field of technology.
Professional Summary
 Software Engineer, Capgemini India privet limited, Bangalore.
 Overall 4+ Years of experience.
 2.6 years of work experience on BIG DATA Technologies(Hadoop)
 1. 7years of work experience on core java.
 Highly versatile and experienced in adapting and implementing the latest technologies in new application solutions.
 Hands-on experience in designing and implementing solutions using Apache Hadoop, HDFS, Map Reduce,Spark
Hive, PIG,Sqoop.
 Knowledge in Tableau reporting tools
 Experience in Agile software development process
 Strong knowledge in OOP Concepts, Core Java .
 Good exposure on Windows and Linux platforms.
Technical Summary
 Big Data Ecosystem: Hadoop, Map Reduce, Pig, Hive.
 Good Knowledge: HBase, Sqoop, Flume, Oozie, Zookeeper, Spark and Scala.
 Languages & Frameworks: Core Java, SQL.
 Scripting Languages: Unix Shell Scripting
 Web Technologies : HTML
 Tools : Eclipse,Putty , Maven
 Databases: Oracle 9i, MySQL
 Development Methodologies: Agile SCURM.
Educational Summary
Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)
 Email : nageswara268@gmail.com
 B.Tech in Electrical and Electronics Engineering from JNTU Kakinada(2008 - 2012)
Assignments
Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)
 Email : nageswara268@gmail.com
 Banking Customer – Enterprise Data Provisioning Platform
Duration : Jan 2016 – Till date
Client: – BARCLAYS Bank , U K
Team Size : 31
Designation: Hadoop Developer
Project Description: - The Enterprise Data Provisioning Platform (EDPP), which is the desired build of the
Information Excellence project, will allow BARCLAYS to address new business needs and is in line with
BARCLAYS’s guiding principle which is to operate with excellence. The primary object of the EDPP project is to
institutionalize a Hadoop platform for data collected within BARCLAYS and make it available to perform analytics
on the collected data.
Environment:
Distribution CDH5, Apache Pig, Hive, Java , Unix , MySQL, Spark and Scala.
Roles & Responsibilities:
 Designing schemas in Hive.
 Moved all the data obtained from different sources into hadoop environment
 Created HIVE tables to store the processed results in a tabular format
 Written Map Reduce programs to process the HDFS data and convert into common format.
 Written shell scripts for automation of the loading process
 Resolving JIRA Tickets.
 Unit Testing and Performance tuning of hive queries
 Written various hive queries
 Involved in Client engagements
 Responsible to conduct scrum meetings.
 Retail Customer – TARGET Re-hosting of Web Intelligence Project
Duration : Dec 2014 – Nov 2015
Client: – TARGET , USA
Team Size : 15
Designation: Hadoop Developer
Project Description: - The purpose of the project is to store terabytes of log information generated by the
ecommerce website and extract meaning information out of it. The solution is based on the open source
BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs.
Which intern includes getting the raw html data from the websites ,Process the html to obtain product and pricing
information, Extract various reports out of the product pricing information and Export the information for further
processing.
This project is mainly for the replatforming of the current existing system which is running on WebHarvest a
third party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process
large date sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the
increasing competion from his retailers.
Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)
 Email : nageswara268@gmail.com
Environment:
Distribution CDH5,Apache Pig, Hive, SQOOP, Unix , MySQL
Roles & Responsibilities:
 Moved all crawl data flat files generated from various retailers to HDFS for further processing
 Written the Apache PIG scripts to process the HDFS data.
 Created HIVE tables to store the processed results in a tabular format
 Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.
 Involved in resolving the JIRAs based on Hadoop.
 Developed the Unix shell scripts for creating the reports from Hive data.
 Completely involved in the requirement analysis phase.
 Web Application – Intella Sphere
Duration : Dec 2012 – Jun 2014
Environment: :
Java, Mysql, Mongo db2.4.6, Activity Workflow, Svn
Designation: Java Developer
Project Description: - The brand essence of Intella Sphere is direct, analytical and engaging. It's all about
empowring business to gain the intelligence they need to grow and improve their brand in the new age of marketing.
Intella Sphere is the ultimate marketing tool, giving a company the devices they need to gain market share, beat the
competition and get true results. Intella Sphere understands these challenges better than anyone and uses
experience and innovation to create the right tools for a business to clearly understand its audience and empower
them to grow and engage with their community!
Responsibilities:
• DAO Created for all DB operations using Mongo Db api.
• Worked on Designing phase of application using Visual Paradigm tool.
• Implementing social oauth configuration.
• Using social api for social networks(Facebook, Twitter, LinkedIn, Blogger, Youtube).
• Implementing Activiti workflow.
• Implementing aggregations for calculating metrics.
• Working on mongo db replica set.
• Worked on development and production environments.

More Related Content

What's hot

Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNag Arjun
 
Game Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise ThinkingGame Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise Thinking
Inside Analysis
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_DamarlaNag Arjun
 
How pig and hadoop fit in data processing architecture
How pig and hadoop fit in data processing architectureHow pig and hadoop fit in data processing architecture
How pig and hadoop fit in data processing architecture
Kovid Academy
 
Hortonworks & IBM solutions
Hortonworks & IBM solutionsHortonworks & IBM solutions
Hortonworks & IBM solutions
Thiago Santiago
 
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud ComputingBattling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Edwin Poot
 
IJSRED-V2I3P43
IJSRED-V2I3P43IJSRED-V2I3P43
IJSRED-V2I3P43
IJSRED
 
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEvamshi krishna
 
Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol
Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol
Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol
HARMAN Services
 
Big Data/Hadoop Option Analysis
Big Data/Hadoop Option AnalysisBig Data/Hadoop Option Analysis
Big Data/Hadoop Option Analysis
zafarali1981
 
BDaas- BigData as a service
BDaas- BigData as a service  BDaas- BigData as a service
BDaas- BigData as a service
Agile Testing Alliance
 
Hadoop Reporting and Analysis - Jaspersoft
Hadoop Reporting and Analysis - JaspersoftHadoop Reporting and Analysis - Jaspersoft
Hadoop Reporting and Analysis - Jaspersoft
Hortonworks
 
Web Briefing: Unlock the power of Hadoop to enable interactive analytics
Web Briefing: Unlock the power of Hadoop to enable interactive analyticsWeb Briefing: Unlock the power of Hadoop to enable interactive analytics
Web Briefing: Unlock the power of Hadoop to enable interactive analyticsKognitio
 
MongoDB in the Big Data Landscape
MongoDB in the Big Data LandscapeMongoDB in the Big Data Landscape
MongoDB in the Big Data Landscape
MongoDB
 
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love ItIBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM Analytics
 
Dba to data scientist -Satyendra
Dba to data scientist -SatyendraDba to data scientist -Satyendra
Dba to data scientist -Satyendra
pasalapudi123
 

What's hot (20)

Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Hareesh
HareeshHareesh
Hareesh
 
Resume
ResumeResume
Resume
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Game Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise ThinkingGame Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise Thinking
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
How pig and hadoop fit in data processing architecture
How pig and hadoop fit in data processing architectureHow pig and hadoop fit in data processing architecture
How pig and hadoop fit in data processing architecture
 
Arindam Sengupta _ Resume
Arindam Sengupta _ ResumeArindam Sengupta _ Resume
Arindam Sengupta _ Resume
 
Hortonworks & IBM solutions
Hortonworks & IBM solutionsHortonworks & IBM solutions
Hortonworks & IBM solutions
 
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud ComputingBattling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
 
IJSRED-V2I3P43
IJSRED-V2I3P43IJSRED-V2I3P43
IJSRED-V2I3P43
 
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
 
Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol
Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol
Introduction to Microsoft Azure HD Insight by Dattatrey Sindhol
 
Big Data/Hadoop Option Analysis
Big Data/Hadoop Option AnalysisBig Data/Hadoop Option Analysis
Big Data/Hadoop Option Analysis
 
BDaas- BigData as a service
BDaas- BigData as a service  BDaas- BigData as a service
BDaas- BigData as a service
 
Hadoop Reporting and Analysis - Jaspersoft
Hadoop Reporting and Analysis - JaspersoftHadoop Reporting and Analysis - Jaspersoft
Hadoop Reporting and Analysis - Jaspersoft
 
Web Briefing: Unlock the power of Hadoop to enable interactive analytics
Web Briefing: Unlock the power of Hadoop to enable interactive analyticsWeb Briefing: Unlock the power of Hadoop to enable interactive analytics
Web Briefing: Unlock the power of Hadoop to enable interactive analytics
 
MongoDB in the Big Data Landscape
MongoDB in the Big Data LandscapeMongoDB in the Big Data Landscape
MongoDB in the Big Data Landscape
 
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love ItIBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
 
Dba to data scientist -Satyendra
Dba to data scientist -SatyendraDba to data scientist -Satyendra
Dba to data scientist -Satyendra
 

Similar to RESUME_N

Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam M
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
mallikarjunkoriindia
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotechlccinfotech
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_ResumeYeduvaka Ganesh
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
suresh thallapelly
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
Sourav Banerjee
 

Similar to RESUME_N (20)

Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+Years
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
Madhu
MadhuMadhu
Madhu
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Resume_Karthick
Resume_KarthickResume_Karthick
Resume_Karthick
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotech
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
Resume
ResumeResume
Resume
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
ChandraSekhar CV
ChandraSekhar CVChandraSekhar CV
ChandraSekhar CV
 
Monika_Raghuvanshi
Monika_RaghuvanshiMonika_Raghuvanshi
Monika_Raghuvanshi
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 

RESUME_N

  • 1. Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)  Email : nageswara268@gmail.com Career Objective A challenging and vibrant career in a growing organization, where I could be able to learn and use my technical skills which can contribute to the growth of organization in the field of technology. Professional Summary  Software Engineer, Capgemini India privet limited, Bangalore.  Overall 4+ Years of experience.  2.6 years of work experience on BIG DATA Technologies(Hadoop)  1. 7years of work experience on core java.  Highly versatile and experienced in adapting and implementing the latest technologies in new application solutions.  Hands-on experience in designing and implementing solutions using Apache Hadoop, HDFS, Map Reduce,Spark Hive, PIG,Sqoop.  Knowledge in Tableau reporting tools  Experience in Agile software development process  Strong knowledge in OOP Concepts, Core Java .  Good exposure on Windows and Linux platforms. Technical Summary  Big Data Ecosystem: Hadoop, Map Reduce, Pig, Hive.  Good Knowledge: HBase, Sqoop, Flume, Oozie, Zookeeper, Spark and Scala.  Languages & Frameworks: Core Java, SQL.  Scripting Languages: Unix Shell Scripting  Web Technologies : HTML  Tools : Eclipse,Putty , Maven  Databases: Oracle 9i, MySQL  Development Methodologies: Agile SCURM. Educational Summary
  • 2. Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)  Email : nageswara268@gmail.com  B.Tech in Electrical and Electronics Engineering from JNTU Kakinada(2008 - 2012) Assignments
  • 3. Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)  Email : nageswara268@gmail.com  Banking Customer – Enterprise Data Provisioning Platform Duration : Jan 2016 – Till date Client: – BARCLAYS Bank , U K Team Size : 31 Designation: Hadoop Developer Project Description: - The Enterprise Data Provisioning Platform (EDPP), which is the desired build of the Information Excellence project, will allow BARCLAYS to address new business needs and is in line with BARCLAYS’s guiding principle which is to operate with excellence. The primary object of the EDPP project is to institutionalize a Hadoop platform for data collected within BARCLAYS and make it available to perform analytics on the collected data. Environment: Distribution CDH5, Apache Pig, Hive, Java , Unix , MySQL, Spark and Scala. Roles & Responsibilities:  Designing schemas in Hive.  Moved all the data obtained from different sources into hadoop environment  Created HIVE tables to store the processed results in a tabular format  Written Map Reduce programs to process the HDFS data and convert into common format.  Written shell scripts for automation of the loading process  Resolving JIRA Tickets.  Unit Testing and Performance tuning of hive queries  Written various hive queries  Involved in Client engagements  Responsible to conduct scrum meetings.  Retail Customer – TARGET Re-hosting of Web Intelligence Project Duration : Dec 2014 – Nov 2015 Client: – TARGET , USA Team Size : 15 Designation: Hadoop Developer Project Description: - The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from the websites ,Process the html to obtain product and pricing information, Extract various reports out of the product pricing information and Export the information for further processing. This project is mainly for the replatforming of the current existing system which is running on WebHarvest a third party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the increasing competion from his retailers.
  • 4. Name: NAGESWARA RAO DASARI Tel No : 91-9035131268(M)  Email : nageswara268@gmail.com Environment: Distribution CDH5,Apache Pig, Hive, SQOOP, Unix , MySQL Roles & Responsibilities:  Moved all crawl data flat files generated from various retailers to HDFS for further processing  Written the Apache PIG scripts to process the HDFS data.  Created HIVE tables to store the processed results in a tabular format  Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.  Involved in resolving the JIRAs based on Hadoop.  Developed the Unix shell scripts for creating the reports from Hive data.  Completely involved in the requirement analysis phase.  Web Application – Intella Sphere Duration : Dec 2012 – Jun 2014 Environment: : Java, Mysql, Mongo db2.4.6, Activity Workflow, Svn Designation: Java Developer Project Description: - The brand essence of Intella Sphere is direct, analytical and engaging. It's all about empowring business to gain the intelligence they need to grow and improve their brand in the new age of marketing. Intella Sphere is the ultimate marketing tool, giving a company the devices they need to gain market share, beat the competition and get true results. Intella Sphere understands these challenges better than anyone and uses experience and innovation to create the right tools for a business to clearly understand its audience and empower them to grow and engage with their community! Responsibilities: • DAO Created for all DB operations using Mongo Db api. • Worked on Designing phase of application using Visual Paradigm tool. • Implementing social oauth configuration. • Using social api for social networks(Facebook, Twitter, LinkedIn, Blogger, Youtube). • Implementing Activiti workflow. • Implementing aggregations for calculating metrics. • Working on mongo db replica set. • Worked on development and production environments.