SlideShare a Scribd company logo
Hassan Qureshi
Hadoop Lead Developer
hhquresh@gmail.com
PROFESSIONAL SUMMARY:
 Certified Java programmer with 9+ Years of extensive experience in IT including few years of
Big Data related technologies.
 Currently Researcher and developer Technical lead of data engineering team, team works
with data scientists in developing insights
 Good exposure in following all the process in a production environment like change
management, incident management and managing escalations
 Hands-on experience on major components in Hadoop Ecosystem including Hive, HBase,
HBase-Hive Integration, PIG, Sqoop, Flume & knowledge of Mapper/Reducer/HDFS
Framework.
 Handson experience Installation,configuration,maintenance,monitoring,performance andtuning,
and troubleshooting Hadoop clusters in different environments such as Development Cluster, Test
Cluster and Production
 Defined file system layout and data set permissions
 Monitor local file system disk space usage, log files, cleaning log files with auto script
 Extensive knowledge of Front End technologies like HTML, CSS, Java Script.
 Good working Knowledge in OOA & OOD using UML and designing use cases.
 Good communication skills, work ethics and the ability to work in a team efficiently with
good leadership skills.
TECHNICAL SKILLS:
Big Data Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, HBase,
MongoDB, Flume, Zookeeper, Oozie.
Operating Systems Windows, Ubuntu, Red Hat Linux, Linux, UNIX
Java Technologies JDBC, JAVA, SQL, JavaScript, J2EE, C, JDBC, SQL, PL/SQL
Programming or Scripting
Languages
Java, SQL, Unix Shell Scripting, C.,Python
Database MS-SQL, MySQL, Oracle, MS-Access
Middleware Web Sphere, TIBCO
IDE’s & Utilities Eclipse and JCreator, NetBeans
Protocols TCP/IP, HTTP and HTTPS.
Testing Quality Center, Win Runner, Load Runner, QTP
Frameworks Hadoop,py-spark,Cassendra
PROFESSIONAL EXPERIENCE
Fortinet, NewYork, NY Nov 2013- Present
Hadoop Lead
Project Details:
Fortinet is company that provides networking software including IP communicator phone
software. IP enhancement project features a post sales project. All devices of Fortinet especially
Routers and Switches would be sending xml files to a centralised server on daily basis. Xml files
contain info about their location and activities done for the day. All xml files need to process to
get the device health and their usage, xmls are 20 percent of overall data and remaining will be
coming from RDBMS and flat files.
Frame works and tools used:
 HDP 2.3 distribution for development Cluster
 Hadoop eco systems Hive, Map reduce to process data Contribution
 Writing Map reduce for processing xmls and flat files
 Provided production support for cluster maintenance
 Commissioned and decommissioned nodes as needed
 There were 10 node cluster with Hortonworks data platform with 550 GB RAM, 10 TB SSDs and 8
cores
 Star schema was designed with fact tables and dimension tables
 Worked on analyzing Hadoop stack and different big data analytic tools including Pig and
Hive, Hbase database and Sqoop
 Taking trainings for new joiners into project
 Triggered workflows based on time or availability of data using the Oozie Coordinator
Engine
Emerson Climate Technologies, Louisville, KY Jan 2011- October 2013
Hadoop Lead
Project Details:
Emerson Climate is providing efficient HVAC system to buildings minimize cost energy used by
HVAC system. Project HVAC controller used in controlling HVAC speed on basis of occupancy in a
floor by analysing access card data of the employees. HVAC systems run at constant speeds
irrespective of occupancy in buildings. We developed a project where we collect data of our own
employees across many locations at 10 mins frequency and come up with recommendation of
HVAC speeds depending occupancy in a floor. Used Mapreduce and spark to clean and format
the data, jobs run at 10 mins frequency using Crontab to generate HVAC controller report.
Frame works and Tools Used
 HDP 2.0 distribution for development Cluster
 All the datasets was loaded from two different source such as Oracle, MySQL to HDFS and
Hive respectively on daily basis
 We were getting on an average of 80 GB on daily basis on the whole the data warehouse.
We used 12 node cluster to process the data
 Involved in loading data from UNIX file system to HDFS
 Hadoop eco systems hive, Map reduce, Pyspark to process data
Implemented capacity scheduler to share the resources of the cluster and perform Hadoop
admin responsibilities as needed
 Writing Map reduce and Pyspark jobs for cleansing and applying algorithms
 Cassendra database was use to transform queries to Hadoop HDFS
 Designed scalable big data cluster solutions
 Monitored job status through email received from cluster health monitoring tools
 Responsible to manage data coming from different sources.
BMO Harris Bank, Buffalo Grove, IL Aug 2010 – Dec 2011
Hadoop Lead
BMO isfinancial servicesdepartmentthathelpscustomerwiththeirfinancial needsincludingcredit
card, banking,andloans.
BMO CreditCardproject wasdesignedtoextractraw data from differentsourcesintoHadoop Eco
systemtocreate andpopulate the necessaryHive tables.The mainaimof the projectisto centralize the
source of data forreport generationusinghistorical databasewhichotherwiseare generatedfrom
multiple sources.
Responsibilities:
 Worked on Importing and exporting data into HDFS in financial sector
 Involved as a team in reviewing of functional and non-functional requirements for writing
debit processing in Atlanta location.
 Implemented Oozie workflows to perform Ingestion & Merging of data in the MapReduce
jobs for credit card fraud detection.
 Extracted files from Cassendra Database through Sqoop and placed in HDFS and processed.
 Hands on experience in creating Hive tables,loading with data and writing hive queries which
will run internally in map reduce way to administer transactions.
 Developed a custom File system plug in for Hadoop so it can access files on Data Platform.
 This plug-in allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified
and access files directly.
 Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML,
Hibernate, Struts, Struts2, JDBC, and JavaScript development.
 Design of GUI using Model View Architecture (STRUTS Frame Work).
 Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.
Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, Jquery, Java, J2EE, HTML,
Javascript, Hibernate
Lowe’s, Mooresville, NC Feb 2005 – July 2010
Sr. Java Developer
Lowes.com redesign and assembly haul away. Worked with different services based on Service
Oriented Architecture (SOA) and also standalone projects which utilize UNIX shell scripts to
execute java programs. Some of the REST and SOAP services include: catalog service, façade
service, pricing service, purchase history service, mylowes service, seo-redirect service. Also
worked in REST API automation project using Rest-assured framework. Involved in updating
small batch script based java projects which produces some CSV, excel, txt files and sends those
files through SFTP or email.
Responsibilities:
 Developed new DAOs methods using Hibernate 4.3 as ORMfor application.
 Used DOMParser to parse XML 1.1 data from file.
 Used JAXB 2.0 annotations to convert Java object to/from XML 1.1 file.
 Created a SOAP 1.2 web service and then got its WSDL 2.0 generated.
 Created a Web Service Client and invoked the web service using the client
 Developed a REST based service which reads the JSON 2.0 file and passed it as an argument
to the Controller which handles the multiple HTML 5.1 UI files.
 Used Struts MVC framework for user authentication by using Ping Federate Server for single
sign on (SSO)
 Used SAML to use many services by entering into the systemfor one service
 Involved in coding front end using Swing, HTML, JSP, JSF, Struts Framework
 IDesign and Development of Spring service classes and JSF pages
 Involvedinall software developmentlifecycle phaseslike development,unittesting,regression
testing,performance testing,deployment
 Responsible fordeveloping,configuring,ormodifyingRESTand SOAPwebservicesusing
technologieslikeJAX-RS,JAX-WS,Jersey,SpringMVC.
 UsedSpringJDBC as data layerto querydatabasesDB2 and Cassandra.
 WorkedUNIXbatch applicationsthatgeneratesproductfeedsandXML files.
 WorkedwithRestAPIautomationusingRestAssuredandTestingframework.
 Participatedinscrummeetings,dailystand-ups, groomingsessions.
 Usedtechnologieslike Spring,REST,JAX-RS,Jersy,JSON,Junit,Testing,Mockito,EasyMock,
RestAssured,Ehcache,Maven,DB2,JDBC,Batch Scripting,DB2, WebSphere commerce,websphere.
Environment: Java, J2EE, JSP, ExtJS, Servlets, Struts, JDBC, Java Script, LifeRay, Google Web
Toolkit, Spring, EJB (SSB, MDB), Ajax, Websphere 6.1
Education Details-BE in IT , Guelph University,2005

More Related Content

What's hot

Suresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh Yadav
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
Abhinav khanduja
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
TRIVENI PATRO
 
Kumaresan kaliappan resume
Kumaresan kaliappan resumeKumaresan kaliappan resume
Kumaresan kaliappan resume
Sam Walsh
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
abinash bindhani
 
Soundarya Reddy Resume
Soundarya Reddy ResumeSoundarya Reddy Resume
Soundarya Reddy Resume
Soundarya Reddy
 
Resume-Manish_Agrahari_IBM_BPM
Resume-Manish_Agrahari_IBM_BPMResume-Manish_Agrahari_IBM_BPM
Resume-Manish_Agrahari_IBM_BPM
Manish Agrahari
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
Yeduvaka Ganesh
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
Kishore Babu
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
Robin David
 
Archana Jaiswal Resume
Archana Jaiswal ResumeArchana Jaiswal Resume
Archana Jaiswal Resume
Archana Jaiswal
 
Borja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data ArchitectBorja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data Architect
Borja Gonzalez Martinez-Cabrera
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
Sreenivasula Reddy B
 
Afsal Resume
Afsal ResumeAfsal Resume
Afsal Resume
senzonafsal
 
Shanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resumeShanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resume
Shantha Kumar N
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
Sreenivasula Reddy B
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
Chandan Das
 
Resume-pradeep SQL DBA
Resume-pradeep SQL DBAResume-pradeep SQL DBA
Resume-pradeep SQL DBA
Pradeep GP
 

What's hot (19)

Suresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Kumaresan kaliappan resume
Kumaresan kaliappan resumeKumaresan kaliappan resume
Kumaresan kaliappan resume
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Soundarya Reddy Resume
Soundarya Reddy ResumeSoundarya Reddy Resume
Soundarya Reddy Resume
 
Resume-Manish_Agrahari_IBM_BPM
Resume-Manish_Agrahari_IBM_BPMResume-Manish_Agrahari_IBM_BPM
Resume-Manish_Agrahari_IBM_BPM
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Archana Jaiswal Resume
Archana Jaiswal ResumeArchana Jaiswal Resume
Archana Jaiswal Resume
 
Borja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data ArchitectBorja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data Architect
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
 
Afsal Resume
Afsal ResumeAfsal Resume
Afsal Resume
 
Shanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resumeShanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resume
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
Resume-pradeep SQL DBA
Resume-pradeep SQL DBAResume-pradeep SQL DBA
Resume-pradeep SQL DBA
 

Similar to hadoop resume

Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
mukul upadhyay
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
srikanth K
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
suresh thallapelly
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
Shiv Shakti
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
PRAFUL DASH
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
Prashanth Shankar kumar
 
Hadoop and BigData - July 2016
Hadoop and BigData - July 2016Hadoop and BigData - July 2016
Hadoop and BigData - July 2016
Ranjith Sekar
 
Hadoop in action
Hadoop in actionHadoop in action
Hadoop in action
Mahmoud Yassin
 
What is hadoop
What is hadoopWhat is hadoop
What is hadoop
Asis Mohanty
 
Hadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log ProcessingHadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log Processing
Hitendra Kumar
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
PRAFUL DASH
 
HimaBindu
HimaBinduHimaBindu
Poorna Hadoop
Poorna HadoopPoorna Hadoop
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
PRABHAKAR T
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
PRABHAKAR T
 
Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
Shiv Shakti
 
Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
Thanh Nguyen
 
Overview of big data & hadoop version 1 - Tony Nguyen
Overview of big data & hadoop   version 1 - Tony NguyenOverview of big data & hadoop   version 1 - Tony Nguyen
Overview of big data & hadoop version 1 - Tony Nguyen
Thanh Nguyen
 
Atul Mithe
Atul MitheAtul Mithe
Atul Mithe
Atul Mithe
 
Big data
Big dataBig data
Big data
revathireddyb
 

Similar to hadoop resume (20)

Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Hadoop and BigData - July 2016
Hadoop and BigData - July 2016Hadoop and BigData - July 2016
Hadoop and BigData - July 2016
 
Hadoop in action
Hadoop in actionHadoop in action
Hadoop in action
 
What is hadoop
What is hadoopWhat is hadoop
What is hadoop
 
Hadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log ProcessingHadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log Processing
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Poorna Hadoop
Poorna HadoopPoorna Hadoop
Poorna Hadoop
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
 
Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
 
Overview of big data & hadoop version 1 - Tony Nguyen
Overview of big data & hadoop   version 1 - Tony NguyenOverview of big data & hadoop   version 1 - Tony Nguyen
Overview of big data & hadoop version 1 - Tony Nguyen
 
Atul Mithe
Atul MitheAtul Mithe
Atul Mithe
 
Big data
Big dataBig data
Big data
 

hadoop resume

  • 1. Hassan Qureshi Hadoop Lead Developer hhquresh@gmail.com PROFESSIONAL SUMMARY:  Certified Java programmer with 9+ Years of extensive experience in IT including few years of Big Data related technologies.  Currently Researcher and developer Technical lead of data engineering team, team works with data scientists in developing insights  Good exposure in following all the process in a production environment like change management, incident management and managing escalations  Hands-on experience on major components in Hadoop Ecosystem including Hive, HBase, HBase-Hive Integration, PIG, Sqoop, Flume & knowledge of Mapper/Reducer/HDFS Framework.  Handson experience Installation,configuration,maintenance,monitoring,performance andtuning, and troubleshooting Hadoop clusters in different environments such as Development Cluster, Test Cluster and Production  Defined file system layout and data set permissions  Monitor local file system disk space usage, log files, cleaning log files with auto script  Extensive knowledge of Front End technologies like HTML, CSS, Java Script.  Good working Knowledge in OOA & OOD using UML and designing use cases.  Good communication skills, work ethics and the ability to work in a team efficiently with good leadership skills. TECHNICAL SKILLS: Big Data Hadoop, HDFS, MapReduce, Hive, Sqoop, Pig, HBase, MongoDB, Flume, Zookeeper, Oozie. Operating Systems Windows, Ubuntu, Red Hat Linux, Linux, UNIX Java Technologies JDBC, JAVA, SQL, JavaScript, J2EE, C, JDBC, SQL, PL/SQL Programming or Scripting Languages Java, SQL, Unix Shell Scripting, C.,Python Database MS-SQL, MySQL, Oracle, MS-Access Middleware Web Sphere, TIBCO IDE’s & Utilities Eclipse and JCreator, NetBeans Protocols TCP/IP, HTTP and HTTPS. Testing Quality Center, Win Runner, Load Runner, QTP Frameworks Hadoop,py-spark,Cassendra PROFESSIONAL EXPERIENCE Fortinet, NewYork, NY Nov 2013- Present Hadoop Lead
  • 2. Project Details: Fortinet is company that provides networking software including IP communicator phone software. IP enhancement project features a post sales project. All devices of Fortinet especially Routers and Switches would be sending xml files to a centralised server on daily basis. Xml files contain info about their location and activities done for the day. All xml files need to process to get the device health and their usage, xmls are 20 percent of overall data and remaining will be coming from RDBMS and flat files. Frame works and tools used:  HDP 2.3 distribution for development Cluster  Hadoop eco systems Hive, Map reduce to process data Contribution  Writing Map reduce for processing xmls and flat files  Provided production support for cluster maintenance  Commissioned and decommissioned nodes as needed  There were 10 node cluster with Hortonworks data platform with 550 GB RAM, 10 TB SSDs and 8 cores  Star schema was designed with fact tables and dimension tables  Worked on analyzing Hadoop stack and different big data analytic tools including Pig and Hive, Hbase database and Sqoop  Taking trainings for new joiners into project  Triggered workflows based on time or availability of data using the Oozie Coordinator Engine Emerson Climate Technologies, Louisville, KY Jan 2011- October 2013 Hadoop Lead Project Details: Emerson Climate is providing efficient HVAC system to buildings minimize cost energy used by HVAC system. Project HVAC controller used in controlling HVAC speed on basis of occupancy in a floor by analysing access card data of the employees. HVAC systems run at constant speeds irrespective of occupancy in buildings. We developed a project where we collect data of our own employees across many locations at 10 mins frequency and come up with recommendation of HVAC speeds depending occupancy in a floor. Used Mapreduce and spark to clean and format the data, jobs run at 10 mins frequency using Crontab to generate HVAC controller report. Frame works and Tools Used  HDP 2.0 distribution for development Cluster  All the datasets was loaded from two different source such as Oracle, MySQL to HDFS and Hive respectively on daily basis  We were getting on an average of 80 GB on daily basis on the whole the data warehouse. We used 12 node cluster to process the data
  • 3.  Involved in loading data from UNIX file system to HDFS  Hadoop eco systems hive, Map reduce, Pyspark to process data Implemented capacity scheduler to share the resources of the cluster and perform Hadoop admin responsibilities as needed  Writing Map reduce and Pyspark jobs for cleansing and applying algorithms  Cassendra database was use to transform queries to Hadoop HDFS  Designed scalable big data cluster solutions  Monitored job status through email received from cluster health monitoring tools  Responsible to manage data coming from different sources. BMO Harris Bank, Buffalo Grove, IL Aug 2010 – Dec 2011 Hadoop Lead BMO isfinancial servicesdepartmentthathelpscustomerwiththeirfinancial needsincludingcredit card, banking,andloans. BMO CreditCardproject wasdesignedtoextractraw data from differentsourcesintoHadoop Eco systemtocreate andpopulate the necessaryHive tables.The mainaimof the projectisto centralize the source of data forreport generationusinghistorical databasewhichotherwiseare generatedfrom multiple sources. Responsibilities:  Worked on Importing and exporting data into HDFS in financial sector  Involved as a team in reviewing of functional and non-functional requirements for writing debit processing in Atlanta location.  Implemented Oozie workflows to perform Ingestion & Merging of data in the MapReduce jobs for credit card fraud detection.  Extracted files from Cassendra Database through Sqoop and placed in HDFS and processed.  Hands on experience in creating Hive tables,loading with data and writing hive queries which will run internally in map reduce way to administer transactions.  Developed a custom File system plug in for Hadoop so it can access files on Data Platform.  This plug-in allows Hadoop MapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.  Expertise in server-side and J2EE technologies including Java, J2SE, JSP, Servlets, XML, Hibernate, Struts, Struts2, JDBC, and JavaScript development.  Design of GUI using Model View Architecture (STRUTS Frame Work).  Extracted feeds form social media sites such as Facebook, Twitter using Python scripts. Environment: Hadoop 1x, Hive, Pig, HBASE, Sqoop and Flume, Spring, Jquery, Java, J2EE, HTML, Javascript, Hibernate Lowe’s, Mooresville, NC Feb 2005 – July 2010 Sr. Java Developer
  • 4. Lowes.com redesign and assembly haul away. Worked with different services based on Service Oriented Architecture (SOA) and also standalone projects which utilize UNIX shell scripts to execute java programs. Some of the REST and SOAP services include: catalog service, façade service, pricing service, purchase history service, mylowes service, seo-redirect service. Also worked in REST API automation project using Rest-assured framework. Involved in updating small batch script based java projects which produces some CSV, excel, txt files and sends those files through SFTP or email. Responsibilities:  Developed new DAOs methods using Hibernate 4.3 as ORMfor application.  Used DOMParser to parse XML 1.1 data from file.  Used JAXB 2.0 annotations to convert Java object to/from XML 1.1 file.  Created a SOAP 1.2 web service and then got its WSDL 2.0 generated.  Created a Web Service Client and invoked the web service using the client  Developed a REST based service which reads the JSON 2.0 file and passed it as an argument to the Controller which handles the multiple HTML 5.1 UI files.  Used Struts MVC framework for user authentication by using Ping Federate Server for single sign on (SSO)  Used SAML to use many services by entering into the systemfor one service  Involved in coding front end using Swing, HTML, JSP, JSF, Struts Framework  IDesign and Development of Spring service classes and JSF pages  Involvedinall software developmentlifecycle phaseslike development,unittesting,regression testing,performance testing,deployment  Responsible fordeveloping,configuring,ormodifyingRESTand SOAPwebservicesusing technologieslikeJAX-RS,JAX-WS,Jersey,SpringMVC.  UsedSpringJDBC as data layerto querydatabasesDB2 and Cassandra.  WorkedUNIXbatch applicationsthatgeneratesproductfeedsandXML files.  WorkedwithRestAPIautomationusingRestAssuredandTestingframework.  Participatedinscrummeetings,dailystand-ups, groomingsessions.  Usedtechnologieslike Spring,REST,JAX-RS,Jersy,JSON,Junit,Testing,Mockito,EasyMock, RestAssured,Ehcache,Maven,DB2,JDBC,Batch Scripting,DB2, WebSphere commerce,websphere. Environment: Java, J2EE, JSP, ExtJS, Servlets, Struts, JDBC, Java Script, LifeRay, Google Web Toolkit, Spring, EJB (SSB, MDB), Ajax, Websphere 6.1 Education Details-BE in IT , Guelph University,2005