SlideShare a Scribd company logo
1 of 4
BALAMURUGAN.KM
Contact:+918880949491
E-Mail:km.balamurugan@gmail.com
Big Data Architect
Expertise in Big Data and Java design cum implementation
SUMMARY
 Overall 14 Years of IT experience.
 3 year of Big Data design experience.
 3 years of Hadoop Admin experience.
 1.6 years of MapReduce, Storm (Zookeeper) experience.
 Developed applications for Distributed Environment using Hadoop, MapReduce and Java.
 More than 10 years of commendable experience in Java Design and Development.
 Experience in Cloud enablement & Workflows.
 Experience in Telecom, Motor, Banking and Educational domain.
 Familiar with components of Hadoop Ecosystem: Storm, Zookeeper, HDFS, Hive, Mongo DB, Pig, Oozie, Spark, R,
Ambari, Sqoop, Flume,Kafka etc.,
 Familiar with data management technologies like MongoDB & Cassandra.
IT SKILLS
 Horton Works & Cloudera Cluster Installation & Administration.
 Proficiency in using MapReduce to develop Hadoop applications and jobs.
 Knowledge on Hadoop Ecosystem.
 Knowledge on Tableau.
 Experience in Linux, Windows & Unix.
 Experience in Oracle, My SQL & Vertica.
 Experience in Design, Development, Testing and implementation of applications using Oracle Java.
 Experience in frameworks like Apache Struts 1.2, Spring.
 Experience in JSP, Servlets, XML,JMS.
 Familiar with application servers like Jboss, Web Sphere & Web Logic.
 Experience in CORE Java application development Specialist in designing relational database objects and writing
SQL stored procedures, packages and functions for Oracle 9i10g
EMPLOYMENT HISTORY
 Hewlett Packard Global Soft Limited, Bangalore, India
Currently working as a Hadoop Admin - from June 2011 to Till Date.
 Ford Motor Company, Chennai, India.
Worked as a Team Leader in Ford Motors deputed from Thirdware Technology Solutions - from July 2010 to May
2011.
 ASPEYA InfoTech, Chennai, India.
Worked as a Team Leader - from March 2008 to June 2010.
WORK EXPERIENCE
1) Project: PDE (Position Determining Element) Analysis - Feb 2015 – Jan 2016
Software Used: Horton Works, Ambari, Map Reduce, Java, Vertica, HP SPS, Tableau
Operating System: Linux & Windows
Team Size: 8
Client: Verizon
Description:
PDE helps in identifying the geological location of the mobile users. It generates near real time
trending like location and time based top application usage, top users, etc.
Features:
• Horton Works 10 node cluster and Map reduce were used.
• PDE bin format is converted into ASCII using QUALCOMM.
• Converted PDE data are stored and processed by MapReduce.
• HP SPS is used for rollup analysis.
• Tableau is used for Trending reports.
Responsibilities:
• ETL Design.
• Horton Works Hadoop Cluster Installation & Administration.
• Map Reduce Development.
2) Project : CEP (Complex Event Processing) - July 2013 – Jan 2015
Software Used: Horton Works, Storm, Zookeeper, Java, Servlet, JSP, JQuery, Ajax, JBOSS-AS-7.1.0, MySQL,
Volt DB
Operating System: Windows & Linux
Team Size: 11
Description:
Complex Event Processing does real time analytics with the help of defined events and helps retailers
in developing their business. It is capable of processing huge data with less latency.
Features:
• Helps in creating complex events based on Time, Historical data etc.,
• Sends notifications in the form of Mail, SMS, JMS and as Logs
• Capable of defining any type of data sources
• Events can be scheduled as Always, Weekly or Monthly
• UI Based
Responsibilities:
• Design.
• Horton Works Installation.
• Storm & Zookeeper Installation.
• Storm & Java Development.
3) Project: SPS (Smart Profile Service) - Dec 2012 – June 2013
Software Used: Cloudera, HP SPS.
Operating System: Linux
Team Size: 4
Client: Axiata
Description:
SPS does real time and batch processing by creating respective value packs. It is capable of
analyzing and aggregating the incoming data.
Features:
• 29 nodes Cloudera cluster has been integrated to do the same.
• SQL & Java Analysis.
• Java and Linux Orchestration.
• UI Based.
• Sequential or Parallel Task Execution.
Responsibilities:
• Cloudera Hadoop Cluster Installation & Administration.
• SPS Integration.
4) Organization: Hewlett-Packard Global Soft Limited (June 2011 – Nov 2012)
Project: AP4SaaS
Software Used: Servlet, JSP, Spring, EJB, JBOSS Seam, Web Services, JSF, Rich faces, Hibernate, Portlets,
JBOSS 5.1 GA,JBOSS-epp 4.3,Oracle 11i,HPSA,SATk
Operating System: Windows & Linux
Team Size: 6
Client: Rogers, ESUS and Airtel
Role: Technical Lead
Description:
SaaS Platform is a platform that offers Telecom Operators which provides the ability to on-board SaaS
Services and offers a variety of Products and its offers to SMBs (Small to Medium sized Businesses), End-
Users (or Residential users and Resellers. It is an e-marketplace solution that brings together vendors of
services and the customers (SMBs, End-Users and Resellers). The SaaS platform also takes care of
provisioning the organization and the organization users in the SaaS Service so that the end-users are able
to access these services. The SaaS Platform also takes care of subscription management, provisioning
management, etc. In future, it is envisaged that the SaaS Platform will integrate with the billing and
charging components that the operator may already have to support monetizing the services
• Enhance SaaS base product on Spring MVC Framework using Eclipse on JBoss 5.1
• Add new features to the base product, services.
• Developed Product Management using Spring WebFlow.
• Handled XML transformations
• Design and developed UI Screens using JSP.
• Create database Objects, load data and write queries for Oracle 11i
• Implemented Exception Handling, Logging and Tracing best practice.
• Trained on HPSA, Spring and SATk
• Worked on internationalization of AP4SaaS project.
• Worked on Web Services.
5) Organization: Ford Motor Company from Thirdware Technology Solutions, Chennai, INDIA
Project: Exchange Core Tracking System (ECTS).
Role: Team Leader
Team Size: 7
Description:
ECTS is computer-based core tracking and reporting system to handle parts (tracking). All
users can only access specific screens depending on the group they belong to (Admin, Dealer, EPC, etc.). All
authorized dealers shall use ECTS system via FORD WSL (Web Single Login).
The Application handles approximately 20 Intranet Users and 5000 Extranet Users. Used in 21
countries @ Europe
Responsibility:
I was involved in the implementation of Interface module and leaded the team for the below
modules,
• Admin
• Dealer
• EPC
• Core Controller
• PRC
EDUCATION
 MCA (Master of Computer Education) from IGNOU, India: 2002

More Related Content

What's hot

Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
 
Distributed Heterogeneous Mixture Learning On Spark
Distributed Heterogeneous Mixture Learning On SparkDistributed Heterogeneous Mixture Learning On Spark
Distributed Heterogeneous Mixture Learning On SparkSpark Summit
 
Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...
Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...
Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...Databricks
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINAthokpam Nabakumar
 
Scaling and Unifying SciKit Learn and Apache Spark Pipelines
Scaling and Unifying SciKit Learn and Apache Spark PipelinesScaling and Unifying SciKit Learn and Apache Spark Pipelines
Scaling and Unifying SciKit Learn and Apache Spark PipelinesDatabricks
 
Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...
Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...
Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...Databricks
 
Hortonworks tech workshop in-memory processing with spark
Hortonworks tech workshop   in-memory processing with sparkHortonworks tech workshop   in-memory processing with spark
Hortonworks tech workshop in-memory processing with sparkHortonworks
 
VrittiGaneriwal_Resume_USC
VrittiGaneriwal_Resume_USCVrittiGaneriwal_Resume_USC
VrittiGaneriwal_Resume_USCVritti Ganeriwal
 
Delight: An Improved Apache Spark UI, Free, and Cross-Platform
Delight: An Improved Apache Spark UI, Free, and Cross-PlatformDelight: An Improved Apache Spark UI, Free, and Cross-Platform
Delight: An Improved Apache Spark UI, Free, and Cross-PlatformDatabricks
 
Harnessing Spark Catalyst for Custom Data Payloads
Harnessing Spark Catalyst for Custom Data PayloadsHarnessing Spark Catalyst for Custom Data Payloads
Harnessing Spark Catalyst for Custom Data PayloadsSimeon Fitch
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
 

What's hot (20)

Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_Resume
 
Sidharth_CV
Sidharth_CVSidharth_CV
Sidharth_CV
 
Distributed Heterogeneous Mixture Learning On Spark
Distributed Heterogeneous Mixture Learning On SparkDistributed Heterogeneous Mixture Learning On Spark
Distributed Heterogeneous Mixture Learning On Spark
 
Big Data Heterogeneous Mixture Learning on Spark
Big Data Heterogeneous Mixture Learning on SparkBig Data Heterogeneous Mixture Learning on Spark
Big Data Heterogeneous Mixture Learning on Spark
 
Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...
Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...
Generative Hyperloop Design: Managing Massively Scaled Simulations Focused on...
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
Spark ML Pipeline serving
Spark ML Pipeline servingSpark ML Pipeline serving
Spark ML Pipeline serving
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
 
Scaling and Unifying SciKit Learn and Apache Spark Pipelines
Scaling and Unifying SciKit Learn and Apache Spark PipelinesScaling and Unifying SciKit Learn and Apache Spark Pipelines
Scaling and Unifying SciKit Learn and Apache Spark Pipelines
 
Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...
Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...
Benchmark Tests and How-Tos of Convolutional Neural Network on HorovodRunner ...
 
Hortonworks tech workshop in-memory processing with spark
Hortonworks tech workshop   in-memory processing with sparkHortonworks tech workshop   in-memory processing with spark
Hortonworks tech workshop in-memory processing with spark
 
VrittiGaneriwal_Resume_USC
VrittiGaneriwal_Resume_USCVrittiGaneriwal_Resume_USC
VrittiGaneriwal_Resume_USC
 
Delight: An Improved Apache Spark UI, Free, and Cross-Platform
Delight: An Improved Apache Spark UI, Free, and Cross-PlatformDelight: An Improved Apache Spark UI, Free, and Cross-Platform
Delight: An Improved Apache Spark UI, Free, and Cross-Platform
 
LEGO: Data Driven Growth Hacking Powered by Big Data
LEGO: Data Driven Growth Hacking Powered by Big Data LEGO: Data Driven Growth Hacking Powered by Big Data
LEGO: Data Driven Growth Hacking Powered by Big Data
 
Harnessing Spark Catalyst for Custom Data Payloads
Harnessing Spark Catalyst for Custom Data PayloadsHarnessing Spark Catalyst for Custom Data Payloads
Harnessing Spark Catalyst for Custom Data Payloads
 
Resume (2)
Resume (2)Resume (2)
Resume (2)
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
 
MLeap: Release Spark ML Pipelines
MLeap: Release Spark ML PipelinesMLeap: Release Spark ML Pipelines
MLeap: Release Spark ML Pipelines
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 

Viewers also liked

Viewers also liked (17)

Word jeimis san juan
Word jeimis san juanWord jeimis san juan
Word jeimis san juan
 
AROMATIC SECRET
AROMATIC SECRET AROMATIC SECRET
AROMATIC SECRET
 
Math f1 mid year
Math f1 mid yearMath f1 mid year
Math f1 mid year
 
Plantilla
PlantillaPlantilla
Plantilla
 
3. 2 matrices
3. 2 matrices3. 2 matrices
3. 2 matrices
 
irshad update cv (3)
irshad update cv (3)irshad update cv (3)
irshad update cv (3)
 
Apresentação Serviços 3DMK
Apresentação Serviços 3DMKApresentação Serviços 3DMK
Apresentação Serviços 3DMK
 
Profile
ProfileProfile
Profile
 
Profile
ProfileProfile
Profile
 
Smart Been - Smart Greenhouse
Smart Been - Smart GreenhouseSmart Been - Smart Greenhouse
Smart Been - Smart Greenhouse
 
Decreto no. 1000-0386 transito municipal
Decreto no. 1000-0386 transito municipalDecreto no. 1000-0386 transito municipal
Decreto no. 1000-0386 transito municipal
 
CV Ahmad Al Attar- Updated 07 March 2016
CV Ahmad Al Attar- Updated 07 March 2016CV Ahmad Al Attar- Updated 07 March 2016
CV Ahmad Al Attar- Updated 07 March 2016
 
Presentation Trescon1
Presentation Trescon1Presentation Trescon1
Presentation Trescon1
 
Guia do IRS 2015
Guia do IRS 2015Guia do IRS 2015
Guia do IRS 2015
 
TRABAJO STEPHANIE GAITSKELL 11-02 JM
TRABAJO STEPHANIE GAITSKELL 11-02 JMTRABAJO STEPHANIE GAITSKELL 11-02 JM
TRABAJO STEPHANIE GAITSKELL 11-02 JM
 
organic fertilizer making machine for making organic fertilizer:300-400kg/h
organic fertilizer making machine for making organic fertilizer:300-400kg/horganic fertilizer making machine for making organic fertilizer:300-400kg/h
organic fertilizer making machine for making organic fertilizer:300-400kg/h
 
Mugo Presentation
Mugo PresentationMugo Presentation
Mugo Presentation
 

Similar to Balamurugan.KM_Arch

Similar to Balamurugan.KM_Arch (20)

Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
CV_Vasili_Tegza 2G
CV_Vasili_Tegza 2GCV_Vasili_Tegza 2G
CV_Vasili_Tegza 2G
 
Nand_CV
Nand_CV Nand_CV
Nand_CV
 
Resume
ResumeResume
Resume
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Gubendran Lakshmanan
Gubendran LakshmananGubendran Lakshmanan
Gubendran Lakshmanan
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
 
Sushil kumar pandey
Sushil kumar pandeySushil kumar pandey
Sushil kumar pandey
 
Yazan Malkawi CV
Yazan Malkawi CVYazan Malkawi CV
Yazan Malkawi CV
 
Saloni_Tyagi
Saloni_TyagiSaloni_Tyagi
Saloni_Tyagi
 
BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
ANIRBAN_GHOSH_RESUME
ANIRBAN_GHOSH_RESUMEANIRBAN_GHOSH_RESUME
ANIRBAN_GHOSH_RESUME
 
AnilKumarT_Resume_latest
AnilKumarT_Resume_latestAnilKumarT_Resume_latest
AnilKumarT_Resume_latest
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
Resume
ResumeResume
Resume
 
Aryaan_CV
Aryaan_CVAryaan_CV
Aryaan_CV
 
Mohsin hakim
Mohsin hakimMohsin hakim
Mohsin hakim
 
MohsinHakim
MohsinHakimMohsinHakim
MohsinHakim
 

Balamurugan.KM_Arch

  • 1. BALAMURUGAN.KM Contact:+918880949491 E-Mail:km.balamurugan@gmail.com Big Data Architect Expertise in Big Data and Java design cum implementation SUMMARY  Overall 14 Years of IT experience.  3 year of Big Data design experience.  3 years of Hadoop Admin experience.  1.6 years of MapReduce, Storm (Zookeeper) experience.  Developed applications for Distributed Environment using Hadoop, MapReduce and Java.  More than 10 years of commendable experience in Java Design and Development.  Experience in Cloud enablement & Workflows.  Experience in Telecom, Motor, Banking and Educational domain.  Familiar with components of Hadoop Ecosystem: Storm, Zookeeper, HDFS, Hive, Mongo DB, Pig, Oozie, Spark, R, Ambari, Sqoop, Flume,Kafka etc.,  Familiar with data management technologies like MongoDB & Cassandra. IT SKILLS  Horton Works & Cloudera Cluster Installation & Administration.  Proficiency in using MapReduce to develop Hadoop applications and jobs.  Knowledge on Hadoop Ecosystem.  Knowledge on Tableau.  Experience in Linux, Windows & Unix.  Experience in Oracle, My SQL & Vertica.  Experience in Design, Development, Testing and implementation of applications using Oracle Java.  Experience in frameworks like Apache Struts 1.2, Spring.  Experience in JSP, Servlets, XML,JMS.  Familiar with application servers like Jboss, Web Sphere & Web Logic.  Experience in CORE Java application development Specialist in designing relational database objects and writing SQL stored procedures, packages and functions for Oracle 9i10g EMPLOYMENT HISTORY  Hewlett Packard Global Soft Limited, Bangalore, India Currently working as a Hadoop Admin - from June 2011 to Till Date.  Ford Motor Company, Chennai, India. Worked as a Team Leader in Ford Motors deputed from Thirdware Technology Solutions - from July 2010 to May 2011.  ASPEYA InfoTech, Chennai, India. Worked as a Team Leader - from March 2008 to June 2010. WORK EXPERIENCE 1) Project: PDE (Position Determining Element) Analysis - Feb 2015 – Jan 2016 Software Used: Horton Works, Ambari, Map Reduce, Java, Vertica, HP SPS, Tableau Operating System: Linux & Windows Team Size: 8 Client: Verizon
  • 2. Description: PDE helps in identifying the geological location of the mobile users. It generates near real time trending like location and time based top application usage, top users, etc. Features: • Horton Works 10 node cluster and Map reduce were used. • PDE bin format is converted into ASCII using QUALCOMM. • Converted PDE data are stored and processed by MapReduce. • HP SPS is used for rollup analysis. • Tableau is used for Trending reports. Responsibilities: • ETL Design. • Horton Works Hadoop Cluster Installation & Administration. • Map Reduce Development. 2) Project : CEP (Complex Event Processing) - July 2013 – Jan 2015 Software Used: Horton Works, Storm, Zookeeper, Java, Servlet, JSP, JQuery, Ajax, JBOSS-AS-7.1.0, MySQL, Volt DB Operating System: Windows & Linux Team Size: 11 Description: Complex Event Processing does real time analytics with the help of defined events and helps retailers in developing their business. It is capable of processing huge data with less latency. Features: • Helps in creating complex events based on Time, Historical data etc., • Sends notifications in the form of Mail, SMS, JMS and as Logs • Capable of defining any type of data sources • Events can be scheduled as Always, Weekly or Monthly • UI Based Responsibilities: • Design. • Horton Works Installation. • Storm & Zookeeper Installation. • Storm & Java Development. 3) Project: SPS (Smart Profile Service) - Dec 2012 – June 2013 Software Used: Cloudera, HP SPS. Operating System: Linux Team Size: 4 Client: Axiata Description: SPS does real time and batch processing by creating respective value packs. It is capable of analyzing and aggregating the incoming data. Features: • 29 nodes Cloudera cluster has been integrated to do the same. • SQL & Java Analysis. • Java and Linux Orchestration. • UI Based. • Sequential or Parallel Task Execution. Responsibilities: • Cloudera Hadoop Cluster Installation & Administration. • SPS Integration.
  • 3. 4) Organization: Hewlett-Packard Global Soft Limited (June 2011 – Nov 2012) Project: AP4SaaS Software Used: Servlet, JSP, Spring, EJB, JBOSS Seam, Web Services, JSF, Rich faces, Hibernate, Portlets, JBOSS 5.1 GA,JBOSS-epp 4.3,Oracle 11i,HPSA,SATk Operating System: Windows & Linux Team Size: 6 Client: Rogers, ESUS and Airtel Role: Technical Lead Description: SaaS Platform is a platform that offers Telecom Operators which provides the ability to on-board SaaS Services and offers a variety of Products and its offers to SMBs (Small to Medium sized Businesses), End- Users (or Residential users and Resellers. It is an e-marketplace solution that brings together vendors of services and the customers (SMBs, End-Users and Resellers). The SaaS platform also takes care of provisioning the organization and the organization users in the SaaS Service so that the end-users are able to access these services. The SaaS Platform also takes care of subscription management, provisioning management, etc. In future, it is envisaged that the SaaS Platform will integrate with the billing and charging components that the operator may already have to support monetizing the services • Enhance SaaS base product on Spring MVC Framework using Eclipse on JBoss 5.1 • Add new features to the base product, services. • Developed Product Management using Spring WebFlow. • Handled XML transformations • Design and developed UI Screens using JSP. • Create database Objects, load data and write queries for Oracle 11i • Implemented Exception Handling, Logging and Tracing best practice. • Trained on HPSA, Spring and SATk • Worked on internationalization of AP4SaaS project. • Worked on Web Services. 5) Organization: Ford Motor Company from Thirdware Technology Solutions, Chennai, INDIA Project: Exchange Core Tracking System (ECTS). Role: Team Leader Team Size: 7 Description: ECTS is computer-based core tracking and reporting system to handle parts (tracking). All users can only access specific screens depending on the group they belong to (Admin, Dealer, EPC, etc.). All authorized dealers shall use ECTS system via FORD WSL (Web Single Login). The Application handles approximately 20 Intranet Users and 5000 Extranet Users. Used in 21 countries @ Europe Responsibility: I was involved in the implementation of Interface module and leaded the team for the below modules, • Admin • Dealer • EPC
  • 4. • Core Controller • PRC EDUCATION  MCA (Master of Computer Education) from IGNOU, India: 2002