1. HIMANSU SEKHAR BEHERA Email Id:himansubehera3@gmail.com
JAVA/J2EE Developer And Big Data Consultant Cell No:+1 6692463600
PROFESSIONAL SUMMARY
● Total 7 years 8 months of IT experience in Requirements Analysis, Design, Development, Implementation,
Production and Support of Big Data Technology like Apache Spark, Apache Kafka Java, J2EE and client-server
technologies,.
● Directly worked for different domains like Investment Banking, Retail Transportation, and
Manufacturing,Telecom from requirement collection till deployment and post deployment enhancements and
support.
● Involved in all the phases of Software Development Life Cycle (SDLC) and experience in different software
development methodologies like Agile, Iterative and waterfall approaches
● Implemented Kafka Log Compaction as one of the feature.
● Strong work experience building data pipelines for partners and exposing them to partners through Amazon kinesis
Streams.
● Good hands on experience on generating metrics for executive reporting using Apache Spark.
● Good hands on experience on Migrating Map Reduce jobs into Spark Jobs,Apache Storm Topology.
● Strong Working experience in Design, Development and implementation of several J2EE frameworks like Spring
IoC, Spring MVC and Hibernate.
● Well versed with Object Oriented Programming, Collection Framework, Multi-Threading, Executor
Framework, Exception Handling, Spring Dependency Injection and various ORM frameworks Hibernate.
● Experience in developing applications using HTML, JavaScript.-
● Strong experience in developing back end side of the applications using JSP, Servlets, JavaBeans, XML, JAXB.
● Developed and deployed multi-tier web applications using Apache Tomcat.
● Developed J2EE applications on Eclipse IDE, NetBeans, and Intellij Idea.
● Implemented the in-house ETL layer for the Hadoop Cluster for Apple.
● Implemented Design patterns such as Factory, Adapter, Bridge, Singleton, DAO, Command Pattern and Service
Locator, Observer pattern.
● Sound knowledge on UNIX commands and shell script programming.
● Sound knowledge on Hadoop commands and basics Pig Script
● Sound knowledge in Map Reduce jobs.
● Sound knowledge in Spark technology.
● Good hands-on experience on Apache Kafka.
● Developed ANT scripts and Maven scripts for the build and deployment of J2EE applications.
● Working experience in using UML, XML parsers (SAX, DOM).
● Experience in Web Services, and working knowledge of Data Transfer using Jason, XML etc.
● Hands on experience with SVN Tortoise, GITHUB code repository server and client.
● Sound RDBMS concepts and worked with MySQL database.
● Developed and deployed Web Services like REST to interact with partner interfaces, and written client interfaces,
and implementation classes to consume the web services.
● Strong design, programming, and troubleshooting skills with ability to plan processes in meticulous detail and ability
to adapt to challenging roles.
● Handled Onsite/offshore Team co-ordination, coordinating medium/large development and testing teams from
beginning till completion.
● Motivated team player with equally good emphasis on individual contribution
● Very good experience in developing Test Cases using JUnit.
● Excellent communication and interpersonal skills and a very good team player with the ability
TECHNICAL SKILS
Languages : Java, SQL, UML, XML
J2EE Technologies : Core Java, Executor Framework, Concurrency framework, JSP, Web Services,
SOAP Web Services, Hibernate, JDBC, ODBC, XML (SAX & DOM), JAXB, Ant,
JUnit, Web Services, Ant-scripts, RESTful Web Services, SOAP Web Services,
Hibernate, Spring Core, Spring WebMVC
Big Data Framework : MapReduce, Apache Spark, Apache Kafka, Amazon Kinesis,Apache Storm
Big Data Platform : Apache Mesos, Apache Hadoop YARN
Web/App Servers : Tomcat /Apache 5.x/4.x,GlassFish Server 4.1
Data Bases : MySQL
Tools : Eclipse, NetBeans, Intellij Idea, JUnit, Tortoise SVN, Maven, HP Quality Center,
Sonnar, PMD, GIT, Mockito, SVN Subversion
Operating Systems : Windows, Mac OSX, Ubuntu.
2. EDUCATION
Bachelor of Technology in Electrical Engineering
Institute of Technical Education and Research (ITER), Bhubaneswar, India
PROFESSIONAL EXPERIENCE
WalmartLabs. Sunnyvale, CA August 2016 – till date
Qarth Product
System Analyst/ Big Data Consultant
Qarth Product is a massive, scalable, centralized, and enriched product repository to optimize assortment growth and
maximize customer engagement. It contains all product information for products sold by Walmart and beyond. It is
scalable and enriched with high quality product data.
It also features comprehensive set of granular, hierarchical product types that are internally classified and also forms
the backbone of most Walmart.com experience.
Qarth Product Interface provides for users to validate classification / normalization results and conflicts and trains the
machine learning model while producing accuracy measurements. Qarth Product Service is used for assigning the most
appropriate shelves to all products, based on product likeness to other products on a shelf.
Responsibilities:
● Understanding client requirements and converting them into feasible software solutions.
● Generating metrics for the executive reporting using Apache Spark.
● Building real time data analytics pipeline by reading real time data from Apache Kafka topics and enriching the data
from different data source
● Design solutions which are dynamic and deliver performance.
● Coding in Java for developing the test Automation tool.
● Peer Code Review
● Analyzing Sprint stories with Scrum team to identify automation candidates
● Deep-diving into automation candidate stories to outline test steps to automate.
● Implemented data pipelines by reading avro data from the our internal kafka cluster and enriching the data from
other sources and exposing it partners through Amazon Kinesis
● Implemented different transformation functionality like sorting, grouping, filtering, merging of the data.
● Implemented data pipelines by reading avro data from the our internal kafka cluster and enriching the data from
other sources using Spark for Executive Reporting
● Implemented the multi-threading concept for parsing large amount of data and validating.
● Developing application using various Java/J2EE design patterns to improve usability and flexibility.
● Responsible for maintaining and enhancing jobs.
● Writing User Specification documents, design documents.
● Performed reviews and prepared unit test cases and executing them.
● Agile methodology was used for software development process, with daily scrums
● Involved in High level and Low level design and technical documentation.
● Used Maven for building project and also configured to add dependencies to existing projects.
● Implemented the mechanism of logging and debugging with Log4j.
● Hands on experience using ANT, MAVEN build tool.
● Have done POCs related to Apache SparkStreaming
● Performed Static code Analysis using PMD tool and Code coverage is performed using Sonar.
● Used HP quality Centre as bug and defect tracking tool
● Unit testing and maintenance of the Project.
● Analyzed Production issues, doing root cause analysis and providing quick resolution to production issue.
● Implementing the projects into live.
● Worked with customer requirements analyst on a bi-weekly basis to realize end-user requirements into functional
requirements
3. Environment: Core Java,J2EE Spring IOC,Apache Kafka, Apache Storm,Amazon Kinesis, Apache Spark,Apache
Hadoop Yarn Executor Framework, Apple’s Foundation (in-house API) for development, Executor Framework,
Concurrent Framework, Spring WebMVC 2.5 Framework, Tomcat 6.0, SVN Sub Version, Agile Methodologies, Maven,
Eclipse IDE, Log4j, PMD, HP Quality Center, Unix
Comcast INC. Philladelphia, PA July 2015- August 2016
Xfinity Business Intelligence Platform
System Analyst
The X1 platform generates valuable data that partner users of the platform need to have. The platform has data about
customer viewing patterns including what they're watching, how long, when they tune away, what they record, what they
watch on demand, where they are (zip code), what devices they're using to watch, and more. Partners can use this data
for their analytics and reporting. This data is also valuable to 3rd parties like ratings services.In order to accommodate
these requirements for partners, the solution for data needs to aim for a syndication mindset. Fairmount is the data
syndication framework to meet this requirement.
Responsibilities:
● Understanding client requirements and converting them into feasible software solutions.
● Building data pipe line using Apache Spark
● Generating metrics for the executive reporting using Apache Spark.
● Building data pipeline by reading real time data from Apache Kafka topics and enriching the data from different data
source
● Design solutions which are dynamic and deliver performance.
● Coding in Java for developing the test Automation tool.
● Peer Code Review
● Analyzing Sprint stories with Scrum team to identify automation candidates
● Deep-diving into automation candidate stories to outline test steps to automate.
● Redesigned the current observer design pattern and implemented the batch processing approach for processing
of logs.
● Designed the module Avro Schema Registry to handle avro schema for different mobile apps data format for
which I was appreciated by the Client.
● Implemented data pipelines by reading avro data from the our internal kafka cluster and enriching the data from
other sources and exposing it partners through Amazon Kinesis
● Implemented different transformation functionality like sorting, grouping, filtering, merging of the data.
● Implemented data pipelines by reading avro data from the our internal kafka cluster and enriching the data from
other sources using Spark for Executive Reporting
● Implemented Spring Dependency Injection in the existing application Web Receiver for which I received client
appreciation
● Implemented the multi-threading concept for parsing large amount of data and validating.
● Developing application using various Java/J2EE design patterns to improve usability and flexibility.
● Responsible for maintaining and enhancing jobs.
● Writing User Specification documents, design documents.
● Performed reviews and prepared unit test cases and executing them.
● Agile methodology was used for software development process, with daily scrums
● Involved in High level and Low level design and technical documentation.
● Used Maven for building project and also configured to add dependencies to existing projects.
● Implemented the mechanism of logging and debugging with Log4j.
● Hands on experience using ANT, MAVEN build tool.
● Have done Research and POCs related to Apache Spark
● Performed Static code Analysis using PMD tool and Code coverage is performed using Sonar.
● Used HP quality Centre as bug and defect tracking tool
● Unit testing and maintenance of the Project.
● Analyzed Production issues, doing root cause analysis and providing quick resolution to production issue.
● Implementing the projects into live.
● Worked with customer requirements analyst on a bi-weekly basis to realize end-user requirements into functional
requirements
4. Environment: Core Java,J2EE Spring IOC,Apache Kafka, Amazon Kinesis, Apache Spark,Apache Hadoop Yarn
Executor Framework, Apple’s Foundation (in-house API) for development, Executor Framework, Concurrent
Framework, Spring WebMVC 2.5 Framework, Tomcat 6.0, SVN Sub Version, Agile Methodologies, Maven, Eclipse IDE,
Log4j, PMD, HP Quality Center, Unix
Comcast INC. Philladelphia, PA July 2015- Nov 2015
Casper
System Analyst
Casper is an application for extracting changed data (deltas) from pre-determined column families in Cassandra.
Specifically, it monitors certain column families in the central Cassandra database to extract data required for analytics.
Casper uses Spark, running on a Mesos cluster, to read from dedicated Cassandra nodes to compute deltas. The
results are persisted to a data bus (Kafka).
Responsibilities:
● Understanding client requirements and converting them into feasible software solutions.
● Design solutions which are dynamic and deliver performance.
● Writing a Spark application in scala which reads data from the central Cassandra repository and checking if it’s a
new record or an updated record.
● Writing data into the Kafka topic using Kafka log compaction feature for the updated and new records.
● Unit testing and maintenance of the Project.
● Analyzed Production issues, doing root cause analysis and providing quick resolution to production issue.
● Implementing the project into live.
Environment:Apache Kafka, Apache Spark,Apache Mesos ,SVN Sub Version, Agile Methodologies, Maven, Eclipse
IDE, Log4j, PMD, HP Quality Center, Unix
Apple INC. Cupertino, CA May-2013 to May 2015
iReporter
Sr. Java Developer
iReporter project is the ETL layer for extracting data from different iCloud apps to the Cluster. iReporterAgent is first
component ETL layer of the iCloud application of Apple. It extracts the logs which is produced by different apps and
sends it to transformation layer via HTTP in JSON format. The ireporter agent library is client side of a massive log
service that collects events. iReporterWebReceiver project is second component ETL layer of the iCloud application
of Apple. It receives the data from iReporterAgent and transforms the data by performing some functions like validating,
sorting, grouping, filtering on the data. After transforming the data its writes the transformed data into file system TSV
format. HDFSCopier project is the third and the latest ETL layer of the iCloud application of Apple. It loads the
transformed TSV data into the Highly Distributed File System.
Responsibilities:
● Understanding client requirements and converting them into feasible software solutions.
● Design solutions which are dynamic and deliver performance.
● Coding in Java for developing the test Automation tool.
● Peer Code Review
● Analyzing Sprint stories with Scrum team to identify automation candidates
● Deep-diving into automation candidate stories to outline test steps to automate.
● Redesigned the current observer design pattern and implemented the batch processing approach for processing
of logs.
● Redesigned the simple web based application to Spring Web MVC based application to handle data for different
mobile apps data for which I was appreciated by the Client.
● Implemented different transformation functionality like sorting, grouping, filtering, merging of the data.
● Implemented Spring Dependency Injection in the existing application Web Receiver for which I received client
appreciation
● Implemented the multi-threading concept for parsing large amount of data and validating.
● Have worked on Apache Kafka for real time processing of the data.Have sound knowledge on
producer,consumer,and broker concept of Apache Kafka
5. ● Developing application using various Java/J2EE design patterns to improve usability and flexibility.
● Responsible for maintaining and enhancing application.
● Writing User Specification documents, design documents.
● Performed reviews and prepared unit test cases and executing them.
● Agile methodology was used for software development process, with daily scrums
● Involved in High level and Low level design and technical documentation.
● Used Maven for building project and also configured to add dependencies to existing projects.
● Implemented the mechanism of logging and debugging with Log4j.
● Developed a Spring Enabled HDFSCopier application which copies the data files from the local system to the HDFS
File system or Cluster.
● Redesigned and Developed the HDFSCopier component using Multithreading, Concurrent framework and
synchronization and was able to reduce the execution time by 70 percentage
● Hands on experience using ANT, MAVEN build tool.
● Have done Research and POCs related to Apache Spark
● Performed Static code Analysis using PMD tool and Code coverage is performed using Sonar.
● Used HP quality Centre as bug and defect tracking tool
● Unit testing and maintenance of the Project.
● Analyzed Production issues, doing root cause analysis and providing quick resolution to production issue.
● Implementing the projects into live.
● Worked with customer requirements analyst on a bi-weekly basis to realize end-user requirements into functional
requirements
Environment: Core Java,J2EE Spring IOC, Executor Framework, Apple’s Foundation (in-house API) for development,
Executor Framework, Concurrent Framework, Spring WebMVC 2.5 Framework, XML, JavaScript, JSP, Tomcat 6.0,
SVN Sub Version, Agile Methodologies, Maven, Eclipse IDE, Log4j, PMD, HP Quality Center, Unix
Societe Generale Investment Banking, NYC, NY Mar 2011 – April 2013
Tracking/Monitoring Bank Defaulter
Analyst Programmer
Default Monitoring is web based banking application used for capturing Recovery and Loss Events and to calculate
the provision amounts (Provisioning process), for debtors in default whose credit ratings are between 8 and 10. This is
a Bank’s regulatory requirement. Default Monitoring Application (DMN) was developed to achieve the above
regulatory requirement and to replace two existing applications namely “Colloss” and “PROV1” to avoid data
redundancy. The important modules are Watchlist, Administration, Default, Provision Calculation, and Multi Obligor
Management
Responsibilities:
● Worked in the design and development of the work flow queues and valuation dashboard using J2EE design
patterns.
● Designed & developed the JSP pages, backing beans, business delegates, services and DAO mapping and Oracle
for the backend.
● Worked on Java backend services using Java Spring Framework
● Implemented Web Service layer (Controllers, Services and DAOs) using Java Spring Framework and Tomcat
server and Eclipse IDE
● Implemented UI screens using JSP, JavaScript
● Writing Hibernate entity mappings for the database table entities and for persistence into MySQL database.
● Developing application using various Java/J2EE design patterns to improve usability and flexibility.
● Worked on developing modules end-to-end, which involved generating User-Interfaces, Service modules and
database Queries.
● Developed how user inputs are captured, the data loaded into screen using JSP, Spring WebMVC, implemented
the business logic
● Involved in support of the system maintenance project
● Used spring context application file to define beans, define data source and Hibernate properties.
● Used Hibernate for persistence into MySQL database.
● Handled customer queries for the product
● Performing Code reviews with the team, design reviews with the Architects.
● Preparing builds, deploy and Co-ordinate with the release management team to ensure that the proper process is
followed during the release.
● Customizing Log4J for maintaining information and debugging.
● Follow the defined Quality Procedures for the projects and continuously monitor and audit to ensure team meets
quality goals.
6. Environment: Java 5, Tomcat, Spring 2.5 Framework, Core Java, Spring Core, Spring AOP, Hibernate, Tortoise
SVN, JSP, MySQL, Windows, RestFul Web Services, SOAP Web Services.
Ericsson Nov 2010 – Feb 2011
WCDMA Radio Access Network (WRAN) Data consistency
Sr. Software Engineer
The Consistency Check application is used to check that Network Elements in the WCDMA Radio Access Network
(RAN) have consistent data. If data is not consistent between Network Elements problems in the network can occur.
Using the Consistency Check interface, you can select Network Elements and check that they are consistent. This is
done by selecting Network Elements and applying ’rules’ to check against. The results of a consistency check can be
displayed in a report.
Responsibilities:
● Involved in requirement analysis, code reviews, understanding of end-to-end product.
● Worked on collecting data using Java Multi-threading and factoring Design Patterns while implementing.
● Worked on implementing algorithms for Data Consistency
● Implemented and designed modules for the User-Interface screens using JSP.
● Designed database schema and implemented SQL queries using Firebird database
● Generated reports using Jasper Reports.
● Developed Java Servlets for implementing the core business logic
● Performed unit testing on the project using JUnit and Mockito
● Demonstration of the product at section and team levels
● Developed Ant-scripts for the project
● Writing high level design document and performing Code Coverage
● Used HP Quality as defects/bug tracking tool
Environment: Java 5, Java Servlets, JSP, SQL, PMD, JUnit, Mockito, Ant-scripts, Apache Tomcat, Windows, HP
Quality Center
State Street Corporation, Boston, MA April 2010 – Oct 2010
Integrated Channel Offer and Response Engine
Software Engineer
Integrated Channel Offer and Response Engine (ICORE) is a web application that enables a flexible and richly
interactive, consultative product acquisition experience for State Street. The system is deployed to enable capabilities
for a diverse constituency of users including consumers looking for a credit product, State Street customers looking for
a new product or for an enhancement to an existing product, and employees and agents of State Street who will use
the system on behalf of the consumer, such as internal and external call centers, bank branch associates. The intent of
the ICORE system is to provide a central set of capabilities that can be leveraged to support these channels and the
common and specialized need of these users groups
Responsibilities:
● Worked in the design and development of the work flow queues and valuation dashboard using J2EE design
patterns.
● Designed & developed the JSP pages using JSF framework, backing beans, business delegates, services and
DAO mapping and MySQL for the backend.
● Involved in support of the system maintenance project
● Used spring context application file to define beans, define data source and Hibernate properties.
● Used Hibernate for persistence into MySQL database.
● Handled customer queries for the product
● Implemented multi-threading.
● Preparing builds, deploy and Co-ordinate with the release management team to ensure that the proper process is
followed during the release.
● Customizing Log4J for maintaining information and debugging.
● Follow the defined Quality Procedures for the projects and continuously monitor and audit to ensure team meets
quality goals.
Environment: Java 5, JSF Framework, Multi-Threading Concepts like Synchronization, UNIX, HP Quality Centre
7. American Bureau of Shipping Jul 2009 – Mar 2010
SAFE SHIP
Software Engineer, Level 1
SAFE SHIP, a division of ABS, is one of the leading providers of integrated asset management software solutions for
marine and offshore operators ABS SAFE SHIP offers a fully-integrated, modular approach to managing the principal
operational expenses associated with a vessel, boat or offshore rig. Major modules are Machinery Maintenance, Crew
Management and Administration.
Responsibilities:
● Implemented core business service modules in Java,
● Developed some of the User-Interfaces screens using Java Swing.
● Gather Performance statistics from remote hosts (ESX VMware hosts) via SSH connection.
● Responsible for analyzing, designing and development of various modules in the project.
● Was a part of review meetings, project demonstration
● Implemented database using MySQL and also worked on business intelligence reports
● Was responsible for the support of the project and handled Customer Queries
● Involved in various responsibilities like keep track of all the issues raised by the clients / users and perform impact
analysis and come up with proposal and co–ordinate with the team.
Environment: Java 5, Eclipse, Core Java, Swing, JAXB, Windows, HP Quality Center, MySQL