SlideShare a Scribd company logo
1 of 2
Download to read offline
Abhishek Kapoor
Software Engineer
M. 9873057645
E-mail : abhishek.kapoor@outlook.com
abhishek.kapoor@iic.ac.in
Blog : kapoorabhish.github.io
LinkedIn : https://in.linkedin.com/in/abhishek-
kapoor-4b7b9295
Github : github.com/kapoorabhish
[About Me]
Passionate, responsible and committed engineer,
with a get-it-done, on-time spirit, and 2 years of
experience in product development using Python
and PHP. I spend a lot of time understanding the
business approach and then implement those ideas
in the best possible way through my coding skills. I
can work well alone, but I'm at my best
collaborating with others.
[Education]
Institute of Informatics and Communication
University of Delhi
JULY 2012 - JULY 2014
M.Sc Informatics
7.68 (CGPA)
Rajdhani College
University of Delhi
JULY 2009 - JULY 2012
B.Sc. (H) Electronics
Aggregate -67.6%
Rajkiya Pratibha Vikas Vidyalaya
APR 2008 - MAY 2009
CBSE- 76.6%
R. A. Geeta Co-Ed Sr. Sec. School
APR 2006 - MAY 2007
CBSE – 88.6%
Address: B-14, Sharma Bhavan, South
Anarkali, Som Bazar, Krishna
Nagar, Delhi-110051
[Skills]
PYTHON
MONGODB
ELASTICSEARCH
MYSQL/POSTGRES
REDIS
PHP
SQL
HTML
APACHE SOLR
APACHE STORM
C
C++
JAVASCRIPT
CSS
CORE JAVA
SHELL SCRIPT
ETL
CASSANDRA DB
[Experience]
Intern
National Informatics Centre(NIC) : May 2013 – July 2013
Analyst(IT)
Smart Mandate : July 2014 - December 2014
Software Engineer
Juxt Smart Mandate : January 2015 – Present
[Projects]
Project-Exim-Guide
Developed under the aegis of NIC, Govt. of India, "Exim Guide" aims to serve as a single query
window for people who are involved in Indian trade. The app provides important details like ITC &
IEC codes, Tariff Calculator ,Web Links and address of Trade Associations.
I along with 2 members of my team created the web app end to end.
Lunchbox
Lunchbox is a dashboard for people running restaurant(s), showcasing what people say about the
restaurants on web. We process the web content about restaurants and do NLP on it and extract the
insights from it. The insights include sentiments and topics of the content which are visualised on the
dashboard.
URL - http://lunchbox.subcortext.com
I along with 3 members of my team set up the entire pipeline for it.
Distributed Scraper
Distributed Scraper is a web scraper which works in distributed environment. One can add machines
on the fly in the environment and that will start scraping. This has been designed keeping in mind
speed and avoiding the bans from websites.
I designed and implemented it end to end.
Elasticsearch Cluster Creation
Elasticsearch Cluster Creation creates cluster of elastic search with minimal effort. It has been
designed keeping in mind the requirements of creating clusters of elasticsearch as per the requirement
with no hardwork.
I designed and implemented it end to end.
Elasticsearch Data Uploader
Elasticsearch Data Uploader uploads the data to elasticsearch cluster with minimal effort. One needs
to input the csv to it and it will upload the data to the cluster. The idea behind this is that anyone can
easily upload big data easily and get the insights from it by visualising it on dashboard.
I designed and implemented it end to end.

More Related Content

Similar to Abhishek Kapoor Software Engineer Resume

Similar to Abhishek Kapoor Software Engineer Resume (20)

Build up and tune PC website(prototype)
Build up and tune PC website(prototype)Build up and tune PC website(prototype)
Build up and tune PC website(prototype)
 
Mera medicare ppt template
Mera medicare ppt templateMera medicare ppt template
Mera medicare ppt template
 
Understanding MicroSERVICE Architecture with Java & Spring Boot
Understanding MicroSERVICE Architecture with Java & Spring BootUnderstanding MicroSERVICE Architecture with Java & Spring Boot
Understanding MicroSERVICE Architecture with Java & Spring Boot
 
Resume pankaj
Resume pankajResume pankaj
Resume pankaj
 
NavneetSingh_ASP.NET
NavneetSingh_ASP.NETNavneetSingh_ASP.NET
NavneetSingh_ASP.NET
 
PHP Developer
PHP DeveloperPHP Developer
PHP Developer
 
Manish123 CV
Manish123 CVManish123 CV
Manish123 CV
 
CV 5 years exp. in UI Front End Developer
CV  5 years exp. in UI Front End DeveloperCV  5 years exp. in UI Front End Developer
CV 5 years exp. in UI Front End Developer
 
PreetiPrasad-ilovepdf-compressed
PreetiPrasad-ilovepdf-compressedPreetiPrasad-ilovepdf-compressed
PreetiPrasad-ilovepdf-compressed
 
Resume
ResumeResume
Resume
 
CV of Mr. Adeel Anwar
CV of Mr. Adeel AnwarCV of Mr. Adeel Anwar
CV of Mr. Adeel Anwar
 
Industrial Training and Practice (ITP).pptx
Industrial Training and Practice (ITP).pptxIndustrial Training and Practice (ITP).pptx
Industrial Training and Practice (ITP).pptx
 
santosh_kumar
santosh_kumarsantosh_kumar
santosh_kumar
 
CV With 2 Years Exp. (Ashish Wankhede)
CV With 2 Years Exp. (Ashish Wankhede)CV With 2 Years Exp. (Ashish Wankhede)
CV With 2 Years Exp. (Ashish Wankhede)
 
prad
pradprad
prad
 
Resume1
Resume1Resume1
Resume1
 
Ali-Shoaib-main-
Ali-Shoaib-main-Ali-Shoaib-main-
Ali-Shoaib-main-
 
Resume
ResumeResume
Resume
 
SHAVAK.pptx
SHAVAK.pptxSHAVAK.pptx
SHAVAK.pptx
 
prasad_kamble_cv
prasad_kamble_cvprasad_kamble_cv
prasad_kamble_cv
 

Abhishek Kapoor Software Engineer Resume

  • 1. Abhishek Kapoor Software Engineer M. 9873057645 E-mail : abhishek.kapoor@outlook.com abhishek.kapoor@iic.ac.in Blog : kapoorabhish.github.io LinkedIn : https://in.linkedin.com/in/abhishek- kapoor-4b7b9295 Github : github.com/kapoorabhish [About Me] Passionate, responsible and committed engineer, with a get-it-done, on-time spirit, and 2 years of experience in product development using Python and PHP. I spend a lot of time understanding the business approach and then implement those ideas in the best possible way through my coding skills. I can work well alone, but I'm at my best collaborating with others. [Education] Institute of Informatics and Communication University of Delhi JULY 2012 - JULY 2014 M.Sc Informatics 7.68 (CGPA) Rajdhani College University of Delhi JULY 2009 - JULY 2012 B.Sc. (H) Electronics Aggregate -67.6% Rajkiya Pratibha Vikas Vidyalaya APR 2008 - MAY 2009 CBSE- 76.6% R. A. Geeta Co-Ed Sr. Sec. School APR 2006 - MAY 2007 CBSE – 88.6% Address: B-14, Sharma Bhavan, South Anarkali, Som Bazar, Krishna Nagar, Delhi-110051 [Skills] PYTHON MONGODB ELASTICSEARCH MYSQL/POSTGRES REDIS PHP SQL HTML APACHE SOLR APACHE STORM C C++ JAVASCRIPT CSS CORE JAVA SHELL SCRIPT ETL CASSANDRA DB
  • 2. [Experience] Intern National Informatics Centre(NIC) : May 2013 – July 2013 Analyst(IT) Smart Mandate : July 2014 - December 2014 Software Engineer Juxt Smart Mandate : January 2015 – Present [Projects] Project-Exim-Guide Developed under the aegis of NIC, Govt. of India, "Exim Guide" aims to serve as a single query window for people who are involved in Indian trade. The app provides important details like ITC & IEC codes, Tariff Calculator ,Web Links and address of Trade Associations. I along with 2 members of my team created the web app end to end. Lunchbox Lunchbox is a dashboard for people running restaurant(s), showcasing what people say about the restaurants on web. We process the web content about restaurants and do NLP on it and extract the insights from it. The insights include sentiments and topics of the content which are visualised on the dashboard. URL - http://lunchbox.subcortext.com I along with 3 members of my team set up the entire pipeline for it. Distributed Scraper Distributed Scraper is a web scraper which works in distributed environment. One can add machines on the fly in the environment and that will start scraping. This has been designed keeping in mind speed and avoiding the bans from websites. I designed and implemented it end to end. Elasticsearch Cluster Creation Elasticsearch Cluster Creation creates cluster of elastic search with minimal effort. It has been designed keeping in mind the requirements of creating clusters of elasticsearch as per the requirement with no hardwork. I designed and implemented it end to end. Elasticsearch Data Uploader Elasticsearch Data Uploader uploads the data to elasticsearch cluster with minimal effort. One needs to input the csv to it and it will upload the data to the cluster. The idea behind this is that anyone can easily upload big data easily and get the insights from it by visualising it on dashboard. I designed and implemented it end to end.