This document provides information about Mohan C R, including his education and qualifications, skills, projects, publications and awards. He has over 2 years of experience as a data scientist and machine learning engineer. He has a B.Tech in electronics and communication engineering as well as nanodegree certificates in data science and machine learning foundations from Udacity. His skills include Python, SQL, AWS, TensorFlow and he has worked on projects involving image classification, recommendation engines, disaster response pipelines and customer segmentation.
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Data Scientist & ML Engineer Resume
1. ABOUT ME
I'm a Data Scientist and a Machine Learning Engineer with over 2 years of
Project experience in the field.
EDUCATION
B. Tech in Electronics and Communication Engineering | REVA University
AUG 2015 – AUG 2019
CGPA: 6.72
Data Scientist Nanodegree | Udacity
DEC 2018 – AUG 2019
NANODEGREE CERTIFICATE
Learned skills necessary to become a successful Data Scientist. worked on projects designed by
industry experts, and learnt to run data pipelines, design experiments, build recommendation
systems, and deploy solutions to the cloud.
Machine Learning Foundations Nanodegree | Udacity
JULY 2018 – DEC 2018
NANODEGREE CERTIFICATE
Developed skills on programming, Descriptive and Inferential Statistics, Evaluation and Verification of
machine learning models and worked on projects relating the same.
Higher Secondary | St. Joseph’s Pre-University College
MAR 2015
PERCENTAGE: 81.16%
Secondary | RT. Nagar High School
APR 2013
PERCENTAGE: 88.80%
LANGUAGES
ENGLISH, KANNADA, HINDI, TAMIL
MOHAN C R
Bangalore, IN
+91-8431099532
mohancrnwk@gmail.com
www.linkedin.com/in/mohancr8
https://github.com/MohanCR97
mohancr.ml
2. 2
SKILLS
• Programming Languages:
Python, HTML, CSS, SQL, PHP, C, C++,
• Tools:
Tableau, Jupyter Notebook, Anaconda,
AWS
• Frameworks:
Pandas, NumPy, Scikit-learn,
Matplotlib, TensorFlow, Keras,
Seaborn, PyTorch, Spark, Bootstrap
• Version Control System: Git
• Database: MySQL
PROJECTS
1.IMAGE CLASSIFIER
Technologies and Tools used: PyTorch, Jupyter Notebook, Anaconda, Python3 and its
libraries NumPy and Matplotlib.
• In this project I aimed at developing an Image classifier with deep learning and convert
the same into a command application that others can use for any set of labeled images.
• At first stage of the project I loaded and preprocessed the image dataset of 102 flower
categories with each category having 20 images to train on, trained the image classifier
on the dataset, and used the trained classifier to predict image content. For the model
architecture I primarily used ‘VGG16’ with two hidden layers.
• At the final stage of the project, I wrote two Python scripts that run from the command
line: one trains a new network on the dataset and saves the model as a checkpoint,
and the other uses the trained network to predict the class for an input image.
• Result: Successfully developed the application to be used on other labeled images.
Accuracy of 82% was achieved by the trained network on the test data.
2. RECOMMENDATION ENGINE
Technologies and Tools used: Jupyter Notebook, Anaconda, Pickle, Python3 and its libraries
NumPy, Pandas and Matplotlib.
• In this project I aimed at making recommendations for IBM Watson Studio’s data
platform by analyzing the interactions that users have with articles on the platform,
and make recommendations to them about new articles they might like.
• Performed Exploratory Analysis on the data first to find some insights before making
recommendations. Then built Rank Based Recommendations to find the most popular
articles based on most user interactions with articles as there were no ratings for any
of the articles.
• Then used User-User Based Collaborative Filtering technique to make
recommendations more personal for the users by looking at users that are similar in
terms of the items they have interacted with.
• Built out a matrix decomposition using Singular Value Decomposition (NumPy) based
on User-Item interactions so as to use this Decomposition to get an idea of how well I
can predict new articles that an individual might interact with.
• Results: Was able to successfully suggest what kind of Recommendations can be used
for different types of users of the platform based on whether they were new users or
users that had already read few articles. The testing accuracy of the systems were
around 93% for 300 latent features.
3. 3
3. DISASTER RESPONSE PIPELINE
Technologies and Tools used: Jupyter Notebook, Anaconda, NLTK, SQLAlchemy, Flask,
Python3 and its libraries NumPy, Pandas, Scikit-learn, Plotly.
• In this project I took up the challenge of having to analyze real life Disaster data from
Figure Eight company and build a model for an API that classifies disaster messages.
• Built an ETL Pipeline to clean the data and store it in an SQLite database, a ML
Pipeline that uses NLTK, as well as scikit-learn's Pipeline and GridSearchCV to output
a final model that predicts classifications for 36 categories (multi-output
classification), then export the final model as pickle file.
• This machine learning pipeline categorizes disaster events so that one can send the
messages to an appropriate disaster relief agency. Also built a Flask Web APP where
an emergency worker can input a new message and get classification results in
several categories. This web app will also display visualizations of the data.
• Result: Successfully built the pipelines and converted it to python scripts so as in
someone in future comes with a revised or new dataset of messages, they can easily
create a new model just by running the code. Also was able to successfully display
my results for the Figure Eight dataset in a Flask Web APP.
4. CUSTOMER SEGMENTATION
Technologies and Tools used: Jupyter Notebook, Anaconda, Python3 and its libraries
NumPy, Pandas, Seaborn, Scikit-learn and Matplotlib.
• In this project I applied Unsupervised learning skills to two demographics datasets,
to identify segments and clusters in the population, and see how customers of a
company map to them.
• The first dataset was the Demographic data for the general population of Germany;
891211 persons (rows) x 85 features (columns) and the second was a Demographic
data for customers of a mail-order company; 191652 persons (rows) x 85 features
(columns)
• I created a cleaning function for the demographic data i.e. preprocessing.
Performed Dimensionality reduction on the scaled data by using Sklearn’s PCA class
to apply Principal Component Analysis. Then performed K-means clustering on the
PCA-transformed data for the general population.
• Result: Successfully applied the Unsupervised learning techniques to compare two
cluster distributions of the Customer data to Demographic Data and suggested
where the strongest customer base for the company is.
OTHER PROJECTS:
• ISRO Satellite Design
Designing a 6-unit Satellite for agriculture sector, which helps locate suitable plots
for agriculture and helps reduce deforestation.
• Autonomous Agriculture Robot
An autonomous robot that uses deep learning (TensorFlow), that would replace
labor intensive tasks of Agriculture and improving the existing methods using
machine learning.
• Renewable Power Trading System using Blockchain
A project where we developed a custom blockchain model to facilitate an electrical
grid system which allows easy treading of DC power.
For more projects refer: http://mohancr.ml
https://github.com/MohanCR97
4. 4
PUBLICATIONS AND AWARDS
ACTIVITIES AND CERTIFICATIONS
• Interactive Robot with Image Classification Techniques
Mar-2019
Publisher: International Journal of Scientific Research and Review
Impact Factor: 6.1
• Received Best Paper Award for the paper titled “INTERACTIVE ROBOT WITH IMAGE
CLASSIFICATION TECHNIQUES” at the 2nd
National Conference on Recent Innovation in
Engineering, Science, Humanities and Management.
• Prathibha Puraskar Award-2013 for Securing Distinction in my Secondary School.
• Participated in a 24-Hour ‘DoraHacks Global Hack Series 2018’ Hackathon conducted
by DoraHacks on Blockchain Technology.
• Participated in a 3-day bootcamp on building a payload organized Young
Professionals in Space (YPS) in association with IEEE.
• Web Development Training Certification program by Internshala.
• Been an Active Member and Volunteer of IEEE student (2016-2018).
• Attended workshops on “Android APP Development”, “Blockchain And Bitcoin
Technology”, “Introduction to ERP Using SAP”.
• Volunteered in International Conference on Smart Technologies for Smart Nation
(SmartTechCon) Aug-2017.
• Attended Google Developer Days events 2017 over 3 days.