@louisdorard
#papisconnect
PredictiveAPIs
Student Researcher Data Scientist Developer Non-technical
Machine Learning
Use cases
Limitations
Predictive APIs
Does it work?
Case study
ML Canvas
–Mike Gualtieri, Principal Analyst at Forrester
“Predictive apps are
the next big thing
in app development.”
–Waqar Hasan, VISA
“Predictive is the ‘killer app’ for big
data.”
1. Machine Learning
2. Data
BUT
–McKinsey & Co. (2011)
“A significant constraint on
realizing value from big data will
be a shortage of talent,
particularly of people with deep
expertise in statistics and machine
learning.”
Demystifying

Machine Learning
“Which type of email is this?
— Spam/Ham”


“Which type of email is this?
— Spam/Ham”


Classification
I
O
“Which type of email is this?
— Spam/Ham”


??
“How much is this house worth?
— X $”


-> Regression
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
3 1 860 1950 house 565,000
3 1 1012 1951 house
2 1.5 968 1976 townhouse 447,000
4 1315 1950 house 648,000
3 2 1599 1964 house
3 2 987 1951 townhouse 790,000
1 1 530 2007 condo 122,000
4 2 1574 1964 house 835,000
4 2001 house 855,000
3 2.5 1472 2005 house
4 3.5 1714 2005 townhouse
2 2 1113 1999 condo
1 769 1999 condo 315,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
3 1 860 1950 house 565,000
3 1 1012 1951 house
2 1.5 968 1976 townhouse 447,000
4 1315 1950 house 648,000
3 2 1599 1964 house
3 2 987 1951 townhouse 790,000
1 1 530 2007 condo 122,000
4 2 1574 1964 house 835,000
4 2001 house 855,000
3 2.5 1472 2005 house
4 3.5 1714 2005 townhouse
2 2 1113 1999 condo
1 769 1999 condo 315,000
ML is a set of AI techniques where
“intelligence” is built by referring to
examples
Use cases
• Real-estate
• Spam
• Priority inbox
• Crowd prediction
property price
email spam indicator
email importance indicator
location & context #people
Zillow
Gmail
Gmail
Tranquilien
I. Get more customers
• Reduce churn
• Score leads
• Optimize campaigns
customer churn indicator
customer revenue
customer & campaign interest indicator
II. Serve customers better
• Cross-sell
• Increase engagement
• Optimize pricing
customer & product purchase indicator
user & item interest indicator
product & price #sales
III. Serve customers more efficiently
• Predict demand
• Automate tasks
• Use predictive enterprise apps
context demand
credit application repayment indicator
Predictive enterprise apps
• Priority filtering
• Message routing
• Auto-configuration
message priority indicator
request employee
user & actions settings
RULES
–Katherine Barr, Partner at VC-firm MDV
"Pairing human workers with
machine learning and automation
will transform knowledge work
and unleash new levels of human
productivity and creativity."
Limitations
Need examples of inputs AND outputs
What if not enough data points?
What if similar inputs have dissimilar outputs?
Bedrooms Bathrooms Price ($)
3 2 500,000
3 2 800,000
1 1 300,000
1 1 800,000
Bedrooms Bathrooms Surface (foot²) Year built Price ($)
3 2 800 1950 500,000
3 2 1000 1950 800,000
1 1 500 1950 300,000
1 1 500 2014 800,000
–@louisdorard
“A model can only be as good as
the data it was given to train on”
Predictive APIs:

ML for all
HTML / CSS / JavaScript
HTML / CSS / JavaScript
squarespace.com
The two phases of machine learning:
• TRAIN a model
• PREDICT with a model
The two methods of predictive APIs:
• TRAIN a model
• PREDICT with a model
The two methods of predictive APIs:
• model = create_model(dataset)
• predicted_output =
create_prediction(model, new_input)
The two methods of predictive APIs:
• model = create_model(‘training.csv’)
• predicted_output =
create_prediction(model, new_input)
“Is this email important?
— Yes/No”
“Is this customer going to leave next month?
— Yes/No”
“What is the sentiment of this tweet?
— Positive/Neutral/Negative”
The two phases of machine learning:
• TRAIN a model
• PREDICT with a model
The two phases of machine learning:
• TRAIN a model
• PREDICT with an already existing model
“Is this email spam?
— Yes/No”
Does it work?
How well
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
3 1 860 1950 house 565,000
3 1 1012 1951 house
2 1.5 968 1976 townhouse 447,000
4 1315 1950 house 648,000
3 2 1599 1964 house
3 2 987 1951 townhouse 790,000
1 1 530 2007 condo 122,000
4 2 1574 1964 house 835,000
4 2001 house 855,000
3 2.5 1472 2005 house
4 3.5 1714 2005 townhouse
2 2 1113 1999 condo
1 769 1999 condo 315,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
3 1 860 1950 house 565,000
2 1.5 968 1976 townhouse 447,000
4 1315 1950 house 648,000
3 2 987 1951 townhouse 790,000
1 1 530 2007 condo 122,000
4 2 1574 1964 house 835,000
4 2001 house 855,000
1 769 1999 condo 315,000
3 1 1012 1951 house
3 2 1599 1964 house
3 2.5 1472 2005 house
4 3.5 1714 2005 townhouse
2 2 1113 1999 condo
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
3 1 860 1950 house 565,000
2 1.5 968 1976 townhouse 447,000
4 1315 1950 house 648,000
3 2 987 1951 townhouse 790,000
1 1 530 2007 condo 122,000
4 2 1574 1964 house 835,000
4 2001 house 855,000
1 769 1999 condo 315,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
2 1.5 968 1976 townhouse 447,000
3 1 860 1950 house 565,000
1 769 1999 condo 315,000
4 1315 1950 house 648,000
4 2 1574 1964 house 835,000
3 2 987 1951 townhouse 790,000
4 2001 house 855,000
1 1 530 2007 condo 122,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
4 2 1574 1964 house 835,000
3 2 987 1951 townhouse 790,000
4 2001 house 855,000
1 1 530 2007 condo 122,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
2 1.5 968 1976 townhouse 447,000
3 1 860 1950 house 565,000
1 769 1999 condo 315,000
4 1315 1950 house 648,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
2 1.5 968 1976 townhouse 447,000
3 1 860 1950 house 565,000
1 769 1999 condo 315,000
4 1315 1950 house 648,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($)
4 2 1574 1964 house 835,000 835,000
3 2 987 1951 townhouse 790,000 790,000
4 2001 house 855,000 855,000
1 1 530 2007 condo 122,000 122,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
2 1.5 968 1976 townhouse 447,000
3 1 860 1950 house 565,000
1 769 1999 condo 315,000
4 1315 1950 house 648,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($)
4 2 1574 1964 house 835,000
3 2 987 1951 townhouse 790,000
4 2001 house 855,000
1 1 530 2007 condo 122,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
2 1.5 968 1976 townhouse 447,000
3 1 860 1950 house 565,000
1 769 1999 condo 315,000
4 1315 1950 house 648,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($)
4 2 1574 1964 house 835,000
3 2 987 1951 townhouse 790,000
4 2001 house 855,000
1 1 530 2007 condo 122,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($)
2 1.5 968 1976 townhouse 447,000
3 1 860 1950 house 565,000
1 769 1999 condo 315,000
4 1315 1950 house 648,000
Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($)
4 2 1574 1964 house 818,000 835,000
3 2 987 1951 townhouse 800,000 790,000
4 2001 house 915,000 855,000
1 1 530 2007 condo 100,000 122,000
Price ($) Price ($)
818,000 835,000
800,000 790,000
915,000 855,000
100,000 122,000
Need real-time machine learning?
The two phases of machine learning:
• TRAIN a model
• PREDICT with a model
• Training time
• Prediction time
• Accuracy
Case study:

churn analysis
• Who: SaaS company selling monthly subscription
• Question asked:“Is this customer going to leave
within 1 month?”
• Input: customer
• Output: no-churn (negative) or churn (positive)
• Data collection: history up until 1 month ago
• Baseline: if no usage for more than 15 days then
churn
Learning: OK
but
• How to represent customers?
• What to do after predicting churn?
Customer representation:
• basic info (age, income, etc.)
• usage of service (# times used app, avg time spent,
features used, etc.)
• interactions with customer support (how many,
topics of questions, satisfaction ratings)
Taking action to prevent churn:
• contact customers (in which order?)
• switch to different plan
• give special offer
• no action?
Measuring accuracy:
• #TP (we predict customer churns and he does)
• #FP (we predict customer churns but he doesn’t)
• #FN (we predict customer doesn’t churn but he does)
• Compare to baseline
Estimating Return On Investment:
• Taking action for #TP and #FP customers has a cost
• We earn #TP * success rate * revenue /cust. /month
• Compare to baseline
Machine Learning
Canvas
PREDICTIONS OBJECTIVES DATA
Context
Who will use the predictive system / who will be
affected by it? Provide some background.
Value Proposition
What are we trying to do? E.g. spend less time on
X, increase Y...
Data Sources
Where do/can we get data from? (internal
database, 3rd party API, etc.)
Problem
Question to predict answers to (in plain English)
Input (i.e. question "parameter")
Possible outputs (i.e. "answers")
Type of problem (e.g. classification, regression,
recommendation...)
Baseline
What is an alternative way of making predictions
(e.g. manual rules based on feature values)?
Performance evaluation
Domain-specific / bottom-line metrics for
monitoring performance in production
Prediction accuracy metrics (e.g. MSE if
regression; % accuracy, #FP for classification)
Offline performance evaluation method (e.g.
cross-validation or simple training/test split)
Dataset
How do we collect data (inputs and outputs)?
How many data points?
Features
Used to represent inputs and extracted from
data sources above. Group by types and
mention key features if too many to list all.
Using predictions
When do we make predictions and how many?
What is the time constraint for making those predictions?
How do we use predictions and confidence values?
Learning predictive models
When do we create/update models? With which data / how much?
What is the time constraint for creating a model?
Criteria for deploying model (e.g. minimum performance value — absolute,
relative to baseline or to previous model)
IDEASPECSDEPLOYMENT
BACKGROUND
ENGINE SPECS
INTEGRATION
PREDICTIONS OBJECTIVES DATA
BACKGROUND
ENGINE SPECS
INTEGRATION
PREDICTIONS OBJECTIVES DATA
BACKGROUND End-user Value prop Sources
ENGINE SPECS ML problem Perf eval Preparation
INTEGRATION Using pred Learning modelINTEGRATION Using pred Learning model
Why fill in ML canvas?
• Target the right problem for your company
• Choose right algorithm, infrastructure, or ML
solution
• Guide project management
• Improve team communication
machinelearningcanvas.com
Recap
• Need examples of inputs AND outputs
• Need enough examples
• ML to create value from data
• 2 phases: TRAIN and PREDICT
• Predictive APIs make it more accessible
• Good data is essential
• What do we do with predictions?
• Measure performance with accuracy, time and
bottom-line
• Also: deploy, maintain, improve…
louisdorard.com

Demystifying Machine Learning

  • 1.
  • 3.
  • 5.
    Student Researcher DataScientist Developer Non-technical
  • 9.
    Machine Learning Use cases Limitations PredictiveAPIs Does it work? Case study ML Canvas
  • 10.
    –Mike Gualtieri, PrincipalAnalyst at Forrester “Predictive apps are the next big thing in app development.”
  • 11.
    –Waqar Hasan, VISA “Predictiveis the ‘killer app’ for big data.”
  • 12.
  • 13.
  • 14.
    –McKinsey & Co.(2011) “A significant constraint on realizing value from big data will be a shortage of talent, particularly of people with deep expertise in statistics and machine learning.”
  • 15.
  • 17.
    “Which type ofemail is this? — Spam/Ham” 

  • 18.
    “Which type ofemail is this? — Spam/Ham” 
 Classification
  • 19.
    I O “Which type ofemail is this? — Spam/Ham” 

  • 20.
  • 22.
    “How much isthis house worth? — X $” 
 -> Regression
  • 23.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 3 1 860 1950 house 565,000 3 1 1012 1951 house 2 1.5 968 1976 townhouse 447,000 4 1315 1950 house 648,000 3 2 1599 1964 house 3 2 987 1951 townhouse 790,000 1 1 530 2007 condo 122,000 4 2 1574 1964 house 835,000 4 2001 house 855,000 3 2.5 1472 2005 house 4 3.5 1714 2005 townhouse 2 2 1113 1999 condo 1 769 1999 condo 315,000
  • 24.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 3 1 860 1950 house 565,000 3 1 1012 1951 house 2 1.5 968 1976 townhouse 447,000 4 1315 1950 house 648,000 3 2 1599 1964 house 3 2 987 1951 townhouse 790,000 1 1 530 2007 condo 122,000 4 2 1574 1964 house 835,000 4 2001 house 855,000 3 2.5 1472 2005 house 4 3.5 1714 2005 townhouse 2 2 1113 1999 condo 1 769 1999 condo 315,000
  • 26.
    ML is aset of AI techniques where “intelligence” is built by referring to examples
  • 28.
  • 29.
    • Real-estate • Spam •Priority inbox • Crowd prediction property price email spam indicator email importance indicator location & context #people Zillow Gmail Gmail Tranquilien
  • 30.
    I. Get morecustomers • Reduce churn • Score leads • Optimize campaigns customer churn indicator customer revenue customer & campaign interest indicator
  • 31.
    II. Serve customersbetter • Cross-sell • Increase engagement • Optimize pricing customer & product purchase indicator user & item interest indicator product & price #sales
  • 32.
    III. Serve customersmore efficiently • Predict demand • Automate tasks • Use predictive enterprise apps context demand credit application repayment indicator
  • 33.
    Predictive enterprise apps •Priority filtering • Message routing • Auto-configuration message priority indicator request employee user & actions settings RULES
  • 34.
    –Katherine Barr, Partnerat VC-firm MDV "Pairing human workers with machine learning and automation will transform knowledge work and unleash new levels of human productivity and creativity."
  • 35.
  • 40.
    Need examples ofinputs AND outputs
  • 42.
    What if notenough data points?
  • 44.
    What if similarinputs have dissimilar outputs?
  • 46.
    Bedrooms Bathrooms Price($) 3 2 500,000 3 2 800,000 1 1 300,000 1 1 800,000
  • 47.
    Bedrooms Bathrooms Surface(foot²) Year built Price ($) 3 2 800 1950 500,000 3 2 1000 1950 800,000 1 1 500 1950 300,000 1 1 500 2014 800,000
  • 48.
    –@louisdorard “A model canonly be as good as the data it was given to train on”
  • 49.
  • 51.
    HTML / CSS/ JavaScript
  • 52.
    HTML / CSS/ JavaScript
  • 53.
  • 56.
    The two phasesof machine learning: • TRAIN a model • PREDICT with a model
  • 57.
    The two methodsof predictive APIs: • TRAIN a model • PREDICT with a model
  • 58.
    The two methodsof predictive APIs: • model = create_model(dataset) • predicted_output = create_prediction(model, new_input)
  • 59.
    The two methodsof predictive APIs: • model = create_model(‘training.csv’) • predicted_output = create_prediction(model, new_input)
  • 62.
    “Is this emailimportant? — Yes/No”
  • 63.
    “Is this customergoing to leave next month? — Yes/No”
  • 64.
    “What is thesentiment of this tweet? — Positive/Neutral/Negative”
  • 65.
    The two phasesof machine learning: • TRAIN a model • PREDICT with a model
  • 66.
    The two phasesof machine learning: • TRAIN a model • PREDICT with an already existing model
  • 67.
    “Is this emailspam? — Yes/No”
  • 68.
  • 69.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 3 1 860 1950 house 565,000 3 1 1012 1951 house 2 1.5 968 1976 townhouse 447,000 4 1315 1950 house 648,000 3 2 1599 1964 house 3 2 987 1951 townhouse 790,000 1 1 530 2007 condo 122,000 4 2 1574 1964 house 835,000 4 2001 house 855,000 3 2.5 1472 2005 house 4 3.5 1714 2005 townhouse 2 2 1113 1999 condo 1 769 1999 condo 315,000
  • 70.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 3 1 860 1950 house 565,000 2 1.5 968 1976 townhouse 447,000 4 1315 1950 house 648,000 3 2 987 1951 townhouse 790,000 1 1 530 2007 condo 122,000 4 2 1574 1964 house 835,000 4 2001 house 855,000 1 769 1999 condo 315,000 3 1 1012 1951 house 3 2 1599 1964 house 3 2.5 1472 2005 house 4 3.5 1714 2005 townhouse 2 2 1113 1999 condo
  • 71.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 3 1 860 1950 house 565,000 2 1.5 968 1976 townhouse 447,000 4 1315 1950 house 648,000 3 2 987 1951 townhouse 790,000 1 1 530 2007 condo 122,000 4 2 1574 1964 house 835,000 4 2001 house 855,000 1 769 1999 condo 315,000
  • 72.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 2 1.5 968 1976 townhouse 447,000 3 1 860 1950 house 565,000 1 769 1999 condo 315,000 4 1315 1950 house 648,000 4 2 1574 1964 house 835,000 3 2 987 1951 townhouse 790,000 4 2001 house 855,000 1 1 530 2007 condo 122,000
  • 73.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 4 2 1574 1964 house 835,000 3 2 987 1951 townhouse 790,000 4 2001 house 855,000 1 1 530 2007 condo 122,000 Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) 2 1.5 968 1976 townhouse 447,000 3 1 860 1950 house 565,000 1 769 1999 condo 315,000 4 1315 1950 house 648,000
  • 74.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 2 1.5 968 1976 townhouse 447,000 3 1 860 1950 house 565,000 1 769 1999 condo 315,000 4 1315 1950 house 648,000 Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($) 4 2 1574 1964 house 835,000 835,000 3 2 987 1951 townhouse 790,000 790,000 4 2001 house 855,000 855,000 1 1 530 2007 condo 122,000 122,000
  • 75.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 2 1.5 968 1976 townhouse 447,000 3 1 860 1950 house 565,000 1 769 1999 condo 315,000 4 1315 1950 house 648,000 Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($) 4 2 1574 1964 house 835,000 3 2 987 1951 townhouse 790,000 4 2001 house 855,000 1 1 530 2007 condo 122,000
  • 76.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 2 1.5 968 1976 townhouse 447,000 3 1 860 1950 house 565,000 1 769 1999 condo 315,000 4 1315 1950 house 648,000 Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($) 4 2 1574 1964 house 835,000 3 2 987 1951 townhouse 790,000 4 2001 house 855,000 1 1 530 2007 condo 122,000
  • 77.
    Bedrooms Bathrooms Surface(foot²) Year built Type Price ($) 2 1.5 968 1976 townhouse 447,000 3 1 860 1950 house 565,000 1 769 1999 condo 315,000 4 1315 1950 house 648,000 Bedrooms Bathrooms Surface (foot²) Year built Type Price ($) Price ($) 4 2 1574 1964 house 818,000 835,000 3 2 987 1951 townhouse 800,000 790,000 4 2001 house 915,000 855,000 1 1 530 2007 condo 100,000 122,000
  • 78.
    Price ($) Price($) 818,000 835,000 800,000 790,000 915,000 855,000 100,000 122,000
  • 79.
  • 80.
    The two phasesof machine learning: • TRAIN a model • PREDICT with a model
  • 81.
    • Training time •Prediction time • Accuracy
  • 82.
  • 83.
    • Who: SaaScompany selling monthly subscription • Question asked:“Is this customer going to leave within 1 month?” • Input: customer • Output: no-churn (negative) or churn (positive) • Data collection: history up until 1 month ago • Baseline: if no usage for more than 15 days then churn
  • 84.
    Learning: OK but • Howto represent customers? • What to do after predicting churn?
  • 85.
    Customer representation: • basicinfo (age, income, etc.) • usage of service (# times used app, avg time spent, features used, etc.) • interactions with customer support (how many, topics of questions, satisfaction ratings)
  • 86.
    Taking action toprevent churn: • contact customers (in which order?) • switch to different plan • give special offer • no action?
  • 87.
    Measuring accuracy: • #TP(we predict customer churns and he does) • #FP (we predict customer churns but he doesn’t) • #FN (we predict customer doesn’t churn but he does) • Compare to baseline
  • 88.
    Estimating Return OnInvestment: • Taking action for #TP and #FP customers has a cost • We earn #TP * success rate * revenue /cust. /month • Compare to baseline
  • 89.
  • 91.
    PREDICTIONS OBJECTIVES DATA Context Whowill use the predictive system / who will be affected by it? Provide some background. Value Proposition What are we trying to do? E.g. spend less time on X, increase Y... Data Sources Where do/can we get data from? (internal database, 3rd party API, etc.) Problem Question to predict answers to (in plain English) Input (i.e. question "parameter") Possible outputs (i.e. "answers") Type of problem (e.g. classification, regression, recommendation...) Baseline What is an alternative way of making predictions (e.g. manual rules based on feature values)? Performance evaluation Domain-specific / bottom-line metrics for monitoring performance in production Prediction accuracy metrics (e.g. MSE if regression; % accuracy, #FP for classification) Offline performance evaluation method (e.g. cross-validation or simple training/test split) Dataset How do we collect data (inputs and outputs)? How many data points? Features Used to represent inputs and extracted from data sources above. Group by types and mention key features if too many to list all. Using predictions When do we make predictions and how many? What is the time constraint for making those predictions? How do we use predictions and confidence values? Learning predictive models When do we create/update models? With which data / how much? What is the time constraint for creating a model? Criteria for deploying model (e.g. minimum performance value — absolute, relative to baseline or to previous model) IDEASPECSDEPLOYMENT
  • 92.
  • 93.
  • 94.
    PREDICTIONS OBJECTIVES DATA BACKGROUNDEnd-user Value prop Sources ENGINE SPECS ML problem Perf eval Preparation INTEGRATION Using pred Learning modelINTEGRATION Using pred Learning model
  • 95.
    Why fill inML canvas? • Target the right problem for your company • Choose right algorithm, infrastructure, or ML solution • Guide project management • Improve team communication
  • 96.
  • 97.
  • 98.
    • Need examplesof inputs AND outputs • Need enough examples
  • 99.
    • ML tocreate value from data • 2 phases: TRAIN and PREDICT • Predictive APIs make it more accessible • Good data is essential • What do we do with predictions? • Measure performance with accuracy, time and bottom-line • Also: deploy, maintain, improve…
  • 101.