Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Hyperparameter Optimization 101
Alexandra Johnson
Software Engineer, SigOpt
What are Hyperparameters?
Hyperparameters
affect model
performance
How Do I Find The Best Hyperparameters?
Step 1: Pick an Objective Metric
Classification models Accuracy
Regression models Root MSE
Caveat: Cross
Validate to Prevent
Overfitting
Cross Validation
4 5 60 1 2 3 7 8 9
4 5 6 7 8 90 1 2 3data
train validate metric
Cross Validation
4 5 60 1 2 3 7 8 9
4 5 6 7 8 90 1 2 3data
train
6 7 91 2 4 5 0 3 8train
7 8 90 2 3 6 1 4 5train
metric
me...
Grid Search Random Search Bayesian Optimization
Step 2: Pick an Optimization Strategy
Step 3: Evaluate N Times
N
Times
What is the Best Hyperparameter
Optimization Strategy?
Primary
Consideration: How
Good are the “Best”
Hyperparameters?
“Best Found Value” Distributions
experiments
accuracy
Secondary
Consideration: How
Much Time Do You
Have?
Number of Evaluations Required
Grid Search Random Search Bayesian
Optimization
2 parameters 100 ?? 20-40
3 parameters 1,00...
SigOpt
Easy-to-use REST API,
R, Java, Python Clients
Ensemble of Bayesian
optimization techniques
Free trial, academic
dis...
SigOpt Tutorial Videos
Versus untuned models:
+315.2% accuracy with TensorFlow CNN
+49.2% accuracy with Xgboost + unsuperv...
Learn More
See more at sigopt.com/research:
● Blog posts
● Papers
● Videos
Thank You!
Twitter: @SigOpt
Email: support@sigopt.com
Web: sigopt.com/getstarted
Upcoming SlideShare
Loading in …5
×

Hyperparameter Optimization 101

3,935 views

Published on

Hyperparameter optimization is a common problem in machine learning. Machine learning algorithms, from logistic regression to neural nets, depend on well tuned hyperparameters to reach maximum effectiveness. Different hyperparameter optimization strategies have varied performance and cost (in time, money, and compute cycles.) So how do you choose? This talk will give a brief introduction to hyperparameter tuning and its importance, as well as the basics of how we apply statistical tests to make confident assertions about which hyperparameter optimization strategies can give you better results, faster.

Published in: Technology
  • If you are one of those students who prefer to order a well-written essay online instead of doing it individually, I can suggest you using this writing company HelpWriting.net ! Try using this service once and you will see how useful it can be for you!
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • The Insider's Edge You've Been Looking For.... ♣♣♣ https://bit.ly/2UD1pAx
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Mike Cruickshank Profit Maximiser➤➤ https://bit.ly/2UD1pAx
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Dating for everyone is here: ♥♥♥ http://bit.ly/2F7hN3u ♥♥♥
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Sex in your area is here: ♥♥♥ http://bit.ly/2F7hN3u ♥♥♥
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

Hyperparameter Optimization 101

  1. 1. Hyperparameter Optimization 101 Alexandra Johnson Software Engineer, SigOpt
  2. 2. What are Hyperparameters?
  3. 3. Hyperparameters affect model performance
  4. 4. How Do I Find The Best Hyperparameters?
  5. 5. Step 1: Pick an Objective Metric Classification models Accuracy Regression models Root MSE
  6. 6. Caveat: Cross Validate to Prevent Overfitting
  7. 7. Cross Validation 4 5 60 1 2 3 7 8 9 4 5 6 7 8 90 1 2 3data train validate metric
  8. 8. Cross Validation 4 5 60 1 2 3 7 8 9 4 5 6 7 8 90 1 2 3data train 6 7 91 2 4 5 0 3 8train 7 8 90 2 3 6 1 4 5train metric metric metric Ktimes validate validate validate
  9. 9. Grid Search Random Search Bayesian Optimization Step 2: Pick an Optimization Strategy
  10. 10. Step 3: Evaluate N Times N Times
  11. 11. What is the Best Hyperparameter Optimization Strategy?
  12. 12. Primary Consideration: How Good are the “Best” Hyperparameters?
  13. 13. “Best Found Value” Distributions experiments accuracy
  14. 14. Secondary Consideration: How Much Time Do You Have?
  15. 15. Number of Evaluations Required Grid Search Random Search Bayesian Optimization 2 parameters 100 ?? 20-40 3 parameters 1,000 ?? 30-60 4 parameters 10,000 ?? 40-80 5 parameters 100,000 ?? 50-100
  16. 16. SigOpt Easy-to-use REST API, R, Java, Python Clients Ensemble of Bayesian optimization techniques Free trial, academic discount, we’re hiring!
  17. 17. SigOpt Tutorial Videos Versus untuned models: +315.2% accuracy with TensorFlow CNN +49.2% accuracy with Xgboost + unsupervised features
  18. 18. Learn More See more at sigopt.com/research: ● Blog posts ● Papers ● Videos
  19. 19. Thank You! Twitter: @SigOpt Email: support@sigopt.com Web: sigopt.com/getstarted

×