Create my own design in Photoshop with new techniques I have learned. Upload the before and after products, a word document to describe everything, and what tutorials I used and completed.
Hyper Parameter Optimization(HPO) is the technique of tuning parameters to optimize learning algorithms performance on an independent dataset.
Wiki Definition: https://en.wikipedia.org/wiki/Hyperparameter_optimization
In this presentation, we highlight how one can use VW(vowpal wabbit) and Spark framework to achieve to build generalized models in a distributed way.
Create my own design in Photoshop with new techniques I have learned. Upload the before and after products, a word document to describe everything, and what tutorials I used and completed.
Hyper Parameter Optimization(HPO) is the technique of tuning parameters to optimize learning algorithms performance on an independent dataset.
Wiki Definition: https://en.wikipedia.org/wiki/Hyperparameter_optimization
In this presentation, we highlight how one can use VW(vowpal wabbit) and Spark framework to achieve to build generalized models in a distributed way.