Embed presentation
Download to read offline




Hyperparameters control a model's behavior and must be carefully chosen prior to fitting machine learning models to data. Hyperparameters regulate an algorithm's complexity and topology. They generally have two categories based on their purpose and optimizing hyperparameters is important to optimize the model. Neural networks have hidden layers between the input and output that nonlinearly transform inputs via activation functions. The number of hidden layers and their weights can vary based on the network's performance. Hyperparameters are essential to any machine learning pipeline and are values assigned to a model before training, known as the model's "magic numbers".



