-
Notifications
You must be signed in to change notification settings - Fork 1
Hyperparameter
A Bhat edited this page Aug 15, 2020
·
11 revisions
- Knobs that ML practitioner has to tweak during successive runs of training a model
- Unlike parameters, hyperparameters are not automatically learned
- Example - learning rate
- Learning rate | Learning rate decay | Epoch | Batch size
- Number of Layers | Activation functions of layers
Area | Hyperparameter | Parameter |
---|---|---|
Who decides? | Trainer | Training process |
Example | Number of layers Learning rate Activation functions |
Weight Bias |
-
What is the difference between a parameter and hyperparameter?
- Parameters are automatically learned during the training process.
- Hyperparameters have to be decided by the trainer