https://d2l.ai/chapter_hyperparameter-optimization/hyperopt-intro.html
2 Likes
There is no code to read for question 1
Hyperparameters are parameters that are set by you before training your model.
They are usually arbitrary and iterative in nature.
Examples:
- batch_size
- no_of_epochs
*learning rate e.t.c
I hope it helps.
There is a mismatch between the first code example in section 19.1.1.2 and its explanation. To be correct, either the equation needs to be changed to “stats.loguniform(1e-4, 0.1)” or the text to " which represents a uniform distribution between -4 and 0 in the logarithmic space."
Best regards
It is in chapter 3.2.4