The base class about learning rate for descent-based minimizers.
This is the interface used in descent based minimizers. (eg, GradientDescendUpdater::update_variable(SGVector<float64_t> variable_reference, SGVector<float64_t> gradient) )
Definition at line 46 of file LearningRate.h.
virtual float64_t get_learning_rate |
( |
int32_t |
iter_counter | ) |
|
|
pure virtual |
Get a learning rate for descent direction Note that the learning rate usually is positive
- Parameters
-
iter_counter | the number of iterations |
- Returns
- the learning rate (A.K.A step size/length)
Implemented in ConstLearningRate, and InverseScalingLearningRate.
The documentation for this class was generated from the following file: