SHOGUN  4.1.0
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Modules Pages
List of all members | Public Member Functions
LearningRate Class Referenceabstract

Detailed Description

The base class about learning rate for descent-based minimizers.

This is the interface used in descent based minimizers. (eg, GradientDescendUpdater::update_variable(SGVector<float64_t> variable_reference, SGVector<float64_t> gradient) )

Definition at line 46 of file LearningRate.h.

Inheritance diagram for LearningRate:
Inheritance graph
[legend]

Public Member Functions

virtual float64_t get_learning_rate (int32_t iter_counter)=0
 
virtual void update_context (CMinimizerContext *context)=0
 
virtual void load_from_context (CMinimizerContext *context)=0
 

Member Function Documentation

virtual float64_t get_learning_rate ( int32_t  iter_counter)
pure virtual

Get a learning rate for descent direction Note that the learning rate usually is positive

Parameters
iter_counterthe number of iterations
Returns
the learning rate (A.K.A step size/length)

Implemented in ConstLearningRate, and InverseScalingLearningRate.

virtual void load_from_context ( CMinimizerContext context)
pure virtual

Load the given context object to restore mutable variables

Parameters
contexta context object

Implemented in InverseScalingLearningRate, and ConstLearningRate.

virtual void update_context ( CMinimizerContext context)
pure virtual

Update a context object to store mutable variables used in learning rate

Parameters
contexta context object

Implemented in InverseScalingLearningRate, and ConstLearningRate.


The documentation for this class was generated from the following file:

SHOGUN Machine Learning Toolbox - Documentation