SHOGUN  3.2.1
CNeuralLinearLayer Class Reference

## Detailed Description

Neural layer with linear neurons, with an identity activation function. can be used as a hidden layer or an output layer.

Each neuron in the layer is connected to all the neurons in all the layers that connect into this layer.

Activations for each train/test case are computed according to $$b + \sum_i W_i x_i$$ where $$b$$ is the bias vector, $$W_i$$ is the weights matrix between this layer and layer i of its inputs, and $$x_i$$ is the activations vector of layer i.

The layout of the parameter vector of this layer is as follows:

• The first num_neurons elements correspond to the biases
• The following elements correspond to the weight matrices. For each layer i that connects into this layer as input, a weight matrix of size num_neurons*num_neurons_i is stored in column major format.

When used as an output layer, a squared error measure is used

Definition at line 63 of file NeuralLinearLayer.h.

Inheritance diagram for CNeuralLinearLayer:
[legend]

## Public Member Functions

CNeuralLinearLayer ()
CNeuralLinearLayer (int32_t num_neurons)
virtual ~CNeuralLinearLayer ()
virtual void initialize (CDynamicObjectArray *layers, SGVector< int32_t > input_indices)
virtual void initialize_parameters (SGVector< float64_t > parameters, SGVector< bool > parameter_regularizable, float64_t sigma)
virtual void compute_activations (SGVector< float64_t > parameters, CDynamicObjectArray *layers)
virtual void compute_gradients (SGVector< float64_t > parameters, SGMatrix< float64_t > targets, CDynamicObjectArray *layers, SGVector< float64_t > parameter_gradients)
virtual float64_t compute_error (SGMatrix< float64_t > targets)
virtual void enforce_max_norm (SGVector< float64_t > parameters, float64_t max_norm)
virtual float64_t compute_contraction_term (SGVector< float64_t > parameters)
virtual void compute_local_gradients (SGMatrix< float64_t > targets)
virtual const char * get_name () const
virtual void set_batch_size (int32_t batch_size)
virtual bool is_input ()
virtual void compute_activations (SGMatrix< float64_t > inputs)
virtual void dropout_activations ()
virtual int32_t get_num_neurons ()
virtual int32_t get_width ()
virtual int32_t get_height ()
virtual int32_t get_num_parameters ()
virtual SGMatrix< float64_tget_activations ()
virtual CSGObjectshallow_copy () const
virtual CSGObjectdeep_copy () const
virtual bool is_generic (EPrimitiveType *generic) const
template<class T >
void set_generic ()
void unset_generic ()
virtual void print_serializable (const char *prefix="")
virtual bool save_serializable (CSerializableFile *file, const char *prefix="", int32_t param_version=Version::get_version_parameter())
virtual bool load_serializable (CSerializableFile *file, const char *prefix="", int32_t param_version=Version::get_version_parameter())
DynArray< TParameter * > * load_file_parameters (const SGParamInfo *param_info, int32_t file_version, CSerializableFile *file, const char *prefix="")
DynArray< TParameter * > * load_all_file_parameters (int32_t file_version, int32_t current_version, CSerializableFile *file, const char *prefix="")
void map_parameters (DynArray< TParameter * > *param_base, int32_t &base_version, DynArray< const SGParamInfo * > *target_param_infos)
void set_global_io (SGIO *io)
SGIOget_global_io ()
void set_global_parallel (Parallel *parallel)
Parallelget_global_parallel ()
void set_global_version (Version *version)
Versionget_global_version ()
SGStringList< char > get_modelsel_names ()
void print_modsel_params ()
char * get_modsel_param_descr (const char *param_name)
index_t get_modsel_param_index (const char *param_name)
void build_gradient_parameter_dictionary (CMap< TParameter *, CSGObject * > *dict)
virtual void update_parameter_hash ()
virtual bool parameter_hash_changed ()
virtual bool equals (CSGObject *other, float64_t accuracy=0.0, bool tolerant=false)
virtual CSGObjectclone ()

## Public Attributes

bool is_training
float64_t dropout_prop
float64_t contraction_coefficient
ENLAutoencoderPosition autoencoder_position
SGIOio
Parallelparallel
Versionversion
Parameterm_parameters
Parameterm_model_selection_parameters
ParameterMapm_parameter_map
uint32_t m_hash

## Protected Member Functions

virtual TParametermigrate (DynArray< TParameter * > *param_base, const SGParamInfo *target)
virtual void one_to_one_migration_prepare (DynArray< TParameter * > *param_base, const SGParamInfo *target, TParameter *&replacement, TParameter *&to_migrate, char *old_name=NULL)
virtual void load_serializable_pre () throw (ShogunException)
virtual void load_serializable_post () throw (ShogunException)
virtual void save_serializable_pre () throw (ShogunException)
virtual void save_serializable_post () throw (ShogunException)

## Protected Attributes

int32_t m_num_neurons
int32_t m_width
int32_t m_height
int32_t m_num_parameters
SGVector< int32_t > m_input_indices
SGVector< int32_t > m_input_sizes
int32_t m_batch_size
SGMatrix< float64_tm_activations

## Constructor & Destructor Documentation

 CNeuralLinearLayer ( )

default constructor

Definition at line 44 of file NeuralLinearLayer.cpp.

 CNeuralLinearLayer ( int32_t num_neurons )

Constuctor

Parameters
 num_neurons Number of neurons in this layer

Definition at line 48 of file NeuralLinearLayer.cpp.

 virtual ~CNeuralLinearLayer ( )
virtual

Definition at line 75 of file NeuralLinearLayer.h.

## Member Function Documentation

 void build_gradient_parameter_dictionary ( CMap< TParameter *, CSGObject * > * dict )
inherited

Builds a dictionary of all parameters in SGObject as well of those of SGObjects that are parameters of this object. Dictionary maps parameters to the objects that own them.

Parameters
 dict dictionary of parameters to be built.

Definition at line 1185 of file SGObject.cpp.

 CSGObject * clone ( )
virtualinherited

Creates a clone of the current object. This is done via recursively traversing all parameters, which corresponds to a deep copy. Calling equals on the cloned object always returns true although none of the memory of both objects overlaps.

Returns
an identical copy of the given object, which is disjoint in memory. NULL if the clone fails. Note that the returned object is SG_REF'ed

Definition at line 1302 of file SGObject.cpp.

 void compute_activations ( SGVector< float64_t > parameters, CDynamicObjectArray * layers )
virtual

Computes the activations of the neurons in this layer, results should be stored in m_activations. To be used only with non-input layers

Parameters
 parameters Vector of size get_num_parameters(), contains the parameters of the layer layers Array of layers that form the network that this layer is being used with

Reimplemented from CNeuralLayer.

Reimplemented in CNeuralLogisticLayer, CNeuralRectifiedLinearLayer, and CNeuralSoftmaxLayer.

Definition at line 77 of file NeuralLinearLayer.cpp.

 virtual void compute_activations ( SGMatrix< float64_t > inputs )
virtualinherited

Computes the activations of the neurons in this layer, results should be stored in m_activations. To be used only with input layers

Parameters
 inputs activations of the neurons in the previous layer, matrix of size previous_layer_num_neurons * batch_size

Reimplemented in CNeuralInputLayer.

Definition at line 153 of file NeuralLayer.h.

 float64_t compute_contraction_term ( SGVector< float64_t > parameters )
virtual

Computes

$\frac{\lambda}{N} \sum_{k=0}^{N-1} \left \| J(x_k) \right \|^2_F$

where $$\left \| J(x_k)) \right \|^2_F$$ is the Frobenius norm of the Jacobian of the activations of the hidden layer with respect to its inputs, $$N$$ is the batch size, and $$\lambda$$ is the contraction coefficient.

Should be implemented by layers that support being used as a hidden layer in a contractive autoencoder.

Parameters
 parameters Vector of size get_num_parameters(), contains the parameters of the layer

Reimplemented from CNeuralLayer.

Reimplemented in CNeuralLogisticLayer, and CNeuralRectifiedLinearLayer.

Definition at line 295 of file NeuralLinearLayer.cpp.

 void compute_contraction_term_gradients ( SGVector< float64_t > parameters, SGVector< float64_t > gradients )
virtual

$\frac{\lambda}{N} \sum_{k=0}^{N-1} \left \| J(x_k) \right \|^2_F$

to the gradients vector, where $$\left \| J(x_k)) \right \|^2_F$$ is the Frobenius norm of the Jacobian of the activations of the hidden layer with respect to its inputs, $$N$$ is the batch size, and $$\lambda$$ is the contraction coefficient.

Should be implemented by layers that support being used as a hidden layer in a contractive autoencoder.

Parameters
 parameters Vector of size get_num_parameters(), contains the parameters of the layer gradients Vector of size get_num_parameters(). Gradients of the contraction term will be added to it

Reimplemented in CNeuralLogisticLayer, and CNeuralRectifiedLinearLayer.

Definition at line 304 of file NeuralLinearLayer.cpp.

 float64_t compute_error ( SGMatrix< float64_t > targets )
virtual

Computes the error between the layer's current activations and the given target activations. Should only be used with output layers

Parameters
 targets desired values for the layer's activations, matrix of size num_neurons*batch_size

Reimplemented from CNeuralLayer.

Reimplemented in CNeuralSoftmaxLayer.

Definition at line 260 of file NeuralLinearLayer.cpp.

 void compute_gradients ( SGVector< float64_t > parameters, SGMatrix< float64_t > targets, CDynamicObjectArray * layers, SGVector< float64_t > parameter_gradients )
virtual

Computes the gradients that are relevent to this layer:

• The gradients of the error with respect to the layer's parameters -The gradients of the error with respect to the layer's inputs

Deriving classes should make sure to account for dropout [Hinton, 2012] during gradient computations

Parameters
 parameters Vector of size get_num_parameters(), contains the parameters of the layer targets a matrix of size num_neurons*batch_size. If the layer is being used as an output layer, targets is the desired values for the layer's activations, otherwise it's an empty matrix layers Array of layers that form the network that this layer is being used with parameter_gradients Vector of size get_num_parameters(). To be filled with gradients of the error with respect to each parameter of the layer

Reimplemented from CNeuralLayer.

Definition at line 135 of file NeuralLinearLayer.cpp.

 void compute_local_gradients ( SGMatrix< float64_t > targets )
virtual

Computes the gradients of the error with respect to this layer's pre-activations. Results are stored in m_local_gradients.

This is used by compute_gradients() and can be overriden to implement layers with different activation functions

Parameters
 targets a matrix of size num_neurons*batch_size. If the layer is being used as an output layer, targets is the desired values for the layer's activations, otherwise it's an empty matrix

Reimplemented in CNeuralLogisticLayer, CNeuralRectifiedLinearLayer, and CNeuralSoftmaxLayer.

Definition at line 242 of file NeuralLinearLayer.cpp.

 CSGObject * deep_copy ( ) const
virtualinherited

A deep copy. All the instance variables will also be copied.

Definition at line 146 of file SGObject.cpp.

 void dropout_activations ( )
virtualinherited

Applies dropout [Hinton, 2012] to the activations of the layer

If is_training is true, fills m_dropout_mask with random values (according to dropout_prop) and multiplies it into the activations, otherwise, multiplies the activations by (1-dropout_prop) to compensate for using dropout during training

Definition at line 90 of file NeuralLayer.cpp.

 void enforce_max_norm ( SGVector< float64_t > parameters, float64_t max_norm )
virtual

Constrains the weights of each neuron in the layer to have an L2 norm of at most max_norm

Parameters
 parameters pointer to the layer's parameters, array of size get_num_parameters() max_norm maximum allowable norm for a neuron's weights

Reimplemented from CNeuralLayer.

Definition at line 271 of file NeuralLinearLayer.cpp.

 bool equals ( CSGObject * other, float64_t accuracy = 0.0, bool tolerant = false )
virtualinherited

Recursively compares the current SGObject to another one. Compares all registered numerical parameters, recursion upon complex (SGObject) parameters. Does not compare pointers!

May be overwritten but please do with care! Should not be necessary in most cases.

Parameters
 other object to compare with accuracy accuracy to use for comparison (optional) tolerant allows linient check on float equality (within accuracy)
Returns
true if all parameters were equal, false if not

Definition at line 1206 of file SGObject.cpp.

virtualinherited

Gets the layer's activation gradients, a matrix of size num_neurons * batch_size

Returns

Definition at line 284 of file NeuralLayer.h.

 virtual SGMatrix get_activations ( )
virtualinherited

Gets the layer's activations, a matrix of size num_neurons * batch_size

Returns
layer's activations

Definition at line 277 of file NeuralLayer.h.

 SGIO * get_global_io ( )
inherited

get the io object

Returns
io object

Definition at line 183 of file SGObject.cpp.

 Parallel * get_global_parallel ( )
inherited

get the parallel object

Returns
parallel object

Definition at line 224 of file SGObject.cpp.

 Version * get_global_version ( )
inherited

get the version object

Returns
version object

Definition at line 237 of file SGObject.cpp.

 virtual int32_t get_height ( )
virtualinherited

Returns the height assuming that the layer's activations are interpreted as images (i.e for convolutional nets)

Returns
Height

Definition at line 265 of file NeuralLayer.h.

virtualinherited

Gets the layer's local gradients, a matrix of size num_neurons * batch_size

Returns

Definition at line 294 of file NeuralLayer.h.

 SGStringList< char > get_modelsel_names ( )
inherited
Returns
vector of names of all parameters which are registered for model selection

Definition at line 1077 of file SGObject.cpp.

 char * get_modsel_param_descr ( const char * param_name )
inherited

Returns description of a given parameter string, if it exists. SG_ERROR otherwise

Parameters
 param_name name of the parameter
Returns
description of the parameter

Definition at line 1101 of file SGObject.cpp.

 index_t get_modsel_param_index ( const char * param_name )
inherited

Returns index of model selection parameter with provided index

Parameters
 param_name name of model selection parameter
Returns
index of model selection parameter with provided name, -1 if there is no such

Definition at line 1114 of file SGObject.cpp.

 virtual const char* get_name ( ) const
virtual

Returns the name of the SGSerializable instance. It MUST BE the CLASS NAME without the prefixed C'.

Returns
name of the SGSerializable

Reimplemented from CNeuralLayer.

Reimplemented in CNeuralLogisticLayer, CNeuralRectifiedLinearLayer, and CNeuralSoftmaxLayer.

Definition at line 213 of file NeuralLinearLayer.h.

 virtual int32_t get_num_neurons ( )
virtualinherited

Gets the number of neurons in the layer

Returns
number of neurons in the layer

Definition at line 251 of file NeuralLayer.h.

 virtual int32_t get_num_parameters ( )
virtualinherited

Gets the number of parameters used in this layer

Returns
number of parameters used in this layer

Definition at line 271 of file NeuralLayer.h.

 virtual int32_t get_width ( )
virtualinherited

Returns the width assuming that the layer's activations are interpreted as images (i.e for convolutional nets)

Returns
Width

Definition at line 258 of file NeuralLayer.h.

 void initialize ( CDynamicObjectArray * layers, SGVector< int32_t > input_indices )
virtual

Initializes the layer, computes the number of parameters needed for the layer

Parameters
 layers Array of layers that form the network that this layer is being used with input_indices Indices of the layers that are connected to this layer as input

Reimplemented from CNeuralLayer.

Definition at line 53 of file NeuralLinearLayer.cpp.

 void initialize_parameters ( SGVector< float64_t > parameters, SGVector< bool > parameter_regularizable, float64_t sigma )
virtual

Initializes the layer's parameters. The layer should fill the given arrays with the initial value for its parameters

Parameters
 parameters Vector of size get_num_parameters() parameter_regularizable Vector of size get_num_parameters(). This controls which of the layer's parameter are subject to regularization, i.e to turn off regularization for parameter i, set parameter_regularizable[i] = false. This is usally used to turn off regularization for bias parameters. sigma standard deviation of the gaussian used to random the parameters

Reimplemented from CNeuralLayer.

Definition at line 63 of file NeuralLinearLayer.cpp.

 bool is_generic ( EPrimitiveType * generic ) const
virtualinherited

If the SGSerializable is a class template then TRUE will be returned and GENERIC is set to the type of the generic.

Parameters
 generic set to the type of the generic if returning TRUE
Returns
TRUE if a class template.

Definition at line 243 of file SGObject.cpp.

 virtual bool is_input ( )
virtualinherited

returns true if the layer is an input layer. Input layers are the root layers of a network, that is, they don't receive signals from other layers, they receive signals from the inputs features to the network.

Local and activation gradients are not computed for input layers

Reimplemented in CNeuralInputLayer.

Definition at line 127 of file NeuralLayer.h.

 DynArray< TParameter * > * load_all_file_parameters ( int32_t file_version, int32_t current_version, CSerializableFile * file, const char * prefix = "" )
inherited

maps all parameters of this instance to the provided file version and loads all parameter data from the file into an array, which is sorted (basically calls load_file_parameter(...) for all parameters and puts all results into a sorted array)

Parameters
 file_version parameter version of the file current_version version from which mapping begins (you want to use Version::get_version_parameter() for this in most cases) file file to load from prefix prefix for members
Returns
(sorted) array of created TParameter instances with file data

Definition at line 648 of file SGObject.cpp.

 DynArray< TParameter * > * load_file_parameters ( const SGParamInfo * param_info, int32_t file_version, CSerializableFile * file, const char * prefix = "" )
inherited

loads some specified parameters from a file with a specified version The provided parameter info has a version which is recursively mapped until the file parameter version is reached. Note that there may be possibly multiple parameters in the mapping, therefore, a set of TParameter instances is returned

Parameters
 param_info information of parameter file_version parameter version of the file, must be <= provided parameter version file file to load from prefix prefix for members
Returns
new array with TParameter instances with the attached data

Definition at line 489 of file SGObject.cpp.

 bool load_serializable ( CSerializableFile * file, const char * prefix = "", int32_t param_version = Version::get_version_parameter() )
virtualinherited

Load this object from file. If it will fail (returning FALSE) then this object will contain inconsistent data and should not be used!

Parameters
 file where to load from prefix prefix for members param_version (optional) a parameter version different to (this is mainly for testing, better do not use)
Returns
TRUE if done, otherwise FALSE

Definition at line 320 of file SGObject.cpp.

 void load_serializable_post ( ) throw (ShogunException)
protectedvirtualinherited

Can (optionally) be overridden to post-initialize some member variables which are not PARAMETER::ADD'ed. Make sure that at first the overridden method BASE_CLASS::LOAD_SERIALIZABLE_POST is called.

Exceptions
 ShogunException Will be thrown if an error occurres.

Definition at line 1004 of file SGObject.cpp.

 void load_serializable_pre ( ) throw (ShogunException)
protectedvirtualinherited

Can (optionally) be overridden to pre-initialize some member variables which are not PARAMETER::ADD'ed. Make sure that at first the overridden method BASE_CLASS::LOAD_SERIALIZABLE_PRE is called.

Exceptions
 ShogunException Will be thrown if an error occurres.

Definition at line 999 of file SGObject.cpp.

 void map_parameters ( DynArray< TParameter * > * param_base, int32_t & base_version, DynArray< const SGParamInfo * > * target_param_infos )
inherited

Takes a set of TParameter instances (base) with a certain version and a set of target parameter infos and recursively maps the base level wise to the current version using CSGObject::migrate(...). The base is replaced. After this call, the base version containing parameters should be of same version/type as the initial target parameter infos. Note for this to work, the migrate methods and all the internal parameter mappings have to match

Parameters
 param_base set of TParameter instances that are mapped to the provided target parameter infos base_version version of the parameter base target_param_infos set of SGParamInfo instances that specify the target parameter base

Definition at line 686 of file SGObject.cpp.

 TParameter * migrate ( DynArray< TParameter * > * param_base, const SGParamInfo * target )
protectedvirtualinherited

creates a new TParameter instance, which contains migrated data from the version that is provided. The provided parameter data base is used for migration, this base is a collection of all parameter data of the previous version. Migration is done FROM the data in param_base TO the provided param info Migration is always one version step. Method has to be implemented in subclasses, if no match is found, base method has to be called.

If there is an element in the param_base which equals the target, a copy of the element is returned. This represents the case when nothing has changed and therefore, the migrate method is not overloaded in a subclass

Parameters
 param_base set of TParameter instances to use for migration target parameter info for the resulting TParameter
Returns
a new TParameter instance with migrated data from the base of the type which is specified by the target parameter

Definition at line 893 of file SGObject.cpp.

 void one_to_one_migration_prepare ( DynArray< TParameter * > * param_base, const SGParamInfo * target, TParameter *& replacement, TParameter *& to_migrate, char * old_name = NULL )
protectedvirtualinherited

This method prepares everything for a one-to-one parameter migration. One to one here means that only ONE element of the parameter base is needed for the migration (the one with the same name as the target). Data is allocated for the target (in the type as provided in the target SGParamInfo), and a corresponding new TParameter instance is written to replacement. The to_migrate pointer points to the single needed TParameter instance needed for migration. If a name change happened, the old name may be specified by old_name. In addition, the m_delete_data flag of to_migrate is set to true. So if you want to migrate data, the only thing to do after this call is converting the data in the m_parameter fields. If unsure how to use - have a look into an example for this. (base_migration_type_conversion.cpp for example)

Parameters
 param_base set of TParameter instances to use for migration target parameter info for the resulting TParameter replacement (used as output) here the TParameter instance which is returned by migration is created into to_migrate the only source that is used for migration old_name with this parameter, a name change may be specified

Definition at line 833 of file SGObject.cpp.

 bool parameter_hash_changed ( )
virtualinherited
Returns
whether parameter combination has changed since last update

Definition at line 209 of file SGObject.cpp.

 void print_modsel_params ( )
inherited

prints all parameter registered for model selection and their type

Definition at line 1053 of file SGObject.cpp.

 void print_serializable ( const char * prefix = "" )
virtualinherited

prints registered parameters out

Parameters
 prefix prefix for members

Definition at line 255 of file SGObject.cpp.

 bool save_serializable ( CSerializableFile * file, const char * prefix = "", int32_t param_version = Version::get_version_parameter()` )
virtualinherited

Save this object to file.

Parameters
 file where to save the object; will be closed during returning if PREFIX is an empty string. prefix prefix for members param_version (optional) a parameter version different to (this is mainly for testing, better do not use)
Returns
TRUE if done, otherwise FALSE

Definition at line 261 of file SGObject.cpp.

 void save_serializable_post ( ) throw (ShogunException)
protectedvirtualinherited

Can (optionally) be overridden to post-initialize some member variables which are not PARAMETER::ADD'ed. Make sure that at first the overridden method BASE_CLASS::SAVE_SERIALIZABLE_POST is called.

Exceptions
 ShogunException Will be thrown if an error occurres.

Reimplemented in CKernel.

Definition at line 1014 of file SGObject.cpp.

 void save_serializable_pre ( ) throw (ShogunException)
protectedvirtualinherited

Can (optionally) be overridden to pre-initialize some member variables which are not PARAMETER::ADD'ed. Make sure that at first the overridden method BASE_CLASS::SAVE_SERIALIZABLE_PRE is called.

Exceptions
 ShogunException Will be thrown if an error occurres.

Definition at line 1009 of file SGObject.cpp.

 void set_batch_size ( int32_t batch_size )
virtualinherited

Sets the batch_size and allocates memory for m_activations and m_input_gradients accordingly. Must be called before forward or backward propagation is performed

Parameters
 batch_size number of training/test cases the network is currently working with

Reimplemented in CNeuralConvolutionalLayer.

Definition at line 75 of file NeuralLayer.cpp.

 void set_generic< complex128_t > ( )
inherited

set generic type to T

Definition at line 38 of file SGObject.cpp.

 void set_global_io ( SGIO * io )
inherited

set the io object

Parameters
 io io object to use

Definition at line 176 of file SGObject.cpp.

 void set_global_parallel ( Parallel * parallel )
inherited

set the parallel object

Parameters
 parallel parallel object to use

Definition at line 189 of file SGObject.cpp.

 void set_global_version ( Version * version )
inherited

set the version object

Parameters
 version version object to use

Definition at line 230 of file SGObject.cpp.

 CSGObject * shallow_copy ( ) const
virtualinherited

A shallow copy. All the SGObject instance variables will be simply assigned and SG_REF-ed.

Reimplemented in CGaussianKernel.

Definition at line 140 of file SGObject.cpp.

 void unset_generic ( )
inherited

unset generic type

this has to be called in classes specializing a template class

Definition at line 250 of file SGObject.cpp.

 void update_parameter_hash ( )
virtualinherited

Updates the hash of current parameter combination

Definition at line 196 of file SGObject.cpp.

## Member Data Documentation

 ENLAutoencoderPosition autoencoder_position
inherited

For autoencoders, specifies the position of the layer in the autoencoder, i.e an encoding layer or a decoding layer. Default value is NLAP_NONE

Definition at line 327 of file NeuralLayer.h.

 float64_t contraction_coefficient
inherited

For hidden layers in a contractive autoencoders [Rifai, 2011] a term:

$\frac{\lambda}{N} \sum_{k=0}^{N-1} \left \| J(x_k) \right \|^2_F$

is added to the error, where $$\left \| J(x_k)) \right \|^2_F$$ is the Frobenius norm of the Jacobian of the activations of the hidden layer with respect to its inputs, $$N$$ is the batch size, and $$\lambda$$ is the contraction coefficient.

Default value is 0.0.

Definition at line 322 of file NeuralLayer.h.

 float64_t dropout_prop
inherited

probabilty of dropping out a neuron in the layer

Definition at line 311 of file NeuralLayer.h.

 SGIO* io
inherited

io

Definition at line 461 of file SGObject.h.

 bool is_training
inherited

Should be true if the layer is currently used during training initial value is false

Definition at line 308 of file NeuralLayer.h.

protectedinherited

gradients of the error with respect to the layer's inputs size previous_layer_num_neurons * batch_size

Definition at line 365 of file NeuralLayer.h.

 SGMatrix m_activations
protectedinherited

activations of the neurons in this layer size num_neurons * batch_size

Definition at line 360 of file NeuralLayer.h.

 int32_t m_batch_size
protectedinherited

number of training/test cases the network is currently working with

Definition at line 355 of file NeuralLayer.h.

protectedinherited

binary mask that determines whether a neuron will be kept or dropped out during the current iteration of training size num_neurons * batch_size

Definition at line 377 of file NeuralLayer.h.

inherited

parameters wrt which we can compute gradients

Definition at line 476 of file SGObject.h.

 uint32_t m_hash
inherited

Hash of parameter values

Definition at line 482 of file SGObject.h.

 int32_t m_height
protectedinherited

Width of the image (if the layer's activations are to be interpreted as images. Default value is 1

Definition at line 341 of file NeuralLayer.h.

 SGVector m_input_indices
protectedinherited

Indices of the layers that are connected to this layer as input

Definition at line 347 of file NeuralLayer.h.

 SGVector m_input_sizes
protectedinherited

Number of neurons in the layers that are connected to this layer as input

Definition at line 352 of file NeuralLayer.h.

protectedinherited

gradients of the error with respect to the layer's pre-activations, this is usually used as a buffer when computing the input gradients size num_neurons * batch_size

Definition at line 371 of file NeuralLayer.h.

 Parameter* m_model_selection_parameters
inherited

model selection parameters

Definition at line 473 of file SGObject.h.

 int32_t m_num_neurons
protectedinherited

Number of neurons in this layer

Definition at line 331 of file NeuralLayer.h.

 int32_t m_num_parameters
protectedinherited

Number of neurons in this layer

Definition at line 344 of file NeuralLayer.h.

 ParameterMap* m_parameter_map
inherited

map for different parameter versions

Definition at line 479 of file SGObject.h.

 Parameter* m_parameters
inherited

parameters

Definition at line 470 of file SGObject.h.

 int32_t m_width
protectedinherited

Width of the image (if the layer's activations are to be interpreted as images. Default value is m_num_neurons

Definition at line 336 of file NeuralLayer.h.

 Parallel* parallel
inherited

parallel

Definition at line 464 of file SGObject.h.

 Version* version
inherited

version

Definition at line 467 of file SGObject.h.

The documentation for this class was generated from the following files:

SHOGUN Machine Learning Toolbox - Documentation