Class List

Here are the classes, structs, unions and interfaces with brief descriptions:
CAccuracyMeasureClass AccuracyMeasure used to measure accuracy of 2-class classifier
CAlphabetThe class Alphabet implements an alphabet and alphabet utility functions
CANOVAKernelANOVA (ANalysis Of VAriances) kernel
CArray< T >Template class Array implements a dense one dimensional array
CArray2< T >Template class Array2 implements a dense two dimensional array
CArray3< T >Template class Array3 implements a dense three dimensional array
CAsciiFileA Ascii File access class
CAttenuatedEuclidianDistanceClass AttenuatedEuclidianDistance
CAttributeFeaturesImplements attributed features, that is in the simplest case a number of (attribute, value) pairs
CAUCKernelThe AUC kernel can be used to maximize the area under the receiver operator characteristic curve (AUC) instead of margin in SVM training
CAveragedPerceptronClass Averaged Perceptron implements the standard linear (online) algorithm. Averaged perceptron is the simple extension of Perceptron
CAvgDiagKernelNormalizerNormalize the kernel by either a constant or the average value of the diagonal elements (depending on argument c of the constructor)
CBALMeasureClass BALMeasure used to measure balanced error of 2-class classifier
CBesselKernelClass Bessel kernel
CBinaryClassEvaluationThe class TwoClassEvaluation a base class used to evaluate 2-class classification
CBinaryFileA Binary file access class
CBinaryStream< T >Memory mapped emulation via binary streams (files)
CBitStringString class embedding a string in a compact bit representation
CBrayCurtisDistanceClass Bray-Curtis distance
CCache< T >Template class Cache implements a simple cache
CCanberraMetricClass CanberraMetric
CCanberraWordDistanceClass CanberraWordDistance
CCauchyKernelCauchy kernel
CChebyshewMetricClass ChebyshewMetric
CChi2KernelThe Chi2 kernel operating on realvalued vectors computes the chi-squared distance between sets of histograms
CChiSquareDistanceClass ChiSquareDistance
CCircularKernelCircular kernel
CCombinedDotFeaturesFeatures that allow stacking of a number of DotFeatures
CCombinedFeaturesThe class CombinedFeatures is used to combine a number of of feature objects into a single CombinedFeatures object
CCombinedKernelThe Combined kernel is used to combine a number of kernels into a single CombinedKernel object by linear combination
CCommUlongStringKernelThe CommUlongString kernel may be used to compute the spectrum kernel from strings that have been mapped into unsigned 64bit integers
CCommWordStringKernelThe CommWordString kernel may be used to compute the spectrum kernel from strings that have been mapped into unsigned 16bit integers
CCompressorCompression library for compressing and decompressing buffers using one of the standard compression algorithms, LZO, GZIP, BZIP2 or LZMA
CConstKernelThe Constant Kernel returns a constant for all elements
CContingencyTableEvaluationThe class ContingencyTableEvaluation a base class used to evaluate 2-class classification with TP, FP, TN, FN rates
CCosineDistanceClass CosineDistance
CCplexClass CCplex to encapsulate access to the commercial cplex general purpose optimizer
CCPLEXSVMCplexSVM a SVM solver implementation based on cplex (unfinished)
CCrossCorrelationMeasureClass CrossCorrelationMeasure used to measure cross correlation coefficient of 2-class classifier
CCrossValidationBase class for cross-validation evaluation. Given a learning machine, a splitting strategy, an evaluation criterium, features and correspnding labels, this provides an interface for cross-validation. Results may be retrieved using the evaluate method. A number of repetitions may be specified for obtaining more accurate results. The arithmetic mean of different runs is returned along with confidence intervals, if a p-value is specified. Default number of runs is one, confidence interval combutation is disabled
CCustomDistanceThe Custom Distance allows for custom user provided distance matrices
CCustomKernelThe Custom Kernel allows for custom user provided kernel matrices
CDecompressString< ST >Preprocessor that decompresses compressed strings
CDiagKernelThe Diagonal Kernel returns a constant for the diagonal and zero otherwise
CDiceKernelNormalizerDiceKernelNormalizer performs kernel normalization inspired by the Dice coefficient (see http://en.wikipedia.org/wiki/Dice's_coefficient)
CDimensionReductionPreprocessorClass DimensionReductionPreprocessor, a base class for preprocessors used to lower the dimensionality of given simple features (dense matrices)
CDistanceClass Distance, a base class for all the distances used in the Shogun toolbox
CDistanceKernelThe Distance kernel takes a distance as input
CDistanceMachineA generic DistanceMachine interface
CDistantSegmentsKernelThe distant segments kernel is a string kernel, which counts the number of substrings, so-called segments, at a certain distance from each other
CDistributionBase class Distribution from which all methods implementing a distribution are derived
CDomainAdaptationSVMClass DomainAdaptationSVM
CDomainAdaptationSVMLinearClass DomainAdaptationSVMLinear
CDotFeaturesFeatures that support dot products among other operations
CDotKernelTemplate class DotKernel is the base class for kernels working on DotFeatures
CDummyFeaturesThe class DummyFeatures implements features that only know the number of feature objects (but don't actually contain any)
CDynamicArray< T >Template Dynamic array class that creates an array that can be used like a list or an array
CDynamicObjectArray< T >Template Dynamic array class that creates an array that can be used like a list or an array
CDynInt< T, sz >Integer type of dynamic size
CDynProgDynamic Programming Class
CErrorRateMeasureClass ErrorRateMeasure used to measure error rate of 2-class classifier
CEuclidianDistanceClass EuclidianDistance
CEvaluationThe class Evaluation a main class for other classes used to evaluate labels, e.g. accuracy of classification or mean squared error of regression
CExplicitSpecFeaturesFeatures that compute the Spectrum Kernel feature space explicitly
CExponentialKernelThe Exponential Kernel, closely related to the Gaussian Kernel computed on CDotFeatures
CF1MeasureClass F1Measure used to measure F1 score of 2-class classifier
CFeaturesThe class Features is the base class of all feature objects
CFileA File access base class
CFirstElementKernelNormalizerNormalize the kernel by a constant obtained from the first element of the kernel matrix, i.e. $ c=k({\bf x},{\bf x})$
CFixedDegreeStringKernelThe FixedDegree String kernel takes as input two strings of same size and counts the number of matches of length d
CFKFeaturesThe class FKFeatures implements Fischer kernel features obtained from two Hidden Markov models
CGaussianGaussian distribution interface
CGaussianKernelThe well known Gaussian kernel (swiss army knife for SVMs) computed on CDotFeatures
CGaussianMatchStringKernelThe class GaussianMatchStringKernel computes a variant of the Gaussian kernel on strings of same length
CGaussianNaiveBayesClass GaussianNaiveBayes, a Gaussian Naive Bayes classifier
CGaussianShiftKernelAn experimental kernel inspired by the WeightedDegreePositionStringKernel and the Gaussian kernel
CGaussianShortRealKernelThe well known Gaussian kernel (swiss army knife for SVMs) on dense short-real valued features
CGCArray< T >Template class GCArray implements a garbage collecting static array
CGeodesicMetricClass GeodesicMetric
CGHMMClass GHMM - this class is non-functional and was meant to implement a Generalize Hidden Markov Model (aka Semi Hidden Markov HMM)
CGMMGaussian Mixture Model interface
CGMNPLibClass GMNPLib Library of solvers for Generalized Minimal Norm Problem (GMNP)
CGMNPSVMClass GMNPSVM implements a one vs. rest MultiClass SVM
CGNPPLibClass GNPPLib, a Library of solvers for Generalized Nearest Point Problem (GNPP)
CGNPPSVMClass GNPPSVM
CGPBTSVMClass GPBTSVM
CGridSearchModelSelectionModel selection class which searches for the best model by a grid- search. See CModelSelection for details
CGUIClassifierUI classifier
CGUIDistanceUI distance
CGUIFeaturesUI features
CGUIHMMUI HMM (Hidden Markov Model)
CGUIKernelUI kernel
CGUILabelsUI labels
CGUIMathUI math
CGUIPluginEstimateUI estimate
CGUIPreprocessorUI preprocessor
CGUIStructureUI structure
CGUITimeUI time
CHammingWordDistanceClass HammingWordDistance
CHashCollection of Hashing Functions
CHashedWDFeaturesFeatures that compute the Weighted Degreee Kernel feature space explicitly
CHashedWDFeaturesTransposedFeatures that compute the Weighted Degreee Kernel feature space explicitly
CHashSetClass HashSet, a set based on the hash-table. w: http://en.wikipedia.org/wiki/Hash_table
CHessianLocallyLinearEmbeddingClass HessianLocallyLinearEmbedding used to preprocess data using Hessian Locally Linear Embedding algorithm described in
CHierarchicalAgglomerative hierarchical single linkage clustering
CHingeLossCHingeLoss implements the hinge loss function
CHistogramClass Histogram computes a histogram over all 16bit unsigned integers in the features
CHistogramIntersectionKernelThe HistogramIntersection kernel operating on realvalued vectors computes the histogram intersection distance between sets of histograms. Note: the current implementation assumes positive values for the histograms, and input vectors should sum to 1
CHistogramWordStringKernelThe HistogramWordString computes the TOP kernel on inhomogeneous Markov Chains
CHMMHidden Markov Model
CIdentityKernelNormalizerIdentity Kernel Normalization, i.e. no normalization is applied
CImplicitWeightedSpecFeaturesFeatures that compute the Weighted Spectrum Kernel feature space explicitly
CIndirectObject< T, P >Array class that accesses elements indirectly via an index array
CInputParser< T >Class CInputParser is a templated class used to maintain the reading/parsing/providing of examples
CIntronListClass IntronList
CInverseMultiQuadricKernelInverseMultiQuadricKernel
CIOBufferAn I/O buffer class
CIsomapClass Isomap used to preprocess data using K-Isomap algorithm as described in
CJensenMetricClass JensenMetric
CKernelThe Kernel base class
CKernelDistanceThe Kernel distance takes a distance as input
CKernelLocallyLinearEmbeddingClass KernelLocallyLinearEmbedding used to preprocess data using kernel extension of Locally Linear Embedding algorithm as described in
CKernelMachineA generic KernelMachine interface
CKernelNormalizerThe class Kernel Normalizer defines a function to post-process kernel values
CKernelPCAPreprocessor KernelPCA performs kernel principal component analysis
CKMeansKMeans clustering, partitions the data into k (a-priori specified) clusters
CKNNClass KNN, an implementation of the standard k-nearest neigbor classifier
CKRRClass KRR implements Kernel Ridge Regression - a regularized least square method for classification and regression
CLabelsThe class Labels models labels, i.e. class assignments of objects
CLaplacianEigenmapsClass LaplacianEigenmaps used to preprocess data using Laplacian Eigenmaps algorithm as described in:
CLaRankLaRank multiclass SVM machine
CLBPPyrDotFeaturesImplement DotFeatures for the polynomial kernel
CLDAClass LDA implements regularized Linear Discriminant Analysis
CLibLinearClass to implement LibLinear
CLibSVMLibSVM
CLibSVMMultiClassClass LibSVMMultiClass
CLibSVMOneClassClass LibSVMOneClass
CLibSVRClass LibSVR, performs support vector regression using LibSVM
CLinearHMMThe class LinearHMM is for learning Higher Order Markov chains
CLinearKernelComputes the standard linear kernel on CDotFeatures
CLinearMachineClass LinearMachine is a generic interface for all kinds of linear machines like classifiers
CLinearStringKernelComputes the standard linear kernel on dense char valued features
CListClass List implements a doubly connected list for low-level-objects
CListElementClass ListElement, defines how an element of the the list looks like
CLocalAlignmentStringKernelThe LocalAlignmentString kernel compares two sequences through all possible local alignments between the two sequences
CLocalityImprovedStringKernelThe LocalityImprovedString kernel is inspired by the polynomial kernel. Comparing neighboring characters it puts emphasize on local features
CLocallyLinearEmbeddingClass LocallyLinearEmbedding used to preprocess data using Locally Linear Embedding algorithm described in
CLocalTangentSpaceAlignmentClass LocalTangentSpaceAlignment used to preprocess data using Local Tangent Space Alignment (LTSA) algorithm as described in:
CLogKernelLog kernel
CLogLossCLogLoss implements the logarithmic loss function
CLogLossMarginClass CLogLossMargin implements a margin-based log-likelihood loss function
CLogPlusOnePreprocessor LogPlusOne does what the name says, it adds one to a dense real valued vector and takes the logarithm of each component of it
CLossClass which collects generic mathematical functions
CLossFunctionClass CLossFunction is the base class of all loss functions
CLPBoostClass LPBoost trains a linear classifier called Linear Programming Machine, i.e. a SVM using a $\ell_1$ norm regularizer
CLPMClass LPM trains a linear classifier called Linear Programming Machine, i.e. a SVM using a $\ell_1$ norm regularizer
CMachineA generic learning machine interface
CManhattanMetricClass ManhattanMetric
CManhattanWordDistanceClass ManhattanWordDistance
CMatchWordStringKernelThe class MatchWordStringKernel computes a variant of the polynomial kernel on strings of same length converted to a word alphabet
CMathClass which collects generic mathematical functions
CMeanSquaredErrorClass MeanSquaredError used to compute error of regression model
CMemoryMappedFile< T >Memory mapped file
CMinkowskiMetricClass MinkowskiMetric
CMKLMultiple Kernel Learning
CMKLClassificationMultiple Kernel Learning for two-class-classification
CMKLMultiClassMKLMultiClass is a class for L1-norm multiclass MKL
CMKLOneClassMultiple Kernel Learning for one-class-classification
CMKLRegressionMultiple Kernel Learning for regression
CModelSelectionAbstract base class for model selection. Takes a parameter tree which specifies parameters for model selection, and a cross-validation instance and searches for the best combination of parameters in the abstract method select_model(), which has to be implemented in concrete sub-classes
CModelSelectionParametersClass to select parameters and their ranges for model selection. The structure is organized as a tree with different kinds of nodes, depending on the values of its member variables of name and CSGObject
CMPDSVMClass MPDSVM
CMulticlassAccuracyThe class MulticlassAccuracy used to compute accuracy of multiclass classification
CMultiClassSVMClass MultiClassSVM
CMultidimensionalScalingClass Multidimensionalscaling is used to perform multidimensional scaling (capable of landmark approximation if requested)
CMultiquadricKernelMultiquadricKernel
CMultitaskKernelMaskNormalizerThe MultitaskKernel allows Multitask Learning via a modified kernel function
CMultitaskKernelMaskPairNormalizerThe MultitaskKernel allows Multitask Learning via a modified kernel function
CMultitaskKernelMklNormalizerBase-class for parameterized Kernel Normalizers
CMultitaskKernelNormalizerThe MultitaskKernel allows Multitask Learning via a modified kernel function
CMultitaskKernelPlifNormalizerThe MultitaskKernel allows learning a piece-wise linear function (PLIF) via MKL
CMultitaskKernelTreeNormalizerThe MultitaskKernel allows Multitask Learning via a modified kernel function based on taxonomy
CNodeA CNode is an element of a CTaxonomy, which is used to describe hierarchical structure between tasks
CNormOnePreprocessor NormOne, normalizes vectors to have norm 1
COligoStringKernelThis class offers access to the Oligo Kernel introduced by Meinicke et al. in 2004
COnlineLibLinearClass implementing a purely online version of LibLinear, using the L2R_L1LOSS_SVC_DUAL solver only
COnlineLinearMachineClass OnlineLinearMachine is a generic interface for linear machines like classifiers which work through online algorithms
COnlineSVMSGDClass OnlineSVMSGD
CParameterCombinationClass that holds ONE combination of parameters for a learning machine. The structure is organized as a tree. Every node may hold a name or an instance of a Parameter class. Nodes may have children. The nodes are organized in such way, that every parameter of a model for model selection has one node and sub-parameters are stored in sub-nodes. Using a tree of this class, parameters of models may easily be set. There are these types of nodes:
CParseBuffer< T >Class CParseBuffer implements a ring of examples of a defined size. The ring stores objects of the Example type
CPCAPreprocessor PCACut performs principial component analysis on the input vectors and keeps only the n eigenvectors with eigenvalues above a certain threshold
CPerceptronClass Perceptron implements the standard linear (online) perceptron
CPlifClass Plif
CPlifArrayClass PlifArray
CPlifBaseClass PlifBase
CPlifMatrixStore plif arrays for all transitions in the model
CPluginEstimateClass PluginEstimate
CPolyFeaturesImplement DotFeatures for the polynomial kernel
CPolyKernelComputes the standard polynomial kernel on CDotFeatures
CPolyMatchStringKernelThe class PolyMatchStringKernel computes a variant of the polynomial kernel on strings of same length
CPolyMatchWordStringKernelThe class PolyMatchWordStringKernel computes a variant of the polynomial kernel on word-features
CPositionalPWMPositional PWM
CPowerKernelPower kernel
CPRCEvaluationThe class PRCEvalution used to evaluate PRC (Precision Recall Curve) graph of binary classifier. This class also has an capability of calculating auPRC (area under PRC)
CPrecisionMeasureClass PrecisionMeasure used to measure precision of 2-class classifier
CPreprocessorClass Preprocessor defines a preprocessor interface
CPruneVarSubMeanPreprocessor PruneVarSubMean will substract the mean and remove features that have zero variance
CPyramidChi2Pyramid Kernel over Chi2 matched histograms
CQPBSVMLibClass QPBSVMLib
CRandomFourierGaussPreprocPreprocessor CRandomFourierGaussPreproc implements Random Fourier Features for the Gauss kernel a la Ali Rahimi and Ben Recht Nips2007 after preprocessing the features using them in a linear kernel approximates a gaussian kernel
CRationalQuadraticKernelRational Quadratic kernel
CRealDistanceClass RealDistance
CRealFileFeaturesThe class RealFileFeatures implements a dense double-precision floating point matrix from a file
CRecallMeasureClass RecallMeasure used to measure recall of 2-class classifier
CRegulatoryModulesStringKernelThe Regulaty Modules kernel, based on the WD kernel, as published in Schultheiss et al., Bioinformatics (2009) on regulatory sequences
CRidgeKernelNormalizerNormalize the kernel by adding a constant term to its diagonal. This aids kernels to become positive definite (even though they are not - often caused by numerical problems)
CROCEvaluationThe class ROCEvalution used to evaluate ROC (Receiver Operator Characteristic) graph of binary classifier. This class also has an capability of calculating auROC (area under ROC)
CrossValidationResultType to encapsulate the results of an evaluation run. May contain confidence interval (if conf_int_alpha!=0). m_conf_int_alpha is the probability for an error, i.e. the value does not lie in the confidence interval
CSalzbergWordStringKernelThe SalzbergWordString kernel implements the Salzberg kernel
CScatterKernelNormalizerScatter kernel normalizer
CScatterSVMScatterSVM - Multiclass SVM
CSegmentLossClass IntronList
CSerializableAsciiFileSerializable ascii file
CSerializableFileSerializable file
CSet< T >Template Set class
CSGDQNClass SGDQN
CSGObjectClass SGObject is the base class of all shogun objects
CSigmoidKernelThe standard Sigmoid kernel computed on dense real valued features
CSignalClass Signal implements signal handling to e.g. allow ctrl+c to cancel a long running process
CSimpleDistance< ST >Template class SimpleDistance
CSimpleFeatures< ST >The class SimpleFeatures implements dense feature matrices
CSimpleFile< T >Template class SimpleFile to read and write from files
CSimpleLocalityImprovedStringKernelSimpleLocalityImprovedString kernel, is a ``simplified'' and better performing version of the Locality improved kernel
CSimplePreprocessor< ST >Template class SimplePreprocessor, base class for preprocessors (cf. CPreprocessor) that apply to CSimpleFeatures (i.e. rectangular dense matrices)
CSmoothHingeLossCSmoothHingeLoss implements the smooth hinge loss function
CSNPFeaturesFeatures that compute the Weighted Degreee Kernel feature space explicitly
CSNPStringKernelThe class SNPStringKernel computes a variant of the polynomial kernel on strings of same length
CSortUlongStringPreprocessor SortUlongString, sorts the indivual strings in ascending order
CSortWordStringPreprocessor SortWordString, sorts the indivual strings in ascending order
CSparseDistance< ST >Template class SparseDistance
CSparseEuclidianDistanceClass SparseEucldianDistance
CSparseFeatures< ST >Template class SparseFeatures implements sparse matrices
CSparseKernel< ST >Template class SparseKernel, is the base class of kernels working on sparse features
CSparsePolyFeaturesImplement DotFeatures for the polynomial kernel
CSparsePreprocessor< ST >Template class SparsePreprocessor, base class for preprocessors (cf. CPreprocessor) that apply to CSparseFeatures
CSparseSpatialSampleStringKernelSparse Spatial Sample String Kernel by Pavel Kuksa <pkuksa@cs.rutgers.edu> and Vladimir Pavlovic <vladimir@cs.rutgers.edu>
CSpecificityMeasureClass SpecificityMeasure used to measure specificity of 2-class classifier
CSpectrumMismatchRBFKernelSpectrum mismatch rbf kernel
CSpectrumRBFKernelSpectrum rbf kernel
CSphericalKernelSpherical kernel
CSplineKernelComputes the Spline Kernel function which is the cubic polynomial
CSplittingStrategyAbstract base class for all splitting types. Takes a CLabels instance and generates a desired number of subsets which are being accessed by their indices via the method generate_subset_indices(...)
CSqrtDiagKernelNormalizerSqrtDiagKernelNormalizer divides by the Square Root of the product of the diagonal elements
CSquaredHingeLossClass CSquaredHingeLoss implements a squared hinge loss function
CSquaredLossCSquaredLoss implements the squared loss function
CStatisticsClass that contains certain functions related to statistics, such as the student's t distribution
CStratifiedCrossValidationSplittingImplementation of stratified cross-validation on the base of CSplittingStrategy. Produces subset index sets of equal size (at most one difference) in which the label ratio is equal (at most one difference) to the label ratio of the specified labels
CStreamingAsciiFileClass StreamingAsciiFile to read vector-by-vector from ASCII files
CStreamingDotFeaturesStreaming features that support dot products among other operations
CStreamingFeaturesStreaming features are features which are used for online algorithms
CStreamingFileA Streaming File access class
CStreamingFileFromFeaturesClass StreamingFileFromFeatures to read vector-by-vector from a CFeatures object
CStreamingFileFromSimpleFeatures< T >Class CStreamingFileFromSimpleFeatures is a derived class of CStreamingFile which creates an input source for the online framework from a CSimpleFeatures object
CStreamingFileFromSparseFeatures< T >Class CStreamingFileFromSparseFeatures is derived from CStreamingFile and provides an input source for the online framework. It uses an existing CSparseFeatures object to generate online examples
CStreamingFileFromStringFeatures< T >Class CStreamingFileFromStringFeatures is derived from CStreamingFile and provides an input source for the online framework from a CStringFeatures object
CStreamingSimpleFeatures< T >This class implements streaming features with dense feature vectors
CStreamingSparseFeatures< T >This class implements streaming features with sparse feature vectors. The vector is represented as an SGSparseVector<T>. Each entry is of type SGSparseVectorEntry<T> with members `feat_index' and `entry'
CStreamingStringFeatures< T >This class implements streaming features as strings
CStreamingVwCacheFileClass StreamingVwCacheFile to read vector-by-vector from VW cache files
CStreamingVwFeaturesThis class implements streaming features for use with VW
CStreamingVwFileClass StreamingVwFile to read vector-by-vector from Vowpal Wabbit data files. It reads the example and label into one object of VwExample type
CStringDistance< ST >Template class StringDistance
CStringFeatures< ST >Template class StringFeatures implements a list of strings
CStringFileFeatures< ST >File based string features
CStringKernel< ST >Template class StringKernel, is the base class of all String Kernels
CStringPreprocessor< ST >Template class StringPreprocessor, base class for preprocessors (cf. CPreprocessor) that apply to CStringFeatures (i.e. strings of variable length)
CSubGradientLPMClass SubGradientSVM trains a linear classifier called Linear Programming Machine, i.e. a SVM using a $\ell_1$ norm regularizer
CSubGradientSVMClass SubGradientSVM
CSubsetClass for adding subset support to a class. Provides an interface for getting/setting subset_matrices and index conversion. Do not inherit from this class, use it as variable. Write wrappers for all get/set functions
CSVMA generic Support Vector Machine Interface
CSVMLightClass SVMlight
CSVMLightOneClassTrains a one class C SVM
CSVMLinClass SVMLin
CSVMOcasClass SVMOcas
CSVMSGDClass SVMSGD
CSVRLightClass SVRLight, performs support vector regression using SVMLight
CSyntaxHighLightSyntax highlight
CTanimotoDistanceClass Tanimoto coefficient
CTanimotoKernelNormalizerTanimotoKernelNormalizer performs kernel normalization inspired by the Tanimoto coefficient (see http://en.wikipedia.org/wiki/Jaccard_index )
CTaxonomyCTaxonomy is used to describe hierarchical structure between tasks
CTensorProductPairKernelComputes the Tensor Product Pair Kernel (TPPK)
CTimeClass Time that implements a stopwatch based on either cpu time or wall clock time
CTOPFeaturesThe class TOPFeatures implements TOP kernel features obtained from two Hidden Markov models
CTrie< Trie >Template class Trie implements a suffix trie, i.e. a tree in which all suffixes up to a certain length are stored
CTronClass Tron
CTStudentKernelGeneralized T-Student kernel
CVarianceKernelNormalizerVarianceKernelNormalizer divides by the ``variance''
CVowpalWabbitClass CVowpalWabbit is the implementation of the online learning algorithm used in Vowpal Wabbit
CVwAdaptiveLearnerVwAdaptiveLearner uses an adaptive subgradient technique to update weights
CVwCacheReaderBase class from which all cache readers for VW should be derived
CVwCacheWriterCVwCacheWriter is the base class for all VW cache creating classes
CVwEnvironmentClass CVwEnvironment is the environment used by VW
CVwLearnerBase class for all VW learners
CVwNativeCacheReaderClass CVwNativeCacheReader reads from a cache exactly as that which has been produced by VW's default cache format
CVwNativeCacheWriterClass CVwNativeCacheWriter writes a cache exactly as that which would be produced by VW's default cache format
CVwNonAdaptiveLearnerVwNonAdaptiveLearner uses a standard gradient descent weight update rule
CVwParserCVwParser is the object which provides the functions to parse examples from buffered input
CVwRegressorRegressor used by VW
CWaveKernelWave kernel
CWaveletKernelClass WaveletKernel
CWDFeaturesFeatures that compute the Weighted Degreee Kernel feature space explicitly
CWDSVMOcasClass WDSVMOcas
CWeightedCommWordStringKernelThe WeightedCommWordString kernel may be used to compute the weighted spectrum kernel (i.e. a spectrum kernel for 1 to K-mers, where each k-mer length is weighted by some coefficient $\beta_k$) from strings that have been mapped into unsigned 16bit integers
CWeightedDegreePositionStringKernelThe Weighted Degree Position String kernel (Weighted Degree kernel with shifts)
CWeightedDegreeRBFKernelWeighted degree RBF kernel
CWeightedDegreeStringKernelThe Weighted Degree String kernel
CWRACCMeasureClass WRACCMeasure used to measure weighted relative accuracy of 2-class classifier
CZeroMeanCenterKernelNormalizerZeroMeanCenterKernelNormalizer centers the kernel in feature space
DynArray< T >Template Dynamic array class that creates an array that can be used like a list or an array
Example< T >Class Example is the container type for the vector+label combination
MKLMultiClassGLPKMKLMultiClassGLPK is a helper class for MKLMultiClass
MKLMultiClassGradientMKLMultiClassGradient is a helper class for MKLMultiClass
MKLMultiClassOptimizationBaseMKLMultiClassOptimizationBase is a helper class for MKLMultiClass
ModelClass Model
ParallelClass Parallel provides helper functions for multithreading
ParameterParameter class
ParameterMapImplements a map of ParameterMapElement instances
ParameterMapElementClass to hold instances of a parameter map. Each element contains a key and a value, which are of type SGParamInfo. May be compared to each other based on their keys
SerializableAsciiReader00Serializable ascii reader
SGIOClass SGIO, used to do input output operations throughout shogun
SGMatrix< T >Shogun matrix
SGNDArray< T >Shogun n-dimensional array
SGParamInfoClass that holds informations about a certain parameter of an CSGObject. Contains name, type, etc. This is used for mapping types that have changed in different versions of shogun. Instances of this class may be compared to each other. Ordering is based on name, equalness is based on all attributes
SGSparseMatrix< T >Template class SGSparseMatrix
SGSparseVector< T >Template class SGSparseVector
SGSparseVectorEntry< T >Template class SGSparseVectorEntry
SGString< T >Shogun string
SGStringList< T >Template class SGStringList
SGVector< T >Shogun vector
ShogunExceptionClass ShogunException defines an exception which is thrown whenever an error inside of shogun occurs
SSKFeaturesSSKFeatures
substringStruct Substring, specified by start position and end position
TParameterParameter struct
CSerializableFile::TSerializableReaderSerializable reader
TSGDataTypeDatatypes that shogun supports
v_array< T >Class v_array is a templated class used to store variable length arrays. Memory locations are stored as 'extents', i.e., address of the first memory location and address after the last member
VersionClass Version provides version information
VwExampleExample class for VW
VwFeatureOne feature in VW
VwLabelClass VwLabel holds a label object used by VW
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Defines

SHOGUN Machine Learning Toolbox - Documentation