1 /*! \page staticinterfaces Static Interfaces
3 As mentioned before SHOGUN interfaces to several programming languages and
4 toolkits such as Matlab(tm), R, Python, Octave. The following sections shall
5 give you an overview over the static interface commands of SHOGUN. For the
6 static interfaces we tried to preserve the syntax of the commands in a
7 consistent manner through all the different languages. However as in some cases
8 this was not possible and we document the subtle differences of syntax and
9 semantic in the respective toolkit. Instead of reading through all this, we
10 suggest to have a look at the large number of examples available in the \b
11 examples / interface directory. For example examples/R or examples/python etc.
13 <b>Overview of Static Interfaces & Testing the Installation</b>
14 \li \ref staticoctaveinterf_sec
15 \li \ref staticpythoninterf_sec
16 \li \ref staticrinterf_sec
18 <b>Interface Commands</b>
19 \li \ref staticiffeatures_sec
20 \li \ref staticifkernel_sec
21 \li \ref staticifsvm_sec
22 \li \ref staticifhmm_sec
23 \li \ref staticifpoim_sec
24 \li \ref staticifutil_sec
25 \li \ref staticifexample_sec
27 <b>Command Reference</b>
28 \li \ref staticifcmdref_sec
30 \section staticifoverview_sec Overview of Static Interfaces & Testing the Installation
32 \subsection staticoctaveinterf_sec Static Matlab and Octave Interface
34 Since octave is nowadays up to par with matlab a single documentation for both
35 interfaces is sufficient and will be based on octave (matlab can be used
38 To start SHOGUN in octave, start octave and check if it is correctly installed
39 by by typing ( let ">" be the octave prompt )
45 inside of octave. This should show you some help text.
47 \subsection staticpythoninterf_sec Static Python Interface
49 To start SHOGUN in python, start python and check if it is correctly installed
50 by by typing ( let ">" be the python prompt )
57 inside of python. This should show you some help text.
59 \subsection staticrinterf_sec Static R Interface
61 To fire up SHOGUN in R make sure that you have SHOGUN correctly installed in
62 R. You can check this by typing ( let ">" be the R prompt ):
68 inside of R, this command should list all R packages that have been
69 installed on your system.
70 You should have an entry like:
73 sg The SHOGUN Machine Learning Toolbox
76 After you made sure that SHOGUN is installed correctly you can start it via:
82 you will see some informations of the SHOGUN core (compile options etc).
83 After this command R and SHOGUN are ready to receive your commands.
85 In general all commands in SHOGUN are issued using the function sg(...).
86 To invoke the SHOGUN command help one types:
92 and then a help text appears giving a short description of all commands.
95 \section staticifcmds Static Interface Commands
97 \subsection staticiffeatures_sec Features
98 These functions transfer data from the interface to shogun and back.
99 Suppose you have a matlab matrix or R matrix "features" which
100 contains your training data and you want to register this data, you simply
103 Transfer the features to shogun
104 \arg \b set_features \verbatim sg('set_features', 'TRAIN|TEST', features[, DNABINFILE|<ALPHABET>]) \endverbatim
105 \arg \b add_features \verbatim sg('add_features', 'TRAIN|TEST', features[, DNABINFILE|<ALPHABET>]) \endverbatim
107 Features can be char/byte/word/int/real valued matrices, real values sparse
108 matrices, or strings (lists or cell arrays of strings). When dealing with
109 strings an alphabet name has to be specified (DNA, RAW, ...). Use 'TRAIN' to
110 tell SHOGUN that this is the data you want to train your classifier and TEST for
113 In contrast to \b set_features, \b add_features will create a combined feature
114 object and append the features to it. This is useful when dealing with a set of
115 different features (real valued and strings) and multiple kernels.
117 In case a single string was set using \b set_features, it can be "multiplexed"
118 by sliding a window over it using
119 \arg \b from_position_list \verbatim sg('from_position_list', 'TRAIN|TEST', winsize, shift[, skip]) \endverbatim
121 \arg \b obtain_from_sliding_window \verbatim sg('obtain_from_sliding_window', winsize, skip) \endverbatim
123 Deletes the features which we assigned before in the actual SHOGUN session.
124 \arg clean_features \verbatim sg('clean_features') \endverbatim
126 Obtain the Features from shogun
127 \arg \b get_features \verbatim [features]=sg('get_features', 'TRAIN|TEST') \endverbatim
129 One proceeds similar when assigning labels to the training data and obtaining
130 labels from shogun: The commands
132 \arg \b set_labels \verbatim sg('set_labels', 'TRAIN', trainlab) \endverbatim
133 \arg \b get_labels \verbatim [labels]=sg('get_labels', 'TRAIN|TEST') \endverbatim
135 tell SHOGUN that the labels of the assigned training data reside in trainlab,
136 respectively return the current labels (note that currently all data is
137 \b copied into SHOGUN, so modifications to trainlab are local within the
140 \subsection staticifkernel_sec Kernel & Distances
142 Kernel and DistanceMatrix specific commands, used to create, obtain and setting
145 Creating a kernel in shogun
146 \arg \b set_kernel \verbatim sg('set_kernel', 'KERNELNAME', 'FEATURETYPE', CACHESIZE, PARAMETERS) \endverbatim
147 \arg \b add_kernel \verbatim sg('add_kernel', WEIGHT, 'KERNELNAME', 'FEATURETYPE', CACHESIZE, PARAMETERS) \endverbatim
149 Here KERNELNAME is the name of the kernel one wishes to use, FEATURETYPE the
150 type of features (e.g. REAL for standard realvalued feature vectors), CACHESIZE
151 the size of the kernel cache in megabytes and PARAMETERS kernel specific
152 additional parameters.
154 \subsubsection staticifsuppkernels_sec Supported Kernels
156 The following kernels are implemented in SHOGUN:
162 \li User defined CustomKernel
164 \li Kernel from Distance
165 \li Fixed Degree StringKernel
166 \li Gaussian \f$ k(x,x')=e^{-\frac{||x-x'||^2}{\sigma}} \f$
168 To work with a gaussian kernel on real values one issues:
169 \verbatim sg('set_kernel', 'GAUSSIAN', 'TYPE', CACHESIZE, SIGMA)\endverbatim
172 \verbatim sg('set_kernel', 'GAUSSIAN', 'REAL', 40, 1)\endverbatim
173 creates a gaussian kernel on real values with a cache size of 40MB and a sigma
174 value of one. Available types for the gaussian kernel: REAL, SPARSEREAL.
176 \li Gaussian Shift Kernel
178 \li Linear \f$k(x,x')=x\cdot x'\f$
180 A linear kernel is created via:
181 \verbatim sg('set_kernel', 'LINEAR', 'TYPE', CACHESIZE)\endverbatim
184 \verbatim sg('add_kernel', 1.0, 'LINEAR', 'REAL', 50')\endverbatim
186 creates a linear kernel of cache size 50 for real datavalues, with weight 1.0.
188 Available types for the linear kernel: BYTE, WORD CHAR, REAL, SPARSEREAL.
190 \li Local Alignment StringKernel
191 \li Locality Improved StringKernel
192 \li Polynomial Kernel \f$k(x,x')=(x\cdot x')^d\f$
194 A polynomial kernel is created via:
195 \verbatim sg('set_kernel', 'POLY', 'TYPE', CACHESIZE, DEGREE, INHOMOGENE, NORMALIZE) \endverbatim
198 \verbatim sg('add_kernel', 0.1, 'POLY', 'REAL', 50, 3, 0) \endverbatim
199 adds a polynomial kernel. Available types for the polynomial kernel: REAL,
204 To work with a sigmoid kernel on real values one issues:
206 \verbatim sg('set_kernel', 'SIGMOID', 'TYPE', CACHESIZE, GAMMA, COEFF)\endverbatim
210 \verbatim sg('set_kernel', 'SIGMOID', 'REAL', 40, 0.1, 0.1) \endverbatim
212 creates a sigmoid kernel on real values with a cache size of 40MB, a gamma
213 value of 0.1 and a coefficient of 0.1. Available types for the gaussian kernel: REAL.
215 \li Weighted Spectrum Kernel
216 \li Weighted Degree Kernels
220 Assign a user defined custom kernel, fo which only the upper triangle may be
221 given (DIAG) or the FULL matrix (FULL), or the full matrix which is then
222 internally stored as a upper triangle (FULL2DIAG).
223 \arg \b set_custom_kernel \verbatim sg('set_custom_kernel', kernelmatrix, 'DIAG|FULL|FULL2DIAG') \endverbatim
225 The purpose of the get_kernel_matrix and get_distance_matrix commands is to
226 return a kernel or distance matrix representing the kernel/distance matrix for
229 \arg \b get_distance_matrix \verbatim [D]=sg('get_distance_matrix', 'TRAIN|TEST') \endverbatim
230 \arg \b get_kernel_matrix \verbatim [K]=sg('get_kernel_matrix', 'TRAIN|TEST') \endverbatim
232 km refers to a matrix object.
234 \subsection staticifsvm_sec SVM
235 \arg new_classifier Creates a new classifier (e.g. SVM instance).
236 \arg train_classifier Starts the training of the SVM on the assigned features and kernels.
238 The get_svm command returns some properties of an SVM such as the Langrange
239 multipliers alpha, the bias b and the index of the support vectors SV (zero
241 \arg \b get_classifier \verbatim [bias, alphas]=sg('get_svm') \endverbatim
242 \arg \b set_classifier \verbatim sg('set_classifier', bias, alphas) \endverbatim
244 This commands returns a list of arguments. \b set_classifier may be later on used (after creating an SVM classifier) to set alphas and bias again.
246 The result of the classification of the test sample is obtained via:
247 \arg \b classify \verbatim [result]=sg('classify') \endverbatim
248 \arg \b classify_example \verbatim [result]=sg('classify_example', feature_vector_index) \endverbatim
249 where result is a vector containing the classification result for each
250 datapoint and \b classify_example only obtains the output for a single example
251 (index is zero based like in python. note that octave, matlab, R are 1 based).
253 \subsection staticifhmm_sec HMM
257 \li hmm_classify_example
261 \subsection staticifpoim_sec POIM
263 \li get_SPEC_consensus
268 \subsection staticifutil_sec Utility
269 Miscellaneous functions.
271 Returns the svn version number
272 \arg \b help \verbatim sg('get_version') \endverbatim
274 Gives you a help text.
275 \arg \b help \verbatim sg('help') \endverbatim
276 \arg \b help \verbatim sg('help', 'CMD') \endverbatim
278 Sets a debugging log level - useful to trace errors.
279 \arg loglevel \verbatim sg('loglevel', 'LEVEL') \endverbatim
280 LEVEL can be one of DEBUG, WARN, ERROR
281 \li ALL: very verbose logging output (useful only for hunting memory leaks)
282 \li DEBUG: verbose logging output (useful for debugging).
283 \li WARN: less logging output (useful for error search).
284 \li ERROR: only logging output on critical errors.
288 > sg('loglevel', 'ALL')
290 gives you a list of instructions.
293 Let's get started, equipped with the above information on the basic SHOGUN
294 commands you are now able to create your own SHOGUN applications.
296 \section staticifexample_sec Example
297 Let us discuss an example:
299 \li \verbatim sg('set_features', 'TRAIN', traindat) \endverbatim
300 registers the training sample which reside in traindat.
302 \li \verbatim sg('set_labels', 'TRAIN', trainlab) \endverbatim
303 registers the training labels.
305 \li \verbatim sg('set_kernel', 'GAUSSIAN', 'REAL', 100, 1.0) \endverbatim
306 creates a new gaussian kernel for reals with cache size 100Mb and width = 1.
308 \li \verbatim sg('new_classifier', 'SVMLIGHT') \endverbatim
309 creates a new SVM object inside the SHOGUN core.
311 \li \verbatim sg('c', 20.0) \endverbatim
312 sets the C value of the new SVM to 20.0.
314 \li \verbatim sg('train_classifier') \endverbatim
315 attaches the data to the kernel and does some initialization then
316 starts the training on the sample.
318 \li \verbatim sg('set_features', 'TEST', testdat) \endverbatim
319 registers the test sample
321 \li \verbatim out=sg('classify') \endverbatim
322 attaches the data to the kernel and classifies. Then
323 gives you the classification result as a vector.
326 \section staticifcmdref_sec Function Reference
327 \li \subpage staticoctave
328 \li \subpage staticpython
329 \li \subpage staticcmdline