IRC logs of #shogun for Monday, 2011-07-25

--- Log opened Mon Jul 25 00:00:12 2011
--- Day changed Mon Jul 25 2011
blackburnsonney2k: libsvm doesn't work neither in python nor java00:00
blackburnSystemError: [ERROR] assertion kernel->get_num_vec_lhs()==problem.l failed in file classifier/svm/LibSVM.cpp line 8500:00
@sonney2kI don't understand00:01
@sonney2kyou said the minimal example worked?00:01
@sonney2kthat should be using libsvm too00:01
blackburnminimal example yes, classifier_libsvm_modular.py - not00:02
@sonney2kbut where is the difference?00:02
@sonney2kthey both do the same00:02
blackburnI would fix that already if I knew00:03
@sonney2kblackburn, I suspect that the number of labels doesn't match the matrix00:03
@sonney2kdoes python work?00:03
@sonney2kor java?00:03
@sonney2kor none?00:03
blackburnnone00:03
blackburnah yes00:04
blackburnmy fauld00:04
blackburnfault*00:04
@sonney2kbtw the minimal example in java cannot work00:04
@sonney2kforget what I just said00:04
@sonney2kblackburn, your fault?00:05
@sonney2kwhat means ah yes?00:05
blackburnnumber of labels wrong00:05
blackburnsonney2k: why minimal example in java cannot work?00:06
@sonney2kok00:06
@sonney2kdid you read what I said above?00:06
blackburnyes I did00:06
@sonney2k<sonney2k> btw the minimal example in java cannot work00:07
@sonney2k<sonney2k> forget what I just said00:07
@sonney2k:D00:07
blackburnsonney2k: libsvm produces the same results00:08
@sonney2kblackburn, as in same labels.get_labels() ?00:08
blackburnblackburn@blackburn-laptop:~/shogun/shogun/examples/undocumented/java_modular$ ./check.sh classifier_libsvm_modular.java00:08
blackburn[0.1938791717197525, 0.19659259940936621]00:08
blackburnblackburn@blackburn-laptop:~/shogun/shogun/examples/undocumented/python_modular$ python classifier_libsvm_modular.py00:08
blackburnLibSVM00:08
blackburn[ 0.19387917  0.1965926 ]00:08
@sonney2kwait00:10
@sonney2k2 outputs only?00:10
@sonney2kwe should have 9200:10
blackburnyes, I modified data00:10
@sonney2kok00:10
blackburnI can't check 92 numbers for equality00:10
@sonney2kbetter check for the whole sample00:10
blackburntakes more time00:10
@sonney2kjust print them and do a diff00:10
@sonney2kyes I understand but it is impossible to have 92 numbers to match by chance00:11
blackburnokay okay00:11
@sonney2kohh dam'd I am compiling shogun on debian unstable00:11
@sonney2klots of new warnings....00:11
blackburn?00:12
@sonney2kI think I should start preparing a debian package00:12
@sonney2kfor the new thing00:12
blackburnsonney2k: the same for 9200:13
@sonney2khurray!00:13
blackburnok, i'm pretty tired with java today00:16
@sonney2kblackburn, what did you expect?00:16
@sonney2kI think it ran rather smoothly00:16
blackburnwhaT?00:16
@sonney2kit basically worked out of the box00:16
blackburncan't understand what is you talking about00:17
@sonney2kdidn't you only have to modify load.py / Load.java?00:17
@sonney2kno bugs in typemaps (so far)00:17
@sonney2k?00:17
blackburnyes00:17
@sonney2kso yes - that is trivial compared to typemap bugs00:17
blackburnit is00:18
@sonney2kso we are lucky00:18
@sonney2kand chances are that other examples will just work00:18
blackburnI saw some strange outputs in some of them00:19
blackburnNaN or so00:19
blackburntomorrow will take a look00:19
@sonney2kI mean now it remains to test if string based typemaps work00:19
@sonney2ke.g. stringfeatures00:19
@sonney2kweighteddegreestringkernel00:20
@sonney2kand then some really complex example with preprocessors attached or multiple kernels00:20
@sonney2kif something big works then the rest is just minor issues00:20
@sonney2kblackburn, I really think we need someone doing a tutorial with some nice data set00:24
@sonney2kI mean like we have a certain data set and no idea about it00:24
@sonney2kso we do pca or so first00:24
@sonney2kand visualize it00:24
@sonney2kthen we do some classification or so00:24
blackburnwith as much methods used as it could be?00:25
@sonney2kyeah00:28
@sonney2klike some story line from very explorative unsupervised00:28
@sonney2kto simple supervised00:28
@sonney2ke.g. linear00:28
@sonney2kthen e.g. svm w/ kernels00:28
@sonney2kand then maybe even multiple kernels / data sources00:29
blackburngood idea00:29
@sonney2kthat could work for anything, 2-class classification, regression, multiclass00:30
@sonney2kwould be cool to use heikos x-validation on top already for that00:30
blackburnI would say I can do it if I wasn't embarassed with my manifold learning algos00:31
@sonney2kblackburn, don't worry at some point we will have shogun 1.0 and then we might have time to work on some nice applications too :)00:42
@sonney2kanyway00:42
@sonney2kgoing to bed now00:42
@sonney2kcu00:42
blackburnsee you00:43
-!- blackburn [~blackburn@188.122.239.253] has quit [Ping timeout: 255 seconds]00:49
-!- f-x [~user@117.192.199.217] has joined #shogun01:59
-!- f-x_ [fx@213.155.190.134] has joined #shogun04:02
-!- f-x [~user@117.192.199.217] has quit [Ping timeout: 260 seconds]04:24
-!- in3xes [~in3xes@180.149.49.230] has quit [Quit: Leaving]05:09
-!- gsomix [~gsomix@178.45.88.77] has joined #shogun05:45
-!- [1]warpy [~warpy@bzq-79-181-43-167.red.bezeqint.net] has quit [Quit: HydraIRC -> http://www.hydrairc.com <- Like it? Visit #hydrairc on EFNet]07:08
-!- gsomix [~gsomix@178.45.88.77] has quit [Read error: Connection reset by peer]07:29
-!- f-x [~user@117.192.204.84] has joined #shogun08:52
-!- sploving1 [~sploving@124.16.139.134] has joined #shogun08:55
@sonney2ksploving1, could you please test on the kernel example first?09:11
sploving1I tested all of them09:12
@sonney2kand?09:12
sploving1there is no same result!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!09:12
sploving1why??09:12
@sonney2kwhy what?09:13
@sonney2kwhen you just set features and get the features09:13
sploving1the result are not same(python , lua)09:13
@sonney2kare they the same as in python?09:13
@sonney2ksploving1, yes you said that already - now we need to debug why not.09:14
sploving1sonne2y, why classifier_averaged_perceptron_modular.py run twice not the same?09:14
sploving1sonney2k09:15
@sonney2kthey were not the same in (python,java) either. but now they are at least for some examples09:15
@sonney2ksploving1, please first try a simpler example like just setting features / getting features09:15
sploving1you can give me the name of example, and I will try. and I do not know what setting/getting features mean09:17
@sonney2ksploving1, I see09:17
@sonney2klets try features_simple_real_modular.py09:17
sploving1works well~09:20
sploving1sonney2k, I mean in python it works well09:20
@sonney2ksploving1, ok - now compare if that works in lua too09:21
sploving1sonney2k, can we compile them both09:21
sploving1I mean in configure, we need compile them both.09:21
@sonney2ksploving1, yes, just configure python_modular,lua_modular09:22
sploving1./configure --interfaces=python_modular,lua_modular09:22
@sonney2kbut when you already installed you don't need to09:22
@sonney2k,09:22
@sonney2kbetween yes09:22
sploving1okay that is good09:22
@sonney2ksploving1, I am looking into the averaged perceptron issue09:22
@sonney2k(now)09:22
@sonney2kit is a different problem - it seems09:23
sploving1yeap. I do not need09:28
sploving11 4 009:29
sploving10 0 909:29
sploving10 0 009:29
sploving10 5 009:29
sploving10 0 609:29
sploving19 9 909:29
sploving1this is lua result09:29
sploving1[[ 1.  2.  3.]09:29
sploving1 [ 4.  0.  0.]09:29
sploving1 [ 0.  0.  0.]09:29
sploving1 [ 0.  5.  0.]09:29
sploving1 [ 0.  0.  6.]09:29
sploving1 [ 9.  9.  9.]]09:29
-!- sploving1 was kicked from #shogun by bettyboo [flood]09:29
-!- sploving1 [~sploving@124.16.139.134] has joined #shogun09:29
sploving1[[ 1.  2.  3.]09:29
sploving1 [ 4.  0.  0.]09:29
sploving1 [ 0.  0.  0.]09:29
sploving1 [ 0.  5.  0.]09:29
sploving1 [ 0.  0.  6.]09:29
sploving1 [ 9.  9.  9.]]09:29
-!- sploving1 was kicked from #shogun by bettyboo [flood]09:29
-!- sploving1 [~sploving@124.16.139.134] has joined #shogun09:29
sploving1[[ 1.  2.  3.]09:30
sploving1 [ 4.  0.  0.]09:30
sploving1 09:30
sploving1 [ 0.  0.  0.]09:30
sploving1 [ 0.  5.  0.]09:30
sploving1 [ 0.  0.  6.]09:30
sploving1 [ 9.  9.  9.]]09:30
sploving1this is python result09:30
sploving1sonney2k, which is correct?09:33
@sonney2ksploving1, what is the original input?09:34
sploving1a=RealFeatures(A), a.set_feature_vector(array([1,4,0,0,0,9], dtype=float64), 0) will affect a.get_feature_matrix()??09:34
sploving1matrix=array([[1,2,3],[4,0,0],[0,0,0],[0,5,0],[0,0,6],[9,9,9]], dtype=float64)09:34
sploving1this  is the original input09:34
sploving1sonney2k, in lua it is : matrix = {{1,2,3},{4,0,0},{0,0,0},{0,5,0},{0,0,6},{9,9,9}}09:35
-!- f-x [~user@117.192.204.84] has quit [Ping timeout: 260 seconds]09:36
sploving1as I do not know features' meaning, I have no idea which result is correct09:36
@sonney2ksploving1, did you do the set_feature_vector in lua too?09:37
sploving1sonney2k, yeap . a:set_feature_vector({1,4,0,0,0,9}, 0)09:38
@sonney2ksploving1, I am a bit lost - please comment the set_feature_vector in both languages09:38
@sonney2kand then just show what you get in python (first) and then lua09:39
@sonney2ksploving1, maybe use gist.github.com for pasting...09:39
@sonney2kor /query me09:39
sploving1https://gist.github.com/110371709:41
sploving1sonney2k, take a look at it!09:41
@sonney2kthe python one is correct09:42
sploving1you mean set_feature_vector has no effect on the result? sonney2k??09:43
sploving1sonney2k, then why set_features_vector effect the result in lua?? so strange!09:44
@sonney2ksploving1, it should not yes09:50
@sonney2ksploving1, please don't do set_feature_vector for now and check09:51
@sonney2kit is probably wrong in lua nevertheless09:51
@sonney2k(just result transposed - I guess different order in typemap)09:51
sploving1sonne2k, without it(set_..), it is the correct result09:53
sploving1sonney2k09:53
sploving1it is the origin input09:54
sploving1the same with09:54
sploving1different order? sonney2k, can you expain it more detail?09:55
sploving1so I can fix it09:55
@sonney2ksploving1, yes that can happen when both set_feature_matrix and get_feature_matrix use a different ordering09:55
@sonney2ksploving1, when you uncomment09:56
@sonney2k    print a.get_num_vectors()09:56
@sonney2k    print a.get_num_features()09:56
@sonney2kwhat do these display in lua?09:56
@sonney2ksploving1, in the end i suspect that just this array[i * cols + j] statement in lua is wrong09:58
sploving13 609:58
@sonney2kthat is correct09:58
@sonney2ksploving1, if you write  matrix = {{1,2,3},{4,0,0},{0,0,0},{0,5,0},{0,0,6},{9,9,9}} how many tables are these?10:03
@sonney2k6 right?10:04
@sonney2kso that should match rows10:04
@sonney2kand cols is 3 since each table has 3 elements10:04
sploving16 yeap10:05
-!- f-x [~user@117.192.204.152] has joined #shogun10:05
sploving16 is rows, 3 is cols10:05
@sonney2ksploving1, does that match the meaning you have in swig_typemaps.i?10:05
sploving1yeap. now I understand set_feature_vector mean10:05
sploving1it is to set the first columen10:05
@sonney2kyes10:05
sploving1I will take a look at the file and fix it10:06
@sonney2ksploving1, but please not that the vector set function is correct10:07
@sonney2kit must be the in /out typemap for SGMatrix that are *both* wrong10:07
sploving1you mean SGVector is correct and SGMatrix is wrong?10:07
@sonney2kyes10:07
sploving1sonney2k, okay10:07
@sonney2kI think it should be array[j * rows + i] in line 156 in swig_typemaps.i10:12
@sonney2kand same indexing in line 17610:13
@sonney2ksploving1, ^10:13
@sonney2kthen it should work10:13
sploving1I fixed it and now recompiling10:13
sploving1I thought shogun store in rows first, but that is wrong10:14
sploving1shogun store in colmn first then rows, sonney2k10:14
@sonney2kyes it is always column by column10:14
@sonney2klike fortran, matlab, r, octave, ...10:14
-!- blackburn [~blackburn@188.122.239.253] has joined #shogun10:15
@sonney2k... but not python :)10:15
sploving1oh. I know10:15
@sonney2k(in python numpy one can specify that one wants fortran order - so it works there too :)10:16
sploving1sonney2k, now I want to support rubby narray10:17
@sonney2ksploving1, does it work now?10:17
@sonney2kI mean lua matrix?10:18
sploving1just compiling10:18
sploving1I fetch upstream10:18
sploving1so I git clean and compiling from new fresh10:18
@sonney2kok10:19
sploving1I just see narray examples,but no api10:19
sploving1like numpy10:19
@sonney2kserialhex, in case you are around again ping us10:19
@sonney2ksploving1, I would do the same kind of low-level support that you did for lua10:20
sploving1sonne2k, ?10:20
sploving1sonney2k10:20
sploving1what do you mean?10:20
@sonney2ksploving1, just using arrays10:22
-!- sploving1 [~sploving@124.16.139.134] has quit [Remote host closed the connection]10:22
@sonney2k?10:23
@sonney2khmmhh it looks like narray is still being developed10:24
@sonney2kso it is probably worth supporting10:24
@sonney2kand the api is in narray.h10:27
-!- sploving1 [~sploving@124.16.139.194] has joined #shogun10:30
sploving1sonney2k,/../../src/interfaces/lua_modular/modshogun.so: undefined symbol: _ZN6shogun4CGMM10train_smemEiidid10:30
sploving1my machine crashed just now. I need reboot and just run make(cannot run other application) to compile shogun10:31
sploving1but I met the  problem: ua: error loading module 'modshogun' from file '../../../src/interfaces/lua_modular/modshogun.so':10:31
sploving1../../../src/interfaces/lua_modular/modshogun.so: undefined symbol: _ZN6shogun4CGMM10train_smemEiidid10:31
@sonney2ksploving1, yes it needs 1.5 G to compile10:31
sploving1I have git clean -dfx and configure it with lua/python modular10:33
@sonney2ksploving1, I am doing now too10:34
@sonney2ksploving1, btw you can use narray.h for the api of ruby's narray10:37
sploving1oh. i hope there is a api doc10:39
@sonney2ksploving1, I couldn't find any - but the .h does contain the needed info10:39
@sonney2ke.g. IsNArray() to test if the obj. is an narray10:39
sploving1okay. I will take a look at it10:40
@sonney2kand there is RNArray with the data10:40
sploving1if it is similar to python, that maybe not difficult10:40
sploving1numpy, i mean10:40
@sonney2kit is definitely similar and not beautiful :)10:42
@sonney2kwhich example did not work?10:43
@sonney2ksploving1, ^10:43
sploving1feature10:44
sploving1features_simple_real_modular.lua, sonney2k10:44
sploving1does fresh shogun work well in your computer??10:45
@sonney2kI just did git clean -dfx and recompiled10:45
@sonney2kit works....10:45
@sonney2khow do you run the example?10:46
sploving1export LUA_PATH=../../../src/interfaces/lua_modular/?.lua\;?.lua10:46
sploving1export LUA_CPATH=../../../src/interfaces/lua_modular/?.so10:46
sploving1then lua features_simple_real_modular.lua10:46
@sonney2k(I ran ./check.sh)10:46
@sonney2kyes that works too10:47
sploving1oh. I wll compile it again10:50
-!- sploving1 [~sploving@124.16.139.194] has left #shogun []10:50
-!- warpyyy [~theuser@212.179.28.34] has joined #shogun10:52
@sonney2kblackburn, does current master compile and run for you?10:52
blackburnmin10:53
-!- warpyyy [~theuser@212.179.28.34] has quit [Read error: Connection reset by peer]10:54
blackburnsonney2k: yes, all ok, interfaces=java_modular10:57
@sonney2kok then it must be sth on splovings side11:00
* sonney2k is transitioning CLabels for SGVector11:01
CIA-87shogun: Soeren Sonnenburg master * re9d4632 / src/interfaces/lua_modular/swig_typemaps.i :11:47
CIA-87shogun: Merge pull request #232 from sploving/master11:47
CIA-87shogun: fix matrix typemap(columns first then rows) - https://github.com/shogun-toolbox/shogun/commit/e9d463232d97ebb290e9db25ae31905469c22acf11:47
CIA-87shogun: Baozeng Ding master * rad8130a / src/interfaces/lua_modular/swig_typemaps.i : fix matrix typemap(columns first then rows) - https://github.com/shogun-toolbox/shogun/commit/ad8130a578f008d729f4f2525b83631b52ba462011:47
-!- sploving1 [~sploving@124.16.139.194] has joined #shogun12:06
sploving1now lua features_simple_real works!12:07
sploving1sonney2k, do you know why classifier_averaged_perceptron_modular.py reduce different results?12:08
@sonney2ksploving1, and kernel too?12:09
@sonney2ksploving1, I am working on that perceptron issue12:09
@sonney2ksince that issue appears in python too - it must be some general problem12:09
sploving1I am tring kernel now12:09
sploving1kernel_gaussian_modular, https://gist.github.com/110387112:16
sploving1sonney2k, not the same. lua generate so long result!!!!12:16
@sonney2ksploving1, could you please test km_train first?12:18
sploving1okay12:18
@sonney2kit should be as big as number of columns12:18
@sonney2ktimes number of columns12:18
@sonney2kIIRC 92x9212:18
sploving1lua: km_train: 92*812:23
sploving1sonney2k, how to print python??12:23
sploving1it has ... in the result12:24
@sonney2kyou mean print(x) ?12:24
sploving1yeap12:24
@sonney2krepr()12:24
@sonney2kx.repr()12:24
@sonney2ksploving1, how can km_train for lua be 92x8 ?12:25
@sonney2knot possible...12:25
sploving1sonney2k,  'numpy.ndarray' object has no attribute 'repr'12:25
sploving1km_train.repr()??12:25
sploving1sonney2k, what it should be? 92*92?12:26
@sonney2krepr(km_train)12:26
@sonney2kyes12:26
sploving1sonney2k, https://gist.github.com/1103871, the python output still cannot dump using repr. it has ... output12:28
sploving1I mean it omit some results12:30
sploving1using repr, or print12:30
@sonney2ksploving1, then use numpy.savetxt('somefilename', km_train)12:31
@sonney2kbut that is not the actual problem...12:32
@sonney2kbtw the lua matrix is 92x92 here too12:34
sploving1https://gist.github.com/110388712:35
sploving1this is the result12:35
sploving192* 92? how dou you know that??12:36
sploving1I just count rows = 92, cols = 8 about, sonney2k12:36
@sonney2knope 92x9212:40
@sonney2kand gives same result as in python btw12:42
sploving1sonney2k, you are so great. can you tell me how do you that????12:44
sploving1sonney2k, I did not see the right sitd12:45
@sonney2kI just write the output of the lua matrix to a file12:45
sploving1sorry for that12:45
sploving1you mean lua *.lua > 1.txt? sonney2k12:46
@sonney2kyes12:46
sploving1okay. I will move to ruby12:50
sploving1sonney2k I gtg bye12:51
@bettyboosee you12:51
@sonney2ksploving1, don't forget strings12:51
@sonney2kin lua I mean12:51
@sonney2kyou havent' tested these yet12:51
sploving1sonney2k, okay. I will test them. when output, just lua *.lua > 1.txt???12:52
@sonney2ksploving1, I don't understand what you mean12:52
sploving1in my machine, I did not see the right side of the columns12:52
sploving11 0.026404555080215 0.00051704925476818 0.013315592813501 0.98824215105277 0.65147692801705 0.14929587268109 2.354172745438e-05 0.2832383077822712:53
sploving1for instance i just see the first row is the above12:53
sploving1but when I paste them on pasbin, it have the right side12:53
sploving1so strange12:53
-!- sploving1 [~sploving@124.16.139.194] has left #shogun []12:54
@sonney2kblackburn, why do you use m_labels in GaussianNB?13:08
@sonney2kI mean you only need this in train() or?13:09
blackburnhmm13:11
blackburnI can't remember13:11
blackburnlet me look ;)13:11
blackburnsonney2k: m_labels is used only in train, yes13:12
@sonney2kok then I remove m_labels from the .h13:12
@sonney2ketc13:12
-!- heiko1 [~heiko@134.91.10.200] has joined #shogun13:35
@sonney2kheiko1, hey ... you survived :)13:41
heiko1sonney2k, yes, but pain all over the body ;)13:43
heiko1how do you say muskelkater in english? :)13:43
blackburnhard weekend with gf? :D13:43
blackburnthe last words were 'girlfriend is waiting' IIRC :D13:44
heiko1oh, yes and that also :)13:44
heiko1and climbing13:44
* sonney2k sings la la la *VERY* *LOUDLY*13:44
heiko1and you guys? did you have a good weekend?13:45
@sonney2kblackburn had a lot of fun with java or coffee or so13:47
blackburnhaha13:47
blackburnwith HLLE too13:47
@sonney2kI am currently transitioning labels to use SGVector for real (internally)13:48
@sonney2ktoo13:48
-!- heiko1 [~heiko@134.91.10.200] has quit [Ping timeout: 258 seconds]15:27
-!- heiko1 [~heiko@134.91.52.56] has joined #shogun15:33
@sonney2kheiko1, do you know by heart where you recently added a SG_NOT_IMPLEMENTED?15:41
heiko1yes, copy_subset() of CFeatures15:44
heiko1called by KernelMachine::train15:44
heiko1an example fails?15:45
@sonney2kyeah serialization15:45
@sonney2kbut I don't think it was that one15:45
heiko1mmh15:45
@sonney2kit is the gaussian kernel that is failing15:45
heiko1let me check15:45
heiko1compute feature vector of SimpleFeatures15:46
@sonney2kheiko1, yes that one!15:47
heiko1this method hjust returns NULL there15:47
heiko1has to be overridden or something15:47
heiko1should I remove the SG_NOTIMPLEMENTED?15:47
@sonney2kI don't know though why it is called15:47
heiko1i saw a call of it recently... i will check15:49
heiko1SimpleFeatures::get_feature_vector15:50
heiko1if no feature matrix is set15:50
@sonney2kmakes sense15:50
@sonney2kso it could be that serializaton has some chicken / egg problem15:50
heiko1yes15:50
@sonney2kthat kernel should be loaded but features are not yet there or so15:51
heiko1mmh15:51
heiko1but if the SG_NOTIMPLEMENTED is removed15:51
heiko1NULL is returned there15:51
@sonney2kand the gaussian kernel does some operation in load_serializable_post15:51
@sonney2kto precompute some x_i^215:52
heiko1sonney2k, the fire-alarm just started howling here15:53
heiko1i will check whats happening15:53
@sonney2kheiko1, ok15:53
heiko1indeed, there is a fire15:55
@sonney2kwow!15:55
heiko1cars ariving, i will go outside for a minute15:55
@sonney2kwho is burning?15:55
heiko1(probalby takes more)15:55
heiko1dont know15:55
blackburnit is what they call 'extreme programming'15:56
heiko1i am in 6th floor15:56
heiko1afk...15:56
@sonney2kblackburn, your name is heiko1 program15:56
blackburnwhat&15:57
blackburn>15:57
blackburn?15:57
@sonney2kI see them all burning with black smoke15:57
@sonney2khope heiko1 manages to escape15:57
blackburnwhere do you see it?15:57
@sonney2klive tv of course ;-)15:58
blackburnjoke? ;)15:58
@sonney2kno of course not *eg*15:58
@sonney2kI guess this bug hunt here is driving me made16:01
@sonney2kwhich reminds me16:01
@sonney2kblackburn, how is java going along?16:02
blackburnsonney2k: currently working on HLLE16:02
@sonney2kblackburn, that is not fair16:02
@sonney2kyou can have fun16:02
@sonney2kI have to fix bugs16:02
blackburnI'm currently fixing bugs in HLLE :D16:03
@sonney2kyou mean it is semi-fun16:03
@sonney2khmmhh not sure I can live with that16:03
heiko1re16:04
heiko1small fire16:04
heiko1wow 3 cars and police here16:04
heiko1but builing will not be evacuated16:04
@sonney2kheiko1, what happened?16:14
@sonney2kor what is going on?16:14
heiko1they are already gone,16:14
heiko1but I did not have the motivation to go down all 150 chair steps ;)16:15
heiko1probably nothing too scary16:15
@sonney2kah no elevator...16:15
@sonney2kyou could have climbed...16:15
heiko1yes, but I dont like being in the elevator16:15
heiko1yes, its possible here .)16:15
@sonney2kanyway heiko1 I made some progress on this here16:15
heiko1ok?16:15
@sonney2kthe strange thing is that the feature object got loaded just fine16:16
@sonney2kand it pretends that it did also load the matrix16:16
@sonney2kbut for some reason not?!16:16
heiko1mmmh16:18
heiko1are this simple features?16:18
@sonney2kyes16:18
@sonney2kheiko1, I only noticed because I changed all of labels and thus lots of classifiers16:19
@sonney2kand so I recognized that some examples fail...16:19
heiko1is the load method of SimpleFfeatures called?16:20
@sonney2kenabling debug I see that that simple features are already loaded...16:21
@sonney2kheiko1, w/ debug on I see that16:45
@sonney2k[DEBUG] START LOADING CSGObject 'SimpleFeatures'16:45
@sonney2k....16:45
@sonney2k[DEBUG] Loading parameter 'feature_matrix' of type 'Matrix<float64>'16:45
@sonney2k[DEBUG] DONE LOADING CSGObject 'SimpleFeatures' (0x3c973b0)16:45
@sonney2kbut then in kernel lhs=0x3c973b0 '(nil)' num_vec_fm=0 num_feat_fm=0 num_vec=20 num_feat=216:45
@sonney2kthe nil there corresponds to no feature matrix around16:45
@sonney2kand the num_vec/feat_fm =0 indicate that the matrix indeed did not get loaded16:46
heiko1strange :(16:48
@sonney2kheiko1, does the matrix stuff work at all?16:48
@sonney2klet me create a fool proof example16:48
heiko1what do you mean with matrix stuff?16:48
heiko1ok16:49
@sonney2kanswer is no16:50
@sonney2kfrom modshogun import *16:50
@sonney2kfrom numpy import array16:50
@sonney2kfeats=RealFeatures(array([[1.0,2,3],[4,5,6]]))16:50
@sonney2kfstream = SerializableAsciiFile("foo.asc", "w")16:50
@sonney2kfeats.save_serializable(fstream)16:50
@sonney2kbut in foo.asc we have feature_matrix Matrix<float64> 0 0 ()16:51
heiko1so which part is not working? save or load?16:52
@sonney2ksave16:52
@sonney2kheiko1, vector works though16:52
@sonney2kl=Labels(array([1.0,2,3]))16:52
@sonney2kfstream = SerializableAsciiFile("foo2.asc", "w")16:52
@sonney2kl.save_serializable(fstream)16:52
@sonney2klabels Vector<float64> 3 ({1}{2}{3})16:52
@sonney2kheiko1, do you have an idea where I should look for the bug?16:53
@sonney2kor do you even know what the problem could be?16:53
heiko1mmh16:53
heiko1i mean the save code of SimpleFeatures is short.16:54
heiko1writer->f_write(feature_matrix, num_features, num_vectors);16:54
@sonney2kheiko1, not the save code of simplefeatures16:54
@sonney2kserialization16:54
heiko1ah sorry16:54
@sonney2klike where you did add support for SGVector / SGMatrix :)16:54
heiko1oh, ... mmh, perhaps the add methods of Parameter I did are wrong16:55
@sonney2kor not - we have to find out16:55
@sonney2kthat was in shogun/base/Parameter.cpp?16:56
heiko1yes16:56
heiko1all the add methods16:56
heiko1with SGVector SGMatrix16:56
heiko1hope theres no mistake in them16:57
@sonney2kbut we are not even using SGMatrix etc16:57
heiko1oh16:57
heiko1mmh16:57
heiko1then this cant be the mistake16:57
@sonney2kwe use the add_matrix stuff16:57
heiko1but this wasnt touched recently or?16:58
@sonney2kmaybe for the subsetting business16:59
heiko1yes16:59
@sonney2kheiko1, I mean there are feature_matrix_num_vectors etc16:59
@sonney2kand these are 0 too17:00
heiko1oh17:00
heiko1I just got an idea17:00
heiko1let me check17:00
@sonney2kand indeed they are17:00
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun17:00
heiko1perhaps this has to do something with the variables that i changed17:01
heiko1the add methods have also been changed17:01
heiko1At sometime I replaced the features by a SGVector17:01
@sonney2kheiko1, no I think it is a bug in simplefeatures somehow17:01
heiko1but undid this17:01
heiko1yes17:01
heiko1in SimpleFeatures17:01
@sonney2kI mean dimensions of feature matrix need to be non-zero17:02
@sonney2kcould very well be my fault too...17:02
heiko1mmh17:02
blackburnhooray to new heisenbug in arpack wrapper!17:02
heiko1sonney2k, I have an appointment in a few minutes, sorry for that, but I will be back later17:02
@sonney2kheiko1, found the bug!17:03
heiko1where?17:04
@sonney2kheiko1, in the set_feature_matrix for SGMatrix type17:04
@sonney2kit was forgotten to set featuer_matrix_num_vectors17:05
heiko1alright17:05
heiko1then17:05
@sonney2keverywhere else it was ok17:05
heiko1glad you found it :)17:05
heiko1so, see you this in the evening17:06
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 276 seconds]17:16
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun17:28
CIA-87shogun: Soeren Sonnenburg master * rd85cb65 / (23 files in 9 dirs):17:35
CIA-87shogun: remove unused confidences from labels and add SGVector in methods17:35
CIA-87shogun: utilizing labels when possible - https://github.com/shogun-toolbox/shogun/commit/d85cb655161d037c2d0a0c2f0970f44de3a9e13117:35
CIA-87shogun: Soeren Sonnenburg master * rd2f13fb / examples/undocumented/python_modular/serialization_matrix_modular.py : add example to just serialize matrix - https://github.com/shogun-toolbox/shogun/commit/d2f13fb17d3f9998af1a175dfd4e2bea4544fb3d17:35
CIA-87shogun: Soeren Sonnenburg master * r58fd62c / (10 files in 7 dirs):17:35
CIA-87shogun: various bugfixes related to SGVector/SGMatrix transition17:35
CIA-87shogun: - in out typemap of python_modular for vectors17:35
CIA-87shogun: - in simplefeatures set_feature_matrix17:35
CIA-87shogun: - KRR double free17:35
CIA-87shogun:  ... - https://github.com/shogun-toolbox/shogun/commit/58fd62cc8e2765e82b9e4fc5926603942355d47e17:35
-!- CIA-87 was kicked from #shogun by bettyboo [flood]17:35
-!- CIA-87 [~CIA@cia.atheme.org] has joined #shogun17:35
@sonney2kblackburn, yay!17:36
@sonney2kall examples work again :)17:36
@sonney2kpython_modular only of course17:37
blackburnnice17:39
CIA-87shogun: Soeren Sonnenburg master * r00d57d5 / examples/undocumented/python_modular/clustering_gmm_modular.py : fix clustering example - https://github.com/shogun-toolbox/shogun/commit/00d57d584f283a8ed9512f97e74b6a8413b7b66217:45
CIA-87shogun: Sergey Lisitsyn master * rbcb7bb8 / (2 files): Added DORGQR routine wrapper for lapack - https://github.com/shogun-toolbox/shogun/commit/bcb7bb8f5a965e5eb7e51f7c158a4015411f5cf518:37
CIA-87shogun: Sergey Lisitsyn master * r988360c / (5 files in 2 dirs): Introduced Hessian Locally Linear Embedding preprocessor - https://github.com/shogun-toolbox/shogun/commit/988360ca58dfa56181aedb7cb7781b7d64ee0a3d18:37
CIA-87shogun: Sergey Lisitsyn master * r179091b / (2 files): Fixed LLE and added HLLE python modular example - https://github.com/shogun-toolbox/shogun/commit/179091b1af287374bc754b852f2a833321e8704e18:41
blackburnsonney2k: vodka?18:42
-!- gsomix [~gsomix@109.169.131.11] has joined #shogun19:25
gsomixhi all19:25
-!- f-x [~user@117.192.204.152] has quit [Remote host closed the connection]19:28
gsomixsonney2k, i saw own ohloh account. I think i did something wrong with commiting. :)19:30
gsomixAll Languages. Total Lines Changed: 221,575.19:34
heiko1sonney2k, are you there?20:06
@sonney2kheiko1, yes20:47
@sonney2kheiko1, wassup?20:47
heiko1did you receive my email?20:48
heiko1http://pastebin.com/mjPq525g20:48
heiko1I want to create StringFeatures20:48
heiko1but it does not work20:48
heiko1because CAlphabet::check_alphabet() fails20:48
heiko1ALPHABET does not contain all symbols in histogram20:48
heiko1CAlphabet::621020:48
heiko161020:48
heiko1and I am a bit unsure, what I am doing wrong here20:49
heiko1the program creates some random char strings (this works, they are printed) and then creates a CStringFeatures instance20:49
heiko1and then SG_ERROR20:49
CIA-87shogun: Soeren Sonnenburg master * rfe563a7 / src/shogun/features/Alphabet.h :20:51
CIA-87shogun: fix documentation for ALPHANUM and PROTEIN alphabets (they take upper20:51
CIA-87shogun: case chars not lowercase) - https://github.com/shogun-toolbox/shogun/commit/fe563a7c46e90eace5ec4177ab458883510986db20:51
@sonney2kheiko1, ^20:51
@sonney2kUPPER CASE20:52
@sonney2kso 0x41 and more20:52
heiko1ah20:52
heiko1oh no :)20:53
heiko1silly mistake20:53
@sonney2kheiko1, well documentation was wrong...20:53
@sonney2kit said a-z20:53
@sonney2knot A-Z ...20:53
@sonney2kblackburn, vodka!20:53
blackburnnot yet! found error :D20:54
@sonney2kblackburn, you definitely need that to start working on java :D20:54
@sonney2kohh20:54
blackburnwith arpack the solution is kinda wrong20:54
heiko1sonney2k, well however, thanks .)20:54
blackburntemporary will force to use lapack20:54
@sonney2kgsomix, sounds like it...20:54
@sonney2kheiko1, IIRC there is some debug option - then it will display the histogram...21:05
heiko1sonney2k, works now :)21:05
CIA-87shogun: Sergey Lisitsyn master * r9ff9954 / (3 files): Added force_lapack option for Locally Linear Embedding and forced HLLE to use lapack solver - https://github.com/shogun-toolbox/shogun/commit/9ff99546af4be84dfc1c079b4318f29e7b72973521:06
@sonney2kheiko1, surprise ;-)21:06
heiko1sonney2k, I will try to generalise the model selecttion parameters now21:06
heiko1was trying to do model selection with a string kernel21:06
heiko1but it has int32_t type parameters21:06
heiko1that does not work yet21:06
@sonney2kheiko1, so you do all the other standard types like int / byte etc right?21:06
@sonney2kshould be easy given that you have double working already21:07
heiko1yes21:07
@sonney2kenums can be a bit problematic though - but in the end these are ints too21:07
heiko1yes21:07
heiko1i am not that deep into generics21:09
heiko1hope it is possible to append an instance of one modelparam to another with another type21:09
@sonney2kheiko1, enums are usually represented as integers - so at least from C/C++ / python it will work21:11
@sonney2kone could specify illegal values though - but that should be catched anyways21:12
heiko1uuh this is harder than i thought21:23
heiko1all this generic classes build trees21:23
heiko1I dont know if it is possible to have a datastructure that holds instances of generic classes with different types?21:23
heiko1no, its not21:25
heiko1mmh21:25
heiko1sonney2k, basically this problem:21:29
heiko1http://www.cplusplus.com/forum/general/1281/21:29
@sonney2kyes indeed that is not working21:32
@sonney2kheiko1, but can't you define all the types / trees that you need?21:32
heiko1but one tree may have different node types21:32
@sonney2kI mean there are only a handful21:32
heiko1like one parameter int and another float21:33
heiko1both children of one node21:33
@sonney2kwhat you could certainly do is to store the node types21:34
heiko1yes21:35
heiko1I think I will have to do that21:35
@sonney2kand then have some access function that gives you the correct item from a union or so based on that type21:35
@sonney2kor you have a node content base class21:36
@sonney2kand then have derived classes for each node content type21:37
@sonney2kbut same thing not type safe21:37
@sonney2kyou need to store the type again21:37
heiko1yes21:37
blackburnLLE IS FUCKING UNSTABLE21:38
blackburn:E21:38
heiko1I think I will just store the type and save the data with void pointers21:38
@sonney2kconsidering that we need just int, byte, bool - it is probably easiest to just use a union21:38
@sonney2kor that yes21:39
@sonney2kblackburn, like java examples SCNR :D21:39
blackburnit drives me mad21:40
blackburnI thought HLLE does21:40
blackburnbut it is not21:40
blackburnI guess I should write sth like "If embedding is shity try another data" in description21:41
blackburnsonney2k: you want me doing some notfunny things, ain't you?21:50
* blackburn wonders if there will be some ModifiedHessianLocallyLinearEmbedding21:52
blackburnor even ImplicitlyRestartedModifiedStableHessianLocallySublinearMegaEmbedding21:53
@sonney2k;)21:56
blackburnoh22:04
blackburnHLLE rocks!22:04
blackburnsonney2k: I guess you might understand how good it is: http://dl.dropbox.com/u/10139213/shogun/hlle-faces-k10.png22:05
@sonney2ksure22:09
blackburndamn that guy looks just like me :D22:10
-!- in3xes [~in3xes@180.149.49.227] has quit [Quit: Leaving]22:15
-!- gsomix [~gsomix@109.169.131.11] has quit [Ping timeout: 240 seconds]22:26
CIA-87shogun: Sergey Lisitsyn master * r053c634 / .gitignore : Added some more filetypes to ignore by git - https://github.com/shogun-toolbox/shogun/commit/053c634fce544b1efb1b2141826dc5d4d7ba3ad722:35
CIA-87shogun: Sergey Lisitsyn master * rac722b7 / src/shogun/mathematics/arpack.cpp : Improved arpack wrapper - https://github.com/shogun-toolbox/shogun/commit/ac722b7fafeb94352f855b66dbfe66aca0cdd50922:35
CIA-87shogun: Sergey Lisitsyn master * r1fff667 / src/shogun/preprocessor/LocallyLinearEmbedding.cpp : Improved stability of Locally Linear Embedding - https://github.com/shogun-toolbox/shogun/commit/1fff667d78a71a7ab8fd3c0ab9ca303ef850235922:35
CIA-87shogun: Sergey Lisitsyn master * re11c4bc / src/shogun/preprocessor/HessianLocallyLinearEmbedding.cpp : Improved HLLE - https://github.com/shogun-toolbox/shogun/commit/e11c4bcfcfab8163380b8c13933aaa3bc9d171a322:35
CIA-87shogun: Sergey Lisitsyn master * r05427ed / src/shogun/preprocessor/LocallyLinearEmbedding.cpp : Changed solver for LLE and removed unnecessary default parameter - https://github.com/shogun-toolbox/shogun/commit/05427edcff160e1241f9b9b8ad6674e8fc7b31cc22:39
-!- blackburn [~blackburn@188.122.239.253] has left #shogun []23:42
--- Log closed Tue Jul 26 00:00:21 2011

Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!