IRC logs of #shogun for Thursday, 2011-07-21

--- Log opened Thu Jul 21 00:00:50 2011
-!- serialhex_ [~quassel@99-101-148-183.lightspeed.wepbfl.sbcglobal.net] has joined #shogun00:08
blackburnsonney2k: hey it took <1gb to compile00:13
blackburnwith python_modular00:13
-!- serialhex [~quassel@99-101-148-183.lightspeed.wepbfl.sbcglobal.net] has quit [Ping timeout: 255 seconds]00:13
-!- blackburn [~blackburn@188.122.253.215] has quit [Quit: Leaving.]00:28
-!- f-x [~user@117.192.196.76] has joined #shogun01:54
-!- f-x [~user@117.192.196.76] has quit [Ping timeout: 260 seconds]02:07
-!- f-x [~user@117.192.196.76] has joined #shogun02:08
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun02:38
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds]02:41
-!- f-x [~user@117.192.196.76] has quit [Ping timeout: 260 seconds]04:24
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 255 seconds]06:49
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun07:01
-!- in3xes_ is now known as in3xes07:47
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun07:54
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds]07:57
-!- cwidmer [~quassel@connect.tuebingen.mpg.de] has quit [Read error: Operation timed out]08:14
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun09:03
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 258 seconds]09:07
-!- warpyyy [~theuser@212.179.28.34] has joined #shogun09:39
-!- in3xes1 is now known as in3xes10:00
-!- warpyyy [~theuser@212.179.28.34] has quit [Ping timeout: 276 seconds]10:17
-!- gsomix [~gsomix@109.169.132.216] has joined #shogun11:04
-!- cwidmer [~quassel@connect.tuebingen.mpg.de] has joined #shogun11:13
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds]11:17
-!- heiko [~heiko@main.uni-duisburg.de] has joined #shogun11:18
-!- blackburn [~blackburn@188.122.253.215] has joined #shogun11:21
-!- blackburn1 [~blackburn@188.122.253.215] has joined #shogun11:27
-!- blackburn [~blackburn@188.122.253.215] has quit [Read error: No route to host]11:28
@sonney2kblackburn1, 1G? it was here 1.6G...11:29
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun11:29
blackburn1how did you measure it?11:29
blackburn1I was looking to 'top'11:30
blackburn1any other way?11:30
-!- blackburn1 is now known as blackburn11:30
@sonney2kblackburn, did the same thing11:31
@sonney2kmaybe you have 50% of the features disabled ;-)11:31
blackburn--disable-optimizations --disable-doxygen11:32
@sonney2kw/ doxygen here and optimizations11:32
blackburnah11:32
blackburnI guess it is the answer11:32
-!- in3xes [~in3xes@180.149.49.227] has quit [Remote host closed the connection]11:36
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun11:43
blackburnoh shit11:48
blackburnshiiiit11:48
-!- cwidmer [~quassel@connect.tuebingen.mpg.de] has quit [Remote host closed the connection]11:48
blackburnsonney2k: damned git clean -dfx11:48
blackburnlost HLLE source :D11:49
blackburn*facepalm*11:49
heikoblackburn, oh no :(12:44
blackburnnot enough vodka12:46
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun12:56
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds]12:59
@sonney2kblackburn, man why didn't you git add your files before you do such things?13:43
blackburnforgot13:43
@sonney2kbut for the record I managed to do the same thing with modshogun 1-2days ago13:44
@sonney2kheiko, around?13:44
heikosonney2k, now14:17
@sonney2kheiko, good morning ;)14:18
@sonney2kI wanted to discuss about this kernel/distance machine issue14:18
heikosonney2k, no no :) I am awake since 8 o clock ;)14:18
heikoyes ok, I already worked a few hours on that14:19
heikoyou were right, with the rhs of kernel14:19
heikoI am doing it that way now14:19
@sonney2klhs14:19
@sonney2kleft is always training14:19
heikolhs, yes sorry14:19
@sonney2kright is 'testing'14:19
@sonney2kok14:19
heikoI added a CFeatures variable, a boolean flag and a map for indices to kernelmachine14:20
heikothen in the apply method, the conversion is done14:20
heikoand the copied sv vectors are used as lhs14:20
heikoonly thing that is a bit more complicated is to set the SVs14:21
@sonney2ksorry which conversion is done in apply()?14:21
heikofrom m_svs[i] to the index the feature has in the CFeatures variable14:21
@sonney2kheiko, I mean if you do that then m_svs[i] will just be == i14:21
@sonney2kSo when you have a function, store_svs_in_object()14:22
@sonney2kit should modify m_svs14:22
@sonney2kand store the svs14:23
heikooh yes, tree indeed14:23
heikotrue14:23
heikoyes of course14:23
-!- petiera [~ohyvarin@kosh.org.aalto.fi] has joined #shogun14:24
heikowell I think this already might work with the changes I made14:24
heikobut the setting is more complicated14:24
heikobecause the SVs are set one by one14:24
heikoby set_support_vector14:24
heikomethod14:24
heikoand another crucial point is create_new_model14:24
heikoin create_new_model a CFeatures instance of the size num_sv has to be created of same type as current lhs14:25
@sonney2kbut you dont' need that...14:25
heikomy plan was to create the CFeatures instance in create_new_model and then to copy the SV data vector wise in the set_support_vector method14:26
@sonney2kheiko, I would prefer one method that is doing the conversion separately14:27
@sonney2kthis way it can be called after training - if wanted14:27
heikommmh14:27
heikook14:27
heikothen all KernelMachine implementations would have to call it14:28
@sonney2kyes, after training they could all call this common function14:29
petieragood afternoon, I'd like to know if it is possible to get leave-one-out estimates of svm classifiers after training?14:29
heikohi petiera, we are currently working on this14:30
petieraok, nice!14:30
heikopetiera, currently this works for LibLinear, I am currently doing it for KernelMachines14:31
heikosonney2k, I will continue on the stuff now, moving it to the separate function14:31
petieraheiko: I'd like to use it with MKL, which to my understanding uses either LibSVM or SVMLight. Am I right?14:33
heikopetiera, yes, I think so, any SVM14:36
@sonney2kheiko, I would do the following modification to the train function of all classes deriving from kernel machines: rename the train function to train_kernel_machine() (make it protected/private and virtual), in CKernelMachine::train() then call train_kernel_machine() and do the postprocessing afterwards if the copy_svs flag is true14:36
@sonney2kpetiera, any SVM / SVR but SVMLight with interleaved is fastest14:37
heikosonney2k, yes good idea14:37
@sonney2kheiko, so this is the same as we do in CKernel, there is a CKernel::kernel() function that calls compute() and is overloaded in the subclasses14:38
heikoyes14:38
petierasonney2k, heiko: ok, thanks to both!14:39
heikopetiera, Perhaps the KernelMachine cross-validation will work at the end of this week *experimentally*, so you might check it then14:40
petieraheiko: cool, I'll take a look at it when you're ready14:43
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun14:55
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 252 seconds]14:58
-!- in3xes1 [~in3xes@180.149.49.227] has quit [Remote host closed the connection]15:14
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun15:14
CIA-87shogun: Baozeng Ding master * rc3d0e6d / testsuite/lua_modular/generator.lua : add generator.lua - https://github.com/shogun-toolbox/shogun/commit/c3d0e6d9f092cd80dede9852d10a6d9588e2398715:25
CIA-87shogun: Soeren Sonnenburg master * r5ac6102 / (3 files in 3 dirs):15:26
CIA-87shogun: Merge pull request #220 from sploving/master15:26
CIA-87shogun: add generator.lua - https://github.com/shogun-toolbox/shogun/commit/5ac61021fb14249b0270ba77eca07e1a35e5233e15:26
-!- sploving1 [~sploving@124.16.139.194] has joined #shogun15:27
sploving1sonney2k, a bad news. in lua, we cannot seril./dump user data15:27
@sonney2ksploving1, ?15:27
@sonney2kwhy not?15:28
sploving1cannot serilize userdata. it is opaque.15:28
@sonney2kwhat is userdata?15:28
sploving1in our examples, kernel=GaussianKernel(), then kernel is userdata15:29
@sonney2ksploving1, but shogun objects can be dumped as ascii stream15:29
@sonney2klike15:30
@sonney2k    fstream = SerializableAsciiFile("blaah.asc", "w")15:30
@sonney2k    status = svm.save_serializable(fstream)15:30
@sonney2k    check_status(status)15:30
@sonney2kso you can read this string / save it15:30
@sonney2kwith svm being any shogun object15:31
sploving1I mean the generator.py, which serialize them into file15:31
sploving1lua does not support serialize them into file15:31
@sonney2ksploving1, yes I understand15:31
@sonney2kbut how do you turn an object into one that can be serialized?15:32
sploving1it can support tables, but userdata is different to tables15:33
@sonney2kbut can't you attach some function to userdata that if available will enable the object to be serialized?15:34
sploving1no15:35
sploving1it does not support15:35
@sonney2kin the end it does not really matter15:36
@sonney2kwe can still serialize shogun objects15:36
@sonney2ksploving1, I am not really sure what you are trying to do...15:39
sploving1writing generator.lua15:39
@sonney2ksploving1, do the general typemaps for strings/matrices/vectors work?15:39
@sonney2ksploving1, well ignore the generator / tester for now15:40
sploving1ofcourse they work. I have pushed the examples15:40
sploving1!!!15:40
@sonney2kwe might be able to do it all differently15:40
@sonney2ksploving1, did you compare that they give the same results as in python_modular?15:40
@sonney2knot the typemaps but e.g. the distance matrix / or learned classifier15:41
sploving1I leave the last week to test them. Now the main work is coding15:43
sploving1if you have time, you can have a try15:43
@sonney2ksploving1, please test them now15:43
sploving1or blackburn helps me15:44
@sonney2kotherwise we will never know if this works or not15:44
@sonney2ksploving1, blackburn has his own project to work on ...15:44
@sonney2kif these tests work you can move on to the next language15:45
sploving1what about generator and tester15:45
sploving1I have worked nearly a day for gernerator.lua:(15:46
sploving1I think the test is very easy15:46
sploving1just print out some results15:46
@sonney2ksploving1, well ok then please explain to me how it works15:46
@sonney2khow do you save everything except shogun objects?15:47
sploving1current generator.lua does not save anything15:47
@sonney2kso how will it serialize anything ?15:48
sploving1http://lua-users.org/wiki/TableSerialization15:49
sploving1I am not familiar to them now15:49
@sonney2kI mean the examples will return some strings, doubles etc15:49
sploving1if it is tables, it can serialize them15:50
@sonney2ksploving1, so serialization is not part of lua15:50
@sonney2kbut you need to get some serialization library to do the work?15:50
@sonney2kbut then you can easily dump shogun objects. you just call these functions I meantioned above and save the string15:51
@sonney2ksploving1, anyway it is much more important to check if the few examples you converted return the exact same results15:52
sploving1oh. I will check them15:52
sploving1just print out some result??15:52
@sonney2kup to you15:53
@sonney2kfor kernel / distance I would print out the distance / kernel matrix15:53
@sonney2kfor features the feature matrix15:53
sploving1okay. no problem!15:53
sploving1if they work well, I can change to ruby??15:53
@sonney2kand for classifiers the labels15:54
@sonney2kand then compare these to the python output15:54
@sonney2kthis is manual labor15:54
@sonney2kso you modify the python script to print out that stuff15:54
sploving1I would like to finish lua  as early as possible, as ruby and C # , especially c# may take longer tiem15:54
@sonney2kthen do the same in lua for the same script15:54
@sonney2ksploving1, do you have an example with strings?15:55
sploving1you mean dna15:55
sploving1I forgot it!15:55
@sonney2ke.g. python_modular/kernel_weighted_degree_string_modular.py15:56
@sonney2kcurrently I see matrix and vector typemaps related tests15:56
@sonney2kfor floats15:56
sploving1okay. I will add it.15:56
@sonney2ksploving1, do other types such as byte or so work too?15:57
sploving1not yet. I  thought port 1 example means, one kernel, one distance, one ...15:58
@sonney2kwhile there is no sparse matrix stuff in lua these typemaps can be emulated by returning a table with 3 examples15:58
@sonney2ks/examples/columns/15:58
@sonney2kor 3 vectors no idea how that is called in lua15:58
sploving1I will not support sparse matrix15:59
sploving1it is not efficiency15:59
@sonney2kwell then do the tests and support for other types at least16:02
sploving1sonney2k, okay. which trouble me is there is no method like pickle.dump method in lua, which generator.lua and tester.lua need16:09
sploving1sonne2y, can we find another way to compare results, not use dump to file method,16:11
sploving1which is general to all modules??16:11
sploving1as we know, java, lua can not use this dump method well16:12
sploving1sonney2k, I gtg. bye16:14
-!- f-x` [~user@117.192.220.115] has joined #shogun16:15
-!- f-x` [~user@117.192.220.115] has quit [Client Quit]16:16
-!- sploving1 [~sploving@124.16.139.194] has left #shogun []16:16
-!- f-x` [~user@117.192.220.115] has joined #shogun16:17
-!- f-x` [~user@117.192.220.115] has quit [Client Quit]16:17
-!- f-x` [~user@117.192.220.115] has joined #shogun16:18
-!- f-x` is now known as f-x16:23
heikosonney2k, while replacing train, I found a classify method, in KRR, I think its obsolete or?16:23
-!- f-x is now known as Guest6835916:24
-!- Guest68359 is now known as f-x`16:24
heikosonney2k, and kernelPerceptron seems so be empty?16:25
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has joined #shogun16:31
@sonney2kheiko, yes classify should be apply()16:31
@sonney2kand classify_example should be apply() too16:31
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has quit [Read error: Connection reset by peer]16:31
heikook, will do this on the fly then16:31
@sonney2kheiko, and yes remove kernel perceptron16:32
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has joined #shogun16:32
heikook then16:32
@sonney2khi f-x`16:36
-!- blackburn [~blackburn@188.122.253.215] has quit [Quit: Leaving.]16:36
f-x`hey sonney2k16:38
@sonney2khi f-x` - how is it going?16:39
f-x`sonney2k: made an sgd-qn implementation based on the paper in python.. but still have to get it to work properly16:39
@sonney2kI've seen you update a few things in the streamingfeatures (no longer busy loop)16:39
f-x`so in the meantime i'm trying to bring over a few vw things into shogun16:39
f-x`sonney2k: will we be using the vw input format?16:41
f-x`plus, vw uses hashing on features to access them fast, i think.. should we use this too?16:42
alesis-novikHey, is R modular working properly in current version of shogun?16:44
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds]16:51
@sonney2kalesis-novik, no - but octave_modular is now :)17:02
@bettyboo<:*)17:02
alesis-noviksonney2k, thanks17:02
@sonney2kalesis-novik, actually what the README says17:02
@sonney2k(not the src/README but the toplevel one)17:02
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun17:03
-!- gsomix [~gsomix@109.169.132.216] has quit [Quit: Ухожу я от вас (xchat 2.4.5 или старше)]17:15
-!- heiko [~heiko@main.uni-duisburg.de] has quit [Ping timeout: 258 seconds]17:16
@sonney2kheiko great stuff!17:20
CIA-87shogun: Soeren Sonnenburg master * r2399d29 / (47 files in 11 dirs):17:20
CIA-87shogun: Merge pull request #221 from karlnapf/master17:20
CIA-87shogun: changes towards kernel machine model storage (+10 more commits...) - https://github.com/shogun-toolbox/shogun/commit/2399d29c2d8889531388a2b15dc4138cfd93a63917:20
@sonney2kf-x`, it makes sense to support also VW's input format17:21
@sonney2kf-x`, this way you can much easier test if vw and shogun's vw do the same thing17:21
@sonney2kf-x`, so you would either modify AsciiFeatures and set a ascii feature type (like VW or SVMLIGHT)17:21
@sonney2kor create a new class VWAsciiFeatures (though I prefer the first option)17:22
@sonney2kf-x`, shogun's dotfeatures do that too17:22
@sonney2k(hashing)17:22
f-x`sonney2k: ah.. didn't know that17:23
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun17:25
f-x`i think it's high time i made an initial implementation of vw17:26
f-x`and then add those advanced features to make it fast17:26
@sonney2kf-x`, yeah and a systematic comparison of liblinear/sgd/sgd-qn/vw on these challenge data sets (there is not just alpha)17:28
f-x`sonney2k: but what can we do about sgd-qn?17:28
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds]17:28
-!- gsomix [~gsomix@109.169.132.216] has joined #shogun17:30
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has quit [Quit: Leaving]17:30
@sonney2kf-x`, I have to check in detail17:30
@sonney2kbut not now17:30
f-x`so should i leave that for later?17:31
f-x`because i think i'll have to go through that paper properly and make an implementation of my own, and obviously i can' do that well enough17:31
f-x`*can't17:31
-!- in3xes_ is now known as in3xes17:36
-!- f-x` [~user@117.192.220.115] has quit [Remote host closed the connection]19:05
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun19:16
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds]19:20
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds]20:05
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun20:18
-!- in3xes_ is now known as in3xes20:45
-!- blackburn [~blackburn@188.122.253.215] has joined #shogun20:47
blackburnsonney2k: any news?21:13
@sonney2kblackburn, sunset is over21:13
blackburngreat21:14
blackburnhehe21:14
@sonney2kapart from that I heard that someone did a git clean -dfx and lost quite a bit of code :-]21:14
blackburnyeah, stupid guy21:14
blackburnI also heard he is suffering some memory issues now21:15
@sonney2kand even worse - he is one of the shogun contributors21:15
blackburnit is crazy21:15
@sonney2kno idea who why anyone wants to use that toolbox21:15
@sonney2kalllllllllllllllllllrighty21:15
@sonney2kI am currently doing some experiments on my notebook21:16
blackburnwarming it up to 95C?21:16
@sonney2kso I cannot really fiddle with shogun atm21:16
@sonney2k92C21:16
@sonney2k94C21:16
@sonney2karound that21:16
@sonney2kif you open the window you will hear my fan21:17
blackburnyeah, some noise21:17
blackburnzzzzzz21:17
blackburnsonney2k: btw I won't be online tomorrow - hope you won't miss me ;)21:18
blackburnbut I will finish HLLE on this weekend21:18
@sonney2kI will immediately have to go shopping21:19
@sonney2kbuy some vodka for tomorrow21:19
@sonney2kto not be alone here21:19
blackburnhehe21:20
@sonney2kblackburn, did you look at any of the java examples yet?21:21
blackburnI've execute all of them21:21
blackburnsome failed21:21
blackburnbut it was before your berserking21:22
@sonney2kwe really need to *test* if they are giving the same result like the python ones21:22
@sonney2ksince I learned today that lua has no real principle of serializing - it might be very worthwhile to do the comparison via shogun serialization21:23
@sonney2kstill I'd like to be able to serialize any shogun java object21:23
blackburnI'll test all of them on sunday, ok?21:24
@sonney2kok then I will adjust them to the new modshogun21:26
@sonney2kand then try to get the R typemaps to work more reliably21:26
blackburnnow I have no idea why HLLE fails with memory21:26
blackburnaha21:27
blackburnsvd hehe21:27
@sonney2kdid I mention that we need a build-bot?21:28
blackburnyes21:45
* gsomix heard 'vodka'...22:07
CIA-87shogun: Soeren Sonnenburg master * rfd2da15 / (130 files): Modify examples to only load modshogun. - https://github.com/shogun-toolbox/shogun/commit/fd2da157e85cc3b6534f2eaa1af296391583401522:34
@sonney2kgsomix, that's the spirit :D22:35
gsomixsmells like vodka spirit22:35
* gsomix http://media.skysurvey.org/interactive360/index.html23:02
--- Log closed Fri Jul 22 00:00:55 2011

Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!