--- Log opened Thu Jul 21 00:00:50 2011 | ||
-!- serialhex_ [~quassel@99-101-148-183.lightspeed.wepbfl.sbcglobal.net] has joined #shogun | 00:08 | |
blackburn | sonney2k: hey it took <1gb to compile | 00:13 |
---|---|---|
blackburn | with python_modular | 00:13 |
-!- serialhex [~quassel@99-101-148-183.lightspeed.wepbfl.sbcglobal.net] has quit [Ping timeout: 255 seconds] | 00:13 | |
-!- blackburn [~blackburn@188.122.253.215] has quit [Quit: Leaving.] | 00:28 | |
-!- f-x [~user@117.192.196.76] has joined #shogun | 01:54 | |
-!- f-x [~user@117.192.196.76] has quit [Ping timeout: 260 seconds] | 02:07 | |
-!- f-x [~user@117.192.196.76] has joined #shogun | 02:08 | |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 02:38 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds] | 02:41 | |
-!- f-x [~user@117.192.196.76] has quit [Ping timeout: 260 seconds] | 04:24 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 255 seconds] | 06:49 | |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 07:01 | |
-!- in3xes_ is now known as in3xes | 07:47 | |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 07:54 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds] | 07:57 | |
-!- cwidmer [~quassel@connect.tuebingen.mpg.de] has quit [Read error: Operation timed out] | 08:14 | |
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun | 09:03 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 258 seconds] | 09:07 | |
-!- warpyyy [~theuser@212.179.28.34] has joined #shogun | 09:39 | |
-!- in3xes1 is now known as in3xes | 10:00 | |
-!- warpyyy [~theuser@212.179.28.34] has quit [Ping timeout: 276 seconds] | 10:17 | |
-!- gsomix [~gsomix@109.169.132.216] has joined #shogun | 11:04 | |
-!- cwidmer [~quassel@connect.tuebingen.mpg.de] has joined #shogun | 11:13 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds] | 11:17 | |
-!- heiko [~heiko@main.uni-duisburg.de] has joined #shogun | 11:18 | |
-!- blackburn [~blackburn@188.122.253.215] has joined #shogun | 11:21 | |
-!- blackburn1 [~blackburn@188.122.253.215] has joined #shogun | 11:27 | |
-!- blackburn [~blackburn@188.122.253.215] has quit [Read error: No route to host] | 11:28 | |
@sonney2k | blackburn1, 1G? it was here 1.6G... | 11:29 |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 11:29 | |
blackburn1 | how did you measure it? | 11:29 |
blackburn1 | I was looking to 'top' | 11:30 |
blackburn1 | any other way? | 11:30 |
-!- blackburn1 is now known as blackburn | 11:30 | |
@sonney2k | blackburn, did the same thing | 11:31 |
@sonney2k | maybe you have 50% of the features disabled ;-) | 11:31 |
blackburn | --disable-optimizations --disable-doxygen | 11:32 |
@sonney2k | w/ doxygen here and optimizations | 11:32 |
blackburn | ah | 11:32 |
blackburn | I guess it is the answer | 11:32 |
-!- in3xes [~in3xes@180.149.49.227] has quit [Remote host closed the connection] | 11:36 | |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 11:43 | |
blackburn | oh shit | 11:48 |
blackburn | shiiiit | 11:48 |
-!- cwidmer [~quassel@connect.tuebingen.mpg.de] has quit [Remote host closed the connection] | 11:48 | |
blackburn | sonney2k: damned git clean -dfx | 11:48 |
blackburn | lost HLLE source :D | 11:49 |
blackburn | *facepalm* | 11:49 |
heiko | blackburn, oh no :( | 12:44 |
blackburn | not enough vodka | 12:46 |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 12:56 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds] | 12:59 | |
@sonney2k | blackburn, man why didn't you git add your files before you do such things? | 13:43 |
blackburn | forgot | 13:43 |
@sonney2k | but for the record I managed to do the same thing with modshogun 1-2days ago | 13:44 |
@sonney2k | heiko, around? | 13:44 |
heiko | sonney2k, now | 14:17 |
@sonney2k | heiko, good morning ;) | 14:18 |
@sonney2k | I wanted to discuss about this kernel/distance machine issue | 14:18 |
heiko | sonney2k, no no :) I am awake since 8 o clock ;) | 14:18 |
heiko | yes ok, I already worked a few hours on that | 14:19 |
heiko | you were right, with the rhs of kernel | 14:19 |
heiko | I am doing it that way now | 14:19 |
@sonney2k | lhs | 14:19 |
@sonney2k | left is always training | 14:19 |
heiko | lhs, yes sorry | 14:19 |
@sonney2k | right is 'testing' | 14:19 |
@sonney2k | ok | 14:19 |
heiko | I added a CFeatures variable, a boolean flag and a map for indices to kernelmachine | 14:20 |
heiko | then in the apply method, the conversion is done | 14:20 |
heiko | and the copied sv vectors are used as lhs | 14:20 |
heiko | only thing that is a bit more complicated is to set the SVs | 14:21 |
@sonney2k | sorry which conversion is done in apply()? | 14:21 |
heiko | from m_svs[i] to the index the feature has in the CFeatures variable | 14:21 |
@sonney2k | heiko, I mean if you do that then m_svs[i] will just be == i | 14:21 |
@sonney2k | So when you have a function, store_svs_in_object() | 14:22 |
@sonney2k | it should modify m_svs | 14:22 |
@sonney2k | and store the svs | 14:23 |
heiko | oh yes, tree indeed | 14:23 |
heiko | true | 14:23 |
heiko | yes of course | 14:23 |
-!- petiera [~ohyvarin@kosh.org.aalto.fi] has joined #shogun | 14:24 | |
heiko | well I think this already might work with the changes I made | 14:24 |
heiko | but the setting is more complicated | 14:24 |
heiko | because the SVs are set one by one | 14:24 |
heiko | by set_support_vector | 14:24 |
heiko | method | 14:24 |
heiko | and another crucial point is create_new_model | 14:24 |
heiko | in create_new_model a CFeatures instance of the size num_sv has to be created of same type as current lhs | 14:25 |
@sonney2k | but you dont' need that... | 14:25 |
heiko | my plan was to create the CFeatures instance in create_new_model and then to copy the SV data vector wise in the set_support_vector method | 14:26 |
@sonney2k | heiko, I would prefer one method that is doing the conversion separately | 14:27 |
@sonney2k | this way it can be called after training - if wanted | 14:27 |
heiko | mmmh | 14:27 |
heiko | ok | 14:27 |
heiko | then all KernelMachine implementations would have to call it | 14:28 |
@sonney2k | yes, after training they could all call this common function | 14:29 |
petiera | good afternoon, I'd like to know if it is possible to get leave-one-out estimates of svm classifiers after training? | 14:29 |
heiko | hi petiera, we are currently working on this | 14:30 |
petiera | ok, nice! | 14:30 |
heiko | petiera, currently this works for LibLinear, I am currently doing it for KernelMachines | 14:31 |
heiko | sonney2k, I will continue on the stuff now, moving it to the separate function | 14:31 |
petiera | heiko: I'd like to use it with MKL, which to my understanding uses either LibSVM or SVMLight. Am I right? | 14:33 |
heiko | petiera, yes, I think so, any SVM | 14:36 |
@sonney2k | heiko, I would do the following modification to the train function of all classes deriving from kernel machines: rename the train function to train_kernel_machine() (make it protected/private and virtual), in CKernelMachine::train() then call train_kernel_machine() and do the postprocessing afterwards if the copy_svs flag is true | 14:36 |
@sonney2k | petiera, any SVM / SVR but SVMLight with interleaved is fastest | 14:37 |
heiko | sonney2k, yes good idea | 14:37 |
@sonney2k | heiko, so this is the same as we do in CKernel, there is a CKernel::kernel() function that calls compute() and is overloaded in the subclasses | 14:38 |
heiko | yes | 14:38 |
petiera | sonney2k, heiko: ok, thanks to both! | 14:39 |
heiko | petiera, Perhaps the KernelMachine cross-validation will work at the end of this week *experimentally*, so you might check it then | 14:40 |
petiera | heiko: cool, I'll take a look at it when you're ready | 14:43 |
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun | 14:55 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 252 seconds] | 14:58 | |
-!- in3xes1 [~in3xes@180.149.49.227] has quit [Remote host closed the connection] | 15:14 | |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 15:14 | |
CIA-87 | shogun: Baozeng Ding master * rc3d0e6d / testsuite/lua_modular/generator.lua : add generator.lua - https://github.com/shogun-toolbox/shogun/commit/c3d0e6d9f092cd80dede9852d10a6d9588e23987 | 15:25 |
CIA-87 | shogun: Soeren Sonnenburg master * r5ac6102 / (3 files in 3 dirs): | 15:26 |
CIA-87 | shogun: Merge pull request #220 from sploving/master | 15:26 |
CIA-87 | shogun: add generator.lua - https://github.com/shogun-toolbox/shogun/commit/5ac61021fb14249b0270ba77eca07e1a35e5233e | 15:26 |
-!- sploving1 [~sploving@124.16.139.194] has joined #shogun | 15:27 | |
sploving1 | sonney2k, a bad news. in lua, we cannot seril./dump user data | 15:27 |
@sonney2k | sploving1, ? | 15:27 |
@sonney2k | why not? | 15:28 |
sploving1 | cannot serilize userdata. it is opaque. | 15:28 |
@sonney2k | what is userdata? | 15:28 |
sploving1 | in our examples, kernel=GaussianKernel(), then kernel is userdata | 15:29 |
@sonney2k | sploving1, but shogun objects can be dumped as ascii stream | 15:29 |
@sonney2k | like | 15:30 |
@sonney2k | fstream = SerializableAsciiFile("blaah.asc", "w") | 15:30 |
@sonney2k | status = svm.save_serializable(fstream) | 15:30 |
@sonney2k | check_status(status) | 15:30 |
@sonney2k | so you can read this string / save it | 15:30 |
@sonney2k | with svm being any shogun object | 15:31 |
sploving1 | I mean the generator.py, which serialize them into file | 15:31 |
sploving1 | lua does not support serialize them into file | 15:31 |
@sonney2k | sploving1, yes I understand | 15:31 |
@sonney2k | but how do you turn an object into one that can be serialized? | 15:32 |
sploving1 | it can support tables, but userdata is different to tables | 15:33 |
@sonney2k | but can't you attach some function to userdata that if available will enable the object to be serialized? | 15:34 |
sploving1 | no | 15:35 |
sploving1 | it does not support | 15:35 |
@sonney2k | in the end it does not really matter | 15:36 |
@sonney2k | we can still serialize shogun objects | 15:36 |
@sonney2k | sploving1, I am not really sure what you are trying to do... | 15:39 |
sploving1 | writing generator.lua | 15:39 |
@sonney2k | sploving1, do the general typemaps for strings/matrices/vectors work? | 15:39 |
@sonney2k | sploving1, well ignore the generator / tester for now | 15:40 |
sploving1 | ofcourse they work. I have pushed the examples | 15:40 |
sploving1 | !!! | 15:40 |
@sonney2k | we might be able to do it all differently | 15:40 |
@sonney2k | sploving1, did you compare that they give the same results as in python_modular? | 15:40 |
@sonney2k | not the typemaps but e.g. the distance matrix / or learned classifier | 15:41 |
sploving1 | I leave the last week to test them. Now the main work is coding | 15:43 |
sploving1 | if you have time, you can have a try | 15:43 |
@sonney2k | sploving1, please test them now | 15:43 |
sploving1 | or blackburn helps me | 15:44 |
@sonney2k | otherwise we will never know if this works or not | 15:44 |
@sonney2k | sploving1, blackburn has his own project to work on ... | 15:44 |
@sonney2k | if these tests work you can move on to the next language | 15:45 |
sploving1 | what about generator and tester | 15:45 |
sploving1 | I have worked nearly a day for gernerator.lua:( | 15:46 |
sploving1 | I think the test is very easy | 15:46 |
sploving1 | just print out some results | 15:46 |
@sonney2k | sploving1, well ok then please explain to me how it works | 15:46 |
@sonney2k | how do you save everything except shogun objects? | 15:47 |
sploving1 | current generator.lua does not save anything | 15:47 |
@sonney2k | so how will it serialize anything ? | 15:48 |
sploving1 | http://lua-users.org/wiki/TableSerialization | 15:49 |
sploving1 | I am not familiar to them now | 15:49 |
@sonney2k | I mean the examples will return some strings, doubles etc | 15:49 |
sploving1 | if it is tables, it can serialize them | 15:50 |
@sonney2k | sploving1, so serialization is not part of lua | 15:50 |
@sonney2k | but you need to get some serialization library to do the work? | 15:50 |
@sonney2k | but then you can easily dump shogun objects. you just call these functions I meantioned above and save the string | 15:51 |
@sonney2k | sploving1, anyway it is much more important to check if the few examples you converted return the exact same results | 15:52 |
sploving1 | oh. I will check them | 15:52 |
sploving1 | just print out some result?? | 15:52 |
@sonney2k | up to you | 15:53 |
@sonney2k | for kernel / distance I would print out the distance / kernel matrix | 15:53 |
@sonney2k | for features the feature matrix | 15:53 |
sploving1 | okay. no problem! | 15:53 |
sploving1 | if they work well, I can change to ruby?? | 15:53 |
@sonney2k | and for classifiers the labels | 15:54 |
@sonney2k | and then compare these to the python output | 15:54 |
@sonney2k | this is manual labor | 15:54 |
@sonney2k | so you modify the python script to print out that stuff | 15:54 |
sploving1 | I would like to finish lua as early as possible, as ruby and C # , especially c# may take longer tiem | 15:54 |
@sonney2k | then do the same in lua for the same script | 15:54 |
@sonney2k | sploving1, do you have an example with strings? | 15:55 |
sploving1 | you mean dna | 15:55 |
sploving1 | I forgot it! | 15:55 |
@sonney2k | e.g. python_modular/kernel_weighted_degree_string_modular.py | 15:56 |
@sonney2k | currently I see matrix and vector typemaps related tests | 15:56 |
@sonney2k | for floats | 15:56 |
sploving1 | okay. I will add it. | 15:56 |
@sonney2k | sploving1, do other types such as byte or so work too? | 15:57 |
sploving1 | not yet. I thought port 1 example means, one kernel, one distance, one ... | 15:58 |
@sonney2k | while there is no sparse matrix stuff in lua these typemaps can be emulated by returning a table with 3 examples | 15:58 |
@sonney2k | s/examples/columns/ | 15:58 |
@sonney2k | or 3 vectors no idea how that is called in lua | 15:58 |
sploving1 | I will not support sparse matrix | 15:59 |
sploving1 | it is not efficiency | 15:59 |
@sonney2k | well then do the tests and support for other types at least | 16:02 |
sploving1 | sonney2k, okay. which trouble me is there is no method like pickle.dump method in lua, which generator.lua and tester.lua need | 16:09 |
sploving1 | sonne2y, can we find another way to compare results, not use dump to file method, | 16:11 |
sploving1 | which is general to all modules?? | 16:11 |
sploving1 | as we know, java, lua can not use this dump method well | 16:12 |
sploving1 | sonney2k, I gtg. bye | 16:14 |
-!- f-x` [~user@117.192.220.115] has joined #shogun | 16:15 | |
-!- f-x` [~user@117.192.220.115] has quit [Client Quit] | 16:16 | |
-!- sploving1 [~sploving@124.16.139.194] has left #shogun [] | 16:16 | |
-!- f-x` [~user@117.192.220.115] has joined #shogun | 16:17 | |
-!- f-x` [~user@117.192.220.115] has quit [Client Quit] | 16:17 | |
-!- f-x` [~user@117.192.220.115] has joined #shogun | 16:18 | |
-!- f-x` is now known as f-x | 16:23 | |
heiko | sonney2k, while replacing train, I found a classify method, in KRR, I think its obsolete or? | 16:23 |
-!- f-x is now known as Guest68359 | 16:24 | |
-!- Guest68359 is now known as f-x` | 16:24 | |
heiko | sonney2k, and kernelPerceptron seems so be empty? | 16:25 |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has joined #shogun | 16:31 | |
@sonney2k | heiko, yes classify should be apply() | 16:31 |
@sonney2k | and classify_example should be apply() too | 16:31 |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has quit [Read error: Connection reset by peer] | 16:31 | |
heiko | ok, will do this on the fly then | 16:31 |
@sonney2k | heiko, and yes remove kernel perceptron | 16:32 |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has joined #shogun | 16:32 | |
heiko | ok then | 16:32 |
@sonney2k | hi f-x` | 16:36 |
-!- blackburn [~blackburn@188.122.253.215] has quit [Quit: Leaving.] | 16:36 | |
f-x` | hey sonney2k | 16:38 |
@sonney2k | hi f-x` - how is it going? | 16:39 |
f-x` | sonney2k: made an sgd-qn implementation based on the paper in python.. but still have to get it to work properly | 16:39 |
@sonney2k | I've seen you update a few things in the streamingfeatures (no longer busy loop) | 16:39 |
f-x` | so in the meantime i'm trying to bring over a few vw things into shogun | 16:39 |
f-x` | sonney2k: will we be using the vw input format? | 16:41 |
f-x` | plus, vw uses hashing on features to access them fast, i think.. should we use this too? | 16:42 |
alesis-novik | Hey, is R modular working properly in current version of shogun? | 16:44 |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds] | 16:51 | |
@sonney2k | alesis-novik, no - but octave_modular is now :) | 17:02 |
@bettyboo | <:*) | 17:02 |
alesis-novik | sonney2k, thanks | 17:02 |
@sonney2k | alesis-novik, actually what the README says | 17:02 |
@sonney2k | (not the src/README but the toplevel one) | 17:02 |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 17:03 | |
-!- gsomix [~gsomix@109.169.132.216] has quit [Quit: Ухожу я от вас (xchat 2.4.5 или старше)] | 17:15 | |
-!- heiko [~heiko@main.uni-duisburg.de] has quit [Ping timeout: 258 seconds] | 17:16 | |
@sonney2k | heiko great stuff! | 17:20 |
CIA-87 | shogun: Soeren Sonnenburg master * r2399d29 / (47 files in 11 dirs): | 17:20 |
CIA-87 | shogun: Merge pull request #221 from karlnapf/master | 17:20 |
CIA-87 | shogun: changes towards kernel machine model storage (+10 more commits...) - https://github.com/shogun-toolbox/shogun/commit/2399d29c2d8889531388a2b15dc4138cfd93a639 | 17:20 |
@sonney2k | f-x`, it makes sense to support also VW's input format | 17:21 |
@sonney2k | f-x`, this way you can much easier test if vw and shogun's vw do the same thing | 17:21 |
@sonney2k | f-x`, so you would either modify AsciiFeatures and set a ascii feature type (like VW or SVMLIGHT) | 17:21 |
@sonney2k | or create a new class VWAsciiFeatures (though I prefer the first option) | 17:22 |
@sonney2k | f-x`, shogun's dotfeatures do that too | 17:22 |
@sonney2k | (hashing) | 17:22 |
f-x` | sonney2k: ah.. didn't know that | 17:23 |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 17:25 | |
f-x` | i think it's high time i made an initial implementation of vw | 17:26 |
f-x` | and then add those advanced features to make it fast | 17:26 |
@sonney2k | f-x`, yeah and a systematic comparison of liblinear/sgd/sgd-qn/vw on these challenge data sets (there is not just alpha) | 17:28 |
f-x` | sonney2k: but what can we do about sgd-qn? | 17:28 |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds] | 17:28 | |
-!- gsomix [~gsomix@109.169.132.216] has joined #shogun | 17:30 | |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has quit [Quit: Leaving] | 17:30 | |
@sonney2k | f-x`, I have to check in detail | 17:30 |
@sonney2k | but not now | 17:30 |
f-x` | so should i leave that for later? | 17:31 |
f-x` | because i think i'll have to go through that paper properly and make an implementation of my own, and obviously i can' do that well enough | 17:31 |
f-x` | *can't | 17:31 |
-!- in3xes_ is now known as in3xes | 17:36 | |
-!- f-x` [~user@117.192.220.115] has quit [Remote host closed the connection] | 19:05 | |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 19:16 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds] | 19:20 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds] | 20:05 | |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 20:18 | |
-!- in3xes_ is now known as in3xes | 20:45 | |
-!- blackburn [~blackburn@188.122.253.215] has joined #shogun | 20:47 | |
blackburn | sonney2k: any news? | 21:13 |
@sonney2k | blackburn, sunset is over | 21:13 |
blackburn | great | 21:14 |
blackburn | hehe | 21:14 |
@sonney2k | apart from that I heard that someone did a git clean -dfx and lost quite a bit of code :-] | 21:14 |
blackburn | yeah, stupid guy | 21:14 |
blackburn | I also heard he is suffering some memory issues now | 21:15 |
@sonney2k | and even worse - he is one of the shogun contributors | 21:15 |
blackburn | it is crazy | 21:15 |
@sonney2k | no idea who why anyone wants to use that toolbox | 21:15 |
@sonney2k | alllllllllllllllllllrighty | 21:15 |
@sonney2k | I am currently doing some experiments on my notebook | 21:16 |
blackburn | warming it up to 95C? | 21:16 |
@sonney2k | so I cannot really fiddle with shogun atm | 21:16 |
@sonney2k | 92C | 21:16 |
@sonney2k | 94C | 21:16 |
@sonney2k | around that | 21:16 |
@sonney2k | if you open the window you will hear my fan | 21:17 |
blackburn | yeah, some noise | 21:17 |
blackburn | zzzzzz | 21:17 |
blackburn | sonney2k: btw I won't be online tomorrow - hope you won't miss me ;) | 21:18 |
blackburn | but I will finish HLLE on this weekend | 21:18 |
@sonney2k | I will immediately have to go shopping | 21:19 |
@sonney2k | buy some vodka for tomorrow | 21:19 |
@sonney2k | to not be alone here | 21:19 |
blackburn | hehe | 21:20 |
@sonney2k | blackburn, did you look at any of the java examples yet? | 21:21 |
blackburn | I've execute all of them | 21:21 |
blackburn | some failed | 21:21 |
blackburn | but it was before your berserking | 21:22 |
@sonney2k | we really need to *test* if they are giving the same result like the python ones | 21:22 |
@sonney2k | since I learned today that lua has no real principle of serializing - it might be very worthwhile to do the comparison via shogun serialization | 21:23 |
@sonney2k | still I'd like to be able to serialize any shogun java object | 21:23 |
blackburn | I'll test all of them on sunday, ok? | 21:24 |
@sonney2k | ok then I will adjust them to the new modshogun | 21:26 |
@sonney2k | and then try to get the R typemaps to work more reliably | 21:26 |
blackburn | now I have no idea why HLLE fails with memory | 21:26 |
blackburn | aha | 21:27 |
blackburn | svd hehe | 21:27 |
@sonney2k | did I mention that we need a build-bot? | 21:28 |
blackburn | yes | 21:45 |
* gsomix heard 'vodka'... | 22:07 | |
CIA-87 | shogun: Soeren Sonnenburg master * rfd2da15 / (130 files): Modify examples to only load modshogun. - https://github.com/shogun-toolbox/shogun/commit/fd2da157e85cc3b6534f2eaa1af2963915834015 | 22:34 |
@sonney2k | gsomix, that's the spirit :D | 22:35 |
gsomix | smells like vodka spirit | 22:35 |
* gsomix http://media.skysurvey.org/interactive360/index.html | 23:02 | |
--- Log closed Fri Jul 22 00:00:55 2011 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!