--- Log opened Fri Jul 22 00:00:55 2011 | ||
-!- gsomix [~gsomix@109.169.132.216] has quit [Quit: Ухожу я от вас (xchat 2.4.5 или старше)] | 00:03 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 252 seconds] | 00:23 | |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 00:24 | |
-!- blackburn [~blackburn@188.122.253.215] has quit [Quit: Leaving.] | 01:17 | |
-!- f-x [~user@117.192.196.224] has joined #shogun | 05:44 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds] | 06:35 | |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 06:58 | |
-!- gsomix [~gsomix@109.169.132.216] has joined #shogun | 07:15 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 252 seconds] | 07:20 | |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 07:33 | |
-!- gsomix [~gsomix@109.169.132.216] has quit [Ping timeout: 252 seconds] | 07:34 | |
-!- in3xes_ [~in3xes@210.212.58.111] has joined #shogun | 08:02 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds] | 08:06 | |
CIA-87 | shogun: Baozeng Ding master * r334cc39 / src/interfaces/lua_modular/swig_typemaps.i : add stringfeatures check typemap - https://github.com/shogun-toolbox/shogun/commit/334cc39b56b30d5f79415c48ef9bf3a3587346da | 08:40 |
---|---|---|
CIA-87 | shogun: Baozeng Ding master * r8bb327a / (2 files in 2 dirs): add kernel_weighted_degree_string_modular.lua example - https://github.com/shogun-toolbox/shogun/commit/8bb327a2f81c5e85ea97070aa548739bbb96291d | 08:40 |
CIA-87 | shogun: Soeren Sonnenburg master * rc86e131 / (3 files in 2 dirs): | 08:40 |
CIA-87 | shogun: Merge pull request #222 from sploving/8bb327a2f81c5e85ea97070aa548739bbb96291d | 08:40 |
CIA-87 | shogun: add string support in typemap and example - https://github.com/shogun-toolbox/shogun/commit/c86e131225489bddbe45932604fd40c3177a9157 | 08:40 |
-!- in3xes_ [~in3xes@210.212.58.111] has quit [Ping timeout: 252 seconds] | 09:46 | |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 09:59 | |
-!- heiko [~heiko@main.uni-duisburg.de] has joined #shogun | 10:12 | |
CIA-87 | shogun: Baozeng Ding master * re385d71 / doc/pages/Installation.mainpage : add lua doc in Installion.mainpage - https://github.com/shogun-toolbox/shogun/commit/e385d7177cb165e59d5f33c8c3af110c2982ebb8 | 10:27 |
CIA-87 | shogun: Baozeng Ding master * r49f6d5e / (2 files in 2 dirs): add minimal libsvm lua modular - https://github.com/shogun-toolbox/shogun/commit/49f6d5e82a6dfe0d2d75964d4f37381184423ccf | 10:27 |
CIA-87 | shogun: Baozeng Ding master * rfe2408f / doc/pages/ModularTutorial.mainpage : add lua doc in ModularTutorial.mainpage - https://github.com/shogun-toolbox/shogun/commit/fe2408fa7b2f9aad282381981bdd5019341c49fa | 10:27 |
CIA-87 | shogun: Soeren Sonnenburg master * r6d0ae13 / (16 files in 3 dirs): | 10:27 |
CIA-87 | shogun: Merge pull request #223 from sploving/master | 10:27 |
CIA-87 | shogun: add lua docs and further examples - https://github.com/shogun-toolbox/shogun/commit/6d0ae13a0b07debbcf09db4b2fd7c072874fd23e | 10:27 |
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun | 11:43 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 240 seconds] | 11:46 | |
-!- f-x` [~user@117.192.196.224] has joined #shogun | 12:10 | |
-!- f-x [~user@117.192.196.224] has quit [Ping timeout: 260 seconds] | 12:12 | |
CIA-87 | shogun: Heiko Strathmann master * r3bb09d5 / (19 files): made get_num_vectors() a const method - https://github.com/shogun-toolbox/shogun/commit/3bb09d5aee7ac6f12f0b393e70f8bdf3d9ee9b66 | 12:25 |
CIA-87 | shogun: Soeren Sonnenburg master * rf8239e9 / (19 files): | 12:25 |
CIA-87 | shogun: Merge pull request #224 from karlnapf/master | 12:25 |
CIA-87 | shogun: made get_num_vectors method const - https://github.com/shogun-toolbox/shogun/commit/f8239e91fd171086786a114fae9b61c578c4ee20 | 12:25 |
CIA-87 | shogun: Heiko Strathmann master * r295d6f0 / src/shogun/features/SimpleFeatures.h : implemented copy_subset method - https://github.com/shogun-toolbox/shogun/commit/295d6f0ace7c41881e6dd6ac89d4432579ceed81 | 12:38 |
CIA-87 | shogun: Heiko Strathmann master * r68f6f71 / (2 files): added example/test for copy_subset method of SimpleFeatures - https://github.com/shogun-toolbox/shogun/commit/68f6f71f7c62f425b875526a316ccb032e1db493 | 12:38 |
CIA-87 | shogun: Soeren Sonnenburg master * rabdbf53 / (3 files in 2 dirs): | 12:38 |
CIA-87 | shogun: Merge pull request #225 from karlnapf/master | 12:38 |
CIA-87 | shogun: implementation of copy_subset - https://github.com/shogun-toolbox/shogun/commit/abdbf5369b6b1796f669b7fa24bc5c0d569982cd | 12:38 |
-!- heiko [~heiko@main.uni-duisburg.de] has quit [Ping timeout: 258 seconds] | 12:42 | |
-!- in3xes1 [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds] | 14:17 | |
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun | 14:29 | |
-!- gsomix [~gsomix@95.67.172.217] has joined #shogun | 14:34 | |
-!- gsomix [~gsomix@95.67.172.217] has quit [Client Quit] | 14:37 | |
-!- heiko [~heiko@main.uni-duisburg.de] has joined #shogun | 14:39 | |
heiko | sonney2k, just received my new memory, will plug it in now, wisch me luck :) | 14:40 |
-!- heiko [~heiko@main.uni-duisburg.de] has quit [Client Quit] | 14:40 | |
@sonney2k | heiko, I just spent 80 EUR to get 16G of memory too | 14:40 |
@sonney2k | good luck! | 14:41 |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 14:56 | |
-!- in3xes1 [~in3xes@180.149.49.227] has quit [Ping timeout: 252 seconds] | 14:59 | |
-!- heiko [~heiko@main.uni-duisburg.de] has joined #shogun | 15:17 | |
heiko | there we go | 15:19 |
heiko | sad, 4GB turn out to be only 3.3 ;) | 15:19 |
heiko | oh no github offline | 15:20 |
@sonney2k | heiko, ? | 15:41 |
@sonney2k | are you stuck on x86? | 15:41 |
heiko | hi | 15:41 |
heiko | what? | 15:41 |
@sonney2k | I just ordered 16G for 73 EUR ... | 15:41 |
@sonney2k | why do you only have 3.3? | 15:42 |
@sonney2k | G | 15:42 |
heiko | oh no, I need a new computer :) | 15:42 |
heiko | dont know, I am not reakky into this stuff :) | 15:42 |
@sonney2k | my desktop is not really new and I was surprised that I can get 16G for <100EUR | 15:42 |
heiko | yes | 15:43 |
heiko | well i have a notebook | 15:43 |
heiko | always more expensive | 15:43 |
@sonney2k | yes sure | 15:43 |
@sonney2k | but github is online - at least now or? | 15:43 |
heiko | yes its back on | 15:43 |
heiko | will try to compile python_modular soon | 15:45 |
@sonney2k | heh | 15:45 |
heiko | but first kernel machine example :) | 15:45 |
heiko | should work now | 15:45 |
@sonney2k | heiko, I am a bit worried about the paradigm shift: after learning everything is in the classifier | 15:46 |
@sonney2k | for kernel machines we had -- at some point -- the assumption that lhs == training data | 15:46 |
@sonney2k | this was necessary to be able to initialize kernelnormalizers | 15:47 |
@sonney2k | because we didn't save the state of the kernel / normalizer | 15:47 |
heiko | I do not really understand what you mean | 15:48 |
@sonney2k | heiko, for example: if we estimate a scale parameter on all lhs data | 15:48 |
@sonney2k | then it will be different when you just have svs in there | 15:48 |
heiko | ah ok | 15:48 |
heiko | mmh | 15:48 |
heiko | what about not storing the sv data in lhs, but in a separate variable | 15:49 |
heiko | and then use it if desired, and keep the old lhs? | 15:49 |
heiko | or just keep a reference on the training data separately | 15:50 |
@sonney2k | heiko, no I mean I think what we do now is much more reasonable | 15:51 |
@sonney2k | it is just that we have to pay extra attention | 15:51 |
heiko | oh, ok :) | 15:51 |
@sonney2k | it simply makes a lot of sense to do model = some_method.train(features) | 15:52 |
@sonney2k | and then call model.apply() | 15:52 |
@sonney2k | without any dependency to anything before | 15:52 |
@sonney2k | of course this paradigm can be non-efficient in some casrs | 15:53 |
@sonney2k | cases | 15:53 |
@sonney2k | but then one has to do things manually anyways | 15:53 |
heiko | yes, its optional after all | 15:53 |
@sonney2k | (when trainingdata doesn't fit in memory twice for example) | 15:53 |
@sonney2k | KNN is probably the best counter example | 15:53 |
@sonney2k | but for such cases to be efficient one would need to support multiple views | 15:54 |
@sonney2k | so when doing get_feature_vector() always pass the view/subset | 15:54 |
heiko | ok, this is another thing then | 15:55 |
@sonney2k | we can do that but not for version 1.0 I would say | 15:55 |
@sonney2k | heiko, it would also 'resolve' the sv-stuff | 15:55 |
heiko | so much to do still | 15:55 |
heiko | yes, it would | 15:56 |
@sonney2k | I mean one could create a subset of a subset object and then pass this along | 15:56 |
@sonney2k | it won't help in the kernel could be initialized differently case though | 15:56 |
heiko | another thing not so nice for the current solution is that every Feature instance has to implement the copy_subset method | 15:57 |
@sonney2k | anyways please continue where you have left | 15:57 |
heiko | this is a lot of work | 15:57 |
@sonney2k | heiko, that is true though | 15:57 |
heiko | I changed the get_num_vectors method | 15:57 |
heiko | there were at least 10 classes | 15:57 |
heiko | perhaps some of this stuff could be moved to the feature class itself | 15:58 |
@sonney2k | I think there are really only 3 classes | 15:58 |
@sonney2k | simple, sparse, string | 15:58 |
heiko | yes | 15:58 |
@sonney2k | the rest is just using them in one way or another | 15:58 |
heiko | but the methods have to be implemented anyways | 15:59 |
heiko | but well I will see | 15:59 |
heiko | will build an example first for simple features and see if it works | 15:59 |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 260 seconds] | 16:01 | |
@sonney2k | heiko, yes the method would be there but just call the respective feature shrink method | 16:02 |
@sonney2k | hmmhh | 16:02 |
@sonney2k | I am thinking more about this 'multiple view' thing | 16:02 |
@sonney2k | one could add an optional parameter to each get_feature_vector etc function | 16:03 |
@sonney2k | (default = NULL) | 16:03 |
@sonney2k | then it would be classifiers that need to know which subset they need to operate on for training/testing | 16:03 |
@sonney2k | and it would just be one data set that is required | 16:04 |
heiko | yes | 16:04 |
heiko | this would be really nice | 16:04 |
heiko | so the subset comes from outside not from the feature class itself | 16:05 |
@sonney2k | so this would need to be part of CMethod | 16:05 |
@sonney2k | the training / test subset | 16:05 |
@sonney2k | however, people usually use separate training/ test data sets | 16:06 |
@sonney2k | so it would be cumbersome for some... and I am not sure if we don't need to support a different test data set anyways just because of that | 16:06 |
@sonney2k | this certainly needs more thought | 16:07 |
heiko | yes, indeed, this is quite natural | 16:07 |
@sonney2k | I mean one could in principle emulate the concatenation of features | 16:08 |
heiko | so that test/train data is "concatenated" and then used as one data set? | 16:08 |
@sonney2k | heiko, yes - only that it is not really concatenated | 16:08 |
heiko | yes, only view-wise | 16:09 |
@sonney2k | but virtually :) | 16:09 |
@sonney2k | I don't want to think about all the implications | 16:10 |
heiko | also, the "concatenation" could be done automatically internally, so the user does not needs to change his habits | 16:10 |
@sonney2k | so one could call apply(subset) | 16:11 |
@sonney2k | or apply(features) | 16:11 |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 16:13 | |
@sonney2k | this has many implications, every method needs to be changed | 16:13 |
@sonney2k | some methods work with transposed features for efficiency | 16:13 |
@sonney2k | so that will be more tricky to resolve | 16:14 |
@sonney2k | etc | 16:14 |
@sonney2k | the easy part though would be to change all the features | 16:14 |
@sonney2k | :) | 16:14 |
heiko | well, yes :) | 16:14 |
@sonney2k | because you already did the work | 16:14 |
@sonney2k | you would just assume that every get_feature_vector etc method gets has a subset parameter and check that | 16:15 |
@sonney2k | instead of the m_subset :) | 16:15 |
heiko | yes, quite straight | 16:15 |
@sonney2k | so that would mean if we do it this way | 16:17 |
@sonney2k | it won't work for any Method out of the box | 16:18 |
@sonney2k | only for those for which get_feature_vector() calls are changed | 16:18 |
@sonney2k | heiko, bitte mal uebers wochenende sickern lassen :) | 16:19 |
@sonney2k | too much work for a quck shot | 16:19 |
heiko | alles klaro ;) | 16:19 |
@sonney2k | quick | 16:19 |
CIA-87 | shogun: Heiko Strathmann master * rc3629a8 / src/shogun/machine/KernelMachine.cpp : removed a check that did not make sense - https://github.com/shogun-toolbox/shogun/commit/c3629a8bd39ce17e8015aaea1ef89000d43b500d | 16:31 |
CIA-87 | shogun: Soeren Sonnenburg master * r35719a8 / src/shogun/machine/KernelMachine.cpp : | 16:31 |
CIA-87 | shogun: Merge pull request #226 from karlnapf/master | 16:31 |
CIA-87 | shogun: correction - https://github.com/shogun-toolbox/shogun/commit/35719a854d3721aaaa87e5d443117e29ffbe375e | 16:31 |
CIA-87 | shogun: Soeren Sonnenburg master * r0f1f5d9 / (2 files): enable major/minor string finding over 2 snp feature objects - https://github.com/shogun-toolbox/shogun/commit/0f1f5d9ed7352a33426c5c8285670ecf36fb1b39 | 16:35 |
-!- heiko [~heiko@main.uni-duisburg.de] has quit [Ping timeout: 258 seconds] | 16:55 | |
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun | 17:04 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 246 seconds] | 17:08 | |
-!- in3xes1 is now known as in3xes | 17:10 | |
-!- heiko [~heiko@main.uni-duisburg.de] has joined #shogun | 17:55 | |
CIA-87 | shogun: Heiko Strathmann master * r20f6974 / src/shogun/evaluation/CrossValidation.cpp : usage of mean method of CStatistics instead of CMath - https://github.com/shogun-toolbox/shogun/commit/20f697448860a53f4fc1063f0abcdd119d03326b | 18:13 |
CIA-87 | shogun: Heiko Strathmann master * rc94ebcc / src/shogun/evaluation/CrossValidation.cpp : usage of CStatisitcs::mean instead of CMath - https://github.com/shogun-toolbox/shogun/commit/c94ebcc313eb30725c1c9be2c2ab71a0a31cb903 | 18:13 |
CIA-87 | shogun: Heiko Strathmann master * rd05c40c / (7 files in 2 dirs): removed "name nodes" which were a placeholder to group SGObject nodes. This is just not necessary - https://github.com/shogun-toolbox/shogun/commit/d05c40c773f1a9cee48475b4624267fb5d94ef8b | 18:13 |
CIA-87 | shogun: Heiko Strathmann master * r80b7302 / examples/undocumented/python_modular/modelselection_parameter_tree_modular.py : applied name node removal of model selection parameters - https://github.com/shogun-toolbox/shogun/commit/80b730257932bb5e116ec8c67114dc0bd8ba0ee7 | 18:13 |
CIA-87 | shogun: Soeren Sonnenburg master * r31d21d3 / (9 files in 4 dirs): | 18:13 |
CIA-87 | shogun: Merge pull request #227 from karlnapf/master | 18:13 |
CIA-87 | shogun: simplified parameter trees for model selection - https://github.com/shogun-toolbox/shogun/commit/31d21d3b65ca24884fa8bbb2c7e343774cdb130c | 18:13 |
-!- in3xes_ [~in3xes@180.149.49.227] has joined #shogun | 18:15 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 264 seconds] | 18:19 | |
heiko | sonney2k, python compiles just fine here :) | 18:22 |
@bettyboo | hrhr heiko | 18:22 |
-!- in3xes1 [~in3xes@180.149.49.227] has joined #shogun | 18:52 | |
-!- in3xes_ [~in3xes@180.149.49.227] has quit [Ping timeout: 250 seconds] | 18:56 | |
@sonney2k | heiko, great :) | 18:57 |
CIA-87 | shogun: Soeren Sonnenburg master * r9d9ea5c / (7 files in 3 dirs): | 19:56 |
CIA-87 | shogun: Merge pull request #228 from karlnapf/master | 19:56 |
CIA-87 | shogun: first working grid search with a kernel (+6 more commits...) - https://github.com/shogun-toolbox/shogun/commit/9d9ea5c4ef651932817e4f72f3a5de838c469dea | 19:56 |
heiko | sonney2k, hi | 19:57 |
@sonney2k | hi | 19:57 |
@bettyboo | welcome | 19:57 |
heiko | finally, selection of kernel works .) | 19:57 |
heiko | i am going home now, have a nice weekend! | 19:57 |
@sonney2k | heiko, thanks for your hard work! | 19:57 |
@sonney2k | and don't dream of shogun and model selection and feature subsets :) | 19:58 |
heiko | its cool stuff :) | 19:58 |
heiko | no, I wont ;) | 19:58 |
@sonney2k | in this case do dream about it! | 19:58 |
heiko | looking forward to a computer free weekend :) | 19:58 |
@sonney2k | heiko, and rain... | 19:58 |
heiko | rain? | 19:58 |
@sonney2k | at least it is raining here for the last 2 days | 19:58 |
heiko | oh | 19:58 |
heiko | yes | 19:59 |
heiko | bad weather all the time | 19:59 |
@sonney2k | ideal coding weather | 19:59 |
heiko | however, going climbing in the dry hours anyways :) | 19:59 |
* sonney2k is crossing fingers | 19:59 | |
@sonney2k | no accidents please | 19:59 |
@sonney2k | maybe we discuss on monday about this subset thing | 19:59 |
heiko | no, not interested in that | 19:59 |
heiko | yes | 19:59 |
heiko | lets do this, i will have a thought about it | 20:00 |
heiko | so, i gotta go, girlfrind is waiting | 20:00 |
heiko | bye | 20:00 |
@sonney2k | bye | 20:00 |
@sonney2k | and thanks again | 20:00 |
-!- heiko [~heiko@main.uni-duisburg.de] has quit [Quit: Leaving.] | 20:00 | |
-!- in3xes1 is now known as in3xes | 20:13 | |
-!- gsomix [~gsomix@95.67.172.217] has joined #shogun | 21:37 | |
-!- in3xes_ [~in3xes@180.149.49.230] has joined #shogun | 22:08 | |
-!- in3xes [~in3xes@180.149.49.227] has quit [Remote host closed the connection] | 22:08 | |
-!- in3xes_ is now known as in3xes | 22:09 | |
-!- gsomix [~gsomix@95.67.172.217] has quit [Quit: Ухожу я от вас (xchat 2.4.5 или старше)] | 22:58 | |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has joined #shogun | 23:06 | |
alesis-novik | Hmm, is it ok to use memset to zero float64_t? | 23:09 |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has quit [Ping timeout: 240 seconds] | 23:29 | |
-!- alesis-novik [~alesis@cpat001.wlan.net.ed.ac.uk] has joined #shogun | 23:30 | |
-!- f-x` [~user@117.192.196.224] has quit [Ping timeout: 260 seconds] | 23:58 | |
--- Log closed Sat Jul 23 00:00:00 2011 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!