--- Log opened Wed Jun 06 00:00:41 2012 | ||
-!- heiko [~heiko@host86-179-59-69.range86-179.btcentralplus.com] has quit [Ping timeout: 256 seconds] | 01:07 | |
-!- blackburn [d5578aee@gateway/web/freenode/ip.213.87.138.238] has quit [Ping timeout: 245 seconds] | 02:39 | |
-!- romi_ [~mizobe@187.66.121.115] has quit [Quit: Leaving] | 04:36 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Read error: Operation timed out] | 05:47 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 06:07 | |
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has joined #shogun | 06:13 | |
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has quit [Changing host] | 06:13 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 06:13 | |
-!- uricamic [~uricamic@2001:718:2:1634:29b5:2f5b:6ebd:d1b0] has joined #shogun | 09:01 | |
-!- gsomix [~gsomix@109.169.142.23] has joined #shogun | 10:01 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 10:05 | |
gsomix | hi | 10:08 |
---|---|---|
-!- wiking [~wiking@we02c096.ugent.be] has joined #shogun | 10:34 | |
-!- wiking [~wiking@we02c096.ugent.be] has quit [Changing host] | 10:34 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 10:34 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun | 11:08 | |
-!- gsomix [~gsomix@109.169.142.23] has quit [Quit: Ex-Chat] | 11:18 | |
-!- gsomix [~gsomix@109.169.142.23] has joined #shogun | 11:20 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 265 seconds] | 11:52 | |
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has joined #shogun | 12:00 | |
alexlovesdata | may I ask: who is zxtx, naywhayare, and the CIA agent ? | 12:07 |
alexlovesdata | with respect to the others I have an idea who they are | 12:08 |
zxtx | fan of the software | 12:14 |
zxtx | working on a patch to get pegasos into the repo | 12:14 |
alexlovesdata | ahh thx! | 12:15 |
gsomix | alexlovesdata, CIA-9 is github bot. | 12:15 |
alexlovesdata | thx | 12:17 |
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has joined #shogun | 12:23 | |
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has quit [Changing host] | 12:23 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 12:23 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Quit: leaving] | 12:58 | |
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has quit [Ping timeout: 245 seconds] | 13:11 | |
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has joined #shogun | 15:26 | |
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has joined #shogun | 16:05 | |
alexlovesdata | viking? | 16:11 |
wiking | alexlovesdata: yeps here | 16:25 |
alexlovesdata | nice | 16:26 |
alexlovesdata | ok so what you would like to do next? | 16:26 |
wiking | alexlovesdata: so yeah as i was writing to you earlier the very short plan for the latent extension of shogun would be to get a simple -1,1 labelled data and solve for that the optimization problem. currently i have a dataset with mammal's in it. and the 'task' is basically to label the image (i.e. give the right mammal present in the image) and give a bounding box for it | 16:27 |
alexlovesdata | so joint object detection and classification | 16:27 |
wiking | so this is a very typical example for object recognition in images, i.e. your h would be something like (x,y) and (w, h) | 16:28 |
alexlovesdata | caltech256 animals? a dataset which I had used | 16:28 |
alexlovesdata | then yuo could cite me :) | 16:28 |
wiking | hheheheh :) | 16:28 |
alexlovesdata | unofficial Gsoc rule: cite your mentor, joke aside: | 16:28 |
wiking | imho there's only 6 different mammals in this dataset | 16:28 |
alexlovesdata | caltech256 animals has 52 or so | 16:29 |
wiking | i cannot recall now which dataset is this | 16:29 |
alexlovesdata | except for the fantasy animals like minotaur | 16:29 |
wiking | but then again it's small and simple... | 16:29 |
wiking | and the features are already ready | 16:29 |
alexlovesdata | thats not important | 16:29 |
wiking | so yeah anyhow... i was trying now that example to get working | 16:29 |
alexlovesdata | so you want to implement the felzenszwalb style latent svm> | 16:30 |
wiking | yes | 16:30 |
alexlovesdata | so you want to implement the felzenszwalb style latent svm? | 16:30 |
wiking | so first that one | 16:30 |
wiking | and after that try to work on a SO latent svm | 16:30 |
wiking | as soon as i can start using the solver of n4nd0 (fernando) | 16:31 |
alexlovesdata | and psi(x,h) would be = ? HoG feature over the box? | 16:31 |
wiking | yep | 16:31 |
wiking | so now i have like n hog for each image | 16:31 |
alexlovesdata | ok, sounds easy | 16:31 |
alexlovesdata | my suggestion would be to get a base latentpsi class and then derive your special class from it\ | 16:32 |
alexlovesdata | which has its own argmax | 16:33 |
alexlovesdata | and its own way to get the psi(x,h) | 16:33 |
wiking | ah so that one can already use this 'example' for other object recognition | 16:33 |
alexlovesdata | what do you mean by your last question? | 16:34 |
wiking | so i mean that we have this derived class | 16:34 |
alexlovesdata | yes | 16:34 |
wiking | which users can use out of box | 16:34 |
wiking | if they wanna have an object detector | 16:34 |
wiking | for exmaple.. | 16:35 |
wiking | ok | 16:35 |
alexlovesdata | yes, good idea | 16:35 |
wiking | that should be fine, although i have to discuss about this with blackburn (sergey) | 16:35 |
wiking | since i think it should be part of the library itself and not the example part in the repository | 16:36 |
wiking | but this is some minor thing | 16:36 |
alexlovesdata | what to discuss? | 16:36 |
wiking | well where exactly to store the code for this 'example' | 16:36 |
wiking | anyhow i was just wondering if there's such easy example for latent svm | 16:36 |
wiking | that would be a usual use case | 16:36 |
wiking | as it would be good to have 2-3 use cases for latent svm implemented in the library | 16:37 |
alexlovesdata | in examples/someinterface ? | 16:41 |
alexlovesdata | examples/undocumented/* | 16:41 |
alexlovesdata | or do you mean C++ code? | 16:41 |
alexlovesdata | C++ code you can have as test method even | 16:42 |
wiking | alexlovesdata: i mean that i think these basic 'examples' should actually be really part of the library itself. so that one could just basically include in his own code an ObjectDetector.h or something which is basically a latent svm based object recognizer... | 16:43 |
alexlovesdata | yes, this is ok | 16:44 |
alexlovesdata | you have a base class and a derived example | 16:44 |
wiking | yep | 16:44 |
wiking | but i need to talk about this with blackburn... how exactly we should do this | 16:44 |
wiking | i mean where to put the actual code/header... | 16:44 |
alexlovesdata | and then some code for interfaces (python whatever) | 16:45 |
alexlovesdata | yes put it into shogun main | 16:45 |
wiking | yeah the modular interfaces will be the last step ... | 16:45 |
wiking | so first only c++ and then when it all works fine i'll do the modular interfaces for python etc.. | 16:45 |
alexlovesdata | because it is a usable piece of code | 16:45 |
alexlovesdata | my question would be | 16:45 |
alexlovesdata | will you also implement the mining of hard negatives | 16:47 |
wiking | aaaah | 16:48 |
wiking | not this week :D | 16:48 |
alexlovesdata | no not this week | 16:48 |
alexlovesdata | I wanted to say: it is NOT mandatory for latent SVM | 16:48 |
wiking | but yeah i was thinking about it | 16:48 |
alexlovesdata | I think it is not your core duty to implement Felszenszwalb in all details, ok? | 16:49 |
alexlovesdata | so if you do binary latent SVM fine | 16:49 |
alexlovesdata | doing more like hard negatives mining is NOT mandatory. | 16:50 |
alexlovesdata | everything besides mining hard negatives is luxury ... for donald Trumps wife | 16:50 |
wiking | yea but actually it would be great to have imho | 16:50 |
wiking | and of course when SO is in a working shape, it'd be great to have latent structural svm | 16:51 |
wiking | what i want this week is really the simple solver i've mentioned earlier.... based on ocas | 16:53 |
alexlovesdata | thats fine! | 16:53 |
alexlovesdata | and pls do not waste time on more than mining hard negatives | 16:54 |
wiking | :> | 16:55 |
wiking | will try :) | 16:56 |
alexlovesdata | hmm, should we discuss nandos struct while he is not in chat? | 16:57 |
alexlovesdata | bash it and talk about our wishes :) ? | 16:58 |
wiking | :D | 17:00 |
alexlovesdata | https://github.com/iglesias/shogun/tree/master/src/shogun/so | 17:00 |
wiking | ahhahaha | 17:00 |
alexlovesdata | because if we discuss this is three weeks it might be too late | 17:00 |
alexlovesdata | so now is time for wishing what we want | 17:00 |
wiking | yeah i've already told him 2 weeks ago | 17:01 |
wiking | w | 17:01 |
wiking | what i want :D | 17:01 |
alexlovesdata | any desired changes which you could tell me ? | 17:01 |
alexlovesdata | so that I know what you want, too :D ? | 17:01 |
wiking | well i think the problem here will be | 17:01 |
wiking | that i'll have basically 2 base classes | 17:02 |
wiking | 1) a base class latent svm solver with -1,1 labelling | 17:02 |
wiking | 2) same but with structured labelling | 17:02 |
wiking | i don't see it being able to cover by only 1 base class | 17:02 |
wiking | so lets say i'll have something like: LatentLinearMachine and LatentStructuredLinearMachine | 17:03 |
alexlovesdata | does that affect nandos framework? because for -1,+1 labels you could use ocas and fine | 17:04 |
alexlovesdata | am i wrong? | 17:04 |
alexlovesdata | wait I get myself a coffee for 3 minutes | 17:05 |
wiking | no you are wrong | 17:05 |
wiking | ok no worries | 17:05 |
wiking | i'll write the rest here in the meanwhile... so afaik LatentStructuredLinearMachine can be derived from CLinearStructuredOutputMachine | 17:05 |
wiking | and that would be the latent s-svm solver | 17:06 |
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun | 17:09 | |
alexlovesdata | if I am wrong then pls correct me | 17:12 |
alexlovesdata | back from getting coffee | 17:12 |
wiking | ok | 17:13 |
-!- uricamic [~uricamic@2001:718:2:1634:29b5:2f5b:6ebd:d1b0] has quit [Quit: Leaving.] | 17:15 | |
alexlovesdata | afaik LatentStructuredLinearMachine can be derived from CLinearStructuredOutputMachine ... | 17:20 |
alexlovesdata | I would say: you use it rather as a solver instead of deriving | 17:20 |
alexlovesdata | so it would be a member | 17:20 |
alexlovesdata | or called in train() | 17:20 |
alexlovesdata | or called in train() only | 17:20 |
alexlovesdata | that might be easier than deriving it | 17:21 |
alexlovesdata | then you can use nandos stuff only as a solver and are free to design your own interfaces as you like them most | 17:21 |
wiking | mmm | 17:21 |
wiking | but we'll have to change it | 17:22 |
wiking | since the PSI is different in case of a | 17:22 |
wiking | LatentStructuredLinearMachine and CLinearStructuredOutputMachine | 17:22 |
wiking | PSI(x,y,h) vs PSI(x,y) | 17:22 |
wiking | so i wouldn't be able to use it directly | 17:23 |
alexlovesdata | right | 17:23 |
alexlovesdata | for solver calls you know the h's already | 17:23 |
alexlovesdata | so you could add to your member a getpsi_knowhiddenlabels | 17:24 |
alexlovesdata | method | 17:24 |
alexlovesdata | which inputs the right psi into nandos olver | 17:24 |
alexlovesdata | wrong? | 17:24 |
wiking | mmm | 17:25 |
alexlovesdata | if you see a problem in it pls say so ... mistakes belong to me like rotten fruits to a market | 17:25 |
* wiking thinking | 17:25 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun | 17:26 | |
wiking | so this is the current implementation of an SO solver: https://github.com/iglesias/shogun/blob/master/src/shogun/so/VanillaStructuredOutputMachine.cpp | 17:27 |
wiking | this as is i won't be able to use as a solver directly in CLinearStructuredOutputMachine | 17:27 |
wiking | i've meant: LatentStructuredLinearMachine | 17:27 |
wiking | or i don't see it yet | 17:28 |
wiking | mmm | 17:29 |
wiking | ok i'll be able | 17:29 |
-!- blackburn [d5578d64@gateway/web/freenode/ip.213.87.141.100] has joined #shogun | 17:29 | |
wiking | i'll only need to change the CStructuredModel | 17:29 |
blackburn | hey | 17:29 |
blackburn | I just read logs - wiking what is the code you want to discuss where to put it in? | 17:29 |
wiking | because in https://github.com/iglesias/shogun/blob/master/src/shogun/so/VanillaStructuredOutputMachine.cpp#L34 the passed CFeatures* data would be actually the already calculated PSI(x,y,h) | 17:30 |
wiking | blackburn: ok so let's say i have a basic latent svm solver, e.g. LatentLinearMachine | 17:30 |
blackburn | right | 17:30 |
wiking | that needs a lot of parameters/functions implemented if you actually want to use it for an actual problem | 17:30 |
wiking | so let's say u want to have an object detector in an image based on latent svm (typical use case) | 17:31 |
wiking | that would be something like ObjectDetector : public LatentLinearMachine | 17:31 |
n4nd0 | wiking: there have been some changes in Vanilla and other classes | 17:31 |
n4nd0 | wiking: check my branch so to see the latest ones | 17:31 |
wiking | imho that class is so 'often' used that actually having it in shogun library would be good | 17:31 |
blackburn | yes | 17:32 |
wiking | n4nd0: https://github.com/iglesias/shogun/blob/master/src/shogun/so/VanillaStructuredOutputMachine.cpp#L34 | 17:32 |
blackburn | I don't mind to put it into shogun/latent | 17:32 |
wiking | isn't this the latest? | 17:32 |
wiking | blackburn: ok | 17:32 |
n4nd0 | wiking: no | 17:32 |
wiking | n4nd0: ? | 17:32 |
wiking | :D | 17:32 |
wiking | where is it then? :D | 17:32 |
blackburn | however there could be some issues | 17:33 |
blackburn | i.e if you want HoG there | 17:33 |
n4nd0 | wiking: in the branch | 17:33 |
n4nd0 | wiking: not in master | 17:33 |
wiking | n4nd0: oh shit yeah sorry :DDD | 17:33 |
n4nd0 | no problem :D | 17:33 |
wiking | blackburn: well that implementation is not dependent on HoG itself | 17:33 |
wiking | so you could use other features | 17:33 |
n4nd0 | I have a bunch of new changes too I will push soon them to the branch | 17:34 |
alexlovesdata | right, all what we need is a class for psi(x,y,h) | 17:34 |
blackburn | wiking: then it could become pretty big | 17:34 |
alexlovesdata | can anyone give me the link to the relevanr branch? | 17:34 |
blackburn | I don't mind to put it into applications as well | 17:34 |
wiking | alexlovesdata: https://github.com/iglesias/shogun/tree/so/src/shogun/so | 17:34 |
wiking | n4nd0: liked the other api better :))) | 17:37 |
wiking | n4nd0: was more flexible :P | 17:37 |
n4nd0 | wiking: because of the function pointers? | 17:37 |
wiking | not just because of that | 17:37 |
alexlovesdata | right, this api is a bit more special | 17:37 |
alexlovesdata | because it separates the structures labels from the features | 17:38 |
alexlovesdata | in SO this split is artificial | 17:38 |
alexlovesdata | one works over Psi(x,y) | 17:38 |
n4nd0 | wiking, alexlovesdata : tell me what parts you don't like and we can adapt it | 17:38 |
alexlovesdata | this can be constructed from phi(x) and y | 17:38 |
alexlovesdata | but that is not necessary ... | 17:38 |
wiking | n4nd0: i'm just checking https://github.com/iglesias/shogun/blob/so/src/shogun/so/StructuredModel.h | 17:39 |
wiking | as basically that's the thing i'll have to modify | 17:39 |
wiking | or create a derived class | 17:39 |
alexlovesdata | if I am allowed to say something ... | 17:39 |
n4nd0 | what do you want to modify? | 17:39 |
n4nd0 | alexlovesdata: sure :) | 17:39 |
wiking | go ahead | 17:39 |
alexlovesdata | if we would habe an alternate setter which allows to input the Psi(x,y) directly | 17:39 |
alexlovesdata | without constructing them from struct labels and features | 17:40 |
alexlovesdata | for our stuff we will probably use a psi class to get this abstraction ... I do not require this | 17:40 |
n4nd0 | alexlovesdata: the part of Psi is quite undone so far | 17:40 |
alexlovesdata | but as thinking input it might be an idea | 17:40 |
alexlovesdata | the psi class has its own argmax | 17:41 |
alexlovesdata | depending on the structure | 17:41 |
alexlovesdata | and can be initialized as one likes | 17:41 |
alexlovesdata | eg explicitly by set structlables and set features | 17:41 |
alexlovesdata | the point is: the struct solver needs only psi(x,y) and information which x and which y belongs to each psi and the set of all possible ys and x's | 17:42 |
alexlovesdata | everything else is more specialized to some applications | 17:42 |
alexlovesdata | am I wrong?? | 17:42 |
alexlovesdata | thats why I would like to have a way such that one can input the psis directly together with an argmax | 17:43 |
alexlovesdata | and that was possible with the old C-style interface | 17:44 |
alexlovesdata | by overriding the function pointer | 17:44 |
alexlovesdata | but you can do that with the new interface as well | 17:44 |
blackburn | please prefer interfaces | 17:44 |
n4nd0 | alexlovesdata: my idea is that Psi would be a class member of the StructuredModel | 17:44 |
alexlovesdata | ... with a different abstraction however | 17:44 |
blackburn | pointers is more painful for modular interfaces | 17:44 |
n4nd0 | alexlovesdata: then you could have a set_psi there too | 17:45 |
alexlovesdata | I agree blackburn | 17:45 |
alexlovesdata | and its uglier | 17:45 |
blackburn | with brand new directors (TM) we can do some funky shit here | 17:45 |
alexlovesdata | however even set_psi is bad when the psis are too big for memory | 17:45 |
alexlovesdata | so it would be better to have a psi class which has its get_a_specific_psi member | 17:46 |
alexlovesdata | because the solver could just use the getter to get the right psi | 17:46 |
alexlovesdata | and its associated struct label | 17:46 |
alexlovesdata | and no need for members like vector<fullpsis> | 17:47 |
n4nd0 | we should also take into account that sonney2k wants to have the joint features or psi with the idea of COFFIN | 17:47 |
n4nd0 | so I think that at the end we will use a class similar to CDotFeatures | 17:47 |
alexlovesdata | well, the psis need then a scalar prod (member of class) and a linadd ... | 17:47 |
alexlovesdata | yep | 17:47 |
alexlovesdata | I have not looked into CDoTFeatures but my idea behind it is: it needs then some getter for the psi(x_i,y_i) | 17:48 |
alexlovesdata | and for its associated label index and feature index | 17:48 |
alexlovesdata | that should be enough for the solver | 17:49 |
alexlovesdata | and a derivied class could implement a psi from Cfeatures and Cstructlabels ... as done now in the current code | 17:49 |
alexlovesdata | so you would retain the current functionality | 17:50 |
alexlovesdata | just split the solver from getting the psis | 17:50 |
alexlovesdata | thats my suggestion ... sorry for assholing around | 17:50 |
alexlovesdata | so the structuredmodel would have a setpsiclass member or so | 17:51 |
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has quit [Quit: Page closed] | 17:51 | |
alexlovesdata | is that an idea?> | 17:52 |
wiking | n4nd0: ping :) | 17:52 |
n4nd0 | alexlovesdata: so what you suggest is to have a psi_function that is a member of the model with a setter | 17:53 |
n4nd0 | or? | 17:53 |
alexlovesdata | yes | 17:53 |
alexlovesdata | but psifunction is a class itself | 17:53 |
n4nd0 | wiking: I was answering, it takes some time to read and think :P | 17:53 |
wiking | n4nd0: :>>> no worries | 17:53 |
n4nd0 | alexlovesdata: I agree with that suggestion, it's the idea I have | 17:54 |
alexlovesdata | I get older, too ... | 17:54 |
wiking | ok | 17:54 |
n4nd0 | I have not so clear though what functionality we should provide in this base psi_function class | 17:54 |
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has joined #shogun | 17:54 | |
alexlovesdata | what the solver needs ... | 17:55 |
alexlovesdata | 1)accessing the i-th psi | 17:55 |
alexlovesdata | getting its index into training data (no x_i actually necessary) | 17:55 |
alexlovesdata | getting its index into structlabels | 17:55 |
alexlovesdata | if we assume that structlabels are discrete | 17:56 |
n4nd0 | where i-th represents both an index for a feature and another for a label? | 17:56 |
alexlovesdata | the index could be for a continuous case also a vectro of real numbers | 17:56 |
alexlovesdata | but in the simplest discrete case it is a long or a vector of long | 17:56 |
alexlovesdata | for prediction it needs the range of struct labels | 17:57 |
alexlovesdata | not for the solver: ways to construct these psis | 17:57 |
alexlovesdata | index for training data means the i for which trainning data point x_i | 17:58 |
alexlovesdata | index for structlabels means: which y was used | 17:59 |
alexlovesdata | so these indices are two different things | 17:59 |
alexlovesdata | so these two indices are two different things | 17:59 |
alexlovesdata | I need a black tea for a moment | 18:00 |
alexlovesdata | back | 18:03 |
n4nd0 | ok, tell me then | 18:04 |
alexlovesdata | nando: you could check for a struct formulation what it needs besides the Psis | 18:04 |
alexlovesdata | then you know what members you will need for the psi class | 18:04 |
alexlovesdata | it will also have its own argmax | 18:04 |
alexlovesdata | because that depends on the structure of Psi | 18:04 |
n4nd0 | why? | 18:04 |
n4nd0 | I think that Psi and ArgMax should be different parts | 18:05 |
n4nd0 | I don't understand why the psi function have its own argmax | 18:05 |
alexlovesdata | because max_{y \in Y} w*psi(x,y) | 18:06 |
alexlovesdata | depends on the structure of psi(x,y) and y | 18:06 |
alexlovesdata | in the most generic case it would be searching all y's brute force | 18:06 |
alexlovesdata | in more special cases you would search only some y's based on their structure | 18:07 |
alexlovesdata | eg in computer vision only some bounding boxes close to a given one | 18:07 |
alexlovesdata | y=bounding box params | 18:07 |
alexlovesdata | thats why argmax would be a mamber of the psi class | 18:07 |
alexlovesdata | wrong? | 18:07 |
n4nd0 | I understand your point | 18:08 |
n4nd0 | but thinking of the code, it looks to me kind of weird that the psi function has its own argmax | 18:08 |
n4nd0 | it is like, the psi function is computed independently of how the argmax is computed | 18:09 |
n4nd0 | then, why argmax should me a member of psi? | 18:09 |
alexlovesdata | because I would say that computing the argmax depends on the structure of psi and y | 18:09 |
-!- romi_ [~mizobe@187.66.121.115] has quit [Ping timeout: 244 seconds] | 18:10 | |
alexlovesdata | I think a good starting point would be if you look into the SO formulation based on: how wouldwe start if we would load psi(x,y) from disk | 18:10 |
alexlovesdata | what member would the psi class need | 18:11 |
n4nd0 | psi class needs labels and features | 18:11 |
alexlovesdata | if you work directly with precomputed psis | 18:11 |
alexlovesdata | only labels, no x's | 18:11 |
n4nd0 | why not? | 18:11 |
alexlovesdata | because in SO SVM you never use the X directly in optimization | 18:11 |
alexlovesdata | only Psi(x,y) | 18:12 |
alexlovesdata | am I worng? | 18:12 |
alexlovesdata | am I wrong? | 18:12 |
n4nd0 | ok, so you mean like we use a particular example of X (let's sat an x_i) but not the whole X? | 18:12 |
alexlovesdata | what you need is only for psi(x_i,y_i) to remember y_i and the index i | 18:13 |
alexlovesdata | no | 18:13 |
alexlovesdata | psi(x,y)=cos(x)*log(y) | 18:13 |
alexlovesdata | you will can work directly with the psis | 18:14 |
-!- heiko [~heiko@host86-179-192-248.range86-179.btcentralplus.com] has joined #shogun | 18:14 | |
alexlovesdata | you will never need to know the value of x at no point in training or testing | 18:14 |
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun | 18:14 | |
n4nd0 | providing that psis are precomputed, or? | 18:15 |
alexlovesdata | right! | 18:15 |
alexlovesdata | and our derived psi class takes care of precomputing them in the style which you like | 18:15 |
alexlovesdata | e.g. precomputing on the fly from x's and y's like you and nico are used to do | 18:15 |
n4nd0 | ok, I understand what you mean | 18:16 |
alexlovesdata | but we can also load psis from disk or an SSD on demand (thats why the getter for required single psi(x_i,y)) | 18:16 |
n4nd0 | your point implies that m_features should not be in CStructuredModel, right? | 18:17 |
alexlovesdata | right! | 18:17 |
alexlovesdata | because you can still load them if necessary form SSD by get_your_psi | 18:17 |
alexlovesdata | even when they do not fit into your mem | 18:18 |
alexlovesdata | that would be scalable ;) | 18:18 |
-!- heiko1 [~heiko@host86-180-43-237.range86-180.btcentralplus.com] has joined #shogun | 18:18 | |
alexlovesdata | and a derived psi class could take care of that loading on demand or whatever | 18:18 |
alexlovesdata | stop me if I am talking crap | 18:18 |
-!- heiko [~heiko@host86-179-192-248.range86-179.btcentralplus.com] has quit [Ping timeout: 256 seconds] | 18:18 | |
alexlovesdata | may happen ;) | 18:18 |
n4nd0 | aham, so m_features could even dissapear from LinearSOMachine | 18:18 |
n4nd0 | alexlovesdata: haha ook :D | 18:19 |
alexlovesdata | right because it asks the getter member to provide the next psi | 18:19 |
alexlovesdata | yeah! | 18:19 |
n4nd0 | alexlovesdata: ok, so I can understand that but you have still to convince with Psi having the Argmax :) | 18:20 |
alexlovesdata | hehehe | 18:20 |
alexlovesdata | so the goal is t ocompute argmax_y w*psi(x_i,y) right? | 18:21 |
n4nd0 | yes | 18:21 |
alexlovesdata | I look up something | 18:22 |
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has quit [Quit: Page closed] | 18:24 | |
alexlovesdata | so what happens if you have prior knowledge about how to compute the psis from x and y | 18:24 |
alexlovesdata | encoded in your derived class | 18:24 |
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has joined #shogun | 18:24 | |
alexlovesdata | then you can write a very efficient argmax | 18:24 |
alexlovesdata | as memeber of this class | 18:24 |
alexlovesdata | which exploits properties of the psi to compute argmax | 18:24 |
alexlovesdata | to skip some candidate y's | 18:25 |
alexlovesdata | example y \in \RR^d | 18:25 |
alexlovesdata | x in \RR^d | 18:25 |
n4nd0 | still, I see the relation in the other way; argmax has psi as a member | 18:26 |
CIA-9 | shogun: Heiko Strathmann master * rdaeea81 / examples/undocumented/libshogun/statistics.cpp : put different values to examples - http://git.io/r9gYDA | 18:26 |
CIA-9 | shogun: Heiko Strathmann master * r724eb3b / examples/undocumented/libshogun/statistics.cpp : Merge pull request #570 from karlnapf/master - http://git.io/CqY94g | 18:26 |
n4nd0 | alexlovesdata: would that fit for what you are saying? | 18:27 |
n4nd0 | I think it would | 18:27 |
alexlovesdata | psi(x,y)=cos(sum(x))*y | 18:27 |
alexlovesdata | psi is one d | 18:27 |
alexlovesdata | then for w>0 and cos(sum(x))>0 you can skip all negative y's | 18:28 |
alexlovesdata | the argmax is a function ... | 18:28 |
alexlovesdata | you want to make it a class? | 18:28 |
n4nd0 | yes | 18:29 |
n4nd0 | it's already a class in the last version of the code | 18:29 |
n4nd0 | we cannot afford to use function pointers | 18:29 |
alexlovesdata | or better psi(x,y)=cos(sum(x))*y*difficultcomplexbutpositivefunctionof(x,y) | 18:29 |
alexlovesdata | then an efficient argmax can just look at the signs of the first two terms | 18:30 |
alexlovesdata | and skip the difficultcomplexbutpositivefunctionof(x,y) | 18:30 |
alexlovesdata | if psi is one dimensional | 18:30 |
alexlovesdata | does that serve as an example | 18:31 |
alexlovesdata | my argument for making psi a member is that it needs only the w and knowledge about the structure of psi and y | 18:31 |
alexlovesdata | this is already present in the psi class | 18:32 |
n4nd0 | ok | 18:32 |
n4nd0 | but thinking of argmax as a class too | 18:32 |
alexlovesdata | which is no contradiction | 18:32 |
alexlovesdata | because a derived class could call a member argmax, right? | 18:32 |
n4nd0 | your idea should fit also if psi IS the member of argmax | 18:32 |
n4nd0 | alexlovesdata: yes | 18:33 |
alexlovesdata | your idea should fit also if psi IS the member of argmax: yes | 18:33 |
alexlovesdata | I agree | 18:33 |
-!- blackburn1 [~blackburn@188.168.3.9] has joined #shogun | 18:33 | |
alexlovesdata | hmm, C++ has no real pope for it, what a pity | 18:33 |
n4nd0 | alexlovesdata: for what? | 18:34 |
alexlovesdata | for being an instance which tells us what to choose and can never make a mistake :) | 18:35 |
alexlovesdata | with the psi being member of the argmax you would call then argmax->psiclass->getnextpsi in optimization | 18:36 |
alexlovesdata | toget psi(xi,yi) | 18:36 |
alexlovesdata | also possible ... | 18:36 |
n4nd0 | it is because IMHO to have argmax inside psi is not intuitive | 18:37 |
alexlovesdata | I would be tempted to ask why not intuitive | 18:41 |
alexlovesdata | but I do not want to go on your nerves | 18:41 |
n4nd0 | haha | 18:41 |
n4nd0 | no problem | 18:41 |
alexlovesdata | you can tell nico to berate me fofr today :) | 18:41 |
n4nd0 | because Psi doesn't need to know anything about the argmax in order to do its task | 18:41 |
alexlovesdata | right | 18:42 |
alexlovesdata | but sometimes the argmax can use specialized information about the psi for computation | 18:42 |
alexlovesdata | and I think technically it would not make the argmax more special by putting into the psi, right? | 18:43 |
n4nd0 | that is still in the direction of argmax needs psi | 18:43 |
n4nd0 | but not psi needs argmax :D | 18:43 |
alexlovesdata | right | 18:43 |
alexlovesdata | but psi also needs no getter | 18:43 |
alexlovesdata | but the getter needs psi | 18:44 |
n4nd0 | ? | 18:44 |
alexlovesdata | thats why the getter which delivers the i-th psi is a member | 18:44 |
alexlovesdata | could be that we are stuck in a question which needs a pope ... :) | 18:44 |
n4nd0 | yeah | 18:45 |
alexlovesdata | or you do as you prefer as long as we can implement a specialized argmax for special psi | 18:45 |
n4nd0 | it feels better if we all agree | 18:46 |
alexlovesdata | I would attach functions to the data classes ... but I will not require that from another ... because that is a style matter | 18:46 |
alexlovesdata | (a papal matter) | 18:46 |
blackburn1 | alexlovesdata: I am curious - can one use latent svm with not only bounding box but some transformations like rotation or perspective? | 18:47 |
alexlovesdata | I can agree on anything which allows users to implement specialized argmaxes and psi-getters | 18:47 |
alexlovesdata | I think for general users yes | 18:47 |
alexlovesdata | thats why I want an abstract psi | 18:47 |
alexlovesdata | so that any guy who needs something more weird can program it | 18:48 |
n4nd0 | yes, that's important too | 18:48 |
alexlovesdata | so that construction of the actual psi cn be done outside the solver code | 18:50 |
alexlovesdata | except for where it is unavoidable | 18:51 |
alexlovesdata | maybe I forgot the important point: | 18:52 |
alexlovesdata | which I had mentioned just now ... splitting solving the problem from constructing the psi | 18:52 |
alexlovesdata | because with the current interface that is inside the solver | 18:53 |
n4nd0 | I am sorry but I don't understand | 18:54 |
alexlovesdata | void set_labels(CStructuredLabels* labs); /** set features * * @param feats features */ void set_features(CFeatures* feats); /** computes \f$ \Psi(\bf{x}, \bf{y}) \f$ */ SGVector< float64_t > compute_joint_feature(int32_t feat_idx, int32_t lab_idx) | 18:55 |
alexlovesdata | 1. now we store the features and labels in memory (can be loaded on the fly ... your virt memory will like that :D ) | 18:56 |
alexlovesdata | 2. compute joint feature is now inside the solver class, | 18:56 |
alexlovesdata | with a getter it would be outside the solver algorithm | 18:56 |
alexlovesdata | with a psi class and a getter it would be outside the solver algorithm | 18:57 |
alexlovesdata | no one understands me :'-(( | 18:57 |
alexlovesdata | :) | 18:57 |
alexlovesdata | I need to check MKL regression .. wasbroken in 0.10.0 and 1.1.0 for matlab with custom kernels | 18:58 |
blackburn1 | alexlovesdata: are you the author of MKL in svmlight? | 18:59 |
alexlovesdata | no | 18:59 |
alexlovesdata | that was marius kloft | 18:59 |
n4nd0 | alexlovesdata: it is not like the compute join feature is inside the solver | 18:59 |
blackburn1 | aham | 18:59 |
alexlovesdata | but I have a little bit insight in it | 18:59 |
blackburn1 | we had some issue there | 18:59 |
blackburn1 | with LINADD optimizations | 18:59 |
blackburn1 | basically it is broken | 19:00 |
alexlovesdata | ok, that part I never looked into | 19:00 |
n4nd0 | alexlovesdata: the idea at that moment was that the model has a in a member the joint feature function and provides this compute_joint_feature for the solver | 19:00 |
blackburn1 | ok | 19:00 |
n4nd0 | alexlovesdata: since the solver does not have a reference to the psi function directly but a reference to the model | 19:01 |
alexlovesdata | I understand ... but that requires to store CStructuredLabels* m_labels; /** feature vectors */ CFeatures* m_features; | 19:01 |
alexlovesdata | with an external psi class this would be transparent | 19:01 |
alexlovesdata | or do some complex hacks which insert artificially a psi | 19:02 |
alexlovesdata | I am strongly for keeping that computation out of the solver and let the psi getter do that job | 19:02 |
alexlovesdata | because then you can init the solver with a nando-style psi | 19:03 |
n4nd0 | alexlovesdata: would you mind to sketch in a class diagram or using gist how would you like it to be then? | 19:03 |
alexlovesdata | or an object detection psi | 19:03 |
alexlovesdata | or a custom psi | 19:04 |
alexlovesdata | I am mathemtician | 19:04 |
wiking | n4nd0: i can do that for ya | 19:04 |
n4nd0 | wiking: ok | 19:04 |
alexlovesdata | I could write an example header, ok?, but I am not familiar with diagrams | 19:04 |
n4nd0 | alexlovesdata: ok | 19:05 |
wiking | alexlovesdata: afaik i know what you'd like to do here so i'll try to sketch it up in gist | 19:05 |
wiking | and let you and n4nd0 check it out | 19:05 |
alexlovesdata | great! thank you! | 19:05 |
wiking | nw | 19:05 |
alexlovesdata | nw = ?? | 19:06 |
wiking | i'll post it on the mailing list | 19:06 |
wiking | nw = no worries | 19:06 |
n4nd0 | nice | 19:06 |
alexlovesdata | hi nando, I hope you can live with my blabla ;) | 19:08 |
wiking | :D | 19:09 |
blackburn1 | n4nd0: I failed with 'curl' word :) | 19:10 |
n4nd0 | alexlovesdata: sure no problem! it's good to talk and discuss | 19:10 |
n4nd0 | blackburn1: noooooo | 19:10 |
blackburn1 | I have absolutely no idea where to put any curl here :D | 19:11 |
n4nd0 | alexlovesdata: I did some modifications to a diagram I was using introducing today's conversation | 19:21 |
n4nd0 | alexlovesdata: can you take a look to it and tell me if it represents what you said? | 19:21 |
n4nd0 | http://dl.dropbox.com/u/11020840/shogun/diagram.pdf | 19:21 |
alexlovesdata | thank you! | 19:21 |
n4nd0 | it is the left-most part | 19:21 |
alexlovesdata | I take alook | 19:22 |
alexlovesdata | the arrow means derived class, right? | 19:24 |
alexlovesdata | the 45 degree rotated cube means class is member of another? | 19:25 |
alexlovesdata | at the first sight looks nice | 19:25 |
alexlovesdata | should give us the possibility to do what we need ... | 19:25 |
alexlovesdata | wiking: what do you think? | 19:25 |
wiking | alexlovesdata: just checking | 19:26 |
n4nd0 | alexlovesdata: arrow derived class and the cube member yes | 19:26 |
wiking | ah yeah | 19:26 |
wiking | one thing | 19:26 |
n4nd0 | I think it strictly means weak aggregation or something like that, but member is fine :D | 19:26 |
wiking | get_psi (lab_idx, feat_idx) | 19:27 |
wiking | ok never mind | 19:27 |
wiking | it's ok | 19:27 |
wiking | i mean essentially it'd be the same idx or? | 19:27 |
alexlovesdata | probably compute_joint_feature would use get_psi again? | 19:27 |
wiking | or can it be that you want get_psi(0,1) ? | 19:27 |
wiking | ok yeah you may actually want to do that... so ok | 19:28 |
alexlovesdata | I think feat_idx refers to the index in x_i | 19:28 |
n4nd0 | alexlovesdata: yes, compute_joint_feature is get_psi | 19:28 |
alexlovesdata | lab_idx refers to the index into all possibly Ys | 19:28 |
n4nd0 | not into all possibly Ys | 19:28 |
alexlovesdata | right? wrong? | 19:28 |
wiking | alexlovesdata: well afaik you cannot have all the possible Ys | 19:28 |
wiking | alexlovesdata: only the ones that are present ;) | 19:29 |
n4nd0 | but the index into the Ys we got in training data | 19:29 |
n4nd0 | like the true Ys | 19:29 |
wiking | ^ what n4nd0 means here ;) | 19:29 |
alexlovesdata | youe are right, I agree | 19:29 |
alexlovesdata | but I think lab_idx would be not the index into y_i , right? | 19:29 |
alexlovesdata | otherwise I should go to bed soon | 19:29 |
n4nd0 | yes, it is | 19:29 |
n4nd0 | but don't go to bed :P | 19:30 |
alexlovesdata | I need to be awake until 1:30 am :( | 19:31 |
n4nd0 | why did you say that lab_idx is not the i in y_i? | 19:31 |
alexlovesdata | so it is the index in y_i ? | 19:31 |
wiking | alexlovesdata: not in but of | 19:32 |
wiking | so i guess y_{lab_idx} | 19:32 |
alexlovesdata | ok | 19:32 |
wiking | it actually refers to a given y in the set | 19:32 |
alexlovesdata | so lab_idx is NOT the training data index, but the index into all available values for Y ? | 19:33 |
alexlovesdata | that I would think | 19:33 |
alexlovesdata | I get myself a black tea again | 19:33 |
wiking | alexlovesdata: actually it is | 19:33 |
wiking | index in the training data index | 19:33 |
n4nd0 | yes, in the training data index | 19:34 |
wiking | or i understand it as such | 19:34 |
n4nd0 | me too | 19:34 |
wiking | ok if n4nd0 as well then we are good ;) | 19:34 |
n4nd0 | I mean, for me the index into all available values for Y makes no sense | 19:34 |
wiking | as it can be an infinite set | 19:34 |
-!- romi_ [~mizobe@187.66.121.115] has quit [Ping timeout: 260 seconds] | 19:34 | |
n4nd0 | and there's not order defined there | 19:35 |
wiking | countable but not finite ;) | 19:35 |
alexlovesdata | ok then I do not get it currently | 19:38 |
alexlovesdata | never mind ... temporary confusion | 19:39 |
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun | 19:42 | |
alexlovesdata | nice that we could agree today ... | 19:47 |
blackburn1 | hard discussion tonight | 19:47 |
wiking | \o/ | 19:49 |
n4nd0 | :) | 19:49 |
gsomix | n4nd0, hey | 19:49 |
n4nd0 | gsomix: hi | 19:50 |
gsomix | can you concretize about director classes that you wish? | 19:50 |
n4nd0 | yeah sure, I don't know if you read the conversation about it with sonney2k | 19:51 |
n4nd0 | basically is that I think it would be a good idea to have the argmax and the psi function of SO with director classes | 19:51 |
n4nd0 | so we can prototype in python and so on | 19:51 |
n4nd0 | wiking, alexlovesdata: do you think it would be good to have that? | 19:52 |
alexlovesdata | yes, sounds practical :) | 19:52 |
wiking | second that | 19:53 |
alexlovesdata | maybe we should prioritize the wishes of directors among all objects ... what people use most?? | 19:53 |
n4nd0 | gsomix: what do you think? | 19:53 |
n4nd0 | I have no idea how hard or how much time does it take to do these director classes | 19:54 |
n4nd0 | gsomix: you have done any already right? | 19:54 |
gsomix | n4nd0, I just can tell you I'll do whatever you want :) | 19:54 |
gsomix | n4nd0, e.g. DirectorDistance, last day | 19:55 |
n4nd0 | ok | 19:57 |
n4nd0 | so let's wait some days so we can say that the design of these classes is better established and I will tell you about it | 19:58 |
gsomix | n4nd0, ok | 19:58 |
n4nd0 | I am going now, talk to you later guys! | 19:59 |
-!- romi_ [~mizobe@187.66.121.115] has quit [Ping timeout: 252 seconds] | 19:59 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Read error: Operation timed out] | 20:02 | |
alexlovesdata | anobody has an idea how the regression label class is initialized in matlab? | 20:05 |
alexlovesdata | blackburn: what do the scores for the german sign recognition data? | 20:08 |
blackburn1 | alexlovesdata: ah I stopped at 97.84% | 20:08 |
blackburn1 | not enough time to try different hogs, etc :( | 20:09 |
alexlovesdata | seems to be +0.4 or so | 20:09 |
alexlovesdata | ok | 20:09 |
blackburn1 | yes colors are great | 20:09 |
blackburn1 | for curiosity I tried to put test images to train ones | 20:09 |
blackburn1 | 99.84% :D | 20:09 |
alexlovesdata | maybe they did??? | 20:10 |
alexlovesdata | hmm, the winner got pretty much like 99.84 | 20:10 |
blackburn1 | winner got 99.46% | 20:10 |
blackburn1 | well they use LeNet | 20:10 |
alexlovesdata | I mean there is this transductive stuff | 20:10 |
blackburn1 | yes I just wanted to say I could try transductive | 20:11 |
alexlovesdata | :D | 20:11 |
blackburn1 | but unfortunately I have to get all things done in 4 hours | 20:11 |
alexlovesdata | with a neural net it could be implicitly transductive | 20:11 |
blackburn1 | so I just claim my 97.82 w/o any cheating | 20:11 |
alexlovesdata | well but for a thesis you can point out that there are some hard cases missing in training and adding them ... blablabla | 20:12 |
blackburn1 | haha | 20:12 |
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun | 20:13 | |
blackburn1 | alexlovesdata: I have other thing that makes me able to claim I've got 100% accuracy :D | 20:14 |
blackburn1 | rejects | 20:14 |
alexlovesdata | also nice ... in particular if you can reject automatically | 20:14 |
blackburn1 | alexlovesdata: I just threshold outputs | 20:14 |
blackburn1 | with high threshold I get no errors *at all* | 20:15 |
alexlovesdata | then you could grab similar data from the web, add it to training and get 99.99% | 20:15 |
blackburn1 | however 50% are rejected hen | 20:15 |
alexlovesdata | haha | 20:15 |
alexlovesdata | thats the ML trick of the day | 20:15 |
blackburn1 | :D | 20:15 |
blackburn1 | alexlovesdata: too bad I spent too much time on fun instead of training efficient classifier :D | 20:19 |
alexlovesdata | aren't we all doing it similar? :D | 20:20 |
blackburn1 | i.e. I managed to cite Karl Popper but did not do overlapping HoG | 20:20 |
blackburn1 | because it was funnier | 20:20 |
alexlovesdata | and noetherian rings, too? | 20:21 |
blackburn1 | no, it is impossible I think | 20:21 |
blackburn1 | alexlovesdata: okay lesser cheating - added to trainset images from testset with errors :D | 20:26 |
blackburn1 | only 183 images actually | 20:26 |
alexlovesdata | but then did images from the test set improve which have not been added, as well? | 20:26 |
blackburn1 | I am not sure I understand that | 20:27 |
blackburn1 | what do you mean? :) | 20:28 |
alexlovesdata | so you added images from testset to trainset, this implies that all added images will be classified well (because SVMs overfit terribly) | 20:31 |
blackburn1 | hmm yes probably | 20:31 |
alexlovesdata | but were there images which you did not add to the trainset, which have been classified wrongly before and then have been classified correctly after | 20:31 |
alexlovesdata | i.e. these images would have profited from improved generalization by adding the 183 other | 20:32 |
alexlovesdata | and not from mere overfitting | 20:32 |
blackburn1 | no I added all images there I had errors | 20:33 |
blackburn1 | results will be in a min I think | 20:33 |
blackburn1 | alexlovesdata: does SVM really overfit terribly? | 20:34 |
alexlovesdata | yea, usually AUC=100 on training data | 20:35 |
blackburn1 | alexlovesdata: what about NNs then? | 20:35 |
alexlovesdata | I have no idea for neural nets ... | 20:37 |
blackburn1 | alexlovesdata: in my world it was thought that NNs overfit and SVMs are better because they do not overfit so much | 20:38 |
blackburn1 | alexlovesdata: wow adding 183 images lead to 99.69% | 20:38 |
blackburn1 | these 183 images would make me a winner of a contest | 20:39 |
blackburn1 | huuh | 20:39 |
alexlovesdata | on training data with what I work get AUC=100 | 20:39 |
alexlovesdata | error=0 | 20:39 |
blackburn1 | alexlovesdata: I feel confused because you make me feel all the capacity control, blabla is useless stuff :) | 20:40 |
blackburn1 | and probably all that stuff is useless for real | 20:40 |
alexlovesdata | well capacity control is made for getting a reasonable error on testing data (cross-validation) | 20:40 |
alexlovesdata | it is not made to tune error rate on train set = error rate on test set | 20:41 |
blackburn1 | isn't that for some generalization ability? | 20:42 |
blackburn1 | I mean I thought max margin is the point of good generalization | 20:42 |
blackburn1 | am I wrong? | 20:42 |
blackburn1 | alexlovesdata: I remember you are a big fan of our 'president' - let you become a fan of our parliament http://cs304702.userapi.com/v304702563/1de3/Spom2VHmg4o.jpg | 20:47 |
blackburn1 | oops not 183 but 276 | 20:49 |
alexlovesdata | yes, it is for generalization but still you get zero error at trainingdata | 21:04 |
alexlovesdata | what does the image mean?? - you can explain me tomorrow, I go home, 12 hours is enough! | 21:05 |
blackburn1 | alexlovesdata: image? one I send? | 21:05 |
blackburn1 | well deputy playing teddy bear | 21:06 |
blackburn1 | :D | 21:06 |
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has quit [Ping timeout: 245 seconds] | 21:09 | |
-!- blackburn [d5578d64@gateway/web/freenode/ip.213.87.141.100] has quit [Ping timeout: 245 seconds] | 21:47 | |
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has quit [Ping timeout: 245 seconds] | 21:49 | |
@sonney2k | blackburn1, SVMs don't overfit and your SVM giving that high accuracy on test data certainly does not | 22:00 |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun | 22:04 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 22:19 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 22:23 | |
gsomix | sonney2k, hey | 22:24 |
@sonney2k | gsomix, ho :) | 22:24 |
gsomix | EuclidianDistance vs DirectorDistance 10:37 (in seconds) | 22:25 |
@sonney2k | gsomix, how is it going? | 22:25 |
gsomix | sonney2k, working, programming, building. | 22:25 |
@sonney2k | gsomix, you mean again 10 times slower right? | 22:25 |
@sonney2k | sounds good | 22:25 |
gsomix | sonney2k, nope. 3.7 times | 22:26 |
@sonney2k | ahh ok - I guess you have the non-optimized atlas version like blackburn1 | 22:27 |
@sonney2k | gsomix, you could do a director for a general kernel machine | 22:28 |
@sonney2k | next I mean | 22:28 |
gsomix | sonney2k, ok | 22:28 |
@sonney2k | same with general linearmachine | 22:28 |
@sonney2k | maybe these two should be next | 22:29 |
@sonney2k | I think the only important thing to overload here is the train method | 22:29 |
@sonney2k | that's about it | 22:29 |
n4nd0 | sonney2k: by the way, did you read part of conversation wiking, alexander and I had before? | 23:07 |
-!- wiking_ [~wiking@huwico/staff/wiking] has joined #shogun | 23:20 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 240 seconds] | 23:21 | |
-!- wiking_ is now known as wiking | 23:22 | |
@sonney2k | n4nd0, superficially yes | 23:38 |
n4nd0 | sonney2k: ok, just in case you had a comment or whether psi should be member of argmax or vice versa :D | 23:48 |
blackburn1 | sonney2k: hmm do not overfit at all? | 23:48 |
n4nd0 | that has the hop topic | 23:48 |
blackburn1 | you all confusing me with contradictive claims :D | 23:48 |
gsomix | good night guys | 23:54 |
gsomix | ah, btw, google summer of building report http://instagr.am/p/Lf3WXUMs4H/ | 23:55 |
gsomix | first wall | 23:55 |
gsomix | hehe | 23:55 |
gsomix | .___. | 23:55 |
--- Log closed Thu Jun 07 00:00:41 2012 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!