--- Log opened Wed Feb 15 00:00:19 2012 | ||
n4nd0 | blackburn: hi! | 01:13 |
---|---|---|
blackburn | n4nd0: hi | 01:22 |
n4nd0 | blackburn: how is it going? | 01:25 |
n4nd0 | blackburn: I tested the classifier splitting the data in train and test and I am quite surprised with the results | 01:25 |
blackburn | fine, but get no sleep today preparing to some talk at student's conference :) | 01:25 |
blackburn | tell me | 01:25 |
n4nd0 | blackburn: they were much better than I expected | 01:26 |
blackburn | with which classifier? | 01:26 |
n4nd0 | blackburn: SVMLib | 01:26 |
blackburn | libsvm I guess | 01:26 |
blackburn | I see | 01:26 |
n4nd0 | blackburn: yeah, libsvm sorry | 01:27 |
n4nd0 | blackburn: so the accuracy is 0.993338360985 | 01:27 |
n4nd0 | blackburn: what is much better than what I expected using just an svm and the image pixels as features | 01:27 |
blackburn | looks pretty unreal for faces hmm | 01:27 |
n4nd0 | blackburn: exactly | 01:28 |
blackburn | what are the features? | 01:28 |
n4nd0 | blackburn: pixel values | 01:28 |
blackburn | is it CBCL face database? | 01:28 |
n4nd0 | blackburn: black and white images, normalized to mean 0 and std 1, but pixel values indeed | 01:29 |
n4nd0 | blackburn: is the database I used for a course at the university, let me check if it was taken from a known database | 01:29 |
blackburn | I recalled it has 19x19 images too | 01:30 |
n4nd0 | blackburn: yes | 01:30 |
blackburn | then why did you do splitting? | 01:30 |
blackburn | I mean there is test set | 01:31 |
n4nd0 | so I don't use all the data for training but I have part of it to test as well | 01:31 |
n4nd0 | I don't know if I got what you meant | 01:32 |
blackburn | i mean if we are talking about the same dataset | 01:32 |
blackburn | http://cbcl.mit.edu/projects/cbcl/software-datasets/FaceData1Readme.html | 01:32 |
blackburn | there is a train and test sets | 01:32 |
blackburn | so you don't have to split I guess | 01:33 |
n4nd0 | aha, I see | 01:34 |
n4nd0 | I will try with that database to see what I get | 01:34 |
blackburn | btw it is important to use whole training set | 01:34 |
n4nd0 | ok | 01:34 |
blackburn | or even with virtual images | 01:34 |
blackburn | it is a common practice | 01:34 |
n4nd0 | virtual images? | 01:34 |
n4nd0 | artificially generated you mean? | 01:35 |
blackburn | i.e. shifted faces or noised | 01:35 |
blackburn | yeah | 01:35 |
n4nd0 | ok | 01:35 |
blackburn | there is a citation here, they used some crazy method for features | 01:35 |
n4nd0 | I remember that was pretty useful to train the cascade with boosting | 01:36 |
blackburn | yeah boosting works well in practice afaik | 01:36 |
blackburn | I am not a big fan of it though ;) | 01:36 |
n4nd0 | I believe it is called bootstraping | 01:36 |
n4nd0 | bootstrapping* | 01:36 |
blackburn | or more generally ensembles | 01:37 |
blackburn | :) | 01:37 |
n4nd0 | :) | 01:37 |
n4nd0 | anyway | 01:37 |
n4nd0 | I guess that trained classifiers can be stored and retrieved from a program later right? | 01:37 |
blackburn | hmm yes, using serialization techniques | 01:38 |
n4nd0 | like save them into a file and later load them into memory | 01:38 |
n4nd0 | does that work between interfaces? | 01:38 |
n4nd0 | e.g., I store sth trained with python and load it from octave | 01:38 |
blackburn | should work | 01:39 |
blackburn | if you did not use pickle or so | 01:39 |
n4nd0 | don't really know what is pickle | 01:39 |
blackburn | pickle is a serialization package for python | 01:40 |
blackburn | that enables to save/load objects | 01:40 |
n4nd0 | ah ok | 01:40 |
blackburn | I am not sure there | 01:41 |
n4nd0 | so then there is not something in shogun that enables the serialization | 01:41 |
blackburn | you could try | 01:41 |
blackburn | hmm that's pretty complex thing | 01:41 |
blackburn | let me try to describe | 01:41 |
blackburn | while we have swig interfaces | 01:41 |
blackburn | we have some internal serialization part | 01:42 |
blackburn | based on C== | 01:42 |
blackburn | ++ | 01:42 |
blackburn | and 'external' related to concrete interface like python or ruby | 01:42 |
blackburn | so when pickle in python tries to save or load shogun object it uses that C++ serialization code | 01:43 |
n4nd0 | ok | 01:44 |
n4nd0 | then it might be possible, just wondering anyway :) | 01:44 |
blackburn | http://www.youtube.com/watch?feature=player_detailpage&v=EokaVrvZWBs | 01:47 |
n4nd0 | blackburn: about gsoc, I want to start taking a closer look to possible projects I could apply for | 01:48 |
n4nd0 | haha good summary of a day | 01:48 |
blackburn | my current life remind me that video: http://www.youtube.com/watch?feature=player_detailpage&v=kfchvCyHmsc | 01:49 |
n4nd0 | blackburn: I think it is good if I start some coding related to it | 01:50 |
blackburn | hmm | 01:50 |
blackburn | okay there would be definitely be structured output learning | 01:50 |
blackburn | and gaussian processes | 01:50 |
blackburn | then one idea would be possibly related to ECOC and some label tree learning | 01:50 |
n4nd0 | are you planning to apply for one of those this year or will you continue with dimensionality reduction? | 01:51 |
blackburn | ah yes I am going to apply to multitask learning | 01:52 |
n4nd0 | I read today a bit about gaussian procceses and structured output learning, both look really interesting | 01:53 |
blackburn | yeah promising edges | 01:54 |
n4nd0 | blackburn: so what do you think is a suitable approximation to get into one of those? | 01:55 |
blackburn | I am not sure what you mean | 01:56 |
blackburn | :) | 01:56 |
n4nd0 | blackburn: study a current implementation of it, like this one for GPs and later try to port it? | 01:56 |
n4nd0 | http://mloss.org/software/view/118/ | 01:56 |
blackburn | hmm I would suggest you to take a look on scikits GPs | 01:56 |
blackburn | I believe it would be more readable | 01:57 |
n4nd0 | all right, thanks :) | 01:57 |
n4nd0 | blackburn: so I meant for example, I start a documentation phase and once the coding can be started I have to tell around here that I am working on that and that's all? | 01:58 |
blackburn | not really | 01:58 |
blackburn | are you about gsoc applying part? | 01:59 |
n4nd0 | right know I was about collaboration before gsoc | 01:59 |
blackburn | well it is not necessary to start right now | 02:00 |
n4nd0 | I guess that if I start with this during the next days I will be able to start working on it before gsoc has started | 02:00 |
blackburn | you don't have to hurry | 02:00 |
n4nd0 | aha ok | 02:00 |
blackburn | if you finish things in may what would you do during summer? :) | 02:01 |
n4nd0 | haha I see | 02:01 |
n4nd0 | I assumed that could give better chances for the application to succeed | 02:02 |
blackburn | sure but you may do some other things for now ;) | 02:02 |
blackburn | for example you'd be very welcome to integrate your face recognition things | 02:02 |
blackburn | we lack good -real- examples | 02:03 |
n4nd0 | ok :) | 02:03 |
n4nd0 | that's good then, I will continue with that and try to make a nice example :) | 02:04 |
n4nd0 | some doc about GPs | 02:04 |
blackburn | I find pretty useful to implement things by myself in some scripting lang | 02:04 |
blackburn | and then port to C++ | 02:04 |
blackburn | so if you are hungry to GPs it would be useful to implement it in octave/matlab/python/etc | 02:05 |
blackburn | I believe it would take less than week to port things then | 02:05 |
blackburn | :) | 02:05 |
n4nd0 | good | 02:06 |
n4nd0 | I have to get some sleep now | 02:07 |
blackburn | and staying in touch increases your chances significantly :) | 02:07 |
n4nd0 | blackburn: :) | 02:07 |
blackburn | there are some guys promising some things to implement | 02:07 |
blackburn | they appear and disappear for weeks | 02:07 |
n4nd0 | aham | 02:08 |
blackburn | not really good practice for me :) | 02:08 |
n4nd0 | well, thank you for the suggestions | 02:08 |
blackburn | you should talk to Soeren as well | 02:09 |
blackburn | to get him to know you | 02:09 |
blackburn | I am not the boss :) | 02:09 |
blackburn | okay sleep well then :) | 02:09 |
n4nd0 | haha ok | 02:09 |
n4nd0 | good night | 02:09 |
blackburn | good night | 02:09 |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Quit: leaving] | 02:09 | |
-!- naywhayare [~ryan@spoon.lugatgt.org] has joined #shogun | 03:40 | |
-!- blackburn [~qdrgsm@188.168.4.152] has left #shogun [] | 03:47 | |
-!- dfrx [~f-x@inet-hqmc01-o.oracle.com] has joined #shogun | 04:16 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 05:17 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 08:10 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 09:49 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 09:49 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 240 seconds] | 09:53 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 10:00 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 10:47 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 10:59 | |
-!- dfrx [~f-x@inet-hqmc01-o.oracle.com] has quit [Quit: Leaving.] | 11:57 | |
-!- blackburn [5bdfb203@gateway/web/freenode/ip.91.223.178.3] has joined #shogun | 14:04 | |
blackburn | wiking: JS was a little better than HIK ;) | 14:05 |
wiking | hehehe | 14:05 |
wiking | cool | 14:05 |
wiking | it took this long? | 14:05 |
blackburn | wiking: not really, just recalled | 14:06 |
blackburn | but anyway long, yes | 14:06 |
blackburn | 30K seconds for 2000 vectors | 14:07 |
blackburn | libsvm OvO | 14:07 |
blackburn | not a real-time system lol | 14:07 |
wiking | :>> | 14:09 |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun | 14:14 | |
blackburn | sonne|work: ???-??-??-???! | 14:42 |
blackburn | unbelievable %$%$ | 14:47 |
wiking | :> | 15:34 |
sonne|work | blackburn: ? | 15:38 |
blackburn | sonne|work: you don't answer mails and it is impossible to catch you! | 15:38 |
blackburn | looks like I ve been pinging you for last month :D | 15:40 |
blackburn | wiking: are you willing to integrate this homogay kernel map from vlfeat? | 15:40 |
wiking | blackburn: yep | 15:41 |
blackburn | so I shouldn't, right? | 15:41 |
blackburn | just checking | 15:41 |
wiking | blackburn: i'm just finishing up some other code, but if u guys say that you are willing to do the pull then i'll do it this week | 15:41 |
wiking | i guess it should go within the preprocessing | 15:41 |
blackburn | I'm not really in hurry with it | 15:42 |
wiking | i mean preprocessor | 15:42 |
blackburn | hmm | 15:42 |
blackburn | yes, looks like | 15:42 |
wiking | ok great | 15:42 |
blackburn | could be converter as well.. | 15:42 |
blackburn | but i guess preprocessor fits better | 15:42 |
wiking | ok | 15:42 |
blackburn | btw we already have similar thing here | 15:43 |
blackburn | random gaussian fourier blabla | 15:43 |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 260 seconds] | 16:21 | |
CIA-18 | shogun: Soeren Sonnenburg master * r0338587 / examples/undocumented/python_modular/regression_linear_ridge_modular.py : remove unused gaussian kernel from example - http://git.io/Sf2o5w | 16:23 |
-!- blackburn [5bdfb203@gateway/web/freenode/ip.91.223.178.3] has quit [Quit: Page closed] | 16:38 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 245 seconds] | 17:33 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 18:13 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Remote host closed the connection] | 19:33 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 19:33 | |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun | 22:36 | |
--- Log closed Thu Feb 16 00:00:19 2012 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!