IRC logs of #shogun for Sunday, 2012-02-12

--- Log opened Sun Feb 12 00:00:19 2012
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Quit: Leaving]01:13
-!- rj-code [~rj-code@bb220-255-82-222.singnet.com.sg] has joined #shogun03:49
-!- rj-code [~rj-code@bb220-255-82-222.singnet.com.sg] has quit [Client Quit]03:49
-!- Netsplit *.net <-> *.split quits: wiking05:34
-!- Netsplit over, joins: wiking05:40
-!- Ram108 [~amma@14.99.255.99] has joined #shogun06:29
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]07:39
-!- Ram108 [~amma@14.99.255.99] has quit [Ping timeout: 240 seconds]07:58
-!- Ram108 [~amma@14.99.248.44] has joined #shogun08:18
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun09:04
-!- Ram108 [~amma@14.99.248.44] has quit [Ping timeout: 248 seconds]11:04
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]11:28
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun11:41
-!- Ram108 [~amma@14.99.60.150] has joined #shogun11:54
-!- blackburn [~qdrgsm@188.168.5.246] has joined #shogun11:54
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun13:09
-!- Ram108 [~amma@14.99.60.150] has quit [Ping timeout: 265 seconds]13:35
n4nd0blackburn, hey!13:47
blackburnhey13:47
n4nd0I am training the lpboost classifier for the faces and non-faces training set13:48
n4nd0but I am just able to train it with 500 train examples :-O13:48
blackburnbecause of CPLEX's limitations?13:48
n4nd0I am not completely sure13:49
n4nd0but it is the most reasonable thing since it says this comment about13:49
n4nd0The CPLEX Optimizers will solve problems up to 500 variables and 500 constraints.13:49
n4nd0I don't know if it is the CPLEX solver itself or because the version I downloaded is a trial one13:50
blackburnwhy don't u simply use some svm?13:50
n4nd0do you think that a single svm can handle the problem?13:51
blackburnwhat do you mean single?13:51
n4nd0one svm13:51
blackburnhmm why not?13:51
n4nd0I think that the idea of using boosting for face detection was that the problem was too hard to use just one classifier13:52
blackburnsure but you may use some kernel13:52
n4nd0ok13:52
n4nd0I can try to look on the web if there is some previous work on face detection with svm :)13:52
blackburnI am not sure but you may try13:53
n4nd0sure13:53
blackburnis it a multiclass problem?13:53
n4nd0no13:53
blackburnface/not face?13:53
n4nd0just two classes, face or non face13:53
n4nd0exactly13:53
blackburnhmm then svm should work well too13:53
n4nd0I will check13:53
n4nd0it seems that ir works with svms too13:57
n4nd0some people have done with cascaded svms but I guess that is a similar approximation to boost13:58
blackburnbtw what features do you use?14:02
blackburnsome haar like?14:02
n4nd0yes14:04
n4nd0I have found an article that they use something call variance based Haar-like feature14:04
n4nd0I am trying to get it and look it more in depth, the abstract looks like it can work for what I want14:05
blackburnn4nd0: what is the size of images in pixels? :)14:08
blackburnthere is one simple classifier in shogun14:09
blackburnclass ConjugateIndex14:09
blackburncould you try it on pixels? :)14:09
blackburnI am just curious14:09
n4nd0the training images are 19x1914:12
n4nd0what do you mean to try it on pixels?14:12
blackburnjust 19x19 pixels value vector14:12
blackburnbrightness14:12
n4nd0ok14:15
n4nd0do you think that have any chance to work :P?14:15
blackburnyeah may be 60-70% accuracy :)14:16
n4nd0I will take it a look ;-)14:18
n4nd0I will first take a look to that paper ... if I manage to get hold of it!14:18
blackburnwhich?14:18
n4nd0Face Detection using Variance based Haar-Like feature and SVM14:20
n4nd0by Cuong Nguyen Khac, Ju H Park, Ho-youl Jung14:20
blackburnah I see14:21
blackburnI would better suggest to try first :)14:21
n4nd0with the ConjugateIndex you mean?14:21
blackburnwith any of classifiers14:21
n4nd0ok14:21
blackburnit is pretty easy to change14:22
n4nd0cool14:22
blackburnokay will be back later14:25
-!- blackburn [~qdrgsm@188.168.5.246] has quit [Quit: Leaving.]14:25
-!- wiking [~wiking@huwico/staff/wiking] has quit [Remote host closed the connection]14:46
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun14:47
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 245 seconds]15:35
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun15:47
-!- Ram108 [~amma@14.99.1.86] has joined #shogun15:53
-!- Ram108 [~amma@14.99.1.86] has quit [Ping timeout: 260 seconds]17:57
-!- blackburn [~qdrgsm@188.168.5.125] has joined #shogun18:11
blackburnn4nd0:what's up?18:12
n4nd0blackburn, hi!18:43
n4nd0blackburn, I have tried an svm, libsvm_oneclass with different set of parameters18:44
blackburnand how it wsa?18:44
blackburnwas*18:44
n4nd0blackburn, using pixel values directly18:44
n4nd0the training error is still high so not really good18:44
n4nd0but I have taken a look to this paper I told you and read about the features they used for training the svm18:45
n4nd0I am going to try with those to see how they work out18:45
blackburncould you try ConjugateIndex?18:45
blackburn:)18:45
blackburnI'm curious18:46
n4nd0they are pretty similar to the Haar ones I already have18:46
n4nd0I tried to check it but I have not identified it in the matlab/octave static interface18:46
n4nd0does it have a short name?18:46
blackburnwhat interface do you use?18:47
n4nd0I am with static matlab18:47
blackburnoh when no18:47
blackburnit seems you don't like easy ways, right? :)18:47
n4nd0haha why?18:47
blackburnwell modular things for me are much more conveninet18:48
blackburnconvenient18:48
n4nd0I must try them18:51
n4nd0I don't know why I started with this one18:51
blackburnwell there is no matlab modular18:52
n4nd0what is the main idea in the difference between both of them?18:52
blackburnhmm18:52
blackburnstatic allows only one classifier and two features18:52
blackburnwith modular you can create different classifiers as objects18:53
blackburnjust18:53
blackburnclassifier = LibSVM(C,kernel,labels)18:53
blackburnclassifier.train()18:53
blackburnpredicted = classifier.apply(test_features)18:53
n4nd0cool18:53
n4nd0I would actually like to start working with modular in python18:54
blackburnsure, I prefer python too18:54
n4nd0but I would like to re-use the matlab code I have for the haar features18:54
blackburnyou may save .mat18:54
blackburnand load it in python18:55
n4nd0I save .mat right now18:55
n4nd0can they be used directly from python?18:55
blackburnyes18:55
blackburnnot directly but can be loaded18:55
blackburnimport scipy.io18:56
blackburnfaces = scipy.io.loadmat(data_file)['faces']18:56
blackburnexample18:56
n4nd0it looks good then18:56
n4nd0I am going to try it right now18:56
blackburnonce you've loaded matrix18:57
blackburnyou have to create features18:57
blackburntrain_features = RealFeatures(some_train_features_matrix)18:58
blackburntrain_labels = Labels(some_train_labels)18:58
n4nd0I could load it from Python19:02
n4nd0nice19:02
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 252 seconds]20:25
blackburnsonney2k: ping21:20
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun21:30
-!- axitkhurana [d2d43a6f@gateway/web/freenode/ip.210.212.58.111] has joined #shogun22:08
n4nd0blackburn, hey!22:25
blackburnhi22:25
blackburn(again)22:25
blackburn:)22:25
n4nd0blackburn, one question22:25
n4nd0yeah again :)22:25
n4nd0I have come across one of the examples in the web a class called22:26
n4nd0PerformanceMeasure22:26
blackburnyeah it is not here anymore22:26
n4nd0ok ... that answers my question then :P22:26
blackburnif you want to measure something22:26
n4nd0so one has to use the different classes AccuracyMeasure22:26
blackburnyou can use AccuracyMeasure, ROCEvaluation and etc22:27
blackburnyes22:27
n4nd0ROCEvaluation and the like22:27
n4nd0ok22:27
blackburnyou speak just like me22:27
blackburn:D22:27
n4nd0haha22:27
n4nd0it seems to me in the example that PerformanceMeasure was more handy22:27
n4nd0what was the problem with it?22:28
blackburncould you please paste a link?22:28
n4nd0http://www.shogun-toolbox.org/doc/en/current/modular_tutorial.html22:28
n4nd0PerformanceMeasures is the exact name as it stands22:28
blackburnit was decomposited because of model selection mainly22:29
blackburnthere are some framework there you can choose measure to use22:29
blackburni.e. then you want to maximize auROC you select ROCEvaluation22:29
n4nd0aha I see22:30
blackburnbtw you can use22:30
blackburnContingencyTableEvaluation22:30
blackburnand then22:30
blackburnwait a min :)22:30
blackburnyes22:30
n4nd0checking Contingency right now22:30
blackburnsomething like22:30
n4nd0looks like a useful one :)22:31
blackburnevaluator = ContingencyTableEvaluation()22:31
blackburnevaluator.get_error_rate()22:31
blackburnah mistake but you get it I guess22:31
n4nd0;)22:32
blackburnbut you are right it was handy22:32
blackburnI have to think about it22:32
n4nd0btw22:33
blackburnwe have not too much users and able to change everything hah22:33
n4nd0found it weird that ROCEvaluation complained when the output of the SVM was not precisely either -1 or 122:33
blackburnI guess you mixed up things22:34
n4nd0while the others (AccuracyMeasure and ContingencyTableEvaluation) didn't22:34
n4nd0mmm22:34
blackburnpredicted is first22:34
blackburnand ground truth (-1,1) is second22:34
n4nd0aha! that's it22:35
n4nd0stupid mistkae :P22:35
blackburnnot really22:35
blackburnshould be warning here or so22:35
blackburnsleep time22:37
blackburnn4nd0: see you later :)22:37
n4nd0blackburn, good night22:38
blackburnbtw did you manage to test things with modular?22:38
n4nd0blackburn, and thank you for your help :D22:38
n4nd0yes, I am right now with it in python22:38
blackburnno problem you are a potential developer so you may help later ;)22:38
blackburnokay good night22:40
-!- blackburn [~qdrgsm@188.168.5.125] has quit [Quit: Leaving.]22:40
--- Log closed Mon Feb 13 00:00:19 2012

Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!