--- Log opened Sun Feb 12 00:00:19 2012 | ||
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Quit: Leaving] | 01:13 | |
-!- rj-code [~rj-code@bb220-255-82-222.singnet.com.sg] has joined #shogun | 03:49 | |
-!- rj-code [~rj-code@bb220-255-82-222.singnet.com.sg] has quit [Client Quit] | 03:49 | |
-!- Netsplit *.net <-> *.split quits: wiking | 05:34 | |
-!- Netsplit over, joins: wiking | 05:40 | |
-!- Ram108 [~amma@14.99.255.99] has joined #shogun | 06:29 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 07:39 | |
-!- Ram108 [~amma@14.99.255.99] has quit [Ping timeout: 240 seconds] | 07:58 | |
-!- Ram108 [~amma@14.99.248.44] has joined #shogun | 08:18 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 09:04 | |
-!- Ram108 [~amma@14.99.248.44] has quit [Ping timeout: 248 seconds] | 11:04 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 11:28 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 11:41 | |
-!- Ram108 [~amma@14.99.60.150] has joined #shogun | 11:54 | |
-!- blackburn [~qdrgsm@188.168.5.246] has joined #shogun | 11:54 | |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun | 13:09 | |
-!- Ram108 [~amma@14.99.60.150] has quit [Ping timeout: 265 seconds] | 13:35 | |
n4nd0 | blackburn, hey! | 13:47 |
---|---|---|
blackburn | hey | 13:47 |
n4nd0 | I am training the lpboost classifier for the faces and non-faces training set | 13:48 |
n4nd0 | but I am just able to train it with 500 train examples :-O | 13:48 |
blackburn | because of CPLEX's limitations? | 13:48 |
n4nd0 | I am not completely sure | 13:49 |
n4nd0 | but it is the most reasonable thing since it says this comment about | 13:49 |
n4nd0 | The CPLEX Optimizers will solve problems up to 500 variables and 500 constraints. | 13:49 |
n4nd0 | I don't know if it is the CPLEX solver itself or because the version I downloaded is a trial one | 13:50 |
blackburn | why don't u simply use some svm? | 13:50 |
n4nd0 | do you think that a single svm can handle the problem? | 13:51 |
blackburn | what do you mean single? | 13:51 |
n4nd0 | one svm | 13:51 |
blackburn | hmm why not? | 13:51 |
n4nd0 | I think that the idea of using boosting for face detection was that the problem was too hard to use just one classifier | 13:52 |
blackburn | sure but you may use some kernel | 13:52 |
n4nd0 | ok | 13:52 |
n4nd0 | I can try to look on the web if there is some previous work on face detection with svm :) | 13:52 |
blackburn | I am not sure but you may try | 13:53 |
n4nd0 | sure | 13:53 |
blackburn | is it a multiclass problem? | 13:53 |
n4nd0 | no | 13:53 |
blackburn | face/not face? | 13:53 |
n4nd0 | just two classes, face or non face | 13:53 |
n4nd0 | exactly | 13:53 |
blackburn | hmm then svm should work well too | 13:53 |
n4nd0 | I will check | 13:53 |
n4nd0 | it seems that ir works with svms too | 13:57 |
n4nd0 | some people have done with cascaded svms but I guess that is a similar approximation to boost | 13:58 |
blackburn | btw what features do you use? | 14:02 |
blackburn | some haar like? | 14:02 |
n4nd0 | yes | 14:04 |
n4nd0 | I have found an article that they use something call variance based Haar-like feature | 14:04 |
n4nd0 | I am trying to get it and look it more in depth, the abstract looks like it can work for what I want | 14:05 |
blackburn | n4nd0: what is the size of images in pixels? :) | 14:08 |
blackburn | there is one simple classifier in shogun | 14:09 |
blackburn | class ConjugateIndex | 14:09 |
blackburn | could you try it on pixels? :) | 14:09 |
blackburn | I am just curious | 14:09 |
n4nd0 | the training images are 19x19 | 14:12 |
n4nd0 | what do you mean to try it on pixels? | 14:12 |
blackburn | just 19x19 pixels value vector | 14:12 |
blackburn | brightness | 14:12 |
n4nd0 | ok | 14:15 |
n4nd0 | do you think that have any chance to work :P? | 14:15 |
blackburn | yeah may be 60-70% accuracy :) | 14:16 |
n4nd0 | I will take it a look ;-) | 14:18 |
n4nd0 | I will first take a look to that paper ... if I manage to get hold of it! | 14:18 |
blackburn | which? | 14:18 |
n4nd0 | Face Detection using Variance based Haar-Like feature and SVM | 14:20 |
n4nd0 | by Cuong Nguyen Khac, Ju H Park, Ho-youl Jung | 14:20 |
blackburn | ah I see | 14:21 |
blackburn | I would better suggest to try first :) | 14:21 |
n4nd0 | with the ConjugateIndex you mean? | 14:21 |
blackburn | with any of classifiers | 14:21 |
n4nd0 | ok | 14:21 |
blackburn | it is pretty easy to change | 14:22 |
n4nd0 | cool | 14:22 |
blackburn | okay will be back later | 14:25 |
-!- blackburn [~qdrgsm@188.168.5.246] has quit [Quit: Leaving.] | 14:25 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Remote host closed the connection] | 14:46 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 14:47 | |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 245 seconds] | 15:35 | |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun | 15:47 | |
-!- Ram108 [~amma@14.99.1.86] has joined #shogun | 15:53 | |
-!- Ram108 [~amma@14.99.1.86] has quit [Ping timeout: 260 seconds] | 17:57 | |
-!- blackburn [~qdrgsm@188.168.5.125] has joined #shogun | 18:11 | |
blackburn | n4nd0:what's up? | 18:12 |
n4nd0 | blackburn, hi! | 18:43 |
n4nd0 | blackburn, I have tried an svm, libsvm_oneclass with different set of parameters | 18:44 |
blackburn | and how it wsa? | 18:44 |
blackburn | was* | 18:44 |
n4nd0 | blackburn, using pixel values directly | 18:44 |
n4nd0 | the training error is still high so not really good | 18:44 |
n4nd0 | but I have taken a look to this paper I told you and read about the features they used for training the svm | 18:45 |
n4nd0 | I am going to try with those to see how they work out | 18:45 |
blackburn | could you try ConjugateIndex? | 18:45 |
blackburn | :) | 18:45 |
blackburn | I'm curious | 18:46 |
n4nd0 | they are pretty similar to the Haar ones I already have | 18:46 |
n4nd0 | I tried to check it but I have not identified it in the matlab/octave static interface | 18:46 |
n4nd0 | does it have a short name? | 18:46 |
blackburn | what interface do you use? | 18:47 |
n4nd0 | I am with static matlab | 18:47 |
blackburn | oh when no | 18:47 |
blackburn | it seems you don't like easy ways, right? :) | 18:47 |
n4nd0 | haha why? | 18:47 |
blackburn | well modular things for me are much more conveninet | 18:48 |
blackburn | convenient | 18:48 |
n4nd0 | I must try them | 18:51 |
n4nd0 | I don't know why I started with this one | 18:51 |
blackburn | well there is no matlab modular | 18:52 |
n4nd0 | what is the main idea in the difference between both of them? | 18:52 |
blackburn | hmm | 18:52 |
blackburn | static allows only one classifier and two features | 18:52 |
blackburn | with modular you can create different classifiers as objects | 18:53 |
blackburn | just | 18:53 |
blackburn | classifier = LibSVM(C,kernel,labels) | 18:53 |
blackburn | classifier.train() | 18:53 |
blackburn | predicted = classifier.apply(test_features) | 18:53 |
n4nd0 | cool | 18:53 |
n4nd0 | I would actually like to start working with modular in python | 18:54 |
blackburn | sure, I prefer python too | 18:54 |
n4nd0 | but I would like to re-use the matlab code I have for the haar features | 18:54 |
blackburn | you may save .mat | 18:54 |
blackburn | and load it in python | 18:55 |
n4nd0 | I save .mat right now | 18:55 |
n4nd0 | can they be used directly from python? | 18:55 |
blackburn | yes | 18:55 |
blackburn | not directly but can be loaded | 18:55 |
blackburn | import scipy.io | 18:56 |
blackburn | faces = scipy.io.loadmat(data_file)['faces'] | 18:56 |
blackburn | example | 18:56 |
n4nd0 | it looks good then | 18:56 |
n4nd0 | I am going to try it right now | 18:56 |
blackburn | once you've loaded matrix | 18:57 |
blackburn | you have to create features | 18:57 |
blackburn | train_features = RealFeatures(some_train_features_matrix) | 18:58 |
blackburn | train_labels = Labels(some_train_labels) | 18:58 |
n4nd0 | I could load it from Python | 19:02 |
n4nd0 | nice | 19:02 |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 252 seconds] | 20:25 | |
blackburn | sonney2k: ping | 21:20 |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun | 21:30 | |
-!- axitkhurana [d2d43a6f@gateway/web/freenode/ip.210.212.58.111] has joined #shogun | 22:08 | |
n4nd0 | blackburn, hey! | 22:25 |
blackburn | hi | 22:25 |
blackburn | (again) | 22:25 |
blackburn | :) | 22:25 |
n4nd0 | blackburn, one question | 22:25 |
n4nd0 | yeah again :) | 22:25 |
n4nd0 | I have come across one of the examples in the web a class called | 22:26 |
n4nd0 | PerformanceMeasure | 22:26 |
blackburn | yeah it is not here anymore | 22:26 |
n4nd0 | ok ... that answers my question then :P | 22:26 |
blackburn | if you want to measure something | 22:26 |
n4nd0 | so one has to use the different classes AccuracyMeasure | 22:26 |
blackburn | you can use AccuracyMeasure, ROCEvaluation and etc | 22:27 |
blackburn | yes | 22:27 |
n4nd0 | ROCEvaluation and the like | 22:27 |
n4nd0 | ok | 22:27 |
blackburn | you speak just like me | 22:27 |
blackburn | :D | 22:27 |
n4nd0 | haha | 22:27 |
n4nd0 | it seems to me in the example that PerformanceMeasure was more handy | 22:27 |
n4nd0 | what was the problem with it? | 22:28 |
blackburn | could you please paste a link? | 22:28 |
n4nd0 | http://www.shogun-toolbox.org/doc/en/current/modular_tutorial.html | 22:28 |
n4nd0 | PerformanceMeasures is the exact name as it stands | 22:28 |
blackburn | it was decomposited because of model selection mainly | 22:29 |
blackburn | there are some framework there you can choose measure to use | 22:29 |
blackburn | i.e. then you want to maximize auROC you select ROCEvaluation | 22:29 |
n4nd0 | aha I see | 22:30 |
blackburn | btw you can use | 22:30 |
blackburn | ContingencyTableEvaluation | 22:30 |
blackburn | and then | 22:30 |
blackburn | wait a min :) | 22:30 |
blackburn | yes | 22:30 |
n4nd0 | checking Contingency right now | 22:30 |
blackburn | something like | 22:30 |
n4nd0 | looks like a useful one :) | 22:31 |
blackburn | evaluator = ContingencyTableEvaluation() | 22:31 |
blackburn | evaluator.get_error_rate() | 22:31 |
blackburn | ah mistake but you get it I guess | 22:31 |
n4nd0 | ;) | 22:32 |
blackburn | but you are right it was handy | 22:32 |
blackburn | I have to think about it | 22:32 |
n4nd0 | btw | 22:33 |
blackburn | we have not too much users and able to change everything hah | 22:33 |
n4nd0 | found it weird that ROCEvaluation complained when the output of the SVM was not precisely either -1 or 1 | 22:33 |
blackburn | I guess you mixed up things | 22:34 |
n4nd0 | while the others (AccuracyMeasure and ContingencyTableEvaluation) didn't | 22:34 |
n4nd0 | mmm | 22:34 |
blackburn | predicted is first | 22:34 |
blackburn | and ground truth (-1,1) is second | 22:34 |
n4nd0 | aha! that's it | 22:35 |
n4nd0 | stupid mistkae :P | 22:35 |
blackburn | not really | 22:35 |
blackburn | should be warning here or so | 22:35 |
blackburn | sleep time | 22:37 |
blackburn | n4nd0: see you later :) | 22:37 |
n4nd0 | blackburn, good night | 22:38 |
blackburn | btw did you manage to test things with modular? | 22:38 |
n4nd0 | blackburn, and thank you for your help :D | 22:38 |
n4nd0 | yes, I am right now with it in python | 22:38 |
blackburn | no problem you are a potential developer so you may help later ;) | 22:38 |
blackburn | okay good night | 22:40 |
-!- blackburn [~qdrgsm@188.168.5.125] has quit [Quit: Leaving.] | 22:40 | |
--- Log closed Mon Feb 13 00:00:19 2012 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!