--- Log opened Sat Feb 18 00:00:19 2012 | ||
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun | 00:00 | |
n4nd0 | so is there any way to build a strong classifier with shogun where several "weak" ones are put together? | 00:07 |
---|---|---|
n4nd0 | I am looking for it to work on an example application on face detection | 00:08 |
n4nd0 | I have made tests using fancy features but no good results using svm with a polynomial kernel | 00:09 |
@sonney2k | n4nd0, we don't have boosting if you mean that | 00:28 |
@sonney2k | try multiboost for that | 00:28 |
n4nd0 | sonney2k, I was looking for an alternative solution to boosting | 00:29 |
n4nd0 | I want to stick to shogun | 00:30 |
n4nd0 | sonney2k, about boosting, I have seen in the gsoc ideas page that one of the suggested projects last year was to merge shogun and multiboost | 00:33 |
@sonney2k | no longer there ... | 00:33 |
@sonney2k | too ambitious | 00:33 |
n4nd0 | aham | 00:33 |
n4nd0 | but would that still be of interest for shogun? | 00:34 |
@sonney2k | sure | 00:35 |
@sonney2k | alright ... sleep time! | 00:36 |
@sonney2k | cu | 00:36 |
n4nd0 | sonney2k, I have worked with adaboost in particular before | 00:36 |
n4nd0 | so I could start taking a look to multiboost and try to see how could that be ported to shogun | 00:36 |
n4nd0 | ok | 00:36 |
n4nd0 | good night | 00:36 |
-!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Quit: Leaving] | 00:51 | |
-!- CIA-64 [~CIA@cia.atheme.org] has joined #shogun | 03:00 | |
-!- Netsplit *.net <-> *.split quits: CIA-18 | 03:03 | |
-!- blackburn [~qdrgsm@109.226.88.39] has joined #shogun | 09:08 | |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 10:09 | |
CIA-64 | shogun: Sergey Lisitsyn master * rdd64345 / (5 files in 4 dirs): Fixed a couple of warnings - http://git.io/EduvuQ | 10:14 |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 10:44 | |
blackburn | wiking: have you tried 1/2-homogay map? | 11:01 |
blackburn | vedaldi surprisingly reported improvement with this | 11:01 |
wiking | what's 1/2-hkm? | 11:03 |
blackburn | wiking: order of homogaynit | 11:05 |
blackburn | y | 11:05 |
blackburn | I mean they use 1/2 as order | 11:05 |
blackburn | and it smooths things | 11:05 |
blackburn | but I got worse results | 11:06 |
wiking | but how can he do 1/2 order | 11:09 |
wiking | when it's an int in their implementation | 11:09 |
wiking | :D | 11:09 |
blackburn | hmm | 11:10 |
blackburn | wait | 11:10 |
blackburn | not order | 11:10 |
blackburn | gamma | 11:10 |
blackburn | my bad | 11:11 |
blackburn | k(cx,cy) = c^\gamma k(x,y) | 11:11 |
blackburn | wiking: using 0.5 for JS improved for 1% | 11:20 |
blackburn | on my data | 11:20 |
blackburn | hog | 11:20 |
wiking | aha | 11:21 |
wiking | compared to original JS? | 11:21 |
wiking | or compared to 1.0 gamma ;P | 11:21 |
wiking | i'm gonna introduce a new kernel today/tomorrow | 11:21 |
wiking | and let's see if i can make that HKM as well | 11:21 |
blackburn | wiking: JS with gamma = 1.0 and JS with gamma = 0.5 | 11:22 |
blackburn | btw it is pretty dirty hack | 11:22 |
blackburn | for me | 11:22 |
blackburn | wiking: jenson-shannon-renyi-mozart? | 11:22 |
wiking | which? :) | 11:22 |
wiking | ahahah yepp | 11:23 |
wiking | :) | 11:23 |
blackburn | wiking: hmm what is the natural gamma for JS btw? | 11:23 |
wiking | none | 11:24 |
wiking | i mean this is like basically a tailor series | 11:24 |
wiking | so it's just approximation | 11:24 |
wiking | so i guess gamma -> inf would get you to original JS | 11:24 |
blackburn | hmm | 11:25 |
wiking | Smaller value of @f$ \gamma @f$ enhance the kernel non-linearity and | 11:26 |
wiking | are sometimes beneficial in applications | 11:26 |
blackburn | I guess your suggestion about gamma -> inf wasn't right | 11:27 |
wiking | yep | 11:27 |
blackburn | gamma -> inf makes it LINEAR :) | 11:27 |
wiking | and is says that | 11:27 |
blackburn | or something like that | 11:27 |
blackburn | oh I don't like it | 11:27 |
blackburn | makes more params :( | 11:27 |
wiking | if gamma = 1 then u should obtain the standard kernels | 11:27 |
wiking | the order ->inf would do that what i'm saying | 11:28 |
wiking | i mean yeah as it says here | 11:28 |
blackburn | I see | 11:28 |
wiking | The homogeneous kernel map | 11:28 |
wiking | approximation is based on periodicizing the kernel | 11:28 |
wiking | so basically you could see this | 11:28 |
wiking | or at least i would see this as sampling the signal | 11:29 |
wiking | and then u have there the Nyquist-Shannon sampling theorem | 11:29 |
wiking | and basically period is the sampling rate | 11:29 |
wiking | imho | 11:29 |
blackburn | pretty complex huh | 11:29 |
blackburn | but yeah I catch it | 11:30 |
blackburn | wiking: where are you doing your phd btw? | 11:42 |
wiking | gent, belgium | 11:46 |
wiking | yesterday i've found a fucking paper that partially implements that idea i had lately | 11:47 |
wiking | it'll be published next month | 11:47 |
wiking | fuckers... from stanford.. | 11:47 |
blackburn | hah that fucked up feeling | 11:47 |
blackburn | so you are a big traveller :) | 11:48 |
blackburn | wiking: will you have enough time for doing latent svms this summer? | 11:49 |
wiking | yep | 11:49 |
blackburn | nice | 11:49 |
wiking | i'll be in iceland for a month in july | 11:49 |
wiking | doing nothing | 11:49 |
blackburn | iceland?? | 11:49 |
wiking | yeah i was living quite around this world... | 11:49 |
blackburn | I wish I did the same haha | 11:50 |
wiking | nevertoolate | 11:50 |
blackburn | notenoughmoney | 11:50 |
blackburn | :D | 11:50 |
wiking | u think i had ? :P | 11:50 |
wiking | anyhow | 11:50 |
wiking | there's this thingy | 11:50 |
blackburn | will you live in a box? | 11:50 |
blackburn | :D | 11:50 |
wiking | http://www.pascal-network.org/?q=node/19 | 11:51 |
wiking | i think it's a good idea, and shogun could apply for it | 11:51 |
wiking | the only problem is that the program is running out and they don't know if there's going to be a last call for participation or not... | 11:51 |
wiking | they going to decide it on the end of march | 11:51 |
wiking | and then with this you could travel | 11:52 |
wiking | ;) | 11:52 |
wiking | ok i'm off now | 11:52 |
blackburn | see you | 11:52 |
blackburn | nice program | 11:52 |
wiking | i'll be back sometime at night | 11:52 |
blackburn | but no idea how to apply it for shogun | 11:52 |
blackburn | ok | 11:52 |
wiking | it's easy | 11:52 |
blackburn | I mean we can hardly find 4-8 guye | 11:53 |
blackburn | guys | 11:53 |
wiking | nono | 11:53 |
wiking | easy | 11:53 |
wiking | you are one | 11:53 |
wiking | :> | 11:53 |
wiking | anyhow i'll let you know if they are going to be a last call for participation | 11:53 |
blackburn | ok | 11:53 |
blackburn | e.g. Soeren can't participate in this program | 11:53 |
wiking | ttyl | 11:53 |
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] | 11:54 | |
blackburn | he is pretty busy with his job | 11:54 |
blackburn | cu | 11:54 |
blackburn | :) | 11:54 |
blackburn | sonne|work: http://www.pascal-network.org/?q=node/19 is it any applicable to shogun? | 11:59 |
blackburn | hm I forgot it is saturday | 13:46 |
blackburn | :D | 13:46 |
blackburn | sonney2k: ^ | 13:46 |
-!- Ram108 [~amma@14.96.26.93] has joined #shogun | 14:33 | |
Ram108 | blackburn: there? | 14:34 |
blackburn | Ram108: yes | 14:34 |
Ram108 | hi i think i figured out what went wrong......... | 14:35 |
Ram108 | well the weight vectors were initialised to large values and the error equation yielded very small values to change the weight vectors appropriately | 14:36 |
Ram108 | thanks for the help.......... | 14:36 |
blackburn | hah, did I help you anyhow? :) | 14:39 |
Ram108 | lol | 14:39 |
Ram108 | erm do u mind i ask u hw long were u in this field of work? | 14:39 |
Ram108 | i mean machine learning | 14:40 |
blackburn | hmm | 14:41 |
Ram108 | i am not able to grasp what all topics does this field comprise off | 14:41 |
blackburn | more than year | 14:41 |
blackburn | may be 1.5 | 14:41 |
Ram108 | oh hmmm thanks :) | 14:41 |
Ram108 | well NN, SVM, fuzzy logic, Genetic algorithms | 14:42 |
Ram108 | is that about it? | 14:42 |
Ram108 | i mean do all the learning algorithms fall under one of these? | 14:42 |
blackburn | NN + SVM mainly | 14:42 |
blackburn | genetic stuff stands apart usually | 14:43 |
blackburn | more generally, evolutionary computing | 14:43 |
blackburn | fuzzy logic as well | 14:43 |
Ram108 | oh hmmm okay......... | 14:43 |
blackburn | but there are a lot of intersections everywhere | 14:43 |
Ram108 | ah........ | 14:43 |
blackburn | shogun contains svms mainly | 14:44 |
Ram108 | i can see that lol | 14:44 |
Ram108 | thats perhaps why am not able to understand all the liblinear lib...... etc etc | 14:44 |
Ram108 | i ll read up on that...... | 14:45 |
Ram108 | by the way could u enlighten me more on what Mr sonney is doing? | 14:45 |
blackburn | I don't know, he does some research at tomtom | 14:46 |
blackburn | earlier he was related to bioinformatics | 14:47 |
Ram108 | oh hmmm i c....... and he devout s rest of his spare time building shogun? | 14:47 |
blackburn | yes | 14:49 |
Ram108 | ok :) last one....... hw old is shogun? | 14:49 |
blackburn | since 1999 | 14:50 |
Ram108 | thanks :) | 14:50 |
-!- Ram108 [~amma@14.96.26.93] has quit [Ping timeout: 240 seconds] | 14:55 | |
-!- Ram108 [~amma@14.96.172.24] has joined #shogun | 15:12 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun | 15:33 | |
-!- Ram108 [~amma@14.96.172.24] has quit [Ping timeout: 255 seconds] | 16:28 | |
-!- naywhaya1e [~ryan@spoon.lugatgt.org] has joined #shogun | 18:35 | |
-!- naywhayare [~ryan@spoon.lugatgt.org] has quit [Ping timeout: 240 seconds] | 18:37 | |
-!- blackburn [~qdrgsm@109.226.88.39] has quit [Ping timeout: 240 seconds] | 18:37 | |
-!- sonne|work [~sonnenbu@194.78.35.195] has quit [Ping timeout: 240 seconds] | 18:37 | |
-!- sonne|work [~sonnenbu@194.78.35.195] has joined #shogun | 18:37 | |
-!- blackburn [~qdrgsm@109.226.88.39] has joined #shogun | 18:37 | |
-!- Ram108 [~amma@14.99.168.133] has joined #shogun | 18:43 | |
blackburn | n4nd0: hey | 19:19 |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 252 seconds] | 19:19 | |
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun | 20:03 | |
-!- Ram108 [~amma@14.99.168.133] has quit [Quit: Ex-Chat] | 20:06 | |
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun | 21:16 | |
n4nd0 | sonney2k: hi! so about what we talked a bit yesterday of adding boosting to shogun | 21:28 |
n4nd0 | sonney2k: I would like to first start with adaboost, what do you think it is a good start to do it? | 21:29 |
wiking | blackburn: yo | 21:39 |
blackburn | wiking: hey | 21:39 |
wiking | wazza | 21:39 |
wiking | are we getting multiclass linear sum soon? | 21:40 |
wiking | *svm | 21:40 |
blackburn | heh | 21:40 |
blackburn | yes, I'm working on it | 21:40 |
blackburn | although not very fast | 21:40 |
wiking | ok cool | 21:40 |
wiking | that'd be great to see the speedup | 21:40 |
blackburn | yeah liblinear with cramer singer should be fast and accurate | 21:41 |
n4nd0 | blackburn: do you know sth about multiboost and the gsoc project that was suggested last year porting it to shogun? | 21:42 |
blackburn | n4nd0: no, unfortunately no | 21:42 |
n4nd0 | blackburn: I started thinking of it since I don't see other way of making the face detector work :( | 21:43 |
blackburn | do you need a face detector? | 21:43 |
n4nd0 | blackburn: I have tried with some fancy haar features for a single svm but no luck | 21:43 |
blackburn | did you get better results with boosting? | 21:43 |
n4nd0 | blackburn: well, not really ... it was more with the idea of making this example we talked about | 21:43 |
n4nd0 | blackburn: yes, with the implementation from scratch I did in matlab with boosting of weak classifiers it worked | 21:44 |
blackburn | what are the numbers? | 21:44 |
blackburn | if you really need boosting you may work on it, else it is okay to show bad results | 21:45 |
blackburn | bad results are results as well right? | 21:45 |
n4nd0 | haha yes they are | 21:45 |
n4nd0 | in any case it could be nice try to get into shogun some of the algorithms in multiboost | 21:47 |
n4nd0 | blackburn: or do you think it might be too optimistic to do let say in a couple of non-intense weeks? | 21:48 |
blackburn | n4nd0: Soeren said it is pretty ambitious | 21:49 |
blackburn | did he? | 21:49 |
n4nd0 | he referred to the gsoc project that was proposed last year, but yes he said that | 21:49 |
n4nd0 | I had in mind anyway to limit to adaboost firstly | 21:50 |
n4nd0 | these are the parameters of the classifiers I get using a single svm in the CBCL database | 21:50 |
n4nd0 | >>>> accuracy = 0.932669322709 | 21:50 |
n4nd0 | >>>> precision = 0.985507246377 | 21:50 |
n4nd0 | >>>> error rate = 0.0673306772908 | 21:50 |
n4nd0 | >>>> recall = 0.28813559322 | 21:50 |
n4nd0 | >>>> ROC area = 0.907626093794 | 21:50 |
n4nd0 | recall is too low | 21:51 |
blackburn | what are results with your boosting? | 21:52 |
n4nd0 | I didn't try it with this database | 21:52 |
blackburn | ehmm how then can you compare :) | 21:53 |
n4nd0 | because we could use that classifier "for real", not in real time though | 21:54 |
n4nd0 | but it worked nicely with images | 21:54 |
n4nd0 | and, even if I have not tried the svm with real images, with a recall of about 0.288 I don't think it will work fine | 21:54 |
blackburn | I just wonder if recall is calculated correctly | 21:57 |
blackburn | hmm seems so | 21:59 |
blackburn | n4nd0: did you try kernels? | 21:59 |
n4nd0 | blackburn: I have used a polynomial one | 22:00 |
blackburn | gaussian? sigmoid? | 22:00 |
n4nd0 | blackburn: I have tried it with degree 2 and 3 | 22:00 |
n4nd0 | no, I have not tried other ones | 22:00 |
blackburn | why? | 22:00 |
n4nd0 | in the page where I got the CBCL face data I read they had used those kernels and not others | 22:01 |
blackburn | heh | 22:01 |
n4nd0 | I assumed those were the best for this application | 22:01 |
n4nd0 | but it might be that they improve | 22:02 |
n4nd0 | let me try | 22:02 |
blackburn | well gaussian works well usually | 22:02 |
blackburn | n4nd0: btw did you change C? | 22:02 |
blackburn | normalization? | 22:03 |
blackburn | these things are considerable steps | 22:03 |
n4nd0 | currently I am using a value of 1.0 for C | 22:04 |
n4nd0 | I tried changing it to bigger ones but it went worse | 22:04 |
n4nd0 | do you mean normalization for the training and test data? | 22:04 |
blackburn | yes, both should be normalized | 22:04 |
n4nd0 | so the images get zero mean and std 1? | 22:04 |
blackburn | I mean vectors of your features should have L2 norm = 1 for linear kernel | 22:05 |
n4nd0 | mmm | 22:05 |
n4nd0 | no | 22:05 |
blackburn | could be better | 22:05 |
n4nd0 | the features are not normalized :( | 22:05 |
blackburn | preprocessor = NormOne() | 22:06 |
blackburn | preprocessor.apply_to_feature_matrix(train_features) | 22:06 |
blackburn | preprocessor.apply_to_feature_matrix(test_features) | 22:06 |
blackburn | something like that | 22:06 |
n4nd0 | ok, I will take it a look | 22:06 |
n4nd0 | even if the features are the pixel of the images and the images are already normalized, do you think it will make a difference? | 22:07 |
n4nd0 | they are normalized as I told you, zero mean and standard deviation equal to one | 22:07 |
blackburn | I don't know | 22:08 |
blackburn | better try | 22:08 |
blackburn | :) | 22:08 |
n4nd0 | ok | 22:08 |
n4nd0 | blackburn: how is it a good way to choose the parameters for the kernels, such as the width for the Gaussian? | 22:09 |
n4nd0 | cross-validation? | 22:09 |
blackburn | yeah | 22:09 |
wiking | blackburn: hey man btw do you know viktor pelevin? he's one of my favorite contemporary writers... and i've just seen that they've made a movie out of one of his books http://www.imdb.com/title/tt0459748/ | 22:09 |
blackburn | wiking: yes, I've seen this movie :) | 22:10 |
wiking | is it good? | 22:10 |
blackburn | and have read a book as well | 22:10 |
blackburn | not bad :) | 22:10 |
wiking | unfortunately haven't read that book yet from him.. but all the latest ones... | 22:10 |
wiking | starting from the yellow arrow | 22:10 |
wiking | empire v | 22:10 |
blackburn | generation P is the only book I've read :) | 22:11 |
blackburn | from pelevin I mean | 22:11 |
blackburn | not generally :D | 22:11 |
wiking | and sacred book of the werewolf... and i don't know what was the english title for one playing in the big russian revolution time | 22:11 |
wiking | hehehe | 22:11 |
wiking | anyhow he's really cool | 22:11 |
wiking | i'm just getting this movie | 22:11 |
wiking | i'm just a bit scared of the subtitles | 22:12 |
blackburn | is it translated? | 22:12 |
blackburn | ah | 22:12 |
blackburn | hey you should understand russian a little :) | 22:12 |
wiking | heheh i do a little | 22:12 |
blackburn | wiking: there are a lot of really good artists | 22:13 |
wiking | it's funny when i was living in australia there were a lot of russians around me... i could understand 60-70% of what they were saying... but when i've tried serbian with them they couldn't understand a word... | 22:13 |
blackburn | epifantsev as tatarsky, efremov as azadovsky.. | 22:13 |
blackburn | have you been living in australia?? | 22:13 |
wiking | yeah | 22:13 |
wiking | for almost 2 years | 22:14 |
blackburn | damn is there any island where you weren't? | 22:14 |
wiking | never been to latin america nor africa and neither in asia | 22:14 |
blackburn | :D | 22:14 |
wiking | just been at bangkok airport 3 times :P | 22:14 |
wiking | i wanted to do the transiberian | 22:14 |
wiking | but that has been postponed... but hoping to do it once soon | 22:15 |
wiking | where are u living in russia atm? | 22:15 |
blackburn | you are welcome at samara/togliatti | 22:15 |
blackburn | :) | 22:15 |
wiking | aaah | 22:15 |
wiking | it's by the volga river right? | 22:15 |
blackburn | yeah | 22:15 |
wiking | cool | 22:15 |
wiking | how far is the star city from you? :) | 22:16 |
blackburn | the place 'lada' cars are being made | 22:16 |
blackburn | star? | 22:16 |
wiking | i mean cosmos city | 22:16 |
wiking | or what | 22:16 |
wiking | in kazakstan | 22:16 |
blackburn | ah baikonur? | 22:16 |
wiking | aaah yeah | 22:16 |
blackburn | pretty far | 22:16 |
blackburn | let me check | 22:16 |
wiking | fucking hell i'm tired not remembering the name | 22:16 |
wiking | oooh fuck | 22:17 |
wiking | it's that faaar | 22:17 |
wiking | i mean baikonur | 22:17 |
blackburn | ~1200 | 22:17 |
wiking | heheheh so the name for the car lada samara is coming from the name of the city ?:))) | 22:17 |
wiking | hahah didn't know that one :> | 22:18 |
blackburn | yes | 22:18 |
blackburn | VAZ is in togliatti | 22:18 |
wiking | that's a funny car :> | 22:18 |
blackburn | but cars was named after city, right | 22:18 |
blackburn | sometimes it is called zubilo | 22:18 |
blackburn | http://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Lada_Samara.JPG/220px-Lada_Samara.JPG | 22:19 |
blackburn | http://upload.wikimedia.org/wikipedia/commons/thumb/5/50/ColdChisels.jpg/220px-ColdChisels.jpg | 22:19 |
blackburn | something similar right? | 22:19 |
wiking | yep yep those lada samaras | 22:19 |
wiking | i remember seeing them a lot | 22:19 |
blackburn | where? | 22:19 |
wiking | well all around eastern europe | 22:19 |
wiking | but since the communism stopped being... | 22:20 |
blackburn | btw now they do cars like that | 22:20 |
blackburn | http://wroom.ru/i/cars2/lada_priora_1.jpg | 22:20 |
wiking | woah | 22:20 |
wiking | high tech shit | 22:20 |
blackburn | not as ugly | 22:20 |
wiking | loved russian cars | 22:20 |
blackburn | as it was | 22:20 |
wiking | they were reliable | 22:20 |
wiking | and simple so easy to fix | 22:20 |
blackburn | not really but it is easy to fix it | 22:20 |
wiking | yeah i mean that's why | 22:20 |
blackburn | with kuvalda | 22:21 |
wiking | even if there was something even somebody with a little knowledge could fix it | 22:21 |
wiking | uaz | 22:21 |
wiking | or what was that truck | 22:21 |
blackburn | uaz being made a little norther | 22:21 |
blackburn | in ulyanovsk | 22:22 |
wiking | ahhaha they were like survivor machines | 22:22 |
wiking | :) | 22:22 |
wiking | the uaz bus | 22:22 |
wiking | http://upload.wikimedia.org/wikipedia/commons/a/a1/UAZ-Bus.jpg | 22:22 |
blackburn | bus? I'm not sure | 22:22 |
wiking | this oen!!! | 22:22 |
blackburn | ah | 22:22 |
blackburn | buhanka | 22:22 |
wiking | do u know this one? :) | 22:22 |
blackburn | this shit is fucking crazy | 22:22 |
wiking | that thing was going in any shit :))) | 22:22 |
blackburn | it is able to climb 75 degree mountain | 22:23 |
blackburn | :D | 22:23 |
wiking | yep | 22:23 |
wiking | amazing shit | 22:23 |
blackburn | I heard one story | 22:23 |
blackburn | a couple of guys went to hunt a little | 22:23 |
blackburn | here | 22:23 |
blackburn | they had land cruiser, etc | 22:23 |
blackburn | so they sticked in snow | 22:23 |
blackburn | and there came man on this uaz | 22:24 |
blackburn | pulled their cars easily and told them to use better cars next time :D | 22:24 |
wiking | nothing can mess with an UAZ!!! | 22:24 |
wiking | yeah i totally believe | 22:24 |
blackburn | hah yeah crazy car | 22:24 |
wiking | that car is some amazing piece of mechanism | 22:25 |
blackburn | there are a lot of them still here | 22:25 |
wiking | but yeah i really like russian made stuff... i mean they are just bruteforce | 22:25 |
wiking | they were never nice and shinny stuff | 22:25 |
wiking | but for the purpose it was great :) | 22:25 |
blackburn | rather soviet | 22:25 |
wiking | hahahaha | 22:26 |
wiking | yeah | 22:26 |
wiking | i just remember the soviet era stuff | 22:26 |
wiking | dunno how is it now with russia :> | 22:26 |
wiking | how's IT doing it there? | 22:26 |
wiking | i mean it should be great in one way | 22:26 |
blackburn | IT? | 22:26 |
blackburn | inf. tech. stuff? | 22:26 |
wiking | as there's amazing coders coming from russia | 22:26 |
wiking | yeah inf.tech | 22:26 |
blackburn | well it is ok | 22:26 |
blackburn | for example we have a lot of outsource in samara | 22:27 |
blackburn | netcracker, epam ,mercury | 22:27 |
blackburn | I work at netcracker btw | 22:27 |
wiking | heheheh i always wondered why people talk about outsourcing to india, when russian coders are way better imho | 22:27 |
blackburn | yeah I think so :) | 22:28 |
wiking | man when i was working with indians in nokia.... | 22:28 |
wiking | i really cannot explain | 22:28 |
blackburn | I can imagine haha | 22:28 |
wiking | if i could have i would release all their codes | 22:28 |
blackburn | in one book? | 22:28 |
blackburn | :D | 22:28 |
wiking | it's like 2 functions they've used for EVERYTHING | 22:28 |
blackburn | oracle_func_1 | 22:29 |
blackburn | oracle_func_2 | 22:29 |
wiking | exactly | 22:29 |
wiking | i remember cleaning up their shit | 22:29 |
blackburn | this function oracles everything | 22:29 |
blackburn | hah | 22:29 |
wiking | and the best is | 22:29 |
wiking | using 2 files | 22:29 |
wiking | .h and .cpp | 22:29 |
wiking | not more | 22:29 |
wiking | 2 | 22:29 |
blackburn | damn how old are you? | 22:29 |
wiking | and that was for a browser interface | 22:29 |
blackburn | you were at every country I know | 22:29 |
blackburn | :D | 22:29 |
blackburn | worked in every company? | 22:29 |
blackburn | :D | 22:29 |
wiking | ahahah too old man too old | 22:30 |
wiking | 30 | 22:30 |
blackburn | ohh it is being clearer now | 22:30 |
wiking | hehehe yeah makes sense right? :P | 22:30 |
blackburn | hah yeah | 22:30 |
wiking | but yeah it is fucking crazy cleaning up a 10k+ lines file | 22:30 |
blackburn | I have 2/3 of your hah | 22:30 |
wiking | hahahahahahahha | 22:30 |
wiking | little green grasshopper then | 22:31 |
blackburn | yeah | 22:31 |
n4nd0 | blackburn: wow the training with the Gaussian kernel is never ending :-O | 22:31 |
blackburn | n4nd0: should be a little slower | 22:31 |
blackburn | hmm.. | 22:31 |
blackburn | wiking: artist playing tatarsky in generation P had some crazy roles before | 22:34 |
blackburn | for example | 22:34 |
blackburn | http://www.youtube.com/watch?feature=player_detailpage&v=jhzpe7QxlGw | 22:34 |
wiking | hahahaha | 22:35 |
blackburn | lets test your russian | 22:35 |
blackburn | did you get title? :) | 22:35 |
wiking | what is ?????? | 22:36 |
wiking | ? | 22:36 |
blackburn | head | 22:36 |
wiking | ahhaha | 22:36 |
blackburn | TIDE or cutting off the head, something like that I guess | 22:36 |
wiking | yeah | 22:36 |
wiking | i didn't get ???? | 22:36 |
blackburn | just tide | 22:36 |
wiking | cutting was ok | 22:36 |
wiking | i got that one | 22:37 |
wiking | and of course ili | 22:37 |
wiking | :P | 22:37 |
wiking | aaah golovi | 22:37 |
blackburn | yeah | 22:37 |
wiking | it's glava on serbian | 22:37 |
wiking | i should have got it | 22:37 |
blackburn | 3:10 hah | 22:37 |
wiking | :) | 22:37 |
blackburn | hmm I just wonder what do you think about kosovo | 22:38 |
wiking | honestly | 22:38 |
blackburn | 3:46 is ok a well | 22:38 |
wiking | no opinion | 22:38 |
wiking | i mean it's a big mess | 22:38 |
wiking | it's just bad that people cannot agree on it in a normal manner | 22:38 |
wiking | but that's quite usual in balkans :P | 22:38 |
blackburn | as for me it was a great shame for my country to not help serbians there | 22:38 |
wiking | i mean that they cannot communicate in a normal manner | 22:39 |
blackburn | but may be I'm wrong | 22:39 |
wiking | well | 22:39 |
wiking | it's really not nice for the minorities there (kosovoar people) | 22:39 |
wiking | because of the past stories... oppression by the serbs etc | 22:39 |
wiking | so i completely understand that part | 22:39 |
wiking | it's just funny when people from around the world who have no idea about anything | 22:40 |
wiking | try to fix it | 22:40 |
wiking | :) | 22:40 |
wiking | but when you look at their 'own mess' it's even worse in a way | 22:40 |
blackburn | so a lot of my friends (me as well) think 'kosovo je serbia' :) | 22:40 |
wiking | like the guy who sketched up the 'solution' for kosovo is from finland | 22:40 |
blackburn | funny thing I don't think chechnya should be included | 22:41 |
blackburn | into russia | 22:41 |
wiking | and for instance the situation with russians in finland (on the border) | 22:41 |
wiking | it's like wtf | 22:41 |
wiking | i mean finnish people just amazingly hating russians... :( | 22:41 |
wiking | hehehe yeah you have some troubles of your own as well | 22:42 |
blackburn | for what? | 22:42 |
wiking | well i don't know | 22:42 |
wiking | it's just something from the past | 22:42 |
blackburn | do you know how they 'solved' chechnya problem? | 22:42 |
blackburn | njet molotoff hah | 22:42 |
wiking | hahahahahha | 22:42 |
wiking | i mean their hate is irrational (finnish) | 22:43 |
wiking | they hate russians because they tried to invade finland couple of times | 22:43 |
blackburn | hah yeah | 22:43 |
wiking | but they have almost no real problems with the swedes | 22:43 |
wiking | who kind of like ruled them for 100+ years | 22:43 |
wiking | so it's amazing how unbalanced that shit is | 22:44 |
blackburn | I know no one hating finnish :) | 22:44 |
wiking | but it's all the same with those countries there in the baltics | 22:44 |
blackburn | estonia/latvia/lithuania same | 22:44 |
wiking | yeah | 22:44 |
blackburn | they hate their soviet legacy | 22:44 |
wiking | yeah | 22:44 |
wiking | but it's part of their culture | 22:44 |
wiking | and identity | 22:44 |
wiking | so funny to hate something that is part of u | 22:45 |
wiking | :> | 22:45 |
blackburn | btw currently we don't like soviet state of mind as well | 22:45 |
wiking | ahhahahah | 22:45 |
wiking | well | 22:45 |
wiking | i don't know which was better | 22:45 |
wiking | i mean don't get me wrong | 22:45 |
wiking | i don't know that much of current state of russia | 22:45 |
wiking | but the thing with putin and yelcin | 22:45 |
blackburn | so the situation with putin clearly describes what I mean | 22:46 |
blackburn | people here want vozhd | 22:46 |
blackburn | who will rule them | 22:46 |
wiking | yeah i kind of like sensed that one... that in russia some people just want a big leader | 22:46 |
blackburn | 30% do | 22:46 |
wiking | something like stalin | 22:46 |
blackburn | + some falsification | 22:46 |
wiking | or breznyev :P | 22:46 |
blackburn | and here we go, putin again | 22:47 |
wiking | ehehheheh | 22:47 |
wiking | but that's amazing | 22:47 |
wiking | i mean the whole thing around putin | 22:47 |
wiking | the oligarch | 22:47 |
wiking | it's like a big fucking maffia | 22:47 |
blackburn | exactly it is | 22:47 |
wiking | especially with gazprom | 22:47 |
blackburn | some day he will be judged | 22:47 |
blackburn | particularly yukos as well | 22:48 |
wiking | i mean on the other hand if u look what was happening with yelcin... | 22:48 |
blackburn | khodorkovsky | 22:48 |
blackburn | yeltsin was worse for sure | 22:48 |
wiking | i mean that was amazing how the things gone really bad with yeltsin | 22:48 |
wiking | everything started to get wasted... | 22:48 |
blackburn | but there was a big bonus for russia | 22:49 |
blackburn | cost of oil | 22:49 |
wiking | heheh yeah | 22:49 |
blackburn | it impacted everything | 22:49 |
blackburn | if there was 30$ for barrel - there would be no way for such 'great putin' | 22:49 |
blackburn | would be only wastelands here :) | 22:49 |
wiking | :P | 22:50 |
blackburn | wiking: do you know how they trying to calm down people here? | 22:50 |
blackburn | there were protests after elections | 22:51 |
blackburn | a lot of | 22:51 |
wiking | yeah | 22:51 |
wiking | read about those | 22:51 |
blackburn | they just grow hate to US | 22:51 |
blackburn | :D | 22:51 |
blackburn | they say US wants to do revolution here | 22:51 |
wiking | no way! | 22:51 |
blackburn | hahah | 22:51 |
wiking | ahahahah | 22:51 |
blackburn | yes, they say we don't want another lebanon here | 22:51 |
blackburn | or siria | 22:51 |
blackburn | or egypt | 22:51 |
blackburn | it works for not-too-smart-people | 22:52 |
blackburn | but more educated city people usually just laughs at it | 22:52 |
wiking | sorry but i gotta run again... would love to continue this conversation some time soon | 22:52 |
wiking | :( | 22:52 |
blackburn | aha okay :) | 22:53 |
wiking | but yeah fuck putin :P | 22:53 |
blackburn | was great to talk to you | 22:53 |
blackburn | hahah | 22:53 |
wiking | yeah you too! | 22:53 |
wiking | laterz! | 22:53 |
blackburn | see you | 22:53 |
wiking | cya | 22:53 |
n4nd0 | blackburn: should it be better if I increase the cache-size parameter? | 22:57 |
n4nd0 | right now is to 40, and it has been training the svm for a long long while | 22:57 |
blackburn | n4nd0: yes, should be | 22:58 |
blackburn | still?? | 22:58 |
blackburn | that's crazy :) | 22:58 |
n4nd0 | yeah I know :-P | 22:58 |
n4nd0 | but it is also because the size of the images changed when I tried with some fancy haar features | 22:59 |
n4nd0 | I am going to come back to the 19x19 and increase the size cache | 22:59 |
n4nd0 | which one is a good value for it? | 22:59 |
blackburn | it is size of cache in mb | 22:59 |
blackburn | you may use any that fits into your memory | 22:59 |
n4nd0 | and for the width in the Gaussian? | 23:02 |
n4nd0 | an approximate value that should go good? | 23:02 |
n4nd0 | is there any heuristic or thumb rule to use? | 23:02 |
blackburn | well not really | 23:03 |
blackburn | it should be very small and very large | 23:03 |
blackburn | :D | 23:03 |
blackburn | shouldn't* | 23:03 |
blackburn | I don't know any good | 23:03 |
n4nd0 | I will try with 20 then | 23:05 |
n4nd0 | blackburn: nothing good with the Gaussian kernel :( | 23:34 |
blackburn | bad | 23:35 |
n4nd0 | by the way | 23:48 |
n4nd0 | I don't think I got clearly the idea behind the two parameters for the kernel constructors | 23:49 |
n4nd0 | I mean the ones that are called | 23:49 |
n4nd0 | CDotFeatures * l, CDotFeatures * r | 23:49 |
n4nd0 | so far I am using the same for both | 23:49 |
n4nd0 | feats_train, feats_train | 23:49 |
n4nd0 | but feels weird to do it that way | 23:50 |
n4nd0 | blackburn: should they be different things? | 23:53 |
blackburn | n4nd0: no | 23:53 |
blackburn | ok when you train classifier | 23:53 |
blackburn | you need k_ij between train features and train features | 23:54 |
blackburn | but when you classify | 23:54 |
blackburn | you need kernel values between train features and test features | 23:54 |
n4nd0 | aham | 23:54 |
blackburn | that thing is going on when you call apply | 23:54 |
blackburn | it inits kernel with | 23:54 |
blackburn | feats_train, feats_test | 23:55 |
blackburn | and when do things according alphas/support vectors | 23:55 |
n4nd0 | actually I am not changing anything new when I call apply | 23:56 |
blackburn | apply does | 23:56 |
n4nd0 | I do sth like | 23:56 |
n4nd0 | kernel = GaussianKernel(feats_train, feats_train, width, size_cache) | 23:57 |
n4nd0 | svm = LibSVM(C, kernel, labels_train) | 23:57 |
n4nd0 | svm.train() | 23:57 |
n4nd0 | output = svm.apply(feats_test) | 23:57 |
blackburn | svm.apply() changes kernel | 23:57 |
blackburn | it inits kernel with feats train and feats test | 23:57 |
blackburn | svm.apply(feats_train) I mean | 23:57 |
n4nd0 | ok | 23:58 |
n4nd0 | so there is nothing I should change? | 23:58 |
n4nd0 | is it done automatically? | 23:58 |
blackburn | yes | 23:58 |
blackburn | it is the same if you | 23:58 |
blackburn | did | 23:58 |
blackburn | kernel.init(feats_train,feats_test) | 23:58 |
blackburn | output = svm.apply() | 23:58 |
blackburn | without anytihng in apply() | 23:59 |
n4nd0 | cool I get it | 23:59 |
blackburn | not very clear design here | 23:59 |
n4nd0 | I have to go now for a while | 23:59 |
n4nd0 | be back later | 23:59 |
n4nd0 | bye | 23:59 |
blackburn | but no idea how to do it flexible and in better way | 23:59 |
blackburn | bye | 23:59 |
--- Log closed Sun Feb 19 00:00:19 2012 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!