--- Log opened Mon Nov 28 00:00:59 2011 | ||
-!- blackburn [~blackburn@83.234.54.62] has quit [Quit: Leaving.] | 02:24 | |
-!- in3xes [~in3xes@180.149.49.230] has joined #shogun | 06:35 | |
-!- blackburn [~blackburn@188.168.4.251] has joined #shogun | 08:34 | |
sonne|work | moin blackburn | 08:44 |
---|---|---|
blackburn | hi | 08:44 |
sonne|work | I finally understood the typemap issue | 08:44 |
sonne|work | typecheck typemaps are only called when there is ambiguity | 08:44 |
blackburn | oh | 08:44 |
blackburn | and what is called if no? | 08:44 |
sonne|work | e.g. if there are 2 functions named e.g. set_vector(float64_t*,int32_t) / set_vector(SGVector<> v) | 08:45 |
sonne|work | if not then the function is *always* called | 08:45 |
blackburn | hmm | 08:45 |
sonne|work | so it needs to check the type too | 08:45 |
sonne|work | that's all | 08:45 |
sonne|work | I will rework the typemaps hopefully tonight | 08:45 |
blackburn | anyway we should remove float64_t*, int32_t stuff | 08:46 |
blackburn | sonne|work: I ran tests yesterday, did HMM fail before? | 08:46 |
sonne|work | blackburn: the problem only occurs when the float64_t* stuff is *not* there | 08:47 |
blackburn | sonne|work: heh I see | 08:47 |
sonne|work | blackburn: re HMM I don't remember | 08:47 |
sonne|work | you need to test it | 08:47 |
sonne|work | btw regarding your valgrind question | 08:47 |
sonne|work | you can use valgrind surpressions | 08:48 |
blackburn | sonne|work: that is not about supression but necessarity of testing with valgrind | 08:48 |
sonne|work | there used to be a suppression file somewhere in /usr/share/doc/python* | 08:48 |
sonne|work | then you will only see shogun errors not the python ones | 08:49 |
sonne|work | (which are in fact false alarms) | 08:49 |
blackburn | i.e. if there is grep shogun then something goes wrong | 08:49 |
blackburn | sonne|work: IIRC I would need to recompile my python to use it, it is not the problem I describe | 08:50 |
sonne|work | no you don't need to recompile python for that | 08:50 |
sonne|work | it is just a list of errors to ignore | 08:50 |
sonne|work | ...that you pass to valgrind | 08:50 |
blackburn | sonne|work: I would like you to test HMM cause I really don't know how to use it | 08:51 |
blackburn | the results are different from ones we have now | 08:52 |
sonne|work | did the tests fail in 1.0.0 ? | 08:52 |
blackburn | sonne|work: ah do you suggest to test it with 1.0.0? | 08:52 |
blackburn | hmm why not ok | 08:53 |
sonne|work | I mean then we know whether we introduced this problem in 1.1... | 08:53 |
blackburn | sonne|work: I started playing with SVMs and noticed that classifier/svm is like a dark forest | 08:54 |
blackburn | so much 'unstructured' code | 08:55 |
sonne|work | what do you mean? | 08:55 |
blackburn | well there are a lot of files with strange names | 08:56 |
sonne|work | you mean like SVM, LDA, PCA, :D | 08:56 |
blackburn | no codestyle in sources | 08:56 |
blackburn | i.e. some of them are uppercase | 08:56 |
blackburn | some are lower | 08:56 |
blackburn | etc | 08:56 |
sonne|work | please be more specific | 08:57 |
sonne|work | or give me an example | 08:57 |
blackburn | sonne|work: the code is impossible to maintain | 08:57 |
blackburn | at least for me | 08:57 |
sonne|work | which code? | 08:57 |
blackburn | what if there is an error? | 08:57 |
blackburn | any svm | 08:57 |
blackburn | sonne|work: I have no idea how to maintain it properly | 08:59 |
sonne|work | you mean the deep internals of svm algo's? | 08:59 |
blackburn | yes | 08:59 |
sonne|work | yes that is impossible for non-experts | 09:00 |
sonne|work | that is why everyone uses e.g. libsvm's code as black box | 09:01 |
sonne|work | well ok I did a couple of changes to libsvm but tested them | 09:01 |
sonne|work | and you can only do it when you have read their paper... | 09:01 |
blackburn | sonne|work: hmm I think it is the same for dimreduction and you | 09:07 |
blackburn | my code is as well non comprehensive | 09:07 |
blackburn | :D | 09:07 |
sonne|work | exactly :) | 09:09 |
sonne|work | it takes a *long* time to understand it... | 09:10 |
blackburn | sonne|work: especially in the case of LLE | 09:13 |
blackburn | I did it with alignment and it is not as in the paper | 09:13 |
blackburn | that's why it $\infty$ times faster than any other impl | 09:14 |
blackburn | I wonder, sonne|work did you ever heard anything about elections in russia this week? | 09:15 |
blackburn | I think nobody even knows what is going one cause nobody cares :) | 09:16 |
blackburn | but still interested | 09:16 |
sonne|work | no | 09:19 |
sonne|work | what happened? | 09:19 |
-!- in3xes [~in3xes@180.149.49.230] has quit [Quit: Leaving] | 09:22 | |
blackburn | sonne|work: well we all know who will be elected :D | 09:29 |
blackburn | government party cheats way too much.. | 09:30 |
sonne|work | like everywhere... | 09:34 |
blackburn | sonne|work: you can't imagine how much | 09:34 |
blackburn | sonne|work: we know who will be president for 4 years already | 09:34 |
blackburn | sonne|work: yes, there is an regression with HMM between 1.0.0 and 1.1.0 | 10:21 |
-!- blackburn [~blackburn@188.168.4.251] has quit [Quit: Leaving.] | 10:32 | |
-!- blackburn [5bdfb203@gateway/web/freenode/ip.91.223.178.3] has joined #shogun | 12:40 | |
blackburn | hah my new lcd screen for the notebook has finally arrived | 12:46 |
blackburn | 67 days in the loong way home :D | 12:46 |
blackburn | crazy, it took 67 days to move from GB to Russia | 12:47 |
sonne|work | blackburn: congrats | 12:54 |
blackburn | sonne|work: have 3 mins? | 12:55 |
sonne|work | blackburn: regarding the HMM regression - could you do a git bisect to figure out the commit that is causing the trouble? | 12:55 |
blackburn | sonne|work: sure, this night | 12:55 |
blackburn | sonne|work: what is the multiclass SVM you would suggest to use? | 12:55 |
sonne|work | ask but I may have to leave | 12:55 |
sonne|work | GMNPSVM | 12:55 |
blackburn | I used larank | 12:56 |
sonne|work | (true multiclass) | 12:56 |
blackburn | is it worse? | 12:56 |
sonne|work | one is online one not | 12:56 |
blackburn | GMNPSVM looks to took infinity time | 12:56 |
sonne|work | larank can be faster on more data | 12:56 |
sonne|work | sure | 12:56 |
blackburn | will it produce better accuracy? | 12:56 |
sonne|work | how many classes /data points | 12:56 |
sonne|work | I would expect so | 12:56 |
blackburn | 43 classes, 200-600 each | 12:57 |
blackburn | LibSVMMultiClass was slightly worse, 84% accuracy | 12:57 |
blackburn | and with LaRank I got ~87% | 12:57 |
sonne|work | precompute kernel matrix? | 12:57 |
sonne|work | ok but then you are already pretty good | 12:57 |
blackburn | sonne|work: gaussian on HOG :) | 12:57 |
blackburn | can I expect any other impact on accuracy with GMNPSVM? | 12:58 |
blackburn | it is pretty good indeed but others got almost 99 with convolutional neural networks | 12:59 |
blackburn | I would like to beat them hah | 12:59 |
blackburn | I guess you had to leave :) | 13:01 |
sonne|work | well convolutional NN are very hard to beat | 13:09 |
sonne|work | for such obj. recognition tasks | 13:09 |
blackburn | sonne|work: aren't they overfitted? | 13:10 |
blackburn | are they really so good? | 13:10 |
sonne|work | http://yann.lecun.com/exdb/mnist/ | 13:11 |
sonne|work | they can be when done right | 13:12 |
sonne|work | alright back to work | 13:12 |
blackburn | ok | 13:12 |
blackburn | hmm I'm in doubts | 13:12 |
blackburn | I guess I have to try CNNs.. | 13:28 |
sonne|work | I would be very interested in the results | 13:35 |
sonne|work | btw, did you use larank from shogun? | 13:35 |
sonne|work | did you normalize your data? | 13:35 |
sonne|work | did you add virtual examples (by rotating/shifting objects?) | 13:35 |
sonne|work | scaling etc | 13:35 |
sonne|work | all these things help | 13:35 |
blackburn | sonne|work: me too, from shogun, normalized but didnot extract RoI, no | 13:46 |
blackburn | sonne|work: I don't like any neural nets, what about you? :) | 14:10 |
sonne|work | me neither | 14:11 |
sonne|work | too many local minima | 14:11 |
sonne|work | very hard to control | 14:11 |
blackburn | I don't have much SVM experience | 14:12 |
blackburn | but it seems to be better | 14:12 |
sonne|work | well it is a convex optimizationproblem so local minima == global minima | 14:19 |
sonne|work | it helps a lot when you know that the same choice of model parameters leads to the same result | 14:20 |
blackburn | as I understand | 14:21 |
blackburn | SVM will have similar results on slightly different data | 14:21 |
blackburn | but NN could get totally wrong | 14:21 |
blackburn | right? | 14:21 |
blackburn | like ill-posed things in linalg | 14:21 |
sonne|work | like trying to find the root's of a polynomial with *many* variables | 14:28 |
blackburn | time to go to gym :) | 16:24 |
-!- blackburn [5bdfb203@gateway/web/freenode/ip.91.223.178.3] has quit [Quit: Page closed] | 16:24 | |
-!- blackburn [~blackburn@188.168.5.8] has joined #shogun | 19:07 | |
-!- mode/#shogun [+o sonney2k] by ChanServ | 19:52 | |
blackburn | 7 steps.. | 21:32 |
blackburn | 3 left | 22:20 |
blackburn | damn | 22:54 |
blackburn | I failed to determine the reason | 22:54 |
--- Log closed Tue Nov 29 00:00:59 2011 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!