--- Log opened Mon Jul 25 00:00:12 2011 | ||
--- Day changed Mon Jul 25 2011 | ||
blackburn | sonney2k: libsvm doesn't work neither in python nor java | 00:00 |
---|---|---|
blackburn | SystemError: [ERROR] assertion kernel->get_num_vec_lhs()==problem.l failed in file classifier/svm/LibSVM.cpp line 85 | 00:00 |
@sonney2k | I don't understand | 00:01 |
@sonney2k | you said the minimal example worked? | 00:01 |
@sonney2k | that should be using libsvm too | 00:01 |
blackburn | minimal example yes, classifier_libsvm_modular.py - not | 00:02 |
@sonney2k | but where is the difference? | 00:02 |
@sonney2k | they both do the same | 00:02 |
blackburn | I would fix that already if I knew | 00:03 |
@sonney2k | blackburn, I suspect that the number of labels doesn't match the matrix | 00:03 |
@sonney2k | does python work? | 00:03 |
@sonney2k | or java? | 00:03 |
@sonney2k | or none? | 00:03 |
blackburn | none | 00:03 |
blackburn | ah yes | 00:04 |
blackburn | my fauld | 00:04 |
blackburn | fault* | 00:04 |
@sonney2k | btw the minimal example in java cannot work | 00:04 |
@sonney2k | forget what I just said | 00:04 |
@sonney2k | blackburn, your fault? | 00:05 |
@sonney2k | what means ah yes? | 00:05 |
blackburn | number of labels wrong | 00:05 |
blackburn | sonney2k: why minimal example in java cannot work? | 00:06 |
@sonney2k | ok | 00:06 |
@sonney2k | did you read what I said above? | 00:06 |
blackburn | yes I did | 00:06 |
@sonney2k | <sonney2k> btw the minimal example in java cannot work | 00:07 |
@sonney2k | <sonney2k> forget what I just said | 00:07 |
@sonney2k | :D | 00:07 |
blackburn | sonney2k: libsvm produces the same results | 00:08 |
@sonney2k | blackburn, as in same labels.get_labels() ? | 00:08 |
blackburn | blackburn@blackburn-laptop:~/shogun/shogun/examples/undocumented/java_modular$ ./check.sh classifier_libsvm_modular.java | 00:08 |
blackburn | [0.1938791717197525, 0.19659259940936621] | 00:08 |
blackburn | blackburn@blackburn-laptop:~/shogun/shogun/examples/undocumented/python_modular$ python classifier_libsvm_modular.py | 00:08 |
blackburn | LibSVM | 00:08 |
blackburn | [ 0.19387917 0.1965926 ] | 00:08 |
@sonney2k | wait | 00:10 |
@sonney2k | 2 outputs only? | 00:10 |
@sonney2k | we should have 92 | 00:10 |
blackburn | yes, I modified data | 00:10 |
@sonney2k | ok | 00:10 |
blackburn | I can't check 92 numbers for equality | 00:10 |
@sonney2k | better check for the whole sample | 00:10 |
blackburn | takes more time | 00:10 |
@sonney2k | just print them and do a diff | 00:10 |
@sonney2k | yes I understand but it is impossible to have 92 numbers to match by chance | 00:11 |
blackburn | okay okay | 00:11 |
@sonney2k | ohh dam'd I am compiling shogun on debian unstable | 00:11 |
@sonney2k | lots of new warnings.... | 00:11 |
blackburn | ? | 00:12 |
@sonney2k | I think I should start preparing a debian package | 00:12 |
@sonney2k | for the new thing | 00:12 |
blackburn | sonney2k: the same for 92 | 00:13 |
@sonney2k | hurray! | 00:13 |
blackburn | ok, i'm pretty tired with java today | 00:16 |
@sonney2k | blackburn, what did you expect? | 00:16 |
@sonney2k | I think it ran rather smoothly | 00:16 |
blackburn | whaT? | 00:16 |
@sonney2k | it basically worked out of the box | 00:16 |
blackburn | can't understand what is you talking about | 00:17 |
@sonney2k | didn't you only have to modify load.py / Load.java? | 00:17 |
@sonney2k | no bugs in typemaps (so far) | 00:17 |
@sonney2k | ? | 00:17 |
blackburn | yes | 00:17 |
@sonney2k | so yes - that is trivial compared to typemap bugs | 00:17 |
blackburn | it is | 00:18 |
@sonney2k | so we are lucky | 00:18 |
@sonney2k | and chances are that other examples will just work | 00:18 |
blackburn | I saw some strange outputs in some of them | 00:19 |
blackburn | NaN or so | 00:19 |
blackburn | tomorrow will take a look | 00:19 |
@sonney2k | I mean now it remains to test if string based typemaps work | 00:19 |
@sonney2k | e.g. stringfeatures | 00:19 |
@sonney2k | weighteddegreestringkernel | 00:20 |
@sonney2k | and then some really complex example with preprocessors attached or multiple kernels | 00:20 |
@sonney2k | if something big works then the rest is just minor issues | 00:20 |
@sonney2k | blackburn, I really think we need someone doing a tutorial with some nice data set | 00:24 |
@sonney2k | I mean like we have a certain data set and no idea about it | 00:24 |
@sonney2k | so we do pca or so first | 00:24 |
@sonney2k | and visualize it | 00:24 |
@sonney2k | then we do some classification or so | 00:24 |
blackburn | with as much methods used as it could be? | 00:25 |
@sonney2k | yeah | 00:28 |
@sonney2k | like some story line from very explorative unsupervised | 00:28 |
@sonney2k | to simple supervised | 00:28 |
@sonney2k | e.g. linear | 00:28 |
@sonney2k | then e.g. svm w/ kernels | 00:28 |
@sonney2k | and then maybe even multiple kernels / data sources | 00:29 |
blackburn | good idea | 00:29 |
@sonney2k | that could work for anything, 2-class classification, regression, multiclass | 00:30 |
@sonney2k | would be cool to use heikos x-validation on top already for that | 00:30 |
blackburn | I would say I can do it if I wasn't embarassed with my manifold learning algos | 00:31 |
@sonney2k | blackburn, don't worry at some point we will have shogun 1.0 and then we might have time to work on some nice applications too :) | 00:42 |
@sonney2k | anyway | 00:42 |
@sonney2k | going to bed now | 00:42 |
@sonney2k | cu | 00:42 |
blackburn | see you | 00:43 |
-!- blackburn [~blackburn@188.122.239.253] has quit [Ping timeout: 255 seconds] | 00:49 | |
-!- f-x [~user@117.192.199.217] has joined #shogun | 01:59 | |
-!- f-x_ [fx@213.155.190.134] has joined #shogun | 04:02 | |
-!- f-x [~user@117.192.199.217] has quit [Ping timeout: 260 seconds] | 04:24 | |
-!- in3xes [~in3xes@180.149.49.230] has quit [Quit: Leaving] | 05:09 | |
-!- gsomix [~gsomix@178.45.88.77] has joined #shogun | 05:45 | |
-!- [1]warpy [~warpy@bzq-79-181-43-167.red.bezeqint.net] has quit [Quit: HydraIRC -> http://www.hydrairc.com <- Like it? Visit #hydrairc on EFNet] | 07:08 | |
-!- gsomix [~gsomix@178.45.88.77] has quit [Read error: Connection reset by peer] | 07:29 | |
-!- f-x [~user@117.192.204.84] has joined #shogun | 08:52 | |
-!- sploving1 [~sploving@124.16.139.134] has joined #shogun | 08:55 | |
@sonney2k | sploving1, could you please test on the kernel example first? | 09:11 |
sploving1 | I tested all of them | 09:12 |
@sonney2k | and? | 09:12 |
sploving1 | there is no same result!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! | 09:12 |
sploving1 | why?? | 09:12 |
@sonney2k | why what? | 09:13 |
@sonney2k | when you just set features and get the features | 09:13 |
sploving1 | the result are not same(python , lua) | 09:13 |
@sonney2k | are they the same as in python? | 09:13 |
@sonney2k | sploving1, yes you said that already - now we need to debug why not. | 09:14 |
sploving1 | sonne2y, why classifier_averaged_perceptron_modular.py run twice not the same? | 09:14 |
sploving1 | sonney2k | 09:15 |
@sonney2k | they were not the same in (python,java) either. but now they are at least for some examples | 09:15 |
@sonney2k | sploving1, please first try a simpler example like just setting features / getting features | 09:15 |
sploving1 | you can give me the name of example, and I will try. and I do not know what setting/getting features mean | 09:17 |
@sonney2k | sploving1, I see | 09:17 |
@sonney2k | lets try features_simple_real_modular.py | 09:17 |
sploving1 | works well~ | 09:20 |
sploving1 | sonney2k, I mean in python it works well | 09:20 |
@sonney2k | sploving1, ok - now compare if that works in lua too | 09:21 |
sploving1 | sonney2k, can we compile them both | 09:21 |
sploving1 | I mean in configure, we need compile them both. | 09:21 |
@sonney2k | sploving1, yes, just configure python_modular,lua_modular | 09:22 |
sploving1 | ./configure --interfaces=python_modular,lua_modular | 09:22 |
@sonney2k | but when you already installed you don't need to | 09:22 |
@sonney2k | , | 09:22 |
@sonney2k | between yes | 09:22 |
sploving1 | okay that is good | 09:22 |
@sonney2k | sploving1, I am looking into the averaged perceptron issue | 09:22 |
@sonney2k | (now) | 09:22 |
@sonney2k | it is a different problem - it seems | 09:23 |
sploving1 | yeap. I do not need | 09:28 |
sploving1 | 1 4 0 | 09:29 |
sploving1 | 0 0 9 | 09:29 |
sploving1 | 0 0 0 | 09:29 |
sploving1 | 0 5 0 | 09:29 |
sploving1 | 0 0 6 | 09:29 |
sploving1 | 9 9 9 | 09:29 |
sploving1 | this is lua result | 09:29 |
sploving1 | [[ 1. 2. 3.] | 09:29 |
sploving1 | [ 4. 0. 0.] | 09:29 |
sploving1 | [ 0. 0. 0.] | 09:29 |
sploving1 | [ 0. 5. 0.] | 09:29 |
sploving1 | [ 0. 0. 6.] | 09:29 |
sploving1 | [ 9. 9. 9.]] | 09:29 |
-!- sploving1 was kicked from #shogun by bettyboo [flood] | 09:29 | |
-!- sploving1 [~sploving@124.16.139.134] has joined #shogun | 09:29 | |
sploving1 | [[ 1. 2. 3.] | 09:29 |
sploving1 | [ 4. 0. 0.] | 09:29 |
sploving1 | [ 0. 0. 0.] | 09:29 |
sploving1 | [ 0. 5. 0.] | 09:29 |
sploving1 | [ 0. 0. 6.] | 09:29 |
sploving1 | [ 9. 9. 9.]] | 09:29 |
-!- sploving1 was kicked from #shogun by bettyboo [flood] | 09:29 | |
-!- sploving1 [~sploving@124.16.139.134] has joined #shogun | 09:29 | |
sploving1 | [[ 1. 2. 3.] | 09:30 |
sploving1 | [ 4. 0. 0.] | 09:30 |
sploving1 | 09:30 | |
sploving1 | [ 0. 0. 0.] | 09:30 |
sploving1 | [ 0. 5. 0.] | 09:30 |
sploving1 | [ 0. 0. 6.] | 09:30 |
sploving1 | [ 9. 9. 9.]] | 09:30 |
sploving1 | this is python result | 09:30 |
sploving1 | sonney2k, which is correct? | 09:33 |
@sonney2k | sploving1, what is the original input? | 09:34 |
sploving1 | a=RealFeatures(A), a.set_feature_vector(array([1,4,0,0,0,9], dtype=float64), 0) will affect a.get_feature_matrix()?? | 09:34 |
sploving1 | matrix=array([[1,2,3],[4,0,0],[0,0,0],[0,5,0],[0,0,6],[9,9,9]], dtype=float64) | 09:34 |
sploving1 | this is the original input | 09:34 |
sploving1 | sonney2k, in lua it is : matrix = {{1,2,3},{4,0,0},{0,0,0},{0,5,0},{0,0,6},{9,9,9}} | 09:35 |
-!- f-x [~user@117.192.204.84] has quit [Ping timeout: 260 seconds] | 09:36 | |
sploving1 | as I do not know features' meaning, I have no idea which result is correct | 09:36 |
@sonney2k | sploving1, did you do the set_feature_vector in lua too? | 09:37 |
sploving1 | sonney2k, yeap . a:set_feature_vector({1,4,0,0,0,9}, 0) | 09:38 |
@sonney2k | sploving1, I am a bit lost - please comment the set_feature_vector in both languages | 09:38 |
@sonney2k | and then just show what you get in python (first) and then lua | 09:39 |
@sonney2k | sploving1, maybe use gist.github.com for pasting... | 09:39 |
@sonney2k | or /query me | 09:39 |
sploving1 | https://gist.github.com/1103717 | 09:41 |
sploving1 | sonney2k, take a look at it! | 09:41 |
@sonney2k | the python one is correct | 09:42 |
sploving1 | you mean set_feature_vector has no effect on the result? sonney2k?? | 09:43 |
sploving1 | sonney2k, then why set_features_vector effect the result in lua?? so strange! | 09:44 |
@sonney2k | sploving1, it should not yes | 09:50 |
@sonney2k | sploving1, please don't do set_feature_vector for now and check | 09:51 |
@sonney2k | it is probably wrong in lua nevertheless | 09:51 |
@sonney2k | (just result transposed - I guess different order in typemap) | 09:51 |
sploving1 | sonne2k, without it(set_..), it is the correct result | 09:53 |
sploving1 | sonney2k | 09:53 |
sploving1 | it is the origin input | 09:54 |
sploving1 | the same with | 09:54 |
sploving1 | different order? sonney2k, can you expain it more detail? | 09:55 |
sploving1 | so I can fix it | 09:55 |
@sonney2k | sploving1, yes that can happen when both set_feature_matrix and get_feature_matrix use a different ordering | 09:55 |
@sonney2k | sploving1, when you uncomment | 09:56 |
@sonney2k | print a.get_num_vectors() | 09:56 |
@sonney2k | print a.get_num_features() | 09:56 |
@sonney2k | what do these display in lua? | 09:56 |
@sonney2k | sploving1, in the end i suspect that just this array[i * cols + j] statement in lua is wrong | 09:58 |
sploving1 | 3 6 | 09:58 |
@sonney2k | that is correct | 09:58 |
@sonney2k | sploving1, if you write matrix = {{1,2,3},{4,0,0},{0,0,0},{0,5,0},{0,0,6},{9,9,9}} how many tables are these? | 10:03 |
@sonney2k | 6 right? | 10:04 |
@sonney2k | so that should match rows | 10:04 |
@sonney2k | and cols is 3 since each table has 3 elements | 10:04 |
sploving1 | 6 yeap | 10:05 |
-!- f-x [~user@117.192.204.152] has joined #shogun | 10:05 | |
sploving1 | 6 is rows, 3 is cols | 10:05 |
@sonney2k | sploving1, does that match the meaning you have in swig_typemaps.i? | 10:05 |
sploving1 | yeap. now I understand set_feature_vector mean | 10:05 |
sploving1 | it is to set the first columen | 10:05 |
@sonney2k | yes | 10:05 |
sploving1 | I will take a look at the file and fix it | 10:06 |
@sonney2k | sploving1, but please not that the vector set function is correct | 10:07 |
@sonney2k | it must be the in /out typemap for SGMatrix that are *both* wrong | 10:07 |
sploving1 | you mean SGVector is correct and SGMatrix is wrong? | 10:07 |
@sonney2k | yes | 10:07 |
sploving1 | sonney2k, okay | 10:07 |
@sonney2k | I think it should be array[j * rows + i] in line 156 in swig_typemaps.i | 10:12 |
@sonney2k | and same indexing in line 176 | 10:13 |
@sonney2k | sploving1, ^ | 10:13 |
@sonney2k | then it should work | 10:13 |
sploving1 | I fixed it and now recompiling | 10:13 |
sploving1 | I thought shogun store in rows first, but that is wrong | 10:14 |
sploving1 | shogun store in colmn first then rows, sonney2k | 10:14 |
@sonney2k | yes it is always column by column | 10:14 |
@sonney2k | like fortran, matlab, r, octave, ... | 10:14 |
-!- blackburn [~blackburn@188.122.239.253] has joined #shogun | 10:15 | |
@sonney2k | ... but not python :) | 10:15 |
sploving1 | oh. I know | 10:15 |
@sonney2k | (in python numpy one can specify that one wants fortran order - so it works there too :) | 10:16 |
sploving1 | sonney2k, now I want to support rubby narray | 10:17 |
@sonney2k | sploving1, does it work now? | 10:17 |
@sonney2k | I mean lua matrix? | 10:18 |
sploving1 | just compiling | 10:18 |
sploving1 | I fetch upstream | 10:18 |
sploving1 | so I git clean and compiling from new fresh | 10:18 |
@sonney2k | ok | 10:19 |
sploving1 | I just see narray examples,but no api | 10:19 |
sploving1 | like numpy | 10:19 |
@sonney2k | serialhex, in case you are around again ping us | 10:19 |
@sonney2k | sploving1, I would do the same kind of low-level support that you did for lua | 10:20 |
sploving1 | sonne2k, ? | 10:20 |
sploving1 | sonney2k | 10:20 |
sploving1 | what do you mean? | 10:20 |
@sonney2k | sploving1, just using arrays | 10:22 |
-!- sploving1 [~sploving@124.16.139.134] has quit [Remote host closed the connection] | 10:22 | |
@sonney2k | ? | 10:23 |
@sonney2k | hmmhh it looks like narray is still being developed | 10:24 |
@sonney2k | so it is probably worth supporting | 10:24 |
@sonney2k | and the api is in narray.h | 10:27 |
-!- sploving1 [~sploving@124.16.139.194] has joined #shogun | 10:30 | |
sploving1 | sonney2k,/../../src/interfaces/lua_modular/modshogun.so: undefined symbol: _ZN6shogun4CGMM10train_smemEiidid | 10:30 |
sploving1 | my machine crashed just now. I need reboot and just run make(cannot run other application) to compile shogun | 10:31 |
sploving1 | but I met the problem: ua: error loading module 'modshogun' from file '../../../src/interfaces/lua_modular/modshogun.so': | 10:31 |
sploving1 | ../../../src/interfaces/lua_modular/modshogun.so: undefined symbol: _ZN6shogun4CGMM10train_smemEiidid | 10:31 |
@sonney2k | sploving1, yes it needs 1.5 G to compile | 10:31 |
sploving1 | I have git clean -dfx and configure it with lua/python modular | 10:33 |
@sonney2k | sploving1, I am doing now too | 10:34 |
@sonney2k | sploving1, btw you can use narray.h for the api of ruby's narray | 10:37 |
sploving1 | oh. i hope there is a api doc | 10:39 |
@sonney2k | sploving1, I couldn't find any - but the .h does contain the needed info | 10:39 |
@sonney2k | e.g. IsNArray() to test if the obj. is an narray | 10:39 |
sploving1 | okay. I will take a look at it | 10:40 |
@sonney2k | and there is RNArray with the data | 10:40 |
sploving1 | if it is similar to python, that maybe not difficult | 10:40 |
sploving1 | numpy, i mean | 10:40 |
@sonney2k | it is definitely similar and not beautiful :) | 10:42 |
@sonney2k | which example did not work? | 10:43 |
@sonney2k | sploving1, ^ | 10:43 |
sploving1 | feature | 10:44 |
sploving1 | features_simple_real_modular.lua, sonney2k | 10:44 |
sploving1 | does fresh shogun work well in your computer?? | 10:45 |
@sonney2k | I just did git clean -dfx and recompiled | 10:45 |
@sonney2k | it works.... | 10:45 |
@sonney2k | how do you run the example? | 10:46 |
sploving1 | export LUA_PATH=../../../src/interfaces/lua_modular/?.lua\;?.lua | 10:46 |
sploving1 | export LUA_CPATH=../../../src/interfaces/lua_modular/?.so | 10:46 |
sploving1 | then lua features_simple_real_modular.lua | 10:46 |
@sonney2k | (I ran ./check.sh) | 10:46 |
@sonney2k | yes that works too | 10:47 |
sploving1 | oh. I wll compile it again | 10:50 |
-!- sploving1 [~sploving@124.16.139.194] has left #shogun [] | 10:50 | |
-!- warpyyy [~theuser@212.179.28.34] has joined #shogun | 10:52 | |
@sonney2k | blackburn, does current master compile and run for you? | 10:52 |
blackburn | min | 10:53 |
-!- warpyyy [~theuser@212.179.28.34] has quit [Read error: Connection reset by peer] | 10:54 | |
blackburn | sonney2k: yes, all ok, interfaces=java_modular | 10:57 |
@sonney2k | ok then it must be sth on splovings side | 11:00 |
* sonney2k is transitioning CLabels for SGVector | 11:01 | |
CIA-87 | shogun: Soeren Sonnenburg master * re9d4632 / src/interfaces/lua_modular/swig_typemaps.i : | 11:47 |
CIA-87 | shogun: Merge pull request #232 from sploving/master | 11:47 |
CIA-87 | shogun: fix matrix typemap(columns first then rows) - https://github.com/shogun-toolbox/shogun/commit/e9d463232d97ebb290e9db25ae31905469c22acf | 11:47 |
CIA-87 | shogun: Baozeng Ding master * rad8130a / src/interfaces/lua_modular/swig_typemaps.i : fix matrix typemap(columns first then rows) - https://github.com/shogun-toolbox/shogun/commit/ad8130a578f008d729f4f2525b83631b52ba4620 | 11:47 |
-!- sploving1 [~sploving@124.16.139.194] has joined #shogun | 12:06 | |
sploving1 | now lua features_simple_real works! | 12:07 |
sploving1 | sonney2k, do you know why classifier_averaged_perceptron_modular.py reduce different results? | 12:08 |
@sonney2k | sploving1, and kernel too? | 12:09 |
@sonney2k | sploving1, I am working on that perceptron issue | 12:09 |
@sonney2k | since that issue appears in python too - it must be some general problem | 12:09 |
sploving1 | I am tring kernel now | 12:09 |
sploving1 | kernel_gaussian_modular, https://gist.github.com/1103871 | 12:16 |
sploving1 | sonney2k, not the same. lua generate so long result!!!! | 12:16 |
@sonney2k | sploving1, could you please test km_train first? | 12:18 |
sploving1 | okay | 12:18 |
@sonney2k | it should be as big as number of columns | 12:18 |
@sonney2k | times number of columns | 12:18 |
@sonney2k | IIRC 92x92 | 12:18 |
sploving1 | lua: km_train: 92*8 | 12:23 |
sploving1 | sonney2k, how to print python?? | 12:23 |
sploving1 | it has ... in the result | 12:24 |
@sonney2k | you mean print(x) ? | 12:24 |
sploving1 | yeap | 12:24 |
@sonney2k | repr() | 12:24 |
@sonney2k | x.repr() | 12:24 |
@sonney2k | sploving1, how can km_train for lua be 92x8 ? | 12:25 |
@sonney2k | not possible... | 12:25 |
sploving1 | sonney2k, 'numpy.ndarray' object has no attribute 'repr' | 12:25 |
sploving1 | km_train.repr()?? | 12:25 |
sploving1 | sonney2k, what it should be? 92*92? | 12:26 |
@sonney2k | repr(km_train) | 12:26 |
@sonney2k | yes | 12:26 |
sploving1 | sonney2k, https://gist.github.com/1103871, the python output still cannot dump using repr. it has ... output | 12:28 |
sploving1 | I mean it omit some results | 12:30 |
sploving1 | using repr, or print | 12:30 |
@sonney2k | sploving1, then use numpy.savetxt('somefilename', km_train) | 12:31 |
@sonney2k | but that is not the actual problem... | 12:32 |
@sonney2k | btw the lua matrix is 92x92 here too | 12:34 |
sploving1 | https://gist.github.com/1103887 | 12:35 |
sploving1 | this is the result | 12:35 |
sploving1 | 92* 92? how dou you know that?? | 12:36 |
sploving1 | I just count rows = 92, cols = 8 about, sonney2k | 12:36 |
@sonney2k | nope 92x92 | 12:40 |
@sonney2k | and gives same result as in python btw | 12:42 |
sploving1 | sonney2k, you are so great. can you tell me how do you that???? | 12:44 |
sploving1 | sonney2k, I did not see the right sitd | 12:45 |
@sonney2k | I just write the output of the lua matrix to a file | 12:45 |
sploving1 | sorry for that | 12:45 |
sploving1 | you mean lua *.lua > 1.txt? sonney2k | 12:46 |
@sonney2k | yes | 12:46 |
sploving1 | okay. I will move to ruby | 12:50 |
sploving1 | sonney2k I gtg bye | 12:51 |
@bettyboo | see you | 12:51 |
@sonney2k | sploving1, don't forget strings | 12:51 |
@sonney2k | in lua I mean | 12:51 |
@sonney2k | you havent' tested these yet | 12:51 |
sploving1 | sonney2k, okay. I will test them. when output, just lua *.lua > 1.txt??? | 12:52 |
@sonney2k | sploving1, I don't understand what you mean | 12:52 |
sploving1 | in my machine, I did not see the right side of the columns | 12:52 |
sploving1 | 1 0.026404555080215 0.00051704925476818 0.013315592813501 0.98824215105277 0.65147692801705 0.14929587268109 2.354172745438e-05 0.28323830778227 | 12:53 |
sploving1 | for instance i just see the first row is the above | 12:53 |
sploving1 | but when I paste them on pasbin, it have the right side | 12:53 |
sploving1 | so strange | 12:53 |
-!- sploving1 [~sploving@124.16.139.194] has left #shogun [] | 12:54 | |
@sonney2k | blackburn, why do you use m_labels in GaussianNB? | 13:08 |
@sonney2k | I mean you only need this in train() or? | 13:09 |
blackburn | hmm | 13:11 |
blackburn | I can't remember | 13:11 |
blackburn | let me look ;) | 13:11 |
blackburn | sonney2k: m_labels is used only in train, yes | 13:12 |
@sonney2k | ok then I remove m_labels from the .h | 13:12 |
@sonney2k | etc | 13:12 |
-!- heiko1 [~heiko@134.91.10.200] has joined #shogun | 13:35 | |
@sonney2k | heiko1, hey ... you survived :) | 13:41 |
heiko1 | sonney2k, yes, but pain all over the body ;) | 13:43 |
heiko1 | how do you say muskelkater in english? :) | 13:43 |
blackburn | hard weekend with gf? :D | 13:43 |
blackburn | the last words were 'girlfriend is waiting' IIRC :D | 13:44 |
heiko1 | oh, yes and that also :) | 13:44 |
heiko1 | and climbing | 13:44 |
* sonney2k sings la la la *VERY* *LOUDLY* | 13:44 | |
heiko1 | and you guys? did you have a good weekend? | 13:45 |
@sonney2k | blackburn had a lot of fun with java or coffee or so | 13:47 |
blackburn | haha | 13:47 |
blackburn | with HLLE too | 13:47 |
@sonney2k | I am currently transitioning labels to use SGVector for real (internally) | 13:48 |
@sonney2k | too | 13:48 |
-!- heiko1 [~heiko@134.91.10.200] has quit [Ping timeout: 258 seconds] | 15:27 | |
-!- heiko1 [~heiko@134.91.52.56] has joined #shogun | 15:33 | |
@sonney2k | heiko1, do you know by heart where you recently added a SG_NOT_IMPLEMENTED? | 15:41 |
heiko1 | yes, copy_subset() of CFeatures | 15:44 |
heiko1 | called by KernelMachine::train | 15:44 |
heiko1 | an example fails? | 15:45 |
@sonney2k | yeah serialization | 15:45 |
@sonney2k | but I don't think it was that one | 15:45 |
heiko1 | mmh | 15:45 |
@sonney2k | it is the gaussian kernel that is failing | 15:45 |
heiko1 | let me check | 15:45 |
heiko1 | compute feature vector of SimpleFeatures | 15:46 |
@sonney2k | heiko1, yes that one! | 15:47 |
heiko1 | this method hjust returns NULL there | 15:47 |
heiko1 | has to be overridden or something | 15:47 |
heiko1 | should I remove the SG_NOTIMPLEMENTED? | 15:47 |
@sonney2k | I don't know though why it is called | 15:47 |
heiko1 | i saw a call of it recently... i will check | 15:49 |
heiko1 | SimpleFeatures::get_feature_vector | 15:50 |
heiko1 | if no feature matrix is set | 15:50 |
@sonney2k | makes sense | 15:50 |
@sonney2k | so it could be that serializaton has some chicken / egg problem | 15:50 |
heiko1 | yes | 15:50 |
@sonney2k | that kernel should be loaded but features are not yet there or so | 15:51 |
heiko1 | mmh | 15:51 |
heiko1 | but if the SG_NOTIMPLEMENTED is removed | 15:51 |
heiko1 | NULL is returned there | 15:51 |
@sonney2k | and the gaussian kernel does some operation in load_serializable_post | 15:51 |
@sonney2k | to precompute some x_i^2 | 15:52 |
heiko1 | sonney2k, the fire-alarm just started howling here | 15:53 |
heiko1 | i will check whats happening | 15:53 |
@sonney2k | heiko1, ok | 15:53 |
heiko1 | indeed, there is a fire | 15:55 |
@sonney2k | wow! | 15:55 |
heiko1 | cars ariving, i will go outside for a minute | 15:55 |
@sonney2k | who is burning? | 15:55 |
heiko1 | (probalby takes more) | 15:55 |
heiko1 | dont know | 15:55 |
blackburn | it is what they call 'extreme programming' | 15:56 |
heiko1 | i am in 6th floor | 15:56 |
heiko1 | afk... | 15:56 |
@sonney2k | blackburn, your name is heiko1 program | 15:56 |
blackburn | what& | 15:57 |
blackburn | > | 15:57 |
blackburn | ? | 15:57 |
@sonney2k | I see them all burning with black smoke | 15:57 |
@sonney2k | hope heiko1 manages to escape | 15:57 |
blackburn | where do you see it? | 15:57 |
@sonney2k | live tv of course ;-) | 15:58 |
blackburn | joke? ;) | 15:58 |
@sonney2k | no of course not *eg* | 15:58 |
@sonney2k | I guess this bug hunt here is driving me made | 16:01 |
@sonney2k | which reminds me | 16:01 |
@sonney2k | blackburn, how is java going along? | 16:02 |
blackburn | sonney2k: currently working on HLLE | 16:02 |
@sonney2k | blackburn, that is not fair | 16:02 |
@sonney2k | you can have fun | 16:02 |
@sonney2k | I have to fix bugs | 16:02 |
blackburn | I'm currently fixing bugs in HLLE :D | 16:03 |
@sonney2k | you mean it is semi-fun | 16:03 |
@sonney2k | hmmhh not sure I can live with that | 16:03 |
heiko1 | re | 16:04 |
heiko1 | small fire | 16:04 |
heiko1 | wow 3 cars and police here | 16:04 |
heiko1 | but builing will not be evacuated | 16:04 |
@sonney2k | heiko1, what happened? | 16:14 |
@sonney2k | or what is going on? | 16:14 |
heiko1 | they are already gone, | 16:14 |
heiko1 | but I did not have the motivation to go down all 150 chair steps ;) | 16:15 |
heiko1 | probably nothing too scary | 16:15 |
@sonney2k | ah no elevator... | 16:15 |
@sonney2k | you could have climbed... | 16:15 |
heiko1 | yes, but I dont like being in the elevator | 16:15 |
heiko1 | yes, its possible here .) | 16:15 |
@sonney2k | anyway heiko1 I made some progress on this here | 16:15 |
heiko1 | ok? | 16:15 |
@sonney2k | the strange thing is that the feature object got loaded just fine | 16:16 |
@sonney2k | and it pretends that it did also load the matrix | 16:16 |
@sonney2k | but for some reason not?! | 16:16 |
heiko1 | mmmh | 16:18 |
heiko1 | are this simple features? | 16:18 |
@sonney2k | yes | 16:18 |
@sonney2k | heiko1, I only noticed because I changed all of labels and thus lots of classifiers | 16:19 |
@sonney2k | and so I recognized that some examples fail... | 16:19 |
heiko1 | is the load method of SimpleFfeatures called? | 16:20 |
@sonney2k | enabling debug I see that that simple features are already loaded... | 16:21 |
@sonney2k | heiko1, w/ debug on I see that | 16:45 |
@sonney2k | [DEBUG] START LOADING CSGObject 'SimpleFeatures' | 16:45 |
@sonney2k | .... | 16:45 |
@sonney2k | [DEBUG] Loading parameter 'feature_matrix' of type 'Matrix<float64>' | 16:45 |
@sonney2k | [DEBUG] DONE LOADING CSGObject 'SimpleFeatures' (0x3c973b0) | 16:45 |
@sonney2k | but then in kernel lhs=0x3c973b0 '(nil)' num_vec_fm=0 num_feat_fm=0 num_vec=20 num_feat=2 | 16:45 |
@sonney2k | the nil there corresponds to no feature matrix around | 16:45 |
@sonney2k | and the num_vec/feat_fm =0 indicate that the matrix indeed did not get loaded | 16:46 |
heiko1 | strange :( | 16:48 |
@sonney2k | heiko1, does the matrix stuff work at all? | 16:48 |
@sonney2k | let me create a fool proof example | 16:48 |
heiko1 | what do you mean with matrix stuff? | 16:48 |
heiko1 | ok | 16:49 |
@sonney2k | answer is no | 16:50 |
@sonney2k | from modshogun import * | 16:50 |
@sonney2k | from numpy import array | 16:50 |
@sonney2k | feats=RealFeatures(array([[1.0,2,3],[4,5,6]])) | 16:50 |
@sonney2k | fstream = SerializableAsciiFile("foo.asc", "w") | 16:50 |
@sonney2k | feats.save_serializable(fstream) | 16:50 |
@sonney2k | but in foo.asc we have feature_matrix Matrix<float64> 0 0 () | 16:51 |
heiko1 | so which part is not working? save or load? | 16:52 |
@sonney2k | save | 16:52 |
@sonney2k | heiko1, vector works though | 16:52 |
@sonney2k | l=Labels(array([1.0,2,3])) | 16:52 |
@sonney2k | fstream = SerializableAsciiFile("foo2.asc", "w") | 16:52 |
@sonney2k | l.save_serializable(fstream) | 16:52 |
@sonney2k | labels Vector<float64> 3 ({1}{2}{3}) | 16:52 |
@sonney2k | heiko1, do you have an idea where I should look for the bug? | 16:53 |
@sonney2k | or do you even know what the problem could be? | 16:53 |
heiko1 | mmh | 16:53 |
heiko1 | i mean the save code of SimpleFeatures is short. | 16:54 |
heiko1 | writer->f_write(feature_matrix, num_features, num_vectors); | 16:54 |
@sonney2k | heiko1, not the save code of simplefeatures | 16:54 |
@sonney2k | serialization | 16:54 |
heiko1 | ah sorry | 16:54 |
@sonney2k | like where you did add support for SGVector / SGMatrix :) | 16:54 |
heiko1 | oh, ... mmh, perhaps the add methods of Parameter I did are wrong | 16:55 |
@sonney2k | or not - we have to find out | 16:55 |
@sonney2k | that was in shogun/base/Parameter.cpp? | 16:56 |
heiko1 | yes | 16:56 |
heiko1 | all the add methods | 16:56 |
heiko1 | with SGVector SGMatrix | 16:56 |
heiko1 | hope theres no mistake in them | 16:57 |
@sonney2k | but we are not even using SGMatrix etc | 16:57 |
heiko1 | oh | 16:57 |
heiko1 | mmh | 16:57 |
heiko1 | then this cant be the mistake | 16:57 |
@sonney2k | we use the add_matrix stuff | 16:57 |
heiko1 | but this wasnt touched recently or? | 16:58 |
@sonney2k | maybe for the subsetting business | 16:59 |
heiko1 | yes | 16:59 |
@sonney2k | heiko1, I mean there are feature_matrix_num_vectors etc | 16:59 |
@sonney2k | and these are 0 too | 17:00 |
heiko1 | oh | 17:00 |
heiko1 | I just got an idea | 17:00 |
heiko1 | let me check | 17:00 |
@sonney2k | and indeed they are | 17:00 |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 17:00 | |
heiko1 | perhaps this has to do something with the variables that i changed | 17:01 |
heiko1 | the add methods have also been changed | 17:01 |
heiko1 | At sometime I replaced the features by a SGVector | 17:01 |
@sonney2k | heiko1, no I think it is a bug in simplefeatures somehow | 17:01 |
heiko1 | but undid this | 17:01 |
heiko1 | yes | 17:01 |
heiko1 | in SimpleFeatures | 17:01 |
@sonney2k | I mean dimensions of feature matrix need to be non-zero | 17:02 |
@sonney2k | could very well be my fault too... | 17:02 |
heiko1 | mmh | 17:02 |
blackburn | hooray to new heisenbug in arpack wrapper! | 17:02 |
heiko1 | sonney2k, I have an appointment in a few minutes, sorry for that, but I will be back later | 17:02 |
@sonney2k | heiko1, found the bug! | 17:03 |
heiko1 | where? | 17:04 |
@sonney2k | heiko1, in the set_feature_matrix for SGMatrix type | 17:04 |
@sonney2k | it was forgotten to set featuer_matrix_num_vectors | 17:05 |
heiko1 | alright | 17:05 |
heiko1 | then | 17:05 |
@sonney2k | everywhere else it was ok | 17:05 |
heiko1 | glad you found it :) | 17:05 |
heiko1 | so, see you this in the evening | 17:06 |
-!- in3xes [~in3xes@180.149.49.227] has quit [Ping timeout: 276 seconds] | 17:16 | |
-!- in3xes [~in3xes@180.149.49.227] has joined #shogun | 17:28 | |
CIA-87 | shogun: Soeren Sonnenburg master * rd85cb65 / (23 files in 9 dirs): | 17:35 |
CIA-87 | shogun: remove unused confidences from labels and add SGVector in methods | 17:35 |
CIA-87 | shogun: utilizing labels when possible - https://github.com/shogun-toolbox/shogun/commit/d85cb655161d037c2d0a0c2f0970f44de3a9e131 | 17:35 |
CIA-87 | shogun: Soeren Sonnenburg master * rd2f13fb / examples/undocumented/python_modular/serialization_matrix_modular.py : add example to just serialize matrix - https://github.com/shogun-toolbox/shogun/commit/d2f13fb17d3f9998af1a175dfd4e2bea4544fb3d | 17:35 |
CIA-87 | shogun: Soeren Sonnenburg master * r58fd62c / (10 files in 7 dirs): | 17:35 |
CIA-87 | shogun: various bugfixes related to SGVector/SGMatrix transition | 17:35 |
CIA-87 | shogun: - in out typemap of python_modular for vectors | 17:35 |
CIA-87 | shogun: - in simplefeatures set_feature_matrix | 17:35 |
CIA-87 | shogun: - KRR double free | 17:35 |
CIA-87 | shogun: ... - https://github.com/shogun-toolbox/shogun/commit/58fd62cc8e2765e82b9e4fc5926603942355d47e | 17:35 |
-!- CIA-87 was kicked from #shogun by bettyboo [flood] | 17:35 | |
-!- CIA-87 [~CIA@cia.atheme.org] has joined #shogun | 17:35 | |
@sonney2k | blackburn, yay! | 17:36 |
@sonney2k | all examples work again :) | 17:36 |
@sonney2k | python_modular only of course | 17:37 |
blackburn | nice | 17:39 |
CIA-87 | shogun: Soeren Sonnenburg master * r00d57d5 / examples/undocumented/python_modular/clustering_gmm_modular.py : fix clustering example - https://github.com/shogun-toolbox/shogun/commit/00d57d584f283a8ed9512f97e74b6a8413b7b662 | 17:45 |
CIA-87 | shogun: Sergey Lisitsyn master * rbcb7bb8 / (2 files): Added DORGQR routine wrapper for lapack - https://github.com/shogun-toolbox/shogun/commit/bcb7bb8f5a965e5eb7e51f7c158a4015411f5cf5 | 18:37 |
CIA-87 | shogun: Sergey Lisitsyn master * r988360c / (5 files in 2 dirs): Introduced Hessian Locally Linear Embedding preprocessor - https://github.com/shogun-toolbox/shogun/commit/988360ca58dfa56181aedb7cb7781b7d64ee0a3d | 18:37 |
CIA-87 | shogun: Sergey Lisitsyn master * r179091b / (2 files): Fixed LLE and added HLLE python modular example - https://github.com/shogun-toolbox/shogun/commit/179091b1af287374bc754b852f2a833321e8704e | 18:41 |
blackburn | sonney2k: vodka? | 18:42 |
-!- gsomix [~gsomix@109.169.131.11] has joined #shogun | 19:25 | |
gsomix | hi all | 19:25 |
-!- f-x [~user@117.192.204.152] has quit [Remote host closed the connection] | 19:28 | |
gsomix | sonney2k, i saw own ohloh account. I think i did something wrong with commiting. :) | 19:30 |
gsomix | All Languages. Total Lines Changed: 221,575. | 19:34 |
heiko1 | sonney2k, are you there? | 20:06 |
@sonney2k | heiko1, yes | 20:47 |
@sonney2k | heiko1, wassup? | 20:47 |
heiko1 | did you receive my email? | 20:48 |
heiko1 | http://pastebin.com/mjPq525g | 20:48 |
heiko1 | I want to create StringFeatures | 20:48 |
heiko1 | but it does not work | 20:48 |
heiko1 | because CAlphabet::check_alphabet() fails | 20:48 |
heiko1 | ALPHABET does not contain all symbols in histogram | 20:48 |
heiko1 | CAlphabet::6210 | 20:48 |
heiko1 | 610 | 20:48 |
heiko1 | and I am a bit unsure, what I am doing wrong here | 20:49 |
heiko1 | the program creates some random char strings (this works, they are printed) and then creates a CStringFeatures instance | 20:49 |
heiko1 | and then SG_ERROR | 20:49 |
CIA-87 | shogun: Soeren Sonnenburg master * rfe563a7 / src/shogun/features/Alphabet.h : | 20:51 |
CIA-87 | shogun: fix documentation for ALPHANUM and PROTEIN alphabets (they take upper | 20:51 |
CIA-87 | shogun: case chars not lowercase) - https://github.com/shogun-toolbox/shogun/commit/fe563a7c46e90eace5ec4177ab458883510986db | 20:51 |
@sonney2k | heiko1, ^ | 20:51 |
@sonney2k | UPPER CASE | 20:52 |
@sonney2k | so 0x41 and more | 20:52 |
heiko1 | ah | 20:52 |
heiko1 | oh no :) | 20:53 |
heiko1 | silly mistake | 20:53 |
@sonney2k | heiko1, well documentation was wrong... | 20:53 |
@sonney2k | it said a-z | 20:53 |
@sonney2k | not A-Z ... | 20:53 |
@sonney2k | blackburn, vodka! | 20:53 |
blackburn | not yet! found error :D | 20:54 |
@sonney2k | blackburn, you definitely need that to start working on java :D | 20:54 |
@sonney2k | ohh | 20:54 |
blackburn | with arpack the solution is kinda wrong | 20:54 |
heiko1 | sonney2k, well however, thanks .) | 20:54 |
blackburn | temporary will force to use lapack | 20:54 |
@sonney2k | gsomix, sounds like it... | 20:54 |
@sonney2k | heiko1, IIRC there is some debug option - then it will display the histogram... | 21:05 |
heiko1 | sonney2k, works now :) | 21:05 |
CIA-87 | shogun: Sergey Lisitsyn master * r9ff9954 / (3 files): Added force_lapack option for Locally Linear Embedding and forced HLLE to use lapack solver - https://github.com/shogun-toolbox/shogun/commit/9ff99546af4be84dfc1c079b4318f29e7b729735 | 21:06 |
@sonney2k | heiko1, surprise ;-) | 21:06 |
heiko1 | sonney2k, I will try to generalise the model selecttion parameters now | 21:06 |
heiko1 | was trying to do model selection with a string kernel | 21:06 |
heiko1 | but it has int32_t type parameters | 21:06 |
heiko1 | that does not work yet | 21:06 |
@sonney2k | heiko1, so you do all the other standard types like int / byte etc right? | 21:06 |
@sonney2k | should be easy given that you have double working already | 21:07 |
heiko1 | yes | 21:07 |
@sonney2k | enums can be a bit problematic though - but in the end these are ints too | 21:07 |
heiko1 | yes | 21:07 |
heiko1 | i am not that deep into generics | 21:09 |
heiko1 | hope it is possible to append an instance of one modelparam to another with another type | 21:09 |
@sonney2k | heiko1, enums are usually represented as integers - so at least from C/C++ / python it will work | 21:11 |
@sonney2k | one could specify illegal values though - but that should be catched anyways | 21:12 |
heiko1 | uuh this is harder than i thought | 21:23 |
heiko1 | all this generic classes build trees | 21:23 |
heiko1 | I dont know if it is possible to have a datastructure that holds instances of generic classes with different types? | 21:23 |
heiko1 | no, its not | 21:25 |
heiko1 | mmh | 21:25 |
heiko1 | sonney2k, basically this problem: | 21:29 |
heiko1 | http://www.cplusplus.com/forum/general/1281/ | 21:29 |
@sonney2k | yes indeed that is not working | 21:32 |
@sonney2k | heiko1, but can't you define all the types / trees that you need? | 21:32 |
heiko1 | but one tree may have different node types | 21:32 |
@sonney2k | I mean there are only a handful | 21:32 |
heiko1 | like one parameter int and another float | 21:33 |
heiko1 | both children of one node | 21:33 |
@sonney2k | what you could certainly do is to store the node types | 21:34 |
heiko1 | yes | 21:35 |
heiko1 | I think I will have to do that | 21:35 |
@sonney2k | and then have some access function that gives you the correct item from a union or so based on that type | 21:35 |
@sonney2k | or you have a node content base class | 21:36 |
@sonney2k | and then have derived classes for each node content type | 21:37 |
@sonney2k | but same thing not type safe | 21:37 |
@sonney2k | you need to store the type again | 21:37 |
heiko1 | yes | 21:37 |
blackburn | LLE IS FUCKING UNSTABLE | 21:38 |
blackburn | :E | 21:38 |
heiko1 | I think I will just store the type and save the data with void pointers | 21:38 |
@sonney2k | considering that we need just int, byte, bool - it is probably easiest to just use a union | 21:38 |
@sonney2k | or that yes | 21:39 |
@sonney2k | blackburn, like java examples SCNR :D | 21:39 |
blackburn | it drives me mad | 21:40 |
blackburn | I thought HLLE does | 21:40 |
blackburn | but it is not | 21:40 |
blackburn | I guess I should write sth like "If embedding is shity try another data" in description | 21:41 |
blackburn | sonney2k: you want me doing some notfunny things, ain't you? | 21:50 |
* blackburn wonders if there will be some ModifiedHessianLocallyLinearEmbedding | 21:52 | |
blackburn | or even ImplicitlyRestartedModifiedStableHessianLocallySublinearMegaEmbedding | 21:53 |
@sonney2k | ;) | 21:56 |
blackburn | oh | 22:04 |
blackburn | HLLE rocks! | 22:04 |
blackburn | sonney2k: I guess you might understand how good it is: http://dl.dropbox.com/u/10139213/shogun/hlle-faces-k10.png | 22:05 |
@sonney2k | sure | 22:09 |
blackburn | damn that guy looks just like me :D | 22:10 |
-!- in3xes [~in3xes@180.149.49.227] has quit [Quit: Leaving] | 22:15 | |
-!- gsomix [~gsomix@109.169.131.11] has quit [Ping timeout: 240 seconds] | 22:26 | |
CIA-87 | shogun: Sergey Lisitsyn master * r053c634 / .gitignore : Added some more filetypes to ignore by git - https://github.com/shogun-toolbox/shogun/commit/053c634fce544b1efb1b2141826dc5d4d7ba3ad7 | 22:35 |
CIA-87 | shogun: Sergey Lisitsyn master * rac722b7 / src/shogun/mathematics/arpack.cpp : Improved arpack wrapper - https://github.com/shogun-toolbox/shogun/commit/ac722b7fafeb94352f855b66dbfe66aca0cdd509 | 22:35 |
CIA-87 | shogun: Sergey Lisitsyn master * r1fff667 / src/shogun/preprocessor/LocallyLinearEmbedding.cpp : Improved stability of Locally Linear Embedding - https://github.com/shogun-toolbox/shogun/commit/1fff667d78a71a7ab8fd3c0ab9ca303ef8502359 | 22:35 |
CIA-87 | shogun: Sergey Lisitsyn master * re11c4bc / src/shogun/preprocessor/HessianLocallyLinearEmbedding.cpp : Improved HLLE - https://github.com/shogun-toolbox/shogun/commit/e11c4bcfcfab8163380b8c13933aaa3bc9d171a3 | 22:35 |
CIA-87 | shogun: Sergey Lisitsyn master * r05427ed / src/shogun/preprocessor/LocallyLinearEmbedding.cpp : Changed solver for LLE and removed unnecessary default parameter - https://github.com/shogun-toolbox/shogun/commit/05427edcff160e1241f9b9b8ad6674e8fc7b31cc | 22:39 |
-!- blackburn [~blackburn@188.122.239.253] has left #shogun [] | 23:42 | |
--- Log closed Tue Jul 26 00:00:21 2011 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!