--- Log opened Thu Nov 13 00:00:06 2014 | ||
-!- DSrupt [~DSrupt@73.6.109.86] has joined #shogun | 01:18 | |
-!- DSrupt [~DSrupt@73.6.109.86] has quit [Quit: (null)] | 03:06 | |
-!- txomon|home [~txomon@unaffiliated/txomon] has quit [Ping timeout: 258 seconds] | 03:17 | |
-!- txomon|home [~txomon@unaffiliated/txomon] has joined #shogun | 03:18 | |
wiking | shogun-buildbot: stop build nightly_default "restarting" | 03:44 |
---|---|---|
shogun-buildbot | build 929 interrupted | 03:44 |
wiking | shogun-buildbot: force build --develop=branch 'nightly_default' | 03:45 |
shogun-buildbot | Something bad happened (see logs) | 03:45 |
wiking | shogun-buildbot: force build --branch=develop 'nightly_default' | 03:45 |
shogun-buildbot | The build has been queued, I'll give a shout when it starts | 03:45 |
shogun-buildbot | build #930 forced | 03:51 |
shogun-buildbot | I'll give a shout when the build finishes | 03:51 |
-!- pickle27 [~pickle27@192-0-136-118.cpe.teksavvy.com] has joined #shogun | 03:55 | |
-!- shogun-notifier- [~irker@7nn.de] has joined #shogun | 03:57 | |
shogun-notifier- | shogun: Viktor Gal :develop * cc1b966 / doc/Doxyfile_cn.in,doc/Doxyfile_en.in: https://github.com/shogun-toolbox/shogun/commit/cc1b9661e20986257f63909dca0d1b19561442b2 | 03:57 |
shogun-notifier- | shogun: Fix mathjax script location in doxygen config files | 03:57 |
shogun-notifier- | shogun: [ci skip] | 03:57 |
wiking | shogun-buildbot: stop build nightly_default "restarting" | 03:57 |
shogun-buildbot | build 930 interrupted | 03:57 |
shogun-buildbot | Hey! build nightly_default #930 is complete: Exception [exception interrupted] | 03:57 |
shogun-buildbot | Build details are at http://buildbot.shogun-toolbox.org/builders/nightly_default/builds/930 | 03:57 |
wiking | shogun-buildbot: force build --branch=develop 'nightly_default' | 03:57 |
shogun-buildbot | build #931 forced | 03:57 |
shogun-buildbot | I'll give a shout when the build finishes | 03:57 |
-!- Floatingman [~Floatingm@c-68-52-34-232.hsd1.tn.comcast.net] has quit [Remote host closed the connection] | 05:36 | |
-!- Floatingman [~Floatingm@c-68-52-34-232.hsd1.tn.comcast.net] has joined #shogun | 05:41 | |
-!- pickle27 [~pickle27@192-0-136-118.cpe.teksavvy.com] has quit [Remote host closed the connection] | 06:21 | |
shogun-buildbot | build #931 of nightly_default is complete: Failure [failed notebooks] Build details are at http://buildbot.shogun-toolbox.org/builders/nightly_default/builds/931 | 06:46 |
shogun-buildbot | build #886 of precise - libshogun is complete: Failure [failed compile] Build details are at http://buildbot.shogun-toolbox.org/builders/precise%20-%20libshogun/builds/886 blamelist: Viktor Gal <viktor.gal@maeth.com> | 06:49 |
shogun-buildbot | build #913 of FCRH - libshogun is complete: Failure [failed test] Build details are at http://buildbot.shogun-toolbox.org/builders/FCRH%20-%20libshogun/builds/913 blamelist: Viktor Gal <viktor.gal@maeth.com> | 06:53 |
-!- shogun-notifier- [~irker@7nn.de] has quit [Quit: transmission timeout] | 06:57 | |
shogun-buildbot | build #119 of osx2 - modular_interfaces is complete: Failure [failed csharp modular] Build details are at http://buildbot.shogun-toolbox.org/builders/osx2%20-%20modular_interfaces/builds/119 blamelist: Viktor Gal <viktor.gal@maeth.com> | 06:59 |
shogun-buildbot | build #480 of debian wheezy - memcheck is complete: Success [build successful] Build details are at http://buildbot.shogun-toolbox.org/builders/debian%20wheezy%20-%20memcheck/builds/480 | 09:36 |
-!- HeikoS [~heiko@0545399b.skybroadband.com] has joined #shogun | 12:05 | |
-!- mode/#shogun [+o HeikoS] by ChanServ | 12:05 | |
wiking | HeikoS: ping | 12:17 |
@HeikoS | wiking: pong | 12:18 |
-!- Heikotablet [~androirc@0545399b.skybroadband.com] has joined #shogun | 12:20 | |
Heikotablet | wiking, whatsup | 12:20 |
wiking | Heikotablet: do u have matlab at UCL? | 12:23 |
Heikotablet | Yes | 12:23 |
wiking | Heikotablet: ok then get a machine + matlab + buildbot | 12:24 |
@HeikoS | wiking: can do that I think | 12:27 |
@HeikoS | wiking: can I use my desktop? | 12:27 |
@HeikoS | wiking: its 24/7 | 12:27 |
wiking | yeah | 12:27 |
wiking | u can | 12:27 |
@HeikoS | but can only install ubuntu package | 12:27 |
wiking | that's ok | 12:27 |
@HeikoS | I dont have full admin rights | 12:27 |
@HeikoS | but can instal anything from repo | 12:27 |
wiking | mmm question of course is | 12:27 |
wiking | whether u can open a custom port :P | 12:27 |
wiking | although no | 12:27 |
wiking | it's ok | 12:27 |
wiking | so u just need to | 12:27 |
wiking | apt-get install buildbot-slave | 12:28 |
@HeikoS | let me try | 12:28 |
wiking | and then the rest we can figure out | 12:28 |
wiking | HeikoS: do u have g++ on that machine? :P | 12:28 |
wiking | i guess os or? | 12:28 |
wiking | *so | 12:28 |
@HeikoS | wiking: yeah yeah, I also use shogun on it | 12:28 |
@HeikoS | remember all my questions a while ago? | 12:28 |
@HeikoS | our cluster consists of all our desktop machines | 12:28 |
wiking | yeah ok because then we can haz swig-matlab | 12:28 |
@HeikoS | wiking: awesome! | 12:29 |
wiking | now the only question remains | 12:29 |
@HeikoS | ok installed buildbot slave | 12:29 |
wiking | ok | 12:29 |
wiking | lemme set it up | 12:29 |
@HeikoS | wiking: what do you need from me? | 12:29 |
wiking | HeikoS: i'm just writin you in pm | 12:29 |
-!- shogun-buildbot [~shogun-bu@7nn.de] has quit [Quit: buildmaster reconfigured: bot disconnecting] | 12:32 | |
-!- shogun-buildbot [~shogun-bu@7nn.de] has joined #shogun | 12:32 | |
-!- rajul [~rajul@59.89.130.252] has joined #shogun | 14:37 | |
wiking | hahah | 14:44 |
wiking | matlab_modular interface generated | 14:44 |
wiking | :D | 14:44 |
@lisitsyn | really? | 14:53 |
-!- rajul [~rajul@59.89.130.252] has quit [Ping timeout: 264 seconds] | 14:54 | |
wiking | yep | 14:54 |
wiking | but we need to generate the typemaps | 14:54 |
-!- rajul [~rajul@117.199.144.148] has joined #shogun | 15:06 | |
-!- rajul_ [~rajul@123.239.48.133] has joined #shogun | 15:13 | |
-!- rajul [~rajul@117.199.144.148] has quit [Ping timeout: 245 seconds] | 15:15 | |
-!- rajul_ is now known as rajul | 15:16 | |
@HeikoS | wiking: whoooooo! :) | 15:21 |
@HeikoS | wiking: might be cool for release, at least experimental | 15:21 |
@HeikoS | but I dont know how much work typemaps are | 15:21 |
@HeikoS | wiking: we should definitely take the opportunity to unit test the maps systematically as a draft for how to test the existing ones | 15:22 |
-!- HeikoS [~heiko@0545399b.skybroadband.com] has quit [Quit: Leaving.] | 15:37 | |
-!- HeikoS [~heiko@0545399b.skybroadband.com] has joined #shogun | 15:46 | |
-!- mode/#shogun [+o HeikoS] by ChanServ | 15:46 | |
@lisitsyn | HeikoS: hey | 15:47 |
@lisitsyn | HeikoS: jfyi I am ok with your date to talk about vw | 15:48 |
@HeikoS | lisitsyn: good to know | 15:54 |
@HeikoS | lets see what john says | 15:54 |
@HeikoS | s | 15:54 |
@HeikoS | sorry for postponing all the time | 15:54 |
@lisitsyn | HeikoS: np I am sick today anyway | 15:54 |
@HeikoS | lisitsyn: did he reply? | 15:54 |
@HeikoS | nope not yet | 15:54 |
@lisitsyn | HeikoS: not yet | 15:54 |
@lisitsyn | HeikoS: https://jakevdp.github.io/blog/2014/11/11/the-hipster-effect-interactive/ | 15:55 |
@HeikoS | lisitsyn: haha I saw that paper | 15:56 |
@HeikoS | nice that they have a nbotebook | 15:56 |
@lisitsyn | HeikoS: it ain't them | 15:56 |
@lisitsyn | it's jake from scikit learn | 15:57 |
@lisitsyn | HeikoS: can you recommend me something to read about variational inference? | 15:57 |
@HeikoS | yes | 15:58 |
@HeikoS | chris bishops book | 15:58 |
@lisitsyn | I am tired of being stupid :D | 15:58 |
@HeikoS | pattern recognition and machine learning | 15:58 |
@HeikoS | its the probabilistic ML bible | 15:58 |
@lisitsyn | hmm I think I glanced through it before | 15:58 |
@HeikoS | or Wu's notebook | 15:58 |
@HeikoS | its an easy idea | 15:58 |
@HeikoS | distribution is intractable | 15:58 |
@HeikoS | so you select one that you can deal with (mostly gaussian) | 15:58 |
@HeikoS | and then minimise KL div between approximation and true | 15:59 |
@lisitsyn | so in layman terms | 15:59 |
@lisitsyn | you just fit multidimensional gaussian to the distribution? | 15:59 |
@lisitsyn | via KL? | 15:59 |
@HeikoS | kind of | 16:00 |
@HeikoS | its not really the same as fitting the gaussian | 16:00 |
@HeikoS | minimizing KL is different | 16:00 |
@HeikoS | kullback leibler divergence | 16:00 |
@lisitsyn | yeah I know | 16:00 |
@lisitsyn | so you have that integral | 16:00 |
@HeikoS | some kind of distance between distributions | 16:00 |
@HeikoS | but not symmetric | 16:00 |
@HeikoS | and then you usually do this via maximising a lower bound | 16:00 |
@HeikoS | the KL gives you a lower bound on the likelihood of the distrbution you care about | 16:00 |
@lisitsyn | I don't get one thing yet | 16:01 |
@HeikoS | the bound is usually not tight (thats why its approximate) but you just hope for the best | 16:01 |
@lisitsyn | so are we talking about inference or training? | 16:01 |
@HeikoS | getting posterior | 16:03 |
@HeikoS | have a representation of posterior | 16:03 |
@HeikoS | that is kind of training | 16:03 |
@lisitsyn | is posterior usually a gaussian? | 16:04 |
@HeikoS | but if you have a gaussian posterior, usually inference is easy | 16:04 |
@HeikoS | no usually not | 16:04 |
@HeikoS | thats why the approximation | 16:04 |
@lisitsyn | but you approximate with gaussian, right? | 16:04 |
@HeikoS | yes | 16:05 |
@HeikoS | kind of most cases | 16:05 |
@HeikoS | but sometimes you just assume a certain type of factorisation | 16:05 |
@HeikoS | that is you assume certain variables in posterior are independent | 16:05 |
@HeikoS | and then a parametric form of posterior (that is not gaussian) drops out of the model math | 16:05 |
@HeikoS | for example for LDA for topic modelling that happens | 16:05 |
@lisitsyn | HeikoS: hmm I see | 16:06 |
@HeikoS | variational bayes there is very similar to gibbs sampling updates | 16:06 |
@HeikoS | with the posterior approximations happening to be discrete /dirichlet | 16:06 |
@HeikoS | I gotta run off now, we can discuss a little later today if you want | 16:06 |
@lisitsyn | sure | 16:06 |
@HeikoS | see you :) | 16:06 |
@lisitsyn | thanks | 16:06 |
@lisitsyn | see you | 16:06 |
@HeikoS | Ill be back soon, just a talk now | 16:06 |
-!- HeikoS [~heiko@0545399b.skybroadband.com] has quit [Quit: Leaving.] | 16:06 | |
wiking | aahahahahahah | 16:09 |
wiking | ahahahhahahhahaha | 16:09 |
* wiking just had 3 shots in a row.... :D | 16:09 | |
-!- rajul [~rajul@123.239.48.133] has quit [Ping timeout: 256 seconds] | 16:24 | |
-!- rajul [~rajul@117.199.144.26] has joined #shogun | 16:50 | |
-!- HeikoS [~heiko@pat-191-250.internal.eduroam.ucl.ac.uk] has joined #shogun | 17:15 | |
-!- mode/#shogun [+o HeikoS] by ChanServ | 17:15 | |
@HeikoS | lisitsyn: re | 17:15 |
@lisitsyn | HeikoS: cool | 17:15 |
@lisitsyn | HeikoS: I can ask you random questions if you are not busy :D | 17:15 |
@HeikoS | lisitsyn: please do | 17:17 |
@lisitsyn | HeikoS: you were talking about gaussian posterior | 17:17 |
@lisitsyn | but what's about other distributions? | 17:17 |
@HeikoS | yes | 17:17 |
@lisitsyn | is it kind of engineering to choose the distribution | 17:18 |
@HeikoS | as said, for discrete posteriors, one usually things are different | 17:18 |
@HeikoS | lisitsyn: not really, the point about Gaussians is that one can integrate over them | 17:18 |
@HeikoS | but if you say learn a Gaussian mixture model | 17:18 |
@HeikoS | you do the same thing | 17:18 |
@HeikoS | you minimise the KL between the posterior and the mixture | 17:18 |
@HeikoS | for standard GMM, one can do that in closed form and gets EM algoirithm | 17:19 |
@HeikoS | but for other mixture models, that might not be possible | 17:19 |
@HeikoS | so usually Gaussian, yes | 17:19 |
@lisitsyn | how inaccurate it is? | 17:19 |
@HeikoS | it depends on your posterior | 17:20 |
@HeikoS | if it doesnt look like a Gaussian | 17:20 |
@HeikoS | its not accurate | 17:20 |
@HeikoS | but you dont know how it looks | 17:20 |
@lisitsyn | hmm say I have features obtained from layer 2 of some deep learning net | 17:20 |
@HeikoS | so you dont know how accurate variational inference is | 17:20 |
@lisitsyn | will it be consistent to just try gaussian? | 17:21 |
@lisitsyn | I am just trying to get the thought behind it | 17:21 |
@HeikoS | what is the model | 17:21 |
@lisitsyn | what do you mean? | 17:21 |
@HeikoS | if you want to do variational inference you need a model | 17:22 |
@HeikoS | usually one starts with the model, and then comes up with inference algroithms for it | 17:22 |
@lisitsyn | ahh no I am speaking quite general | 17:22 |
@HeikoS | so what do you want to do? | 17:22 |
@lisitsyn | is possible models class broad? | 17:22 |
@HeikoS | it is an algorithm that characterises a posterior distribution | 17:22 |
@HeikoS | so *any* probabilistic model | 17:22 |
@lisitsyn | ahm I see | 17:23 |
@lisitsyn | HeikoS: I was talking about gps recently | 17:23 |
@lisitsyn | and was asked | 17:23 |
@lisitsyn | what is this limitation of 'answer' being gaussian | 17:23 |
@lisitsyn | like is it very restrictive | 17:24 |
@lisitsyn | and I still don't have a real answer :D | 17:24 |
@lisitsyn | HeikoS: what do you think? | 17:24 |
@HeikoS | the limitation is that your predictive uncertainty when you integrate over the posterior might be wrong | 17:24 |
@HeikoS | btw depends on what kind of gp | 17:25 |
@HeikoS | since for example regression is analytically tractable, posterior *is* gaussian | 17:25 |
@HeikoS | classification, it is not | 17:25 |
@HeikoS | but it usually is close to being Gaussian | 17:25 |
@lisitsyn | is regression always gaussian like? | 17:25 |
@HeikoS | yes | 17:26 |
@HeikoS | posterior is closed form | 17:26 |
@lisitsyn | ah | 17:26 |
@lisitsyn | ok I see | 17:26 |
@HeikoS | lisitsyn: | 17:26 |
@HeikoS | http://nbviewer.ipython.org/gist/yorkerlin/d991d9c0c7eeb14a62ff | 17:26 |
@HeikoS | wu put pictures of posterior and approximations to it | 17:26 |
@lisitsyn | let me check | 17:26 |
@lisitsyn | HeikoS: okay next random :D | 17:26 |
@lisitsyn | there is a book | 17:27 |
@lisitsyn | let me find | 17:27 |
@HeikoS | ah he did not put the plot | 17:27 |
@HeikoS | lisitsyn: but you can download and run | 17:27 |
@lisitsyn | ten lectures on statistical and structural pattern recognition | 17:27 |
@lisitsyn | have you seen that? | 17:28 |
@lisitsyn | http://books.google.ru/books/about/Ten_Lectures_on_Statistical_and_Structur.html?id=VLI2u3oVkAoC&redir_esc=y | 17:28 |
@HeikoS | nope | 17:28 |
@HeikoS | but i really reccomend the GP book by rasmussen and the ML book by bishop | 17:28 |
@HeikoS | these two are really really good books | 17:28 |
@lisitsyn | ah yeah that's for sure | 17:28 |
@lisitsyn | but it was a random question ;) | 17:28 |
@lisitsyn | HeikoS: the thing is this book comes from some years before | 17:29 |
@lisitsyn | and they think about say Wald, Neyman-Pearson tasks | 17:29 |
@lisitsyn | some Anderson task etc | 17:29 |
@lisitsyn | like P(error) < e | 17:29 |
@lisitsyn | or anything like that | 17:29 |
@HeikoS | ah | 17:29 |
@HeikoS | well | 17:29 |
@lisitsyn | HeikoS: so my question is whether it is dead | 17:29 |
@HeikoS | I dont like this kind of statistic | 17:29 |
@lisitsyn | why? | 17:29 |
@HeikoS | personal taste | 17:30 |
@HeikoS | not bayesian | 17:30 |
@HeikoS | not powerful | 17:30 |
@HeikoS | small d, small n | 17:30 |
@HeikoS | more like foundations | 17:30 |
@lisitsyn | hmm I see | 17:30 |
@HeikoS | im more a computational statistics boy :) | 17:30 |
-!- rajul [~rajul@117.199.144.26] has quit [Ping timeout: 265 seconds] | 17:30 | |
@lisitsyn | HeikoS: you said about small d small n | 17:31 |
@lisitsyn | is it working well under these circumstances? | 17:31 |
@HeikoS | lisitsyn: these methods attack different problems | 17:31 |
@HeikoS | so you cannot really compare | 17:31 |
@lisitsyn | HeikoS: I am just trying to get what's really wrong about it | 17:32 |
@lisitsyn | the only thing I can say is that you don't know P anyway | 17:32 |
@lisitsyn | so speaking about P(error) is kind of wrong | 17:32 |
@HeikoS | mmh | 17:32 |
@HeikoS | nothing wrong about it | 17:33 |
@HeikoS | but everything depends on what you want to do | 17:33 |
@HeikoS | if you want to do hypothesis testing, then this is the way to go | 17:33 |
@HeikoS | all tools | 17:33 |
@HeikoS | how good they are depends on what you want to do | 17:33 |
@lisitsyn | say when would you choose variational inference? | 17:34 |
@lisitsyn | or more important - when would you choose something else :) | 17:34 |
@lisitsyn | HeikoS: let me attack it like that - NNs are good at images/audio now | 17:35 |
wiking | alalallaalaaaaaaaaaaaaa | 17:36 |
@HeikoS | again | 17:36 |
@HeikoS | variational inference is an inference algorithm | 17:36 |
@HeikoS | it is not like you decide to use it | 17:36 |
@HeikoS | you first need to define your problem | 17:36 |
@HeikoS | and your model | 17:36 |
@HeikoS | the model is the critical part | 17:36 |
@HeikoS | not the inference algoirithm | 17:37 |
@HeikoS | you dont choose to use variational inference, you choose to do a probabilistic model | 17:37 |
@lisitsyn | HeikoS: I see | 17:37 |
@lisitsyn | HeikoS: okay then how would you choose to do probabilistic model? | 17:37 |
@HeikoS | it depends on what you want to do | 17:37 |
wiking | choose to drink :D | 17:38 |
@HeikoS | for example if you want to understand the process you are modelling | 17:38 |
@lisitsyn | wiking: haha | 17:38 |
@HeikoS | or if uncertainty is important for you | 17:38 |
@HeikoS | wiking: haha | 17:38 |
@lisitsyn | HeikoS: oh that seems legit | 17:38 |
@lisitsyn | answers my question :) | 17:38 |
@HeikoS | lisitsyn: example might be classification | 17:38 |
@HeikoS | if you are interested in confidence | 17:38 |
@HeikoS | and maybe have some complicated relationship that you would like to take into account for classification | 17:39 |
wiking | 'Column indexes start at 1 in JDBC' | 17:39 |
@HeikoS | say a group like strucutre that share hyper-parameters | 17:39 |
@HeikoS | and are interested in how these parameters look since that tells you something about the world | 17:39 |
wiking | why on fucking earth would anybody start indexing with 1 | 17:39 |
@HeikoS | then a probabilisti model might be good for oyu | 17:39 |
@HeikoS | wiking: I recently got the same question with 0 :) | 17:39 |
@lisitsyn | HeikoS: is it moving towards something more automatic? | 17:40 |
wiking | HeikoS: lemme guess... matlab developer? :D | 17:40 |
@lisitsyn | I mean neural guys claim they have feature learning | 17:40 |
@lisitsyn | is there any work on try to pass raw features and get something of it? | 17:40 |
wiking | you can learn anyting with an NN.. give that you have enough time/examples :d | 17:41 |
@lisitsyn | wiking: yeah absolutely but that's why I don't like what neural is all about | 17:41 |
@HeikoS | wiking: no proper scientists :) | 17:42 |
@HeikoS | lisitsyn: that is about representation learning | 17:42 |
@HeikoS | lisitsyn: kind of orthogonal to probabilistic modelling, and quite a different goal also | 17:42 |
@HeikoS | probabilistic models are for somethign for scientists who want to understand the world better | 17:42 |
@lisitsyn | HeikoS: I see | 17:45 |
@HeikoS | lisitsyn: but there are lots of connections between back propagation and stochastic variational inference, kind of the same in fact ;) | 17:46 |
-!- rajul [~rajul@101.56.101.74] has joined #shogun | 17:47 | |
-!- HeikoS [~heiko@pat-191-250.internal.eduroam.ucl.ac.uk] has quit [Quit: Leaving.] | 19:03 | |
@lisitsyn | wiking: hah so now I am on a sickness leave for like a week | 19:16 |
@lisitsyn | wiking: curious if I can switch to help you on release | 19:16 |
-!- rajul [~rajul@101.56.101.74] has quit [Ping timeout: 272 seconds] | 19:35 | |
-!- rajul [~rajul@101.58.27.168] has joined #shogun | 19:48 | |
-!- rajul [~rajul@101.58.27.168] has quit [Ping timeout: 255 seconds] | 20:26 | |
-!- rajul [~rajul@101.58.23.147] has joined #shogun | 20:40 | |
-!- rajul [~rajul@101.58.23.147] has quit [Ping timeout: 250 seconds] | 21:32 | |
--- Log closed Fri Nov 14 00:00:08 2014 |
Generated by irclog2html.py 2.10.0 by Marius Gedminas - find it at mg.pov.lt!