mlpack IRC logs, 2017-07-28

Logs for the day 2017-07-28 (starts at 0:00 UTC) are shown below.

>
July 2017
Sun
Mon
Tue
Wed
Thu
Fri
Sat
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
22
23
24
25
26
27
28
29
30
31
--- Log opened Fri Jul 28 00:00:18 2017
00:56 -!- stephentu [~stephentu@dyn-160-39-55-103.dyn.columbia.edu] has joined #mlpack
01:00 -!- stephentu [~stephentu@dyn-160-39-55-103.dyn.columbia.edu] has quit [Ping timeout: 255 seconds]
01:56 -!- stephentu [~stephentu@dyn-160-39-55-103.dyn.columbia.edu] has joined #mlpack
02:00 -!- stephentu [~stephentu@dyn-160-39-55-103.dyn.columbia.edu] has quit [Ping timeout: 240 seconds]
02:08 -!- govg [~govg@unaffiliated/govg] has joined #mlpack
02:15 -!- stephentu [~stephentu@dyn-160-39-55-103.dyn.columbia.edu] has joined #mlpack
02:29 -!- sumedhghaisas_ [~sumedh@192.41.128.94] has quit [Quit: Ex-Chat]
02:29 -!- sumedhghaisas__ [~sumedh@192.41.128.94] has joined #mlpack
02:43 -!- sumedhghaisas__ [~sumedh@192.41.128.94] has quit [Ping timeout: 268 seconds]
03:14 -!- sumedhghaisas__ [~sumedh@188.74.64.249] has joined #mlpack
04:50 -!- stephentu [~stephentu@dyn-160-39-55-103.dyn.columbia.edu] has quit [Quit: Lost terminal]
05:14 -!- sumedhghaisas__ [~sumedh@188.74.64.249] has quit [Ping timeout: 268 seconds]
06:17 -!- andrzejku [~textual@031011130234.dynamic-zab-05.vectranet.pl] has joined #mlpack
06:34 -!- andrzejku [~textual@031011130234.dynamic-zab-05.vectranet.pl] has quit [Ping timeout: 260 seconds]
06:44 -!- andrzejku [~textual@90-156-117-46.internetia.net.pl] has joined #mlpack
06:51 -!- kris1 [~kris@103.232.241.5] has joined #mlpack
07:44 -!- andrzejku [~textual@90-156-117-46.internetia.net.pl] has quit [Quit: Textual IRC Client: www.textualapp.com]
09:17 -!- kris1 [~kris@103.232.241.5] has quit [Quit: kris1]
09:20 -!- kris1 [~kris@103.232.241.5] has joined #mlpack
10:19 -!- kris1 [~kris@103.232.241.5] has quit [Quit: kris1]
10:20 < zoq> sumedhghais: Not sure how the test framework looks like, maybe you can open a PR or something like that, so I was thinking if we need a visitor to call the specific method since we know the type?
10:37 -!- kris1 [~kris@103.232.241.5] has joined #mlpack
12:38 < zoq> ironstark: Hello, do you need any help with the dlib-ml implementation? Probably a good first step would be to write an install script?
13:04 -!- sumedhghaisas__ [~sumedh@188.74.64.249] has joined #mlpack
13:15 -!- zoq [~marcus_zo@urgs.org] has quit [Read error: Connection reset by peer]
13:15 -!- zoq_ [~marcus_zo@urgs.org] has joined #mlpack
13:24 < kris1> I had a question zoq
13:24 < kris1> so in layer you use InputDataType and OutputDataType
13:26 < kris1> but i am not able to understand how did you some variables like gradient to OutputDataType.
13:26 < kris1> Is the logic that anything that gets computed within the layer is labeled as OutputDataType.
13:30 < zoq_> kris1: Yes, everything inside the layer is of type OutputDataType only the input is of type InputDataType.
13:30 -!- zoq_ is now known as zoq
13:39 < kris1> okay….. but why use InputData type and also arma::Mat<eT> when you could have replace arma::Mat<eT> by InputDataType.
13:48 < zoq> kris1: Right, in fact I already refactored some of the existing layer, to not use arma::Mat<eT> but e.g InputType for the input and OutType for the output etc. That way we could also pass e.g. arma::subview.
13:51 < lozhnikov> kris1: I am looking through your changes. It seems, you did an invalid merge. I did git reset since you committed some unnecessary code. I pointed out that here https://github.com/mlpack/mlpack/pull/1046#issuecomment-317401006
13:54 < lozhnikov> however, it seems you fixed that later
14:13 < kris1> Yes, i think we can just cherry pick the commits if we see there are some unncessary commits.
14:13 < kris1> Were you able to look at comments i gave in the PR.
14:14 < lozhnikov> I am looking through the code right now, I'll post a review soon
14:48 -!- shikhar [~shikhar@122.161.13.119] has joined #mlpack
14:53 -!- shikhar [~shikhar@122.161.13.119] has quit [Ping timeout: 240 seconds]
14:53 -!- shikhar [~shikhar@122.161.58.208] has joined #mlpack
15:15 -!- shikhar [~shikhar@122.161.58.208] has quit [Ping timeout: 240 seconds]
15:17 -!- shikhar [~shikhar@122.161.0.245] has joined #mlpack
15:22 -!- shikhar [~shikhar@122.161.0.245] has quit [Ping timeout: 260 seconds]
15:24 < kris1> lozhnikov i looked at the comments. I made some points. Were you able to look them over.
15:25 < kris1> Also one of the thing i am not sure about is that. ssRBM can outperform binary rbm on the mnist dataset. I did not see any papers that even consider using the mnist dataset. They all use “natural images”.
15:25 < kris1> for ssRBM classification testing.
15:25 < kris1> this goes for variants of ssRBM as well.
15:26 -!- shikhar [~shikhar@122.161.13.118] has joined #mlpack
15:27 < lozhnikov> yeah, I think we should try the CIFAR dataset
15:42 -!- sumedhghaisas__ [~sumedh@188.74.64.249] has quit [Ping timeout: 268 seconds]
15:50 < lozhnikov> kris1: I'll be unavailable this weekend
16:11 -!- kris1 [~kris@103.232.241.5] has quit [Quit: kris1]
16:46 -!- kris1 [~kris@103.232.241.5] has joined #mlpack
16:46 -!- shikhar [~shikhar@122.161.13.118] has quit [Read error: Connection reset by peer]
16:48 < lozhnikov> kris1: are you online?
16:48 < kris1> Yes
16:48 < lozhnikov> I'll be unavailable this weekend
16:49 < kris1> Ahhh…okay.
16:49 < lozhnikov> I'll return in the evening on Sunday
16:49 < lozhnikov> I am going to ride a bicycle for 2 days
16:50 < kris1> Wow !!! That sound exciting and tiring
16:50 < kris1> Is it part of some marathon?
16:51 < kris1> I will continue working on the weekend on ssRBM and GAN.
16:51 < lozhnikov> No, I just want to relax with my family
16:51 < kris1> I just need to clarify few things
16:52 < lozhnikov> usually, I don't participate in bicycle marathons
16:53 < kris1> 1. Gan: See my comment on Github 2. ssRBM: did you agree with slabPenalty can’t be used as a scalar i had commented upon Github
16:54 < lozhnikov> I replied at github to your comment about slabPenalty
16:54 < lozhnikov> I think it is possible to simplify expressions with slabPenalty
16:55 < lozhnikov> The second option is to use arma::diagmat()
16:56 < lozhnikov> but I think that arma::mat should work slowly in this case
16:57 < lozhnikov> regarding GAN:
16:57 < lozhnikov> I think it is possible to call optimize() only once instead of `numIterations` times
16:58 < lozhnikov> *once
16:58 < kris1> Your comment is to use arma::cumsum. So i think you mean to spike(i) * slabBias * arma::cumsum(weight.slice(i).t() * visible)
16:58 < kris1> Is that correct.
17:00 < lozhnikov> yeah, looks like that is correct, it isn't hard to verify
17:00 < kris1> okay.
17:00 < kris1> Can you explain how you would just call optimize just once for GAN.
17:01 < lozhnikov> maybe it is possible to add a counter to Gradient()
17:02 < kris1> The cause of problem for me seems to be that we have to generate the ouputs for optimisation of generator from the training of Discriminator.
17:02 < kris1> So if we have just call to the optimizer we can’t change the predictors and responses
17:03 < kris1> Also we can’t generate all the predictors and reponses in a matrix since we need to get the predictors from a trained Generator at a previous time step.
17:04 < kris1> Sorry for the 1st comment it generate oputputs for optimisation of discriminator from a trained generator.
17:05 < lozhnikov> I think you can move everything to the Gradient() function, I have to think about that