Deep Learning Module in MLpack(Week 7)

Week Seven

This week i mostly tried completing ssRBM and GAN PR's. Majority of the time was spend in making both the codes work on the test dataset. We finally managed to do so. With ssRBM Pr we were running into the errors of memory mangament due to me allocating around ~30gb of memory for the parameters. Since i was declaring all the parameters to be of full matricies. But i managed to reduce this to just vectors. The problem left withh ssRBM still is the training part we getting a accuracy of around 12% for the mnist data set that we used in the binary rbm. We are working on fixing the test.

This week i also managed to finish the GAN implmenetation the code on work on the test data but is givin near random images even for say 1000 iterations of alternating sgd(with the discriminator being trained for 3000(3 * 1000) iterations) and generator being trained for 1000 iterations(the generator and discriminator being used here are just simple ffn). The GAN PR also requires review for me to fully undertand where i am gouing wrong. I want to thank Konstantin here also since i was using the CrossEntropy Code that he wrote for the GAN's. I am also not sure how to test GAN. Write now i am just trying to see if it can generate real quality images.

Next Week: I would mostly be working on fixing both the GAN' and ssRBM test. Also i would write serialisation for GAN's next week. I hoping within 10 days both PR's would be mergable.


Deep Learning Module in MLPack(Week 6)

Week Six

This week the majority of time went in refactoring of the code of existing RBM and writing code for ssRBM. I have succesfully been able to refactor the code and make all the test's pass for Binary RBM(serialisation test and classification test). The code for ssRBM is also now complete. I primarily aim to test the ssRBM implementation this week though i am not sure about how what test. Basically the classification test given in the ssRBM paper requires that we use the Cifar-10 dataset which is around 178Mb so i think that test is not good to commit but can be tested. Mikhail gave a idea to test the existing classification test with Mnist. Though i don't have any idea which values to start with.

This week I also implemented the GAN. Though there are some issues that we have to work with. Testing GAN would be easy if we don't run into any issues with training of GAN's which is a known issue.

This week I planning to get the test for ssRBM right and opening a PR for GAN. P.S: Sorry for the Late blog post


Deep Learning Module in MLpack(Week 5)

Week Five

This week was primarily focused on reading and understanding the ssRBM paper. I also opened a new PR for ssRBM that basically implements the spike slab layer. Our approach for implementation of ssRBM is that it will be a policy class of the RBM class. This would mean that we would have very less code duplication and things we would actually need to implement for the ssRBM would be Gradients, Reset and the FreeEnergy function. I have already implemented these the only part remaining is refactoring of the RBM class.

The plan for the next week is basically to complete the ssRBM implementation and hopefully test the ssRBM implementation for classification on the mnist/cifar dataset. I don't exactly know how would we add these to the repo as cifar dataset is huge.


Deep Learning Module in MLpack(Week 4)

Week Four

This week we mainly put finishing touches upon out existing Binary RBM PR. The finishing touches took time mainly because we were not able to train to the RBM correctly and one disastrous commit that I submitted that actually rolled back the changes that I had made earlier :(. We did a lot of trial and error's(mainly with the gibbs sampling step) to make it finally work. I do now undertand why people in Deep Learning talk so much about how hard it is to train DL models.

Here are is our Result on the mnist Dataset

The samples are generated from 1000 steps gibbs sampling

This is image is generated from example

We also added another test which basically a classification test using the latent learnt representation of mnist dataset. We compared our results with sci-kit learn framework. And we were able to get better accuracy than them on the subset of test cases. Though I think it would be a fair comparison if we do a 10-fold cross validation. Right now the test size - 100 and train size 2500 our implementation classification accuracy is around 90% while sklearn is around 78% only(sklearn implementation the number of steps gibbs=1). We have also added a serialization test to out implementation this week.

I think the PR would be accepted this week(fingers crossed).

I have also started working on the ssRBM. It was disappointing to see that no other libraries have actually implemented the ssRBM so we could compare our results with them. Even the authors do not provide a link to the code. Anyways, I have implemented the spike-slab layer(hidden) layer and visible layer for the ssRBM and would be opening a PR by this weekend.

The main goals for next week are the following 1. Implement ssRBM 2. Start writing tests for ssRBM.

PS. I would like to thank Mikhail for all the help this week :)


Deep Learning Module in MLpack(Week 3)

Week Three

This week I tried to finish the PR of the BinaryRBM implmentation. I expected t is would not take much time. But as the famous saying goes "We make plans and god laugh". Most of the time this week was spent in debuggin the existing implmentation of the RBM implementation.This code majority of my time this week though the code alse went through some major changes. Some of the major Changes it underwent were as follows: 1. Change of the evaluation function 2. Major Style Fixes 3. Cd-k code addition. 4. Addition of batch training to cd-k algorithm

The important thing I learnt this week how important is to intialise the variable.

We finally were able to solve the problem of training and we kind of get okay results now have a look here. Here is parmaeters list we got results by. cd-1, batch size: 20, learning rate:0.1

The samples are generated from 1-step gibbs sampling.

The last image uses mnist-binary dataset with threshold value of 0.2

Next Week

I had hoped to finish Binary RBM in the previous week but now it has to be extended this week. Major goals for this week include 1. Writing test for Binary RBM(write now i am planning to add reconstruction loss and classification accuray as test) 2. Merge Binary RBM PR 3. Start with ssRBM

*Hopefull this week we would be able to achive are targets. :)


Deep Learning Modules in Mlpack(Week 2)

This week we pushed our existing implementations to mlpack. We are now done with Basic Wrapper Layer for RBM the base visible and hidden layer. We also are done with the Cd-k and and PCd-k algorithm. This week will be spent in writing test for the Binary RBM layer. I will also try to complete the ssRBM this would be easy since we have to only edit the visible layer for this hopefully. Currently we are facing some diffculties in the storage of the parameters that are shared by the visible and hidden layer. But we expect to finish that by this week.

Here are the links to the works 1. [CD-k algorithm] ( 2. [Wrapper Layer and the Binary Visible Layer] (