[mlpack] An Introduction

Amitrajit Sarkar aaiijmrtt at gmail.com
Fri Mar 11 13:56:49 EST 2016


Dear Marcus,

I apologize for the delay. I have been through all the material that you
suggested: thank you. I have also been through the structure of mlpack. I
have a few questions:

1. Referring to the organization of mlpack, should the neuroevolution
algorithms be implemented as an 'optimizer' within 'ann' 'methods'
(supporting 'optimize', 'reset' and 'update')? Or should they be
implemented as separate 'methods'?

2. Keeping track of the historical markers for NEAT suggests a linked list,
connectionist ANN representation. Should this be implemented as a separate
datastructure, or serve as the connectionist representation of the network
itself? If a vectorized matrix representation is used, how will new nodes
evolved between layers be maintained?

3. What functions will the deep learning modules support? 'Train',
'predict', 'evaluate', 'gradient' and 'serialize'? Some deep learning
networks use forms of unsupervised pretraining. How should these by
incorporated into the mlpack API?

Best,
Amitrajit.

On Tue, Mar 8, 2016 at 3:18 AM, Marcus Edel <marcus.edel at fu-berlin.de>
wrote:

> Hello Amitrajit,
>
> I shall certainly look through the new references I find on the reading
> lists
> before filling out my application. If there are any resources in
> particular that
> you would like me to take note of, do mention them.
>
>
> Here are some papers for the Neuroevolution idea. A good theoretical
> understanding of what these models do and why they work is a necessity to
> be
> able to implement these well.
>
> - HyperNEAT-GGP:
> http://nn.cs.utexas.edu/downloads/papers/hausknecht.gecco12.pdf
> - NEAT:
> http://nn.cs.utexas.edu/?stanley:ec02
> - CMA-ES:
> http://image.diku.dk/igel/paper/NfRLUES.pdf
> - CoSyNE:
> ftp://ftp.cs.utexas.edu/pub/neural-nets/papers/gomez.ecml06.ps
> - Multi-Objective Neuroevolution in Super Mario Bros.:
> http://www.diva-portal.org/smash/get/diva2:676807/FULLTEXT01.pdf
>
> And here are some papers for the 'We need to go deeper' idea.
>
> - Going Deeper with Convolutions:
> http://arxiv.org/abs/1409.4842
> - Selective Search for Object Recognition:
> http://koen.me/research/pub/uijlings-ijcv2013-draft.pdf
> - Scalable Object Detection using Deep Neural Networks (multi-box):
> http://arxiv.org/abs/1312.2249
>
> And here are some papers on neural network models.
>
> Restricted Boltzmann Machines (RBM)
> - https://www.cs.toronto.edu/~hinton/absps/guideTR.pdf
> - http://deeplearning.net/tutorial/rbm.html
>
> Deep Belief Networks (DBN)
> - http://www.cs.toronto.edu/~rsalakhu/papers/science.pdf
> - http://deeplearning.net/tutorial/DBN.html
>
> Radial Basis Function Networks (RBFN)
> - http://www.cc.gatech.edu/~isbell/tutorials/rbf-intro.pdf
>
> Bidrectional Recurrent networks (BRN)
> Note: mlpack provides already an implementation for recurrent network
> - http://www.di.ufpe.br/~fnj/RNA/bibliografia/BRNN.pdf
>
> Convolutional Auto-Encoders (CAE)
> - http://people.idsia.ch/~masci/papers/2011_icann.pdf
>
> Hopfield neural networks (HNN)
> - http://page.mi.fu-berlin.de/rojas/neural/chapter/K13.pdf
>
> Keep in mind that you don't have to implement all of this models; A good
> project
> will select a handful of architectures and implement them with tests and
> documentation. Writing good tests is often the hardest part, so keep that
> in
> mind when you create your project timeline.
>
> I was wondering whether a connectionist approach would be better with
> regard to
> implementing the Neuroevolution algorithms when dealing with Augmenting
> Topologies. I would like your views on the matter.
>
>
> Basically it's a for performance reasons, but you can mimic a connectionist
> model, by simply setting the weights in the LinearLayer to zero , so that
> unit_11^(0) is only connected with unit_11^(1) and unit_12^(1)
> and not with unit_13^(1). You can also implement a special Layer to get
> this
> done even more easily.
>
> Also, would you like to see a basic implementation of CNE, using the
> existing
> mlpack neural networks, as a warm-up task? I really look forward to
> contributing
> to mlpack.
>
>
> Contributing is not a requirement for an application. Anyway, If you like
> to do
> that as warm-up task, I'm here to help you out. Keep in mind that you have
> to
> write a test, before I can merge anything in.
>
> Thanks,
> Marcus
>
>
> On 07 Mar 2016, at 19:40, Amitrajit Sarkar <aaiijmrtt at gmail.com> wrote:
>
> Hello Marcus,
>
> I agree: each of these projects requires a lot of background study.
> However, my undergrad research work has been focused on neural networks and
> deep learning for over a year now. Hence I am already familiar with the
> concepts appearing on the Ideas page, as well as those previously mentioned
> in the mailing list, having implemented several myself. I shall certainly
> look through the new references I find on the reading lists before filling
> out my application. If there are any resources in particular that you would
> like me to take note of, do mention them.
>
> I built mlpack from source, tried the tutorials, and started deciphering
> the source code. I understand that neural networks in mlpack use armadillo
> matrices for efficiency, a vectorized approach. I was wondering whether a
> connectionist approach would be better with regard to implementing the
> Neuroevolution algorithms when dealing with Augmenting Topologies. I would
> like your views on the matter.
>
> Also, would you like to see a basic implementation of CNE, using the
> existing mlpack neural networks, as a warm-up task? I really look forward
> to contributing to mlpack.
>
> Regards,
> Amitrajit.
>
> On Mon, Mar 7, 2016 at 5:38 AM, Marcus Edel <marcus.edel at fu-berlin.de>
> wrote:
>
>> Hello Amitrajit,
>>
>> sorry for the slow response.
>>
>> I am especially interested in:
>>
>> Neuroevolution Algorithms,
>> Essential Deep Learning Modules,
>> We Need To Go Deeper - Google LeNet.
>>
>>
>> I might suggest that you narrow your focus because each of these projects
>> has a
>> significant amount of background knowledge that is necessary.
>>
>> To learn more about each of the projects than what has been listed on the
>> Ideas
>> page, take a look at the mailing list archives:
>>
>> https://mailman.cc.gatech.edu/pipermail/mlpack/
>>
>> However, others are already working on the warmup tasks listed alongside
>> the
>> projects. Are there any other tasks that I could try?
>>
>>
>> Don't worry, contributing is not a requirement for an application. So if
>> you
>> don't find anything that you think you can do, that's not necessarily a
>> problem.
>> However, I'll see if I can add some more "easy" issues in the next couple
>> of
>> days. On the other side, you are always welcome to just poke around the
>> library
>> and try to fix any problems you find, or improve the speed of various
>> parts.
>>
>> Thanks,
>> Marcus
>>
>> On 06 Mar 2016, at 08:39, Amitrajit Sarkar <aaiijmrtt at gmail.com> wrote:
>>
>> Hi,
>>
>> I am Amitrajit Sarkar, a CS undergrad from Jadavpur University, India. I
>> have been working on machine learning for over a year now. I even have my
>> own neural networks library <https://github.com/aaiijmrtt/NET>, which I
>> wrote from scratch while trying to understand existing theories. I am very
>> eager to contribute to mlpack for GSoC 2016, as almost all the projects
>> excite me equally.
>>
>> I am especially interested in:
>>
>> Neuroevolution Algorithms,
>> <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas#neuroevolution-algorithms>
>> Essential Deep Learning Modules,
>> <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas#essential-deep-learning-modules>
>> We Need To Go Deeper - Google LeNet.
>> <https://github.com/mlpack/mlpack/wiki/SummerOfCodeIdeas#we-need-to-go-deeper---googlenet>
>>
>> I have implemented basic neuroevolution algorithms here
>> <https://github.com/aaiijmrtt/LEARNING>, and several deep learning
>> modules here <https://github.com/aaiijmrtt/NET>. I am certain that I can
>> take up the tasks. However, others are already working on the warmup tasks
>> listed alongside the projects. Are there any other tasks that I could try?
>> I have a lot of experience with research work, and am a skilled coder.
>>
>> I am attaching my CV for reference. You may find more about my interests
>> on my blog <http://aaiijmrtt.github.io/>.
>>
>> Cheers,
>> Amitrajit.
>> <cv.pdf>_______________________________________________
>> mlpack mailing list
>> mlpack at cc.gatech.edu
>> https://mailman.cc.gatech.edu/mailman/listinfo/mlpack
>>
>>
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20160312/72a6abda/attachment-0002.html>


More information about the mlpack mailing list