[mlpack] queries regarding the implementation of rnn.hpp in ann

Nikhil Yadala nikhil.yadala at gmail.com
Mon Mar 14 00:01:43 EDT 2016


Hi marcus,Ryan,

                          I have gone through the complete code of ann, I
don't get THE exact idea of how rnn is implemented in mlpack.I have a few
queries

Could you tell me what the variables  ,inputSize, outputSize , seqOut
specify

How is the output taken from the network, are we taking output after every
time instance or is it that we take the output at the end of input (time)
sequence.?

Also, As per what i understand regarding BPTT,each subsequence(time)
(typically k=3, at t-1,t,t+1) is considered one layer , and the rest all is
similar to ffn with a constraint that the weight matrix is same at every
layer. But I dont understand how the BPTT is implemented in rnn.hpp (If
this is not the way , its implemented here, could be please direct me to
the link, where I could get a better understanding of what BPTT does and
how it does)

Regarding the project proposal, I am planning to implement bidirectional
deep rnn, So, that there is no need to code the double layer brnn
explicitly. , and also to give very less time to implement convolutional
auto-encoder, as the cnn,hpp does almost the same thing , the only tweak
that has to be done is to hard code the outputlayer to inputs(am I wright?
) .Could you please give your views on these?


thanks,
Nikhil Yadala.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20160314/ac493a11/attachment-0002.html>


More information about the mlpack mailing list