[mlpack] Stop gradient in mlpack ANN module

Ryan Curtin ryan at ratml.org
Thu Feb 14 21:15:27 EST 2019


On Fri, Feb 15, 2019 at 09:35:03AM +0800, problemset wrote:
> Hi, all, 
> 
> Nowadays, as the ML/DL/RL developed quickly, there is more diversity
> demand on the flexibility of ANN module. I am wondering that is there
> a way to stopping gradient back prop through a particular layer in
> mlpack.  Like Pytorch uses detach() while Tensorflow uses
> stop_gradien.

Hey there Xiaohong,

Could we create a layer we could add that just doesn't pass a gradient
through, perhaps?

That may not be the best solution (in fact I am sure it is not) but it
could at least be a start.

-- 
Ryan Curtin    | "I know... but I really liked those ones."
ryan at ratml.org |   - Vincent


More information about the mlpack mailing list