[mlpack] [mlpack/mlpack] HMM initialization: don't use equal initial probabilities. (#828)

Ryan Curtin notifications at github.com
Mon Dec 12 14:45:29 EST 2016


That can cause training to fail sometimes.  Instead, optimization seems to perform better when using random intiialization.

This is a fix that came out of some debugging in the IRC channel.
You can view, comment on, or merge this pull request online at:

  https://github.com/mlpack/mlpack/pull/828

-- Commit Summary --

  * Don't use equal initial probabilities.

-- File Changes --

    M src/mlpack/methods/hmm/hmm_impl.hpp (11)
    M src/mlpack/tests/hmm_test.cpp (5)

-- Patch Links --

https://github.com/mlpack/mlpack/pull/828.patch
https://github.com/mlpack/mlpack/pull/828.diff

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/mlpack/mlpack/pull/828
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20161212/a4038964/attachment.html>


More information about the mlpack mailing list