[mlpack] mlpack 1.0.9 released

Ryan Curtin gth671b at mail.gatech.edu
Mon Jul 28 14:30:54 EDT 2014


Hello there,

Today the last bits of work required for a new mlpack release were
finished.  You can download mlpack 1.0.9 from

  http://www.mlpack.org/

with a direct URL

  http://www.mlpack.org/files/mlpack-1.0.9.tar.gz

It has been a while since a release, and there have been a lot of
contributions.  The projects from our five Summer of Code students are
starting to come to a close, and some of their code has been
incorporated into this release.  (You can expect another release shortly
after their projects are done; hopefully early September.)

Here is a list of changes, pulled straight from HISTORY.txt:

 - GMM initialization is now safer and provides a working GMM when
   constructed with only the dimensionality and number of Gaussians
   (#314).

 - Check for division by 0 in Forward-Backward Algorithm in HMMs (#314).

 - Fix MaxVarianceNewCluster (used when re-initializing clusters for
   k-means) (#314).

 - Fixed implementation of Viterbi algorithm in HMM::Predict() (#316).

 - Significant speedups for dual-tree algorithms using the cover tree
   (#243, #329) including a faster implementation of FastMKS.

 - Fix for LRSDP optimizer so that it compiles and can be used (#325).

 - CF (collaborative filtering) now expects users and items to be
   zero-indexed, not one-indexed (#324).

 - CF::GetRecommendations() API change: now requires the number of
   recommendations as the first parameter.  The number of users in the
   local neighborhood should be specified with
   CF::NumUsersForSimilarity().

 - Removed incorrect PeriodicHRectBound (#30).

 - Refactor LRSDP into LRSDP class and standalone function to be
   optimized (#318).

 - Fix for centering in kernel PCA (#355).

 - Added simulated annealing (SA) optimizer, contributed by Zhihao Lou.

 - HMMs now support initial state probabilities; these can be set in the
   constructor, trained, or set manually with HMM::Initial() (#315).

 - Added Nyström method for kernel matrix approximation by Marcus Edel.

 - Kernel PCA now supports using Nyström method for approximation.

 - Ball trees now work with dual-tree algorithms, via the BallBound<>
   bound structure (#320); fixed by Yash Vadalia.

 - The NMF class is now AMF<>, and supports far more types of
   factorizations, by Sumedh Ghaisas.

 - A QUIC-SVD implementation has returned, written by Siddharth Agrawal
   and based on older code from Mudit Gupta.

 - Added perceptron and decision stump by Udit Saxena (these are weak
   learners for an eventual AdaBoost class).

 - Sparse autoencoder added by Siddharth Agrawal.

If you want a (nearly) full list of bugfixes and changes since 1.0.8,
you can see the list of resolved tickets:

http://www.mlpack.org/trac/query?status=closed&group=resolution&milestone=mlpack+1.0.9
(Note: at the moment, Trac is down.  Sorry about that.  It is being
worked on, but there is no solution quite yet... it should be back up in
a day or two.)

The benchmarks for 1.0.9 are not done yet, but I will respond to this
email with a link when they are complete and posted.

The next release is likely to be 1.1.0; we are anticipating that there
will be some API reverse compatibility breakage.  However, we will
provide a guide on any big changes and how code can be adapted.  Many of
the abstractions we use to organize and understand our algorithms are
still not completely stable (especially the tree API), and sometimes
this forces changes...

Anyway, thanks to everyone for their hard work on this release!  Lots of
work has gone into this one.

Ryan

-- 
Ryan Curtin    | "This is how Number One works!"
ryan at ratml.org |   - Number One



More information about the mlpack mailing list