[mlpack] GSoC'22 Project Proposal: Enhance CMA-ES

Ryan Curtin ryan at ratml.org
Tue Apr 19 10:29:19 EDT 2022


Hey there Phan,

Thanks for getting in touch!  I am not an expert at CMA-ES, but what you
have written sounds good to me.  I don't think I can closely answer any
of the specific questions that you asked about CMA-ES, but perhaps this
discussion might have some useful information:

https://github.com/mlpack/ensmallen/issues/70

I hope that is helpful!

Thanks,

Ryan

On Mon, Apr 18, 2022 at 01:19:38PM +0800, Nhật Hoàng Phan wrote:
> I’m Phan Nhat Hoang, a freshman computer science student at Nanyang
> Technological University. I’m familiar with multivariate statistical
> analysis, matrix analysis and C++ programming as well which are all
> relevant to CMA-ES (quite linear algebra intensive IMO). I would like to
> contribute into CMA-ES expansion and ensmallen repository as a project for
> Google Summer of Code 2022. I’ve read throughout the original paper
> (”Completely Derandomized Self-Adaption in Evolution Strategies”), the
> tutorial about CMA-ES in 2016 (”The CMA Evolution Strategy: A Tutorial”) as
> well as some recent variations including the saACM-ES, IPOP-CMA-ES,
> Active-CMA-ES, Cholesky-CMA-ES (2016) and some large-scale variants of it
> including LM-CMA-ES, sep-CMA-ES. For the implementation part, I’ve examined
> the CMA-ES implementation in ensmallen. Although, in cmaes/cmaes_impl.hpp,
> authors mentioned this is implementation of "Completely Derandomized
> Self-Adaption in Evolution Strategies” paper, I realized that many detail
> implementations are inspired by "Efficient covariance matrix update for
> variable metric evolution strategies” by Suttorp et al. Some of them are
> Cholensky decomposition, hyperparameters choices and multi-objective
> functions adaption. Furthermore, the lambda (size of population) in the
> ensmallen is 10(4+round(3log(N)) meanwhile to the extent of my knowledge,
> lambda is mostly set by 4+round(3log(N)). Therefore I think we need some
> clarifications on which theoretical source is used for ensmallen
> implementation. Besides to current implementation, the test
> functions,currently only Rosenbrock and Logistic Regression functions, are
> needed to be extended to Sphere (important), Cigar, Ellipsoid and Different
> Powers for reliable benchmark. For the enhancement implementations of
> CMA-ES, I would choose 2-3 algorithms (possibly IPOP-CMA-ES, Active-CMA-ES
> and LM-CMA-ES) to improve the scalability, speed of adaption and globality
> of original CMA-ES along with an intensive benchmark for them. The current
> documentation is also lack of mathematical information, my plan also
> includes mathematical documentations and how to use them for toy functions
> and even scalable applications. Lastly, I’m writing a detail proposal on
> how to leverage ensmallen APIs to implement aforementioned algorithms,
> documentation and testing plan. In this thread, I’m looking for your
> comments on my current directions and how original CMA-ES is implemented.

> _______________________________________________
> mlpack mailing list
> mlpack at lists.mlpack.org
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack


-- 
Ryan Curtin    | "Oh boy, am I happy!  I'm surrounded by bugs!"
ryan at ratml.org |  - Agitha


More information about the mlpack mailing list