[mlpack] GSOC

Oleksandr Nikolskyy onikolskyy at gmail.com
Mon Mar 29 04:03:10 EDT 2021


Dear all, I want to share a draft of the first part of my proposal for GSOC
(approx first month).



***

(Draft) Title: Improvements of scalability of CMA-Es and introduction of
NES modules in ensmallen


Motivation:

Application of Evolution Strategies to learn RL problems is an interesting
topic. However, ES is often limited by computation/memory capacities.

For example, the CMA-ES, as a blackbox-optimizer for non-differentiable
problems, is very robust but requires O(n^2) space to store a covariance
matrix and compute the Cholesky decomposition on it.
This means that, while solving the CartPole problem from OpenAI Gym is fast
and easy, using CMA-ES to optimize a minimal deep convolution network to
learn to play Atari games, requires 75 GB of RAM or more.

There however exist promising approaches, which can handle a
high-dimensional parameter space. The idea is to introduce some of these
algorithms to the ensmallen library and, in the ideal case, to integrate
the

1st half

Main focus: ES framework

There are improvements on scalability for CMA-ES to achieve up to linear
time and memory complexity.

The idea is to refactor the existing CMA ES to a modular framework, which
allows the implementation and usage of different variants of CMA-ES.
Structurally the considered CMA ES variants obey the general form:

1. Sample new population
2. Selection and recombination
3. Step size control
4. Covariance adaptation

There are different approaches, e.g. LM-CMA-ES, do not store a covariance
matrix to reduce the memory size. Therefore, from a more general view, we
can summarize the ES structure as

1. SamplerStrategy
2. SelectionStrategy
3. AdaptationStrategy.
4. Restart strategy (?)

An important point while construction of the framework would be to provide
type traits to ensure a given combination of strategies can work together.

One of the main advantages of this general framework is first of all
extensibility.

A good implementation would allow a user to invent their own ES and give
them a possibility to experiment.

Further the algorithms
CMA-ES
ML-CMA-ES
SEP-CMA-ES

should be predefined and accessible in a similar way as the predefined
types of DecisionTree in mlpack.

A tutorial on the usage and extension of the framework should be provided.

2nd half:

Main focus: Variants of Natural ES

Special interest:
Parallelized Natural Evolution Strategies for RL

.. More details coming soon, in the meantime, take a coffee break ...

Bonus:

CEM-RL for mlpack (possibility to work together with another student, who
focuses on RL)





***



I would be happy to receive some feedback!
Especially, I am interested to hear from you about the idea to write the
CMA-ES framework.
Do you think it is an overkill? I was reading through different papers
regarding improvements of CMA-ES and there are so many variants out there,
that extensibility is perhaps a good approach.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20210329/3722534f/attachment.htm>


More information about the mlpack mailing list