[mlpack] GSoC'20

Marcus Edel marcus.edel at fu-berlin.de
Mon Mar 16 18:13:33 EDT 2020


Hello Yihan,

thanks for the update, really like the IPOP-CMA-ES idea, the others are
interesting as well. The general project plan sounds reasonable to me. If you
apply make sure to document, how you plan to test the implementation as well.

Thanks,
Marcus

> On 16. Mar 2020, at 16:13, Yihan Wang <wangyihan617 at gmail.com> wrote:
> 
> Hi Marcus, 
> 
> I have got familiar with the code base and the listed papers. And I think these are the modifications we should add to the original CMA-ES.
> * ACM-ES
> We can add a surrogate model into the original algorithm. We check whether the model is given during the iteration, and call it if so.
> * IPOP-CMA-ES
> The key idea is to restart the algorithm with a larger population when terminated.
> * Active-CMA-ES
> The key idea is to introduce new update functions to variance matrix and other parameters. And I think we can extract the update functions in the main algorithm as separated functions which can be overwritten in derived classes.
> 
> Here are steps that I think we need to do:
> Phase #1
> 1. making initialization, iteration and update rules separated functions.
> 2. implement IPOP-CMA-ES, which will be easier on basis of step #1.
> 3. test and test preparation for the other two algorithms
> 4. documentation
> 
> Phase #2
> 1. implement Active-CMA-ES
> 2. test and documentation
> 
> Phase #3
> 1. implement ACM-ES
> 2. test and documentation
> 
> Please let me know if there is any suggestion on the plan.
> 
> Thanks,
> Yihan
> 
> 
> Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>> 于2020年3月9日周一 上午7:08写道:
> Hello Yihan,
> 
> > * Enhance CMA-ES I have began to check the references listed, and I have a
> > question related to the current mlpack. Currently is there an original CMA-ES
> > algorithm in the mlpack? If there is none, I can begin from the original
> > implementation.
> 
> All mlpack optimizers are in another repository including the CMA-ES optimizer:
> https://github.com/mlpack/ensmallen <https://github.com/mlpack/ensmallen> and
> https://github.com/mlpack/ensmallen/tree/master/include/ensmallen_bits/cmaes <https://github.com/mlpack/ensmallen/tree/master/include/ensmallen_bits/cmaes>.
> 
> > * Implement the Transformer in mlpack I think what we need to do is first
> > implement an attention layer and then the transformer itself. For testing, we
> > can compare the result with results got from pytorch or so.
> 
> Agreed, mlpack doesn't implement an attention layer.
> 
> Let me know if I should clarify anything.
> 
> Thanks,
> Marcus
> 
> > On 8. Mar 2020, at 07:54, Yihan Wang <wangyihan617 at gmail.com <mailto:wangyihan617 at gmail.com>> wrote:
> > 
> > Hi all,
> > 
> > I am Yihan Wang, a final year student from Tsinghua University, with more than a year's research experience in machine learning algorithms. I am interested in participating in this year's GSoC. In particular I am interested in these two topics.
> > 
> > * Enhance CMA-ES
> > I have began to check the references listed, and I have a question related to the current mlpack. Currently is there an original CMA-ES algorithm in the mlpack? If there is none, I can begin from the original implementation.
> > 
> > * Implement the Transformer in mlpack
> > I think what we need to do is first implement an attention layer and then the transformer itself. For testing, we can compare the result with results got from pytorch or so.
> > 
> > Is there any suggestion related to these two ideas?
> > 
> > Best,
> > Yihan
> > _______________________________________________
> > mlpack mailing list
> > mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
> > http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20200316/76712d82/attachment-0001.htm>


More information about the mlpack mailing list