[mlpack] GSoC 2018: Particle Swarm Optimization

Marcus Edel marcus.edel at fu-berlin.de
Sun Mar 11 13:15:10 EDT 2018


Hello Chintan,

> I've been through the MOPSO with GD approach as we discussed earlier, and I
> certainly like the idea. My thoughts as of now are:
> 
> (1) Add a basic PSO optimizer to the existing optimization API might refer to
> (the gradient descent optimizer code for help/coding style).

There is already an open PR for the standard PSO
https://github.com/mlpack/mlpack/pull/1225 that should be helpful.

> (2) Add support for contraint based optimization the AugLagrangian class has
> (some support for equality constraints by calculating the associated penalty and
> (keeping it under a threshold; maybe a similar approach will work here?).

Agreed, that is definitely an option we can adapt for the PSO method.

> (3) Extend the functionality to multi-objective optimization, this might require
> (reworking the Evaluate) methods of FunctionTypes to evaluate the position for
> (multiple objective functions; perhaps a better approach would be to add another
> (FunctionType which will be used by the MOPSO optimizer
> (MultiObjectiveFunctionType?).

Agreed, another FunctionType is a good idea to represent multiple functions, one
idea is to use variadic templates to pass multiple functions, that would allow
us to use the same interface.-

> I also had a small doubt: is there a minimum amount of RAM/resources I need to
> have on my system? I am running a Fedora 27 on a Core i5 with 4GB of RAM, and I
> cannot keep another application open while building the code (I switched from
> the distributed tar file to a clone of the repo), not even atom. Should I
> consider getting a RAM upgrade?

Right, the codebase does use a lot of memory in the build step, one idea is to
turn off the python bindings and the executables:

cmake  -DBUILD_CLI_EXECUTABLES=OFF -DBUILD_PYTHON_BINDINGS=OFF ..

you can also check if adding -DDEBUG=ON helps. Also, maybe there is an option to
increase/add Swap?

Let me know if I should clarify anything.

Thanks,
Marcus

> On 11. Mar 2018, at 11:39, Chintan Soni <chintan.soni4 at gmail.com> wrote:
> 
> Hi Marcus,
> 
> I've been through the MOPSO with GD approach as we discussed earlier, and I certainly like the idea. My thoughts as of now are:
> 
> (1) Add a basic PSO optimizer to the existing optimization API (might refer to the gradient descent optimizer code for help/coding style).
> 
> (2) Add support for contraint based optimization (the AugLagrangian class has some support for equality constraints by calculating the associated penalty and keeping it under a threshold; maybe a similar approach will work here?).
> 
> (3) Extend the functionality to multi-objective optimization, this might require reworking the Evaluate() methods of FunctionTypes to evaluate the position for multiple objective functions; perhaps a better approach would be to add another FunctionType which will be used by the MOPSO optimizer (MultiObjectiveFunctionType?).
> 
> What are your thoughts about this? Especially regarding (2)?
> 
> I also had a small doubt: is there a minimum amount of RAM/resources I need to have on my system? I am running a Fedora 27 on a Core i5 with 4GB of RAM, and I cannot keep another application open while building the code (I switched from the distributed tar file to a clone of the repo), not even atom. Should I consider getting a RAM upgrade?
> 
> Thanks and regards,
> Chintan



More information about the mlpack mailing list