[mlpack] GSoC 2019
Marcus Edel
marcus.edel at fu-berlin.de
Mon Mar 4 17:43:42 EST 2019
Hello Daniel,
thanks for getting in touch.
> - Given that there are many variants of PSO that seem worth implementing, such as using or not velocity clamping
> or a constriction coefficient to prevent velocity explosion, fully informed PSO, etc., how should I go about
> reflecting variations of the algorithm? I have observed that there are different directories for different versions
> of gradient descent, should I imitate this?
That depends on the final decision which variant/variations you like to
implement. I think if we can combine multiple variations e.g. by using the
policy design pattern I guess it makes sense to do so, if not it's absolutely
fine to implement the variant as another class. Also, the PSO class should
follow the Function type documentation see: http://www.ensmallen.org/docs.html
#function-type-documentation for more details.
> - From what I have read in some papers I have got the impression that the usual way to implement constrained optimization
> is by modelling constraints as penalty functions, using functions that take infinite values to represent hard constraints.
> Should traits for constraints reflect this?
I think that is a good idea, yes.
I hope anything I said, was helpful, let me know if I should clarify anything.
Thanks,
Marcus
> On 4. Mar 2019, at 22:39, danipozo at autistici.org wrote:
>
> Hello,
>
> I am Daniel Pozo Escalona, fourth year computer science and mathematics student at the University of Granada, Spain.
> I am interested in one of the projects proposed for GSoC 2019: implementing different variants of Particle Swarm Optimization.
>
> If I have understood correctly the structure of the project, an implementation should:
>
> - Add a directory to include/ensmallen_bits in the ensmallen repo, where the code for PSO would reside.
> - Add a trait for functions that provide constraints.
>
> I have some questions on these points:
>
> - Given that there are many variants of PSO that seem worth implementing, such as using or not velocity clamping
> or a constriction coefficient to prevent velocity explosion, fully informed PSO, etc., how should I go about
> reflecting variations of the algorithm? I have observed that there are different directories for different versions
> of gradient descent, should I imitate this?
>
> - From what I have read in some papers I have got the impression that the usual way to implement constrained optimization
> is by modelling constraints as penalty functions, using functions that take infinite values to represent hard constraints.
> Should traits for constraints reflect this?
>
> Regards,
> Daniel.
> _______________________________________________
> mlpack mailing list
> mlpack at lists.mlpack.org
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
More information about the mlpack
mailing list