[mlpack] Query regarding constrained and unconstrained methods

Marcus Edel marcus.edel at fu-berlin.de
Tue Mar 13 19:07:11 EDT 2018


Hello Adeel,

thanks for the update, I like the idea using a traits class, my only concern is
that I can't use both (with and without constraints on the same function). What
do you think about adding HasConstraints to the function itself? We could also
use SFINAE and enable_if to detect if a function implements a specific function
and call the function accordingly. Let me know what you think.

Thanks,
Marcus

> On 13. Mar 2018, at 19:18, Adeel Ahmad <adeelahmad14 at hotmail.com> wrote:
> 
> Hello Marcus,
> 
> Thank you for your guidance so far. The pull request I started working on last month now provides an interface for handling unconstrained problems (https://github.com/mlpack/mlpack/pull/1225 <https://github.com/mlpack/mlpack/pull/1225>). Please let me know if it requires further improvements.
> 
> I have started working on my application and will share it as soon as I have a reviewable draft. Currently, I'm brain-storming some ideas related with API design for handling constrained problems.
> 
> We previously conversed on using lambda functions for defining the constraints. This seems alright, but we should also have a mechanism for detecting whether the user wants to add constraints at all. For this, I think we can make use for traits. I imagine something like this would work:
> 
> template<typename ConstraintType>
> struct ConstraintTraits
> {
>   // If true, the constraints are enforced after each update.
>   // This defaults to false.
>   static const bool HasConstraints = false;
> };
> 
> The user can then make a template specialization to set the flag:
> 
> template<>
> struct ConstraintTraits<bool>
> {
>   static const bool HasConstraints = true;
> }
> 
> Which can then be used to enforce constraints:
> 
> template<typename ConstraintType>
> if (BoundTraits<ConstraintType>::HasConstraints)
> {
>   // Enforce constraints.
> }
> 
> Something similar is already being done in bound_traits.hpp(https://github.com/mlpack/mlpack/blob/3d3d733ba3c41c4f51764f44185767384ab6d9c7/src/mlpack/core/tree/bound_traits.hpp#L31 <https://github.com/mlpack/mlpack/blob/3d3d733ba3c41c4f51764f44185767384ab6d9c7/src/mlpack/core/tree/bound_traits.hpp#L31>). Additionally, I think constraints can also be provided as part of the ConstraintTraits structure using a lambda function as the structure member.
> 
> Another approach would be to provide constraints as part of the evaluation function class. In the implementation I proposed, the template class passed to the Optimize class method would contain the constraint definition.
> 
> Could you please let me know which of these techniques would be more desirable? Or, should I mention both in my proposal and try to come up with pros and cons for each?
> 
> Thank you,
> Adeel
> 
> From: Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>>
> Sent: Tuesday, January 30, 2018 6:05 PM
> To: Adeel Ahmad
> Cc: mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
> Subject: Re: [mlpack] Query regarding constrained and unconstrained methods
>  
> Hello Adeel,
> 
>> I have created a minimal layout for the PSO (unconstrained problems). Please see
>> this commit (https://github.com/adl1995/mlpack/commit/493c87968a9291582bb663d537 <https://github.com/adl1995/mlpack/commit/493c87968a9291582bb663d537>
>> 22818295e5cd47). Could you please provide some feedback on the current progress?
>> Maybe it would be easier to use GitHub to provide the feedback?
> 
> Not sure what your plans are, if you like to work on the standard PSO just for
> fun and to jump into the codebase, my recommendation is to open a PR, but don't
> feel obligated we don't require a code submission.
> 
> The overall structure looks good, it's clear and I think once you go for the
> actual implementation this should be helpful since you can just modify one of
> the existing tests to get the first results. One comment about the test (file
> pso_test.cpp): writing tests is done with the Boost Unit Test Framework:
> http://www.boost.org/doc/libs/1_60_0/libs/test/doc/html/index.html <http://www.boost.org/doc/libs/1_60_0/libs/test/doc/html/index.html> Realistically
> Boost.Test - 1.60.0 <http://www.boost.org/doc/libs/1_60_0/libs/test/doc/html/index.html>
> www.boost.org <http://www.boost.org/>
> Distributed under the Boost Software License, Version 1.0. (See accompanying file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt <http://www.boost.org/LICENSE_1_0.txt>)
> you can look at some of the other tests in src/mlpack/tests/ for examples;
> BOOST_REQUIRE_CLOSE() and BOOST_REQUIRE_EQUAL() are the most useful macros.  If
> you make a test suite called "TestSuite" (BOOST_AUTO_TEST_SUITE(TestSuite)), and
> then build 'mlpack_test' ('make mlpack_test'), you can run only the tests in
> that test suite with 'bin/mlpack_test -t TestSuite'.  A specific test case
> called 'TestCase' (BOOST_AUTO_TEST_CASE(TestCase)) could be run with
> 'bin/mlpack_test -t TestSuite/TestCase'.  So the main you implemented is good
> for debugging but at the end, we should write one that fits in the existing
> infrastructure.
> 
>> Is it advisable to follow the test driven development workflow i.e. write test
>> cases (failing) before the actual implementation?
> 
> That is a good idea, realistically you can take a look at the CNE or CMA-ES test
> cases and adapt the code accordingly, we can also add more tests along the way.
> 
>> Also, if we want to implement further variants of PSO later on, what would be
>> the best way to do this? For CMA-ES, this is achieved using a policy based
>> design where each variant has a unique class and it implements the Select()
>> method. I'm not sure, but I think all PSO variants mostly differ in how the
>> velocity and position is updated. So, we can use the same Optimize() method and
>> maybe use a updateParameters() method, which updates the particles' velocity and
>> position based on the PSO variant, all using a policy based design. However, I'm
>> not sure how the user would provide variant specific parameters e.g. the
>> constriction factor k, since this is absent in vanilla PSO.
> 
> An easy solution would be to use a dummy parameter for the variants that don't
> provide a specific parameter. Here is an example: https://github.com/mlpack/mlpa <https://github.com/mlpack/mlpa>
> ck/blob/master/src/mlpack/core/optimizers/sgdr/cyclical_decay.hpp, https://githu <https://githu/>
> b.com/mlpack/mlpack/blob/master/src/mlpack/core/optimizers/sgdr/snapshot_ensembl <http://b.com/mlpack/mlpack/blob/master/src/mlpack/core/optimizers/sgdr/snapshot_ensembl>
> es.hpp in this case the iterate is only used in the second decay policy. We
> could do the same for the PSO implementation, another idea would be to use
> SFINAE and select the methods according to the given parameters.
> 
> Thanks,
> Marcus
> 
> 
>> On 29. Jan 2018, at 19:10, Adeel Ahmad <adeelahmad14 at hotmail.com <mailto:adeelahmad14 at hotmail.com>> wrote:
>> 
>> Hello Marcus,
>> 
>> I have created a minimal layout for the PSO (unconstrained problems). Please see this commit (https://github.com/adl1995/mlpack/commit/493c87968a9291582bb663d53722818295e5cd47 <https://github.com/adl1995/mlpack/commit/493c87968a9291582bb663d53722818295e5cd47>). Could you please provide some feedback on the current progress? Maybe it would be easier to use GitHub to provide the feedback?
>> 
>> Is it advisable to follow the test driven development workflow i.e. write test cases (failing) before the actual implementation?
>> 
>> Also, if we want to implement further variants of PSO later on, what would be the best way to do this? For CMA-ES, this is achieved using a policy based design where each variant has a unique class and it implements the Select() method. I'm not sure, but I think all PSO variants mostly differ in how the velocity and position is updated. So, we can use the same Optimize() method and maybe use a updateParameters() method, which updates the particles' velocity and position based on the PSO variant, all using a policy based design. However, I'm not sure how the user would provide variant specific parameters e.g. the constriction factor k, since this is absent in vanilla PSO.
>> 
>> Thank you,
>> Adeel
>> From: Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>>
>> Sent: Wednesday, January 24, 2018 7:55 PM
>> To: Adeel Ahmad
>> Subject: Re: [mlpack] Query regarding constrained and unconstrained methods
>>  
>> Hello Adeel,
>> 
>>> I'm not sure how we can provide multiple constraints in
>>> (https://stackoverflow.com/a/28747100 <https://stackoverflow.com/a/28747100>). Would we have to use pack expansion like
>>> in the code block above?
>>> 
>>> Also, wouldn't it be easier if the user could provide multiple constraints in a
>>> single lambda function, rather than creating a separate lambda function for each
>>> constraint? For example:
>>> 
>>> void optimize(std::function<bool(double, double)> constraint)
>>> {
>>>   ...
>>>   while(!constraint(x, y))
>>>     // do something
>>> }
>>> 
>>> int main() {
>>>   auto constraint = [](double x, double y) { return (x < 3 && y > 4);};
>>>   optimize(constraint);
>>> }
>>> 
>>> Maybe I'm missing out on a crucial point which makes the above solution
>>> unusable. Please let me know if I am.
>> 
>> Agreed, that would make everything a lot easier, will have to think about a
>> situation where that might be contra productive. I guess if someone likes to add
>> another constrained this approach would mean to redefine everything, but I don't
>> think this is a huge burden. Anyway, I'll think about the approach with the
>> single constrained and multiple constrained and let you know once I figured
>> something out.
>> 
>>> > I'd like to start with the unconstrained PSO method first, but if you like to
>>> > start with a proof of concept for the constrained method feel free to do so.
>>> > Either way, really like all the thoughts you already put into the idea.
>>> 
>>> I have no issue starting out with the unconstrained PSO method first. However,
>>> could you please provide some guidelines on how to start with the
>>> implementation? I was thinking of creating a fork of the mlpack repository and
>>> including the optimizer in src/mlpack/core/optimizers/pso. Would that be
>>> alright?
>> 
>> Yes, that's the recommended way to do it, I would also open another branch in the
>> fork (git checkout -b pso) makes the workflow easier if you work on more than
>> one feature at the same time. Also, you can take a look at the existing
>> optimizer classes to get some more details especially
>> https://github.com/mlpack/mlpack/tree/master/src/mlpack/core/optimizers/cmaes <https://github.com/mlpack/mlpack/tree/master/src/mlpack/core/optimizers/cmaes>
>>  <https://github.com/mlpack/mlpack/tree/master/src/mlpack/core/optimizers/cmaes>	
>> mlpack/mlpack <https://github.com/mlpack/mlpack/tree/master/src/mlpack/core/optimizers/cmaes>
>> github.com <http://github.com/>
>>  <http://github.com/>	
>> Build software better, together <http://github.com/>
>> github.com <http://github.com/>
>> GitHub is where people build software. More than 27 million people use GitHub to discover, fork, and contribute to over 80 million projects.
>> mlpack: a scalable C++ machine learning library --
>> should be helpful here. Another reference is: https://arxiv.org/abs/1711.06581 <https://arxiv.org/abs/1711.06581>
>> [1711.06581] A generic and fast C++ optimization framework <https://arxiv.org/abs/1711.06581>
>> arxiv.org <http://arxiv.org/>
>> arXiv.org e-Print archive <http://arxiv.org/>
>> arxiv.org <http://arxiv.org/>
>> arXiv is an e-print service in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering ...
>> Abstract: The development of the mlpack C++ machine learning library (this http URL) has required the design and implementation of a flexible, robust optimization ...
>> which provides some more details about the Optimizer API in general.
>> 
>> Thanks,
>> Marcus
>> 
>> 
>>> On 24. Jan 2018, at 09:54, Adeel Ahmad <adeelahmad14 at hotmail.com <mailto:adeelahmad14 at hotmail.com>> wrote:
>>> 
>>> Hello Marcus,
>>> 
>>> I was able to create a variadic template that takes an arbitrary number of constraints following this blog post (https://eli.thegreenplace.net/2014/variadic-templates-in-c/ <https://eli.thegreenplace.net/2014/variadic-templates-in-c/>). The code block below returns True only when all the constraints are satisfied:
>>> 
>>> template<typename T>
>>> bool constraintCheck(T v) {
>>>   return v(3);
>>> }
>>> 
>>> template<typename T, typename... Args>
>>> bool constraintCheck(T constraint, Args... args) {
>>>   return constraint(3) && constraintCheck(args...);
>>> }
>>> 
>>> int main() {
>>>   auto constraint1=[](int x){ return x < 3; };
>>>   auto constraint2=[](int x){ return x > 3; };
>>>   auto constraint3=[](int x){ return x % 3 == 0; };
>>>   auto constraint4=[](int x){ return x == 3; };
>>> 
>>>   constraintCheck(constraint1, constraint2, constraint3, constraint4); // evaluates to 0
>>> }
>>> 
>>> I'm not sure how we can provide multiple constraints in (https://stackoverflow.com/a/28747100 <https://stackoverflow.com/a/28747100>). Would we have to use pack expansion like in the code block above?
>>> 
>>> Also, wouldn't it be easier if the user could provide multiple constraints in a single lambda function, rather than creating a separate lambda function for each constraint? For example:
>>> 
>>> void optimize(std::function<bool(double, double)> constraint)
>>> {
>>>   ...
>>>   while(!constraint(x, y))
>>>     // do something
>>> }
>>> 
>>> int main() {
>>>   auto constraint = [](double x, double y) { return (x < 3 && y > 4);};
>>>   optimize(constraint);
>>> }
>>> 
>>> Maybe I'm missing out on a crucial point which makes the above solution unusable. Please let me know if I am.
>>> 
>>> > I'd like to start with the unconstrained PSO method first, but if you like to
>>> > start with a proof of concept for the constrained method feel free to do so.
>>> > Either way, really like all the thoughts you already put into the idea.
>>> 
>>> I have no issue starting out with the unconstrained PSO method first. However, could you please provide some guidelines on how to start with the implementation? I was thinking of creating a fork of the mlpack repository and including the optimizer in src/mlpack/core/optimizers/pso. Would that be alright?
>>> 
>>> Thank you,
>>> Adeel
>>> 
>>> From: Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>>
>>> Sent: Tuesday, January 23, 2018 5:02 AM
>>> To: Adeel Ahmad
>>> Subject: Re: [mlpack] Query regarding constrained and unconstrained methods
>>>  
>>> Hello Adeel,
>>> 
>>> What do you think about https://stackoverflow.com/a/28747100 <https://stackoverflow.com/a/28747100>, I think in
>>> combination with a variadic template we could even pass an arbitrary number of
>>> constraints. What I don't like about the solution is that this contains a
>>> virtual function, maybe we can avoid that somehow, have to think about that.
>>> 
>>> Should I start working on a minimal script that implements the PSO using lambda
>>> functions?
>>> 
>>> I'd like to start with the unconstrained PSO method first, but if you like to
>>> start with a proof of concept for the constrained method feel free to do so.
>>> Either way, really like all the thoughts you already put into the idea.
>>> 
>>> Thanks,
>>> Marcus
>>> 
>>>> On 22. Jan 2018, at 13:37, Adeel Ahmad <adeelahmad14 at hotmail.com <mailto:adeelahmad14 at hotmail.com>> wrote:
>>>> 
>>>> Hello Marcus,
>>>> 
>>>> Thank you for the clarification on the usage of C++11 lambda functions. This seems a more intuitive approach, rather than using a vector representation. I can think of two ways by which the user can apply constraints. One is to define the constraint as a lambda function and pass it as a function pointer. However, this way the lambda function should not capture anything (https://stackoverflow.com/a/28746827 <https://stackoverflow.com/a/28746827>), so we have to rely on std::function instead. This could be implemented like this:
>>>> 
>>>> void optimize(std::function<bool(double)> constraint)
>>>> {
>>>>   ...
>>>>   while(!constraint(x))
>>>>     // do something
>>>> }
>>>> 
>>>> auto constraint = [](double x) { return x < 3; };
>>>> optimize(constraint, /* other parameters */)
>>>> 
>>>> Because we are relying on capture, we cannot pass constraint as a function pointer.
>>>> 
>>>> Another way is to define the constraint as a class member and initialize it with a lambda function with the user input, like this:
>>>> 
>>>> class PSO
>>>> {
>>>> public:
>>>>   PSO(double x)
>>>>   {
>>>>     constraint = [](double x) { return x < 3; };
>>>>   }
>>>> private:
>>>>   std::function<bool(double)> constraint;
>>>> }
>>>> 
>>>> The former technique has the advantage that the user can define any sort of constraint they want, while in the latter technique only a handful of constraints could be offered (maybe this limitation could be eliminated using additional parameters).
>>>> 
>>>> Should I start working on a minimal script that implements the PSO using lambda functions?
>>>> 
>>>> > I think it's just fine, to let the user select the value, however, we should
>>>> > note that there are some good initial values in the documentation and examples.
>>>> > Does this sound reasonable?
>>>> 
>>>> Yes, it sounds fine if the user can initialize the value. Maybe we can point out in the optimizer documentation on the recommended initial values from the paper.
>>>> 
>>>> Thank you,
>>>> Adeel
>>>> 
>>>> 
>>>> From: Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>>
>>>> Sent: Sunday, January 21, 2018 7:53 PM
>>>> To: Adeel Ahmad
>>>> Cc: mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
>>>> Subject: Re: [mlpack] Query regarding constrained and unconstrained methods
>>>>  
>>>> Hello Adeel,
>>>> 
>>>>> I have done some research on C++ lambda functions. Did you mean to use these
>>>>> instead of the standard accessors and mutators? From what I have found, lambda
>>>>> functions are used for writing an anonymous inline functor right into the spot
>>>>> where it is called, like in this example below (source):
>>>>> 
>>>>> std::for_each(v.begin(), v.end(), [](int) { /* do something here */ });
>>>>> 
>>>>> Although they can be used to modify the parameters (passed in a capture list) by
>>>>> using the mutable keyword, I don't know what advantage this would have over the
>>>>> standard accessors and mutators. If you had a different use in mind, please let
>>>>> me know.
>>>> 
>>>> I was thinking to use the C++11 lambda functions to define the constraints
>>>> instead of using a matrix representation, I had something like this in mind:
>>>> 
>>>> auto constraint = [](double x) { return x < 3; };
>>>> std::cout << "constraint: " << constraint(6) << std::endl;
>>>> 
>>>> I think it might be a good idea to work on a proof of concept before deciding on
>>>> the design, what do you think?
>>>> 
>>>>> I have read some sections from the Velocity Adaptation in Particle Swarm
>>>>> Optimization paper. The PSO variant presented there is somewhat similar to PSO
>>>>> with inertia weight in Looking Inside Particle Swarm Optimization in Constrained
>>>>> Search Spaces paper. The algorithm presented in section 4 for PSO with Velocity
>>>>> Adaptation uses Velocity Length l for scaling the particle velocity based on its
>>>>> current behavior. There are various initialization methods for setting the
>>>>> initial value of velocity length, such as l = r, l = r / sqrt(n). If I opt to
>>>>> implement this PSO variant in my GSoC application, would I leave it to the user
>>>>> for specifying the value of l, or set it by default following a heuristic, or
>>>>> maybe a combination of both?
>>>> 
>>>> I think it's just fine, to let the user select the value, however, we should
>>>> note that there are some good initial values in the documentation and examples.
>>>> Does this sound reasonable?
>>>> 
>>>> Thanks,
>>>> Marcus
>>>> 
>>>> 
>>>>> On 20. Jan 2018, at 11:19, Adeel Ahmad <adeelahmad14 at hotmail.com <mailto:adeelahmad14 at hotmail.com>> wrote:
>>>>> 
>>>>> Hello Marcus,
>>>>> 
>>>>> I have done some research on C++ lambda functions. Did you mean to use these instead of the standardaccessors and mutators? From what I have found, lambda functions are used for writing an anonymous inline functor right into the spot where it is called, like in this example below (source <https://stackoverflow.com/a/7627218>):
>>>>> 
>>>>> std::for_each(v.begin(), v.end(), [](int) { /* do something here */ });
>>>>> 
>>>>> Although they can be used to modify the parameters (passed in a capture list) by using the mutable keyword, I don't know what advantage this would have over the standard accessors and mutators. If you had a different use in mind, please let me know.
>>>>> 
>>>>> Yes, a policy based design seems like a much better option for implementing the optimizer. We could create a base class named PSO and use its methods in another class, for instance, LBPSO; using the former class' object. This would be more intuitive if other variants of PSO are to be implemented in the future.
>>>>> 
>>>>> I have read some sections from the Velocity Adaptation in Particle Swarm Optimization paper. The PSO variant presented there is somewhat similar to PSO with inertia weight in Looking Inside Particle Swarm Optimization in Constrained Search Spaces paper. The algorithm presented in section 4 for PSO with Velocity Adaptation uses Velocity Length l for scaling the particle velocity based on its current behavior. There are various initialization methods for setting the initial value of velocity length, such as l = r, l = r / sqrt(n). If I opt to implement this PSO variant in my GSoC application, would I leave it to the user for specifying the value of l, or set it by default following a heuristic, or maybe a combination of both?
>>>>> 
>>>>> Thank you,
>>>>> Adeel
>>>>> 
>>>>> From: Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>>
>>>>> Sent: Thursday, January 18, 2018 6:51 PM
>>>>> To: Adeel Ahmad
>>>>> Cc: mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
>>>>> Subject: Re: [mlpack] Query regarding constrained and unconstrained methods
>>>>>  
>>>>> Hello Adeel,
>>>>> 
>>>>>> I have read the research paper you linked. In the paper, two variants of PSO are
>>>>>> mentioned -- inertia weight and constriction factor based. It is stated that the
>>>>>> local-best particle swarm optimizer (LBPSO) with constriction k produces the
>>>>>> best results. I assume all variants must be implemented for GSoC, however, in
>>>>>> the paper a modified version of PSO is presented (MPSO), which dynamically
>>>>>> updates two hyper-parameters, k and c2 (acceleration constant for social
>>>>>> elements in the swarm), should this be implemented as well? I suppose this won't
>>>>>> be time consuming if vanilla PSO is already in place.
>>>>> 
>>>>> I'm not sure it would be reasonable to implement every variant mentioned in the
>>>>> paper over the summer, keep in mind that each method has to be tested (writing
>>>>> good tests is time-consuming). So my recommendation is, focus on a single
>>>>> variant, in your proposal you can point out that if there is time left you aim
>>>>> for another variant. But at the end it's up to you, choose the methods you think
>>>>> are interesting. Also, there is another interesting paper that might be
>>>>> interesting as well: "Particle Swarm Optimization with Velocity Adaptation" by S.
>>>>> Helwig et al. (let me know if you can't access the paper).
>>>>> 
>>>>>> Regarding the design of the optimizer itself, it was pointed out earlier by Ryan
>>>>>> that the SDP (semidefinite program) optimizer supports constraints. In there,
>>>>>> the constraints are specified as Armadillo matrices, and set using setters. I
>>>>>> think the same methodology could be applied for PSO.
>>>>> 
>>>>> Right, as pointed out on the ideas page a matrix representation is definitely
>>>>> one option another would be to use C++11 lambda functions:
>>>>> https://en.wikipedia.org/wiki/C%2B%2B11#Lambda_functions_and_expressions <https://en.wikipedia.org/wiki/C%2B%2B11#Lambda_functions_and_expressions> which I
>>>>> C++11 - Wikipedia <https://en.wikipedia.org/wiki/C%2B%2B11#Lambda_functions_and_expressions>
>>>>> en.wikipedia.org <http://en.wikipedia.org/>
>>>>>  <http://en.wikipedia.org/>	
>>>>> Wikipedia, the free encyclopedia <http://en.wikipedia.org/>
>>>>> en.wikipedia.org <http://en.wikipedia.org/>
>>>>> C++11 is a version of the standard for the programming language C++. It was approved by International Organization for Standardization (ISO) on 12 August 2011 ...
>>>>> think would be easier to use as someone could naturally define the constraints.
>>>>> Let me know what you think, coming up with a good structure is part of the
>>>>> project.
>>>>> 
>>>>>> For specifying whether the
>>>>>> PSO is local or global, a boolean could be used. However, the constriction
>>>>>> factor k should only be created in case of constriction based PSO, I'm not sure
>>>>>> what would be the best design for this.
>>>>> 
>>>>> 
>>>>> Another option would be to use a policy based design, provide a separate class
>>>>> for each method and reuse as much code as possible internally. We do something
>>>>> similar for Adam, RmsProp, etc. each optimizer basically uses the SGD class and
>>>>> all we do is to provide a wrapper class to set optimizer specific parameter. Let
>>>>> me know what you think.
>>>>> 
>>>>>> Would it be possible for us to discuss the optimizer architecture in more detail
>>>>>> on the mailing list?
>>>>> 
>>>>> Absolutely, we are here to help.
>>>>> 
>>>>> Thanks,
>>>>> Marcus
>>>>> 
>>>>> 
>>>>>> On 18. Jan 2018, at 08:54, Adeel Ahmad <adeelahmad14 at hotmail.com <mailto:adeelahmad14 at hotmail.com>> wrote:
>>>>>> 
>>>>>> Hello Marcus,
>>>>>> 
>>>>>> I have read the research paper you linked. In the paper, two variants of PSO are mentioned -- inertia weight and constriction factor based. It is stated that the local-best particle swarm optimizer (LBPSO) with constriction k produces the best results. I assume all variants must be implemented for GSoC, however, in the paper a modified version of PSO is presented (MPSO), which dynamically updates two hyper-parameters, k and c2 (acceleration constant for social elements in the swarm), should this be implemented as well? I suppose this won't be time consuming if vanilla PSO is already in place.
>>>>>> 
>>>>>> Regarding the design of the optimizer itself, it was pointed out earlier by Ryan that the SDP (semidefinite program) optimizer supports constraints. In there, the constraints are specified as Armadillo matrices, and set using setters. I think the same methodology could be applied for PSO. For specifying whether the PSO is local or global, a boolean could be used. However, the constriction factor k should only be created in case of constriction based PSO, I'm not sure what would be the best design for this.
>>>>>> 
>>>>>> Would it be possible for us to discuss the optimizer architecture in more detail on the mailing list?
>>>>>> 
>>>>>> Thank you,
>>>>>> Adeel
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> From: Marcus Edel <marcus.edel at fu-berlin.de <mailto:marcus.edel at fu-berlin.de>>
>>>>>> Sent: Wednesday, January 17, 2018 5:39 PM
>>>>>> To: Adeel Ahmad
>>>>>> Cc: mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
>>>>>> Subject: Re: [mlpack] Query regarding constrained and unconstrained methods
>>>>>>  
>>>>>> Hello Adeel,
>>>>>> 
>>>>>> sorry for the slow reponse on this one. There are various approaches to solve
>>>>>> constrained problems; one is the use of a penalty function. The constrained
>>>>>> problem is transformed to an unconstrained one, by penalizing the constraints so
>>>>>> that it can be solved using an unconstrained optimization method. You might take
>>>>>> a look at: "Looking Inside Particle Swarm Optimization in Constrained Search
>>>>>> Spaces" by Jorge Isacc Flores-Mendoza and Efrén Mezura-Montes they describe
>>>>>> various PSO method to solve constrained problems.
>>>>>> 
>>>>>>> I apologize if I misunderstood what constrained problems are, but can't we apply
>>>>>>> constraints to the methods already present in "src/mlpack/methods/*" directory?
>>>>>>> Or, are these unrelated? In the latter case, are there some specialized methods
>>>>>>> for constrained problems that need to be implemented for this project?
>>>>>> 
>>>>>> 
>>>>>> Currently, mlpack does not implement an optimizer that can handle constrained
>>>>>> problems. So for example, if you like to solve the constrained (cube, line)
>>>>>> Rosenbrock function:
>>>>>> 
>>>>>> f(x, y) = (1 - x)^2 + 100(y - x^2)^2
>>>>>> 
>>>>>> with constraints (x - 1)^3 - y +1 < 0 and x + y - 2 < 0
>>>>>> 
>>>>>> Currently, there is no structure to represent the problem and there is no
>>>>>> optimizer that can solve the constrained problem. Comming up with a structure is
>>>>>> one part of the project implementing an optimizer (PSO) that can handle
>>>>>> constrained problems is the other part. But as pointed out in the project idea,
>>>>>> it's recommended to start with a PSO implementation for unconstrained problems
>>>>>> and to extend the work later on.
>>>>>> 
>>>>>>> Regarding the test cases structuring, I've found that in some cases a
>>>>>>> test_function.cpp or <method_name>_test_function.cpp file is present in the main
>>>>>>> method directory, such as here (https://github.com/mlpack/mlpack/blob/master/src <https://github.com/mlpack/mlpack/blob/master/src>
>>>>>>>  <https://github.com/mlpack/mlpack/blob/master/src>	
>>>>>>> mlpack/mlpack <https://github.com/mlpack/mlpack/blob/master/src>
>>>>>>> github.com <http://github.com/>
>>>>>>>  <http://github.com/>	
>>>>>>> Build software better, together <http://github.com/>
>>>>>>> github.com <http://github.com/>
>>>>>>> GitHub is where people build software. More than 27 million people use GitHub to discover, fork, and contribute to over 80 million projects.
>>>>>>> mlpack: a scalable C++ machine learning library --
>>>>>>> /mlpack/core/optimizers/gradient_descent/test_function.cpp). Later, an object of
>>>>>>> this class is created in the main tests directory ("src/mlpack/tests/*"), in
>>>>>>> this case, here (https://github.com/mlpack/mlpack/blob/master/src/mlpack/tests/g <https://github.com/mlpack/mlpack/blob/master/src/mlpack/tests/g>
>>>>>>> radient_descent_test.cpp). So, my question is this, what is the preferred
>>>>>>> structure for writing test cases? In this case, I think this could have been
>>>>>>> directly tested without the need of a separate GDTestFunction class, however,
>>>>>>> this might not have been a neat alternative.
>>>>>> 
>>>>>> There is an open PR which consolidates different problems into one folder
>>>>>> (https://github.com/mlpack/mlpack/pull/1151 <https://github.com/mlpack/mlpack/pull/1151>); the benefit for not implementing
>>>>>>  <https://github.com/mlpack/mlpack/pull/1151>	
>>>>>> Optimization Test Problems by zoq · Pull Request #1151 · mlpack/mlpack <https://github.com/mlpack/mlpack/pull/1151>
>>>>>> github.com <http://github.com/>
>>>>>>  <http://github.com/>	
>>>>>> Build software better, together <http://github.com/>
>>>>>> github.com <http://github.com/>
>>>>>> GitHub is where people build software. More than 27 million people use GitHub to discover, fork, and contribute to over 80 million projects.
>>>>>> Common functions used for testing optimization algorithms, will add more functions and test integrations once we agree on this.
>>>>>> the test function inside the test itself, is that someone could reuse the
>>>>>> functionality for other methods/tests. One example is the SGDTestFunction which
>>>>>> is used to test Adam, SGD, RMSProp, etc.
>>>>>> 
>>>>>> I hope this is helpful, let us know if we should clarify anything.
>>>>>> 
>>>>>> Thanks,
>>>>>> Marcus
>>>>>> 
>>>>>> 
>>>>>>> On 16. Jan 2018, at 19:58, Adeel Ahmad <adeelahmad14 at hotmail.com <mailto:adeelahmad14 at hotmail.com>> wrote:
>>>>>>> 
>>>>>>> Greetings,
>>>>>>> 
>>>>>>> I'm following a potential idea for GSoC 2018 titled "Particle swarm optimization". I have read a few documents and familiarized myself with the algorithm. It's listed in the idea description: "So this project is divided into two parts: First implement one or two unconstrained methods and afterwards takes a look at one --contained-- (constrained [?]) method". I apologize if I misunderstood what constrained problems are, but can't we apply constraints to the methods already present in "src/mlpack/methods/*" directory? Or, are these unrelated? In the latter case, are there some specialized methods for constrained problems that need to be implemented for this project?
>>>>>>> 
>>>>>>> Regarding the test cases structuring, I've found that in some cases a test_function.cpp or <method_name>_test_function.cpp file is present in the main method directory, such as here ( <https://github.com/mlpack/mlpack/blob/3d3d733ba3c41c4f51764f44185767384ab6d9c7/src/mlpack/core/optimizers/gradient_descent/test_function.cpp>https://github.com/mlpack/mlpack/blob/master/src/mlpack/core/optimizers/gradient_descent/test_function.cpp <https://github.com/mlpack/mlpack/blob/master/src/mlpack/core/optimizers/gradient_descent/test_function.cpp>). Later, an object of this class is created in the main tests directory ("src/mlpack/tests/*"), in this case, here ( <https://github.com/mlpack/mlpack/blob/97fd47de5e0a51adf3c01957f6646eb5cc3651d5/src/mlpack/tests/gradient_descent_test.cpp>https://github.com/mlpack/mlpack/blob/master/src/mlpack/tests/gradient_descent_test.cpp <https://github.com/mlpack/mlpack/blob/master/src/mlpack/tests/gradient_descent_test.cpp>). So, my question is this, what is the preferred structure for writing test cases? In this case, I think this could have been directly tested without the need of a separate GDTestFunction class, however, this might not have been a neat alternative.
>>>>>>> 
>>>>>>> Thank you,
>>>>>>> Adeel
>>>>>>> _______________________________________________
>>>>>>> mlpack mailing list
>>>>>>> mlpack at lists.mlpack.org <mailto:mlpack at lists.mlpack.org>
>>>>>>> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
>>>>>>> mlpack Info Page - knife.lugatgt.org <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
>>>>>>> knife.lugatgt.org <http://knife.lugatgt.org/>
>>>>>>> Discussion of mlpack, a scalable C++ machine learning library. To see the collection of prior postings to the list, visit the mlpack Archives.
>> 
>> 
> 
> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://knife.lugatgt.org/pipermail/mlpack/attachments/20180314/9037873a/attachment-0001.html>


More information about the mlpack mailing list