[mlpack] Basic AdaBoost CLI design ideas

Mj Liu noodleslik at gmail.com
Tue Mar 11 12:38:20 EDT 2014


This is a  a try to define the command line interfaces(CLI). CLI shall
provide simple intuitive instructions to the users, and provide helpful
info when users not quite sure how to use the AdaBoost   command.
I think the following command line interface(CLI) shall be provided to
users.
    1) --help           with whole help info provided
  2) --version        AdaBoost may evolve with the development of mlpack
library,
                      the algorithm itself may evolve dependently.
  3) --weak_learner   follows with built in weak leaner algorithm in
mlpack. This *must *be specified.
  4) --Iteration      Maximum steps of the algorithm can be exacuted,
default shall be 1000 or some others
  5) --InputFile      user provided dataset path. This *must *be specified.
  6) --outputFile     results of the algorithm output file path, default
shall be the work dir
                      with name of "output.csv" or any others
  7) --ThreadNo       no. of thread to call the algorithm. default shall
use single thread
                      to run the AdaBoost algorithm.

Implementation of the CLI could refrence to other methods provided by
mlpack. And Ryan Curtin has menstioned that openMP is a good option for
multi-thread implementation which provide much clear struct and is easy to
maintain(the comment was to "Ideas on AdaBoost"). The single thread
algorithm shall be implement in the first stage.

I'm not sure if we shall provide the following CLI:
    *) --Algorithm         user defined algorithm which can be called by
the AdaBoost algorithm.
If this provided then would it need to compile the user defined algorithm
into the mlpack?

As Ryan mentioned that AdaBoost class definition examples in the following
lines:

template<
  typename WeakClassifier1Type,
  typename WeakClassifier2Type,
  typename WeakClassifier3Type,
  ...
>
class AdaBoost
{...
     Classify()

 ...
};


Then there shall be WeakClassifier1Type defined. I believe this is doable
for inner defined methods in mlpack. But for user defined classifiers this
would not work properly. So, AdaBoost algorithm shall  have the ability to
be inherited by user defined class. And user can just override  the calling
method like
        adaboost::setWeakLearner(void (*WeakLearner) (char **args, ...))
Then the AdaBoost do anything else.
Above can be viewed as a try to define the CLIs, and welcome suggestions
and comments and criticisms!~:)
Thanks!

Best Regards

Mingjun Liu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cc.gatech.edu/pipermail/mlpack/attachments/20140312/21f9a88c/attachment-0002.html>


More information about the mlpack mailing list