[mlpack] GSOC 2014: Introduction

Ryan Curtin gth671b at mail.gatech.edu
Thu Mar 6 11:29:18 EST 2014


On Fri, Feb 28, 2014 at 01:36:33AM +0530, Udit Saxena wrote:
> Hey,
> 
> I came across the perceptron when I was looking further.
> 
> IEEE-Neural-weak
> class<https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=7&ved=0CGUQFjAG&url=http%3A%2F%2Fusers.ece.gatech.edu%2F~jic%2Fieee-neural-weak-class.pdf&ei=JpUPU9O6O4b8rAenh4CIDw&usg=AFQjCNHYdb2n_zYmNH3x1z9yIEsRHrtX8g&sig2=xjFw-bRoX4D6YzjIhAB3jQ&bvm=bv.61965928,d.bmk&cad=rja>.pdf
> - the section on weak classifiers B.III. talks about perceptron as an
> option, and also talks about combination of weak learners.
> 
> Maybe this gives you other ideas ?

Yeah; there are numerous weak classifiers and the perceptron would be a
particularly simple one to implement and test.  Slightly larger neural
networks are an option too.  So is simple linear regression, ridge
regression, and the Naive Bayes classifier, which we actually already
have implemented.

> And extending through templates might just end up being too simple, as at
> least two of these are significantly different. But shouldn't be too much
> of a problem, considering the spine remains the same. I was just wondering
> which ones we would be interested in, but it seems as MingJun Liu says, we
> can test through experimental implementations.
> 
> I will get in touch with Marcus Edel; just trying to get a simple patch put
> in. What vcs do you use for the src? Could you help me with this ?

We use subversion; the repository is located at

http://svn.cc.gatech.edu/fastlab/mlpack/

or you can just check out trunk from

http://svn.cc.gatech.edu/fastlab/mlpack/trunk/
 
> Debian and Ubuntu packaging: Oh ! You've been working on it ? Great. I did
> want to club it this time but we'll see. Maybe, I would want to help
> maintain it, along with keeping the Arch Linux's package up to date too.
> Currently Arch's is outdated.
> I had to build one from your src. Arch's one is 1.0.5 or 1.0.2. One of
> them. Oh, and I'm on Arch.

There was a guy who just did this a couple days before you sent this
message; his handle is govg and I see him in IRC often.

https://aur.archlinux.org/packages/mlpack/

> I think a good way to implement them would be : ( a basic high level
> overview )
> 
>    - implement a batch of weak learners. (say, 4). Will have to write
>    separate functions/classes for this, for each weak learner.
>    - write the adaboost class
>    - and (through templates maybe?) allow the user to load separate
>    instances of the weak learners as a potential input to the adaboost algo.

Yes, that seems reasonable.  It will take some thinking to figure out
how to make the API work in such a way that arbitrary weak learners can
be plugged in.

Also, sorry for the slow response... things have been busy.

Thanks,

Ryan

-- 
Ryan Curtin    | "None of your mailman friends can hear you."
ryan at ratml.org |   - Alpha



More information about the mlpack mailing list