Note the package is still under construction and not tested. Yet.
A selection of learning classifiers. All the classes strive to implement the interface of the classifiers from mlpy; with the methods learn(), pred(), and labels().
Module: | pysteg.ml |
---|---|
Date: | $Date$ |
Revision: | $Revision$ |
Author: | © 2012: Hans Georg Schaathun <georg@schaathun.net> |
Ensemble classifier.
Module: | pysteg.ml.classifiers |
---|---|
Date: | $Date$ |
Revision: | $Revision$ |
Author: | © 2012: Hans Georg Schaathun <georg@schaathun.net> |
This module implements primarily the ensemble classifier, and necessary auxiliaries. This includes the abstract Classifier class which is intended as base class for other classifiers as well.
Abstract classifier class.
Any derived class must implement _learn() and _pred() which are called from the training method learn() and prediction method pred() (respectively). The main method normalise the arguments, so _learn() and _pred() can assume well-formed arguments of np.array.
This class represents an ensemble classifier.
Run classification of test vector(s) t and return soft information classification scores.
Parameters : |
|
---|---|
Returns : |
|
NOT COMPLETED. Need to design fusion rules for multi-class prediction.
Support Vector Machines with Grid Search.
Module: | pysteg.ml.svm |
---|---|
Date: | $Date$ |
Revision: | $Revision$ |
Author: | © 2012: Hans Georg Schaathun <georg@schaathun.net> |
Copyright: | a small part of the code is derived from code by (c) 2000-2010 Chih-Chung Chang and Chih-Jen Lin. See source code for details. |
This provides a wrapper for the LibSvm, where a grid search is performed as an integral part of training. Thus gamma and C do not have to be specified at initialisation time; instead search ranges are provided and the training method finds the optimal values within the range.
The module also exports the auxiliary functions to do grid search and cross-validation.
Support Vector Machine.
This is a wrapper for mlpy.LibSvm with additional functionality.
This class is based on the published PDF documentation for mlpy version 3.5.0. It is incompatible with the current head of the mercurial repo of mlpy. It will in all likelihood be DEPRECATED.
Perform a grid search based on the given test set and options. If gnuplot is given, it should implement the interface of GnuPlot object, and will be used to plot and update a contour plot of parameter choices during the search.
The return values is (db,(best_c1, best_g1, best_rate)), where db is a list of all the tested parameters with performance, best_c1 and best_g1 are the optimal parameters (log c and log gamma), and best_rate is the cross-validation accuracy at the optimised parameters.
This code is derived from the grid search script distributed with the libSVM library.
Copyright (c) 2000-2010 Chih-Chung Chang and Chih-Jen Lin Copyright (c) 2012 Hans Georg Schaathun
Cross-validate a model
Parameters : | cls : class of the classifier to validate k : integer
|
---|
All other keyword arguments are passed to the classifier constructor.