pystegML documentation

Note the package is still under construction and not tested. Yet.

A selection of learning classifiers. All the classes strive to implement the interface of the classifiers from mlpy; with the methods learn(), pred(), and labels().

Module:pysteg.ml
Date:$Date$
Revision:$Revision$
Author:© 2012: Hans Georg Schaathun <georg@schaathun.net>

The pysteg.ml.classifiers module

Ensemble classifier.

Module:pysteg.ml.classifiers
Date:$Date$
Revision:$Revision$
Author:© 2012: Hans Georg Schaathun <georg@schaathun.net>

This module implements primarily the ensemble classifier, and necessary auxiliaries. This includes the abstract Classifier class which is intended as base class for other classifiers as well.

class pysteg.ml.classifiers.Classifier[source]

Abstract classifier class.

Any derived class must implement _learn() and _pred() which are called from the training method learn() and prediction method pred() (respectively). The main method normalise the arguments, so _learn() and _pred() can assume well-formed arguments of np.array.

labels()[source]

Return the set of labels used.

learn(x, y, labels=None)[source]

Train the classifier.

Parameters :
x : 2d array_like object

training data (N, dim); each row is one object.

y : 1d array_like object integer

true labels of the trainin set

pred(t)[source]

Run classification of test vector(s) t, returning discrete prediction labels.

Parameters :
t : 1d (one sample) or 2d array_like object

Feature vectors to be classified

Returns :
p : integer or 1d numpy array

predicted class(es)

class pysteg.ml.classifiers.EC(L, dred, base=<class mlpy.da.LDAC at 0x2d72f58>, rule='majority')[source]

This class represents an ensemble classifier.

pred_values(t)

Run classification of test vector(s) t and return soft information classification scores.

Parameters :
t : 1d (one sample) or 2d array_like object

Feature vectors to be classified

Returns :
p : float or 1d numpy array

classification scores

NOT COMPLETED. Need to design fusion rules for multi-class prediction.

The pysteg.ml.svm module

Support Vector Machines with Grid Search.

Module:pysteg.ml.svm
Date:$Date$
Revision:$Revision$
Author:© 2012: Hans Georg Schaathun <georg@schaathun.net>
Copyright:a small part of the code is derived from code by (c) 2000-2010 Chih-Chung Chang and Chih-Jen Lin. See source code for details.

This provides a wrapper for the LibSvm, where a grid search is performed as an integral part of training. Thus gamma and C do not have to be specified at initialisation time; instead search ranges are provided and the training method finds the optimal values within the range.

The module also exports the auxiliary functions to do grid search and cross-validation.

class pysteg.ml.svm.SVM(fold=5, crange=(-5, 15, 2), grange=(3, -15, -2), nworker=1, gnuplot=None, verbosity=0, **kw)[source]

Support Vector Machine.

This is a wrapper for mlpy.LibSvm with additional functionality.

This class is based on the published PDF documentation for mlpy version 3.5.0. It is incompatible with the current head of the mercurial repo of mlpy. It will in all likelihood be DEPRECATED.

pysteg.ml.svm.gridsearch(x, y, fold, crange=(-5, 15, 2), grange=(3, -15, -2), nr_local_worker=1, gnuplot=None, verbosity=0, **kw)[source]

Perform a grid search based on the given test set and options. If gnuplot is given, it should implement the interface of GnuPlot object, and will be used to plot and update a contour plot of parameter choices during the search.

The return values is (db,(best_c1, best_g1, best_rate)), where db is a list of all the tested parameters with performance, best_c1 and best_g1 are the optimal parameters (log c and log gamma), and best_rate is the cross-validation accuracy at the optimised parameters.

This code is derived from the grid search script distributed with the libSVM library.

Copyright (c) 2000-2010 Chih-Chung Chang and Chih-Jen Lin Copyright (c) 2012 Hans Georg Schaathun

pysteg.ml.svm.xvalidate(cls, k, x, y, seed=0, **kw)[source]

Cross-validate a model

Parameters :

cls : class of the classifier to validate k : integer

to perform k-fold crossvalidation

x : 2-D array

training set feature vectors

y : 1-D array

training set labels

All other keyword arguments are passed to the classifier constructor.

Table Of Contents

Previous topic

sql Package

Next topic

The features Package

This Page