PRTools, elements, operations, user commands, introductory examples, advanced examples

Classifiers

On this page the classifier is introduced. It is an essential extension of PRTools’ mapping.

Background

Programmatically there is almost no difference between classifers and other trainable mappings. It is just the way they are used. A c-class classifier maps the input vector space on an output space of c dimensions. Every dimension is related to one of the classes. The vector elements represent either distances to the corresponding class (numbers on the interval $$[-infty,infty]$$), class memberships $$[0,1]$$, class confidences $$[0,1]$$ summing to $$1$$ or posterior probabilities also $$[0,1]$$ summing to $$1$$.

The output dataset produced by a trained classifier, the classification dataset, is special, also in another way. Datasets have the possibility of an annotation of the features, the so-called feature labels. They should be set by the class names, the labels of the objects used to train the classifier. Thereby it is made possible to assign label estimates during the classification procedure: the feature label of the column with the largest element of the vector representing a test object.The classification routine labeld is based on this.

The object labels of a test dataset are copied to the classification dataset. So every test object has therby two labels: its object label and the label found by the classification procedure. Test routines like testc, testd, confmat and prcrossval compare these two labelings.

It is the responsibility of the programmer of classification routines to produce the outputs in the above sense and label the features of the classification dataset appropriately.

Definition

The definition of a classifier is formally not different than for trainable mappings. It is the programmatic construction of the output classification dataset that makes the difference. See mapping definition.

Overload

Operations on mappings apply to classifiers as well. Especially important are the combining classifiers. See the    sections on stacked and parallel combining, and the one on dyadic operations. Sequential combining of mappings and classifiers can be used to modify classifiers, e.g. by first scaling or reducing the feature space.

Examples

 

elements: datasets datafiles cells and doubles mappings classifiers mapping types.
operations: datasets datafiles cells and doubles mappings classifiers stacked parallel sequential dyadic.
user commands: datasets representation classifiers evaluation clustering examples support routines.
introductory examples: Introduction Scatterplots Datasets Datafiles Mappings Classifiers Evaluation Learning curves Feature curves Dimension reduction Combining classifiers Dissimilarities.
advanced examples.