PRTools Contents

PRTools User Guide

svc

SVC

Trainable classifier: Support Vector Machine

    [W,J] = SVC(A,KERNEL,C)
    [W,J] = A*SVC([],KERNEL,C)
    [W,J] = A*SVC(KERNEL,C)

Input
 A Dataset
 KERNEL Untrained mapping to compute kernel by A*(A*KERNEL) during  training, or B*(A*KERNEL) during testing with dataset B.
-  String to compute kernel matrices by FEVAL(KERNEL,B,A) Default: linear kernel (PROXM('p',1))
 C Regularisation parameter (optional; default: 1)

Output
 W Mapping: Support Vector Classifier
 J Object indices of support objects

Description

Optimises a support vector classifier for the dataset A by quadratic  programming. The non-linearity is determined by the kernel.  If KERNEL = 0 it is assumed that A is already the kernelmatrix (square).  In this case also a kernel matrix B should be supplied at evaluation by  B*W or PRMAP(B,W).

There are several ways to define KERNEL, e.g. PROXM('r',1) for a  radial basis kernel or by USERKERNEL for a user defined kernel.

If C is NaN this regularisation parameter is optimised by REGOPTC.

SVC is basically a two-class classifier. Multi-class problems are solved  in a one-against-rest fashion by MCLASSC. The resulting base-classifiers  are combined by the maximum confidence rule. A better, non-linear  combiner might be FISHERCC, e.g. W = A*(SVC*FISHERCC)

See for more possibilties SVCINFO

Example(s)

 a = gendatb;                     % generate banana classes
 [w,J] = a*svc(proxm('p',3));  % compute svm with 3rd order polynomial
 a*w*testc                        % show error on train set
 scatterd(a)                      % show scatterplot
 plotc(w)                         % plot classifier
 hold on; 
 scatterd(a(J,:),'o')             % show support objcts

See also

mappings, datasets, proxm, userkernel, nusvc, rbsvc, libsvc, regoptc, mclassc, fishercc,

PRTools Contents

PRTools User Guide

This file has been automatically generated. If badly readable, use the help-command in Matlab.