PRTools Contents

PRTools User Guide

svo

SVO

Support Vector Optimiser, low-level routine

    [V,J,C,NU] = SVO(K,NLAB,C,OPTIONS)

Input
 K Similarity matrix
 NLAB Label list consisting of -1/+1
 C Scalar for weighting the errors (optional; default: 1)
 OPTIONS
 .PD_CHECK force positive definiteness of the kernel by adding a small constant  to a kernel diagonal (default: 1)
 .BIAS_IN_ADMREG it may happen that bias of svc (b term) is not defined, then  if BIAS_IN_ADMREG == 1, b will be taken from the midpoint of its admissible  region, otherwise (BIAS_IN_ADMREG == 0) the situation will be considered  as an optimisation failure and treated accordingly (deafault: 1)
 .PF_ON_FAILURE if optimisation is failed (or bias is undefined and BIAS_IN_ADMREG is 0) and PF_ON_FAILURE == 1, then Pseudo Fisher classifier will be computed,  otherwise (PF_ON_FAILURE == 0) an error will be issued (default: 1)

Output
 V Vector of weights for the support vectors
 J Index vector pointing to the support vectors
 C C which was actually used for optimisation
 NU NU parameter of NUSVC algorithm, which gives the same classifier

Description

A low level routine that optimises the set of support vectors for a 2-class  classification problem based on the similarity matrix K computed from the  training set. SVO is called directly from SVC. The labels NLAB should indicate  the two classes by +1 and -1. Optimisation is done by a quadratic programming.  If available, the QLD function is used, otherwise an appropriate Matlab routine.

See also

svc,

PRTools Contents

PRTools User Guide

This file has been automatically generated. If badly readable, use the help-command in Matlab.