PARALLEL Combining classifiers in different feature spaces
WC = PARALLEL(W1,W2,W3, ....) or WC = [W1;W2;W3; ...] WC = PARALLEL({W1;W2;W3; ...}) or WC = [{W1;W2;W3; ...}] WC = PARALLEL(WC,W1,W2, ....) or WC = [WC;W2;W3; ...] WC = PARALELL(C);
DescriptionThe base classifiers (or mappings) W1, W2, W3, ... defined in different feature spaces are combined in WC. This is a classifier defined for the total number of features and with the combined set of outputs. So, for three two-class classifiers defined for the classes 'c1' and 'c2', a dataset A is mapped by D = A*WC on the outputs 'c1','c2','c1','c2','c1','c2' which are the feature labels of D. Note that classification by LABELD(D) finds for each vector in D the feature label of the column with the maximum value. This is equivalent to using the maximum combiner MAXC. Other fixed combining rules like PRODC, MEANC, and VOTEC can be applied by D = A*WC*PRODC etc. A trained combiner like FISHERC has to be supplied with the appropriate training set by AC = A*WC; VC = AC*FISHERC. So the expression VC = A*WC*FISHERC yields a classifier and not a dataset as with fixed combining rules. This classifier operates in the intermediate feature space, the output space of the set of base classifiers. A new dataset B has to be mapped to this intermediate space first by BC = B*WC before it can be classified by D = BC*VC. As this is equivalent to D = B*WC*VC, the total trained combiner is WTC = WC*VC = WC*A*WC*FISHERC. To simplify this procedure PRTools executes the training of a combined classifier by WTC = A*(WC*FISHERC) as WTC = WC*A*WC*FISHERC. In order to allow for training an untrained parallel combined classifier by A*WC the subsets of the features of A that apply for the individual base classifiers of WC should be known to WC. Subset sizes are stored in the dataset description if constructed by horizontal concatenation See alsomappings, datasets, maxc, minc, meanc, medianc, prodc, fisherc, stacked, Example(s)prex_combining,
|