DisTools examples: Combining dissimilarity matrices:
Different dissimilarity measure will generate different representations. Various ways of combining them are possible. Some examples are shown here. It is assumed that readers are familiar with PRTools and will consult the following pages where needed:
- PRTools User Guide, See at the bottom of the page for a TOC
- Introduction to DisTools
- Dissimilarity Representation Course
- The following packages should be in the Matlab path: PRTools, DisTools, PRDisData
If for the same set of objects a set of dissimilarity measures has been measured, the following options may be considered to combine them:
- Select by crossvalidation the best one.
- Normalize and average.
- Weight and sum. Optimization is similar to what is studied in metric learning.
- Concatenate all dissimilarity spaces. This is realized by horizontal concatenation of the dissimilarity matrices.
- Determine for every dissmilarity measure (matrix) the best classifier and combine them.
Below are some examples. In PRDisData the following datasets consist of a set of dissimilarity matrices of the same objects: chickenpieces
(44 sets), flowcytodis
(4 sets), mfeatdis
(6 sets), cover80
, covers_beethoven
, covers_beatles
, covers_songs
. Moreover, the separately mentioned datasets coildelftdiff
, coildelftsame
and coilyork
refer to the same set of images in the coil database and polydish57
and polydism57
are based on the same set of polygons.
Exercise
- Take one of the multi-dismat problems.
- Normalize all matrices by
disnorm
- Decide for one or more classifiers to use:
knnc
,fisherc
,svc
,loglc
. - Determine for every dismat by cross-validation the performance.
- Average all matrices and determine like above the performance of your classifier(s).
- Concatenate all matrices and determine like above the performance of your classifier(s).
- Are you able to combine classifiers and determine the performance of the combiner?
elements:
datasets
datafiles
cells and doubles
mappings
classifiers
mapping types.
operations:
datasets
datafiles
cells and doubles
mappings
classifiers
stacked
parallel
sequential
dyadic.
user commands:
datasets
representation
classifiers
evaluation
clustering
examples
support routines.
introductory examples:
Introduction
Scatterplots
Datasets
Datafiles
Mappings
Classifiers
Evaluation
Learning curves
Feature curves
Dimension reduction
Combining classifiers
Dissimilarities.
advanced examples.