Trunk’s example of the peaking phenomenon

In 1979 G.V. Trunk published a very clear and simple example of the peaking phenomenon. It has been cited many times to explain the existence of peaking. Here, we will summarize and discuss it for those who want to have a better idea about the peaking problem. The paper presents an extreme example. Its value…

Read the rest of this entry

Pattern recognition at eastern

We were invited to watch the search for eastern eggs at the preschool of our grandson Pim. He is about two-and-a-halve, does not speak so much, but is mainly an observer. Internally he might even be a philosopher, who knows? We had to take the early train and rent some bicycles to arrive in time…

Read the rest of this entry

A crisis in the theory of pattern recognition

The Russian scientist A. Lerner published in 1972 a paper under the title: “A crisis in the theory of Pattern Recognition”. This is definitely a title that attracts the attention of researchers interested in the history of the field. What was it to have appeared as the crisis? The answer is surprising, in short it…

Read the rest of this entry

The curse of dimensionality

Imagine a two-class problem represented by 100 training objects in a 100-dimensional feature (vector) space. If the objects are in general position (not by accident in a low-dimensional subspace) then they still fit perfectly in a 99-dimensional subspace. This is a ‘plane’, formally a hyperplane, in the 100-dimensional feature space. We will argue that this…

Read the rest of this entry

Hughes phenomenon

The peaking paradox was heavily discussed in pattern recognition after a general mathematical analysis of the phenomenon was published by Hughes in 1968. It has puzzled researchers for at least a decade. This peaking is a real world phenomenon and it has been observed many times. Although the explanation by Hughes seemed general and convincing,…

Read the rest of this entry

The peaking paradox

To measure is to know. Thereby, if we measure more, we know more. This is a fundamental understanding in science, phrased by Kelvin. If we want to know more, or want to increase the accuracy of our knowledge, we should observe more. How to realize this in pattern recognition, however, is a permanently returning problem…

Read the rest of this entry

Non-metric dissimilarities are all around

A big advantage of the representation of objects by a dissimilarity space over the use of kernels is that it has no problems with the usage of non-Euclidean dissimilarity measures. More specifically, it can handle non-metric measures as well. Here, we will show common examples that such dissimilarities arise easily, both, in daily life as…

Read the rest of this entry

Metric learning, a problem in consciousness

Pattern recognition studies the tools to learn from examples what is yet unknown. Distances (or dissimilarities) are a primary notion for learning in pattern recognition. What to do if no proper distance measure is known? Can it be learnt? On what basis? This seems to be a consciousness problem. There are three sources of scientific…

Read the rest of this entry

Kernel-induced space versus the dissimilarity space

The dissimilarity representation has a strong resemblance to a kernel. There are, however, essential differences in assumptions and usage. Here they will be summarized and illustrated by some examples. Dissimilarities and kernels are both functions describing the pairwise relations between objects. Dissimilarities can be considered as a special type of kernel if kernels are understood…

Read the rest of this entry

Personal history on the dissimilarity representation

Personal and historical notes The previous post briefly explains arguments for the steps taken by us between 1995 and 2005. From the perspective we have now, it has become much more clear what we did in those years. Below a few historical remarks will be made as they sketch how research may proceed. It all started when…

Read the rest of this entry

 Page 4 of 7  « First  ... « 2  3  4  5  6 » ...  Last »