Skip to main content

2004 | OriginalPaper | Buchkapitel

Support Blob Machines

The Sparsification of Linear Scale Space

verfasst von : Marco Loog

Erschienen in: Computer Vision - ECCV 2004

Verlag: Springer Berlin Heidelberg

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

A novel generalization of linear scale space is presented. The generalization allows for a sparse approximation of the function at a certain scale.To start with, we first consider the Tikhonov regularization viewpoint on scale space theory [15]. The sparsification is then obtained using ideas from support vector machines [22] and based on the link between sparse approximation and support vector regression as described in [4] and [19].In regularization theory, an ill-posed problem is solved by searching for a solution having a certain differentiability while in some precise sense the final solution is close to the initial signal. To obtain scale space, a quadratic loss function is used to measure the closeness of the initial function to its scale σ image.We propose to alter this loss function thus obtaining our generalization of linear scale space. Comparable to the linear ε-insensitive loss function introduced in support vector regression [22], we use a quadratic ε-insensitive loss function instead of the original quadratic measure. The ε-insensitivity loss allows errors in the approximating function without actual increase in loss. It penalizes errors only when they become larger than the a priory specified constant ε. The quadratic form is mainly maintained for consistency with linear scale space.Although the main concern of the article is the theoretical connection between the foregoing theories, the proposed approach is tested and exemplified in a small experiment on a single image.

Metadaten
Titel
Support Blob Machines
verfasst von
Marco Loog
Copyright-Jahr
2004
Verlag
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-540-24673-2_2