z-logo
open-access-imgOpen Access
Feature Combination beyond Basic Arithmetics
Author(s) -
Hao Fu,
Guoping Qiu,
HE Han-gen
Publication year - 2011
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.25.58
Subject(s) - kernel (algebra) , histogram , pattern recognition (psychology) , kernel method , artificial intelligence , computer science , feature (linguistics) , matching (statistics) , kernel embedding of distributions , mathematics , simple (philosophy) , tree kernel , string kernel , support vector machine , image (mathematics) , statistics , linguistics , philosophy , epistemology , combinatorics
Kernel-based feature combination techniques such as Multiple Kernel Learning use arithmetical operations to linearly combine different kernels. We have observed that the kernel distributions of different features are usually very different. We argue that the similarity distributions amongst the data points for a given dataset should not change with their representation features and propose the concept of relative kernel distribution invariance (RKDI). We have developed a very simple histogram matching based technique to achieve RKDI by transforming the kernels to a canonical distribution. We have performed extensive experiments on various computer vision and machine learning datasets and show that calibrating the kernels to an empirically chosen canonical space before they are combined can always achieve a performance gain over state-of-art methods. As histogram matching is a remarkably simple and robust technique, the new method is universally applicable to kernel-based feature combination.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom