Nyström-based approximate kernel subspace learning

    Research output: Contribution to journalArticleScientificpeer-review

    22 Citations (Scopus)

    Abstract

    In this paper, we describe a method for the determination of a subspace of the feature space in kernel methods, which is suited to large-scale learning problems. Linear model learning in the obtained space corresponds to a nonlinear model learning process in the input space. Since the obtained feature space is determined only by exploiting properties of the training data, this approach can be used for generic nonlinear pattern recognition. That is, nonlinear data mapping can be considered to be a pre-processing step exploiting nonlinear relationships between the training data. Linear techniques can be subsequently applied in the new feature space and, thus, they can model nonlinear properties of the problem at hand. In order to appropriately address the inherent problem of kernel learning methods related to their time and memory complexities, we follow an approximate learning approach. We show that the method can lead to considerable operation speed gains and achieve very good performance. Experimental results verify our analysis.

    Original languageEnglish
    Pages (from-to)190-197
    Number of pages8
    JournalPattern Recognition
    DOIs
    Publication statusPublished - Sept 2016
    Publication typeA1 Journal article-refereed

    Keywords

    • Kernel methods
    • Nonlinear pattern recognition
    • Nonlinear projection trick
    • Nyström approximation

    Publication forum classification

    • Publication forum level 3

    ASJC Scopus subject areas

    • Software
    • Artificial Intelligence
    • Computer Vision and Pattern Recognition
    • Signal Processing

    Fingerprint

    Dive into the research topics of 'Nyström-based approximate kernel subspace learning'. Together they form a unique fingerprint.

    Cite this