Generalization of the K-SVD algorithm for minimization of β-divergence

Victor M. Garcia-Molla, Pablo San Juan, Tuomas Virtanen, Antonio M. Vidal, Pedro Alonso

Research output: Contribution to journalArticleScientificpeer-review

1 Citation (Scopus)


In this paper, we propose, describe, and test a modification of the K-SVD algorithm. Given a set of training data, the proposed algorithm computes an overcomplete dictionary by minimizing the β-divergence (β>=1) between the data and its representation as linear combinations of atoms of the dictionary, under strict sparsity restrictions. For the special case β=2, the proposed algorithm minimizes the Frobenius norm and, therefore, for β=2 the proposed algorithm is equivalent to the original K-SVD algorithm. We describe the modifications needed and discuss the possible shortcomings of the new algorithm. The algorithm is tested with random matrices and with an example based on speech separation.

Original languageEnglish
Pages (from-to)47-53
Number of pages7
JournalDigital Signal Processing: A Review Journal
Publication statusPublished - 1 Sept 2019
Publication typeA1 Journal article-refereed


  • Beta-divergence
  • K-SVD
  • Matching pursuit algorithms
  • NMF
  • Nonnegative K-SVD

Publication forum classification

  • Publication forum level 1

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Generalization of the K-SVD algorithm for minimization of β-divergence'. Together they form a unique fingerprint.

Cite this