Self-Attention Neural Bag-of-Features

Kateryna Chumachenko, Alexandros Iosifidis, Moncef Gabbouj

Tutkimustuotos: KonferenssiartikkeliScientificvertaisarvioitu

Abstrakti

In this work, we propose several attention formulations for multi-variate sequence data. We build on top of the recently introduced 2D-Attention and reformulate the attention learning methodology by quantifying the relevance of feature/temporal dimensions through latent spaces based on self-attention rather than learning them directly. In addition, we propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information without treating feature and temporal representations independently. The proposed approaches can be used in various architectures and we specifically evaluate their application together with Neural Bag of Features feature extraction module. Experiments on several sequence data analysis tasks show the improved performance yielded by our approach compared to standard methods.

AlkuperäiskieliEnglanti
Otsikko2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing, MLSP 2022
KustantajaIEEE
ISBN (elektroninen)9781665485470
DOI - pysyväislinkit
TilaJulkaistu - 2022
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisussa
TapahtumaIEEE International Workshop on Machine Learning for Signal Processing - Xi'an, Kiina
Kesto: 22 elok. 202225 elok. 2022

Julkaisusarja

NimiIEEE International Workshop on Machine Learning for Signal Processing, MLSP
Vuosikerta2022-August
ISSN (painettu)2161-0363
ISSN (elektroninen)2161-0371

Conference

ConferenceIEEE International Workshop on Machine Learning for Signal Processing
Maa/AlueKiina
KaupunkiXi'an
Ajanjakso22/08/2225/08/22

Julkaisufoorumi-taso

  • Jufo-taso 1

!!ASJC Scopus subject areas

  • Human-Computer Interaction
  • Signal Processing

Sormenjälki

Sukella tutkimusaiheisiin 'Self-Attention Neural Bag-of-Features'. Ne muodostavat yhdessä ainutlaatuisen sormenjäljen.

Siteeraa tätä