TY - GEN
T1 - Self-Attention Neural Bag-of-Features
AU - Chumachenko, Kateryna
AU - Iosifidis, Alexandros
AU - Gabbouj, Moncef
N1 - Funding Information:
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 871449 (OpenDR).
Publisher Copyright:
© 2022 IEEE.
jufoid=70573
PY - 2022
Y1 - 2022
N2 - In this work, we propose several attention formulations for multi-variate sequence data. We build on top of the recently introduced 2D-Attention and reformulate the attention learning methodology by quantifying the relevance of feature/temporal dimensions through latent spaces based on self-attention rather than learning them directly. In addition, we propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information without treating feature and temporal representations independently. The proposed approaches can be used in various architectures and we specifically evaluate their application together with Neural Bag of Features feature extraction module. Experiments on several sequence data analysis tasks show the improved performance yielded by our approach compared to standard methods.
AB - In this work, we propose several attention formulations for multi-variate sequence data. We build on top of the recently introduced 2D-Attention and reformulate the attention learning methodology by quantifying the relevance of feature/temporal dimensions through latent spaces based on self-attention rather than learning them directly. In addition, we propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information without treating feature and temporal representations independently. The proposed approaches can be used in various architectures and we specifically evaluate their application together with Neural Bag of Features feature extraction module. Experiments on several sequence data analysis tasks show the improved performance yielded by our approach compared to standard methods.
U2 - 10.1109/MLSP55214.2022.9943454
DO - 10.1109/MLSP55214.2022.9943454
M3 - Conference contribution
AN - SCOPUS:85142762742
T3 - IEEE International Workshop on Machine Learning for Signal Processing, MLSP
BT - 2022 IEEE 32nd International Workshop on Machine Learning for Signal Processing, MLSP 2022
PB - IEEE
T2 - IEEE International Workshop on Machine Learning for Signal Processing
Y2 - 22 August 2022 through 25 August 2022
ER -