Attention-based Neural Bag-of-Features Learning for Sequence Data

Dat Thanh Tran, Nikolaos Passalis, Anastasios Tefas, Moncef Gabbouj, Alexandros Iosifidis

Research output: Contribution to journalArticleScientificpeer-review


In this paper, we propose 2D-Attention (2DA), a generic attention formulation for sequence data, which acts as a complementary computation block that can detect and focus on relevant sources of information for the given learning objective. The proposed attention module is incorporated into the recently proposed Neural Bag of Feature (NBoF) model to enhance its learning capacity. Since 2DA acts as a plug-in layer, injecting it into different computation stages of the NBoF model results in different 2DA-NBoF architectures, each of which possesses a unique interpretation. We conducted extensive experiments in financial forecasting, audio analysis as well as medical diagnosis problems to benchmark the proposed formulations in comparison with existing methods, including the widely used Gated Recurrent Units. Our empirical analysis shows that the proposed attention formulations can not only improve performances of NBoF models but also make them resilient to noisy data.

Original languageEnglish
JournalIEEE Access
Publication statusE-pub ahead of print - 2022
Publication typeA1 Journal article-refereed


  • Attention Mechanism
  • Neural Bag-of-Features
  • Time-series Analysis

Publication forum classification

  • Publication forum level 2

ASJC Scopus subject areas

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)


Dive into the research topics of 'Attention-based Neural Bag-of-Features Learning for Sequence Data'. Together they form a unique fingerprint.

Cite this