Recurrent bag-of-features for visual information analysis

Marios Krestenitis, Nikolaos Passalis, Alexandros Iosifidis, Moncef Gabbouj, Anastasios Tefas

Research output: Contribution to journalArticleScientificpeer-review

3 Citations (Scopus)

Abstract

Deep Learning (DL) has provided powerful tools for visual information analysis. For example, Convolutional Neural Networks (CNNs) are excelling in complex and challenging image analysis tasks by extracting meaningful feature vectors with high discriminative power. However, these powerful feature vectors are crushed through the pooling layers of the network, that usually implement the pooling operation in a less sophisticated manner. This can lead to significant information loss, especially in cases where the informative content of the data is sequentially distributed over the spatial or temporal dimension, e.g., videos, which often require extracting fine-grained temporal information. A novel stateful recurrent pooling approach, that can overcome the aforementioned limitations, is proposed in this paper. The proposed method is inspired by the well-known Bag-of-Features (BoF) model, but employs a stateful trainable recurrent quantizer, instead of plain static quantization, allowing for efficiently processing sequential data and encoding both their temporal, as well as their spatial aspects. The effectiveness of the proposed Recurrent BoF model to enclose spatio-temporal information compared to other competitive methods is demonstrated using six different datasets and two different tasks.
Original languageEnglish
Article number107380
JournalPattern Recognition
Volume106
DOIs
Publication statusPublished - Oct 2020
Publication typeA1 Journal article-refereed

Keywords

  • Bag-of-Features
  • Recurrent neural networks
  • Pooling operators
  • Activity recognition

Publication forum classification

  • Publication forum level 3

Fingerprint

Dive into the research topics of 'Recurrent bag-of-features for visual information analysis'. Together they form a unique fingerprint.

Cite this