Adaptive L2 regularization in person Re-identification

Xingyang Ni, Liang Fang, Heikki Huttunen

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

17 Citations (Scopus)

Abstract

We introduce an adaptive L2 regularization mechanism in the setting of person re-identification. In the literature, it is common practice to utilize hand-picked regularization factors which remain constant throughout the training procedure. Unlike existing approaches, the regularization factors in our proposed method are updated adaptively through backpropagation. This is achieved by incorporating trainable scalar variables as the regularization factors, which are further fed into a scaled hard sigmoid function. Extensive experiments on the Market-1501, DukeMTMC-reID and MSMT17 datasets validate the effectiveness of our framework. Most notably, we obtain state-of-the-art performance on MSMT17, which is the largest dataset for person re-identification. Source code is publicly available at https://github.com/nixingyang/AdaptiveL2Regularization.

Original languageEnglish
Title of host publicationProceedings of ICPR 2020 - 25th International Conference on Pattern Recognition
PublisherIEEE
Pages9601-9607
Number of pages7
ISBN (Electronic)978-1-7281-8808-9
DOIs
Publication statusPublished - 2020
Publication typeA4 Article in conference proceedings
Event25th International Conference on Pattern Recognition, ICPR 2020 - Virtual, Milan, Italy
Duration: 10 Jan 202115 Jan 2021

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference25th International Conference on Pattern Recognition, ICPR 2020
Country/TerritoryItaly
CityVirtual, Milan
Period10/01/2115/01/21

Publication forum classification

  • Publication forum level 1

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Adaptive L2 regularization in person Re-identification'. Together they form a unique fingerprint.

Cite this