ANHIR: Automatic Non-Rigid Histological Image Registration Challenge

Jiří Borovec, Jan Kybic, Ignacio Arganda-Carreras, Dmitry V. Sorokin, Gloria Bueno, Alexander V. Khvostikov, Spyridon Bakas, Eric I.Chao Chang, Stefan Heldmann, Kimmo Kartasalo, Leena Latonen, Johannes Lotz, Michelle Noga, Sarthak Pati, Kumaradevan Punithakumar, Pekka Ruusuvuori, Andrzej Skalski, Nazanin Tahmasebi, Masi Valkonen, Ludovic VenetYizhe Wang, Nick Weiss, Marek Wodzinski, Yu Xiang, Yan Xu, Yan Yan, Paul Yushkevich, Shengyu Zhao, Arrate Munõz-Barrutia

    Research output: Contribution to journalArticleScientificpeer-review

    94 Citations (Scopus)

    Abstract

    Automatic Non-rigid Histological Image Registration (ANHIR) challenge was organized to compare the performance of image registration algorithms on several kinds of microscopy histology images in a fair and independentmanner. We have assembled 8 datasets, containing 355 images with 18 different stains, resulting in 481 image pairs to be registered. Registration accuracy was evaluated using manually placed landmarks. In total, 256 teams registered for the challenge, 10 submitted the results, and 6 participated in the workshop. Here, we present the results of 7 well-performing methods from the challenge together with 6 well-known existing methods. The best methods used coarse but robust initial alignment, followed by nonrigid registration, used multiresolution, and were carefully tuned for the data at hand. They outperformed off-the-shelf methods, mostly by being more robust. The best methodscould successfully register over 98% of all landmarks and theirmean landmark registration accuracy (TRE)was 0.44% of the image diagonal. The challenge remains open to submissions and all images are available for download.

    Original languageEnglish
    Pages (from-to)3042-3052
    Number of pages11
    JournalIEEE Transactions on Medical Imaging
    Volume39
    Issue number10
    DOIs
    Publication statusPublished - 2020
    Publication typeA1 Journal article-refereed

    Funding

    Manuscript received November 22, 2019; revised March 21, 2020; accepted April 2, 2020. Date of publication April 7, 2020; date of current version September 30, 2020. The work of Jan Kybic was supported in part by the OP VVV Funded under Project CZ.02.1.01/0.0/0.0/16_ 019/0000765 and in part by the Research Center for Informatics and the Czech Science Foundation under Project 17-15361S. The work of Dmitry V. Sorokin and Alexander V. Khvostikov was supported by the Russian Science Foundation under Grant 17-11-01279. The work of Spyridon Bakas, Sarthak Pati, Ludovic Venet, and Paul Yushkevich was supported in part by the National Institutes of Health under Grant NIH/NCI/ITCR:U24-CA189523, Grant NIH/NIBIB:R01EB017255, Grant NIH/NIA:R01AG056014, and Grant NIH/NIA:P30AG010124. The work of Pekka Ruusuvuori was supported by the Academy of Finland under Project 313921 and Project 314558. The work of Michelle Noga, Kuma-radevan Punithakumar, and Nazanin Tahmasebi was supported by the WestGrid and Compute Canada. The work of Marek Wodzinski was supported by the Preludium Project funded by the National Science Centre in Poland under Grant UMO-2018/29/N/ST6/00143. The work of Arrate Muñoz-Barrutia was supported in part by the Spanish Ministry of Economy and Competitiveness under Grant TEC2015-73064-EXP, Grant TEC2016-78052-R, and Grant RTC-2017-6600-1 and in part by the 2017 Leonardo Grant for Researchers and Cultural Creators, BBVA Foundation. (Corresponding author: Jan Kybic.) Please see the Acknowledgment section of this article for the author affiliations.

    Keywords

    • Image registration
    • Microscopy

    Publication forum classification

    • Publication forum level 2

    ASJC Scopus subject areas

    • Software
    • Radiological and Ultrasound Technology
    • Computer Science Applications
    • Electrical and Electronic Engineering

    Fingerprint

    Dive into the research topics of 'ANHIR: Automatic Non-Rigid Histological Image Registration Challenge'. Together they form a unique fingerprint.

    Cite this