Curriculum-based Teacher Ensemble for Robust Neural Network Distillation

Georgios Panagiotatos, Nikolaos Passalis, Alexandros Iosifidis, Moncef Gabbouj, Anastasios Tefas

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

1 Citation (Scopus)


Neural network distillation is used for transferring the knowledge from a complex teacher network into a lightweight student network, improving in this way the performance of the student network. However, neural distillation does not always lead to consistent results, with several factors affecting the efficiency of the knowledge distillation process. In this paper it is experimentally demonstrated that the selected teacher can indeed have a significant effect on knowledge transfer. To overcome this limitation, we propose a curriculum-based teacher ensemble that allows for performing robust and efficient knowledge distillation. The proposed method is motivated by the way that humans learn through a curriculum, as well as supported by recent findings that hints to the existence of critical learning periods in neural networks. The effectiveness of the proposed approach, compared to various distillation variants, is demonstrated using three image datasets and different network architectures.
Original languageEnglish
Title of host publication2019 27th European Signal Processing Conference (EUSIPCO)
Number of pages5
ISBN (Electronic)978-9-0827-9703-9
ISBN (Print)978-1-5386-7300-3
Publication statusPublished - Sep 2019
Publication typeA4 Article in conference proceedings
EventEuropean Signal Processing Conference -
Duration: 1 Jan 1900 → …

Publication series

NameEuropean Signal Processing Conference
ISSN (Print)2219-5491
ISSN (Electronic)2076-1465


ConferenceEuropean Signal Processing Conference
Period1/01/00 → …


  • neural network distillation
  • knowledge transfer
  • curriculum-based distillation
  • lightweight deep learning

Publication forum classification

  • Publication forum level 1


Dive into the research topics of 'Curriculum-based Teacher Ensemble for Robust Neural Network Distillation'. Together they form a unique fingerprint.

Cite this