DropELM: Fast neural network regularization with Dropout and DropConnect

Alexandros Iosifidis, Anastasios Tefas, Ioannis Pitas

    Research output: Contribution to journalArticleScientificpeer-review

    49 Citations (Scopus)

    Abstract

    In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is adopted by the proposed DropELM network. The proposed algorithm is able to exploit Dropout and DropConnect regularization, without computationally intensive iterative weight tuning. We show that the adoption of such a regularization approach can lead to better solutions for the network output weights. We incorporate the proposed regularization approach in several recently proposed ELM algorithms and show that their performance can be enhanced without requiring much additional computational cost.

    Original languageEnglish
    Pages (from-to)57-66
    Number of pages10
    JournalNeurocomputing
    Volume162
    DOIs
    Publication statusPublished - 25 Aug 2015
    Publication typeA1 Journal article-refereed

    Keywords

    • DropConnect
    • Dropout
    • Extreme Learning Machine
    • Regularization
    • Single Hidden Layer Feedforward Networks

    ASJC Scopus subject areas

    • Artificial Intelligence
    • Computer Science Applications
    • Cognitive Neuroscience

    Fingerprint

    Dive into the research topics of 'DropELM: Fast neural network regularization with Dropout and DropConnect'. Together they form a unique fingerprint.

    Cite this