Transfer learning using a nonparametric sparse topic model

Ali Faisal, Jussi Gillberg, Gayle Leen, Jaakko Peltonen

    Tutkimustuotos: ArtikkeliScientificvertaisarvioitu

    9 Sitaatiot (Scopus)
    1 Lataukset (Pure)

    Abstrakti

    In many domains data items are represented by vectors of counts: count data arises, for example, in bioinformatics or analysis of text documents represented as word count vectors. However, often the amount of data available from an interesting data source is too small to model the data source well. When several data sets are available from related sources, exploiting their similarities by transfer learning can improve the resulting models compared to modeling sources independently. We introduce a Bayesian generative transfer learning model which represents similarity across document collections by sparse sharing of latent topics controlled by an Indian buffet process. Unlike a prominent previous model, hierarchical Dirichlet process (HDP) based multi-task learning, our model decouples topic sharing probability from topic strength, making sharing of low-strength topics easier. In experiments, our model outperforms the HDP approach both on synthetic data and in first of the two case studies on text collections, and achieves similar performance as the HDP approach in the second case study.

    AlkuperäiskieliEnglanti
    Sivut124-137
    Sivumäärä14
    JulkaisuNeurocomputing
    Vuosikerta112
    DOI - pysyväislinkit
    TilaJulkaistu - 2013
    OKM-julkaisutyyppiA1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä

    Tutkimusalat

    • Latent Dirichlet allocation
    • Nonparametric Bayesian inference
    • Small sample size
    • Sparsity
    • Topic models
    • Transfer learning

    Julkaisufoorumi-taso

    • Jufo-taso 1

    Sormenjälki

    Sukella tutkimusaiheisiin 'Transfer learning using a nonparametric sparse topic model'. Ne muodostavat yhdessä ainutlaatuisen sormenjäljen.

    Siteeraa tätä