A Double-Layer ELM with Added Feature Selection Ability using a Sparse Bayesian Approach

Farkhondeh Kiaee, Christian Gagné and Hamid Sheikhzadeh

More on this project...

Abstract - The Sparse Bayesian Extreme Learning Machine (SBELM) has been recently proposed to reduce the number of units activated on the hidden layer. To deal with high-dimensional data, a novel sparse Bayesian Double-Layer ELM (DL-ELM) is proposed in this paper. The first layer of the proposed DL-ELM is based on a set of SBELM subnetworks which are separately applied to the features on the input layer. The second layer consists in a set of weight parameters to determine the contribution of each feature in the output. Adopting a Bayesian approach and using Gaussian priors, the proposed method is sparse in both the hidden layer (of SBELM subnetworks) and the input layer. Sparseness in the input layer (i.e. pruning of irrelevant features) is achieved by decaying weights on the second layer to zero, such that the contribution of the corresponding input feature is deactivated. The proposed framework then enables simultaneous feature selection and classifier design at the training time. Experimental comparisons on real benchmark data sets show that the proposed method benefits from efficient feature selection ability while providing a compact classification model of good accuracy and generalization properties.

download documentdownload document


    author    = { Farkhondeh Kiaee and Christian Gagné and Hamid Sheikhzadeh },
    title     = { A Double-Layer ELM with Added Feature Selection Ability using a Sparse Bayesian Approach },
    volume    = { 216 },
    pages     = { 371‎ -- ‎380 },
    year      = { 2016 },
    journal   = { Neurocomputing },
    web       = { }

Last modification: Aug 11 2016 3:16PM by fakia1


©2002-. Computer Vision and Systems Laboratory. All rights reserved