A structure optimization framework for feed-forward neural networks using sparse representation
RIS ID
110121
Abstract
Traditionally, optimizing the structure of a feed-forward neural-network is time-consuming and it needs to balance the trade-off between the network size and network performance. In this paper, a sparse-representation based framework, termed SRS, is introduced to generate a small-sized network structure without compromising the network performance. Based on the forward selection strategy, the SRS framework selects significant elements (weights or hidden neurons) from the initial network that minimize the residual output error. The main advantage of the SRS framework is that it is able to optimize the network structure and training performance simultaneously. As a result, the training error is reduced while the number of selected elements increases. The efficiency and robustness of the SRS framework are evaluated based on several benchmark datasets. Experimental results indicate that the SRS framework performs favourably compared to alternative structure optimization algorithms.
Publication Details
Yang, J. & Ma, J. (2016). A structure optimization framework for feed-forward neural networks using sparse representation. Knowledge-Based Systems, 109 61-70.