DAP: A dataset-agnostic predictor of neural network performance

Publication Name

Neurocomputing

Abstract

Training a deep neural network on a large dataset to convergence is a time-demanding task. This task often must be repeated many times, especially when developing a new deep learning algorithm or performing a neural architecture search. This problem can be mitigated if a deep neural network's performance can be estimated without actually training it. In this work, we investigate the feasibility of two tasks: (i) predicting a deep neural network's performance accurately given only its architectural descriptor, and (ii) generalizing the predictor across different datasets without re-training. To this end, we propose a dataset-agnostic regression framework that uses a novel dual-LSTM model and a new dataset difficulty feature. The experimental results show that both tasks above are indeed feasible, and the proposed method outperforms the existing techniques in all experimental cases. Additionally, we also demonstrate several practical use-cases of the proposed predictor.

Open Access Status

This publication may be available as open access

Volume

583

Article Number

127544

Funding Sponsor

Australian Research Council

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1016/j.neucom.2024.127544