On the initialization of long short-term memory networks

Mostafa Mehdipour Ghazi*, Mads Nielsen, Akshay Pai, Marc Modat, M. Jorge Cardoso, Sébastien Ourselin, Lauge Sørensen

*Corresponding author af dette arbejde

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningpeer review

5 Citationer (Scopus)

Abstract

Weight initialization is important for faster convergence and stability of deep neural networks training. In this paper, a robust initialization method is developed to address the training instability in long short-term memory (LSTM) networks. It is based on a normalized random initialization of the network weights that aims at preserving the variance of the network input and output in the same range. The method is applied to standard LSTMs for univariate time series regression and to LSTMs robust to missing values for multivariate disease progression modeling. The results show that in all cases, the proposed initialization method outperforms the state-of-the-art initialization techniques in terms of training convergence and generalization performance of the obtained solution.

OriginalsprogEngelsk
TitelNeural Information Processing - 26th International Conference, ICONIP 2019, Proceedings
RedaktørerTom Gedeon, Kok Wai Wong, Minho Lee
Antal sider12
ForlagSpringer VS
Publikationsdato2019
Sider275-286
ISBN (Trykt)9783030367077
DOI
StatusUdgivet - 2019
Begivenhed26th International Conference on Neural Information Processing, ICONIP 2019 - Sydney, Australien
Varighed: 12 dec. 201915 dec. 2019

Konference

Konference26th International Conference on Neural Information Processing, ICONIP 2019
Land/OmrådeAustralien
BySydney
Periode12/12/201915/12/2019
NavnLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Vol/bind11953 LNCS
ISSN0302-9743

Citationsformater