INEB
INEB
TitleError entropy minimization for LSTM training
Publication TypeBook
Year of Publication2006
AuthorsAlexandre, LA, Marques De Sá, JP
Series TitleLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)Lect. Notes Comput. Sci.
Volume4131 LNCS - I
Number of Pages244 - 253
CityAthens
ISBN Number03029743 (ISSN); 3540386254 (ISBN); 9783540386254 (ISBN)
KeywordsAlgorithms, Convergence of numerical methods, Cost function, entropy, Error Entropy Minimization, Errors, Long Short-Term Memory (LSTM), Neural networks, Optimization
AbstractIn this paper we present a new training algorithm for the Long Short-Term Memory (LSTM) recurrent neural network. This algorithm uses entropy instead of the usual mean squared error as the cost function for the weight update. More precisely we use the Error Entropy Minimization approach, were the entropy of the error is minimized after each symbol is present to the network. Our experiments show that this approach enables the convergence of the LSTM more frequently than with the traditional learning algorithm. This in turn relaxes the burden of parameter tuning since learning is achieved for a wider range of parameter values. The use of EEM also reduces, in some cases, the number of epochs needed for convergence. © Springer-Verlag Berlin Heidelberg 2006.
URLhttp://www.scopus.com/inward/record.url?eid=2-s2.0-33749837658&partnerID=40&md5=5f40159a3f1faf4760405dd581e79bf7