INEB
INEB
TitleNeural networks trained with the EEM algorithm: Tuning the smoothing parameter
Publication TypeJournal Article
2005
AuthorsSantos, JM, Marques de Sá, J, Alexandre, LA
JournalWSEAS Transactions on SystemsWSEAS Trans. Syst.
Volume4
Issue4
Pagination295 - 299
Date Published2005///
11092777 (ISSN)
Cost function, entropy, Error entropy minimization (EEM) algorithm, Learning algorithms, Learning systems, Neural networks, Optimization, Parzen, Probability density function, Smoothing parameter
The training of Neural Networks and particularly Multi-Layer Perceptrons (MLP's) is made by minimizing an error function usually known as "cost function". In our previous works, we apply the Error Entropy Minimization (EEM) algorithm in classification and its optimized version using, as cost function, the entropy of the errors between the outputs and the desired targets of the neural network. One of the difficulties in implementing the EEM algorithm is the choice of the smoothing parameter, also known as window size, in the Parzen Window probability density function estimation for the computation of the entropy and its gradient. We present here a formula yielding the value of the smoothing parameter, depending on the number of data samples and on the neural network output dimension. Several experiments with real data sets were made in order to show the validity of the proposed formula.
http://www.scopus.com/inward/record.url?eid=2-s2.0-21844450579&partnerID=40&md5=31c652daee111a68194cd4f006d7ca7b