INEB
INEB
TitleData classification with multilayer perceptrons using a generalized error function
Publication TypeJournal Article
2008
AuthorsSilva, LM, Marques de Sá, J, Alexandre, LA
JournalNeural NetworksNeural Netw.
Volume21
Issue9
Pagination1302 - 1310
Date Published2008///
08936080 (ISSN)
algorithm, Algorithms, analytical error, article, calculation, Classification, Complementary features, Cybernetics, data analysis, Data classification, Data classifications, Data Interpretation, Statistical, Electric loads, entropy, Error functions, learning, Learning processes, mathematical analysis, Mathematical properties, Models, Statistical, Multilayer neural networks, Multilayer perceptron, Multilayer perceptrons, Multilayers, Neural networks, Neural Networks (Computer), Pattern recognition systems, perceptron, Performance improvements, priority journal, Probability density function
The learning process of a multilayer perceptron requires the optimization of an error function E (y, t) comparing the predicted output, y, and the observed target, t. We review some usual error functions, analyze their mathematical properties for data classification purposes, and introduce a new one, E Exp, inspired by the Z-EDM algorithm that we have recently proposed. An important property of E Exp is its ability to emulate the behavior of other error functions by the sole adjustment of a real-valued parameter. In other words, E Exp is a sort of generalized error function embodying complementary features of other functions. The experimental results show that the flexibility of the new, generalized, error function allows one to obtain the best results achievable with the other functions with a performance improvement in some cases. © 2008 Elsevier Ltd. All rights reserved.
http://www.scopus.com/inward/record.url?eid=2-s2.0-54449100428&partnerID=40&md5=30f8de18c0272f4a440395045bd3c164