Parikh, CR, Pont, MJ, Li, Yuhua and Jones, NB (2000) Investigating the performance of MLP classifiers where limited training data are available for some classes. In: SOFT COMPUTING TECHNIQUES AND APPLICATIONS. UNSPECIFIED. 6 pp. [Conference contribution]
Full text not available from this repository.
The standard implementation of the back-propagation training algorithm for multi-layer Perceptron (MLP) neural networks assumes that there are equal number of samples for training each of the required classes. Where limited training data are available for one (or more) classes, sub-optimal performance may be obtained. We have demonstrated in a previous study [Parikh ct al., 1999. Proceedings of Condition Monitoring 1999, Swansea, UK] that, where unequal training class cannot be avoided, performance of the classifier may be substantially improved by duplicating the available patterns in the smaller class. In this study, we investigate whether the addition of random noise to the `duplicated' training patterns will further improve the classification performance. In the study conducted here, we conclude that the addition of noise does not give a consistent improvement in performance.
|Item Type:||Conference contribution (Paper)|
|Keywords:||Multi-layer Perceptron; training algorithm; condition monitoring; fault diagnosis|
|Faculties and Schools:||Faculty of Computing & Engineering|
Faculty of Computing & Engineering > School of Computing and Intelligent Systems
|Research Institutes and Groups:||Computer Science Research Institute|
Computer Science Research Institute > Intelligent Systems Research Centre
|Deposited By:||Dr Yuhua Li|
|Deposited On:||09 Mar 2010 16:14|
|Last Modified:||09 May 2016 10:51|
Repository Staff Only: item control page