Abstract
This paper presents an efficient algorithm for large-scale multi-system learning task. The proposed architecture, referred to as the ‘RBFxSOM’, is based on the SOM2, that is, a‘SOM of SOMs’. As is the case in the modular network SOM (mnSOM) with multilayer perceptron modules (MLP-mnSOM), the aim of the RBFxSOM is to organize a continuous map of nonlinear functions representing multi-class input-output relations of the given datasets. By adopting the algorithm for the SOM2, the RBFxSOM generates a map much faster than the original mnSOM, and without the local minima problem. In addition, the RBFxSOM can be applied to more difficult cases, that were not easily dealt with by the MLP-mnSOM. Thus, the RBFxSOM can deal with cases in which the probability density of the inputs is dependent on the classes. This tends to happen more often as the input dimension increases. The RBFxSOM therefore, overcomes many of the problems inherent in the MLP-mnSOM, and this is crucial for application to large scale tasks. Simulation results with artificial datasets and a meteorological dataset confirm the performance of the RBFxSOM.