download eBook The Self-Organizing Map (SOM), with its variants, is the most popular artificial neural network algorithm in the unsupervised learning category. download eBook Self-Organizing Maps deals with the most popular artificial neural- network algorithm of the unsupervised-learning category, viz. the. Since the second edition of this book came out in early , the number of scientific papers published on the Self-Organizing Map (SOM) has increased from.
|Language:||English, Spanish, German|
|Genre:||Science & Research|
|Distribution:||Free* [*Registration needed]|
The Self-Organizing Map (SOM), with its variants, is the most popular artificial neural network algorithm in the unsupervised learning category. About This book provides an overview of self-organizing map formation, including recent developments. Self-organizing maps form a branch of unsupervised learning. The Self-Organizing Map (SOM) algorithm was introduced by the author in Its theory and many applications form one of the major approaches to the.
The results show that our proposed method yields a promising result with better average accuracy and quantisation errors compared to the other methods as well as convincing significant test. Introduction In classification process; normally, large classes of objects are separated into smaller classes. This approach can be very complicated due to the challenge in identifying the criteria especially for procedures involving complex data structures.
In this scenario; practically, the Machine Learning ML techniques will be used and introduced by many researchers as alternative solutions to solve the above problems. Various applications of ANN which have been implemented in many practical problems such as meteorological forecasting, image processing, and agriculture are discussed in [ 2 — 4 ].
In ANN model, simple neurons are connected together to form series of connected network. While a neural network does not have to be adaptive, its advantages arise with proper algorithms to update the weights of the connections to produce a desired output.
ANN and evolutionary computation methodologies have each been proven effective in solving certain classes of problems. For example, neural networks are very efficient at mapping input to output vectors and evolutionary algorithms are very useful at optimization.
ANN weaknesses could be solved either by enhancing the structures of ANN itself or by hybridizing it with evolutionary optimisation [ 5 , 6 ]. Evolutionary computation is based on population of optimisation techniques such as evolutionary Algorithm EA and Swarm Intelligence SI.
One of the techniques used in EA is Genetic Algorithm GA , inspired by biological evolution such as inheritance, mutation, selection, and crossover.
The searching implementation with evolutionary method such as ANN learning may overcome the gradient-based handicaps. However, the convergence is in general much slower, since these are general purpose methods. Kennedy and Eberhart [ 7 ] proposed a very simple nonlinear optimisation technique called PSO which requires little computational costs.
Early studies have shown that the multistrategy learning of PSO-SOM approach was first introduced by Shi and Eberhart [ 8 ] with modified particle swarm optimizer.
Subsequently, Xiao et al. This is because this factor is valuable as a competitive learning technique, but it reduces the number of epochs necessary to produce a robust solution. The authors suggested using different distance metric in calculating the distance between input vectors and each member of the swarm to produce competitive result for data classification.
However, in this study, types of SOM lattice structure were not considered. However, the problem emerged in generating image classes which provided concise visualisation of the image dataset.
The authors used growing grid structure in the Kohonen network to learn the sample data through mapping grid and PSO to probe the optimum fitting points on the surface. In this study, the proposed Kohonen network was a 3D rectangular map and being enhanced using growing grid method.
They form a discrete approximation of the distribution of training samples. More neurons point to regions with high training sample concentration and fewer where the samples are scarce.
Originally, SOM was not formulated as a solution to an optimisation problem. Nevertheless, there have been several attempts to modify the definition of SOM and to formulate an optimisation problem which gives similar results.
In the sense that a GTM explicitly requires a smooth and continuous mapping from the input space to the map space, it is topology preserving. However, in a practical sense, this measure of topological preservation is lacking. It also includes a scaling parameter to make the network invariant to scaling, translation and rotation of the input space.
The TASOM and its variants have been used in several applications including adaptive clustering, multilevel thresholding, input space approximation, and active contour modeling. It starts with a minimal number of nodes usually four and grows new nodes on the boundary based on a heuristic.
By using a value called the spread factor, the data analyst has the ability to control the growth of the GSOM. The elastic maps approach  borrows from the spline interpolation the idea of minimization of the elastic energy.
In learning, it minimizes the sum of quadratic bending and stretching energy with the least squares approximation error. The conformal approach   that uses conformal mapping to interpolate each training sample between grid nodes in a continuous surface.
A one-to-one smooth mapping is possible in this approach.
The oriented and scalable map OS-Map generalises the neighborhood function and the winner selection . The homogeneous Gaussian neighborhood function is replaced with the matrix exponential.
Thus one can specify the orientation either in the map space or in the data space.