As the neural algorithm to optimize, LVQ [1] is going to be used. LVQ is a supervised classification algorithm.

An LVQ network consists of an array of labeled vectors, which
are conventionally called *weight vectors*. Each label corresponds
to a class.

When training, each labeled vector in the training sample is submitted
to the network. The closest weight vector (usually called the *winning*
vector or neuron) is computed, and this vector is updated, in such a
way that it gets closer to the input vector if both belong to the
same class, and is moved away from it if they belong to different classes.

LVQ usually performs efficiently if the initial weight
vectors are close to their final values. That is why some heuristic
procedures are used for weight initialization, like Kohonen's SOM or any other vector quantization procedure, like *k-means*.

The number of weight vectors remains fixed during training, and is usually set prior to training to be equal to the number of classes, or to a fixed number of weight vectors per class. Several sets of initial weight vectors are tested, and the best solution is chosen.

Mon Nov 21 11:09:09 MET 1994