ABSTRACT
A recurrent neural network is presented which performs quadratic optimization subject to bound constraints on each of the optimization variables. The network is shown to be globally convergent, and conditions on the quadratic problem and the network parameters are established under which exponential asymptotic stability is achieved. Through suitable choice of the network parameters, the system of differential equations governing the network activations is preconditioned in order to reduce its sensitivity to noise and to roundoff errors. The optimization method employed by the neural network is shown to fall into the general class of gradient methods for constrained nonlinear optimization and, in contrast with penalty function methods, is guaranteed to yield only feasible solutions.
ABSTRACT
The so-called "simple cells" in layer IV of feline primary visual cortex have been shown to have Gabor function spatial receptive field profiles (RFP's). Since Gabor functions are not mutually orthogonal, the decomposition of an image into Gabor function coefficients is usually performed by minimising some measure of the error between the original image and that reconstructed from the coefficients. A cortical relaxation model is proposed which performs this minimisation implicitly, and is used to examine the biological relevance and feasibility of reconstruction error minimisation.