Minimax and Hamiltonian Dynamics of Excitatory-Inhibitory Networks

Part of Advances in Neural Information Processing Systems 10 (NIPS 1997)

Bibtex Metadata Paper

Authors

H. Sebastian Seung, Tom Richardson, J. Lagarias, John J. Hopfield

Abstract

A Lyapunov function for excitatory-inhibitory networks is constructed. The construction assumes symmetric interactions within excitatory and inhibitory populations of neurons, and antisymmetric interactions be(cid:173) tween populations. The Lyapunov function yields sufficient conditions for the global asymptotic stability of fixed points. If these conditions are violated, limit cycles may be stable. The relations of the Lyapunov function to optimization theory and classical mechanics are revealed by minimax and dissipative Hamiltonian forms of the network dynamics.

The dynamics of a neural network with symmetric interactions provably converges to fixed points under very general assumptions[l, 2]. This mathematical result helped to establish the paradigm of neural computation with fixed point attractors[3]. But in reality, interactions between neurons in the brain are asymmetric. Furthermore, the dynamical behaviors seen in the brain are not confined to fixed point attractors, but also include oscillations and complex nonperiodic behavior. These other types of dynamics can be realized by asymmetric networks, and may be useful for neural computation. For these reasons, it is important to understand the global behavior of asymmetric neural networks. The interaction between an excitatory neuron and an inhibitory neuron is clearly asymmetric. Here we consider a class of networks that incorporates this fundamen(cid:173) tal asymmetry of the brain's microcircuitry. Networks of this class have distinct populations of excitatory and inhibitory neurons, with antisymmetric interactions

330

H. S. Seung, T. 1. Richardson, J. C. Lagarias and 1. 1. Hopfield

between populations and symmetric interactions within each population. Such net(cid:173) works display a rich repertoire of dynamical behaviors including fixed points, limit cycles[4, 5] and traveling waves[6]. After defining the class of excitatory-inhibitory networks, we introduce a Lyapunov function that establishes sufficient conditions for the global asymptotic stability of fixed points. The generality of these conditions contrasts with the restricted nature of previous convergence results, which applied only to linear networks[5]' or to nonlinear networks with infinitely fast inhibition[7].

The use of the Lyapunov function is illustrated with a competitive or winner-take-all network, which consists of an excitatory population of neurons with recurrent inhi(cid:173) bition from a single neuron[8]. For this network, the sufficient conditions for global stability of fixed points also happen to be necessary conditions. In other words, we have proved global stability over the largest possible parameter regime in which it holds, demonstrating the power of the Lyapunov function. There exists another parameter regime in which numerical simulations display limit cycle oscillations[7].

Similar convergence proofs for other excitatory-inhibitory networks may be obtained by tedious but straightforward calculations. All the necessary tools are given in the first half of the paper. But the rest of the paper explains what makes the Lyapunov function especially interesting, beyond the convergence results it yields: its role in a conceptual framework that relates excitatory-inhibitory networks to optimization theory and classical mechanics.

The connection between neural networks and optimization[3] was established by proofs that symmetric networks could find minima of objective functions[l, 2]. Later it was discovered that excitatory-inhibitory networks could perform the minimax computation of finding saddle points[9, 10, 11], though no general proof of this was given at the time. Our Lyapunov function finally supplies such a proof, and one of its components is the objective function of the network's minimax computation.

Our Lyapunov function can also be obtained by writing the dynamics of excitatory(cid:173) inhibitory networks in Hamiltonian form, with extra velocity-dependent terms. If these extra terms are dissipative, then the energy of the system is nonincreasing, and is a Lyapunov function. If the extra terms are not purely dissipative, limit cycles are possible. Previous Hamiltonian formalisms for neural networks made the more restrictive assumption of purely antisymmetric interactions, and did not include the effect of dissipation[12].

This paper establishes sufficient conditions for global asymptotic stability of fixed points. The problem of finding sufficient conditions for oscillatory and chaotic behavior remains open. The perspectives of minimax and Hamiltonian dynamics may help in this task.

1 EXCITATORY-INHIBITORY NETWORKS

The dynamics of an excitatory-inhibitory network is defined by

f(u+Ax-By) , TxX+X TyY+y = g(v+BTx-Cy).