Madison Chaos and Complex Systems Seminar

Spring 1995 Seminars

All talks are now listed, but we don't have abstracts yet for some of them. Talks are on Tuesdays at 12:05 in 8417 Social Science, except as noted.

Short List

February 1. Prof. John Holland, University of Michigan. ``Complex Adaptive Systems.''

Unusual time and place: Wednesday, 12:05 pm, in 1610 Engineering.

Abstract: Many of our most troubling long-range problems - trade balances, sustainability, AIDS, genetic defects, mental health, computer viruses - center on certain systems of extraordinary complexity - economies, ecologies, immune systems, embryos, nervous systems, computer networks. These systems, called complex adaptive systems, are above all ``social'' systems: Their global behavior stems from interactions among large numbers of agents, agents that adapt and learn in response to the actions of other agents in the system.

Because these systems are continually in flux, they are not readily studied using the traditional mathematics of equilibria and steady states, and they cannot be isolated for easy experimentation. Computer-based models provide for organized experimentation, allowing a search for patterns and general principles that can be formalized and then verified in real systems. This lecture describes the basic elements of such models, and discusses some of the relevant mathematics.

A videotape of the lecture is on reserve at the desk in the Physics Library (4220 Chamberlin Hall), and can be checked out overnight. ``The video quality is good, but the first few minutes of audio are missing.'' (Sprott.)

February 7. Prof. Thomas Erber, Illinois Institute of Technology. ``Catastrophes, Chaos, Complexity: Have Physicists Finally Got It Right?''

Abstract: Thomas Kuhn's message in The Structure of Scientific Revolutions was that science lurches forward in abrupt ``paradigm shifts''. Is this really so? Once upon a time we used to quote Linnaeus, ``Nature doesn't jump!'' - and then came Catastrophe Theory. We also used to think that science was a search for order - and then came Chaos. Finally, just the day before yesterday, ``Complexity'' wasn't a subject, but a state of mind, best left to biologists and computer buffs. Now ``Complexity Institutes'' are sprouting like mushrooms after the rain. In this talk we'll try to separate the paradigms from the hype, and present a review of some of the ``hard'' science advances that are likely to have broad significance and be of lasting value.

February 14. Anthony Ives, UW Dept of Zoology. ``Population Dynamics in Stochastic Ecological Systems.''

Abstract: Ecological communities represent a complex web of interactions among many species. These interactions present a challenge to both theoretical and empirical ecologists trying to predict how communities will change in response to environmental perturbations. My talk will address three questions:

  1. From time series data, is it possible to separate exogenous (environmental) from endogenous (biotic) sources of variance in population densities?
  2. Can long-term temporal patterns in population dynamics be generated by endogenous processes, and can these processes be characterized in real data sets?
  3. Is it possible to predict how long-term (structural) changes in the environment will affect mean population densities?

February 21. Robert Jeanne, UW Dept of Entomology. ``Organization in an Insect Colony: a Parallel Distributed Process.''

Abstract:Insect societies self-organize and show emergent behavior, and appear to be examples of complex systems. The behavioral acts performed by individual workers are simple and limited in number, yet the behavior produced by the whole colony can be amazingly complex. The task confronting insect sociobiologists is to derive the complex patterns of behavior seen at the colony level from a knowledge of the interactions among colony members. Because it is regular, predictable, and easily observed and measured, nest construction is an ideal form of group behavior on which to attempt this analysis. This is especially true for Polybia occidentalis, a tropical social wasp my students and I have been studying for several years. In this seminar I'll describe our progress toward understanding how the components of nest construction behavior are organized and regulated.

February 28. Mark Smucker, UW Department of Computer Sciences. ``Social Network Formation in Evolutionary Iterated Prisoner's Dilemma with Choice and Refusal of Partners.'' (Written with E. Ann Stanley, Dan Ashlock, and Leigh Tesfatsion of Iowa State University.)

Unusual time: 12:30

Abstract: Social structures affect many human interactions. We study the emergence of behaviors in an artificial ecology in which evolved players use expected payoffs to select partners for games of prisoner's dilemma. Each player has a finite state machine specifying its prisoner's dilemma strategy, and a genetic algorithm is used to evolve new players. Full cooperation is the most common behavior which evolves in our simulations, but many other behaviors, which rely upon interesting social network structures, also evolve. (This talk will be a new and improved version of the talk given last semester in computer sciences. New material will be presented.)

March 7. Clint Sprott, UW Physics Department. ``Strange Attractors: From Art to Science.''

Abstract: From the dawn of science until just a few years ago the phenomenon of chaos was largely unknown. Now chaos is seen everywhere. Is chaos the exception or the rule? I will describe numerical experiments that assess the prevalence of chaos. Millions of equations are solved and the solutions catalogued. A portion of these solutions are chaotic and produce strange attractors - fractal objects of great beauty and mathematical interest. Properties of the solutions will be described. The technique is extended to complex systems (artificial neural nets) containing many identical parts interacting by simple rules, and it will be shown that such systems are nearly always weakly chaotic.

March 14. No Meeting - Spring recess.
March 22. John Rinzel, National Institute of Health. ``Nonlinear Dynamics of Neurons and Neuronal Ensembles.''

Unusual day, time and place: Wednesday, March 22nd, 12:05, the anatomy conference room, 341 Bardeen Hall.

Abstract: During non-REM sleep, and during absence seizures, widespread synchronous oscillations in electrical activity are seen in the brain. The thalamus is believed to act as a generator for these rhythms. A recently developed in vitro preparation, an isolated ``slice'' of ferret thalamus, shows spontaneous rhythmic behaviors like those seen in vivo, and with interesting wave-like behavior. Current hypotheses for the neural basis of these rhythms will be outlined, and computational neural network models that simulate the rhythms will be presented. The nonlinear dynamics of single cells, and of the coupling between cells, will be described for these Hodgkin-Huxley-like models. In this system, coupling via synaptic inhibition plays a key role in rhythmogenesis.

March 28. No meeting.
April 4. Ann Bell, UW Department of Economics. ``Interdependent Preferences in a Lattice Economy.''

Abstract: This talk examines the economic consequences of a fad or bandwagon effect in consumers' preferences. Agents trade in a global market but change their preferences over time in repsonse the consumption decision of other agents located nearby. The models embed standard economic theory in a nonlinear dynamical systems framework. The results show clustering of preferences and consumption on an individual level and a characteristic evolution of price and average preferences on an economy wide level.

April 11. Blake LeBaron, UW Department of Economics. ``Asset Pricing under Inductive Reasoning: Experiments with an Artificial Stock Market.'' (Coauthors: Brian Arthur, John Holland, Richard Palmer, Paul Taylor.)

Abstract: In this project an artificial stock market provides an environment in which to study the behavior of many artificially intelligent agents trying to forecast the future behavior of a traded asset paying a random dividend. The objective is to understand some of the phenomenon possible from the interactions of learning algorithms brought together in a simple stock market trading environment. Traders using Holland's classifier systems build up sets of simple rules to forecast future stock market price behavior. Successful rules are strengthened and used more frequently while less successful rules are replaced with new rules created by a genetic algorithm. The relationship between this model and traditional modeling of financial markets in economics will be discussed.

April 18. Danny Kaplan, McGill University. ``Detecting Unstable Fixed Points in Biological Data'' (formerly ``Exploiting Unstable Fixed Points in Nonlinear Data Analysis'').

Abstract: Possibly the simplest sort of feature in a dynamical system is a fixed point, sometimes called a ``steady state.'' Stable fixed points play an important role in the study of linear systems, and unstable fixed points underlie much nonlinear, chaotic behavior. Recently, much experimental and theoretical work has investigated how unstable fixed points can be exploited to control chaotic and other nonlinear behavior, and it has been suggested that such techniques may be applicable to controlling cardiac rhythms and epilepsy. I will discuss techniques for identifying unstable fixed points in experimental data, and show applications to fibrillating hearts and to the subthreshold dynamics of the squid axon.

April 25. Mark Craven, UW Department of Computer Sciences. ``Extracting Comprehensible Symbolic Representations from Trained Neural Networks.''

Abstract: Neural networks offer an appealing approach to concept learning because they are applicable to a large class of problems, and because they have demonstrated good generalization performance on a number of difficult real-world tasks. A limitation of neural networks, however, is that the concept representations they form are nearly impenetrable to human understanding. To address this limitation, we have been developing algorithms for extracting comprehensible, symbolic representations from trained neural networks. I will first discuss why it is important to be able to understand the concept representations formed by neural networks, and then describe our approach to this task. We have developed a novel method that involves viewing the rule-extraction task as a separate learning problem in which the target concept is the network itself. In addition to learning from training examples, our method exploits the property that networks can be queried.

May 2. Thomas Higgins, UW Engineering. ``History, Development, and Applications of Lyapunov Theory.''

Abstract:For some 40+ years the speaker developed and taught undergraduate and graduate courses in electrical engineering; some 50+ courses in all. Among these, and of particular interest, were a series of seven courses in automatic feedback control systems theory, and a series of five courses in advanced linear and nonlinear circuit theory. In each of these two subject areas stability theory occupies a central role. In each case the speaker made especial effort to seek out, read, and utilize all published writings on stability theory. In recent years, the so-called Lyapunov stability theory has come to be of interest and use by electrical engineers especially concerned with the theory and design of various nonlinear systems utilized in engineering practice.

In his talk the speaker will trace in chronological order, the history of the development of Lyapunov stability theory - which is of interest to those concerned with "chaos" theory.

May 9. Andreas Weigend, Department of Computer Science, University of Colorado, Boulder. ``Nonlinear Mixture Models for Time Series Analysis: Discovering Regimes and Avoiding Overfitting.''

Abstract:When trying to forecast the future behavior of real-world systems, two of the key problems are overfitting (particularly serious for noisy processes) and regime switching (the underlying process changes its characteristics). In this talk we show how Gaussian mixture models with nonlinear experts point to solutions to these problems. In connectionist terms, the architecture consists of several experts and a gating net. The nonlinear experts put out the conditional mean (as usual), but each expert also has its own adaptive width. The gating net puts out an input-dependent probability for each expert. There is a supervised component in learning, to predict the next value(s), and an unsupervised component, to discover the (hidden) regimes. We report a number of results:

  1. the gating net discovers the different regimes that underlie the process: its outputs segment the data correctly into the different regions.
  2. the widths associated with each expert characterize the sub-processes: i.e., the variances give the expected squared error for each regime.
  3. there is significantly less overfitting compared to single nets, for two reasons: only subsets of the potential inputs are given to the experts and gating net, and the experts learn to match their variances to the (local) noise levels, thus only learning as much as the data support.
We compare these results from the mixture model to single networks of different sizes, as well as to nets with two outputs, one for the mean, the other one for the confidence interval as a function of the input. Several data sets are used: a computer-generated series, the laser data set from the Santa Fe Competition, the daily electricity demand of France, and the daily exchange rate between German marks and US dollars.
Up to the Chaos and Complex Systems Seminar page.
Last changed 3 April 1995