Madison Chaos and Complex Systems Seminar
Spring 1998 Seminars
Dates, speakers, titles and abstracts will be listed as they become
available. Meetings will be noon Tuesdays in 4274 Chamberlin Hall
unless
otherwise noted.
Short List
- 2 January. No Meeting.
- 27 January. George Hrabovsky: ``The Shape
of
Chaos.''
- 3 February. Blake LeBaron:
``Evolving
Neural
Network Architectures for Forecasting'' (part 1).
- 10 February. Blake LeBaron:
``Evolving Neural
Network Architectures for Forecasting'' (part 2).
- 17 February. Dan Callan: ``A Dynamic
Neural
Network
Model of Speech Production in the Developing Child.''
- 24 February. Craig Berridge: ``The Locus
Coeruleus-Noradrenergic System: Modulation of Behavioral State
and
State-Dependent Processes.'' Cancelled due to
illness.
- 3 March. Olvi Mangasarian: ``Mathematical
Programming
in Data Mining.''
- 10 March. No seminar --- spring recess.
- 17 March. Julia Evans: ``Nonlinear
Dynamic
Model of
Social Discourse: Implications for the Study of Children with
Language
Disorders.''
- 24 March. Josh Chover: ``On Modeling
Memory.''
- 31 March. Grace Wahba: ````Multivariate
Smoothing
Methods in Time and Space with Application to Historical Trends
and
Patterns in
the Global Historical Climate Network Data.''
- 7 April. Clarence Clay: ``Fractals and the
Ocean Floor.''
- 14 April. John Young: ``El Nino --- An
Overview of
its Complex Dynamics''
- 21 April. Michael Morgan: ``Using
Singular
Vectors of
Observed Atmospheric Flows to Diagnosis Cyclone Development and
Predictability.''
- 22 April. Seminar of possible
interest: Steven Strogatz: ``Dynamics of Small-World
Networks.''
- 28 April. Amir Assadi: ``Perceptual
Simplification of
Geometry of Natural Surfaces in Human Vision.''
- 5 May. David Newman: ``If Self Organized
Critical Systems Are All Around Us, Can We Identify and Control
Them?''
- 11 May. Seminar of possible
interest:
Tomaso Poggio: ``Learning Sparse Representations for Vision.''
- 12 May. Steering Committee Meeting.
27 Jannuary. George E. Hrabovsky, UW
Physics:
``The
Shape of Chaos.''
Abstract: Chaos is a small part of dynamical systems
theory,
which treats systems which
evolve in time according to specific rules. Such rules are
determined
by
studying the systems in question. There are three broad approaches
to
this
subject; algebraic, analytic, and geometric. Algebraic approaches
involve
matrices and even more specialized structures to determine the
properties of
the system in question. An analytic approach involves the solution
of a
system
of very general functions or differential equations. A geometric
approach
seeks to discover the "shape of the domain of the system." This
division is an
oversimplification since there is much overlap. In this talk I
present
some
general concepts regarding the shape of the domain of several
chaotic
systems.
Without going into any of the gory mathematics, I will take you on a
tour of
some of the concepts of modern geometry and topology as they apply
to
dynamical
systems theory.
3 and 10 Februuary. Blake LeBaron, UW
Economics: ``Evolving Neural Network Architectures for
Forecasting.''
Abstract:
These two talks will introduce a procedure combining aspects of
evolutionary
optimization algorithms along with bootstrap and monte- carlo
cross-validation
procedures. Evolution is used to search the large space of potential
network
architectures for lean network structures. These leaner networks
reduce
overfitting problems and improve numerical maximization properties
of
the
heavily parameterized single hidden layer networks. In sample biases
are
estimated using several techniques, but estimates using bootstrap
based
cross-
validation are emphasized. The procedure will be applied to two very
different
examples. The first is the Henon attractor, and the second is a
foreign
exchange forecasting problem. In both cases this procedure is shown
to
generate
networks which perform well out of sample on several criteria, some
of
which
are motivated by economic objectives.
17 February. Daniel Callan, Waisman Center:
``A
Dynamic Neural Network Model of Speech Production in the
Developing
Child.''
Abstract:
Many theories of speech processing propose the existence of motor
control
systems that utilize invariant vocal tract configurations to specify
particular
goals of speech production. One problem that challenges these
theories
is the
fact that the associated structures involved with speech production
go
through
a considerable amount of change during development. The same speech
goals
continue to be achieved during the course of development despite the
changes
that the vocal tract configuration undergoes. In this paper,
characteristics
of speech production in the developing child are accounted for by
the
DIVA
model (Guenther, 1995, Guenther, et al., in press), which
incorporates
auditory
perceptual targets to plan articulation. The conversion from
articulator
configuration to acoustic signal is worked out by a modified version
of
the
Maeda articulation model that utilizes developmental parameters. The
performance of the neural network was assessed at different points
during
development by determining the articulation-acoustic output needed
to
produce
different vowels.
24 February. Craig Berridge, UW Psychology:
``The
Locus Coeruleus-Noradrenergic System: Modulation of Behavioral
State
and
State-Dependent Processes.''
Due to illness of the speaker, this seminar was cancelled.
It
will be rescheduled in the fall.
Abstract:
The locus coeruleus-noradrenergic system is one of a number of
brainstem-originating ascending systems that display
state-dependent
activity.
During the past 25 years, considerable information has been
collected
that
indicates that this system is well-positioned to exert a
widespread and
potent
modulatory influence throughout the CNS. This talk will review
recent
observations that indicate that this system enhances acquisition
and
processing
of sensory information, through a series of concerted actions
across a
variety
of disparate brain regions.
3 March. Olvi Mangasarian, UW Computer
Sciences:
``Mathematical Programming in Data Mining.''
Abstract:
Mathematical programming approaches to two fundamental problems will
be
described: feature selection and clustering. The feature selection
problem
considered is that of discriminating between two sets while
recognizing
irrelevant and redundant features and suppressing them. This creates
a
lean
model that often generalizes better to new unseen data.
Computational
results
on real data confirm improved generalization of leaner models.
Clustering is
exemplified by the unsupervised learning of patterns and clusters
that
may
exist in a given database and is a useful tool for knowledge
discovery
in
databases (KDD). A mathematical programming formulation of this
problem
is
proposed that is theoretically justifiable and computationally
implementable in
a finite number of steps. A resulting k-Median Algorithm is utilized
to
discover very useful survival curves for breast cancer patients from
a
medical
database.
17 March. Julia L. Evans, Waisman Center:
``Nonlinear
Dynamic Model of Social Discourse: Implications for the Study of
Children with
Language Disorders.''
Abstract:
Nonlinear dynamic models provide a theoretical framework to study
conversations
in real-time. In particular, recent models based on the coupling of
two
complex
systems indicate that: 1) speakers have a strong tendency to
converge
or
coordinate their verbal and non-verbal speaking behavior
moment-to-moment, 2)
changes in the degree to which one speaking partner desires to
participate in
the conversation affect not only the stability of the dyad as a
whole,
but the
behavior of the second speaker as well. Implications of the model's
predictions
will be discussed with respect to the nature of verbal interactions
for
children with language disorders.
24 March. Josh Chover, UW Mathematics: ``On
Modeling
Memory.''
Abstract:
I'll present a model of a neural network which seems to learn
sequences
of "stimuli" without extensive training, recall them in correct
order
when prompted, and recognize novelty. The features of the model are
conjectured to correspond to basic biological mechanisms. No
technical
knowledge should be necessary to understand this talk--neither
remote
nor recent recall.
31 March.
Grace Wahba, UW
Statistics:
``Multivariate Smoothing Methods in Time and Space with
Application to
Historical Trends and Patterns in the Global Historical Climate
Network
Data.''
No abstract yet.
7 April. Clarence Clay, UW Geology and
Geophysics:
``Fractals and the Ocean Floor.''
Abstract:
We know more about the surface of the moon than the seafloor that
covers 70% of planet Earth. The main reason is that the oceans are
effectively opaque to electromagnetic and light over distances
greater
than 10 to 100m. The attenuation of sound waves is small for
frequencies less than 12 kHz. Sonars or acoustic echosounders get
excellent reflections from the seafloor over most oceans 3 to 5 km
depth. The seafloor is mapped by having oceanographic survey ships
make
many tracks over the survey area. For decades these were single
tracks
and the "depths" were the shortest time of arrival from a reflection
or
scattering from features on the bottom. The echo sounder beam widths
were in the 10 to 30 degree range. Although the echosounders
recorded
continuously, the sonar "yard-stick" gives samples of depth at 3.5
km
that are are roughly 100 m apart or more. Charts of the seafloor
have
much simplification, guess work, and artistic imagination. I will
show
examples of seafloor images for a wide range of scales.
We are interested in knowing the details the seafloor roughness
because the geological processes cause it to be rough. We can use
acoustic scatter from the seafloor to estimate roughness.
Generally,
scattering theories use approximate spatial spectra or spatial
correlation functions as inputs. If the phases are not random,
then a
completely different form of acoustic scattering theory is needed.
There are good reasons to believe that the seafloor has a fractal
structure. Measurements of the spatial power spectra give
exponents of
the wave numbers that are in the range of -2 to -5. The
conventional
wisdom is that the spectral components have random phases. The
task for
the marine geophysicist is to interpolate the seafloor roughness
between sonar yard-stick measurements of depth. I analyzed a
published
profile and got a wavenumber slope of -3.2. However, the unwrapped
phases have linear dependencies on wavenumber over decent ranges
of the
wavenumbers. I believe that fractal interpolation methods are a
good
replacement for the artist's imagination. With luck, I will show
fractal interpolations for the echosounding profile that I used.
14 April. John Young, UW Atmospheric and
Oceanic
Sciences: ``El Nino --- An Overview of its Complex Dynamics.''
Abstract:
The phenomenon of ``El Nino'' is an irregular, inter-annual
disruption
of the dynamical climate of the tropical ocean and atmosphere. This
anomalous state is believed to depend strongly upon air-sea coupling
and probably inherent dynamical nonlinearities in the dynamics of
each
fluid system. The resulting behavior is complex, but it seems to be
partly predictable several months in advance.
In this talk, I will begin by briefly showing the structures of
the
physical components of this year's event, and historical time
series
indicating coupled but irregular behavior over the past decades. I
will
schematically review the governing mathematical equations,
focusing on
the crucial elements of air-sea coupling, Kelvin and Rossby waves,
and
nonlinearities.
In simple models, increased coupling suggests that unstable wave
growth can exist, which can lead to chaotic behavior which
probably
dooms predictions beyond a year into the future. However, some
models
with more degrees of freedom suggest that the chaotic tendencies
are
less strong, and that prediction error growth is influenced more
by
stochastic influences. Resolution of these issues will be of
practical
societal as well as scientific benefit.
21 April. Michael Morgan, UW Atmospheric and
Oceanic
Sciences: ``Using Singular Vectors of Observed Atmospheric Flows
to
Diagnosis Cyclone Development and Predictability.''
Abstract:
Quasi-geostrophic (QG) models of baroclinic instability successfully
capture many of the salient characteristics of observed cyclones
including the westward tilt with height of the geopotential field,
growth rates, and the horizontal length scales of the disturbance. A
more recent description of surface cyclogenesis views it as an
initial
value problem. Rather than disturbances being of normal mode form,
the
structure of the growing disturbance changes from being initially
tilted against the flow to being tilted downshear at a later time.
For
a given flow, one may identify those disturbances which amplify most
rapidly over a fixed time interval, for a given norm. These
"optimal"
disturbances are the singular vectors of the linear operator which
describes the dynamical evolution of the fluid system.
In this presentation, calculations of singular vectors from
simple
QG models are presented and interpreted from a potential vorticity
perspective. The salient features of the transient development of
these
singular vectors are identified and related to structures seen in
cyclogenesis events. The relationship of singular vectors to
predictability and the deployment of adaptive observing systems is
also
discussed.
22 April. Seminar of Possible Interest:
Steven
Strogatz, Dept. of Theoretical and Applied Mechanics, Cornell:
``Dynamics of
Small World Networks.''
Unusual time and place: Wednesday, 2:25pm in room 901 Van
Vleck.
Abstract: According to folklore, everyone on the planet
is
connected to everyone else through a short chain of acquaintances.
This
idea
is often called ``six degrees of separation'' (after the play, and
later
Hollywood film, of the same name). The Kevin Bacon game,
Erdös
numbers,
and the small-world phenomenon are all variations on the same
theme. In
this
talk, I'll explore the mathematics underlying the small-world
phenomenon, and
argue that it is not merely a curiosity --- it is probably a
common
feature of
large, sparse networks that are neither completely regular nor
completely
random. Networks in this middle ground have not been studied much,
but
they
are ubiquitous and scientifically important; examples include
neural
networks
and the electrical power grid of the western United States. Simple
models of
dynamical systems with small-world coupling appear to display
enhanced
propagation speed and computational power, compared to their
locally-connected
counterparts. This is joint work with Duncan Watts.
28 April. Amir Assadi, UW Mathematics:
``Perceptual
Simplification of Geometry of Natural Surfaces in Human
Vision.''
Abstract:
Obect recognition in human vision begins with rays of photons that
are
emitted from the object reaching the eyes. A sequence of elaborate
information processing tasks takes place in the brain, and leads to
complex products such as object recognition and visual attention.
The
complexity of visual perception stems from many factors. Among them,
there are billions of neurons, their circuits and networks mainly
dedicated to vision. Nonetheless, every-day visual tasks are
performed
with great ease.
How does the visual system solve the numerous geometric problems,
such as
estimating the shape and spatial position of objects in a scene?
Traditionally, two levels of information processing are
distinguished:
the
bottom-up processes in Early (or Low Level) Vision, and the
top-down
processes
in High Level Vision. Vision employs a combination of sequences of
such
processes in addition to other cognitive processes. While Low
Level
Vision is
primarily concerned with local information, High Level Vision
focuses
on global
phenomena. Where and how does the local-to-global transition in
information
processing occur?
We propose a hypothesis that there is a processing stage
intermediate to
these two levels. In this intermediate level, the visual system
associates the
simplest global geometric structures to the complex array of Early
visual
processes of local nature, taking into account the statistical
nature
of
perception and the observer's ability to estimate global shapes of
objects.
Based on this model, we provide numerical estimates for some
common
geometric
attributes of textured surfaces that the visual system might
qualitatively
perceive.
5 May. David Newman, Fusion Energy Division,
Oak
Ridge National Laboratory: ``If Self Organized Critical Systems
Are All
Around Us, Can We Identify and Control Them?''
Abstract:
In nature there are many systems which appear to exhibit some form
of
self-organization. Among these are forest fires, earthquakes,
sandpiles,
turbulent transport and even many aspects of society itself.
Investigations
into the similarity of the dynamics of such systems have been
undertaken by
using simple cellular automata models. These models have produced a
remarkable
amount of insight into the dynamics of such systems. Some basic
features of
SOC systems, from forest fires to earthquakes and sandpiles will be
discussed.
Recently a Self-Organized Criticality (SOC) model for turbulent
transport in
magnetically confined plasmas was proposed in order to explain
some of
the
observed features of the transport dynamics in these plasmas.
Because
of this
there has been an increased interest in methods for identifying
and
controlling
such systems. A perturbed extension to a sandpile model of
turbulent
transport
and data from various systems are used to investigate methods for
possible
control of SOC systems and methods for identifying whether these
systems are
SOC. Time permitting, some speculation on the implications to
society
of
attempting to control certain behavior (for example, risk
avoidance) in
the
context of controlling a SOC system will be discussed.
11 May. Tomaso Poggio,
Brain
Science
Department and A. I. Lab, MIT: ``Learning Sparse Representations
for
Vision.''
Unusual Time and Place: Monday, 11 May, 3:30-4:30pm in room
1221 Computer Sciences and Statistics
Abstract: Learning is becoming the central problem in
trying
to
understand intelligence and in trying to develop intelligent
machines.
I will
outline some of our recent efforts in the domain of vision to
develop
machines
that learn and to understand brain mechanisms of learning. I will
begin
with
some recent theoretical results on the problem of function
approximation and
sparse representations that connect regularization theory, Support
Vector
Machine Regression, Basis Pursuit Denoising and PCA techniques. I
will
then
motivate the appeal of learning sparse representations from an
overcomplete
dictionary of basis functions in terms of recent results in two
different
fields: computer vision and neuroscience. In particular, we have
developed a
trainable object detection architecture that succeeds in learning a
sparse
representation from an overcomplete set of Haar wavelets to perform
difficult
object detection tasks. In neuroscience, physiological data from IT
cortex
suggest that individual neurons encode a large vocabulary of
elementary
shapes
before converging on cells tuned to specific views of specific 3D
objects.
12 May. Steering Committee Meeting
Open meeting of the seminar steering committee; all are welcome to
attend.
Up to the Chaos and Complex Systems Seminar
page.
Last change worth mentioning Wednesday 6 May 1998
CRS