Overview
A number of researchers in computational/systems neuroscience and in
information/communication theory are investigating problems of information representation
and processing. While the goals are often the same, these researchers bring different
perspectives and points of view to a common set of neuroscience problems. Often they
participate in different fora and their interaction is limited.
The goal of the workshop is to bring some of these researchers together to discuss
challenges posed by neuroscience and to exchange ideas and present their latest
work.
The workshop is targeted towards computational and systems neuroscientists
with interest in methods of information theory as well as
information/communication theorists with interest in neuroscience.
Thursday, June 20, 2006 (8:45 AM - 5:00 PM)
Morning Session (8:45 AM - 12:00 noon)
8:45 AM - 9:30 AM
Information Theory and Neuroscience
When Shannon developed information theory, he envisioned a systematic
way to determine how much "information" could be transmitted over an
arbitrary communications channel. While this classic work embraces
many of the key aspects of neural communication (e.g., stochastic
stimuli and communication signals, multiple-neuron populations, etc.),
there are difficulties in applying his concepts meaningfully to neuroscience
applications. We describe the classic information theoretic
quantities---entropy, mutual information, and capacity---and how they
can be used to assess the ultimate fidelity of the neural stimulus
representation. We also discuss some of the problems that accompany using and
interpreting these quantities in a neuroscience context. We also present
an overview of post-Shannon research areas that leverage his work in rate-distortion theory
that are extremely relevant to neuroscientists looking to understand the neural
code. The presentation is meant to be mostly tutorial in nature, setting the
stage for other workshop presentations.
9:30 AM - 10:00 AM
Lossy Compression in Neural Sensory Systems: At what Cost (Function)?
Biological sensory systems, and more so individual neurons, do not represent
external stimuli exactly. This obvious statement is a consequence of the
almost infinite richness of the sensory world compared to the relative
paucity of neural resources that are used to represent it. Even if the
intrinsic uncertainty present in all biological systems is disregarded,
there will always be a many-to-one representation of whole regions of
sensory space by indistinguishable neural responses. When noise is included,
the representation is many-to-many.
One direction of research in sensory neuroscience, espoused by us and
others, is to identify and model such regions, with the goal of eventually
completely describing neural sensory function as the partitioning of sensory
space into distinguishable regions, associated to different response states
of a sensory system. In essence, our goal is to quantify the distortion
function of a particular biological system. In pursuing this agenda, the
vastness of sensory space imposes a certain style of analysis that
explicitly addresses the problem ensuing from the availability of relatively
small datasets with which to provide description of relatively large sensory
regions. We report our progress in this direction.
10:00 AM - 10:30 AM
Estimating Mutual Information by Bayesian Binning
I'll present an exact Bayesian treatment of a simple, yet sufficiently general
probability distribution model, constructed by
considering piecewise constant distributions P(X) with uniform (2nd order)
prior over location of discontinuity points and assigned chances. The
predictive distribution and the model complexity can be determined completely
from the data in a computational time that is linear in the number of degrees
of freedom and quadratic in the number of possible values of X. Furthermore,
exact values of the expectations of entropies and their variances can be
computed with polynomial effort. The expectation of the mutual information
becomes thus available, too, and a strict upper bound on its variance. The
resulting algorithm is particularly useful in experimental research areas
where the number of available samples is severely limited (e.g.
neurophysiology).
10:30 AM - 11:00 AM
Morning Break
11:00 AM - 11:30 AM
An Exactly Solvable Maximum Entropy Model
One of our main goals in life is to dream up distributions
that do a good job approximating neural data. One natural
class of distributions are maximum entropy ones, a class with
a great deal of aesthetic appeal.
Before applying these to real data, however, it would be nice
to develop an understanding of their properties. This,
however, is hard, mainly because for most correlated
distributions even sampling is intractable, let alone doing
anything analytic (the obvious exception, Gaussians, rarely
occur in real life, something that is especially true for
neural data).
Fortunately, there's at least one correlated distribution for
which we can calculate many things analytically; that model
is what we investigate here. Our goal is twofold. First, we
simply want to develop intuition for maximum entropy models.
Second, we want to understand something about estimating
these models from data, in particular whether results we get
from a reasonably small number of neurons, say around 10,
provide us with any information about what's happening when
the number of neurons is large, on the order of 1000s or more.
11:30 AM - 12:00 PM
Real-time Adaptive Information-theoretic Optimization of Neurophysiological Experiments
Adaptive optimal experimental design is a promising idea for
minimizing the number of trials needed to characterize a neuron's
response properties in the form of a parametric statistical encoding
model. However, this potential has been limited to date by severe
computational challenges: to find the stimulus which will provide the
most information about the (typically high-dimensional) model
parameters, we must perform a high-dimensional integration and
optimization in near-real time. Here we develop a fast algorithm,
based on a Fisher approximation of the Shannon information and
specialized numerical linear algebra techniques, to compute this
optimal (most informative) stimulus. This algorithm requires only a
one-dimensional linesearch, and is therefore efficient even for
high-dimensional stimulus and parameter spaces; for example, we
require just 10 milliseconds on a desktop computer to optimize a
100-dimensional stimulus, making real-time adaptive experimental
design feasible. Simulation results show that model parameters can be
estimated much more efficiently using these adaptive techniques than
by using random (nonadaptive) stimuli. Finally, we generalize the
algorithm to efficiently handle both fast adaptation due to
spike-history effects and slow, non-systematic drifts in the model
parameters.
12:00 PM - 2:00 PM
Lunch
Afternoon Session (2:00 PM - 5:00 PM)
2:00 PM - 2:30 PM
Some Interrelationship between Information,
Information Processing, and Energy
The talk will consider the development of
quantitative predictions that arise when communication and
information processing are constrained by efficient use of
metabolic energy.
Computation in brain is adiabatic. Information processing
and communication use currents powered by ion gradients.
Then, the Na-K ATPase pump expends metabolic energy to
maintain these ion gradients via an unmixing process. Both
ends of the process (computation and pumping) are essentially
frictionless. The heat generated by brain comes from the
inefficient conversion of glucose into ATP.
Several ways that energy is used in brain can be surprising,
particularly when compared to energy use in manufactured
computers and communication equipment. Some examples will be
discussed.
Attwell, D. & Gibb, A. Neurosci. 6, 2005, 841-849; Attwell,
D. & Laughlin, S. B. J. Cerebral Blood Flow and Metabolism
21, 2001, 1133-1145;
Crotty.
P., Sangrey, T., & Levy, W. B J Neurophysiol, in press,
2006; Levy, W. B & Baxter, R. A. Neural Comp. 8, 1996,
531-543;
Levy, W. B & Baxter, R.
A. J. Neurosci. 22, 2002, 4746-4755; Levy, W. B, Crotty,
P., Sangrey, T., & Friesen, O. J. Neurophysiol. 2006, in
press; Sangrey, T. and Levy, W. B Neurocomputing 65-66,
2005, 907-913.
2:30 PM - 3:00 PM
Information Representation with an Ensemble of Hodgkin-Huxley Neurons
Information repesentation in Communications and Information Theory
is based on the classical sampling theorem. A continuous-time
bandlimited signal is represented by a sequence of equidistant
signal samples. The signal can be perfectly recovered if the
samples are taken at a rate higher than or equal to the Nyquist
rate. A clock is needed for sampling, however. How can stimuli
be represented in neural systems given the absence of an ubiquitous
clock? Motivated by the natural representation of stimuli in sensory
systems, we review the representation of bandlimited stimuli with
integrate-and-fire neurons. As in the case of classical sampling,
perfect recovery of a bandlimited stimulus from the spike sequence
generated by an IAF neuron can be achieved provided that the average
spike rate is greater than the Nyquist rate. We extend these results
to stimuli encoded with a Hodgkin-Huxley neuron and describe a general
algorithm for recovering the stimulus at the input of a neuronal ensemble.
3:00 PM - 3:30 PM
Decoding Spike Times without Knowing the Stimulus Time
In most studies of neural coding one considers whether
regsitering the spike times with a fine temporal precision
increases the information available about the external
stimulus. However, most studies do not consider whether the
information contained in precise spike times can be decoded
by another neural system which, unlike the experimenter, does
not have a precise knowledge of the stimulus time. In this
talk we introduce in detail a class of information-theoretic
metrics that can at the same time quantify how much
information is encoded by precise spike times, and how much
of this information can be decoded by a downstream system
that has only a limited knowledge of the
stimulus time. We discuss potential applications to
experimental studies of
neural coding, as well as directions for our future
mathematical research on the issue.
This is joint work with my long term collaborators Mathew
Diamond (SISSA) and Ehsan Arabzadeh (Sidney University).
3:30 PM - 4:00 PM
Afternoon Break
4:00 PM - 4:30 PM
Correlations, Synergy and Coding in the Cortex: an Old Dogma
Learns New Tricks
Recent results from a number of groups have revealed the
"correlation = 0.2 in the cortex" dogma to be an
over-simplification: with carefully constructed stimuli,
inter-neuronal synchronization can depend upon stimulus
properties in interesting ways. We have been using
Information Theory to study the effect of this stimulus-
dependent synchronization on the neural coding of orientation
and contrast in V1. We analysed pairs of simultaneously
recorded neurons in the macaque primary visual cortex, whose
receptive fields were carefully mapped and co-stimulated.
Direction coding showed weak synergistic effects at short
timescales, trailing off to informational independence at
long timescales. An information component analysis revealed
that this was due to a balance of a synergistic contribution
due to the stimulus-dependence of synchronization, with
redundancy due to the overlap of tuning. In comparison,
contrast coding was dominated by redundancy due to the
similarity in contrast tuning curves and showed a weak
synergy at only very short time scales (< 5 ms). Stimulus
dependence of synchronization does therefore have an effect
on coding - and its effect is to pull the coding regime back
towards informational independence, when redundancy would
otherwise rule due to (the possibly inevitable) effects of
correlations and tuning overlap.
Work performed in collaboration with Fernando Montani, Adam
Kohn and Matthew Smith.
4:30 PM - 5:00 PM
Panel Discussion
CNS*2006 Workshop on
Methods of Information Theory in Computational Neuroscience
Thursday, June 20, 2006
Methods originally developed in Information Theory have found wide applicability in
computational neuroscience. Beyond these original methods there is a need to
develop novel tools and approaches that are driven by problems arising in neuroscience.
Organizers
Aurel A. Lazar, Department of Electrical Engineering, Columbia University
and
Alex Dimitrov, Center for Computational Biology, Montana State University.
Program Overview
Entropy Models, Rate Distortion Measures & Design of Experiments
Chair: A.A. Lazar
Don H. Johnson and Christopher J. Rozell,
Department of Electrical Engineering, Rice University.
Alex Dimitrov, Center for Computational Biology, Montana State University.
Dominik Endres, School
of Psychology, University of St Andrews.
Peter Latham, Gatsby Computational Neuroscience Unit,
University College London.
Jeremy Lewi, Robert Butera, and
Liam Paninski, Department of Statistics, Columbia University.
Information Representation, Processing and Decoding
Chair: A. Dimitrov
William B. Levy,
Laboratory for Systems Neurodynamics, University of Virginia.
Aurel A. Lazar, Department of Electrical Engineering, Columbia University.
Stefano Panzeri, Faculty of Life Sciences, University of Manchester.
Simon R. Schultz, Department of Bioengineering, Imperial College.