Associate Professor
Computer Science Department and
Center for the Neural Basis of Cognition
Carnegie Mellon University
Computational Principles of Sensory Coding
Part of my work focuses on the issue of sensory coding: How should natural images or sounds be encoded? One approach that my research has developed is the application of probabilistic models to learn codes that are efficient in an information theoretic sense. Given a particular sensory environment, this top-down computational approach makes predictions about the properties of neural codes at the population level. When this approach is applied to natural images, the resulting representation very closely matches the receptive field properties of visual cortex cells. This framework also demonstrates that the model system encodes natural images more efficiently than many common Fourier or wavelet-based coding schemes. Because the theory based on general probabilistic models of a high-dimensional data space, it can be applied to a variety of sensory patterns including static and dynamic visual images as well as patterns from different acoustic environments. A recent extention of this theoretical framework to the temporal domain led to a theory for how continuous time-varying signals can be optimally represented by a population of spiking neurons.
Higher-Order Representation and Inference
A second aim of my work is to develop algorithms for learning hierarchical representations. These are important because they can extract successively more abstract properties of the sensory patterns and may provide insight into how the brain carries out more complex aspects of perception that subserve object recognition. One result of this work has been a computational explanation for the role of feedback, which is ubiquitous in the brain, but poorly understood. The long term goal of this research is to develop abstract neural architectures and learning algorithms that help elucidate the details of the higher-level processes that allow the brain to perceive and process natural stimuli under a wide range of conditions. These include the computational principles underlying the representation and learning of perceptual invariances and attentional processes such as auditory and visual scene segmentation.
Signal Processing
As we better understand ways in which to represent and process sensory information, we can also develop better algorithms for signal processing. For example, the application efficient coding framework to natural images yields a code that is demonstrably more efficient than many standard methods of compression such as wavelets or JPEG. Furthermore, because this framework is based on probability theory, it leads naturally to algorithms for optimal inference in the face of uncertainty such as denoising and filling-in of missing information. Further extensions have led to novel algorithms for texture recognition and segmentation.
Efficient Algorithms for Processing Natural Signals
Another important aspect of my research is the development of
computationally efficient algorithms. Many algorithms are
attractive from the viewpoint of theory but are of sufficient
complexity that they can only be applied to small toy problems,
which may not provide a valid test of the underlying computational
theory. Developing efficient implementations of these
algorithms allows them to be tested on more realistic problems and also
affords the possibility of developing practical applications.