Neural Coding-Touch and proprioception
We seek to elucidate the neural codes that underlie the sense of touch: At each stage of processing along the somatosensory neuraxis, how does neural activity carry information about objects held in the hand? We assess the degree to which putative neural codes carry information about object properties – shape, size, texture, etc. – and account for perception thereof, measured in psychophysical experiments with humans or monkeys. For example, we have shown that the tactile perception of texture is mediated by two neural codes in the nerves: Coarse surface features are encoded in the spatial patterns of activation across one population of tactile nerve fibers while fine surface features – only tangible when the skin moves across the surface – are encoded in millisecond-precision temporal spiking patterns in two other populations nerve fibers. These disparate neural signals are read out by overlapping populations of neurons in somatosensory cortex: Some neurons extract information from spatial patterns of afferent input and preferentially encode coarse textural features, other neurons extract information from temporal patterns and preferentially encode fine features.
Measuring neuronal responses to the same stimuli in both the nerve and the brain allows us to discover the computations that the afferent input undergoes to ultimately give rise to cortical responses. We find that the cortical representation of texture representation in cortex is much more robust to changes in the way the surface is explored than is its peripheral counterpart and demonstrate that this robustness is achieved via the aforementioned identified computations. This type of detailed analysis has been applied to other stimulus domains, notably vibration, shape, and motion. A running theme of this work is the identification of canonical computations, observed not only across stimulus domains (vibrations and texture, e.g.) but also across sensory modalities, for example in the extraction of shape and motion information in vision and touch, and texture and vibration information in audition and touch.
Neuroprosthetics - Biomimetic sensory feedback
One approach to restoring sensorimotor function in amputees or tetraplegic patients consists in equipping them with anthropomorphic robotic arms that are interfaced directly with the nervous system. To control these arms, not only must motor intention be translated into movements of the limb, but sensory signals must be transmitted from the limb to the patient. Indeed, without these signals, controlling the arm is very slow, clumsy, and effortful. With this in mind, we develop approaches to convey meaningful and naturalistic sensations through stimulation of peripheral or cortical neurons, attempting to reproduce, to the extent possible, the patterns of neuronal activation that are relevant for basic object interactions. We anticipate that these studies will constitute an important step towards restoring the sense of touch to those who have lost it.