146
Summary
We explored many aspects of multisensory interactions between audition,
vision, and Somatosensation. Before we could study Somatosensation using fMRI, we
had to build a tactile stimulator that was compatible with the MRI environment.
Chapter one describes the system that was built based on commercially available
piezoelectric bending actuators. This system proved to be able to deliver robust,
computer-controlled tactile Stimulationto several parts of the body simultaneously.
In chapter two we used these piezoelectric stimulators to find responses to
tactile stimuli in a visual area responsible for processing motion, area MST. Why would
a visual motion area respond to tactile stimulation? One possible answer is that this
multimodal integration aids in hand-eye coordination. This contradicts the classical
model Ofdedicated sensory cortices, and supports a model that includes strongly
interconnected areas of sensory cortex that includes at least some parallel processing
between them.
InchapterthreerWeusedthepiezoelectricstimuIatorstoinvestigatethe
superior temporal sulcus multisensory area, an area that had previously been shown to
be involved in integrating auditory and visual information. STSms overlaps with
Wernicke's area, an area that has long been known to be critical for understanding both
spoken and written language. Since we demonstrated that STSms also responds to
tactile stimulation, could this area be the neural substrate for touch replacement that
allows the use of Braille? If so, our finding suggest that touch replacement could also be
used to help deaf people understand spoken language by using a device that translates