The brain is made up of two hemispheres, the right and the left, and they are very similar in their anatomy and their physiology. Yet, a striking number of differences in how the two hemispheres function have been described,
ranging from differences in the processing of basic sensory features to differences in emotion, language, and problem solving. This disparity between the general neural similarity of the two hemispheres and the distinctiveness of their functions highlights the limits of our current understanding of the mapping between neural and functional properties.
Whereas there is now a better sense of which brain areas may be involved in various language functions, the computations that these brain areas are performing are not yet understood well enough to explain why a left hemisphere (LH) area is crucially involved in a given function while the corresponding right hemisphere (RH) area, with the same basic cell types, neurochemistry, and inputs and outputs, is not. In the CABLab, we conduct
studies designed to understand how and why the hemispheres differ in how they comprehend and remember the world, as well as how they work together during complex cognitive tasks like language processing.
To examine hemispheric differences in meaning processing, we use what is known as the visual half-field (VF) presentation method, which we combine with the measurement of brain electrical responses (ERPs).
The visual half-field technique takes advantage of the fact that information presented in peripheral vision on the right side is sent first to the left hemisphere, and, correspondingly, information in left peripheral vision is sent first to the right hemisphere. Although the hemispheres can share information (across the corpus callosum, a bundle of nerve fibers connecting the two hemispheres), giving one hemisphere information more directly gives it a processing advantage, which reveals differences between the hemispheres.
Our studies have shown that some assumptions that have been made about how the hemispheres function do not hold up when we measure brain activity rather than just behavior. For example, although some theories have argued that the right hemisphere represents the meaning of words more "coarsely" than the left, and that the right hemisphere cannot build a representation of the meaning of an entire sentence (as opposed to a single word), our ERP studies have consistently indicated that the right and left hemispheres represent basic aspects of meaning fairly similarly, and that the right hemisphere can construct the meaning of a sentence.
While both hemispheres seem to be able to use world knowledge to comprehend sentences, however, they also seem to use context information differently (see figure). We have put forward a theoretical framework (the PARLO; Production Affects Reception in Left Only framework) for understanding the pattern of ERP data in the context of previous neuropsychological and behavioral results.
In particular, we have found that the left hemisphere (which is better at language production)
seems to make predictions about what is likely to happen next (such as what word will come up in a sentence) and to get ready to process those inputs. In contrast, the right hemisphere seems
biased toward the veridical maintenance of information and integration with working memory. Such a division of labor across the hemispheres may help the brain deal with the inherent tradeoff between efficiency and accuracy in information processing.