Lisa Aziz-Zadeh, PhD
Assistant Professor of Occupational Science
Room: HNB B20
Phone: (213) 821-2970
Lisa Aziz-Zadeh trained at the University of California, Los Angeles, receiving her BA in psychology with a minor in neuroscience, and her PhD in Psychology with an emphasis in Cognitive Neuroscience. She completed postdoctoral work with Dr. Giacomo Rizzolatti's laboratory at the University of Parma, Italy, Dr. Richard Ivry's laboratory at the University pf California, Berkeley, and was a fellow at the UCLA Tennenbaum Family Creativity Initiative. She has published numerous papers and book chapters on the mirror neuron system, embodied cognition, and language. In 2008-2009, she was an invited fellow at the Institute for Advanced Study at Berlin.
Dr. Aziz-Zadeh studies embodied representations, creativity and language from a cognitive neuroscience perspective, using techniques including functional Magnetic Resonance Imaging (fMRI) and Transcranial Magnetic Stimulation (TMS).
Doctor of Philosophy (Ph D) in Neurosciences
University of California Los Angeles -Management School
Doctor of Philosophy (Ph D) in Psychology
University of California Los Angeles -Management School
Master of Arts (MA) in Neurosciences
University of California Los Angeles -Management School
Master of Arts (MA) in Psychology
University of California Los Angeles -Management School
Bachelor of Arts (BA) in Psychology
University of California Los Angeles -Management School
Aziz-Zadeh, L. S., & Liew, S. -. (2013). The human mirror neuron system, social control, and language. In D. D. Franks & J. H. Turner (Eds.), Handbook of neurosociology. Dordrecht, The Netherlands: Springer.
The human putative mirror neuron system (MNS) is a key network hypothesized to play a role in many social cognitive and language-related abilities. This chapter begins by discussing basic findings on the mirror system, which encompasses motor-related brain regions that fire when an individual both performs and observes others perform actions. We then discuss how these shared action/observation regions are thought to underlie one’s ability to understand others via simulation of their actions onto one’s own motor representations. Finally, we conclude by noting how the frontal mirror region coincides with Broca’s area, a language region in the brain, leading some to propose that the MNS may also play a role in language and gesture abilities.
Liew, S. L., & Aziz-Zadeh, L. S. (2011). The human mirror neuron system and social cognition. In R. P. Ebstein, S. Shamay-Tsoory, & S. H. Chew (Eds.), From DNA to social cognition (1st ed.). Hoboken, NJ: Wiley-Blackwell.
Grafton, S. T., Aziz-Zadeh, L. S., & Ivry, R. B. (2009). Relative hierarchies and the representation of action. In M. Gazzaniga (Ed.), The cognitive neurosciences (4th ed.). Cambridge, MA: MIT Press.
Aziz-Zadeh, L. S., & Ivry, R. (2008). Action representation and the mirror neuron system. In D. Sternad (Ed.), Progress in motor control: A multidisciplinary perspective (5th ed.). New York, NY: Springer.
Liew, S. L., Sheng, T., & Aziz-Zadeh, L. S. (2013). Experience with an amputee modulates one’s own sensorimotor response during action observation. NeuroImage, 69, 138-145. doi:10.1016/j.neuroimage.2012.12.028.
Observing actions performed by others engages one's own sensorimotor regions, typically with greater activity for actions within one's own motor abilities or for which one has prior experience. However, it is unclear how experience modulates the neural response during the observation of impossible actions, beyond one's own abilities. Using fMRI, we scanned typically-developed participants as they observed actions performed by a novel biological effector (the residual limb of a woman born without arms) and a familiar biological effector (a hand). Participants initially demonstrated greater activity in the bilateral inferior and superior parietal cortices when observing actions made by the residual limb compared to the hand, with more empathic participants activating the right inferior parietal lobule, corresponding to the posterior component of the action observation network, more strongly. Activity in the parietal regions may indicate matching the kinematics of a novel effector to one's own existing sensorimotor system, a process that may be more active in more empathic individuals. Participants then received extended visual exposure to each effector, after which they showed little difference between activation in response to residual limb compared to hand actions, only in the right superior parietal lobule. This suggests that visual experience may attenuate the difference between how residual limb and hand actions are represented using one's own body representations, allowing us to flexibly map physically different others onto our own body representations.
Aziz-Zadeh, L. S., Liew, S. L., & Dandekar, F. (2012). Exploring the neural correlates of visual creativity. Social Cognitive and Affective Neuroscience, doi:10.1093/scan/nss021.
Although creativity has been called the most important of all human resources, its neural basis is still unclear. In the current study, we used fMRI to measure neural activity in participants solving a visuospatial creativity problem that involves divergent thinking and has been considered a canonical right hemisphere task. As hypothesized, both the visual creativity task and the control task as compared to rest activated a variety of areas including the posterior parietal cortex bilaterally and motor regions, which are known to be involved in visuospatial rotation of objects. However, directly comparing the two tasks indicated that the creative task more strongly activated left hemisphere regions including the posterior parietal cortex, the premotor cortex, dorsolateral prefrontal cortex (DLPFC) and the medial PFC. These results demonstrate that even in a task that is specialized to the right hemisphere, robust parallel activity in the left hemisphere supports creative processing. Furthermore, the results support the notion that higher motor planning may be a general component of creative improvisation and that such goal-directed planning of novel solutions may be organized top-down by the left DLPFC and by working memory processing in the medial prefrontal cortex.
Liew, S. L., Garrison, K. A., Werner, J., & Aziz-Zadeh, L. S. (2012). The mirror neuron system: Innovations and implications for occupational therapy. OTJR: Occupation, Participation and Health, 32(3), 79-86. doi:10.3928/15394492-20111209-01.
Occupational therapy has traditionally championed the use of meaningful occupations in rehabilitation. Emerging research in neuroscience about the putative human mirror neuron system may provide empirical support for the use of occupations to improve outcomes in rehabilitation. This article provides an interdisciplinary framework for understanding the mirror neuron system— a network of motor-related brain regions activated during the production and perception of the same actions—in relation to occupational therapy. The authors present an overview of recent research on the mirror neuron system, highlighting features that are relevant to clinical practice in occupational therapy. They also discuss the potential use of the mirror neuron system in motor rehabilitation and how it may be deficient in populations served by occupational therapy, including individuals with dyspraxia, multisensory integration disorders, and social interaction difficulties. Methods are proposed for occupational therapy to translate these neuroscience findings on the mirror neuron system into clinical applications and the authors suggest that future research in neuroscience would benefit from integrating the occupational therapy perspective.
Aziz-Zadeh, L. S., Sheng, T., Liew, S. L., & Damasio, H. C. (2012). Understanding otherness: The neural bases of action comprehension and pain empathy in a congenital amputee. Cerebral Cortex, 22, 811-819. doi:10.1093/cercor/bhr139.
How do we understand and empathize with individuals whose bodies are drastically different from our own? We investigated the neural processes by which an individual with a radically different body, a congenital amputee who is born without limbs, engages her own sensory-motor representations as a means to understand other people’s body actions or emotional states. Our results support the prediction that when the goal of the action is possible for the observer, one’s own motor regions are involved in processing action observation, just as when individuals viewed those similar to themselves. However, when the observed actions are not possible, mentalizing mechanisms, relying on a different set of neural structures, are additionally recruited to process the actions. Furthermore, our results indicate that when individuals view others experiencing pain in body parts that they have, the insula and somatosensory cortices are activated, consistent with previous reports. However, when an individual views others experiencing pain in body parts that she does not have, the insula and secondary somatosensory cortices are still active, but the primary somatosensory cortices are not. These results provide a novel understanding for how we understand and empathize with individuals who drastically differ from the self.
Aziz-Zadeh, L. S., & Liew, S. L. (2011). The neuroscience of language and action in occupations: A review of findings from brain and behavioral sciences. Journal of Occupational Science, 18(2), 97-114. doi:10.1080/14427591.2011.575758.
Language is a dominant part of our daily activities, playing a significant role in narrating our actions and mediating our interactions with one another. In this article, we examine emerging neuroscientific evidence that language is biologically linked to action and suggest that studying language from an occupation-based perspective can contribute a rich dimension of analysis for occupational science. We briefly review several of the ways in which language is currently being incorporated into the study of occupations and conclude by suggesting future directions for an occupation-based study of language.
Liew, S. L., Han, S., & Aziz-Zadeh, L. S. (2011). Familiarity modulates mirror neuron and mentalizing regions during intention understanding. Human Brain Mapping, 32, 1986-1987. doi:10.1002/hbm.21164.
Recent research suggests that the inference of others' intentions from their observed actions is supported by two neural systems that perform complementary roles. The human putative mirror neuron system (pMNS) is thought to support automatic motor simulations of observed actions, with increased activity for previously experienced actions, whereas the mentalizing system provides reflective, non-intuitive reasoning of others' perspectives, particularly in the absence of prior experience. In the current fMRI study, we show how motor familiarity with an action and perceptual familiarity with the race of an actor uniquely modulate these two systems. Chinese participants were asked to infer the intentions of actors performing symbolic gestures, an important form of non-verbal communication that has been shown to activate both mentalizing and mirror neuron regions. Stimuli were manipulated along two dimensions: (1) actor's race (Caucasian vs. Chinese actors) and (2) participants' level of experience with the gestures (familiar or unfamiliar). We found that observing all gestures compared to observing still images was associated with increased activity in key regions of both the pMNS and mentalizing systems. In addition, observations of one's same race generated greater activity in the posterior pMNS-related regions and the insula than observations of a different race. Surprisingly, however, familiar gestures more strongly activated regions associated with mentalizing, while unfamiliar gestures more strongly activated the posterior region of the pMNS, a finding that is contrary to prior literature and demonstrates the powerful modulatory effects of both motor and perceptual familiarity on pMNS and mentalizing regions when asked to infer the intentions of intransitive gestures.
Aziz-Zadeh, L. S. (2011). Who's afraid of the boss: Cultural differences in social hierarchies modulate self-face recognition in Chinese and Americans. PLoS ONE, 6(2), e16901. doi: 10.1371/journal.pone.0016901.
Human adults typically respond faster to their own face than to the faces of others. However, in Chinese participants, this self-face advantage is lost in the presence of one's supervisor, and they respond faster to their supervisor's face than to their own. While this “boss effect” suggests a strong modulation of self-processing in the presence of influential social superiors, the current study examined whether this effect was true across cultures. Given the wealth of literature on cultural differences between collectivist, interdependent versus individualistic, independent self-construals, we hypothesized that the boss effect might be weaker in independent than interdependent cultures. Twenty European American college students were asked to identify orientations of their own face or their supervisors' face. We found that European Americans, unlike Chinese participants, did not show a “boss effect” and maintained the self-face advantage even in the presence of their supervisor's face. Interestingly, however, their self-face advantage decreased as their ratings of their boss's perceived social status increased, suggesting that self-processing in Americans is influenced more by one's social status than by one's hierarchical position as a social superior. In addition, when their boss's face was presented with a labmate's face, American participants responded faster to the boss's face, indicating that the boss may represent general social dominance rather than a direct negative threat to oneself, in more independent cultures. Altogether, these results demonstrate a strong cultural modulation of self-processing in social contexts and suggest that the very concept of social positions, such as a boss, may hold markedly different meanings to the self across Western and East Asian cultures.
Garrison, K. A., Winstein, C. J., & Aziz-Zadeh, L. S. (2010). The mirror neuron system: A neural substrate for methods in stroke rehabilitation. Neural Rehabilitation and Neural Repair, doi:10.1177/1545968309354536.
Mirror neurons found in the premotor and parietal cortex respond not only during action execution, but also during observation of actions being performed by others. Thus, the motor system may be activated without overt movement. Rehabilitation of motor function after stroke is often challenging due to severity of impairment and poor to absent voluntary movement ability. Methods in stroke rehabilitation based on the mirror neuron system—action observation, motor imagery, and imitation—take advantage of this opportunity to rebuild motor function despite impairments, as an alternative or complement to physical therapy. Here the authors review research into each condition of practice, and discuss the relevance of the mirror neuron system to stroke recovery.
Aziz-Zadeh, L. S., Sheng, T., & Gheytanchi, A. (2010). Common premotor regions for the perception and production of prosody and correlations with empathy and prosodic ability. PLoS ONE, 5(1), e8759. doi:10.1371/journal.pone.0008759.
BACKGROUND: Prosody, the melody and intonation of speech, involves the rhythm, rate, pitch and voice quality to relay linguistic and emotional information from one individual to another. A significant component of human social communication depends upon interpreting and responding to another person’s prosodic tone as well as one’s own ability to produce prosodic speech. However there has been little work on whether the perception and production of prosody
share common neural processes, and if so, how these might correlate with individual differences in social ability.
METHODS: The aim of the present study was to determine the degree to which perception and production of prosody rely on shared neural systems. Using fMRI, neural activity during perception and production of a meaningless phrase in different prosodic intonations was measured. Regions of overlap for production and perception of prosody were found in premotor
regions, in particular the left inferior frontal gyrus (IFG). Activity in these regions was further found to correlate with how high an individual scored on two different measures of affective empathy as well as a measure on prosodic production ability.
CONCLUSIONS: These data indicate, for the first time, that areas that are important for prosody production may also be utilized for prosody perception, as well as other aspects of social communication and social understanding, such as aspects of empathy and prosodic ability.
Sheng, T., Gheytanchi, A., & Aziz-Zadeh, L. S. (2010). Default network deactivations are correlated with psychopathic personality traits. PLoS ONE, 5(9), e1261. doi:10.1371/journal.pone.0012611.
BACKGROUND: The posteromedial cortex (PMC) and medial prefrontal cortex (mPFC) are part of a network of brain regions that has been found to exhibit decreased activity during goal-oriented tasks. This network is thought to support a baseline of brain activity, and is commonly referred to as the “default network”. Although recent reports suggest that the PMC and mPFC are associated with affective, social, and self-referential processes, the relationship between these default network components and personality traits, especially those pertaining to social context, is poorly understood.
METHODOLOGY/PRINCIPAL FINDINGS: In the current investigation, we assessed the relationship between PMC and mPFC deactivations and psychopathic personality traits using fMRI and a self-report measure. We found that PMC deactivations predicted traits related to egocentricity and mPFC deactivations predicted traits related to decision-making.
CONCLUSIONS/SIGNIFICANCE: These results suggest that the PMC and mPFC are associated with processes involving self-relevancy and affective decision-making, consistent with previous reports. More generally, these findings suggest a link between default network activity and personality traits.
Landau, A. N., Aziz-Zadeh, L. S., & Ivry, R. B. (2010). The influence of language on perception: Listening to sentences about faces affects the perception of faces. The Journal of Neuroscience, 30, 15254-15261. doi:10.1523/JNEUROSCI.2046-10.2010.
We examined the effect of linguistic comprehension on early perceptual encoding in a series of electrophysiological and behavioral studies on humans. Using the fact that pictures of faces elicit a robust and reliable evoked response that peaks at ∼170 ms after stimulus onset (N170), we measured the N170 to faces that were preceded by primes that referred to either faces or scenes. When the primes were auditory sentences, the magnitude of the N170 was larger when the face stimuli were preceded by sentences describing faces compared to sentences describing scenes. In contrast, when the primes were visual, the N170 was smaller after visual primes of faces compared to visual primes of scenes. Similar opposing effects of linguistic and visual primes were also observed in a reaction time experiment in which participants judged the gender of faces. These results provide novel evidence of the influence of language on early perceptual processes and suggest a surprising mechanistic description of this interaction: linguistic primes produce content-specific interference on subsequent visual processing. This interference may be a consequence of the natural statistics of language and vision given that linguistic content is generally uncorrelated with the contents of perception.
Aziz-Zadeh, L. S., Kaplan, J. T., & Iacoboni, M. (2009). "Aha!": The neural correlates of verbal insight solutions. Human Brain Mapping, 30, 908-916. doi:10.1002/hbm.20554.
What are the neural correlates of insight solutions? To explore this question we asked participants to perform an anagram task while in the fMRI scanner. Previous research indicates that anagrams are unique in that they can yield both insight and search solutions in expert subjects. Using a single-trial fMRI paradigm, we utilized the anagram methodology to explore the neural correlates of insight versus search solutions. We used both reaction time measures and subjective reports to classify each trial as a search or insight solution. Data indicate that verbal insight solutions activate a distributed neural network that includes bilateral activation in the insula, the right prefrontal cortex, and the anterior cingulate. These areas are discussed with their possible role in evaluation and metacognition of insight solutions, as well as attention and monitoring during insight.
Aziz-Zadeh, L. S., Fiebach, C. J., Naranayan, S., Feldman, J., Dodge, E., & Ivry, R. B. (2008). Modulation of the FFA and PPA by language related to faces and places. Social Neuroscience, 3(3-4), 229-238. doi:10.1080/17470910701414604.
Does sentence comprehension related to faces modulate activity in the fusiform face area (FFA) and does sentence comprehension related to places modulate activity in the parahippocampal place area (PPA)? We investigated this question in an fMRI experiment. Participants listened to sentences describing faces, places, or objects, with the latter serving as a control condition. In a separate run, we localized the FFA and PPA in each participant using a perceptual task. We observed a significant interaction between the region of interest (FFA vs. PPA) and sentence type (face vs. place). Activity in the left FFA was modulated by face sentences and in the left PPA was modulated by place sentences. Surprisingly, activation in each region of interest was reduced when listening to sentences requiring semantic analysis related to that region's domain specificity. No modulation was found in the corresponding right hemisphere ROIs. We conclude that processing sentences may involve inhibition of some visual processing areas in a content-specific manner. Furthermore, our data indicate that this semantic-based modulation is restricted to the left hemisphere. We discuss how these results may constrain neural models of embodied semantics.
Kaplan, J. T., Aziz-Zadeh, L. S., Uddin, L. Q., & Iacoboni, M. (2008). The self across the senses: An fMRI study of self-face and self-voice recognition. Social Cognitive and Affective Neuroscience, 3, 218-223. doi:10.1093/scan/nsn014.
There is evidence that the right hemisphere is involved in processing self-related stimuli. Previous brain imaging research has found a network of right-lateralized brain regions that preferentially respond to seeing one's own face rather than a familiar other. Given that the self is an abstract multimodal concept, we tested whether these brain regions would also discriminate the sound of one's own voice compared to a friend's voice. Participants were shown photographs of their own face and friend's face, and also listened to recordings of their own voice and a friend's voice during fMRI scanning. Consistent with previous studies, seeing one's own face activated regions in the inferior frontal gyrus (IFG), inferior parietal lobe and inferior occipital cortex in the right hemisphere. In addition, listening to one's voice also showed increased activity in the right IFG. These data suggest that the right IFG is concerned with processing self-related stimuli across multiple sensory modalities and that it may contribute to an abstract self-representation.
Aziz-Zadeh, L. S., & Damasio, A. R. (2008). Embodied semantics for actions: Findings from functional brain imaging. Journal of Physiology, Paris, 102, 35-39. doi:10.1016/j.jphysparis.2008.03.012.
The theory of embodied semantics for actions specifies that the sensory-motor areas used for producing an action are also used for the conceptual representation of the same action. Here we review the functional imaging literature that has explored this theory and consider both supporting as well as challenging fMRI findings. In particular we address the representation of actions and concepts as well as literal and metaphorical phrases in the premotor cortex.
Gazzola, V., Aziz-Zadeh, L. S., & Keysers, C. (2006). Empathy and the somatotopic auditory mirror system in humans. Current Biology, 16, 1824-1829. doi:10.1016/j.cub.2006.07.072.
How do we understand the actions of other individuals if we can only hear them? Auditory mirror neurons respond both while monkeys perform hand or mouth actions and while they listen to sounds of similar actions . This system might be critical for auditory action understanding and language evolution . Preliminary evidence suggests that a similar system may exist in humans . Using fMRI, we searched for brain areas that respond both during motor execution and when individuals listened to the sound of an action made by the same effector. We show that a left hemispheric temporo-parieto-premotor circuit is activated in both cases, providing evidence for a human auditory mirror system. In the left premotor cortex, a somatotopic pattern of activation was also observed: A dorsal cluster was more involved during listening and execution of hand actions, and a ventral cluster was more involved during listening and execution of mouth actions. Most of this system appears to be multimodal because it also responds to the sight of similar actions. Finally, individuals who scored higher on an empathy scale activated this system more strongly, adding evidence for a possible link between the motor mirror system and empathy.
Aziz-Zadeh, L. S., Wilson, S., Rizzolatti, G., & Iacoboni, M. (2006). Congruent embodied representations for visually presented actions and linguistic phrases describing actions. Current Biology, 16(18), 1818-23.
The thesis of embodied semantics holds that conceptual representations accessed during linguistic processing are, in part, equivalent to the sensory-motor representations required for the enactment of the concepts described. Here, using fMRI, we tested the hypothesis that areas in human premotor cortex that respond both to the execution and observation of actions - mirror neuron areas - are key neural structures in these processes. Participants observed actions and read phrases relating to foot, hand, or mouth actions. In the premotor cortex of the left hemisphere, a clear congruence was found between effector-specific activations of visually presented actions and of actions described by literal phrases. These results suggest a key role of mirror neuron areas in the re-enactment of sensory-motor representations during conceptual processing of actions invoked by linguistic stimuli.
Aziz-Zadeh, L. S., Koski, L., Zaidel, E., Mazziotta, J., & Iacoboni, M. (2006). Author response to "Mirror neurons and the lateralization of language". Journal of Neuroscience, 26, 6666-6667s.
Aziz-Zadeh, L. S., Koski, L., Zaidel, E., Mazziotta, J., & Iacoboni, M. (2006). Lateralization of the human mirror neuron system. Journal of Neuroscience, 26(11), 2964-2970.
A cortical network consisting of the inferior frontal, rostral inferior parietal, and posterior superior temporal cortices has been implicated in representing actions in the primate brain and is critical to imitation in humans. This neural circuitry may be an evolutionary precursor of neural systems associated with language. However, language is predominantly lateralized to the left hemisphere, whereas the degree of lateralization of the imitation circuitry in humans is unclear. We conducted a functional magnetic resonance imaging study of imitation of finger movements with lateralized stimuli and responses. During imitation, activity in the inferior frontal and rostral inferior parietal cortex, although fairly bilateral, was stronger in the hemisphere ipsilateral to the visual stimulus and response hand. This ipsilateral pattern is at variance with the typical contralateral activity of primary visual and motor areas. Reliably increased signal in the right superior temporal sulcus (STS) was observed for both left-sided and right-sided imitation tasks, although subthreshold activity was also observed in the left STS. Overall, the data indicate that visual and motor components of the human mirror system are not left-lateralized. The left hemisphere superiority for language, then, must be have been favored by other types of language precursors, perhaps auditory or multimodal action representations.
Aziz-Zadeh, L. S., Iacoboni, M., & Zaidel, E. (2006). Hemispheric sensitivity to body stimuli in simple reaction time. Experimental Brain Research, 170, 116-121.
Previous research indicates that people respond fastest when the motor response is (spatially, functionally, anatomically, or otherwise) congruent to the visual stimulus. This effect, called ideomotor compatibility, is thought to be expressed in motor areas. Congruence occurs when the stimulus and response share some dimensions in their internal representations. If the ideomotor compatibility hypothesis were true, we would expect facilitation when right hand stimuli are presented to the left hemisphere, or left hand stimuli are presented to the right hemisphere. To address this issue, we conducted a simple reaction time experiment with lateralized targets. Participants were instructed to press a button as soon as a target was observed. The target stimulus was a left hand, a right hand, or a neutral control. Each hemisphere showed faster responses to contralateral hand stimuli as compared with ipsilateral hand stimuli, consistent with the ideomotor compatibility hypothesis. The results support an automatic and implicit processing of visual stimuli within motor representations even when no recognition of, or decision about, the stimulus is necessary.
Aziz-Zadeh, L. S., Cattaneo, L., Rochat, M., & Rizzolatti, G. (2005). Covert speech arrest induced by rTMS over both motor and nonmotor left hemisphere frontal sites. Journal of Cognitive Neuroscience, 17, 928-938. doi:10.1162/0898929054021157.
Blocking the capacity to speak aloud (overt speech arrest, SA) may be induced by repetitive transcranial magnetic stimulation (rTMS). The possibility, however, of blocking internal speech (covert SA) has not been explored. To investigate this issue, we conducted two rTMS experiments. In the first experiment, we stimulated two left frontal lobe sites. The first was a motor site (left posterior site) and the second was a nonmotor site located in correspondence to the posterior part of the inferior frontal gyrus (IFG) (left anterior site). The corresponding right hemisphere nonmotor SA site was stimulated as a control. In the second experiment, we focused on the right hemisphere and stimulated a right hemisphere motor site (right posterior site), and, as control sites, a right hemisphere nonmotor site corresponding to the IFG (right anterior site) and a left hemisphere anteromedial site (left control). For both experiments, participants performed a syllable counting task both covertly and overtly for each stimulation site. Longer latencies in this task imply the occurrence of an overt and/or covert SA.All participants showed significantly longer latencies when stimulation was either over the left posterior or the left anterior site, as compared with the right hemisphere site (Experiment 1). This result was observed for the overt and covert speech task alike. During stimulation of the posterior right hemisphere site, a dissociation for overt and covert speech was observed. An overt SA was observed but there was no evidence for a covert SA (Experiment 2). Taken together, the results show that rTMS can induce a covert SA when applied to areas over the brain that are pertinent to language. Furthermore, both the left posterior/motor site and the left anterior/IFG site appear to be essential to language elaboration even when motor output is not required.
Aziz-Zadeh, L. S., Iacoboni, M., Zaidel, E., Wilson, S., & Mazziotta, J. (2004). Left hemisphere motor facilitation in response to manual action sounds. European Journal of Neuroscience, 19, 2609-2612. doi:10.1111/j.1460-9568.2004.03348.x.
Previous studies indicate that the motor areas of both hemispheres are active when observing actions. Here we explored how the motor areas of each hemisphere respond to the sounds associated with actions. We used transcranial magnetic stimulation (TMS) to measure motor corticospinal excitability of hand muscles while listening to sounds. Sounds associated with bimanual actions produced greater motor corticospinal excitability than sounds associated with leg movements or control sounds. This facilitation was exclusively lateralized to the left hemisphere, the dominant hemisphere for language. These results are consistent with the hypothesis that action coding may be a precursor of language.
Aziz-Zadeh, L. S., Maeda, F., Zaidel, E., Mazziotta, J., & Iacoboni, M. (2002). Lateralization in motor facilitation during action observation: A TMS study. Experimental Brain Research, 144, 127-131. doi:10.1007/s00221-002-1037-5.
Action observation facilitates corticospinal excitability. This is presumably due to a premotor neural system that is active when we perform actions and when we observe actions performed by others. It has been speculated that this neural system is a precursor of neural systems subserving language. If this theory is true, we may expect hemispheric differences in the motor facilitation produced by action observation, with the language-dominant left hemisphere showing stronger facilitation than the right hemisphere. Furthermore, it has been suggested that body parts are recognized via cortical regions controlling sensory and motor processing associated with that body part. If this is true, then corticospinal facilitation during action observation should be modulated by the laterality of the observed body part. The present study addressed these two issues using TMS for each motor cortex separately as participants observed actions being performed by a left hand, a right hand, or a control stimulus on the computer screen. We found no overall difference between the right and left hemisphere for motor-evoked potential (MEP) size during action observation. However, when TMS was applied to the left motor cortex, MEPs were larger while observing right hand actions. Likewise, when TMS was applied to the right motor cortex, MEPs were larger while observing left hand actions. Our data do not suggest left hemisphere superiority in the facilitating effects of action observation on the motor system. However, they do support the notion of a sensory-motor loop according to which sensory stimulus properties (for example, the image of a left hand or a right hand) directly affect motor cortex activity, even when no motor output is required. The pattern of this effect is congruent with the pattern of motor representation in each hemisphere.