Do you see what I am saying? Electrophysiological dynamics of visual speech processing and the role of orofacial effectors.
Human speech perception has been prevalently studied focusing on auditory processing. However, visual and motor systems seem to play a more important role in speech perception than previously thought. The current study investigated the electrophysiological responses evoked by visual speech cues and other kind of orofacial movements and the role of automatic mimicry in speech versus non-speech visual perception. The results show that a) visual linguistic content and particularly the place of articulation of the syllables strongly modulated the electrophysiological responses and that b) this effect disappeared when automatic mimicry was interfered by asking the participants to hold an effector depressor between their teeth. These results support the idea that speech processing is multimodal and involves not only auditory but also visual and motor systems.