Marta Font-Alaminos

and 3 more

Our actions shape our everyday experience: what we experience, how we perceive and remember it, is deeply affected by how we interact with the world. Performing an action to deliver a stimulus engages neurophysiological processes which are reflected in the modulation of sensory and pupil responses. In this study, we hypothesized that these processes shape memory encoding, parsing the experience by grouping self- and externally-generated stimuli into differentiated events. Participants encoded sound sequences, in which either the first or last few sounds were self-generated and the rest externally-generated. We tested recall of the sequential order of sounds that had originated from the same (within event) or different sources (across events). Memory performance was not higher for within event sounds, suggesting that the memory representation was not structured by actions. However, during encoding, we replicated the well-known electrophysiological response attenuation, together with increased pupil dilation for self-generated sounds. Moreover, we found that at the boundary between events, physiological responses to the first sound originating from the new source were determined by the direction of the source switch. The results suggest that introducing actions, acts as a stronger contextual shift than removing them, despite not directly contributing to memory performance. The findings contribute to our understanding of how interacting with sensory input shapes our experiences, by addressing the unexplored relationships between action effects on sensory responses, pupil dilation and memory encoding, and discarding a meaningful contribution of low-level neurophysiological mechanisms associated to action execution in the modulation of memory.

Stefanie Sturm

and 2 more

Active engagement improves learning and memory, and self- vs. externally generated stimuli are processed differently: perceptual intensity and neural responses are attenuated. Whether the attenuation is linked to memory formation remains to be understood. This study investigates whether active oculomotor control over auditory stimuli – controlling for movement and stimulus predictability – benefits associative learning, and studies the underlying neural mechanisms. Using EEG and eyetracking we explored the impact of control during learning on the processing and memory recall of arbitrary oculomotor-auditory associations. Participants (N=23) learned associations through active exploration or passive observation, using a gaze-controlled interface to generate sounds. Our results show faster learning progress in the active condition. ERPs time-locked to the onset of sound stimuli showed that learning progress was linked to an attenuation of the P3a component. The detection of matching movement-sound pairs triggered a target-matching P3b response. There was no general modulation of ERPs through active learning. However, participants could be divided into different learner types: those who benefited strongly from active control during learning and those who did not. The strength of the N1 attenuation effect for self-generated stimuli was correlated with memory gain in active learning. Our results show that control helps learning and memory and modulates sensory responses. Individual differences during sensory processing predict the strength of the memory benefit. Taken together, these results help to disentangle the effects of agency, unspecific motor-based neuromodulation, and stimulus predictability on ERP components and establish a link between self-generation effects and active learning memory gain.