• 33. Husta, C., Meyer, A.S., Drijvers L. (2024). Using rapid invisible frequency tagging (RIFT) to probe the attentional distribution between speech planning and comprehension. bioRxiv

    32. Ter Bekke, M., Drijvers, L., & Holler, J. (2024). Gestures speed up responses to questions. Language, Cognition and Neuroscience, 1-8.

    31. Ter Bekke, M., Drijvers, L., Holler, J. (2024) Hand gestures have predictive potential during conversation: An investigation of the timing of gestures in relation to speech. Cognitive Science.

    30. Rubianes, M., Drijvers, L., Muñoz, F., Jiménez-Ortega, L., Almeida-Rivera, T., Sánchez-García, J., … & Martín-Loeches, M. (2024). The Self-reference Effect Can Modulate Language Syntactic Processing Even without Explicit Awareness: An EEG Study. Journal of Cognitive Neuroscience, 1-15.

    29. Seijdel, N., Schoffelen, J.-M., Hagoort, P., & Drijvers, L. (2024). Attention drives visual processing and audiovisual integration during multimodal communication. The Journal of Neuroscience.

    28. Drijvers, L., & Mazzini, S. (2023). Neural oscillations in audiovisual speech/language and communication. In S. Murray Sherman (Ed.), Oxford Research Encyclopedia of Neuroscience. New York and Oxford: Oxford University Press. doi:10.1093/acrefore/9780190264086.013.ORE_NEU-00455.R1

    27. The Communicative Brain, Mazzini*, S., Seijdel*, N., & Drijvers*, L. (2023). Autistic individuals benefit from gestures during degraded speech comprehension. PsyArXiv, 10.31234/ doi:10.31234/ [PubMan]

    26. The Communicative Brain, Seijdel*, N., Mazzini*, S., & Drijvers*, L. (2023). Environmental noise affects audiovisual gain during speech comprehension in adverse listening conditions. OSF Preprints. doi:10.31219/ [PubMan]

    25. Mazzini, S., Holler, J., & Drijvers, L. (2023). Studying naturalistic human communication using dual-EEG and audio-visual recordings. STAR Protocols 4 (3): 102370. [PubMan] doi:10.1016/j.xpro.2023.102370.

    24. Drijvers, L., & Holler, J. (2023). Face-to-face spatial orientation fine-tunes the brain for neurocognitive processing in conversation. iScience.

    23. Drijvers, L., & Holler, J. (2023). The multimodal facilitation effect in human communication. Psychonomic Bulletin & Review.

    22. Körner, A., Castillo, M., Drijvers, L., Fischer, M., Günther, F., Marelli, M., Platonova, O., Rinaldi, L., Shaki, S., Trujillo, J.P., Tsaregorodtseva, O., Glenberg, A.M. (accepted). Embodied Processing at Six Linguistic Granularity Levels: A Consensus Paper. Journal of Cognition.

    21. Seijdel, N., Marshall, T.R., Drijvers, L. (2022). Rapid Invisible Frequency Tagging (RIFT): a promising technique to study neural and cognitive processing using naturalistic paradigms. Cerebral Cortex. Full text

    20. Wilms, V., Drijvers, L.,* Brouwer, S.* (2022). The effects of iconic gestures and babble language on word intelligibility in sentence context. Journal of Speech, Language and Hearing Research. (*shared senior authorship). Full text

    19. Holler, J., Drijvers, L., Rafiee, A., & Majid, A. (2022). Embodied space-pitch associations are shaped by language. Cognitive Science. Full text

    18. Pouw, W., Proksch, S., Drijvers, L., Gamba, M., Holler, J., Kello, C., Schaefer, R., Wiggins, G. (2021). Multilevel rhythms in multimodal communication. Proceedings of the Royal Society B: Biological Sciences. doi: 10.31219/

    17. Duprez, J., Stokkermans, M.,  Drijvers, L., Cohen, M.X. (2021) Synchronization between keyboard typing and neural oscillations. Journal of Cognitive Neuroscience. Full text

    16. Drijvers, L., Jensen, O.*, & Spaak, E*. (2021). Rapid invisible frequency tagging reveals nonlinear integration of auditory and visual information. Human Brain Mapping. Full Text

    15. Trujillo, J. P., Ozyurek, A., Holler, J., & Drijvers, L. (2021). Speakers exhibit a multimodal Lombard effect in noise. Scientific Reports. DOI Full Text

    14. Ter Bekke, M., Drijvers, L., & Holler, J. (2020). The predictive potential of hand gestures during conversation: An investigation of the timing of gestures in relation to speech. PsyArXiv Preprints. doi:10.31234/ Full Text

    13. Drijvers, L., & Ozyurek, A. (2020). Non-native listeners benefit less from gestures and visible speech than native listeners during degraded speech comprehension. Language and Speech, 63(2), 209-220. doi:10.1177/0023830919831311. DOI Full Text

    12. Ripperda, J., Drijvers, L., & Holler, J. (2020). Speeding up the detection of non-iconic and iconic gestures (SPUDNIG): A toolkit for the automatic detection of hand movements and gestures in video data. Behavior Research Methods. Advance online publication. doi:10.3758/s13428-020-01350-2. DOI Full Text

    11. Schubotz, L., Holler, J., Drijvers, L., & Ozyurek, A. (2020). Aging and working memory modulate the ability to benefit from visible speech and iconic gestures during speech-in-noise comprehension. Psychological Research. Advance online publication. doi:10.1007/s00426-020-01363-8. DOI Full Text

    10. Drijvers, L., Van der Plas, M., Ozyurek, A., & Jensen, O. (2019). Native and non-native listeners show similar yet distinct oscillatory dynamics when using gestures to access speech in noise. NeuroImage, 194, 55-67. doi:10.1016/j.neuroimage.2019.03.032. DOI Full Text

    9. Drijvers, L., Vaitonyte, J., & Ozyurek, A. (2019). Degree of language experience modulates visual attention to visible speech and iconic gestures during clear and degraded speech comprehension. Cognitive Science, 43: e12789. doi:10.1111/cogs.12789. DOI Full Text

    8. Drijvers, L. (2019). On the oscillatory dynamics underlying speech-gesture integration in clear and adverse listening conditions. PhD Thesis, Radboud University Nijmegen, Nijmegen. Full Text

    7. Drijvers, L., Ozyurek, A., & Jensen, O. (2018). Alpha and beta oscillations index semantic congruency between speech and gestures in clear and degraded speech. Journal of Cognitive Neuroscience, 30(8), 1086-1097. doi:10.1162/jocn_a_01301. DOI Full Text

    6. Drijvers, L., & Trujillo, J. P. (2018). Commentary: Transcranial magnetic stimulation over left inferior frontal and posterior temporal cortex disrupts gesture-speech integration. Frontiers in Human Neuroscience, 12: 256. doi:10.3389/fnhum.2018.00256. DOI Full Text

    5. Drijvers, L., & Ozyurek, A. (2018). Native language status of the listener modulates the neural integration of speech and iconic gestures in clear and adverse listening conditions. Brain and Language, 177-178, 7-17. doi:10.1016/j.bandl.2018.01.003. DOI Full Text

    4. Drijvers, L., Ozyurek, A., & Jensen, O. (2018). Hearing and seeing meaning in noise: Alpha, beta and gamma oscillations predict gestural enhancement of degraded speech comprehension. Human Brain Mapping, 39(5), 2075-2087. doi:10.1002/hbm.23987. DOI Full Text

    3. Drijvers, L., & Ozyurek, A. (2017). Visual context enhanced: The joint contribution of iconic gestures and visible speech to degraded speech comprehension. Journal of Speech, Language, and Hearing Research, 60, 212-222. doi:10.1044/2016_JSLHR-H-16-0101. DOI Full Text

    2. Drijvers, L., Mulder, K., & Ernestus, M. (2016). Alpha and gamma band oscillations index differential processing of acoustically reduced and full forms. Brain and Language, 153-154, 27-37. doi:10.1016/j.bandl.2016.01.003. DOI Full Text

    1. Drijvers, L., Zaadnoordijk, L., & Dingemanse, M. (2015). Sound-symbolism is disrupted in dyslexia: Implications for the role of cross-modal abstraction processes. In D. Noelle, R. Dale, A. S. Warlaumont, J. Yoshimi, T. Matlock, C. D. Jennings, & P. P. Maglio (Eds.), Proceedings of the 37th Annual Meeting of the Cognitive Science Society (CogSci 2015) (pp. 602-607). Austin, Tx: Cognitive Science Society. Full Text