Myself, Doreen Hansmann, and Catherine Theys just published our article on “Tri-modal Speech: Audio-visual-tactile Integration in Speech Perception” in the Journal of the Acoustical Society of America. This paper was also presented as a poster at the American Speech-Language-Hearing Association (ASHA) Annual Convention in Orlando, Florida, November 21-22, 2019, winning a meritorious poster award.
TL-DR; People use auditory, visual, and tactile speech information to accurately identify syllables in noise. Auditory speech information is the most important, then visual information, and lastly aero-tactile information – but we can use them all at once.
Abstract: Speech perception is a multi-sensory experience. Visual information enhances (Sumby and Pollack, 1954) and interferes (McGurk and MacDonald, 1976) with speech perception. Similarly, tactile information, transmitted by puffs of air arriving at the skin and aligned with speech audio, alters (Gick and Derrick, 2009) auditory speech perception in noise. It has also been shown that aero-tactile information influences visual speech perception when an auditory signal is absent (Derrick, Bicevskis, and Gick, 2019a). However, researchers have not yet identified the combined influence of aero-tactile, visual, and auditory information on speech perception. The effects of matching and mismatching visual and tactile speech on two-way forced-choice auditory syllable-in-noise classification tasks were tested. The results showed that both visual and tactile information altered the signal-to-noise threshold for accurate identification of auditory signals. Similar to previous studies, the visual component has a strong influence on auditory syllable-in-noise identification, as evidenced by a 28.04 dB improvement in SNR between matching and mismatching visual stimulus presentations. In comparison, the tactile component had a small influence resulting in a 1.58 dB SNR match-mismatch range. The effects of both the audio and tactile information were shown to be additive.
Derrick, D., Bicevskis, K., and Gick, B. (2019a). “Visual-tactile speech perception and the autism quotient,” Frontiers in Communication – Language Sciences 3(61), 1–11, doi: http://dx.doi.org/10.3389/fcomm.2018.00061
Gick, B., and Derrick, D. (2009). “Aero-tactile integration in speech perception,” Nature 462, 502–504, doi: https://doi.org/10.1038/nature08572.
McGurk, H., and MacDonald, J. (1976). “Hearing lips and seeing voices,” Nature 264, 746–748, doi: http://dx.doi.org/https://doi.org/10.1038/264746a0