Open Access

On the Relationship between Face Movements, Tongue Movements, and Speech Acoustics

  • Jintao Jiang1Email author,
  • Abeer Alwan1,
  • Patricia A. Keating2,
  • Edward T. AuerJr.3 and
  • Lynne E. Bernstein3
EURASIP Journal on Advances in Signal Processing20022002:506945

Received: 29 November 2001

Published: 28 November 2002


This study examines relationships between external face movements, tongue movements, and speech acoustics for consonant-vowel (CV) syllables and sentences spoken by two male and two female talkers with different visual intelligibility ratings. The questions addressed are how relationships among measures vary by syllable, whether talkers who are more intelligible produce greater optical evidence of tongue movements, and how the results for CVs compared to those for sentences. Results show that the prediction of one data stream from another is better for C/a/ syllables than C/i/ and C/u/ syllables. Across the different places of articulation, lingual places result in better predictions of one data stream from another than do bilabial and glottal places. Results vary from talker to talker; interestingly, high rated intelligibility do not result in high predictions. In general, predictions for CV syllables are better than those for sentences.


articulatory movementsspeech acousticsqualisysEMAoptical tracking

Authors’ Affiliations

Electrical Engineering Department, University of California at Los Angeles, Los Angeles, USA
Linguistics Department, University of California at Los Angeles, Los Angeles, USA
Communication Neuroscience Department, House Ear Institute, Los Angeles, USA


© Jiang et al. 2002