Skip to main content
  • Research Article
  • Open access
  • Published:

Spoken Emotion Recognition Using Glottal Symmetry


Speech variability in real-world situations makes spoken emotion recognition a challenging task. While a variety of temporal and spectral speech features have been proposed, this paper investigates the effectiveness of using the glottal airflow signal in recognizing emotions. The speech used in this investigation is from a classical recording of the theatrical play "Waiting for Godot" by Samuel Beckett. Six emotions were investigated: happy, angry, sad, fear, surprise, and neutral. The proposed method was tested on the original recording and on simulated distortion conditions. In clean signal conditions the proposed method achieved average recognition rates of 76% for four emotions and 66.5% for all six emotions. Furthermore, it proved fairly robust under signal distortion and noisy conditions achieving recognition rates of 60% for four and 51.6% for six emotions for severely low-pass filtered speech, while with additive white Gaussian noise at SNR = 10 dB recognition rates were 53% and 47% for the four and six-emotion tasks, respectively. Results indicate that glottal signal features provide good separation of spoken emotions and achieve enhanced classification performance when compared to other approaches.

Publisher note

To access the full article, please see PDF.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Alexander I. Iliev.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Iliev, A.I., Scordilis, M.S. Spoken Emotion Recognition Using Glottal Symmetry. EURASIP J. Adv. Signal Process. 2011, 624575 (2011).

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: