An HPSG approach to synchronous speech and deixis

Authors

DOI:

https://doi.org/10.21248/hpsg.2011.1

Abstract

The use of hand gestures to point at objects and individuals, or to navigate through landmarks on a virtually created map is ubiquitous in face-to-face conversation. We take this observation as a starting point, and we demonstrate that deictic gestures can be analysed on a par with speech by using standard methods from constraint-based grammars such as HPSG. In particular, we use the form of the deictic signal, the form of the speech signal (including its prosodic marking) and their relative temporal performance to derive an integrated multimodal tree that maps to an integrated multimodal meaning. The integration process is constrained via construction rules that rule out ill-formed input. These rules are driven from an empirical corporal study which sheds light on the interaction between speech and deictic gesture.

Downloads

Additional Files

Published

2011-11-16

How to Cite

Alahverdzhieva, Katya & Lascarides, Alex. 2011. An HPSG approach to synchronous speech and deixis. Proceedings of the 18th International Conference on Head-Driven Phrase Structure Grammar 6–24. (doi:10.21248/hpsg.2011.1) (https://proceedings.hpsg.xyz/article/view/748) (Accessed March 29, 2024.)