Yılmaz, Begüm, Furman, Reyhan ORCID: 0000-0001-6034-3820, Göksun, Tilbe and Eskenazi, Terry
(2025)
Speech Disfluencies and Hand Gestures as Metacognitive Cues.
Cognitive Science: A Multidisciplinary Journal, 49
(8).
e70093.
ISSN 0364-0213
![]() |
PDF
- Accepted Version
Restricted to Repository staff only 617kB |
Official URL: https://doi.org/10.1111/cogs.70093
Abstract
How language interacts with metacognitive processes is an understudied area. Earlier research shows that people produce disfluencies (i.e., “uh” s or “um” s) in their speech when they are not sure of their answers, indicating metacognitive monitoring. Gestures have monitoring and predictive roles in language, also implicating metacognitive processes. Further, the rate of speech disfluencies and gestures change as a function of the communicational setting. People produce fewer disfluencies and more gestures when they can see the listener than when the listener is not visible. In the current study, 50 participants (32 women, Mage = 21.16, SD = 1.46) were asked 40 general knowledge questions, either with a visible (n = 25) or nonvisible (n = 25) listener. They provided feelings-of-knowing (FOK) judgment immediately after seeing the question and were asked to think aloud while pondering their answers. Then, they provided retrospective confidence judgments (RCJs). Results showed that gestures and speech disfluencies were not related either to the accuracy or the FOK judgments. However, both gestures and speech disfluencies predicted RCJs uniquely and interactively. Speech disfluencies negatively predicted RCJs. In contrast, hand gestures were positively related to RCJs. Importantly, the use of gestures was more strongly related to RCJs when disfluencies were also higher. No effect of communicational setting on the rate of gestures or speech disfluencies was found. These results highlight the importance of multimodal language cues in the elaboration of metacognitive judgments.
Repository Staff Only: item control page