TimeToM: Temporal Space is the Key to Unlocking the Door of Large Language Models’ Theory-of-Mind

Guiyang Hou, Wenqi Zhang, Yongliang Shen, Linjuan Wu, Weiming Lu


Abstract
Theory of Mind (ToM)—the cognitive ability to reason about mental states of ourselves and others, is the foundation of social interaction. Although ToM comes naturally to humans, it poses a significant challenge to even the most advanced Large Language Models (LLMs). Due to the complex logical chains in ToM reasoning, especially in higher-order ToM questions, simply utilizing reasoning methods like Chain of Thought (CoT) will not improve the ToM capabilities of LLMs. We present TimeToM, which constructs a temporal space and uses it as the foundation to improve the ToM capabilities of LLMs in multiple scenarios. Specifically, within the temporal space, we construct Temporal Belief State Chain (TBSC) for each character and inspired by the cognition perspective of the social world model, we divide TBSC into self-world beliefs and social world beliefs, aligning with first-order ToM (first-order beliefs) and higher-order ToM (higher-order beliefs) questions, respectively. Moreover, we design a novel tool-belief solver that, by considering belief communication between characters in temporal space, can transform a character’s higher-order beliefs into another character’s first-order beliefs under belief communication period.
Anthology ID:
2024.findings-acl.685
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11532–11547
Language:
URL:
https://aclanthology.org/2024.findings-acl.685
DOI:
10.18653/v1/2024.findings-acl.685
Bibkey:
Cite (ACL):
Guiyang Hou, Wenqi Zhang, Yongliang Shen, Linjuan Wu, and Weiming Lu. 2024. TimeToM: Temporal Space is the Key to Unlocking the Door of Large Language Models’ Theory-of-Mind. In Findings of the Association for Computational Linguistics: ACL 2024, pages 11532–11547, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
TimeToM: Temporal Space is the Key to Unlocking the Door of Large Language Models’ Theory-of-Mind (Hou et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.685.pdf