CNNs found to jump around more skillfully than RNNs: Compositional Generalization in Seq2seq Convolutional Networks

Roberto Dessì, Marco Baroni


Abstract
Lake and Baroni (2018) introduced the SCAN dataset probing the ability of seq2seq models to capture compositional generalizations, such as inferring the meaning of “jump around” 0-shot from the component words. Recurrent networks (RNNs) were found to completely fail the most challenging generalization cases. We test here a convolutional network (CNN) on these tasks, reporting hugely improved performance with respect to RNNs. Despite the big improvement, the CNN has however not induced systematic rules, suggesting that the difference between compositional and non-compositional behaviour is not clear-cut.
Anthology ID:
P19-1381
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3919–3923
Language:
URL:
https://aclanthology.org/P19-1381
DOI:
10.18653/v1/P19-1381
Bibkey:
Cite (ACL):
Roberto Dessì and Marco Baroni. 2019. CNNs found to jump around more skillfully than RNNs: Compositional Generalization in Seq2seq Convolutional Networks. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3919–3923, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
CNNs found to jump around more skillfully than RNNs: Compositional Generalization in Seq2seq Convolutional Networks (Dessì & Baroni, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1381.pdf
Data
SCAN