Skip to Main content Skip to Navigation
Conference papers

Sequence-based Structured Prediction for Semantic Parsing

Abstract : We propose an approach for semantic parsing that uses a recurrent neural network to map a natural language question into a logical form representation of a KB query. Building on recent work by (Wang et al., 2015), the interpretable logical forms, which are structured objects obeying certain constraints, are enumerated by an underlying grammar and are paired with their canonical realizations. In order to use sequence prediction, we need to sequentialize these logical forms. We compare three sequentializations: a direct linearization of the logical form, a linearization of the associated canonical realization, and a sequence consisting of derivation steps relative to the underlying grammar. We also show how grammatical constraints on the derivation sequence can easily be integrated inside the RNN-based sequential predictor. Our experiments show important improvements over previous results for the same dataset, and also demonstrate the advantage of incorporating the grammatical constraints.
Document type :
Conference papers
Complete list of metadata

Cited literature [23 references]  Display  Hide  Download
Contributor : Claire Gardent Connect in order to contact the contributor
Submitted on : Wednesday, October 25, 2017 - 4:24:27 PM
Last modification on : Wednesday, November 3, 2021 - 7:09:02 AM
Long-term archiving on: : Friday, January 26, 2018 - 3:01:04 PM


Publisher files allowed on an open archive




Chunyang Xiao, Marc Dymetman, Claire Gardent. Sequence-based Structured Prediction for Semantic Parsing. Annual meeting of the Association for Computational Linguistics (ACL) , Aug 2016, Berlin, Germany. pp.1341 - 1350, ⟨10.18653/v1/P16-1127⟩. ⟨hal-01623762⟩



Record views


Files downloads