Sequence-based Structured Prediction for Semantic Parsing

Abstract : We propose an approach for semantic parsing that uses a recurrent neural network to map a natural language question into a logical form representation of a KB query. Building on recent work by (Wang et al., 2015), the interpretable logical forms, which are structured objects obeying certain constraints, are enumerated by an underlying grammar and are paired with their canonical realizations. In order to use sequence prediction, we need to sequentialize these logical forms. We compare three sequentializations: a direct linearization of the logical form, a linearization of the associated canonical realization, and a sequence consisting of derivation steps relative to the underlying grammar. We also show how grammatical constraints on the derivation sequence can easily be integrated inside the RNN-based sequential predictor. Our experiments show important improvements over previous results for the same dataset, and also demonstrate the advantage of incorporating the grammatical constraints.
Document type :
Conference papers
Complete list of metadatas

Cited literature [23 references]  Display  Hide  Download

https://hal.inria.fr/hal-01623762
Contributor : Claire Gardent <>
Submitted on : Wednesday, October 25, 2017 - 4:24:27 PM
Last modification on : Tuesday, February 12, 2019 - 10:30:06 AM
Long-term archiving on : Friday, January 26, 2018 - 3:01:04 PM

File

2016-ACL-Chunyang-semanticPars...
Publisher files allowed on an open archive

Identifiers

Collections

Citation

Chunyang Xiao, Marc Dymetman, Claire Gardent. Sequence-based Structured Prediction for Semantic Parsing. Annual meeting of the Association for Computational Linguistics (ACL) , Aug 2016, Berlin, Germany. pp.1341 - 1350, ⟨10.18653/v1/P16-1127⟩. ⟨hal-01623762⟩

Share

Metrics

Record views

249

Files downloads

140