dc.contributor.author | Fernández González, Daniel | |
dc.date.accessioned | 2024-09-18T08:01:42Z | |
dc.date.available | 2024-09-18T08:01:42Z | |
dc.date.issued | 2024-08-22 | |
dc.identifier.citation | Cognitive Computation, 1, 1-17 (2024) | spa |
dc.identifier.issn | 18669956 | |
dc.identifier.issn | 18669964 | |
dc.identifier.uri | http://hdl.handle.net/11093/7460 | |
dc.description.abstract | Intelligent voice assistants, such as Apple Siri and Amazon Alexa, are widely used nowadays. These task-oriented dialogue systems require a semantic parsing module in order to process user utterances and understand the action to be performed. This semantic parsing component was initially implemented by rule-based or statistical slot-filling approaches for processing simple queries; however, the appearance of more complex utterances demanded the application of shift-reduce parsers or sequence-to-sequence models. Although shift-reduce approaches were initially considered the most promising option, the emergence of sequence-to-sequence neural systems has propelled them to the forefront as the highest-performing method for this particular task. In this article, we advance the research on shift-reduce semantic parsing for task-oriented dialogue. We implement novel shift-reduce parsers that rely on Stack-Transformers. This framework allows to adequately model transition systems on the transformer neural architecture, notably boosting shift-reduce parsing performance. Furthermore, our approach goes beyond the conventional top-down algorithm: we incorporate alternative bottom-up and in-order transition systems derived from constituency parsing into the realm of task-oriented parsing. We extensively test our approach on multiple domains from the Facebook TOP benchmark, improving over existing shift-reduce parsers and state-of-the-art sequence-to-sequence models in both high-resource and low-resource settings. We also empirically prove that the in-order algorithm substantially outperforms the commonly used top-down strategy. Through the creation of innovative transition systems and harnessing the capabilities of a robust neural architecture, our study showcases the superiority of shift-reduce parsers over leading sequence-to-sequence methods on the main benchmark. | en |
dc.description.sponsorship | Xunta de Galicia | Ref. ED431C 2020/11 | spa |
dc.description.sponsorship | Xunta de Galicia | Ref. ED431G 2019/01 | spa |
dc.description.sponsorship | Universidade de Vigo/CISUG | spa |
dc.description.sponsorship | European Comission | Ref. 714150 | spa |
dc.description.sponsorship | Agencia Estatal de Investigación | Ref. PID2020-113230RB-C21 | spa |
dc.language.iso | eng | spa |
dc.publisher | Cognitive Computation | spa |
dc.relation | info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2020-113230RB-C21/ES | |
dc.rights | Attribution 4.0 International | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.title | Shift-reduce task-oriented semantic parsing with stack-transformers | en |
dc.type | article | spa |
dc.rights.accessRights | openAccess | spa |
dc.relation.projectID | info:eu-repo/grantAgreement/EC/H2020/714150 | spa |
dc.identifier.doi | 10.1007/s12559-024-10339-4 | |
dc.identifier.editor | https://link.springer.com/10.1007/s12559-024-10339-4 | spa |
dc.publisher.departamento | Informática | spa |
dc.publisher.grupoinvestigacion | COmputational LEarning | spa |
dc.subject.unesco | 1203.04 Inteligencia Artificial | spa |
dc.subject.unesco | 3325.99 Otras | spa |
dc.date.updated | 2024-09-10T10:59:35Z | |
dc.computerCitation | pub_title=Cognitive Computation|volume=1|journal_number=|start_pag=1|end_pag=17 | spa |