Neural natural language generation: a survey on multilinguality, multimodality, controllability and learning

Developing artificial learning systems that can understand and generate natural language has been one of the long-standing goals of artificial intelligence. Recent decades have witnessed an impressive progress on both of these problems, giving rise to a new family of approaches. Especially, the adva...

Full description

Saved in:
Bibliographic Details
Main Authors: Erdem, Erkut (Author) , Kuyu, Menekse (Author) , Yagcioglu, Semih (Author) , Frank, Anette (Author) , Pârcălăbescu, Letiția (Author) , Plank, Barbara (Author) , Babii, Andrii (Author) , Turuta, Oleksii (Author) , Erdem, Aykut (Author) , Calixto, Iacer (Author) , Lloret, Elena (Author) , Apostol, Elena-Simona (Author) , Truică, Ciprian-Octavian (Author) , Šandrih, Branislava (Author) , Martinčić-Ipšić, Sanda (Author) , Berend, Gábor (Author) , Gatt, Albert (Author) , Korvel, Grăzina (Author)
Format: Article (Journal)
Language:English
Published: Apr 6, 2022
In: Journal of artificial intelligence research
Year: 2022, Volume: 73, Pages: 1131-1207
ISSN:1943-5037
DOI:10.1613/jair.1.12918
Online Access:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.1613/jair.1.12918
Verlag, lizenzpflichtig, Volltext: https://jair.org/index.php/jair/article/view/12918
Get full text
Author Notes:Erkut Erdem, Menekse Kuyu, Semih Yagcioglu, Anette Frank, Letitia Parcalabescu, Barbara Plank, Andrii Babii, Oleksii Turuta, Aykut Erdem, Iacer Calixto, Elena Lloret, Elena-Simona Apostol, Ciprian-Octavian Truică, Branislava Šandrih, Sanda Martinčić-Ipšić, Gábor Berend, Albert Gatt, Grăzina Korvel
Description
Summary:Developing artificial learning systems that can understand and generate natural language has been one of the long-standing goals of artificial intelligence. Recent decades have witnessed an impressive progress on both of these problems, giving rise to a new family of approaches. Especially, the advances in deep learning over the past couple of years have led to neural approaches to natural language generation (NLG). These methods combine generative language learning techniques with neural-networks based frameworks. With a wide range of applications in natural language processing, neural NLG (NNLG) is a new and fast growing field of research. In this state-of-the-art report, we investigate the recent developments and applications of NNLG in its full extent from a multidimensional view, covering critical perspectives such as multimodality, multilinguality, controllability and learning strategies. We summarize the fundamental building blocks of NNLG approaches from these aspects and provide detailed reviews of commonly used preprocessing steps and basic neural architectures. This report also focuses on the seminal applications of these NNLG models such as machine translation, description generation, automatic speech recognition, abstractive summarization, text simplification, question answering and generation, and dialogue generation. Finally, we conclude with a thorough discussion of the described frameworks by pointing out some open research directions.
Item Description:Gesehen am 05.10.2022
Physical Description:Online Resource
ISSN:1943-5037
DOI:10.1613/jair.1.12918