lookidutch.blogg.se

Scratchpad
Scratchpad








scratchpad

Publisher = "Association for Computational Linguistics",Ībstract = "We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks.

scratchpad

#Scratchpad mods

Cite (Informal): Keeping Notes: Conditional Natural Language Generation with a Scratchpad Encoder (Benmalek et al., ACL 2019) Copy Citation: BibTeX Markdown MODS XML Endnote More options… PDF: Data = "Keeping Notes: Conditional Natural Language Generation with a Scratchpad Encoder",īooktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics", Association for Computational Linguistics. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4157–4167, Florence, Italy. Keeping Notes: Conditional Natural Language Generation with a Scratchpad Encoder.

scratchpad

Anthology ID: P19-1407 Volume: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics Month: July Year: 2019 Address: Florence, Italy Venue: ACL SIG: Publisher: Association for Computational Linguistics Note: Pages: 4157–4167 Language: URL: DOI: 10.18653/v1/P19-1407 Bibkey: benmalek-etal-2019-keeping Cite (ACL): Ryan Benmalek, Madian Khabsa, Suma Desu, Claire Cardie, and Michele Banko. Qualitative assessments in the form of human judgements (question generation), attention visualization (MT), and sample output (summarization) provide further evidence of the ability of Scratchpad to generate fluent and expressive output. We evaluate Scratchpad in the context of three well-studied natural language generation tasks - Machine Translation, Question Generation, and Text Summarization - and obtain state-of-the-art or comparable performance on standard datasets for each task. By enabling the decoder at each time step to write to all of the encoder output layers, Scratchpad can employ the encoder as a “scratchpad” memory to keep track of what has been generated so far and thereby guide future generation. Abstract We introduce the Scratchpad Mechanism, a novel addition to the sequence-to-sequence (seq2seq) neural network architecture and demonstrate its effectiveness in improving the overall fluency of seq2seq models for natural language generation tasks.










Scratchpad