Skip to content

Talk - Making A Point: Pointer-Generator Transformers for Disjoint Vocabularies

E
Hosted By
Elizabeth S. and Maggie P.
Talk - Making A Point: Pointer-Generator Transformers for Disjoint Vocabularies

Details

The Front Range NLP Group is kicking off the spring semester with a talk on March 11 with Nikhil Prabhu. Please join us as he gives a presentation on the paper "Making a Point: Pointer-Generator Transformers for Disjoint Vocabularies" co-authored with Katharina Kann for the Asia-Pacific Chapter of the Association for Computational Linguistics 2020 conference.

Please join us at 6:30pm on Thursday, March 11th, at this Zoom link: https://cuboulder.zoom.us/j/99300892288

Speaker: Nikhil Prabhu
Nikhil Prabhu is a Master's Student working in the NALA Lab as a Research Assistant working on problems related to low-resource machine translation. His paper won the Best Paper Award at AACL Student Research Workshop.

Abstract
Explicit mechanisms for copying have improved the performance of neural models for sequence-to-sequence tasks in the low resource setting. However, they rely on an overlap between source and target vocabularies. Here, we propose a model that does not: a pointer-generator transformer for disjoint vocabularies. We apply our model to a low resource version of the grapheme-to-phoneme conversion (G2P) task, and show that it outperforms a standard transformer by an average of 5.1 WER over 15 languages. While our model does not beat the best performing baseline, we demonstrate that it provides complementary information to it: an oracle that combines the best outputs of the two models improves over the strongest baseline by 7.7 WER on average in the low-resource setting. In the high resource setting, our model performs comparably to a standard transformer.

Full paper available here: https://www.aclweb.org/anthology/2020.aacl-srw.13.pdf

Photo of Front Range NLP (Natural Language Processing) group
Front Range NLP (Natural Language Processing)
See more events