Learning variable length units for SMT between related languages via Byte Pair Encoding
نویسندگان
چکیده
We explore the use of segments learnt using Byte Pair Encoding (referred to as BPE units) as basic units for statistical machine translation between related languages and compare it with orthographic syllables, which are currently the best performing basic units for this translation task. BPE identifies the most frequent character sequences as basic units, while orthographic syllables are linguistically motivated pseudo-syllables. We show that BPE units modestly outperform orthographic syllables as units of translation, showing up to 11% increase in BLEU score. While orthographic syllables can be used only for languages whose writing systems use vowel representations, BPE is writing system independent and we show that BPE outperforms other units for nonvowel writing systems too. Our results are supported by extensive experimentation spanning multiple language families and writing systems.
منابع مشابه
Transfer Learning across Low-Resource, Related Languages for Neural Machine Translation
We present a simple method to improve neural translation of a low-resource language pair using parallel data from a related, also low-resource, language pair. The method is based on the transfer method of Zoph et al., but whereas their method ignores any source vocabulary overlap, ours exploits it. First, we split words using Byte Pair Encoding (BPE) to increase vocabulary overlap. Then, we tra...
متن کاملOrthographic Syllable as basic unit for SMT between Related Languages
We explore the use of the orthographic syllable, a variable-length consonant-vowel sequence, as a basic unit of translation between related languages which use abugida or alphabetic scripts. We show that orthographic syllable level translation significantly outperforms models trained over other basic units (word, morpheme and character) when training over small parallel corpora.
متن کاملNeural Machine Translation of Rare Words with Subword Units
Neural machine translation (NMT) models typically operate with a fixed vocabulary, but translation is an open-vocabulary problem. Previous work addresses the translation of out-of-vocabulary words by backing off to a dictionary. In this paper, we introduce a simpler and more effective approach, making the NMT model capable of open-vocabulary translation by encoding rare and unknown words as seq...
متن کاملBPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages
We present BPEmb, a collection of pre-trained subword unit embeddings in 275 languages, based on Byte-Pair Encoding (BPE). In an evaluation using fine-grained entity typing as testbed, BPEmb performs competitively, and for some languages better than alternative subword approaches, while requiring vastly fewer resources and no tokenization. BPEmb is available at https://github.com/bheinzerling/b...
متن کاملLiterature Survey: Study of Neural Machine Translation
We build Neural Machine Translation (NMT) systems for EnglishHindi,Bengali-Hindi and Gujarati-Hindi with two different units of translation i.e. word and subword and present a comparative study of subword NMT and word level NMT systems, along with strong results and case studies. We train attention-based encoder-decoder model for word level and use Byte Pair Encoding (BPE) in subword NMT for wo...
متن کامل