Pavel Petrushkov

Pavel Petrushkov
Applied Scientist

Originally from a small, beautiful city of Elabuga, Pavel started his academic career at Saint Petersburg State University in Russia. He received his Bachelor’s thesis in 2015 on the topic of quadcopter visual navigation and control. Pavel moved to Germany in the summer of 2015 to study at RWTH Aachen University. Pavel’s research for the past two years revolved around deep learning with neural models, and in particular neural machine translation. In January 2018 Pavel completed his Master’s thesis at eBay/RWTH Aachen University.  His thesis is about reinforcement learning in neural machine translation.

His research interests include deep learning, sequence-to-sequence modeling, and machine translation.

EMNLP, Copenhagen, Denmark, September 2017

Neural Machine Translation Leveraging Phrase-based Models in a Hybrid Search

In this paper, we introduce a hybrid search for attention-based neural machine translation (NMT). A target phrase learned with statistical MT models extends a hypothesis in the NMT beam search when the attention of the NMT model focuses on the source words translated by this phrase. Phrases added in this way are scored with the NMT model, but also with SMT features including phrase-level translation probabilities and a target language model. Experimental results on German->English news domain and English->Russian ecommerce domain translation tasks show that using phrase-based models in NMT search improves MT quality by up to 2.3% BLEU absolute as compared to a strong NMT baseline.