This project contains the implementation and pretrained models used for Facebook AI's submissions to the WMT19 Shared Task on Machine Translation. Built using the fairseq sequence modeling toolkit, these models achieved state-of-the-art performance across several language pairs, utilizing advanced transformer architectures and training techniques such as back-translation and sampling-based data augmentation.
Key Features
Includes pretrained models for multiple WMT19 language pairs
Built on top of the fairseq sequence modeling framework
Uses transformer architectures optimized with novel training strategies
Incorporates back-translation, sampling, and denoising for data augmentation
Demonstrated strong performance in WMT19 benchmark evaluations