M2M-100: Multilingual Many-to-Many Machine Translation Model

M2M-100: Multilingual Many-to-Many Machine Translation Model

M2M-100 is a multilingual, many-to-many machine translation model developed by Facebook AI Research. It enables direct translation between 100 languages without relying on English as an intermediate. Built using FAIRSEQ, this model supports training, evaluation, and inference for diverse multilingual datasets.

Key Features

  • Supports direct translation between any of 100 languages
  • Does not require pivoting through English
  • Built on FAIRSEQ, optimized for research and scalability
  • Pre-trained models and configuration scripts included
  • Capable of zero-shot and supervised translation
  • High translation quality with strong BLEU performance on benchmark datasets