Tencent recently announced the official open source release of its lightweight translation model, Hunyuan-MT-7B, which has demonstrated impressive performance in the international machine translation field. At the recently concluded ACL WMT2025 competition, Hunyuan-MT-7B, with its compact size of 700 million parameters, won 30 first prizes in 31 languages, including smaller languages like Chinese, English, Japanese, and Czech, surpassing even many competitors with larger parameters. This achievement not only demonstrates its technological leadership but also showcases the groundbreaking potential of lightweight models in professional translation scenarios.
Technically, Hunyuan-MT-7B leverages Tencent's proprietary AngelSlim compression tool to improve inference performance by 30%, enabling it to handle more translation requests on the same hardware. Compared to traditional machine translation, this model accurately understands the context of complex texts, such as slang and ancient poetry, resulting in more natural translations. The concurrently launched Hunyuan-MT-Chimera-7B integrated model can also incorporate translations from other models, such as DeepSeek, to provide professional users with even higher-quality translation results.
Currently, this model has been applied to products such as Tencent Meeting and WeChat for Business, significantly improving the multilingual communication experience. Since its launch in 2023, the Tencent Hunyuan series has continued to be open source. The opening of Hunyuan-MT-7B will further promote the shared ecosystem of large-scale model technology.