A free translator (wrapper) that can do online and offline translations. This package uses the Google translator as its first option. In cases when the google translation doesn't work, It uses the offline translation model. The dual mode feature (online + offline) allows this package to always return translation result without failing.
Machine with 16GB RAM can load and perform inferences with the local model.
It relies on deep-translator for performing translations through Google.
It uses facebook/fasttext-language-identification model for language detection. It can detect 217 languages. The model size is around 1.18GB. When the class object is initialized for the first time, this model is downloaded and stored in the cache dir "~/.cache/huggingface/hub"
It uses facebook/m2m100_1.2B model for performing offline translations. This model can directly translate between the 9,900 directions of 100 languages. It has 1.2B parameters. The model size is around 4.96GB. When the class object is initialized for the first time, this model is downloaded and stored in the cache dir "~/.cache/huggingface/hub"
Find the langauges supported by each model/service in 'docs/supported_langauges.md'
Install the package directly using:
python setup.py installfrom good_translator import GoodTranslator
gt = GoodTranslator()
text = "Hola. Cómo estás ? ¿Cómo va todo en España?"
gt.translate(text) # default value for param target_lang="en"from good_translator import GoodTranslator
gt = GoodTranslator()
text = "Translate me to spanish please."
gt.translate(text, target_lang="es")The result will contains list of tuples with (original text, translated text)
from good_translator import GoodTranslator
gt = GoodTranslator()
texts = ["Hola. Cómo estás ? ¿Cómo va todo en España?", "Extraño la paella."]
gt.batch_translate(texts) # default value for param target_lang="en"


