MADLAD-400-10B-MT, converted to cTranslate2
The google/madlad400-10b-mt model, converted to cTranslate2 format.
Usage:
from huggingface_hub import snapshot_download
model_path = snapshot_download("santhosh/madlad400-3b-ct2")
tokenizer = SentencePieceProcessor()
tokenizer.load(f"{model_path}/sentencepiece.model")
translator = ctranslate2.Translator(model_path, device="cuda" if torch.cuda.is_available() else "cpu")
text = "<2pt> I love pizza!"
input_ids = tokenizer.encode(text, out_type=str)
outputs = model.translate_batch(
[input_ids],
batch_type="tokens",
beam_size=2,
no_repeat_ngram_size=1,
)
tokenizer.decode(results[0].hypotheses[0])
# Eu adoro pizza!
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.