YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Vocabulary Trimmed facebook/mbart-large-50: duongttr/japanese-trimmed-mbart-large

This model is a trimmed version of facebook/mbart-large-50 by vocabtrimmer, a tool for trimming vocabulary of language models to compress the model size. Following table shows a summary of the trimming process.

facebook/mbart-large-50 duongttr/japanese-trimmed-mbart-large
parameter_size_full 610,879,488 416,319,488
parameter_size_embedding 256,055,296 61,495,296
vocab_size 250,054 60,054
compression_rate_full 100.0 68.15
compression_rate_embedding 100.0 24.02

Following table shows the parameter used to trim vocabulary.

language dataset dataset_column dataset_name dataset_split target_vocab_size min_frequency
ja vocabtrimmer/mc4_validation text ja validation 60000 2
Downloads last month
2
Safetensors
Model size
416M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.