MahaGemma-7B

MahaGemma-7B is a Marathi Gemma model. It is a Gemma 7B (google/gemma-7b) model LoRA fine-tuned on translated Marathi datasets. [dataset link] (https://github.com/l3cube-pune/MarathiNLP)

This is part of the MahaNLP initiative. More details coming soon.

Prompt format:

<bos>\n### Instruction:\nमहाराष्ट्राची राजधानी काय आहे?\n\n### Input:\n\n\n### Response:\nमहाराष्ट्राची राजधानी मुंबई आहे

Citing

@article{joshi2022l3cube,
  title={L3cube-mahanlp: Marathi natural language processing datasets, models, and library},
  author={Joshi, Raviraj},
  journal={arXiv preprint arXiv:2205.14728},
  year={2022}
}

Model Family:
MahaGemma-2B
MahaGemma-7B

Downloads last month
19
Safetensors
Model size
8.54B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for l3cube-pune/marathi-gpt-gemma-7b

Quantizations
1 model

Space using l3cube-pune/marathi-gpt-gemma-7b 1