Solshine's picture
Update README.md
04d5113 verified
metadata
language:
  - en
license: other
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - mistral
  - trl
  - biology
  - farming
  - agriculture
  - climate
base_model: unsloth/mistral-7b-instruct-v0.2-bnb-4bit

Uploaded model

  • Developed by: Caleb DeLeeuw; Copyleft Cultivars, a nonprofit
  • License: Hippocratic 3.0 CL-Eco-Extr Hippocratic License HL3-CL-ECO-EXTR https://firstdonoharm.dev/version/3/0/cl-eco-extr.html
  • Finetuned from model : unsloth/mistral-7b-instruct-v0.2-bnb-4bit
  • Dataset Used : CopyleftCultivars/Training-Ready_NF_chatbot_conversation_history currated from real-world agriculture and natural farming questions and the best answers from a previous POC chatbot which were then lightly editted by domain experts

Using real-world user data from a previous farmer assistant chatbot service and additional curated datasets (prioritizing sustainable regenerative organic farming practices,) Gemma 2B and Mistral 7B LLMs were iteratively fine-tuned and tested against eachother as well as basic benchmarking, whereby the Gemma 2B fine-tune emerged victorious, while this Mistral fine-tune was still viable. LORA adapters were saved for each model.

Shout out to roger j (bhugxer) for help with the dataset and training framework.

This mistral model was trained with Unsloth and Huggingface's TRL library.