LoRA

#1
by MrRobotoAI - opened

mergekit-extract-lora AuraIndustries/Aura-8B arcee-ai/Llama-3.1-SuperNova-Lite OUTPUT_PATH --no-lazy-unpickle --skip-undecomposable --rank=32 --extend-vocab --model_name=Aura-r32-LoRA --verbose

Does this command extract "all" LoRAs or "all" finetuning from the base model or something else? Sorry ahead of time for such a mundane question, but I have not created LoRAs before and I am very interested in trying to delve into it.

-Also, if it isn't too tedious, are there any environment variables specifically needed or is there a HF Space to help perform the task?

Sign up or log in to comment