YAML Metadata
Warning:
The pipeline tag "conversational" is not in the official list: text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, audio-text-to-text, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, image-text-to-text, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, image-feature-extraction, video-text-to-text, keypoint-detection, any-to-any, other
Venus 120B v1.0 - GGUF
- Model creator: nsfwthrowitaway69
- Original model: Venus 120B v1.0
Description
GGUF quants for nsfwthrowitaway69's Venus 120B v1.0.
Provided quants
Name | Quant method | Size |
---|---|---|
Venus-120b-v1.0.Q2_K.gguf | Q2_K | 50.71 GB |
Venus-120b-v1.0.Q3_K_S.gguf | Q3_K_S | 51.81 GB |
Venus-120b-v1.0.Q3_K_M.gguf | Q3_K_M | 57.64 GB |
Venus-120b-v1.0.Q3_K_L.gguf | Q3_K_L | 63.01 GB |
Venus-120b-v1.0.Q4_K_S.gguf | Q4_K_S | 67.88 GB |
Venus-120b-v1.0.Q4_K_M.gguf | Q4_K_M | 72.14 GB |
Venus-120b-v1.0.Q5_K_S.gguf | Q5_K_S | 82.76 GB |
Venus-120b-v1.0.Q5_K_M.gguf | Q5_K_M | 85.02 GB |
Venus-120b-v1.0.Q6_K.gguf | Q6_K | 98.70 GB |
Venus-120b-v1.0.Q8_0.gguf | Q8_0 | 127.84 GB |
All of the files are split and require joining
Note: HF does not support uploading files larger than 50GB. Therefore the quants have been uploaded as split files.
Q2_K - Q6_K
Download the two parts of your preferred quant. For Q6_K
that would be:
Venus-120b-v1.0.Q6_K.gguf-split-a
Venus-120b-v1.0.Q6_K.gguf-split-b
Q8_0
Download the three parts of the Q8_0
quant:
Venus-120b-v1.0.Q8_0.gguf-split-a
Venus-120b-v1.0.Q8_0.gguf-split-b
Venus-120b-v1.0.Q8_0.gguf-split-c
To join the files, do the following:
Linux and macOS:
cat Venus-120b-v1.0.Q6_K.gguf-split-* > Venus-120b-v1.0.Q6_K.gguf && rm Venus-120b-v1.0.Q6_K.gguf-split-*
Note: Replace Q6_K
with the quant you downloaded
Windows command line:
Q2_K - Q6_K
COPY /B Venus-120b-v1.0.Q6_K.gguf-split-a + Venus-120b-v1.0.Q6_K.gguf-split-b Venus-120b-v1.0.Q6_K.gguf
del Venus-120b-v1.0.Q6_K.gguf-split-a Venus-120b-v1.0.Q6_K.gguf-split-b
Note: Replace Q6_K
with the quant you downloaded
Q8_K
COPY /B Venus-120b-v1.0.Q8_0.gguf-split-a + Venus-120b-v1.0.Q8_0.gguf-split-b + Venus-120b-v1.0.Q8_0.gguf-split-c Venus-120b-v1.0.Q8_0.gguf
del Venus-120b-v1.0.Q8_0.gguf-split-a Venus-120b-v1.0.Q8_0.gguf-split-b Venus-120b-v1.0.Q8_0.gguf-split-c
- Downloads last month
- 5
Model tree for 3-3/Venus-120b-v1.0-GGUF
Base model
nsfwthrowitaway69/Venus-120b-v1.0