apepkuss79 commited on
Commit
609cf09
·
verified ·
1 Parent(s): 13082bd

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -89,9 +89,9 @@ language:
89
  | [Palmyra-Med-70B-32K-Q8_0-00002-of-00003.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-Q8_0-00002-of-00003.gguf) | Q8_0 | 8 | 29.8 GB| very large, extremely low quality loss - not recommended |
90
  | [Palmyra-Med-70B-32K-Q8_0-00003-of-00003.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-Q8_0-00003-of-00003.gguf) | Q8_0 | 8 | 15.4 GB| very large, extremely low quality loss - not recommended |
91
  | [Palmyra-Med-70B-32K-f16-00001-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00001-of-00005.gguf) | f16 | 16 | 30.0 GB| |
92
- | [Palmyra-Med-70B-32K-f16-00001-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00001-of-00005.gguf) | f16 | 16 | 29.6 GB| |
93
- | [Palmyra-Med-70B-32K-f16-00001-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00001-of-00005.gguf) | f16 | 16 | 29.9 GB| |
94
- | [Palmyra-Med-70B-32K-f16-00001-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00001-of-00005.gguf) | f16 | 16 | 29.6 GB| |
95
  | [Palmyra-Med-70B-32K-f16-00005-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00005-of-00005.gguf) | f16 | 16 | 22.2 GB| |
96
 
97
  *Quantized with llama.cpp b3499*
 
89
  | [Palmyra-Med-70B-32K-Q8_0-00002-of-00003.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-Q8_0-00002-of-00003.gguf) | Q8_0 | 8 | 29.8 GB| very large, extremely low quality loss - not recommended |
90
  | [Palmyra-Med-70B-32K-Q8_0-00003-of-00003.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-Q8_0-00003-of-00003.gguf) | Q8_0 | 8 | 15.4 GB| very large, extremely low quality loss - not recommended |
91
  | [Palmyra-Med-70B-32K-f16-00001-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00001-of-00005.gguf) | f16 | 16 | 30.0 GB| |
92
+ | [Palmyra-Med-70B-32K-f16-00002-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00002-of-00005.gguf) | f16 | 16 | 29.6 GB| |
93
+ | [Palmyra-Med-70B-32K-f16-00003-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00003-of-00005.gguf) | f16 | 16 | 29.9 GB| |
94
+ | [Palmyra-Med-70B-32K-f16-00004-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00004-of-00005.gguf) | f16 | 16 | 29.6 GB| |
95
  | [Palmyra-Med-70B-32K-f16-00005-of-00005.gguf](https://huggingface.co/second-state/Palmyra-Med-70B-32K-GGUF/blob/main/Palmyra-Med-70B-32K-f16-00005-of-00005.gguf) | f16 | 16 | 22.2 GB| |
96
 
97
  *Quantized with llama.cpp b3499*