exui error - 'intermediate_size' on gptq-3bit-128g-actorder_True
#3
by
AgileTurnip
- opened
Unable to load in exui, I get this error:
model https://huggingface.co/TheBloke/Falcon-180B-GPTQ with gptq-3bit-128g-actorder_True
@AgileTurnip i believe exllamav2 which exui uses does not support falcon models