model_type
#4
by
sdyy
- opened
state-spaces/mamba-130m
model_type='?????'
from langchain_community.llms import CTransformers
Change the model_type to 'llama'
llm = CTransformers(
model='./mamba-2.8b-f32.gguf',
model_type='mamba', # أو أي نوع مناسب آخر
config={'max_new_tokens': 256, 'repetition_penalty': 1.1}
)
print(llm.invoke('AI is going to'))