'VisionExpertAttention' object has no attribute 'q_proj'
#20
by
TJ98
- opened
Thanks for your excellent work. In my attempt to infer with multiple RTX2080Ti's, the code gets an error 'VisionExpertAttention' object has no attribute 'q_proj', I'm not sure how to fix it. I did not find the corresponding 'VisionExpertAttention.q_proj' information in the model.safetensors.json file.
My environment is configured as follows:
torch 2.1.0+cu118
xformers 0.0.22.post7+cu118
transformers 4.35.0
triton 2.1.0
accelerate 0.24.1
sentencepiece 0.1.99
einops 0.7.0
TJ98
changed discussion status to
closed