flash attention 2
#2
by
yassin-eb
- opened
Hello, Thank you very much for your amazing work!!
is there any plan of implementing flash attention to cogvlm2 anytime soon?
use torch which is combine with flash-attn v2 is ok
zRzRzRzRzRzRzR
changed discussion status to
closed