exl2

#2
by SekkSea - opened

Can we get a low quant exl2 of this? I'm working with 12gb vram, and these sparse models are bringing new hope to me.

Sure, we will try to get the quantized models and release them after evaluation.

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment