[fine-tuning] attention_dropout not defined

#2
by jondurbin - opened

Hi there,

When attempting to fine-tuning the model, I bumped into an attribute error, here (self.attention_dropout)
https://huggingface.co/internlm/internlm2-base-20b/blob/main/modeling_internlm2.py#L483

I can it manually on my local copy for now, but ideally this would be set properly.

Thanks for the release!

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment