Fine Tuning

#1
by skylerjoe - opened

Hi everyone, thanks for putting this great model out there. I was fine-tuning the larger 3.8B parameter version using PEFT and creating an adaptor using AutoTrain. However, the sizes of the model and the adaptor created using AutoTrain are different.
''' my error
size mismatch for base_model.model.model.embed_tokens.weight: copying a param with shape torch.Size([32012, 3072]) from checkpoint, the shape in current model is torch.Size([32064, 3072]).
size mismatch for base_model.model.lm_head.weight: copying a param with shape torch.Size([32012, 3072])
from checkpoint, the shape in current model is torch.Size([32064, 3072]).
'''
It seems as though it has something to do with the vocabulary size. If anyone could help me or point me to some resources that would be great! Thanks in advance.

Sign up or log in to comment