Cannot load model in transformers.pipeline

#51
by DevBhuyan - opened

Hi everyone, this is my first time posting to this platform. Please correct me if I'm wrong on any part.

I've tried this boilerplate code to test the predictions of this model through the Python SDK on my local machine:

from transformers import pipeline

messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe = pipeline("text-generation", model="CohereForAI/c4ai-command-r-plus")
pipe(messages)

It throws this error:

ValueError: Could not load model CohereForAI/c4ai-command-r-plus with any of the following classes: (<class 'transformers.models.auto.modeling_tf_auto.TFAutoModelForCausalLM'>,). See the original errors:

while loading with TFAutoModelForCausalLM, an error is thrown:
Traceback (most recent call last):
  File "/home/dev/anaconda3/lib/python3.9/site-packages/transformers/pipelines/base.py", line 283, in infer_framework_load_model
    model = model_class.from_pretrained(model, **kwargs)
  File "/home/dev/anaconda3/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 567, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.cohere.configuration_cohere.CohereConfig'> for this kind of AutoModel: TFAutoModelForCausalLM.
Model type should be one of BertConfig, CamembertConfig, CTRLConfig, GPT2Config, GPT2Config, GPTJConfig, MistralConfig, OpenAIGPTConfig, OPTConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoFormerConfig, TransfoXLConfig, XGLMConfig, XLMConfig, XLMRobertaConfig, XLNetConfig.

Before this, I've always used pretrained models which are based on some architecture (RoBERT, GPT, BART, etc.). And transformers would find an appropriate PreTrainedModel class for the given model and everything was smooth. This time, however, I guess since c4ai-command-r-plus is a new architecture, this issue pops up. I tried upgrading transformers to the latest version but it didn't help. (I'm using Python 3.9 btw, and transformers==4.42.3).

Am I missing something obvious?

Thanks in advance for your inputs.

pip install torch - you have TF but not Torch installed, and it's trying to initialize a TensorFlow pipeline

@Rocketknight1 thanks for pointing out! I didn't see it that way cuz all the earlier models would work with the TF backend. I guess its time to move to torch for this one.

Cohere For AI org

Hi this issue looks resolved so closing it but feel free to reopen in case you're still facing any issues related to this @DevBhuyan !

shivi changed discussion status to closed

Sign up or log in to comment