cannot use model via the serverless API

#66
by lenadan - opened

Hi,
I'm a pro user and from what I understand, this model (Meta-Llama-3-70B-Instruct) should be available via the serverless API, however, I get an error every time I try to use it. I don't think it's a problem with my token, since I can use other "pro" models. Also, using the inference UI is also not possible. This is the error I get:

Screenshot 2024-09-26 at 11.22.32.png

What can be the problem?
p.s. I requested and was granted an access to this model, so this is not the issue.

same here!!!

Hi all! This should be back!

Sign up or log in to comment