Getting error "config.json not found"

#8
by vishx1 - opened

Hello Abhisek,
I'm getting this error when trying to run Interface API: "404 Client Error. (Request ID: Root=1-6602cc60-7dd3e60715b1fc1363caac33;871fd7cc-ab13-412e-90ad-0c8bba20ba5b) Entry Not Found for url: https://huggingface.co/abhishek/llama-2-7b-hf-small-shards/resolve/5faf5923e00b35c1d73067c87a6570e6aaf55973/config.json."

Can you please tell me if I'm missing something?

Here's the full error log:
`"Could not load model vishx1/ai-test-2 with any of the following classes: (<class 'transformers.models.llama.modeling_llama.LlamaForCausalLM'>,). See the original errors: while loading with LlamaForCausalLM, an error is thrown: Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 286, in hf_raise_for_status response.raise_for_status() File "/usr/local/lib/python3.11/site-packages/requests/models.py", line 1021, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 404 Client Error: Not Found for url: https://huggingface.co/abhishek/llama-2-7b-hf-small-shards/resolve/5faf5923e00b35c1d73067c87a6570e6aaf55973/config.json The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/transformers/src/transformers/utils/hub.py", line 398, in cached_file resolved_file = hf_hub_download( ^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download metadata = get_hf_file_metadata( ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata r = _request_wrapper( ^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper response = _request_wrapper( ^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper hf_raise_for_status(response) File "/usr/local/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py", line 296, in hf_raise_for_status raise EntryNotFoundError(message, response) from e huggingface_hub.utils._errors.EntryNotFoundError: 404 Client Error. (Request ID: Root=1-6602cc60-7dd3e60715b1fc1363caac33;871fd7cc-ab13-412e-90ad-0c8bba20ba5b) Entry Not Found for url: https://huggingface.co/abhishek/llama-2-7b-hf-small-shards/resolve/5faf5923e00b35c1d73067c87a6570e6aaf55973/config.json. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/transformers/src/transformers/pipelines/base.py", line 279, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/modeling_utils.py", line 2983, in from_pretrained config, model_kwargs = cls.config_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 602, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/transformers/src/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "/src/transformers/src/transformers/utils/hub.py", line 452, in cached_file raise EnvironmentError( OSError: abhishek/llama-2-7b-hf-small-shards does not appear to have a file named config.json. Checkout 'https://huggingface.co/abhishek/llama-2-7b-hf-small-shards/5faf5923e00b35c1d73067c87a6570e6aaf55973' for available files."

vishx1 changed discussion status to closed
vishx1 changed discussion status to open

Sign up or log in to comment