runtime error

Exit code: 1. Reason: .10/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( Loading pipeline components...: 67%|██████▋ | 4/6 [00:00<00:00, 14.22it/s] Traceback (most recent call last): File "/home/user/app/app.py", line 38, in <module> pipe_inpaint = StableDiffusionXLInpaintPipeline.from_single_file( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/single_file.py", line 495, in from_single_file loaded_sub_model = load_single_file_sub_model( File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/single_file.py", line 102, in load_single_file_sub_model loaded_sub_model = load_method( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/diffusers/loaders/single_file_model.py", line 299, in from_single_file unexpected_keys = load_model_dict_into_meta(model, diffusers_format_checkpoint, dtype=torch_dtype) File "/usr/local/lib/python3.10/site-packages/diffusers/models/model_loading_utils.py", line 154, in load_model_dict_into_meta raise ValueError( ValueError: Cannot load because down_blocks.1.attentions.0.proj_in.weight expected shape tensor(..., device='meta', size=(640, 640, 1, 1)), but got torch.Size([640, 640]). If you want to instead overwrite randomly initialized weights, please make sure to pass both `low_cpu_mem_usage=False` and `ignore_mismatched_sizes=True`. For more information, see also: https://github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389 as an example.

Container logs:

Fetching error logs...