runtime error

Exit code: 1. Reason: The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s] 0it [00:00, ?it/s] The config attributes {'block_out_channels': [64, 64, 64, 64]} were passed to AutoencoderTiny, but are not expected and will be ignored. Please verify your config.json configuration file. Loading pipeline components...: 0%| | 0/7 [00:00<?, ?it/s] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:00<00:00, 2.74it/s] Loading pipeline components...: 14%|β–ˆβ– | 1/7 [00:01<00:06, 1.09s/it] Loading pipeline components...: 29%|β–ˆβ–ˆβ–Š | 2/7 [00:04<00:13, 2.63s/it]You set `add_prefix_space`. The tokenizer needs to be converted from the slow tokenizers Loading pipeline components...: 71%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 5/7 [00:05<00:02, 1.03s/it] Loading pipeline components...: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 7/7 [00:06<00:00, 1.17it/s] ZeroGPU tensors packing: 0%| | 0.00/33.7G [00:00<?, ?B/s] ZeroGPU tensors packing: 0%| | 0.00/33.7G [00:00<?, ?B/s] Traceback (most recent call last): File "/home/user/app/app.py", line 296, in <module> app.launch() File "/usr/local/lib/python3.10/site-packages/spaces/zero/gradio.py", line 142, in launch task(*task_args, **task_kwargs) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 348, in pack _pack(Config.zerogpu_offload_dir) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 340, in _pack pack = pack_tensors(originals, fakes, offload_dir, callback=update) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/packing.py", line 114, in pack_tensors os.posix_fallocate(fd, 0, total_asize) OSError: [Errno 28] No space left on device

Container logs:

Fetching error logs...