runtime error

Exit code: 1. Reason: 75%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 4.97G/6.62G [00:04<00:01, 1.35GB/s] model-00002-of-000002.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰| 6.62G/6.62G [00:05<00:00, 1.31GB/s] Downloading shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:12<00:00, 6.07s/it] Downloading shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:12<00:00, 6.21s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s] Loading checkpoint shards: 50%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 1/2 [00:04<00:04, 4.35s/it] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:07<00:00, 3.88s/it] Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 2/2 [00:07<00:00, 3.95s/it] generation_config.json: 0%| | 0.00/181 [00:00<?, ?B/s] generation_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 181/181 [00:00<00:00, 1.03MB/s] /usr/local/lib/python3.10/site-packages/gradio/components/chatbot.py:279: UserWarning: The 'tuples' format for chatbot messages is deprecated and will be removed in a future version of Gradio. Please set type='messages' instead, which uses openai-style 'role' and 'content' keys. warnings.warn( ZeroGPU tensors packing: 0%| | 0.00/30.5G [00:00<?, ?B/s] ZeroGPU tensors packing: 0%| | 0.00/30.5G [00:00<?, ?B/s] Traceback (most recent call last): File "/home/user/app/app.py", line 129, in <module> ).queue().launch() File "/usr/local/lib/python3.10/site-packages/spaces/zero/gradio.py", line 162, in launch task(*task_args, **task_kwargs) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 367, in pack _pack(Config.zerogpu_offload_dir) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/patching.py", line 359, in _pack pack = pack_tensors(originals, fakes, offload_dir, callback=update) File "/usr/local/lib/python3.10/site-packages/spaces/zero/torch/packing.py", line 114, in pack_tensors os.posix_fallocate(fd, 0, total_asize) OSError: [Errno 28] No space left on device

Container logs:

Fetching error logs...