https://huggingface.co/FINGU-AI/RomboUltima-32B

#692
by Nurburgring - opened

This got already queued on 3rd of February so it must have failed for some reason. I do not yet have a way to check the logs why it failed so I just forcefully queued it again so we see.

The reason this model failed is because of ValueError: Can not map tensor 'model.layers.0.mlp.down_proj.weight.absmax:

INFO:hf-to-gguf:Exporting model...
INFO:hf-to-gguf:gguf: loading model weight map from 'model.safetensors.index.json'
INFO:hf-to-gguf:gguf: loading model part 'model-00001-of-00005.safetensors'
INFO:hf-to-gguf:token_embd.weight,         torch.float16 --> F16, shape = {5120, 151665}
INFO:hf-to-gguf:blk.0.attn_norm.weight,    torch.float16 --> F32, shape = {5120}
INFO:hf-to-gguf:blk.0.ffn_down.weight,     torch.uint8 --> F16, shape = {1, 70778880}
Traceback (most recent call last):
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 5143, in <module>
    main()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 5137, in main
    model_instance.write()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 439, in write
    self.prepare_tensors()
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 298, in prepare_tensors
    for new_name, data_torch in (self.modify_tensors(data_torch, name, bid)):
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 266, in modify_tensors
    return [(self.map_tensor_name(name), data_torch)]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/llmjob/llama.cpp/convert_hf_to_gguf.py", line 214, in map_tensor_name
    raise ValueError(f"Can not map tensor {name!r}")
ValueError: Can not map tensor 'model.layers.0.mlp.down_proj.weight.absmax'
job finished, status 1
job-done<0 RomboUltima-32B noquant 1>

https://huggingface.co/FINGU-AI/RomboUltima-32B

This is probably unfixable, so I nuked the model from the queue.

mradermacher changed discussion status to closed

Sign up or log in to comment