Reopen the Space

#43
by laiking - opened

Hello,

Is it possible to reopen this space please ? I use a GPU cluster which cannot run pytorch > 2.5.1 (runs on centOS 7) , and I got the following error trying to load GoLLIE model :

ValueError: Due to a serious vulnerability issue in `torch.load`, even with `weights_only=True`, we now require users to upgrade torch to at least v2.6 in order to use the function. This version restriction does not apply when loading files with safetensors.

However, the model author never accepted the safetensors PR, so I need to convert it myself to be able to load it

Safetensors org

yes, we'll reopen asap

Thanks for reporting

Safetensors org

Apologies for this @laiking , the Space has been restarted.

julien-c changed discussion status to closed

Thanks @julien-c @lysandre ,
I did not find an easy way to convert locally to safetensors, the convert.py file mentionned in the documentation is still opening a PR and in a lot of repos, the PRs are left open and not merged by the authors.
The workaround I found is duplicating the model and adding safetensors files on the duplicated model.

Is there an easier way to do this ? (like pointing to the PR directly when downloading the model or smth like that ? because I saw the prs where accessible as branches named pr-[pr-number])

PS : I apologize if this is not the right place to ask this

Safetensors org

like pointing to the PR directly when downloading the model or smth like that

@lysandre can confirm but transformers should automatically use the safetensors weight in the verified conversion branch if it exists (no need for it to be merged)

Otherwise, duplicating the model and adding safetensors files on the duplicated model is also perfectly fine.

Ok it was my bad , I did not point to the safetensors revision when downloading the model so I only had .bin files locally ,

now i added the --revision refs/pr/... to hf download CLI command and it works, sorry for bothering you

Sign up or log in to comment