-
-
Save zer0TF/8f756f99b00b02697edcd5eec5202c59 to your computer and use it in GitHub Desktop.
# Got a bunch of .ckpt files to convert? | |
# Here's a handy script to take care of all that for you! | |
# Original .ckpt files are not touched! | |
# Make sure you have enough disk space! You are going to DOUBLE the size of your models folder! | |
# | |
# First, run: | |
# pip install torch torchsde==0.2.5 safetensors==0.2.5 | |
# | |
# Place this file in the **SAME DIRECTORY** as all of your .ckpt files, open a command prompt for that folder, and run: | |
# python convert_to_safe.py | |
import os | |
import torch | |
from safetensors.torch import save_file | |
files = os.listdir() | |
for f in files: | |
if f.lower().endswith('.ckpt'): | |
print(f'Loading {f}...') | |
fn = f"{f.replace('.ckpt', '')}.safetensors" | |
if fn in files: | |
print(f'Skipping, as {fn} already exists.') | |
continue | |
try: | |
with torch.no_grad(): | |
weights = torch.load(f)["state_dict"] | |
fn = f"{f.replace('.ckpt', '')}.safetensors" | |
print(f'Saving {fn}...') | |
save_file(weights, fn) | |
except Exception as ex: | |
print(f'ERROR converting {f}: {ex}') | |
print('Done!') |
I just tried novel ai and it seems to work perfectly. I think it's just because of sending it a nested structure instead of the raw weights. But I used https://huggingface.co/spaces/safetensors/convert and it worked perfectly for my use case.
How did convert NovelAI with that link? For me it appears that you can only convert models on huggingface and afaik NovelAI is nowhere hosted on HuggingFace
Hmm , I use the script contained in there to convert a private model on HF and then deleted the model, I was just curious at what was failing, but everything seemed to work fine. (I didn't check the resulting file, but everything looked ok).
Got NAI to convert by referencing @Narsil colab found here.
Just need to add weights.pop("state_dict")
after weights is defined. I made a new gist in case this one doesn't get updated.
@RandomLegend @2PeteShakur @MiyatoKyo
@RassilonSleeps You're the hero!
Works like a charm! Now i can swap the model in 4 seconds compared to 21 lol
Thanks!!
are the converted output files set to fp16?
For each file loaded, I get this error:
ERROR converting (...): 'state_dict'
Removing your non-descriptive exception text, I got this:
Traceback (most recent call last):
File "convert_to_safe.py", line 64, in
weights.pop("state_dict")
KeyError: 'state_dict'
Got NAI to convert by referencing @Narsil colab found here. Just need to add
weights.pop("state_dict")
after weights is defined. I made a new gist in case this one doesn't get updated. @RandomLegend @2PeteShakur @MiyatoKyo
@cooperdk - look at this comment, there is a better version. That one works for every model i tried.
Got NAI to convert by referencing @Narsil colab found here. Just need to add
weights.pop("state_dict")
after weights is defined. I made a new gist in case this one doesn't get updated. @RandomLegend @2PeteShakur @MiyatoKyo@cooperdk - look at this comment, there is a better version. That one works for every model i tried.
It breaks the script for non novelai models, obviously as that was what I wrote. It is like 35 in your linked script, not 64. I just used a better variant that would list out the models and explain more etc. I can paste it here later.
Anyway, I got the error as mentioned because it tries to execute line 35 (64 in my script) on all models.
It will not convert the models that are not novelai because it moves on to the next due to the mentioned exception BEFORE saving the converted model. In Python, an exception continues the loop but breaks the current iteration.
I mentioned elsewhere that the weights.pop in line 35 must be conditioned to only apply to novelai models. So instead something like
if "novelai" in fn:
weighs.pop("state_dict")
(Or check the model for specific data)
That way you will make it work on all models.
if 'state_dict' in weights:
weights.pop("state_dict')
Maybe ? (So it doesn't rely on the filename)
is there any way to do this the other way around?
This is just a guess, but since the difference between the models is Python injection, have you tried simply renaming the model to .ckpt?
if 'state_dict' in weights: weights.pop("state_dict')Maybe ? (So it doesn't rely on the filename)
I agree, that would be better. It also supports any model with that dictionary.
if 'state_dict' in weights: weights.pop("state_dict')
Please tell me where to add these lines to the code.
After line 28
Hey, I'm getting this error
"ERROR converting xxxx.ckpt: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU."
how to fix this? I'm running it on VM with no GPU.
Thank you.
I just tried novel ai and it seems to work perfectly. I think it's just because of sending it a nested structure instead of the raw weights. But I used https://huggingface.co/spaces/safetensors/convert and it worked perfectly for my use case.