question

#2
by LHJ0 - opened

Can this used as normal flux fp8 model

It can be used, but Diffusers does not support the expansion of torch_fp8 in memory, so it is loaded as BF16. If you are using it with ComfyUI, etc., I think it is better to use the standalone fp8 safetensors file.
https://huggingface.co/Kijai/flux-fp8

but i met the problems:TypeError: expected str, bytes or os.PathLike object, not NoneType

my code:
pipe = FluxPipeline.from_pretrained(os.path.join(self.model_root, self.config["flux_fp8_repo"]), torch_dtype=torch.bfloat16, low_cpu_mem_usage=False).to(self.device)

os.path.join(self.model_root, self.config["flux_fp8_repo"])

From the error message, it seems that the error occurred at a stage before the program tried to read this repo.

Loading pipeline components...: 43%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 3/7 [00:01<00:01, 2.53it/s]Some weights of the model checkpoint at /home/iv/Algo_new/LouHaijie/IVAlgoHTTP/IVAlgoHTTP/weights/FluxTxt2Img/diffusers_file/flux1-dev-fp8-flux/text_encoder were not used when initializing CLIPTextModel: ['text_projection.weight']

  • This IS expected if you are initializing CLIPTextModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing CLIPTextModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    Loading pipeline components...: 57%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ | 4/7 [01:10<00:53, 17.71s/it]
    Process Process-3:
    Traceback (most recent call last):
    File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/diffusers/models/model_loading_utils.py", line 143, in load_state_dict
    file_extension = os.path.basename(checkpoint_file).split(".")[-1]
    File "/home/iv/anaconda3/lib/python3.10/posixpath.py", line 142, in basename
    p = os.fspath(p)
    TypeError: expected str, bytes or os.PathLike object, not NoneType

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/iv/anaconda3/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/iv/anaconda3/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/IVAlgoHTTP/app.py", line 112, in process_data
IVmodelHandle.loadModel()
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/IVAlgoHTTP/algos/BaseModel.py", line 28, in loadModel
self.model = GlobalConfig.get_model()
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/IVAlgoHTTP/algos/GlobalConfig.py", line 76, in get_model
modelHandler = model()
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/IVAlgoHTTP/algos/FluxTxt2Img/Flux_txt2img.py", line 151, in init
self.pipe = FluxPipeline.from_pretrained(os.path.join(self.model_root, self.config["flux_fp8_repo"]), torch_dtype=torch.bfloat16, low_cpu_mem_usage=False, device_map=None).to(self.device)
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/diffusers/pipelines/pipeline_utils.py", line 924, in from_pretrained
loaded_sub_model = load_sub_model(
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/diffusers/pipelines/pipeline_loading_utils.py", line 725, in load_sub_model
loaded_sub_model = load_method(os.path.join(cached_folder, name), **loading_kwargs)
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/diffusers/models/modeling_utils.py", line 986, in from_pretrained
state_dict = load_state_dict(model_file, variant=variant)
File "/home/iv/Algo_new/LouHaijie/IVAlgoHTTP/66v/lib/python3.10/site-packages/diffusers/models/model_loading_utils.py", line 157, in load_state_dict
with open(checkpoint_file) as f:
TypeError: expected str, bytes or os.PathLike object, not NoneType

These are the full messages

It's not strange that there are errors, it's quite common, but the error content was strange, so I searched for it. There is a possibility that the error is caused by low_cpu_mem_usage.
https://github.com/huggingface/diffusers/issues/9343

Ok, I will check it. However, Can I run without low_cpu_mem_usage

When I run the code without low_cpu_mem_usage=False

Please make sure to pass low_cpu_mem_usage=False and device_map=None if you want to randomly initialize those weights or else make sure your checkpoint file is correct.

That's quite a tricky problem...
It seems that there have been cases in the past where this problem occurs depending on the version of the accelerate library, but that was a long time ago, so it's probably not relevant.
I want to isolate the problem. Will it work with the following BF16 repo?
There shouldn't be any difference other than whether the accuracy of the safetensors file is FP8 or BF16.
camenduru/FLUX.1-dev-diffusers

Sign up or log in to comment