Part of console log while starting Comfyui:
...custom_nodes/comfyui-rmbg/SDMatte/utils/utils.py: Requires Flash-Attention version >=2.7.1,<=2.8.2 but got 2.8.3
...custom_nodes/comfyui-rmbg/SDMatte/modeling/SDMatte/meta_arch.py: Failed to import diffusers.models.autoencoders.autoencoder_kl because of the following error (look up to see its traceback):
Requires Flash-Attention version >=2.7.1,<=2.8.2 but got 2.8.3.
Is it possible to add support of Flash-Attention 2.8.3 for SDMatte? Thanks.
Or maybe someone knows how to install one of the earlier Flashattention version? I couldn't found wheels for python3.11, torch 2.8.0, cuda 12.8 for linux