Skip to content

Some nodes still need Flash-Attention version < 2.8.3 and doesn't works #190

@marchcat69

Description

@marchcat69

Part of console log while starting Comfyui:
...custom_nodes/comfyui-rmbg/SDMatte/utils/utils.py: Requires Flash-Attention version >=2.7.1,<=2.8.2 but got 2.8.3
...custom_nodes/comfyui-rmbg/SDMatte/modeling/SDMatte/meta_arch.py: Failed to import diffusers.models.autoencoders.autoencoder_kl because of the following error (look up to see its traceback):
Requires Flash-Attention version >=2.7.1,<=2.8.2 but got 2.8.3.

Is it possible to add support of Flash-Attention 2.8.3 for SDMatte? Thanks.

Or maybe someone knows how to install one of the earlier Flashattention version? I couldn't found wheels for python3.11, torch 2.8.0, cuda 12.8 for linux

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions