Skip to content

Fix: Graceful fallback for torch.jit.script on AMD/ROCm#192

Open
0xDELUXA wants to merge 1 commit into1038lab:mainfrom
0xDELUXA:fix/amd-rocm-jit-script-fallback
Open

Fix: Graceful fallback for torch.jit.script on AMD/ROCm#192
0xDELUXA wants to merge 1 commit into1038lab:mainfrom
0xDELUXA:fix/amd-rocm-jit-script-fallback

Conversation

@0xDELUXA
Copy link

@0xDELUXA 0xDELUXA commented Mar 13, 2026

@torch.jit.script fails on AMD/ROCm backend due to incomplete TorchScript support. This replaces the decorators on fast_diag_generalized_box_iou and fast_diag_box_iou with a try/except block that attempts JIT compilation at import time and silently falls back to eager mode if it fails.

Behavior on CUDA and CPU is unchanged.

Without this change, AMD/ROCm users encounter the following error:

Error loading C:\ComfyUI\custom_nodes\ComfyUI-RMBG\py\AILab_SAM3Segment.py: module 'torch.distributed' has no attribute 'rpc'

However, torch.distributed.rpc is actually available:

(venv) PS C:\ComfyUI> python -c "import torch; print(hasattr(torch.distributed, 'rpc'))"
True

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant