Skip to content

Fix model loading crash with transformers 5.0.0#194

Open
LostSunset wants to merge 1 commit into1038lab:mainfrom
LostSunset:fix/transformers-5-module-file-attr
Open

Fix model loading crash with transformers 5.0.0#194
LostSunset wants to merge 1 commit into1038lab:mainfrom
LostSunset:fix/transformers-5-module-file-attr

Conversation

@LostSunset
Copy link

Summary

  • Set __file__ attribute on dynamically created module in RMBGModel.load_model() to fix AttributeError with transformers 5.0.0
  • Without this fix, transformers.modeling_utils._can_set_experts_implementation() crashes accessing sys.modules[cls.__module__].__file__ on the module created via types.ModuleType()
  • The fallback from_pretrained() path also fails due to meta tensor .item() calls during model initialization

Reproduction

Using transformers==5.0.0 + torch==2.10.0, loading any RMBG-2.0 model produces:

AttributeError: module 'custom_birefnet_model_...' has no attribute '__file__'

Fix

One-line change: add module.__file__ = birefnet_path after types.ModuleType() creation (line 190).

Test plan

  • Verified RMBG-2.0 model loads successfully with transformers 5.0.0
  • No impact on other model types (INSPYRENET, BEN, BEN2)

🤖 Generated with Claude Code

Set __file__ attribute on dynamically created module to prevent
AttributeError in transformers' _can_set_experts_implementation().

Without this, transformers 5.0.0 fails at
sys.modules[cls.__module__].__file__ because the module created via
types.ModuleType() lacks __file__. The fallback path also fails due
to meta tensor .item() calls during from_pretrained() initialization.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant