Skip to content

[wip] migrate to transformers v5#12976

Draft
sayakpaul wants to merge 59 commits intomainfrom
transformers-v5-pr
Draft

[wip] migrate to transformers v5#12976
sayakpaul wants to merge 59 commits intomainfrom
transformers-v5-pr

Conversation

@sayakpaul
Copy link
Member

@sayakpaul sayakpaul commented Jan 14, 2026

What does this PR do?

This PR is to assess if we can move to transformers main again for our CI. This will also help us migrate to transformers v5 successfully.

Bunch of inline comments to make the reviewers aware of internal discussions.

@sayakpaul sayakpaul requested a review from DN6 January 14, 2026 09:23
- "tests/pipelines/test_pipelines_common.py"
- "tests/models/test_modeling_common.py"
- "examples/**/*.py"
- ".github/**.yml"
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Temporary. For this PR.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@sayakpaul sayakpaul marked this pull request as draft January 15, 2026 12:05
@sayakpaul sayakpaul changed the title switch to transformers main again. [main] switch to transformers main again. Jan 15, 2026
@sayakpaul sayakpaul changed the title [main] switch to transformers main again. [wip] switch to transformers main again. Jan 15, 2026
logger.addHandler(stream_handler)


@unittest.skipIf(is_transformers_version(">=", "4.57.5"), "Size mismatch")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

torch.nn.ConvTranspose2d,
torch.nn.ConvTranspose3d,
torch.nn.Linear,
torch.nn.Embedding,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Happening because of the way weight loading is done in v5.

Comment on lines +23 to +25
model = AutoModel.from_pretrained(
"hf-internal-testing/tiny-stable-diffusion-torch", subfolder="text_encoder", use_safetensors=False
)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines +281 to +283
input_ids = (
input_ids["input_ids"] if not isinstance(input_ids, list) and "input_ids" in input_ids else input_ids
)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inputs = {
"prompt": "dance monkey",
"negative_prompt": "",
"negative_prompt": "bad",
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise, the corresponding tokenizer outputs:

negative_prompt=[' ']
prompt=[' ']
text_input_ids=tensor([], size=(1, 0), dtype=torch.int64)

which leads to:

E       RuntimeError: cannot reshape tensor of 0 elements into shape [1, 0, -1, 8] because the unspecified dimension size -1 can be any value and is ambiguous

@sayakpaul
Copy link
Member Author

sayakpaul commented Jan 20, 2026

Hmm, https://github.com/huggingface/diffusers/actions/runs/21354964855/job/61460242386?pr=12976 fails on this PR but passes without any regrets on this https://github.com/huggingface/diffusers/actions/runs/21344761402/job/61450173169?pr=12996. So, I am not sure at this point what's happening TBH. Other failures seem to be already existing and well-known.

for component_name in model_components_pipe:
pipe_component = model_components_pipe[component_name]
pipe_loaded_component = model_components_pipe_loaded[component_name]
for p1, p2 in zip(pipe_component.parameters(), pipe_loaded_component.parameters()):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sayakpaul sayakpaul changed the title [wip] switch to transformers main again. [wip] move transformers v5 Feb 14, 2026
@sayakpaul sayakpaul changed the title [wip] move transformers v5 [wip] migrate to transformers v5 Feb 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments