Skip to content

Conversation

crcrpar
Copy link
Collaborator

@crcrpar crcrpar commented Sep 25, 2025

What does this PR do?

As per title. We might want to upcast packed fp4 tensors to 16/32-bit to get min and max but for now just avoiding errors from thunderfx _splitter

@crcrpar crcrpar added the thunderfx for things that could be applicable to the dynamo+thunder frontend label Sep 25, 2025
@t-vi
Copy link
Collaborator

t-vi commented Sep 25, 2025

CI says we need to make the addition conditional on such a dtype existing in PyTorch (would be my favourite) or bump the PT requirement.
https://github.com/Lightning-AI/lightning-thunder/actions/runs/18001272157/job/51210920505?pr=2533

@crcrpar crcrpar mentioned this pull request Sep 25, 2025
Signed-off-by: Masaki Kozuki <[email protected]>
Signed-off-by: Masaki Kozuki <[email protected]>
@crcrpar crcrpar changed the title avoid `torch.float4_e2m1fn_x2 in _get_min_and_val avoid torch.float4_e2m1fn_x2 in _get_min_and_val Sep 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
thunderfx for things that could be applicable to the dynamo+thunder frontend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants