Skip to content

Conversation

@Aidyn-A
Copy link
Collaborator

@Aidyn-A Aidyn-A commented Oct 15, 2025

We need to gate against torch.distributed.is_available() check because otherwise it will fail on import:

$ python -c "import apex"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/local/lib/python3.12/dist-packages/apex/__init__.py", line 16, in <module>
    from . import transformer
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/__init__.py", line 4, in <module>
    from apex.transformer import pipeline_parallel
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/pipeline_parallel/__init__.py", line 1, in <module>
    from apex.transformer.pipeline_parallel.schedules import get_forward_backward_func
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/pipeline_parallel/schedules/__init__.py", line 3, in <module>
    from apex.transformer.pipeline_parallel.schedules.fwd_bwd_no_pipelining import (
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/pipeline_parallel/schedules/fwd_bwd_no_pipelining.py", line 10, in <module>
    from apex.transformer.pipeline_parallel.schedules.common import Batch
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/pipeline_parallel/schedules/common.py", line 9, in <module>
    from apex.transformer.pipeline_parallel.p2p_communication import FutureTensor
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/pipeline_parallel/p2p_communication.py", line 25, in <module>
    from apex.transformer.utils import split_tensor_into_1d_equal_chunks
  File "/usr/local/lib/python3.12/dist-packages/apex/transformer/utils.py", line 11, in <module>
    torch.distributed.all_gather_into_tensor = torch.distributed._all_gather_base
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'torch.distributed' has no attribute '_all_gather_base'

cc @crcrpar

@crcrpar crcrpar merged commit 184ea24 into NVIDIA:master Oct 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants