apex: AttributeError: module 'torch.distributed' has no attribute '_all_gather_base'

    from apex.transformer.utils import split_tensor_into_1d_equal_chunks
  File "/home/ailab/anaconda3/envs/yy_FAFS/lib/python3.8/site-packages/apex/transformer/utils.py", line 11, in <module>
    torch.distributed.all_gather_into_tensor = torch.distributed._all_gather_base
AttributeError: module 'torch.distributed' has no attribute '_all_gather_base'

my version is

python                    3.8.13 
torch                     1.7.1+cu110              pypi_0    pypi
torchaudio                0.7.2                    pypi_0    pypi
torchvision               0.8.2+cu110              pypi_0    pypi
tqdm                      4.64.1  

About this issue

  • Original URL
  • State: open
  • Created 2 years ago
  • Reactions: 2
  • Comments: 15

Commits related to this issue

Most upvoted comments

that pytorch is a bit too old for the current master branch of this repo. some of older branches e.g. 22.04-dev could work for your environment.

hi i have the same problem. my version is torch 1.8.0 pypi_0 pypi torchaudio 0.8.0 pypi_0 pypi torchvision 0.9.0 pypi_0 pypi tqdm 4.64.1 pypi_0 pypi

and the cuda version is 10.2

I use some of older branches e.g. 22.04-dev could work for my environment. It looks ok. But I don’t test anthoer code. You can have try