Skip to content

torch.distributed with single process #773

@linshokaku

Description

@linshokaku

if world_size > 1 and not torch.distributed.is_initialized(): # type: ignore
torch.distributed.init_process_group( # type: ignore
backend, init_method=init_method, world_size=world_size, rank=rank
)
torch.distributed.barrier() # type: ignore

I think torch.distributed.init_process_group() can be executed since there is no error even in the world_size=1 state.

By allowing this to be executed even in the world_size=1 state, it is possible to check the operation with respect to the function assuming that torch.distributed.is_initialized() is True, without having to run MPI.

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions