mindspore.communication.create_group
- mindspore.communication.create_group(group, rank_ids, options=None)[source]
- Create a user collective communication group. - Note - This method isn't supported in GPU and CPU versions of MindSpore. The size of rank_ids should be larger than 1, rank_ids should not have duplicate data. This method should be used after init(). Only support global single communication group in PyNative mode if you do not start with mpirun. - Parameters
- group (str) – The name of the communication group to be created. 
- rank_ids (list) – A list of device IDs. 
- options (GroupOptions, optional) – - Additional communication group configuration parameters. The backend will automatically select supported parameters and apply them during group initialization. i.e. for the - HCCLbackend,- hccl_configcan be specified so that group initialization configurations can be applied. Default is- None.- GroupOptions is defined as a class that can be instantiated as a python object. - GroupOptions { hccl_config(dict) } - hccl_config currently only supports "hccl_buffer_size" or "hccl_comm". - hccl_buffer_size (uint32): specifies the size of the HCCL communication buffer. 
- hccl_comm (int64): specifies an existing HcclComm pointer. If "hccl_comm" is set, "hccl_buffer_size" will be ignored. 
 
 
- Raises
- TypeError – If group is not a string or rank_ids is not a list. 
- ValueError – If rank_ids size is not larger than 1, or rank_ids has duplicate data, or backend is invalid. 
- RuntimeError – If HCCL is not available or MindSpore is GPU/CPU version. 
 
 - Supported Platforms:
- Ascend- GPU- CPU
 - Examples - Note - Before running the following examples, you need to configure the communication environment variables. - For Ascend/GPU/CPU devices, it is recommended to use the msrun startup method without any third-party or configuration file dependencies. Please see the msrun start up for more details. - >>> import mindspore as ms >>> from mindspore import set_context, ops >>> from mindspore._c_expression import GroupOptions >>> from mindspore.communication import init, create_group, get_rank >>> set_context(mode=ms.GRAPH_MODE) >>> ms.set_device(device_target="Ascend") >>> init() >>> group = "0-7" >>> rank_ids = [0,7] >>> options = GroupOptions() >>> options.hccl_config = {"hccl_buffer_size": 400} >>> if get_rank() in rank_ids: ... create_group(group, rank_ids, options) ... allreduce = ops.AllReduce(group)