Invention Grant
- Patent Title: Communication optimizations for distributed machine learning
-
Application No.: US15859180Application Date: 2017-12-29
-
Publication No.: US11270201B2Publication Date: 2022-03-08
- Inventor: Srinivas Sridharan , Karthikeyan Vaidyanathan , Dipankar Das , Chandrasekaran Sakthivel , Mikhail E. Smorkalov
- Applicant: Intel Corporation
- Applicant Address: US CA Santa Clara
- Assignee: Intel Corporation
- Current Assignee: Intel Corporation
- Current Assignee Address: US CA Santa Clara
- Agency: Jaffery Watson Mendonsa & Hamilton LLP
- Main IPC: G06N3/08
- IPC: G06N3/08 ; G06F9/50 ; G06N3/04 ; G06N3/063 ; G06N7/00

Abstract:
Embodiments described herein provide a system to configure distributed training of a neural network, the system comprising memory to store a library to facilitate data transmission during distributed training of the neural network; a network interface to enable transmission and receipt of configuration data associated with a set of worker nodes, the worker nodes configured to perform distributed training of the neural network; and a processor to execute instructions provided by the library, the instructions to cause the processor to create one or more groups of the worker nodes, the one or more groups of worker nodes to be created based on a communication pattern for messages to be transmitted between the worker nodes during distributed training of the neural network.
Public/Granted literature
- US20190205745A1 COMMUNICATION OPTIMIZATIONS FOR DISTRIBUTED MACHINE LEARNING Public/Granted day:2019-07-04
Information query