Invention Grant
- Patent Title: Distributed deep learning system using a communication network for stochastic gradient descent calculations
-
Application No.: US16967702Application Date: 2019-02-06
-
Publication No.: US12008468B2Publication Date: 2024-06-11
- Inventor: Junichi Kato , Kenji Kawai , Huycu Ngo , Yuki Arikawa , Tsuyoshi Ito , Takeshi Sakamoto
- Applicant: Nippon Telegraph and Telephone Corporation
- Applicant Address: JP Tokyo
- Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
- Current Assignee: NIPPON TELEGRAPH AND TELEPHONE CORPORATION
- Current Assignee Address: JP Tokyo
- Agency: SLATER MATSIL, LLP
- Priority: JP 18025940 2018.02.16
- International Application: PCT/JP2019/004213 2019.02.06
- International Announcement: WO2019/159783A 2019.08.22
- Date entered country: 2020-08-05
- Main IPC: G06N3/08
- IPC: G06N3/08 ; G06N3/04 ; G06N3/063

Abstract:
Each of learning nodes calculates gradients of a loss function from an output result obtained by inputting learning data to a learning target neural network, converts a calculation result into a packet, and transmits the packet to a computing interconnect device. The computing interconnect device receives the packet transmitted from each of the learning nodes, acquires a value of the gradients stored in the packet, calculates a sum of the gradients, converts a calculation result into a packet, and transmits the packet to each of the learning nodes. Each of the learning nodes receives the packet transmitted from the computing interconnect device and updates a constituent parameter of a neural network based on a value stored in the packet.
Public/Granted literature
- US20210034978A1 Distributed Deep Learning System Public/Granted day:2021-02-04
Information query