Invention Grant
- Patent Title: Training giant neural networks using pipeline parallelism
-
Application No.: US16989787Application Date: 2020-08-10
-
Publication No.: US11232356B2Publication Date: 2022-01-25
- Inventor: Zhifeng Chen , Yanping Huang , Youlong Cheng , HyoukJoong Lee , Dehao Chen , Jiquan Ngiam
- Applicant: Google LLC
- Applicant Address: US CA Mountain View
- Assignee: Google LLC
- Current Assignee: Google LLC
- Current Assignee Address: US CA Mountain View
- Agency: Fish & Richardson P.C.
- Main IPC: G06N3/08
- IPC: G06N3/08 ; G06N3/04

Abstract:
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for training giant neural networks. One of the methods includes obtaining data specifying a partitioning of the neural network into N composite layers that form a sequence of composite layers, wherein each composite layer comprises a distinct plurality of layers from the multiple network layers of the neural network; obtaining data assigning each of the N composite layers to one or more computing devices from a set of N computing devices; partitioning a mini-batch of training examples into a plurality of micro-batches; and training the neural network, comprising: performing a forward pass through the neural network until output activations have been computed for each micro-batch for a final composite layer in the sequence, and performing a backward pass through the neural network until output gradients have been computed for each micro-batch for the first composite layer in the sequence.
Public/Granted literature
- US20210042620A1 TRAINING GIANT NEURAL NETWORKS USING PIPELINE PARALLELISM Public/Granted day:2021-02-11
Information query