Invention Grant
- Patent Title: Data parallelism in distributed training of artificial intelligence models
-
Application No.: US16588402Application Date: 2019-09-30
-
Publication No.: US11436019B2Publication Date: 2022-09-06
- Inventor: Bharadwaj Pudipeddi , Marc Tremblay , Sujeeth Subramanya Bharadwaj , Devangkumar Patel , Jinwen Xi , Maral Mesmakhosroshahi
- Applicant: Microsoft Technology Licensing, LLC
- Applicant Address: US WA Redmond
- Assignee: Microsoft Technology Licensing, LLC
- Current Assignee: Microsoft Technology Licensing, LLC
- Current Assignee Address: US WA Redmond
- Agency: Fiala & Weaver P.L.L.C.
- Main IPC: G06F15/16
- IPC: G06F15/16 ; G06F9/38 ; H04L67/289 ; G06N3/08 ; H04L67/00

Abstract:
Methods, systems, apparatuses, and computer program products are described herein that enable execution of a large AI model on a memory-constrained target device that is communicatively connected to a parameter server, which stores a master copy of the AI model. The AI model may be dissected into smaller portions (e.g., layers or sub-layers), and each portion may be executed as efficiently as possible on the target device. After execution of one portion of the AI model is finished, another portion of the AI model may be downloaded and executed at the target device. To improve efficiency, the input samples may be divided into microbatches, and a plurality of microbatches executing in sequential order may form a minibatch. The size of the group of microbatches or minibatch can be adjusted to reduce the communication overhead. Multi-level parallel parameters reduction may be performed at the parameter server and the target device.
Public/Granted literature
- US20210019152A1 DATA PARALLELISM IN DISTRIBUTED TRAINING OF ARTIFICIAL INTELLIGENCE MODELS Public/Granted day:2021-01-21
Information query