Invention Grant
- Patent Title: Batch normalization layer training method
-
Application No.: US16814578Application Date: 2020-03-10
-
Publication No.: US12014268B2Publication Date: 2024-06-18
- Inventor: Seung-Kyun Oh , Jinseok Im , Sanghoon Kim
- Applicant: LG ELECTRONICS INC.
- Applicant Address: KR Seoul
- Assignee: LG ELECTRONICS INC.
- Current Assignee: LG ELECTRONICS INC.
- Current Assignee Address: KR Seoul
- Agency: Birch, Stewart, Kolasch & Birch, LLP
- Priority: KR 20190094523 2019.08.02
- Main IPC: G06N3/08
- IPC: G06N3/08 ; G06N3/04

Abstract:
Disclosed is a batch normalization layer training method, which may be used in a neural network learning apparatus having limited operational processing capability and storage space. A batch normalization layer training method according to an embodiment of the present disclosure may perform batch normalization transform by setting the gradients of the standard deviation and the mean of the loss function to zero, and applying a normalized statistic value obtained from an initial neural network or a previous neural network to the gradient of the loss function. The neural network learning apparatus of the present disclosure may be connected or converged with an Artificial Intelligence module, an Unmanned Aerial Vehicle (UAV), a robot, an Augmented Reality (AR) apparatus, a Virtual Reality (VR), a 5G network service-related apparatus, etc.
Public/Granted literature
- US20210034972A1 BATCH NORMALIZATION LAYER TRAINING METHOD Public/Granted day:2021-02-04
Information query