ACCELERATING DEEP NEURAL NETWORK TRAINING WITH INCONSISTENT STOCHASTIC GRADIENT DESCENT
    1.
    发明申请
    ACCELERATING DEEP NEURAL NETWORK TRAINING WITH INCONSISTENT STOCHASTIC GRADIENT DESCENT 审中-公开
    不完全随机梯度下降加速深度神经网络训练

    公开(公告)号:WO2017136802A1

    公开(公告)日:2017-08-10

    申请号:PCT/US2017/016637

    申请日:2017-02-06

    CPC classification number: G06N3/08 G06N3/04 G06N3/0454 G06N3/084

    Abstract: Aspects of the present disclosure describe techniques for training a convolutional neural network using an inconsistent stochastic gradient descent (ISGD) algorithm. Training effort for training batches used by the ISGD algorithm are dynamically adjusted according to a determined loss for a given training batch which are classified into two sub states - well-trained or under-trained. The ISGD algorithm provides more iterations for under-trained batches while reducing iterations for well-trained ones.

    Abstract translation: 本公开的各方面描述了使用不一致随机梯度下降(ISGD)算法来训练卷积神经网络的技术。 ISGD算法使用的培训批次的培训工作根据给定培训批次的确定损失进行动态调整,这些培训批次分为两个子状态 - 良好培训或未受过培训。 ISGD算法为未经训练的批次提供更多迭代,同时减少训练良好的批次的迭代次数。

Patent Agency Ranking