Processing

Please wait...

Settings

Settings

Goto Application

1. CN112020723 - METHOD AND DEVICE FOR TRAINING CLASSIFICATION NEURAL NETWORK FOR SEMANTIC SEGMENTATION, AND ELECTRONIC APPARATUS

Office
China
Application Number 201880092697.6
Application Date 23.05.2018
Publication Number 112020723
Publication Date 01.12.2020
Publication Kind A
IPC
G06N 3/02
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computer systems based on biological models
02using neural network models
CPC
G06N 3/02
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computer systems based on biological models
02using neural network models
Applicants FUJITSU LTD.
富士通株式会社
Inventors SHI LU
石路
WANG QI
王琪
Agents 北京三友知识产权代理有限公司 11127
北京三友知识产权代理有限公司 11127
Title
(EN) METHOD AND DEVICE FOR TRAINING CLASSIFICATION NEURAL NETWORK FOR SEMANTIC SEGMENTATION, AND ELECTRONIC APPARATUS
(ZH) 用于语义分割的分类神经网络的训练方法及装置、电子设备
Abstract
(EN) A device and method for training a classification neural network for semantic segmentation, and an electronic apparatus. Since each training operation can partially utilize a gradient obtained in a previous training operation, even if a large number of training images are used in training a network, the invention can effectively reduce a computation load and adapts to limited hardware resources. The reduction in the computation load accelerates the training speed, thereby reducing the time required for completion of the training. Moreover, since each training operation utilizes new data, training precision can be guaranteed in the event of limited hardware resources.
(ZH) 一种用于语义分割的分类神经网络的训练装置及方法、电子设备。即使在训练网络时使用了较多数量的训练图像,由于在每次训练时能够部分利用之前训练时已经获得的梯度,因此能够有效减少计算量并适用于硬件资源有限的情况,并且,由于计算量的减少,训练速度加快,能够缩短训练完成的时间,另外,由于每次训练时都使用了新的数据,使得在使用有限的硬件资源的条件下仍然能够保证训练的精度。
Related patent documents