Processing

Please wait...

Settings

Settings

Goto Application

1. CN109885378 - Model training method and device, computer device and computer readable storage medium

Office
China
Application Number 201910008124.7
Application Date 04.01.2019
Publication Number 109885378
Publication Date 14.06.2019
Publication Kind A
IPC
G06F 9/455
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
9Arrangements for program control, e.g. control units
06using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
44Arrangements for executing specific programs
455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
G06F 16/951
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
16Information retrieval; Database structures therefor; File system structures therefor
90Details of database functions independent of the retrieved data types
95Retrieval from the web
951Indexing; Web crawling techniques
G06F 16/955
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
16Information retrieval; Database structures therefor; File system structures therefor
90Details of database functions independent of the retrieved data types
95Retrieval from the web
955using information identifiers, e.g. uniform resource locators
G06N 20/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
20Machine learning
CPC
G06F 9/455
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
FELECTRIC DIGITAL DATA PROCESSING
9Arrangements for program control, e.g. control units
06using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
44Arrangements for executing specific programs
455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
G06F 16/951
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
FELECTRIC DIGITAL DATA PROCESSING
16Information retrieval; Database structures therefor; File system structures therefor
90Details of database functions independent of the retrieved data types
95Retrieval from the web
951Indexing; Web crawling techniques
G06F 16/955
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
FELECTRIC DIGITAL DATA PROCESSING
16Information retrieval; Database structures therefor; File system structures therefor
90Details of database functions independent of the retrieved data types
95Retrieval from the web
955using information identifiers, e.g. uniform resource locators [URL]
G06N 20/00
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
20Machine learning
Applicants PING AN TECHNOLOGY (SHENZHEN) CO., LTD.
平安科技(深圳)有限公司
Inventors WU ZHUANGWEI
吴壮伟
Agents 深圳市精英专利事务所 44242
Title
(EN) Model training method and device, computer device and computer readable storage medium
(ZH) 模型训练方法、装置、计算机设备及计算机可读存储介质
Abstract
(EN)
The embodiment of the invention provides a model training method and device, a computer device and a computer readable storage medium. The method comprises the steps of obtaining a training corpus through a first preset mode; segmenting the corpus according to preset conditions to obtain a plurality of data blocks; inputting the data blocks into the corresponding sub-models according to a preset corresponding relationship to train each sub-model to obtain trained sub-models; and synthesizing the trained sub-models according to a second preset mode to obtain a synthetic model. According to theembodiment of the invention, at the model training, the method is based on a concept of combining parallel and serial, a corpus is divided into data blocks; and the data blocks are inputted into the corresponding sub-models according to preset settings, each sub-model is trained in a parallel mode, and the serial combined calculation is carried out through a plurality of fine molecule models to form a final multi-layer composite model, thereby remarkably improving the efficiency of the model training and saving the computer hardware resources.

(ZH)
本申请实施例提供了一种模型训练方法、装置、计算机设备及计算机可读存储介质。方法包括:通过第一预设方式获取训练语料;将语料按照预设条件进行切分以得到多个数据块;将数据块按照预设对应关系分别输入至对应的子模型以训练各个子模型,得到训练后的子模型;按照第二预设方式合成训练后的子模型以得到合成模型。本申请实施例实现模型训练时,基于并行和串行相结合的构思,通过将语料进行划分数据块,将所述数据块按照预先设置分别输入至对应的子模型,采用并行方式训练各个子模型,然后通过多个细分子模型进行串行的组合计算,组成了最终多层的合成模型,显著提高了模型训练时的效率和对计算机硬件资源的节省。

Related patent documents