Processing

Please wait...

Settings

Settings

Goto Application

1. WO2020069239 - EXPLOITING ACTIVATION SPARSITY IN DEEP NEURAL NETWORKS

Publication Number WO/2020/069239
Publication Date 02.04.2020
International Application No. PCT/US2019/053325
International Filing Date 27.09.2019
IPC
G06N 3/063 2006.01
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computer systems based on biological models
02using neural network models
06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
063using electronic means
CPC
G06F 5/06
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
FELECTRIC DIGITAL DATA PROCESSING
5Methods or arrangements for data conversion without changing the order or content of the data handled
06for changing the speed of data flow, i.e. speed regularising ; or timing, e.g. delay lines, FIFO buffers; over- or underrun control therefor;
G06F 7/5443
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
FELECTRIC DIGITAL DATA PROCESSING
7Methods or arrangements for processing data by operating upon the order or content of the data handled
38Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation
48using non-contact-making devices, e.g. tube, solid state device; using unspecified devices
544for evaluating functions by calculation
5443Sum of products
G06N 3/063
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computer systems based on biological models
02using neural network models
06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
063using electronic means
Applicants
  • QUALCOMM INCORPORATED [US]/[US]
Inventors
  • HILL, Rexford
  • LAMB, Aaron
  • GOLDFARB, Michael
  • ANSARI, Amin
  • LOTT, Christopher
Agents
  • MEISAROSH, Edward J.
Priority Data
16/147,29728.09.2018US
Publication Language English (EN)
Filing Language English (EN)
Designated States
Title
(EN) EXPLOITING ACTIVATION SPARSITY IN DEEP NEURAL NETWORKS
(FR) EXPLOITATION DE LA RARETÉ D'ACTIVATION DANS DES RÉSEAUX NEURONAUX PROFONDS
Abstract
(EN)
A method of exploiting activation sparsity in deep neural networks is described. The method includes retrieving an activation tensor and a weight tensor where the activation tensor is a sparse activation tensor. The method also includes generating a compressed activation tensor comprising non-zero activations of the activation tensor, where the compressed activation tensor has fewer columns than the activation tensor. The method further includes processing the compressed activation tensor and the weight tensor to generate an output tensor.
(FR)
L'invention concerne un procédé d'exploitation de la rareté d'activation dans des réseaux neuronaux profonds. Le procédé consiste à récupérer un tenseur d'activation et un tenseur de poids, le tenseur d'activation étant un tenseur d'activation rare. Le procédé consiste également à générer un tenseur d'activation compressé comprenant des activations non nulles du tenseur d'activation, le tenseur d'activation compressé ayant moins de colonnes que le tenseur d'activation. Le procédé consiste en outre à traiter le tenseur d'activation compressé et le tenseur de poids en vue de générer un tenseur de sortie.
Also published as
Latest bibliographic data on file with the International Bureau