Processing

Please wait...

Settings

Settings

Goto Application

1. WO2020112213 - DEEP NEURAL NETWORK PROCESSING FOR SENSOR BLINDNESS DETECTION IN AUTONOMOUS MACHINE APPLICATIONS

Publication Number WO/2020/112213
Publication Date 04.06.2020
International Application No. PCT/US2019/051097
International Filing Date 13.09.2019
CPC
G06K 9/00791
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
00791Recognising scenes perceived from the perspective of a land vehicle, e.g. recognising lanes, obstacles or traffic signs on road scenes
G06K 9/036
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
03Detection or correction of errors, e.g. by rescanning the pattern
036Evaluation of quality of acquired pattern
G06K 9/6228
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
62Methods or arrangements for recognition using electronic means
6217Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
6228Selecting the most significant subset of features
G06K 9/6262
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
62Methods or arrangements for recognition using electronic means
6217Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
6262Validation, performance evaluation or active pattern learning techniques
G06K 9/6267
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
62Methods or arrangements for recognition using electronic means
6267Classification techniques
G06K 9/6273
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
62Methods or arrangements for recognition using electronic means
6267Classification techniques
6268relating to the classification paradigm, e.g. parametric or non-parametric approaches
627based on distances between the pattern to be recognised and training or reference patterns
6271based on distances to prototypes
6272based on distances to cluster centroïds
6273Smoothing the distance, e.g. Radial Basis Function Networks
Applicants
  • NVIDIA CORPORATION [US]/[US]
Inventors
  • SEO, Hae-Jong
  • BAJPAYEE, Abhishek
  • NISTER, David
  • PARK, Minwoo
  • CVIJETIC, Neda
Agents
  • PATEL, Maitry
Priority Data
16/570,18713.09.2019US
62/730,65213.09.2018US
Publication Language English (EN)
Filing Language English (EN)
Designated States
Title
(EN) DEEP NEURAL NETWORK PROCESSING FOR SENSOR BLINDNESS DETECTION IN AUTONOMOUS MACHINE APPLICATIONS
(FR) TRAITEMENT PAR RÉSEAU NEURONAL PROFOND POUR DÉTECTION DE CÉCITÉ DE CAPTEUR DANS DES APPLICATIONS DE MACHINE AUTONOME
Abstract
(EN)
In various examples, a deep neural network (DNN) is trained for sensor blindness detection using a region and context-based approach. Using sensor data, the DNN may compute locations of blindness or compromised visibility regions as well as associated blindness classifications and/or blindness attributes associated therewith. In addition, the DNN may predict a usability of each instance of the sensor data for performing one or more operations – such as operations associated with semi-autonomous or autonomous driving. The combination of the outputs of the DNN may be used to filter out instances of the sensor data – or to filter out portions of instances of the sensor data determined to be compromised – that may lead to inaccurate or ineffective results for the one or more operations of the system.
(FR)
Dans divers exemples, un réseau neuronal profond (DNN) est entraîné pour une détection de cécité de capteur à l'aide d'une approche basée sur une région et un contexte. A l'aide de données de capteur, le DNN peut calculer des emplacements de régions de cécité ou de visibilité compromise ainsi que des classifications de cécité associées et/ou des attributs de cécité associés. De plus, le DNN peut prédire une capacité d'utilisation de chaque instance des données de capteur pour effectuer une ou plusieurs opérations telles que des opérations associées à une conduite semi-autonome ou autonome. La combinaison des sorties du DNN peut être utilisée pour filtrer des instances des données de capteur ou pour filtrer des parties d'instances des données de capteur déterminées comme étant compromises qui peuvent conduire à des résultats imprécis ou inefficaces pour la ou les opérations du système.
Also published as
Latest bibliographic data on file with the International Bureau