Processing

Please wait...

Settings

Settings

Goto Application

Offices all Languages en Stemming true Single Family Member false Include NPL false
RSS feed can only be generated if you have a WIPO account

Save query

A private query is only visible to you when you are logged-in and can not be used in RSS feeds

Query Tree

Refine Options

Offices
All
Specify the language of your search keywords
Stemming reduces inflected words to their stem or root form.
For example the words fishing, fished,fish, and fisher are reduced to the root word,fish,
so a search for fisher returns all the different variations
Returns only one member of a family of patents
Include Non-Patent literature in results

Full Query

AIapplicationfieldNetworksSocialNetworks

Side-by-side view shortcuts

General
Go to Search input
CTRL + SHIFT +
Go to Results (selected record)
CTRL + SHIFT +
Go to Detail (selected tab)
CTRL + SHIFT +
Go to Next page
CTRL +
Go to Previous page
CTRL +
Results (First, do 'Go to Results')
Go to Next record / image
/
Go to Previous record / image
/
Scroll Up
Page Up
Scroll Down
Page Down
Scroll to Top
CTRL + Home
Scroll to Bottom
CTRL + End
Detail (First, do 'Go to Detail')
Go to Next tab
Go to Previous tab

Analysis

1.20200311520Training machine learning model
US 01.10.2020
Int.Class G06T 7/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
7Image analysis
Appl.No 16369135 Applicant International Business Machines Corporation Inventor Shiwan Zhao

Techniques are provided for training machine learning model. According to one aspect, a training data is received by one or more processing units. The machine learning model is trained based on the training data, wherein the training comprises: optimizing the machine learning model based on stochastic gradient descent (SGD) by adding a dynamic noise to a gradient of a model parameter of the machine learning model calculated by the SGD.

2.20140180975INSTANCE WEIGHTED LEARNING MACHINE LEARNING MODEL
US 26.06.2014
Int.Class G06N 99/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
99Subject matter not provided for in other groups of this subclass
Appl.No 13725653 Applicant INSIDESALES.COM, INC. Inventor Martinez Tony Ramon

An instance weighted learning (IWL) machine learning model. In one example embodiment, a method of employing an IWL machine learning model to train a classifier may include determining a quality value that should be associated with each machine learning training instance in a temporal sequence of reinforcement learning machine learning training instances, associating the corresponding determined quality value with each of the machine learning training instances, and training a classifier using each of the machine learning training instances. Each of the machine learning training instances includes a state-action pair and is weighted during the training based on its associated quality value using a weighting factor that weights different quality values differently such that the classifier learns more from a machine learning training instance with a higher quality value than from a machine learning training instance with a lower quality value.

3.2013364041Instance weighted learning machine learning model
AU 09.07.2015
Int.Class G06F 15/18
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
15Digital computers in general; Data processing equipment in general
18in which a program is changed according to experience gained by the computer itself during a complete run; Learning machines
Appl.No 2013364041 Applicant InsideSales.com, Inc. Inventor Martinez, Tony Ramon
An instance weighted learning (IWL) machine learning model. In one example embodiment, a method of employing an IWL machine learning model to train a classifier may include determining a quality value that should be associated with each machine learning training instance in a temporal sequence of reinforcement learning machine learning training instances, associating the corresponding determined quality value with each of the machine learning training instances, and training a classifier using each of the machine learning training instances. Each of the machine learning training instances includes a state-action pair and is weighted during the training based on its associated quality value using a weighting factor that weights different quality values differently such that the classifier learns more from a machine learning training instance with a higher quality value than from a machine learning training instance with a lower quality value.
4.WO/2014/100738INSTANCE WEIGHTED LEARNING MACHINE LEARNING MODEL
WO 26.06.2014
Int.Class G06F 15/18
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
15Digital computers in general; Data processing equipment in general
18in which a program is changed according to experience gained by the computer itself during a complete run; Learning machines
Appl.No PCT/US2013/077260 Applicant INSIDESALES.COM, INC. Inventor MARTINEZ, Tony, Ramon
An instance weighted learning (IWL) machine learning model. In one example embodiment, a method of employing an IWL machine learning model to train a classifier may include determining a quality value that should be associated with each machine learning training instance in a temporal sequence of reinforcement learning machine learning training instances, associating the corresponding determined quality value with each of the machine learning training instances, and training a classifier using each of the machine learning training instances. Each of the machine learning training instances includes a state-action pair and is weighted during the training based on its associated quality value using a weighting factor that weights different quality values differently such that the classifier learns more from a machine learning training instance with a higher quality value than from a machine learning training instance with a lower quality value.
5.20140052678Hierarchical based sequencing machine learning model
US 20.02.2014
Int.Class G06E 1/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
EOPTICAL COMPUTING DEVICES
1Devices for processing exclusively digital data
Appl.No 13590000 Applicant Martinez Tony Ramon Inventor Martinez Tony Ramon

A hierarchical based sequencing (HBS) machine learning model. In one example embodiment, a method of employing an HBS machine learning model to predict multiple interdependent output components of an MOD output decision may include determining an order for multiple interdependent output components of an MOD output decision. The method may also include sequentially training a classifier for each component in the selected order to predict the component based on an input and based on any previous predicted component(s).

6.20140180978INSTANCE WEIGHTED LEARNING MACHINE LEARNING MODEL
US 26.06.2014
Int.Class G06N 99/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
99Subject matter not provided for in other groups of this subclass
Appl.No 14189669 Applicant INSIDESALES.COM, INC. Inventor Martinez Tony Ramon

An instance weighted learning (IWL) machine learning model. In one example embodiment, a method of employing an IWL machine learning model may include identifying a temporal sequence of reinforcement learning machine learning training instances with each of the training instances including a state-action pair, determining a first quality value for a first training instance in the temporal sequence of reinforcement learning machine learning training instances determining a second quality value for a second training instance in the temporal sequence of reinforcement learning machine learning training instances, associating the first quality value with the first training instance, and associating the second quality value with the second training instance. In this example embodiment, the first quality value is higher than the second quality value.

7.20160155069Machine learning classifier that can determine classifications of high-risk items
US 02.06.2016
Int.Class G06F 15/18
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
15Digital computers in general; Data processing equipment in general
18in which a program is changed according to experience gained by the computer itself during a complete run; Learning machines
Appl.No 15014773 Applicant Accenture Global Solutions Limited Inventor James Hoover

A machine learning classifier system includes a data set processing subsystem to generate a training set and a validation set from multiple data sources. Classifier hardware induces a classifier according to the training set, and tests the classifier according to the validation set. A buffer connected to the classifier hardware stores data objects to be classified, and a register connected to the classifier hardware stores outputs of the classifier, including classified data objects.

8.WO/2020/013760ANNOTATION SYSTEM FOR A NEURAL NETWORK
WO 16.01.2020
Int.Class G06N 3/08
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
08Learning methods
Appl.No PCT/SG2019/050324 Applicant XJERA LABS PTE. LTD. Inventor DING, Lu
An annotation system for a neural network and a method thereof are disclosed in the present application. The annotation system comprises a memory and a processor operatively coupled to the memory. The memory is configured for storing instructions to cause the process to receive information comprising a first set of unlabeled instances from at least one source; set a learning target of the information; select a second set of unlabeled instances from the first set of unlabeled instances by executing a software algorithm; and annotate the second set of unlabeled instances for generating labeled data. The software algorithm increases an efficiency of annotation in training neural networks for deep-learning-based video analysis by combining semi-supervised learning and transfer learning via a data augmentation method. The software algorithm can increase the efficiency of annotation by reducing an amount of annotation by an order of one magnitude.
9.20140188462System and method for analyzing ambiguities in language for natural language processing
US 03.07.2014
Int.Class G06F 17/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
17Digital computing or data processing equipment or methods, specially adapted for specific functions
Appl.No 14201974 Applicant Zadeh Lotfi A. Inventor Zadeh Lotfi A.

Specification covers new algorithms, methods, and systems for artificial intelligence, soft computing, and deep learning/recognition, e.g., image recognition (e.g., for action, gesture, emotion, expression, biometrics, fingerprint, facial, OCR (text), background, relationship, position, pattern, and object), large number of images (“Big Data”) analytics, machine learning, training schemes, crowd-sourcing (using experts or humans), feature space, clustering, classification, similarity measures, optimization, search engine, ranking, question-answering system, soft (fuzzy or unsharp) boundaries/impreciseness/ambiguities/fuzziness in language, Natural Language Processing (NLP), Computing-with-Words (CWW), parsing, machine translation, sound and speech recognition, video search and analysis (e.g. tracking), image annotation, geometrical abstraction, image correction, semantic web, context analysis, data reliability (e.g., using Z-number (e.g., “About 45 minutes; Very sure”)), rules engine, control system, autonomous vehicle, self-diagnosis and self-repair robots, system diagnosis, medical diagnosis, biomedicine, data mining, event prediction, financial forecasting, economics, risk assessment, e-mail management, database management, indexing and join operation, memory management, and data compression.

10.WO/2014/031683HIERARCHICAL BASED SEQUENCING MACHINE LEARNING MODEL
WO 27.02.2014
Int.Class G06F 15/18
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
15Digital computers in general; Data processing equipment in general
18in which a program is changed according to experience gained by the computer itself during a complete run; Learning machines
Appl.No PCT/US2013/055856 Applicant INSIDESALES.COM, INC. Inventor MARTINEZ, Tony, Ramon
A hierarchical based sequencing (HBS) machine learning model. In one example embodiment, a method of employing an HBS machine learning model to predict multiple interdependent output components of an MOD output decision may include determining an order for multiple interdependent output components of an MOD output decision. The method may also include sequentially training a classifier for each component in the selected order to predict the component based on an input and based on any previous predicted component(s).