Processing

Please wait...

Settings

Settings

Goto Application

Offices all Languages en Stemming true Single Family Member false Include NPL false
RSS feed can only be generated if you have a WIPO account

Save query

A private query is only visible to you when you are logged-in and can not be used in RSS feeds

Query Tree

Refine Options

Offices
All
Specify the language of your search keywords
Stemming reduces inflected words to their stem or root form.
For example the words fishing, fished,fish, and fisher are reduced to the root word,fish,
so a search for fisher returns all the different variations
Returns only one member of a family of patents
Include Non-Patent literature in results

Full Query

AIapplicationfieldBusinessECommerce

Side-by-side view shortcuts

General
Go to Search input
CTRL + SHIFT +
Go to Results (selected record)
CTRL + SHIFT +
Go to Detail (selected tab)
CTRL + SHIFT +
Go to Next page
CTRL +
Go to Previous page
CTRL +
Results (First, do 'Go to Results')
Go to Next record / image
/
Go to Previous record / image
/
Scroll Up
Page Up
Scroll Down
Page Down
Scroll to Top
CTRL + Home
Scroll to Bottom
CTRL + End
Detail (First, do 'Go to Detail')
Go to Next tab
Go to Previous tab

Analysis

1.20220180975METHODS AND SYSTEMS FOR DETERMINING GENE EXPRESSION PROFILES AND CELL IDENTITIES FROM MULTI-OMIC IMAGING DATA
US 09.06.2022
Int.Class G16B 40/30
GPHYSICS
16INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
40ICT specially adapted for biostatistics; ICT specially adapted for bioinformatics-related machine learning or data mining, e.g. knowledge discovery or pattern finding
30Unsupervised data analysis
Appl.No 17553691 Applicant The Broad Institute, Inc. Inventor Aviv Regev

The present disclosure relates to systems and method of determining transcriptomic profile from omics imaging data. The systems and methods train machine learning methods with intrinsic and extrinsic features of a cell and/or tissue to define transcriptomic profiles of the cell and/or tissue. Applicants utilize a convolutional autoencoder to define cell subtypes from images of the cells.

2.12112752Cohort determination in natural language processing
US 08.10.2024
Int.Class G10L 15/22
GPHYSICS
10MUSICAL INSTRUMENTS; ACOUSTICS
LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
15Speech recognition
22Procedures used during a speech recognition process, e.g. man-machine dialog
Appl.No 17688279 Applicant Amazon Technologies, Inc. Inventor Rahul Gupta

Devices and techniques are generally described for cohort determination in natural language processing. In various examples, a first natural language input to a natural language processing system may be determined. The first natural language input may be associated with a first account identifier. A first machine learning model may determine first data representing one or more words of the first natural language input. A second machine learning model may determine second data representing one or more acoustic characteristics of the first natural language input. Third data may be determined, the third data including a predicted performance for processing the first natural language input by the natural language processing system. The third data may be determined based on the first data representation and the second data representation.

3.20220308943System and AI pattern model for actionable alerts for events within a ChatOps platform
US 29.09.2022
Int.Class G06F 9/54
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
9Arrangements for program control, e.g. control units
06using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
46Multiprogramming arrangements
54Interprogram communication
Appl.No 17210853 Applicant KYNDRYL, INC. Inventor Raghuram Srinivasan

In an approach for building a machine learning model that predicts the appropriate action to resolve a malfunction or system error, a processor receives an alert that a malfunction or a system error has occurred. A processor creates a workspace on a ChatOps platform integrated with a chatbot and one or more tools. A processor inputs data relating to the alert in a natural language format. A processor processes the data using a natural language processing algorithm. Responsive to determining a pre-set threshold for outputting the appropriate action is not met, a processor establishes a conversation between two or more support service agents in the workspace. A processor monitors the conversation using the natural language processing algorithm. A processor analyzes a transcript of the conversation using text analytics or pattern matching. A processor creates and trains a machine learning model to predict the appropriate action in future iterations.

4.20200311520Training machine learning model
US 01.10.2020
Int.Class G06T 7/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
7Image analysis
Appl.No 16369135 Applicant International Business Machines Corporation Inventor Shiwan Zhao

Techniques are provided for training machine learning model. According to one aspect, a training data is received by one or more processing units. The machine learning model is trained based on the training data, wherein the training comprises: optimizing the machine learning model based on stochastic gradient descent (SGD) by adding a dynamic noise to a gradient of a model parameter of the machine learning model calculated by the SGD.

5.2017300259Distributed machine learning systems, apparatus, and methods
AU 25.01.2018
Int.Class G06N 99/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
99Subject matter not provided for in other groups of this subclass
Appl.No 2017300259 Applicant Nant Holdings IP, LLC Inventor Benz, Stephen Charles
A distributed, online machine learning system is presented. Contemplated systems include many private data servers, each having local private data. Researchers can request that relevant private data servers train implementations of machine learning algorithms on their local private data without requiring de-identification of the private data or without exposing the private data to unauthorized computing systems. The private data servers also generate synthetic or proxy data according to the data distributions of the actual data. The servers then use the proxy data to train proxy models. When the proxy models are sufficiently similar to the trained actual models, the proxy data, proxy model parameters, or other learned knowledge can be transmitted to one or more non-private computing devices. The learned knowledge from many private data servers can then be aggregated into one or more trained global models without exposing private data.
6.WO/2018/017467DISTRIBUTED MACHINE LEARNING SYSTEMS, APPARATUS, AND METHODS
WO 25.01.2018
Int.Class G06N 99/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
99Subject matter not provided for in other groups of this subclass
Appl.No PCT/US2017/042356 Applicant NANTOMICS, LLC Inventor SZETO, Christopher
A distributed, online machine learning system is presented. Contemplated systems include many private data servers, each having local private data. Researchers can request that relevant private data servers train implementations of machine learning algorithms on their local private data without requiring de-identification of the private data or without exposing the private data to unauthorized computing systems. The private data servers also generate synthetic or proxy data according to the data distributions of the actual data. The servers then use the proxy data to train proxy models. When the proxy models are sufficiently similar to the trained actual models, the proxy data, proxy model parameters, or other learned knowledge can be transmitted to one or more non-private computing devices. The learned knowledge from many private data servers can then be aggregated into one or more trained global models without exposing private data.
7.20140180975INSTANCE WEIGHTED LEARNING MACHINE LEARNING MODEL
US 26.06.2014
Int.Class G06N 99/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
99Subject matter not provided for in other groups of this subclass
Appl.No 13725653 Applicant INSIDESALES.COM, INC. Inventor Martinez Tony Ramon

An instance weighted learning (IWL) machine learning model. In one example embodiment, a method of employing an IWL machine learning model to train a classifier may include determining a quality value that should be associated with each machine learning training instance in a temporal sequence of reinforcement learning machine learning training instances, associating the corresponding determined quality value with each of the machine learning training instances, and training a classifier using each of the machine learning training instances. Each of the machine learning training instances includes a state-action pair and is weighted during the training based on its associated quality value using a weighting factor that weights different quality values differently such that the classifier learns more from a machine learning training instance with a higher quality value than from a machine learning training instance with a lower quality value.

8.2013364041Instance weighted learning machine learning model
AU 09.07.2015
Int.Class G06F 15/18
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
15Digital computers in general; Data processing equipment in general
18in which a program is changed according to experience gained by the computer itself during a complete run; Learning machines
Appl.No 2013364041 Applicant InsideSales.com, Inc. Inventor Martinez, Tony Ramon
An instance weighted learning (IWL) machine learning model. In one example embodiment, a method of employing an IWL machine learning model to train a classifier may include determining a quality value that should be associated with each machine learning training instance in a temporal sequence of reinforcement learning machine learning training instances, associating the corresponding determined quality value with each of the machine learning training instances, and training a classifier using each of the machine learning training instances. Each of the machine learning training instances includes a state-action pair and is weighted during the training based on its associated quality value using a weighting factor that weights different quality values differently such that the classifier learns more from a machine learning training instance with a higher quality value than from a machine learning training instance with a lower quality value.
9.10990645System and methods for performing automatic data aggregation
US 27.04.2021
Int.Class G06N 3/08
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
08Learning methods
Appl.No 16127764 Applicant Sophtron, Inc. Inventor Nanjuan Shi

Systems, apparatuses, and methods for automated data aggregation. In some embodiments, this is achieved by use of techniques such as natural language processing (NLP) and machine learning to enable the automation of data aggregation from websites without the use of pre-programmed scripts.

10.20140052678Hierarchical based sequencing machine learning model
US 20.02.2014
Int.Class G06E 1/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
EOPTICAL COMPUTING DEVICES
1Devices for processing exclusively digital data
Appl.No 13590000 Applicant Martinez Tony Ramon Inventor Martinez Tony Ramon

A hierarchical based sequencing (HBS) machine learning model. In one example embodiment, a method of employing an HBS machine learning model to predict multiple interdependent output components of an MOD output decision may include determining an order for multiple interdependent output components of an MOD output decision. The method may also include sequentially training a classifier for each component in the selected order to predict the component based on an input and based on any previous predicted component(s).