Processing

Please wait...

Settings

Settings

Goto Application

Offices all Languages en Stemming true Single Family Member false Include NPL false
RSS feed can only be generated if you have a WIPO account

Save query

A private query is only visible to you when you are logged-in and can not be used in RSS feeds

Query Tree

Refine Options

Offices
All
Specify the language of your search keywords
Stemming reduces inflected words to their stem or root form.
For example the words fishing, fished,fish, and fisher are reduced to the root word,fish,
so a search for fisher returns all the different variations
Returns only one member of a family of patents
Include Non-Patent literature in results

Full Query

AItechniqueMachineLearningUnsupervisedLearning

Side-by-side view shortcuts

General
Go to Search input
CTRL + SHIFT +
Go to Results (selected record)
CTRL + SHIFT +
Go to Detail (selected tab)
CTRL + SHIFT +
Go to Next page
CTRL +
Go to Previous page
CTRL +
Results (First, do 'Go to Results')
Go to Next record / image
/
Go to Previous record / image
/
Scroll Up
Page Up
Scroll Down
Page Down
Scroll to Top
CTRL + Home
Scroll to Bottom
CTRL + End
Detail (First, do 'Go to Detail')
Go to Next tab
Go to Previous tab

Analysis

1.20220180975METHODS AND SYSTEMS FOR DETERMINING GENE EXPRESSION PROFILES AND CELL IDENTITIES FROM MULTI-OMIC IMAGING DATA
US 09.06.2022
Int.Class G16B 40/30
GPHYSICS
16INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
40ICT specially adapted for biostatistics; ICT specially adapted for bioinformatics-related machine learning or data mining, e.g. knowledge discovery or pattern finding
30Unsupervised data analysis
Appl.No 17553691 Applicant The Broad Institute, Inc. Inventor Aviv Regev

The present disclosure relates to systems and method of determining transcriptomic profile from omics imaging data. The systems and methods train machine learning methods with intrinsic and extrinsic features of a cell and/or tissue to define transcriptomic profiles of the cell and/or tissue. Applicants utilize a convolutional autoencoder to define cell subtypes from images of the cells.

2.WO/2023/059663SYSTEMS AND METHODS FOR ASSESSMENT OF BODY FAT COMPOSITION AND TYPE VIA IMAGE PROCESSING
WO 13.04.2023
Int.Class A61B 5/00
AHUMAN NECESSITIES
61MEDICAL OR VETERINARY SCIENCE; HYGIENE
BDIAGNOSIS; SURGERY; IDENTIFICATION
5Measuring for diagnostic purposes ; Identification of persons
Appl.No PCT/US2022/045706 Applicant THE BROAD INSTITUTE, INC. Inventor KHERA, Amit
The subject matter disclosed herein relates to utilizing the silhouette of an individual to measure body fat volume and distribution. Particular examples relates to providing a system, a computer-implemented method, and a computer program product to utilize a binary outline, or silhouette, to predict the individual's fat depot volumes with machine learning models.
3.20200342307Swarm fair deep reinforcement learning
US 29.10.2020
Int.Class G06N 3/08
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
08Learning methods
Appl.No 16395187 Applicant International Business Machines Corporation Inventor Aaron K. Baughman

Fair deep reinforcement learning is provided. A microstate of an environment and reaction of items in a plurality of microstates within the environment are observed after an agent performs an action in the environment. Semi-supervised training is utilized to determine bias weights corresponding to the action for the microstate of the environment and the reaction of the items in the plurality of microstates within the environment. The bias weights from the semi-supervised training are merged with non-bias weights using an artificial neural network. Over time, it is determined where bias is occurring in the semi-supervised training based on merging the bias weights with the non-bias weights in the artificial neural network. A deep reinforcement learning model that decreases reliance on the bias weights is generated based on determined bias to increase fairness.

4.20210397895INTELLIGENT LEARNING SYSTEM WITH NOISY LABEL DATA
US 23.12.2021
Int.Class G06K 9/62
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for recognising patterns
62Methods or arrangements for pattern recognition using electronic means
Appl.No 16946465 Applicant INTERNATIONAL BUSINESS MACHINES CORPORATION Inventor Yang SUN

Various embodiments are provided for providing machine learning with noisy label data in a computing environment using one or more processors in a computing system. A label corruption probability of noisy labels may be estimated for selected data from a dataset using temporal inconsistency in a machine model prediction during a training operation in a neural network.

5.20210295045Automatic makeup transfer using semi-supervised learning
US 23.09.2021
Int.Class G09G 5/00
GPHYSICS
09EDUCATING; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
5Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
Appl.No 16822878 Applicant Adobe Inc. Inventor Yijun Li

The present disclosure relates to systems, computer-implemented methods, and non-transitory computer readable medium for automatically transferring makeup from a reference face image to a target face image using a neural network trained using semi-supervised learning. For example, the disclosed systems can receive, at a neural network, a target face image and a reference face image, where the target face image is selected by a user via a graphical user interface (GUI) and the reference face image has makeup. The systems transfer, by the neural network, the makeup from the reference face image to the target face image, where the neural network is trained to transfer the makeup from the reference face image to the target face image using semi-supervised learning. The systems output for display the makeup on the target face image.

6.4163833DEEP NEURAL NETWORK MODEL DESIGN ENHANCED BY REAL-TIME PROXY EVALUATION FEEDBACK
EP 12.04.2023
Int.Class G06N 3/04
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
04Architecture, e.g. interconnection topology
Appl.No 22186944 Applicant INTEL CORP Inventor CUMMINGS DANIEL J
The present disclosure is related to artificial intelligence (AI), machine learning (ML), and Neural Architecture Search (NAS) technologies, and in particular, to Deep Neural Network (DNN) model engineering techniques that use proxy evaluation feedback. The DNN model engineering techniques discussed herein provide near real-time feedback on model performance via low-cost proxy scores without requiring continual training and/or validation cycles, iterations, epochs, etc. In conjunction with the proxy-based scoring, semi-supervised learning mechanisms are used to map proxy scores to various model performance metrics. Other embodiments may be described and/or claimed.
7.20220027777Generalized expectation maximization for semi-supervised learning
US 27.01.2022
Int.Class G06N 20/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
20Machine learning
Appl.No 16935313 Applicant Oracle International Corporation Inventor Felix Schmidt

Techniques are described that extend supervised machine-learning algorithms for use with semi-supervised training. Random labels are assigned to unlabeled training data, and the data is split into k partitions. During a label-training iteration, each of these k partitions is combined with the labeled training data, and the combination is used train a single instance of the machine-learning model. Each of these trained models are then used to predict labels for data points in the k−1 partitions of previously-unlabeled training data that were not used to train of the model. Thus, every data point in the previously-unlabeled training data obtains k−1 predicted labels. For each data point, these labels are aggregated to obtain a composite label prediction for the data point. After the labels are determined via one or more label-training iterations, a machine-learning model is trained on data with the resulting composite label predictions and on the labeled data set.

8.20180165554Semisupervised autoencoder for sentiment analysis
US 14.06.2018
Int.Class G06K 9/62
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for recognising patterns
62Methods or arrangements for pattern recognition using electronic means
Appl.No 15838000 Applicant The Research Foundation for The State University of New York Inventor Zhongfei Zhang

A method of modelling data, comprising: training an objective function of a linear classifier, based on a set of labeled data, to derive a set of classifier weights; defining a posterior probability distribution on the set of classifier weights of the linear classifier; approximating a marginalized loss function for an autoencoder as a Bregman divergence, based on the posterior probability distribution on the set of classifier weights learned from the linear classifier; and classifying unlabeled data using the autoencoder according to the marginalized loss function.

9.3786855AUTOMATED DATA PROCESSING AND MACHINE LEARNING MODEL GENERATION
EP 03.03.2021
Int.Class G06N 5/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
5Computing arrangements using knowledge-based models
Appl.No 19290076 Applicant ACCENTURE GLOBAL SOLUTIONS LTD Inventor HIGGINS LUKE
A device may obtain first data relating to a machine learning model. The device may pre-process the first data to alter the first data to generate second data. The device may process the second data to select a set of features from the second data. The device may analyze the set of features to evaluate a plurality of types of machine learning models with respect to the set of features. The device may select a particular type of machine learning model for the set of features based on analyzing the set of features to evaluate the plurality of types of machine learning models. The device may tune a set of parameters of the particular type of machine learning model to train the machine learning model. The device may receive third data for prediction. The device may provide a prediction using the particular type of machine learning model.
10.20200272947Orchestrator for machine learning pipeline
US 27.08.2020
Int.Class G06F 15/173
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
15Digital computers in general; Data processing equipment in general
16Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
163Interprocessor communication
173using an interconnection network, e.g. matrix, shuffle, pyramid, star or snowflake
Appl.No 16284291 Applicant SAP SE Inventor Lukas Carullo

Provided is a system and method for training and validating models in a machine learning pipeline for failure mode analytics. The machine learning pipeline may include an unsupervised training phase, a validation phase and a supervised training and scoring phase. In one example, the method may include receiving an identification of a machine learning model, executing a machine learning pipeline comprising a plurality of services which train the machine learning model via at least one of an unsupervised learning process and a supervised learning process, the machine learning pipeline being controlled by an orchestration module that triggers ordered execution of the services, and storing the trained machine learning model output from the machine learning pipeline in a database associated with the machine learning pipeline.