Processing

Please wait...

Settings

Settings

Goto Application

Offices all Languages en Stemming true Single Family Member false Include NPL false
RSS feed can only be generated if you have a WIPO account

Save query

A private query is only visible to you when you are logged-in and can not be used in RSS feeds

Query Tree

Refine Options

Offices
All
Specify the language of your search keywords
Stemming reduces inflected words to their stem or root form.
For example the words fishing, fished,fish, and fisher are reduced to the root word,fish,
so a search for fisher returns all the different variations
Returns only one member of a family of patents
Include Non-Patent literature in results

Full Query

AItechniqueProbabilisticReasoning

Side-by-side view shortcuts

General
Go to Search input
CTRL + SHIFT +
Go to Results (selected record)
CTRL + SHIFT +
Go to Detail (selected tab)
CTRL + SHIFT +
Go to Next page
CTRL +
Go to Previous page
CTRL +
Results (First, do 'Go to Results')
Go to Next record / image
/
Go to Previous record / image
/
Scroll Up
Page Up
Scroll Down
Page Down
Scroll to Top
CTRL + Home
Scroll to Bottom
CTRL + End
Detail (First, do 'Go to Detail')
Go to Next tab
Go to Previous tab

Analysis

1.2020102708STUDENT PARTICIPATION AND PERFORMANCE PREDICTION ANALYSIS TECHNIQUE DURING ONLINE CLASSES USING DATA MINING
AU 05.11.2020
Int.Class G06Q 50/20
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
QINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
50Information and communication technology specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
10Services
20Education
Appl.No 2020102708 Applicant Garg, Sharvan Kumar DR Inventor Garg, Sharvan Kumar
As online classes are emerging nowadays it is crucial to predict the student's performance using various techniques. Educational data mining and learning analytics two main factors to analyse. In addition to analyse and provide solution based on the institutional view. Also, it gives solution based on the instructor views. Also based on the dataset arrived it will provide the solution. The prediction can be made possible using the methods like decision tree, neural network, nave Bayes, K-Nearest neighbour and support vector machine. Decision tree is the simplest one to predict, the small and big data. It is easier to convert to the IF-THEN rules. Neural network predicts regardless of the dependent or independent variable. This too predicts in a better manner of the psychometric, GPAs in a better way and yields a better accuracy. Nave Bayes too make a good prediction with comparison about the student's performance. K-Nearest neighbour produces a good accuracy and it predicts faster than the other algorithms. support vector machine also used for mainly classification, but it predicts too. It is also faster than the other techniques. In addition to the other tools like Probabilistic Soft Logic (PSL), logistic regression, ID3, Classification and Regression Tree (CART) algorithm can be used. This will analyse and produce the accuracy. STUDENT PARTICIPATION AND PERFORMANCE PREDICTION ANALYSIS TECHNIQUE DURING ONLINE CLASSES USING DATA MINING Drawings Figure 1: Overall architecture for gathering information
2.WO/2022/101452ARCHITECTURE FOR EXPLAINABLE REINFORCEMENT LEARNING
WO 19.05.2022
Int.Class G06N 3/08
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
08Learning methods
Appl.No PCT/EP2021/081600 Applicant UMNAI LIMITED Inventor DALLI, Angelo
An exemplary embodiment may provide an explainable reinforcement learning system. Explanations may be incorporated into an exemplary reinforcement learning agent/model or a corresponding environmental model. The explanations may be incorporated into an agent's state and/or action space. An explainable Bellman equation may implement an explainable state and explainable action as part of an explainable reward function. An explainable reinforcement learning induction method may implement a dataset to provide a white-box model which mimics a black-box reinforcement learning system. An explainable generative adversarial imitation learning model may implement an explainable generative adversarial network to train the occupancy measure of a policy and may generate multiple levels of explanations. Explainable reinforcement learning may be implemented on a quantum computing system using an embodiment of an explainable Bellman equation.
3.3180/MUM/2011METHOD AND SYSTEM FOR RECOGNIZING ANCIENT INSCRIBED PALI CHARACTERS USING FUZZY FILTERS
IN 27.01.2012
Int.Class G06K 9/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for recognising patterns
Appl.No 3180/MUM/2011 Applicant ANUPAMA D. SAKHARE Inventor ANUPAMA D. SAKHARE
The present invention is a technical solution to solve the problem of character recognition caused due to the different versions of inscribed ancient Pali text corpus found on rock plates. It provides a method and a system for recognizing ancient inscribed Pali language characters using computational linguistic method of phonological knowledge base used in language understanding and Neuro-fuzzy soft computing approach to accurately recognize characters. Three algorithms are used for processing which include CANFIS system. Color Recipe Classifier system and the Fuzzy Controller algorithms. Each character is identified and the positional features are used for identification and character recognition.
4.WO/2022/101515METHOD FOR AN EXPLAINABLE AUTOENCODER AND AN EXPLAINABLE GENERATIVE ADVERSARIAL NETWORK
WO 19.05.2022
Int.Class G06N 3/04
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
04Architecture, e.g. interconnection topology
Appl.No PCT/EP2021/081899 Applicant UMNAI LIMITED Inventor DALLI, Angelo
An exemplary embodiment provides an autoencoder which is explainable. An exemplary explainable autoencoder may explain the degree to which each feature of the input attributed to the output of the system, which may be a compressed data representation. An exemplary embodiment may be used for classification, such as anomaly detection, as well as other scenarios where an explainable autoencoder is used as input to another machine learning system or when an explainable autoencoder is a component in an end-to-end deep learning architecture. An exemplary embodiment provides an explainable generative adversarial network that adds explainable generation, simulation and discrimination capabilities. The underlying architecture of an exemplary embodiment may be based on an explainable or interpretable neural network, allowing the underlying architecture to be a fully explainable white-box machine learning system.
5.WO/2021/115929XAI AND XNN CONVERSION
WO 17.06.2021
Int.Class G06N 3/04
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
04Architecture, e.g. interconnection topology
Appl.No PCT/EP2020/084523 Applicant UMNAI LIMITED Inventor DALLI, Angelo
In an exemplary embodiment, a method for extracting a model from an existing machine learning model may be shown and described. In black-box models, transfer learning consists of transferring knowledge with the objective of learning new patterns. However, in an exemplary embodiment, transfer learning presents the concept of converting an explainable neural network into logically equivalent variants, which may not be possible with black-box neural networks, which typically consist of multiple fully-connected layers. The white-box nature of an exemplary XNN or XAI enables new ways of transferring knowledge with intelligent conversions of neural networks in ways that are impossible to do with a black-box model.
6.20230281449Architectures, systems and methods for program defined transaction system and decentralized cryptocurrency systems
US 07.09.2023
Int.Class G06N 3/08
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
08Learning methods
Appl.No 18196466 Applicant MILESTONE ENTERTAINMENT, LLC Inventor Randall M. Katz

In one aspect, the invention comprises a system and method for control of a transaction state system utilizing a distributed ledger. First, the system and method includes an application plane layer adapted to receive instructions regarding operation of the transaction state system. Preferably, the application plane layer is coupled to the application plane layer interface. Second, a control plane layer is provided, the control plane layer including an adaptive control unit, such as a cognitive computing unit, artificial intelligence unit or machine-learning unit. Third, a data plane layer includes an input interface to receive data input from one or more data sources and to provide output coupled to a decentralized distributed ledger, the data plane layer is coupled to the control plane layer. Optionally, the system and method serve to implement a smart contract on a decentralized distributed ledger.

7.20180197111Transfer learning and domain adaptation using distributable data models
US 12.07.2018
Int.Class G06F 17/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
FELECTRIC DIGITAL DATA PROCESSING
17Digital computing or data processing equipment or methods, specially adapted for specific functions
Appl.No 15835436 Applicant Fractal Industries, Inc. Inventor Jason Crabtree

A system for transfer learning and domain adaptation using distributable data models is provided, comprising a network-connected distributable model configured to serve instances of a plurality of distributable models; and a directed computation graph module configured to receive at least an instance of at least one of the distributable models from the network-connected computing system, create a second dataset from machine learning performed by a transfer engine, train the instance of the distributable model with the second dataset, and generate an update report based at least in part by updates to the instance of the distributable model.

8.3340570PROFILING CYBER THREATS DETECTED IN A TARGET ENVIRONMENT AND AUTOMATICALLY GENERATING ONE OR MORE RULE BASES FOR AN EXPERT SYSTEM USABLE TO PROFILE CYBER THREATS DETECTED IN A TARGET ENVIRONMENT
EP 27.06.2018
Int.Class H04L 29/06
HELECTRICITY
04ELECTRIC COMMUNICATION TECHNIQUE
LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
29Arrangements, apparatus, circuits or systems, not covered by a single one of groups H04L1/-H04L27/136
02Communication control; Communication processing
06characterised by a protocol
Appl.No 18157284 Applicant CYBERLYTIC LTD Inventor LAIDLAW STUART
A method of automatically generating one or more rule bases for an expert system usable to profile cyber threats detected in a target environment, comprising the steps of: for each alert of a training set of alerts triggered by a potential cyber threat detected by an SIEM: retrieving captured packet data related to the alert; and extracting training threat data pertaining to a set of attributes from captured packet data triggering the alert; generating a predictive model of the level of risk posed by an alert based on attribute values for that alert by analysing the captured training threat data pertaining to the set of attributes; and generating a set of fuzzy rules based on the predictive model, said rules being usable at run time in a fuzzy logic engine to evaluate data pertaining to one or more of the extracted attributes of a detected cyber threat to determine values for one or more output variables indicative of a level of an aspect of risk attributable to the detected cyber threat.
9.WO/2022/258775AUTOMATIC XAI (AUTOXAI) WITH EVOLUTIONARY NAS TECHNIQUES AND MODEL DISCOVERY AND REFINEMENT
WO 15.12.2022
Int.Class G06N 3/12
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
12using genetic models
Appl.No PCT/EP2022/065745 Applicant UMNAI LIMITED Inventor DALLI, Angelo
An exemplary model search may provide optimal explainable models based on a dataset. An exemplary embodiment may identify features from a training dataset, and may map feature costs to the identified features. The search space may be sampled to generate initial or seed candidates, which may be chosen based on one or more objectives and/or constraints. The candidates may be iteratively optimized until an exit condition is met. The optimization may be performed by an external optimizer. The external optimizer may iteratively apply constraints to the candidates to quantify a fitness level of each of the seed candidates. The fitness level may be based on the constraints and objectives. The candidates may be a set of data, or may be trained to form explainable models. The external optimizer may optimize the explainable models until the exit conditions are met.
10.WO/2023/143707TRAINING A NEURAL NETWORK TO PERFORM A MACHINE LEARNING TASK
WO 03.08.2023
Int.Class G06N 3/04
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computing arrangements based on biological models
02Neural networks
04Architecture, e.g. interconnection topology
Appl.No PCT/EP2022/051710 Applicant PETERSEN RESEARCH, LLC Inventor PETERSEN, Felix
A method, program and system for training a neural network to perform a machine learning task is provided. The method comprises receiving input data for the neural network. The method further comprises determining values for a plurality of hyperparameters of the neural network. The method further comprises building the neural network according to the hyper parameter values, wherein the neural network comprises a plurality of neurons. Each neuron includes a probability distribution for a plurality of logic operators, such that the neuron includes a corresponding probability for each of the logic operators. The method further comprises training the neural network according to the hyper parameter values and the input data by learning the probability distribution of each neuron. The method further comprises determining a logic operator of the plurality of logic operators for each neuron by selecting a value in the probability distribution.