Processing

Please wait...

Settings

Settings

Goto Application

1. WO2020091928 - SCENE ANNOTATION USING MACHINE LEARNING

Publication Number WO/2020/091928
Publication Date 07.05.2020
International Application No. PCT/US2019/053744
International Filing Date 30.09.2019
IPC
H04N 5/445 2011.01
HELECTRICITY
04ELECTRIC COMMUNICATION TECHNIQUE
NPICTORIAL COMMUNICATION, e.g. TELEVISION
5Details of television systems
44Receiver circuitry
445for displaying additional information
G06N 20/00 2019.01
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
20Machine learning
CPC
A63F 13/60
AHUMAN NECESSITIES
63SPORTS; GAMES; AMUSEMENTS
FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
13Video games, i.e. games using an electronically generated display having two or more dimensions
60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
G06F 16/65
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
FELECTRIC DIGITAL DATA PROCESSING
16Information retrieval; Database structures therefor; File system structures therefor
60of audio data
65Clustering; Classification
G06K 9/00671
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
9Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
00664Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera
00671for providing information about objects in the scene to a user, e.g. as in augmented reality applications
G06N 3/0454
GPHYSICS
06COMPUTING; CALCULATING; COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
3Computer systems based on biological models
02using neural network models
04Architectures, e.g. interconnection topology
0454using a combination of multiple neural nets
G07F 17/32
GPHYSICS
07CHECKING-DEVICES
FCOIN-FREED OR LIKE APPARATUS
17Coin-freed apparatus for hiring articles; Coin-freed facilities or services
32for games, toys, sports or amusements, ; e.g. casino games, online gambling or betting
G10L 13/00
GPHYSICS
10MUSICAL INSTRUMENTS; ACOUSTICS
LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
13Speech synthesis; Text to speech systems
Applicants
  • SONY INTERACTIVE ENTERTAINMENT INC. [JP]/[JP]
  • KRISHNAMURTHY, Sudha [US]/[US]
Inventors
  • KRISHNAMURTHY, Sudha
  • ADAMS, Justice
  • JATI, Arindam
  • OMOTE, Masanori
  • ZHENG, Jian
Agents
  • ISENBERG, Joshua
Priority Data
16/177,21431.10.2018US
Publication Language English (EN)
Filing Language English (EN)
Designated States
Title
(EN) SCENE ANNOTATION USING MACHINE LEARNING
(FR) ANNOTATION DE SCÈNE PAR APPRENTISSAGE AUTOMATIQUE
Abstract
(EN)
A system enhances existing audio-visual content with audio describing the setting of the visual content. A scene annotation module classifies scene elements from an image frame received from a host system and generates a caption describing the scene elements. A text to speech synthesis module may then convert the caption to synthesized speech data describing the scene elements within the image frame.
(FR)
Un système améliore un contenu audiovisuel existant avec une description parlée du réglage du contenu visuel. Un module d’annotation de scène classifie des éléments d’une scène à partir d’une trame d’image reçue en provenance d’un système hôte et génère un sous-titre décrivant les éléments de la scène. Un modèle de synthèse de textes en paroles peut ensuite convertir le sous-titre en des données vocales de synthèse décrivant les éléments de la scène dans la trame d’image.
Also published as
Latest bibliographic data on file with the International Bureau