Processing

Please wait...

Settings

Settings

Goto Application

Offices all Languages Stemming false Single Family Member true Include NPL false

Save query

A private query is only visible to you when you are logged-in and can not be used in RSS feeds

Query Tree

Refine Options

Offices
All
Specify the language of your search keywords
Stemming reduces inflected words to their stem or root form.
For example the words fishing, fished,fish, and fisher are reduced to the root word,fish,
so a search for fisher returns all the different variations
Returns only one member of a family of patents
Include Non-Patent literature in results

Full Query

IC:B25J9/00

Side-by-side view shortcuts

General
Go to Search input
CTRL + SHIFT +
Go to Results (selected record)
CTRL + SHIFT +
Go to Detail (selected tab)
CTRL + SHIFT +
Go to Next page
CTRL +
Go to Previous page
CTRL +
Results (First, do 'Go to Results')
Go to Next record / image
/
Go to Previous record / image
/
Scroll Up
Page Up
Scroll Down
Page Down
Scroll to Top
CTRL + Home
Scroll to Bottom
CTRL + End
Detail (First, do 'Go to Detail')
Go to Next tab
Go to Previous tab

Analysis

1.20210316452INFORMATION PROCESSING DEVICE, ACTION DECISION METHOD AND PROGRAM
US 14.10.2021
Int.Class B25J 9/16
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
9Programme-controlled manipulators
16Programme controls
Appl.No 17259693 Applicant SONY CORPORATION Inventor Natsuko OZAKI

More natural communication and interaction are enabled. The autonomous system (100) includes an action decision unit (140) that decides, based on an attention level map (40) in which an attention level indicating the degree of attention for each position in a predetermined space is set, the action which a drive mechanism is caused to perform.

2.20210316461DATA GENERATION DEVICE, METHOD OF GENERATING DATA, AND REMOTE MANIPULATION SYSTEM
US 14.10.2021
Int.Class B25J 9/16
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
9Programme-controlled manipulators
16Programme controls
Appl.No 17267288 Applicant KAWASAKI JUKOGYO KABUSHIKI KAISHA Inventor Yasuhiko HASHIMOTO

A data generation device generates at least a part of data used for a generation of an image displayed on a display unit. The display unit displays a workspace model modeled after an actual workspace, as a video. The workspace model includes a robot model modeled after an actual robot, and a peripheral object model modeled after a given peripheral object around the actual robot. The robot model is created so as to operate according to operation of an operator to a manipulator. The data generation device includes a state information acquiring module configured to acquire state information indicative of a state of the peripheral object, and an estimating module configured to estimate, based on the state information, a state of the peripheral object after a given period of time from the current time point, and generate a result of the estimation as peripheral-object model data used for a creation of the peripheral object model displayed on the display unit.

3.20210316458CLOUD BASED COMPUTER-IMPLEMENTED VISUALLY PROGRAMMING METHOD AND SYSTEM FOR ROBOTIC MOTIONS IN CONSTRUCTION
US 14.10.2021
Int.Class B25J 9/16
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
9Programme-controlled manipulators
16Programme controls
Appl.No 17218653 Applicant SMART BUILDING TECH CO., LTD. Inventor SHIH-CHUNG KANG

The present invention relates to a computer-implemented method. The method includes causing a visual programming panel including a timeline editor and a plurality of motion blocks enabling a variety of robotic motions to be displayed in a visualization interface provided by a robot simulator shown on a web browser; selecting from a user, at the visual programming panel, at least one motion block from the plurality of motion blocks and adding the at least one motion block into the timeline editor, via a drag-and-drop, to form a motion configuration; and according to the motion configuration at the visual programming panel, automatically generating a program capable of commanding an end effector equipped on a target robot in a work cell to perform at least one selected robotic motion from the variety of robotic motions in the robot simulator.

4.WO/2021/204428METHOD AND APPARATUS FOR SUPPLYING AN OBJECT WITH A SUPPLY SUBSTANCE
WO 14.10.2021
Int.Class B60L 53/35
BPERFORMING OPERATIONS; TRANSPORTING
60VEHICLES IN GENERAL
LPROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
53Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
30Constructional details of charging stations
35Means for automatic or assisted adjustment of the relative position of charging devices and vehicles
Appl.No PCT/EP2021/050258 Applicant VOLKSWAGEN AKTIENGESELLSCHAFT Inventor SCHULZ, Hannes
The invention relates to a method for supplying an object with a supply substance, by means of at least one mobile, driverless robot (9) and a separate mobile platform (4) having the supply substance, wherein the mobile platform (4) has a mechanical coupling (8) for a separate mobile, driverless transport device (1) and/or the mobile platform (4) has a driverless transport device, wherein the transport device (1) and the robot (9) obtain a supply task and move, independently of one another, to the object; and to an apparatus (20).
5.20210316463CANDIDATE SIX DIMENSIONAL POSE HYPOTHESIS SELECTION
US 14.10.2021
Int.Class B25J 9/16
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
9Programme-controlled manipulators
16Programme controls
Appl.No 16848460 Applicant Hong Kong Applied Science and Technology Research Institute Co., Ltd. Inventor Wing To Ku

The present disclosure relates to methods, devices, and systems for selecting a candidate six dimensional pose hypothesis from among a plurality of six dimensional pose hypotheses. For example, the systems, devices, and methods described herein may be used to quickly, accurately, and precisely select a candidate six dimensional pose hypothesis from among a plurality of six dimensional pose hypotheses so that the selected candidate six dimensional pose hypothesis substantially overlaps with an image of an object to be identified from an image of a plurality of objects. In this manner, an object can be identified from among a plurality of objects based on the selected candidate six dimensional pose hypothesis.

6.20210316448GENERATING AND/OR USING TRAINING INSTANCES THAT INCLUDE PREVIOUSLY CAPTURED ROBOT VISION DATA AND DRIVABILITY LABELS
US 14.10.2021
Int.Class B25J 9/16
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
9Programme-controlled manipulators
16Programme controls
Appl.No 16720498 Applicant X Development LLC Inventor Ammar Husain

Implementations set forth herein relate to generating training data, such that each instance of training data includes a corresponding instance of vision data and drivability label(s) for the instance of vision data. A drivability label can be determined using first vision data from a first vision component that is connected to the robot. The drivability label(s) can be generated by processing the first vision data using geometric and/or heuristic methods. Second vision data can be generated using a second vision component of the robot, such as a camera that is connected to the robot. The drivability labels can be correlated to the second vision data and thereafter used to train one or more machine learning models. The trained models can be shared with a robot(s) in furtherance of enabling the robot(s) to determine drivability of areas captured in vision data, which is being collected in real-time using one or more vision components.

7.WO/2021/204303ROBOT-BASED DISINFECTION METHOD AND APPARATUS, DEVICE AND MEDIUM
WO 14.10.2021
Int.Class B25J 11/00
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
11Manipulators not otherwise provided for
Appl.No PCT/CN2021/098140 Applicant SHANGHAI TAIMI ROBOTICS TECHNOLOGY CO., LTD. Inventor PAN, Jing
A robot-based disinfection method and apparatus, a device and a medium. The robot-based disinfection method comprises: in response to receiving a disinfection request for triggering a disinfection operation, determining a target disinfection region for a robot; determining region attribute feature information of the target disinfection region according to region identification information of the target disinfection region; and controlling, according to the region attribute feature information, the robot to disinfect the target disinfection region.
8.WO/2021/204393ROBOT ARM HAVING AN ARTICULATED JOINT
WO 14.10.2021
Int.Class B25J 9/10
BPERFORMING OPERATIONS; TRANSPORTING
25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; HANDLES FOR HAND IMPLEMENTS; WORKSHOP EQUIPMENT; MANIPULATORS
JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
9Programme-controlled manipulators
10characterised by positioning means for manipulator elements
Appl.No PCT/EP2020/060200 Applicant AGILE ROBOTS AG Inventor DÜRR, Daniel Mark
The invention relates to a robot arm having at least two limbs (2), which are connected to one another at their ends via an articulated joint (10) so that they can be pivoted relative to one another about a rotation axis (A), the two limbs (2) each comprising at least one joint portion (3) and a transition region (4) adjoined thereto and each extending in a longitudinal direction (L). According to the invention, the transition region (4) of at least one of the limbs (2, 2a) has a circumferential edge (6) which, when considered from a direction running transverse to the rotation axis (A) and transverse to the longitudinal direction (L), crosses a parting line (9) between the two joint portions (3) and runs at a predefined distance radially outside the joint portion (3) of the other limb (2), the circumferential edge (6) running at a distance, in relation to the rotation axis (A), of less than 25 mm outside the surface of the joint portion (3) of the other limb (2) arranged beneath. In addition, a portion of the circumferential edge (6) extends obliquely to the rotation axis (A) when considered from a direction running transverse to the rotation axis (A) and transverse to the longitudinal direction (L).
9.20210319367DIVERSIFIED IMITATION LEARNING FOR AUTOMATED MACHINES
US 14.10.2021
Int.Class G06N 20/00
GPHYSICS
06COMPUTING; CALCULATING OR COUNTING
NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
20Machine learning
Appl.No 17304800 Applicant Rita H. Wouhaybi Inventor Rita H. Wouhaybi

Disclosed herein are embodiments of systems and methods for diversified imitation learning for automated machines. In an embodiment, a process-profiling system obtains sensor data captured by a plurality of sensors that are arranged to observe one or more human subjects performing one or more processes to accomplish one or more tasks. The process-profiling system clusters the sensor data based on a set of one or more process-performance criteria. The process-profiling system also performs, based on the clustered sensor data, one or both of generating and updating one or more process profiles in a plurality of process profiles. The process-profiling system selects, for one or more corresponding automated machines, one or more process profiles from among the plurality of process profiles, and the process-profiling system configures the one or more corresponding automated machines to operate according to the selected one or more process profiles.

10.102020204532Lagemessung bei einer Positioniervorrichtung
DE 14.10.2021
Int.Class G01B 21/00
GPHYSICS
01MEASURING; TESTING
BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
21Measuring arrangements or details thereof in so far as they are not adapted to particular types of measuring means of the other groups of this subclass
Appl.No 102020204532 Applicant Carl Zeiss Industrielle Messtechnik GmbH Inventor Haverkamp Nils

Die Erfindung betrifft insbesondere eine Positioniervorrichtung (10) mit Lagemessfunktion, wobei die Positioniervorrichtung (10) ein erstes Kinematikglied (K1) und wenigstens ein zweites Kinematikglied (K2) aufweist, an dem ein erstes Messglied (M1) befestigt ist, wobei das erste und zweite Kinematikglied (K1, K2) über ein Gelenk (G1) miteinander verbunden sind; wobei an dem ersten Messglied (M1) einer von wenigstens einer Maßeinrichtung (18) und wenigstens einem Sensor (16) zum Erfassen der Maßeinrichtung (18) und/oder hiervon ausgesendeter Signale angeordnet ist und an dem ersten Kinematikglied (K1) zumindest mittelbar der entsprechend andere von Maßeinrichtung (18) und Sensor (16) angeordnet ist, wobei auf Basis des sensorischen Erfassens neben einem dem Freiheitsgrad des Gelenks (G1) entsprechenden Relativlagewert noch wenigstens ein weiterer Relativlagewert für einen anderen Freiheitsgrad ermittelbar ist; und wobei ein Befestigungsort (B1) des ersten Messglieds (M1) an dem zweiten Kinematikglied (K2) näher an einem von dem Gelenk (G1) entfernten Ende des zweiten Kinematikglieds (K2) als an dem Gelenk (G1) liegt. embedded image