G05B2219/39546

Generating robotic grasping instructions for inventory items

Robotic arms may be utilized to grasp inventory items within an inventory system. Information about an inventory item to be grasped can be detected and used to determine a grasping strategy in conjunction with information from a database. Instructions for grasping an inventory item can be generated based on the detected information and the database.

HUMAN-IN-LOOP ROBOT TRAINING AND TESTING SYSTEM WITH GENERATIVE ARTIFICIAL INTELLIGENCE (AI)
20240408757 · 2024-12-12 ·

A robot teaching and testing system and method that performs human-operated robot tasks according to instructions generated from generative AI models. The process starts with a user prompt and combines the user prompt with predefined prompt templates to generate well-formatted text prompts. Generative AI models take the text prompts and convert them into high-level instructions or control codes that can be deployed on a robot. The high-level instructions are then converted into human-operated robot tasks for a human data collector using a mixed reality (MR) device. The human data collector will attempt to follow the instructions to complete the human-operated robot tasks and may overwrite the suggested instructions by performing a different action, demonstrate a task without instructions, or leave feedback or comments regarding the tasks. Feedback data will be captured and saved for improving the robot system.

Systems and methods for planning a robot grasp that can withstand task disturbances
09649764 · 2017-05-16 · ·

In one embodiment, a system and method for planning a robot grasp involve measuring interaction forces imposed on an object by an environment while a task is demonstrated using the object to obtain a disturbance distribution dataset, modeling a task requirement based upon the disturbance distribution dataset, identifying robot grasp types that can be used to satisfy the task requirement, calculating a grasp wrench space for each identified robot grasp, and calculating a grasp quality of each grasp.

ROBOTIC GRASPING OF ITEMS IN INVENTORY SYSTEM

Robotic arms or manipulators can be utilized to grasp inventory items within an inventory system. Information can be obtained about constraints relative to relevant elements of a process of transferring the item from place to place. Examples of such elements may include a grasping location from which an item is to be grasped, a receiving location in which a grasped item is to be placed, or a space between the grasping location and the receiving location. The information about the constraints can be used to select from multiple possible grasping options, such as by eliminating options that conflict with the constraints or preferring options that outperform others given the constraints.

Generating a grasp affordance for an object based on a thermal image of the object that is captured following human manipulation of the object
09616568 · 2017-04-11 · ·

Methods, apparatus, and computer readable storage media related to utilizing a thermographic camera to capture at least one thermal image of an object following human manipulation of the object, and generating a grasp affordance for the object based on the temperatures indicated by the captured thermal image. The generated grasp affordance may be utilized, directly or indirectly, by one or more robots for determining grasping parameters for manipulating the object and/or other objects that are similar to the object.

Robotic grasping of items in inventory system

Robotic arms or manipulators can be utilized to grasp inventory items within an inventory system. Information about an item to be grasped can be detected and/or accessed from one or more databases to determine a grasping strategy for grasping the item with a robotic arm or manipulator. For example, one or more accessed databases can contain information about the item, characteristics of the item, and/or similar items, such as information indicating grasping strategies that have been successful or unsuccessful for such items in the past.

GENERATING ROBOTIC GRASPING INSTRUCTIONS FOR INVENTORY ITEMS

Robotic arms may be utilized to grasp inventory items within an inventory system. Information about an inventory item to be grasped can be detected and used to determine a grasping strategy in conjunction with information from a database. Instructions for grasping an inventory item can be generated based on the detected information and the database.

Robot hand and humanoid robot having the same

Disclosed herein is a control method of a robot hand including recognizing a pre-posture of user's fingers using a master device, changing the shape of the robot hand according to the recognized pre-posture, recognizing a gripping motion of the user's fingers using the master device, and executing a gripping motion of the robot hand according to a gripping posture corresponding to the recognized pre-posture.

SYSTEM AND METHOD FOR DETERMINING A GRASPING HAND MODEL

Method for determining a grasping hand model suitable for grasping an object by receiving an image including at least one object; obtaining an object model estimating a pose and shape of the object from the image of the object; selecting a grasp class from a set of grasp classes by means of a neural network, with a cross entropy loss, thus, obtaining a set of parameters defining a coarse grasping hand model; refining the coarse grasping hand model, by minimizing loss functions referring to the parameters of the hand model for obtaining an operable grasping hand model while minimizing the distance between the finger of the hand model and the surface of the object and preventing interpenetration; and obtaining a mesh of the hand represented by the enhanced set of parameters.

TIME-OF-FLIGHT SENSORS FOR WEARABLE ROBOTIC TRAINING DEVICES

Technology disclosed herein includes a wearable data collection device for training robotic systems. In an implementation, a wearable data collection device includes a hand element configured to receive a user's hand, multiple finger elements extending from the hand element, and joints coupling the finger elements to the hand element. The finger elements are constrained to movements that match capabilities of a robotic counterpart device. Multiple sensors mounted on the device capture pressure, position, visual, proximity, and acoustic data during recording sessions. The device may integrate with position tracking technologies such as mobile devices or augmented reality headsets. Data collected through the wearable device serves as training input for a neural network that controls the robotic counterpart.