G05B2219/39543

INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND COMPUTER PROGRAM
20230264365 · 2023-08-24 ·

Provided is an information processing apparatus that processes information for implementing the best gripping operation suitable for a characteristic of a gripper.

The information processing apparatus includes: a grip information analysis unit that analyzes an object and generates generalized grip shape information; and a generalized grip shape information accumulation unit that accumulates the generalized grip shape information. The information processing apparatus further includes a grip objective planning unit that plans a grip objective of the object on the basis of the generalized grip shape information of the object and a characteristic of a gripper to be used for a grip. The grip objective planning unit plans the grip objective regarding how to grip which portion of the object.

AUTOMATED MANIPULATION OF OBJECTS USING A VISION-BASED METHOD FOR DETERMINING COLLISION-FREE MOTION PLANNING
20220152825 · 2022-05-19 · ·

In accordance with various aspects and embodiments of the invention, a system and method are provided for manipulation and movement of objects. In accordance with one aspect of the invention, the system includes a robotic arm that grabs and manipulates objects along a collision-free path. The objects can be in a randomly arranged pile or in an orderly arranged location. In accordance with various aspects and embodiments of the invention, the objects are moved from an orderly location to a storage location.

Device for outputting holding detection results

The purpose of the present invention is to provide a device for outputting holding detection results by a highly accurate simulation in consideration of parameters related to a holding member. A user enters workpiece information through an input UI unit. A selection control unit executes an automatic selection process of a suction pad based on the workpiece information input through the input UI unit, an automatic selection process of a workpiece physical model, an automatic selection process of a robot, and a confirmation process of a vibration tolerance, and then displays the selection results. The selection control unit determines whether there is a problem with the selection results based on an input instruction from the user.

SYSTEMS AND METHODS FOR SKU INDUCTION, DECANTING AND AUTOMATED-ELIGIBILITY ESTIMATION

An object induction system is disclosed for assigning handling parameters to an object. The system includes an analysis system, an association system, and an assignment system. The analysis system includes at least one characteristic perception system for providing perception data regarding an object to be processed. The association system includes an object information database and assigns association data to the object responsive to commonality with of any of the characteristic perception data with any of the characteristic recorded data. The assignment system is for assigning programmable motion device handling parameters to the indicia perception data based on the association data, and includes a workflow management system as well as a separate operational controller.

Interactive Tactile Perception Method for Classification and Recognition of Object Instances

A controller is provided for interactive classification and recognition of an object in a scene using tactile feedback. The controller includes an interface configured to transmit and receive the control, sensor signals from a robot arm, gripper signals from a gripper attached to the robot arm, tactile signals from sensors attached to the gripper and at least one vision sensor, a memory module to store robot control programs, and a classifier and recognition model, and a processor to generate control signals based on the control program and a grasp pose on the object, configured to control the robot arm to grasp the object with the gripper. Further, the processor is configured to compute a tactile feature representation from the tactile sensor signals and to repeat gripping the object and computing a tactile feature representation with the set of grasp poses, after which the processor, processes the ensemble of tactile features to learn a model which is utilized to classify or recognize the object as known or unknown.

Haptic photogrammetry in robots and methods for operating the same

Robots, robot systems, and methods for operating the same based on environment models including haptic data are described. An environment model which includes representations of objects in an environment is accessed, and a robot system is controlled based on the environment model. The environment model incudes haptic data, which provides more effective control of the robot. The environment model is populated based on visual profiles, haptic profiles, and/or other data profiles for objects or features retrieved from respective databases. Identification of objects or features can be based on cross-referencing between visual and haptic profiles, to populate the environment model with data not directly collected by a robot which is populating the model, or data not directly collected from the actual objects or features in the environment.

Object grasp system and method
11312581 · 2022-04-26 · ·

A grasping system includes a robotic arm having a gripper. A fixed sensor monitors a grasp area and an onboard sensor moves with the gripper also monitors the area. A controller receives information indicative of a position of an object to be grasped and operates the robotic arm to bring the gripper into a grasp position adjacent the object based on information provided by the fixed sensor. The controller is also programmed to operate the gripper to grasp the object in response to information provided by the first onboard sensor.

SYSTEMS, DEVICES, ARTICLES, AND METHODS FOR PREHENSION
20230302665 · 2023-09-28 ·

An end-effector may include a base, a plurality of underactuated fingers coupled to the base; and an adhesion gripper coupled to the base. An end-effector may include a base, an actuator, a first underactuated finger comprising a proximal link and a distal link, the proximal link including a distal end, a guide for a first tendon spaced a first distance away from the distal end of the proximal link and the distal link including a lever arm disposed on a proximal side to the distal pad and which extends in a volar direction from a first axis, and a node disposed on the lever arm sized and shaped to receive a first tendon. The end-effector may include a first revolute joint compliant in a first direction disposed between the base and the proximal link; and a second revolute joint compliant in the first direction disposed between the proximal link and the distal link.

Information processing apparatus, grasping system, and information processing method

It is an object to enable a grasping operation to be executed according to a state of an object. The invention provides an information processing apparatus which determines the grasping operation in a grasping unit for grasping the object. The information processing apparatus has: an obtaining unit for obtaining an image acquired by capturing the object; a recognizing unit for recognizing a state of the object from the image obtained by the obtaining unit; and a generating unit for generating information for allowing the grasping unit to execute the grasping operation on the basis of the object state recognized by the recognizing unit and conditions to execute the grasping operation.

OBJECT GRASPING SYSTEM

There is provided an object grasping system (100) including a camera (150), a grasping unit (140), and a control unit (110) for moving the grasping unit (140) toward an object (200A) while repeatedly specifying a relative position of the object (200A) with respect to the grasping unit (140) based on an image taken by the camera (150).