G05B2219/39532

AUTONOMOUS MOBILE GRABBING METHOD FOR MECHANICAL ARM BASED ON VISUAL-HAPTIC FUSION UNDER COMPLEX ILLUMINATION CONDITION
20230042756 · 2023-02-09 ·

The present disclosure discloses an autonomous mobile grabbing method for a mechanical arm based on visual-haptic fusion under a complex illumination condition, which mainly includes approaching control over a target position and feedback control over environment information.

According to the method, under the complex illumination condition, weighted fusion is conducted on visible light and depth images of a preselected region, identification and positioning of a target object are completed based on a deep neural network, and a mobile mechanical arm is driven to continuously approach the target object; in addition, the pose of the mechanical arm is adjusted according to contact force information of a sensor module, the external environment and the target object; and meanwhile, visual information and haptic information of the target object are fused, and the optimal grabbing pose and the appropriate grabbing force of the target object are selected.

By adopting the method, the object positioning precision and the grabbing accuracy are improved, the collision damage and instability of the mechanical arm are effectively prevented, and the harmful deformation of the grabbed object is reduced.

Systems, devices, components, and methods for a compact robotic gripper with palm-mounted sensing, grasping, and computing devices and components
11559900 · 2023-01-24 · ·

Disclosed are various embodiments of a three-dimensional perception and object manipulation robot gripper configured for connection to and operation in conjunction with a robot arm. In some embodiments, the gripper comprises a palm, a plurality of motors or actuators operably connected to the palm, a mechanical manipulation system operably connected to the palm, a plurality of fingers operably connected to the motors or actuators and configured to manipulate one or more objects located within a workspace or target volume that can be accessed by the fingers. A depth camera system is also operably connected to the palm. One or more computing devices are operably connected to the depth camera and are configured and programmed to process images provided by the depth camera system to determine the location and orientation of the one or more objects within a workspace, and in accordance therewith, provide as outputs therefrom control signals or instructions configured to be employed by the motors or actuators to control movement and operation of the plurality of fingers so as to permit the fingers to manipulate the one or more objects located within the workspace or target volume. The gripper can also be configured to vary controllably at least one of a force, a torque, a stiffness, and a compliance applied by one or more of the plurality of fingers to the one or more objects.

ROBOTIC MANIPULATOR WITH VISUAL GUIDANCE & TACTILE SENSING

A robotic manipulator includes one or multiple end effectors that can engage with an object, and one or multiple cameras that simultaneously observe each end effector, and the surrounding environment. For example, an end effector can include a contact surface including tactile markers which can deform when the end effector contacts the object.

SYSTEMS AND METHODS FOR ROBOTIC CONTROL UNDER CONTACT
20230105746 · 2023-04-06 ·

In variants, a method for robot control can include: receiving sensor data of a scene, modeling the physical objects within the scene, determining a set of potential grasp configurations for grasping a physical object within the scene, determining a reach behavior based on the potential grasp configuration, determining a trajectory for the reach behavior, and grasping the object using the trajectory.

Gripping Device Modalities

Robotic gripping devices and methods for performing a picking operation. The methods described herein may involve positioning a gripping device with respect to an item to be grasped and then executing a first picking operation using the gripping device to obtain a grasp on the item. The methods may then involve executing at least two of a force detection procedure to detect a force applied to a portion of the gripping device, a grasping space detection procedure to detect an item in grasping range of the gripping device, a pressure detection procedure configured to detect pressure in an airflow path, and an item load detection procedure to detect force in a mechanical load path of the gripping device.

Tactile sensor

A visuo-haptic sensor is presented which uses a deformable, passive material that is mounted in view of a camera. When objects interact with the sensor the deformable material is compressed, causing a change in the shape thereof. The change of shape is detected and evaluated by an image processor that is operatively connected to the camera. The camera may also observe the vicinity of the manipulator to measure ego-motion and motion of close-by objects. The visuo-haptic sensor may be attached to a mobile platform, a robotic manipulator or to any other machine which needs to acquire haptic information about the environment.

Hand control apparatus and hand control system
11207788 · 2021-12-28 · ·

A hand control apparatus including an extracting unit extracting a grip pattern of an object having a shape closest to that of the object acquired by a shape acquiring unit from a storage unit storing and associating shapes of plural types of objects and grip patterns, a position and posture calculating unit calculating a gripping position and posture of the hand in accordance with the extracted grip pattern, a hand driving unit causing the hand to grip the object based on the calculated gripping position and posture, a determining unit determining if a gripped state of the object is appropriate based on information acquired by at least one of the shape acquiring unit, a force sensor and a tactile sensor, and a gripped state correcting unit correcting at least one of the gripping position and the posture when it is determined that the gripped state of the object is inappropriate.

Systems, Devices, Components, and Methods for a Compact Robotic Gripper with Palm-Mounted Sensing, Grasping, and Computing Devices and Components
20210394367 · 2021-12-23 ·

Disclosed are various embodiments of a three-dimensional perception and object manipulation robot gripper configured for connection to and operation in conjunction with a robot arm. In some embodiments, the gripper comprises a palm, a plurality of motors or actuators operably connected to the palm, a mechanical manipulation system operably connected to the palm, a plurality of fingers operably connected to the motors or actuators and configured to manipulate one or more objects located within a workspace or target volume that can be accessed by the fingers. A depth camera system is also operably connected to the palm. One or more computing devices are operably connected to the depth camera and are configured and programmed to process images provided by the depth camera system to determine the location and orientation of the one or more objects within a workspace, and in accordance therewith, provide as outputs therefrom control signals or instructions configured to be employed by the motors or actuators to control movement and operation of the plurality of fingers so as to permit the fingers to manipulate the one or more objects located within the workspace or target volume. The gripper can also be configured to vary controllably at least one of a force, a torque, a stiffness, and a compliance applied by one or more of the plurality of fingers to the one or more objects.

Soft Robotic Gripper for Berry Harvesting

A system for harvesting berries comprising a tendon-driven gripper having fingers made of a compliant material.

END EFFECTOR DEVICE
20210354316 · 2021-11-18 ·

The end effector device includes an end effector including a palm and a plurality of fingers, a drive device, a position shift direction determination unit and a position shift correction unit. Each finger includes a tactile sensor unit capable of detecting external forces in at least three axial directions. The position shift direction determination unit determines in which direction the object being grasped is position-shifted with respect to the fitting recess based on a detection result detected by the tactile sensor unit in a case where at least one of the external forces detected by the tactile sensor unit is a specified value or more. The position shift correction unit moves the palm in a direction opposite to a position shift direction of the object being grasped determined by the position shift direction determination unit.