Patent classifications
G05B2219/39531
AUTONOMOUS MOBILE GRABBING METHOD FOR MECHANICAL ARM BASED ON VISUAL-HAPTIC FUSION UNDER COMPLEX ILLUMINATION CONDITION
The present disclosure discloses an autonomous mobile grabbing method for a mechanical arm based on visual-haptic fusion under a complex illumination condition, which mainly includes approaching control over a target position and feedback control over environment information.
According to the method, under the complex illumination condition, weighted fusion is conducted on visible light and depth images of a preselected region, identification and positioning of a target object are completed based on a deep neural network, and a mobile mechanical arm is driven to continuously approach the target object; in addition, the pose of the mechanical arm is adjusted according to contact force information of a sensor module, the external environment and the target object; and meanwhile, visual information and haptic information of the target object are fused, and the optimal grabbing pose and the appropriate grabbing force of the target object are selected.
By adopting the method, the object positioning precision and the grabbing accuracy are improved, the collision damage and instability of the mechanical arm are effectively prevented, and the harmful deformation of the grabbed object is reduced.
Systems, devices, components, and methods for a compact robotic gripper with palm-mounted sensing, grasping, and computing devices and components
Disclosed are various embodiments of a three-dimensional perception and object manipulation robot gripper configured for connection to and operation in conjunction with a robot arm. In some embodiments, the gripper comprises a palm, a plurality of motors or actuators operably connected to the palm, a mechanical manipulation system operably connected to the palm, a plurality of fingers operably connected to the motors or actuators and configured to manipulate one or more objects located within a workspace or target volume that can be accessed by the fingers. A depth camera system is also operably connected to the palm. One or more computing devices are operably connected to the depth camera and are configured and programmed to process images provided by the depth camera system to determine the location and orientation of the one or more objects within a workspace, and in accordance therewith, provide as outputs therefrom control signals or instructions configured to be employed by the motors or actuators to control movement and operation of the plurality of fingers so as to permit the fingers to manipulate the one or more objects located within the workspace or target volume. The gripper can also be configured to vary controllably at least one of a force, a torque, a stiffness, and a compliance applied by one or more of the plurality of fingers to the one or more objects.
Arithmetic device, control program, machine learner, grasping apparatus, and control method
The arithmetic device configured to perform a calculation for controlling a motion of a grasping apparatus that performs work involving a motion of sliding a grasped object includes: an acquisition unit configured to acquire a state variable indicating a state of the grasping apparatus during the work; a storage unit storing a learned neural network that has been learned by receiving a plurality of training data sets composed of a combination of the state variable acquired in advance and correct answer data corresponding to the state variable; an arithmetic unit configured to calculate a target value of each of various actuators related to the work of the grasping apparatus by inputting the state variable to the learned neural network read from the storage unit; and an output unit configured to output the target value of each of the various actuators to the grasping apparatus.
ROBOTIC MANIPULATOR WITH VISUAL GUIDANCE & TACTILE SENSING
A robotic manipulator includes one or multiple end effectors that can engage with an object, and one or multiple cameras that simultaneously observe each end effector, and the surrounding environment. For example, an end effector can include a contact surface including tactile markers which can deform when the end effector contacts the object.
ROBOT AND CONTROL METHOD THEREFOR
A robot includes: an arm; a hand including a first finger and a second finger, wherein the first finger includes a first sensor and the second finger includes a second sensor; and a processor configured to: based on sensing an object through the first sensor while the robot is moving to grip the object, activate the second sensor, receive, from the first sensor and the second sensor, distance information including a plurality of pairs of distance values, wherein each respective pair of distance values of the plurality of pairs of distance values includes a first distance between the first sensor and the object and a second distance between the second sensor and the object, and each respective pair of distance values corresponds to a respective position of the hand relative to the object, and control the first finger and the second finger to grip the object based on the distance information.
Gripping Device Modalities
Robotic gripping devices and methods for performing a picking operation. The methods described herein may involve positioning a gripping device with respect to an item to be grasped and then executing a first picking operation using the gripping device to obtain a grasp on the item. The methods may then involve executing at least two of a force detection procedure to detect a force applied to a portion of the gripping device, a grasping space detection procedure to detect an item in grasping range of the gripping device, a pressure detection procedure configured to detect pressure in an airflow path, and an item load detection procedure to detect force in a mechanical load path of the gripping device.
ESTIMATION APPARATUS, ESTIMATION METHOD, AND ESTIMATION PROGRAM
An estimation apparatus includes: an acquisition section that acquires a measurement result of a measurement unit that measures an object to be an estimation target of a contact sense in a contactless manner; a determination section that makes a determination as to an aspect of the object or a measurement condition of the object on a basis of the measurement result of the measurement unit; a selection section that selects, on a basis of a result of the determination, an estimation scheme to be used for estimation of the contact sense of the object from among a plurality of estimation schemes; and an estimation section that estimates the contact sense of the object using the selected estimation scheme.
Hand control apparatus and hand control system
A hand control apparatus including an extracting unit extracting a grip pattern of an object having a shape closest to that of the object acquired by a shape acquiring unit from a storage unit storing and associating shapes of plural types of objects and grip patterns, a position and posture calculating unit calculating a gripping position and posture of the hand in accordance with the extracted grip pattern, a hand driving unit causing the hand to grip the object based on the calculated gripping position and posture, a determining unit determining if a gripped state of the object is appropriate based on information acquired by at least one of the shape acquiring unit, a force sensor and a tactile sensor, and a gripped state correcting unit correcting at least one of the gripping position and the posture when it is determined that the gripped state of the object is inappropriate.
Systems, Devices, Components, and Methods for a Compact Robotic Gripper with Palm-Mounted Sensing, Grasping, and Computing Devices and Components
Disclosed are various embodiments of a three-dimensional perception and object manipulation robot gripper configured for connection to and operation in conjunction with a robot arm. In some embodiments, the gripper comprises a palm, a plurality of motors or actuators operably connected to the palm, a mechanical manipulation system operably connected to the palm, a plurality of fingers operably connected to the motors or actuators and configured to manipulate one or more objects located within a workspace or target volume that can be accessed by the fingers. A depth camera system is also operably connected to the palm. One or more computing devices are operably connected to the depth camera and are configured and programmed to process images provided by the depth camera system to determine the location and orientation of the one or more objects within a workspace, and in accordance therewith, provide as outputs therefrom control signals or instructions configured to be employed by the motors or actuators to control movement and operation of the plurality of fingers so as to permit the fingers to manipulate the one or more objects located within the workspace or target volume. The gripper can also be configured to vary controllably at least one of a force, a torque, a stiffness, and a compliance applied by one or more of the plurality of fingers to the one or more objects.
Manipulator system
A manipulator system configured to perform a work to a workpiece being moved by a moving device, includes a robotic arm, having one or more joints and to which a tool configured to perform the work to the workpiece is attached, an operating device configured to operate the robotic arm, a first imaging means configured to image the workpiece, while following the movement of the workpiece, a second imaging means fixedly provided in a work area to image a situation of the work to the workpiece, a displaying means configured to display an image imaged by the first imaging means and an image imaged by the second imaging means, and a control device configured to control the operation of the robotic arm based on an operating instruction of the operating device, while detecting a moving amount of the workpiece being moved by the moving device and carrying out a tracking control of the robotic arm according to the moving amount of the workpiece.