G05B2219/39532

Autonomous mobile grabbing method for mechanical arm based on visual-haptic fusion under complex illumination condition

The present disclosure discloses an autonomous mobile grabbing method for a mechanical arm based on visual-haptic fusion under a complex illumination condition, which mainly includes approaching control over a target position and feedback control over environment information. According to the method, under the complex illumination condition, weighted fusion is conducted on visible light and depth images of a preselected region, identification and positioning of a target object are completed based on a deep neural network, and a mobile mechanical arm is driven to continuously approach the target object; in addition, the pose of the mechanical arm is adjusted according to contact force information of a sensor module, the external environment and the target object; and meanwhile, visual information and haptic information of the target object are fused, and the optimal grabbing pose and the appropriate grabbing force of the target object are selected. By adopting the method, the object positioning precision and the grabbing accuracy are improved, the collision damage and instability of the mechanical arm are effectively prevented, and the harmful deformation of the grabbed object is reduced.

Robotic control device and method for manipulating a hand-held tool
09613180 · 2017-04-04 · ·

Described is a robotic control device for manipulating a gripper-held tool. The device includes a robotic gripper having a plurality of tactile sensors. Each sensor generates tactile sensory data upon grasping an tool based on the interface between the tool and the corresponding tactile sensor. In operation, the device causes the gripper to grasp a tool and move the tool into contact with a surface. A control command is used to cause the gripper to perform a pseudo-random movement with the tool against the surface to generate tactile sensory data. A dimensionality reduction is performed on the tactile sensory data to generate a low-dimensional representation of the tactile sensory data, which is then associated with the control command to generate a sensory-motor mapping. A series of control commands can then be generated in a closed-loop based on the sensory-motor mapping to manipulate the tool against the surface.