G05B2219/39532

ROBOTIC MANIPULATORS
20210031373 · 2021-02-04 ·

A robot comprising a chopstick, configured for at least four degrees of freedom of movement, a stiff body of shape and proportions approximate to a pool cue; an electromagnetic actuator, comprising a motor, for each degree of freedom of movement coupled with the stiff body, wherein the functional mapping from each actuator's motor current to torque output along an axis of motion is stored, and used in concert with a calibrated model of the robot for effective impedance control; and a 6-axis force/torque sensor mounted inline between the actuators and each chopstick.

SYSTEMS AND METHODS FOR ROBOTIC CONTROL UNDER CONTACT
20210031375 · 2021-02-04 ·

A system comprises a database; at least one hardware processor coupled with the database; and one or more software modules that, when executed by the at least one hardware processor, receive at least one of sensory data from a robot and images from a camera, identify and build models of objects in an environment, wherein the model encompasses immutable properties of identified objects including mass and geometry, and wherein the geometry is assumed not to change, estimate the state including position, orientation, and velocity, of the identified objects, determine based on the state and model, potential configurations, or pre-grasp poses, for grasping the identified objects and return multiple grasping configurations per identified object, determine an object to be picked based on a quality metric, translate the pre-grasp poses into behaviors that define motor forces and torques, communicate the motor forces and torques to the robot in order to allow the robot to perform a complex behavior generated from the behaviors.

END EFFECTOR AND END EFFECTOR DEVICE
20210213627 · 2021-07-15 ·

The end effector includes a palm, a plurality of fingers capable of grasping operation, a tactile sensor unit provided with each of the plurality of fingers, and a force receiving portion that receives a force from the object being grasped when the object being grasped is grasped by the plurality of fingers, the force receiving portion being connected to each of the plurality of fingers via the tactile sensor unit. The force receiving portion includes a grasping surface that receives a force from the object being grasped, the grasping surface being placed facing the object being grasped to be able to grasp the object being grasped, and a pressing surface that is placed further away from the palm than the second end portion of each of the plurality of fingers and extends in a direction intersecting the grasping surface.

Incorporating Vision System and In-Hand Object Location System for Object Manipulation and Training

A system and method of object manipulation and training including providing at least one robotic hand including a plurality of grippers connected to a body and providing a plurality of cameras disposed in a periphery surface of the grippers. The method also includes providing a plurality of tactile sensors disposed in the periphery surface of the grippers and actuating the grippers to grasp an object. The method further includes detecting a position of the object with respect to the robotic hand via a first image feed from the tactile sensors and detecting a position of the object with respect to the robotic hand via a second image feed from the cameras. The method also includes generating instructions to grip and manipulate an orientation of the object based on the first and the second image feeds for a visualization of the object relative to the robotic hand.

Method of Automated Calibration for In-Hand Object Location System

A method of automated in-hand calibration including providing at least one robotic hand including a plurality of grippers connected to a body and providing at least one camera disposed on a periphery surface of the plurality of grippers. The method also includes providing at least one tactile sensor disposed in the at least one illumination surface and actuating the plurality of grippers to grasp an object. The method further includes locating a position of the object with respect to the at least one robotic hand and calibrating a distance parameter via the at least one camera. The method also includes calibrating the at least one tactile sensor with the at least one camera and generating instructions to grip and manipulate an orientation of the object via an image feed from the at least one camera for a visualization of the object.

ROBOTIC GRIPPER WITH INTEGRATED TACTILE SENSOR ARRAYS

A robotic gripper (end effector) for an arm-type robotic system includes a hierarchical sensor architecture that utilizes a central data processing circuit to generate rich sensory tactile data in response to pressure, temperature, vibration and/or proximity sensor data generated by finger-mounted sensor groups in response to interactions between the robotic gripper and a target object during robotic system operations. The rich sensory tactile data is used to generate feedback signals that directly control finger actuators and/or tactile information that is supplied to the robotic system's control circuit. Sensor data processing circuits are configured to receive single-sensor data signals in parallel from the sensor groups, and to transmit corresponding finger-level sensor data signal on a serial bus/signal line to the central data processing circuit. Each sensor group and an associated sensor data processing circuit are disposed on a PCB structure and mounted on a contact portion of an associated gripper finger.

FORCE ESTIMATION USING DEEP LEARNING

A computer system generates a tactile force model for a tactile force sensor by performing a number of calibration tasks. In various embodiments, the calibration tasks include pressing the tactile force sensor while the tactile force sensor is attached to a pressure gauge, interacting with a ball, and pushing an object along a planar surface. Data collected from these calibration tasks is used to train a neural network. The resulting tactile force model allows the computer system to convert signals received from the tactile force sensor into a force magnitude and direction with greater accuracy than conventional methods. In an embodiment, force on the tactile force sensor is inferred by interacting with an object, determining the motion of the object, and estimating the forces on the object based on a physical model of the object.

METHOD OF CONTROLLING ROBOT BODY, METHOD OF MANUFACTURING PRODUCT, ROBOT APPARATUS, AND RECORDING MEDIUM
20200114507 · 2020-04-16 ·

A method includes controlling a robot body performed by a controller. The robot body includes a finger, a driving unit, and a detection unit. The driving unit is configured to move the finger. The detection unit is configured to output a signal corresponding to a state of the finger moved by the driving unit. The method includes causing the finger to hold a workpiece, causing the robot body to start a predetermined operation while causing the finger to keep holding the workpiece, if a detected value based on the signal outputted from the detection unit is within a first range, and causing the robot body to continue to perform the predetermined operation until completion of the predetermined operation, if the detected value is within a second range in the predetermined operation. The second range is different from the first range.

Systems, Devices, Components, and Methods for a Compact Robotic Gripper with Palm-Mounted Sensing, Grasping, and Computing Devices and Components
20190381670 · 2019-12-19 ·

Disclosed are various embodiments of a three-dimensional perception and object manipulation robot gripper configured for connection to and operation in conjunction with a robot arm. In some embodiments, the gripper comprises a palm, a plurality of motors or actuators operably connected to the palm, a mechanical manipulation system operably connected to the palm, a plurality of fingers operably connected to the motors or actuators and configured to manipulate one or more objects located within a workspace or target volume that can be accessed by the fingers. A depth camera system is also operably connected to the palm. One or more computing devices are operably connected to the depth camera and are configured and programmed to process images provided by the depth camera system to determine the location and orientation of the one or more objects within a workspace, and in accordance therewith, provide as outputs therefrom control signals or instructions configured to be employed by the motors or actuators to control movement and operation of the plurality of fingers so as to permit the fingers to manipulate the one or more objects located within the workspace or target volume. The gripper can also be configured to vary controllably at least one of a force, a torque, a stiffness, and a compliance applied by one or more of the plurality of fingers to the one or more objects.

End effector device
11931887 · 2024-03-19 · ·

The end effector device includes an end effector including a palm and a plurality of fingers, a drive device, a position shift direction determination unit and a position shift correction unit. Each finger includes a tactile sensor unit capable of detecting external forces in at least three axial directions. The position shift direction determination unit determines in which direction the object being grasped is position-shifted with respect to the fitting recess based on a detection result detected by the tactile sensor unit in a case where at least one of the external forces detected by the tactile sensor unit is a specified value or more. The position shift correction unit moves the palm in a direction opposite to a position shift direction of the object being grasped determined by the position shift direction determination unit.