G05B2219/39528

Systems, devices, components, and methods for a compact robotic gripper with palm-mounted sensing, grasping, and computing devices and components

Disclosed are various embodiments of a three-dimensional perception and object manipulation robot gripper configured for connection to and operation in conjunction with a robot arm. In some embodiments, the gripper comprises a palm, a plurality of motors or actuators operably connected to the palm, a mechanical manipulation system operably connected to the palm, a plurality of fingers operably connected to the motors or actuators and configured to manipulate one or more objects located within a workspace or target volume that can be accessed by the fingers. A depth camera system is also operably connected to the palm. One or more computing devices are operably connected to the depth camera and are configured and programmed to process images provided by the depth camera system to determine the location and orientation of the one or more objects within a workspace, and in accordance therewith, provide as outputs therefrom control signals or instructions configured to be employed by the motors or actuators to control movement and operation of the plurality of fingers so as to permit the fingers to manipulate the one or more objects located within the workspace or target volume. The gripper can also be configured to vary controllably at least one of a force, a torque, a stiffness, and a compliance applied by one or more of the plurality of fingers to the one or more objects.

ROBOTIC SYSTEM FOR PICKING AND PLACING OBJECTS FROM AND INTO A CONSTRAINED SPACE
20210031368 · 2021-02-04 ·

A system comprising: a database configured to store a multi-body model of a robot, the robot comprising a plurality of manipulators, and a plurality of joints and plurality of actuators and actuator motors configured to move the joints, and wherein the multi-body model includes a kinematic and geometric model of each manipulator, a catalog of models for objects to be manipulated, the models comprising a current configuration and a target configuration, and a functional mapping of sensory data to configurations of the robot and the manipulators needed to manipulate the objects; at least one hardware processor coupled with the database; and one or more software modules that, when executed by the at least one hardware processor, receive sensory data from within a constrained space, identify objects in the constrained space based on the received sensory data and the catalog of models, determine a target pose for the joints and the manipulators based on the sensory data and the current and target configurations associated with the identified object, and compute joint space positions to necessary to realize the target pose.

ROBOTIC MANIPULATORS
20210031373 · 2021-02-04 ·

A robot comprising a chopstick, configured for at least four degrees of freedom of movement, a stiff body of shape and proportions approximate to a pool cue; an electromagnetic actuator, comprising a motor, for each degree of freedom of movement coupled with the stiff body, wherein the functional mapping from each actuator's motor current to torque output along an axis of motion is stored, and used in concert with a calibrated model of the robot for effective impedance control; and a 6-axis force/torque sensor mounted inline between the actuators and each chopstick.

SYSTEMS AND METHODS FOR ROBOTIC CONTROL UNDER CONTACT
20210031375 · 2021-02-04 ·

A system comprises a database; at least one hardware processor coupled with the database; and one or more software modules that, when executed by the at least one hardware processor, receive at least one of sensory data from a robot and images from a camera, identify and build models of objects in an environment, wherein the model encompasses immutable properties of identified objects including mass and geometry, and wherein the geometry is assumed not to change, estimate the state including position, orientation, and velocity, of the identified objects, determine based on the state and model, potential configurations, or pre-grasp poses, for grasping the identified objects and return multiple grasping configurations per identified object, determine an object to be picked based on a quality metric, translate the pre-grasp poses into behaviors that define motor forces and torques, communicate the motor forces and torques to the robot in order to allow the robot to perform a complex behavior generated from the behaviors.

LEARNING DEVICE, ROBOT CONTROL SYSTEM, AND LEARNING CONTROL METHOD
20210016439 · 2021-01-21 · ·

A learning device includes storage and a learning section. The storage stores therein a learning model. The learning model causes the learning model to learn training data including captured image data and gripping force data. The captured image data corresponds to data to be input to the learning model. The gripping force data corresponds to data to be output from the learning model. The captured image data is data generated by capturing an image of a work to be gripped by a robotic device. The gripping force data is data indicating a gripping force of the robotic device when gripping the work.

Robot Grip Detection Using Non-Contact Sensors

A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.

Object Grasp System and Method
20200331709 · 2020-10-22 · ·

A grasping system includes a robotic arm having a gripper. A fixed sensor monitors a grasp area and an onboard sensor moves with the gripper also monitors the area. A controller receives information indicative of a position of an object to be grasped and operates the robotic arm to bring the gripper into a grasp position adjacent the object based on information provided by the fixed sensor. The controller is also programmed to operate the gripper to grasp the object in response to information provided by the first onboard sensor.

Robot grip detection using non-contact sensors

A method is provided that includes controlling a robotic gripping device to cause a plurality of digits of the robotic gripping device to move towards each other in an attempt to grasp an object. The method also includes receiving, from at least one non-contact sensor on the robotic gripping device, first sensor data indicative of a region between the plurality of digits of the robotic gripping device. The method further includes receiving, from the at least one non-contact sensor on the robotic gripping device, second sensor data indicative of the region between the plurality of digits of the robotic gripping device, where the second sensor data is based on a different sensing modality than the first sensor data. The method additionally includes determining, using an object-in-hand classifier that takes as input the first sensor data and the second sensor data, a result of the attempt to grasp the object.

ROBOTIC GRIPPER WITH INTEGRATED TACTILE SENSOR ARRAYS

A robotic gripper (end effector) for an arm-type robotic system includes a hierarchical sensor architecture that utilizes a central data processing circuit to generate rich sensory tactile data in response to pressure, temperature, vibration and/or proximity sensor data generated by finger-mounted sensor groups in response to interactions between the robotic gripper and a target object during robotic system operations. The rich sensory tactile data is used to generate feedback signals that directly control finger actuators and/or tactile information that is supplied to the robotic system's control circuit. Sensor data processing circuits are configured to receive single-sensor data signals in parallel from the sensor groups, and to transmit corresponding finger-level sensor data signal on a serial bus/signal line to the central data processing circuit. Each sensor group and an associated sensor data processing circuit are disposed on a PCB structure and mounted on a contact portion of an associated gripper finger.

FLEX-RIGID SENSOR ARRAY STRUCTURE FOR ROBOTIC SYSTEMS

A flex-rigid sensor apparatus for providing sensor data from sensors disposed on an end-effector/gripper to the control circuit of an arm-type robotic system. The apparatus includes piezo-type pressure sensors sandwiched between lower and upper PCB stack-up structures respectively fabricated using rigid PCB (e.g., FR-4) and flexible PCB (e.g., polyimide) manufacturing processes. Additional (e.g., temperature and proximity) sensors are mounted on the upper/flexible stack-up structure. A spacer structure is disposed between the two stack-up structures and includes an insulating material layer defining openings that accommodate the pressure sensors. Copper film layers are configured to provide Faraday cages around each pressure sensor. The pressure sensors, additional sensors and Faraday cages are connected to sensor data processing and control circuitry (e.g., analog-to-digital converter circuits) by way of signal traces formed in the lower and upper stack-up structures and in the spacer structure. An encapsulation layer is formed on the upper PCB stack-up structure.