Patent classifications
G05B2219/40625
TACTILE AND/OR OPTICAL DISTANCE SENSOR, SYSTEM HAVING SUCH A DISTANCE SENSOR, AND METHOD FOR CALIBRATING SUCH A DISTANCE SENSOR OR SUCH A SYSTEM
A tactile and/or optical distance sensor includes a housing, which has at least one elongate portion, a measurement arm, which is arranged in the housing, at least partially extends through the elongate portion and has a tactile and/or an optical probe element at one end, a transducer, which is configured to capture a position of the tactile probe element or a signal of the optical probe element and to generate associated probe element measurement signals, and an advance unit, with which the housing is linearly dis-placeable along an advance direction. A strain sensor is located in the region of the measurement arm extending through the elongate portion or at an adjacent region directly adjoining said region. In addition, a system for measuring the roughness of a surface of a workpiece and a method for calibrating a distance sensor or a system are provided.
METHOD AND APPARATUS FOR MANIPULATING A TOOL TO CONTROL IN-GRASP SLIDING OF AN OBJECT HELD BY THE TOOL
A tool control system may include: a tactile sensor configured to, when a tool holds a target object and slides the target object downward across the tool, obtain tactile sensing data from the tool; one or more memories configured to store a target velocity and computer-readable instructions; and one or more processors configured execute the computer-readable instructions to: receive the tactile sensing data from the tactile sensor; estimate a velocity of the target object based on the tactile sensing data, by using one or more neural networks that are trained based on a training image of an sample object captured while the sample object is sliding down; and generate a control parameter of the tool based on the estimated velocity and the target velocity.
TACTILE DEXTERITY AND CONTROL
Systems and methods relating to tactile dexterity and control are disclosed. In one embodiment, a method of manipulating an object based on tactile sensing includes sensing an object by receiving signals from a tactile sensor of an end effector of a robotic system in contact with the object, controlling a contact state by operating the end effector to enforce a desired contact condition between the end effector and the object, estimating a pose of the object based on the received signals, and planning at least one trajectory of the object based on the estimated pose of the object and a desired pose of the object.
SYSTEMS AND METHODS FOR DETERMINING POSE OF OBJECTS HELD BY FLEXIBLE END EFFECTORS
Systems and methods for determining a pose of an object held by a flexible end effector of a robot are disclosed. A method of determining a pose of the object includes receiving tactile data from tactile sensors, receiving curvature data from curvature sensors, determining a plurality of segments of the flexible end effector from the curvature data, assigning a frame to each segment, determining a location of each point of contact between the object and the flexible end effector from the tactile data, calculating a set of relative transformations and determining a location of each point relative to one of the frames, generating continuous data from the determined location of each point, and providing the continuous data to a pose determination algorithm that uses the continuous data to determine the pose of the object.
Tactile Sensor
A tactile sensor including a cap having a top surface and an undersurface. The undersurface includes pins, each pin has a mark. A portion of the undersurface is attachable to a device. A camera positioned in view of the marks, captures images of the marks placed in motion by elastic deformation of the top surface of the cap. A processor receives the captured images and determines a set of relative positions of the marks in the captured images, by identifying measured image coordinates of locations in images of the captured images. Determine a net force tensor acting on the top surface using a stored machine vision algorithm, by matching the set of relative positions of the marks to a stored set of previously learned relative positions of the marks placed in motion. Control the device via a controller in response to the net force tensor determined in the processor.
DETECTING SLIPPAGE FROM ROBOTIC GRASP
A plurality of sensors are configured to provide a corresponding output that reflects a sensed value associated with engagement of a robotic arm end effector with an item. The respective outputs of one or more sensors comprising the plurality of sensors are used to determine one or more inputs to a multi-modal model configured to provide, based at least in part on the one or more inputs, an output associated with slippage of the item within or from a grasp of the robotic arm end effector. A determination associated with slippage of the item within or from the grasp of the robotic arm end effector is made based at least in part on an output of the multi-modal model. A responsive action is taken based at least in part on the determination associated with slippage of the item within or from the grasp of the robotic arm end effector.
IN-HAND OBJECT POSE TRACKING
Apparatuses, systems, and techniques are described that estimate the pose of an object while the object is being manipulated by a robotic appendage. In at least one embodiment, a sample-based optimization algorithm tracks in-hand object poses during manipulation via contact feedback and a GPU-accelerated robotic simulation is developed. In at least one embodiment, parallel simulations concurrently model object pose changes that may be caused by complex contact dynamics. In at least one embodiment, the optimization algorithm tunes simulation parameters during object pose tracking to further improve tracking performance. In various embodiments, real-world contact sensing may be improved by utilizing vision in-the-loop.
HAPTIC PHOTOGRAMMETRY IN ROBOTS AND METHODS FOR OPERATING THE SAME
Robots, robot systems, and methods for operating the same based on environment models including haptic data are described. An environment model which includes representations of objects in an environment is accessed, and a robot system is controlled based on the environment model. The environment model incudes haptic data, which provides more effective control of the robot. The environment model is populated based on visual profiles, haptic profiles, and/or other data profiles for objects or features retrieved from respective databases. Identification of objects or features can be based on cross-referencing between visual and haptic profiles, to populate the environment model with data not directly collected by a robot which is populating the model, or data not directly collected from the actual objects or features in the environment.
ROBOT SYSTEM AND METHOD FOR CONTROLLING ROBOT
A robot system and method for controlling a robot, wherein during learning, a detection unit detects, as waveform data for learning, the contact state when a socket is caused to contact a set position of the head of a bolt and is caused to rotate around the set position within a set movement range. A learning unit learns a plurality of detected sets of the waveform data for learning and writes the learning results to a determination unit. During practical operations, the determination unit recognizes the amount that the socket slips with respect to the bolt on the basis of actual waveform data indicating the change in the contact state when the socket is in contact with the head of the bolt and the written learning results.
END EFFECTOR AND END EFFECTOR DEVICE
The end effector includes a palm, a plurality of fingers capable of grasping operation, a tactile sensor unit provided with each of the plurality of fingers, and a force receiving portion that receives a force from the object being grasped when the object being grasped is grasped by the plurality of fingers, the force receiving portion being connected to each of the plurality of fingers via the tactile sensor unit. The force receiving portion includes a grasping surface that receives a force from the object being grasped, the grasping surface being placed facing the object being grasped to be able to grasp the object being grasped, and a pressing surface that is placed further away from the palm than the second end portion of each of the plurality of fingers and extends in a direction intersecting the grasping surface.