Patent classifications
G05B2219/40625
POSITION DETECTION METHOD, CONTROLLER, AND ROBOT SYSTEM
A method includes: (a) causing a robotic arm to position a contacting structure of the arm laterally in a horizontal direction in relation to a first subject on a target object; (b) causing the arrn to bring the contacting structure into contact with at least three locations on the first subject; (c) detecting positions of the contacting structure in relation to the robot when contacting the locations; (d) detecting a position of the first subject in relation to the robot by using the detected positions of the contacting structure; (e) performing same steps as the steps (a) to (d) for a second subject on the target object; and (f) detecting a position of the robot in relation to the target object by using the positions of the subjects in relation to the robot and using positions of the subjects in relation to the target object.
ROBOTIC MANIPULATOR WITH VISUAL GUIDANCE & TACTILE SENSING
A robotic manipulator includes one or multiple end effectors that can engage with an object, and one or multiple cameras that simultaneously observe each end effector, and the surrounding environment. For example, an end effector can include a contact surface including tactile markers which can deform when the end effector contacts the object.
Tactile sensor
A tactile sensor including a cap having a top surface and an undersurface. The undersurface includes pins, each pin has a mark. A portion of the undersurface is attachable to a device. A camera positioned in view of the marks, captures images of the marks placed in motion by elastic deformation of the top surface of the cap. A processor receives the captured images and determines a set of relative positions of the marks in the captured images, by identifying measured image coordinates of locations in images of the captured images. Determine a net force tensor acting on the top surface using a stored machine vision algorithm, by matching the set of relative positions of the marks to a stored set of previously learned relative positions of the marks placed in motion. Control the device via a controller in response to the net force tensor determined in the processor.
Tactile perception apparatus for robotic systems
A human-like tactile perception apparatus for providing enhanced tactile information (feedback data) from an end-effector/gripper to the control circuit of an arm-type robotic system. The apparatus's base structure is attached to the gripper's finger and includes a flat/planar support plate that presses a pressure sensor array against a target object during operable interactions. The pressure sensor array generates pressure sensor data that indicates portions of the array contacted by surface features of the target object. A sensor data processing circuit generates tactile information in response to the pressure sensor data, and then transmits the tactile information to the robotic system's control circuit. An optional mezzanine connector extends through an opening in the support plate to pass pressure sensor data to the processing circuit. An encapsulating layer covers the pressure sensor array and transmits pressure waves generated by slipping objects to enhance the tactile information.
Tactile and/or optical distance sensor, system having such a distance sensor, and method for calibrating such a distance sensor or such a system
A tactile and/or optical distance sensor includes a housing, which has at least one elongate portion, a measurement arm, which is arranged in the housing, at least partially extends through the elongate portion and has a tactile and/or an optical probe element at one end, a transducer, which is configured to capture a position of the tactile probe element or a signal of the optical probe element and to generate associated probe element measurement signals, and an advance unit, with which the housing is linearly dis-placeable along an advance direction. A strain sensor is located in the region of the measurement arm extending through the elongate portion or at an adjacent region directly adjoining said region. In addition, a system for measuring the roughness of a surface of a workpiece and a method for calibrating a distance sensor or a system are provided.
Robotic Touch Perception
An apparatus such as a robot capable of performing goal oriented tasks may include one or more touch sensors to receive touch perception feedback on the location of objects and structures within an environment. A fusion engine may be configured to combine touch perception data with other types of sensor data such as data received from an image or distance sensor. The apparatus may combine distance sensor data with touch sensor data using inference models such as Bayesian inference. The touch sensor may be mounted onto an adjustable arm of a robot. The apparatus may use the data it has received from both a touch sensor and distance sensor to build a map of its environment and perform goal oriented tasks such as cleaning or moving objects.
CONTROL DEVICE, CONTROL METHOD, AND PROGRAM
The present disclosure relates to a control device, a control method, and a program capable of supporting an object with a more appropriate supporting force. The control device includes a supporting force control unit that controls a supporting force for supporting an object on the basis of information regarding a shape of a contact portion in contact with the object and information regarding a shear force of the contact portion. The information regarding the shear force includes, for example, information regarding a shear displacement of the contact portion. The present disclosure can be applied to, for example, a control device, a control method, an electronic device, a robot, a support system, a gripping system, a program, and the like.
Method and apparatus for manipulating a tool to control in-grasp sliding of an object held by the tool
A tool control system may include: a tactile sensor configured to, when a tool holds a target object and slides the target object downward across the tool, obtain tactile sensing data from the tool; one or more memories configured to store a target velocity and computer-readable instructions; and one or more processors configured execute the computer-readable instructions to: receive the tactile sensing data from the tactile sensor; estimate a velocity of the target object based on the tactile sensing data, by using one or more neural networks that are trained based on a training image of an sample object captured while the sample object is sliding down; and generate a control parameter of the tool based on the estimated velocity and the target velocity.
ROBOT HAND
A robot hand is provided. The robot hand includes a first and second drive gears rotated by first actuator and second actuators; a first interlocked gear interlocked with the second drive gear to rotate in opposite directions; a second interlocked gear interlocked with the first drive gear to rotate in opposite directions; a first inner link engaged with rotation of the first drive gear; a first outer link engaged with rotation of the first interlocked gear; a first end link connected to the first inner link and the first outer link opposite the first actuator; a second inner link engaged with rotation of the second interlocked gear; a second outer link engaged with rotation of the second drive gear; and a second end link connected to the second inner link and the second outer link opposite the second actuator.
Robotic touch perception
An apparatus such as a robot capable of performing goal oriented tasks may include one or more touch sensors to receive touch perception feedback on the location of objects and structures within an environment. A fusion engine may be configured to combine touch perception data with other types of sensor data such as data received from an image or distance sensor. The apparatus may combine distance sensor data with touch sensor data using inference models such as Bayesian inference. The touch sensor may be mounted onto an adjustable arm of a robot. The apparatus may use the data it has received from both a touch sensor and distance sensor to build a map of its environment and perform goal oriented tasks such as cleaning or moving objects.