Patent classifications
G05B2219/39057
Pose calibration method, robot and computer readable storage medium
A pose calibration method, a robot, and a computer readable storage medium are provided. The method includes: obtaining, through a depth camera on a robot, a depth image including a target plane (i.e., a plane where the robot is located); determining point cloud data corresponding to the depth image; and calibrating a target pose of the depth camera based on the point cloud data and a preset optimization method, that is, calibrating a pitch angle and a roll angle of the depth camera and a height of the depth camera in a coordinate system of the robot. In this manner, the accuracy of the calibration of the target pose can be effectively improved while simple in implementation and small in calculation amount, and the efficiency of the calibration of the target pose can be improved so as to improve the user experience.
Robot hand-eye calibration method and apparatus, computing device, medium and product
When a force sensor on a robot arm detects that the force of contact between an end of a calibration device and a calibration plate reaches a threshold, the robot arm stops, and the end of the calibration device performs marking at the contact position between the end of the calibration device and the calibration plate. The robot arm moves upward and stops at a position where the end of the robot arm is at a predetermined height. At this position, a camera at the end of the robot arm photographs marks on the calibration plate, records the coordinates of the marks in the camera coordinate system, and records the coordinates of the end of the calibration device in the robot coordinate system. A calibration transformation matrix is calculated according to the recorded coordinates of at least three marks.
Robot Device Configured to Determine an Interaction Machine Position of at Least One Element of a Predetermined Interaction Machine, and Method
A robot device includes an optical detection device configured to detect a surrounding area image of an area surrounding the robot device. The robot device further includes a control device storing a predetermined reference marking and a predetermined reference position of the reference marking. The control device is configured to detect an image detail that shows the reference marking of the interaction machine in the surrounding area image of the area surrounding the robot device, detect the predetermined reference marking in the image detail, determine a distortion of the predetermined reference marking in the image detail, determine a spatial position of the reference marking, determine an interaction machine position of at least one element of the interaction machine with respect to the robot device from the spatial position of the reference marking, and subject the robot device to closed-loop control and/or open-loop control.
Simultaneous Kinematic and Hand-Eye Calibration
Described are machine vision systems and methods for simultaneous kinematic and hand-eye calibration. A machine vision system includes a robot or motion stage and a camera in communication with a control system. The control system is configured to move the robot or motion stage to poses, and for each pose: capture an image of calibration target features and robot joint angles or motion stage encoder counts. The control system is configured to obtain initial values for robot or motion stage calibration parameters, and determine initial values for hand-eye calibration parameters based on the initial values for the robot or motion stage calibration parameters, the image, and joint angles or encoder counts. The control system is configured to determine final values for the hand-eye calibration parameters and robot or motion stage calibration parameters by refining the hand-eye calibration parameters and robot or motion stage calibration parameters to minimize a cost function.
Robot Control Device, Robot, Robot System, And Calibration Method Of Camera
A robot control device includes a processor that creates a parameter of a camera including a coordinate transformation matrix between a hand coordinate system of an arm and a camera coordinate system of the camera. The processor calculates a relationship between an arm coordinate system and a pattern coordinate system at the time of capturing the pattern image of the calibration pattern, and estimates a coordinate transformation matrix between the hand coordinate system of the arm and the camera coordinate system of the camera with the relationship between the arm coordinate system and the pattern coordinate system, a position and attitude of the arm at the time of capturing a pattern image, and the pattern image.
Image processing apparatus, image processing system, image processing method, and computer program
A movement command to move an end effector to a plurality of predetermined positions is transmitted to a robot controller so as to change a relative position of a target, which becomes an imaging target, with respect to an imaging device. First coordinate values are acquired, the values being each of position coordinates of the end effector having moved in accordance with the movement command, and an image of the target is captured at each movement destination, to which the end effector has moved. Second coordinate values being position coordinates of the target are detected based on the image of the target captured at each movement destination, and a conversion rule between both of the coordinates is calculated based on the first coordinate values and the second coordinate values.
Simultaneous kinematic and hand-eye calibration
Described are machine vision systems and methods for simultaneous kinematic and hand-eye calibration. A machine vision system includes a robot or motion stage and a camera in communication with a control system. The control system is configured to move the robot or motion stage to poses, and for each pose: capture an image of calibration target features and robot joint angles or motion stage encoder counts. The control system is configured to obtain initial values for robot or motion stage calibration parameters, and determine initial values for hand-eye calibration parameters based on the initial values for the robot or motion stage calibration parameters, the image, and joint angles or encoder counts. The control system is configured to determine final values for the hand-eye calibration parameters and robot or motion stage calibration parameters by refining the hand-eye calibration parameters and robot or motion stage calibration parameters to minimize a cost function.
ROBOT AND ROBOT SYSTEM
A robot includes an instruction receiving unit that receives a calibration initiation instruction, and an arm that changes a positional relationship between a marker which indicates a reference point and a capturing unit when the calibration initiation instruction is received, in which the calibration of a coordinate system of the capturing unit and a coordinate system of the robot is performed on the basis of an image in which the marker is captured by the capturing unit after the positional relationship between the capturing unit and the marker changes.
METHOD OF TEACHING ROBOT AND ROBOT SYSTEM
A robot system includes a robot, a vision sensor, and a controller. The vision sensor is configured to be detachably attached to the robot. The controller is configured to measure a reference object by using the vision sensor and calibrate a relative relationship between a sensor portion of the vision sensor and an engagement portion of the vision sensor, and teach the robot by referring to the relative relationship and by using the vision sensor, after the vision sensor is attached to the robot.
CONTROL SYSTEM, CONTROL METHOD, AND CAMERA CONTROL DEVICE
Provided is a control system that controls an imaging condition of a camera equipped on a robot to capture an image in which a blur is suppressed.
The control system includes a robot control unit that controls a motion of the robot equipped with the camera, and a camera control unit that controls an imaging operation of the camera. The camera control unit determines an exposure time for suppressing a blur that occurs in a captured image of the camera within an allowable pixel number for the moving speed of the camera, and in a case where the camera control unit cannot set the exposure time, the camera control unit instructs the robot control unit to update a speed limit value of the robot.