Patent classifications
B25J9/1692
HAND-EYE CALIBRATION OF CAMERA-GUIDED APPARATUSES
The invention describes a generic framework for hand-eye calibration of camera-guided apparatuses, wherein the rigid 3D transformation between the apparatus and the camera must be determined. An example of such an apparatus is a camera-guided robot.
METHOD FOR CORRECTING ROBOT
A method for correcting a robot is provided. The method includes: providing a correction device, wherein the correction device comprises a jig wafer; grabbing and/or transferring the jig wafer by using the robot to obtain collected data; determining, based on the collected data, whether the robot needs to be corrected; and in response to that the robot needs to be corrected, obtaining a compensation value according to the collected data, and correcting the robot based on the compensation value.
DEVICE FOR CORRECTING ROBOTIC ARM
Embodiments relate to a device for correcting a robotic arm, including: a first robotic arm positioned in a vacuum transmission chamber; a first jig wafer comprising a first wafer body and a first jig positioned on a front surface of the first wafer body; a first distance measuring sensor positioned at a center position of a back surface of the first wafer body and configured to detect whether a center of the first jig wafer is aligned with a center of a wafer chuck; a second distance measuring sensor positioned on the front surface of the first wafer body and on an outside of the first jig and configured to detect a lifting height of the first robotic arm when the first robotic arm controls a pick-and-place operation the first jig wafer on an upper surface of the wafer chuck.
Robot-conveyor calibration method, robot system and control system
A method for calibrating a robot coordinate system of a robot with a conveyor coordinate system of a movable conveyor member, the method including providing a sensor configured to detect positions of the robot in a non-contact manner; detecting a position of the robot when the conveyor member is positioned at a first operating position; detecting a position of the robot and/or of the conveyor member by the sensor in the sensor coordinate system when the conveyor member is positioned at a second operating position different from the first operating position; and determining a relationship between the robot coordinate system and the conveyor coordinate system based on at least one detected position of the robot in the sensor coordinate system. A robot system and a control system are also provided.
Method of plotting ultraviolet (UV) radiation for disinfection
Implementations of the disclosed subject matter provide a method of moving, using a drive system, a mobile robot within an area. Detecting, using at least one sensor of the mobile robot, at least one of air within the area, a surface within the area, and/or an object within the area. The area may be mapped in three dimensions based on the detecting of at least one of the air, the surface, and the object as the mobile robot moves within the area. Ultraviolet (UV) light may be emitted from a light source of the mobile robot to disinfect at least a portion of the area. A representation of the emission of the UV light may be plotted onto the mapped area to generate an exposure plot, where the representation is of the UV light emitted on at least one of the air, the surface, and the object in the area.
Task planning for measurement variances
Methods, systems, and apparatus, including computer programs encoded on computer storage media, for modifying a process definition to ensure accuracy, timeliness, or both of workcell measurement. One of the methods includes receiving an initial process definition for a process to be performed by a robot, wherein the process definition defines a sequence of actions to be performed in a workcell, and wherein a first action in the sequence of actions has an associated measurement tolerance; computing a predicted accumulated measurement variance for each of one or more actions that occur before the first action in the sequence; determining that the predicted accumulated measurement variance for the one or more actions that occur before the first action in the sequence exceeds a threshold; and in response, generating a modified process definition that inserts a measurement action at a location in the sequence before the first action.
DISPLAY GUIDED HIGH-ACCURACY ROBOTIC NAVIGATION AND MOTION CONTROL SYSTEM AND METHODS
A display guided robotic navigation and control system comprises a display system including a display surface and a display device configured to display an image including a visual pattern onto the display surface, a robotic system including a mobile robotic device and an optical sensor attached to the mobile robotic device, and a computing system communicatively connected to the display system and the robotic system. Related methods are also disclosed.
External parameter calibration method for robot sensors and apparatus and robot with the same
The present disclosure provides an external parameter calibration method for robot sensors as well as an apparatus, robot and storage medium with the same. The method includes: obtaining first sensor data and second sensor data obtained through a first sensor and a second sensor of the robot by collecting position information of a calibration reference object and converting to a same coordinate system to obtain corresponding first converted sensor data and second converted sensor data, thereby determining a first coordinate and a second coordinate of a reference point of the calibration reference object; using the first coordinate and the second coordinate are as a set of coordinate data; repeating the above-mentioned steps to obtain N sets of the coordinate data to calculate the external parameter between the first sensor and the second sensor in response to a relative positional relationship between the robot and the calibration reference object being changed.
DETERMINATION OF RELATIVE POSITION WHILE LANDING
A system for determining the position and orientation of a landing area relative to an approaching vehicle is disclosed. The landing area has a plurality of target items disposed proximate to the landing area at known locations. The system includes a sensor on which is formed images of target items that are within a Field of View (FOV) of the sensor, a processor configured to receive the positions of the target items on the sensor, and a memory coupled to the processor. The memory stores the locations of the target items relative to the landing area and instructions that, when loaded into the processor and executed, cause the processor to determine a position and an orientation of the landing area relative to the vehicle based in part on the information received from the sensor and the locations of the target items.
Method for Setting More Precisely a Position and/or Orientation of a Device Head
A method for setting more precisely a position and/or an orientation of a device head in a measuring environment by a distance measuring device which has a number of M, M≥1, distance measuring sensors and which is connected to the device head. A control device is communicatively connected to the distance measuring device and an on-board sensor device. The position and/or the orientation of the device head is determined by the on-board sensor device and the position and/or the orientation of the device head determined by the on-board sensor device is set more precisely by the control device.