Patent classifications
G05B2219/45065
Method for interactively providing waypoints to a mobile robot for use in the marking of a geometric figure on a ground surface
A mobile robot and method for interactively providing waypoints to a mobile robot for use in the marking of a geometric figure on a ground surface including the steps of: i) Selecting a control function accepting manual positioning of a mobile robot at two or more target locations on a ground surface; ii) Positioning the mobile robot in proximity to a first target location to be marked on a surface, and directing a position determining device of the mobile device to said first target location to be marked; iii) Instructing the mobile robot to store the first target location as a first waypoint; iv) Repeating steps ii)-iii) to obtain at least a second waypoint; v) Selecting a control function accepting manual selection of a geometric figure for being marked on said ground surface; vi) Instructing the mobile robot to compute the best fit for the selected geometric figure on the surface based on the two or more waypoints; vii) Instructing the mobile robot to compute waypoint coordinates of the geometric figure for being marked from the fitted position of said geometric figure; and viii.a) Instructing the mobile robot to store the computed waypoint coordinates of the geometric figure; or viii.b) Instructing the mobile robot to mark the geometric figure on the surface.
SYSTEM AND METHOD FOR INDICATING DEGRADED OPERATION OF A ROBOTIC PAINT STATION
A system for indicating degraded operation of a robotic paint station is provided. The system includes a paint nozzle operable to spray paint upon a work piece, a paint supply tube system including a plurality of paint supply tubes operable to supply a flow of paint to the paint nozzle, a vacuum system connected to the paint supply tube system and operable to remove air pockets from the paint supply tube system, a pressure sensor connected to the vacuum system, and a computerized robot control module. The computerized robot control module is programmed to monitor data from the pressure sensor, compare the data to a threshold pressure value, and generate a maintenance warning indicating degraded vacuum system operation based upon the comparing.
TWIN LASER CAMERA ASSEMBLY
A twin laser camera unitary assembly for a robot processing tool is disclosed. The assembly has a housing having a front wall defining an upright U-shaped channel into which a tubular portion of the tool is laterally insertable. A mounting support attaches the housing relative to the tool in operative position. Twin laser range finders are respectively mounted in the housing on opposite sides of the U-shaped channel in a symmetrical in-line arrangement with respect to the tool. A controller mounted in the housing is configured to receive robot control signals, operate laser projectors and process image signals produced by imagers of the laser range finders so that joint and bead position and geometry signals are produced in a robot reference frame. The assembly is designed and protected for use in industrial processes such as robotic laser and arc welding and sealant dispensing.
Machine Vision-Based Method and System for Measuring 3D Pose of a Part or Subassembly of Parts
A machine vision-based method and system for measuring 3D pose of a part or subassembly of parts having an unknown pose are disclosed. A number of different applications of the method and system are disclosed including applications which utilize a reprogrammable industrial automation machine such as a robot. The method includes providing a reference cloud of 3D voxels which represent a reference surface of a reference part or subassembly having a known reference pose. Using at least one 2D/3D hybrid sensor, a sample cloud of 3D voxels which represent a corresponding surface of a sample part or subassembly of the same type as the reference part or subassembly is acquired. The sample part or subassembly has an actual pose different from the reference pose. The voxels of the sample and reference clouds are processed utilizing a matching algorithm to determine the pose of the sample part or subassembly.
LIQUID MATERIAL APPLICATION APPARATUS AND LIQUID MATERIAL APPLICATION METHOD
Object: A liquid material application apparatus and a liquid material application method are provided with which a liquid material can be discharged in a predetermined discharge amount per unit time regardless of a relative moving speed in a series of application works.
Solution: The liquid material application apparatus includes a discharge head, a robot moving the discharge head relative to a workpiece, a movement control unit controlling relative movement of the discharge head and the workpiece, and a discharge control unit controlling an operation of discharging a liquid material from a discharge head, wherein the discharge control unit executes, in a switchable manner in accordance with an application program, first mode discharge control of changing a discharge amount of the liquid material, which is discharged from the discharge head per unit time, depending on a relative moving speed between the discharge head and the workpiece, and second mode discharge control of operating the discharge head to discharge the liquid material in a predetermined discharge amount per unit time regardless of the relative moving speed. The liquid material application method is implemented using the liquid material application apparatus.
Liquid material application apparatus and liquid material application method
A liquid material is discharged in a constant amount per unit time regardless of a relative moving speed. The liquid material application apparatus includes a discharge head, a robot moving the discharge head relative to a workpiece, a movement control unit controlling relative movement of the discharge head and the workpiece, and a discharge control unit controlling an operation of discharging a liquid material from a discharge head, wherein the discharge control unit executes, in a switchable manner in accordance with an application program, first mode discharge control of changing a discharge amount of the liquid material, which is discharged from the discharge head per unit time, depending on a relative moving speed between the discharge head and the workpiece, and second mode discharge control of operating the discharge head to discharge the liquid material in a predetermined discharge amount per unit time regardless of the relative moving speed.
Machine vision-based method and system for measuring 3D pose of a part or subassembly of parts
A machine vision-based method and system for measuring 3D pose of a part or subassembly of parts having an unknown pose are disclosed. A number of different applications of the method and system are disclosed including applications which utilize a reprogrammable industrial automation machine such as a robot. The method includes providing a reference cloud of 3D voxels which represent a reference surface of a reference part or subassembly having a known reference pose. Using at least one 2D/3D hybrid sensor, a sample cloud of 3D voxels which represent a corresponding surface of a sample part or subassembly of the same type as the reference part or subassembly is acquired. The sample part or subassembly has an actual pose different from the reference pose. The voxels of the sample and reference clouds are processed utilizing a matching algorithm to determine the pose of the sample part or subassembly.
System and method for painting an interior wall of housing using a semi-automatic painting robot
A system for painting an interior wall of housing is disclosed. The system includes a semi-automatic painting robot 106 and a user device 104. The semi-automatic painting robot 106 includes a microprocessor 304, a servo drive module 306, a DC motor drive module 316, a magnetometer 312, a distance sensor module 318, a first servo motor 708, a second servo motor 714, a spray gun 710, and a belt driven linear actuator 804. A user device 104 captures one or more images of the interior wall to be painted, processes and sends the one or more images to the microprocessor 304 in co-ordinates of the interior wall. The microprocessor 304 receives the co-ordinates and performs the operations of painting on the interior wall using one or more painting strokes. The user 102 may control the semi-automatic painting robot 106 with the user interface present in the user device 104.
ROBOT SYSTEM AND OPERATING METHOD THEREOF
A robot system includes a robot that self-travels along a traveling shaft and is provided with a position detection sensor at a distal end, a support member that has a plurality of reference positions juxtaposed and supports a workpiece, a plurality of calibration members that are juxtaposed along the traveling shaft, and a control device, in which the calibration members each have a calibration position, and the control device is configured to cause the robot to move by a predetermined first distance along the traveling shaft, calibrate position coordinates of the robot based on position coordinates of the calibration positions detected by the position detection sensor, and subsequently calibrate position coordinates of the workpiece based on position coordinates of the reference positions detected by the position detection sensor.
Machine Vision-Based Method and System for Measuring 3D Pose of a Part or Subassembly of Parts
A machine vision-based method and system for measuring 3D pose of a part or subassembly of parts having an unknown pose are disclosed. A number of different applications of the method and system are disclosed including applications which utilize a reprogrammable industrial automation machine such as a robot. The method includes providing a reference cloud of 3D voxels which represent a reference surface of a reference part or subassembly having a known reference pose. Using at least one 2D/3D hybrid sensor, a sample cloud of 3D voxels which represent a corresponding surface of a sample part or subassembly of the same type as the reference part or subassembly is acquired. The sample part or subassembly has an actual pose different from the reference pose. The voxels of the sample and reference clouds are processed utilizing a matching algorithm to determine the pose of the sample part or subassembly.