Patent classifications
G05B2219/40607
Surface finishing apparatus
A surface finishing apparatus includes: an arm to which a tool is attached; a force sensor that detects force applied to the tool; a visual sensor acquiring an image of a plane surface; a storage device storing data indicating a target state of the plane surface; and a controller that performs removing position determination process for determining, by using at least unfinished-surface image data and the data indicating the target state, a plurality of removing positions on the plane surface of the member, and arm control process for controlling the arm to sequentially perform surface removal at the plurality of determined removing positions, wherein a surface inspection agent is applied to the plane surface whose image is to be acquired by the visual sensor, and thereby the surface inspection agent is distributed over the plane surface.
PRODUCTION SYSTEM FOR PROCESSING WORKPIECES
Production system for processing workpieces, having a robot module, a workpiece carrier module and a machining module, all of them being working modules, wherein each of said working modules includes an interface surface, the interface surface having a supply interface and a communication interface, wherein the robot module includes a robot and a robot controller for handling workpieces, wherein the workpiece carrier module includes a plurality of workpiece locations for receiving unmachined and finished workpieces, wherein the machining module includes a processing system for carrying out at least one processing operation on at least one workpiece; wherein a data carrier is assigned to each of the working modules, which a data carrier stores processing data, the processing data including a transfer position for workpieces and being coded for processing in the robot controller of the robot module.
CONTROL DEVICE, CONTROL METHOD, AND PROGRAM
A control device according to one or more embodiments may control a robot that performs a collaborative work with a worker. The control device may include: a storage section storing an operation program to cause the robot to perform the collaborative work with the worker; a control section controlling the robot based on the operation program when the collaborative work is performed; a calculation section calculating a motion of the worker when the collaborative work is performed; and a correction section correcting the operation program based on the motion of the worker calculated by the calculation section.
METHOD AND COMPUTING SYSTEMS FOR PERFORMING OBJECT DETECTION
A computing system including a communication interface and a processing circuit. The communication interface communicates with a robot and a camera having a field of view. The processing circuit performs obtaining image information based on objects within the field of view and determines a first template matching score which indicates a degree of match between the image information and an model template. The processing circuit further determines image edge information based on the image information and determines a second template matching score which indicates a degree of match between the image edge information and a template. The processing circuit additional determines an overall template matching score based on the first template matching score and the second template matching score.
Control system and control method
A control device estimates a position and pose of an imaging device relative to a robot based on an image of the robot captured by the imaging device. A simulation device arranges a robot model at a teaching point, and generates a simulation image of the robot model captured by a virtual camera that is arranged so that a position and pose of the virtual camera relative to the robot model in the virtual space coincide with the estimated position and pose of the imaging device. The control device determines an amount of correction of a position and pose of the robot for the teaching point so that the position and pose of the robot on the actual image captured after the robot has been driven according to a movement command to the teaching point approximate to the position and pose of the robot model on the simulation image.
POSITIONING METHOD AND POSITIONING DEVICE
There are provided a positioning method and a positioning device that can position workpieces by a simple method and configuration. A positioning method includes: gripping at least one of first and second workpieces; obtaining point group data of the at least one gripped workpiece of the first and second workpieces; calculating a translation matrix of shape fitting point group data obtained by adjusting a position of the point group data to reference data in a position adjustment state of the first and second workpieces; calculating an inverse matrix based on the translation matrix; and positioning the first and second workpieces by moving the at least one gripped workpiece of the first and second workpieces based on at least one of the translation matrix and the inverse matrix.
AUTOMATED PRODUCTION WORK CELL
A robotic work cell uses an object separating mechanism to disperse bulk objects into a 2D arrangement on a horizontal surface and uses a vision system to generate pick-up (positional) data and rotational orientation data for each sequentially selected target object of the 2D arrangement. A pick-and-place robot mechanism uses the positional data to pick-up each target object and uses the rotational orientation data to reorientate the target object during transfer to a designated hand-off location. A carousel-type robotic end-tool disposed on a 4-axis object-processing robot mechanism rotates a gripper mechanism around a vertical axis to move the target object from the hand-off location to a designated processing location, where an associated processing device performs a desired process (e.g., label application) on the target object. In one embodiment the gripper mechanism is selectively rotatable around a horizontal axis to facilitate processing on opposing surfaces of the target object.
Handling assembly comprising a handling device for carrying out at least one work step, method, and computer program
A handling assembly having a handling device for carrying out at least one working step with and/or on a workpiece in a working region of the handling device, stations being situated in the working region, with at least one monitoring sensor for the optical monitoring of the working region and for provision as monitoring data, with a localization module, the localization module being designed to recognize the stations and to determine a station position for each of the stations.
ROBOT SYSTEM
A robot system including: a robot that grips one of a first and second workpiece that are disposed adjacent to each other; an illumination device that radiates light beam onto surfaces of the first workpiece and the second workpiece on either side of a border between the workpieces along a plane that intersects the border; a camera that captures an image containing a first line image of the light beam formed on the surface of the first workpiece and a second line image of the light beam formed on the surface of the second workpiece; and a robot controller that operates the robot based on a misalignment amount and direction of the second line image with respect to the first line image in the image acquired and that performs a correction of a level difference between the surface of the first workpiece and the surface of the second workpiece.
ROBOT SAFETY MONITORING SYSTEM
A robot safety monitoring system includes a stereo camera set configured to photograph a robot and a monitoring area around the robot and a processor connected to the stereo camera set. The stereo camera set includes a first camera including a plurality of first camera modules spaced apart from each other, and a second camera including a plurality of second camera modules spaced apart from each other. Based on a first direction, the first camera and the second camera have different fields of view. The processor is configured to set the monitoring area, determine whether an object is present in the monitoring area, and control the robot based on determining that an object is present in the monitoring area.