Patent classifications
G05B2219/35506
Initializing individual exposure field parameters of an overlay controller
A method for initializing individual exposure field parameters of an overlay controller is disclosed including initializing a first control thread having a first context associated with a first product type, wherein a first layout of first exposure fields is defined for the first product type for processing in a stepper. The method further includes remapping a set of previous control state data for a set of control threads associated with other product types different than the first product type into the first layout. The other product types have layouts of second exposure fields different than the first layout. An initial set of control state data for the first control thread associated with the first product type is generated using the remapped previous control state data. The stepper is configured for processing a first substrate of the first product type using the initial set of control state data.
INITIALIZING INDIVIDUAL EXPOSURE FIELD PARAMETERS OF AN OVERLAY CONTROLLER
A method for initializing individual exposure field parameters of an overlay controller is disclosed including initializing a first control thread having a first context associated with a first product type, wherein a first layout of first exposure fields is defined for the first product type for processing in a stepper. The method further includes remapping a set of previous control state data for a set of control threads associated with other product types different than the first product type into the first layout. The other product types have layouts of second exposure fields different than the first layout. An initial set of control state data for the first control thread associated with the first product type is generated using the remapped previous control state data. The stepper is configured for processing a first substrate of the first product type using the initial set of control state data.
VISUALIZATION AND MODIFICATION OF OPERATIONAL BOUNDING ZONES USING AUGMENTED REALITY
An augmented reality (AR) system for visualizing and modifying robot operational zones. The system includes an AR device such as a headset in communication with a robot controller. The AR device includes software for the AR display and modification of the operational zones. The AR device is registered with the robot coordinate frame via detection of a visual marker. The AR device displays operational zones overlaid on real world images of the robot and existing fixtures, where the display is updated as the user moves around the robot work cell. Control points on the virtual operational zones are displayed and allow the user to reshape the operational zones. The robot can be operated during the AR session, running the robot's programmed motion and evaluating the operational zones. Zone violations are highlighted in the AR display. When zone definition is complete, the finalized operational zones are uploaded to the robot controller.
Use of a live video stream in a process control system
A method, user interface control arrangement, and a computer program product for controlling a stationary user interface in an industrial process control system as well as to such a process control system. The user interface control arrangement obtains a first live video stream from a video camera monitoring an industrial process at a first location, obtains a process control view for the first location, overlays the process control view on the first live video stream and displays the first live video stream with the overlaid process control view on a display of the user interface.
Environment property estimation and graphical display
A surgical robot including an imaging system comprising at least one camera, a processor in communication with the imaging system, a manipulation system in communication with the processor, and a visual display in communication with the processor. The processor is operable to calculate a mechanical property estimate for an area of an environment based on an environment model of tool-environment interaction data, create a composite image comprising a mechanical property map of the mechanical property estimate overlaid on an environment image from the at least one camera, and output the composite image on the visual display.
Image-Based Placing of Workpiece Machining Operations
Techniques are described for machining flat workpieces, such as metal sheets, or three-dimensional workpieces on a processing machine, such as a machine tool or laser cutting machine, including capturing a live image of a workpiece to be machined with an image capturing device for capturing two-dimensional images; displaying at least one workpiece machining operation to be performed in the live image of the workpiece by a predetermined forward transformation of the workpiece machining operation from the three-dimensional machine coordinate system into the two-dimensional live-image coordinate system; repositioning the workpiece machining operation to be performed in the live image of the workpiece; and performing the workpiece machining operation on the workpiece by a predetermined inverse transformation of the repositioned workpiece machining operation from the two-dimensional live-image coordinate system into the three-dimensional machine coordinate system.