G05B2219/32014

MANUFACTURING APPARATUS
20180011473 · 2018-01-11 · ·

A manufacturing apparatus includes a manufacturing unit; a cover that includes a see-through portion; an input unit that occupies a predetermined area on a surface of the see-through portion and detects a contact to the predetermined area and outputs information regarding the detection; a display that displays a predetermined image at a position of the cover; and a controller that in response to a predetermined input to the input unit, controls the display to display the predetermined image at or near a position where the predetermined input is detected, controls the display to display a predetermined operation panel image as the predetermined image at or near a position on the cover that overlaps the input unit; and controls the manufacturing unit based on a detection position of the predetermined input on the predetermined operation panel image.

THERMAL IMAGING ASSET INSPECTION SYSTEMS AND METHODS
20230024701 · 2023-01-26 ·

Various techniques are provided performing temperature inspections of assets, such as temperature-sensitive industrial equipment. In one example, a method includes receiving, at a portable device, inspection instructions associated with an asset at a location in an environment. The method also includes displaying, at the portable device, the asset in an augmented reality format to guide a user of the portable device to the location. The method also includes capturing, by a thermal imager associated with the portable device, a thermal image of the asset when the thermal imager is aligned with the asset. The method also includes extracting, from the thermal image, at least one temperature measurement associated with the asset. Additional methods and systems are also provided.

MEAN TIME BETWEEN FAILURE OF SEMICONDUCTOR-FABRICATION EQUIPMENT USING DATA ANALYTICS WITH NATURAL-LANGUAGE PROCESSING

In one embodiment, a system includes a wafer handling system, processing components, a controller, a virtual assistant, a natural language processing (NLP) engine, and a data-analytics engine. The wafer handling system is configured to hold one or more wafers for processing. The processing components is configured to physically treat the one or more wafers. The controller is configured to operate the processing components. The virtual assistant, in communication with the NLP engine, is configured to receive a user query from a user, understand an intent or context of the user query, and provide a context-specific response to the user query. The data-analytics engine is configured to generate and provide analytical data relating to the user query based on data collected from a plurality of data sources via one or more communication protocols.

Method and system for augmented reality visualisation

A method for visualizing an image combining an image (Ic) of a real object (200) originating from a video capture system (300) with digital information (In) originating from a three-dimensional model of the equipment, comprising: carrying out a processing operation to superimpose, in real time, a reference point (402) of the three-dimensional model with a reference point (302) of the video capture system and an object reference point (202) of the real object, and displaying at least some of the digital information superimposed on the image captured by the video capture system, further comprising: an initial step (Ei) of recording the reference texture (T200) of the real object, and a step (Ea) of analyzing the images transmitted by the video capture system, the analysis step comprising: generating a synthesis image from the captured image, and from the three-dimensional model of the equipment textured using the recorded texture; a step of calculating a composite image by mixing the synthesis image and the captured image.

SYSTEM FOR REMOTE ASSISTANCE OF A FIELD OPERATOR

A system (1) for assisting a field operator, in particular a maintenance field operator, by a remote assistant (51) equipped with a viewer (50) comprises a first group of components configured to be connected to a support (10) wearable on the operator's head, including a first video camera (11), a local viewer (13) arranged to be watched by the operator, a first control unit (15) and a first wireless connection interface (19). The system also comprises a handpiece (20) providing a visual sensor (21) such as a video camera or a thermal imaging camera, a serial port (24) for connecting a peripheral device (30) such as an instrument or a sensor or a video source, a second control unit (25) and a second connection interface (26). The connection interfaces (15,25) are configured for mutually changing, locally or through a remote server (99) available through a data network (3), data streams (12,22,32) coming from the video camera (11), from the visual sensor (21) or from the peripheral device (30), respectively, intended to be displayed by the local viewer (13). Moreover, the connection interfaces are configured for exchanging the data streams with the assistants (51) remote viewer (50) through the remote server (99). The system (1) also comprises a scenario-switching device (80) for displaying a same data stream of interest (42) at the same time to the local viewer (13) and to the remote viewer (50), said data stream of interest selected among data streams (12,22,32) related to the scenarios (11′,21′) framed by the video camera (11) and by the visual sensor (21), respectively, and to the data obtained from the peripheral device (30).

MEASUREMENT PROGRAM SELECTION ASSISTING APPARATUS AND MEASUREMENT CONTROL APPARATUS
20230214083 · 2023-07-06 ·

The present invention provides a measurement program selection assisting apparatus capable of visually confirming whether a selected measurement program is suitable for an object to be measured. One aspect of the present invention is a measurement program selection assisting apparatus comprising: a measurement program database storing a measurement program related to measurement of an object and superimposed display information corresponding to a three-dimensional shape of the object in association with each other; a display unit capable of displaying information defined in a virtual space superimposed on the real space; and a display control unit for acquiring the superimposed display information corresponding to a selected measurement program from the measurement program database and displaying the acquired superimposed display information in a mixed reality on the display unit

Virtual object positioning in augmented reality applications
11694403 · 2023-07-04 · ·

Systems and methods include determination of a first component of a set of components under assembly in a physical environment, determination of a first physical position of a user with respect to the first component in the physical environment, determination of a second component of the set of components under assembly to be installed at least partially on the first component based on assembly information associated with the set of components, determination of three-dimensional surface data of the second component, determination of a physical relationship in which the second component is to be installed at least partially on the first component based on a model associated with the set of components, determination of a graphical representation of the second component based on the first physical position of the user with respect to the first component, the physical relationship, and the three-dimensional surface data of the second component, and presentation of the graphical representation to the user in a view including the first component in the physical environment, wherein the presented graphical representation appears to the user to be in the physical relationship with respect to the first component.

Systems and methods for automatic sensor registration and configuration

Various approaches to ensuring safe operation of industrial machinery in a workcell include disposing multiple image sensors proximate to the workcell and acquiring, with at least some of the image sensors, the first set of images of the workcell; registering the sensors to each other based at least in part on the first set of images and, based at least in part on the registration, converting the first set of images to a common reference frame of the sensors; determining a transformation matrix for transforming the common reference frame of the sensors to a global frame of the workcell; registering the sensors to the industrial machinery; acquiring the second set of images during operation of the industrial machinery; and monitoring the industrial machinery during operation thereof based at least in part on the acquired second plurality of images, transformation, and registration of the sensors to the industrial machinery.

Dynamic, interactive signaling of safety-related conditions in a monitored environment

Systems and methods for determining safe and unsafe zones in a workspace—where safe actions are calculated in real time based on all relevant objects (e.g., some observed by sensors and others computationally generated based on analysis of the sensed workspace) and on the current state of the machinery (e.g., a robot) in the workspace—may utilize a variety of workspace-monitoring approaches as well as dynamic modeling of the robot geometry. The future trajectory of the robot(s) and/or the human(s) may be forecast using, e.g., a model of human movement and other forms of control. Modeling and forecasting of the robot may, in some embodiments, make use of data provided by the robot controller that may or may not include safety guarantees.

System and method for using virtual/augmented reality for interaction with collaborative robots in manufacturing or industrial environment
11529737 · 2022-12-20 · ·

A method includes determining a movement of an industrial robot in a manufacturing environment from a first position to a second position. The method also includes displaying an image showing a trajectory of the movement of the robot on a wearable headset. The displaying of the image comprises at least one of: displaying an augmented reality (AR) graphical image or video of the trajectory superimposed on a real-time actual image of the robot, or displaying a virtual reality (VR) graphical image or video showing a graphical representation of the robot together with the trajectory.