Patent classifications
G05B2219/32014
METHOD AND SYSTEM FOR REMOTE COLLABORATION
A method for a remote collaboration, as a method for providing an augmented reality (AR)-based remote collaboration between a robot located in a worksite, a field worker terminal, and a remote administrator terminal located outside the worksite, includes acquiring a captured image including a field image captured by a robot located at the worksite or a captured image including a user image captured by the field worker terminal, displaying the captured image of the worksite, generating virtual content based on an input of a remote administrator and a field worker with respect to the displayed captured image, and displaying an AR image in which the virtual content is augmented on the displayed captured image.
Augmented vision system with active welder guidance
Welding headwear comprises a display operable to present images for viewing by a wearer of the welding headwear, and comprises circuitry operable to determine an identifier associated with a workpiece, retrieve, from memory, welding work instructions associated with the identifier, and generate the images for presentation on the display based on said work instructions.
Augmented reality human machine interface testing
An industrial virtual reality (VR) system includes visualization processing capabilities that allow an augmented reality (AR) human-machine interface (HMI) application to be tested within a virtual representation of the plant environment. This approach can yield an interactable AR HMI that simulates, within the VR environment, what a wearer of an AR appliance will see while traversing the physical plant. In this way, proper operation of the AR HMI can be verified prior to commissioning of the physical system. This can include ensuring that graphics are tied to the correct data points, confirming correct and non-obtrusive locations of graphics within the user's field of view.
360° assistance for QCS scanner with mixed reality and machine learning technology
An apparatus, method, and non-transitory machine-readable medium provide for 360° assistance for a QCS scanner with mixed reality (MR) and machine learning technology. The apparatus includes an optical sensor, a display, a Chatbot, cloud service, and a processor operably connected to the optical sensor and the display. The processor receives diagnostic information from a server related to a field device in an industrial process control and automation system; identifies an issue of the field device based on the diagnostic information; detects, using the optical sensor, the field device corresponding to the identified issue; guides, using the display, a user to a location and a scanner part of the field device that is related to the issue; provides, using the display, necessary steps or actions to resolve the issue; and connects, using a cloud server, a user to get modules of installation, commissioning, annual maintenance (AMC) and training for a quality control system (QCS) as per the selected persona.
Display system and display device
A display system includes a display and a detector configured to detect an orientation of the display. The display system includes a display source image generation unit configured to generate a display source image including a first image and a second image related to this first image; and a visual field image display unit configured to display the first image in a first display region of the display and display the second image in a second display region of the display. The visual field image display unit is configured to set a predetermined fixed region of the display as the first display region when a display mode associated with the first image is a first display mode, and determine the first display region according to the orientation of the display when the display mode is a second display mode.
SYSTEMS AND METHODS FOR PROVIDING CONTEXT-BASED DATA FOR AN INDUSTRIAL AUTOMATION SYSTEM
A tangible, non-transitory, computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive user input indicative of a selection of a user experience of a plurality of user experiences. The plurality of user experiences include a first user experience associated with a first event that occurred in an industrial automation system at a first time prior to receiving the user input and a second user experience associated with a second event expected to occur in the industrial automation system at a second time after receiving the user input. When executed, the instructions also cause the processing circuitry to determine, based on the user input, output representative data associated with the industrial automation system and instruct an extended reality device to present the output representative data.
SYSTEMS AND METHODS FOR PROVIDING CONTEXT-BASED DATA FOR AN INDUSTRIAL AUTOMATION SYSTEM
A non-transitory computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive, from first sensors, first sensory datasets associated with an industrial automation system, receive, from second sensors, second sensory datasets associated with a machine configured to perform mechanical operations, determine a position of the machine relative to the industrial automation system based on the first sensory datasets and the second sensory datasets, determine output representative data associated with the industrial automation system based on the first sensory datasets and the second sensory datasets and in accordance with the position of the machine relative to the industrial automation system, instruct an extended reality device to present the output representative data, determine movement of components of the machine, and instruct the extended reality device to present feedback based on the movement of the components.
SYSTEMS AND METHODS FOR MODIFYING CONTEXT-BASED DATA PROVIDED FOR AN INDUSTRIAL AUTOMATION SYSTEM
A tangible, non-transitory, computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive sensory datasets associated with an industrial automation system, determine context information based on a sensory dataset and representative of an environmental condition, predict an intent of a user to complete a task associated with the industrial automation system based on the sensory datasets and the context information, present first output representative data via an extended reality device based on the intent and a setting, the setting including a data presentation format for presenting the sensory datasets, receive inputs indicative of changes to the data presentation format, present second output representative data via the extended reality device in response to receiving the inputs, and update the setting based on the inputs and historical data indicative of users changing the data presentation format of the first output representative data.
SYSTEMS AND METHODS FOR PROVIDING CONTEXT-BASED DATA FOR AN INDUSTRIAL AUTOMATION SYSTEM
A non-transitory computer-readable medium includes instructions that, when executed by processing circuitry, are configured to cause the processing circuitry to receive sensory datasets associated with an industrial automation system from sensors, receive positioning data via an extended reality device associated with a user, determine a first virtual positioning of the user in a virtual coordinate system based on the positioning data, determine a second virtual positioning of an industrial automation system in the virtual coordinate system based on the sensory datasets, determine output representative data to be presented by the extended reality device based on the plurality of sensory datasets and in accordance to the first virtual positioning relative to the second virtual positioning, and instruct the extended reality device to present the output representative data.
Augmented reality diagnostic tool for data center nodes
An augmented reality (AR) diagnostic tool embodied as a software application on a portable device employs AR infrastructure to enable a user to locate a failed/malfunctioning node of a cluster and, with minimal interaction, diagnose causes and provide recommendations to repair the node. The portable device may be a computer embodied as visualization technology and configured to execute the software application. Once installed, the AR diagnostic (ARD) tool is ready for use by the user, e.g., a customer service technician, to locate and repair one or more failed cluster nodes. In response to a failure/malfunction, the cluster node sends diagnostic and configuration information (i.e., failure/malfunction information) of the failed node to an analytics service. The failure information informs the technician of the cluster failure. The technician may then activate the ARD tool and AR infrastructure to locate and repair the failed node.