Patent classifications
G05B2219/24012
Method and apparatus for the automated management of a coating plant
A plant for manufacturing products, in particular for applying a coating on parts, includes apparatuses having radiofrequency transmitters, with a pre-set periodicity, of a radio signal containing a unique identifier of each apparatus; a portable device carried by an operator and receiving the radio signal; a program loaded on and performed by the portable device to extract a unique identifier of the apparatus and detect the operator-apparatus distance through an analysis of the trend of the radio signal, the radio-transmitters transmitting periodically their unique identifier and having a remote configuring/setting channel through radio-frequency communication with an external control device to exchange data packets; and a connecting procedure between the portable device and the radio-frequency transmitter to receive information on the operating status of the apparatuses, the signals being transmitted to a remote server or a cloud to monitor the operating conditions of the apparatuses.
METHOD AND SYSTEM FOR COMMISSIONING ENVIRONMENTAL SENSORS
Methods and systems to commission building environmental sensors are disclosed. The system include a device that will move about the environment of a building (inside and/or outside) and detect building environmental sensor devices that are installed in the environment. For each of the sensor devices, in response to detecting the sensor device, the system will retrieve an identifier for the sensor device, and it will determine whether the sensor device is registered with a control system. If the sensor device is registered with the control system, the system will not automatically implement a commissioning process with the sensor device. If the sensor device is not registered with the control system, the system will automatically implement the commissioning process with the sensor device.
Systems and methods for providing an immersive experience of a facility control room using virtual reality
A video surveillance system includes a plurality of video surveillance cameras each for producing a corresponding video stream, a server configured to receive and store the video streams, and a first control room having a video wall. The video wall may be operatively coupled to the server and may be configured to concurrently display two or more of the video streams from two or more of the plurality of video surveillance cameras in a first arrangement. The video surveillance system may further include a remote virtual reality headset with a display and a virtual reality controller operatively coupled to the virtual reality headset and the server. The virtual reality controller may be configured to receive the same two or more video streams that are displayed on the video wall in the first control room and display them in the virtual reality headset.
Methods for collaboratively assisting a control room operator
Methods for collaboratively assisting a control room operator are disclosed. An example method disclosed includes receiving a notification associated with a person via a portable wireless device. The notification includes information regarding a process control variable in a process control environment. The example method further includes altering the notification by the person via the portable wireless device and transmitting the altered notification via the portable wireless device. The control room operator and other persons associated with the notification are to receive the altered notification.
Operating system for a machine of the food industry
The present disclosure relates to an operating system for a machine of a food industry. The operating system includes eyeglasses for a user of the operating system and a transceiver for exchanging data between the operating system and the machine. The eyeglasses include a display system configured to display a control element and information of a human machine interface (HMI). Furthermore, the operating system includes at least one input module which receives user input from the user with respect to the control element. The operating system additionally includes a processing module. The processing module converts the received user input into an input signal for controlling at least one of the machine or the HMI.
Infrastructure construction digital integrated twin (ICDIT)
The present disclosure describes a computer-implemented method to manage an industrial plant facility, the method including: monitoring multiple streams of input data from a network of sensors at the industrial plant facility, wherein the network of sensors include one or more camera devices; determining, by a server computer, an event during construction or operation of the industrial plant facility based on analyzing the multiple streams of input data in real-time; and based on the determined event, generating a notification to alert at least one operator of the industrial plant facility.
METHOD FOR COMPUTER ASSISTANCE IN THE MANAGEMENT OF A PRODUCTION LINE
The present invention relates to a method for computer assistance in the management of a production line. In the method a portable terminal executes and displays a human-machine interface (HMI), and the terminal visualizes through display means at least one module of said production line. The method performs a geographical localization of said terminal within a three-dimensional reference frame centered on said production line and determining the position of said terminal with respect to said module. The method also displays via said display means in superimposition a virtual model limited to said module as a function of said determined position of said terminal.
INFRASTRUCTURE CONSTRUCTION DIGITAL INTEGRATED TWIN (ICDIT)
The present disclosure describes a computer-implemented method to manage an industrial plant facility, the method including: monitoring multiple streams of input data from a network of sensors at the industrial plant facility, wherein the network of sensors include one or more camera devices; determining, by a server computer, an event during construction or operation of the industrial plant facility based on analyzing the multiple streams of input data in real-time; and based on the determined event, generating a notification to alert at least one operator of the industrial plant facility.
Gesture recognition method, corresponding circuit, device and computer program product
A programmable data processing circuit is configured for receiving sensor signals indicative of gestures for identification by the processing circuit. The processing circuit applies to the sensor signals finite state machine processing resources to provide identification output signals indicative of gestures identified as a function of the sensor signals. A plurality of finite state machine processing programs loaded into the processing circuit include a data section and an instruction section. The data section including a fixed size part specifying respective processing resources used by the programs in the plurality of finite state machine processing programs and a variable size part with respective sizes for allocating the respective processing resources used by the programs in the plurality of finite state machine processing programs. The instruction section including conditions and commands for execution by the respective processing resources used by the programs by operating on data located in the respective data sections.
Drone-enabled operator rounds
Drones (e.g., unmanned aerial vehicles, or “UAVs”) equipped with cameras and sensors may be configured to travel throughout the field environment of a process plant to monitor process plant conditions. Onboard computing devices associated with the drones control the movement of the drones through the field environment of the process plant. The onboard computing devices interface with the cameras and other sensors and communicate with user interface devices, controllers, servers and/or databases via a network. The onboard computing devices may receive drone commands from user interface devices and/or servers, or may access drone commands stored in one or more databases. The onboard computing devices may transmit data captured by the cameras and/or other sensors to UI devices, controllers, servers, etc. Accordingly, the user interface devices may display data (including live video feeds) captured by the drone cameras and/or drone sensors to an operator in a human machine interface application.