Patent classifications
G05B2219/31046
Shop order status visualization system
A method and apparatus for visualizing an assembly of parts for an aircraft. A shop order instance is identified for the assembly of parts for the aircraft. A volume is identified for the shop order instance. The assembly of parts is displayed within the volume in context with a number of assemblies of other parts within the volume.
Process calibrator, method for controlling a process calibrator and user interface for a process calibrator
A process calibrator is formed with a user interface, which guides to choose the right connection arrangements of the process calibrator according to the function of the process calibrator. The user interface shows in the display of the process calibrator the connection arrangements of the process calibrator and indicates the place of the connection arrangement assigned to the selected function visually compared to other connection arrangements. The connection arrangements shown by the user interface in the display are placed similarly to the physical connection arrangements of the process calibrator. As the user interface indicates the connection arrangement that is assigned to the selected function of the process calibrator, it is easy for the user to make the right connection.
ACOUSTICAL OR VIBRATIONAL MONITORING IN A GUIDED ASSEMBLY SYSTEM
A guidance, monitoring, and inspection system for a work area includes a non-visual sensory detection sensor, such as a microphone or vibration detection sensor, visual sensors, other sensors, and a processor. The microphone or vibration sensor is configured to sense sounds or vibrations generated in the work area during the performance of an action that are then received by a processor. The visual sensor and other sensors sense characteristics and identity of objects present in the work area. The processor analyzes the received acoustic and/or vibrational signals and compares the received signals to an expected signal associated with an operational step that was performed to confirm that the operational step has been performed, and that it has been performed properly. The processor identifies a particular operational step as defined by the visual sensors and the other sensors.
LIGHT GUIDED ASSEMBLY SYSTEM AND METHOD
A guide system and method of projecting visual indicators onto a physical object to guide actions of an individual comprises providing a guide system having a guide system controller and a projector, with the guide system controller including a plurality of addressed display features. An input signal is provided to the guide system controller whereby an addressed display feature is selected based on the input signal and the projector projects a visual indicator corresponding to the selected addressed display feature onto a physical object. The guide system controller may also include a plurality of positional identifiers, with the method including the selecting of a positional identifier based on the input signal and the projecting of the visual indicator onto a physical object at a location corresponding to the selected positional identifier.
Information handling system keyboard disposition automated using performance metrics
End users subscribe to use information handling systems having a selected of available performance characteristics defined by a keyboard and touchpad configuration selected to build the information handling systems. A manufacturer meets subscriptions with information handling systems built from an inventory of new keyboards, deployed keyboards of information handling system in use by subscribers, and separated keyboards taken from returned information handling systems and re-used. End user subscriptions are met in part by building replacement information handling systems with separated keyboards having a useful life remaining that aligns with end user keyboard usage patterns tracked over time, benchmarked performance metrics and end user subscription performance characteristics.
Light guided assembly system and method
A guide system and method of projecting visual indicators onto a physical object to guide actions of an individual comprises providing a guide system having a guide system controller and a projector, with the guide system controller including a plurality of addressed display features. An input signal is provided to the guide system controller whereby an addressed display feature is selected based on the input signal and the projector projects a visual indicator corresponding to the selected addressed display feature onto a physical object. The guide system controller may also include a plurality of positional identifiers, with the method including the selecting of a positional identifier based on the input signal and the projecting of the visual indicator onto a physical object at a location corresponding to the selected positional identifier.
Autonomous device employed in a system for facilitating the assembly of a product
The invention relates to a device (D2) employed in a system for facilitating the assembly of a product, said assembly being carried out by following one or more assembly sequences, an assembly sequence comprising several successive assembly steps graded by execution ranks, said device comprising: a presence sensor (C2), at least one signalling member (V2), a sequencer (SQ2) connected to the presence sensor (C2) and to the signalling member (V2), said sequencer comprising a microcontroller designed to manage said assembly sequence. The invention also relates to a system for facilitating the assembly of a product comprising several devices (D2) connected together via a communication line.
Remote behavior navigation system and processing method thereof
There are provided a remote behavior navigation system and a method thereof, which allow an instructor at a remote location to perform guidance to a worker while watching a work video picture of the worker. Corresponding to a video picture from a worker-side camera at time t1, an instructor-side computer extracts a work instruction video picture from a video picture of an instructor-side camera at time t2. The instructor-side computer superimposes the work instruction video picture at the time t2 on the video picture from the worker-side camera at the time t1, and displays the superimposed video picture on an instructor-side monitor. A worker-side computer provides the work instruction video picture at the time t2 with a time difference correction, then superimposes the corrected work instruction video picture on a video picture from the worker-side camera at time t3, and displays the superimposed video picture on a worker-side monitor.
WIRING ASSISTANCE SYSTEM
A wiring assistance method includes receiving wiring data associated with a set of industrial automation devices and a set of wires and determining a set of wire properties based on the wiring data. The wiring assistance method also includes generating a wiring design based on the set of wire properties. The wiring design includes a set of wire couplings, each wire coupling indicative of a connection between two or more industrial automation devices of the set of industrial automation devices. The wiring assistance method also includes generating, based on the wiring design, a wiring instruction indicative of a first wire coupling of the set of wire couplings for display via an electronic display.
Controlling of operations using a tool during production of an assembly of parts
A system for controlling operations during production of an assembly of parts is provided. The system includes a tool for performing manual operations on a plurality of elements of the assembly of parts, and locating means integrated at least in part into an electronic module attached to the tool and able to determine a location of at least one operation associated with an element by locating the position of the tool in a three-dimensional coordinate system associated with a set of modelling data representing a three-dimensional modelled image of the assembly of parts. The locating means includes a depth camera belonging to the electronic module and a processing module able to estimate the position of the portable hand tool on the basis of images captured by the depth camera.