Housing for a Manufacturing Machine and/or a Part of a Production Line and System for Open-Loop and/or Closed-Loop Control of a Production Facility
20230359026 · 2023-11-09
Assignee
Inventors
Cpc classification
G06F3/011
PHYSICS
G05B19/409
PHYSICS
International classification
Abstract
Various embodiments of the teachings housing for a manufacturing machine and/or a part of a production line, the housing comprising a transparent element used at least partially as: a projection surface for a head-up display comprising an optics module and an imaging unit in addition to the projection surface, a carrier for a transparent interactive input device including at least one of: a transparent multitouch element, a sensor button, a slider, a wheel, a trackpad, and a touchscreen, and/or a combiner for augmented and/or assisted reality.
Claims
1. A housing for a manufacturing machine and/or a part of a production line, the housing comprising: a transparent element used at least partially as: a projection surface for a head-up display comprising an optics module and an imaging unit in addition to the projection surface, a carrier for a transparent interactive input device including at least one of: a transparent multitouch element, a sensor button, a slider, a wheel, a trackpad, and a touchscreen, and/or a combiner for augmented and/or assisted reality.
2. The housing as claimed in claim 1, wherein the housing holds a camera.
3. The housing as claimed in claim 1, wherein the housing holds a loudspeaker.
4. The housing as claimed in claim 1, wherein the housing holds a microphone.
5. The housing as claimed in claim 1, wherein the housing holds a 360° camera.
6. The housing as claimed in claim 1, wherein the housing holds a haptic film.
7. The housing as claimed in claim 1, wherein the housing holds a laser grid.
8. The housing as claimed in claim 1, wherein the housing comprises an at least 25% opaque film.
9. The housing as claimed in claim 1, wherein the housing holds a processor.
10. A system for closed-loop and/or open-loop control of a manufacturing machine and/or a part of a production line, the system comprising: a display and operating unit having an HUD unit; a transparent interactive input device; sensors; and a processor programmed to receive, combine, and process the data of the sensors and to operate the display of operating element and to control in an open-loop and closed-loop manner the manufacturing machine and/or the part of a production line.
11. The system as claimed in claim 10, further comprising: a camera, a light barrier, a microphone, and/or a loudspeaker.
12. The system as claimed in claim 10, wherein the sensors detect aquality of the product to be manufactured.
13. The system as claimed in claim 10, wherein the sensors operate to authenticate a user.
14. The system as claimed in claim 10, further comprising an AI for creating predictions about the progress of the production of the product to be manufactured.
15. The system as claimed in claim 10, wherein the sensors detect a status of the manufacturing machine, the part of the production line, and/or a tool.
Description
DETAILED DESCRIPTION
[0021] The teachings of the present disclosure may provide a housing for a manufacturing machine and/or a part of a production line having at least one transparent element, wherein the at least one transparent element is used at least partially as a projection surface for a head-up display (HUD), the latter at least also comprising an optics module and an imaging unit in addition to the projection surface, and/or as a carrier for a transparent interactive input device, such as a transparent display of at least one: transparent multitouch element, sensor button, slider, wheel, trackpad, and/or touchscreen and/or as a combiner for augmented and/or assisted reality.
[0022] Some embodiments include a system for closed-loop and/or open-loop control of a manufacturing machine and/or a part of a production line, comprising a display and operating element having an HUD unit, a transparent interactive input device, and/or a combiner, sensors, and a processor, wherein the processor is provided to receive, combine, and process the data of the sensors and from the display and operating element and thus to control in an open-loop and closed-loop manner the manufacturing machine and/or the part of a production line.
[0023] In the present disclosure, “manufacturing machine” and/or “part of a production line” denotes a device which processes, finishes, transports, assembles, and/or constructs a product to be manufactured in a housing. The devices can operate in a fully or partially automated manner. For example, this can be a robot.
[0024] A “head-up display” (HUD) is a display system, in which the user can maintain his head position or viewing direction because the items of information are projected into his field of view. An HUD can be equipped with 3D and/or augmented reality function. In current computer games, status displays which are not part of the virtual environment but are positioned statistically at the edges of the field of view are generally designated as HUD.
[0025] An HUD generally comprises an imaging unit, an optics module, and a projection surface. The imaging unit generates the image. The optics module having collimator and deflection conducts the image onto the projection surface or the combiner.
[0026] The teachings herein are explained in more detail hereinafter on the basis of exemplary embodiments:
Example 1
[0027] An exemplary structure of an HUD having projection on at least one transparent windowpane or part of such a windowpane of a housing of a production line comprises [0028] An optics module, for example in the form of a display in the form of a haptic film and/or a laser grade for gesture interaction, and an at least 25% opaque film for the projection on the transparent windowpane, which is a Plexiglas windowpane, for example. [0029] An imaging unit in the form of a close range projector. [0030] A processor which receives, processes, and sends or passes on or returns data of the product to be manufactured, the manufacturing machine, the optics module of the HUD, the sensors, the touch elements, and/or the imaging unit to one of these modules, by which the system controls in an open-loop and closed-loop manner [0031] for example an optical sensor such as a light barrier for the activation of the system when a user is detected, [0032] for example a loudspeaker and microphone for speech interaction and/or alarm output.
[0033] The windowpane forms, for example, the projection surface of the HUD. This is in particular a reflective, light-transmissive windowpane. The user of, for example, a windowpane projector thus sees the reflected items of information of the imaging unit and at the same time the real world of the production behind the windowpane.
[0034] In some embodiments, the system has various sensors which are interfaces to the user, to the manufacturing machine in the housing, and to the product to be manufactured. These sensors send the data detected thereby to the processor, which processes these data and in turn sends corresponding signals to the manufacturing machine, the imaging unit, the loudspeaker, and/or the touchscreen. All types of known sensors can be used as sensors, analytical sensors for checking the quality of the product to be manufactured, sensors generating light, temperature, pressure, audio and/or video data, for example “sensors” such as camera and/or microphone.
[0035] In some embodiments, the system comprises an artificial intelligence (AI), which creates a prediction for the progress of the production of the product to be manufactured with the aid of the data combined and processed by the processor. Error sources in the production can thus be recognized and eliminated rapidly by the detection of the data.
[0036] Furthermore, the system can detect the status of the manufacturing machine and/or the tools used via the sensors, wherein the data are again either, for example, displayed by the HUD or processed by an AI and corresponding predictions, requirements, etc. are displayed by the HUD.
[0037] “Combiner” refers to a device which combines signals from various sources to form one signal. For example, a combiner for augmented reality combines light which is incident from the surroundings with light which comes from an image source - such as the imaging unit of an HUD - to form a joint view. A possible structure of a combiner comprises semitransparent mirrors, which let through ambient light and at the same time reflect a display installed at the matching distance, for example a liquid crystal display (LCD).
[0038] For an observer from a matching perspective, the impression thus results that the graphics displayed on the display are located behind the combiner, while objects located behind the combiner are still visible.
[0039] In a housing of a production line having HUD display, the combiner is often simply the Plexiglas windowpane. In some embodiments, there are also so-called “boxed combiners”, which use a separate small windowpane independently of the Plexiglas windowpane to merge the virtual graphics with the light from the physical surroundings.
[0040] The generated virtual image can be projected so that it can be acquired with one eye - monocular - or with both eyes - binocular. Binocular HUDs have a higher visibility range than monocular ones. The projection of the virtual image is directed to the size and the distance from the physical object behind the windowpane.
[0041] For example, special laser and/or light-emitting diodes are used as the light source for HUDs. The brightness of the image is open-loop controlled in dependence on the ambient light via a photosensor.
[0042] An optics module of an HUD and/or a haptic film in the form of a touch element and/or a combiner - thus an interactive display and/or input element - is generated, for example, by a colored high-resolution thin-film transistor (TFT) display. This can be used, for example, in the form of an active matrix display.
[0043] Such an interactive display and/or input element is, for example, a device in the form of a transparent touch element. This is available, for example, as a touchscreen or multitouch element in a typical commercial form as flexible transparent films or on glass as the carrier. Touchscreens are also available in the form of transparent multitouch elements, which are operable via sensor buttons, thus electronic switching elements, which are open-loop controlled by means of finger touch without mechanical buttons, and/or by means of zooming and wiping, etc.
[0044] Input devices such as displays and/or touchscreens, in which electrical signals are generated by circling, wiping, and/or pulling movement of the hand and/or a finger and/or a pen and thus inputs can be made reliably and unambiguously into a system having processor, are referred to as a “wheel” and/or as a “slider”.
[0045] A “processor” is a programmable arithmetic unit, thus a machine, which controls other machines or electrical circuits according to delivered commands and drives an algorithm, which usually includes data processing. A computer comprises a processor, a server comprises a processor, etc.
[0046] By way of a transparent display and/or operating element according to another exemplary embodiment of the invention having corresponding sensors it is possible that a user who walks past is informed about the most up-to-date KPIs (key performance indicators) and/or about the current assembly design plans, about the current status of the product, and/or the current documentation of the storage containers, machines, tools, and facilities. For example, the display and/or operating element can be equipped for this purpose with a sensor switch, by way of which the display and/or operating element can assign the user to a group which has certain authorizations, so that the processor, after receiving the signal that the user is in the vicinity, immediately provides the items of information tailored to the respective user via the display.
[0047] A display and/or an indicator in which items of digital information are overlaid on the physical world in real time is referred to as “augmented reality (AR)”. That is to say, the users still perceive their environment, but this is supplemented with virtual elements and items of information.
[0048] A minimalistic form of “augmented reality” is referred to as “assisted reality”, in which users have, for example, items of text information overlaid, so that course of time, degree of production, product status, test results are visible in the HUD, so that the user sees the product and at the same time perceives the results of the running quality tests, for example surface roughness of a wafer etc., via the projection surface of the HUD on the windowpane.
[0049] Simultaneously with the data about the manufacturing product, data about the manufacturing machine, for example the number of the rotations of a robot arm per minute etc., can be overlaid via assisted reality.
[0050] A further combination of virtual and physical world is referred to as “mixed reality”. In mixed reality, a digitally generated virtual world is combined with items of information and/or elements of the physical world. The user can act both in the physical and in the virtual world here.
[0051] Above all “augmented reality”, thus the overlay of digital contents over the real affected components of the facility and the natural interaction with the system may be implemented by the use proposed here of the transparent elements of housings of the devices and machines of a production line as HUDs or multitouch display and input elements having greatly varying accompanying sensors, and modules for speech interaction. Contents can thus be provided efficiently with context for faster comprehension by users and decisions can be made or tasks can be performed directly in a natural manner by speech interaction, gestures, etc.
[0052] For example, the system detects an irregularity on the housed machine and reports this in an automated manner. The user located in the closest proximity authenticates themselves on the system, for example by means of smartphone, card reader, smartwatch, facial recognition, and/or voice, and depending on the role is tasked with eliminating the irregularity - under certain circumstances immediately with action recommendations. Depending on the role, this can extend at least to a simple list of responsible contacts who are to be found immediately, up to a video introduction, which can be played back via the HUD, for immediate error elimination.
[0053] In some embodiments, the system recognizes a passing user - this can be a human or machine “AEV” - and outputs a selection of simple tasks with the request to carry them out. For some tasks, expert knowledge is not required, for example refilling packaging cardboard. On the other hand, however, expert knowledge can be required, the task issued by machine to an arbitrary user then involving finding the closest human for the tasks and accompanying them back.
[0054] In some embodiments, the system recognizes a passing user by way of a sensor system for authenticating the user. This can comprise, for example, a face recognition, a speech recognition, a Bluetooth recognition, and/or an RFID recognition, etc. The system then addresses the user in an automated manner and gives them their task.
[0055] If no current task exists for the affected user or a passing user, the system simply displays the KPIs and the current status with the aid of augmented reality of the facility. Furthermore, AI-assisted predictions are output, such as resources which could be used up in X minutes and a refilling requirement will arise or the point in time of the last maintenance.
[0056] In some embodiments, the system identifies a passing user, for example, on the basis of their smartwatch or smartphone. For this user, role-related data and tasks not affecting them are then hidden.
[0057] In some embodiments, the system recognizes a passing user and displays the KPIs and/or the current status with the aid of augmented reality of the system on the projection surface of the HUD. Furthermore, AI-assisted predictions, which are created, for example, in an automated manner by the processor on the basis of the data acquired by the sensors, can be displayed. The user can thus recognize immediately that, for example, in X minutes a storage container will be consumed, or a nozzle will be completely clogged and thus there is a need for cleaning or that a screw is coming loose on the machine, etc. The point in time of the last maintenance etc. can also be displayed in an automated manner.
[0058] In some embodiments, the system also comprises a camera, for example a 360° camera. The system can thus acquire further data and have it incorporated via the processor, which comprises the open-loop control of the system, in an AI or other data processing.
[0059] Three-dimensional video data may also be acquired and likewise processed via one or more cameras. For example, a camera can acquire that a user needs assistance in a specific process step, which is then provided to him in an automated manner via the HUD.
[0060] The system can thus display the tasks of a production machine such as those of a user via the HUD and visualize the degree of processing. Furthermore, the work can be displayed prioritized according to degree of urgency or time dependence.
[0061] The system can also visualize irregularities in the work sequence via the camera of the system, so that it can establish by comparison by means of AI in the processor poor posture, absence, closed eyes, etc. and communicate or display them in an automated manner. The system can also produce in an automated manner the personal settings of a user by authentication of the user.
[0062] The possibility is opened up for the first time by the present invention of replacing conventional monitors in a production line and simultaneously, by using transparent interactive display and input elements such as transparent multitouch elements and/or HUDs, using the transparent areas of the housings of a production line via augmented reality and/or assisted reality.