DEVICE FOR GENERATING AN IMAGE OF AN OBJECT

20220156916 · 2022-05-19

Assignee

Inventors

Cpc classification

International classification

Abstract

A device for generating an image of an object by electromagnetic waves has a transmission device which is set up to radiate electromagnetic waves in the direction of the object, a receiving device which is set up to receive electromagnetic waves from the object, and a digital processing and control unit which is set up to generate image data of the object from the measured data. Here, the transmission device and the receiving device are arranged in at least one modular unit. The digital processing and control unit has an interface via which different modular units can be exchangeably coupled to the digital processing and control unit. Here, the interface is set up to transmit data to the modular unit and to receive from this, to transmit control signals to the transmission device and to the receiving device, and to supply the modular unit with energy.

Claims

1. A device for generating an image of an object (O, O*) by means of electromagnetic waves, having: a transmission device, which is set up to radiate electromagnetic waves in the direction of the object (O, O*); a receiving device, which is set up to receive electromagnetic waves from the object (O, O*); and a digital processing and control unit (20), which is set up to generate image data of the object (O, O*) from the measured data; wherein the transmission device and the receiving device are arranged in at least one modular unit (10, 11, 12, 13, 14, 15, 16), and the digital process and control unit (20, 21) has an interface (S1), via which different modular units (10, 11, 12, 13, 14, 15, 16) are exchangeably coupled to the digital processing and control unit (20, 21); and wherein the interface (S1) is set up to transmit data to the modular units (10, 11, 12, 13, 14, 15, 16) and to receive from them, to transmit control signals to the transmission device and to the receiving device and to supply the modular unit (10, 11, 12, 13, 14, 15, 16) with energy.

2. The device according to claim 1, wherein the modular unit (10, 11, 12, 13, 14, 15, 16) has a pre-processing device, which is set up to generate partial data of the object (O, O*) from the receiving electromagnetic waves, the interface (S1) is set up to transmit the partial image data to the digital processing and control unit (20, 21), and the digital processing and control unit (20, 21) is set up to generate image data from the partial image data.

3. The device according to claim 1, wherein the digital processing and control unit (20, 21) and at least one modular unit (10, 11, 12, 13, 14, 15, 16) are arranged in a common housing (40, 45, 46, 47).

4. The device according to claim 1, wherein the digital processing and control unit (20, 21) and at least one modular unit (10, 11, 12, 13, 14, 15, 16) are arranged in different housings (42, 44, 46).

5-7. (canceled)

8. The device according to claim 1, wherein the transmission device is set up to emit electromagnetic waves with a frequency or with several frequencies of a frequency range, and the receiving device is set up to receive the electromagnetic waves with the frequency or with the frequencies in the frequency range.

9. The device according to claim 1, wherein the transmission device emits electromagnetic waves in a frequency range of from 1 GHz to 10 THz, and the receiving device is set up to receive the electromagnetic waves in this frequency range.

10. The device according to claim 1, wherein the transmission device and the receiving device have several measuring channels.

11. The device according to claim 1, wherein the transmission device and the receiving device are set up to carry out a reflection measuring on the object (O, O*).

12. The device according to claim 11, wherein an absorber (A) or a reflector (R) is arranged opposite the transmission device.

13. The device according to claim 1, wherein the transmission device and the receiving device are set up to carry out a transmission measuring at the object (O, O*).

14. The device according to claim 1, wherein the digital processing and control unit (20, 21) is set up to ascertain a movement of the object (O, O*) from the measured data.

15. The device according to claim 1, wherein several modular units (10, 11, 12, 13, 14, 15, 16) can be simultaneously coupled to the digital processing and control unit (20, 21) via the interface (S1).

16. The device according to claim 15, wherein the several modular units (10, 11, 12, 13, 14) are arranged in lines.

17. The device according to claim 16, wherein the digital processing and control unit (20) is set up to control the several modular units (10, 11, 12, 13, 14) in such a way that they each simultaneously carry out a measuring in sequences running temporally one behind the other.

18. The device according to claim 15, wherein the several modular units (10, 11, 15, 16) are arranged in different orientations in relation to the object (O, O*), and the transmission devices radiate the electromagnetic waves from different directions onto the object (O, O*), and the receiving device receives the electromagnetic waves from different directions from the object (O, O*).

19. The device according to claim 15, wherein the respective transmission devices of the several modular units (10, 11, 12, 13, 14, 15, 16) radiate electromagnetic waves with different polarization and/or in different frequency ranges, and the corresponding receiving devices of the modular units (10, 11, 12, 13, 14, 15, 16) receive the electromagnetic waves with the different polarization and/or in different frequency ranges.

20. The device according to claim 15, wherein the digital processing and control unit (20) is set up to transmit a reference signal for coherently controlling the modular units (10, 11, 12, 13, 14, 15, 16) to the modular units (10, 11, 12, 13, 14, 15, 16) via the interface (S1).

21. The device according to claim 1, comprising an image evaluation unit (30) which is part of the digital processing and control unit (20, 21), wherein the image evaluation unit (30) is set up to evaluate the image data generated by the digital processing and control unit (20, 21) and to generate output signals from this and to emit the generated output signals.

22. The device according to claim 1, comprising an image evaluation unit (30) which is modularly connected to the digital processing and control unit (20) and is set up to evaluate the image data generated by the digital processing and control unit (20) and to generate output signals from this and to emit the generated output signals.

23. The device according to claim 22, wherein the modular image evaluation unit (30) is connected to the digital processing and control unit (20, 21) by means of a further interface (S2), and the further interface (S2) is set up to transmit the image data from the digital processing and control unit (20, 21) to the image evaluation unit (30).

24. The device according to claim 22, wherein the modular image evaluation unit (30) is connected to several digital processing and control units (20, 21) and is set up to respectively evaluate the image data generated by the digital processing and control unit (20, 21) and to generate output signals from this and to emit the generated output signals.

25. The device according to claim 1, wherein the interface (S1) between the modular unit (10, 11, 12, 13, 14, 15, 16) and the digital processing and control unit (20, 21) and/or the interface (S2) between the modular image evaluation unit (30) and the digital processing and control unit (20, 21) are formed as the connection to a computer network; and wherein the digital processing and control unit (20, 21) is formed to establish anomalies when generating the image data.

26. (canceled)

27. The device according to claim 1, wherein the digital processing and control unit (20, 21) or the image evaluation unit (30) has an output interface (S3), via which the generated image data and/or output signals are emitted.

28. The device according to claim 27, wherein the output interface (S3) is formed as an IO-link interface, Ethernet or fieldbus interface.

29. The device according to claim 1, wherein the digital processing and control unit (20, 21) or the image evaluation unit (30) is formed to identify a constant symbol from the image data and to emit an error signal when this symbol is no longer identified.

30. The device according to claim 1, wherein the digital processing and control unit (20, 21) and/or the image evaluation unit (30) can be coupled to further sensors (50) and/or is set up to obtain measuring data from further sensors (50), and the digital processing and control unit (20, 21) and/or the image evaluation unit (30) is set up to include the measuring data of the further sensors (50) in the evaluation.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0056] Other objects and features of the invention will become apparent from the following detailed description considered in connection with the accompanying drawings. It is to be understood, however, that the drawings are designed as an illustration only and not as a definition of the limits of the invention.

[0057] In the drawings,

[0058] FIGS. 1A, 1 B, 1C, 1D, and 1E each show a schematic depiction of different exemplary embodiments of the device according to the invention.

[0059] FIGS. 2A, 2B, 3A, and 3B each show a schematic depiction of various arrangements of the modular units in reflection measuring.

[0060] FIG. 4 shows a schematic depiction of an arrangement of the modular units in transmission measuring.

[0061] FIG. 5 shows an isometric view of the device according to the invention and an object to be examined on a band conveyor.

[0062] FIG. 6 schematically shows the sequence of a measuring method.

DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

[0063] Different exemplary embodiments of the device according to the invention are depicted in FIGS. 1A, 1B, 1C, 1D, and 1E. The same components are labelled with the same reference numerals and, for reasons of clarity, these are only described once. The features of the individual exemplary embodiments can, when not otherwise described, also be assumed for the other exemplary embodiments. FIGS. 1A, 1B, 1C, and 1D each show exemplary embodiments with two modular units 10, 11, a digital processing and control unit 20 and an image evaluation unit 30. In further exemplary embodiments, further modular units or also only one modular unit can also be provided.

[0064] The modular units 10, 11 are each formed in the form of an individual module, which is referred to as a high frequency front end module, as a closed functional unit. The modular units 10, 11 each have a transmission device and/or a receiving device, which is not depicted here and reference is made to FIGS. 2A, 2B, 3A, 3B, and 4 for the description of this. The transmission device radiates electromagnetic waves in the direction of the object, and the receiving device receives the electromagnetic waves from the object. In addition, each modular unit 10, 11 has a pre-processing device (not depicted) which generates partial image data of the object from the electromagnetic waves received by the receiving device and forwards this to the digital processing and control unit 20. In this embodiment, the several modular units 10, 11 are constructed in the same way and have the same functions. In further embodiments, the modular units 10, 11 differ and output electromagnetic waves with a different frequency and/or polarization, for example.

[0065] The modular units 10, 11 are connected to the digital processing and control unit 20 via an interface. The first interface S1 is formed as a wire-bound internal communication connection between the two components. In order to connect several modular units 10, 11 to the digital processing and control unit 20, the first interface S1 is formed as a bus system. With the wire-bound connection, a separate cable can be provided for each modular unit 10, 11 or the connection to the individual modular units 10, 11 is at least partially produced via the same cable. The wire-bound connection can also be looped through by the respective modular units 10, 11. Measuring data are transmitted from the modular units 10, 11 to the digital processing and control unit 20 via the first interface S1, and control signals are transmitted from the digital processing and control unit 20 to the transmission device and/or receiving device of the modular units 10, 11. In addition, the modular units 10, 11 are provided with energy from an energy supply E, as described below, via the first interface S1.

[0066] In relation to the functionality of the digital processing and control unit 20 and the image evaluation unit 30, reference is made to the description of FIGS. 2A, 2B, and 6. A second interface S2 is provided between the digital processing and control unit 20 and the image evaluation unit 30, said interface being formed as an Ethernet connection or as a USB connection, for example. The image evaluation unit 30 is supplied with energy via an energy supply E and forwards this on to the digital processing and control unit 20 via the second interface S2.

[0067] In addition, the image evaluation unit 30 has a third interface S3, which functions as an output interface. The third interface S3 is formed as an IO-link interface or, in other embodiments, as a fieldbus interface or as an Ethernet interface. Output signals, which are generated by the image evaluation unit 30 and, optionally, by the digital processing and control unit 20, are output via the third interface S3. Control signals and/or parameters for the digital processing and control unit 20 and/or for the image evaluation unit 30 can also be input via the output interface.

[0068] FIG. 1A shows a first exemplary embodiment, in which the digital processing and control unit 20 and the two modular units 10, 11 are arranged in a common housing 40. The modular units 10, 11 nevertheless each form independent closed functional units. Each modular unit 10, 11 is connected to the digital processing and control unit 20 via the first interface S1, which constitutes an internal communication connection. In this exemplary embodiment, the image evaluation unit 30 is arranged in a separate housing 41, whereby it can be placed spatially separated from the processing and control unit 20 and the modular units 10, 11. The image evaluation unit 30 is connected to the digital processing and control unit 20 via the second interface S2 formed as a wire-bound Ethernet connection or USB connection.

[0069] FIG. 1B shows a second exemplary embodiment, in which the digital processing and control unit 20 is arranged in a housing 42 and each modular unit 10, 11 is arranged in a separate housing 43, 44 each. The housing 43 of a modular unit 10 is then connected to the housing 42 of the digital processing and control unit 20, and the housing 44 of the other modular unit 11 is connected to the housing 43 of the modular unit 10 arranged before it. For this, the housings 43, 44 of the modular units 10, 11 each have a fixing element (not shown), with which they can be fixed on the housing 42 of the digital processing and control unit 20 or on the housing 43 of the modular unit 10 arranged before this. The housing 42 of the digital processing and control unit 20 and the housings 43, 44 of the modular units 10, 11 each have a complementary fixing element (not shown), which interacts with the fixing element, in order to produce a fastening. The fastening is a plug connection, for example. The first interface S1 is here also implemented via the plug connection. In this way, many modular units can be randomly coupled to the digital processing and control unit 20. As described in the first exemplary embodiment, the image evaluation unit 30 is arranged in a separate housing 41, and in relation to this, reference is made to the description above.

[0070] FIG. 1C shows a third exemplary embodiment, which differs from the first exemplary embodiment in that the image evaluation unit 30 is arranged in a common housing 45 together with the digital processing and control unit 20 (and, in this example, with the modular units 10, 11). In this example, the image evaluation unit 30 is furthermore formed as a closed functional unit and is connected to the digital processing and control unit 20 via the second interface S2. In a further example, the image evaluation unit 30 can also be part of the digital processing and control unit 20. As a result of the arrangement of the modules 10, 11, 20, 30 in one housing 45, this can be used as one component at the site of application. The image evaluation unit 30 can also be arranged in a common housing in the device according to the first, the second and the fourth exemplary embodiment together with the digital processing and control unit 20.

[0071] FIG. 1D shows a fourth exemplary embodiment in which a first modular unit, as described in connection with the first exemplary embodiment, is arranged in a common housing 46 with the digital processing and control unit 20. As described in connection with the second exemplary embodiment, a second modular unit 11 is arranged in a separate housing 44 and is connected to the housing 46 via the plug connection.

[0072] FIG. 1E shows a fifth exemplary embodiment in which the modular image evaluation unit 30 is connected to two digital processing and control units 20, 21. As described in connection with the first exemplary embodiment, a first digital processing and control unit 20 is arranged in a common housing 40 with two modular units 10, 11 and is coupled to these. Analogously, a second digital processing and control unit 21 is arranged in a common housing 47 with two further modular units 12, 13. The image evaluation unit 30 receives image data from the two digital processing and control units 20, 21 and evaluates the image data in combination and, on this basis, generates the output signals.

[0073] In addition, an optical camera 50 is provided which provides optical image data of the object O to the image evaluation unit 30. The image evaluation unit uses the optical image data when evaluating the image data generated by the digital processing and control unit 20. Using the optical image data of the optical camera 50, the position, the contour and the surface of the object can already be ascertained in advance. The optical camera 40 can also be provided in other exemplary embodiments in the same way.

[0074] In further embodiments, the interfaces S1, S2 can be formed as a connection to a computer network, which functions as a cloud. In this case, the digital processing and control unit 20 or parts thereof can be implemented in the computer network. The interfaces S1, S2 and S3 can also be formed as a radio connection.

[0075] In FIGS. 2A and 2B, an arrangement of two modular units 10, 11 for a reflection measuring of an object O is respectively shown. The construction of such a modular unit 10, 11 is then described by means of the first modular unit 10: the first modular unit 10 is formed as a high frequency front end module and has a transceiver 60 and an antenna unit 61, which comprises at least one antenna not shown. The transceiver 60 is connected to the antenna unit 61 and provides this with an electrical signal, with which the antenna device 61 generates electromagnetic waves. In order to set or change the frequency in situ, the electrical signal of the transceiver 60 can be controlled and can be output corresponding to the desired frequency. The transceiver 60 and the antenna unit 61 thus function as the transmission device. The frequency of the electromagnetic waves lies in a frequency range of from 1 GHz to 10 THz, i.e., in the microwave range or in the Terahertz range. The electromagnetic waves are radiated by the at least one antenna in the direction of the object O. There, the electromagnetic waves are reflected or scattered and diffuse back to the modular unit 10. The scattered or reflected electromagnetic waves are received by the at least one antenna of the antenna unit 61 and converted by the transceiver 60 into an electrical signal that can be recorded in a metrological manner. For this, the transceiver 60 can downmix the received signal to a low baseband frequency, wherein a baseband with a high frequency of 0 Hz is also possible. The antenna unit 61 and the transceiver 60 thus also function as the receiving unit. The electrical signal is then transmitted to the digital processing and control unit 20 via the first interface S1 not depicted and processed there. The baseband signals are pre-processed and digitalized by the digital processing and control unit 20. In addition, a predetermined algorithm is applied to the baseband signals, by means of which an imaging method is implemented. For this, various algorithms for image calculation are known. For example, the back projection algorithm can be used. The digital processing and control unit 20 is additionally set up to control the transceiver 60.

[0076] In this embodiment, the second modular unit 11 is constructed in the same way and has the same components and functions. The control of the second modular unit 11 is carried out by the digital processing and control unit 20 in the manner described above. In another embodiment not depicted, the two modular units 10, 11 are constructed differently and/or have different functions. For example, the two modular units 10, 11 can output electromagnetic waves with a different frequency and/or polarization. In this case, the digital processing and control unit correspondingly controls the different modular units 10, 11.

[0077] In FIG. 2A, the two modular units 10, 11 are directed towards the object O and each irradiate electromagnetic waves onto it. The modular units 10, 11 are arranged at different positions and have different orientations. Thus, the object O is received by the modular units 10, 11 from different perspectives. Based on the received signals of the two modular units 10, 11, the digital processing and control unit 20 implements the imaging method. The several perspectives here lead to an improved imaging.

[0078] In FIG. 2B, the two modular units 10, 11 are also directed towards the object and are arranged on different sides of the object O. In the case depicted, an obstacle H is located between the first modular unit 10 and the object. The obstacle H is impenetrable for the electromagnetic waves emitted by the antenna unit 61 of the first modular unit 10. The second modular unit 11, however, has clear sight of the object O. Thus, the measuring is only carried out by the second modular unit 11.

[0079] In FIGS. 3A and 3B, an arrangement of a modular unit 10 and an additional limiting condition for a reflection measuring of an object O are respectively shown. The modular unit 10 corresponds to that described above. In FIG. 3A, an absorber A is arranged opposite the modular unit 10 on the other side of the object O. The electromagnetic waves that penetrate the object O are absorbed by the absorber A, and thus only the waves reflected or scattered by the object O are received by the antenna device 61. In FIG. 3B, a reflector R is arranged opposite the modular unit 10 on the other side of the object O facing away from the modular unit 10. The electromagnetic waves that penetrate the object O are reflected by the reflector R and pass through the object O again before they are received by the antenna device 61. Thus, diffusion properties in the object can be better ascertained. The absorber A and the reflector R can also be arranged one next to the other. The limiting conditions described can be assumed for the exemplary embodiments described.

[0080] In FIG. 4, an arrangement of two modular units 15, 16 for a transmission measuring of an object O is shown. The first modular unit 15 is formed as a high frequency front end module and has a transceiver 70 and an antenna unit 71, which comprises at least one antenna not shown. Instead of the transceiver 70, a transmitter can also be provided in the first modular unit 15 in the transmission measuring. The transceiver 70 is connected to the antenna unit 71 and provides this with an electrical signal, with which the antenna device 71 generates electromagnetic waves. In order to set or change the frequency in situ, the electrical signal of the transceiver 70 can be controlled and can be output corresponding to the desired frequency. The transceiver 70 and the antenna unit 71 thus function as a transmission device. The frequency of the electromagnetic waves lies in a frequency range of from 1 GHz to 10 THz, i.e., in the microwave range or in the Terahertz range. The electromagnetic waves are radiated by the at least one antenna in the direction of the object O. There, the electromagnetic waves penetrate the object O and reach the other side.

[0081] A second modular unit 16 is arranged opposite the first modular unit 15, said second modular unit also being formed as a high frequency front end module. The second modular unit 16 has a transceiver 80 and an antenna unit 81, which comprises at least one antenna not shown. Instead of the transceiver 80, a receiver can also be provided in the second modular unit 16 in the transmission measuring. The transmitted electromagnetic waves are received by the at least one antenna of the antenna unit 81 and converted by the transceiver 80 into an electrical signal that can be recorded in a metrological manner. For this, the transceiver 80 can downmix the received signal to a low baseband frequency, wherein a baseband with a frequency of 0 Hz is also possible. The antenna unit 81 and the transceiver 80 thus also function as the receiving unit. The electrical signal is then transmitted to the digital processing and control unit 20 via the first interface S1 not depicted in this Figure. The baseband signals are pre-processed and digitalized by the digital processing and control unit 20. In addition, a predetermined algorithm is applied to the baseband signals, via which an imaging method is implemented. For this, various algorithms are known for image calculation. For example, the amounts of the individual measuring points can be interpreted as pixels, which thus result in an image. The digital processing and control unit 20 is additionally set up to control the transceivers 70 and 80.

[0082] In FIG. 5, an exemplary application for the device according to the invention on a band conveyor F is shown. Several objects O, O*, which are each surrounded by packaging V, are moved on the band conveyor F in the running direction L of the band conveyor F. The device according to the invention serves to check the objects O, O* through the packaging V. In this example, an incorrect object O is to be identified. An incorrect object has anomalies, for example. In the situation depicted in FIG. 5, the two objects O* on the left-hand side have already been checked by the device according to the invention, the central object O is being checked by the device according to the invention and the object O* on the right-hand side is then checked. Here, five modular units 10-14 are provided which are positioned above the band conveyor F. The five modular units 10-14 are arranged according to the first embodiment of FIG. 1A in a common housing 40 with the digital processing and control unit 20 and are constructed in the same way in this embodiment and have the same functions. The five modular units 10-14 are arranged in lines along a line perpendicular to the running direction L of the band conveyor F and are oriented in opposition to the running direction L and substantially cover the entire width of the band conveyor F.

[0083] The common housing 40 has two tubular fixing elements 90 on the end, on which the digital processing and control unit 20 is arranged, on opposite sides. In each case, one holding rod 91 of a holding device 92 is inserted into these tubular fixing elements 90 in order to position the device according to the invention above the band conveyor F.

[0084] The five modular units 10-14 are controlled together by the digital processing and control unit 20, that is to say in such a way that the units 10-14 carry out a measuring in sequences running temporally one after the other with a predetermined temporal spacing in each case simultaneously along the line. For this, the digital processing and control unit 20 transfers an optional reference signal to the coherent controller of the modular units 10-14 via the first interface S1. When the object O moves in the running direction L of the band conveyor F, each of the five modular units 10-14 measures a measuring point of the object O with each sequence of the measuring carried out at different positions in the running direction L. The spatial distance of the measuring points results directly from the movement speed of the object O and the temporal spacing of the measuring events carried out one after the other. Since the modular units 10-14 are fixed via the holding device 92 and are thus not moved and the movement direction of the object O is predetermined by the running direction L of the band conveyor F, the movement speed of the object O can be directly ascertained from the measured data. For this, the double shifting or a tracking method of a scattering center of the object O, for example, is used. As described below in connection with FIG. 6, the measuring events are then evaluated. The modular units 10-14 have a pre-processing device not shown, with which partial images can be compiled out of the measured data. The partial images are then transmitted to the digital processing and control unit 20 and there linked to a whole image.

[0085] The whole image is transmitted to the image evaluation unit 30 via the second interface S2. Finally, the image evaluation unit 30 evaluates the whole image and assesses the object O, O*. If the object is identified as not to be objected to, as is the case for the two objects O* on the left-hand side, these can be processed further as usual. In contrast, if the object is identified as to be objected to, as is the case with the central object O being examined, an error signal is emitted. The object O can then be treated specifically depending on the situation. The output signals generated by the image evaluation unit are emitted to an output device 35, such as e.g., a PC (e.g., a laptop) or a mobile terminal (e.g., a smartphone or a tablet), via the output interface S3 formed as an IO link, Ethernet or fieldbus.

[0086] FIG. 6 schematically shows the sequence of a measuring method for the exemplary application from FIG. 5. The object O, which is not visible through the cover, i.e., the packaging V, is here supposed to have the shape of the letter “D”. A measuring event is carried out corresponding to the measuring plane 100 depicted. The five modular units 10-14 are arranged in lines in the Y direction, and each modular unit 10-14 carries out a measuring event 101-105. The running direction L of the band conveyor F here corresponds to the X direction. By means of the sequence of the measuring events 101-105 running temporally one behind the other, a plurality of measuring events is carried out, by means of which a second dimension of the measuring plane in the X direction is achieved. As a result, a plurality of measuring points is obtained, which are depicted in this Figure as dashes and of which an exemplary measuring point is labelled with 106. The spacing of the measuring points in the Y direction corresponds to the spatial spacing of the modular units 10-14. The spacing of the measuring points in the X direction emerges from the temporal spacing of the sequence of the measuring events 101-105 and the movement speed of the object in the running direction L. Complex-valued raw data is obtained from the measuring events. A raw data image 110 is assembled by pre-processing from the raw data. In the example shown, the measuring point 106 of the measurement results in a pixel in the raw data image 110. In the digital processing and control unit 20, image data of the whole image is calculated from the raw data or the pre-processed raw data image via an algorithm. Based on a two-dimensional measuring (see measuring plane 100), three-dimensional image data can be calculated, whereby a piece of distance information can be allocated to each pixel. An anomaly detection can be carried out by the digital processing and control unit 20 already when calculating the image data.

[0087] Based on this three-dimensional information, suitable visualization shapes can be implemented. In FIG. 6, a sectional depiction 120 of the observation space is depicted by way of example. Projecting the spatial information onto a predetermined geometry constitutes another possibility. In further embodiments, only individual sectional images are compiled at predetermined positions, whereby the calculating effort can be considerably reduced and the calculating process can be accelerated. In order to accelerate the calculation in the digital processing and control unit 20 and in order to implement a real-time capable system, the hardware-technical implementation of calculations and calculating steps is provided, for example by using Field Programmable Gate Arrays (FPGAs). Moreover, quick programmable microprocessors and/or graphic processors (GPU) are used for this.

[0088] In this embodiment with a very minimal directional effect, the electromagnetic waves are radiated and received, which is why the focusing is undertaken via the algorithm. In further embodiments, the radiation and the reception of the electromagnetic waves is carried out when already focused by the use of lenses or other typical beam shaping concepts of high frequency technology. In a further embodiment, the beam shaping and focusing can be set electronically via phase shifters and/or attenuators in the waveguide. In order to achieve a physical focusing, the calculating effort for image calculating can clearly be reduced.

[0089] Next, the image data is evaluated by the image evaluation unit 30 (image postprocessing). Here, firstly the object O is identified 130. In one embodiment, a piece of machine-vision software, for example BVS-Cockpit by Balluff GmbH, is used in order to allow a user to simply carry out certain evaluation steps. In further embodiments, analytical, model-based or self-learning evaluation methods are provided, the latter using artificial intelligence, for example. The image data can be fused with measuring data of further sensors, for example the optical camera 50. Finally, output signals are generated which allow a good-bad evaluation, checking target values, a classification of states or similar. The output signals are emitted via the output interface S3.

[0090] Although only a few embodiments of the present invention have been shown and described, it is to be understood that many changes and modifications may be made thereunto without departing from the spirit and scope of the invention.