Detection of objects

20210352214 · 2021-11-11

    Inventors

    Cpc classification

    International classification

    Abstract

    A camera for detecting objects in a detection zone is provided that has an image sensor for recording image data of the objects, a distance sensor for detecting at least one distance value from a respective object, and a control and evaluation unit that is configured to perform at least one setting of the camera for a recording using the distance value, The control and evaluation unit here has real time capability and is configured to generate a recording at a trigger time using time information of the distance sensor.

    Claims

    1. A camera for detecting objects in a detection zone, the camera comprising: an image sensor for recording image data of the objects, a distance sensor for detecting at least one distance value from a respective object, and a control and evaluation unit that is configured to perform at least one setting of the camera for a recording using the distance value, wherein the control and evaluation unit has real time capability and is configured to generate a recording at a trigger time using time information of the distance sensor.

    2. The camera in accordance with claim 1, wherein the control and evaluation unit has one of a microprocessor having real time capability connected to the distance sensor and an FPGA connected to the distance sensor.

    3. The camera in accordance with claim 2, wherein the one of the microprocessor and the FPGA is integrated in the distance sensor.

    4. The camera in accordance with claim 1, that has a focus adjustable optics arranged in front of the image sensor, and wherein the setting of the camera comprises a focus setting.

    5. The camera in accordance with claim 1, wherein the distance sensor is integrated into the camera.

    6. The camera in accordance with claim 1, wherein the distance sensor is an optoelectronic distance sensor.

    7. The camera in accordance with claim 6, wherein the optoelectronic distance sensor is in accordance with the principle of the time of flight process.

    8. The camera in accordance with claim 1, wherein the distance sensor has a plurality of measurement zones for measuring a plurality of distance values.

    9. The camera in accordance with claim 8, wherein the control and evaluation unit is configured to form a common distance value from the plurality of distance values.

    10. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to recognize a new object when the distance value changes and then to determine a trigger time for this.

    11. The camera in accordance with claim 1, wherein the distance sensor is configured to determine a remission value.

    12. The camera in accordance with claim 11, wherein the control and evaluation unit is configured to recognize a new object when the remission value changes and then to determine a trigger time for this.

    13. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to carry out the setting of the camera in accordance with a sequence of a plurality of trigger times.

    14. The camera in accordance with claim 1, wherein the distance sensor is configured to transmit a time stamp as the time information on a distance value or a trigger time.

    15. The camera in accordance with claim 1, that has a pulsed illumination unit, wherein the control and evaluation unit is configured to synchronize the trigger time with an illumination pulse.

    16. The camera in accordance with claim 15, wherein the pulsed illumination unit is configured to transmit one or more of said illumination pulses.

    17. The camera in accordance with claim 1, wherein the setting of the camera comprises an exposure time of the recording.

    18. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to read code contents of codes recorded with the objects.

    19. The camera in accordance with claim 1, that is installed as stationary at a conveying device which conveys the objects in the direction of movement.

    20. A method of detecting objects in a detection zone, in which image data of the objects are recorded by a camera and at least one distance value from a respective object is determined by a distance sensor, wherein at least one setting of the camera for a recording is performed with reference to the distance value, wherein a recording is generated at a trigger time using time information of the distance sensor in a processing having real time capability.

    Description

    [0026] The invention will be explained in more detail in the following also with respect to further features and advantages by way of example with reference to embodiments and to the enclosed drawing. The Figures of the drawing show in:

    [0027] FIG. 1 a schematic sectional representation of a camera with an optoelectronic distance sensor; and

    [0028] FIG. 2 a three-dimensional view of an exemplary use of the camera in an installation at a conveyor belt.

    [0029] FIG. 1 shows a schematic sectional representation of a camera 10. Received light 12 from a detection zone 14 is incident on a reception optics 16 that conducts the received light 12 to an image sensor 18. The optical elements of the reception optics 16 are preferably configured as an objective composed of a plurality of lenses and other optical elements such as diaphragms, prisms, and the like, but here only represented by a lens for reasons of simplicity.

    [0030] To illuminate the detection zone 14 with transmitted light 20 during a recording of the camera 10, the camera 10 comprises an optional illumination unit 22 that is shown in FIG. 1 in the form of a simple light source and without a transmission optics. In other embodiments, a plurality of light sources such as LEDs or laser diodes are arranged around the reception path, in ring form, for example, and can also be multi-color and controllable in groups or individually to adapt parameters of the illumination unit 22 such as its color, intensity, and direction.

    [0031] In addition to the actual image sensor 18 for detecting image data, the camera 10 has an optoelectronic distance sensor 24 that measures distances from objects in the detection zone 14 using a time of flight (TOF) process. The distance sensor 24 comprises a TOF light transmitter 26 having a TOF transmission optics 28 and a TOF light receiver 30 having a TOF reception optics 32. A TOF light signal 34 is thus transmitted and received again. A time of flight measurement unit 36 determines the time of flight of the TOF light signal 34 and determines from this the distance from an object at which the TOF light signal 34 was reflected back.

    [0032] The TOF light receiver 30 has a plurality of light reception elements 30a. The light reception elements 30a individually or in smaller groups form measurement zones with which a respective distance value is determined. No individual distance value is therefore preferably detected, although that is also possible, but the distance values are rather spatially resolved and can be assembled to form a vertical section. The number of measurement zones of the TOF light receiver 30 can remain comparatively small, for example with some tens, hundreds, or thousands of measurement zones, far remote from customary megapixel resolutions of the image sensor 18.

    [0033] The design of the distance sensor 24 is purely exemplary. The optoelectronic distance measurement by means of time light processes is known and will therefore not be explained in detail. Two exemplary measurement processes are photomixing detection using a periodically modulated TOF light signal 34 and pulse time of flight measurement using a pulse modulated TOF light signal 34. There are also highly integrated solutions here in which the TOF light receiver 30 is accommodated on a common chip with the time of flight measurement unit 36 or at least parts thereof, for instance TDCs (time to digital converters) for time of flight measurements. In particular a TOF light receiver 30 is suitable for this purpose that is designed as a matrix of SPAD (single photon avalanche diode) light reception elements 30a. Measurement zones of SPAD light reception elements 30a can be directly deactivated and activated in that the bias voltage is set below or above the breakdown voltage. An active zone of the distance sensor 24 can thereby be set. The TOF optics 28, 32 are shown only symbolically as respective individual lenses representative of any desired optics such as a microlens field.

    [0034] Despite its name, the distance sensor 24 is in a preferred embodiment additionally able to also measure a remission value. The intensity of the received TOF light signal 34 is evaluated for this purpose. With SPAD light reception elements 30a, the individual event is not suitable for an intensity measurement because the same maximum photocurrent is generated on registration of a photon by the uncontrolled avalanche breakdown. However, events in a plurality of SPAD light reception elements 30a of a measurement zone and/or over a longer measurement duration can indeed be counted. This is then also a measure for the intensity with SPAD light reception elements.

    [0035] A control and evaluation circuit 37 having real time capability is provided for the evaluation with real time capability of the distance values of the distance sensor 24. It, for example, comprises a microprocessor or FPGA having real time capability or a combination thereof. The connection between the distance sensor 24 and the control and evaluation unit 37 having real time capability can be implemented via 120 or SPI. A connection between the microprocessor and the FPGA can take place via PCI, PCIe, MIPI, UART, or similar. The time critical processes, in particular the real time synchronization with the image recording of the image sensor 18, are controlled by the control and evaluation unit 37 having real time capability. In addition, settings of the camera 10, for instance a focal position or an exposure time, are set using the evaluation of the distance values.

    [0036] A further control and evaluation unit 38 does not have to have real time capability and is connected to the illumination unit 22, to the image sensor 18, and to the control and evaluation unit 37 of the distance sensor 24. This control and evaluation unit is responsible for further control, evaluation, and other coordination work in the camera 10. It therefore reads image data of the image sensor 18 to store them and to output them at an interface 40, for example. The control and evaluation unit 38 is preferably able to localize and decode code zones in the image data so that the camera 10 becomes a camera-based code reader.

    [0037] The division into a control and evaluation unit 37 having real time capability and a control and evaluation unit 38 not having real time capability in FIG. 1 should clarify the principle and is purely by way of example. The control and evaluation unit 37 having real time capability can be at least partially implemented in the distance sensor 24 or in its time of flight measurement unit 36. Functions can furthermore be shifted between the control and evaluation units 37, 38. It is only not possible in accordance with the invention that a component not having real time capability takes over time critical functions such as the determination of a trigger time for the image sensor 18.

    [0038] The camera 10 is protected by a housing 42 that is terminated by a front screen 44 in the front region where the received light 12 is incident.

    [0039] FIG. 2 shows a possible use of the camera 10 in an installation at a conveyor belt 46. The camera 10 is shown here only as a single symbol and no longer with its structure already explained with reference to FIG. 1. The conveyor belt 46 conveys objects 48, as indicated by a direction of movement 50 with an arrow, through the detection zone 14 of the camera 10. The objects 48 can bear code zones 52 at their outer surfaces. It is the object of the camera 10 to detect properties of the objects 48 and, in a preferred use as a code reader, to recognize the code zones 52, to read and decode the codes affixed there, and to associate them with the respective associated object 48. In order also to recognize objects, laterally applied code zones 54, additional cameras 10, not shown, are preferably used from different perspectives.

    [0040] The use on a conveyor belt 46 is only an example. The camera 10 can alternatively be used for different applications, for instance at a fixed workplace at which a worker holds respective objects 48 into the detection zone.

    [0041] The real time processing of distance values and the control of the image recording by the control and evaluation unit 37 will now be explained in a sequence example.

    [0042] The distance sensor 24 or its time of flight measurement unit 36 already take care of converting the raw data, for example in the form of reception events, into distance values. In addition, in dependence on the embodiment, a time stamp for the time of the distance measurement of the respective distance value and a remission value are available. With a single-zone distance sensor 24, only one respective distance value is measured that cannot be further evaluated. With a plurality of measurement zones and thus distance values, a preselection of relevant measurement zones is preferably made. In a conveyor belt application as in FIG. 2, they are preferably measurement zones that detect an incoming object 48 as early as possible. Central measurement zones are more suitable in a workstation in which objects 48 are manually held into the detection zone 14. Since some settings can only be made once and not in a differentiated manner for a vertical section, the plurality of distance values are offset with one another, for example as a mean value, as in the case of a focal position.

    [0043] It is mostly possible to separate objects 48 from one another with reference to the distance values. This is not always possible due to measurement errors of the distance sensor 24 and in unfavorable constellations such as objects 48 of a similar height following one another closely or with very flat objects 48 such as an envelope. The remission value can then be used as a supplementary or alternative criterion. A check can be made in a specific example whether the distance values differ from the distance from the conveyor belt 46 by more than a noise threshold. If this is the case, the distance values are the dominant feature with reference to which the focal position is set. A mean value is preferably only formed from distance values different from the distance from the conveyor belt 48 since only they belong to the appropriate object 48. If conversely all the distances within the framework of the noise threshold only measure the conveyor belt 46, a check is made whether the remission values allow a difference to be recognized to, for example, recognize a light envelope on a dark conveyor belt 46. A focal position can then be placed onto the plane of the conveyor belts 46. If there are no significant differences in either the distance or in the remission, an object 48 may be overlooked—a black envelope on a black background is not recognized, but would anyway not be able to bear any readable code 52.

    [0044] In an alternative application in which objects 48 are held into the detection zone 14 by hand, it is extremely unlikely that this takes place for two successive objects 48 at the same distance without a gap. A separation of objects 48 using the distance values is therefore possible as a rule. A supplementary use of remissions values is nevertheless also conceivable here.

    [0045] It is thus recognized if a new camera setting and a trigger time for a further object 48 are required. The trigger time at which the image recording takes place results from the time stamp. A fixed or a dynamically determined time offset has to be considered here. In the conveyor belt application, this is the time that the object 48 requires until it has been conveyed into the recording position, for example centrally into the detection zone 14, from the first detection by the distance sensor 24. This depends, on the one hand, on the belt speed that is known by parameterization, specification, or measurement and, on the other hand, on the object height measured over the distance values and on the geometrical arrangement. A constant time offset is sufficient in an application with a manual guidance of objects 48.

    [0046] The control and evaluation unit 37 having real time capability preferably does not immediately reset the camera 10 in accordance with the last measured distance values, but rather in order only at the trigger time, naturally where possible in good time while taking account of adaptation delays. A focal position may, for example, not be adjusted immediately if another object 48 should still previously be recorded. Its distance values are then decisive initially up to its trigger time. As soon as the control and evaluation unit 37 having real time capability has determined that it is no longer necessary to wait for a previous image, the reset can begin.

    [0047] The image recording generally takes place at the trigger time. If, however, the illumination time 22 is operated in a pulsed manner at a predefined frequency, the image recording should be synchronized with it. For this purpose, the image recording is synchronized to a suitable illumination pulse, that is, for example, to the next illumination pulse before or after the originally intended trigger time. In principle, it is alternatively conceivable to displace the illumination pulse. However, the illumination unit 22 must support this and in addition the pulse sequence is frequently not a free variable but rather predefined by conditions.

    [0048] The focal position frequently used as an example is in no way the only conceivable camera setting. An adaptation of the exposure time in dependence on the distance value is, for example, also conceivable to avoid an overexposure or an underexposure. An adaptation of the illumination intensity would alternatively be conceivable for this. However, this requires a setting possibility having real time capability of the illumination unit 22.

    [0049] On the storage or output of the image data that are detected at the trigger time, metadata such as the distance values or the trigger time can be appended. This enables further evaluations at a later time and also a diagnosis and improvement of the camera 10 and its application.