Camera and Method for Detecting an Object

20230353883 · 2023-11-02

    Inventors

    Cpc classification

    International classification

    Abstract

    A camera for detecting an object in a detection zone is provided that has an image sensor having a plurality of light reception elements for the generation of image data from received light from the detection zone), a reception optics having a focus adjustment unit for setting a focal position, with an angle of incidence of the received light on the image sensor changing on a change of the focal position, and a control and evaluation unit that is configured to set the focal position for a sharp recording of image data of the object, A sensitivity change of the capturing of the image data caused by the respective angle of incidence of the received light is compensated.

    Claims

    1. A camera for detecting an object in a detection zone that has an image sensor having a plurality of light reception elements for the generation of image data from received light from the detection zone, a reception optics having a focus adjustment unit for setting a focal position, with an angle of incidence of the received light on the image sensor changing on a change of the focal position, and a control and evaluation unit that is configured to set the focal position for a sharp recording of image data of the object, wherein the control and evaluation unit is further configured to compensate a sensitivity change of the capturing of image data caused by the respective angle of incidence of the received light.

    2. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to compensate the capturing of image data by a calculatory and/or physical adaptation of an amplification factor.

    3. The camera in accordance with claim 1, wherein the reception optics has a pivotable optical element, and wherein a pivoting of the optical element changes the focal position and thus the angle of incidence of the received light.

    4. The camera in accordance with claim 3, wherein the pivotable optical element is a deflection mirror.

    5. The camera in accordance with claim 1, wherein the control and evaluation unit has a memory in which a correction table or a correction rule is stored that associates a brightness adaptation with a respective angle of incidence of the received light.

    6. The camera in accordance with claim 1, wherein the control and evaluation unit has a memory in which a correction table or a correction rule is stored that associates a brightness adaptation with a respective focal position.

    7. The camera in accordance with claim 1, wherein the focus adjustment unit has a drive and wherein the control and evaluation unit has a memory in which a correction table or correction rule is stored that associates a brightness adjustment with a respective position of the drive.

    8. The camera in accordance with claim 1, wherein the correction table or correction rule is determined in a teaching process in which the image sensor is homogeneously illuminated and the intensity distribution is measured via the light reception elements for different angles of incidence of the received light.

    9. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to compensate the sensitivity change caused by the respective angle of incidence of the received light and other varying sensitivities of the light receiver elements in respective separate steps.

    10. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to compensate the sensitivity change caused by the respective angle of incidence of the received light for all the light reception elements together, for groups of light reception elements together, or for individual light reception elements.

    11. The camera in accordance with claim 10, wherein the groups are rows.

    12. The camera in accordance with claim 1, wherein a color filter is arranged upstream of at least some light reception elements.

    13. The camera in accordance with claim 1, wherein the image sensor is configured as a multiple line scan sensor having two to four rows of light reception elements, having at least one white line whose light reception elements for recording a gray scale image are sensitive to white light, and at least one color line whose light reception elements for recording a color image are sensitive to light of only one respective color.

    14. The camera in accordance with claim 1, wherein a microlens filter is arranged upstream of at least some light reception elements.

    15. The camera in accordance with claim 1, wherein the control and evaluation unit is configured to read a code content of a code on a detected object with the aid of the image data.

    16. The camera in accordance with claim 1, that is installed in a stationary manner at a conveying device that guides the object to be detected in a direction of conveying through the detection zone.

    17. A method of detecting an object in a detection zone in which an image sensor having a plurality of light reception elements generates image data from received light from the detection zone, wherein a focus adjustment unit sets the focal position of a reception optics for a sharp recording of image data of the object and a change of the angle of incidence of the received light on the image sensor is accompanied by a change of the focal position, wherein a sensitivity change of the capturing of the image data caused by the respective angle of incidence of the received light is compensated.

    Description

    [0032] The invention will be explained in more detail in the following also with respect to further features and advantages by way of example with reference to embodiments and to the enclosed drawing. The Figures of the drawing show in:

    [0033] FIG. 1 a schematic sectional representation of a camera;

    [0034] FIG. 2 a three-dimensional view of an exemplary use of the camera in an installation at a conveyor belt;

    [0035] FIG. 3 a schematic representation of a focus adjustment for a camera having a pivotable optical element in a first focal position with a first angle of incidence of the received light on the image sensor;

    [0036] FIG. 4 a representation in accordance with FIG. 3 now with a perpendicular angle of incidence of the received light on the image sensor;

    [0037] FIG. 5 a representation in accordance with FIG. 3 with a further angle of incidence of the received light on the image sensor;

    [0038] FIG. 6 exemplary intensity curves of two pixels of an image sensor in dependence on the angle of incidence of the received light on the image sensor; and

    [0039] FIG. 7 exemplary intensity curves in accordance with FIG. 6 for two further pixels of an image sensor.

    [0040] FIG. 1 shows a schematic sectional representation of a camera 10. The camera is, for example, a camera-based code reader or an industrial camera (machine vision camera), for example for the quality control or automatic detection of specific features. Received light 12 from a detection zone 14 is incident on a reception optics 16 that guides the received light 12 onto an image sensor 18 having a plurality of light reception elements or pixels in particular arranged to form a line, a plurality of lines, or a matrix. The optical elements of the reception optics 16 are preferably configured as an objective composed of a plurality of lenses and other optical elements such as diaphragms, prisms, and the like, but here only represented by a lens for reasons of simplicity. The reception optics 16 can be set to different focal positions by means of a focus adjustment 20 to record objects in focus at different distances. For this purpose, the most varied functional principles are conceivable, for instance a change of the image back focal distance by a stepper motor or a moving coil actuator system. The focus adjustment 20 is shown only purely schematically in FIG. 1. As will be explained below with reference to FIGS. 3 to 5, the angle of incidence of the received light 12 on the image sensor 18 changes simultaneously with the focal position. An optional internal or external distance sensor, not shown, can be provided to measure the distance from an object to be recorded and to derive a required focal position therefrom.

    [0041] To illuminate the detection zone 14 with transmitted light 22 during a recording of the camera 10, the camera 10 comprises an optional internal or external illumination unit 24 that is shown in FIG. 1 in the form of a simple light source and without a transmission optics. In other embodiments, a plurality of light sources such as LEDs or laser diodes are arranged around the reception path, in ring form, for example, and can also be multi-color and controllable in groups or individually to adapt parameters of the illumination unit 24 such as its color, intensity, and direction.

    [0042] A control and evaluation unit 26 is connected to the focus adjustment 20, to the illumination unit 24, and to the image sensor 18 and is responsible for the control work, the evaluation work, and for other coordination work in the camera 10. It therefore controls the focus adjustment 20 with a suitable focal position in particular corresponding to a measured distance value from an object to be recorded and reads image data of the image sensor 18 to store them or to output them to an interface 28. The control and evaluation unit 26 is preferably able to localize and decode code regions in the image data so that the camera 10 becomes a camera-based code reader. A plurality of modules can be provided for the different control and evaluation work, for example to perform the focus adaptations in a separate module or to perform pre-processing of the image data on a separate FPGA. The camera 10 is protected by a housing 30 that is terminated by a front screen 32 in the front region where the received light 12 is incident.

    [0043] FIG. 2 shows a possible use of the camera 10 in an installation at a conveyor belt 34. The conveyor belt 34 conveys objects 36, as indicated by the arrow 38, through the detection region 14 of the camera 10. The objects 36 can bear code zones 40 at their outer surfaces. It is the object of the camera 10 to detect properties of the objects 36 and, in a preferred use as a code reader, to recognize the code regions 40, to read and decode the codes affixed there, and to associate them with the respective associated object 36. In order in particular also to recognize laterally applied code regions 42, additional cameras 10, not shown, are preferably used from different perspectives. In addition, a plurality of cameras can be arranged next to one another to together cover a wider detection zone 14.

    [0044] FIGS. 3 to 5 very schematically illustrate a focus adjustment 20 that is based on a pivoting of an optical element 44, in particular a deflection mirror. The image sensor 18 is here by way of example a multiple line scan sensor of four line arrangements A-D of light reception elements or pixels. Such a multiple line scan sensor can in particular be used for the simultaneous capturing of black and white images and color images as is described in EP 3 822 844 B1 named in the introduction.

    [0045] FIG. 3 shows a first pivot position of the optical element 44 and thus a first focal position with an angle of incidence α; FIG. 4 shows a second pivot position of the optical element and thus a second focal position with a perpendicular light incidence; and FIG. 5 shows a third pivot position and thus a third focal position with an angle of incidence β. Further focal positions are adopted in further angular positions of the optical element and further angles of incidence of the received light 12 are thus incident on the image sensor 18. Such a focus adjustment 20 is explained in more detail, for example, in EP 1 698 995 B1 and EP 1 698 996 B1 that are named in the introduction and to which reference is additionally made. The light path between the reception optics 16 and the image sensor 18 is shortened or lengthened in dependence on the angular position of the optical element 44; the focal position is thus varied. It is conceivable to also pivot the image sensor 18 so that the image sensor 18 continues to be impinged by the received light 12 at the different angles of incidence. The focus adjustment 20 shown is to be understood as an example; the angle of incidence of the received light 12 on the image sensor 18 can also change with the focal position with other focus adjustment principle.

    [0046] FIG. 6 shows exemplary intensity curves of two light reception elements or pixels of the image sensor 18 in dependence on the angle of incidence of the received light on the image sensor 18. The sensitivity of the light reception elements varies with the angle of incidence, and indeed individually very differently in part. The reasons for this can be the geometrical circumstances and the quantum efficiency, but also microlenses or color filters in front of the light reception elements. The effect illustrated in FIG. 6 can be called an angle dependent PRNU. With knowledge of the current angle of incidence and of the angle dependent PRNU, their effect can be compensated in that an analog or digital amplification of the respective light reception element is reciprocally adapted and/or a corresponding offset is carried out in the image data.

    [0047] FIG. 7 shows further exemplary intensity curves for two further light reception element. Unlike FIG. 6, there are here also differences for a perpendicular light incidence at the angle of incidence corresponding to the position of the Y axis. Further effects can accordingly be superposed on the angle dependent PRNU that affect the sensitivity of the pixels and they can be compensated together with the angle dependent PRNU or separately depending on the embodiment.

    [0048] The dependencies between the angle of incidence and the required brightness adaptation can be stored as a compensation table (lookup table, LUT) or as a compensation rule, i.e. as a functional relationship or formula. A compensation rule can arise by a function fit or a polynomial as a compact combination from a compensation table.

    [0049] Theoretical considerations, for example in the form of a model of the camera 10 or of a simulation can lead to a compensation table or compensation rule.

    [0050] Alternatively or additionally, the compensation table or compensation rule can be acquired in a teaching or calibration process, for example during a production balance. In this respect, the angle of incidence of the received light 12 is varied under homogeneous illumination of the image sensor 16, for example, in one degree steps or with another desired fineness, and in so doing the intensities received by the light reception elements of the image sensor 18 are measured. The required brightness adaptations can be calculated from them. The control and evaluation unit 26 can for this purpose provide a teaching module in which these variations of the angle of incidence and the corresponding measurements and calculations for acquiring the correction table or correction rule are carried out automatically.

    [0051] It is admittedly ultimately the angle of incidence that causes the angle dependent PRNU The camera 10, however, has more direct access to its focal position that includes the respective angle of incidence. It is therefore conceivable to associate a respective brightness adaptation in the correction table or correction rule to a focal position instead of to an angle of incidence. The processing chain can be extended by a further step in that the correction table or correction rule relates to a motor control of the focus adjustment 20 and associates a respective brightness adaptation with a motor position. For that is the actuator that is controlled to adjust the focal position and thus the angle of the optical element 44 and ultimately the angle of incidence of the received light 12 at the image sensor 18.

    [0052] The motor position can in particular be expressed via motor increments that each correspond to a respective rotational position of the motor. In a teaching process designed for this purpose, the motor increments are run through and the intensities of the pixels is measured after an image recording under a homogeneous illumination. An incremental encoder can, in the sense of a regulation of the motor movements, check whether the respective controlled rotational position of the motor is actually adopted. The control and evaluation unit 26 is typically also aware of an association between motor positions or motor increments and the focal position so that the correction table or correction rule can be selectively linked to the motor position or the focal position. The motor position can be used as the basis of a linear movement instead of a rotational position of the motor.

    [0053] In operation, the correction table or correction rule is used to associate a brightness adaptation with the motor position, the focal positions, or the angular position of the optical element 44 or the angle of incidence of the received light 12 on the image sensor in dependence on the embodiment. In this respect, the correction table or correction rule can directly include the compensating values, for example amplification factors required for the compensation or first only the intensity differences through the angle dependent PRNU from which then the required compensation is determined in real time. Since the compensation should take place in real time where possible, it is advantageous to implement the corresponding functionality of the compensation unit 26 on an FPGA (field programmable gate array) or in an ASIC (application specific integrated circuit). An actual balance preferably no longer takes place in operation, the compensation is then rather based completely on the previously taught relationships. The compensation can take place as a control, not as a regulation, since the interference value relevant here is known, namely the respective focal position.

    [0054] The compensation of the angle dependent PRNU can take place with different granular fineness. An individual compensation on the level of individual light reception elements of the image sensor 18 is possible. Alternatively, only an across-the-board one-time adaptation for the whole image sensor 18 takes place. As an intermediate solution, pixels for the compensation are combined group-wise and compensated in the same manner within the group, for example with reference to a mean value determined for this group. A particularly preferred special case of such groups is a combination in lines, in particular for a multiple line scan sensor as shown by way of example in FIGS. 3 to 5. The angle dependent PRNU should not differ all that much within a group so that a group combination is sensible. This is the case with white or color lines of a multiple line scan sensor.

    [0055] The image sensor 18 is color sensitive in a preferred embodiment. For this purpose, its light reception elements have color filters arranged upstream at least in part, for instance in a Bayer pattern, for a color line of a multiple line scan sensor having a color filter that is uniform over the line or a sequence, for example an alternating sequence, of colors. There can be monochrome light reception elements or at least a whole monochrome line therebetween. The angle dependent PRNU varies particularly highly in the case of such color filters and differently in dependence on the color, with parasitic effects also being able to occur on light reception elements without their own color filters, but in the vicinity of light reception elements with a color filter. With a multiple line scan sensor, the information of the plurality of lines is preferably combined to in particular acquire a respective single black and white image line or a color image line aggregated from the primary colors. In such a combination, the respective influence of the angle dependent PRNU is accumulated and the compensation in accordance with the invention is therefore particularly advantageous because otherwise a particularly pronounced total error would result. Amplifying effects of the PRNU can also arise through other elements arranged upstream of the image sensor 18, for example microlenses, that cooperate as parts of the reception optics 16 with its focus adjustable optical elements.