Automatic configuration of a plurality of cameras
20210306597 ยท 2021-09-30
Inventors
Cpc classification
H04N7/18
ELECTRICITY
H04N23/662
ELECTRICITY
H04N7/181
ELECTRICITY
H04N23/74
ELECTRICITY
H04N23/661
ELECTRICITY
International classification
Abstract
A method is provided for the automatic configuration of a plurality of cameras that each have an image sensor, an illumination, and a communication interface and that are installed such that they together record objects in a detection zone by partially overlapping, mutually complementary fields of view, wherein the cameras are connected via their communication interfaces to form a network. In this process, the cameras produce sample recordings while only specific cameras activate their illumination, a check is made for the respective cameras with reference to the sample recordings whether there is a proximity relationship to a camera having active illumination, and the relative arrangement of the cameras with respect to one another is determined from the proximity relationships.
Claims
1. A method for the automatic configuration of a plurality of cameras that each have an image sensor, an illumination, and a communication interface and together record objects in a detection zone by partially overlapping, mutually complementary fields of view, wherein the cameras are connected via their communication interfaces to form a network, wherein the cameras produce sample recordings while only specific cameras activate their illumination; wherein a check is made for the respective cameras with reference to the sample recordings whether there is a proximity relationship to a camera having active illumination; and wherein the relative arrangement of the cameras with respect to one another is determined from the proximity relationships.
2. The method in accordance with claim 1, wherein the objects are present in a stream of objects conveyed through the detection zone.
3. The method in accordance with claim 1, wherein sample recordings are repeatedly produced while exactly one camera alternatingly activates its illumination.
4. The method in accordance with claim 1, wherein first, a local proximity relationship is determined per camera from the information on the direction in which a camera having active illumination is disposed in accordance with the check of the sample recordings; and wherein the local proximity relationships with respect to the relative arrangement of the cameras are subsequently collected.
5. The method in accordance with claim 1, wherein the cameras form a row arrangement having fields of view partially overlapping in pairs.
6. The method in accordance with claim 4, wherein first, it is determined by checking the sample recordings per camera whether a camera having active illumination is located to the right, left, or not even in proximity; and wherein the local proximity relationships thus acquired are subsequently combined to form the relative arrangement of the cameras.
7. The method in accordance with claim 1, wherein the cameras are of the same construction among one another and/or are configured originally identical with a basic setting.
8. The method in accordance with claim 1, wherein an automatic device recognition recognizes the cameras connected in the network.
9. The method in accordance with claim 8, wherein the automatic device recognition determines one camera as a master.
10. The method in accordance with claim 1, wherein a provisional network configuration is assigned to every camera.
11. The method in accordance with claim 10, wherein a provisional network configuration is assigned to every camera by a camera configured as a master.
12. The method in accordance with claim 1, wherein a new network configuration is assigned to the cameras that corresponds to the relative arrangement of the cameras with respect to one another.
13. The method in accordance with claim 1, wherein the cameras each record a reference image while no illumination is active and the reference image is taken into account in the check of sample recordings.
14. The method in accordance with claim 1, wherein the cameras are synchronized.
15. The method in accordance with claim 1, wherein an individual autoconfiguration method is triggered and performed in the cameras by which their own positions with respect to the detection zone are determined.
16. The method in accordance with claim 15, wherein the individual autoconfiguration method is triggered and performed in the cameras by which their own positions with respect to the detection zone are determined using recordings of a calibration object.
17. The method in accordance with claim 1, wherein the relative arrangement of the cameras with respect to one another found from the sample recordings is checked using position information of the cameras that is in particular acquired from recordings of a calibration object.
18. The method in accordance with claim 1, wherein at least one camera is parameterized; wherein system parameters are distinguished in the cameras that have a significance for the cameras; and wherein on the setting or changing of a system parameter in the parameterized camera, the system parameter is transferred to the other cameras.
19. The method in accordance with claim 1, wherein the cameras record images of the objects and read codes affixed there in an operating phase after the automatic configuration.
20. A multi-camera system having a plurality of cameras, wherein the cameras each have an image sensor, an illumination, and a communication interface and that are installed such that they together record objects in a detection zone by partially overlapping, mutually complementary fields of view; wherein the cameras are connected via their communication interfaces to form a network; and wherein the cameras are configured for performing an automatic configuration in accordance with any one of the preceding claims.
21. The multi-camera system in accordance with claim 20, further comprising a reading tunnel for reading codes on objects that are conveyed through the reading tunnel.
Description
[0031] The invention will be explained in more detail in the following also with respect to further features and advantages by way of example with reference to embodiments and to the enclosed drawing. The Figures of the drawing show in:
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041] In a preferred embodiment, the cameras 18.sub.1 . . . 18.sub.n are configured as code readers in which the control and evaluation unit 26 additionally acts as a decoding unit for reading code information and corresponding preprocessing for localizing and preparing code regions. It is also conceivable to detect streams of objects 14 without codes 16 and accordingly to dispense with the decoding unit itself or its use, for example for an inspection or quality control. In addition, a decoding on their image data can also take place downstream outside the cameras 18.sub.1 . . . 18.sub.n, in particular in the higher ranking control 32.
[0042] The conveyor belt 12 is too wide to be detected by an individual camera 18.sub.1 . . . 18.sub.n. The fields of view 20.sub.1 . . . 20.sub.n of the cameras 18.sub.1 . . . 18.sub.n therefore complement one another in the transverse direction of the conveyor belt 12 with a partial mutual overlap. An optional geometry detection sensor 34, for example in the form of a distance measuring laser scanner known per se, is arranged above the individual cameras 18.sub.1 . . . 18.sub.n with respect to the direction of movement of the conveyor belt 12 to measure the three-dimensional contour of the objects 14 on the conveyor belt 12 in advance, with dimensions, volume, position, and shape resulting therefrom. In addition, the geometry data can be used to trigger the recordings and to focus the cameras 18.sub.1 . . . 18.sub.n.
[0043]
[0044] The use of a calibration object 36 is advantageous in a number of cases, but is not absolutely necessary for the now described configuration.
[0045]
[0046] In a step S1, the cameras 18.sub.1 . . . 18.sub.n are first installed and physically connected to one another. In the starting state, the cameras 18.sub.1 . . . 18.sub.n are at the works setting with the same network setting, that is, for example, mutually identical IP addresses and device addresses, that are here called CAN node ID. The specific embodiment of the network setting can vary and depends on the network architecture used. The cameras 18.sub.1 . . . 18.sub.n, however, preferably have a unique serial number or another feature such as a MAC address. An individualizing feature could otherwise also be negotiated. A configuration processor is preferably connected during the configuration, in particular to the higher ranking control 32, to start, to observer, and optionally to complement or change the configuration. A control element is, however, also conceivable at the camera 18.sub.1 . . . 18.sub.n itself, for instance a button that triggers the configuration.
[0047] In a step S2, the connected cameras 18.sub.1 . . . 18.sub.n are first recognized. A master is preferably selected and configured to which a corresponding IP address is assigned. The master than controls the further procedure. Alternatively to a master camera, a master in the higher ranking control 32 can also take over this function. The method does not rely on a master-slave architecture, but will be described on this basis below.
[0048] In a step S3, the master assigns a provisional network setting to the other cameras 18.sub.1 . . . 18.sub.n, with a unique IP address and a unique node ID being determined with reference to the respective serial number, for example. A double assignment of possible addresses can be precluded by a further network scan. The cameras 18.sub.1 . . . 18.sub.n can optionally still be required to provide a sign, for example by activating their illumination 30, so that a service engineer can see whether all the cameras 18.sub.1 . . . 18.sub.n have been found and integrated in the network.
[0049] In a step S4, the master starts an individual automatic setup in every camera 18.sub.1 . . . 18.sub.n. This is coordinated such that mutual interference is avoided. The calibration object 36 explained with reference to
[0050] In a step S5, the master starts a synchronization service in all cameras 18.sub.1 . . . 18.sub.n. This subsequently enables a synchronized triggering of the image recording and the illumination. There could otherwise be interference due to simultaneous flash light, for example.
[0051] In a step S6, all the cameras 18.sub.1 . . . 18.sub.n record a reference image with an inactive illumination 30 and store it for later applications. The reference image delivers information on the environmental light situation and is taken into account later, for example, in that is it removed from further recordings.
[0052] In the steps S7 to S10, the cameras 18.sub.1 . . . 18.sub.n independently find their relative arrangement with respect to one another. In a step S7, only one camera 18.sub.1-18.sub.n activates its illumination 30. All the cameras 18.sub.1-18.sub.n, possibly with the exception of the one with an active illumination 30, generate a sample recording in a step S8. The direction in which the camera 18.sub.1 . . . 18.sub.n having the active illumination 30 is located is determined by evaluation of the sample recording in a step S9. The reference image is taken into account here to preclude environmental artifacts. A distinction only has to be made between left and right in a row arrangement, with a distinction optionally being able to be made by a comparison of the degree of illumination whether it is the next neighbor or one further remote. Local proximity relationships of the camera 18.sub.1 . . . 18.sub.n are thus derived from the visibility of the illumination. Steps S7 to S9 are repeated until every camera 18.sub.1-18.sub.n has had its turn with the activation of its illumination 30 once.
[0053] In a step S10, the local proximity relationships are finally collected and combined by the master. A final or global proximity relationship is thereby produced or the sought relative arrangement of the cameras 18.sub.1 . . . 18.sub.n to one another is now known. This step can be brought forward to decide after every run of steps S7 to S9 whether the relative arrangement of the cameras 18.sub.1 . . . 18.sub.n has already been uniquely fixed and the loop can be aborted.
[0054] The steps S7 to S10 are illustrated in
[0055]
[0056]
[0057]
[0058] Now that the relative arrangement of the cameras 18.sub.1 . . . 18.sub.n is known, new network configurations can be distributed in a step S11 so that the physical and logical arrangements correspond to one another. This step is not required for the function since the association can also be ensured in a different manner, but does facilitate the further handling, for example on a later exchange of devices.
[0059]
[0060] The configuration can be verified again in a final step S12. This can involve the service engineer in that, for example, the cameras 18.sub.1 . . . 18.sub.n activate their illumination 30 after one another. If the cameras have determined their own positions in step S4, these positions can be compared with the relative arrangement found in step S10. In addition, a network scan can again be carried out to check for IP addressed issued twice and the like.
[0061] The individual cameras 18.sub.1 . . . 18.sub.n are preferably subsequently individually parameterized. System parameters are here also set or changed in some cases that are not only of significance for the currently parameterized camera 18.sub.1 . . . 18.sub.n, but also for the other cameras 18.sub.1 . . . 18.sub.n. This information has to be kept constant over the camera device 10, however.
[0062] The system parameters as such are known in an embodiment for this purpose, for example by a corresponding flag or a table that determines for every parameter whether it is a system parameter or not. If a configuration relates to a system parameter, this is transferred to the required other cameras 18.sub.1-18.sub.n, and indeed selectively immediately or after completion of the configuration of a camera 18.sub.1 . . . 18.sub.n. This information whether it is a system parameter or that it is now also transferred to the other cameras 18.sub.1-18.sub.n can, but does not have to, be displayed to the service engineer since the camera device 10 itself takes care of consistently distributing system parameters in the background.
[0063] The service engineer can thus optimize the camera device 10 by settings at the individual cameras 18.sub.1 . . . 18.sub.n without having to take care that the system parameters remain constant over the camera device 10. The service engineer does not even have to know which parameters are such system parameters that also have to be communicated to the other cameras 18.sub.1-18.sub.n.