IMAGE PROCESSOR, COMPUTER-IMPLEMENTED IMAGE PROCESSING METHOD, COMPUTER PROGRAM AND NON-VOLATILE DATA CARRIER
20230384456 · 2023-11-30
Inventors
Cpc classification
A01J5/007
HUMAN NECESSITIES
G01S17/894
PHYSICS
G06V10/60
PHYSICS
International classification
G01S17/894
PHYSICS
G06V10/60
PHYSICS
G06V10/74
PHYSICS
Abstract
An image processor that obtains image data registered by a time-of-flight imaging system and representing a scene illuminated by two or more light sources calibrated to enable the image processor to include distance data in the image data, where the image processor determines if a shadow effect exists by which a first object in the scene obstructs light of at least one light source from reaching a part of a second object in the scene, and adjusts the distance data to compensate for the at least one light source for which light did not reach the part of the second object.
Claims
1. An image processor (140), configured to: obtain image data (D.sub.img) registered by a time-of-flight (TOF) imaging system (110), said image data (D.sub.img) representing a scene (100) illuminated by two or more light sources calibrated to enable the image processor (140) to produce distance data (d.sub.D) to be comprised in the image data (D.sub.img), said distance data (d.sub.D) expressing respective distances from the TOF imaging system (110) to points on objects imaged by the imaging system (110) in the scene; determine if a shadow effect exists, by which a first object of the objects in the scene (100) obstructs light from at least one light source of the two or more light sources from reaching a part of a second object of the objects in the scene (100) and from being reflected from the part of the second object to the TOF imaging system (110); and upon determination that the shadow effect exists, adjust the distance data (d.sub.D) to compensate (d.sub.Δ) for the at least one light source for which the light is obstructed and does not reach the part of the one second object.
2. The image processor (140) according to claim 1, wherein the distance data (d.sub.D) is adjusted by modifying a piece of distance data (d.sub.D′) expressing a distance to a point on a surface (FLT.sub.S1) on the part of the second object (FLT) by an adaptation amount (d.sub.Δ), the piece of distance data (d.sub.D′) being determined without consideration of the shadow effect, and the adaptation amount (d.sub.Δ) depending on a determination via the imaging system of which one or ones of the two or more light sources emit light that does not reach said point, where a count of the two or more light sources equals a first total number n, and a count of the one or ones of the two or more light sources for which light does not reach said point equals a second total number of at least one and is n−1 or less.
3. The image processor (140) according to claim 1, wherein: the image processor (140) is communicatively connected to a lookup table (1450) comprising a data set of adaptation amounts respective of each possible combination of light sources of the two or more light sources for which light does not reach said point for at least one distance expressed by the distance data (d.sub.D), and the image processor (140) is configured to adjust the distance data (d.sub.D) by modifying a piece of distance data (d.sub.D′) based on an adaptation amount (d.sub.Δ) drawn from the adaptation amounts of the lookup table (1450), the piece of distance data (d.sub.D′) expressing a distance to a point on a surface (FLT.sub.S1) on the part of the second object (FLT), the piece of distance data (d.sub.D′) being determined without consideration of the shadow effect.
4. The image processor (140) according to claim 1, wherein the first object (TC) has a known position and spatial extension relative to the TOF imaging system (110) and the two or more light sources, and the image processor (140) is configured to: adjust the distance data (d.sub.D) to compensate for the at least one of the two or more light sources for which light is obstructed from reaching the part of the second object behind the first object (TC).
5. The image processor (140) according to claim 4, wherein the scene (100) comprises a milking location, and the first object is a teat cup (TC) arranged on a carrying structure (RA) mechanically linked to the TOF imaging system (110).
6. The image processor (140) according to claim 4, wherein the scene (100) comprises a milking location, and the first object is a teat of a milking animal.
7. The image processor (140) according to claim 1, further configured to: determine a distance (d.sub.D) to a potentially shadowing object in the scene (100), said potentially shadowing object is located at a shorter distance from the TOF imaging system (110) than any other object in the scene (100); apply a reverse ray-tracing algorithm to establish at least one sector in the scene (100) estimated to be shadowed by the potentially shadowing object with respect to light from at least one light source of the two or more light sources; and include the potentially shadowing object in a group of candidates from which to select the at least one first object when determining if the shadow effect exists.
8. The image processor (140) according to claim 7, further configured to: carry out a first processing step wherein a spatial position of a further potentially shadowing object (FLT) in the scene (100) is determined, said further potentially shadowing object (FLT) located at a shorter distance from the TOF imaging system (110) than any other object in the scene (100) that has not yet been included in the group of candidates; carry out a second processing step wherein said reverse ray-tracing algorithm is applied to establish at least one sector in the scene (100) estimated to be shadowed by the further potentially shadowing object (FLT) with respect to light from at least one light source of the two or more light sources; and carry out a third processing step wherein the further potentially shadowing object (FLT) is included in the group of candidates from which to select the at least one first object when determining if the shadow effect exists.
9. The image processor (140) according to claim 8, further configured to repeat the first, second and third processing steps until a stop criterion has been fulfilled.
10. The image processor (140) according to claim 9, wherein the stop criterion is set in response to at least one of: a time constraint, and a processing capacity constraint.
11. The image processor (140) according to claim 1, further configured to: determine a first piece of the distance data (d.sub.D) expressing a first distance to a first surface area (ARLT.sub.1234) of the second object (RLT) which first surface area is located in a first sector (S.sub.1234) illuminated by all of the two or more light sources; and determine a second piece of the distance data (d.sub.D) expressing a second distance to a second surface area (ARLT.sub.23) of the part of the second object (RLT), said second surface area located in a second sector (S.sub.23) illuminated by a subset of the two or more light sources, and the determining of the second piece of the distance data (d.sub.D) includes extrapolating the first surface area (ARLT.sub.1234) into the second sector (S.sub.23).
12. The image processor (140) according to claim 11, wherein the extrapolating assumes that the second object (RLT) has a generally known shape.
13. The image processor (140) according to claim 12, wherein the second number is zero.
14. The image processor (140) according to claim 11, wherein the scene (100) comprises a milking location, and the second object (RLT) is a teat of a milking animal.
15. The image processor (140) according to claim 1, wherein the image data (D.sub.img) comprises data that expresses a light intensity value for each pixel in a set of pixels, and the image processor (140) is further configured to: adjust a light intensity value of a pixel in the set of pixels by an adaptation intensity, said pixel representing a point on a surface (FLT.sub.S1) of the part of the second object being illuminated by light from less than all of the two or more light sources, the light intensity value being calculated without consideration of the shadow effect, and the adaptation intensity being proportional to a count of light sources of the two or more light sources for which light is obstructed from reaching said point, where a count of the two or more light sources amount to a first total number n, and a count of the light sources for which light is obstructed from reaching said point equals a second total number of at least one and is n−1 or less.
16. A computer-implemented image processing method, comprising: obtaining image data (D.sub.img) registered by a time-of-flight (TOF) imaging system (110), said image data (D.sub.img) representing a scene (100) illuminated by two or more light sources calibrated to enable an image processor (140) to produce distance data (d.sub.D) to be comprised in the image data (D.sub.img), said distance data (d.sub.D) expressing respective distances from the TOF imaging system (110) to points on objects imaged by the imaging system (110) in the scene; determining if a shadow effect exists, by which a first object of the objects in the scene (100) obstructs light from at least one light source of the two or more light sources from reaching a part of a second object of the objects in the scene (100) and being reflected from the part of the second object to the TOF imaging system (110); and determining that the shadow effect exists and subsequently adjusting the distance data (d.sub.D) to compensate (d.sub.Δ) for the at least one light source for which the light did not reach the part of the second object.
17. The method according to claim 16, further comprising: adjusting the distance data (d.sub.D) by modifying a piece of distance data (d.sub.D′) expressing a distance to a point on a surface (FLT.sub.S1) on the at least one part of the at least one second object (FLT) by an adaptation amount (d.sub.Δ), the piece of distance data (d.sub.D′) being determined without consideration of the shadow effect, and the adaptation amount (d.sub.Δ) depending on a determination via the imaging system of which one or ones of two or more light sources emit light that does not reach said point.
18. The method according to claim 16, further comprising: obtaining an adaptation amount (d.sub.Δ) from a lookup table (1350) comprising a data set expressing an adaptation amount (d.sub.Δ) for each possible combination of light sources of the two or more light sources for which light does not reach said point for at least one distance expressed by the distance data (d.sub.D), and adjusting the distance data (d.sub.D) by modifying a piece of distance data (d.sub.D′) expressing a distance to a point on a surface (FLT.sub.S1) on the at least one part of the at least one second object (FLT) by the adaptation amount (d.sub.Δ), the piece of distance data (d.sub.D′) being determined without consideration of the shadow effect.
19. The method according to claim 16, wherein the at least one first object (TC) has a known position and spatial extension relative to the TOF imaging system (110) and the two or more light sources, and the method further comprises: adjusting the distance data (d.sub.D) to compensate for the at least one of the two or more light sources for which light is obstructed from reaching the part of the second object behind the first object (TC).
20. The method according to claim 19, wherein the scene (100) comprises a milking location, and the first object is a teat cup (TC) arranged on a carrying structure (RA) mechanically linked to the TOF imaging system (110).
21. The method according to claim 19, wherein the scene (100) comprises a milking location, and the first object is a teat of a milking animal.
22. The method according to claim 16, further comprising: determining a distance (d.sub.D) to a potentially shadowing object in the scene (100), said potentially shadowing object located at a shorter distance from the TOF imaging system (110) than any other object in the scene (100); applying a reverse ray-tracing algorithm to establish at least one sector in the scene (100) estimated to be shadowed by the potentially shadowing object with respect to light from at least one light source of the two or more light sources; and including the potentially shadowing object in a group of candidates from which to select the at least one first object when determining if the shadow effect exists.
23. The method according to claim 22, further comprising: executing a first processing step wherein a spatial position of a further potentially shadowing object (RLT) in the scene (100) is determined, said further potentially shadowing object (RLT) located at a shorter distance from the TOF imaging system (110) than any other object in the scene (100) that has not yet been included in the group of candidates; executing a second processing step wherein said reverse ray-tracing algorithm is applied to establish at least one sector in the scene (100) estimated to be shadowed by the further potentially shadowing object (RLT) with respect to light from at least one light source of the two or more light sources; and executing a third processing step wherein the further potentially shadowing object (RLT) is included in the group of candidates from which to select the at least one first object when determining if the shadow effect exists.
24. The method according to claim 23, further comprising: executing the first, second and third processing steps repeatedly until a stop criterion has been fulfilled.
25. The method according to claim 24, further comprising: setting the stop criterion in response to at least one of: a time constraint, and a processing capacity constraint.
26. The method according to claim 16, comprising: determining a first piece of the distance data (d.sub.D) expressing a first distance to a first surface area (ARLT.sub.1234) of the second object (RLT) which first surface area is located in a first sector (S.sub.1234) illuminated by all of the two or more light sources; and determining a second piece of the distance data (d.sub.D) expressing a second distance to a second surface area (ARLT.sub.23) of part of the second object (RLT), said second surface area is located in a second sector (S.sub.23) illuminated by a subset of the two or more light sources, and the determining of the second piece of the distance data (d.sub.D) includes extrapolating the first surface area (ARLT.sub.1234) into the second sector (S.sub.23).
27. The method according to claim 26, wherein the extrapolating assumes that the second object (RLT) has a generally known shape.
28. The method according to claim 27, wherein the second number is zero.
29. The method according to claim 26, wherein the scene (100) comprises a milking location, and the second object (RLT) is a teat of a milking animal.
30. The method according to claim 16, wherein the image data (D.sub.img) comprises data that for each pixel in a set of pixels expresses a light intensity value, and the method further comprises: adjusting the light intensity value of a pixel in the set of pixels by an adaptation intensity, said pixel representing a point on a surface (FLT.sub.S1) of the part of the second object being illuminated by light from less than all of the two or more light sources, the light intensity value being calculated without consideration of the shadow effect, and the adaptation intensity being proportional to a count of the light sources of the two or more light sources for which light is obstructed from reaching said point.
31. A non-volatile, non-transitory data carrier (1326) having recorded thereon a computer program (1327) comprising processor-executable code that, when executed by a processing unit of a computing device, causes the computing device to carry out the method according to claim 16.
32. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] The invention is now to be explained more closely by means of preferred embodiments, which are disclosed as examples, and with reference to the attached drawings.
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
DETAILED DESCRIPTION
[0038] In
[0039] Here, where the set of light sources contains four light sources, the light sources are preferably arranged in a plane parallel to a sensor matrix of the TOF imaging system 110 and at 90 degrees angular offset from one another. For example, the light sources 121, 122, 123 and 124 may be attached to a frame 120 around a front lens 111 of the TOF imaging system 110. In any case, the light sources 121, 122, 123 and 124 are controlled by at least one control signal C in synchronization with how image data is registered by the sensor matrix of the TOF imaging system 110. For example, the at least one control signal C may control the light sources 121, 122, 123 and 124 so that there is a particular desired phase delay between each of the light beams emitted from the respective light sources. Namely, such calibration between the image data registration and the illumination of a scene renders it relatively straightforward to produce distance data expressing respective distances from the TOF imaging system 110 to different points on imaged objects in the scene. However, if for example, a common control signal C is arranged to control all the light sources 121, 122, 123 and 124 in a serial manner, and the signal lines interconnecting the light sources 121, 122, 123 and 124 for forwarding the common control signal C causes an unknown delay of the respective points in time at which the light beams from the different light sources 121, 122, 123 and 124 are emitted, the light sources 121, 122, 123 and 124 need to be individually calibrated to the sensor matrix of the TOF imaging system 110 to enable production of accurate distance data.
[0040]
[0041]
[0042] An image processor 140 is here configured to obtain image data D.sub.img that is registered by a TOF imaging system 110. The image data D.sub.img represents a scene 100, which in this case contains an udder to whose teats a carrying structure RA of the milking robot shall attach teat cups TC based on the image data D.sub.img. The scene 100 is illuminated by at least two light sources, for example as described above with reference to
[0043] In order to enhance the quality of the distance data, the image processor 140 is configured to determine if a shadow effect exists by which at least one first object in the scene 100 obstructs light from at least one of the light sources from reaching at least one part of at least one second object in the scene 100 and be reflected there from into the TOF imaging system 110.
[0044] In the example shown in
[0045] If the image processor 140 determines that the shadow effect exists with respect to at least one part of at least one second object as described above, the image processor 140 adjusts the distance data to compensate for the at least one light source, i.e. 121, 122, 123 and/or 124 whose light did not reach the at least one part of the at least one second object FLT, FRT, RLT and/or RRT. The principles for how to adjust the distance data will be discussed below with reference to
[0046] According to one embodiment of the invention, the image processor 140 is configured to determine if a shadow effect exists as follows. The image processor 140 determines a distance to a potentially shadowing object in the scene 100, such as the teat cup TC, the front right teat FRT and/or the front left teat FLT, which potentially shadowing object is located at a shorter distance from the TOF imaging system 110 than any other object in the scene 100. The image processor 140 then applies a reverse raytracing algorithm to establish sectors in the scene 100 estimated to be shadowed by the potentially shadowing object TC, FRT and/or FLT with respect to light from at least one light source of the at least two light sources 121, 122, 123 and/or 124. In
[0047] Preferably, the image processor 140 applies a stepwise procedure to determine if the shadow effect exists. Specifically, the image processor 140 may be configured to execute first, second and third processing steps according to the below.
[0048] In the first processing step, the image processor 140 determines a spatial position of a further potentially shadowing object in the scene 100. The further potentially shadowing object is located at a shorter distance from the TOF imaging system 110 than any other object in the scene 100 that has not yet been included in the above group of candidates.
[0049] In the second processing step, the image processor 140 applies said reverse ray-tracing algorithm to establish at least one sector in the scene 100 that is estimated to be shadowed by the further potentially shadowing object with respect to light from at least one light source of the at least two light sources 121, 122, 123, and/or 124.
[0050] In the third processing step, the image processor 140 includes the further potentially shadowing object in the group of candidates from which to select the at least one first object TC, FLT and/or FRT when determining if the shadow effect exists.
[0051] For accuracy reasons, it is desirable if the image processor 140 is configured to repeat the first, second and third processing steps until a stop criterion has been fulfilled. The stop criterion may be set in response to a time constraint and/or a processing capacity constraint on the image processor 140.
[0052]
[0053]
[0054] In
[0055] It should be noted that the field of view of the TOF imaging system 110 and the respective angular widths of illumination of the light sources shown in this document are merely used for illustrative purposes. Preferably, the light sources should have wider illumination sectors than what is illustrated in the drawings, say in the order of 150 to 170 degrees. It is typically advantageous if also the field of view of the TOF imaging system 110 is somewhat wider than what is illustrated in the drawings.
[0056] Referring now to
[0057] Let us here assume that the surface represents a second object in the form of the teat FRT being visible to the TOF imaging system 110 in
[0058] The TOF imaging system 110 is designed to determine a line-of-sight distance d.sub.P1 between a sensor 810 in the TOF imaging system 110 and respective points on objects reflecting the light from the light sources 121 and 122. Based on the line-of-sight distance d.sub.P1, in turn, the TOF imaging system 110 may be further configured to determine an orthogonal distance d.sub.D to said points. However, the determinations made by the TOF imaging system 110 presupposes that each of said points has received light from all illuminating light sources associated with the TOF imaging system 110. Here, the light sources are and represented by 121 and 122, they may be calibrated to the TOF imaging system 110, for instance by applying information about phase delays between the emitted light beams. The inventor has found that if, due to a shadow effect, a surface on an imaged object is illuminated by light from less than all the light sources associated with the TOF imaging system 110, the distance data d.sub.D determined by the TOF imaging system 110 shall be adjusted by an adaptation amount. The adaptation amount, in turn, depends on which specific light beams that were shadowed with respect to the surface in question.
[0059] As explained briefly above with reference to
[0060] Referring now to
[0061] Analogously, if the light from the light source 121 were shadowed with respect to the surface, the TOF imaging system 110 would instead have determined the orthogonal distance to this surface to a somewhat shorter distance, more precisely by an amount equal to the measure −d.sub.Δ.
[0062] In fact, for a given TOF imaging system 110, and for each piece of distance data d.sub.D′ determined by the TOF imaging system 110, an appropriate adjustment amount can be determined, which adjustment amount depends on which specific light sources that are shadowed. As discussed above, the adjustment amount may be either positive or negative. Moreover, the magnitude of the adjustment amount may vary with the distance data d.sub.D′ as well as the number of light sources being shadowed. Nevertheless, for a range of pieces of distance data d.sub.D′ determined by the TOF imaging system 110, such appropriate adjustment amounts can be determined at well-defined increments, i.e. sampling points, by measuring actual physical distances to different points in the TOF imaging system's 110 field of view and comparing these measurements with corresponding pieces of distance data d.sub.D′ determined by the TOF imaging system 110 under various shadowing conditions.
[0063] According to one embodiment of the invention, the image processor 140 is configured to adjust the distance data d.sub.D by modifying a piece of distance data d.sub.D′ expressing a distance to a point on a shadowed surface, such as the above-mentioned at least one part of the at least one second object FLT, by an adaptation amount d.sub.Δ. The piece of distance data d.sub.D′ is determined without consideration of the shadow effect, i.e. straight out of the TOF imaging system 110. The adaptation amount d.sub.Δ depends on which light source or light sources whose light did not reach said point. The at least two light sources amount to a first total number n and the light source or light sources whose light did not reach said point amount to a second total number of at least one and n−1 or less. Thus, if the system includes two light sources, no more than one light source can be shadowed with respect to said point. The sign and the magnitude of the adaptation amount d.sub.Δ may either be positive or negative depending on the interrelationship between the shadowed and non-shadowed light sources.
[0064] For example if the system is set to operate at a relatively short and known range of distances, such as in a milking installation, it may be advantageous to register appropriate values of the adaptation amount d.sub.Δ in advance, and store these values in a lookup table for easy access by the image processor 140. Naturally, such an approach may be advantageous irrespective of the operating distance. I.e. the lookup-table design is generally beneficial for any range of operation for the TOF imaging system 110.
[0065]
[0066] It is generally advantageous if the image processor 140 is configured to effect the above-described procedure in an automatic manner by executing a computer program 1327. Therefore, the image processor 140 may include a memory unit 1326, i.e. nonvolatile data carrier, storing the computer program 1327, which, in turn, contains software for making processing circuitry in the form of at least one processor 1325 in the image processor 140 execute the actions mentioned in this disclosure when the computer program 1327 is run on the at least one processor 1325.
[0067] According to one embodiment of the invention, the image data D.sub.img contains data that for each pixel in a set of pixels expresses a light intensity value. In this embodiment, the image processor 140 is further configured to adjust the intensity value of a pixel in the set of pixels by an adaptation intensity as described below.
[0068] Here, we assume that the pixel represents one of the points P1, P2, . . . , P6 on a surface FLT.sub.S1 of the at least one part of the second object that is illuminated by light from less than all of the at least two light sources 121, 122, 123 and 124. The intensity value is calculated without consideration of the shadow effect; and, the adaptation intensity is proportional to a number of the light sources 121, 122, 123 and/or 124 whose light is obstructed from reaching said point P1, P2, . . . , P6. Consequently, the pictorial data quality can be enhanced and any partially shadowed objects in the scene can be made somewhat brighter and easier to distinguish visually. This, in turn, facilitates distinguishing the rearmost teats RLF and RRT, i.e. the teats that are furthest from the TOF imaging system 110.
[0069]
[0070] According to one embodiment of the invention, the image processor 140 is configured to determine a first piece of the distance data d.sub.D expressing a first distance to a first surface area ARLT.sub.1234 of the second object RLT, which first surface area ARLT.sub.1234 is located in a first sector S.sub.1234 illuminated by all four light sources 121, 122, 123 and 124. The image processor 140 is further configured to determine a second piece of the distance data d.sub.D expressing a second distance to a second surface area ARLT.sub.23 of the at least one part of the at least one second object RLT, which second surface area ARLT.sub.23 is located in a second sector S.sub.23 illuminated only by a subset of the light sources 121, 122, 123 and 124, namely by the light sources 122 and 123. The image processor 140 determines the second piece of the distance data d.sub.D according to a procedure that involves extrapolating the first surface area ARLT.sub.1234 into the second sector S.sub.23. Due to the better illumination, the distance to the first surface area ARLT.sub.1234 can be determined with higher accuracy. In fact, such extrapolating can even be carried out over smaller surfaces that are not illuminated by any light sources at all.
[0071] Of course, the extrapolating can be improved if the image processor 140 can assume that the at least one second object RLT has a generally known shape. This type of extrapolating may for example be applied if the scene 100 contains a milking location, and the at least one second object RLT is a teat of a milking animal. Indeed, in a milking scenario, the specific shape of the at least one second object RLT may be known to a comparatively high degree of accuracy. Namely, provided that the milking animals are identified, the physical characteristics of each teat of each animal may have been measured and stored in a database to which the image processor 140 has access. Consequently, the image processor 140 may determine distances to the surfaces of the teats with very high accuracy, also if some of these surfaces are partially shadowed with respect to one or more of the light sources 121, 122, 123 and/or 124 associated to the TOF imaging system 110.
[0072] In addition to objects in the scene 100 shadowing one another from the light emitted by the light sources, shadow effects may occur due to known objects that are always present in front of the TOF imaging system 110, and for example form part of a milking robot.
[0073] According to one embodiment of the invention, the image processor 140 is configured to adjust the distance data d.sub.D determined by the TOF imaging system 110 in order to compensate for the at least one of the light sources 121, 122, 123 and/or 124 whose light is obstructed from reaching at least one sector behind the first object TC. Specifically, in the example shown in
[0074] Similarly, referring to
[0075] In order to sum up, and with reference to the flow diagram in
[0076] In a first step 1410, image data D.sub.img is obtained, which image data D.sub.img have been registered by a TOF imaging system 110 and which image data D.sub.img represents a scene 100 illuminated by at least two light sources calibrated to enable an image processor 140 to produce distance data d.sub.D to be comprised in the image data D.sub.img. The distance data d.sub.D expresses respective distances from the TOF imaging system 110 to points on imaged objects in the scene 100.
[0077] A step 1420 thereafter determines if a shadow effect exists by which at least one first object in the scene 100 obstructs light from at least one light source of the at least two light sources from reaching at least one part of at least one second object in the scene 100 and be reflected there from into the TOF imaging system 110. If it is determined that the shadow effect exists, a step 1430 follows, and otherwise the procedure loops back to step 1410.
[0078] In step 1430, the distance data d.sub.D produced by the TOF imaging system 110 is adjusted to compensate for the at least one light source whose light did not reach the at least one part of the at least one second object. The adjustment is made depending on which specific light source or light sources whose light did not reach the at least one part of the at least one second object. Subsequently, the procedure loops back to step 1110.
[0079] All of the process steps, as well as any sub-sequence of steps, described with reference to
[0080] Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims.
[0081] The term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components. The term does not preclude the presence or addition of one or more additional elements, features, integers, steps or components or groups thereof. The indefinite article “a” or “an” does not exclude a plurality. In the claims, the word “or” is not to be interpreted as an exclusive or (sometimes referred to as “XOR”). On the contrary, expressions such as “A or B” covers all the cases “A and not B”, “B and not A” and “A and B”, unless otherwise indicated. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
[0082] It is also to be noted that features from the various embodiments described herein may freely be combined, unless it is explicitly stated that such a combination would be unsuitable.
[0083] The invention is not restricted to the described embodiments in the figures, but may be varied freely within the scope of the claims.