Method for operating a light source for a camera, light source, camera
10663837 ยท 2020-05-26
Assignee
Inventors
Cpc classification
G03B15/03
PHYSICS
H04N23/10
ELECTRICITY
G01S17/42
PHYSICS
H04N23/74
ELECTRICITY
G03B15/05
PHYSICS
H04N13/254
ELECTRICITY
H04N13/25
ELECTRICITY
International classification
G03B15/05
PHYSICS
G03B15/03
PHYSICS
G01S17/42
PHYSICS
H04N13/25
ELECTRICITY
H04N13/254
ELECTRICITY
Abstract
In an embodiment a method includes illuminating a scene in a first illumination by identically driving the emitters of a light source such that first exposures and/or first colour values of segments are ascertained by an image sensor, determining first illumination parameters for the segments of the scene, wherein the first illumination parameters are determined based on the first exposures and/or the first colour values, illuminating the scene in a second subsequent illumination by differently driving the emitters based on the first illumination parameters of the segments such that second exposures and/or second colour values of the segments are ascertained by the image sensor, determining second illumination parameters for the segments of the scene, wherein the second illumination parameters are determined based on the second exposures and/or the second colour values and illuminating the scene in a third subsequent illumination by differently driving the emitters based on the second illumination parameters of the segments.
Claims
1. A method for operating a light source for a camera, wherein the light source comprises at least two individually driveable emitters, the method comprising: individually illuminating segments of a scene to be recorded by the emitters, wherein an illumination parameter is determined for a segment of the scene and an emitter is individually driven on a basis of the illumination parameter, wherein the illumination parameter is determined by a measurement of a physical variable, wherein the physical variable is an exposure and/or a colour value, wherein the measurement of the physical variable is carried out by the scene being firstly illuminated in a first illumination by the light source in such a way that the emitters are driven identically, wherein the exposure and/or the colour value of at least one segment are/is ascertained by an image sensor, wherein the illumination parameters of the segments are determined on the basis of the ascertained exposure and/or the ascertained colour value, wherein the scene is subsequently illuminated in a second illumination by the light source, wherein in this case the emitters are operated individually differently on the basis of the illumination parameters of the segments, wherein a recording is created by the image sensor during the second illumination, wherein during the first illumination the emitters are operated with a lower light intensity than during the second illumination, wherein a further illumination of the scene by the light source is carried out between the first illumination and the second illumination, wherein the light intensity of the further illumination corresponds to the light intensity of the first illumination, wherein the emitters are operated individually differently on the basis of the illumination parameters of the segments in the case of the further illumination, and wherein, during the further illumination by an image sensor, the exposure and/or the colour value of the segments are/is ascertained and the ascertainment of the exposure and/or the colour value during the further illumination influences a determination of the illumination parameters for the second illumination.
2. The method according to claim 1, wherein the illumination parameter is a light intensity and/or a colour temperature.
3. The method according to claim 1, wherein the individually driveable emitters are constructed from at least two single emitters, wherein the single emitters are in each case individually driveable, and wherein a colour temperature of an emitter as illumination parameter is set by individually driving the single emitters.
4. The method according to claim 1, wherein the physical variable comprises a distance from the camera to an object in a segment of the scene, and wherein the illumination parameter of the emitter is determined on the basis of the distance of the object.
5. The method according to claim 1, wherein a preview image is displayed on a screen of the camera, and wherein a user of the camera manually selects a region of the preview image and defines an illumination parameter for the manually selected region.
6. The method according to claim 4, wherein the distance from the camera to the object is ascertained by a propagation time measurement of a radiation pulse.
7. The method according to claim 4, wherein the distance from the camera to the object is ascertained by a LIDAR.
8. The method according to claim 4, wherein the distance from the camera to the object is ascertained by the scene being illuminated with structured light comprising a predefined pattern, and wherein the predefined pattern is distorted by objects in the segments of the scene, and wherein the distorted light is evaluated.
9. The method according to claim 6, wherein the radiation pulse is emitted by a radiation source, wherein the radiation pulse is reflected from the object partly in a direction of the camera, wherein a spatially resolving radiation detector detects reflected radiation, and wherein the detected radiation comprises different propagation times depending on the segment of the scene and the distance from the object to the camera, and the distance from the object to the camera is determined on the basis of the propagation time.
10. The light source configured to be used according to the method of claim 1.
11. The camera comprising: the light source according to claim 10; a control device; and the image sensor.
12. A method for operating a light source, wherein the light source comprises at least two individually driveable emitters, the method comprising: illuminating a scene in a first illumination by identically driving the emitters of the light source such that first exposures and/or first colour values of segments are ascertained by an image sensor; determining first illumination parameters for the segments of the scene, wherein the first illumination parameters are determined based on the first exposures and/or the first colour values; illuminating the scene in a second subsequent illumination by differently driving the emitters of the light source based on the first illumination parameters of the segments such that second exposures and/or second colour values of the segments are ascertained by the image sensor; determining second illumination parameters for the segments of the scene, wherein the second illumination parameters are determined based on the second exposures and/or the second colour values; illuminating the scene in a further subsequent illumination by differently driving the emitters of the light source based on the second illumination parameters of the segments; and recording an image of the scene during the second illumination, wherein during the first illumination the emitters are operated with a lower light intensity than during the further illumination, and wherein a light intensity of the second illumination corresponds to the light intensity of the first illumination.
13. The method according to claim 12, wherein the first illumination parameters or the second illumination parameters are light intensities and/or colour temperatures.
14. The method according to claim 12, wherein an illumination parameter of an emitter is further based on a distance from a camera to an object in a segment of the scene.
15. The method according to claim 14, wherein the distance from the camera to the object is ascertained by a propagation time measurement of a radiation pulse.
16. The method according to claim 14, wherein the distance from the camera to the object is ascertained by a LIDAR.
17. The method according to claim 14, wherein the distance from the camera to the object is ascertained by the scene being illuminated with structured light comprising a predefined pattern, and wherein the predefined pattern is distorted by objects in the segments of the scene, and wherein the distorted light is evaluated.
18. A camera comprising: the light source according to claim 12; a control device; and the image sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The above-described properties, features and advantages of this invention and the way in which they are achieved will become clearer and more clearly understood in association with the following description of the exemplary embodiments which are explained in greater detail in association with the drawings, in which, in each case in a schematic illustration:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS
(14)
(15) In one exemplary embodiment, the first emitter 110 is configured to illuminate only the first segment 210 of the scene 200. In this case, the first emitter 110 may be driven on the basis of an illumination parameter for the first segment 210, while the second emitter 120 is driven on the basis of an illumination parameter for the second segment 211.
(16) In one exemplary embodiment, the illumination parameter is a light intensity and/or a colour temperature. Provision may thus be made for illuminating the first segment 110 with a different intensity from the second segment 120. Likewise, provision may be made for illuminating the first segment 110 with a different colour temperature from the second segment 120.
(17)
(18) The first emitter 110 is constructed from a first single emitter 111 and a second single emitter 112. The second emitter 120 is constructed from a third single emitter 121 and a fourth single emitter 122. The single emitters 111, 112, 121, 122 are each individually driveable. The first single emitter 111 and the second single emitter 112 comprise a mutually different colour temperature. The third single emitter 121 and the fourth single emitter 122 likewise comprise a mutually different colour temperature. A colour temperature of the first emitter 110 may be set by an individual driving of the first single emitter 111 and the second single emitter 112. Likewise, a colour temperature of the second emitter 120 may be set by an individual driving of the third single emitter 121 and the fourth single emitter 122. As a result, the light source may be operated in such a way that the first segment 210 and the second segment 211 of the scene 200 are illuminated with a different colour temperature.
(19) Here, too, provision may be made for the first emitter 110 to illuminate exclusively the first segment 210 of the scene 200.
(20)
(21)
(22) The emitters 110, 120, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140 may each once again be constructed from single emitters analogously to
(23) If a recording of the scene 200 is then intended to be created, provision may be made for an illumination parameter for the fifth segment 214, the sixth segment 215, the eighth segment 217 and the ninth segment 218, in which the vehicle 240 is situated, to be intended to be chosen differently from any other segments. By way of example, provision may be made for the vehicle 240 to be illuminated more intensely than the rest of the recording. The fifth emitter 133, the sixth emitter 134, the eighth emitter 136 and the ninth emitter 137 are then operated with a greater intensity than the remaining emitters 110, 120, 131, 132, 135, 138, 139, 140.
(24) The tree 230 in the first segment 210 and the second segment 212 of the scene comprises a treetop 231 in the first segment 211 and a trunk 232 in the second segment 211. Provision may be made for the tree 230 to be intended to be illuminated with a different colour temperature compared with the vehicle 240. The first emitter 110 and the second emitter 120 may then be operated with a different colour temperature from that of the fifth emitter 133, the sixth emitter 134, the eighth emitter 136 and the ninth emitter 137, which illuminate the vehicle 240. Likewise, provision may be made for treetop 231 and trunk 232 to be illuminated with a mutually different colour temperature. In this case, the first emitter 110 may be operated with a different colour temperature from the second emitter 120.
(25) In one exemplary embodiment, the scene 200 from
(26) In one exemplary embodiment, during the first exposure the light source 100 is operated with a lower intensity than during the second exposure.
(27) In one exemplary embodiment, a further exposure is carried out between the first exposure and the second exposure, in the case of which further exposure the light source 100 is operated with the intensity of the first exposure, but the emitters 110, 120, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140 are already operated on the basis of the illumination parameter for the segments 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221 of the scene 200. By virtue of the further exposure, the illumination parameters for the segments 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221 of the scene 200 are checked and, if appropriate, adapted.
(28) Provision may be made for the scene 200 to comprise more than twelve segments 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221 and for the light source 100 to comprise more than twelve emitters 110, 120, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140 operated with the intensity of the first exposure.
(29)
(30) The illumination parameters for the segments 250 of the scene 200 may be determined by means of the image sensor 310.
(31)
(32) The illumination parameters with which the emitters 150 are operated are determined on the basis of the first distance 236 and the second distance 237. In this case, the distances 236, 237 may be ascertained by means of the device 340 for distance measurement. Alternatively, the distances may also be ascertained by means of the image sensor 310. In this case, the device 340 for distance measurement is not necessary.
(33) In one exemplary embodiment, the camera 300 comprises two image sensors 310. The controller 330 is configured to stereoscopically evaluate the image sensors 310 and to ascertain the distances 236, 237 on the basis of this evaluation.
(34) In one exemplary embodiment, the distances 236, 237 are ascertained by means of a propagation time measurement of a radiation pulse. In this case, the radiation pulse may be emitted by the device 340 for distance measurement. The reflected radiation pulse is detected by the device 340 for distance measurement. This may involve a LIDAR system, for example.
(35) Provision may likewise be made for a short radiation pulse to be emitted by the light source 100. Part of the short radiation pulse is reflected from the tree 230 and the person 235 in the direction of the camera. By means of the image sensor 310, it is possible to ascertain the propagation times of the radiation pulse depending on the location of the reflection and thus the distances 236, 237. A device 340 for distance measurement is not necessary in this case.
(36)
(37) The predefined pattern of the structured light 160 is thus distorted by objects 260 within the scene 200 and the distortion is evaluated.
(38)
(39)
(40)
(41) The pixelated emitter 170 may be constructed, for example, from light emitting diodes on a carrier, wherein the light emitting diodes may in turn comprise conversion materials. In this case, the emitters 171 may be constructed in particular from two single emitters, which emit white light comprising different colour temperatures, or from three single emitters, which emit red, green and blue light, respectively.
(42) The emitters 110, 120, 131, 132, 134, 135, 136, 137, 138, 139, 140, 150, 171 of the light source 100 may be configured, for example, in the form of a light emitting diode or a laser diode. In this case, the light emitting diodes or laser diodes may be configured as semiconductor chips and comprise a conversion material.
(43)
(44) If the scene 200 is then illuminated using a conventional light source of a conventional camera, the strongly illuminated segments 225 of the lake 280 will be overexposed, while the weakly illuminated segments 226 will be underexposed. However, the butterfly 270 is intended to be shown to advantage in the recording, which is made more difficult by the illumination situation. By recording a plurality of images of the weakly illuminated segments 226, it is possible to create a recording with a higher exposure. However, if the camera or the butterfly 270 moves during the recording of the images, the recording will be blurred.
(45) However, if the scene 200 from
(46) On the basis of the exposure ascertained by the image sensor 310, said exposure representing a physical variable, the segments 225, 226 are subdivided into strongly illuminated segments 225 and weakly illuminated segments 226. The control device 330 of the camera 300 then ascertains which of the emitters 150 illuminate the weakly illuminated segments 226. During a second illumination of the scene 200, the emitters 150 which are directed at the weakly illuminated segments 226 of the scene are then operated. As a result, the exposure is significantly increased in the weakly illuminated segments 226 of the scene 200. In this case, the illumination parameter is the light intensity with which the segments 225, 226 are to be illuminated in each case. During the second illumination, the recording is created by means of the image sensor 310.
(47) Likewise, between the first illumination and the second illumination it is possible to carry out further illuminations for the purpose of more accurately ascertaining the driving of the individual emitters 150.
(48) Furthermore, it is also possible to provide more than two different light intensities for different segments.
(49) Although the invention has been more specifically illustrated and described in detail by means of the preferred exemplary embodiments, the invention is not restricted by the examples disclosed and other variations may be derived therefrom by the person skilled in the art, without departing from the scope of protection of the invention.